Mikulich-Gilbertson, Susan K; Wagner, Brandie D; Grunwald, Gary K; Riggs, Paula D; Zerbe, Gary O
2018-01-01
Medical research is often designed to investigate changes in a collection of response variables that are measured repeatedly on the same subjects. The multivariate generalized linear mixed model (MGLMM) can be used to evaluate random coefficient associations (e.g. simple correlations, partial regression coefficients) among outcomes that may be non-normal and differently distributed by specifying a multivariate normal distribution for their random effects and then evaluating the latent relationship between them. Empirical Bayes predictors are readily available for each subject from any mixed model and are observable and hence, plotable. Here, we evaluate whether second-stage association analyses of empirical Bayes predictors from a MGLMM, provide a good approximation and visual representation of these latent association analyses using medical examples and simulations. Additionally, we compare these results with association analyses of empirical Bayes predictors generated from separate mixed models for each outcome, a procedure that could circumvent computational problems that arise when the dimension of the joint covariance matrix of random effects is large and prohibits estimation of latent associations. As has been shown in other analytic contexts, the p-values for all second-stage coefficients that were determined by naively assuming normality of empirical Bayes predictors provide a good approximation to p-values determined via permutation analysis. Analyzing outcomes that are interrelated with separate models in the first stage and then associating the resulting empirical Bayes predictors in a second stage results in different mean and covariance parameter estimates from the maximum likelihood estimates generated by a MGLMM. The potential for erroneous inference from using results from these separate models increases as the magnitude of the association among the outcomes increases. Thus if computable, scatterplots of the conditionally independent empirical Bayes
Bayes linear statistics, theory & methods
Goldstein, Michael
2007-01-01
Bayesian methods combine information available from data with any prior information available from expert knowledge. The Bayes linear approach follows this path, offering a quantitative structure for expressing beliefs, and systematic methods for adjusting these beliefs, given observational data. The methodology differs from the full Bayesian methodology in that it establishes simpler approaches to belief specification and analysis based around expectation judgements. Bayes Linear Statistics presents an authoritative account of this approach, explaining the foundations, theory, methodology, and practicalities of this important field. The text provides a thorough coverage of Bayes linear analysis, from the development of the basic language to the collection of algebraic results needed for efficient implementation, with detailed practical examples. The book covers:The importance of partial prior specifications for complex problems where it is difficult to supply a meaningful full prior probability specification...
Empirical Bayes Approaches to Multivariate Fuzzy Partitions.
Woodbury, Max A.; Manton, Kenneth G.
1991-01-01
An empirical Bayes-maximum likelihood estimation procedure is presented for the application of fuzzy partition models in describing high dimensional discrete response data. The model describes individuals in terms of partial membership in multiple latent categories that represent bounded discrete spaces. (SLD)
The effect of loss functions on empirical Bayes reliability analysis
Directory of Open Access Journals (Sweden)
Camara Vincent A. R.
1998-01-01
Full Text Available The aim of the present study is to investigate the sensitivity of empirical Bayes estimates of the reliability function with respect to changing of the loss function. In addition to applying some of the basic analytical results on empirical Bayes reliability obtained with the use of the “popular” squared error loss function, we shall derive some expressions corresponding to empirical Bayes reliability estimates obtained with the Higgins–Tsokos, the Harris and our proposed logarithmic loss functions. The concept of efficiency, along with the notion of integrated mean square error, will be used as a criterion to numerically compare our results. It is shown that empirical Bayes reliability functions are in general sensitive to the choice of the loss function, and that the squared error loss does not always yield the best empirical Bayes reliability estimate.
The effect of loss functions on empirical Bayes reliability analysis
Directory of Open Access Journals (Sweden)
Vincent A. R. Camara
1999-01-01
Full Text Available The aim of the present study is to investigate the sensitivity of empirical Bayes estimates of the reliability function with respect to changing of the loss function. In addition to applying some of the basic analytical results on empirical Bayes reliability obtained with the use of the “popular” squared error loss function, we shall derive some expressions corresponding to empirical Bayes reliability estimates obtained with the Higgins–Tsokos, the Harris and our proposed logarithmic loss functions. The concept of efficiency, along with the notion of integrated mean square error, will be used as a criterion to numerically compare our results.
A Bayes linear Bayes method for estimation of correlated event rates.
Quigley, John; Wilson, Kevin J; Walls, Lesley; Bedford, Tim
2013-12-01
Typically, full Bayesian estimation of correlated event rates can be computationally challenging since estimators are intractable. When estimation of event rates represents one activity within a larger modeling process, there is an incentive to develop more efficient inference than provided by a full Bayesian model. We develop a new subjective inference method for correlated event rates based on a Bayes linear Bayes model under the assumption that events are generated from a homogeneous Poisson process. To reduce the elicitation burden we introduce homogenization factors to the model and, as an alternative to a subjective prior, an empirical method using the method of moments is developed. Inference under the new method is compared against estimates obtained under a full Bayesian model, which takes a multivariate gamma prior, where the predictive and posterior distributions are derived in terms of well-known functions. The mathematical properties of both models are presented. A simulation study shows that the Bayes linear Bayes inference method and the full Bayesian model provide equally reliable estimates. An illustrative example, motivated by a problem of estimating correlated event rates across different users in a simple supply chain, shows how ignoring the correlation leads to biased estimation of event rates. © 2013 Society for Risk Analysis.
An Empirical Bayes Approach to Mantel-Haenszel DIF Analysis.
Zwick, Rebecca; Thayer, Dorothy T.; Lewis, Charles
1999-01-01
Developed an empirical Bayes enhancement to Mantel-Haenszel (MH) analysis of differential item functioning (DIF) in which it is assumed that the MH statistics are normally distributed and that the prior distribution of underlying DIF parameters is also normal. (Author/SLD)
Using Loss Functions for DIF Detection: An Empirical Bayes Approach.
Zwick, Rebecca; Thayer, Dorothy; Lewis, Charles
2000-01-01
Studied a method for flagging differential item functioning (DIF) based on loss functions. Builds on earlier research that led to the development of an empirical Bayes enhancement to the Mantel-Haenszel DIF analysis. Tested the method through simulation and found its performance better than some commonly used DIF classification systems. (SLD)
International Nuclear Information System (INIS)
Quigley, John; Hardman, Gavin; Bedford, Tim; Walls, Lesley
2011-01-01
Empirical Bayes provides one approach to estimating the frequency of rare events as a weighted average of the frequencies of an event and a pool of events. The pool will draw upon, for example, events with similar precursors. The higher the degree of homogeneity of the pool, then the Empirical Bayes estimator will be more accurate. We propose and evaluate a new method using homogenisation factors under the assumption that events are generated from a Homogeneous Poisson Process. The homogenisation factors are scaling constants, which can be elicited through structured expert judgement and used to align the frequencies of different events, hence homogenising the pool. The estimation error relative to the homogeneity of the pool is examined theoretically indicating that reduced error is associated with larger pool homogeneity. The effects of misspecified expert assessments of the homogenisation factors are examined theoretically and through simulation experiments. Our results show that the proposed Empirical Bayes method using homogenisation factors is robust under different degrees of misspecification.
On the null distribution of Bayes factors in linear regression
We show that under the null, the 2 log (Bayes factor) is asymptotically distributed as a weighted sum of chi-squared random variables with a shifted mean. This claim holds for Bayesian multi-linear regression with a family of conjugate priors, namely, the normal-inverse-gamma prior, the g-prior, and...
Bayes Empirical Bayes Inference of Amino Acid Sites Under Positive Selection
DEFF Research Database (Denmark)
Yang, Ziheng; Wong, Wendy Shuk Wan; Nielsen, Rasmus
2005-01-01
, with > 1 indicating positive selection. Statistical distributions are used to model the variation in among sites, allowing a subset of sites to have > 1 while the rest of the sequence may be under purifying selection with ... probabilities that a site comes from the site class with > 1. Current implementations, however, use the naive EB (NEB) approach and fail to account for sampling errors in maximum likelihood estimates of model parameters, such as the proportions and ratios for the site classes. In small data sets lacking...... information, this approach may lead to unreliable posterior probability calculations. In this paper, we develop a Bayes empirical Bayes (BEB) approach to the problem, which assigns a prior to the model parameters and integrates over their uncertainties. We compare the new and old methods on real and simulated...
DEFF Research Database (Denmark)
Madsen, Henrik; Rosbjerg, Dan
1997-01-01
parameters is inferred from regional data using generalized least squares (GLS) regression. Two different Bayesian T-year event estimators are introduced: a linear estimator that requires only some moments of the prior distributions to be specified and a parametric estimator that is based on specified......A regional estimation procedure that combines the index-flood concept with an empirical Bayes method for inferring regional information is introduced. The model is based on the partial duration series approach with generalized Pareto (GP) distributed exceedances. The prior information of the model...
Empirical Bayes conditional independence graphs for regulatory network recovery
Mahdi, Rami; Madduri, Abishek S.; Wang, Guoqing; Strulovici-Barel, Yael; Salit, Jacqueline; Hackett, Neil R.; Crystal, Ronald G.; Mezey, Jason G.
2012-01-01
Motivation: Computational inference methods that make use of graphical models to extract regulatory networks from gene expression data can have difficulty reconstructing dense regions of a network, a consequence of both computational complexity and unreliable parameter estimation when sample size is small. As a result, identification of hub genes is of special difficulty for these methods. Methods: We present a new algorithm, Empirical Light Mutual Min (ELMM), for large network reconstruction that has properties well suited for recovery of graphs with high-degree nodes. ELMM reconstructs the undirected graph of a regulatory network using empirical Bayes conditional independence testing with a heuristic relaxation of independence constraints in dense areas of the graph. This relaxation allows only one gene of a pair with a putative relation to be aware of the network connection, an approach that is aimed at easing multiple testing problems associated with recovering densely connected structures. Results: Using in silico data, we show that ELMM has better performance than commonly used network inference algorithms including GeneNet, ARACNE, FOCI, GENIE3 and GLASSO. We also apply ELMM to reconstruct a network among 5492 genes expressed in human lung airway epithelium of healthy non-smokers, healthy smokers and individuals with chronic obstructive pulmonary disease assayed using microarrays. The analysis identifies dense sub-networks that are consistent with known regulatory relationships in the lung airway and also suggests novel hub regulatory relationships among a number of genes that play roles in oxidative stress and secretion. Availability and implementation: Software for running ELMM is made available at http://mezeylab.cb.bscb.cornell.edu/Software.aspx. Contact: ramimahdi@yahoo.com or jgm45@cornell.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:22685074
Flexible Modeling of Epidemics with an Empirical Bayes Framework
Brooks, Logan C.; Farrow, David C.; Hyun, Sangwon; Tibshirani, Ryan J.; Rosenfeld, Roni
2015-01-01
Seasonal influenza epidemics cause consistent, considerable, widespread loss annually in terms of economic burden, morbidity, and mortality. With access to accurate and reliable forecasts of a current or upcoming influenza epidemic’s behavior, policy makers can design and implement more effective countermeasures. This past year, the Centers for Disease Control and Prevention hosted the “Predict the Influenza Season Challenge”, with the task of predicting key epidemiological measures for the 2013–2014 U.S. influenza season with the help of digital surveillance data. We developed a framework for in-season forecasts of epidemics using a semiparametric Empirical Bayes framework, and applied it to predict the weekly percentage of outpatient doctors visits for influenza-like illness, and the season onset, duration, peak time, and peak height, with and without using Google Flu Trends data. Previous work on epidemic modeling has focused on developing mechanistic models of disease behavior and applying time series tools to explain historical data. However, tailoring these models to certain types of surveillance data can be challenging, and overly complex models with many parameters can compromise forecasting ability. Our approach instead produces possibilities for the epidemic curve of the season of interest using modified versions of data from previous seasons, allowing for reasonable variations in the timing, pace, and intensity of the seasonal epidemics, as well as noise in observations. Since the framework does not make strict domain-specific assumptions, it can easily be applied to some other diseases with seasonal epidemics. This method produces a complete posterior distribution over epidemic curves, rather than, for example, solely point predictions of forecasting targets. We report prospective influenza-like-illness forecasts made for the 2013–2014 U.S. influenza season, and compare the framework’s cross-validated prediction error on historical data to
EbayesThresh: R Programs for Empirical Bayes Thresholding
Directory of Open Access Journals (Sweden)
Iain Johnstone
2005-04-01
Full Text Available Suppose that a sequence of unknown parameters is observed sub ject to independent Gaussian noise. The EbayesThresh package in the S language implements a class of Empirical Bayes thresholding methods that can take advantage of possible sparsity in the sequence, to improve the quality of estimation. The prior for each parameter in the sequence is a mixture of an atom of probability at zero and a heavy-tailed density. Within the package, this can be either a Laplace (double exponential density or else a mixture of normal distributions with tail behavior similar to the Cauchy distribution. The mixing weight, or sparsity parameter, is chosen automatically by marginal maximum likelihood. If estimation is carried out using the posterior median, this is a random thresholding procedure; the estimation can also be carried out using other thresholding rules with the same threshold, and the package provides the posterior mean, and hard and soft thresholding, as additional options. This paper reviews the method, and gives details (far beyond those previously published of the calculations needed for implementing the procedures. It explains and motivates both the general methodology, and the use of the EbayesThresh package, through simulated and real data examples. When estimating the wavelet transform of an unknown function, it is appropriate to apply the method level by level to the transform of the observed data. The package can carry out these calculations for wavelet transforms obtained using various packages in R and S-PLUS. Details, including a motivating example, are presented, and the application of the method to image estimation is also explored. The final topic considered is the estimation of a single sequence that may become progressively sparser along the sequence. An iterated least squares isotone regression method allows for the choice of a threshold that depends monotonically on the order in which the observations are made. An alternative
Estimating rate of occurrence of rare events with empirical bayes: A railway application
International Nuclear Information System (INIS)
Quigley, John; Bedford, Tim; Walls, Lesley
2007-01-01
Classical approaches to estimating the rate of occurrence of events perform poorly when data are few. Maximum likelihood estimators result in overly optimistic point estimates of zero for situations where there have been no events. Alternative empirical-based approaches have been proposed based on median estimators or non-informative prior distributions. While these alternatives offer an improvement over point estimates of zero, they can be overly conservative. Empirical Bayes procedures offer an unbiased approach through pooling data across different hazards to support stronger statistical inference. This paper considers the application of Empirical Bayes to high consequence low-frequency events, where estimates are required for risk mitigation decision support such as as low as reasonably possible. A summary of empirical Bayes methods is given and the choices of estimation procedures to obtain interval estimates are discussed. The approaches illustrated within the case study are based on the estimation of the rate of occurrence of train derailments within the UK. The usefulness of empirical Bayes within this context is discussed
Empirical Bayes Credibility Models for Economic Catastrophic Losses by Regions
Directory of Open Access Journals (Sweden)
Jindrová Pavla
2017-01-01
Full Text Available Catastrophic events affect various regions of the world with increasing frequency and intensity. The number of catastrophic events and the amount of economic losses is varying in different world regions. Part of these losses is covered by insurance. Catastrophe events in last years are associated with increases in premiums for some lines of business. The article focus on estimating the amount of net premiums that would be needed to cover the total or insured catastrophic losses in different world regions using Bühlmann and Bühlmann-Straub empirical credibility models based on data from Sigma Swiss Re 2010-2016. The empirical credibility models have been developed to estimate insurance premiums for short term insurance contracts using two ingredients: past data from the risk itself and collateral data from other sources considered to be relevant. In this article we deal with application of these models based on the real data about number of catastrophic events and about the total economic and insured catastrophe losses in seven regions of the world in time period 2009-2015. Estimated credible premiums by world regions provide information how much money in the monitored regions will be need to cover total and insured catastrophic losses in next year.
International Nuclear Information System (INIS)
Quigley, John; Walls, Lesley
2011-01-01
Mixing Bayes and Empirical Bayes inference provides reliability estimates for variant system designs by using relevant failure data - observed and anticipated - about engineering changes arising due to modification and innovation. A coherent inference framework is proposed to predict the realization of engineering concerns during product development so that informed decisions can be made about the system design and the analysis conducted to prove reliability. The proposed method involves combining subjective prior distributions for the number of engineering concerns with empirical priors for the non-parametric distribution of time to realize these concerns in such a way that we can cross-tabulate classes of concerns to failure events within time partitions at an appropriate level of granularity. To support efficient implementation, a computationally convenient hypergeometric approximation is developed for the counting distributions appropriate to our underlying stochastic model. The accuracy of our approximation over first-order alternatives is examined, and demonstrated, through an evaluation experiment. An industrial application illustrates model implementation and shows how estimates can be updated using information arising during development test and analysis.
A nonparametric empirical Bayes framework for large-scale multiple testing.
Martin, Ryan; Tokdar, Surya T
2012-07-01
We propose a flexible and identifiable version of the 2-groups model, motivated by hierarchical Bayes considerations, that features an empirical null and a semiparametric mixture model for the nonnull cases. We use a computationally efficient predictive recursion (PR) marginal likelihood procedure to estimate the model parameters, even the nonparametric mixing distribution. This leads to a nonparametric empirical Bayes testing procedure, which we call PRtest, based on thresholding the estimated local false discovery rates. Simulations and real data examples demonstrate that, compared to existing approaches, PRtest's careful handling of the nonnull density can give a much better fit in the tails of the mixture distribution which, in turn, can lead to more realistic conclusions.
Noma, Hisashi; Matsui, Shigeyuki
2013-05-20
The main purpose of microarray studies is screening of differentially expressed genes as candidates for further investigation. Because of limited resources in this stage, prioritizing genes are relevant statistical tasks in microarray studies. For effective gene selections, parametric empirical Bayes methods for ranking and selection of genes with largest effect sizes have been proposed (Noma et al., 2010; Biostatistics 11: 281-289). The hierarchical mixture model incorporates the differential and non-differential components and allows information borrowing across differential genes with separation from nuisance, non-differential genes. In this article, we develop empirical Bayes ranking methods via a semiparametric hierarchical mixture model. A nonparametric prior distribution, rather than parametric prior distributions, for effect sizes is specified and estimated using the "smoothing by roughening" approach of Laird and Louis (1991; Computational statistics and data analysis 12: 27-37). We present applications to childhood and infant leukemia clinical studies with microarrays for exploring genes related to prognosis or disease progression. Copyright © 2012 John Wiley & Sons, Ltd.
Spencer, Amy V; Cox, Angela; Lin, Wei-Yu; Easton, Douglas F; Michailidou, Kyriaki; Walters, Kevin
2016-04-01
There is a large amount of functional genetic data available, which can be used to inform fine-mapping association studies (in diseases with well-characterised disease pathways). Single nucleotide polymorphism (SNP) prioritization via Bayes factors is attractive because prior information can inform the effect size or the prior probability of causal association. This approach requires the specification of the effect size. If the information needed to estimate a priori the probability density for the effect sizes for causal SNPs in a genomic region isn't consistent or isn't available, then specifying a prior variance for the effect sizes is challenging. We propose both an empirical method to estimate this prior variance, and a coherent approach to using SNP-level functional data, to inform the prior probability of causal association. Through simulation we show that when ranking SNPs by our empirical Bayes factor in a fine-mapping study, the causal SNP rank is generally as high or higher than the rank using Bayes factors with other plausible values of the prior variance. Importantly, we also show that assigning SNP-specific prior probabilities of association based on expert prior functional knowledge of the disease mechanism can lead to improved causal SNPs ranks compared to ranking with identical prior probabilities of association. We demonstrate the use of our methods by applying the methods to the fine mapping of the CASP8 region of chromosome 2 using genotype data from the Collaborative Oncological Gene-Environment Study (COGS) Consortium. The data we analysed included approximately 46,000 breast cancer case and 43,000 healthy control samples. © 2016 The Authors. *Genetic Epidemiology published by Wiley Periodicals, Inc.
Ator, Scott W.; Brakebill, John W.; Blomquist, Joel D.
2011-01-01
Spatially Referenced Regression on Watershed Attributes (SPARROW) was used to provide empirical estimates of the sources, fate, and transport of total nitrogen (TN) and total phosphorus (TP) in the Chesapeake Bay watershed, and the mean annual TN and TP flux to the bay and in each of 80,579 nontidal tributary stream reaches. Restoration efforts in recent decades have been insufficient to meet established standards for water quality and ecological conditions in Chesapeake Bay. The bay watershed includes 166,000 square kilometers of mixed land uses, multiple nutrient sources, and variable hydrogeologic, soil, and weather conditions, and bay restoration is complicated by the multitude of nutrient sources and complex interacting factors affecting the occurrence, fate, and transport of nitrogen and phosphorus from source areas to streams and the estuary. Effective and efficient nutrient management at the regional scale in support of Chesapeake Bay restoration requires a comprehensive understanding of the sources, fate, and transport of nitrogen and phosphorus in the watershed, which is only available through regional models. The current models, Chesapeake Bay nutrient SPARROW models, version 4 (CBTN_v4 and CBTP_v4), were constructed at a finer spatial resolution than previous SPARROW models for the Chesapeake Bay watershed (versions 1, 2, and 3), and include an updated timeframe and modified sources and other explantory terms.
β-empirical Bayes inference and model diagnosis of microarray data
Directory of Open Access Journals (Sweden)
Hossain Mollah Mohammad
2012-06-01
Full Text Available Abstract Background Microarray data enables the high-throughput survey of mRNA expression profiles at the genomic level; however, the data presents a challenging statistical problem because of the large number of transcripts with small sample sizes that are obtained. To reduce the dimensionality, various Bayesian or empirical Bayes hierarchical models have been developed. However, because of the complexity of the microarray data, no model can explain the data fully. It is generally difficult to scrutinize the irregular patterns of expression that are not expected by the usual statistical gene by gene models. Results As an extension of empirical Bayes (EB procedures, we have developed the β-empirical Bayes (β-EB approach based on a β-likelihood measure which can be regarded as an ’evidence-based’ weighted (quasi- likelihood inference. The weight of a transcript t is described as a power function of its likelihood, fβ(yt|θ. Genes with low likelihoods have unexpected expression patterns and low weights. By assigning low weights to outliers, the inference becomes robust. The value of β, which controls the balance between the robustness and efficiency, is selected by maximizing the predictive β0-likelihood by cross-validation. The proposed β-EB approach identified six significant (p−5 contaminated transcripts as differentially expressed (DE in normal/tumor tissues from the head and neck of cancer patients. These six genes were all confirmed to be related to cancer; they were not identified as DE genes by the classical EB approach. When applied to the eQTL analysis of Arabidopsis thaliana, the proposed β-EB approach identified some potential master regulators that were missed by the EB approach. Conclusions The simulation data and real gene expression data showed that the proposed β-EB method was robust against outliers. The distribution of the weights was used to scrutinize the irregular patterns of expression and diagnose the model
International Nuclear Information System (INIS)
Vesely, W.E.; Uryas'ev, S.P.; Samanta, P.K.
1993-01-01
Emergency Diesel Generators (EDGs) provide backup power to nuclear power plants in case of failure of AC buses. The reliability of EDGs is important to assure response to loss-of-offsite power accident scenarios, a dominant contributor to the plant risk. The reliable performance of EDGs has been of concern both for regulators and plant operators. In this paper the authors present an approach and results from the analysis of failure data from a large population of EDGs. They used empirical Bayes approach to obtain both the population distribution and the individual failure probabilities from EDGs failure to start and load-run data over 4 years for 194 EDGs at 63 plant units
The Bayes linear approach to inference and decision-making for a reliability programme
International Nuclear Information System (INIS)
Goldstein, Michael; Bedford, Tim
2007-01-01
In reliability modelling it is conventional to build sophisticated models of the probabilistic behaviour of the component lifetimes in a system in order to deduce information about the probabilistic behaviour of the system lifetime. Decision modelling of the reliability programme requires a priori, therefore, an even more sophisticated set of models in order to capture the evidence the decision maker believes may be obtained from different types of data acquisition. Bayes linear analysis is a methodology that uses expectation rather than probability as the fundamental expression of uncertainty. By working only with expected values, a simpler level of modelling is needed as compared to full probability models. In this paper we shall consider the Bayes linear approach to the estimation of a mean time to failure MTTF of a component. The model built will take account of the variance in our estimate of the MTTF, based on a variety of sources of information
Naznin, Farhana; Currie, Graham; Sarvi, Majid; Logan, David
2016-01-01
Streetcars/tram systems are growing worldwide, and many are given priority to increase speed and reliability performance in mixed traffic conditions. Research related to the road safety impact of tram priority is limited. This study explores the road safety impacts of tram priority measures including lane and intersection/signal priority measures. A before-after crash study was conducted using the empirical Bayes (EB) method to provide more accurate crash impact estimates by accounting for wider crash trends and regression to the mean effects. Before-after crash data for 29 intersections with tram signal priority and 23 arterials with tram lane priority in Melbourne, Australia, were analyzed to evaluate the road safety impact of tram priority. The EB before-after analysis results indicated a statistically significant adjusted crash reduction rate of 16.4% after implementation of tram priority measures. Signal priority measures were found to reduce crashes by 13.9% and lane priority by 19.4%. A disaggregate level simple before-after analysis indicated reductions in total and serious crashes as well as vehicle-, pedestrian-, and motorcycle-involved crashes. In addition, reductions in on-path crashes, pedestrian-involved crashes, and collisions among vehicles moving in the same and opposite directions and all other specific crash types were found after tram priority implementation. Results suggest that streetcar/tram priority measures result in safety benefits for all road users, including vehicles, pedestrians, and cyclists. Policy implications and areas for future research are discussed.
Developing a Clustering-Based Empirical Bayes Analysis Method for Hotspot Identification
Directory of Open Access Journals (Sweden)
Yajie Zou
2017-01-01
Full Text Available Hotspot identification (HSID is a critical part of network-wide safety evaluations. Typical methods for ranking sites are often rooted in using the Empirical Bayes (EB method to estimate safety from both observed crash records and predicted crash frequency based on similar sites. The performance of the EB method is highly related to the selection of a reference group of sites (i.e., roadway segments or intersections similar to the target site from which safety performance functions (SPF used to predict crash frequency will be developed. As crash data often contain underlying heterogeneity that, in essence, can make them appear to be generated from distinct subpopulations, methods are needed to select similar sites in a principled manner. To overcome this possible heterogeneity problem, EB-based HSID methods that use common clustering methodologies (e.g., mixture models, K-means, and hierarchical clustering to select “similar” sites for building SPFs are developed. Performance of the clustering-based EB methods is then compared using real crash data. Here, HSID results, when computed on Texas undivided rural highway cash data, suggest that all three clustering-based EB analysis methods are preferred over the conventional statistical methods. Thus, properly classifying the road segments for heterogeneous crash data can further improve HSID accuracy.
Zhang, Kui; Wiener, Howard; Beasley, Mark; George, Varghese; Amos, Christopher I; Allison, David B
2006-08-01
Individual genome scans for quantitative trait loci (QTL) mapping often suffer from low statistical power and imprecise estimates of QTL location and effect. This lack of precision yields large confidence intervals for QTL location, which are problematic for subsequent fine mapping and positional cloning. In prioritizing areas for follow-up after an initial genome scan and in evaluating the credibility of apparent linkage signals, investigators typically examine the results of other genome scans of the same phenotype and informally update their beliefs about which linkage signals in their scan most merit confidence and follow-up via a subjective-intuitive integration approach. A method that acknowledges the wisdom of this general paradigm but formally borrows information from other scans to increase confidence in objectivity would be a benefit. We developed an empirical Bayes analytic method to integrate information from multiple genome scans. The linkage statistic obtained from a single genome scan study is updated by incorporating statistics from other genome scans as prior information. This technique does not require that all studies have an identical marker map or a common estimated QTL effect. The updated linkage statistic can then be used for the estimation of QTL location and effect. We evaluate the performance of our method by using extensive simulations based on actual marker spacing and allele frequencies from available data. Results indicate that the empirical Bayes method can account for between-study heterogeneity, estimate the QTL location and effect more precisely, and provide narrower confidence intervals than results from any single individual study. We also compared the empirical Bayes method with a method originally developed for meta-analysis (a closely related but distinct purpose). In the face of marked heterogeneity among studies, the empirical Bayes method outperforms the comparator.
Observations and a linear model of water level in an interconnected inlet-bay system
Aretxabaleta, Alfredo; Ganju, Neil K.; Butman, Bradford; Signell, Richard
2017-01-01
A system of barrier islands and back-barrier bays occurs along southern Long Island, New York, and in many coastal areas worldwide. Characterizing the bay physical response to water level fluctuations is needed to understand flooding during extreme events and evaluate their relation to geomorphological changes. Offshore sea level is one of the main drivers of water level fluctuations in semienclosed back-barrier bays. We analyzed observed water levels (October 2007 to November 2015) and developed analytical models to better understand bay water level along southern Long Island. An increase (∼0.02 m change in 0.17 m amplitude) in the dominant M2 tidal amplitude (containing the largest fraction of the variability) was observed in Great South Bay during mid-2014. The observed changes in both tidal amplitude and bay water level transfer from offshore were related to the dredging of nearby inlets and possibly the changing size of a breach across Fire Island caused by Hurricane Sandy (after December 2012). The bay response was independent of the magnitude of the fluctuations (e.g., storms) at a specific frequency. An analytical model that incorporates bay and inlet dimensions reproduced the observed transfer function in Great South Bay and surrounding areas. The model predicts the transfer function in Moriches and Shinnecock bays where long-term observations were not available. The model is a simplified tool to investigate changes in bay water level and enables the evaluation of future conditions and alternative geomorphological settings.
An Empirical Bayes before-after evaluation of road safety effects of a new motorway in Norway.
Elvik, Rune; Ulstein, Heidi; Wifstad, Kristina; Syrstad, Ragnhild S; Seeberg, Aase R; Gulbrandsen, Magnus U; Welde, Morten
2017-11-01
This paper presents an Empirical Bayes before-after evaluation of the road safety effects of a new motorway (freeway) in Østfold county, Norway. The before-period was 1996-2002. The after-period was 2009-2015. The road was rebuilt from an undivided two-lane road into a divided four-lane road. The number of killed or seriously injured road users was reduced by 75 percent, controlling for (downward) long-term trends and regression-to-the-mean (statistically significant at the 5 percent level; recorded numbers 71 before, 11 after). There were small changes in the number of injury accidents (185 before, 123 after; net effect -3%) and the number of slightly injured road users (403 before 279 after; net effect +5%). Motorways appear to mainly reduce injury severity, not the number of accidents. The paper discusses challenges in implementing the Empirical Bayes design when less than ideal data are available. Copyright © 2017 Elsevier Ltd. All rights reserved.
Jaber, Abobaker M; Ismail, Mohd Tahir; Altaher, Alsaidi M
2014-01-01
This paper mainly forecasts the daily closing price of stock markets. We propose a two-stage technique that combines the empirical mode decomposition (EMD) with nonparametric methods of local linear quantile (LLQ). We use the proposed technique, EMD-LLQ, to forecast two stock index time series. Detailed experiments are implemented for the proposed method, in which EMD-LPQ, EMD, and Holt-Winter methods are compared. The proposed EMD-LPQ model is determined to be superior to the EMD and Holt-Winter methods in predicting the stock closing prices.
Anderson, Clare
2011-01-01
This article explores the lives of two Andamanese women, both of whom the British called “Tospy.” The first part of the article takes an indigenous and gendered perspective on early British colonization of the Andamans in the 1860s, and through the experiences of a woman called Topsy stresses the sexual violence that underpinned colonial settlement as well as the British reliance on women as cultural interlocutors. Second, the article discusses colonial naming practices, and the employment of Andamanese women and men as nursemaids and household servants during the 1890s–1910s. Using an extraordinary murder case in which a woman known as Topsy-ayah was a central witness, it argues that both reveal something of the enduring associations and legacies of slavery, as well as the cultural influence of the Atlantic in the Bay of Bengal. In sum, these women's lives present a kaleidoscope view of colonization, gender, networks of Empire, labor, and domesticity in the Bay of Bengal.
The effect of changes in sea surface temperature on linear growth of Porites coral in Ambon Bay
International Nuclear Information System (INIS)
Corvianawatie, Corry; Putri, Mutiara R.; Cahyarini, Sri Y.
2015-01-01
Coral is one of the most important organisms in the coral reef ecosystem. There are several factors affecting coral growth, one of them is changes in sea surface temperature (SST). The purpose of this research is to understand the influence of SST variability on the annual linear growth of Porites coral taken from Ambon Bay. The annual coral linear growth was calculated and compared to the annual SST from the Extended Reconstructed Sea Surface Temperature version 3b (ERSST v3b) model. Coral growth was calculated by using Coral X-radiograph Density System (CoralXDS) software. Coral sample X-radiographs were used as input data. Chronology was developed by calculating the coral’s annual growth bands. A pair of high and low density banding patterns observed in the coral’s X-radiograph represent one year of coral growth. The results of this study shows that Porites coral extents from 2001-2009 and had an average growth rate of 1.46 cm/year. Statistical analysis shows that the annual coral linear growth declined by 0.015 cm/year while the annual SST declined by 0.013°C/year. SST and the annual linear growth of Porites coral in the Ambon Bay is insignificantly correlated with r=0.304 (n=9, p>0.05). This indicates that annual SST variability does not significantly influence the linear growth of Porites coral from Ambon Bay. It is suggested that sedimentation load, salinity, pH or other environmental factors may affect annual linear coral growth
The effect of changes in sea surface temperature on linear growth of Porites coral in Ambon Bay
Energy Technology Data Exchange (ETDEWEB)
Corvianawatie, Corry, E-mail: corvianawatie@students.itb.ac.id; Putri, Mutiara R., E-mail: mutiara.putri@fitb.itb.ac.id [Oceanography Study Program, Bandung Institute of Technology (ITB), Jl. Ganesha 10 Bandung (Indonesia); Cahyarini, Sri Y., E-mail: yuda@geotek.lipi.go.id [Research Center for Geotechnology, Indonesian Institute of Sciences (LIPI), Bandung (Indonesia)
2015-09-30
Coral is one of the most important organisms in the coral reef ecosystem. There are several factors affecting coral growth, one of them is changes in sea surface temperature (SST). The purpose of this research is to understand the influence of SST variability on the annual linear growth of Porites coral taken from Ambon Bay. The annual coral linear growth was calculated and compared to the annual SST from the Extended Reconstructed Sea Surface Temperature version 3b (ERSST v3b) model. Coral growth was calculated by using Coral X-radiograph Density System (CoralXDS) software. Coral sample X-radiographs were used as input data. Chronology was developed by calculating the coral’s annual growth bands. A pair of high and low density banding patterns observed in the coral’s X-radiograph represent one year of coral growth. The results of this study shows that Porites coral extents from 2001-2009 and had an average growth rate of 1.46 cm/year. Statistical analysis shows that the annual coral linear growth declined by 0.015 cm/year while the annual SST declined by 0.013°C/year. SST and the annual linear growth of Porites coral in the Ambon Bay is insignificantly correlated with r=0.304 (n=9, p>0.05). This indicates that annual SST variability does not significantly influence the linear growth of Porites coral from Ambon Bay. It is suggested that sedimentation load, salinity, pH or other environmental factors may affect annual linear coral growth.
Construction and utilization of linear empirical core models for PWR in-core fuel management
International Nuclear Information System (INIS)
Okafor, K.C.
1988-01-01
An empirical core-model construction procedure for pressurized water reactor (PWR) in-core fuel management is developed that allows determining the optimal BOC k ∞ profiles in PWRs as a single linear-programming problem and thus facilitates the overall optimization process for in-core fuel management due to algorithmic simplification and reduction in computation time. The optimal profile is defined as one that maximizes cycle burnup. The model construction scheme treats the fuel-assembly power fractions, burnup, and leakage as state variables and BOC zone enrichments as control variables. The core model consists of linear correlations between the state and control variables that describe fuel-assembly behavior in time and space. These correlations are obtained through time-dependent two-dimensional core simulations. The core model incorporates the effects of composition changes in all the enrichment control zones on a given fuel assembly and is valid at all times during the cycle for a given range of control variables. No assumption is made on the geometry of the control zones. A scatter-composition distribution, as well as annular, can be considered for model construction. The application of the methodology to a typical PWR core indicates good agreement between the model and exact simulation results
Mejia, Amanda F; Nebel, Mary Beth; Barber, Anita D; Choe, Ann S; Pekar, James J; Caffo, Brian S; Lindquist, Martin A
2018-05-15
Reliability of subject-level resting-state functional connectivity (FC) is determined in part by the statistical techniques employed in its estimation. Methods that pool information across subjects to inform estimation of subject-level effects (e.g., Bayesian approaches) have been shown to enhance reliability of subject-level FC. However, fully Bayesian approaches are computationally demanding, while empirical Bayesian approaches typically rely on using repeated measures to estimate the variance components in the model. Here, we avoid the need for repeated measures by proposing a novel measurement error model for FC describing the different sources of variance and error, which we use to perform empirical Bayes shrinkage of subject-level FC towards the group average. In addition, since the traditional intra-class correlation coefficient (ICC) is inappropriate for biased estimates, we propose a new reliability measure denoted the mean squared error intra-class correlation coefficient (ICC MSE ) to properly assess the reliability of the resulting (biased) estimates. We apply the proposed techniques to test-retest resting-state fMRI data on 461 subjects from the Human Connectome Project to estimate connectivity between 100 regions identified through independent components analysis (ICA). We consider both correlation and partial correlation as the measure of FC and assess the benefit of shrinkage for each measure, as well as the effects of scan duration. We find that shrinkage estimates of subject-level FC exhibit substantially greater reliability than traditional estimates across various scan durations, even for the most reliable connections and regardless of connectivity measure. Additionally, we find partial correlation reliability to be highly sensitive to the choice of penalty term, and to be generally worse than that of full correlations except for certain connections and a narrow range of penalty values. This suggests that the penalty needs to be chosen carefully
Hatfield, L.A.; Gutreuter, S.; Boogaard, M.A.; Carlin, B.P.
2011-01-01
Estimation of extreme quantal-response statistics, such as the concentration required to kill 99.9% of test subjects (LC99.9), remains a challenge in the presence of multiple covariates and complex study designs. Accurate and precise estimates of the LC99.9 for mixtures of toxicants are critical to ongoing control of a parasitic invasive species, the sea lamprey, in the Laurentian Great Lakes of North America. The toxicity of those chemicals is affected by local and temporal variations in water chemistry, which must be incorporated into the modeling. We develop multilevel empirical Bayes models for data from multiple laboratory studies. Our approach yields more accurate and precise estimation of the LC99.9 compared to alternative models considered. This study demonstrates that properly incorporating hierarchical structure in laboratory data yields better estimates of LC99.9 stream treatment values that are critical to larvae control in the field. In addition, out-of-sample prediction of the results of in situ tests reveals the presence of a latent seasonal effect not manifest in the laboratory studies, suggesting avenues for future study and illustrating the importance of dual consideration of both experimental and observational data. ?? 2011, The International Biometric Society.
Directory of Open Access Journals (Sweden)
Simone Vincenzi
2014-09-01
Full Text Available The differences in demographic and life-history processes between organisms living in the same population have important consequences for ecological and evolutionary dynamics. Modern statistical and computational methods allow the investigation of individual and shared (among homogeneous groups determinants of the observed variation in growth. We use an Empirical Bayes approach to estimate individual and shared variation in somatic growth using a von Bertalanffy growth model with random effects. To illustrate the power and generality of the method, we consider two populations of marble trout Salmo marmoratus living in Slovenian streams, where individually tagged fish have been sampled for more than 15 years. We use year-of-birth cohort, population density during the first year of life, and individual random effects as potential predictors of the von Bertalanffy growth function's parameters k (rate of growth and L∞ (asymptotic size. Our results showed that size ranks were largely maintained throughout marble trout lifetime in both populations. According to the Akaike Information Criterion (AIC, the best models showed different growth patterns for year-of-birth cohorts as well as the existence of substantial individual variation in growth trajectories after accounting for the cohort effect. For both populations, models including density during the first year of life showed that growth tended to decrease with increasing population density early in life. Model validation showed that predictions of individual growth trajectories using the random-effects model were more accurate than predictions based on mean size-at-age of fish.
Vincenzi, Simone; Mangel, Marc; Crivelli, Alain J; Munch, Stephan; Skaug, Hans J
2014-09-01
The differences in demographic and life-history processes between organisms living in the same population have important consequences for ecological and evolutionary dynamics. Modern statistical and computational methods allow the investigation of individual and shared (among homogeneous groups) determinants of the observed variation in growth. We use an Empirical Bayes approach to estimate individual and shared variation in somatic growth using a von Bertalanffy growth model with random effects. To illustrate the power and generality of the method, we consider two populations of marble trout Salmo marmoratus living in Slovenian streams, where individually tagged fish have been sampled for more than 15 years. We use year-of-birth cohort, population density during the first year of life, and individual random effects as potential predictors of the von Bertalanffy growth function's parameters k (rate of growth) and L∞ (asymptotic size). Our results showed that size ranks were largely maintained throughout marble trout lifetime in both populations. According to the Akaike Information Criterion (AIC), the best models showed different growth patterns for year-of-birth cohorts as well as the existence of substantial individual variation in growth trajectories after accounting for the cohort effect. For both populations, models including density during the first year of life showed that growth tended to decrease with increasing population density early in life. Model validation showed that predictions of individual growth trajectories using the random-effects model were more accurate than predictions based on mean size-at-age of fish.
Directory of Open Access Journals (Sweden)
Jo Nishino
2018-04-01
Full Text Available Genome-wide association studies (GWAS suggest that the genetic architecture of complex diseases consists of unexpectedly numerous variants with small effect sizes. However, the polygenic architectures of many diseases have not been well characterized due to lack of simple and fast methods for unbiased estimation of the underlying proportion of disease-associated variants and their effect-size distribution. Applying empirical Bayes estimation of semi-parametric hierarchical mixture models to GWAS summary statistics, we confirmed that schizophrenia was extremely polygenic [~40% of independent genome-wide SNPs are risk variants, most within odds ratio (OR = 1.03], whereas rheumatoid arthritis was less polygenic (~4 to 8% risk variants, significant portion reaching OR = 1.05 to 1.1. For rheumatoid arthritis, stratified estimations revealed that expression quantitative loci in blood explained large genetic variance, and low- and high-frequency derived alleles were prone to be risk and protective, respectively, suggesting a predominance of deleterious-risk and advantageous-protective mutations. Despite genetic correlation, effect-size distributions for schizophrenia and bipolar disorder differed across allele frequency. These analyses distinguished disease polygenic architectures and provided clues for etiological differences in complex diseases.
Monteys, Xavier; Harris, Paul; Caloca, Silvia
2014-05-01
The coastal shallow water zone can be a challenging and expensive environment within which to acquire bathymetry and other oceanographic data using traditional survey methods. Dangers and limited swath coverage make some of these areas unfeasible to survey using ship borne systems, and turbidity can preclude marine LIDAR. As a result, an extensive part of the coastline worldwide remains completely unmapped. Satellite EO multispectral data, after processing, allows timely, cost efficient and quality controlled information to be used for planning, monitoring, and regulating coastal environments. It has the potential to deliver repetitive derivation of medium resolution bathymetry, coastal water properties and seafloor characteristics in shallow waters. Over the last 30 years satellite passive imaging methods for bathymetry extraction, implementing analytical or empirical methods, have had a limited success predicting water depths. Different wavelengths of the solar light penetrate the water column to varying depths. They can provide acceptable results up to 20 m but become less accurate in deeper waters. The study area is located in the inner part of Dublin Bay, on the East coast of Ireland. The region investigated is a C-shaped inlet covering an area of 10 km long and 5 km wide with water depths ranging from 0 to 10 m. The methodology employed on this research uses a ratio of reflectance from SPOT 5 satellite bands, differing to standard linear transform algorithms. High accuracy water depths were derived using multibeam data. The final empirical model uses spatially weighted geographical tools to retrieve predicted depths. The results of this paper confirm that SPOT satellite scenes are suitable to predict depths using empirical models in very shallow embayments. Spatial regression models show better adjustments in the predictions over non-spatial models. The spatial regression equation used provides realistic results down to 6 m below the water surface, with
International Nuclear Information System (INIS)
Da Silva Pinto, P.S.; Eustache, R.P.; Audenaert, M.; Bernassau, J.M.
1996-01-01
This work deals with carbon 13 nuclear magnetic resonance chemical shifts empiric calculations by multi linear regression and molecular modeling. The multi linear regression is indeed one way to obtain an equation able to describe the behaviour of the chemical shift for some molecules which are in the data base (rigid molecules with carbons). The methodology consists of structures describer parameters definition which can be bound to carbon 13 chemical shift known for these molecules. Then, the linear regression is used to determine the equation significant parameters. This one can be extrapolated to molecules which presents some resemblances with those of the data base. (O.L.). 20 refs., 4 figs., 1 tab
Prinyakupt, Jaroonrut; Pluempitiwiriyawej, Charnchai
2015-06-30
Blood smear microscopic images are routinely investigated by haematologists to diagnose most blood diseases. However, the task is quite tedious and time consuming. An automatic detection and classification of white blood cells within such images can accelerate the process tremendously. In this paper we propose a system to locate white blood cells within microscopic blood smear images, segment them into nucleus and cytoplasm regions, extract suitable features and finally, classify them into five types: basophil, eosinophil, neutrophil, lymphocyte and monocyte. Two sets of blood smear images were used in this study's experiments. Dataset 1, collected from Rangsit University, were normal peripheral blood slides under light microscope with 100× magnification; 555 images with 601 white blood cells were captured by a Nikon DS-Fi2 high-definition color camera and saved in JPG format of size 960 × 1,280 pixels at 15 pixels per 1 μm resolution. In dataset 2, 477 cropped white blood cell images were downloaded from CellaVision.com. They are in JPG format of size 360 × 363 pixels. The resolution is estimated to be 10 pixels per 1 μm. The proposed system comprises a pre-processing step, nucleus segmentation, cell segmentation, feature extraction, feature selection and classification. The main concept of the segmentation algorithm employed uses white blood cell's morphological properties and the calibrated size of a real cell relative to image resolution. The segmentation process combined thresholding, morphological operation and ellipse curve fitting. Consequently, several features were extracted from the segmented nucleus and cytoplasm regions. Prominent features were then chosen by a greedy search algorithm called sequential forward selection. Finally, with a set of selected prominent features, both linear and naïve Bayes classifiers were applied for performance comparison. This system was tested on normal peripheral blood smear slide images from two datasets. Two sets
An Analytical-empirical Calculation of Linear Attenuation Coefficient of Megavoltage Photon Beams.
Seif, F; Tahmasebi-Birgani, M J; Bayatiani, M R
2017-09-01
In this study, a method for linear attenuation coefficient calculation was introduced. Linear attenuation coefficient was calculated with a new method that base on the physics of interaction of photon with matter, mathematical calculation and x-ray spectrum consideration. The calculation was done for Cerrobend as a common radiotherapy modifier and Mercury. The values of calculated linear attenuation coefficient with this new method are in acceptable range. Also, the linear attenuation coefficient decreases slightly as the thickness of attenuating filter (Cerrobend or mercury) increased, so the procedure of linear attenuation coefficient variation is in agreement with other documents. The results showed that the attenuation ability of mercury was about 1.44 times more than Cerrobend. The method that was introduced in this study for linear attenuation coefficient calculation is general enough to treat beam modifiers with any shape or material by using the same formalism; however, calculating was made only for mercury and Cerrobend attenuator. On the other hand, it seems that this method is suitable for high energy shields or protector designing.
An Empirical Comparison of Five Linear Equating Methods for the NEAT Design
Suh, Youngsuk; Mroch, Andrew A.; Kane, Michael T.; Ripkey, Douglas R.
2009-01-01
In this study, a data base containing the responses of 40,000 candidates to 90 multiple-choice questions was used to mimic data sets for 50-item tests under the "nonequivalent groups with anchor test" (NEAT) design. Using these smaller data sets, we evaluated the performance of five linear equating methods for the NEAT design with five levels of…
Directory of Open Access Journals (Sweden)
Vincenzo Capizzi
2013-05-01
Full Text Available This paper is aimed at identifying and analyzing the contribution of the major drivers of the performance of informal venture capitalists’ investments. This study analyzes data on Italian transactions and personal features of Italian Business Angels gathered during 2007 – 2011 with the support of IBAN (Italian Business Angels Network. The econometric analysis investigates the returns of business angels’ investments and their major determinants (industry, exit strategy, experience, holding period, rejection rate, and year of divestiture. The major results are the followings: 1 differently from previous literature, the relationship between Experience and IRR is quadratic and significant; 2 for the first time, is confirmed by quantitative data that short Holding period (below 3 years earn a lower IRR; 3 the Rejection rate is logarithmic and the impact on IRR is positive and significant. Finally, the outcomes of the empirical analysis performed in this study allow identifying new and concrete insights on possible policy interventions.
Chen, Baojiang; Qin, Jing
2014-05-10
In statistical analysis, a regression model is needed if one is interested in finding the relationship between a response variable and covariates. When the response depends on the covariate, then it may also depend on the function of this covariate. If one has no knowledge of this functional form but expect for monotonic increasing or decreasing, then the isotonic regression model is preferable. Estimation of parameters for isotonic regression models is based on the pool-adjacent-violators algorithm (PAVA), where the monotonicity constraints are built in. With missing data, people often employ the augmented estimating method to improve estimation efficiency by incorporating auxiliary information through a working regression model. However, under the framework of the isotonic regression model, the PAVA does not work as the monotonicity constraints are violated. In this paper, we develop an empirical likelihood-based method for isotonic regression model to incorporate the auxiliary information. Because the monotonicity constraints still hold, the PAVA can be used for parameter estimation. Simulation studies demonstrate that the proposed method can yield more efficient estimates, and in some situations, the efficiency improvement is substantial. We apply this method to a dementia study. Copyright © 2013 John Wiley & Sons, Ltd.
Scholz, Stefan; Graf von der Schulenburg, Johann-Matthias; Greiner, Wolfgang
2015-11-17
Regional differences in physician supply can be found in many health care systems, regardless of their organizational and financial structure. A theoretical model is developed for the physicians' decision on office allocation, covering demand-side factors and a consumption time function. To test the propositions following the theoretical model, generalized linear models were estimated to explain differences in 412 German districts. Various factors found in the literature were included to control for physicians' regional preferences. Evidence in favor of the first three propositions of the theoretical model could be found. Specialists show a stronger association to higher populated districts than GPs. Although indicators for regional preferences are significantly correlated with physician density, their coefficients are not as high as population density. If regional disparities should be addressed by political actions, the focus should be to counteract those parameters representing physicians' preferences in over- and undersupplied regions.
Solvent effects in ionic liquids: empirical linear energy-density relationships.
Cerda-Monje, A; Aizman, A; Tapia, R A; Chiappe, C; Contreras, R
2012-07-28
Multiparameter linear energy-density relationships to model solvent effects in room temperature ionic liquids (RTILs) are introduced and tested. The model incorporates two solvent dependent and two specific solute-solvent parameters represented by a set of electronic indexes derived from the conceptual density functional theory. The specific solute-solvent interactions are described in terms of the electronic chemical potential for proton migration between the anion or cation and the transition state structure of a specific reaction. These indexes provide a quantitative estimation of the hydrogen bond (HB) acceptor basicity and the hydrogen bond donor acidity of the ionic solvent, respectively. A sound quantitative scale of HB strength is thereby obtained. The solvent dependent contributions are described by the global electrophilicity of the cation and nucleophilicity of the anion forming the ionic liquid. The model is illustrated for the kinetics of cycloaddition of cyclopentadiene towards acrolein. In general, cation HB acidity outweighs the remaining parameters for this reaction.
Directory of Open Access Journals (Sweden)
Amin Foroughi
2012-04-01
Full Text Available Road construction projects are considered as the most important governmental issues since there are normally heavy investments required in such projects. There is also shortage of financial resources in governmental budget, which makes the asset allocation more challenging. One primary step in reducing the cost is to determine different risks associated with execution of such project activities. In this study, we present some important risk factors associated with road construction in two levels for a real-world case study of rail-road industry located between two cities of Esfahan and Deligan. The first group of risk factors includes the probability and the effects for various attributes including cost, time, quality and performance. The second group of risk factors includes socio-economical factors as well as political and managerial aspects. The study finds 21 main risk factors as well as 193 sub risk factors. The factors are ranked using groups decision-making method called linear assignment. The preliminary results indicate that the road construction projects could finish faster with better outcome should we carefully consider risk factors and attempt to reduce their impacts.
Multi-locus genome-wide association studies has become the state-of-the-art procedure to identify quantitative trait loci (QTL) associated with traits simultaneously. However, implementation of multi-locus model is still difficult. In this study, we integrated least angle regression with empirical B...
Directory of Open Access Journals (Sweden)
Jens Clausen
2010-06-01
Full Text Available This paper discusses the sustainability impact (contribution to sustainability, reduction of adverse environmental impacts of online second-hand trading. A survey of eBay users shows that a relationship between the trading of used goods and the protection of natural resources is hardly realized. Secondly, the environmental motivation and the willingness to act in a sustainable manner differ widely between groups of consumers. Given these results from a user perspective, the paper tries to find some objective hints of online second-hand trading’s environmental impact. The greenhouse gas emissions resulting from the energy used for the trading transactions seem to be considerably lower than the emissions due to the (avoided production of new goods. The paper concludes with a set of recommendations for second-hand trade and consumer policy. Information about the sustainability benefits of purchasing second-hand goods should be included in general consumer information, and arguments for changes in behavior should be targeted to different groups of consumers.
Boomer, Kathleen B; Weller, Donald E; Jordan, Thomas E
2008-01-01
The Universal Soil Loss Equation (USLE) and its derivatives are widely used for identifying watersheds with a high potential for degrading stream water quality. We compared sediment yields estimated from regional application of the USLE, the automated revised RUSLE2, and five sediment delivery ratio algorithms to measured annual average sediment delivery in 78 catchments of the Chesapeake Bay watershed. We did the same comparisons for another 23 catchments monitored by the USGS. Predictions exceeded observed sediment yields by more than 100% and were highly correlated with USLE erosion predictions (Pearson r range, 0.73-0.92; p USLE estimates (r = 0.87; p USLE model did not change the results. In ranked comparisons between observed and predicted sediment yields, the models failed to identify catchments with higher yields (r range, -0.28-0.00; p > 0.14). In a multiple regression analysis, soil erodibility, log (stream flow), basin shape (topographic relief ratio), the square-root transformed proportion of forest, and occurrence in the Appalachian Plateau province explained 55% of the observed variance in measured suspended sediment loads, but the model performed poorly (r(2) = 0.06) at predicting loads in the 23 USGS watersheds not used in fitting the model. The use of USLE or multiple regression models to predict sediment yields is not advisable despite their present widespread application. Integrated watershed models based on the USLE may also be unsuitable for making management decisions.
International Nuclear Information System (INIS)
David, B.; McNiven, I.J.; Leavesley, M.; Barker, B.; Mandui, H.; Richards, T.; Skelly, R.
2012-01-01
This paper reports on the ceramics from Squares A and B of Bogi 1, a newly excavated site at Caution Bay, south coast of mainland Papua New Guinea. A dense cultural horizon dated from c. 2150 to c. 2100 calBP and preceded by earlier cultural deposits contains previously undescribed ceramics of limited decorative variability almost exclusively focused on Anadara shell edge impressions below finger-grooved lips, which we term the Linear Shell Edge-Impressed Tradition. Here we present the chrono-stratigraphic evidence for this decorative tradition and how it relates to previously described shell-impressed ceramics from the broader region. (author). 16 refs., 9 figs., 3 tabs.
Ren, Wen-Long; Wen, Yang-Jun; Dunwell, Jim M; Zhang, Yuan-Ming
2018-03-01
Although nonparametric methods in genome-wide association studies (GWAS) are robust in quantitative trait nucleotide (QTN) detection, the absence of polygenic background control in single-marker association in genome-wide scans results in a high false positive rate. To overcome this issue, we proposed an integrated nonparametric method for multi-locus GWAS. First, a new model transformation was used to whiten the covariance matrix of polygenic matrix K and environmental noise. Using the transferred model, Kruskal-Wallis test along with least angle regression was then used to select all the markers that were potentially associated with the trait. Finally, all the selected markers were placed into multi-locus model, these effects were estimated by empirical Bayes, and all the nonzero effects were further identified by a likelihood ratio test for true QTN detection. This method, named pKWmEB, was validated by a series of Monte Carlo simulation studies. As a result, pKWmEB effectively controlled false positive rate, although a less stringent significance criterion was adopted. More importantly, pKWmEB retained the high power of Kruskal-Wallis test, and provided QTN effect estimates. To further validate pKWmEB, we re-analyzed four flowering time related traits in Arabidopsis thaliana, and detected some previously reported genes that were not identified by the other methods.
International Nuclear Information System (INIS)
Conover, W.J.; Cox, D.D.; Martz, H.F.
1997-12-01
When using parametric empirical Bayes estimation methods for estimating the binomial or Poisson parameter, the validity of the assumed beta or gamma conjugate prior distribution is an important diagnostic consideration. Chi-square goodness-of-fit tests of the beta or gamma prior hypothesis are developed for use when the binomial sample sizes or Poisson exposure times vary. Nine examples illustrate the application of the methods, using real data from such diverse applications as the loss of feedwater flow rates in nuclear power plants, the probability of failure to run on demand and the failure rates of the high pressure coolant injection systems at US commercial boiling water reactors, the probability of failure to run on demand of emergency diesel generators in US commercial nuclear power plants, the rate of failure of aircraft air conditioners, baseball batting averages, the probability of testing positive for toxoplasmosis, and the probability of tumors in rats. The tests are easily applied in practice by means of corresponding Mathematica reg-sign computer programs which are provided
Hatfield, Laura A.; Gutreuter, Steve; Boogaard, Michael A.; Carlin, Bradley P.
2011-01-01
Estimation of extreme quantal-response statistics, such as the concentration required to kill 99.9% of test subjects (LC99.9), remains a challenge in the presence of multiple covariates and complex study designs. Accurate and precise estimates of the LC99.9 for mixtures of toxicants are critical to ongoing control of a parasitic invasive species, the sea lamprey, in the Laurentian Great Lakes of North America. The toxicity of those chemicals is affected by local and temporal variations in water chemistry, which must be incorporated into the modeling. We develop multilevel empirical Bayes models for data from multiple laboratory studies. Our approach yields more accurate and precise estimation of the LC99.9 compared to alternative models considered. This study demonstrates that properly incorporating hierarchical structure in laboratory data yields better estimates of LC99.9 stream treatment values that are critical to larvae control in the field. In addition, out-of-sample prediction of the results of in situ tests reveals the presence of a latent seasonal effect not manifest in the laboratory studies, suggesting avenues for future study and illustrating the importance of dual consideration of both experimental and observational data.
Xu, Xu Steven; Yuan, Min; Yang, Haitao; Feng, Yan; Xu, Jinfeng; Pinheiro, Jose
2017-01-01
Covariate analysis based on population pharmacokinetics (PPK) is used to identify clinically relevant factors. The likelihood ratio test (LRT) based on nonlinear mixed effect model fits is currently recommended for covariate identification, whereas individual empirical Bayesian estimates (EBEs) are considered unreliable due to the presence of shrinkage. The objectives of this research were to investigate the type I error for LRT and EBE approaches, to confirm the similarity of power between the LRT and EBE approaches from a previous report and to explore the influence of shrinkage on LRT and EBE inferences. Using an oral one-compartment PK model with a single covariate impacting on clearance, we conducted a wide range of simulations according to a two-way factorial design. The results revealed that the EBE-based regression not only provided almost identical power for detecting a covariate effect, but also controlled the false positive rate better than the LRT approach. Shrinkage of EBEs is likely not the root cause for decrease in power or inflated false positive rate although the size of the covariate effect tends to be underestimated at high shrinkage. In summary, contrary to the current recommendations, EBEs may be a better choice for statistical tests in PPK covariate analysis compared to LRT. We proposed a three-step covariate modeling approach for population PK analysis to utilize the advantages of EBEs while overcoming their shortcomings, which allows not only markedly reducing the run time for population PK analysis, but also providing more accurate covariate tests.
Froeschke, John T.; Stunz, Gregory W.; Sterba-Boatwright, Blair; Wildhaber, Mark L.
2010-01-01
Using a long-term fisheries-independent data set, we tested the 'shark nursery area concept' proposed by Heupel et al. (2007) with the suggested working assumptions that a shark nursery habitat would: (1) have an abundance of immature sharks greater than the mean abundance across all habitats where they occur; (2) be used by sharks repeatedly through time (years); and (3) see immature sharks remaining within the habitat for extended periods of time. We tested this concept using young-of-the-year (age 0) and juvenile (age 1+ yr) bull sharks Carcharhinus leucas from gill-net surveys conducted in Texas bays from 1976 to 2006 to estimate the potential nursery function of 9 coastal bays. Of the 9 bay systems considered as potential nursery habitat, only Matagorda Bay satisfied all 3 criteria for young-of-the-year bull sharks. Both Matagorda and San Antonio Bays met the criteria for juvenile bull sharks. Through these analyses we examined the utility of this approach for characterizing nursery areas and we also describe some practical considerations, such as the influence of the temporal or spatial scales considered when applying the nursery role concept to shark populations.
International Nuclear Information System (INIS)
Martins, César C.; Cabral, Ana Caroline; Barbosa-Cintra, Scheyla C.T.; Dauner, Ana Lúcia L.; Souza, Fernanda M.
2014-01-01
Babitonga Bay is a South Atlantic estuary with significant ecological function; it is part of the last remaining areas of mangrove communities in the Southern Hemisphere. The aim of this study was to determine the spatial distribution of the faecal sterols and linear alkylbenzenes (LABs) in surface sediments and to perform an integrated evaluation of several molecular marker indices to assess the sewage contamination status in the study area. The highest observed concentrations of faecal sterols (coprostanol + epicoprostanol) and LABs were 6.65 μg g −1 and 413.3 ng g −1 , respectively. Several faecal sterol indices were calculated and correlated with coprostanol levels; these analyses showed that the index limits presented in the current literature could underestimate the sewage contamination in this study area. For the overall estuarine system, a low sewage impact may be assumed based on the low total mass inventories calculated for coprostanol (between 1.4% and 4.8%). - Highlights: • Sewage contamination in a South Atlantic estuary was confirmed by molecular markers. • Faecal sterol indices were established as indicators of sewage contamination. • Estimates of the total mass inventory of coprostanol and LABs are presented. • Faecal sterols are preferable to LABs for the evaluation of sewage inputs in this study area. - Faecal sterols index limits has been established to a subtropical environment as way to ensure reliability for a more precise assessment of sewage contamination
Žvokelj, Matej; Zupan, Samo; Prebil, Ivan
2011-10-01
The article presents a novel non-linear multivariate and multiscale statistical process monitoring and signal denoising method which combines the strengths of the Kernel Principal Component Analysis (KPCA) non-linear multivariate monitoring approach with the benefits of Ensemble Empirical Mode Decomposition (EEMD) to handle multiscale system dynamics. The proposed method which enables us to cope with complex even severe non-linear systems with a wide dynamic range was named the EEMD-based multiscale KPCA (EEMD-MSKPCA). The method is quite general in nature and could be used in different areas for various tasks even without any really deep understanding of the nature of the system under consideration. Its efficiency was first demonstrated by an illustrative example, after which the applicability for the task of bearing fault detection, diagnosis and signal denosing was tested on simulated as well as actual vibration and acoustic emission (AE) signals measured on purpose-built large-size low-speed bearing test stand. The positive results obtained indicate that the proposed EEMD-MSKPCA method provides a promising tool for tackling non-linear multiscale data which present a convolved picture of many events occupying different regions in the time-frequency plane.
Directory of Open Access Journals (Sweden)
Angélica Garzón-Umerenkova
2018-04-01
Full Text Available This research aimed to analyze the linear bivariate correlation and structural relations between self-regulation -as a central construct-, with flow, health, procrastination and academic performance, in an academic context. A total of 363 college students took part, 101 men (27.8% and 262 women (72.2%. Participants had an average age of 22 years and were between the first and fifth year of studies. They were from five different programs and two universities in Bogotá city (Colombia. A validated ad hoc questionnaire of physical and psychological health was applied along with a battery of tests to measure self-regulation, procrastination, and flourishing. To establish an association relationship, Pearson bivariate correlations were performed using SPSS software (v. 22.0, and structural relationship predictive analysis was performed using an SEM on AMOS software (v. 22.0. Regarding this linear association, it was established that (1 self-regulation has a significant positive association on flourishing and overall health, and a negative effect on procrastination. Regarding the structural relation, it confirmed that (2 self-regulation is a direct and positive predictor of flourishing and health; (3 self-regulation predicts procrastination directly and negatively, and academic performance indirectly and positively; and (4 age and gender have a prediction effect on the analyzed variables. Implications, limitations and future research scope are discussed.
Garzón-Umerenkova, Angélica; de la Fuente, Jesús; Amate, Jorge; Paoloni, Paola V.; Fadda, Salvatore; Pérez, Javier Fiz
2018-01-01
This research aimed to analyze the linear bivariate correlation and structural relations between self-regulation -as a central construct-, with flow, health, procrastination and academic performance, in an academic context. A total of 363 college students took part, 101 men (27.8%) and 262 women (72.2%). Participants had an average age of 22 years and were between the first and fifth year of studies. They were from five different programs and two universities in Bogotá city (Colombia). A validated ad hoc questionnaire of physical and psychological health was applied along with a battery of tests to measure self-regulation, procrastination, and flourishing. To establish an association relationship, Pearson bivariate correlations were performed using SPSS software (v. 22.0), and structural relationship predictive analysis was performed using an SEM on AMOS software (v. 22.0). Regarding this linear association, it was established that (1) self-regulation has a significant positive association on flourishing and overall health, and a negative effect on procrastination. Regarding the structural relation, it confirmed that (2) self-regulation is a direct and positive predictor of flourishing and health; (3) self-regulation predicts procrastination directly and negatively, and academic performance indirectly and positively; and (4) age and gender have a prediction effect on the analyzed variables. Implications, limitations and future research scope are discussed.
Garzón-Umerenkova, Angélica; de la Fuente, Jesús; Amate, Jorge; Paoloni, Paola V; Fadda, Salvatore; Pérez, Javier Fiz
2018-01-01
This research aimed to analyze the linear bivariate correlation and structural relations between self-regulation -as a central construct-, with flow, health, procrastination and academic performance, in an academic context. A total of 363 college students took part, 101 men (27.8%) and 262 women (72.2%). Participants had an average age of 22 years and were between the first and fifth year of studies. They were from five different programs and two universities in Bogotá city (Colombia). A validated ad hoc questionnaire of physical and psychological health was applied along with a battery of tests to measure self-regulation, procrastination, and flourishing. To establish an association relationship, Pearson bivariate correlations were performed using SPSS software (v. 22.0), and structural relationship predictive analysis was performed using an SEM on AMOS software (v. 22.0). Regarding this linear association, it was established that (1) self-regulation has a significant positive association on flourishing and overall health, and a negative effect on procrastination. Regarding the structural relation, it confirmed that (2) self-regulation is a direct and positive predictor of flourishing and health; (3) self-regulation predicts procrastination directly and negatively, and academic performance indirectly and positively; and (4) age and gender have a prediction effect on the analyzed variables. Implications, limitations and future research scope are discussed.
Directory of Open Access Journals (Sweden)
Peng Nai
2016-03-01
Full Text Available A great number of immigration populations resident permanently in Yunnan Border Area of China. To some extent, these people belong to refugees or immigrants in accordance with International Rules, which significantly features the social diversity of this area. However, this kind of social diversity always impairs the social order. Therefore, there will be a positive influence to the local society governance by a research on local immigration integration. This essay hereby attempts to acquire the data of the living situation of these border area immigration and refugees. The analysis of the social integration of refugees and immigration in Yunnan border area in China will be deployed through the modeling of multivariable linear regression based on these data in order to propose some more achievable resolutions.
Angela Mihai, L.
2013-03-01
Finite element simulations of different shear deformations in non-linear elasticity are presented. We pay particular attention to the Poynting effects in hyperelastic materials, complementing recent theoretical findings by showing these effects manifested by specific models. As the finite element method computes uniform deformations exactly, for simple shear deformation and pure shear stress, the Poynting effect is represented exactly, while for the generalised shear and simple torsion, where the deformation is non-uniform, the solution is approximated efficiently and guaranteed computational bounds on the magnitude of the Poynting effect are obtained. The numerical results further indicate that, for a given elastic material, the same sign effect occurs under different shearing mechanisms, showing the genericity of the Poynting effect under a variety of shearing loads. In order to derive numerical models that exhibit either the positive or the negative Poynting effect, the so-called generalised empirical inequalities, which are less restrictive than the usual empirical inequalities involving material parameters, are assumed. © 2012 Elsevier Ltd.
International Nuclear Information System (INIS)
Harker, Y.D.
1976-01-01
A semi-empirical analytical expression representing a fast reactor neutron spectrum has been developed. This expression was used in a non-linear regression computer routine to obtain from measured multiple foil integral reaction data the neutron spectrum inside the Coupled Fast Reactivity Measurement Facility. In this application six parameters in the analytical expression for neutron spectrum were adjusted in the non-linear fitting process to maximize consistency between calculated and measured integral reaction rates for a set of 15 dosimetry detector foils. In two-thirds of the observations the calculated integral agreed with its respective measured value to within the experimental standard deviation, and in all but one case agreement within two standard deviations was obtained. Based on this quality of fit the estimated 70 to 75 percent confidence intervals for the derived spectrum are 10 to 20 percent for the energy range 100 eV to 1 MeV, 10 to 50 percent for 1 MeV to 10 MeV and 50 to 90 percent for 10 MeV to 18 MeV. The analytical model has demonstrated a flexibility to describe salient features of neutron spectra of the fast reactor type. The use of regression analysis with this model has produced a stable method to derive neutron spectra from a limited amount of integral data
Owen, Art B
2001-01-01
Empirical likelihood provides inferences whose validity does not depend on specifying a parametric model for the data. Because it uses a likelihood, the method has certain inherent advantages over resampling methods: it uses the data to determine the shape of the confidence regions, and it makes it easy to combined data from multiple sources. It also facilitates incorporating side information, and it simplifies accounting for censored, truncated, or biased sampling.One of the first books published on the subject, Empirical Likelihood offers an in-depth treatment of this method for constructing confidence regions and testing hypotheses. The author applies empirical likelihood to a range of problems, from those as simple as setting a confidence region for a univariate mean under IID sampling, to problems defined through smooth functions of means, regression models, generalized linear models, estimating equations, or kernel smooths, and to sampling with non-identically distributed data. Abundant figures offer vi...
Empirical Bayes methods in road safety research.
Vogelesang, R.A.W.
1997-01-01
Road safety research is a wonderful combination of counting fatal accidents and using a toolkit containing prior, posterior, overdispersed Poisson, negative binomial and Gamma distributions, together with positive and negative regression effects, shrinkage estimators and fiercy debates concerning
Angela Mihai, L.; Goriely, Alain
2013-01-01
Finite element simulations of different shear deformations in non-linear elasticity are presented. We pay particular attention to the Poynting effects in hyperelastic materials, complementing recent theoretical findings by showing these effects
Wang, Zheng-Xin; Hao, Peng; Yao, Pei-Yi
2017-12-13
The non-linear relationship between provincial economic growth and carbon emissions is investigated by using panel smooth transition regression (PSTR) models. The research indicates that, on the condition of separately taking Gross Domestic Product per capita (GDPpc), energy structure (Es), and urbanisation level (Ul) as transition variables, three models all reject the null hypothesis of a linear relationship, i.e., a non-linear relationship exists. The results show that the three models all contain only one transition function but different numbers of location parameters. The model taking GDPpc as the transition variable has two location parameters, while the other two models separately considering Es and Ul as the transition variables both contain one location parameter. The three models applied in the study all favourably describe the non-linear relationship between economic growth and CO₂ emissions in China. It also can be seen that the conversion rate of the influence of Ul on per capita CO₂ emissions is significantly higher than those of GDPpc and Es on per capita CO₂ emissions.
African Journals Online (AJOL)
user
2015-02-23
Feb 23, 2015 ... surveys to assess the vulnerability of the most important physical and eutrophication parameters along. El- Mex Bay coast. As a result of increasing population and industrial development, poorly untreated industrial waste, domestic sewage, shipping industry and agricultural runoff are being released to the.
Institute of Scientific and Technical Information of China (English)
Nong Yue HE; Chun YANG; Jian Xin TANG; Peng Feng XIAO; Hong CHEN
2003-01-01
KL molecular sieves with different framework compositions were secondarily synthesized by substituting Si for Al with a solution of (NH4)2SiF6. The internal tetrahedron symmetric stretch frequency, at ν770 cm-1, is linear with the molar fraction of Al (XAl= Al/(Si+Al)) in the framework of KL samples: XAl = -7.309×10-3 (υ770-760) + 0.3242.
International Nuclear Information System (INIS)
Peggs, S.; Talman, R.
1986-08-01
As proton accelerators get larger, and include more magnets, the conventional tracking programs which simulate them run slower. At the same time, in order to more carefully optimize the higher cost of the accelerators, they must return more accurate results, even in the presence of a longer list of realistic effects, such as magnet errors and misalignments. For these reasons conventional tracking programs continue to be computationally bound, despite the continually increasing computing power available. This limitation is especially severe for a class of problems in which some lattice parameter is slowly varying, when a faithful description is only obtained by tracking for an exceedingly large number of turns. Examples are synchrotron oscillations in which the energy varies slowly with a period of, say, hundreds of turns, or magnet ripple or noise on a comparably slow time scale. In these cases one may with to track for hundreds of periods of the slowly varying parameter. The purpose of this paper is to describe a method, still under development, in which element-by-element tracking around one turn is replaced by a single map, which can be processed far faster. Similar programs have already been written in which successive elements are ''concatenated'' with truncation to linear, sextupole, or octupole order, et cetera, using Lie algebraic techniques to preserve symplecticity. The method described here is rather more empirical than this but, in principle, contains information to all orders and is able to handle resonances in a more straightforward fashion
voom: Precision weights unlock linear model analysis tools for RNA-seq read counts.
Law, Charity W; Chen, Yunshun; Shi, Wei; Smyth, Gordon K
2014-02-03
New normal linear modeling strategies are presented for analyzing read counts from RNA-seq experiments. The voom method estimates the mean-variance relationship of the log-counts, generates a precision weight for each observation and enters these into the limma empirical Bayes analysis pipeline. This opens access for RNA-seq analysts to a large body of methodology developed for microarrays. Simulation studies show that voom performs as well or better than count-based RNA-seq methods even when the data are generated according to the assumptions of the earlier methods. Two case studies illustrate the use of linear modeling and gene set testing methods.
Malykh, O V; Golub, A Yu; Teplyakov, V V
2011-05-11
Membrane gas separation technologies (air separation, hydrogen recovery from dehydrogenation processes, etc.) use traditionally the glassy polymer membranes with dominating permeability of "small" gas molecules. For this purposes the membranes based on the low free volume glassy polymers (e.g., polysulfone, tetrabromopolycarbonate and polyimides) are used. On the other hand, an application of membrane methods for VOCs and some toxic gas recovery from air, separation of the lower hydrocarbons containing mixtures (in petrochemistry and oil refining) needs the membranes with preferable penetration of components with relatively larger molecular sizes. In general, this kind of permeability is characterized for rubbers and for the high free volume glassy polymers. Data files accumulated (more than 1500 polymeric materials) represent the region of parameters "inside" of these "boundaries." Two main approaches to the prediction of gas permeability of polymers are considered in this paper: (1) the statistical treatment of published transport parameters of polymers and (2) the prediction using model of ≪diffusion jump≫ with consideration of the key properties of the diffusing molecule and polymeric matrix. In the frames of (1) the paper presents N-dimensional methods of the gas permeability estimation of polymers using the correlations "selectivity/permeability." It is found that the optimal accuracy of prediction is provided at n=4. In the frames of the solution-diffusion mechanism (2) the key properties include the effective molecular cross-section of penetrating species to be responsible for molecular transportation in polymeric matrix and the well known force constant (ε/k)(eff i) of {6-12} potential for gas-gas interaction. Set of corrected effective molecular cross-section of penetrant including noble gases (He, Ne, Ar, Kr, Xe), permanent gases (H(2), O(2), N(2), CO), ballast and toxic gases (CO(2), NO(,) NO(2), SO(2), H(2)S) and linear lower hydrocarbons (CH(4
Lartillot, Nicolas; Rodrigue, Nicolas; Stubbs, Daniel; Richer, Jacques
2013-07-01
Modeling across site variation of the substitution process is increasingly recognized as important for obtaining more accurate phylogenetic reconstructions. Both finite and infinite mixture models have been proposed and have been shown to significantly improve on classical single-matrix models. Compared with their finite counterparts, infinite mixtures have a greater expressivity. However, they are computationally more challenging. This has resulted in practical compromises in the design of infinite mixture models. In particular, a fast but simplified version of a Dirichlet process model over equilibrium frequency profiles implemented in PhyloBayes has often been used in recent phylogenomics studies, while more refined model structures, more realistic and empirically more fit, have been practically out of reach. We introduce a message passing interface version of PhyloBayes, implementing the Dirichlet process mixture models as well as more classical empirical matrices and finite mixtures. The parallelization is made efficient thanks to the combination of two algorithmic strategies: a partial Gibbs sampling update of the tree topology and the use of a truncated stick-breaking representation for the Dirichlet process prior. The implementation shows close to linear gains in computational speed for up to 64 cores, thus allowing faster phylogenetic reconstruction under complex mixture models. PhyloBayes MPI is freely available from our website www.phylobayes.org.
Press, J.; Broughton, J.; Kudela, R. M.
2014-12-01
Suspended and dissolved trace elements are key determinants of water quality in estuarine and coastal waters. High concentrations of trace element pollutants in the San Francisco Bay estuary necessitate consistent and thorough monitoring to mitigate adverse effects on biological systems and the contamination of water and food resources. Although existing monitoring programs collect annual in situ samples from fixed locations, models proposed by Benoit, Kudela, & Flegal (2010) enable calculation of the water column total concentration (WCT) and the water column dissolved concentration (WCD) of 14 trace elements in the San Francisco Bay from a more frequently sampled metric—suspended solids concentration (SSC). This study tests the application of these models with SSC calculated from remote sensing data, with the aim of validating a tool for continuous synoptic monitoring of trace elements in the San Francisco Bay. Using HICO imagery, semi-analytical and empirical SSC algorithms were tested against a USGS dataset. A single-band method with statistically significant linear fit (p Arsenic, Iron, and Lead in the southern region of the Bay were found to exceed EPA water quality criteria for human health and aquatic life. The results of this study demonstrate the potential of monitoring programs using remote observation of trace element concentrations, and provide the foundation for investigation of pollutant sources and pathways over time.
DEFF Research Database (Denmark)
Khair, Tabish
2017-01-01
Review of 'Inglorious Empire: What the British did to India' by Shashi Tharoor, London, Hurst Publishers, 2017, 296 pp., £20.00......Review of 'Inglorious Empire: What the British did to India' by Shashi Tharoor, London, Hurst Publishers, 2017, 296 pp., £20.00...
Information about the San Francisco Bay Water Quality Project (SFBWQP) Urban Greening Bay Area, a large-scale effort to re-envision urban landscapes to include green infrastructure (GI) making communities more livable and reducing stormwater runoff.
Ghnimi, Thouraya; Hassini, Lamine; Bagane, Mohamed
2016-12-01
The aim of this work is to determine the desorption isotherms and the drying kinetics of bay laurel leaves ( Laurus Nobilis L.). The desorption isotherms were performed at three temperature levels: 50, 60 and 70 °C and at water activity ranging from 0.057 to 0.88 using the statistic gravimetric method. Five sorption models were used to fit desorption experimental isotherm data. It was found that Kuhn model offers the best fitting of experimental moisture isotherms in the mentioned investigated ranges of temperature and water activity. The Net isosteric heat of water desorption was evaluated using The Clausius-Clapeyron equation and was then best correlated to equilibrium moisture content by the empirical Tsami's equation. Thin layer convective drying curves of bay laurel leaves were obtained for temperatures of 45, 50, 60 and 70 °C, relative humidity of 5, 15, 30 and 45 % and air velocities of 1, 1.5 and 2 m/s. A non linear regression procedure of Levenberg-Marquardt was used to fit drying curves with five semi empirical mathematical models available in the literature, The R2 and χ2 were used to evaluate the goodness of fit of models to data. Based on the experimental drying curves the drying characteristic curve (DCC) has been established and fitted with a third degree polynomial function. It was found that the Midilli Kucuk model was the best semi-empirical model describing thin layer drying kinetics of bay laurel leaves. The bay laurel leaves effective moisture diffusivity and activation energy were also identified.
Masuda, Yosuke; Yoshida, Tomoki; Yamaotsu, Noriyuki; Hirono, Shuichi
2018-01-01
We recently reported that the Gibbs free energy of hydrolytic water molecules (ΔG wat ) in acyl-trypsin intermediates calculated by hydration thermodynamics analysis could be a useful metric for estimating the catalytic rate constants (k cat ) of mechanism-based reversible covalent inhibitors. For thorough evaluation, the proposed method was tested with an increased number of covalent ligands that have no corresponding crystal structures. After modeling acyl-trypsin intermediate structures using flexible molecular superposition, ΔG wat values were calculated according to the proposed method. The orbital energies of antibonding π* molecular orbitals (MOs) of carbonyl C=O in covalently modified catalytic serine (E orb ) were also calculated by semi-empirical MO calculations. Then, linear discriminant analysis (LDA) was performed to build a model that can discriminate covalent inhibitor candidates from substrate-like ligands using ΔG wat and E orb . The model was built using a training set (10 compounds) and then validated by a test set (4 compounds). As a result, the training set and test set ligands were perfectly discriminated by the model. Hydrolysis was slower when (1) the hydrolytic water molecule has lower ΔG wat ; (2) the covalent ligand presents higher E orb (higher reaction barrier). Results also showed that the entropic term of hydrolytic water molecule (-TΔS wat ) could be used for estimating k cat and for covalent inhibitor optimization; when the rotational freedom of the hydrolytic water molecule is limited, the chance for favorable interaction with the electrophilic acyl group would also be limited. The method proposed in this study would be useful for screening and optimizing the mechanism-based reversible covalent inhibitors.
International Nuclear Information System (INIS)
Peggs, S.; Talman, R.
1987-01-01
As proton accelerators get larger, and include more magnets, the conventional tracking programs which simulate them run slower. The purpose of this paper is to describe a method, still under development, in which element-by-element tracking around one turn is replaced by a single man, which can be processed far faster. It is assumed for this method that a conventional program exists which can perform faithful tracking in the lattice under study for some hundreds of turns, with all lattice parameters held constant. An empirical map is then generated by comparison with the tracking program. A procedure has been outlined for determining an empirical Hamiltonian, which can represent motion through many nonlinear kicks, by taking data from a conventional tracking program. Though derived by an approximate method this Hamiltonian is analytic in form and can be subjected to further analysis of varying degrees of mathematical rigor. Even though the empirical procedure has only been described in one transverse dimension, there is good reason to hope that it can be extended to include two transverse dimensions, so that it can become a more practical tool in realistic cases
Urban Noise Modelling in Boka Kotorska Bay
Directory of Open Access Journals (Sweden)
Aleksandar Nikolić
2014-04-01
Full Text Available Traffic is the most significant noise source in urban areas. The village of Kamenari in Boka Kotorska Bay is a site where, in a relatively small area, road traffic and sea (ferry traffic take place at the same time. Due to the specificity of the location, i.e. very rare synergy of sound effects of road and sea traffic in the urban area, as well as the expressed need for assessment of noise level in a simple and quick way, a research was conducted, using empirical methods and statistical analysis methods, which led to the creation of acoustic model for the assessment of equivalent noise level (Leq. The developed model for noise assessment in the Village of Kamenari in Boka Kotorska Bay quite realistically provides data on possible noise levels at the observed site, with very little deviations in relation to empirically obtained values.
Rahaman, Obaidur; Estrada, Trilce P; Doren, Douglas J; Taufer, Michela; Brooks, Charles L; Armen, Roger S
2011-09-26
The performances of several two-step scoring approaches for molecular docking were assessed for their ability to predict binding geometries and free energies. Two new scoring functions designed for "step 2 discrimination" were proposed and compared to our CHARMM implementation of the linear interaction energy (LIE) approach using the Generalized-Born with Molecular Volume (GBMV) implicit solvation model. A scoring function S1 was proposed by considering only "interacting" ligand atoms as the "effective size" of the ligand and extended to an empirical regression-based pair potential S2. The S1 and S2 scoring schemes were trained and 5-fold cross-validated on a diverse set of 259 protein-ligand complexes from the Ligand Protein Database (LPDB). The regression-based parameters for S1 and S2 also demonstrated reasonable transferability in the CSARdock 2010 benchmark using a new data set (NRC HiQ) of diverse protein-ligand complexes. The ability of the scoring functions to accurately predict ligand geometry was evaluated by calculating the discriminative power (DP) of the scoring functions to identify native poses. The parameters for the LIE scoring function with the optimal discriminative power (DP) for geometry (step 1 discrimination) were found to be very similar to the best-fit parameters for binding free energy over a large number of protein-ligand complexes (step 2 discrimination). Reasonable performance of the scoring functions in enrichment of active compounds in four different protein target classes established that the parameters for S1 and S2 provided reasonable accuracy and transferability. Additional analysis was performed to definitively separate scoring function performance from molecular weight effects. This analysis included the prediction of ligand binding efficiencies for a subset of the CSARdock NRC HiQ data set where the number of ligand heavy atoms ranged from 17 to 35. This range of ligand heavy atoms is where improved accuracy of predicted ligand
DEFF Research Database (Denmark)
Engholm, Ida
2014-01-01
Celebrated as one of the leading and most valuable brands in the world, eBay has acquired iconic status on par with century-old brands such as Coca-Cola and Disney. The eBay logo is now synonymous with the world’s leading online auction website, and its design is associated with the company...
A fast EM algorithm for BayesA-like prediction of genomic breeding values.
Directory of Open Access Journals (Sweden)
Xiaochen Sun
Full Text Available Prediction accuracies of estimated breeding values for economically important traits are expected to benefit from genomic information. Single nucleotide polymorphism (SNP panels used in genomic prediction are increasing in density, but the Markov Chain Monte Carlo (MCMC estimation of SNP effects can be quite time consuming or slow to converge when a large number of SNPs are fitted simultaneously in a linear mixed model. Here we present an EM algorithm (termed "fastBayesA" without MCMC. This fastBayesA approach treats the variances of SNP effects as missing data and uses a joint posterior mode of effects compared to the commonly used BayesA which bases predictions on posterior means of effects. In each EM iteration, SNP effects are predicted as a linear combination of best linear unbiased predictions of breeding values from a mixed linear animal model that incorporates a weighted marker-based realized relationship matrix. Method fastBayesA converges after a few iterations to a joint posterior mode of SNP effects under the BayesA model. When applied to simulated quantitative traits with a range of genetic architectures, fastBayesA is shown to predict GEBV as accurately as BayesA but with less computing effort per SNP than BayesA. Method fastBayesA can be used as a computationally efficient substitute for BayesA, especially when an increasing number of markers bring unreasonable computational burden or slow convergence to MCMC approaches.
Shilov, Georgi E
1977-01-01
Covers determinants, linear spaces, systems of linear equations, linear functions of a vector argument, coordinate transformations, the canonical form of the matrix of a linear operator, bilinear and quadratic forms, Euclidean spaces, unitary spaces, quadratic forms in Euclidean and unitary spaces, finite-dimensional space. Problems with hints and answers.
Biscayne Bay Alongshore Epifauna
National Oceanic and Atmospheric Administration, Department of Commerce — Field studies to characterize the alongshore epifauna (shrimp, crabs, echinoderms, and small fishes) along the western shore of southern Biscayne Bay were started in...
National Oceanic and Atmospheric Administration, Department of Commerce — This image represents a 4x4 meter resolution bathymetric surface for Jobos Bay, Puerto Rico (in NAD83 UTM 19 North). The depth values are in meters referenced to the...
Hammond Bay Biological Station
Federal Laboratory Consortium — Hammond Bay Biological Station (HBBS), located near Millersburg, Michigan, is a field station of the USGS Great Lakes Science Center (GLSC). HBBS was established by...
National Oceanic and Atmospheric Administration, Department of Commerce — This data set consists of 0.5-meter pixel resolution, four band orthoimages covering the Humboldt Bay area. An orthoimage is remotely sensed image data in which...
Monterey Bay Aquarium Volunteer Guide Scheduling Analysis
2014-12-01
TERMS 15. NUMBER OF Monterey Bay Aquarium, linear programing, network design, multi commodity flow, resilience PAGES 17. SECURITY 18. SECURITY...Volunteers fill many roles that include Aquarium guides, information desk attendants, divers, and animal caregivers . Julie Packard, Executive Director of...further analyze the resiliency of the shifts to changes in staffing levels caused by no-shows or drop-ins. 3 While the guide program managers have
Poisson-generalized gamma empirical Bayes model for disease ...
African Journals Online (AJOL)
In spatial disease mapping, the use of Bayesian models of estimation technique is becoming popular for smoothing relative risks estimates for disease mapping. The most common Bayesian conjugate model for disease mapping is the Poisson-Gamma Model (PG). To explore further the activity of smoothing of relative risk ...
Empirical Bayes Estimation of Proportions in Several Groups.
1981-01-01
Street NW Washington, DC 20208 Dr. William Graham Testing Directorate 1 Dr. Lorraine D. Eyde MEPCOM/MEPCT-P Personnel R&D Center Ft. Sheridan, IL...D-53 BONN 1, GERMANY Princeton, NJ 08540 1 Dr. Mark D. Reckase Dr. Gary Marco Educational Psychology Dept. Educational Testing Service University of...GED Testing Service, Suite 20 Columbia, SC 29208 One Dupont Cirle, NW Washington, DC 20036 1 PROF. FUMIKO SAMEJIMA DEPT. OF PSYCHOLOGY UNIVERSITY OF
75 FR 11837 - Chesapeake Bay Watershed Initiative
2010-03-12
... DEPARTMENT OF AGRICULTURE Commodity Credit Corporation Chesapeake Bay Watershed Initiative AGENCY...: Notice of availability of program funds for the Chesapeake Bay Watershed Initiative. SUMMARY: The... through the Chesapeake Bay Watershed Initiative for agricultural producers in the Chesapeake Bay watershed...
33 CFR 100.124 - Maggie Fischer Memorial Great South Bay Cross Bay Swim, Great South Bay, New York.
2010-07-01
... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Maggie Fischer Memorial Great South Bay Cross Bay Swim, Great South Bay, New York. 100.124 Section 100.124 Navigation and Navigable... NAVIGABLE WATERS § 100.124 Maggie Fischer Memorial Great South Bay Cross Bay Swim, Great South Bay, New York...
Predicting potentially toxigenic Pseudo-nitzschia blooms in the Chesapeake Bay
Anderson, Clarissa R.; Sapiano, Mathew R. P.; Prasad, M. Bala Krishna; Long, Wen; Tango, Peter J.; Brown, Christopher W.; Murtugudde, Raghu
2010-11-01
Harmful algal blooms are now recognized as a significant threat to the Chesapeake Bay as they can severely compromise the economic viability of important recreational and commercial fisheries in the largest estuary of the United States. This study describes the development of empirical models for the potentially domoic acid-producing Pseudo-nitzschia species complex present in the Bay, developed from a 22-year time series of cell abundance and concurrent measurements of hydrographic and chemical properties. Using a logistic Generalized Linear Model (GLM) approach, model parameters and performance were compared over a range of Pseudo-nitzschia bloom thresholds relevant to toxin production by different species. Small-threshold blooms (≥10 cells mL -1) are explained by time of year, location, and variability in surface values of phosphate, temperature, nitrate plus nitrite, and freshwater discharge. Medium- (100 cells mL -1) to large- threshold (1000 cells mL -1) blooms are further explained by salinity, silicic acid, dissolved organic carbon, and light attenuation (Secchi) depth. These predictors are similar to other models for Pseudo-nitzschia blooms on the west coast, suggesting commonalities across ecosystems. Hindcasts of bloom probabilities at a 19% bloom prediction point yield a Heidke Skill Score of ~53%, a Probability of Detection ˜ 75%, a False Alarm Ratio of ˜ 52%, and a Probability of False Detection ˜9%. The implication of possible future changes in Baywide nutrient stoichiometry on Pseudo-nitzschia blooms is discussed.
According to extensive data obtained over its 13,000 km of shoreline, the Chesapeake Bay has been suffering a major, indeed unprecedented, reduction in submerged vegetation. Chesapeake Bay is alone in experiencing decline in submerged vegetation. Other estuary systems on the east coast of the United States are not so affected. These alarming results were obtained by the synthesis of the findings of numerous individual groups in addition to large consortium projects on the Chesapeake done over the past decade. R. J. Orth and R. A. Moore of the Virginia Institute of Marine Science pointed to the problem of the severe decline of submerged grasses on the Bay and along its tributaries. In a recent report, Orth and Moore note: “The decline, which began in the 1960's and accelerated in the 1970's, has affected all species in all areas. Many major river systems are now totally devoid of any rooted vegetation” (Science, 222, 51-53, 1983).
Crozier, G. F.; Schroeder, W. W.
1978-01-01
The termination of studies carried on for almost three years in the Mobile Bay area and adjacent continental shelf are reported. The initial results concentrating on the shelf and lower bay were presented in the interim report. The continued scope of work was designed to attempt a refinement of the mathematical model, assess the effectiveness of optical measurement of suspended particulate material and disseminate the acquired information. The optical characteristics of particulate solutions are affected by density gradients within the medium, density of the suspended particles, particle size, particle shape, particle quality, albedo, and the angle of refracted light. Several of these are discussed in detail.
International Nuclear Information System (INIS)
Suwono.
1978-01-01
A linear gate providing a variable gate duration from 0,40μsec to 4μsec was developed. The electronic circuity consists of a linear circuit and an enable circuit. The input signal can be either unipolar or bipolar. If the input signal is bipolar, the negative portion will be filtered. The operation of the linear gate is controlled by the application of a positive enable pulse. (author)
International Nuclear Information System (INIS)
Vretenar, M
2014-01-01
The main features of radio-frequency linear accelerators are introduced, reviewing the different types of accelerating structures and presenting the main characteristics aspects of linac beam dynamics
Linearization Method and Linear Complexity
Tanaka, Hidema
We focus on the relationship between the linearization method and linear complexity and show that the linearization method is another effective technique for calculating linear complexity. We analyze its effectiveness by comparing with the logic circuit method. We compare the relevant conditions and necessary computational cost with those of the Berlekamp-Massey algorithm and the Games-Chan algorithm. The significant property of a linearization method is that it needs no output sequence from a pseudo-random number generator (PRNG) because it calculates linear complexity using the algebraic expression of its algorithm. When a PRNG has n [bit] stages (registers or internal states), the necessary computational cost is smaller than O(2n). On the other hand, the Berlekamp-Massey algorithm needs O(N2) where N(≅2n) denotes period. Since existing methods calculate using the output sequence, an initial value of PRNG influences a resultant value of linear complexity. Therefore, a linear complexity is generally given as an estimate value. On the other hand, a linearization method calculates from an algorithm of PRNG, it can determine the lower bound of linear complexity.
Richards Bay effluent pipeline
CSIR Research Space (South Africa)
Lord, DA
1986-07-01
Full Text Available of major concern identified in the effluent are the large volume of byproduct calcium sulphate (phosphogypsum) which would smother marine life, high concentrations of fluoride highly toxic to marine life, heavy metals, chlorinated organic material... ........................ 9 THE RICHARDS BAY PIPELINE ........................................ 16 Environmental considerations ................................... 16 - Phosphogypsum disposal ................................... 16 - Effects of fluoride on locally occurring...
Gao, F.
2017-01-01
The dissertation consists of research in three subjects in two themes—Bayes and networks: The first studies the posterior contraction rates for the Dirichlet-Laplace mixtures in a deconvolution setting (Chapter 1). The second subject regards the statistical inference in preferential attachment
Fast empirical Bayesian LASSO for multiple quantitative trait locus mapping
Directory of Open Access Journals (Sweden)
Xu Shizhong
2011-05-01
Full Text Available Abstract Background The Bayesian shrinkage technique has been applied to multiple quantitative trait loci (QTLs mapping to estimate the genetic effects of QTLs on quantitative traits from a very large set of possible effects including the main and epistatic effects of QTLs. Although the recently developed empirical Bayes (EB method significantly reduced computation comparing with the fully Bayesian approach, its speed and accuracy are limited by the fact that numerical optimization is required to estimate the variance components in the QTL model. Results We developed a fast empirical Bayesian LASSO (EBLASSO method for multiple QTL mapping. The fact that the EBLASSO can estimate the variance components in a closed form along with other algorithmic techniques render the EBLASSO method more efficient and accurate. Comparing with the EB method, our simulation study demonstrated that the EBLASSO method could substantially improve the computational speed and detect more QTL effects without increasing the false positive rate. Particularly, the EBLASSO algorithm running on a personal computer could easily handle a linear QTL model with more than 100,000 variables in our simulation study. Real data analysis also demonstrated that the EBLASSO method detected more reasonable effects than the EB method. Comparing with the LASSO, our simulation showed that the current version of the EBLASSO implemented in Matlab had similar speed as the LASSO implemented in Fortran, and that the EBLASSO detected the same number of true effects as the LASSO but a much smaller number of false positive effects. Conclusions The EBLASSO method can handle a large number of effects possibly including both the main and epistatic QTL effects, environmental effects and the effects of gene-environment interactions. It will be a very useful tool for multiple QTL mapping.
Estimates of vertical velocities and eddy coefficients in the Bay of Bengal
Digital Repository Service at National Institute of Oceanography (India)
Varkey, M.J.; Sastry, J.S.
Vertical velocities and eddy coefficients in the intermediate depths of the Bay of Bengal are calculated from mean hydrographic data for 300 miles-squares. The linear current density (sigma- O) versus log-depth plots show steady balance between...
Dejesusparada, N. (Principal Investigator); Bentancurt, J. J. V.; Herz, B. R.; Molion, L. B.
1980-01-01
Detection of water quality in Guanabara Bay using multispectral scanning digital data taken from LANDSAT satellites was examined. To test these processes, an empirical (statistical) approach was choosen to observe the degree of relationship between LANDSAT data and the in situ data taken simultaneously. The linear and nonlinear regression analyses were taken from among those developed by INPE in 1978. Results indicate that the major regression was in the number six MSS band, atmospheric effects, which indicated a correction coefficient of 0.99 and an average error of 6.59 micrograms liter. This error was similar to that obtained in the laboratory. The chlorophyll content was between 0 and 100 micrograms/liter, as taken from the MSS of LANDSAT.
Said-Houari, Belkacem
2017-01-01
This self-contained, clearly written textbook on linear algebra is easily accessible for students. It begins with the simple linear equation and generalizes several notions from this equation for the system of linear equations and introduces the main ideas using matrices. It then offers a detailed chapter on determinants and introduces the main ideas with detailed proofs. The third chapter introduces the Euclidean spaces using very simple geometric ideas and discusses various major inequalities and identities. These ideas offer a solid basis for understanding general Hilbert spaces in functional analysis. The following two chapters address general vector spaces, including some rigorous proofs to all the main results, and linear transformation: areas that are ignored or are poorly explained in many textbooks. Chapter 6 introduces the idea of matrices using linear transformation, which is easier to understand than the usual theory of matrices approach. The final two chapters are more advanced, introducing t...
Sustainable development in the Hudson Bay/James Bay bioregion
International Nuclear Information System (INIS)
Anon.
1991-01-01
An overview is presented of projects planned for the James Bay/Hudson Bay region, and the expected environmental impacts of these projects. The watershed of James Bay and Hudson Bay covers well over one third of Canada, from southern Alberta to central Ontario to Baffin Island, as well as parts of north Dakota and Minnesota in the U.S.A. Hydroelectric power developments that change the timing and rate of flow of fresh water may cause changes in the nature and duration of ice cover, habitats of marine mammals, fish and migratory birds, currents into and out of Hudson Bay/James Bay, seasonal and annual loads of sediments and nutrients to marine ecosystems, and anadromous fish populations. Hydroelectric projects are proposed for the region by Quebec, Ontario and Manitoba. In January 1992, the Canadian Arctic Resources Committee (CARC), the Environmental Committee of Sanikuluaq, and the Rawson Academy of Arctic Science will launch the Hudson Bay/James Bay Bioregion Program, an independent initiative to apply an ecosystem approach to the region. Two main objectives are to provide a comprehensive assessment of the cumulative impacts of human activities on the marine and freshwater ecosystems of the Hudson Bay/James Bay bioregion, and to foster sustainable development by examining and proposing cooperative processes for decision making among governments, developers, aboriginal peoples and other stakeholders. 1 fig
Stoll, R R
1968-01-01
Linear Algebra is intended to be used as a text for a one-semester course in linear algebra at the undergraduate level. The treatment of the subject will be both useful to students of mathematics and those interested primarily in applications of the theory. The major prerequisite for mastering the material is the readiness of the student to reason abstractly. Specifically, this calls for an understanding of the fact that axioms are assumptions and that theorems are logical consequences of one or more axioms. Familiarity with calculus and linear differential equations is required for understand
Solow, Daniel
2014-01-01
This text covers the basic theory and computation for a first course in linear programming, including substantial material on mathematical proof techniques and sophisticated computation methods. Includes Appendix on using Excel. 1984 edition.
Liesen, Jörg
2015-01-01
This self-contained textbook takes a matrix-oriented approach to linear algebra and presents a complete theory, including all details and proofs, culminating in the Jordan canonical form and its proof. Throughout the development, the applicability of the results is highlighted. Additionally, the book presents special topics from applied linear algebra including matrix functions, the singular value decomposition, the Kronecker product and linear matrix equations. The matrix-oriented approach to linear algebra leads to a better intuition and a deeper understanding of the abstract concepts, and therefore simplifies their use in real world applications. Some of these applications are presented in detailed examples. In several ‘MATLAB-Minutes’ students can comprehend the concepts and results using computational experiments. Necessary basics for the use of MATLAB are presented in a short introduction. Students can also actively work with the material and practice their mathematical skills in more than 300 exerc...
Berberian, Sterling K
2014-01-01
Introductory treatment covers basic theory of vector spaces and linear maps - dimension, determinants, eigenvalues, and eigenvectors - plus more advanced topics such as the study of canonical forms for matrices. 1992 edition.
Searle, Shayle R
2012-01-01
This 1971 classic on linear models is once again available--as a Wiley Classics Library Edition. It features material that can be understood by any statistician who understands matrix algebra and basic statistical methods.
Christofilos, N.C.; Polk, I.J.
1959-02-17
Improvements in linear particle accelerators are described. A drift tube system for a linear ion accelerator reduces gap capacity between adjacent drift tube ends. This is accomplished by reducing the ratio of the diameter of the drift tube to the diameter of the resonant cavity. Concentration of magnetic field intensity at the longitudinal midpoint of the external sunface of each drift tube is reduced by increasing the external drift tube diameter at the longitudinal center region.
Empirical Bayesian inference and model uncertainty
International Nuclear Information System (INIS)
Poern, K.
1994-01-01
This paper presents a hierarchical or multistage empirical Bayesian approach for the estimation of uncertainty concerning the intensity of a homogeneous Poisson process. A class of contaminated gamma distributions is considered to describe the uncertainty concerning the intensity. These distributions in turn are defined through a set of secondary parameters, the knowledge of which is also described and updated via Bayes formula. This two-stage Bayesian approach is an example where the modeling uncertainty is treated in a comprehensive way. Each contaminated gamma distributions, represented by a point in the 3D space of secondary parameters, can be considered as a specific model of the uncertainty about the Poisson intensity. Then, by the empirical Bayesian method each individual model is assigned a posterior probability
THE RESPONSE OF MONTEREY BAY TO THE 2010 CHILEAN EARTHQUAKE
Directory of Open Access Journals (Sweden)
Laurence C. Breaker
2011-01-01
Full Text Available The primary frequencies contained in the arrival sequence produced by the tsunami from the Chilean earthquake of 2010 in Monterey Bay were extracted to determine the seiche modes that were produced. Singular Spectrum Analysis (SSA and Ensemble Empirical Mode Decomposition (EEMD were employed to extract the primary frequencies of interest. The wave train from the Chilean tsunami lasted for at least four days due to multipath arrivals that may not have included reflections from outside the bay but most likely did include secondary undulations, and energy trapping in the form of edge waves, inside the bay. The SSA decomposition resolved oscillations with periods of 52-57, 34-35, 26-27, and 21-22 minutes, all frequencies that have been predicted and/or observed in previous studies. The EEMD decomposition detected oscillations with periods of 50-55 and 21-22 minutes. Periods in the range of 50-57 minutes varied due to measurement uncertainties but almost certainly correspond to the first longitudinal mode of oscillation for Monterey Bay, periods of 34-35 minutes correspond to the first transverse mode of oscillation that assumes a nodal line across the entrance of the bay, a period of 26- 27 minutes, although previously observed, may not represent a fundamental oscillation, and a period of 21-22 minutes has been predicted and observed previously. A period of ~37 minutes, close to the period of 34-35 minutes, was generated by the Great Alaskan Earthquake of 1964 in Monterey Bay and most likely represents the same mode of oscillation. The tsunamis associated with the Great Alaskan Earthquake and the Chilean Earthquake both entered Monterey Bay but initially arrived outside the bay from opposite directions. Unlike the Great Alaskan Earthquake, however, which excited only one resonant mode inside the bay, the Chilean Earthquake excited several modes suggesting that the asymmetric shape of the entrance to Monterey Bay was an important factor and that the
California Natural Resource Agency — The Bay Trail provides easily accessible recreational opportunities for outdoor enthusiasts, including hikers, joggers, bicyclists and skaters. It also offers a...
Olive, David J
2017-01-01
This text covers both multiple linear regression and some experimental design models. The text uses the response plot to visualize the model and to detect outliers, does not assume that the error distribution has a known parametric distribution, develops prediction intervals that work when the error distribution is unknown, suggests bootstrap hypothesis tests that may be useful for inference after variable selection, and develops prediction regions and large sample theory for the multivariate linear regression model that has m response variables. A relationship between multivariate prediction regions and confidence regions provides a simple way to bootstrap confidence regions. These confidence regions often provide a practical method for testing hypotheses. There is also a chapter on generalized linear models and generalized additive models. There are many R functions to produce response and residual plots, to simulate prediction intervals and hypothesis tests, to detect outliers, and to choose response trans...
International Nuclear Information System (INIS)
Alcaraz, J.
2001-01-01
After several years of study e''+ e''- linear colliders in the TeV range have emerged as the major and optimal high-energy physics projects for the post-LHC era. These notes summarize the present status form the main accelerator and detector features to their physics potential. The LHC era. These notes summarize the present status, from the main accelerator and detector features to their physics potential. The LHC is expected to provide first discoveries in the new energy domain, whereas an e''+ e''- linear collider in the 500 GeV-1 TeV will be able to complement it to an unprecedented level of precision in any possible areas: Higgs, signals beyond the SM and electroweak measurements. It is evident that the Linear Collider program will constitute a major step in the understanding of the nature of the new physics beyond the Standard Model. (Author) 22 refs
Edwards, Harold M
1995-01-01
In his new undergraduate textbook, Harold M Edwards proposes a radically new and thoroughly algorithmic approach to linear algebra Originally inspired by the constructive philosophy of mathematics championed in the 19th century by Leopold Kronecker, the approach is well suited to students in the computer-dominated late 20th century Each proof is an algorithm described in English that can be translated into the computer language the class is using and put to work solving problems and generating new examples, making the study of linear algebra a truly interactive experience Designed for a one-semester course, this text adopts an algorithmic approach to linear algebra giving the student many examples to work through and copious exercises to test their skills and extend their knowledge of the subject Students at all levels will find much interactive instruction in this text while teachers will find stimulating examples and methods of approach to the subject
Incident wave run-up into narrow sloping bays and estuaries
Sinan Özeren, M.; Postacioglu, Nazmi; Canlı, Umut
2015-04-01
The problem is investigated using Carrier Greenspan hodograph transformations.We perform a quasi-one-dimensional solution well into the bay, far enough of the mouth of the bay. The linearized boundary conditions at the mouth of the bay lead to an integral equation for 2-D geometry. A semi analytical optimization method has been developed to solve this integral equation. When the wavelength of the incident wave is much larger than the width of the bay, the conformalmapping of the bay and the semi infinite sea onto upper complex plane provides a solution of the integral equation in closed form. Particular emphasis is placed on the case where the frequency of the incident waves matches the real-part of the natural frequency of the oscillation of the bay. These natural frequencies are complex because of the radiation conditions imposed at the mouth of the bay. It is found that the complex part of these natural frequencies decreases with decreasing width of the bay. Thus the trapping of the waves in narrower bays leads to a strong resonance phenomenon when the frequency of the incident wave is equal to the real part of the natural frequency.
Humic Substances from Manila Bay and Bolinao Bay Sediments
Directory of Open Access Journals (Sweden)
Elma Llaguno
1997-12-01
Full Text Available The C,H,N composition of sedimentary humic acids (HA extracted from three sites in Manila Bay and six sites in Bolinao Bay yielded H/C atomic ratios of 1.1-1.4 and N/C atomic ratios of 0.09 - 0.16. The Manila Bay HA's had lower H/C and N/C ratios compared to those from Bolinao Bay. The IR spectra showed prominent aliphatic C-H and amide I and II bands. Manila Bay HA's also had less diverse molecular composition based on the GC-MS analysis of the CuO and alkaline permanganate oxidation products of the humic acids.
Empirical Test Case Specification
DEFF Research Database (Denmark)
Kalyanova, Olena; Heiselberg, Per
This document includes the empirical specification on the IEA task of evaluation building energy simulation computer programs for the Double Skin Facades (DSF) constructions. There are two approaches involved into this procedure, one is the comparative approach and another is the empirical one. I....... In the comparative approach the outcomes of different software tools are compared, while in the empirical approach the modelling results are compared with the results of experimental test cases....
Karloff, Howard
1991-01-01
To this reviewer’s knowledge, this is the first book accessible to the upper division undergraduate or beginning graduate student that surveys linear programming from the Simplex Method…via the Ellipsoid algorithm to Karmarkar’s algorithm. Moreover, its point of view is algorithmic and thus it provides both a history and a case history of work in complexity theory. The presentation is admirable; Karloff's style is informal (even humorous at times) without sacrificing anything necessary for understanding. Diagrams (including horizontal brackets that group terms) aid in providing clarity. The end-of-chapter notes are helpful...Recommended highly for acquisition, since it is not only a textbook, but can also be used for independent reading and study. —Choice Reviews The reader will be well served by reading the monograph from cover to cover. The author succeeds in providing a concise, readable, understandable introduction to modern linear programming. —Mathematics of Computing This is a textbook intend...
Empirical Philosophy of Science
DEFF Research Database (Denmark)
Mansnerus, Erika; Wagenknecht, Susann
2015-01-01
knowledge takes place through the integration of the empirical or historical research into the philosophical studies, as Chang, Nersessian, Thagard and Schickore argue in their work. Building upon their contributions we will develop a blueprint for an Empirical Philosophy of Science that draws upon...... qualitative methods from the social sciences in order to advance our philosophical understanding of science in practice. We will regard the relationship between philosophical conceptualization and empirical data as an iterative dialogue between theory and data, which is guided by a particular ‘feeling with......Empirical insights are proven fruitful for the advancement of Philosophy of Science, but the integration of philosophical concepts and empirical data poses considerable methodological challenges. Debates in Integrated History and Philosophy of Science suggest that the advancement of philosophical...
Minimum relative entropy, Bayes and Kapur
Woodbury, Allan D.
2011-04-01
The focus of this paper is to illustrate important philosophies on inversion and the similarly and differences between Bayesian and minimum relative entropy (MRE) methods. The development of each approach is illustrated through the general-discrete linear inverse. MRE differs from both Bayes and classical statistical methods in that knowledge of moments are used as ‘data’ rather than sample values. MRE like Bayes, presumes knowledge of a prior probability distribution and produces the posterior pdf itself. MRE attempts to produce this pdf based on the information provided by new moments. It will use moments of the prior distribution only if new data on these moments is not available. It is important to note that MRE makes a strong statement that the imposed constraints are exact and complete. In this way, MRE is maximally uncommitted with respect to unknown information. In general, since input data are known only to within a certain accuracy, it is important that any inversion method should allow for errors in the measured data. The MRE approach can accommodate such uncertainty and in new work described here, previous results are modified to include a Gaussian prior. A variety of MRE solutions are reproduced under a number of assumed moments and these include second-order central moments. Various solutions of Jacobs & van der Geest were repeated and clarified. Menke's weighted minimum length solution was shown to have a basis in information theory, and the classic least-squares estimate is shown as a solution to MRE under the conditions of more data than unknowns and where we utilize the observed data and their associated noise. An example inverse problem involving a gravity survey over a layered and faulted zone is shown. In all cases the inverse results match quite closely the actual density profile, at least in the upper portions of the profile. The similar results to Bayes presented in are a reflection of the fact that the MRE posterior pdf, and its mean
2006-01-01
The highest tides on Earth occur in the Minas Basin, the eastern extremity of the Bay of Fundy, Nova Scotia, Canada, where the tide range can reach 16 meters when the various factors affecting the tides are in phase. The primary cause of the immense tides of Fundy is a resonance of the Bay of Fundy-Gulf of Maine system. The system is effectively bounded at this outer end by the edge of the continental shelf with its approximately 40:1 increase in depth. The system has a natural period of approximately 13 hours, which is close to the 12h25m period of the dominant lunar tide of the Atlantic Ocean. Like a father pushing his daughter on a swing, the gentle Atlantic tidal pulse pushes the waters of the Bay of Fundy-Gulf of Maine basin at nearly the optimum frequency to cause a large to-and-fro oscillation. The greatest slosh occurs at the head (northeast end) of the system. The high tide image (top) was acquired April 20, 2001, and the low tide image (bottom) was acquired September 30, 2002. The images cover an area of 16.5 by 21 km, and are centered near 64 degrees west longitude and 45.5 degrees north latitude. With its 14 spectral bands from the visible to the thermal infrared wavelength region, and its high spatial resolution of 15 to 90 meters (about 50 to 300 feet), ASTER images Earth to map and monitor the changing surface of our planet. ASTER is one of five Earth-observing instruments launched December 18, 1999, on NASA's Terra satellite. The instrument was built by Japan's Ministry of Economy, Trade and Industry. A joint U.S./Japan science team is responsible for validation and calibration of the instrument and the data products. The broad spectral coverage and high spectral resolution of ASTER provides scientists in numerous disciplines with critical information for surface mapping, and monitoring of dynamic conditions and temporal change. Example applications are: monitoring glacial advances and retreats; monitoring potentially active volcanoes; identifying
Reduction of Linear Programming to Linear Approximation
Vaserstein, Leonid N.
2006-01-01
It is well known that every Chebyshev linear approximation problem can be reduced to a linear program. In this paper we show that conversely every linear program can be reduced to a Chebyshev linear approximation problem.
Bio-optical water quality dynamics observed from MERIS in Pensacola Bay, Florida
Observed bio-optical water quality data collected from 2009 to 2011 in Pensacola Bay, Florida were used to develop empirical remote sensing retrieval algorithms for chlorophyll a (Chla), colored dissolved organic matter (CDOM), and suspended particulate matter (SPM). Time-series ...
Bayes procedures for adaptive inference in inverse problems for the white noise model
Knapik, B.T.; Szabó, B.T.; van der Vaart, A.W.; van Zanten, J.H.
2016-01-01
We study empirical and hierarchical Bayes approaches to the problem of estimating an infinite-dimensional parameter in mildly ill-posed inverse problems. We consider a class of prior distributions indexed by a hyperparameter that quantifies regularity. We prove that both methods we consider succeed
DEFF Research Database (Denmark)
A watershed moment of the twentieth century, the end of empire saw upheavals to global power structures and national identities. However, decolonisation profoundly affected individual subjectivities too. Life Writing After Empire examines how people around the globe have made sense of the post...... in order to understand how individual life writing reflects broader societal changes. From far-flung corners of the former British Empire, people have turned to life writing to manage painful or nostalgic memories, as well as to think about the past and future of the nation anew through the personal...
Theological reflections on empire
Directory of Open Access Journals (Sweden)
Allan A. Boesak
2009-11-01
Full Text Available Since the meeting of the World Alliance of Reformed Churches in Accra, Ghana (2004, and the adoption of the Accra Declaration, a debate has been raging in the churches about globalisation, socio-economic justice, ecological responsibility, political and cultural domination and globalised war. Central to this debate is the concept of empire and the way the United States is increasingly becoming its embodiment. Is the United States a global empire? This article argues that the United States has indeed become the expression of a modern empire and that this reality has considerable consequences, not just for global economics and politics but for theological refl ection as well.
Yates, K.K.; Cronin, T. M.; Crane, M.; Hansen, M.; Nayeghandi, A.; Swarzenski, P.; Edgar, T.; Brooks, G.R.; Suthard, B.; Hine, A.; Locker, S.; Willard, D.A.; Hastings, D.; Flower, B.; Hollander, D.; Larson, R.A.; Smith, K.
2007-01-01
Many of the nation's estuaries have been environmentally stressed since the turn of the 20th century and will continue to be impacted in the future. Tampa Bay, one the Gulf of Mexico's largest estuaries, exemplifies the threats that our estuaries face (EPA Report 2001, Tampa Bay Estuary Program-Comprehensive Conservation and Management Plan (TBEP-CCMP)). More than 2 million people live in the Tampa Bay watershed, and the population constitutes to grow. Demand for freshwater resources, conversion of undeveloped areas to resident and industrial uses, increases in storm-water runoff, and increased air pollution from urban and industrial sources are some of the known human activities that impact Tampa Bay. Beginning on 2001, additional anthropogenic modifications began in Tampa Bat including construction of an underwater gas pipeline and a desalinization plant, expansion of existing ports, and increased freshwater withdrawal from three major tributaries to the bay. In January of 2001, the Tampa Bay Estuary Program (TBEP) and its partners identifies a critical need for participation from the U.S. Geological Survey (USGS) in providing multidisciplinary expertise and a regional-scale, integrated science approach to address complex scientific research issue and critical scientific information gaps that are necessary for continued restoration and preservation of Tampa Bay. Tampa Bay stakeholders identified several critical science gaps for which USGS expertise was needed (Yates et al. 2001). These critical science gaps fall under four topical categories (or system components): 1) water and sediment quality, 2) hydrodynamics, 3) geology and geomorphology, and 4) ecosystem structure and function. Scientists and resource managers participating in Tampa Bay studies recognize that it is no longer sufficient to simply examine each of these estuarine system components individually, Rather, the interrelation among system components must be understood to develop conceptual and
Spatial and temporal distribution of two diazotrophic bacteria in the Chesapeake Bay.
Short, Steven M; Jenkins, Bethany D; Zehr, Jonathan P
2004-04-01
The aim of this study was to initiate autecological studies on uncultivated natural populations of diazotrophic bacteria by examining the distribution of specific diazotrophs in the Chesapeake Bay. By use of quantitative PCR, the abundance of two nifH sequences (907h22 and 912h4) was quantified in water samples collected along a transect from the head to the mouth of the Chesapeake Bay during cruises in April and October 2001 and 2002. Standard curves for the quantitative PCR assays demonstrated that the relationship between gene copies and cycle threshold was linear and highly reproducible from 1 to 10(7) gene copies. The maximum number of 907h22 gene copies detected was approximately 140 ml(-1) and the maximum number of 912h4 gene copies detected was approximately 340 ml(-1). Sequence 912h4 was most abundant at the mouth of the Chesapeake Bay, and in general, its abundance increased with increasing salinity, with the highest abundances observed in April 2002. Overall, the 907h22 phylotype was most abundant at the mid-bay station. Additionally, 907h22 was most abundant in the April samples from the mid-bay and mouth of the Chesapeake Bay. Despite the fact that the Chesapeake Bay is rarely nitrogen limited, our results show that individual nitrogen-fixing bacteria have distinct nonrandom spatial and seasonal distributions in the Chesapeake Bay and are either distributed by specific physical processes or adapted to different environmental niches.
Management case study: Tampa Bay, Florida
Morrison, Gerold; Greening, Holly; Yates, Kimberly K.; Wolanski, Eric; McLusky, Donald S.
2011-01-01
Tampa Bay, Florida, USA, is a shallow, subtropical estuary that experienced severe cultural eutrophication between the 1940s and 1980s, a period when the human population of its watershed quadrupled. In response, citizen action led to the formation of a public- and private-sector partnership (the Tampa Bay Estuary Program), which adopted a number of management objectives to support the restoration and protection of the bay’s living resources. These included numeric chlorophyll a and water-clarity targets, as well as long-term goals addressing the spatial extent of seagrasses and other selected habitat types, to support estuarine-dependent faunal guilds. Over the past three decades, nitrogen controls involving sources such as wastewater treatment plants, stormwater conveyance systems, fertilizer manufacturing and shipping operations, and power plants have been undertaken to meet these and other management objectives. Cumulatively, these controls have resulted in a 60% reduction in annual total nitrogen (TN) loads relative to earlier worse-case (latter 1970s) conditions. As a result, annual water-clarity and chlorophyll a targets are currently met in most years, and seagrass cover measured in 2008 was the highest recorded since 1950. Factors that have contributed to the observed improvements in Tampa Bay over the past several decades include the following: (1) Development of numeric, science-based water-quality targets to meet a long-term goal of restoring seagrass acreage to 1950s levels. Empirical and mechanistic models found that annual average chlorophyll a concentrations were a primary manageable factor affecting light attenuation. The models also quantified relationships between TN loads, chlorophyll a concentrations, light attenuation, and fluctuations in seagrass cover. The availability of long-term monitoring data, and a systematic process for using the data to evaluate the effectiveness of management actions, has allowed managers to track progress and
African Journals Online (AJOL)
FIRST LADY
2011-01-18
Jan 18, 2011 ... Empirical results reveal that consumption of sugar in. Kenya varies ... experiences in trade in different regions of the world. Some studies ... To assess the relationship between domestic sugar retail prices and sugar sales in ...
Modern sedimentary environments in a large tidal estuary, Delaware Bay
Knebel, H.J.
1989-01-01
Data from an extensive grid of sidescan-sonar records reveal the distribution of sedimentary environments in the large, tidally dominated Delaware Bay estuary. Bathymetric features of the estuary include large tidal channels under the relatively deep (> 10 m water depth) central part of the bay, linear sand shoals (2-8 m relief) that parallel the sides of the tidal channels, and broad, low-relief plains that form the shallow bay margins. The two sedimentary environments that were identified are characterized by either (1) bedload transport and/or erosion or (2) sediment reworking and/or deposition. Sand waves and sand ribbons, composed of medium to coarse sands, define sites of active bedload transport within the tidal channels and in gaps between the linear shoals. The sand waves have spacings that vary from 1 to 70 m, amplitudes of 2 m or less, and crestlines that are usually straight. The orientations of the sand waves and ribbons indicate that bottom sediment movement may be toward either the northwest or southeast along the trends of the tidal channels, although sand-wave asymmetry indicates that the net bottom transport is directed northwestward toward the head of the bay. Gravelly, coarse-grained sediments, which appear as strongly reflective patterns on the sonographs, are also present along the axes and flanks of the tidal channels. These coarse sediments are lag deposits that have developed primarily where older strata were eroded at the bay floor. Conversely, fine sands that compose the linear shoals and muddy sands that cover the shallow bay margins appear mainly on the sonographs either as smooth featureless beds that have uniform light to moderate shading or as mosaics of light and dark patches produced by variations in grain size. These acoustic and textural characteristics are the result of sediment deposition and reworking. Data from this study (1) support the hypothesis that bed configurations under deep tidal flows are functions of current
National Oceanic and Atmospheric Administration, Department of Commerce — Samples were collected from October 15, 1985 through June 12, 1987 in emergent marsh and non-vegetated habitats throughout the Lavaca Bay system to characterize...
National Oceanic and Atmospheric Administration, Department of Commerce — Juvenile spotted seatrout and other sportfish are being monitored annually over a 6-mo period in Florida Bay to assess their abundance over time relative to...
Resilience of coastal wetlands to extreme hydrologicevents in Apalachicola Bay
Medeiros, S. C.; Singh, A.; Tahsin, S.
2017-12-01
Extreme hydrologic events such as hurricanes and droughts continuously threaten wetlands which provide key ecosystem services in coastal areas. The recovery time for vegetation after impact fromthese extreme events can be highly variable depending on the hazard type and intensity. Apalachicola Bay in Florida is home to a rich variety of saltwater and freshwater wetlands and is subject to a wide rangeof hydrologic hazards. Using spatiotemporal changes in Landsat-based empirical vegetation indices, we investigate the impact of hurricane and drought on both freshwater and saltwater wetlands from year 2000to 2015 in Apalachicola Bay. Our results indicate that saltwater wetlands are more resilient than freshwater wetlands and suggest that in response to hurricanes, the coastal wetlands took almost a year to recover,while recovery following a drought period was observed after only a month.
Directory of Open Access Journals (Sweden)
Chua Ming-chung
2016-01-01
Full Text Available Utilizing powerful nuclear reactors as antineutrino sources, high mountains to provide ample shielding from cosmic rays in the vicinity, and functionally identical detectors with large target volume for near-far relative measurement, the Daya Bay Reactor Neutrino Experiment has achieved unprecedented precision in measuring the neutrino mixing angle θ13 and the neutrino mass squared difference |Δm2ee|. I will report the latest Daya Bay results on neutrino oscillations and light sterile neutrino search.
Directory of Open Access Journals (Sweden)
Tanwiwat Jaikuna
2017-02-01
Full Text Available Purpose: To develop an in-house software program that is able to calculate and generate the biological dose distribution and biological dose volume histogram by physical dose conversion using the linear-quadratic-linear (LQL model. Material and methods : The Isobio software was developed using MATLAB version 2014b to calculate and generate the biological dose distribution and biological dose volume histograms. The physical dose from each voxel in treatment planning was extracted through Computational Environment for Radiotherapy Research (CERR, and the accuracy was verified by the differentiation between the dose volume histogram from CERR and the treatment planning system. An equivalent dose in 2 Gy fraction (EQD2 was calculated using biological effective dose (BED based on the LQL model. The software calculation and the manual calculation were compared for EQD2 verification with pair t-test statistical analysis using IBM SPSS Statistics version 22 (64-bit. Results: Two and three-dimensional biological dose distribution and biological dose volume histogram were displayed correctly by the Isobio software. Different physical doses were found between CERR and treatment planning system (TPS in Oncentra, with 3.33% in high-risk clinical target volume (HR-CTV determined by D90%, 0.56% in the bladder, 1.74% in the rectum when determined by D2cc, and less than 1% in Pinnacle. The difference in the EQD2 between the software calculation and the manual calculation was not significantly different with 0.00% at p-values 0.820, 0.095, and 0.593 for external beam radiation therapy (EBRT and 0.240, 0.320, and 0.849 for brachytherapy (BT in HR-CTV, bladder, and rectum, respectively. Conclusions : The Isobio software is a feasible tool to generate the biological dose distribution and biological dose volume histogram for treatment plan evaluation in both EBRT and BT.
Wind-Driven Waves in Tampa Bay, Florida
Gilbert, S. A.; Meyers, S. D.; Luther, M. E.
2002-12-01
Turbidity and nutrient flux due to sediment resuspension by waves and currents are important factors controlling water quality in Tampa Bay. During December 2001 and January 2002, four Sea Bird Electronics SeaGauge wave and tide recorders were deployed in Tampa Bay in each major bay segment. Since May 2002, a SeaGauge has been continuously deployed at a site in middle Tampa Bay as a component of the Bay Regional Atmospheric Chemistry Experiment (BRACE). Initial results for the summer 2002 data indicate that significant wave height is linearly dependent on wind speed and direction over a range of 1 to 12 m/s. The data were divided into four groups according to wind direction. Wave height dependence on wind speed was examined for each group. Both northeasterly and southwesterly winds force significant wave heights that are about 30% larger than those for northwesterly and southeasterly winds. This difference is explained by variations in fetch due to basin shape. Comparisons are made between these observations and the results of a SWAN-based model of Tampa Bay. The SWAN wave model is coupled to a three-dimensional circulation model and computes wave spectra at each model grid cell under observed wind conditions and modeled water velocity. When SWAN is run without dissipation, the model results are generally similar in wave period but about 25%-50% higher in significant wave height than the observations. The impact of various dissipation mechanisms such as bottom drag and whitecapping on the wave state is being investigated. Preliminary analyses on winter data give similar results.
Remotely Sensing Pollution: Detection and Monitoring of PCBs in the San Francisco Bay
Hilton, A.; Kudela, R. M.; Bausell, J.
2016-12-01
While the EPA banned polychlorinated biphenyls (PCBs) in 1977, they continue to persist in San Francisco Bay (SF Bay), often at dangerously high concentrations due to their long half-life. However, in spite of their associated health and environmental risks, PCB monitoring within SF Bay is extremely limited, due in large part to the high costs, both in terms of labor and capital that are associated with it. In this study, a cost effective alternative to in-situ PCB sampling is presented by demonstrating the feasibility of PCB detection via remote sensing. This was done by first establishing relationships between in-situ measurements of sum of 40 PCB concentrations and total suspended sediment concentration (SSC) collected from 1998-2006 at 37 stations distributed throughout SF Bay. A correlation was discovered for all stations at (R2 =0.32), which improved markedly upon partitioning stations into north bay, (R2 =0.64), central bay (R2 =0.80) and south bay (R2 =0.52) regions. SSC was then compared from three USGS monitoring stations with temporally consistent Landsat 8 imagery. The resulting correlation between Landsat 8 (Rrs 654) and SSC measured at USGS stations (R2 =0.50) was validated using an Airborne Visible/ Infrared Imaging Spectrometer (AVIRIS) image. The end product is a two-step empirical algorithm that can derive PCB from Landsat 8 imagery within SF Bay. This algorithm can generate spatial PCB concentration maps for SF Bay, which can in turn be utilized to increase ability to forecast PCB concentration. The observation that correlation between AVIRIS (Rrs 657) and SSC was stronger than that of Landsat 8 suggests that the accuracy of this algorithm could be enhanced with improved atmospheric correction.
Empirical philosophy of science
DEFF Research Database (Denmark)
Wagenknecht, Susann; Nersessian, Nancy J.; Andersen, Hanne
2015-01-01
A growing number of philosophers of science make use of qualitative empirical data, a development that may reconfigure the relations between philosophy and sociology of science and that is reminiscent of efforts to integrate history and philosophy of science. Therefore, the first part...... of this introduction to the volume Empirical Philosophy of Science outlines the history of relations between philosophy and sociology of science on the one hand, and philosophy and history of science on the other. The second part of this introduction offers an overview of the papers in the volume, each of which...... is giving its own answer to questions such as: Why does the use of qualitative empirical methods benefit philosophical accounts of science? And how should these methods be used by the philosopher?...
DEFF Research Database (Denmark)
Sporring, Jon
Principle Component Analysis is a simple tool to obtain linear models for stochastic data and is used both for a data reduction or equivalently noise elim- ination and for data analysis. Principle Component Analysis ts a multivariate Gaussian distribution to the data, and the typical method is by...
DEFF Research Database (Denmark)
Gravier, Magali
2011-01-01
The article discusses the concepts of federation and empire in the context of the European Union (EU). Even if these two concepts are not usually contrasted to one another, the article shows that they refer to related type of polities. Furthermore, they can be used at a time because they shed light...... on different and complementary aspects of the European integration process. The article concludes that the EU is at the crossroads between federation and empire and may remain an ‘imperial federation’ for several decades. This could mean that the EU is on the verge of transforming itself to another type...
Empirical comparison of theories
International Nuclear Information System (INIS)
Opp, K.D.; Wippler, R.
1990-01-01
The book represents the first, comprehensive attempt to take an empirical approach for comparative assessment of theories in sociology. The aims, problems, and advantages of the empirical approach are discussed in detail, and the three theories selected for the purpose of this work are explained. Their comparative assessment is performed within the framework of several research projects, which among other subjects also investigate the social aspects of the protest against nuclear power plants. The theories analysed in this context are the theory of mental incongruities and that of the benefit, and their efficiency in explaining protest behaviour is compared. (orig./HSCH) [de
DEFF Research Database (Denmark)
Grund, Cynthia M.
The toolbox for empirically exploring the ways that artistic endeavors convey and activate meaning on the part of performers and audiences continues to expand. Current work employing methods at the intersection of performance studies, philosophy, motion capture and neuroscience to better understand...... musical performance and reception is inspired by traditional approaches within aesthetics, but it also challenges some of the presuppositions inherent in them. As an example of such work I present a research project in empirical music aesthetics begun last year and of which I am a team member....
Bayes estimation of the general hazard rate model
International Nuclear Information System (INIS)
Sarhan, A.
1999-01-01
In reliability theory and life testing models, the life time distributions are often specified by choosing a relevant hazard rate function. Here a general hazard rate function h(t)=a+bt c-1 , where c, a, b are constants greater than zero, is considered. The parameter c is assumed to be known. The Bayes estimators of (a,b) based on the data of type II/item-censored testing without replacement are obtained. A large simulation study using Monte Carlo Method is done to compare the performance of Bayes with regression estimators of (a,b). The criterion for comparison is made based on the Bayes risk associated with the respective estimator. Also, the influence of the number of failed items on the accuracy of the estimators (Bayes and regression) is investigated. Estimations for the parameters (a,b) of the linearly increasing hazard rate model h(t)=a+bt, where a, b are greater than zero, can be obtained as the special case, letting c=2
Empirical research through design
Keyson, D.V.; Bruns, M.
2009-01-01
This paper describes the empirical research through design method (ERDM), which differs from current approaches to research through design by enforcing the need for the designer, after a series of pilot prototype based studies, to a-priori develop a number of testable interaction design hypothesis
Essays in empirical microeconomics
Péter, A.N.
2016-01-01
The empirical studies in this thesis investigate various factors that could affect individuals' labor market, family formation and educational outcomes. Chapter 2 focuses on scheduling as a potential determinant of individuals' productivity. Chapter 3 looks at the role of a family factor on
Worship, Reflection, Empirical Research
Ding Dong,
2012-01-01
In my youth, I was a worshipper of Mao Zedong. From the latter stage of the Mao Era to the early years of Reform and Opening, I began to reflect on Mao and the Communist Revolution he launched. In recent years I’ve devoted myself to empirical historical research on Mao, seeking the truth about Mao and China’s modern history.
DEFF Research Database (Denmark)
Bang, Peter Fibiger
2007-01-01
This articles seeks to establish a new set of organizing concepts for the analysis of the Roman imperial economy from Republic to late antiquity: tributary empire, port-folio capitalism and protection costs. Together these concepts explain better economic developments in the Roman world than the...
Empirically sampling Universal Dependencies
DEFF Research Database (Denmark)
Schluter, Natalie; Agic, Zeljko
2017-01-01
Universal Dependencies incur a high cost in computation for unbiased system development. We propose a 100% empirically chosen small subset of UD languages for efficient parsing system development. The technique used is based on measurements of model capacity globally. We show that the diversity o...
Long-Term Water Temperature Variations in Daya Bay, China Using Satellite and In Situ Observations
Directory of Open Access Journals (Sweden)
Jing Yu
2010-01-01
Full Text Available Daya Bay is a shallow, semi-en closed bay in the northern section of the South China Sea. The present study analyzed variations of water temperature in Daya Bay over the past 21 years (1985 - 2005 using Advanced Very High Resolution Radiometer (AVHRR satellite remote sensing data and in situ observations. Results showed that AVHRR readings of sea surface temperature (SST increased by 0.07°C y-1. Linear regression anal y sis for monthly SST anomalies (SSTA showed a shift from negative to positive from 1995 - 1996, when the Daya Bay nuclear power station commenced operations in 1994. The slope of linear regression analysis for SSTA nearly doubled from 0.05 (1985 - 1993 to 0.09 (1994 - 2005. Monthly AVHRR images showed a thermal plume from the power station and revealed the in crease of SST over 21 years. In situ observations in water temperature also showed an in creasing trend for the same period (1985 - 2005. Variations in water temperature in Daya Bay were connected with climatic perturbations and in creasing human activity including thermal discharge from nuclear power stations and the rapid economic development around the bay area.
33 CFR 100.919 - International Bay City River Roar, Bay City, MI.
2010-07-01
... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false International Bay City River Roar, Bay City, MI. 100.919 Section 100.919 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF... Bay City River Roar, Bay City, MI. (a) Regulated Area. A regulated area is established to include all...
77 FR 2972 - Thunder Bay Power Company, Thunder Bay Power, LLC, et al.
2012-01-20
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Thunder Bay Power Company, Thunder Bay Power, LLC, et al.; Notice of Application for Transfer of Licenses, and Soliciting Comments and Motions To Intervene Thunder Bay Power Company Project No. 2404-095 Thunder Bay Power, LLC Midwest Hydro, Inc...
33 CFR 162.125 - Sturgeon Bay and the Sturgeon Bay Ship Canal, Wisc.
2010-07-01
... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Sturgeon Bay and the Sturgeon Bay Ship Canal, Wisc. 162.125 Section 162.125 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) PORTS AND WATERWAYS SAFETY INLAND WATERWAYS NAVIGATION REGULATIONS § 162.125 Sturgeon Bay and the Sturgeon Bay Ship...
2012-06-28
... 1625-AA00 Safety Zone; Alexandria Bay Chamber of Commerce, St. Lawrence River, Alexandria Bay, NY... restrict vessels from a portion of the St. Lawrence River during the Alexandria Bay Chamber of Commerce... of proposed rulemaking (NPRM) entitled Safety Zone; Alexandria Bay Chamber of Commerce, St. Lawrence...
Humboldt Bay, California Benthic Habitats 2009 Geodatabase
National Oceanic and Atmospheric Administration, Department of Commerce — Humboldt Bay is the largest estuary in California north of San Francisco Bay and represents a significant resource for the north coast region. Beginning in 2007 the...
Humboldt Bay Benthic Habitats 2009 Aquatic Setting
National Oceanic and Atmospheric Administration, Department of Commerce — Humboldt Bay is the largest estuary in California north of San Francisco Bay and represents a significant resource for the north coast region. Beginning in 2007 the...
San Francisco Bay Water Quality Improvement Fund
EPAs grant program to protect and restore San Francisco Bay. The San Francisco Bay Water Quality Improvement Fund (SFBWQIF) has invested in 58 projects along with 70 partners contributing to restore wetlands, water quality, and reduce polluted runoff.,
South Bay Salt Pond Mercury Studies Project
Information about the SFBWQP South Bay Salt Pond Mercury Studies Project, part of an EPA competitive grant program to improve SF Bay water quality focused on restoring impaired waters and enhancing aquatic resources.
Humboldt Bay, California Benthic Habitats 2009 Substrate
National Oceanic and Atmospheric Administration, Department of Commerce — Humboldt Bay is the largest estuary in California north of San Francisco Bay and represents a significant resource for the north coast region. Beginning in 2007 the...
Humboldt Bay, California Benthic Habitats 2009 Geoform
National Oceanic and Atmospheric Administration, Department of Commerce — Humboldt Bay is the largest estuary in California north of San Francisco Bay and represents a significant resource for the north coast region. Beginning in 2007 the...
Contaminant transport in Massachusetts Bay
Butman, Bradford
Construction of a new treatment plant and outfall to clean up Boston Harbor is currently one of the world's largest public works projects, costing about $4 billion. There is concern about the long-term impact of contaminants on Massachusetts Bay and adjacent Gulf of Maine because these areas are used extensively for transportation, recreation, fishing, and tourism, as well as waste disposal. Public concern also focuses on Stellwagen Bank, located on the eastern side of Massachusetts Bay, which is an important habitat for endangered whales. Contaminants reach Massachusetts Bay not only from Boston Harbor, but from other coastal communities on the Gulf of Maine, as well as from the atmosphere. Knowledge of the pathways, mechanisms, and rates at which pollutants are transported throughout these coastal environments is needed to address a wide range of management questions.
International Nuclear Information System (INIS)
Davis, J.M.; Pollock, J.R.
1992-01-01
Almost every day, it seems, someone is mentioning Prudhoe Bay---its development activities, the direction of its oil production, and more recently its decline rate. Almost as frequently, someone is mentioning the number of companies abandoning exploration in Alaska. The state faces a double-edged dilemma: decline of its most important oil field and a diminished effort to find a replacement for the lost production. ARCO has seen the Prudhoe Bay decline coming for some time and has been planning for it. We have reduced staff, and ARCO and BP Exploration are finding cost-effective ways to work more closely together through such vehicles as shared services. At the same time, ARCO is continuing its high level of Alaskan exploration. This article will assess the future of Prudhoe Bay from a technical perspective, review ARCO's exploration plans for Alaska, and suggest what the state can do to encourage other companies to invest in this crucial producing region and exploratory frontier
On misclassication probabilities of linear and quadratic classiers ...
African Journals Online (AJOL)
We study the theoretical misclassication probability of linear and quadratic classiers and examine the performance of these classiers under distributional variations in theory and using simulation. We derive expression for Bayes errors for some competing distributions from the same family under location shift. Keywords: ...
International Nuclear Information System (INIS)
Honda, Teruyuki; Kimura, Ken-ichiro
2003-01-01
Fourteen major and trace elements in marine sediment core samples collected from the coasts along eastern Japan, i.e. Tokyo Bay (II) (the recess), Tokyo Bay (IV) (the mouth), Mutsu Bay and Funka Bay and the Northwest Pacific basin as a comparative subject were determined by the instrumental neutron activation analysis (INAA). The sedimentation rates and sedimentary ages were calculated for the coastal sediment cores by the 210 Pb method. The results obtained in this study are summarized as follows: (1) Lanthanoid abundance patterns suggested that the major origin of the sediments was terrigenous material. La*/Lu* and Ce*/La* ratios revealed that the sediments from Tokyo Bay (II) and Mutsu Bay more directly reflected the contribution from river than those of other regions. In addition, the Th/Sc ratio indicated that the coastal sediments mainly originated in the materials from the volcanic island-arcs, Japanese islands, whereas those from the Northwest Pacific mainly from the continent. (2) The correlation between the Ce/U and Th/U ratios with high correlation coefficients of 0.920 to 0.991 indicated that all the sediments from Tokyo Bay (II) and Funka Bay were in reducing conditions while at least the upper sediments from Tokyo Bay (IV) and Mutsu Bay were in oxidizing conditions. (3) It became quite obvious that the sedimentation mechanism and the sedimentation environment at Tokyo Bay (II) was different from those at Tokyo Bay (IV), since the sedimentation rate at Tokyo Bay (II) was approximately twice as large as that at Tokyo Bay (IV). The sedimentary age of the 5th layer (8∼10 cm in depth) from Funka Bay was calculated at approximately 1940∼50, which agreed with the time, 1943∼45 when Showa-shinzan was formed by the eruption of the Usu volcano. (author)
Mobile Bay turbidity plume study
Crozier, G. F.
1976-01-01
Laboratory and field transmissometer studies on the effect of suspended particulate material upon the appearance of water are reported. Quantitative correlations were developed between remotely sensed image density, optical sea truth data, and actual sediment load. Evaluation of satellite image sea truth data for an offshore plume projects contours of transmissivity for two different tidal phases. Data clearly demonstrate the speed of change and movement of the optical plume for water patterns associated with the mouth of Mobile bay in which relatively clear Gulf of Mexico water enters the bay on the eastern side. Data show that wind stress in excess of 15 knots has a marked impact in producing suspended sediment loads.
Automation in tube finishing bay
International Nuclear Information System (INIS)
Bhatnagar, Prateek; Satyadev, B.; Raghuraman, S.; Syama Sundara Rao, B.
1997-01-01
Automation concept in tube finishing bay, introduced after the final pass annealing of PHWR tubes resulted in integration of number of sub-systems in synchronisation with each other to produce final cut fuel tubes of specified length, tube finish etc. The tube finishing bay which was physically segregated into four distinct areas: 1. tube spreader and stacking area, 2. I.D. sand blasting area, 3. end conditioning, wad blowing, end capping and O.D. wet grinding area, 4. tube inspection, tube cutting and stacking area has been studied
Chesapeake Bay plume dynamics from LANDSAT
Munday, J. C., Jr.; Fedosh, M. S.
1981-01-01
LANDSAT images with enhancement and density slicing show that the Chesapeake Bay plume usually frequents the Virginia coast south of the Bay mouth. Southwestern (compared to northern) winds spread the plume easterly over a large area. Ebb tide images (compared to flood tide images) show a more dispersed plume. Flooding waters produce high turbidity levels over the shallow northern portion of the Bay mouth.
Default Bayes factors for ANOVA designs
Rouder, Jeffrey N.; Morey, Richard D.; Speckman, Paul L.; Province, Jordan M.
2012-01-01
Bayes factors have been advocated as superior to p-values for assessing statistical evidence in data. Despite the advantages of Bayes factors and the drawbacks of p-values, inference by p-values is still nearly ubiquitous. One impediment to the adoption of Bayes factors is a lack of practical
Occurance and survival of Vibrio alginolyticus in Tamouda Bay (Morocco).
Sabir, M; Cohen, N; Boukhanjer, A; Ennaji, M M
2011-10-15
The objectives of this study were to investigate the spatial and seasonal fluctuations of Vibrio alginolyticus in marine environment of the Tamouda Bay on the Mediterranean coast of Morocco and to determine the dominant factors of the environment that govern these fluctuations. The samples (sea water, plankton, shellfish and sediment) were collected fortnightly for two years from three study sites on the coast Tamouda Bay in northern Morocco. The charge of Vibrio alginolyticus is determined by MPN method. The physicochemical parameters including temperature of sea water, pH, salinity, turbidity and chlorophyll a concentration were determined. Analysis of variance of specific variables and several principal component analyses showed that the temperature of seawater is the major determinant of seasonal distribution of Vibrio alginolyticus. The results showed a positive linear correlation between Vibrio alginolyticus and the water temperature, pH, turbidity and chlorophyll a. Similarly, there are seasonal variations and spatial of Vibrio alginolyticus in marine environment of the Tamouda bay and the highest concentrations were recorded in both years of study during the warm season whereas it was minimal during the cold season. Linear positive correlation was recorded between Vibrio alginolyticus populations in all ecological types of samples studied.
Linear Algebra and Smarandache Linear Algebra
Vasantha, Kandasamy
2003-01-01
The present book, on Smarandache linear algebra, not only studies the Smarandache analogues of linear algebra and its applications, it also aims to bridge the need for new research topics pertaining to linear algebra, purely in the algebraic sense. We have introduced Smarandache semilinear algebra, Smarandache bilinear algebra and Smarandache anti-linear algebra and their fuzzy equivalents. Moreover, in this book, we have brought out the study of linear algebra and vector spaces over finite p...
DEFF Research Database (Denmark)
Rasch, Astrid
of the collective, but insufficient attention has been paid to how individuals respond to such narrative changes. This dissertation examines the relationship between individual and collective memory at the end of empire through analysis of 13 end of empire autobiographies by public intellectuals from Australia......Decolonisation was a major event of the twentieth century, redrawing maps and impacting on identity narratives around the globe. As new nations defined their place in the world, the national and imperial past was retold in new cultural memories. These developments have been studied at the level......, the Anglophone Caribbean and Zimbabwe. I conceive of memory as reconstructive and social, with individual memory striving to make sense of the past in the present in dialogue with surrounding narratives. By examining recurring tropes in the autobiographies, like colonial education, journeys to the imperial...
International Nuclear Information System (INIS)
Guillemoles, A.; Lazareva, A.
2008-01-01
Gazprom is conquering the world. The Russian industrial giant owns the hugest gas reserves and enjoys the privilege of a considerable power. Gazprom edits journals, owns hospitals, airplanes and has even built cities where most of the habitants work for him. With 400000 workers, Gazprom represents 8% of Russia's GDP. This inquiry describes the history and operation of this empire and show how its has become a masterpiece of the government's strategy of russian influence reconquest at the world scale. Is it going to be a winning game? Are the corruption affairs and the expected depletion of resources going to weaken the empire? The authors shade light on the political and diplomatic strategies that are played around the crucial dossier of the energy supply. (J.S.)
Classification using Hierarchical Naive Bayes models
DEFF Research Database (Denmark)
Langseth, Helge; Dyhre Nielsen, Thomas
2006-01-01
Classification problems have a long history in the machine learning literature. One of the simplest, and yet most consistently well-performing set of classifiers is the Naïve Bayes models. However, an inherent problem with these classifiers is the assumption that all attributes used to describe......, termed Hierarchical Naïve Bayes models. Hierarchical Naïve Bayes models extend the modeling flexibility of Naïve Bayes models by introducing latent variables to relax some of the independence statements in these models. We propose a simple algorithm for learning Hierarchical Naïve Bayes models...
Vorobel, Vit; Daya Bay Collaboration
2017-07-01
The Daya Bay Reactor Neutrino Experiment was designed to measure θ 13, the smallest mixing angle in the three-neutrino mixing framework, with unprecedented precision. The experiment consists of eight functionally identical detectors placed underground at different baselines from three pairs of nuclear reactors in South China. Since Dec. 2011, the experiment has been running stably for more than 4 years, and has collected the largest reactor anti-neutrino sample to date. Daya Bay is able to greatly improve the precision on θ 13 and to make an independent measurement of the effective mass splitting in the electron antineutrino disappearance channel. Daya Bay can also perform a number of other precise measurements, such as a high-statistics determination of the absolute reactor antineutrino flux and spectrum, as well as a search for sterile neutrino mixing, among others. The most recent results from Daya Bay are discussed in this paper, as well as the current status and future prospects of the experiment.
Daya bay reactor neutrino experiment
International Nuclear Information System (INIS)
Cao Jun
2010-01-01
Daya Bay Reactor Neutrino Experiment is a large international collaboration experiment under construction. The experiment aims to precisely determine the neutrino mixing angle θ 13 by detecting the neutrinos produced by the Daya Bay Nuclear Power Plant. θ 13 is one of two unknown fundamental parameters in neutrino mixing. Its magnitude is a roadmap of the future neutrino physics, and very likely related to the puzzle of missing antimatter in our universe. The precise measurement has very important physics significance. The detectors of Daya Bay is under construction now. The full operation is expected in 2011. Three years' data taking will reach the designed the precision, to determine sin 2 2θ 13 to better than 0.01. Daya Bay neutrino detector is an underground large nuclear detector of low background, low energy, and high precision. In this paper, the layout of the experiment, the design and fabrication progress of the detectors, and some highlighted nuclear detecting techniques developed in the detector R and D are introduced. (author)
Linear and nonlinear analysis of high-power rf amplifiers
International Nuclear Information System (INIS)
Puglisi, M.
1983-01-01
After a survey of the state variable analysis method the final amplifier for the CBA is analyzed taking into account the real beam waveshape. An empirical method for checking the stability of a non-linear system is also considered
Differences in Dynamic Brand Competition Across Markets: An Empirical Analysis
Jean-Pierre Dubé; Puneet Manchanda
2005-01-01
We investigate differences in the dynamics of marketing decisions across geographic markets empirically. We begin with a linear-quadratic game involving forward-looking firms competing on prices and advertising. Based on the corresponding Markov perfect equilibrium, we propose estimable econometric equations for demand and marketing policy. Our model allows us to measure empirically the strategic response of competitors along with economic measures such as firm profitability. We use a rich da...
Institute of Scientific and Technical Information of China (English)
赵琳; 傅联英; 陈波
2014-01-01
Advertising is not an uncommon tool in non-price competition strategy pool and it affects the market performance non-linearly ( Ishigaki,2000 ) . This paper employs Non-dynamic Panel Threshold Regression Model to investigate the non-linear relationship between advertising and market performance in pharmaceutical industry. The empirical result finds strong evidence of an inverted U-shape relationship between them and identifies significant threshold effect. Specifical y, advertising investment significantly promotes the profit of pharmaceutical enterprises where the intensity of advertising fal s within somewhere between 0 and 0.0491;while it significantly discourages the profit of pharmaceutical enterprises if the intensity of advertising goes beyond 0.0491. The marginal effect of advertising on profit decreases and thus the optimal intensity of advertising for pharmaceutical industry lies in 0.0491. Further, pharmaceutical enterprises, with a proportion of 4.7% in full sample, over-advertise during the observation period. However, large pharmaceutical enterprises, with an amazing proportion of 27.3%in subsample grouped by scale, over-advertise. Small and medium pharmaceutical enterprises over-advertise as wel but with a lower proportion. Conclusions are beneficial to pharmaceutical enterprises in China and some recommendations are offered.%广告竞争是一种常用的非价格竞争手段，其投放强度对市场绩效的影响呈现出非线性特征（Ishigaki，2000）。本文以医药产业为例，运用面板数据门限回归模型实证检验了广告投放和企业利润之间的非线性关系，发现广告投放对企业利润的影响呈现出倒 U 型结构并存在显著的“门限效应”。具体地，若广告投放强度位于[0，0.049,1]之间，广告投放量增加能显著提升药企利润；当广告投放强度超过0.049,1时，广告投放量增加则会降低药企利润；医药生产企业最优广告投入强度为0.049,1
Epistemology and Empirical Investigation
DEFF Research Database (Denmark)
Ahlström, Kristoffer
2008-01-01
Recently, Hilary Kornblith has argued that epistemological investigation is substantially empirical. In the present paper, I will ¿rst show that his claim is not contingent upon the further and, admittedly, controversial assumption that all objects of epistemological investigation are natural kinds....... Then, I will argue that, contrary to what Kornblith seems to assume, this methodological contention does not imply that there is no need for attending to our epistemic concepts in epistemology. Understanding the make-up of our concepts and, in particular, the purposes they ¿ll, is necessary...
Numerical model for wind-driven circulation in the Bay of Bengal
Digital Repository Service at National Institute of Oceanography (India)
Bahulayan, N.; Varadachari, V.V.R.
Wind-driven circulation in the Bay of Bengal, generated by a southwest wind of constant speed (10 m.sec -1) and direction (225 degrees TN), is presented. A non-linear hydrodynamic model is used for the simulation of circulation. Numerical...
Lithosphere structure and upper mantle characteristics below the Bay of Bengal
Digital Repository Service at National Institute of Oceanography (India)
Rao, G.S.; Radhakrishna, M.; Sreejith, K.M.; Krishna, K.S.; Bull, J.M.
The oceanic lithosphere in the Bay of Bengal (BOB) formed 80-120 Ma following the breakup of eastern Gondwanaland. Since its formation, it has been affected by the emplacement of two long N-S trending linear aseismic ridges (85°E and Ninetyeast...
Constructive Verification, Empirical Induction, and Falibilist Deduction: A Threefold Contrast
Directory of Open Access Journals (Sweden)
Julio Michael Stern
2011-10-01
Full Text Available This article explores some open questions related to the problem of verification of theories in the context of empirical sciences by contrasting three epistemological frameworks. Each of these epistemological frameworks is based on a corresponding central metaphor, namely: (a Neo-empiricism and the gambling metaphor; (b Popperian falsificationism and the scientific tribunal metaphor; (c Cognitive constructivism and the object as eigen-solution metaphor. Each of one of these epistemological frameworks has also historically co-evolved with a certain statistical theory and method for testing scientific hypotheses, respectively: (a Decision theoretic Bayesian statistics and Bayes factors; (b Frequentist statistics and p-values; (c Constructive Bayesian statistics and e-values. This article examines with special care the Zero Probability Paradox (ZPP, related to the verification of sharp or precise hypotheses. Finally, this article makes some remarks on Lakatos’ view of mathematics as a quasi-empirical science.
Empirical microeconomics action functionals
Baaquie, Belal E.; Du, Xin; Tanputraman, Winson
2015-06-01
A statistical generalization of microeconomics has been made in Baaquie (2013), where the market price of every traded commodity, at each instant of time, is considered to be an independent random variable. The dynamics of commodity market prices is modeled by an action functional-and the focus of this paper is to empirically determine the action functionals for different commodities. The correlation functions of the model are defined using a Feynman path integral. The model is calibrated using the unequal time correlation of the market commodity prices as well as their cubic and quartic moments using a perturbation expansion. The consistency of the perturbation expansion is verified by a numerical evaluation of the path integral. Nine commodities drawn from the energy, metal and grain sectors are studied and their market behavior is described by the model to an accuracy of over 90% using only six parameters. The paper empirically establishes the existence of the action functional for commodity prices that was postulated to exist in Baaquie (2013).
Pérez-Rodríguez, Paulino; Gianola, Daniel; González-Camacho, Juan Manuel; Crossa, José; Manès, Yann; Dreisigacker, Susanne
2012-12-01
In genome-enabled prediction, parametric, semi-parametric, and non-parametric regression models have been used. This study assessed the predictive ability of linear and non-linear models using dense molecular markers. The linear models were linear on marker effects and included the Bayesian LASSO, Bayesian ridge regression, Bayes A, and Bayes B. The non-linear models (this refers to non-linearity on markers) were reproducing kernel Hilbert space (RKHS) regression, Bayesian regularized neural networks (BRNN), and radial basis function neural networks (RBFNN). These statistical models were compared using 306 elite wheat lines from CIMMYT genotyped with 1717 diversity array technology (DArT) markers and two traits, days to heading (DTH) and grain yield (GY), measured in each of 12 environments. It was found that the three non-linear models had better overall prediction accuracy than the linear regression specification. Results showed a consistent superiority of RKHS and RBFNN over the Bayesian LASSO, Bayesian ridge regression, Bayes A, and Bayes B models.
Propagation of Tsunami-like Surface Long Waves in the Bays of a Variable Depth
Directory of Open Access Journals (Sweden)
A.Yu. Bazykina
2016-08-01
Full Text Available Within the framework of the nonlinear long wave theory the regularities of solitary long wave propagation in the semi-closed bays of model and real geometry are numerically studied. In the present article the zones of wave amplification in the bay are found. The first one is located near the wave running-up on the beach (in front of the bay entrance and the other one – in the middle part of the sea basin. Wave propagation in these zones is accompanied both by significant rise and considerable fall of the sea level. Narrowing of the bay entrance and increase of the entering wave length result in decrease of the sea level maximum rises and falls. The Feodosiya Gulf in the Black Sea is considered as a real basin. In general the dynamics of the waves in the gulf is similar to wave dynamics in the model bay. Four zones of the strongest wave amplification in the Feodosiya Gulf are revealed in the article. The sea level maximum rises and extreme falls which tend to grow with decrease of the entering wave length are observed in these zones. The distance traveled by the wave before the collapse (due to non-linear effects, was found to reduce with decreasing wavelength of the entrance to the bay (gulf.
What 'empirical turn in bioethics'?
Hurst, Samia
2010-10-01
Uncertainty as to how we should articulate empirical data and normative reasoning seems to underlie most difficulties regarding the 'empirical turn' in bioethics. This article examines three different ways in which we could understand 'empirical turn'. Using real facts in normative reasoning is trivial and would not represent a 'turn'. Becoming an empirical discipline through a shift to the social and neurosciences would be a turn away from normative thinking, which we should not take. Conducting empirical research to inform normative reasoning is the usual meaning given to the term 'empirical turn'. In this sense, however, the turn is incomplete. Bioethics has imported methodological tools from empirical disciplines, but too often it has not imported the standards to which researchers in these disciplines are held. Integrating empirical and normative approaches also represents true added difficulties. Addressing these issues from the standpoint of debates on the fact-value distinction can cloud very real methodological concerns by displacing the debate to a level of abstraction where they need not be apparent. Ideally, empirical research in bioethics should meet standards for empirical and normative validity similar to those used in the source disciplines for these methods, and articulate these aspects clearly and appropriately. More modestly, criteria to ensure that none of these standards are completely left aside would improve the quality of empirical bioethics research and partly clear the air of critiques addressing its theoretical justification, when its rigour in the particularly difficult context of interdisciplinarity is what should be at stake.
2002-01-01
Rivers that empty into large bodies of water can have a significant impact on the thawing of nearshore winter ice. This true-color Moderate Resolution Imaging Spectroradiometer (MODIS) image from May 18, 2001, shows the Nelson River emptying spring runoff from the Manitoba province to the south into the southwestern corner of Canada's Hudson Bay. The warmer waters from more southern latitudes hasten melting of ice near the shore, though some still remained, perhaps because in shallow coastal waters, the ice could have been anchored to the bottom. High volumes of sediment in the runoff turned the inflow brown, and the rim of the retreating ice has taken on a dirty appearance even far to the east of the river's entrance into the Bay. The sediment would have further hastened the melting of the ice because its darker color would have absorbed more solar radiation than cleaner, whiter ice. Image courtesy Jacques Descloitres, MODIS Land Rapid Response Team at NASA GSFC
Human Power Empirically Explored
Energy Technology Data Exchange (ETDEWEB)
Jansen, A.J.
2011-01-18
Harvesting energy from the users' muscular power to convert this into electricity is a relatively unknown way to power consumer products. It nevertheless offers surprising opportunities for product designers; human-powered products function independently from regular power infrastructure, are convenient and can be environmentally and economically beneficial. This work provides insight into the knowledge required to design human-powered energy systems in consumer products from a scientific perspective. It shows the developments of human-powered products from the first introduction of the BayGen Freeplay radio in 1995 till current products and provides an overview and analysis of 211 human-powered products currently on the market. Although human power is generally perceived as beneficial for the environment, this thesis shows that achieving environmental benefit is only feasible when the environmental impact of additional materials in the energy conversion system is well balanced with the energy demands of the products functionality. User testing with existing products showed a preference for speeds in the range of 70 to 190 rpm for crank lengths from 32 to 95 mm. The muscular input power varied from 5 to 21 W. The analysis of twenty graduation projects from the Faculty of Industrial Design Engineering in the field of human-powered products, offers an interesting set of additional practice based design recommendations. The knowledge based approach of human power is very powerful to support the design of human-powered products. There is substantial potential for improvements in the domains energy conversion, ergonomics and environment. This makes that human power, when applied properly, is environmentally and economically competitive over a wider range of applications than thought previously.
Linearly constrained minimax optimization
DEFF Research Database (Denmark)
Madsen, Kaj; Schjær-Jacobsen, Hans
1978-01-01
We present an algorithm for nonlinear minimax optimization subject to linear equality and inequality constraints which requires first order partial derivatives. The algorithm is based on successive linear approximations to the functions defining the problem. The resulting linear subproblems...
Structured Additive Regression Models: An R Interface to BayesX
Directory of Open Access Journals (Sweden)
Nikolaus Umlauf
2015-02-01
Full Text Available Structured additive regression (STAR models provide a flexible framework for model- ing possible nonlinear effects of covariates: They contain the well established frameworks of generalized linear models and generalized additive models as special cases but also allow a wider class of effects, e.g., for geographical or spatio-temporal data, allowing for specification of complex and realistic models. BayesX is standalone software package providing software for fitting general class of STAR models. Based on a comprehensive open-source regression toolbox written in C++, BayesX uses Bayesian inference for estimating STAR models based on Markov chain Monte Carlo simulation techniques, a mixed model representation of STAR models, or stepwise regression techniques combining penalized least squares estimation with model selection. BayesX not only covers models for responses from univariate exponential families, but also models from less-standard regression situations such as models for multi-categorical responses with either ordered or unordered categories, continuous time survival data, or continuous time multi-state models. This paper presents a new fully interactive R interface to BayesX: the R package R2BayesX. With the new package, STAR models can be conveniently specified using Rs formula language (with some extended terms, fitted using the BayesX binary, represented in R with objects of suitable classes, and finally printed/summarized/plotted. This makes BayesX much more accessible to users familiar with R and adds extensive graphics capabilities for visualizing fitted STAR models. Furthermore, R2BayesX complements the already impressive capabilities for semiparametric regression in R by a comprehensive toolbox comprising in particular more complex response types and alternative inferential procedures such as simulation-based Bayesian inference.
Multiscale empirical interpolation for solving nonlinear PDEs
Calo, Victor M.
2014-12-01
In this paper, we propose a multiscale empirical interpolation method for solving nonlinear multiscale partial differential equations. The proposed method combines empirical interpolation techniques and local multiscale methods, such as the Generalized Multiscale Finite Element Method (GMsFEM). To solve nonlinear equations, the GMsFEM is used to represent the solution on a coarse grid with multiscale basis functions computed offline. Computing the GMsFEM solution involves calculating the system residuals and Jacobians on the fine grid. We use empirical interpolation concepts to evaluate these residuals and Jacobians of the multiscale system with a computational cost which is proportional to the size of the coarse-scale problem rather than the fully-resolved fine scale one. The empirical interpolation method uses basis functions which are built by sampling the nonlinear function we want to approximate a limited number of times. The coefficients needed for this approximation are computed in the offline stage by inverting an inexpensive linear system. The proposed multiscale empirical interpolation techniques: (1) divide computing the nonlinear function into coarse regions; (2) evaluate contributions of nonlinear functions in each coarse region taking advantage of a reduced-order representation of the solution; and (3) introduce multiscale proper-orthogonal-decomposition techniques to find appropriate interpolation vectors. We demonstrate the effectiveness of the proposed methods on several nonlinear multiscale PDEs that are solved with Newton\\'s methods and fully-implicit time marching schemes. Our numerical results show that the proposed methods provide a robust framework for solving nonlinear multiscale PDEs on a coarse grid with bounded error and significant computational cost reduction.
Spill management strategy for the Chesapeake Bay
International Nuclear Information System (INIS)
Butler, H.L.; Chapman, R.S.; Johnson, B.H.
1990-01-01
The Chesapeake Bay Program is a unique cooperative effort between state and Federal agencies to restore the health and productivity of America's largest estuary. To assist in addressing specific management issues, a comprehensive three-dimensional, time-varying hydrodynamic and water quality model has ben developed. The Bay modeling strategy will serve as an excellent framework for including submodules to predict the movement, dispersion, and weathering of accidental spills, such as for petroleum products or other chemicals. This paper presents sample results from the Bay application to illustrate the success of the model system in simulating Bay processes. Also, a review of model requirements for successful spill modeling in Chesapeake Bay is presented. Recommendations are given for implementing appropriate spill modules with the Bay model framework and establishing a strategy for model use in addressing management issues
Transitioning a Chesapeake Bay Ecological Prediction System to Operations
Brown, C.; Green, D. S.; Eco Forecasters
2011-12-01
Ecological prediction of the impacts of physical, chemical, biological, and human-induced change on ecosystems and their components, encompass a wide range of space and time scales, and subject matter. They vary from predicting the occurrence and/or transport of certain species, such harmful algal blooms, or biogeochemical constituents, such as dissolved oxygen concentrations, to large-scale ecosystem responses and higher trophic levels. The timescales of ecological prediction, including guidance and forecasts, range from nowcasts and short-term forecasts (days), to intraseasonal and interannual outlooks (weeks to months), to decadal and century projections in climate change scenarios. The spatial scales range from small coastal inlets to basin and global scale biogeochemical and ecological forecasts. The types of models that have been used include conceptual, empirical, mechanistic, and hybrid approaches. This presentation will identify the challenges and progress toward transitioning experimental model-based ecological prediction into operational guidance and forecasting. Recent efforts are targeting integration of regional ocean, hydrodynamic and hydrological models and leveraging weather and water service infrastructure to enable the prototyping of an operational ecological forecast capability for the Chesapeake Bay and its tidal tributaries. A path finder demonstration predicts the probability of encountering sea nettles (Chrysaora quinquecirrha), a stinging jellyfish. These jellyfish can negatively impact safety and economic activities in the bay and an impact-based forecast that predicts where and when this biotic nuisance occurs may help management effects. The issuance of bay-wide nowcasts and three-day forecasts of sea nettle probability are generated daily by forcing an empirical habitat model (that predicts the probability of sea nettles) with real-time and 3-day forecasts of sea-surface temperature (SST) and salinity (SSS). In the first demonstration
EGG: Empirical Galaxy Generator
Schreiber, C.; Elbaz, D.; Pannella, M.; Merlin, E.; Castellano, M.; Fontana, A.; Bourne, N.; Boutsia, K.; Cullen, F.; Dunlop, J.; Ferguson, H. C.; MichaÅowski, M. J.; Okumura, K.; Santini, P.; Shu, X. W.; Wang, T.; White, C.
2018-04-01
The Empirical Galaxy Generator (EGG) generates fake galaxy catalogs and images with realistic positions, morphologies and fluxes from the far-ultraviolet to the far-infrared. The catalogs are generated by egg-gencat and stored in binary FITS tables (column oriented). Another program, egg-2skymaker, is used to convert the generated catalog into ASCII tables suitable for ingestion by SkyMaker (ascl:1010.066) to produce realistic high resolution images (e.g., Hubble-like), while egg-gennoise and egg-genmap can be used to generate the low resolution images (e.g., Herschel-like). These tools can be used to test source extraction codes, or to evaluate the reliability of any map-based science (stacking, dropout identification, etc.).
Foundations of linear and generalized linear models
Agresti, Alan
2015-01-01
A valuable overview of the most important ideas and results in statistical analysis Written by a highly-experienced author, Foundations of Linear and Generalized Linear Models is a clear and comprehensive guide to the key concepts and results of linear statistical models. The book presents a broad, in-depth overview of the most commonly used statistical models by discussing the theory underlying the models, R software applications, and examples with crafted models to elucidate key ideas and promote practical model building. The book begins by illustrating the fundamentals of linear models,
75 FR 8297 - Tongass National Forest, Thorne Bay Ranger District, Thorne Bay, AK
2010-02-24
..., Thorne Bay, AK AGENCY: Forest Service, USDA. ACTION: Cancellation of Notice of intent to prepare an... Roberts, Zone Planner, Thorne Bay Ranger District, Tongass National Forest, P.O. Box 19001, Thorne Bay, AK 99919, telephone: 907-828-3250. SUPPLEMENTARY INFORMATION: The 47,007-acre Kosciusko Project Area is...
77 FR 44140 - Drawbridge Operation Regulation; Sturgeon Bay Ship Canal, Sturgeon Bay, WI
2012-07-27
... Maple-Oregon Bridges so vehicular traffic congestion would not develop on downtown Sturgeon Bay streets... movement of vehicular traffic in Sturgeon Bay. The Sturgeon Bay Ship Canal is approximately 8.6 miles long... significant increase in vehicular and vessel traffic during the peak tourist and navigation season between...
The onset of deglaciation of Cumberland Bay and Stromness Bay, South Georgia
Van Der Putten, N.; Verbruggen, C.
Carbon dating of basal peat deposits in Cumberland Bay and Stromness Bay and sediments from a lake in Stromness Bay, South Georgia indicates deglaciation at the very beginning of the Holocene before c. 9500 14C yr BP. This post-dates the deglaciation of one local lake which has been ice-free since
78 FR 46813 - Safety Zone; Evening on the Bay Fireworks; Sturgeon Bay, WI
2013-08-02
...-AA00 Safety Zone; Evening on the Bay Fireworks; Sturgeon Bay, WI AGENCY: Coast Guard, DHS. ACTION.... This temporary safety zone will restrict vessels from a portion of Sturgeon Bay due to a fireworks... hazards associated with the fireworks display. DATES: This rule is effective from 8 p.m. until 10 p.m. on...
Empirical validation of directed functional connectivity.
Mill, Ravi D; Bagic, Anto; Bostan, Andreea; Schneider, Walter; Cole, Michael W
2017-02-01
Mapping directions of influence in the human brain connectome represents the next phase in understanding its functional architecture. However, a host of methodological uncertainties have impeded the application of directed connectivity methods, which have primarily been validated via "ground truth" connectivity patterns embedded in simulated functional MRI (fMRI) and magneto-/electro-encephalography (MEG/EEG) datasets. Such simulations rely on many generative assumptions, and we hence utilized a different strategy involving empirical data in which a ground truth directed connectivity pattern could be anticipated with confidence. Specifically, we exploited the established "sensory reactivation" effect in episodic memory, in which retrieval of sensory information reactivates regions involved in perceiving that sensory modality. Subjects performed a paired associate task in separate fMRI and MEG sessions, in which a ground truth reversal in directed connectivity between auditory and visual sensory regions was instantiated across task conditions. This directed connectivity reversal was successfully recovered across different algorithms, including Granger causality and Bayes network (IMAGES) approaches, and across fMRI ("raw" and deconvolved) and source-modeled MEG. These results extend simulation studies of directed connectivity, and offer practical guidelines for the use of such methods in clarifying causal mechanisms of neural processing. Copyright © 2016 Elsevier Inc. All rights reserved.
Eugene N. Anderson
2016-01-01
The Mongol Empire, the largest contiguous empire the world has ever known, had, among other things, a goodly number of falconers, poultry raisers, birdcatchers, cooks, and other experts on various aspects of birding. We have records of this, largely in the Yinshan Zhengyao, the court nutrition manual of the Mongol empire in China (the Yuan Dynasty). It discusses in some detail 22 bird taxa, from swans to chickens. The Huihui Yaofang, a medical encyclopedia, lists ten taxa used medicinally. Ma...
Empirical Investigation of External Debt-Growth Nexus in Sub ...
African Journals Online (AJOL)
Empirical Investigation of External Debt-Growth Nexus in Sub-Saharan Africa. ... distributed lag (PARDL) model and panel non-linear autoregressive distributed lag (PNARDL) model to examine the relationship between external debt and economic growth using a panel dataset of 22 countries from 1985 to 2015. Its results ...
Uniform convergence of the empirical spectral distribution function
Mikosch, T; Norvaisa, R
1997-01-01
Let X be a linear process having a finite fourth moment. Assume F is a class of square-integrable functions. We consider the empirical spectral distribution function J(n,X) based on X and indexed by F. If F is totally bounded then J(n,X) satisfies a uniform strong law of large numbers. If, in
Empirical Model for Predicting Rate of Biogas Production | Adamu ...
African Journals Online (AJOL)
Rate of biogas production using cow manure as substrate was monitored in two laboratory scale batch reactors (13 liter and 108 liter capacities). Two empirical models based on the Gompertz and the modified logistic equations were used to fit the experimental data based on non-linear regression analysis using Solver tool ...
Empirical techniques in finance
Bhar, Ramaprasad
2005-01-01
This book offers the opportunity to study and experience advanced empi- cal techniques in finance and in general financial economics. It is not only suitable for students with an interest in the field, it is also highly rec- mended for academic researchers as well as the researchers in the industry. The book focuses on the contemporary empirical techniques used in the analysis of financial markets and how these are implemented using actual market data. With an emphasis on Implementation, this book helps foc- ing on strategies for rigorously combing finance theory and modeling technology to extend extant considerations in the literature. The main aim of this book is to equip the readers with an array of tools and techniques that will allow them to explore financial market problems with a fresh perspective. In this sense it is not another volume in eco- metrics. Of course, the traditional econometric methods are still valid and important; the contents of this book will bring in other related modeling topics tha...
Ordinal Log-Linear Models for Contingency Tables
Directory of Open Access Journals (Sweden)
Brzezińska Justyna
2016-12-01
Full Text Available A log-linear analysis is a method providing a comprehensive scheme to describe the association for categorical variables in a contingency table. The log-linear model specifies how the expected counts depend on the levels of the categorical variables for these cells and provide detailed information on the associations. The aim of this paper is to present theoretical, as well as empirical, aspects of ordinal log-linear models used for contingency tables with ordinal variables. We introduce log-linear models for ordinal variables: linear-by-linear association, row effect model, column effect model and RC Goodman’s model. Algorithm, advantages and disadvantages will be discussed in the paper. An empirical analysis will be conducted with the use of R.
Positive Quasi Linear Operator Formulation
International Nuclear Information System (INIS)
Berry, L.A.; Jaeger, E.F.
2005-01-01
Expressions for the RF quasi-linear operator are biquadratic sums over the Fourier modes (or FLR equivalent) that describe the RF electric field with a kernel that is a function of the two wave vectors, k-vector L and k-vector R , in the sum. As a result of either an implicit or explicit average over field lines or flux surfaces, this kernel only depends on one parallel wave vector, conventionally k R -vector. When k-vector is an independent component of the representation for E, the sums are demonstrably positive. However, except for closed field line systems, k-vector is dependent on the local direction of the equilibrium magnetic field, and, empirically, the absorbed energy and quasi-linear diffusion coefficients are observed to have negative features. We have formally introduced an independent k-vector sum by Fourier transforming the RF electric field (assuming straight field lines) using a field-line-length coordinate. The resulting expression is positive. We have modeled this approach by calculating the quasi linear operator for 'modes' with fixed k-vector. We form these modes by discretizing k-vector and then assigning all of the Fourier components with k-vectorthat fall within a given k-vector bin to that k-vector mode. Results will be shown as a function of the number of bins. Future work will involve implementing the expressions derived from the Fourier transform and evaluating the dependence on field line length
Gibbs, James F.; Borcherdt, Roger D.
1974-01-01
Measurements of ground motion generated by nuclear explosions in Nevada have been completed for 99 locations in the San Francisco Bay region, California. The seismograms, Fourier amplitude spectra, spectral amplification curves for the signal, and the Fourier amplitude spectra of the seismic noise are presented for 60 locations. Analog amplifications, based on the maximum signal amplitude, are computed for an additional 39 locations. The recordings of the nuclear explosions show marked amplitude variations which are consistently related to the local geologic conditions of the recording site. The average spectral amplifications observed for vertical and horizontal ground motions are, respectively: (1, 1) for granite, (1.5, 1.6) for the Franciscan Formation, (2.3, 2.3), for other pre-Tertiary and Tertiary rocks, (3.0, 2.7) for the Santa Clara Formation, (3.3, 4.4) for older bay sediments, and (3.7, 11.3) for younger bay mud. Spectral amplification curves define predominant ground frequencies for younger bay mud sites and for some older bay sediment sites. The predominant frequencies for most sites were not clearly defined by the amplitude spectra computed from the seismic background noise. The intensities ascribed to various sites in the San Francisco Bay region for the California earthquake of April 18, 1906, are strongly dependent on distance from the zone of surface faulting and the geological character of the ground. Considering only those sites (approximately one square city block in size) for which there is good evidence for the degree of ascribed intensity, the intensities for 917 sites on Franciscan rocks generally decrease with the logarithm of distance as Intensity = 2.69 - 1.90 log (Distance Km). For sites on other geologic units, intensity increments, derived from this empirical rela.tion, correlate strongly with the Average Horizontal Spectral Amplifications (MISA) according to the empirical relation Intensity Increment= 0.27 + 2.70 log(AHSA). Average
Unique thermal record in False Bay
CSIR Research Space (South Africa)
Grundlingh, ML
1993-10-01
Full Text Available Over the past decade False Bay has assumed a prime position in terms of research in to large South African bays. This is manifested by investigations that cover flow conditions modelling, thermal structure, management, biology and nutrients, geology...
Hierarchical mixtures of naive Bayes classifiers
Wiering, M.A.
2002-01-01
Naive Bayes classifiers tend to perform very well on a large number of problem domains, although their representation power is quite limited compared to more sophisticated machine learning algorithms. In this pa- per we study combining multiple naive Bayes classifiers by using the hierar- chical
Safety culture development at Daya Bay NPP
International Nuclear Information System (INIS)
Zhang Shanming
2001-01-01
From view on Organization Behavior theory, the concept, development and affecting factors of safety culture are introduced. The focuses are on the establishment, development and management practice for safety culture at Daya Bay NPP. A strong safety culture, also demonstrated, has contributed greatly to improving performance at Daya Bay
The Holocene History of Placentia Bay, Newfoundland
DEFF Research Database (Denmark)
Sheldon, Christina; Seidenkrantz, Marit-Solveig; Reynisson, Njall
2013-01-01
Marine sediments analyzed from cores taken in Placentia Bay, Newfoundland, located in the Labrador Sea, captured oceanographic and climatic changes from the end of the Younger Dryas through the Holocene. Placentia Bay is an ideal site to capture changes in both the south-flowing Labrador Current ...
Towards a sustainable future in Hudson Bay
International Nuclear Information System (INIS)
Okrainetz, G.
1991-01-01
To date, ca $40-50 billion has been invested in or committed to hydroelectric development on the rivers feeding Hudson Bay. In addition, billions more have been invested in land uses such as forestry and mining within the Hudson Bay drainage basin. However, there has never been a study of the possible impacts on Hudson Bay resulting from this activity. Neither has there been any federal environmental assessment on any of the economic developments that affect Hudson Bay. To fill this gap in knowledge, the Hudson Bay Program was established. The program will not conduct scientific field research but will rather scan the published literature and consult with leading experts in an effort to identify biophysical factors that are likely to be significantly affected by the cumulative influence of hydroelectric and other developments within and outside the region. An annotated bibliography on Hudson Bay has been completed and used to prepare a science overview paper, which will be circulated for comment, revised, and used as the basis for a workshop on cumulative effects in Hudson Bay. Papers will then be commissioned for a second workshop to be held in fall 1993. A unique feature of the program is its integration of traditional ecological knowledge among the Inuit and Cree communities around Hudson Bay with the scientific approach to cumulative impact assessment. One goal of the program is to help these communities bring forward their knowledge in such a way that it can be integrated into the cumulative effects assessment
Final Empirical Test Case Specification
DEFF Research Database (Denmark)
Kalyanova, Olena; Heiselberg, Per
This document includes the empirical specification on the IEA task of evaluation building energy simulation computer programs for the Double Skin Facades (DSF) constructions. There are two approaches involved into this procedure, one is the comparative approach and another is the empirical one....
Directory of Open Access Journals (Sweden)
Robert Aldrich
2010-03-01
Full Text Available This paper argues that the colonial legacy is ever present in contemporary Europe. For a generation, most Europeans largely tried, publicly, to forget the colonial past, or remembered it only through the rose-coloured lenses of nostalgia; now the pendulum has swung to memory of that past – even perhaps, in the views of some, to a surfeit of memory, where each group agitates for its own version of history, its own recognition in laws and ceremonies, its own commemoration in museums and monuments, the valorization or repatriation of its own art and artefacts. Word such as ‘invasion,’ ‘racism’ and ‘genocide’ are emotional terms that provoke emotional reactions. Whether leaders should apologize for wrongs of the past – and which wrongs – remains a highly sensitive issue. The ‘return of the colonial’ thus has to do with ethics and politics as well as with history, and can link to statements of apology or recognition, legislation about certain views of history, monetary compensation, repatriation of objects, and—perhaps most importantly—redefinition of national identity and policy. The colonial flags may have been lowered, but many barricades seem to have been raised. Private memories—of loss of land, of unacknowledged service, of political, economic, social and cultural disenfranchisement, but also on the other side of defeat, national castigation and self-flagellation—have been increasingly public. Monuments and museums act not only as sites of history but as venues for political agitation and forums for academic debate – differences of opinion that have spread to the streets. Empire has a long after-life.
Empirical Support for Perceptual Conceptualism
Directory of Open Access Journals (Sweden)
Nicolás Alejandro Serrano
2018-03-01
Full Text Available The main objective of this paper is to show that perceptual conceptualism can be understood as an empirically meaningful position and, furthermore, that there is some degree of empirical support for its main theses. In order to do this, I will start by offering an empirical reading of the conceptualist position, and making three predictions from it. Then, I will consider recent experimental results from cognitive sciences that seem to point towards those predictions. I will conclude that, while the evidence offered by those experiments is far from decisive, it is enough not only to show that conceptualism is an empirically meaningful position but also that there is empirical support for it.
Empire as a Geopolitical Figure
DEFF Research Database (Denmark)
Parker, Noel
2010-01-01
This article analyses the ingredients of empire as a pattern of order with geopolitical effects. Noting the imperial form's proclivity for expansion from a critical reading of historical sociology, the article argues that the principal manifestation of earlier geopolitics lay not in the nation...... but in empire. That in turn has been driven by a view of the world as disorderly and open to the ordering will of empires (emanating, at the time of geopolitics' inception, from Europe). One implication is that empires are likely to figure in the geopolitical ordering of the globe at all times, in particular...... after all that has happened in the late twentieth century to undermine nationalism and the national state. Empire is indeed a probable, even for some an attractive form of regime for extending order over the disorder produced by globalisation. Geopolitics articulated in imperial expansion is likely...
Prediction of maximum earthquake intensities for the San Francisco Bay region
Borcherdt, Roger D.; Gibbs, James F.
1975-01-01
The intensity data for the California earthquake of April 18, 1906, are strongly dependent on distance from the zone of surface faulting and the geological character of the ground. Considering only those sites (approximately one square city block in size) for which there is good evidence for the degree of ascribed intensity, the empirical relation derived between 1906 intensities and distance perpendicular to the fault for 917 sites underlain by rocks of the Franciscan Formation is: Intensity = 2.69 - 1.90 log (Distance) (km). For sites on other geologic units intensity increments, derived with respect to this empirical relation, correlate strongly with the Average Horizontal Spectral Amplifications (AHSA) determined from 99 three-component recordings of ground motion generated by nuclear explosions in Nevada. The resulting empirical relation is: Intensity Increment = 0.27 +2.70 log (AHSA), and average intensity increments for the various geologic units are -0.29 for granite, 0.19 for Franciscan Formation, 0.64 for the Great Valley Sequence, 0.82 for Santa Clara Formation, 1.34 for alluvium, 2.43 for bay mud. The maximum intensity map predicted from these empirical relations delineates areas in the San Francisco Bay region of potentially high intensity from future earthquakes on either the San Andreas fault or the Hazard fault.
Prediction of maximum earthquake intensities for the San Francisco Bay region
Energy Technology Data Exchange (ETDEWEB)
Borcherdt, R.D.; Gibbs, J.F.
1975-01-01
The intensity data for the California earthquake of Apr 18, 1906, are strongly dependent on distance from the zone of surface faulting and the geological character of the ground. Considering only those sites (approximately one square city block in size) for which there is good evidence for the degree of ascribed intensity, the empirical relation derived between 1906 intensities and distance perpendicular to the fault for 917 sites underlain by rocks of the Franciscan formation is intensity = 2.69 - 1.90 log (distance) (km). For sites on other geologic units, intensity increments, derived with respect to this empirical relation, correlate strongly with the average horizontal spectral amplifications (AHSA) determined from 99 three-component recordings of ground motion generated by nuclear explosions in Nevada. The resulting empirical relation is intensity increment = 0.27 + 2.70 log (AHSA), and average intensity increments for the various geologic units are -0.29 for granite, 0.19 for Franciscan formation, 0.64 for the Great Valley sequence, 0.82 for Santa Clara formation, 1.34 for alluvium, and 2.43 for bay mud. The maximum intensity map predicted from these empirical relations delineates areas in the San Francisco Bay region of potentially high intensity from future earthquakes on either the San Andreas fault or the Hayward fault.
Tuey, R. C.
1972-01-01
Computer solutions of linear programming problems are outlined. Information covers vector spaces, convex sets, and matrix algebra elements for solving simultaneous linear equations. Dual problems, reduced cost analysis, ranges, and error analysis are illustrated.
Energy Technology Data Exchange (ETDEWEB)
Peterson, David; Stofleth, Jerome H.; Saul, Venner W.
2017-07-11
Linear shaped charges are described herein. In a general embodiment, the linear shaped charge has an explosive with an elongated arrowhead-shaped profile. The linear shaped charge also has and an elongated v-shaped liner that is inset into a recess of the explosive. Another linear shaped charge includes an explosive that is shaped as a star-shaped prism. Liners are inset into crevices of the explosive, where the explosive acts as a tamper.
Classifying Linear Canonical Relations
Lorand, Jonathan
2015-01-01
In this Master's thesis, we consider the problem of classifying, up to conjugation by linear symplectomorphisms, linear canonical relations (lagrangian correspondences) from a finite-dimensional symplectic vector space to itself. We give an elementary introduction to the theory of linear canonical relations and present partial results toward the classification problem. This exposition should be accessible to undergraduate students with a basic familiarity with linear algebra.
Bird surveys at McKinley Bay and Hutchison Bay, Northwest Territories, in 1991
Energy Technology Data Exchange (ETDEWEB)
Cornish, B J; Dickson, D L; Dickson, H L
1992-03-01
McKinley Bay is a shallow protected bay along the eastern Beaufort Sea coast which provides an important habitat for diving ducks. Since 1979, the bay has been the site of a winter harbor and support base for oil and gas exploraton in the Beaufort Sea. Aerial surveys for bird abundance and distribution were conducted in August 1991 as a continuation of long-term monitoring of birds in McKinley Bay and Hutchison Bay, a nearby area used as a control. The main objectives of the 1991 surveys were to expand the set of baseline data on natural annual fluctuations in diving duck numbers, and to determine if numbers of diving ducks had changed since the initial 1981-85 surveys. On the day with the best survey conditions, the population of diving ducks at McKinley bay was estimated at ca 32,000, significantly more than 1981-85. At Hutchison Bay, there were an estimated 11,000 ducks. As in previous years, large numbers of diving ducks were observed off Atkinson Point at the northwest corner of McKinley Bay, at the south end of the bay, and in the northeast corner near a long spit. Most divers in Hutchison Bay were at the west side. Diving ducks, primarily Oldsquaw and scoter, were the most abundant bird group in the study area. Observed distribution patterns of birds are discussed with reference to habitat preferences. 16 refs., 7 figs., 30 tabs.
Lawson, C. L.; Krogh, F. T.; Gold, S. S.; Kincaid, D. R.; Sullivan, J.; Williams, E.; Hanson, R. J.; Haskell, K.; Dongarra, J.; Moler, C. B.
1982-01-01
The Basic Linear Algebra Subprograms (BLAS) library is a collection of 38 FORTRAN-callable routines for performing basic operations of numerical linear algebra. BLAS library is portable and efficient source of basic operations for designers of programs involving linear algebriac computations. BLAS library is supplied in portable FORTRAN and Assembler code versions for IBM 370, UNIVAC 1100 and CDC 6000 series computers.
Directory of Open Access Journals (Sweden)
Eugene N. Anderson
2016-09-01
Full Text Available The Mongol Empire, the largest contiguous empire the world has ever known, had, among other things, a goodly number of falconers, poultry raisers, birdcatchers, cooks, and other experts on various aspects of birding. We have records of this, largely in the Yinshan Zhengyao, the court nutrition manual of the Mongol empire in China (the Yuan Dynasty. It discusses in some detail 22 bird taxa, from swans to chickens. The Huihui Yaofang, a medical encyclopedia, lists ten taxa used medicinally. Marco Polo also made notes on Mongol bird use. There are a few other records. This allows us to draw conclusions about Mongol ornithology, which apparently was sophisticated and detailed.
International Nuclear Information System (INIS)
Watson, Jim
1999-01-01
The UK government's consent for the construction of a gas-fired power plant at Baglan Bay in South Wales is reported, and the growing popularity of economic combined-cycle gas turbine (CCGT) power plants and the resulting environmental improvements are noted . The combining of gas and steam turbines, design developments, and the UK moratorium on planning consents for gas fired power plants are discussed. General Electric's H System technology which will lower the amount of energy lost in the conversion of natural gas to electricity is described, and details of the ten most problematic CCGTs in the UK are given. The domination of the CCGT global market by four manufacturers, and the pressure on manufacturers to develop their designs are considered. (UK)
Directory of Open Access Journals (Sweden)
Ishan Joshi
2015-09-01
Full Text Available Coastal bays, such as Barataria Bay, are important transition zones between the terrigenous and marine environments that are also optically complex due to elevated amounts of particulate and dissolved constituents. Monthly field data collected over a period of 15 months in 2010 and 2011 in Barataria Bay were used to develop an empirical band ratio algorithm for the Landsat-5 TM that showed a good correlation with the Colored Dissolved Organic Matter (CDOM absorption coefficient at 355 nm (ag355 (R2 = 0.74. Landsat-derived CDOM maps generally captured the major details of CDOM distribution and seasonal influences, suggesting the potential use of Landsat imagery to monitor biogeochemistry in coastal water environments. An investigation of the seasonal variation in ag355 conducted using Landsat-derived ag355 as well as field data suggested the strong influence of seasonality in the different regions of the bay with the marine end members (lower bay experiencing generally low but highly variable ag355 and the freshwater end members (upper bay experiencing high ag355 with low variability. Barataria Bay experienced a significant increase in ag355 during the freshwater release at the Davis Pond Freshwater Diversion (DPFD following the Deep Water Horizon oil spill in 2010 and following the Mississippi River (MR flood conditions in 2011, resulting in a weak linkage to salinity in comparison to the other seasons. Tree based statistical analysis showed the influence of high river flow conditions, high- and low-pressure systems that appeared to control ag355 by ~28%, 29% and 43% of the time duration over the study period at the marine end member just outside the bay. An analysis of CDOM variability in 2010 revealed the strong influence of the MR in controlling CDOM abundance in the lower bay during the high flow conditions, while strong winds associated with cold fronts significantly increase CDOM abundance in the upper bay, thus revealing the important
2013-10-15
... Safety Zone, Oyster Festival 30th Anniversary Fireworks Display, Oyster Bay; Oyster Bay, NY AGENCY: Coast... zone on the navigable waters of Oyster Bay near Oyster Bay, NY for the Oyster Festival 30th Anniversary... Oyster Festival 30th Anniversary Fireworks Display is scheduled for October 19, 2013 and is one of...
2010-10-01
... 46 Shipping 1 2010-10-01 2010-10-01 false Nantucket Sound, Vineyard Sound, Buzzards Bay, Narragansett Bay, MA, Block Island Sound and easterly entrance to Long Island Sound, NY. 7.20 Section 7.20... Atlantic Coast § 7.20 Nantucket Sound, Vineyard Sound, Buzzards Bay, Narragansett Bay, MA, Block Island...
2013-05-09
... DEPARTMENT OF DEFENSE Department of the Army, Corps of Engineers 33 CFR Part 334 East Bay, St. Andrews Bay and the Gulf of Mexico at Tyndall Air Force Base, Florida; Restricted Areas AGENCY: U.S. Army... read as follows: Sec. 334.665 East Bay, St. Andrews Bay and the Gulf of Mexico, Restricted Areas...
2010-03-29
...: Narragansett Bay, RI and Mount Hope Bay, RI and MA, Including the Providence River and Taunton River AGENCY... River and Mount Hope Bay in the vicinity of the two Brightman Street bridges have not been adopted and... Island and Mt. Hope Bay, MA.'' The notice was prompted primarily by two events: (1) The U.S. Army Corps...
Empirical Legality and Effective Reality
Directory of Open Access Journals (Sweden)
Hernán Pringe
2015-08-01
Full Text Available The conditions that Kant’s doctrine establishes are examined for the predication of the effective reality of certain empirical objects. It is maintained that a for such a predication, it is necessary to have not only perception but also a certain homogeneity of sensible data, and b the knowledge of the existence of certain empirical objects depends on the application of regulative principles of experience.
Empirical logic and quantum mechanics
International Nuclear Information System (INIS)
Foulis, D.J.; Randall, C.H.
1976-01-01
This article discusses some of the basic notions of quantum physics within the more general framework of operational statistics and empirical logic (as developed in Foulis and Randall, 1972, and Randall and Foulis, 1973). Empirical logic is a formal mathematical system in which the notion of an operation is primitive and undefined; all other concepts are rigorously defined in terms of such operations (which are presumed to correspond to actual physical procedures). (Auth.)
Empirical Research In Engineering Design
DEFF Research Database (Denmark)
Ahmed, Saeema
2007-01-01
Increasingly engineering design research involves the use of empirical studies that are conducted within an industrial environment [Ahmed, 2001; Court 1995; Hales 1987]. Research into the use of information by designers or understanding how engineers build up experience are examples of research...... of research issues. This paper describes case studies of empirical research carried out within industry in engineering design focusing upon information, knowledge and experience in engineering design. The paper describes the research methods employed, their suitability for the particular research aims...
Directory of Open Access Journals (Sweden)
Claudio Roberto Fóffano Vasconcelos
2016-01-01
Full Text Available The aim of this study is to examine empirically the validity of PPP in the context of unit root tests based on linear and non-linear models of the real effective exchange rate of Argentina, Brazil, Chile, Colombia, Mexico, Peru and Venezuela. For this purpose, we apply the Harvey et al. (2008 linearity test and the non-linear unit root test (Kruse, 2011. The results show that the series with linear characteristics are Argentina, Brazil, Chile, Colombia and Peru and those with non-linear characteristics are Mexico and Venezuela. The linear unit root tests indicate that the real effective exchange rate is stationary for Chile and Peru, and the non-linear unit root tests evidence that Mexico is stationary. In the period analyzed, the results show support for the validity of PPP in only three of the seven countries.
Identification of an Equivalent Linear Model for a Non-Linear Time-Variant RC-Structure
DEFF Research Database (Denmark)
Kirkegaard, Poul Henning; Andersen, P.; Brincker, Rune
are investigated and compared with ARMAX models used on a running window. The techniques are evaluated using simulated data generated by the non-linear finite element program SARCOF modeling a 10-storey 3-bay concrete structure subjected to amplitude modulated Gaussian white noise filtered through a Kanai......This paper considers estimation of the maximum softening for a RC-structure subjected to earthquake excitation. The so-called Maximum Softening damage indicator relates the global damage state of the RC-structure to the relative decrease of the fundamental eigenfrequency in an equivalent linear...
Non linear system become linear system
Directory of Open Access Journals (Sweden)
Petre Bucur
2007-01-01
Full Text Available The present paper refers to the theory and the practice of the systems regarding non-linear systems and their applications. We aimed the integration of these systems to elaborate their response as well as to highlight some outstanding features.
Linear motor coil assembly and linear motor
2009-01-01
An ironless linear motor (5) comprising a magnet track (53) and a coil assembly (50) operating in cooperation with said magnet track (53) and having a plurality of concentrated multi-turn coils (31 a-f, 41 a-d, 51 a-k), wherein the end windings (31E) of the coils (31 a-f, 41 a-e) are substantially
Description of gravity cores from San Pablo Bay and Carquinez Strait, San Francisco Bay, California
Woodrow, Donald L.; John L. Chin,; Wong, Florence L.; Fregoso, Theresa A.; Jaffe, Bruce E.
2017-06-27
Seventy-two gravity cores were collected by the U.S. Geological Survey in 1990, 1991, and 2000 from San Pablo Bay and Carquinez Strait, California. The gravity cores collected within San Pablo Bay contain bioturbated laminated silts and sandy clays, whole and broken bivalve shells (mostly mussels), fossil tube structures, and fine-grained plant or wood fragments. Gravity cores from the channel wall of Carquinez Strait east of San Pablo Bay consist of sand and clay layers, whole and broken bivalve shells (less than in San Pablo Bay), trace fossil tubes, and minute fragments of plant material.
Multi-GPGPU Tsunami simulation at Toyama-bay
Furuyama, Shoichi; Ueda, Yuki
2017-07-01
Accelerated multi General Purpose Graphics Processing Unit (GPGPU) calculation for Tsunami run-up simulation was achieved at the wide area (whole Toyama-bay in Japan) by faster computation technique. Toyama-bay has active-faults at the sea-bed. It has a high possibility to occur earthquakes and Tsunami waves in the case of the huge earthquake, that's why to predict the area of Tsunami run-up is important for decreasing damages to residents by the disaster. However it is very hard task to achieve the simulation by the computer resources problem. A several meter's order of the high resolution calculation is required for the running-up Tsunami simulation because artificial structures on the ground such as roads, buildings, and houses are very small. On the other hand the huge area simulation is also required. In the Toyama-bay case the area is 42 [km] × 15 [km]. When 5 [m] × 5 [m] size computational cells are used for the simulation, over 26,000,000 computational cells are generated. To calculate the simulation, a normal CPU desktop computer took about 10 hours for the calculation. An improvement of calculation time is important problem for the immediate prediction system of Tsunami running-up, as a result it will contribute to protect a lot of residents around the coastal region. The study tried to decrease this calculation time by using multi GPGPU system which is equipped with six NVIDIA TESLA K20xs, InfiniBand network connection between computer nodes by MVAPICH library. As a result 5.16 times faster calculation was achieved on six GPUs than one GPU case and it was 86% parallel efficiency to the linear speed up.
Delaware River and Upper Bay Sediment Data
National Oceanic and Atmospheric Administration, Department of Commerce — The area of coverage consists of 192 square miles of benthic habitat mapped from 2005 to 2007 in the Delaware River and Upper Delaware Bay. The bottom sediment map...
Willapa Bay, Washington Benthic Habitats 1995 Biotic
National Oceanic and Atmospheric Administration, Department of Commerce — In June 1995, the Columbia River Estuary Study Taskforce (CREST) acquired 295 true color aerial photographs (1:12,000) of Willapa Bay, Washington, from the State of...
Willapa Bay, Washington Benthic Habitats 1995 Geoform
National Oceanic and Atmospheric Administration, Department of Commerce — In June 1995, the Columbia River Estuary Study Taskforce (CREST) acquired 295 true color aerial photographs (1:12,000) of Willapa Bay, Washington, from the State of...
National Oceanic and Atmospheric Administration, Department of Commerce — The effect of salinity on utilization of shallow-water nursery habitats by aquatic fauna was assessed in San Antonio Bay, Texas. Overall, 272 samples were collected...
Corpus ChristiEast Matagorda Bay 1986
National Oceanic and Atmospheric Administration, Department of Commerce — Patterns of habitat utilization were compared among transplanted and natural Spartina alterniflora marshes in the Halls Lake area of Chocolate Bay in the Galveston...
San Francisco Bay Interferometric Bathymetry: Area B
National Oceanic and Atmospheric Administration, Department of Commerce — High resolution sonar data were collected over ultra-shallow areas of the San Francisco Bay estuary system. Bathymetric and acoustic backscatter data were collected...
BENTHIC MACROFAUNAL ALIENS IN WILLAPA BAY
Benthic macrofaunal samples were collected at random stations in Willapa Bay, WA, in four habitats [eelgrass (Zostera marina), Atlantic cordgrass (Spartina alterniflora), mud shrimp (Upogebia pugettensis), ghost shrimp (Neotrypaea californiensis)] in 1996 and in seven habitats (Z...
FL BAY SPECTROUT-POPULATION STATUS
National Oceanic and Atmospheric Administration, Department of Commerce — Juvenile spotted seatrout and other sportfish are being monitored annually over a 6-mo period in Florida Bay to assess their abundance over time relative to...
Willapa Bay, Washington Benthic Habitats 1995 Substrate
National Oceanic and Atmospheric Administration, Department of Commerce — In June 1995, the Columbia River Estuary Study Taskforce (CREST) acquired 295 true color aerial photographs (1:12,000) of Willapa Bay, Washington, from the State of...
Benthic harpacticoid copepods of Jiaozhou Bay, Qingdao
Ma, Lin; Li, Xinzheng
2017-09-01
The species richness of benthic harpacticoid copepod fauna in Jiaozhou Bay, Qingdao, on the southern coast of Shandong Peninsula, has not been comprehensively studied. We present a preliminary inventory of species for this region based on material found in nine sediment samples collected from 2011 to 2012. Our list includes 15 species belonging to 15 genera in 9 families, the most speciose family was the Miraciidae Dana, 1846 (seven species); all other families were represented by single species only. Sediment characteristics and depth are determined to be important environmental determinants of harpacticoid distribution in this region. We briefly detail the known distributions of species and provide a key to facilitate their identification. Both harpacticoid species richness and the species/genus ratio in Jiaozhou Bay are lower than in Bohai Gulf and Gwangyang Bay. The poor knowledge of the distribution of benthic harpacticoids, in addition to low sampling effort in Jiaozhou Bay, likely contribute to low species richness.
Biscayne Bay Florida Bottlenose Dolphin Studies
National Oceanic and Atmospheric Administration, Department of Commerce — These data sets include a compilation of small vessel based studies of bottlenose dolphins that reside within Biscayne Bay, Florida, adjacent estuaries and nearshore...
Energy Technology Data Exchange (ETDEWEB)
Wiedemann, H.
1981-11-01
Since no linear colliders have been built yet it is difficult to know at what energy the linear cost scaling of linear colliders drops below the quadratic scaling of storage rings. There is, however, no doubt that a linear collider facility for a center of mass energy above say 500 GeV is significantly cheaper than an equivalent storage ring. In order to make the linear collider principle feasible at very high energies a number of problems have to be solved. There are two kinds of problems: one which is related to the feasibility of the principle and the other kind of problems is associated with minimizing the cost of constructing and operating such a facility. This lecture series describes the problems and possible solutions. Since the real test of a principle requires the construction of a prototype I will in the last chapter describe the SLC project at the Stanford Linear Accelerator Center.
Blyth, T S
2002-01-01
Basic Linear Algebra is a text for first year students leading from concrete examples to abstract theorems, via tutorial-type exercises. More exercises (of the kind a student may expect in examination papers) are grouped at the end of each section. The book covers the most important basics of any first course on linear algebra, explaining the algebra of matrices with applications to analytic geometry, systems of linear equations, difference equations and complex numbers. Linear equations are treated via Hermite normal forms which provides a successful and concrete explanation of the notion of linear independence. Another important highlight is the connection between linear mappings and matrices leading to the change of basis theorem which opens the door to the notion of similarity. This new and revised edition features additional exercises and coverage of Cramer's rule (omitted from the first edition). However, it is the new, extra chapter on computer assistance that will be of particular interest to readers:...
International Nuclear Information System (INIS)
Wiedemann, H.
1981-11-01
Since no linear colliders have been built yet it is difficult to know at what energy the linear cost scaling of linear colliders drops below the quadratic scaling of storage rings. There is, however, no doubt that a linear collider facility for a center of mass energy above say 500 GeV is significantly cheaper than an equivalent storage ring. In order to make the linear collider principle feasible at very high energies a number of problems have to be solved. There are two kinds of problems: one which is related to the feasibility of the principle and the other kind of problems is associated with minimizing the cost of constructing and operating such a facility. This lecture series describes the problems and possible solutions. Since the real test of a principle requires the construction of a prototype I will in the last chapter describe the SLC project at the Stanford Linear Accelerator Center
Matrices and linear transformations
Cullen, Charles G
1990-01-01
""Comprehensive . . . an excellent introduction to the subject."" - Electronic Engineer's Design Magazine.This introductory textbook, aimed at sophomore- and junior-level undergraduates in mathematics, engineering, and the physical sciences, offers a smooth, in-depth treatment of linear algebra and matrix theory. The major objects of study are matrices over an arbitrary field. Contents include Matrices and Linear Systems; Vector Spaces; Determinants; Linear Transformations; Similarity: Part I and Part II; Polynomials and Polynomial Matrices; Matrix Analysis; and Numerical Methods. The first
Efficient Non Linear Loudspeakers
DEFF Research Database (Denmark)
Petersen, Bo R.; Agerkvist, Finn T.
2006-01-01
Loudspeakers have traditionally been designed to be as linear as possible. However, as techniques for compensating non linearities are emerging, it becomes possible to use other design criteria. This paper present and examines a new idea for improving the efficiency of loudspeakers at high levels...... by changing the voice coil layout. This deliberate non-linear design has the benefit that a smaller amplifier can be used, which has the benefit of reducing system cost as well as reducing power consumption....
A Glance at Bohai Bay Oil Province
Institute of Scientific and Technical Information of China (English)
Gao Shoubai
1995-01-01
@@ Chinese oil industry keeps on developing in 1994. The oil production of Bohai Bay Oil Province located in East China also keeps on growing. Geologically,the total area of Bohai Bay Basin is about 200 000 km2 and the main structural units are: Liaohe Depression, Huanghua Depression,Jizhong Depression, Linqing Depression, Jiyang Depression, Changwei Depression, Bozhong Depression,Chengning Uplift and Cangjing Uplift (see figure 1). Area of the main structural units is listed in following:
Optimal non-linear health insurance.
Blomqvist, A
1997-06-01
Most theoretical and empirical work on efficient health insurance has been based on models with linear insurance schedules (a constant co-insurance parameter). In this paper, dynamic optimization techniques are used to analyse the properties of optimal non-linear insurance schedules in a model similar to one originally considered by Spence and Zeckhauser (American Economic Review, 1971, 61, 380-387) and reminiscent of those that have been used in the literature on optimal income taxation. The results of a preliminary numerical example suggest that the welfare losses from the implicit subsidy to employer-financed health insurance under US tax law may be a good deal smaller than previously estimated using linear models.
Statistical detection of EEG synchrony using empirical bayesian inference.
Directory of Open Access Journals (Sweden)
Archana K Singh
Full Text Available There is growing interest in understanding how the brain utilizes synchronized oscillatory activity to integrate information across functionally connected regions. Computing phase-locking values (PLV between EEG signals is a popular method for quantifying such synchronizations and elucidating their role in cognitive tasks. However, high-dimensionality in PLV data incurs a serious multiple testing problem. Standard multiple testing methods in neuroimaging research (e.g., false discovery rate, FDR suffer severe loss of power, because they fail to exploit complex dependence structure between hypotheses that vary in spectral, temporal and spatial dimension. Previously, we showed that a hierarchical FDR and optimal discovery procedures could be effectively applied for PLV analysis to provide better power than FDR. In this article, we revisit the multiple comparison problem from a new Empirical Bayes perspective and propose the application of the local FDR method (locFDR; Efron, 2001 for PLV synchrony analysis to compute FDR as a posterior probability that an observed statistic belongs to a null hypothesis. We demonstrate the application of Efron's Empirical Bayes approach for PLV synchrony analysis for the first time. We use simulations to validate the specificity and sensitivity of locFDR and a real EEG dataset from a visual search study for experimental validation. We also compare locFDR with hierarchical FDR and optimal discovery procedures in both simulation and experimental analyses. Our simulation results showed that the locFDR can effectively control false positives without compromising on the power of PLV synchrony inference. Our results from the application locFDR on experiment data detected more significant discoveries than our previously proposed methods whereas the standard FDR method failed to detect any significant discoveries.
Statistical detection of EEG synchrony using empirical bayesian inference.
Singh, Archana K; Asoh, Hideki; Takeda, Yuji; Phillips, Steven
2015-01-01
There is growing interest in understanding how the brain utilizes synchronized oscillatory activity to integrate information across functionally connected regions. Computing phase-locking values (PLV) between EEG signals is a popular method for quantifying such synchronizations and elucidating their role in cognitive tasks. However, high-dimensionality in PLV data incurs a serious multiple testing problem. Standard multiple testing methods in neuroimaging research (e.g., false discovery rate, FDR) suffer severe loss of power, because they fail to exploit complex dependence structure between hypotheses that vary in spectral, temporal and spatial dimension. Previously, we showed that a hierarchical FDR and optimal discovery procedures could be effectively applied for PLV analysis to provide better power than FDR. In this article, we revisit the multiple comparison problem from a new Empirical Bayes perspective and propose the application of the local FDR method (locFDR; Efron, 2001) for PLV synchrony analysis to compute FDR as a posterior probability that an observed statistic belongs to a null hypothesis. We demonstrate the application of Efron's Empirical Bayes approach for PLV synchrony analysis for the first time. We use simulations to validate the specificity and sensitivity of locFDR and a real EEG dataset from a visual search study for experimental validation. We also compare locFDR with hierarchical FDR and optimal discovery procedures in both simulation and experimental analyses. Our simulation results showed that the locFDR can effectively control false positives without compromising on the power of PLV synchrony inference. Our results from the application locFDR on experiment data detected more significant discoveries than our previously proposed methods whereas the standard FDR method failed to detect any significant discoveries.
A Bayes Formula for Nonlinear Filtering with Gaussian and Cox Noise
Directory of Open Access Journals (Sweden)
Vidyadhar Mandrekar
2011-01-01
Full Text Available A Bayes-type formula is derived for the nonlinear filter where the observation contains both general Gaussian noise as well as Cox noise whose jump intensity depends on the signal. This formula extends the well-known Kallianpur-Striebel formula in the classical non-linear filter setting. We also discuss Zakai-type equations for both the unnormalized conditional distribution as well as unnormalized conditional density in case the signal is a Markovian jump diffusion.
An Empirical Bayes Mixture Model for Effect Size Distributions in Genome-Wide Association Studies
DEFF Research Database (Denmark)
Thompson, Wesley K.; Wang, Yunpeng; Schork, Andrew J.
2015-01-01
-wide association study (GWAS) test statistics. Test statistics corresponding to null associations are modeled as random draws from a normal distribution with zero mean; test statistics corresponding to non-null associations are also modeled as normal with zero mean, but with larger variance. The model is fit via...... analytically and in simulations. We apply this approach to meta-analysis test statistics from two large GWAS, one for Crohn’s disease (CD) and the other for schizophrenia (SZ). A scale mixture of two normals distribution provides an excellent fit to the SZ nonparametric replication effect size estimates. While...... minimizing discrepancies between the parametric mixture model and resampling-based nonparametric estimates of replication effect sizes and variances. We describe in detail the implications of this model for estimation of the non-null proportion, the probability of replication in de novo samples, the local...
Goegebeur, Y.; de Boeck, P.; Molenberghs, G.
2010-01-01
The local influence diagnostics, proposed by Cook (1986), provide a flexible way to assess the impact of minor model perturbations on key model parameters’ estimates. In this paper, we apply the local influence idea to the detection of test speededness in a model describing nonresponse in test data,
Contemporary statistical procedures (Parametric Empirical Bayes) and nuclear plant event rates
International Nuclear Information System (INIS)
Gaver, D.P.; Worledge, D.H.
1985-01-01
The conduct of a nuclear power plant probabilistic risk assessment (PRA) recognizes that each of a great many vital components and systems is subject to failure. One aspect of the PRA procedure is to quantify individual item failure propensity, often in terms of the failure rate parameter of an exponential distribution or Poisson process, and then to combine rates so as to effectively infer the probability of plant failure, e.g., core damage. The formal method of combination of such rates involves use of fault-tree analysis. The defensibility of the final fault-tree result depends both upon the adequacy of the failure representations of its components, and upon the correctness and inclusiveness of the fault tree logic. This paper focuses upon the first issue, in particular, upon contemporary proposals for deriving estimates of individual rates. The purpose of the paper is to present, in basically non-mathematical terms, the essential nature of some of these proposals, and an assessment of how they might fit into, and contribute positively to, a more defensible or trustworthy PRA process
An Empirical Bayes Mixture Model for Effect Size Distributions in Genome-Wide Association Studies.
Directory of Open Access Journals (Sweden)
Wesley K Thompson
2015-12-01
Full Text Available Characterizing the distribution of effects from genome-wide genotyping data is crucial for understanding important aspects of the genetic architecture of complex traits, such as number or proportion of non-null loci, average proportion of phenotypic variance explained per non-null effect, power for discovery, and polygenic risk prediction. To this end, previous work has used effect-size models based on various distributions, including the normal and normal mixture distributions, among others. In this paper we propose a scale mixture of two normals model for effect size distributions of genome-wide association study (GWAS test statistics. Test statistics corresponding to null associations are modeled as random draws from a normal distribution with zero mean; test statistics corresponding to non-null associations are also modeled as normal with zero mean, but with larger variance. The model is fit via minimizing discrepancies between the parametric mixture model and resampling-based nonparametric estimates of replication effect sizes and variances. We describe in detail the implications of this model for estimation of the non-null proportion, the probability of replication in de novo samples, the local false discovery rate, and power for discovery of a specified proportion of phenotypic variance explained from additive effects of loci surpassing a given significance threshold. We also examine the crucial issue of the impact of linkage disequilibrium (LD on effect sizes and parameter estimates, both analytically and in simulations. We apply this approach to meta-analysis test statistics from two large GWAS, one for Crohn's disease (CD and the other for schizophrenia (SZ. A scale mixture of two normals distribution provides an excellent fit to the SZ nonparametric replication effect size estimates. While capturing the general behavior of the data, this mixture model underestimates the tails of the CD effect size distribution. We discuss the implications of pervasive small but replicating effects in CD and SZ on genomic control and power. Finally, we conclude that, despite having very similar estimates of variance explained by genotyped SNPs, CD and SZ have a broadly dissimilar genetic architecture, due to differing mean effect size and proportion of non-null loci.
An Empirical Bayes Mixture Model for Effect Size Distributions in Genome-Wide Association Studies.
Thompson, Wesley K; Wang, Yunpeng; Schork, Andrew J; Witoelar, Aree; Zuber, Verena; Xu, Shujing; Werge, Thomas; Holland, Dominic; Andreassen, Ole A; Dale, Anders M
2015-12-01
Characterizing the distribution of effects from genome-wide genotyping data is crucial for understanding important aspects of the genetic architecture of complex traits, such as number or proportion of non-null loci, average proportion of phenotypic variance explained per non-null effect, power for discovery, and polygenic risk prediction. To this end, previous work has used effect-size models based on various distributions, including the normal and normal mixture distributions, among others. In this paper we propose a scale mixture of two normals model for effect size distributions of genome-wide association study (GWAS) test statistics. Test statistics corresponding to null associations are modeled as random draws from a normal distribution with zero mean; test statistics corresponding to non-null associations are also modeled as normal with zero mean, but with larger variance. The model is fit via minimizing discrepancies between the parametric mixture model and resampling-based nonparametric estimates of replication effect sizes and variances. We describe in detail the implications of this model for estimation of the non-null proportion, the probability of replication in de novo samples, the local false discovery rate, and power for discovery of a specified proportion of phenotypic variance explained from additive effects of loci surpassing a given significance threshold. We also examine the crucial issue of the impact of linkage disequilibrium (LD) on effect sizes and parameter estimates, both analytically and in simulations. We apply this approach to meta-analysis test statistics from two large GWAS, one for Crohn's disease (CD) and the other for schizophrenia (SZ). A scale mixture of two normals distribution provides an excellent fit to the SZ nonparametric replication effect size estimates. While capturing the general behavior of the data, this mixture model underestimates the tails of the CD effect size distribution. We discuss the implications of pervasive small but replicating effects in CD and SZ on genomic control and power. Finally, we conclude that, despite having very similar estimates of variance explained by genotyped SNPs, CD and SZ have a broadly dissimilar genetic architecture, due to differing mean effect size and proportion of non-null loci.
Faraway, Julian J
2014-01-01
A Hands-On Way to Learning Data AnalysisPart of the core of statistics, linear models are used to make predictions and explain the relationship between the response and the predictors. Understanding linear models is crucial to a broader competence in the practice of statistics. Linear Models with R, Second Edition explains how to use linear models in physical science, engineering, social science, and business applications. The book incorporates several improvements that reflect how the world of R has greatly expanded since the publication of the first edition.New to the Second EditionReorganiz
Carr, Joseph
1996-01-01
The linear IC market is large and growing, as is the demand for well trained technicians and engineers who understand how these devices work and how to apply them. Linear Integrated Circuits provides in-depth coverage of the devices and their operation, but not at the expense of practical applications in which linear devices figure prominently. This book is written for a wide readership from FE and first degree students, to hobbyists and professionals.Chapter 1 offers a general introduction that will provide students with the foundations of linear IC technology. From chapter 2 onwa
Fault tolerant linear actuator
Tesar, Delbert
2004-09-14
In varying embodiments, the fault tolerant linear actuator of the present invention is a new and improved linear actuator with fault tolerance and positional control that may incorporate velocity summing, force summing, or a combination of the two. In one embodiment, the invention offers a velocity summing arrangement with a differential gear between two prime movers driving a cage, which then drives a linear spindle screw transmission. Other embodiments feature two prime movers driving separate linear spindle screw transmissions, one internal and one external, in a totally concentric and compact integrated module.
Superconducting linear accelerator cryostat
International Nuclear Information System (INIS)
Ben-Zvi, I.; Elkonin, B.V.; Sokolowski, J.S.
1984-01-01
A large vertical cryostat for a superconducting linear accelerator using quarter wave resonators has been developed. The essential technical details, operational experience and performance are described. (author)
Toxic phytoplankton in San Francisco Bay
Rodgers, Kristine M.; Garrison, David L.; Cloern, James E.
1996-01-01
The Regional Monitoring Program (RMP) was conceived and designed to document the changing distribution and effects of trace substances in San Francisco Bay, with focus on toxic contaminants that have become enriched by human inputs. However, coastal ecosystems like San Francisco Bay also have potential sources of naturally-produced toxic substances that can disrupt food webs and, under extreme circumstances, become threats to public health. The most prevalent source of natural toxins is from blooms of algal species that can synthesize metabolites that are toxic to invertebrates or vertebrates. Although San Francisco Bay is nutrient-rich, it has so far apparently been immune from the epidemic of harmful algal blooms in the world’s nutrient-enriched coastal waters. This absence of acute harmful blooms does not imply that San Francisco Bay has unique features that preclude toxic blooms. No sampling program has been implemented to document the occurrence of toxin-producing algae in San Francisco Bay, so it is difficult to judge the likelihood of such events in the future. This issue is directly relevant to the goals of RMP because harmful species of phytoplankton have the potential to disrupt ecosystem processes that support animal populations, cause severe illness or death in humans, and confound the outcomes of toxicity bioassays such as those included in the RMP. Our purpose here is to utilize existing data on the phytoplankton community of San Francisco Bay to provide a provisional statement about the occurrence, distribution, and potential threats of harmful algae in this Estuary.
Energy Technology Data Exchange (ETDEWEB)
Diane De Steven,Ph.D.; Maureen Tone,PhD.
1997-10-01
This report address four project objectives: (1) Gradient model of Carolina bay vegetation on the SRS--The authors use ordination analyses to identify environmental and landscape factors that are correlated with vegetation composition. Significant factors can provide a framework for site-based conservation of existing diversity, and they may also be useful site predictors for potential vegetation in bay restorations. (2) Regional analysis of Carolina bay vegetation diversity--They expand the ordination analyses to assess the degree to which SRS bays encompass the range of vegetation diversity found in the regional landscape of South Carolina's western Upper Coastal Plain. Such comparisons can indicate floristic status relative to regional potentials and identify missing species or community elements that might be re-introduced or restored. (3) Classification of vegetation communities in Upper Coastal Plain bays--They use cluster analysis to identify plant community-types at the regional scale, and explore how this classification may be functional with respect to significant environmental and landscape factors. An environmentally-based classification at the whole-bay level can provide a system of templates for managing bays as individual units and for restoring bays to desired plant communities. (4) Qualitative model for bay vegetation dynamics--They analyze present-day vegetation in relation to historic land uses and disturbances. The distinctive history of SRS bays provides the possibility of assessing pathways of post-disturbance succession. They attempt to develop a coarse-scale model of vegetation shifts in response to changing site factors; such qualitative models can provide a basis for suggesting management interventions that may be needed to maintain desired vegetation in protected or restored bays.
Kish, George R.; Harrison, Arnell S.; Alderson, Mark
2008-01-01
The U.S. Geological Survey, in cooperation with the Sarasota Bay Estuary Program conducted a retrospective review of characteristics of the Sarasota Bay watershed in west-central Florida. This report describes watershed characteristics, surface- and ground-water processes, and the environmental setting of the Sarasota Bay watershed. Population growth during the last 50 years is transforming the Sarasota Bay watershed from rural and agriculture to urban and suburban. The transition has resulted in land-use changes that influence surface- and ground-water processes in the watershed. Increased impervious cover decreases recharge to ground water and increases overland runoff and the pollutants carried in the runoff. Soil compaction resulting from agriculture, construction, and recreation activities also decreases recharge to ground water. Conventional approaches to stormwater runoff have involved conveyances and large storage areas. Low-impact development approaches, designed to provide recharge near the precipitation point-of-contact, are being used increasingly in the watershed. Simple pollutant loading models applied to the Sarasota Bay watershed have focused on large-scale processes and pollutant loads determined from empirical values and mean event concentrations. Complex watershed models and more intensive data-collection programs can provide the level of information needed to quantify (1) the effects of lot-scale land practices on runoff, storage, and ground-water recharge, (2) dry and wet season flux of nutrients through atmospheric deposition, (3) changes in partitioning of water and contaminants as urbanization alters predevelopment rainfall-runoff relations, and (4) linkages between watershed models and lot-scale models to evaluate the effect of small-scale changes over the entire Sarasota Bay watershed. As urbanization in the Sarasota Bay watershed continues, focused research on water-resources issues can provide information needed by water
Energy Technology Data Exchange (ETDEWEB)
Patten, B.C.
1983-04-01
Two issues concerning linearity or nonlinearity of natural systems are considered. Each is related to one of the two alternative defining properties of linear systems, superposition and decomposition. Superposition exists when a linear combination of inputs to a system results in the same linear combination of outputs that individually correspond to the original inputs. To demonstrate this property it is necessary that all initial states and inputs of the system which impinge on the output in question be included in the linear combination manipulation. As this is difficult or impossible to do with real systems of any complexity, nature appears nonlinear even though it may be linear. A linear system that displays nonlinear behavior for this reason is termed pseudononlinear. The decomposition property exists when the dynamic response of a system can be partitioned into an input-free portion due to state plus a state-free portion due to input. This is a characteristic of all linear systems, but not of nonlinear systems. Without the decomposition property, it is not possible to distinguish which portions of a system's behavior are due to innate characteristics (self) vs. outside conditions (environment), which is an important class of questions in biology and ecology. Some philosophical aspects of these findings are then considered. It is suggested that those ecologists who hold to the view that organisms and their environments are separate entities are in effect embracing a linear view of nature, even though their belief systems and mathematical models tend to be nonlinear. On the other hand, those who consider that organism-environment complex forms a single inseparable unit are implictly involved in non-linear thought, which may be in conflict with the linear modes and models that some of them use. The need to rectify these ambivalences on the part of both groups is indicated.
Linear colliders - prospects 1985
International Nuclear Information System (INIS)
Rees, J.
1985-06-01
We discuss the scaling laws of linear colliders and their consequences for accelerator design. We then report on the SLAC Linear Collider project and comment on experience gained on that project and its application to future colliders. 9 refs., 2 figs
International Nuclear Information System (INIS)
Richter, B.
1985-01-01
A report is given on the goals and progress of the SLAC Linear Collider. The author discusses the status of the machine and the detectors and give an overview of the physics which can be done at this new facility. He also gives some ideas on how (and why) large linear colliders of the future should be built
International Nuclear Information System (INIS)
Rogner, H.H.
1989-01-01
The submitted sections on linear programming are extracted from 'Theorie und Technik der Planung' (1978) by W. Blaas and P. Henseler and reformulated for presentation at the Workshop. They consider a brief introduction to the theory of linear programming and to some essential aspects of the SIMPLEX solution algorithm for the purposes of economic planning processes. 1 fig
International Nuclear Information System (INIS)
Rowe, C.H.; Wilton, M.S. de.
1979-01-01
An improved recirculating electron beam linear accelerator of the racetrack type is described. The system comprises a beam path of four straight legs with four Pretzel bending magnets at the end of each leg to direct the beam into the next leg of the beam path. At least one of the beam path legs includes a linear accelerator. (UK)
Umayyad Relations with Byzantium Empire
Directory of Open Access Journals (Sweden)
Mansoor Haidari
2017-06-01
Full Text Available This research investigates the political and military relations between Umayyad caliphates with the Byzantine Empire. The aim of this research is to clarify Umayyad caliphate’s relations with the Byzantine Empire. We know that these relations were mostly about war and fight. Because there were always intense conflicts between Muslims and the Byzantine Empire, they had to have an active continuous diplomacy to call truce and settle the disputes. Thus, based on the general policy of the Umayyad caliphs, Christians were severely ignored and segregated within Islamic territories. This segregation of the Christians was highly affected by political relationships. It is worthy of mentioning that Umayyad caliphs brought the governing style of the Sassanid kings and Roman Caesar into the Islamic Caliphate system but they didn’t establish civil institutions and administrative organizations.
BOOK REVIEW OF "CHESAPEAKE BAY BLUES: SCIENCE, POLITICS, AND THE STRUGGLE TO SAVE THE BAY"
This is a book review of "Chesapeake Bay Blues: Science, Politics, and the Struggle to Save the Bay". This book is very well written and provides an easily understandable description of the political challenges faced by those proposing new or more stringent environmental regulat...
77 FR 21890 - Drawbridge Operation Regulation; Sturgeon Bay Ship Canal, Sturgeon Bay, WI
2012-04-12
... Street and Maple-Oregon Bridges so vehicular traffic congestion would not develop on downtown Sturgeon... the efficient movement of vehicular traffic in Sturgeon Bay. The Sturgeon Bay Ship Canal is... experiences a significant increase in vehicular and vessel traffic during the peak tourist and navigation...
76 FR 28309 - Drawbridge Operation Regulation; Sturgeon Bay Ship Canal, Sturgeon Bay, WI
2011-05-17
... vehicular traffic congestion would not develop on downtown Sturgeon Bay streets due to unscheduled bridge... schedules during the peak tourist and navigation seasons to provide for the efficient movement of vehicular... between Lake Michigan and Green Bay. The area experiences a significant increase in vehicular and vessel...
Semidefinite linear complementarity problems
International Nuclear Information System (INIS)
Eckhardt, U.
1978-04-01
Semidefinite linear complementarity problems arise by discretization of variational inequalities describing e.g. elastic contact problems, free boundary value problems etc. In the present paper linear complementarity problems are introduced and the theory as well as the numerical treatment of them are described. In the special case of semidefinite linear complementarity problems a numerical method is presented which combines the advantages of elimination and iteration methods without suffering from their drawbacks. This new method has very attractive properties since it has a high degree of invariance with respect to the representation of the set of all feasible solutions of a linear complementarity problem by linear inequalities. By means of some practical applications the properties of the new method are demonstrated. (orig.) [de
Axler, Sheldon
2015-01-01
This best-selling textbook for a second course in linear algebra is aimed at undergrad math majors and graduate students. The novel approach taken here banishes determinants to the end of the book. The text focuses on the central goal of linear algebra: understanding the structure of linear operators on finite-dimensional vector spaces. The author has taken unusual care to motivate concepts and to simplify proofs. A variety of interesting exercises in each chapter helps students understand and manipulate the objects of linear algebra. The third edition contains major improvements and revisions throughout the book. More than 300 new exercises have been added since the previous edition. Many new examples have been added to illustrate the key ideas of linear algebra. New topics covered in the book include product spaces, quotient spaces, and dual spaces. Beautiful new formatting creates pages with an unusually pleasant appearance in both print and electronic versions. No prerequisites are assumed other than the ...
Cambridge Bay: Six years later
International Nuclear Information System (INIS)
Edworthy, J.
1992-01-01
The story of a wind energy project in Cambridge Bay, Northwest Territories, is presented from the perspective of the company that supplied the equipment and supported the project through its life. The project was intended to demonstrate the technical, economic, institutional, and operational issues and barriers to the use of wind power in remote communities. The system, involving four Carter Model 25 units each rated at 25 kW, was installed in 1987 and commissioned in January 1988. Shortly thereafter, the Northern Canada Power Commission (which requested the project in the first place) was taken over by the territorial administration, and employee continuity was disrupted. At about the same time, Federal support for the project decreased. Technical problems included a transformer failure, a generator failure, and a failed yaw tube which turned out to be lightly designed and poorly made. The Carter turbine company also went out of business, making spare parts difficult to obtain. The utility organization changed abruptly in summer 1991 with the arrival of a new area superintendent who did not support the project. The wind farm was shut down in 1992. The project generated a total of 160,982 kWh with over 71% availability. The positive and negative results from the project are summarized and recommendations are made for future Arctic wind power projects. 2 figs., 1 tab
Handbook on linear motor application
International Nuclear Information System (INIS)
1988-10-01
This book guides the application for Linear motor. It lists classification and speciality of Linear Motor, terms of linear-induction motor, principle of the Motor, types on one-side linear-induction motor, bilateral linear-induction motor, linear-DC Motor on basic of the motor, linear-DC Motor for moving-coil type, linear-DC motor for permanent-magnet moving type, linear-DC motor for electricity non-utility type, linear-pulse motor for variable motor, linear-pulse motor for permanent magneto type, linear-vibration actuator, linear-vibration actuator for moving-coil type, linear synchronous motor, linear electromagnetic motor, linear electromagnetic solenoid, technical organization and magnetic levitation and linear motor and sensor.
Recent 137Cs deposition in sediments of Admiralty Bay, Antarctica
International Nuclear Information System (INIS)
Sanders, Christian J.; Santos, Isaac R.; Patchineelam, Sambasiva R.; Schaefer, Carlos; Silva-Filho, Emmanoel V.
2010-01-01
Cesium-137, radium-226 and lead-210 profiles of a 25 cm sediment core give an indication of recent changes in land-ocean interactions at a polar coastal environment (Admiralty Bay, King George Island, Antarctica). The linear sedimentation accumulation rate at the study site calculated from the unsupported 210 Pb profile was 6.7 mm/year from 1965 to 2005. A 3.5-fold increase in 137 Cs concentrations was observed in the top layer of this sediment core. This sharp increase seems to indicate a recent redistribution of fallout radionuclides previously deposited on soil, vegetation and snow. These results imply enhanced land-ocean interactions at this site likely as a result of climate change. Because our results are based on a single core, additional investigations are needed to confirm our observations.
Linear discriminant analysis of character sequences using occurrences of words
Dutta, Subhajit; Chaudhuri, Probal; Ghosh, Anil
2014-01-01
Classification of character sequences, where the characters come from a finite set, arises in disciplines such as molecular biology and computer science. For discriminant analysis of such character sequences, the Bayes classifier based on Markov models turns out to have class boundaries defined by linear functions of occurrences of words in the sequences. It is shown that for such classifiers based on Markov models with unknown orders, if the orders are estimated from the data using cross-validation, the resulting classifier has Bayes risk consistency under suitable conditions. Even when Markov models are not valid for the data, we develop methods for constructing classifiers based on linear functions of occurrences of words, where the word length is chosen by cross-validation. Such linear classifiers are constructed using ideas of support vector machines, regression depth, and distance weighted discrimination. We show that classifiers with linear class boundaries have certain optimal properties in terms of their asymptotic misclassification probabilities. The performance of these classifiers is demonstrated in various simulated and benchmark data sets.
Linear discriminant analysis of character sequences using occurrences of words
Dutta, Subhajit
2014-02-01
Classification of character sequences, where the characters come from a finite set, arises in disciplines such as molecular biology and computer science. For discriminant analysis of such character sequences, the Bayes classifier based on Markov models turns out to have class boundaries defined by linear functions of occurrences of words in the sequences. It is shown that for such classifiers based on Markov models with unknown orders, if the orders are estimated from the data using cross-validation, the resulting classifier has Bayes risk consistency under suitable conditions. Even when Markov models are not valid for the data, we develop methods for constructing classifiers based on linear functions of occurrences of words, where the word length is chosen by cross-validation. Such linear classifiers are constructed using ideas of support vector machines, regression depth, and distance weighted discrimination. We show that classifiers with linear class boundaries have certain optimal properties in terms of their asymptotic misclassification probabilities. The performance of these classifiers is demonstrated in various simulated and benchmark data sets.
International Nuclear Information System (INIS)
Zhou, X; Zhang, Z Y; Zhang, Q M; Liu, Q; Ding, Y Y; Zhou, L; Cao, J
2015-01-01
We report the measurements of the densities of linear alkylbenzene at three temperatures over 4 to 23 °C with pressures up to 10 MPa. The measurements have been analysed to yield the isobaric thermal expansion coefficients and, so far for the first time, isothermal compressibilities of linear alkylbenzene. Relevance of results for current generation (i.e., Daya Bay) and next generation (i.e. JUNO) large liquid scintillator neutrino detectors are discussed. (paper)
Physical processes in a coupled bay-estuary coastal system: Whitsand Bay and Plymouth Sound
Uncles, R. J.; Stephens, J. A.; Harris, C.
2015-09-01
Whitsand Bay and Plymouth Sound are located in the southwest of England. The Bay and Sound are separated by the ∼2-3 km-wide Rame Peninsula and connected by ∼10-20 m-deep English Channel waters. Results are presented from measurements of waves and currents, drogue tracking, surveys of salinity, temperature and turbidity during stratified and unstratified conditions, and bed sediment surveys. 2D and 3D hydrodynamic models are used to explore the generation of tidally- and wind-driven residual currents, flow separation and the formation of the Rame eddy, and the coupling between the Bay and the Sound. Tidal currents flow around the Rame Peninsula from the Sound to the Bay between approximately 3 h before to 2 h after low water and form a transport path between them that conveys lower salinity, higher turbidity waters from the Sound to the Bay. These waters are then transported into the Bay as part of the Bay-mouth limb of the Rame eddy and subsequently conveyed to the near-shore, east-going limb and re-circulated back towards Rame Head. The Simpson-Hunter stratification parameter indicates that much of the Sound and Bay are likely to stratify thermally during summer months. Temperature stratification in both is pronounced during summer and is largely determined by coastal, deeper-water stratification offshore. Small tidal stresses in the Bay are unable to move bed sediment of the observed sizes. However, the Bay and Sound are subjected to large waves that are capable of driving a substantial bed-load sediment transport. Measurements show relatively low levels of turbidity, but these respond rapidly to, and have a strong correlation with, wave height.
Horii, Yuichi; Minomo, Kotaro; Ohtsuka, Nobutoshi; Motegi, Mamoru; Nojiri, Kiyoshi; Kannan, Kurunthachalam
2017-05-15
Surface waters including river water and effluent from sewage treatment plants (STPs) were collected from Tokyo Bay watershed, Japan, and analyzed for seven cyclic and linear volatile methylsiloxanes (VMSs), i.e., D3, D4, D5, D6, L3, L4, and L5 by an optimized purge and trap extraction method. The total concentrations of seven VMSs (ΣVMS) in river water ranged from watershed was estimated at 2300kg. Our results indicate widespread distribution of VMSs in Tokyo Bay watershed and the influence of domestic wastewater discharges as a source of VMSs in the aquatic environment. Copyright © 2017 Elsevier B.V. All rights reserved.
Linear ubiquitination in immunity.
Shimizu, Yutaka; Taraborrelli, Lucia; Walczak, Henning
2015-07-01
Linear ubiquitination is a post-translational protein modification recently discovered to be crucial for innate and adaptive immune signaling. The function of linear ubiquitin chains is regulated at multiple levels: generation, recognition, and removal. These chains are generated by the linear ubiquitin chain assembly complex (LUBAC), the only known ubiquitin E3 capable of forming the linear ubiquitin linkage de novo. LUBAC is not only relevant for activation of nuclear factor-κB (NF-κB) and mitogen-activated protein kinases (MAPKs) in various signaling pathways, but importantly, it also regulates cell death downstream of immune receptors capable of inducing this response. Recognition of the linear ubiquitin linkage is specifically mediated by certain ubiquitin receptors, which is crucial for translation into the intended signaling outputs. LUBAC deficiency results in attenuated gene activation and increased cell death, causing pathologic conditions in both, mice, and humans. Removal of ubiquitin chains is mediated by deubiquitinases (DUBs). Two of them, OTULIN and CYLD, are constitutively associated with LUBAC. Here, we review the current knowledge on linear ubiquitination in immune signaling pathways and the biochemical mechanisms as to how linear polyubiquitin exerts its functions distinctly from those of other ubiquitin linkage types. © 2015 The Authors. Immunological Reviews Published by John Wiley & Sons Ltd.
Dividend yield strategies: Dogs of the Dow and Hounds of the Bay
Kapur, Ratul; Suryavanshi, Saurabh
2006-01-01
Over the years ‘Dogs of The Dow’ strategy has become an increasingly popular and intensely argued subject for both practitioners and academicians. This thesis examines the multifarious aspects of the ‘Dogs of The Dow’ (DoD) strategy and highlights both the euphemism of the believers and reservations of the skeptics. Further on, we empirically test the DoD strategy over a 16-year period from 1990 to 2005. A parallel study, Hounds of The Bay (HoB) is also carried out for the Canadian markets, o...
International Nuclear Information System (INIS)
Krivonos, S.O.; Sorin, A.S.
1994-06-01
We show that the Zamolodchikov's and Polyakov-Bershadsky nonlinear algebras W 3 and W (2) 3 can be embedded as subalgebras into some linear algebras with finite set of currents. Using these linear algebras we find new field realizations of W (2) 3 and W 3 which could be a starting point for constructing new versions of W-string theories. We also reveal a number of hidden relationships between W 3 and W (2) 3 . We conjecture that similar linear algebras can exist for other W-algebra as well. (author). 10 refs
Schneider, Hans
1989-01-01
Linear algebra is one of the central disciplines in mathematics. A student of pure mathematics must know linear algebra if he is to continue with modern algebra or functional analysis. Much of the mathematics now taught to engineers and physicists requires it.This well-known and highly regarded text makes the subject accessible to undergraduates with little mathematical experience. Written mainly for students in physics, engineering, economics, and other fields outside mathematics, the book gives the theory of matrices and applications to systems of linear equations, as well as many related t
Linearity in Process Languages
DEFF Research Database (Denmark)
Nygaard, Mikkel; Winskel, Glynn
2002-01-01
The meaning and mathematical consequences of linearity (managing without a presumed ability to copy) are studied for a path-based model of processes which is also a model of affine-linear logic. This connection yields an affine-linear language for processes, automatically respecting open......-map bisimulation, in which a range of process operations can be expressed. An operational semantics is provided for the tensor fragment of the language. Different ways to make assemblies of processes lead to different choices of exponential, some of which respect bisimulation....
Amir-Moez, A R; Sneddon, I N
1962-01-01
Elements of Linear Space is a detailed treatment of the elements of linear spaces, including real spaces with no more than three dimensions and complex n-dimensional spaces. The geometry of conic sections and quadric surfaces is considered, along with algebraic structures, especially vector spaces and transformations. Problems drawn from various branches of geometry are given.Comprised of 12 chapters, this volume begins with an introduction to real Euclidean space, followed by a discussion on linear transformations and matrices. The addition and multiplication of transformations and matrices a
Weisberg, Sanford
2013-01-01
Praise for the Third Edition ""...this is an excellent book which could easily be used as a course text...""-International Statistical Institute The Fourth Edition of Applied Linear Regression provides a thorough update of the basic theory and methodology of linear regression modeling. Demonstrating the practical applications of linear regression analysis techniques, the Fourth Edition uses interesting, real-world exercises and examples. Stressing central concepts such as model building, understanding parameters, assessing fit and reliability, and drawing conclusions, the new edition illus
Gazprom the new russian empire
International Nuclear Information System (INIS)
Cosnard, D.
2004-01-01
The author analyzes the economical and political impacts of the great Gazprom group, leader in the russian energy domain, in Russia. Already number one of the world gas industry, this Group is becoming the right-hand of the Kremlin. Thus the author wonders on this empire transparency and limits. (A.L.B.)
Phenomenology and the Empirical Turn
Zwier, Jochem; Blok, Vincent; Lemmens, Pieter
2016-01-01
This paper provides a phenomenological analysis of postphenomenological philosophy of technology. While acknowledging that the results of its analyses are to be recognized as original, insightful, and valuable, we will argue that in its execution of the empirical turn, postphenomenology forfeits
Empirical ethics as dialogical practice
Widdershoven, G.A.M.; Abma, T.A.; Molewijk, A.C.
2009-01-01
In this article, we present a dialogical approach to empirical ethics, based upon hermeneutic ethics and responsive evaluation. Hermeneutic ethics regards experience as the concrete source of moral wisdom. In order to gain a good understanding of moral issues, concrete detailed experiences and
Riet, Fred H. van
1990-01-01
A Dutch teacher presents reading, film viewing, and writing activities for "Empire of the Sun," J. G. Ballard's autobiographical account of life as a boy in Shanghai and in a Japanese internment camp during World War II (the subject of Steven Spielberg's film of the same name). Includes objectives, procedures, and several literature,…
Empirical Specification of Utility Functions.
Mellenbergh, Gideon J.
Decision theory can be applied to four types of decision situations in education and psychology: (1) selection; (2) placement; (3) classification; and (4) mastery. For the application of the theory, a utility function must be specified. Usually the utility function is chosen on a priori grounds. In this paper methods for the empirical assessment…
Empirical Productivity Indices and Indicators
B.M. Balk (Bert)
2016-01-01
textabstractThe empirical measurement of productivity change (or difference) by means of indices and indicators starts with the ex post profit/loss accounts of a production unit. Key concepts are profit, leading to indicators, and profitability, leading to indices. The main task for the productivity
EMPIRICAL RESEARCH AND CONGREGATIONAL ANALYSIS ...
African Journals Online (AJOL)
empirical research has made to the process of congregational analysis. 1 Part of this ... contextual congegrational analysis – meeting social and divine desires”) at the IAPT .... methodology of a congregational analysis should be regarded as a process. ... essential to create space for a qualitative and quantitative approach.
Empirical processes: theory and applications
Venturini Sergio
2005-01-01
Proceedings of the 2003 Summer School in Statistics and Probability in Torgnon (Aosta, Italy) held by Prof. Jon A. Wellner and Prof. M. Banerjee. The topic presented was the theory of empirical processes with applications to statistics (m-estimation, bootstrap, semiparametric theory).
Empirical laws, regularity and necessity
Koningsveld, H.
1973-01-01
In this book I have tried to develop an analysis of the concept of an empirical law, an analysis that differs in many ways from the alternative analyse's found in contemporary literature dealing with the subject.
1 am referring especially to two well-known views, viz. the regularity and
Empirical analysis of consumer behavior
Huang, Yufeng
2015-01-01
This thesis consists of three essays in quantitative marketing, focusing on structural empirical analysis of consumer behavior. In the first essay, he investigates the role of a consumer's skill of product usage, and its imperfect transferability across brands, in her product choice. It shows that
Pello, F. S.; Haumahu, S.; Huliselan, N. V.; Tuapattinaja, M. A.
2017-10-01
The Inner Ambon Bay and Kao Bay have potential on fisheries resources which one of them is molluscs. Molluscs especially for class bivalve have economical values and are consumed by coastal community. The research had been done to analyze saxitoxin (STX) concentration on bivalves from Kao Bay and Inner Ambon Bay. The Saxitoxin Elisa Test Kit Protocol was used to determine saxitoxin concentration. The measurement showed that the highest concentration of saxitoxin (392.42 µg STXeq/100g shellfish meat) was Gafrarium tumidum from Ambon Bay, whereas concentration of saxitoxin (321.83 µg STXeq/100g shellfish meat) was Mactra mera from Kao Bay
Metal concentrations in Kandalaksha Bay, White Sea (Russia) following the spring snowmelt
International Nuclear Information System (INIS)
Cobelo-Garcia, A.; Millward, G.E.; Prego, R.; Lukashin, V.
2006-01-01
Elevated concentrations of dissolved and particulate Cd, Cu, Pb and Zn have been determined in the waters of Kandalaksha Bay (White Sea, Russia), following the ice melt in the spring of 2000. Dissolved metal maxima in the surface waters were observed at some stations and concentrations generally decreased with depth. The suspended particulate matter (SPM) comprised a non-lithogenic fraction in the range 12-83%, and had elevated metal concentrations that showed no trend with depth or salinity and was compositionally distinct from the sediments. A log-linear relationship existed between the concentrations of metals in sediments and in SPM and their respective Al concentrations, indicating a source of metal-rich particles, with low Al content, to the Bay. The results suggest that Kandalaksha Bay has been impacted by industrial activity on the Kola Peninsula and that restricted water exchange will hinder its recovery from metal contamination. - Elevated dissolved and particulate metal concentrations have been determined in the water column of Kandalaksha Bay, White Sea (Russia)
Gas exchange rates across the sediment-water andd air-water interfaces in south San Francisco Bay
International Nuclear Information System (INIS)
Hartman, B.; Hammond, D.E.
1984-01-01
Radon 222 concentrations in the water and sedimentary columns and radon exchange rates across the sediment-water and air-water interfaces have been measured in a section of south San Francisco Bay. Two independent methods have been used to determine sediment-water exchange rates, and the annual averages of these methods agree within the uncertainity of the determinations, about 20%. The annual average of bethic fluxes from shoal areas is nearly a factor of 2 greater than fluxes from the channel areas. Fluxes from the shoal and channel areas exceed those expected from simple molecular diffusion by factors of 4 and 2, respectively, apparently due to macrofaunal irrigation. Values of the gas transfer coefficient for radon exchange across the air-water inteface were determined by constructing a radon mass balance for the water column and by direct measurement using floating chambers. The chamber method appears to yield results which are too high. Transfer coefficients computed using the mass balance method range from 0.4 m/day to 1.8 m/day, with a 6-year average of 1.0 m/day. Gas exchange is linearly dependent upon wind speed over a wind speed range of 3.2--6.4 m/s, but shows no dependence upon current velocity. Gas transfer coefficients predicted from an empirical relationship between gas exchange rates and wind speed observed in lakes and the oceans are within 30% of the coefficients determined from the radon mass balance and are considerably more accurate than coefficients predicted from theoretical gas exchange models
Gas exchange rates across the sediment-water and air-water interfaces in south San Francisco Bay
Hartman, Blayne; Hammond, Douglas E.
1984-01-01
Radon 222 concentrations in the water and sedimentary columns and radon exchange rates across the sediment-water and air-water interfaces have been measured in a section of south San Francisco Bay. Two independent methods have been used to determine sediment-water exchange rates, and the annual averages of these methods agree within the uncertainty of the determinations, about 20%. The annual average of benthic fluxes from shoal areas is nearly a factor of 2 greater than fluxes from the channel areas. Fluxes from the shoal and channel areas exceed those expected from simple molecular diffusion by factors of 4 and 2, respectively, apparently due to macrofaunal irrigation. Values of the gas transfer coefficient for radon exchange across the air-water interface were determined by constructing a radon mass balance for the water column and by direct measurement using floating chambers. The chamber method appears to yield results which are too high. Transfer coefficients computed using the mass balance method range from 0.4 m/day to 1.8 m/day, with a 6-year average of 1.0 m/day. Gas exchange is linearly dependent upon wind speed over a wind speed range of 3.2–6.4 m/s, but shows no dependence upon current velocity. Gas transfer coefficients predicted from an empirical relationship between gas exchange rates and wind speed observed in lakes and the oceans are within 30% of the coefficients determined from the radon mass balance and are considerably more accurate than coefficients predicted from theoretical gas exchange models.
PEMANFATAN TEOREMA BAYES DALAM PENENTUAN PENYAKIT THT
Directory of Open Access Journals (Sweden)
Sri Winiarti
2008-07-01
Full Text Available Dalam konsep pelacakan dalam mencari solusi dengan pendekatan artificial inteligent, ada berbagai metode yang dapat diterapkan untuk mengatasi masalah ketidakpastian saat proses pelacakan terjadi. Salah satunya adalah teorema bayes. Adanya ketidakpastian pada proses pelacakan dapat terjadi karena adanya perubahan pengetahuan yang ada di dalam sistem. Untuk itu diperlukan adanya suatu metode untuk mengatasi permasalahan tersebut. Dalam penelitian ini telah diterapkan suatu metode untuk mengatasi ketidakpastian dengan teorema Bayes pada kasus pelacakan untuk mendiagnosa penyakit pada THT (Telinga,Hidung dan Tenggorokan. Subjek pada penelitian ini adalah proses pelacakan untuk menentukan penyakit THT dengan model penalaran forward chaining dan metode kepastiannya menggunakan teorema bayes dengan cara menghitung nilai probabilitas suatu penyakit dan membandingkan probabilitas setiap gejalanya. Model pengembangan perangkat lunak yang digunakan dalam penelitian ini adalah Waterfall. Metode Waterfall diawali dengan analisis data, perancangan sistem, pengkodean menggunakan Visual Basic 6.0, pengujian sistem dengan black box test dan alfa test. Dari penelitian yang dilakukan menghasilkan sebuah perangkat lunak yaitu yang mampu menentukan penyakit pada THT dengan menerapkan metode bayes untuk mengatasi ketidakpastian. Hasil uji coba sistem menujukkan bahwa aplikasi ini layak dan dapat digunakan. Kata kunci : Penyakit, THT, Teorema Bayes.
Changing Salinity Patterns in Biscayne Bay, Florida
,
2004-01-01
Biscayne Bay, Fla., is a 428-square-mile (1,109-square-kilometer) subtropical estuarine ecosystem that includes Biscayne National Park, the largest marine park in the U.S. national park system (fig. 1). The bay began forming between 5,000 and 3,000 years ago as sea level rose and southern Florida was flooded. Throughout most of its history, the pristine waters of the bay supported abundant and diverse fauna and flora, and the bay was a nursery for the adjacent coral-reef and marine ecosystems. In the 20th century, urbanization of the Miami-Dade County area profoundly affected the environment of the bay. Construction of powerplants, water-treatment plants, and solid-waste sites and large-scale development along the shoreline stressed the ecosystem. Biscayne National Monument was established in 1968 to ?preserve and protect for the education, inspiration, recreation and enjoyment of present and future generations a rare combination of terrestrial, marine, and amphibious life in a tropical setting of great natural beauty? (Public Law 90?606). The monument was enlarged in 1980 and designated a national park.
Estimating the empirical probability of submarine landslide occurrence
Geist, Eric L.; Parsons, Thomas E.; Mosher, David C.; Shipp, Craig; Moscardelli, Lorena; Chaytor, Jason D.; Baxter, Christopher D. P.; Lee, Homa J.; Urgeles, Roger
2010-01-01
The empirical probability for the occurrence of submarine landslides at a given location can be estimated from age dates of past landslides. In this study, tools developed to estimate earthquake probability from paleoseismic horizons are adapted to estimate submarine landslide probability. In both types of estimates, one has to account for the uncertainty associated with age-dating individual events as well as the open time intervals before and after the observed sequence of landslides. For observed sequences of submarine landslides, we typically only have the age date of the youngest event and possibly of a seismic horizon that lies below the oldest event in a landslide sequence. We use an empirical Bayes analysis based on the Poisson-Gamma conjugate prior model specifically applied to the landslide probability problem. This model assumes that landslide events as imaged in geophysical data are independent and occur in time according to a Poisson distribution characterized by a rate parameter λ. With this method, we are able to estimate the most likely value of λ and, importantly, the range of uncertainty in this estimate. Examples considered include landslide sequences observed in the Santa Barbara Channel, California, and in Port Valdez, Alaska. We confirm that given the uncertainties of age dating that landslide complexes can be treated as single events by performing statistical test of age dates representing the main failure episode of the Holocene Storegga landslide complex.
Callier, Frank M.; Desoer, Charles A.
1991-01-01
The aim of this book is to provide a systematic and rigorous access to the main topics of linear state-space system theory in both the continuous-time case and the discrete-time case; and the I/O description of linear systems. The main thrusts of the work are the analysis of system descriptions and derivations of their properties, LQ-optimal control, state feedback and state estimation, and MIMO unity-feedback systems.
Surface layer temperature inversion in the Bay of Bengal
Digital Repository Service at National Institute of Oceanography (India)
Pankajakshan, T.; Gopalakrishna, V.V.; Muraleedharan, P.M.; Reddy, G.V.; Araligidad, N.; Shenoy, Shrikant
Surface layer temperature inversion occurring in the Bay of Bengal has been addressed. Hydrographic data archived in the Indian Oceanographic Data Center are used to understand various aspects of the temperature inversion of surface layer in the Bay...
Parameter Identification by Bayes Decision and Neural Networks
DEFF Research Database (Denmark)
Kulczycki, P.; Schiøler, Henrik
1994-01-01
The problem of parameter identification by Bayes point estimation using neural networks is investigated.......The problem of parameter identification by Bayes point estimation using neural networks is investigated....
SF Bay Water Quality Improvement Fund: Projects and Accomplishments
San Francisco Bay Water Quality Improvement Fund (SFBWQIF) projects listed here are part of an EPA competitive grant program to improve SF Bay water quality focused on restoring impaired waters and enhancing aquatic resources.
South Bay Salt Pond Tidal Wetland Restoration Phase II Planning
Information about the SFBWQP South Bay Salt Pond Tidal Wetland Restoration Phase II Planning project, part of an EPA competitive grant program to improve SF Bay water quality focused on restoring impaired waters and enhancing aquatic re
South Bay Salt Pond Restoration, Phase II at Ravenswood
Information about the South Bay Salt Pond Restoration Project: Phase II Construction at Ravenswood, part of an EPA competitive grant program to improve SF Bay water quality focused on restoring impaired waters and enhancing aquatic resources.
Sediment grab data from October 1999 in Apalachicola Bay, Florida
National Oceanic and Atmospheric Administration, Department of Commerce — The Apalachicola Bay National Estuarine Research Reserve and the NOAA Office for Coastal Management worked together to map benthic habitats within Apalachicola Bay,...
1999 RoxAnn Data Points from Apalachicola Bay, Florida
National Oceanic and Atmospheric Administration, Department of Commerce — The Apalachicola Bay National Estuarine Research Reserve and the NOAA Office for Coastal Management worked together to map benthic habitats within Apalachicola Bay,...
Atomic mass formula with linear shell terms
International Nuclear Information System (INIS)
Uno, Masahiro; Yamada, Masami; Ando, Yoshihira; Tachibana, Takahiro.
1981-01-01
An atomic mass formula is constructed in the form of a sum of gross terms and empirical linear shell terms. Values of the shell parameters are determined after the statistical method of Uno and Yamada, Which is characterized by inclusion of the error inherent in the mass formula. The resulting formula reproduces the input masses with the standard deviation of 393 keV. A prescription is given for estimating errors of calculated masses. The mass formula is compared with recent experimental data of Rb, Cs and Fr isotopes, which are not included in the input data, and also with the constant-shell-term formula of Uno and Yamada. (author)
Ion transmission in a linear radiofrequency spectrometer
International Nuclear Information System (INIS)
Gomet, J.-C.
1975-01-01
A linear radiofrequency spectrometer is used for the purpose of experimental determination of the absolute ionization cross sections of various ions obtained by electron impact on polyatomic molecules. The transmission of the apparatus is studied: it does not only depend on the mass resolution of the spectrometer, but also on the nature of ions. It is affected by charge transfers, especially for the parent ions. An empiric way of correction of the apparatus function is given which allows the use at 10 -6 Torr [fr
Meteorological research studies at Jervis Bay, Australia
International Nuclear Information System (INIS)
Clark, G.H.; Bendun, E.O.K.
1974-07-01
A climatological study of the winds and temperature from the Jervis Bay region which commenced in October 1970 has shown the presence of a coastal sea breeze and secondary bay breeze circulation system. In an attempt to define the influence of the Murray's Beach site on the local atmospheric dispersion, special smoke plume photography studies were conducted in the lower atmosphere. In June 1972 a meteorological acoustic sounding research programme was initiated at the Jervis Bay settlement. The aims of the research are to calibrate the sounder in terms of surface wind, turbulence and temperature measurements pertinent to a description of the lower atmospheric dispersion potential. Preliminary results on six months' data have shown encouraging correlations between the acoustic sounder patterns and particularly the wind direction turbulence traces. (author)
Algae Reefs in Shark Bay, Western Australia, Australia
1990-01-01
Numerous algae reefs are seen in Shark Bay, Western Australia, Australia (26.0S, 113.5E) especially in the southern portions of the bay. The south end is more saline because tidal flow in and out of the bay is restricted by sediment deposited at the north and central end of the bay opposite the mouth of the Wooramel River. This extremely arid region produces little sediment runoff so that the waters are very clear, saline and rich in algae.
Mapping Oyster Reef Habitats in Mobile Bay
Bolte, Danielle
2011-01-01
Oyster reefs around the world are declining rapidly, and although they haven t received as much attention as coral reefs, they are just as important to their local ecosystems and economies. Oyster reefs provide habitats for many species of fish, invertebrates, and crustaceans, as well as the next generations of oysters. Oysters are also harvested from many of these reefs and are an important segment of many local economies, including that of Mobile Bay, where oysters rank in the top five commercial marine species both by landed weight and by dollar value. Although the remaining Mobile Bay oyster reefs are some of the least degraded in the world, projected climate change could have dramatic effects on the health of these important ecosystems. The viability of oyster reefs depends on water depth and temperature, appropriate pH and salinity levels, and the amount of dissolved oxygen in the water. Projected increases in sea level, changes in precipitation and runoff patterns, and changes in pH resulting from increases in the amount of carbon dioxide dissolved in the oceans could all affect the viability of oyster reefs in the future. Human activities such as dredging and unsustainable harvesting practices are also adversely impacting the oyster reefs. Fortunately, several projects are already under way to help rebuild or support existing or previously existing oyster reefs. The success of these projects will depend on the local effects of climate change on the current and potential habitats and man s ability to recognize and halt unsustainable harvesting practices. As the extent and health of the reefs changes, it will have impacts on the Mobile Bay ecosystem and economy, changing the resources available to the people who live there and to the rest of the country, since Mobile Bay is an important national source of seafood. This project identified potential climate change impacts on the oyster reefs of Mobile Bay, including the possible addition of newly viable
Lost lake - restoration of a Carolina bay
Energy Technology Data Exchange (ETDEWEB)
Hanlin, H.G.; McLendon, J.P. [Univ. of South Carolina, Aiken, SC (United States). Dept. of Biology and Geology; Wike, L.D. [Univ. of South Carolina, Aiken, SC (United States). Dept. of Biology and Geology]|[Westinghouse Savannah River Co., Aiken, SC (United States). Savannah River Technology Center; Dietsch, B.M. [Univ. of South Carolina, Aiken, SC (United States). Dept. of Biology and Geology]|[Univ. of Georgia, Aiken, SC (United States)
1994-09-01
Carolina bays are shallow wetland depressions found only on the Atlantic Coastal Plain. Although these isolated interstream wetlands support many types of communities, they share the common features of having a sandy margin, a fluctuating water level, an elliptical shape, and a northwest to southeast orientation. Lost Lake, an 11.3 hectare Carolina bay, was ditched and drained for agricultural production before establishment of the Savannah River Site in 1950. Later it received overflow from a seepage basin containing a variety of chemicals, primarily solvents and some heavy metals. In 1990 a plan was developed for the restoration of Lost Lake, and restoration activities were complete by mid-1991. Lost Lake is the first known project designed for the restoration and recovery of a Carolina bay. The bay was divided into eight soil treatment zones, allowing four treatments in duplicate. Each of the eight zones was planted with eight species of native wetland plants. Recolonization of the bay by amphibians and reptiles is being evaluated by using drift fences with pitfall traps and coverboard arrays in each of the treatment zones. Additional drift fences in five upland habitats were also established. Hoop turtle traps, funnel minnow traps, and dip nets were utilized for aquatic sampling. The presence of 43 species common to the region has been documented at Lost Lake. More than one-third of these species show evidence of breeding populations being established. Three species found prior to the restoration activity and a number of species common to undisturbed Carolina bays were not encountered. Colonization by additional species is anticipated as the wetland undergoes further succession.
STS-98 Destiny in Atlantis's payload bay
2001-01-01
KENNEDY SPACE CENTER, Fla. -- The U.S. Laboratory Destiny rests once again in Atlantis'''s payload bay, at Launch Pad 39A. Closing of the payload bay doors is imminent. Destiny, a key element in the construction of the International Space Station, is 28 feet long and weighs 16 tons. This research and command-and-control center is the most sophisticated and versatile space laboratory ever built. It will ultimately house a total of 23 experiment racks for crew support and scientific research. Destiny will be launched Feb. 7 on STS-98, the seventh construction flight to the ISS.
Management case study: Tampa Bay, Florida
Morrison, G.; Greening, H.S.; Yates, K.K.
2012-01-01
Tampa Bay, Florida,USA, is a shallow,subtropical estuary that experienced severe cultural eutrophication between the 1940s and 1980s, a period when the human population of its watershed quadrupled. In response, citizen action led to the formation of a public- and private-sector partnership (the Tampa Bay Estuary Program), which adopted a number of management objectives to support the restoration and protection of the bay’s living resources. These included numeric chlorophyll a and water-clarity targets, as well as long-term goals addressing the spatial extent of sea grasses and other selected habitat types, to support estuarine-dependent faunal guilds.
Chondrichthyan occurrence and abundance trends in False Bay ...
African Journals Online (AJOL)
Commercial fishing in False Bay, South Africa, began in the 1600s. Today chondrichthyans are regularly taken in fisheries throughout the bay. Using a combination of catch, survey and life history data, the occurrence and long-term changes in populations of chondrichthyans in False Bay are described. Analyses of time ...
Empirically Testing Thematic Analysis (ETTA)
DEFF Research Database (Denmark)
Gildberg, Frederik Alkier; Bradley, Stephen K.; Tingleff, Elllen B.
2015-01-01
Text analysis is not a question of a right or wrong way to go about it, but a question of different traditions. These tend to not only give answers to how to conduct an analysis, but also to provide the answer as to why it is conducted in the way that it is. The problem however may be that the li...... for themselves. The advantage of utilizing the presented analytic approach is argued to be the integral empirical testing, which should assure systematic development, interpretation and analysis of the source textual material....... between tradition and tool is unclear. The main objective of this article is therefore to present Empirical Testing Thematic Analysis, a step by step approach to thematic text analysis; discussing strengths and weaknesses, so that others might assess its potential as an approach that they might utilize/develop...
International Nuclear Information System (INIS)
Grimm, J.W.; Lynch, J.A.
2005-01-01
Daily precipitation nitrate and ammonium concentration models were developed for the Chesapeake Bay Watershed (USA) using a linear least-squares regression approach and precipitation chemistry data from 29 National Atmospheric Deposition Program/National Trends Network (NADP/NTN) sites. Only weekly samples that comprised a single precipitation event were used in model development. The most significant variables in both ammonium and nitrate models included: precipitation volume, the number of days since the last event, a measure of seasonality, latitude, and the proportion of land within 8 km covered by forest or devoted to industry and transportation. Additional variables included in the nitrate model were the proportion of land within 0.8 km covered by water and/or forest. Local and regional ammonia and nitrogen oxide emissions were not as well correlated as land cover. Modeled concentrations compared very well with event chemistry data collected at six NADP/AirMoN sites within the Chesapeake Bay Watershed. Wet deposition estimates were also consistent with observed deposition at selected sites. Accurately describing the spatial distribution of precipitation volume throughout the watershed is important in providing critical estimates of wet-fall deposition of ammonium and nitrate. - A linear least-squares regression approach was used to develop daily precipitation nitrate and ammonium concentration models for the Chesapeake Bay Watershed
Essays in empirical industrial organization
Aguiar de Luque, Luis
2013-01-01
My PhD thesis consists of three chapters in Empirical Industrial Organization. The first two chapters focus on the relationship between firrm performance and specific public policies. In particular, we analyze the cases of cooperative research and development (R&D) in the European Union and the regulation of public transports in France. The third chapter focuses on copyright protection in the digital era and analyzes the relationship between legal and illegal consumption of di...
Empirical research on Waldorf education
Randoll, Dirk; Peters, Jürgen
2015-01-01
Waldorf education began in 1919 with the first Waldorf School in Stuttgart and nowadays is widespread in many countries all over the world. Empirical research, however, has been rare until the early nineties and Waldorf education has not been discussed within educational science so far. This has changed during the last decades. This article reviews the results of surveys during the last 20 years and is mainly focused on German Waldorf Schools, because most investigations have been done in thi...
Empirical distribution function under heteroscedasticity
Czech Academy of Sciences Publication Activity Database
Víšek, Jan Ámos
2011-01-01
Roč. 45, č. 5 (2011), s. 497-508 ISSN 0233-1888 Grant - others:GA UK(CZ) GA402/09/0557 Institutional research plan: CEZ:AV0Z10750506 Keywords : Robustness * Convergence * Empirical distribution * Heteroscedasticity Subject RIV: BB - Applied Statistics , Operational Research Impact factor: 0.724, year: 2011 http://library.utia.cas.cz/separaty/2011/SI/visek-0365534.pdf
Expert opinion vs. empirical evidence
Herman, Rod A; Raybould, Alan
2014-01-01
Expert opinion is often sought by government regulatory agencies when there is insufficient empirical evidence to judge the safety implications of a course of action. However, it can be reckless to continue following expert opinion when a preponderance of evidence is amassed that conflicts with this opinion. Factual evidence should always trump opinion in prioritizing the information that is used to guide regulatory policy. Evidence-based medicine has seen a dramatic upturn in recent years sp...
Blyth, T S
2002-01-01
Most of the introductory courses on linear algebra develop the basic theory of finite dimensional vector spaces, and in so doing relate the notion of a linear mapping to that of a matrix. Generally speaking, such courses culminate in the diagonalisation of certain matrices and the application of this process to various situations. Such is the case, for example, in our previous SUMS volume Basic Linear Algebra. The present text is a continuation of that volume, and has the objective of introducing the reader to more advanced properties of vector spaces and linear mappings, and consequently of matrices. For readers who are not familiar with the contents of Basic Linear Algebra we provide an introductory chapter that consists of a compact summary of the prerequisites for the present volume. In order to consolidate the student's understanding we have included a large num ber of illustrative and worked examples, as well as many exercises that are strategi cally placed throughout the text. Solutions to the ex...
Ranjbar, Mohammad Hassan; Hadjizadeh Zaker, Nasser
2016-11-01
Gorgan Bay is a semi-enclosed basin located in the southeast of the Caspian Sea in Iran and is an important marine habitat for fish and seabirds. In the present study, the environmental capacity of phosphorus in Gorgan Bay was estimated using a 3D ecological-hydrodynamic numerical model and a linear programming model. The distribution of phosphorus, simulated by the numerical model, was used as an index for the occurrence of eutrophication and to determine the water quality response field of each of the pollution sources. The linear programming model was used to calculate and allocate the total maximum allowable loads of phosphorus to each of the pollution sources in a way that eutrophication be prevented and at the same time maximum environmental capacity be achieved. In addition, the effect of an artificial inlet on the environmental capacity of the bay was investigated. Observations of surface currents in Gorgan Bay were made by GPS-tracked surface drifters to provide data for calibration and verification of numerical modeling. Drifters were deployed at five different points across the bay over a period of 5 days. The results indicated that the annual environmental capacity of phosphorus is approximately 141 t if a concentration of 0.0477 mg/l for phosphorus is set as the water quality criterion. Creating an artificial inlet with a width of 1 km in the western part of the bay would result in a threefold increase in the environmental capacity of the study area.
Empirical isotropic chemical shift surfaces
International Nuclear Information System (INIS)
Czinki, Eszter; Csaszar, Attila G.
2007-01-01
A list of proteins is given for which spatial structures, with a resolution better than 2.5 A, are known from entries in the Protein Data Bank (PDB) and isotropic chemical shift (ICS) values are known from the RefDB database related to the Biological Magnetic Resonance Bank (BMRB) database. The structures chosen provide, with unknown uncertainties, dihedral angles φ and ψ characterizing the backbone structure of the residues. The joint use of experimental ICSs of the same residues within the proteins, again with mostly unknown uncertainties, and ab initio ICS(φ,ψ) surfaces obtained for the model peptides For-(l-Ala) n -NH 2 , with n = 1, 3, and 5, resulted in so-called empirical ICS(φ,ψ) surfaces for all major nuclei of the 20 naturally occurring α-amino acids. Out of the many empirical surfaces determined, it is the 13C α ICS(φ,ψ) surface which seems to be most promising for identifying major secondary structure types, α-helix, β-strand, left-handed helix (α D ), and polyproline-II. Detailed tests suggest that Ala is a good model for many naturally occurring α-amino acids. Two-dimensional empirical 13C α - 1 H α ICS(φ,ψ) correlation plots, obtained so far only from computations on small peptide models, suggest the utility of the experimental information contained therein and thus they should provide useful constraints for structure determinations of proteins
Two concepts of empirical ethics.
Parker, Malcolm
2009-05-01
The turn to empirical ethics answers two calls. The first is for a richer account of morality than that afforded by bioethical principlism, which is cast as excessively abstract and thin on the facts. The second is for the facts in question to be those of human experience and not some other, unworldly realm. Empirical ethics therefore promises a richer naturalistic ethics, but in fulfilling the second call it often fails to heed the metaethical requirements related to the first. Empirical ethics risks losing the normative edge which necessarily characterizes the ethical, by failing to account for the nature and the logic of moral norms. I sketch a naturalistic theory, teleological expressivism (TE), which negotiates the naturalistic fallacy by providing a more satisfactory means of taking into account facts and research data with ethical implications. The examples of informed consent and the euthanasia debate are used to illustrate the superiority of this approach, and the problems consequent on including the facts in the wrong kind of way.
Estimation and variable selection for generalized additive partial linear models
Wang, Li
2011-08-01
We study generalized additive partial linear models, proposing the use of polynomial spline smoothing for estimation of nonparametric functions, and deriving quasi-likelihood based estimators for the linear parameters. We establish asymptotic normality for the estimators of the parametric components. The procedure avoids solving large systems of equations as in kernel-based procedures and thus results in gains in computational simplicity. We further develop a class of variable selection procedures for the linear parameters by employing a nonconcave penalized quasi-likelihood, which is shown to have an asymptotic oracle property. Monte Carlo simulations and an empirical example are presented for illustration. © Institute of Mathematical Statistics, 2011.
International Nuclear Information System (INIS)
Mamyrin, B.A.; Shmikk, D.V.
1979-01-01
A description and operating principle of a linear mass reflectron with V-form trajectory of ion motion -a new non-magnetic time-of-flight mass spectrometer with high resolution are presented. The ion-optical system of the device consists of an ion source with ionization by electron shock, of accelerating gaps, reflector gaps, a drift space and ion detector. Ions move in the linear mass refraction along the trajectories parallel to the axis of the analyzer chamber. The results of investigations into the experimental device are given. With an ion drift length of 0.6 m the device resolution is 1200 with respect to the peak width at half-height. Small-sized mass spectrometric transducers with high resolution and sensitivity may be designed on the base of the linear mass reflectron principle
Olver, Peter J
2018-01-01
This textbook develops the essential tools of linear algebra, with the goal of imparting technique alongside contextual understanding. Applications go hand-in-hand with theory, each reinforcing and explaining the other. This approach encourages students to develop not only the technical proficiency needed to go on to further study, but an appreciation for when, why, and how the tools of linear algebra can be used across modern applied mathematics. Providing an extensive treatment of essential topics such as Gaussian elimination, inner products and norms, and eigenvalues and singular values, this text can be used for an in-depth first course, or an application-driven second course in linear algebra. In this second edition, applications have been updated and expanded to include numerical methods, dynamical systems, data analysis, and signal processing, while the pedagogical flow of the core material has been improved. Throughout, the text emphasizes the conceptual connections between each application and the un...
Banach, S
1987-01-01
This classic work by the late Stefan Banach has been translated into English so as to reach a yet wider audience. It contains the basics of the algebra of operators, concentrating on the study of linear operators, which corresponds to that of the linear forms a1x1 + a2x2 + ... + anxn of algebra.The book gathers results concerning linear operators defined in general spaces of a certain kind, principally in Banach spaces, examples of which are: the space of continuous functions, that of the pth-power-summable functions, Hilbert space, etc. The general theorems are interpreted in various mathematical areas, such as group theory, differential equations, integral equations, equations with infinitely many unknowns, functions of a real variable, summation methods and orthogonal series.A new fifty-page section (``Some Aspects of the Present Theory of Banach Spaces'''') complements this important monograph.
DEFF Research Database (Denmark)
Høskuldsson, Agnar
1996-01-01
Determination of the proper dimension of a given linear model is one of the most important tasks in the applied modeling work. We consider here eight criteria that can be used to determine the dimension of the model, or equivalently, the number of components to use in the model. Four of these cri......Determination of the proper dimension of a given linear model is one of the most important tasks in the applied modeling work. We consider here eight criteria that can be used to determine the dimension of the model, or equivalently, the number of components to use in the model. Four...... the basic problems in determining the dimension of linear models. Then each of the eight measures are treated. The results are illustrated by examples....
Linear programming using Matlab
Ploskas, Nikolaos
2017-01-01
This book offers a theoretical and computational presentation of a variety of linear programming algorithms and methods with an emphasis on the revised simplex method and its components. A theoretical background and mathematical formulation is included for each algorithm as well as comprehensive numerical examples and corresponding MATLAB® code. The MATLAB® implementations presented in this book are sophisticated and allow users to find solutions to large-scale benchmark linear programs. Each algorithm is followed by a computational study on benchmark problems that analyze the computational behavior of the presented algorithms. As a solid companion to existing algorithmic-specific literature, this book will be useful to researchers, scientists, mathematical programmers, and students with a basic knowledge of linear algebra and calculus. The clear presentation enables the reader to understand and utilize all components of simplex-type methods, such as presolve techniques, scaling techniques, pivoting ru...
International Nuclear Information System (INIS)
Anon.
1994-01-01
The aim of the TESLA (TeV Superconducting Linear Accelerator) collaboration (at present 19 institutions from seven countries) is to establish the technology for a high energy electron-positron linear collider using superconducting radiofrequency cavities to accelerate its beams. Another basic goal is to demonstrate that such a collider can meet its performance goals in a cost effective manner. For this the TESLA collaboration is preparing a 500 MeV superconducting linear test accelerator at the DESY Laboratory in Hamburg. This TTF (TESLA Test Facility) consists of four cryomodules, each approximately 12 m long and containing eight 9-cell solid niobium cavities operating at a frequency of 1.3 GHz
Sta Maria, E J; Siringan, F P; Bulos, A dM; Sombrito, E Z
2009-01-01
The GEF/UNDP/IMO/PEMSEA project identifies Manila Bay as among the marine pollution hot spots in the Seas of East Asia. (210)Pb dating of its sediment can provide a historical perspective of its pollution loading. However, the validity of (210)Pb dating in a complex dynamic coastal system of Manila Bay may come into question. Land-based sediment input can be high and physical and biological processes can possibly disturb the sediment layers. In this report, the (210)Pb profiles of sediment cores from different parts of the bay are presented. The linear sedimentation rates are shown to be higher in the recent past and are also variable across the bay. The largest change in sedimentation rate, coincided with the occurrence of a volcanic eruption in 1991 and is shown by applying a variant of the CIC model in sedimentation rate calculations. The data suggest that (210)Pb dating can be useful in estimating relative magnitudes of sedimentation rates, even in a complex dynamic coastal system like Manila Bay.
STUDY OF POLLUTANT DISTRIBUTION IN BENOA BAY USING NUMERICAL SIMULATION AND SATELLITE DATA
Directory of Open Access Journals (Sweden)
Komang Ardana
2012-11-01
Full Text Available Euthrofication that caused by nitrate and phosphate contamination and also sedimentation process is the mainproblem that took place in Benoa Bay Territorial water. The distribution of phosphate pollutant in Benoa bay territorialwater was modeled by numeric of Princeton Ocean model (POM. The input of this pollutant model were a tidal currentpattern, M2 tidal current residue, biological factor, physic and chemistry, that influenced pollutant concentration.Meanwhile, the sedimentation concentration was mapped with ALOS AVNIR-2 sensor image satellite and this image wasanalysed with statistic method (Linear Regression.The result of phosphate modeling concentration was 0.1 mg/1 to 0.0022 mg/1, where the concentration wascategorized very hazardous to the territorial water environment. Because the phosphate concentration in a pollutantresources was beyond the standard level of environmental quality, that was 0.015 mg/1 for fishery cultivation and alsotourism activity (Bali Governor Regulation No.8th 2007. While, the direction of the distribution was affected by currentpattern of movement, that was when the ebb level of high water moving into the bay and when the ebb to high tidemoving out of the bay.The result of statistic approaches with ALOS of AVNIR-2 censor can be used for mapping sedimentationdistribution advantages in Benoa Bay. The values were: R2 Band 1 is 0.3839, Band 2 is 0.6123 and Band 3 is 0.5468. Inthis methodology, the correlation was not significant, due to, the quantity of in-situ data was small and the time researchwas not at the same time with satellite data.
Kock, Alison; O’Riain, M. Justin; Mauff, Katya; Meÿer, Michael; Kotze, Deon; Griffiths, Charles
2013-01-01
White sharks (Carcharodon carcharias) are threatened apex predators and identification of their critical habitats and how these are used are essential to ensuring improved local and ultimately global white shark protection. In this study we investigated habitat use by white sharks in False Bay, South Africa, using acoustic telemetry. 56 sharks (39 female, 17 male), ranging in size from 1.7–5 m TL, were tagged with acoustic transmitters and monitored on an array of 30 receivers for 975 days. To investigate the effects of season, sex and size on habitat use we used a generalized linear mixed effects model. Tagged sharks were detected in the Bay in all months and across all years, but their use of the Bay varied significantly with the season and the sex of the shark. In autumn and winter males and females aggregated around the Cape fur seal colony at Seal Island, where they fed predominantly on young of the year seals. In spring and summer there was marked sexual segregation, with females frequenting the Inshore areas and males seldom being detected. The shift from the Island in autumn and winter to the Inshore region in spring and summer by females mirrors the seasonal peak in abundance of juvenile seals and of migratory teleost and elasmobranch species respectively. This study provides the first evidence of sexual segregation at a fine spatial scale and demonstrates that sexual segregation in white sharks is not restricted to adults, but is apparent for juveniles and sub-adults too. Overall, the results confirm False Bay as a critical area for white shark conservation as both sexes, across a range of sizes, frequent the Bay on an annual basis. The finding that female sharks aggregate in the Inshore regions when recreational use peaks highlights the need for ongoing shark-human conflict mitigation strategies. PMID:23383052
2011-04-25
... DEPARTMENT OF HOMELAND SECURITY Coast Guard 33 CFR Part 165 [Docket No. USCG-2011-0196] RIN 1625-AA00 Safety Zone; Bay Ferry II Maritime Security Exercise; San Francisco Bay, San Francisco, CA AGENCY... Security Exercise; San Francisco Bay, San Francisco, CA. (a) Location. The limits of this safety zone...
Marine littoral diatoms from the Gordon’s bay region of False Bay, Cape Province, South Africa
CSIR Research Space (South Africa)
Giffen, MH
1971-01-01
Full Text Available and Comic/i for Scientific and Industrial Research, Pretoria (Received: 5.2. 1970) The Gordon?s Bay region occupies the north western corner of False Bay, a large rectangular bay, bounded on the west by the Cape Peninsula ending at Cape Point...
Linearly Adjustable International Portfolios
Fonseca, R. J.; Kuhn, D.; Rustem, B.
2010-09-01
We present an approach to multi-stage international portfolio optimization based on the imposition of a linear structure on the recourse decisions. Multiperiod decision problems are traditionally formulated as stochastic programs. Scenario tree based solutions however can become intractable as the number of stages increases. By restricting the space of decision policies to linear rules, we obtain a conservative tractable approximation to the original problem. Local asset prices and foreign exchange rates are modelled separately, which allows for a direct measure of their impact on the final portfolio value.
Linearly Adjustable International Portfolios
International Nuclear Information System (INIS)
Fonseca, R. J.; Kuhn, D.; Rustem, B.
2010-01-01
We present an approach to multi-stage international portfolio optimization based on the imposition of a linear structure on the recourse decisions. Multiperiod decision problems are traditionally formulated as stochastic programs. Scenario tree based solutions however can become intractable as the number of stages increases. By restricting the space of decision policies to linear rules, we obtain a conservative tractable approximation to the original problem. Local asset prices and foreign exchange rates are modelled separately, which allows for a direct measure of their impact on the final portfolio value.
International Nuclear Information System (INIS)
Barkman, W.E.; Adams, W.Q.; Berrier, B.R.
1978-01-01
A linear induction motor has been operated on a test bed with a feedback pulse resolution of 5 nm (0.2 μin). Slewing tests with this slide drive have shown positioning errors less than or equal to 33 nm (1.3 μin) at feedrates between 0 and 25.4 mm/min (0-1 ipm). A 0.86-m (34-in)-stroke linear motor is being investigated, using the SPACO machine as a test bed. Initial results were encouraging, and work is continuing to optimize the servosystem compensation
Hogben, Leslie
2013-01-01
With a substantial amount of new material, the Handbook of Linear Algebra, Second Edition provides comprehensive coverage of linear algebra concepts, applications, and computational software packages in an easy-to-use format. It guides you from the very elementary aspects of the subject to the frontiers of current research. Along with revisions and updates throughout, the second edition of this bestseller includes 20 new chapters.New to the Second EditionSeparate chapters on Schur complements, additional types of canonical forms, tensors, matrix polynomials, matrix equations, special types of
Linear Algebra Thoroughly Explained
Vujičić, Milan
2008-01-01
Linear Algebra Thoroughly Explained provides a comprehensive introduction to the subject suitable for adoption as a self-contained text for courses at undergraduate and postgraduate level. The clear and comprehensive presentation of the basic theory is illustrated throughout with an abundance of worked examples. The book is written for teachers and students of linear algebra at all levels and across mathematics and the applied sciences, particularly physics and engineering. It will also be an invaluable addition to research libraries as a comprehensive resource book for the subject.
Nonlinear wave runup in long bays and firths: Samoa 2009 and Tohoku 2011 tsunamis
Didenkulova, I.; Pelinovsky, E.
2012-04-01
Last catastrophic tsunami events in Samoa on 29 September 2009 and in Japan on 11 March 2011 demonstrated that tsunami may experience abnormal amplification in long bays and firths and result in an unexpectedly high wave runup. The capital city Pago Pago, which is located at the toe of a narrow 4-km-long bay and represents the most characteristic example of a long and narrow bay, was considerably damaged during Samoa 2009 tsunami (destroyed infrastructures, boats and shipping containers carried inland into commercial areas, etc.) The runup height there reached 8 m over an inundation of 538 m at its toe, while the tsunami wave height measured by the tide-gauge at the entrance of the bay was at most 3 m. The same situation was observed during catastrophic Tohoku tsunami in Japan, which coast contains numerous long bays and firths, which experienced the highest wave runup and the strongest amplification. Such examples are villages: Ofunato, Ryori Bay, where the wave runup reached 30 m high, and Onagawa, where the wave amplified up to 17 m. Here we study the nonlinear dynamics of tsunami waves in an inclined U-shaped bay. Nonlinear shallow water equations can in this case be written in 1D form and solved analytically with the use of the hodograph transformation. This approach generalizes the well-known Carrier-Greenspan transformation for long wave runup on a plane beach. In the case of an inclined U-shaped bay it leads to the associated generalized wave equation for symmetrical wave in fractal space. In the special case of the channel of parabolic cross-section it is a spherical symmetrical linear wave equation. As a result, the solution of the Cauchy problem can be expressed in terms of elementary functions and has a simple form (with respect to analysis) for any kind of initial conditions. Wave regimes associated with various localized initial conditions, corresponding to problems of evolution and runup of tsunami, are considered and analyzed. Special attention is
San Francisco Bay Water Quality Improvement Fund Points, SF Bay CA, 2015, US EPA Region 9
U.S. Environmental Protection Agency — The San Francisco Bay Water Quality Improvement Fund is a competitive grant program that is helping implement TMDLs to improve water quality, protect wetlands, and...
2012-09-17
... environmental, recreational, and socio-economic benefits and impacts of our LPP alternatives, and respond to... eco-tourism or natural resource-based visitor centers. Nestucca Bay NWR Alternative A: No Action Under...
2010-11-29
... County, Oregon. The refuge was established in 1991 with the acquisition of a 384-acre dairy farm, and has... pastures at Nestucca Bay NWR to tidal marsh, and what effect would this have on the refuge's ability to...
Pärnu Bay Golf Club = Pärnu Bay Golf Club / Arhitekt11
2016-01-01
Pärnu Bay Golf Club, arhitektid Jürgen Lepper, Anto Savi, Margus Soonets, Janar Toomesso (Arhitekt11), sisearhitektid Liina Vaino, Kaari Metslang, Hannelore Kääramees (Arhitekt11). Kultuurkapitali Arhitektuuri sihtkapitali aastapreemia nominent 2016
Discharge between San Antonio Bay and Aransas Bay, southern Gulf Coast, Texas, May-September 1999
East, Jeffery W.
2001-01-01
Along the Gulf Coast of Texas, many estuaries and bays are important habitat and nurseries for aquatic life. San Antonio Bay and Aransas Bay, located about 50 and 30 miles northeast, respectively, of Corpus Christi, are two important estuarine nurseries on the southern Gulf Coast of Texas (fig. 1). According to the Texas Parks and Wildlife Department, “Almost 80 percent of the seagrasses [along the Texas Gulf Coast] are located in the Laguna Madre, an estuary that begins just south of Corpus Christi Bay and runs southward 140 miles to South Padre Island. Most of the remaining seagrasses, about 45,000 acres, are located in the heavily traveled San Antonio, Aransas and Corpus Christi Bay areas” (Shook, 2000).Population growth has led to greater demands on water supplies in Texas. The Texas Water Development Board, the Texas Parks and Wildlife Department, and the Texas Natural Resource Conservation Commission have the cooperative task of determining inflows required to maintain the ecological health of the State’s streams, rivers, bays, and estuaries. To determine these inflow requirements, the three agencies collect data and conduct studies on the need for instream flows and freshwater/ saline water inflows to Texas estuaries.To assist in the determination of freshwater inflow requirements, the U.S. Geological Survey (USGS), in cooperation with the Texas Water Development Board, conducted a hydrographic survey of discharge (flow) between San Antonio Bay and Aransas Bay during the period May–September 1999. Automated instrumentation and acoustic technology were used to maximize the amount and quality of data that were collected, while minimizing personnel requirements. This report documents the discharge measured at two sites between the bays during May–September 1999 and describes the influences of meteorologic (wind and tidal) and hydrologic (freshwater inflow) conditions on discharge between the two bays. The movement of water between the bays is
2012-05-23
...The Coast Guard proposes to establish a temporary safety zone on the St. Lawrence River, Alexandria Bay, NY. This proposed rule is intended to restrict vessels from a portion of the St. Lawrence River during the Alexandria Bay Chamber of Commerce fireworks display. The safety zone established by this proposed rule is necessary to protect spectators and vessels from the hazards associated with a fireworks display.
International Nuclear Information System (INIS)
1978-07-01
This report identifies researchers, research activities, and data files applicable to the Chesapeake Bay estuarine system. The identified data were generated after 1973 on the following: submerged aquatic vegetation, shellfish bed closures, eutrophication, toxics accumulation in the food chain, dredging and spoil disposal, hydrologic modifications, modification of fisheries, shoreline erosion, wetlands alterations, and the effects of boating and shipping on water quality. Major past and current program monitoring in the Bay and its tributaries are summarized according to frequency
Elemental analysis of Uranouchi bay seabed sludge using PIXE
International Nuclear Information System (INIS)
Kabir, M. Hasnat; Narusawa, Tadashi; Nishiyama, Fumitaka; Sumi, Katsuhiro
2006-01-01
Elemental analyses were carried out for the seabed sludge collected from Uranouchi bay (Kochi, Japan) using Particle Induced X-ray Emission (PIXE). Seabed-sludge contamination with heavy metals as well as toxic elements becomes one of the most serious environmental problems. The aim of the present study is to investigate the polluted areas in the bay by heavy and toxic elements. As a results of analyses of samples collected from eleven different places in the bay, seventeen elements including toxic ones were detected. The results suggest that the center region of the bay is seriously contaminated by heavy and toxic elements in comparison with the other areas in the bay. (author)
PEMANFATAN TEOREMA BAYES DALAM PENENTUAN PENYAKIT THT
Directory of Open Access Journals (Sweden)
Sri Winiarti
2012-05-01
Full Text Available Dalam konsep pelacakan dalam mencari solusi dengan pendekatan artificial inteligent, ada berbagai metode yang dapat diterapkan untuk mengatasi masalah ketidakpastian saat proses pelacakan terjadi. Salah satunya adalah teorema bayes. Adanya ketidakpastian pada proses pelacakan dapat terjadi karena adanya perubahan pengetahuan yang ada di dalam sistem. Untuk itu diperlukan adanya suatu metode untuk mengatasi permasalahan tersebut. Dalam penelitian ini telah diterapkan suatu metode untuk mengatasi ketidakpastian dengan teorema Bayes pada kasus pelacakan untuk mendiagnosa penyakit pada THT (Telinga,Hidung dan Tenggorokan. Subjek pada penelitian ini adalah proses pelacakan untuk menentukan penyakit THT dengan model penalaran forward chaining dan metode kepastiannya menggunakan teorema bayes dengan cara menghitung nilai probabilitas suatu penyakit dan membandingkan probabilitas setiap gejalanya. Model pengembangan perangkat lunak yang digunakan dalam penelitian ini adalah Waterfall. Metode Waterfall diawali dengan analisis data, perancangan sistem, pengkodean menggunakan Visual Basic 6.0, pengujian sistem dengan black box test dan alfa test. Dari penelitian yang dilakukan menghasilkan sebuah perangkat lunak yaitu yang mampu menentukan penyakit pada THT dengan menerapkan metode bayes untuk mengatasi ketidakpastian. Hasil uji coba sistem menujukkan bahwa aplikasi ini layak dan dapat digunakan.
Radioactive source management in Daya Bay NPP
International Nuclear Information System (INIS)
Mao Chun Yang
2000-01-01
'Small sources causes big accidents' had occurred worldwide many times. Radioactive source management in Nuclear Power Plant in very important for its safety record. This paper introduces the way and experience of radioactive source management in Daya Bay NPP from aspects of clarifying the responsibilities, centralizing the management of high radioactivity sources, work process management and experience feedback etc. (author)
Bathymetry (2011) for Fish Bay, St. John
National Oceanic and Atmospheric Administration, Department of Commerce — This image represents a LiDAR (Light Detection & Ranging) 0.3x0.3 meter resolution depth surface for Fish Bay, St. John in the U.S. Virgin Islands (USVI). The...
Sediment Characterization in St. Alban's Bay, VT
Nethercutt, S.; Manley, T.; Manley, P.
2017-12-01
St. Alban's Bay within Lake Champlain is plagued with harmful algal blooms. With future intensification due to climate change, a multidisciplinary program (BREE-Basin Resilience to Extreme Events) was initiated in 2016. In order to assess the mobilization of harmful nutrients from sediment resuspension events and riverine input, 74 sediment samples were collected in a grid fashion throughout St. Alban's Bay. Sediments were deflocculated and analyzed using a LA920 Horiba laser scattering particle size distribution analyzer to define the frequency of sediment sizes from clay to sand. Gridded surfaces of mean sortable silt percentage, silt percentage, sand percentage, and clay percentage were used to represent the sediment distribution of the region. A plot of diameter versus frequency showed the bimodal nature of some of the sediments, with one peak at about 10 microns diameter (silt) and the second at about 525 microns diameter (sand). The data showed an extremely low percentage of clay relative to that of sand and silt. The highest frequencies of sortable silt, which represents the most easily mobilized particle size, are found in the deepest areas of the bay, suggesting that these regions are where dominant bottom flow occurs. The high occurrence of sortable silt in the St. Alban's Bay does suggest that sediment mobilization, and therefore nutrient mobilization has the potential to occur. These data combined with high-resolution multibeam and hydrodynamic data will allow for future models of water flow and remobilization studies in the future.
Underwater Gravity Survey of Northern Monterey Bay.
stations were occupied just above the swash zone. A complete Bouguer anomaly map was drawn and tied in with the previous land surveys and with one...covering the southern half of the bay. The isolines of the complete Bouguer anomaly indicate the relative vertical position of the basement complex Santa
Padilla Bay: The Estuary Guide. Level 2.
Friesem, Judy; Lynn, Valerie, Ed.
Estuaries are marine systems that serve as nurseries for animals, links in the migratory pathways, and habitat for a complex community of organisms. This curriculum guide intended for use at the middle school level is designed for use with the on-site program developed by the Padilla Bay National Esturine Research Reserve (Washington). The guide…
Carolina bays of the Savannah River Plant
Energy Technology Data Exchange (ETDEWEB)
Schalles, J.F. (Creighton Univ., Omaha, NE (USA)); Sharitz, R.R.; Gibbons, J.W.; Leversee, G.J.; Knox, J.N. (Savannah River Ecology Lab., Aiken, SC (USA))
1989-01-01
Much of the research to date on the Carolina bays of the Savannah River Plant and elsewhere has focused on certain species or on environmental features. Different levels of detail exist for different groups of organisms and reflect the diverse interests of previous investigators. This report summarizes aspects of research to date and presents data from numerous studies. 70 refs., 14 figs., 12 tabs.
Fecal indicator bacteria at Havana Bay
International Nuclear Information System (INIS)
Lopez Perez, Lisse; Gomez D'Angelo, Yamiris; Beltran Gonzalez, Jesus; Alvarez Valiente, Reinaldo
2013-01-01
Aims: Fecal indicator bacteria concentrations were evaluated in Havana Bay. Methods: Concentrations of traditional fecal indicator bacteria were calculated between April 2010 and February 2011, by MPN methods. Concentrations of thermo tolerant coliform (CTT), Escherichia coli, fecal streptococci (EF), intestinal enterococci (ENT) in seawater, and Clostridium perfringens in sediment surface, were determined. Results: CTT and E. coli levels were far above Cuban water quality standard for indirect contact with water, showing the negative influence of sewage and rivers on the bay. The EF and ENT were measured during sewage spills at the discharge site and they were suitable indicators of fecal contamination, but these indicators didn't show the same behavior in other selected sites. This result comes from its well-known inactivation by solar light in tropical zones and the presumable presence of humid acids in the waters of the bay. Conclusion: Fecal indicator bacteria and its statistical relationships reflect recent and chronic fecal contamination at the bay and near shores.
Tortuguero Bay [Puerto Rico] environmental studies
International Nuclear Information System (INIS)
Wood, E.D.; Youngbluth, M.J.; Nutt, M.E.; Yoshioka, P.; Canoy, M.J.
1975-01-01
Site selection surveys and environmental research studies of seven coastal sites in Puerto Rico for construction of power generating facilities were carried out. Data are presented on the physical, chemical, and geological parameters of the Tortuguero Bay site, and the ecological parameters of zooplankton, benthic invertebrates, plant and fish communities. (U.S.)
Roebuck Bay Invertebrate and bird Mapping 2006
Piersma, Theunis; Pearson, Grant B.; Hickey, Robert; Dittmann, Sabine; Rogers, Danny I.; Folmer, Eelke; Honkoop, Pieter; Drent, Jan; Goeij, Petra de; Marsh, Loisette
2006-01-01
1. This is a report on a survey of the benthic ecology of the intertidal flats along the northern shores of Roebuck Bay in June 2006. In the period 11-20 June we mapped both the invertebrate macrobenthic animals (those retained by a 1 mm sieve) over the whole of the northern intertidal area of
Morphological features in the Bay of Bengal
Digital Repository Service at National Institute of Oceanography (India)
Sarma, K.V.L.N.S.; Ramana, M.V.; Subrahmanyam, V.; Krishna, K.S.; Ramprasad, T.; Desa, M.
history of the Fan. After India's soft collision with the Eurasian plate, these events may have played a critical role in shaping various morphological features since late Eocene in the Bay of Bengal. The present 12 kHz Echo sounder data collected along...
ULF fluctuations at Terra Nova Bay (Antarctica
Directory of Open Access Journals (Sweden)
A. Meloni
2000-06-01
Full Text Available ULF geomagnetic field measurements in Antarctica are a very important tool for better understanding the dynamics of the Earths magnetosphere and its response to the variable solar wind conditions. We review the results obtained in the last few years at the Italian observatory at Terra Nova Bay
National Oceanic and Atmospheric Administration, Department of Commerce — TASK NAME:(NRCS) Saginaw Bay, MI LiDAR LiDAR Data Acquisition and Processing Production Task USGS Contract No. G10PC00057 Task Order No. G11PD01254 Woolpert Order...
Divergent Priors and well Behaved Bayes Factors
R.W. Strachan (Rodney); H.K. van Dijk (Herman)
2011-01-01
textabstractDivergent priors are improper when defined on unbounded supports. Bartlett's paradox has been taken to imply that using improper priors results in ill-defined Bayes factors, preventing model comparison by posterior probabilities. However many improper priors have attractive properties
Bathymetry (2011) for Coral Bay, St. John
National Oceanic and Atmospheric Administration, Department of Commerce — This image represents a LiDAR (Light Detection & Ranging) 0.3x0.3 meter resolution depth surface for Coral Bay, St. John in the U.S. Virgin Islands (USVI). The...
Empirical Phenomenology: A Qualitative Research Approach (The ...
African Journals Online (AJOL)
Empirical Phenomenology: A Qualitative Research Approach (The Cologne Seminars) ... and practical application of empirical phenomenology in social research. ... and considers its implications for qualitative methods such as interviewing ...
Pb’s high sedimentation inside the bay mouth of Jiaozhou Bay
Yang, Dongfang; Miao, Zhenqing; Huang, Xinmin; Wei, Linzhen; Feng, Ming
2017-12-01
Sedimentation is one of the key environmental behaviors of pollutants in the ocean. This paper analyzed the seasonal and temporal variations of Pb’s sedimentation process in Jiaozhou Bay in 1987. Results showed that Pb contents in bottom waters in Jiaozhou Bay in May, July and November 1987 were 1.87-2.60 μg L-1, 15.11-19.68 μg L-1 and 11.08-15.18 μg L-1, and the pollution levels of Pb in May, July and November 1987 were slight, heavy and heavy, respectively. In May 1987, there was low sedimentation process in waters in the outside of the bay mouth, yet were high sedimentation process in waters in the middle and inside of the bay mouth. In July and November 1987, there was low sedimentation process in waters in the outside of the bay mouth, yet were high sedimentation process in waters in the inside of the bay mouth. The seasonal-temporal variation of sedimentation processes of Pb were determined by the variations of sources input and the vertical water’s effect.
Energy Technology Data Exchange (ETDEWEB)
Ledvina, Joseph A.
2008-05-01
Research on the effects of wetland restoration on reptiles and amphibians is becoming more common, but almost all of these studies have observed the colonization of recently disturbed habitats that were completely dry at the time of restoration. In a similar manner, investigations herpetofaunal responses to forest management have focused on clearcuts, and less intensive stand manipulations are not as well studied. To evaluate community and population responses of reptiles and amphibians to hydrology restoration and canopy removal in the interior of previously degraded Carolina bays, I monitored herpetofauna in the uplands adjacent to six historically degraded Carolina bays at the Savannah River Site (SRS) in South Carolina for four years after restoration. To evaluate the effects of forest thinning on upland herpetofauna, forests were thinned in the margins of three of these bays. I used repeated measures ANOVA to compare species richness and diversity and the abundance of selected species and guilds between these bays and with those at three reference bays that were not historically drained and three control bays that remained degraded. I also used Non-metric Multidimensional Scaling (NMDS) to look for community-level patterns based treatments.
2013-05-10
AND VICTIM- ~ vAP BLAMING 4. AMERICA, LINEARLY CYCUCAL AF IMT 1768, 19840901, V5 PREVIOUS EDITION WILL BE USED. C2C Jessica Adams Dr. Brissett...his desires, his failings, and his aspirations follow the same general trend throughout history and throughout cultures. The founding fathers sought
International Nuclear Information System (INIS)
Southworth, B.
1985-01-01
The peak of the construction phase of the Stanford Linear Collider, SLC, to achieve 50 GeV electron-positron collisions has now been passed. The work remains on schedule to attempt colliding beams, initially at comparatively low luminosity, early in 1987. (orig./HSI).
International Nuclear Information System (INIS)
Mafra Neto, F.
1992-01-01
The dose of gamma radiation from a linear source of cesium 137 is obtained, presenting two difficulties: oblique filtration of radiation when cross the platinum wall, in different directions, and dose connection due to the scattering by the material mean of propagation. (C.G.C.)
Resistors Improve Ramp Linearity
Kleinberg, L. L.
1982-01-01
Simple modification to bootstrap ramp generator gives more linear output over longer sweep times. New circuit adds just two resistors, one of which is adjustable. Modification cancels nonlinearities due to variations in load on charging capacitor and due to changes in charging current as the voltage across capacitor increases.
LINEAR COLLIDERS: 1992 workshop
International Nuclear Information System (INIS)
Settles, Ron; Coignet, Guy
1992-01-01
As work on designs for future electron-positron linear colliders pushes ahead at major Laboratories throughout the world in a major international collaboration framework, the LC92 workshop held in Garmisch Partenkirchen this summer, attended by 200 machine and particle physicists, provided a timely focus
Brameier, Markus
2007-01-01
Presents a variant of Genetic Programming that evolves imperative computer programs as linear sequences of instructions, in contrast to the more traditional functional expressions or syntax trees. This book serves as a reference for researchers, but also contains sufficient introduction for students and those who are new to the field
Dobbs, David E.
2013-01-01
A direct method is given for solving first-order linear recurrences with constant coefficients. The limiting value of that solution is studied as "n to infinity." This classroom note could serve as enrichment material for the typical introductory course on discrete mathematics that follows a calculus course.
International Nuclear Information System (INIS)
Takeda, Seishi
1992-01-01
The status of R and D of future e + e - linear colliders proposed by the institutions throughout the world is described including the JLC, NLC, VLEPP, CLIC, DESY/THD and TESLA projects. The parameters and RF sources are discussed. (G.P.) 36 refs.; 1 tab
Microbial biogeography of San Francisco Bay sediments
Lee, J. A.; Francis, C. A.
2014-12-01
The largest estuary on the west coast of North America, San Francisco Bay is an ecosystem of enormous biodiversity, and also enormous human impact. The benthos has experienced dredging, occupation by invasive species, and over a century of sediment input as a result of hydraulic mining. Although the Bay's great cultural and ecological importance has inspired numerous surveys of the benthic macrofauna, to date there has been almost no investigation of the microbial communities on the Bay floor. An understanding of those microbial communities would contribute significantly to our understanding of both the biogeochemical processes (which are driven by the microbiota) and the physical processes (which contribute to microbial distributions) in the Bay. Here, we present the first broad survey of bacterial and archaeal taxa in the sediments of the San Francisco Bay. We conducted 16S rRNA community sequencing of bacteria and archaea in sediment samples taken bimonthly for one year, from five sites spanning the salinity gradient between Suisun and Central Bay, in order to capture the effect of both spatial and temporal environmental variation on microbial diversity. From the same samples we also conducted deep sequencing of a nitrogen-cycling functional gene, nirS, allowing an assessment of evolutionary diversity at a much finer taxonomic scale within an important and widespread functional group of bacteria. We paired these sequencing projects with extensive geochemical metadata as well as information about macrofaunal distribution. Our data reveal a diversity of distinct biogeographical patterns among different taxa: clades ubiquitous across sites; clades that respond to measurable environmental drivers; and clades that show geographical site-specificity. These community datasets allow us to test the hypothesis that salinity is a major driver of both overall microbial community structure and community structure of the denitrifying bacteria specifically; and to assess
Isotope systematic of contaminant leads in Monterey Bay
International Nuclear Information System (INIS)
Flegal, A.R.; Rosman, K.J.R.; Stephenson, M.D.
1987-01-01
Isotopic compositions of stable lead ( 204 Pb, 206 Pb, 207 Pb, and 208 Pb) were utilized to identify a lead slag deposit as the principal source of contaminant lead in Monterey Bay. This point source had been indicated by anomalously high lead concentrations in native mussels (Mytilus californianus) near that deposit, which were orders of magnitude above the base-line concentration of the species (0.5 μg/g). Subsequent analyses revealed that the lead concentrations of both transplanted mussels and intertidal sediments were positively correlated with their proximity to the slag deposit. Complementary lead isotopic compositions substantiated those empirical correlations by demonstrating that the slag was the predominant source of contaminant lead in both the mussels and the sediments. Analyses of the digestive tracts of mussels from the slag deposit indicated that ingested slag particulates accounted for their elevated lead concentrations, while analyses of their gonads indicated that dissolved lead from other industrial sources was also being bioaccumulated by passive adsorption on exposed surfaces. Therefore, this study has demonstrated the potential of lead isotope systematics both to identify sources of lead contamination in marine organisms and to trace its biogeochemical cycle in the marine environment. 26 references, 3 figures, 5 tables
Unmixing of spectral components affecting AVIRIS imagery of Tampa Bay
Carder, Kendall L.; Lee, Z. P.; Chen, Robert F.; Davis, Curtiss O.
1993-09-01
According to Kirk's as well as Morel and Gentili's Monte Carlo simulations, the popular simple expression, R approximately equals 0.33 bb/a, relating subsurface irradiance reflectance (R) to the ratio of the backscattering coefficient (bb) to absorption coefficient (a), is not valid for bb/a > 0.25. This means that it may no longer be valid for values of remote-sensing reflectance (above-surface ratio of water-leaving radiance to downwelling irradiance) where Rrs4/ > 0.01. Since there has been no simple Rrs expression developed for very turbid waters, we developed one based in part on Monte Carlo simulations and empirical adjustments to an Rrs model and applied it to rather turbid coastal waters near Tampa Bay to evaluate its utility for unmixing the optical components affecting the water- leaving radiance. With the high spectral (10 nm) and spatial (20 m2) resolution of Airborne Visible-InfraRed Imaging Spectrometer (AVIRIS) data, the water depth and bottom type were deduced using the model for shallow waters. This research demonstrates the necessity of further research to improve interpretations of scenes with highly variable turbid waters, and it emphasizes the utility of high spectral-resolution data as from AVIRIS for better understanding complicated coastal environments such as the west Florida shelf.
Rapid Crustal Uplift at Birch Bay, Washington
Sherrod, B. L.; Kelsey, H. M.; Blakely, R. J.
2010-12-01
Geomorphology and coastal marsh stratigraphy suggest late Holocene uplift of the shoreline at Birch Bay, located northwest of Bellingham, Washington, during an earthquake on a shallow fault. LiDAR images show a raised, late Holocene shoreline along Birch Bay, with ~1 m of elevation difference between the modern shoreline and the inferred paleoshoreline. Commercial seismic reflection images reveal an anticline in Tertiary and possibly Quaternary deposits underlying Birch Bay. NW-trending magnetic anomalies are likely associated with the Birch Bay anticline and other nearby structures. Taken together, the geophysical data and lidar images suggest uplift of young deposits along a NW-trending blind reverse fault. Stratigraphy from Terrell Creek marsh, located just south of Birch Bay, shows freshwater peat buried by lower intertidal muds, indicating local submergence ~1300 yr BP. Stratigraphy of a 70-cm sediment core from Birch Bay marsh, sitting astride the anticline imaged with seismic reflection data, shows mud buried by detrital peat. One radiocarbon age from the core places the abrupt change from mud to peat prior to 1520-1700 yr BP. We divide fossil diatom assemblages straddling the mud-peat contact at Birch Bay into three zones. The oldest zone consists primarily of intertidal and marine diatoms, dominated by Paralia sulcata, Scoleoneis tumida, Grammataphora oceanica, and Gyrosigma balticum. An intermediate zone, beginning at the sharp contact between mud and overlying peat, consists of a mixture of brackish marsh and freshwater species, dominated by Diploneis interrupta, with lesser amounts of Aulacoseira sp., Pinnularia viridis, Eunotia pectinalis, and Paralia sulcata. A third and youngest zone lies in the upper half of the peat and is dominated by poorly preserved freshwater diatoms, mostly Aulacoseira cf. crassapuntata, Pinnularia viridis, P. maior, Eunotia pectinalis, and E. praerupta. Paleoecological inferences, based on distributions of modern diatoms
POTENTIAL HAZARDS OF SEDIMENT IN KENDARI BAY, SOUTHEAST SULAWESI
Directory of Open Access Journals (Sweden)
Nur Adi Kristanto
2017-07-01
Full Text Available Kendari bay is located in front of Kendari city. There are two harbors in the inner part of bay which very important to support economic activities such as shipping and passenger transportation. The result of coastal characteristic mapping and physical oceanography survey show various coastal morphology, vegetation, weathering processes, sedimentation, currents, and water depth and sea floor morphology. Kendari bay is an enclosed bay; the area is wide in the inner part and narrow in mouth of bay (outlet, the morphology look like a bottleâ€™s neck. Numerous mouth rivers are concentrate around the bay. The rivers load material from land since erosion on land is intensive enough. There is indication that sediment supplies from land trough river mouth not equivalent with outlet capacity. Sediment load is trapped in the inner bay caused the outlet morphology. So high sediment rate play an important role in the process of shallow of water depth in Kendari bay. This condition make the Kendari bay is a prone area of sediment hazard due to height rate of sedimentary process. Therefore, to anticipate the hazards, precaution should be taken related to the Kendari bay as the center of activities in southeast of Sulawesi. The further survey is needed such as marine geotechnique and on land environmental to collect data, which can be used as database for development planning. Key words: Potential hazard, sediment, Kendari Bay Teluk
Neural network modelling of planform geometry of headland-bay beaches
Iglesias, G.; López, I.; Castro, A.; Carballo, R.
2009-02-01
The shoreline of beaches in the lee of coastal salients or man-made structures, usually known as headland-bay beaches, has a distinctive curvature; wave fronts curve as a result of wave diffraction at the headland and in turn cause the shoreline to bend. The ensuing curved planform is of great interest both as a peculiar landform and in the context of engineering projects in which it is necessary to predict how a coastal structure will affect the sandy shoreline in its lee. A number of empirical models have been put forward, each based on a specific equation. A novel approach, based on the application of artificial neural networks, is presented in this work. Unlike the conventional method, no particular equation of the planform is embedded in the model. Instead, it is the model itself that learns about the problem from a series of examples of headland-bay beaches (the training set) and thereafter applies this self-acquired knowledge to other cases (the test set) for validation. Twenty-three headland-bay beaches from around the world were selected, of which sixteen and seven make up the training and test sets, respectively. As there is no well-developed theory for deciding upon the most convenient neural network architecture to deal with a particular data set, an experimental study was conducted in which ten different architectures with one and two hidden neuron layers and five training algorithms - 50 different options combining network architecture and training algorithm - were compared. Each of these options was implemented, trained and tested in order to find the best-performing approach for modelling the planform of headland-bay beaches. Finally, the selected neural network model was compared with a state-of-the-art planform model and was shown to outperform it.
Fernandes, Veronica; Ramaiah, N.
2016-03-01
Mesozooplankton samples were collected from the mixed layer along a central (along 88°E) and a western transect in the Bay of Bengal during four seasons covered between 2001 and 2006 in order to investigate spatio-temporal variability in their biomass. At these stations, grazing and respiration rates were measured from live zooplankton hauled in from the surface during December 2005. Akin to the mesozooplankton "paradox" in the central and eastern Arabian Sea, biomass in the mixed layer was more or less invariant in the central and western Bay of Bengal, even as the chl a showed marginal temporal variation. By empirical equation, the mesozooplankton production rate calculated to be 70-246 mg C m- 2 d- 1 is on par with the Arabian Sea. Contrary to the conventional belief, mesozooplankton grazing impact was up to 83% on primary production (PP). Low PP coupled with very high zooplankton production (70% of PP) along with abundant bacterial production (50% of the PP; Ramaiah et al., 2009) is likely to render the Bay of Bengal net heterotrophic, especially during the spring intermonsoon. Greater estimates of fecal pellet-carbon egestion by mesozooplankton compared to the average particulate organic carbon flux in sediment traps, implies that much of the matter is recycled by heterotrophic communities in the mixed layer facilitating nutrient regeneration for phytoplankton growth. We also calculated that over a third of the primary production is channelized for basin-wide zooplankton respiration that accounts for 52 Mt C annually. In the current scenario of global warming, if low (primary) productive warm pools like the Bay of Bengal continue to be net heterotrophic, negative implications like enhanced emission of CO2 to the atmosphere, increased particulate flux to the deeper waters and greater utilization of dissolved oxygen resulting in expansion of the existing oxygen minimum zone are imminent.
Autumn photoproduction of carbon monoxide in Jiaozhou Bay, China
Ren, Chunyan; Yang, Guipeng; Lu, Xiaolan
2014-06-01
Carbon monoxide (CO) plays a significant role in global warming and atmospheric chemistry. Global oceans are net natural sources of atmospheric CO. CO at surface ocean is primarily produced from the photochemical degradation of chromophoric dissolved organic matter (CDOM). In this study, the effects of photobleaching, temperature and the origin (terrestrial or marine) of CDOM on the apparent quantum yields (AQY) of CO were studied for seawater samples collected from Jiaozhou Bay. Our results demonstrat that photobleaching, temperature and the origin of CDOM strongly affected the efficiency of CO photoproduction. The concentration, absorbance and fluorescence of CDOM exponentially decreased with increasing light dose. Terrestrial riverine organic matter could be more prone to photodegradation than the marine algae-derived one. The relationships between CO AQY and the dissolved organic carbon-specific absorption coefficient at 254 nm for the photobleaching study were nonlinear, whereas those of the original samples were strongly linear. This suggests that: 1) terrestrial riverine CDOM was more efficient than marine algae-derived CDOM for CO photoproduction; 2) aromatic and olefinic moieties of the CDOM pool were affected more strongly by degradation processes than by aliphatic ones. Water temperature and the origin of CDOM strongly affected the efficiency of CO photoproduction. The photoproduction rate of CO in autumn was estimated to be 31.98 μmol m-2 d-1 and the total DOC photomineralization was equivalent to 3.25%-6.35% of primary production in Jiaozhou Bay. Our results indicate that CO photochemistry in coastal areas is important for oceanic carbon cycle.
Holocene evolution of Apalachicola Bay, Florida
Osterman, L.E.; Twichell, D.C.; Poore, R.Z.
2009-01-01
A program of geophysical mapping and vibracoring was conducted to better understand the geologic evolution of Apalachicola Bay. Analyses of the geophysical data and sediment cores along with age control provided by 34 AMS 14C dates on marine shells and wood reveal the following history. As sea level rose in the early Holocene, fluvial deposits filled the Apalachicola River paleochannel, which extended southward under the central part of the bay and seaward across the continental shelf. Sediments to either side of the paleochannel contain abundant wood fragments, with dates documenting that those areas were forested at 8,000 14C years b.p. As sea level continued to rise, spits formed of headland prodelta deposits. Between ???6,400 and ???2,500 14C years b.p., an Apalachicola prodelta prograded and receded several times across the inner shelf that underlies the western part of the bay. An eastern deltaic lobe was active for a shorter time, between ???5,800 and 5,100 14C years b.p. Estuarine benthic foraminiferal assemblages occurred in the western bay as early as 6,400 14C years b.p., and indicate that there was some physical barrier to open-ocean circulation and shelf species established by that time. It is considered that shoals formed in the region of the present barrier islands as the rising sea flooded an interstream divide. Estuarine conditions were established very early in the post-glacial flooding of the bay. ?? 2009 US Government.
Intermediation by Banks and Economic Growth: A Review of Empirical Evidence
Directory of Open Access Journals (Sweden)
Marijana Bađun
2009-06-01
Full Text Available This paper provides a review of empirical research on the link between financial intermediation by banks and economic growth. Special attention is paid to the issues of causality, non-linearity, time perspective, financial intermediation proxies, and interaction terms. The review shows that there are still quite a few unresolved issues in empirical research, which causes scepticism towards prioritizing financial sector policies in order to cause economic growth. Progress in the finance and growth literature is slow and researchers seem to go round in circles. A possibly fruitful direction for future empirical research is the relationship between government and banks, especially from the standpoint of political economy.
Science and the British Empire.
Harrison, Mark
2005-03-01
The last few decades have witnessed a flowering of interest in the history of science in the British Empire. This essay aims to provide an overview of some of the most important work in this area, identifying interpretative shifts and emerging themes. In so doing, it raises some questions about the analytical framework in which colonial science has traditionally been viewed, highlighting interactions with indigenous scientific traditions and the use of network-based models to understand scientific relations within and beyond colonial contexts.
Empirical logic and tensor products
International Nuclear Information System (INIS)
Foulis, D.J.; Randall, C.H.
1981-01-01
In our work we are developing a formalism called empirical logic to support a generalization of conventional statistics; the resulting generalization is called operational statistics. We are not attempting to develop or advocate any particular physical theory; rather we are formulating a precision 'language' in which such theories can be expressed, compared, evaluated, and related to laboratory experiments. We believe that only in such a language can the connections between real physical procedures (operations) and physical theories be made explicit and perspicuous. (orig./HSI)
Estimating monotonic rates from biological data using local linear regression.
Olito, Colin; White, Craig R; Marshall, Dustin J; Barneche, Diego R
2017-03-01
Accessing many fundamental questions in biology begins with empirical estimation of simple monotonic rates of underlying biological processes. Across a variety of disciplines, ranging from physiology to biogeochemistry, these rates are routinely estimated from non-linear and noisy time series data using linear regression and ad hoc manual truncation of non-linearities. Here, we introduce the R package LoLinR, a flexible toolkit to implement local linear regression techniques to objectively and reproducibly estimate monotonic biological rates from non-linear time series data, and demonstrate possible applications using metabolic rate data. LoLinR provides methods to easily and reliably estimate monotonic rates from time series data in a way that is statistically robust, facilitates reproducible research and is applicable to a wide variety of research disciplines in the biological sciences. © 2017. Published by The Company of Biologists Ltd.
Finite-dimensional linear algebra
Gockenbach, Mark S
2010-01-01
Some Problems Posed on Vector SpacesLinear equationsBest approximationDiagonalizationSummaryFields and Vector SpacesFields Vector spaces Subspaces Linear combinations and spanning sets Linear independence Basis and dimension Properties of bases Polynomial interpolation and the Lagrange basis Continuous piecewise polynomial functionsLinear OperatorsLinear operatorsMore properties of linear operatorsIsomorphic vector spaces Linear operator equations Existence and uniqueness of solutions The fundamental theorem; inverse operatorsGaussian elimination Newton's method Linear ordinary differential eq
Mission Operations Planning with Preferences: An Empirical Study
Bresina, John L.; Khatib, Lina; McGann, Conor
2006-01-01
This paper presents an empirical study of some nonexhaustive approaches to optimizing preferences within the context of constraint-based, mixed-initiative planning for mission operations. This work is motivated by the experience of deploying and operating the MAPGEN (Mixed-initiative Activity Plan GENerator) system for the Mars Exploration Rover Mission. Responsiveness to the user is one of the important requirements for MAPGEN, hence, the additional computation time needed to optimize preferences must be kept within reasonabble bounds. This was the primary motivation for studying non-exhaustive optimization approaches. The specific goals of rhe empirical study are to assess the impact on solution quality of two greedy heuristics used in MAPGEN and to assess the improvement gained by applying a linear programming optimization technique to the final solution.
Linearity and Non-linearity of Photorefractive effect in Materials ...
African Journals Online (AJOL)
In this paper we have studied the Linearity and Non-linearity of Photorefractive effect in materials using the band transport model. For low light beam intensities the change in the refractive index is proportional to the electric field for linear optics while for non- linear optics the change in refractive index is directly proportional ...
Topobathymetric model of Mobile Bay, Alabama
Danielson, Jeffrey J.; Brock, John C.; Howard, Daniel M.; Gesch, Dean B.; Bonisteel-Cormier, Jamie M.; Travers, Laurinda J.
2013-01-01
Topobathymetric Digital Elevation Models (DEMs) are a merged rendering of both topography (land elevation) and bathymetry (water depth) that provides a seamless elevation product useful for inundation mapping, as well as for other earth science applications, such as the development of sediment-transport, sea-level rise, and storm-surge models. This 1/9-arc-second (approximately 3 meters) resolution model of Mobile Bay, Alabama was developed using multiple topographic and bathymetric datasets, collected on different dates. The topographic data were obtained primarily from the U.S. Geological Survey (USGS) National Elevation Dataset (NED) (http://ned.usgs.gov/) at 1/9-arc-second resolution; USGS Experimental Advanced Airborne Research Lidar (EAARL) data (2 meters) (http://pubs.usgs.gov/ds/400/); and topographic lidar data (2 meters) and Compact Hydrographic Airborne Rapid Total Survey (CHARTS) lidar data (2 meters) from the U.S. Army Corps of Engineers (USACE) (http://www.csc.noaa.gov/digitalcoast/data/coastallidar/). Bathymetry was derived from digital soundings obtained from the National Oceanic and Atmospheric Administration’s (NOAA) National Geophysical Data Center (NGDC) (http://www.ngdc.noaa.gov/mgg/geodas/geodas.html) and from water-penetrating lidar sources, such as EAARL and CHARTS. Mobile Bay is ecologically important as it is the fourth largest estuary in the United States. The Mobile and Tensaw Rivers drain into the bay at the northern end with the bay emptying into the Gulf of Mexico at the southern end. Dauphin Island (a barrier island) and the Fort Morgan Peninsula form the mouth of Mobile Bay. Mobile Bay is 31 miles (50 kilometers) long by a maximum width of 24 miles (39 kilometers) with a total area of 413 square miles (1,070 square kilometers). The vertical datum of the Mobile Bay topobathymetric model is the North American Vertical Datum of 1988 (NAVD 88). All the topographic datasets were originally referenced to NAVD 88 and no transformations
Linearly Refined Session Types
Directory of Open Access Journals (Sweden)
Pedro Baltazar
2012-11-01
Full Text Available Session types capture precise protocol structure in concurrent programming, but do not specify properties of the exchanged values beyond their basic type. Refinement types are a form of dependent types that can address this limitation, combining types with logical formulae that may refer to program values and can constrain types using arbitrary predicates. We present a pi calculus with assume and assert operations, typed using a session discipline that incorporates refinement formulae written in a fragment of Multiplicative Linear Logic. Our original combination of session and refinement types, together with the well established benefits of linearity, allows very fine-grained specifications of communication protocols in which refinement formulae are treated as logical resources rather than persistent truths.
Kuznetsov, N.; Maz'ya, V.; Vainberg, B.
2002-08-01
This book gives a self-contained and up-to-date account of mathematical results in the linear theory of water waves. The study of waves has many applications, including the prediction of behavior of floating bodies (ships, submarines, tension-leg platforms etc.), the calculation of wave-making resistance in naval architecture, and the description of wave patterns over bottom topography in geophysical hydrodynamics. The first section deals with time-harmonic waves. Three linear boundary value problems serve as the approximate mathematical models for these types of water waves. The next section uses a plethora of mathematical techniques in the investigation of these three problems. The techniques used in the book include integral equations based on Green's functions, various inequalities between the kinetic and potential energy and integral identities which are indispensable for proving the uniqueness theorems. The so-called inverse procedure is applied to constructing examples of non-uniqueness, usually referred to as 'trapped nodes.'
The International Linear Collider
Directory of Open Access Journals (Sweden)
List Benno
2014-04-01
Full Text Available The International Linear Collider (ILC is a proposed e+e− linear collider with a centre-of-mass energy of 200–500 GeV, based on superconducting RF cavities. The ILC would be an ideal machine for precision studies of a light Higgs boson and the top quark, and would have a discovery potential for new particles that is complementary to that of LHC. The clean experimental conditions would allow the operation of detectors with extremely good performance; two such detectors, ILD and SiD, are currently being designed. Both make use of novel concepts for tracking and calorimetry. The Japanese High Energy Physics community has recently recommended to build the ILC in Japan.
The International Linear Collider
List, Benno
2014-04-01
The International Linear Collider (ILC) is a proposed e+e- linear collider with a centre-of-mass energy of 200-500 GeV, based on superconducting RF cavities. The ILC would be an ideal machine for precision studies of a light Higgs boson and the top quark, and would have a discovery potential for new particles that is complementary to that of LHC. The clean experimental conditions would allow the operation of detectors with extremely good performance; two such detectors, ILD and SiD, are currently being designed. Both make use of novel concepts for tracking and calorimetry. The Japanese High Energy Physics community has recently recommended to build the ILC in Japan.
DEFF Research Database (Denmark)
Høskuldsson, Agnar
1996-01-01
Determination of the proper dimension of a given linear model is one of the most important tasks in the applied modeling work. We consider here eight criteria that can be used to determine the dimension of the model, or equivalently, the number of components to use in the model. Four...... the basic problems in determining the dimension of linear models. Then each of the eight measures are treated. The results are illustrated by examples....... of these criteria are widely used ones, while the remaining four are ones derived from the H-principle of mathematical modeling. Many examples from practice show that the criteria derived from the H-principle function better than the known and popular criteria for the number of components. We shall briefly review...
Goldowsky, Michael P. (Inventor)
1987-01-01
A reciprocating linear motor is formed with a pair of ring-shaped permanent magnets having opposite radial polarizations, held axially apart by a nonmagnetic yoke, which serves as an axially displaceable armature assembly. A pair of annularly wound coils having axial lengths which differ from the axial lengths of the permanent magnets are serially coupled together in mutual opposition and positioned with an outer cylindrical core in axial symmetry about the armature assembly. One embodiment includes a second pair of annularly wound coils serially coupled together in mutual opposition and an inner cylindrical core positioned in axial symmetry inside the armature radially opposite to the first pair of coils. Application of a potential difference across a serial connection of the two pairs of coils creates a current flow perpendicular to the magnetic field created by the armature magnets, thereby causing limited linear displacement of the magnets relative to the coils.
International Nuclear Information System (INIS)
Henneaux, Marc; Teitelboim, Claudio
2005-01-01
We show that duality transformations of linearized gravity in four dimensions, i.e., rotations of the linearized Riemann tensor and its dual into each other, can be extended to the dynamical fields of the theory so as to be symmetries of the action and not just symmetries of the equations of motion. Our approach relies on the introduction of two superpotentials, one for the spatial components of the spin-2 field and the other for their canonically conjugate momenta. These superpotentials are two-index, symmetric tensors. They can be taken to be the basic dynamical fields and appear locally in the action. They are simply rotated into each other under duality. In terms of the superpotentials, the canonical generator of duality rotations is found to have a Chern-Simons-like structure, as in the Maxwell case
International Nuclear Information System (INIS)
Phinney, N.
1992-01-01
The SLAC Linear Collider has begun a new era of operation with the SLD detector. During 1991 there was a first engineering run for the SLD in parallel with machine improvements to increase luminosity and reliability. For the 1992 run, a polarized electron source was added and more than 10,000 Zs with an average of 23% polarization have been logged by the SLD. This paper discusses the performance of the SLC in 1991 and 1992 and the technical advances that have produced higher luminosity. Emphasis will be placed on issues relevant to future linear colliders such as producing and maintaining high current, low emittance beams and focusing the beams to the micron scale for collisions. (Author) tab., 2 figs., 18 refs
Linear waves and instabilities
International Nuclear Information System (INIS)
Bers, A.
1975-01-01
The electrodynamic equations for small-amplitude waves and their dispersion relation in a homogeneous plasma are outlined. For such waves, energy and momentum, and their flow and transformation, are described. Perturbation theory of waves is treated and applied to linear coupling of waves, and the resulting instabilities from such interactions between active and passive waves. Linear stability analysis in time and space is described where the time-asymptotic, time-space Green's function for an arbitrary dispersion relation is developed. The perturbation theory of waves is applied to nonlinear coupling, with particular emphasis on pump-driven interactions of waves. Details of the time--space evolution of instabilities due to coupling are given. (U.S.)
Extended linear chain compounds
Linear chain substances span a large cross section of contemporary chemistry ranging from covalent polymers, to organic charge transfer com plexes to nonstoichiometric transition metal coordination complexes. Their commonality, which coalesced intense interest in the theoretical and exper imental solid state physics/chemistry communities, was based on the obser vation that these inorganic and organic polymeric substrates exhibit striking metal-like electrical and optical properties. Exploitation and extension of these systems has led to the systematic study of both the chemistry and physics of highly and poorly conducting linear chain substances. To gain a salient understanding of these complex materials rich in anomalous aniso tropic electrical, optical, magnetic, and mechanical properties, the conver gence of diverse skills and talents was required. The constructive blending of traditionally segregated disciplines such as synthetic and physical organic, inorganic, and polymer chemistry, crystallog...
Diamond, Jared M.
1966-01-01
1. The relation between osmotic gradient and rate of osmotic water flow has been measured in rabbit gall-bladder by a gravimetric procedure and by a rapid method based on streaming potentials. Streaming potentials were directly proportional to gravimetrically measured water fluxes. 2. As in many other tissues, water flow was found to vary with gradient in a markedly non-linear fashion. There was no consistent relation between the water permeability and either the direction or the rate of water flow. 3. Water flow in response to a given gradient decreased at higher osmolarities. The resistance to water flow increased linearly with osmolarity over the range 186-825 m-osM. 4. The resistance to water flow was the same when the gall-bladder separated any two bathing solutions with the same average osmolarity, regardless of the magnitude of the gradient. In other words, the rate of water flow is given by the expression (Om — Os)/[Ro′ + ½k′ (Om + Os)], where Ro′ and k′ are constants and Om and Os are the bathing solution osmolarities. 5. Of the theories advanced to explain non-linear osmosis in other tissues, flow-induced membrane deformations, unstirred layers, asymmetrical series-membrane effects, and non-osmotic effects of solutes could not explain the results. However, experimental measurements of water permeability as a function of osmolarity permitted quantitative reconstruction of the observed water flow—osmotic gradient curves. Hence non-linear osmosis in rabbit gall-bladder is due to a decrease in water permeability with increasing osmolarity. 6. The results suggest that aqueous channels in the cell membrane behave as osmometers, shrinking in concentrated solutions of impermeant molecules and thereby increasing membrane resistance to water flow. A mathematical formulation of such a membrane structure is offered. PMID:5945254
Fundamentals of linear algebra
Dash, Rajani Ballav
2008-01-01
FUNDAMENTALS OF LINEAR ALGEBRA is a comprehensive Text Book, which can be used by students and teachers of All Indian Universities. The Text has easy, understandable form and covers all topics of UGC Curriculum. There are lots of worked out examples which helps the students in solving the problems without anybody's help. The Problem sets have been designed keeping in view of the questions asked in different examinations.
Sander, K F
1964-01-01
Linear Network Theory covers the significant algebraic aspect of network theory, with minimal reference to practical circuits. The book begins the presentation of network analysis with the exposition of networks containing resistances only, and follows it up with a discussion of networks involving inductance and capacity by way of the differential equations. Classification and description of certain networks, equivalent networks, filter circuits, and network functions are also covered. Electrical engineers, technicians, electronics engineers, electricians, and students learning the intricacies
Non linear viscoelastic models
DEFF Research Database (Denmark)
Agerkvist, Finn T.
2011-01-01
Viscoelastic eects are often present in loudspeaker suspensions, this can be seen in the displacement transfer function which often shows a frequency dependent value below the resonance frequency. In this paper nonlinear versions of the standard linear solid model (SLS) are investigated....... The simulations show that the nonlinear version of the Maxwell SLS model can result in a time dependent small signal stiness while the Kelvin Voight version does not....
Relativistic Linear Restoring Force
Clark, D.; Franklin, J.; Mann, N.
2012-01-01
We consider two different forms for a relativistic version of a linear restoring force. The pair comes from taking Hooke's law to be the force appearing on the right-hand side of the relativistic expressions: d"p"/d"t" or d"p"/d["tau"]. Either formulation recovers Hooke's law in the non-relativistic limit. In addition to these two forces, we…
Superconducting linear colliders
International Nuclear Information System (INIS)
Anon.
1990-01-01
The advantages of superconducting radiofrequency (SRF) for particle accelerators have been demonstrated by successful operation of systems in the TRISTAN and LEP electron-positron collider rings respectively at the Japanese KEK Laboratory and at CERN. If performance continues to improve and costs can be lowered, this would open an attractive option for a high luminosity TeV (1000 GeV) linear collider
Perturbed asymptotically linear problems
Bartolo, R.; Candela, A. M.; Salvatore, A.
2012-01-01
The aim of this paper is investigating the existence of solutions of some semilinear elliptic problems on open bounded domains when the nonlinearity is subcritical and asymptotically linear at infinity and there is a perturbation term which is just continuous. Also in the case when the problem has not a variational structure, suitable procedures and estimates allow us to prove that the number of distinct crtitical levels of the functional associated to the unperturbed problem is "stable" unde...
Miniature linear cooler development
International Nuclear Information System (INIS)
Pruitt, G.R.
1993-01-01
An overview is presented of the status of a family of miniature linear coolers currently under development by Hughes Aircraft Co. for use in hand held, volume limited or power limited infrared applications. These coolers, representing the latest additions to the Hughes family of TOP trademark [twin-opposed piston] linear coolers, have been fabricated and tested in three different configurations. Each configuration is designed to utilize a common compressor assembly resulting in reduced manufacturing costs. The baseline compressor has been integrated with two different expander configurations and has been operated with two different levels of input power. These various configuration combinations offer a wide range of performance and interface characteristics which may be tailored to applications requiring limited power and size without significantly compromising cooler capacity or cooldown characteristics. Key cooler characteristics and test data are summarized for three combinations of cooler configurations which are representative of the versatility of this linear cooler design. Configurations reviewed include the shortened coldfinger [1.50 to 1.75 inches long], limited input power [less than 17 Watts] for low power availability applications; the shortened coldfinger with higher input power for lightweight, higher performance applications; and coldfingers compatible with DoD 0.4 Watt Common Module coolers for wider range retrofit capability. Typical weight of these miniature linear coolers is less than 500 grams for the compressor, expander and interconnecting transfer line. Cooling capacity at 80K at room ambient conditions ranges from 400 mW to greater than 550 mW. Steady state power requirements for maintaining a heat load of 150 mW at 80K has been shown to be less than 8 Watts. Ongoing reliability growth testing is summarized including a review of the latest test article results
Directory of Open Access Journals (Sweden)
Avram Mihai
2017-01-01
Full Text Available The paper presents a linear pneumatic actuator with short working stroke. It consists of a pneumatic motor (a simple stroke cylinder or a membrane chamber, two 2/2 pneumatic distributors “all or nothing” electrically commanded for controlling the intake/outtake flow to/from the active chamber of the motor, a position transducer and a microcontroller. There is also presented the theoretical analysis (mathematical modelling and numerical simulation accomplished.
Avram Mihai; Niţu Constantin; Bucşan Constantin; Grămescu Bogdan
2017-01-01
The paper presents a linear pneumatic actuator with short working stroke. It consists of a pneumatic motor (a simple stroke cylinder or a membrane chamber), two 2/2 pneumatic distributors “all or nothing” electrically commanded for controlling the intake/outtake flow to/from the active chamber of the motor, a position transducer and a microcontroller. There is also presented the theoretical analysis (mathematical modelling and numerical simulation) accomplished.
International Nuclear Information System (INIS)
Scheffel, J.
1984-03-01
The linear Grad-Shafranov equation for a toroidal, axisymmetric plasma is solved analytically. Exact solutions are given in terms of confluent hyper-geometric functions. As an alternative, simple and accurate WKBJ solutions are presented. With parabolic pressure profiles, both hollow and peaked toroidal current density profiles are obtained. As an example the equilibrium of a z-pinch with a square-shaped cross section is derived.(author)
Buttram, M.T.; Ginn, J.W.
1988-06-21
A linear induction accelerator includes a plurality of adder cavities arranged in a series and provided in a structure which is evacuated so that a vacuum inductance is provided between each adder cavity and the structure. An energy storage system for the adder cavities includes a pulsed current source and a respective plurality of bipolar converting networks connected thereto. The bipolar high-voltage, high-repetition-rate square pulse train sets and resets the cavities. 4 figs.
Empirical reality, empirical causality, and the measurement problem
International Nuclear Information System (INIS)
d'Espagnat, B.
1987-01-01
Does physics describe anything that can meaningfully be called independent reality, or is it merely operational? Most physicists implicitly favor an intermediate standpoint, which takes quantum physics into account, but which nevertheless strongly holds fast to quite strictly realistic ideas about apparently obvious facts concerning the macro-objects. Part 1 of this article, which is a survey of recent measurement theories, shows that, when made explicit, the standpoint in question cannot be upheld. Part 2 brings forward a proposal for making minimal changes to this standpoint in such a way as to remove such objections. The empirical reality thus constructed is a notion that, to some extent, does ultimately refer to the human means of apprehension and of data processing. It nevertheless cannot be said that it reduces to a mere name just labelling a set of recipes that never fail. It is shown that their usual notion of macroscopic causality must be endowed with similar features
Springer, T A
1998-01-01
"[The first] ten chapters...are an efficient, accessible, and self-contained introduction to affine algebraic groups over an algebraically closed field. The author includes exercises and the book is certainly usable by graduate students as a text or for self-study...the author [has a] student-friendly style… [The following] seven chapters... would also be a good introduction to rationality issues for algebraic groups. A number of results from the literature…appear for the first time in a text." –Mathematical Reviews (Review of the Second Edition) "This book is a completely new version of the first edition. The aim of the old book was to present the theory of linear algebraic groups over an algebraically closed field. Reading that book, many people entered the research field of linear algebraic groups. The present book has a wider scope. Its aim is to treat the theory of linear algebraic groups over arbitrary fields. Again, the author keeps the treatment of prerequisites self-contained. The material of t...
Parametric Linear Dynamic Logic
Directory of Open Access Journals (Sweden)
Peter Faymonville
2014-08-01
Full Text Available We introduce Parametric Linear Dynamic Logic (PLDL, which extends Linear Dynamic Logic (LDL by temporal operators equipped with parameters that bound their scope. LDL was proposed as an extension of Linear Temporal Logic (LTL that is able to express all ω-regular specifications while still maintaining many of LTL's desirable properties like an intuitive syntax and a translation into non-deterministic Büchi automata of exponential size. But LDL lacks capabilities to express timing constraints. By adding parameterized operators to LDL, we obtain a logic that is able to express all ω-regular properties and that subsumes parameterized extensions of LTL like Parametric LTL and PROMPT-LTL. Our main technical contribution is a translation of PLDL formulas into non-deterministic Büchi word automata of exponential size via alternating automata. This yields a PSPACE model checking algorithm and a realizability algorithm with doubly-exponential running time. Furthermore, we give tight upper and lower bounds on optimal parameter values for both problems. These results show that PLDL model checking and realizability are not harder than LTL model checking and realizability.
Quantum linear Boltzmann equation
International Nuclear Information System (INIS)
Vacchini, Bassano; Hornberger, Klaus
2009-01-01
We review the quantum version of the linear Boltzmann equation, which describes in a non-perturbative fashion, by means of scattering theory, how the quantum motion of a single test particle is affected by collisions with an ideal background gas. A heuristic derivation of this Lindblad master equation is presented, based on the requirement of translation-covariance and on the relation to the classical linear Boltzmann equation. After analyzing its general symmetry properties and the associated relaxation dynamics, we discuss a quantum Monte Carlo method for its numerical solution. We then review important limiting forms of the quantum linear Boltzmann equation, such as the case of quantum Brownian motion and pure collisional decoherence, as well as the application to matter wave optics. Finally, we point to the incorporation of quantum degeneracies and self-interactions in the gas by relating the equation to the dynamic structure factor of the ambient medium, and we provide an extension of the equation to include internal degrees of freedom.
International Nuclear Information System (INIS)
Emma, P.
1995-01-01
The Stanford Linear Collider (SLC) is the first and only high-energy e + e - linear collider in the world. Its most remarkable features are high intensity, submicron sized, polarized (e - ) beams at a single interaction point. The main challenges posed by these unique characteristics include machine-wide emittance preservation, consistent high intensity operation, polarized electron production and transport, and the achievement of a high degree of beam stability on all time scales. In addition to serving as an important machine for the study of Z 0 boson production and decay using polarized beams, the SLC is also an indispensable source of hands-on experience for future linear colliders. Each new year of operation has been highlighted with a marked improvement in performance. The most significant improvements for the 1994-95 run include new low impedance vacuum chambers for the damping rings, an upgrade to the optics and diagnostics of the final focus systems, and a higher degree of polarization from the electron source. As a result, the average luminosity has nearly doubled over the previous year with peaks approaching 10 30 cm -2 s -1 and an 80% electron polarization at the interaction point. These developments as well as the remaining identifiable performance limitations will be discussed
Inconsistency of Bayesian Inference for Misspecified Linear Models, and a Proposal for Repairing It
Grünwald, P.; van Ommen, T.
2017-01-01
We empirically show that Bayesian inference can be inconsistent under misspecification in simple linear regression problems, both in a model averaging/selection and in a Bayesian ridge regression setting. We use the standard linear model, which assumes homoskedasticity, whereas the data are
DEFF Research Database (Denmark)
Dlugosz, Stephan; Mammen, Enno; Wilke, Ralf
We consider the semiparametric generalised linear regression model which has mainstream empirical models such as the (partially) linear mean regression, logistic and multinomial regression as special cases. As an extension to related literature we allow a misclassified covariate to be interacted...
Inconsistency of Bayesian inference for misspecified linear models, and a proposal for repairing it
P.D. Grünwald (Peter); T. van Ommen (Thijs)
2017-01-01
textabstractWe empirically show that Bayesian inference can be inconsistent under misspecification in simple linear regression problems, both in a model averaging/selection and in a Bayesian ridge regression setting. We use the standard linear model, which assumes homoskedasticity, whereas the data
Simulation of a medical linear accelerator for teaching purposes.
Anderson, Rhys; Lamey, Michael; MacPherson, Miller; Carlone, Marco
2015-05-08
Simulation software for medical linear accelerators that can be used in a teaching environment was developed. The components of linear accelerators were modeled to first order accuracy using analytical expressions taken from the literature. The expressions used constants that were empirically set such that realistic response could be expected. These expressions were programmed in a MATLAB environment with a graphical user interface in order to produce an environment similar to that of linear accelerator service mode. The program was evaluated in a systematic fashion, where parameters affecting the clinical properties of medical linear accelerator beams were adjusted independently, and the effects on beam energy and dose rate recorded. These results confirmed that beam tuning adjustments could be simulated in a simple environment. Further, adjustment of service parameters over a large range was possible, and this allows the demonstration of linear accelerator physics in an environment accessible to both medical physicists and linear accelerator service engineers. In conclusion, a software tool, named SIMAC, was developed to improve the teaching of linear accelerator physics in a simulated environment. SIMAC performed in a similar manner to medical linear accelerators. The authors hope that this tool will be valuable as a teaching tool for medical physicists and linear accelerator service engineers.
Dib, Josef; Mongongu, Cynthia; Buisson, Corinne; Molina, Adeline; Schänzer, Wilhelm; Thuss, Uwe; Thevis, Mario
2017-01-01
The development of new therapeutics potentially exhibiting performance-enhancing properties implicates the risk of their misuse by athletes in amateur and elite sports. Such drugs necessitate preventive anti-doping research for consideration in sports drug testing programmes. Hypoxia-inducible factor (HIF) stabilizers represent an emerging class of therapeutics that allows for increasing erythropoiesis in patients. BAY 85-3934 is a novel HIF stabilizer, which is currently undergoing phase-2 clinical trials. Consequently, the comprehensive characterization of BAY 85-3934 and human urinary metabolites as well as the implementation of these analytes into routine doping controls is of great importance. The mass spectrometric behaviour of the HIF stabilizer drug candidate BAY 85-3934 and a glucuronidated metabolite (BAY-348) were characterized by electrospray ionization-(tandem) mass spectrometry (ESI-MS(/MS)) and multiple-stage mass spectrometry (MS n ). Subsequently, two different laboratories established different analytical approaches (one each) enabling urine sample analyses by employing either direct urine injection or solid-phase extraction. The methods were cross-validated for the metabolite BAY-348 that is expected to represent an appropriate target analyte for human urine analysis. Two test methods allowing for the detection of BAY-348 in human urine were applied and cross-validated concerning the validation parameters specificity, linearity, lower limit of detection (LLOD; 1-5 ng/mL), ion suppression/enhancement (up to 78%), intra- and inter-day precision (3-21%), recovery (29-48%), and carryover. By means of ten spiked test urine samples sent blinded to one of the participating laboratories, the fitness-for-purpose of both assays was provided as all specimens were correctly identified applying both testing methods. As no post-administration study samples were available, analyses of authentic urine specimens remain desirable. Copyright © 2016 John Wiley
International Nuclear Information System (INIS)
Gavrilas, M.; Munno, F.J.
1984-01-01
The stable element composition of the American oyster Crassostrea virginica collected between June 1978 and August 1983 in the Chesapeake Bay in the vicinity of Calvert Cliffs Nuclear Power Plant was analyzed by neutron activation. The minimum, maximum and the mean values of the elemental concentrations are given. The seasonal effect and the linear correlation between elements entering the oyster composition are shown. 7 references, 1 figure, 4 tables
Combining Empirical and Stochastic Models for Extreme Floods Estimation
Zemzami, M.; Benaabidate, L.
2013-12-01
Hydrological models can be defined as physical, mathematical or empirical. The latter class uses mathematical equations independent of the physical processes involved in the hydrological system. The linear regression and Gradex (Gradient of Extreme values) are classic examples of empirical models. However, conventional empirical models are still used as a tool for hydrological analysis by probabilistic approaches. In many regions in the world, watersheds are not gauged. This is true even in developed countries where the gauging network has continued to decline as a result of the lack of human and financial resources. Indeed, the obvious lack of data in these watersheds makes it impossible to apply some basic empirical models for daily forecast. So we had to find a combination of rainfall-runoff models in which it would be possible to create our own data and use them to estimate the flow. The estimated design floods would be a good choice to illustrate the difficulties facing the hydrologist for the construction of a standard empirical model in basins where hydrological information is rare. The construction of the climate-hydrological model, which is based on frequency analysis, was established to estimate the design flood in the Anseghmir catchments, Morocco. The choice of using this complex model returns to its ability to be applied in watersheds where hydrological information is not sufficient. It was found that this method is a powerful tool for estimating the design flood of the watershed and also other hydrological elements (runoff, volumes of water...).The hydrographic characteristics and climatic parameters were used to estimate the runoff, water volumes and design flood for different return periods.
Probabilistic empirical prediction of seasonal climate: evaluation and potential applications
Dieppois, B.; Eden, J.; van Oldenborgh, G. J.
2017-12-01
Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a new evaluation of an established empirical system used to predict seasonal climate across the globe. Forecasts for surface air temperature, precipitation and sea level pressure are produced by the KNMI Probabilistic Empirical Prediction (K-PREP) system every month and disseminated via the KNMI Climate Explorer (climexp.knmi.nl). K-PREP is based on multiple linear regression and built on physical principles to the fullest extent with predictive information taken from the global CO2-equivalent concentration, large-scale modes of variability in the climate system and regional-scale information. K-PREP seasonal forecasts for the period 1981-2016 will be compared with corresponding dynamically generated forecasts produced by operational forecast systems. While there are many regions of the world where empirical forecast skill is extremely limited, several areas are identified where K-PREP offers comparable skill to dynamical systems. We discuss two key points in the future development and application of the K-PREP system: (a) the potential for K-PREP to provide a more useful basis for reference forecasts than those based on persistence or climatology, and (b) the added value of including K-PREP forecast information in multi-model forecast products, at least for known regions of good skill. We also discuss the potential development of
Bird surveys at McKinley Bay and Hutchinson Bay, Northwest Territories, in 1990
Energy Technology Data Exchange (ETDEWEB)
Cornish, B J; Dickson, D L; Dickson, H L
1991-03-01
Monitoring surveys of bird abundance and distribution were conducted in 1990 at McKinley Bay in the Northwest Territories, the site of a winter harbour for drillships and the proposed location for a major year-round support base for oil and gas exploration. Primary objectives of the survey were to determine whether diving duck numbers had changed since the initial phase of the study from 1981-1985, and to provide additional baseline data on natural annual fluctuations in diving duck numbers. Three aerial surveys at each bay were carried out using techniques identical to those in previous years. On 5 August 1990, when survey conditions were considered best of the three surveys, more than twice as many diving ducks were found in McKinley Bay and Hutchinson Bay than on average during the five years of 1981-1985. Old squaw and scooters comprised ca 90% of the diving ducks observed, and both species showed significant increases in numbers. The increase in abundance of diving ducks was likely unrelated to industrial activity in the area since a similar increase occurred in the control area, Hutchinson Bay. Many factors, including both environmental factors such as those affecting nesting success and timing of the moult, and factors related to the survey methods, could be involved in causing the large fluctuations observed. 9 refs., 8 figs., 10 tabs.
Bird surveys at McKinley Bay and Hutchinson Bay, Northwest Territories, in 1990
International Nuclear Information System (INIS)
Cornish, B.J.; Dickson, D.L.; Dickson, H.L.
1991-01-01
Monitoring surveys of bird abundance and distribution were conducted in 1990 at McKinley Bay in the Northwest Territories, the site of a winter harbour for drillships and the proposed location for a major year-round support base for oil and gas exploration. Primary objectives of the survey were to determine whether diving duck numbers had changed since the initial phase of the study from 1981-1985, and to provide additional baseline data on natural annual fluctuations in diving duck numbers. Three aerial surveys at each bay were carried out using techniques identical to those in previous years. On 5 August 1990, when survey conditions were considered best of the three surveys, more than twice as many diving ducks were found in McKinley Bay and Hutchinson Bay than on average during the five years of 1981-1985. Old squaw and scooters comprised ca 90% of the diving ducks observed, and both species showed significant increases in numbers. The increase in abundance of diving ducks was likely unrelated to industrial activity in the area since a similar increase occurred in the control area, Hutchinson Bay. Many factors, including both environmental factors such as those affecting nesting success and timing of the moult, and factors related to the survey methods, could be involved in causing the large fluctuations observed. 9 refs., 8 figs., 10 tabs
2011-02-18
..., as an Addition to the Bay Mills Indian Reservation for the Bay Mills Indian Community of Michigan..., more or less, to be added to the Bay Mills Indian Reservation for the Bay Mills Indian Community of... the land described below. The land was proclaimed to be an addition to the Bay Mills Indian...
New and Improved Results from Daya Bay
CERN. Geneva
2017-01-01
Despite the great progress achieved in the last decades, neutrinos remain among the least understood fundamental particles to have been experimentally observed. The Daya Bay Reactor Neutrino Experiment consists of eight identically designed detectors placed underground at different baselines from three groups of nuclear reactors in China, a configuration that is ideally suited for studying the properties of these elusive particles. In this talk I will review the improved results released last summer by the Daya Bay collaboration. These results include (i) a precision measurement of the θ13 mixing angle and the effective mass splitting in the electron antineutrino disappearance channel with a dataset comprising more than 2.5 million antineutrino interactions, (ii) a high-statistics measurement of the absolute flux and spectrum of reactor-produced electron antineutrinos, and (iii) a search for light sterile neutrino mixing performed with more than three times the statistics of the previous result. I w...
The Bay of Pigs: Revisiting Two Museums
Directory of Open Access Journals (Sweden)
Peter Read
2007-08-01
Full Text Available The Museum of Playa Giron (the Bay of Pigs in the region of Cienega De Zapata, Cuba, celebrates the repulse of Brigade 2506 as the first reverse of US imperialism on the American continents. The equivalent Brigade 2506 Museum in Miami, dedicated to and maintained by the members of Brigade 2506, celebrates defeat at the Bay of Pigs as moral victory for the Cuban exiles. The forces were indeed implacable foes. Yet between the museums can be detected some curious similarities. Both present the common theme of the confrontation between forces of good and evil. Both celebrate the philosophy that dying for one’s country is the greatest good a citizen may achieve. Both museums fly the common Cuban flag. Both museums identify a common enemy: the United States of America. This article, by comparing the displays in the two museums, analyses some cultural elements of what, despite decades of separation, in some ways remains a common Cuban culture.
Thatcher Bay, Washington, Nearshore Restoration Assessment
Breems, Joel; Wyllie-Echeverria, Sandy; Grossman, Eric E.; Elliott, Joel
2009-01-01
The San Juan Archipelago, located at the confluence of the Puget Sound, the Straits of Juan de Fuca in Washington State, and the Straits of Georgia, British Columbia, Canada, provides essential nearshore habitat for diverse salmonid, forage fish, and bird populations. With 408 miles of coastline, the San Juan Islands provide a significant portion of the available nearshore habitat for the greater Puget Sound and are an essential part of the regional efforts to restore Puget Sound (Puget Sound Shared Strategy 2005). The nearshore areas of the San Juan Islands provide a critical link between the terrestrial and marine environments. For this reason the focus on restoration and conservation of nearshore habitat in the San Juan Islands is of paramount importance. Wood-waste was a common by-product of historical lumber-milling operations. To date, relatively little attention has been given to the impact of historical lumber-milling operations in the San Juan Archipelago. Thatcher Bay, on Blakely Island, located near the east edge of the archipelago, is presented here as a case study on the restoration potential for a wood-waste contaminated nearshore area. Case study components include (1) a brief discussion of the history of milling operations. (2) an estimate of the location and amount of the current distribution of wood-waste at the site, (3) a preliminary examination of the impacts of wood-waste on benthic flora and fauna at the site, and (4) the presentation of several restoration alternatives for the site. The history of milling activity in Thatcher Bay began in 1879 with the construction of a mill in the southeastern part of the bay. Milling activity continued for more than 60 years, until the mill closed in 1942. Currently, the primary evidence of the historical milling operations is the presence of approximately 5,000 yd3 of wood-waste contaminated sediments. The distribution and thickness of residual wood-waste at the site was determined by using sediment
Expert opinion vs. empirical evidence
Herman, Rod A; Raybould, Alan
2014-01-01
Expert opinion is often sought by government regulatory agencies when there is insufficient empirical evidence to judge the safety implications of a course of action. However, it can be reckless to continue following expert opinion when a preponderance of evidence is amassed that conflicts with this opinion. Factual evidence should always trump opinion in prioritizing the information that is used to guide regulatory policy. Evidence-based medicine has seen a dramatic upturn in recent years spurred by examples where evidence indicated that certain treatments recommended by expert opinions increased death rates. We suggest that scientific evidence should also take priority over expert opinion in the regulation of genetically modified crops (GM). Examples of regulatory data requirements that are not justified based on the mass of evidence are described, and it is suggested that expertise in risk assessment should guide evidence-based regulation of GM crops. PMID:24637724
Daya Bay Antineutrino Detector Gas System
Band, H. R.; Cherwinka, J. J.; Chu, M-C.; Heeger, K. M.; Kwok, M. W.; Shih, K.; Wise, T.; Xiao, Q.
2012-01-01
The Daya Bay Antineutrino Detector gas system is designed to protect the liquid scintillator targets of the antineutrino detectors against degradation and contamination from exposure to ambient laboratory air. The gas system is also used to monitor the leak tightness of the antineutrino detector assembly. The cover gas system constantly flushes the gas volumes above the liquid scintillator with dry nitrogen to minimize oxidation of the scintillator over the five year lifetime of the experimen...
International Nuclear Information System (INIS)
Garbet, X.; Mourgues, F.; Samain, A.
1987-01-01
Among the various instabilities which could explain the anomalous electron heat transport observed in tokamaks during additional heating, a microtearing turbulence is a reasonable candidate since it affects directly the magnetic topology. This turbulence may be described in a proper frame rotating around the majors axis by a static potential vector. In strong non linear regimes, the flow of electrons along the stochastic field lines induces a current. The point is to know whether this current can sustain the turbulence. The mechanisms of this self-consistency, involving the combined effects of the thermal diamagnetism and of the electric drift are presented here
Wangler, Thomas P
2008-01-01
Thomas P. Wangler received his B.S. degree in physics from Michigan State University, and his Ph.D. degree in physics and astronomy from the University of Wisconsin. After postdoctoral appointments at the University of Wisconsin and Brookhaven National Laboratory, he joined the staff of Argonne National Laboratory in 1966, working in the fields of experimental high-energy physics and accelerator physics. He joined the Accelerator Technology Division at Los Alamos National Laboratory in 1979, where he specialized in high-current beam physics and linear accelerator design and technology. In 2007
International Nuclear Information System (INIS)
Richter, B.; Bell, R.A.; Brown, K.L.
1980-06-01
The SLAC LINEAR COLLIDER is designed to achieve an energy of 100 GeV in the electron-positron center-of-mass system by accelerating intense bunches of particles in the SLAC linac and transporting the electron and positron bunches in a special magnet system to a point where they are focused to a radius of about 2 microns and made to collide head on. The rationale for this new type of colliding beam system is discussed, the project is described, some of the novel accelerator physics issues involved are discussed, and some of the critical technical components are described
Lopez, Cesar
2014-01-01
MATLAB is a high-level language and environment for numerical computation, visualization, and programming. Using MATLAB, you can analyze data, develop algorithms, and create models and applications. The language, tools, and built-in math functions enable you to explore multiple approaches and reach a solution faster than with spreadsheets or traditional programming languages, such as C/C++ or Java. MATLAB Linear Algebra introduces you to the MATLAB language with practical hands-on instructions and results, allowing you to quickly achieve your goals. In addition to giving an introduction to
2010-07-01
... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Safety/Security Zone: San... Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) PORTS AND WATERWAYS SAFETY... Areas Eleventh Coast Guard District § 165.1182 Safety/Security Zone: San Francisco Bay, San Pablo Bay...
Cosmogenic neutron production at Daya Bay
An, F. P.; Balantekin, A. B.; Band, H. R.; Bishai, M.; Blyth, S.; Cao, D.; Cao, G. F.; Cao, J.; Chan, Y. L.; Chang, J. F.; Chang, Y.; Chen, H. S.; Chen, S. M.; Chen, Y.; Chen, Y. X.; Cheng, J.; Cheng, Z. K.; Cherwinka, J. J.; Chu, M. C.; Chukanov, A.; Cummings, J. P.; Ding, Y. Y.; Diwan, M. V.; Dolgareva, M.; Dove, J.; Dwyer, D. A.; Edwards, W. R.; Gill, R.; Gonchar, M.; Gong, G. H.; Gong, H.; Grassi, M.; Gu, W. Q.; Guo, L.; Guo, X. H.; Guo, Y. H.; Guo, Z.; Hackenburg, R. W.; Hans, S.; He, M.; Heeger, K. M.; Heng, Y. K.; Higuera, A.; Hsiung, Y. B.; Hu, B. Z.; Hu, T.; Huang, H. X.; Huang, X. T.; Huang, Y. B.; Huber, P.; Huo, W.; Hussain, G.; Jaffe, D. E.; Jen, K. L.; Ji, X. L.; Ji, X. P.; Jiao, J. B.; Johnson, R. A.; Jones, D.; Kang, L.; Kettell, S. H.; Khan, A.; Koerner, L. W.; Kohn, S.; Kramer, M.; Kwok, M. W.; Langford, T. J.; Lau, K.; Lebanowski, L.; Lee, J.; Lee, J. H. C.; Lei, R. T.; Leitner, R.; Leung, J. K. C.; Li, C.; Li, D. J.; Li, F.; Li, G. S.; Li, Q. J.; Li, S.; Li, S. C.; Li, W. D.; Li, X. N.; Li, X. Q.; Li, Y. F.; Li, Z. B.; Liang, H.; Lin, C. J.; Lin, G. L.; Lin, S.; Lin, S. K.; Lin, Y.-C.; Ling, J. J.; Link, J. M.; Littenberg, L.; Littlejohn, B. R.; Liu, J. C.; Liu, J. L.; Loh, C. W.; Lu, C.; Lu, H. Q.; Lu, J. S.; Luk, K. B.; Ma, X. B.; Ma, X. Y.; Ma, Y. Q.; Malyshkin, Y.; Martinez Caicedo, D. A.; McDonald, K. T.; McKeown, R. D.; Mitchell, I.; Nakajima, Y.; Napolitano, J.; Naumov, D.; Naumova, E.; Ochoa-Ricoux, J. P.; Olshevskiy, A.; Pan, H.-R.; Park, J.; Patton, S.; Pec, V.; Peng, J. C.; Pinsky, L.; Pun, C. S. J.; Qi, F. Z.; Qi, M.; Qian, X.; Qiu, R. M.; Raper, N.; Ren, J.; Rosero, R.; Roskovec, B.; Ruan, X. C.; Steiner, H.; Sun, J. L.; Tang, W.; Taychenachev, D.; Treskov, K.; Tsang, K. V.; Tse, W.-H.; Tull, C. E.; Viaux, N.; Viren, B.; Vorobel, V.; Wang, C. H.; Wang, M.; Wang, N. Y.; Wang, R. G.; Wang, W.; Wang, X.; Wang, Y. F.; Wang, Z.; Wang, Z.; Wang, Z. M.; Wei, H. Y.; Wen, L. J.; Whisnant, K.; White, C. G.; Wise, T.; Wong, H. L. H.; Wong, S. C. F.; Worcester, E.; Wu, C.-H.; Wu, Q.; Wu, W. J.; Xia, D. M.; Xia, J. K.; Xing, Z. Z.; Xu, J. L.; Xu, Y.; Xue, T.; Yang, C. G.; Yang, H.; Yang, L.; Yang, M. S.; Yang, M. T.; Yang, Y. Z.; Ye, M.; Ye, Z.; Yeh, M.; Young, B. L.; Yu, Z. Y.; Zeng, S.; Zhan, L.; Zhang, C.; Zhang, C. C.; Zhang, H. H.; Zhang, J. W.; Zhang, Q. M.; Zhang, R.; Zhang, X. T.; Zhang, Y. M.; Zhang, Y. M.; Zhang, Y. X.; Zhang, Z. J.; Zhang, Z. P.; Zhang, Z. Y.; Zhao, J.; Zhou, L.; Zhuang, H. L.; Zou, J. H.; Daya Bay Collaboration
2018-03-01
Neutrons produced by cosmic ray muons are an important background for underground experiments studying neutrino oscillations, neutrinoless double beta decay, dark matter, and other rare-event signals. A measurement of the neutron yield in the three different experimental halls of the Daya Bay Reactor Neutrino Experiment at varying depth is reported. The neutron yield in Daya Bay's liquid scintillator is measured to be Yn=(10.26 ±0.86 )×10-5 , (10.22 ±0.87 )×10-5 , and (17.03 ±1.22 )×10-5 μ-1 g-1 cm2 at depths of 250, 265, and 860 meters-water-equivalent. These results are compared to other measurements and the simulated neutron yield in Fluka and Geant4. A global fit including the Daya Bay measurements yields a power law coefficient of 0.77 ±0.03 for the dependence of the neutron yield on muon energy.
Daya Bay Antineutrino Detector gas system
Band, H. R.; Cherwinka, J. J.; Chu, M.-C.; Heeger, K. M.; Kwok, M. W.; Shih, K.; Wise, T.; Xiao, Q.
2012-11-01
The Daya Bay Antineutrino Detector gas system is designed to protect the liquid scintillator targets of the antineutrino detectors against degradation and contamination from exposure to ambient laboratory air. The gas system is also used to monitor the leak tightness of the antineutrino detector assembly. The cover gas system constantly flushes the gas volumes above the liquid scintillator with dry nitrogen to minimize oxidation of the scintillator over the five year lifetime of the experiment. This constant flush also prevents the infiltration of radon or other contaminants into these detecting liquids keeping the internal backgrounds low. Since the Daya Bay antineutrino detectors are immersed in the large water pools of the muon veto system, other gas volumes are needed to protect vital detector cables or gas lines. These volumes are also purged with dry gas. Return gas is monitored for oxygen content and humidity to provide early warning of potentially damaging leaks. The design and performance of the Daya Bay Antineutrino Detector gas system is described.
An overview of San Francisco Bay PORTS
Cheng, Ralph T.; McKinnie, David; English, Chad; Smith, Richard E.
1998-01-01
The Physical Oceanographic Real-Time System (PORTS) provides observations of tides, tidal currents, and meteorological conditions in real-time. The San Francisco Bay PORTS (SFPORTS) is a decision support system to facilitate safe and efficient maritime commerce. In addition to real-time observations, SFPORTS includes a nowcast numerical model forming a San Francisco Bay marine nowcast system. SFPORTS data and nowcast numerical model results are made available to users through the World Wide Web (WWW). A brief overview of SFPORTS is presented, from the data flow originated at instrument sensors to final results delivered to end users on the WWW. A user-friendly interface for SFPORTS has been designed and implemented. Appropriate field data analysis, nowcast procedures, design and generation of graphics for WWW display of field data and nowcast results are presented and discussed. Furthermore, SFPORTS is designed to support hazardous materials spill prevention and response, and to serve as resources to scientists studying the health of San Francisco Bay ecosystem. The success (or failure) of the SFPORTS to serve the intended user community is determined by the effectiveness of the user interface.
Linear regression and the normality assumption.
Schmidt, Amand F; Finan, Chris
2017-12-16
Researchers often perform arbitrary outcome transformations to fulfill the normality assumption of a linear regression model. This commentary explains and illustrates that in large data settings, such transformations are often unnecessary, and worse may bias model estimates. Linear regression assumptions are illustrated using simulated data and an empirical example on the relation between time since type 2 diabetes diagnosis and glycated hemoglobin levels. Simulation results were evaluated on coverage; i.e., the number of times the 95% confidence interval included the true slope coefficient. Although outcome transformations bias point estimates, violations of the normality assumption in linear regression analyses do not. The normality assumption is necessary to unbiasedly estimate standard errors, and hence confidence intervals and P-values. However, in large sample sizes (e.g., where the number of observations per variable is >10) violations of this normality assumption often do not noticeably impact results. Contrary to this, assumptions on, the parametric model, absence of extreme observations, homoscedasticity, and independency of the errors, remain influential even in large sample size settings. Given that modern healthcare research typically includes thousands of subjects focusing on the normality assumption is often unnecessary, does not guarantee valid results, and worse may bias estimates due to the practice of outcome transformations. Copyright © 2017 Elsevier Inc. All rights reserved.
Special set linear algebra and special set fuzzy linear algebra
Kandasamy, W. B. Vasantha; Smarandache, Florentin; Ilanthenral, K.
2009-01-01
The authors in this book introduce the notion of special set linear algebra and special set fuzzy Linear algebra, which is an extension of the notion set linear algebra and set fuzzy linear algebra. These concepts are best suited in the application of multi expert models and cryptology. This book has five chapters. In chapter one the basic concepts about set linear algebra is given in order to make this book a self contained one. The notion of special set linear algebra and their fuzzy analog...
Distribution of polycyclic aromatic hydrocarbons in water and surface sediments from Daya Bay, China
International Nuclear Information System (INIS)
Zhou, J.L.; Maskaoui, K.
2003-01-01
Findings indicate an urgent need to establish a monitoring program for persistent organic pollutants in water and sediment. - Marine culture is thriving in China and represents a major component of the regional economy in coastal zones, yet the environmental quality of many of those areas has never been studied. This paper attempts to investigate the quality status of Daya Bay, a key aquaculture area in China. The levels of 16 polycyclic aromatic hydrocarbons (PAHs) were determined in water and sediment samples of the bay. The total concentrations of 16 PAHs varied from 4228 to 29325 ng l -1 in water, and from 115 to 1134 ng g -1 dry weight in sediments. In comparison to many other marine systems studied, the PAH levels in Daya Bay waters were relatively high, and at six sites they were sufficiently high (>10 μg l -1 ) to cause acute toxicity. The PAH composition pattern in sediments suggest dominance by medium to high molecular weight compounds, and the ratio of certain related PAHs indicate important pyrolytic and petrogenic sources. Further analysis showed that the distribution coefficient (K D ) increased with the particular organic carbon content of sediments, consistent with the PAH partition theory. The organic carbon normalised distribution coefficient (K oc ) also increased with the compounds' octanol/water partition coefficient (K ow ), confirming the potential applicability of the linear free energy relationships in the modelling and prediction of PAH behaviour in marine environments
Whose Bay Street? Competing Narratives of Nassau's City Centre
Directory of Open Access Journals (Sweden)
Nona Patara Martin
2009-05-01
Full Text Available Bay Street has always been at the centre of commercial, cultural and political life in the Bahama Islands. It also acts as a gateway for millions of tourists who come to Nassau, the Bahamian capital, via cruise ships every year. Not surprisingly, Bahamians and non-Bahamians have widely divergent impressions of Bay Street. The need to accommodate the tourists who are critical to the Bahamian economy has meant that Bay Street, despite its deep social significance for Bahamians, has increasingly become a tourist space. With reference to the ‘sense of place’ and place attachment literature, this paper traces the transformation of Bay Street and attempts to tease out the most obvious tensions between the Bay Street that Bahamians experience and Bay Street as a port of call.
Phytoplankton growth, dissipation, and succession in estuarine environments. [Chesapeake Bay
Energy Technology Data Exchange (ETDEWEB)
Seliger, H H
1976-01-01
Two major advances in a study of phytoplankton ecology in the Chesapeake Bay are reported. The annual subsurface transport of a dinoflagellate species (Prorocentrum mariae labouriae) from the mouth of the bay a distance northward of 120 nautical miles to the region of the Bay Bridge was followed. Prorocentrum is a major seasonal dinoflagellate in the Chespeake Bay and annually has been reported to form mahogany tides, dense reddish-brown patches, in the northern bay beginning in late spring and continuing through the summer. Subsequent to this annual appearance the Prorocentrum spread southward and into the western tributary estuaries. The physiological behavioral characteristics of the Prorocentrum were correlated with the physical water movements in the bay. A phytoplankton cage technique for the measurement in situ of the growth rates of natural mixed populations is described. (CH)
Energy Technology Data Exchange (ETDEWEB)
Munehiro, H
1980-05-29
When driving the carriage of a printer through a rotating motor, there are problems regarding the limited accuracy of the carriage position due to rotation or contraction and ageing of the cable. In order to solve the problem, a direct drive system was proposed, in which the printer carriage is driven by a linear motor. If one wants to keep the motor circuit of such a motor compact, then the magnetic flux density in the air gap must be reduced or the motor travel must be reduced. It is the purpose of this invention to create an electrodynamic linear motor, which on the one hand is compact and light and on the other hand has a relatively high constant force over a large travel. The invention is characterised by the fact that magnetic fields of alternating polarity are generated at equal intervals in the magnetic field, and that the coil arrangement has 2 adjacent coils, whose size corresponds to half the length of each magnetic pole. A logic circuit is provided to select one of the two coils and to determine the direction of the current depending on the signals of a magnetic field sensor on the coil arrangement.
International Nuclear Information System (INIS)
Kozarov, A.; Petrov, O.; Antonov, J.; Sotirova, S.; Petrova, B.
2006-01-01
The purpose of the linear wind-power generator described in this article is to decrease the following disadvantages of the common wind-powered turbine: 1) large bending and twisting moments to the blades and the shaft, especially when strong winds and turbulence exist; 2) significant values of the natural oscillation period of the construction result in the possibility of occurrence of destroying resonance oscillations; 3) high velocity of the peripheral parts of the rotor creating a danger for birds; 4) difficulties, connected with the installation and the operation on the mountain ridges and passages where the wind energy potential is the largest. The working surfaces of the generator in questions driven by the wind are not connected with a joint shaft but each moves along a railway track with few oscillations. So the sizes of each component are small and their number can be rather large. The mechanical trajectory is not a circle but a closed outline in a vertical plain, which consists of two rectilinear sectors, one above the other, connected in their ends by semi-circumferences. The mechanical energy of each component turns into electrical on the principle of the linear electrical generator. A regulation is provided when the direction of the wind is perpendicular to the route. A possibility of effectiveness is shown through aiming of additional quantities of air to the movable components by static barriers
Geochemistry of sediments in the Back Bay and Yellowknife Bay of the Great Slave Lake
International Nuclear Information System (INIS)
Mudroch, A.; Joshi, S.R.; Sutherland, D.; Mudroch, P.; Dickson, K.M.
1989-01-01
Gold mining activities have generated wastes with high concentrations of arsenic and zinc in the vicinity of Yellowknife, Northwest Territories, Canada. Some of the waste material has been discharged into Yellowknife Bay of Great Slave Lake. Concentrations of arsenic and zinc were determined in sediment cores collected at the depositional areas of Yellowknife Bay. Sedimentation rates were estimated using two different radiometric approaches: the depth profiles of cesium 137 and lead 210. Geochemical analysis of the sediment cores indicated input of similar material into sampling areas over the past 50 yr. Age profiles of the sediment constructed from the radionuclide measurements were used to determine historical trends of arsenic and zinc inputs into Yellowknife Bay. The historical record was in good agreement with implemented remedial actions and the usage patterns of both elements. 16 refs., 6 figs., 3 tabs
Spatial-temporal migration laws of Cd in Jiaozhou Bay
Yang, Dongfang; Li, Haixia; Zhang, Xiaolong; Wang, Qi; Miao, Zhenqing
2018-02-01
Many marine bays have been polluted by various pollutants, and understanding the migration laws is essential to scientific research and pollution control. This paper analyzed the spatial and temporal migration laws of Cd in waters in Jiaozhou Bay during 1979—1983. Results showed that there were twenty spatial-temporal migration law for the migration processes of Cd. These laws were helpful for better understanding the migration of Cd in marine bay, providing basis for scientific research and pollution control.
Energy Technology Data Exchange (ETDEWEB)
Sunendra, Joshi R.; Kukkadapu, Ravi K.; Burdige, David J.; Bowden, Mark E.; Sparks, Donald L.; Jaisi, Deb P.
2015-05-19
The Chesapeake Bay, the largest and most productive estuary in the US, suffers from varying degrees of water quality issues fueled by both point and non–point source nutrient sources. Restoration of the bay is complicated by the multitude of nutrient sources, their variable inputs and hydrological conditions, and complex interacting factors including climate forcing. These complexities not only restrict formulation of effective restoration plans but also open up debates on accountability issues with nutrient loading. A detailed understanding of sediment phosphorus (P) dynamics enables one to identify the exchange of dissolved constituents across the sediment- water interface and aid to better constrain mechanisms and processes controlling the coupling between the sediments and the overlying waters. Here we used phosphate oxygen isotope ratios (δ18Op) in concert with sediment chemistry, XRD, and Mössbauer spectroscopy on the sediment retrieved from an organic rich, sulfidic site in the meso-haline portion of the mid-bay to identify sources and pathway of sedimentary P cycling and to infer potential feedback effect on bottom water hypoxia and surface water eutrophication. Isotope data indicate that the regeneration of inorganic P from organic matter degradation (remineralization) is the predominant, if not sole, pathway for authigenic P precipitation in the mid-bay sediments. We interpret that the excess inorganic P generated by remineralization should have overwhelmed any bottom-water and/or pore-water P derived from other sources or biogeochemical processes and exceeded saturation with respect to authigenic P precipitation. It is the first research that identifies the predominance of remineralization pathway against remobilization (coupled Fe-P cycling) pathway in the Chesapeake Bay. Therefore, these results are expected to have significant implications for the current understanding of P cycling and benthic-pelagic coupling in the bay, particularly on the
Upgrade of Daya Bay full scope simulator
International Nuclear Information System (INIS)
2006-01-01
Daya Bay full scope simulator was manufactured by French THOMSON Company in earlier 1990s. It was put into operation in August 1992, one year before the plant's unit-1 was commissioned. During nearly 10 years, the Daya Bay simulator was used to train the control room operators. As many as 220 operators obtained their operator licenses or senior operators licenses. The Daya Bay simulator made a great contribution to the plant's operation. 2) Owing to the limitation of simulation technology and computer capacity in that age, Daya Bay simulator had its deficiencies from the beginning, making maintenance difficult, gradually bringing more and more impact on operator training. - Bad performance: The main computer was the Gould CONCEPT 32/67. Its calculation speed is quite low and memory very limited. Even in the normal operation mode, the average CPU load was up to 80%. The simulation fidelity and scope were not sufficient, which could not meet the deep level of training demand. Many special plant scenarios were not simulated; therefore it was not possible to undertake the verification exercises for the corresponding plant operations. - Poor maintainability: - In hardware aspect, due to that Gould CONCEPT 32/67 is with multi-board architecture. Thousands of tiny connection pins between boards and chasses was the weak link, after many times board plug in-out repair the connection became worse and worse. In addition, the spare parts are difficult to order. Computer crashes happened very often. Each time, the failures each took a few hours, even a few days to fix. - In software aspect, simulation modules suspension, OUT OF TIME error and software breakdown were often occurring. To restart the system took over half an hour each time, which seriously interrupted normal training. - In software maintenance aspect, most modules are manually coded and the development tools are difficult to use. Less than 10% of modifications related to the plant upgrade could be implemented on
Linearization of the Lorenz system
International Nuclear Information System (INIS)
Li, Chunbiao; Sprott, Julien Clinton; Thio, Wesley
2015-01-01
A partial and complete piecewise linearized version of the Lorenz system is proposed. The linearized versions have an independent total amplitude control parameter. Additional further linearization leads naturally to a piecewise linear version of the diffusionless Lorenz system. A chaotic circuit with a single amplitude controller is then implemented using a new switch element, producing a chaotic oscillation that agrees with the numerical calculation for the piecewise linear diffusionless Lorenz system. - Highlights: • A partial and complete piecewise linearized version of the Lorenz system are addressed. • The linearized versions have an independent total amplitude control parameter. • A piecewise linear version of the diffusionless Lorenz system is derived by further linearization. • A corresponding chaotic circuit without any multiplier is implemented for the chaotic oscillation
Topics in computational linear optimization
DEFF Research Database (Denmark)
Hultberg, Tim Helge
2000-01-01
Linear optimization has been an active area of research ever since the pioneering work of G. Dantzig more than 50 years ago. This research has produced a long sequence of practical as well as theoretical improvements of the solution techniques avilable for solving linear optimization problems...... of high quality solvers and the use of algebraic modelling systems to handle the communication between the modeller and the solver. This dissertation features four topics in computational linear optimization: A) automatic reformulation of mixed 0/1 linear programs, B) direct solution of sparse unsymmetric...... systems of linear equations, C) reduction of linear programs and D) integration of algebraic modelling of linear optimization problems in C++. Each of these topics is treated in a separate paper included in this dissertation. The efficiency of solving mixed 0-1 linear programs by linear programming based...
Linearization of the Lorenz system
Energy Technology Data Exchange (ETDEWEB)
Li, Chunbiao, E-mail: goontry@126.com [School of Electronic & Information Engineering, Nanjing University of Information Science & Technology, Nanjing 210044 (China); Engineering Technology Research and Development Center of Jiangsu Circulation Modernization Sensor Network, Jiangsu Institute of Commerce, Nanjing 211168 (China); Sprott, Julien Clinton [Department of Physics, University of Wisconsin–Madison, Madison, WI 53706 (United States); Thio, Wesley [Department of Electrical and Computer Engineering, The Ohio State University, Columbus, OH 43210 (United States)
2015-05-08
A partial and complete piecewise linearized version of the Lorenz system is proposed. The linearized versions have an independent total amplitude control parameter. Additional further linearization leads naturally to a piecewise linear version of the diffusionless Lorenz system. A chaotic circuit with a single amplitude controller is then implemented using a new switch element, producing a chaotic oscillation that agrees with the numerical calculation for the piecewise linear diffusionless Lorenz system. - Highlights: • A partial and complete piecewise linearized version of the Lorenz system are addressed. • The linearized versions have an independent total amplitude control parameter. • A piecewise linear version of the diffusionless Lorenz system is derived by further linearization. • A corresponding chaotic circuit without any multiplier is implemented for the chaotic oscillation.
Delineation of marsh types from Corpus Christi Bay, Texas, to Perdido Bay, Alabama, in 2010
Enwright, Nicholas M.; Hartley, Stephen B.; Couvillion, Brady R.; Michael G. Brasher,; Jenneke M. Visser,; Michael K. Mitchell,; Bart M. Ballard,; Mark W. Parr,; Barry C. Wilson,
2015-07-23
Coastal zone managers and researchers often require detailed information regarding emergent marsh vegetation types (that is, fresh, intermediate, brackish, and saline) for modeling habitat capacities and needs of marsh dependent taxa (such as waterfowl and alligator). Detailed information on the extent and distribution of emergent marsh vegetation types throughout the northern Gulf of Mexico coast has been historically unavailable. In response, the U.S. Geological Survey, in collaboration with the Gulf Coast Joint Venture, the University of Louisiana at Lafayette, Ducks Unlimited, Inc., and the Texas A&M University-Kingsville, produced a classification of emergent marsh vegetation types from Corpus Christi Bay, Texas, to Perdido Bay, Alabama.
On the linear programming bound for linear Lee codes.
Astola, Helena; Tabus, Ioan
2016-01-01
Based on an invariance-type property of the Lee-compositions of a linear Lee code, additional equality constraints can be introduced to the linear programming problem of linear Lee codes. In this paper, we formulate this property in terms of an action of the multiplicative group of the field [Formula: see text] on the set of Lee-compositions. We show some useful properties of certain sums of Lee-numbers, which are the eigenvalues of the Lee association scheme, appearing in the linear programming problem of linear Lee codes. Using the additional equality constraints, we formulate the linear programming problem of linear Lee codes in a very compact form, leading to a fast execution, which allows to efficiently compute the bounds for large parameter values of the linear codes.
PENERAPAN ALGORITMA NAIVE BAYES UNTUK MENGKLASIFIKASI DATA NASABAH ASURANSI
Directory of Open Access Journals (Sweden)
Bustami Bustami
2014-01-01
Full Text Available Data mining adalah teknik yang memanfaatkan data dalam jumlah yang besar untuk memperoleh informasi berharga yang sebelumnya tidak diketahui dan dapat dimanfaatkan untuk pengambilan keputusan penting. Pada penelitian ini, penulis berusaha menambang data (data mining nasabah sebuah perusahaan asuransi untuk mengetahui lancar, kurang lancar atau tidak lancarnya nasabah tersebut. Data yang ada dianalisis menggunakan algoritma Naive Bayes. Naive Bayes merupakan salah satu meode pada probabilistic reasoning. Algoritma Naive Bayes bertujuan untuk melakukan klasifikasi data pada kelas tertentu, kemudian pola tersebut dapat digunakan untuk memperkirakan nasabah yang bergabung, sehingga perusahaan bisa mengambil keputusan menerima atau menolak calon nasabah tersebut. Kata Kunci : data mining, asuransi, klasifikasi, algoritma Naive Bayes
Heavy metals in superficial sediment of Algiers Bay
International Nuclear Information System (INIS)
Benamar, M.A.; Toumert, C.L.; Chaouch, L.; Bacha, L.; Tobbeche, S.
1996-01-01
Sediment samples were collected in 33 stations from the bay of Algiers for the potential sources of pollution. the analyses were made x-ray fluorescence (XRF) and atomic absorption spectrometry (AAS) the results give information about level of concentrations morphology of the bay (funnel form of bay). only Co,Mn,Fe, and Cd present a particular repartition (unrelated to the sedimentary facies). the level pollution bu heavy metals of the bottom sediments in algiers bay have been compared with those of Surkouk considered as a region with low anthropogenic activities
MODELING THE 1958 LITUYA BAY MEGA-TSUNAMI, II
Directory of Open Access Journals (Sweden)
Charles L. Mader
2002-01-01
Full Text Available Lituya Bay, Alaska is a T-Shaped bay, 7 miles long and up to 2 miles wide. The two arms at the head of the bay, Gilbert and Crillon Inlets, are part of a trench along the Fairweather Fault. On July 8, 1958, an 7.5 Magnitude earthquake occurred along the Fairweather fault with an epicenter near Lituya Bay.A mega-tsunami wave was generated that washed out trees to a maximum altitude of 520 meters at the entrance of Gilbert Inlet. Much of the rest of the shoreline of the Bay was denuded by the tsunami from 30 to 200 meters altitude.In the previous study it was determined that if the 520 meter high run-up was 50 to 100 meters thick, the observed inundation in the rest of Lituya Bay could be numerically reproduced. It was also concluded that further studies would require full Navier-Stokes modeling similar to those required for asteroid generated tsunami waves.During the Summer of 2000, Hermann Fritz conducted experiments that reproduced the Lituya Bay 1958 event. The laboratory experiments indicated that the 1958 Lituya Bay 524 meter run-up on the spur ridge of Gilbert Inlet could be caused by a landslide impact.The Lituya Bay impact landslide generated tsunami was modeled with the full Navier- Stokes AMR Eulerian compressible hydrodynamic code called SAGE with includes the effect of gravity.
Heme oxygenase-1 mediates BAY 11-7085 induced ferroptosis.
Chang, Ling-Chu; Chiang, Shih-Kai; Chen, Shuen-Ei; Yu, Yung-Luen; Chou, Ruey-Hwang; Chang, Wei-Chao
2018-03-01
Ferroptosis is a form of oxidative cell death and has become a chemotherapeutic target for cancer treatment. BAY 11-7085 (BAY), which is a well-known IκBα inhibitor, suppressed viability in cancer cells via induction of ferroptotic death in an NF-κB-independent manner. Reactive oxygen species scavenging, relief of lipid peroxidation, replenishment of glutathione and thiol-containing agents, as well as iron chelation, rescued BAY-induced cell death. BAY upregulated a variety of Nrf2 target genes related to redox regulation, particularly heme oxygenase-1 (HO-1). Studies with specific inhibitors and shRNA interventions suggested that the hierarchy of induction is Nrf2-SLC7A11-HO-1. SLC7A11 inhibition by erastin, sulfasalazine, or shRNA interference sensitizes BAY-induced cell death. Overexperession of SLC7A11 attenuated BAY-inhibited cell viability. The ferroptotic process induced by hHO-1 overexpression further indicated that HO-1 is a key mediator of BAY-induced ferroptosis that operates through cellular redox regulation and iron accumulation. BAY causes compartmentalization of HO-1 into the nucleus and mitochondrion, and followed mitochondrial dysfunctions, leading to lysosome targeting for mitophagy. In this study, we first discovered that BAY induced ferroptosis via Nrf2-SLC7A11-HO-1 pathway and HO-1 is a key mediator by responding to the cellular redox status. Copyright © 2017 Elsevier B.V. All rights reserved.
Florida Bay: A history of recent ecological changes
Fourqurean, J.W.; Robblee, M.B.
1999-01-01
Florida Bay is a unique subtropical estuary at the southern tip of the Florida peninsula. Recent ecological changes (seagrass die-off, algal blooms, increased turbidity) to the Florida Bay ecosystem have focused the attention of the public, commercial interests, scientists, and resource managers on the factors influencing the structure and function of Florida Bay. Restoring Florida Bay to some historic condition is the goal of resource managers, but what is not clear is what an anthropogenically-unaltered Florida Bay would look like. While there is general consensus that human activities have contributed to the changes occurring in the Florida Bay ecosystem, a high degree of natural system variability has made elucidation of the links between human activity and Florida Bay dynamics difficult. Paleoecological analyses, examination of long-term datasets, and directed measurements of aspects of the ecology of Florida Bay all contribute to our understanding of the behavior of the bay, and allow quantification of the magnitude of the recent ecological changes with respect to historical variability of the system.
Establishing Empirical Bases for Sustainability Objectives
Lawrence Martin
2006-01-01
The argument is made that sustainability should be construed as measurable environmental conditions, and that sustainable development strategies should be considered in terms of how well they contribute to the sustainable condition target. A case study of the Chesapeake Bay is presented to illustrate how use of Material Flow Analysis (MFA) as a basic component in the...
The Pruned State-Space System for Non-Linear DSGE Models: Theory and Empirical Applications
DEFF Research Database (Denmark)
Andreasen, Martin Møller; Fernández-Villaverde, Jesús; Rubio-Ramírez, Juan F.
and impulse response functions. Thus, our analysis introduces GMM estimation for DSGE models approximated up to third-order and provides the foundation for indirect inference and SMM when simulation is required. We illustrate the usefulness of our approach by estimating a New Keynesian model with habits...... and Epstein-Zin preferences by GMM when using …rst and second unconditional moments of macroeconomic and …nancial data and by SMM when using additional third and fourth unconditional moments and non-Gaussian innovations....
Sediment depositional environment in some bays in Central west coast of India
Digital Repository Service at National Institute of Oceanography (India)
Rajamanickam, G.V.; Gujar, A.R.
negatively and Ratnagiri Bay positively skewed. Kalbadevi sediments show high kurtosis values while those of Mirya Bay show medium and Ratnagiri Bay low values. Bivariant plots between various textural parameters predict mixed environments, viz. for Kalbadevi...
On watermass mixing ratios and regenerated silicon in the Bay of Bengal
Digital Repository Service at National Institute of Oceanography (India)
Rao, D.P.; Sarma, V.V.; Rao, V.S.; Sudhakar, U.; Gupta, G.V.M.
Regeneration of silicon on mixing in the Bay of Bengal have been computed from six water masses [Bay of Bengal low saline water (BBLS), Bay of Bengal subsurface water (BBSS), northern southeast high salinity water (NSEHS), north Indian intermediate...
Greenwich Bay is an urbanized embayment of Narragansett Bay potentially impacted by multiple stressors. The present study identified the important stressors affecting Greenwich Bay benthic fauna. First, existing data and information were used to confirm that the waterbody was imp...
Introduction to linear elasticity
Gould, Phillip L
2013-01-01
Introduction to Linear Elasticity, 3rd Edition, provides an applications-oriented grounding in the tensor-based theory of elasticity for students in mechanical, civil, aeronautical, and biomedical engineering, as well as materials and earth science. The book is distinct from the traditional text aimed at graduate students in solid mechanics by introducing the subject at a level appropriate for advanced undergraduate and beginning graduate students. The author's presentation allows students to apply the basic notions of stress analysis and move on to advanced work in continuum mechanics, plasticity, plate and shell theory, composite materials, viscoelasticity and finite method analysis. This book also: Emphasizes tensor-based approach while still distilling down to explicit notation Provides introduction to theory of plates, theory of shells, wave propagation, viscoelasticity and plasticity accessible to advanced undergraduate students Appropriate for courses following emerging trend of teaching solid mechan...
International Nuclear Information System (INIS)
Haniger, L.; Elger, R.; Kocandrle, L.; Zdebor, J.
1986-01-01
A linear step drive is described developed in Czechoslovak-Soviet cooperation and intended for driving WWER-1000 control rods. The functional principle is explained of the motor and the mechanical and electrical parts of the drive, power control, and the indicator of position are described. The motor has latches situated in the reactor at a distance of 3 m from magnetic armatures, it has a low structural height above the reactor cover, which suggests its suitability for seismic localities. Its magnetic circuits use counterpoles; the mechanical shocks at the completion of each step are damped using special design features. The position indicator is of a special design and evaluates motor position within ±1% of total travel. A drive diagram and the flow chart of both the control electronics and the position indicator are presented. (author) 4 figs
International Nuclear Information System (INIS)
Tjutju, R.L.
1977-01-01
Pulse amplifier is standard significant part of spectrometer. Apart from other type of amplification, it's a combination of amplification and pulse shaping. Because of its special purpose the device should fulfill the following : High resolution is desired to gain a high yield comparable to its actual state of condition. High signal to noise is desired to nhν resolution. High linearity to facilitate calibration. A good overload recovery, in order to the device will capable of analizing a low energy radiation which appear joinly on the high energy fields. Other expections of the device are its economical and practical use its extentive application. For that reason it's built on a standard NIM principle. Taking also into account the above mentioned considerations. High quality component parts are used throughout, while its availability in the domestic market is secured. (author)