Kampers, Gerrit; Oberlack, Martin; Wacławczyk, Marta; Talamelli, Alessandro
2016-01-01
This volume collects the edited and reviewed contributions presented in the 6th iTi Conference in Bertinoro, covering fundamental and applied aspects in turbulence. In the spirit of the iTi conference, the volume has been produced after the conference so that the authors had the possibility to incorporate comments and discussions raised during the meeting. In the present book the contributions have been structured according to the topics : I Theory II Wall bounded flows III Particles in flows IV Free flows V Complex flows The volume is dedicated to the memory of Prof. Konrad Bajer who prematurely passed away in Warsaw on August 29, 2014. .
International Nuclear Information System (INIS)
Moureau, S.
1993-01-01
ITI-MOVATS provides a wide range of test devices to monitor the performance of valves: motor operated gate or globe valve, butterfly valve, air operated valve, and check valve. The ITI-MOVATS testing equipment is used in the following three areas: actuator setup/baseline testing, periodic/post-maintenance testing, and differential pressure testing. The parameters typically measured with the MOVATS diagnostic system as well as the devices used to measure them are described. (Z.S.)
Implementing ITI for urban development locally
Directory of Open Access Journals (Sweden)
Garyfallia Katsavounidou
2017-12-01
Full Text Available In the current Programming Period (2014-2020 the European Commission has introduced a new strategic instrument, the Integrated Territorial Investment (ITI, which shifts the decisions on allocation of funds to the local level and, most importantly, enables drawing of funds from several priority axes and from several European Structural and Investment Funds. Greece is one of EU member countries that has committed on using ITIs as a tool for urban development. In August 2016, in the Region of Central Macedonia, urban authorities with a population of over 10.000 inhabitants were invited by the Managing Authority of the Regional Operational Programme to submit a Strategy for Sustainable Urban Development (SUD, through the mandatory implementation of the ITI tool. The paper focuses on one of these municipalities, the city of Veria, where the ITI approach has been implemented for the design of an ITI of urban scale (ITI-SUD. The integrated approach prescribed by regional authorities forced Municipalities to adopt government approaches uncommon until now: to involve multiple stakeholders in the entire process, from strategy development to project selection and implementation. The paper describes the benefits and challenges of the new approach as applied in the local context, showing the vertical and horizontal connections of urban development strategies. Most importantly, in the context of ‘procedural learning’ happening in Europe in the field of territorial cohesion, it offers an insight on how European cohesion policy strategies and tools are tested at the local level.
5th iTi Conference in Turbulence 2012
Oberlack, Martin; Peinke, Joachim
2014-01-01
This volume collects the edited and reviewed contributions presented in the 5th iTi Conference in Bertinoro. covering fundamental aspects in turbulent flows. In the spirit of the iTi initiative, the volume is produced after the conference so that the authors had the possibility to incorporate comments and discussions raised during the meeting. Turbulence presents a large number of aspects and problems, which are still unsolved and which challenge research communities in engineering and physical sciences both in basic and applied research. The book presents recent advances in theory related to new statistical approaches, effect of non-linearities and presence of symmetries. This edition presents new contributions related to the physics and control of laminar-turbulent transition in wall-bounded flows, which may have a significant impact on drag reduction applications. Turbulent boundary layers, at increasing Reynolds number, are the main subject of both computational and experimental long research programs ...
Directory of Open Access Journals (Sweden)
Hafez Salleh
2011-12-01
Full Text Available Most of the traditional IT/IS performance measures are based on productivity and process, which mainly focus on method of investment appraisal. There is a need to produce alternative holistic measurement models that enable soft and hard issues to be measured qualitatively. A New IT/IS Capability Evaluation (NICE framework has been designed to measure the capability of organisations to'successfully implement IT systems' and it is applicable across industries.The idea is to provide managers with measurement tools to enable them to identify where improvements are required within their organisations and to indicate their readiness prior to IT investment. The NICE framework investigates four organisational key elements: IT, Environment, Process and People, and is composed of six progressive stages of maturity that a company can achieve its IT/IS capabilities. For each maturity stage, the NICE framework describes a set of critical success factors that must be in place for the company to achieve each stage.
A genetic map of mouse chromosome 1 near the Lsh-Ity-Bcg disease resistance locus.
Mock, B; Krall, M; Blackwell, J; O'Brien, A; Schurr, E; Gros, P; Skamene, E; Potter, M
1990-05-01
Isozyme and restriction fragment length polymorphism (RFLP) analyses of backcross progeny, recombinant inbred strains, and congenic strains of mice positioned eight genetic markers with respect to the Lsh-Ity-Bcg disease resistance locus. Allelic isoforms of Idh-1 and Pep-3 and RFLPs detected by Southern hybridization for Myl-1, Cryg, Vil, Achrg, bcl-2, and Ren-1,2, between BALB/cAnPt and DBA/2NPt mice, were utilized to examine the cosegregation of these markers with the Lsh-Ity-Bcg resistance phenotype in 103 backcross progeny. An additional 47 backcross progeny from a cross between C57BL/10ScSn and B10.L-Lshr/s mice were examined for the cosegregation of Myl-1 and Vil RFLPs with Lsh phenotypic differences. Similarly, BXD recombinant inbred strains were typed for RFLPs upon hybridization with Vil and Achrg. Recombination frequencies generated in the different test systems were statistically similar, and villin (Vil) was identified as the molecular marker closest (1.7 +/- 0.8 cM) to the Lsh-Ity-Bcg locus. Two other DNA sequences, nebulin (Neb) and an anonymous DNA fragment (D2S3), which map to a region of human chromosome 2q that is homologous to proximal mouse chromosome 1, were not closely linked to the Lsh-Ity-Bcg locus. This multipoint linkage analysis of chromosome 1 surrounding the Lsh-Ity-Bcg locus provides a basis for the eventual isolation of the disease gene.
Building a Taxonomic Data Editor: ITIS Taxonomic Workbench 6.0
Mitchell,David; Bowman,Lisa; Brockmeier,Christopher
2017-01-01
The Integrated Taxonomic Information System (ITIS - www.itis.gov) provides a regularly updated global database that currently contains over 840,000 scientific names and their hierarchy. A new rich Internet application for adding and editing ITIS data, Taxonomic Workbench 6.0, is being developed using the AngularJS framework. AngularJS is designed to take advantage of many features that are fairly recent to the web platform, facilitates a well-structured product that is easier to maintain, and...
7th iTi Conference in Turbulence
Talamelli, Alessandro; Oberlack, Martin; Peinke, Joachim
2017-01-01
This volume collects the edited and reviewed contribution presented in the 7th iTi Conference in Bertinoro, covering fundamental and applied aspects in turbulence. In the spirit of the iTi conference, the volume is produced after the conference so that the authors had the opportunity to incorporate comments and discussions raised during the meeting. In the present book, the contributions have been structured according to the topics: I Theory II Wall bounded flows III Pipe flow IV Modelling V Experiments VII Miscellaneous topics.
4th iTi Conference in Turbulence
Peinke, Joachim; Talamelli, Alessandro; Castillo, Luciano; Hölling, Michael
2012-01-01
This fourth issue on "progress in turbulence" is based on the fourth ITI conference (ITI interdisciplinary turbulence initiative), which took place in Bertinoro, North Italy. Leading researchers from the engineering and physical sciences presented latest results in turbulence research. Basic as well as applied research is driven by the rather notorious difficult and essentially unsolved problem of turbulence. In this collection of contributions clear progress can be seen in different aspects, ranging from new quality of numerical simulations to new concepts of experimental investigations and new theoretical developments. The importance of turbulence is shown for a wide range of applications including: combustion, energy, flow control, urban flows, are few examples found in this volume. A motivation was to bring fundamentals of turbulence in connection with renewable energy. This lead us to add a special topic relevant to the impact of turbulence on the wind energy conversion. The structure of the present book...
Integrated Inflammatory Stress (ITIS) Model
DEFF Research Database (Denmark)
Bangsgaard, Elisabeth O.; Hjorth, Poul G.; Olufsen, Mette S.
2017-01-01
maintains a long-term level of the stress hormone cortisol which is also anti-inflammatory. A new integrated model of the interaction between these two subsystems of the inflammatory system is proposed and coined the integrated inflammatory stress (ITIS) model. The coupling mechanisms describing....... A constant activation results in elevated levels of the variables in the model while a prolonged change of the oscillations in ACTH and cortisol concentrations is the most pronounced result of different LPS doses predicted by the model....
"Äiti, sä oot niin paras äiti" : - Ihmeelliset vuodet - ryhmän vaikutus vanhemmuustaitoihin
Pylkkönen, Taru
2011-01-01
”Äiti, sä oot niin paras äiti”: Ihmeelliset vuodet – ryhmän vaikutus vanhempien vanhem-muustaitoihin Vuosi 2011 Sivumäärä 51+ 12 Opinnäytetyön tarkoituksena oli selvittää miten Ihmeelliset vuodet koulussa – projektin yhteydessä toiminut vanhemmuusryhmä vaikutti siihen osallistuvien vanhempien vanhemmuustaitoihin. Ihmeelliset vuodet – ohjelman vanhemmuusryhmät ovat käytöshäiriöisten lasten vanhemmille suunnattuja perheinterventioita, joissa lasten käytökseen pyr...
Directory of Open Access Journals (Sweden)
Kevin D Beck
2014-11-01
Full Text Available As a model of anxiety disorder vulnerability, male Wistar-Kyoto (WKY rats acquire lever-press avoidance behavior more readily than outbred Sprague Dawley rats, and their acquisition is enhanced by the presence of a discrete signal presented during the inter-trial intervals (ITIs, suggesting it is perceived as a safety signal. A series of experiments were conducted to determine if this is the case. Additional experiments investigated if the avoidance facilitation relies upon processing through medial prefrontal cortex (mPFC. The results suggest that the ITI-signal facilitates acquisition during the early stages of the avoidance acquisition process, when the rats are initially acquiring escape behavior and then transitioning to avoidance behavior. Post-avoidance introduction of the visual ITI-signal into other associative learning tasks failed to confirm that the visual stimulus had acquired the properties of a conditioned inhibitor. Shortening the signal from the entirety of the 3 min ITI to only the first 5 s of the 3 min ITI slowed acquisition during the first 4 sessions, suggesting the flashing light is not functioning as a feedback signal. The prelimbic (PL cortex showed greater activation during the period of training when the transition from escape responding to avoidance responding occurs. Only combined PL+infralimbic cortex lesions modestly slowed avoidance acquisition, but PL cortex lesions slowed avoidance response latencies. Thus, the flashing light ITI-signal is not likely perceived as a safety signal nor is it serving as a feedback signal. The functional role of the PL cortex appears to be to increase the drive towards responding to the threat of the warning signal. Hence, avoidance susceptibility displayed by male WKY rats may be driven, in part, both by external stimuli (ITI signal as well as by enhanced threat recognition to the warning signal via the PL cortex.
ITI implants with overdentures: a prevention of bone loss in edentulous mandibles?
DEFF Research Database (Denmark)
von Wowern, N; Harder, F; Hjørting-Hansen, E
1990-01-01
Changes in the bone mineral content (BMC) of edentulous mandibles with osseointegrated ITI implants supporting overdentures were measured in vivo by dual-photon absorptiometry. The BMC measurements were performed 3 weeks postoperatively and at the 2-year follow-up visit. Measurements were made...
Peterson, V M; Madonna, G S; Vogel, S N
1992-04-01
Inheritance of the Ityr or the Itys allele of the Ity murine gene confers resistance or increased susceptibility, respectively, to Salmonella typhimurium infection. Recent studies have documented that Ity gene expression may determine net intracellular replication of S. typhimurium by modulating macrophage function. The purpose of this study was to determine if Ity gene expression modulated macrophage stem cell proliferation as well. To detect possible Ity-associated alterations in macrophage stem cell proliferation during endotoxin challenge or S. typhimurium infection, the congenic strain pair BALB/c (Itys) and C.D2-Idh-1, Pep-3 N20F8 (Ityr) were injected intraperitoneally with 25 micrograms of bacterial lipopolysaccharide (LPS) or approximately 10(3) S. typhimurium, and myelopoiesis was evaluated. At 72 h after LPS injection, both BALB/c and C.D2 mice developed comparable degrees of bone marrow hypocellularity and splenomegaly, and cell sizing profiles indicated a normal response to a single injection of LPS in both strains of mice. Although an inhibitor to colony-stimulating factor activity was detected in the sera and plasma of C.D2 mice, the number of myeloid stem cells cultured from the bone marrow and spleen of each mouse strain were comparable. S. typhimurium infection resulted in earlier symptoms, a larger bacterial load, a higher mortality rate, and a greater bone marrow hypocellularity and splenomegaly in BALB/c mice compared with those in C.D2 mice. Despite a dramatic increase in bacterial load, a decrease in both bone marrow and splenic myeloid stem cell numbers was noted in BALB/c mice, while stem cell numbers remained constant in C.D2 mice between days 3 and 5 and increased dramatically at day 7 after infection. These data suggest that BALB/c and C.D2 mice may exhibit a divergent myelopoietic response to S. typhimurium infection. It appears that a paradoxical failure of myelopoiesis in Itys mice during S. typhimurium infection may contribute to the
Lerchenmueller, Marc J; Sorenson, Olav
2016-01-01
We examined the usefulness (precision) and completeness (recall) of the Author-ity author disambiguation for PubMed articles by associating articles with scientists funded by the National Institutes of Health (NIH). In doing so, we exploited established unique identifiers-Principal Investigator (PI) IDs-that the NIH assigns to funded scientists. Analyzing a set of 36,987 NIH scientists who received their first R01 grant between 1985 and 2009, we identified 355,921 articles appearing in PubMed that would allow us to evaluate the precision and recall of the Author-ity disambiguation. We found that Author-ity identified the NIH scientists with 99.51% precision across the articles. It had a corresponding recall of 99.64%. Precision and recall, moreover, appeared stable across common and uncommon last names, across ethnic backgrounds, and across levels of scientist productivity.
The Impact of Project Management Maturity upon IT/IS Project Management Outcomes
Carcillo, Anthony Joseph, Jr.
2013-01-01
Although it is assumed that increasing the institutionalization (or maturity) of project management in an organization leads to greater project success, the literature has diverse views. The purpose of this mixed methods study was to examine the correlation between project management maturity and IT/IS project outcomes. The sample consisted of two…
Statistical electromagnetics: Complex cavities
Naus, H.W.L.
2008-01-01
A selection of the literature on the statistical description of electromagnetic fields and complex cavities is concisely reviewed. Some essential concepts, for example, the application of the central limit theorem and the maximum entropy principle, are scrutinized. Implicit assumptions, biased
Statistical mechanics of complex networks
Rubi, Miguel; Diaz-Guilera, Albert
2003-01-01
Networks can provide a useful model and graphic image useful for the description of a wide variety of web-like structures in the physical and man-made realms, e.g. protein networks, food webs and the Internet. The contributions gathered in the present volume provide both an introduction to, and an overview of, the multifaceted phenomenology of complex networks. Statistical Mechanics of Complex Networks also provides a state-of-the-art picture of current theoretical methods and approaches.
Czech Academy of Sciences Publication Activity Database
Mansfeldová, Věra; Janda, Pavel; Tarábková, Hana
2015-01-01
Roč. 182, NOV 2015 (2015), s. 1053-1059 ISSN 0013-4686 Institutional support: RVO:61388955 Keywords : biomimetic liquid membrane * ion resolution potentiometry * ITIES Subject RIV: CG - Electrochemistry Impact factor: 4.803, year: 2015
Brånemark and ITI dental implants in the human bone-grafted maxilla: a comparative evaluation
DEFF Research Database (Denmark)
Pinholt, Else M
2003-01-01
. Twelve consecutive patients received machine-surfaced Brånemark fixtures and 13 consecutive patients received SLA-ITI fixtures. Gradual loading was applied after healing abutment application. After 6 months the permanent prosthetic reconstruction was provided to the patient, either as a fixed...
Statistical Acceptance Plan for Asphalt Pavement Construction.
1998-05-01
K. C, Freidenrich, J., and Weed, R. M. (1992). " Managing Qual- ity: Time for a National Policy," Transportation Research Record 1340...Baecher, G. (1987). "Statistical Qaulity Control of Engineered Embank- ments," Contract Report GL-87-2, Waterways Experiment Station, U.S. Army Corps...Arfington, VA22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0186). Washington, DC20503. 1 .AGENCY USE ONLY
Statistical physics of complex systems a concise introduction
Bertin, Eric
2016-01-01
This course-tested primer provides graduate students and non-specialists with a basic understanding of the concepts and methods of statistical physics and demonstrates their wide range of applications to interdisciplinary topics in the field of complex system sciences, including selected aspects of theoretical modeling in biology and the social sciences. Generally speaking, the goals of statistical physics may be summarized as follows: on the one hand to study systems composed of a large number of interacting units, and on the other to predict the macroscopic, collective behavior of the system considered from the perspective of the microscopic laws governing the dynamics of the individual entities. These two goals are essentially also shared by what is now called 'complex systems science', and as such, systems studied in the framework of statistical physics may be considered to be among the simplest examples of complex systems – while also offering a rather well developed mathematical treatment. The second ...
Kolmogorov complexity, pseudorandom generators and statistical models testing
Czech Academy of Sciences Publication Activity Database
Šindelář, Jan; Boček, Pavel
2002-01-01
Roč. 38, č. 6 (2002), s. 747-759 ISSN 0023-5954 R&D Projects: GA ČR GA102/99/1564 Institutional research plan: CEZ:AV0Z1075907 Keywords : Kolmogorov complexity * pseudorandom generators * statistical models testing Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.341, year: 2002
Statistical complexity without explicit reference to underlying probabilities
Pennini, F.; Plastino, A.
2018-06-01
We show that extremely simple systems of a not too large number of particles can be simultaneously thermally stable and complex. To such an end, we extend the statistical complexity's notion to simple configurations of non-interacting particles, without appeal to probabilities, and discuss configurational properties.
A Short Tale of the Black Sheep of -ITY
Computer Security Team
2013-01-01
Once upon a time, computer engineers of the ancient world used the abbreviation of “-ITY” ([eye-tee]) as a shorthand for “Information TechnologY”. It was an appropriate abbreviation as it reminded everyone of the core purposes and aspects of information technology, which made not only the computer engineers, but also their clients, happy. Whenever the engineers were programming a software application or setting up a computing service to cover the needs of their clients, they stuck to the four paradigms of -ITY: * “Functional-ITY”, i.e. ensuring that a service or application has a purpose and a justification of being; * “Avalabil-ITY”, i.e. ensuring that this service or application is functional whenever a client wants to use it; * “Usabil-ITY”, i.e. ensuring that this client does not get fed up by a badly designed user interface or disappointed by the service’s or application&am...
Effective control of complex turbulent dynamical systems through statistical functionals.
Majda, Andrew J; Qi, Di
2017-05-30
Turbulent dynamical systems characterized by both a high-dimensional phase space and a large number of instabilities are ubiquitous among complex systems in science and engineering, including climate, material, and neural science. Control of these complex systems is a grand challenge, for example, in mitigating the effects of climate change or safe design of technology with fully developed shear turbulence. Control of flows in the transition to turbulence, where there is a small dimension of instabilities about a basic mean state, is an important and successful discipline. In complex turbulent dynamical systems, it is impossible to track and control the large dimension of instabilities, which strongly interact and exchange energy, and new control strategies are needed. The goal of this paper is to propose an effective statistical control strategy for complex turbulent dynamical systems based on a recent statistical energy principle and statistical linear response theory. We illustrate the potential practical efficiency and verify this effective statistical control strategy on the 40D Lorenz 1996 model in forcing regimes with various types of fully turbulent dynamics with nearly one-half of the phase space unstable.
Complex Data Modeling and Computationally Intensive Statistical Methods
Mantovan, Pietro
2010-01-01
The last years have seen the advent and development of many devices able to record and store an always increasing amount of complex and high dimensional data; 3D images generated by medical scanners or satellite remote sensing, DNA microarrays, real time financial data, system control datasets. The analysis of this data poses new challenging problems and requires the development of novel statistical models and computational methods, fueling many fascinating and fast growing research areas of modern statistics. The book offers a wide variety of statistical methods and is addressed to statistici
Statistical physics of networks, information and complex systems
Energy Technology Data Exchange (ETDEWEB)
Ecke, Robert E [Los Alamos National Laboratory
2009-01-01
In this project we explore the mathematical methods and concepts of statistical physics that are fmding abundant applications across the scientific and technological spectrum from soft condensed matter systems and bio-infonnatics to economic and social systems. Our approach exploits the considerable similarity of concepts between statistical physics and computer science, allowing for a powerful multi-disciplinary approach that draws its strength from cross-fertilization and mUltiple interactions of researchers with different backgrounds. The work on this project takes advantage of the newly appreciated connection between computer science and statistics and addresses important problems in data storage, decoding, optimization, the infonnation processing properties of the brain, the interface between quantum and classical infonnation science, the verification of large software programs, modeling of complex systems including disease epidemiology, resource distribution issues, and the nature of highly fluctuating complex systems. Common themes that the project has been emphasizing are (i) neural computation, (ii) network theory and its applications, and (iii) a statistical physics approach to infonnation theory. The project's efforts focus on the general problem of optimization and variational techniques, algorithm development and infonnation theoretic approaches to quantum systems. These efforts are responsible for fruitful collaborations and the nucleation of science efforts that span multiple divisions such as EES, CCS, 0 , T, ISR and P. This project supports the DOE mission in Energy Security and Nuclear Non-Proliferation by developing novel infonnation science tools for communication, sensing, and interacting complex networks such as the internet or energy distribution system. The work also supports programs in Threat Reduction and Homeland Security.
Information Geometric Complexity of a Trivariate Gaussian Statistical Model
Directory of Open Access Journals (Sweden)
Domenico Felice
2014-05-01
Full Text Available We evaluate the information geometric complexity of entropic motion on low-dimensional Gaussian statistical manifolds in order to quantify how difficult it is to make macroscopic predictions about systems in the presence of limited information. Specifically, we observe that the complexity of such entropic inferences not only depends on the amount of available pieces of information but also on the manner in which such pieces are correlated. Finally, we uncover that, for certain correlational structures, the impossibility of reaching the most favorable configuration from an entropic inference viewpoint seems to lead to an information geometric analog of the well-known frustration effect that occurs in statistical physics.
A Concise Introduction to the Statistical Physics of Complex Systems
Bertin, Eric
2012-01-01
This concise primer (based on lectures given at summer schools on complex systems and on a masters degree course in complex systems modeling) will provide graduate students and newcomers to the field with the basic knowledge of the concepts and methods of statistical physics and its potential for application to interdisciplinary topics. Indeed, in recent years, statistical physics has begun to attract the interest of a broad community of researchers in the field of complex system sciences, ranging from biology to the social sciences, economics and computer science. More generally, a growing number of graduate students and researchers feel the need to learn some basic concepts and questions originating in other disciplines without necessarily having to master all of the corresponding technicalities and jargon. Generally speaking, the goals of statistical physics may be summarized as follows: on the one hand to study systems composed of a large number of interacting ‘entities’, and on the other to predict...
PREFACE: Statistical Physics of Complex Fluids
Golestanian, R.; Khajehpour, M. R. H.; Kolahchi, M. R.; Rouhani, S.
2005-04-01
The field of complex fluids is a rapidly developing, highly interdisciplinary field that brings together people from a plethora of backgrounds such as mechanical engineering, chemical engineering, materials science, applied mathematics, physics, chemistry and biology. In this melting pot of science, the traditional boundaries of various scientific disciplines have been set aside. It is this very property of the field that has guaranteed its richness and prosperity since the final decade of the 20th century and into the 21st. The C3 Commission of the International Union of Pure and Applied Physics (IUPAP), which is the commission for statistical physics that organizes the international STATPHYS conferences, encourages various, more focused, satellite meetings to complement the main event. For the STATPHYS22 conference in Bangalore (July 2004), Iran was recognized by the STATPHYS22 organizers as suitable to host such a satellite meeting and the Institute for Advanced Studies in Basic Sciences (IASBS) was chosen to be the site of this meeting. It was decided to organize a meeting in the field of complex fluids, which is a fairly developed field in Iran. This international meeting, and an accompanying summer school, were intended to boost international connections for both the research groups working in Iran, and several other groups working in the Middle East, South Asia and North Africa. The meeting, entitled `Statistical Physics of Complex Fluids' was held at the Institute for Advanced Studies in Basic Sciences (IASBS) in Zanjan, Iran, from 27 June to 1 July 2004. The main topics discussed at the meeting included: biological statistical physics, wetting and microfluidics, transport in complex media, soft and granular matter, and rheology of complex fluids. At this meeting, 22 invited lectures by eminent scientists were attended by 107 participants from different countries. The poster session consisted of 45 presentations which, in addition to the main topics of the
Statistical screening of input variables in a complex computer code
International Nuclear Information System (INIS)
Krieger, T.J.
1982-01-01
A method is presented for ''statistical screening'' of input variables in a complex computer code. The object is to determine the ''effective'' or important input variables by estimating the relative magnitudes of their associated sensitivity coefficients. This is accomplished by performing a numerical experiment consisting of a relatively small number of computer runs with the code followed by a statistical analysis of the results. A formula for estimating the sensitivity coefficients is derived. Reference is made to an earlier work in which the method was applied to a complex reactor code with good results
Directory of Open Access Journals (Sweden)
Accarat Chaoumead
2012-01-01
Full Text Available Transparent conductive titanium-doped indium oxide (ITiO films were deposited on corning glass substrates by RF magnetron sputtering method. The effects of RF sputtering power and Ar gas pressure on the structural and electrical properties of the films were investigated experimentally, using a 2.5 wt% TiO2-doped In2O3 target. The deposition rate was in the range of around 20~60 nm/min under the experimental conditions of 5~20 mTorr of gas pressure and 220~350 W of RF power. The lowest volume resistivity of 1.2×10−4 Ω-cm and the average optical transmittance of 75% were obtained for the ITiO film, prepared at RF power of 300 W and Ar gas pressure of 15 mTorr. This volume resistivity of 1.2×10−4 Ω-cm is low enough as a transparent conducting layer in various electrooptical devices, and it is comparable with that of ITO or ZnO:Al conducting layer.
Hämmerle, C H; Brägger, U; Bürgin, W; Lang, N P
1996-06-01
In order to achieve esthetically more satisfying results, it has been proposed to place ITI implants with their border between the rough and smooth surfaces below the level of the alveolar crest, thereby obtaining a submucosally located implant shoulder following healing. The aim of the present experimental study was to clinically and radiographically evaluate the tissue response to the placement of one-stage transmucosal implants with the border between the rough and the smooth surfaces sunk by 1 mm into a subcrestal location. 11 patients underwent comprehensive dental care including the placement of 2 implants of the ITI Dental Implant System in the same quadrant (test and control). Randomly assigned control implants were placed according to the manufacturer's instructions, i.e. the border between the rough titanium plasma-sprayed and the smooth polished surfaces precisely at the alveolar crest. At the test implant the apical border of the polished surface was placed approximately 1 mm below the alveolar crest. Probing bone levels were assessed at implant placement (baseline), 4 and 12 months later. Modified plaque and modified gingival indices were recorded at 1, 2, 3, 4 and 12 months. Clinical probing depth and "attachment" levels were measured at 4 and 12 months. All parameters were assessed at 6 sites around each implant. The mean for each implant was calculated and used for analysis. The Wilcoxon matched pairs signed rank test and the Student t-test were applied to detect differences over time and between the test and control implants. At baseline, a mean difference in probing bone level of -0.86 mm (SD 0.43 mm, p placed more deeply. Both test and control implants lost a significant amount of clinical bone height during the first 4 months (test 1.16 mm, p placed under standard conditions, the bone adjacent to the polished surface of more deeply placed ITI implants is also lost over time. From a biological point of view, the placement of the border between
Polychronakos fractional statistics with a complex-valued parameter
International Nuclear Information System (INIS)
Rovenchak, Andrij
2012-01-01
A generalization of quantum statistics is proposed in a fashion similar to the suggestion of Polychronakos [Phys. Lett. B 365, 202 (1996)] with the parameter α varying between −1 (fermionic case) and +1 (bosonic case). However, unlike the original formulation, it is suggested that intermediate values are located on the unit circle in the complex plane. In doing so one can avoid the case α = 0 corresponding to the Boltzmann statistics, which is not a quantum one. The limits of α → +1 and α → −1 reproducing small deviations from the Bose and Fermi statistics, respectively, are studied in detail. The equivalence between the statistics parameter and a possible dissipative part of the excitation spectrum is established. The case of a non-conserving number of excitations is analyzed. It is defined from the condition that the real part of the chemical potential equals zero. Thermodynamic quantities of a model system of two-dimensional harmonic oscillators are calculated.
Statistical analysis of complex systems with nonclassical invariant measures
Fratalocchi, Andrea
2011-01-01
I investigate the problem of finding a statistical description of a complex many-body system whose invariant measure cannot be constructed stemming from classical thermodynamics ensembles. By taking solitons as a reference system and by employing a
Statistical emission of complex fragments from highly excited compound nucleus
International Nuclear Information System (INIS)
Matsuse, T.
1991-01-01
A full statistical analysis has been given in terms of the Extended Hauser-Feshbach method. The charge and kinetic energy distributions of 35 Cl+ 12 C reaction at E lab = 180, 200 MeV and 23 Na+ 24 Mg reaction at E lab = 89 MeV which form the 47 V compound nucleus are investigated as a prototype of the light mass system. The measured kinetic energy distributions of the complex fragments are shown to be well reproduced by the Extended Hauser-Feshbach method, so the observed complex fragment production is understood as the statistical binary decay from the compound nucleus induced by heavy-ion reaction. Next, this method is applied to the study of the complex production from the 111 In compound nucleus which is formed by the 84 Kr+ 27 Al reaction at E lab = 890 MeV. (K.A.) 18 refs., 10 figs
Measuring streetscape complexity based on the statistics of local contrast and spatial frequency.
Directory of Open Access Journals (Sweden)
André Cavalcante
Full Text Available Streetscapes are basic urban elements which play a major role in the livability of a city. The visual complexity of streetscapes is known to influence how people behave in such built spaces. However, how and which characteristics of a visual scene influence our perception of complexity have yet to be fully understood. This study proposes a method to evaluate the complexity perceived in streetscapes based on the statistics of local contrast and spatial frequency. Here, 74 streetscape images from four cities, including daytime and nighttime scenes, were ranked for complexity by 40 participants. Image processing was then used to locally segment contrast and spatial frequency in the streetscapes. The statistics of these characteristics were extracted and later combined to form a single objective measure. The direct use of statistics revealed structural or morphological patterns in streetscapes related to the perception of complexity. Furthermore, in comparison to conventional measures of visual complexity, the proposed objective measure exhibits a higher correlation with the opinion of the participants. Also, the performance of this method is more robust regarding different time scenarios.
Operation Statistics of the CERN Accelerators Complex for 2003
CERN. Geneva; Baird, S A; Rey, A; Steerenberg, R; CERN. Geneva. AB Department
2004-01-01
This report gives an overview of the performance of the different Accelerators (Linacs, PS Booster, PS, AD and SPS) of the CERN Accelerator Complex for 2003. It includes scheduled activities, beam availabilities, beam intensities and an analysis of faults and breakdowns by system and by beam. MORE INFORATION by using the OP Statistics Tool: http://eLogbook.web.cern.ch/eLogbook/statistics.php and on the SPS HomePage: http://ab-div-op-sps.web.cern.ch/ab-div-op-sps/SPSss.html
Foundations of Complex Systems Nonlinear Dynamics, Statistical Physics, and Prediction
Nicolis, Gregoire
2007-01-01
Complexity is emerging as a post-Newtonian paradigm for approaching a large body of phenomena of concern at the crossroads of physical, engineering, environmental, life and human sciences from a unifying point of view. This book outlines the foundations of modern complexity research as it arose from the cross-fertilization of ideas and tools from nonlinear science, statistical physics and numerical simulation. It is shown how these developments lead to an understanding, both qualitative and quantitative, of the complex systems encountered in nature and in everyday experience and, conversely, h
International Nuclear Information System (INIS)
Onchi, T; Fujisawa, A; Sanpei, A; Himura, H; Masamune, S
2017-01-01
Permutation entropy and statistical complexity are measures for complex time series. The Bandt–Pompe methodology evaluates probability distribution using permutation. The method is robust and effective to quantify information of time series data. Statistical complexity is the product of Jensen–Shannon divergence and permutation entropy. These physical parameters are introduced to analyse time series of emission and magnetic fluctuations in low-aspect-ratio reversed-field pinch (RFP) plasma. The observed time-series data aggregates in a region of the plane, the so-called C – H plane, determined by entropy versus complexity. The C – H plane is a representation space used for distinguishing periodic, chaos, stochastic and noisy processes of time series data. The characteristics of the emissions and magnetic fluctuation change under different RFP-plasma conditions. The statistical complexities of soft x-ray emissions and magnetic fluctuations depend on the relationships between reversal and pinch parameters. (paper)
Editorial for special issue of AJIS on Green IT/IS (Sustainable computing
Directory of Open Access Journals (Sweden)
Michael Steven Lane
2011-03-01
Full Text Available We are pleased to present this AJIS Special issue on Green IT/IS (Sustainable Computing. There are five papers published in this special issue of the AJIS which reflect the diversity of this emerging and important area of research in Information Systems. Environmental sustainability is one of if not the most important challenge facing organisations and society in the 21st century. Information systems and information technology have a major role to play in both reducing its environmental impact and providing the systems and technological innovation to reduce the environmental impact of organisations. Currently there is a lack of rigorous empirical studies which are theory and evidence based to provide a sound basis for understanding IT green best practices and how these can be best adopted in organisations. This special issue of the AJIS contributes this current gap in the knowledge concerning green IS and IT with five empirical research papers which examined five different aspects of green IS and IT.
IT/IS plus E: exploring the need for e-integration
Miele, Renato; Gunasekaran, Angappa; Yusuf, Yahaya Y.
2000-10-01
The change in IT/IS strategy is about the Internet becoming a major part of the corporate environment and driving decisions more and more. Companies of all sizes and industries can fully engage employees, customers and partners to capitalize upon the new Internet economy. They can optimize supply chains, managing strategic relationships, reducing time to market, sharing vital information, and increasing productivity and shareholder value. Remaining competitive in today's rapidly changing global marketplace requires fast action. The problem is now how much, how soon, and what kind of Internet based components are essential for companies to be successful, and how the adoption of E-Integration can become a critical component of company's survival in an increasingly competitive environment. How information, knowledge and innovation processes can drive business success are fundamental notions for the information- based economy, which have been extensively researched and confirmed throughout the IT revolution. The new capabilities to use the Internet to supply large amounts of relevant information from multiple internal and external sources give the possibility to move from isolate Information Systems toward an integrated environment in every business organization. The article addresses how E-Integration must link together data from multiple sources, providing a seamless system, fully interoperable with pre-existing IT environment, totally scalable and upgradeable.
Statistical complexity is maximized in a small-world brain.
Directory of Open Access Journals (Sweden)
Teck Liang Tan
Full Text Available In this paper, we study a network of Izhikevich neurons to explore what it means for a brain to be at the edge of chaos. To do so, we first constructed the phase diagram of a single Izhikevich excitatory neuron, and identified a small region of the parameter space where we find a large number of phase boundaries to serve as our edge of chaos. We then couple the outputs of these neurons directly to the parameters of other neurons, so that the neuron dynamics can drive transitions from one phase to another on an artificial energy landscape. Finally, we measure the statistical complexity of the parameter time series, while the network is tuned from a regular network to a random network using the Watts-Strogatz rewiring algorithm. We find that the statistical complexity of the parameter dynamics is maximized when the neuron network is most small-world-like. Our results suggest that the small-world architecture of neuron connections in brains is not accidental, but may be related to the information processing that they do.
PREFACE: Counting Complexity: An international workshop on statistical mechanics and combinatorics
de Gier, Jan; Warnaar, Ole
2006-07-01
On 10-15 July 2005 the conference `Counting Complexity: An international workshop on statistical mechanics and combinatorics' was held on Dunk Island, Queensland, Australia in celebration of Tony Guttmann's 60th birthday. Dunk Island provided the perfect setting for engaging in almost all of Tony's life-long passions: swimming, running, food, wine and, of course, plenty of mathematics and physics. The conference was attended by many of Tony's close scientific friends from all over the world, and most talks were presented by his past and present collaborators. This volume contains the proceedings of the meeting and consists of 24 refereed research papers in the fields of statistical mechanics, condensed matter physics and combinatorics. These papers provide an excellent illustration of the breadth and scope of Tony's work. The very first contribution, written by Stu Whittington, contains an overview of the many scientific achievements of Tony over the past 40 years in mathematics and physics. The organizing committee, consisting of Richard Brak, Aleks Owczarek, Jan de Gier, Emma Lockwood, Andrew Rechnitzer and Ole Warnaar, gratefully acknowledges the Australian Mathematical Society (AustMS), the Australian Mathematical Sciences Institute (AMSI), the ARC Centre of Excellence for Mathematics and Statistics of Complex Systems (MASCOS), the ARC Complex Open Systems Research Network (COSNet), the Institute of Physics (IOP) and the Department of Mathematics and Statistics of The University of Melbourne for financial support in organizing the conference. Tony, we hope that your future years in mathematics will be numerous. Count yourself lucky! Tony Guttman
Statistical mechanics of complex neural systems and high dimensional data
International Nuclear Information System (INIS)
Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya
2013-01-01
Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks. (paper)
Statistical analysis of complex systems with nonclassical invariant measures
Fratalocchi, Andrea
2011-02-28
I investigate the problem of finding a statistical description of a complex many-body system whose invariant measure cannot be constructed stemming from classical thermodynamics ensembles. By taking solitons as a reference system and by employing a general formalism based on the Ablowitz-Kaup-Newell-Segur scheme, I demonstrate how to build an invariant measure and, within a one-dimensional phase space, how to develop a suitable thermodynamics. A detailed example is provided with a universal model of wave propagation, with reference to a transparent potential sustaining gray solitons. The system shows a rich thermodynamic scenario, with a free-energy landscape supporting phase transitions and controllable emergent properties. I finally discuss the origin of such behavior, trying to identify common denominators in the area of complex dynamics.
Wismeijer, D; van Waas, MAJ; Mulder, J; Vermeeren, JIJF; Kalk, W
In a randomized controlled clinical trial carried out at the Ignatius teaching hospital in Breda, The Netherlands, 110 edentulous patients with severe mandibular bone loss were treated with implants of the ITI(R) Dental Implant System using 3 different treatment strategies: a mandibular overdenture
Statistics without Tears: Complex Statistics with Simple Arithmetic
Smith, Brian
2011-01-01
One of the often overlooked aspects of modern statistics is the analysis of time series data. Modern introductory statistics courses tend to rush to probabilistic applications involving risk and confidence. Rarely does the first level course linger on such useful and fascinating topics as time series decomposition, with its practical applications…
Florkowska, Lucyna; Bryt-Nitarska, Izabela
2018-04-01
The notion of Integrated Territorial Investments (ITI) appears more and more frequently in contemporary regional development strategies. Formulating the main assumptions of ITI is a response to a growing need for a co-ordinated, multi-dimensional regional development suitable for the characteristics of a given area. Activities are mainly aimed at improving people's quality of life with their significant participation. These activities include implementing the Sustainable development Goals (SDGs). Territorial investments include, among others, projects in areas where land and building use is governed not only by general regulations (Spatial Planning and Land Development Act) but also by separate legal acts. This issue also concerns areas with active mines and post-mining areas undergoing revitalization. For the areas specified above land development and in particular making building investments is subject to the requirements set forth in the Geological and Mining Law and in the general regulations. In practice this means that factors connected with the present and future mining impacts must be taken into consideration in planning the investment process. This article discusses the role of proper assessment of local geological conditions as well as the current and future mining situation in the context of proper planning and performance of the Integrated Territorial Investment programme and also in the context of implementing the SDGs. It also describes the technical and legislative factors which need to be taken into consideration in areas where mining is planned or where it took place in the past.
Murayama, Shogo; Kinugawa, Hikaru; Tokuda, Isao T.; Gotoda, Hiroshi
2018-02-01
We present an experimental study on the characterization of dynamic behavior of flow velocity field during thermoacoustic combustion oscillations in a turbulent confined combustor from the viewpoints of statistical complexity and complex-network theory, involving detection of a precursor of thermoacoustic combustion oscillations. The multiscale complexity-entropy causality plane clearly shows the possible presence of two dynamics, noisy periodic oscillations and noisy chaos, in the shear layer regions (1) between the outer recirculation region in the dump plate and a recirculation flow in the wake of the centerbody and (2) between the outer recirculation region in the dump plate and a vortex breakdown bubble away from the centerbody. The vertex strength in the turbulence network and the community structure of the vorticity field can identify the vortical interactions during thermoacoustic combustion oscillations. Sequential horizontal visibility graph motifs are useful for capturing a precursor of themoacoustic combustion oscillations.
Statistics of Shared Components in Complex Component Systems
Mazzolini, Andrea; Gherardi, Marco; Caselle, Michele; Cosentino Lagomarsino, Marco; Osella, Matteo
2018-04-01
Many complex systems are modular. Such systems can be represented as "component systems," i.e., sets of elementary components, such as LEGO bricks in LEGO sets. The bricks found in a LEGO set reflect a target architecture, which can be built following a set-specific list of instructions. In other component systems, instead, the underlying functional design and constraints are not obvious a priori, and their detection is often a challenge of both scientific and practical importance, requiring a clear understanding of component statistics. Importantly, some quantitative invariants appear to be common to many component systems, most notably a common broad distribution of component abundances, which often resembles the well-known Zipf's law. Such "laws" affect in a general and nontrivial way the component statistics, potentially hindering the identification of system-specific functional constraints or generative processes. Here, we specifically focus on the statistics of shared components, i.e., the distribution of the number of components shared by different system realizations, such as the common bricks found in different LEGO sets. To account for the effects of component heterogeneity, we consider a simple null model, which builds system realizations by random draws from a universe of possible components. Under general assumptions on abundance heterogeneity, we provide analytical estimates of component occurrence, which quantify exhaustively the statistics of shared components. Surprisingly, this simple null model can positively explain important features of empirical component-occurrence distributions obtained from large-scale data on bacterial genomes, LEGO sets, and book chapters. Specific architectural features and functional constraints can be detected from occurrence patterns as deviations from these null predictions, as we show for the illustrative case of the "core" genome in bacteria.
Czech Academy of Sciences Publication Activity Database
Silver, Barry Richard; Holub, Karel; Mareček, Vladimír
2017-01-01
Roč. 784, JAN 2017 (2017), s. 1-5 ISSN 1572-6657 R&D Projects: GA ČR GA13-04630S Institutional support: RVO:61388955 Keywords : Ion transfer kinetics * ITIES * Tetraethylammonium ion Subject RIV: CG - Electrochemistry OBOR OECD: Electrochemistry (dry cells, batteries, fuel cells, corrosion metals, electrolysis) Impact factor: 3.012, year: 2016
Statistics of Shared Components in Complex Component Systems
Directory of Open Access Journals (Sweden)
Andrea Mazzolini
2018-04-01
Full Text Available Many complex systems are modular. Such systems can be represented as “component systems,” i.e., sets of elementary components, such as LEGO bricks in LEGO sets. The bricks found in a LEGO set reflect a target architecture, which can be built following a set-specific list of instructions. In other component systems, instead, the underlying functional design and constraints are not obvious a priori, and their detection is often a challenge of both scientific and practical importance, requiring a clear understanding of component statistics. Importantly, some quantitative invariants appear to be common to many component systems, most notably a common broad distribution of component abundances, which often resembles the well-known Zipf’s law. Such “laws” affect in a general and nontrivial way the component statistics, potentially hindering the identification of system-specific functional constraints or generative processes. Here, we specifically focus on the statistics of shared components, i.e., the distribution of the number of components shared by different system realizations, such as the common bricks found in different LEGO sets. To account for the effects of component heterogeneity, we consider a simple null model, which builds system realizations by random draws from a universe of possible components. Under general assumptions on abundance heterogeneity, we provide analytical estimates of component occurrence, which quantify exhaustively the statistics of shared components. Surprisingly, this simple null model can positively explain important features of empirical component-occurrence distributions obtained from large-scale data on bacterial genomes, LEGO sets, and book chapters. Specific architectural features and functional constraints can be detected from occurrence patterns as deviations from these null predictions, as we show for the illustrative case of the “core” genome in bacteria.
Time interval between successive trading in foreign currency market: from microscopic to macroscopic
Sato, Aki-Hiro
2004-12-01
Recently, it has been shown that inter-transaction interval (ITI) distribution of foreign currency rates has a fat tail. In order to understand the statistical property of the ITI dealer model with N interactive agents is proposed. From numerical simulations it is confirmed that the ITI distribution of the dealer model has a power law tail. The random multiplicative process (RMP) can be approximately derived from the ITI of the dealer model. Consequently, we conclude that the power law tail of the ITI distribution of the dealer model is a result of the RMP.
STATISTICAL ANALYSIS OF RAW SUGAR MATERIAL FOR SUGAR PRODUCER COMPLEX
A. A. Gromkovskii; O. I. Sherstyuk
2015-01-01
Summary. In the article examines the statistical data on the development of average weight and average sugar content of sugar beet roots. The successful solution of the problem of forecasting these raw indices is essential for solving problems of sugar producing complex control. In the paper by calculating the autocorrelation function demonstrated that the predominant trend component of the growth raw characteristics. For construct the prediction model is proposed to use an autoregressive fir...
Spectral statistics and scattering resonances of complex primes arrays
Wang, Ren; Pinheiro, Felipe A.; Dal Negro, Luca
2018-01-01
We introduce a class of aperiodic arrays of electric dipoles generated from the distribution of prime numbers in complex quadratic fields (Eisenstein and Gaussian primes) as well as quaternion primes (Hurwitz and Lifschitz primes), and study the nature of their scattering resonances using the vectorial Green's matrix method. In these systems we demonstrate several distinctive spectral properties, such as the absence of level repulsion in the strongly scattering regime, critical statistics of level spacings, and the existence of critical modes, which are extended fractal modes with long lifetimes not supported by either random or periodic systems. Moreover, we show that one can predict important physical properties, such as the existence spectral gaps, by analyzing the eigenvalue distribution of the Green's matrix of the arrays in the complex plane. Our results unveil the importance of aperiodic correlations in prime number arrays for the engineering of gapped photonic media that support far richer mode localization and spectral properties compared to usual periodic and random media.
Second-Order Statistics for Wave Propagation through Complex Optical Systems
DEFF Research Database (Denmark)
Yura, H.T.; Hanson, Steen Grüner
1989-01-01
Closed-form expressions are derived for various statistical functions that arise in optical propagation through arbitrary optical systems that can be characterized by a complex ABCD matrix in the presence of distributed random inhomogeneities along the optical path. Specifically, within the second......-order Rytov approximation, explicit general expressions are presented for the mutual coherence function, the log-amplitude and phase correlation functions, and the mean-square irradiance that are obtained in propagation through an arbitrary paraxial ABCD optical system containing Gaussian-shaped limiting...
Kruger, Uwe
2012-01-01
The development and application of multivariate statistical techniques in process monitoring has gained substantial interest over the past two decades in academia and industry alike. Initially developed for monitoring and fault diagnosis in complex systems, such techniques have been refined and applied in various engineering areas, for example mechanical and manufacturing, chemical, electrical and electronic, and power engineering. The recipe for the tremendous interest in multivariate statistical techniques lies in its simplicity and adaptability for developing monitoring applica
Statistical Physics of Complex Substitutive Systems
Jin, Qing
Diffusion processes are central to human interactions. Despite extensive studies that span multiple disciplines, our knowledge is limited to spreading processes in non-substitutive systems. Yet, a considerable number of ideas, products, and behaviors spread by substitution; to adopt a new one, agents must give up an existing one. This captures the spread of scientific constructs--forcing scientists to choose, for example, a deterministic or probabilistic worldview, as well as the adoption of durable items, such as mobile phones, cars, or homes. In this dissertation, I develop a statistical physics framework to describe, quantify, and understand substitutive systems. By empirically exploring three collected high-resolution datasets pertaining to such systems, I build a mechanistic model describing substitutions, which not only analytically predicts the universal macroscopic phenomenon discovered in the collected datasets, but also accurately captures the trajectories of individual items in a complex substitutive system, demonstrating a high degree of regularity and universality in substitutive systems. I also discuss the origins and insights of the parameters in the substitution model and possible generalization form of the mathematical framework. The systematical study of substitutive systems presented in this dissertation could potentially guide the understanding and prediction of all spreading phenomena driven by substitutions, from electric cars to scientific paradigms, and from renewable energy to new healthy habits.
Complexity control in statistical learning
Indian Academy of Sciences (India)
complexity of the class of models from which we are to choose our model. In this ... As is explained in §2, we use the concept of covering numbers to quantify the complexity of a class of ..... called structural risk minimization (SRM). Vapnik ...
Zikou, Anastasia K; Xydis, Vasileios G; Astrakas, Loukas G; Nakou, Iliada; Tzarouchi, Loukia C; Tzoufi, Meropi; Argyropoulou, Maria I
2016-07-01
There is evidence of microstructural changes in normal-appearing white matter of patients with tuberous sclerosis complex. To evaluate major white matter tracts in children with tuberous sclerosis complex using tract-based spatial statistics diffusion tensor imaging (DTI) analysis. Eight children (mean age ± standard deviation: 8.5 ± 5.5 years) with an established diagnosis of tuberous sclerosis complex and 8 age-matched controls were studied. The imaging protocol consisted of T1-weighted high-resolution 3-D spoiled gradient-echo sequence and a spin-echo, echo-planar diffusion-weighted sequence. Differences in the diffusion indices were evaluated using tract-based spatial statistics. Tract-based spatial statistics showed increased axial diffusivity in the children with tuberous sclerosis complex in the superior and anterior corona radiata, the superior longitudinal fascicle, the inferior fronto-occipital fascicle, the uncinate fascicle and the anterior thalamic radiation. No significant differences were observed in fractional anisotropy, mean diffusivity and radial diffusivity between patients and control subjects. No difference was found in the diffusion indices between the baseline and follow-up examination in the patient group. Patients with tuberous sclerosis complex have increased axial diffusivity in major white matter tracts, probably related to reduced axonal integrity.
International Nuclear Information System (INIS)
Zikou, Anastasia K.; Xydis, Vasileios G.; Tzarouchi, Loukia C.; Argyropoulou, Maria I.; Astrakas, Loukas G.; Nakou, Iliada; Tzoufi, Meropi
2016-01-01
There is evidence of microstructural changes in normal-appearing white matter of patients with tuberous sclerosis complex. To evaluate major white matter tracts in children with tuberous sclerosis complex using tract-based spatial statistics diffusion tensor imaging (DTI) analysis. Eight children (mean age ± standard deviation: 8.5 ± 5.5 years) with an established diagnosis of tuberous sclerosis complex and 8 age-matched controls were studied. The imaging protocol consisted of T1-weighted high-resolution 3-D spoiled gradient-echo sequence and a spin-echo, echo-planar diffusion-weighted sequence. Differences in the diffusion indices were evaluated using tract-based spatial statistics. Tract-based spatial statistics showed increased axial diffusivity in the children with tuberous sclerosis complex in the superior and anterior corona radiata, the superior longitudinal fascicle, the inferior fronto-occipital fascicle, the uncinate fascicle and the anterior thalamic radiation. No significant differences were observed in fractional anisotropy, mean diffusivity and radial diffusivity between patients and control subjects. No difference was found in the diffusion indices between the baseline and follow-up examination in the patient group. Patients with tuberous sclerosis complex have increased axial diffusivity in major white matter tracts, probably related to reduced axonal integrity. (orig.)
Autonomous Modeling, Statistical Complexity and Semi-annealed Treatment of Boolean Networks
Gong, Xinwei
This dissertation presents three studies on Boolean networks. Boolean networks are a class of mathematical systems consisting of interacting elements with binary state variables. Each element is a node with a Boolean logic gate, and the presence of interactions between any two nodes is represented by directed links. Boolean networks that implement the logic structures of real systems are studied as coarse-grained models of the real systems. Large random Boolean networks are studied with mean field approximations and used to provide a baseline of possible behaviors of large real systems. This dissertation presents one study of the former type, concerning the stable oscillation of a yeast cell-cycle oscillator, and two studies of the latter type, respectively concerning the statistical complexity of large random Boolean networks and an extension of traditional mean field techniques that accounts for the presence of short loops. In the cell-cycle oscillator study, a novel autonomous update scheme is introduced to study the stability of oscillations in small networks. A motif that corrects pulse-growing perturbations and a motif that grows pulses are identified. A combination of the two motifs is capable of sustaining stable oscillations. Examining a Boolean model of the yeast cell-cycle oscillator using an autonomous update scheme yields evidence that it is endowed with such a combination. Random Boolean networks are classified as ordered, critical or disordered based on their response to small perturbations. In the second study, random Boolean networks are taken as prototypical cases for the evaluation of two measures of complexity based on a criterion for optimal statistical prediction. One measure, defined for homogeneous systems, does not distinguish between the static spatial inhomogeneity in the ordered phase and the dynamical inhomogeneity in the disordered phase. A modification in which complexities of individual nodes are calculated yields vanishing
International Nuclear Information System (INIS)
Tiana-Alsina, J.; Torrent, M. C.; Masoller, C.; Garcia-Ojalvo, J.; Rosso, O. A.
2010-01-01
Low-frequency fluctuations (LFFs) represent a dynamical instability that occurs in semiconductor lasers when they are operated near the lasing threshold and subject to moderate optical feedback. LFFs consist of sudden power dropouts followed by gradual, stepwise recoveries. We analyze experimental time series of intensity dropouts and quantify the complexity of the underlying dynamics employing two tools from information theory, namely, Shannon's entropy and the Martin, Plastino, and Rosso statistical complexity measure. These measures are computed using a method based on ordinal patterns, by which the relative length and ordering of consecutive interdropout intervals (i.e., the time intervals between consecutive intensity dropouts) are analyzed, disregarding the precise timing of the dropouts and the absolute durations of the interdropout intervals. We show that this methodology is suitable for quantifying subtle characteristics of the LFFs, and in particular the transition to fully developed chaos that takes place when the laser's pump current is increased. Our method shows that the statistical complexity of the laser does not increase continuously with the pump current, but levels off before reaching the coherence collapse regime. This behavior coincides with that of the first- and second-order correlations of the interdropout intervals, suggesting that these correlations, and not the chaotic behavior, are what determine the level of complexity of the laser's dynamics. These results hold for two different dynamical regimes, namely, sustained LFFs and coexistence between LFFs and steady-state emission.
Directory of Open Access Journals (Sweden)
Xin Jin
2012-02-01
Full Text Available This study focuses on the preparation and enzymic hydrolysis of an icariin/β-cyclodextrin inclusion complex to efficiently generate icaritin. The physical characteristics of the inclusion complex were evaluated by differential scanning calorimetry (DSC. Enzymatic hydrolysis was optimized for the conversion of icariin/β-cyclodextrin complex to icaritin by Box–Behnken statistical design. The inclusion complex formulation increased the solubility of icariin approximately 17-fold, from 29.2 to 513.5 μg/mL at 60 °C. The optimum conditions were predicted by Box–Behnken statistical design as follows: 60 °C, pH 7.0, the ratio of enzyme/substrate (1:1.1 and reaction time 7 h. Under the optimal conditions the conversion of icariin was 97.91% and the reaction time was decreased by 68% compared with that without β-CD inclusion. Product analysis by melting point, ESI-MS, UV, IR, 1H NMR and 13C NMR confirmed the authenticity of icaritin with a purity of 99.3% and a yield of 473 mg of icaritin from 1.1 g icariin.
The applications of Complexity Theory and Tsallis Non-extensive Statistics at Solar Plasma Dynamics
Pavlos, George
2015-04-01
As the solar plasma lives far from equilibrium it is an excellent laboratory for testing complexity theory and non-equilibrium statistical mechanics. In this study, we present the highlights of complexity theory and Tsallis non extensive statistical mechanics as concerns their applications at solar plasma dynamics, especially at sunspot, solar flare and solar wind phenomena. Generally, when a physical system is driven far from equilibrium states some novel characteristics can be observed related to the nonlinear character of dynamics. Generally, the nonlinearity in space plasma dynamics can generate intermittent turbulence with the typical characteristics of the anomalous diffusion process and strange topologies of stochastic space plasma fields (velocity and magnetic fields) caused by the strange dynamics and strange kinetics (Zaslavsky, 2002). In addition, according to Zelenyi and Milovanov (2004) the complex character of the space plasma system includes the existence of non-equilibrium (quasi)-stationary states (NESS) having the topology of a percolating fractal set. The stabilization of a system near the NESS is perceived as a transition into a turbulent state determined by self-organization processes. The long-range correlation effects manifest themselves as a strange non-Gaussian behavior of kinetic processes near the NESS plasma state. The complex character of space plasma can also be described by the non-extensive statistical thermodynamics pioneered by Tsallis, which offers a consistent and effective theoretical framework, based on a generalization of Boltzmann - Gibbs (BG) entropy, to describe far from equilibrium nonlinear complex dynamics (Tsallis, 2009). In a series of recent papers, the hypothesis of Tsallis non-extensive statistics in magnetosphere, sunspot dynamics, solar flares, solar wind and space plasma in general, was tested and verified (Karakatsanis et al., 2013; Pavlos et al., 2014; 2015). Our study includes the analysis of solar plasma time
Energy Technology Data Exchange (ETDEWEB)
Paik, Joongcheol [University of Minnesota; Sotiropoulos, Fotis [University of Minnesota; Sale, Michael J [ORNL
2005-06-01
A numerical method is developed for carrying out unsteady Reynolds-averaged Navier-Stokes (URANS) simulations and detached-eddy simulations (DESs) in complex 3D geometries. The method is applied to simulate incompressible swirling flow in a typical hydroturbine draft tube, which consists of a strongly curved 90 degree elbow and two piers. The governing equations are solved with a second-order-accurate, finite-volume, dual-time-stepping artificial compressibility approach for a Reynolds number of 1.1 million on a mesh with 1.8 million nodes. The geometrical complexities of the draft tube are handled using domain decomposition with overset (chimera) grids. Numerical simulations show that unsteady statistical turbulence models can capture very complex 3D flow phenomena dominated by geometry-induced, large-scale instabilities and unsteady coherent structures such as the onset of vortex breakdown and the formation of the unsteady rope vortex downstream of the turbine runner. Both URANS and DES appear to yield the general shape and magnitude of mean velocity profiles in reasonable agreement with measurements. Significant discrepancies among the DES and URANS predictions of the turbulence statistics are also observed in the straight downstream diffuser.
Ma, Chuang; Chen, Han-Shuang; Lai, Ying-Cheng; Zhang, Hai-Feng
2018-02-01
Complex networks hosting binary-state dynamics arise in a variety of contexts. In spite of previous works, to fully reconstruct the network structure from observed binary data remains challenging. We articulate a statistical inference based approach to this problem. In particular, exploiting the expectation-maximization (EM) algorithm, we develop a method to ascertain the neighbors of any node in the network based solely on binary data, thereby recovering the full topology of the network. A key ingredient of our method is the maximum-likelihood estimation of the probabilities associated with actual or nonexistent links, and we show that the EM algorithm can distinguish the two kinds of probability values without any ambiguity, insofar as the length of the available binary time series is reasonably long. Our method does not require any a priori knowledge of the detailed dynamical processes, is parameter-free, and is capable of accurate reconstruction even in the presence of noise. We demonstrate the method using combinations of distinct types of binary dynamical processes and network topologies, and provide a physical understanding of the underlying reconstruction mechanism. Our statistical inference based reconstruction method contributes an additional piece to the rapidly expanding "toolbox" of data based reverse engineering of complex networked systems.
On the role of complex phases in the quantum statistics of weak measurements
International Nuclear Information System (INIS)
Hofmann, Holger F
2011-01-01
Weak measurements carried out between quantum state preparation and post-selection result in complex values for self-adjoint operators, corresponding to complex conditional probabilities for the projections on specific eigenstates. In this paper it is shown that the complex phases of these weak conditional probabilities describe the dynamic response of the system to unitary transformations. Quantum mechanics thus unifies the statistical overlap of different states with the dynamical structure of transformations between these states. Specifically, it is possible to identify the phase of weak conditional probabilities directly with the action of a unitary transform that maximizes the overlap of initial and final states. This action provides a quantitative measure of how much quantum correlations can diverge from the deterministic relations between physical properties expected from classical physics or hidden variable theories. In terms of quantum information, the phases of weak conditional probabilities thus represent the logical tension between sets of three quantum states that is at the heart of quantum paradoxes. (paper)
Son, Ji Y; Ramos, Priscilla; DeWolf, Melissa; Loftus, William; Stigler, James W
2018-01-01
In this article, we begin to lay out a framework and approach for studying how students come to understand complex concepts in rich domains. Grounded in theories of embodied cognition, we advance the view that understanding of complex concepts requires students to practice, over time, the coordination of multiple concepts, and the connection of this system of concepts to situations in the world. Specifically, we explore the role that a teacher's gesture might play in supporting students' coordination of two concepts central to understanding in the domain of statistics: mean and standard deviation. In Study 1 we show that university students who have just taken a statistics course nevertheless have difficulty taking both mean and standard deviation into account when thinking about a statistical scenario. In Study 2 we show that presenting the same scenario with an accompanying gesture to represent variation significantly impacts students' interpretation of the scenario. Finally, in Study 3 we present evidence that instructional videos on the internet fail to leverage gesture as a means of facilitating understanding of complex concepts. Taken together, these studies illustrate an approach to translating current theories of cognition into principles that can guide instructional design.
Li, Jie; Li, Rui; You, Leiming; Xu, Anlong; Fu, Yonggui; Huang, Shengfeng
2015-01-01
Switching between different alternative polyadenylation (APA) sites plays an important role in the fine tuning of gene expression. New technologies for the execution of 3’-end enriched RNA-seq allow genome-wide detection of the genes that exhibit significant APA site switching between different samples. Here, we show that the independence test gives better results than the linear trend test in detecting APA site-switching events. Further examination suggests that the discrepancy between these two statistical methods arises from complex APA site-switching events that cannot be represented by a simple change of average 3’-UTR length. In theory, the linear trend test is only effective in detecting these simple changes. We classify the switching events into four switching patterns: two simple patterns (3’-UTR shortening and lengthening) and two complex patterns. By comparing the results of the two statistical methods, we show that complex patterns account for 1/4 of all observed switching events that happen between normal and cancerous human breast cell lines. Because simple and complex switching patterns may convey different biological meanings, they merit separate study. We therefore propose to combine both the independence test and the linear trend test in practice. First, the independence test should be used to detect APA site switching; second, the linear trend test should be invoked to identify simple switching events; and third, those complex switching events that pass independence testing but fail linear trend testing can be identified. PMID:25875641
Chemometric and Statistical Analyses of ToF-SIMS Spectra of Increasingly Complex Biological Samples
Energy Technology Data Exchange (ETDEWEB)
Berman, E S; Wu, L; Fortson, S L; Nelson, D O; Kulp, K S; Wu, K J
2007-10-24
Characterizing and classifying molecular variation within biological samples is critical for determining fundamental mechanisms of biological processes that will lead to new insights including improved disease understanding. Towards these ends, time-of-flight secondary ion mass spectrometry (ToF-SIMS) was used to examine increasingly complex samples of biological relevance, including monosaccharide isomers, pure proteins, complex protein mixtures, and mouse embryo tissues. The complex mass spectral data sets produced were analyzed using five common statistical and chemometric multivariate analysis techniques: principal component analysis (PCA), linear discriminant analysis (LDA), partial least squares discriminant analysis (PLSDA), soft independent modeling of class analogy (SIMCA), and decision tree analysis by recursive partitioning. PCA was found to be a valuable first step in multivariate analysis, providing insight both into the relative groupings of samples and into the molecular basis for those groupings. For the monosaccharides, pure proteins and protein mixture samples, all of LDA, PLSDA, and SIMCA were found to produce excellent classification given a sufficient number of compound variables calculated. For the mouse embryo tissues, however, SIMCA did not produce as accurate a classification. The decision tree analysis was found to be the least successful for all the data sets, providing neither as accurate a classification nor chemical insight for any of the tested samples. Based on these results we conclude that as the complexity of the sample increases, so must the sophistication of the multivariate technique used to classify the samples. PCA is a preferred first step for understanding ToF-SIMS data that can be followed by either LDA or PLSDA for effective classification analysis. This study demonstrates the strength of ToF-SIMS combined with multivariate statistical and chemometric techniques to classify increasingly complex biological samples
Narayanan, Roshni; Nugent, Rebecca; Nugent, Kenneth
2015-10-01
Accreditation Council for Graduate Medical Education guidelines require internal medicine residents to develop skills in the interpretation of medical literature and to understand the principles of research. A necessary component is the ability to understand the statistical methods used and their results, material that is not an in-depth focus of most medical school curricula and residency programs. Given the breadth and depth of the current medical literature and an increasing emphasis on complex, sophisticated statistical analyses, the statistical foundation and education necessary for residents are uncertain. We reviewed the statistical methods and terms used in 49 articles discussed at the journal club in the Department of Internal Medicine residency program at Texas Tech University between January 1, 2013 and June 30, 2013. We collected information on the study type and on the statistical methods used for summarizing and comparing samples, determining the relations between independent variables and dependent variables, and estimating models. We then identified the typical statistics education level at which each term or method is learned. A total of 14 articles came from the Journal of the American Medical Association Internal Medicine, 11 from the New England Journal of Medicine, 6 from the Annals of Internal Medicine, 5 from the Journal of the American Medical Association, and 13 from other journals. Twenty reported randomized controlled trials. Summary statistics included mean values (39 articles), category counts (38), and medians (28). Group comparisons were based on t tests (14 articles), χ2 tests (21), and nonparametric ranking tests (10). The relations between dependent and independent variables were analyzed with simple regression (6 articles), multivariate regression (11), and logistic regression (8). Nine studies reported odds ratios with 95% confidence intervals, and seven analyzed test performance using sensitivity and specificity calculations
Beginning R The Statistical Programming Language
Gardener, Mark
2012-01-01
Conquer the complexities of this open source statistical language R is fast becoming the de facto standard for statistical computing and analysis in science, business, engineering, and related fields. This book examines this complex language using simple statistical examples, showing how R operates in a user-friendly context. Both students and workers in fields that require extensive statistical analysis will find this book helpful as they learn to use R for simple summary statistics, hypothesis testing, creating graphs, regression, and much more. It covers formula notation, complex statistics
a Statistical Dynamic Approach to Structural Evolution of Complex Capital Market Systems
Shao, Xiao; Chai, Li H.
As an important part of modern financial systems, capital market has played a crucial role on diverse social resource allocations and economical exchanges. Beyond traditional models and/or theories based on neoclassical economics, considering capital markets as typical complex open systems, this paper attempts to develop a new approach to overcome some shortcomings of the available researches. By defining the generalized entropy of capital market systems, a theoretical model and nonlinear dynamic equation on the operations of capital market are proposed from statistical dynamic perspectives. The US security market from 1995 to 2001 is then simulated and analyzed as a typical case. Some instructive results are discussed and summarized.
Complexity control in statistical learning
Indian Academy of Sciences (India)
Then we describe how the method of regularization is used to control complexity in learning. We discuss two examples of regularization, one in which the function space used is ﬁnite dimensional, and another in which it is a reproducing kernel Hilbert space. Our exposition follows the formulation of Cucker and Smale.
Application of statistical physics approaches to complex organizations
Matia, Kaushik
The first part of this thesis studies two different kinds of financial markets, namely, the stock market and the commodity market. Stock price fluctuations display certain scale-free statistical features that are not unlike those found in strongly-interacting physical systems. The possibility that new insights can be gained using concepts and methods developed to understand scale-free physical phenomena has stimulated considerable research activity in the physics community. In the first part of this thesis a comparative study of stocks and commodities is performed in terms of probability density function and correlations of stock price fluctuations. It is found that the probability density of the stock price fluctuation has a power law functional form with an exponent 3, which is similar across different markets around the world. We present an autoregressive model to explain the origin of the power law functional form of the probability density function of the price fluctuation. The first part also presents the discovery of unique features of the Indian economy, which we find displays a scale-dependent probability density function. In the second part of this thesis we quantify the statistical properties of fluctuations of complex systems like business firms and world scientific publications. We analyze class size of these systems mentioned above where units agglomerate to form classes. We find that the width of the probability density function of growth rate decays with the class size as a power law with an exponent beta which is universal in the sense that beta is independent of the system studied. We also identify two other scaling exponents, gamma connecting the unit size to the class size and gamma connecting the number of units to the class size, where products are units and firms are classes. Finally we propose a generalized preferential attachment model to describe the class size distribution. This model is successful in explaining the growth rate and class
Selected Manpower Statistics, FY-58
1959-01-31
oro«nfn ■k i% « % A n «fc ^^^«s O m rtt r r-t vp O O CM H H o H H H H H H W HHHHHH «Or-lcot-t^t^ o«»^^«»!«^© irs...OOHriHH Ov O»«) rH rH HHHHHH CM HHHHHH 0\\ t~ ON ON IT ON H 0> ITi imss nail Is ►->•<« o a ö •-» ^ s par.60 t o s i |2
Caregiver Statistics: Demographics
... You are here Home Selected Long-Term Care Statistics Order this publication Printer-friendly version What is ... needs and services are wide-ranging and complex, statistics may vary from study to study. Sources for ...
Denker, Manfred
2017-01-01
Introductory Statistics and Random Phenomena integrates traditional statistical data analysis with new computational experimentation capabilities and concepts of algorithmic complexity and chaotic behavior in nonlinear dynamic systems. This was the first advanced text/reference to bring together such a comprehensive variety of tools for the study of random phenomena occurring in engineering and the natural, life, and social sciences. The crucial computer experiments are conducted using the readily available computer program Mathematica® Uncertain Virtual Worlds™ software packages which optimize and facilitate the simulation environment. Brief tutorials are included that explain how to use theMathematica® programs for effective simulation and computer experiments. Large and original real-life data sets are introduced and analyzed as a model for independent study. This is an excellent classroom tool and self-study guide. The material is presented in a clear and accessible style providing numerous...
Bruña, Ricardo; Poza, Jesús; Gómez, Carlos; García, María; Fernández, Alberto; Hornero, Roberto
2012-06-01
Alzheimer's disease (AD) is the most common cause of dementia. Over the last few years, a considerable effort has been devoted to exploring new biomarkers. Nevertheless, a better understanding of brain dynamics is still required to optimize therapeutic strategies. In this regard, the characterization of mild cognitive impairment (MCI) is crucial, due to the high conversion rate from MCI to AD. However, only a few studies have focused on the analysis of magnetoencephalographic (MEG) rhythms to characterize AD and MCI. In this study, we assess the ability of several parameters derived from information theory to describe spontaneous MEG activity from 36 AD patients, 18 MCI subjects and 26 controls. Three entropies (Shannon, Tsallis and Rényi entropies), one disequilibrium measure (based on Euclidean distance ED) and three statistical complexities (based on Lopez Ruiz-Mancini-Calbet complexity LMC) were used to estimate the irregularity and statistical complexity of MEG activity. Statistically significant differences between AD patients and controls were obtained with all parameters (p validation procedure was applied. The accuracies reached 83.9% and 65.9% to discriminate AD and MCI subjects from controls, respectively. Our findings suggest that MCI subjects exhibit an intermediate pattern of abnormalities between normal aging and AD. Furthermore, the proposed parameters provide a new description of brain dynamics in AD and MCI.
Energy Technology Data Exchange (ETDEWEB)
Al Mouhamed, Mayez
1977-09-15
In a number of complex physical systems the accessible signals are often characterized by random fluctuations about a mean value. The fluctuations (signature) often transmit information about the state of the system that the mean value cannot predict. This study is undertaken to elaborate statistical methods of anomaly detection on the basis of signature analysis of the noise inherent in the process. The algorithm presented first learns the characteristics of normal operation of a complex process. Then it detects small deviations from the normal behavior. The algorithm can be implemented in a medium-sized computer for on line application. (author) [French] Dans de nombreux systemes physiques complexes les grandeurs accessibles a l'homme sont souvent caracterisees par des fluctuations aleatoires autour d'une valeur moyenne. Les fluctuations (signatures) transmettent souvent des informations sur l'etat du systeme que la valeur moyenne ne peut predire. Cette etude est entreprise pour elaborer des methodes statistiques de detection d'anomalies de fonctionnement sur la base de l'analyse des signatures contenues dans les signaux de bruit provenant du processus. L'algorithme presente est capable de: 1/ Apprendre les caracteristiques des operations normales dans un processus complexe. 2/ Detecter des petites deviations par rapport a la conduite normale du processus. L'algorithme peut etre implante sur un calculateur de taille moyenne pour les applications en ligne. (auteur)
Tayurskii, Dmitrii; Abe, Sumiyoshi; Alexandre Wang, Q.
2012-11-01
The 3rd International Workshop on Statistical Physics and Mathematics for Complex Systems (SPMCS2012) was held between 25-30 August at Kazan (Volga Region) Federal University, Kazan, Russian Federation. This workshop was jointly organized by Kazan Federal University and Institut Supérieur des Matériaux et Mécaniques Avancées (ISMANS), France. The series of SPMCS workshops was created in 2008 with the aim to be an interdisciplinary incubator for the worldwide exchange of innovative ideas and information about the latest results. The first workshop was held at ISMANS, Le Mans (France) in 2008, and the third at Huazhong Normal University, Wuhan (China) in 2010. At SPMCS2012, we wished to bring together a broad community of researchers from the different branches of the rapidly developing complexity science to discuss the fundamental theoretical challenges (geometry/topology, number theory, statistical physics, dynamical systems, etc) as well as experimental and applied aspects of many practical problems (condensed matter, disordered systems, financial markets, chemistry, biology, geoscience, etc). The program of SPMCS2012 was prepared based on three categories: (i) physical and mathematical studies (quantum mechanics, generalized nonequilibrium thermodynamics, nonlinear dynamics, condensed matter physics, nanoscience); (ii) natural complex systems (physical, geophysical, chemical and biological); (iii) social, economical, political agent systems and man-made complex systems. The conference attracted 64 participants from 10 countries. There were 10 invited lectures, 12 invited talks and 28 regular oral talks in the morning and afternoon sessions. The book of Abstracts is available from the conference website (http://www.ksu.ru/conf/spmcs2012/?id=3). A round table was also held, the topic of which was 'Recent and Anticipated Future Progress in Science of Complexity', discussing a variety of questions and opinions important for the understanding of the concept of
International Nuclear Information System (INIS)
Masoud, M.S.; Motaweh, H.A.; Ali, A.E.
1999-01-01
Full text.the electronic absorption spectra of the octahedral complexes containing monoethanolamine were recorded in different solvents (dioxine, chlororm, ethanol, dimethylformamide, dimethylsulfoxide and water). The data analyzed based on multiple linear regression technique using the equation: ya (a is the regression intercept) are various empirical solvent polarytiparameters; constants are calculated using micro statistic program on pc computer. The solvent spectral data of the complexes are compared to that of nugot, the solvent assists the spectral data to be red shifts. In case of Mn (MEA) CL complex, numerous bands are appeared in presence of CHCI DMF and DMSO solvents probably due to the numerous oxidation states. The solvent parameters: E (solvent-solute hydrogen bond and dipolar interaction); (dipolar interaction related to the dielectric constant); M (solute permanent dipole-solvent induced ipole) and N (solute permanent dipole-solvent permanent dipole) are correlated with the structure of the complexes, in hydrogen bonding solvents (Band in case of complexes as the dielectric constant increases, blue shift occurs in due to conjugation with high stability, the data in DMF and DMSO solvents are nearly the same probably due to their similarity
Eva, Megan M; Yuki, Kyoko E; Dauphinee, Shauna M; Schwartzentruber, Jeremy A; Pyzik, Michal; Paquet, Marilène; Lathrop, Mark; Majewski, Jacek; Vidal, Silvia M; Malo, Danielle
2014-01-01
Salmonella enterica is a ubiquitous Gram-negative intracellular bacterium that continues to pose a global challenge to human health. The etiology of Salmonella pathogenesis is complex and controlled by pathogen, environmental, and host genetic factors. In fact, patients immunodeficient in genes in the IL-12, IL-23/IFN-γ pathway are predisposed to invasive nontyphoidal Salmonella infection. Using a forward genomics approach by N-ethyl-N-nitrosourea (ENU) germline mutagenesis in mice, we identified the Ity14 (Immunity to Typhimurium locus 14) pedigree exhibiting increased susceptibility following in vivo Salmonella challenge. A DNA-binding domain mutation (p.G418_E445) in Stat4 (Signal Transducer and Activator of Transcription Factor 4) was the causative mutation. STAT4 signals downstream of IL-12 to mediate transcriptional regulation of inflammatory immune responses. In mutant Ity14 mice, the increased splenic and hepatic bacterial load resulted from an intrinsic defect in innate cell function, IFN-γ-mediated immunity, and disorganized granuloma formation. We further show that NK and NKT cells play an important role in mediating control of Salmonella in Stat4(Ity14/Ity14) mice. Stat4(Ity14/Ity14) mice had increased expression of genes involved in cell-cell interactions and communication, as well as increased CD11b expression on a subset of splenic myeloid dendritic cells, resulting in compromised recruitment of inflammatory cells to the spleen during Salmonella infection. Stat4(Ity14/Ity14) presented upregulated compensatory mechanisms, although inefficient and ultimately Stat4(Ity14/Ity14) mice develop fatal bacteremia. The following study further elucidates the pathophysiological impact of STAT4 during Salmonella infection.
Directory of Open Access Journals (Sweden)
Swarna Weerasinghe
2017-03-01
Conclusion: This study demonstrated the importance of complex statistical model use and the consequences of lack of such modelling that accounted for data structures in public health risk assessments.
On the Logical Development of Statistical Models.
1983-12-01
1978). "Modelos con parametros variables en el analisis de series temporales " Questiio, 4, 2, 75-87. [25] Seal, H. L. (1967). "The historical...example, a classical state-space representation of a simple time series model is: yt = it + ut Ut = *It-I + Ct (2.2) ut and et are independent normal...on its past values is displayed in the structural equation. This approach has been particularly useful in time series models. For example, model (2.2
[''R"--project for statistical computing
DEFF Research Database (Denmark)
Dessau, R.B.; Pipper, Christian Bressen
2008-01-01
An introduction to the R project for statistical computing (www.R-project.org) is presented. The main topics are: 1. To make the professional community aware of "R" as a potent and free software for graphical and statistical analysis of medical data; 2. Simple well-known statistical tests are fai...... are fairly easy to perform in R, but more complex modelling requires programming skills; 3. R is seen as a tool for teaching statistics and implementing complex modelling of medical data among medical professionals Udgivelsesdato: 2008/1/28......An introduction to the R project for statistical computing (www.R-project.org) is presented. The main topics are: 1. To make the professional community aware of "R" as a potent and free software for graphical and statistical analysis of medical data; 2. Simple well-known statistical tests...
The Statistical Assessment of Latent Trait Dimensionality in Psychological Testing
1984-06-01
NU’.’Lil-. Dooartmont of Mathematics, Univerf-ity of Ulinois6ll53H; RR042-04; RR 042-C4-01 1409 West Green Street Urbana, 11, 61801 NR 1:>V...RkD Center San Diego, CA 92152 I Dr. Frank Vicino Navy Personnel FID Center San Diego, CA 92152 I Dr. Edxard Kegaan Office of Naval Research-(Cod
Chang, Pao-Erh Paul; Yang, Jen-Chih Rena; Den, Walter; Wu, Chang-Fu
2014-09-01
Emissions of volatile organic compounds (VOCs) are most frequent environmental nuisance complaints in urban areas, especially where industrial districts are nearby. Unfortunately, identifying the responsible emission sources of VOCs is essentially a difficult task. In this study, we proposed a dynamic approach to gradually confine the location of potential VOC emission sources in an industrial complex, by combining multi-path open-path Fourier transform infrared spectrometry (OP-FTIR) measurement and the statistical method of principal component analysis (PCA). Close-cell FTIR was further used to verify the VOC emission source by measuring emitted VOCs from selected exhaust stacks at factories in the confined areas. Multiple open-path monitoring lines were deployed during a 3-month monitoring campaign in a complex industrial district. The emission patterns were identified and locations of emissions were confined by the wind data collected simultaneously. N,N-Dimethyl formamide (DMF), 2-butanone, toluene, and ethyl acetate with mean concentrations of 80.0 ± 1.8, 34.5 ± 0.8, 103.7 ± 2.8, and 26.6 ± 0.7 ppbv, respectively, were identified as the major VOC mixture at all times of the day around the receptor site. As the toxic air pollutant, the concentrations of DMF in air samples were found exceeding the ambient standard despite the path-average effect of OP-FTIR upon concentration levels. The PCA data identified three major emission sources, including PU coating, chemical packaging, and lithographic printing industries. Applying instrumental measurement and statistical modeling, this study has established a systematic approach for locating emission sources. Statistical modeling (PCA) plays an important role in reducing dimensionality of a large measured dataset and identifying underlying emission sources. Instrumental measurement, however, helps verify the outcomes of the statistical modeling. The field study has demonstrated the feasibility of
Tenenbaum, Joel
This thesis applies statistical physics concepts and methods to quantitatively analyze complex systems. This thesis is separated into four parts: (i) characteristics of earthquake systems (ii) memory and volatility in data time series (iii) the application of part (ii) to world financial markets, and (iv) statistical observations on the evolution of word usage. In Part I, we observe statistical patterns in the occurrence of earthquakes. We select a 14-year earthquake catalog covering the archipelago of Japan. We find that regions traditionally thought of as being too distant from one another for causal contact display remarkably high correlations, and the networks that result have a tendency to link highly connected areas with other highly connected areas. In Part II, we introduce and apply the concept of "volatility asymmetry", the primary use of which is in financial data. We explain the relation between memory and "volatility asymmetry" in terms of an asymmetry parameter lambda. We define a litmus test for determining whether lambda is statistically significant and propose a stochastic model based on this parameter and use the model to further explain empirical data. In Part III, we expand on volatility asymmetry. Importing the concepts of time dependence and universality from physics, we explore the aspects of emerging (or "transition") economies in Eastern Europe as they relate to asymmetry. We find that these emerging markets in some instances behave like developed markets and in other instances do not, and that the distinction is a matter both of country and a matter of time period, crisis periods showing different asymmetry characteristics than "healthy" periods. In Part IV, we take note of a series of findings in econophysics, showing statistical growth similarities between a variety of different areas that all have in common the fact of taking place in areas that are both (i) competing and (ii) dynamic. We show that this same growth distribution can be
Industrial statistics with Minitab
Cintas, Pere Grima; Llabres, Xavier Tort-Martorell
2012-01-01
Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores
Statistically motivated model of mechanisms controlling evolution of deformation band substructure
Czech Academy of Sciences Publication Activity Database
Kratochvíl, J.; Kružík, Martin
2016-01-01
Roč. 81, č. 1 (2016), s. 196-208 ISSN 0749-6419 Grant - others:GA ČR(CZ) GAP107/12/0121 Institutional support: RVO:67985556 Keywords : Crystal plastic ity * Microstructures * Deformation bands Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 5.702, year: 2016 http://library.utia.cas.cz/separaty/2016/MTR/kruzik-0457407.pdf
U mosta slúžiti. Mezinárodní vědecká konference k 800. výročí založení Řádu kazatelů – dominikánů
Czech Academy of Sciences Publication Activity Database
Vytlačil, Lukáš
2016-01-01
Roč. 53, 2/3 (2016), s. 302-304 ISSN 0018-7003. [U mosta slúžiti. Mezinárodní vědecká konference k 800. výročí založení Řádu kazatelů – dominikánů. Praha, 29.09.2016-30.09.2016] Institutional support: RVO:68378076 Keywords : dominicans * conferences * medieval music * graduals * medieval music theory * Hieronymus de Moravia Subject RIV: AL - Art, Architecture, Cultural Heritage
PAFit: A Statistical Method for Measuring Preferential Attachment in Temporal Complex Networks.
Directory of Open Access Journals (Sweden)
Thong Pham
Full Text Available Preferential attachment is a stochastic process that has been proposed to explain certain topological features characteristic of complex networks from diverse domains. The systematic investigation of preferential attachment is an important area of research in network science, not only for the theoretical matter of verifying whether this hypothesized process is operative in real-world networks, but also for the practical insights that follow from knowledge of its functional form. Here we describe a maximum likelihood based estimation method for the measurement of preferential attachment in temporal complex networks. We call the method PAFit, and implement it in an R package of the same name. PAFit constitutes an advance over previous methods primarily because we based it on a nonparametric statistical framework that enables attachment kernel estimation free of any assumptions about its functional form. We show this results in PAFit outperforming the popular methods of Jeong and Newman in Monte Carlo simulations. What is more, we found that the application of PAFit to a publically available Flickr social network dataset yielded clear evidence for a deviation of the attachment kernel from the popularly assumed log-linear form. Independent of our main work, we provide a correction to a consequential error in Newman's original method which had evidently gone unnoticed since its publication over a decade ago.
Equilibrium statistical mechanics
Jackson, E Atlee
2000-01-01
Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t
Complex dynamics of a new 3D Lorenz-type autonomous chaotic ...
Indian Academy of Sciences (India)
Fuchen Zhang
2017-11-17
Nov 17, 2017 ... of a chaotic system can be applied to study the stability ... ity theory to study the ultimate boundedness and global ..... Technology and Business University (Grant No. ... National Key Research and Development Program of.
Boslaugh, Sarah
2013-01-01
Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.
Evolutionary Statistical Procedures
Baragona, Roberto; Poli, Irene
2011-01-01
This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a
Kassem, M.; Soize, C.; Gagliardini, L.
2009-06-01
In this paper, an energy-density field approach applied to the vibroacoustic analysis of complex industrial structures in the low- and medium-frequency ranges is presented. This approach uses a statistical computational model. The analyzed system consists of an automotive vehicle structure coupled with its internal acoustic cavity. The objective of this paper is to make use of the statistical properties of the frequency response functions of the vibroacoustic system observed from previous experimental and numerical work. The frequency response functions are expressed in terms of a dimensionless matrix which is estimated using the proposed energy approach. Using this dimensionless matrix, a simplified vibroacoustic model is proposed.
Polarimetric Segmentation Using Wishart Test Statistic
DEFF Research Database (Denmark)
Skriver, Henning; Schou, Jesper; Nielsen, Allan Aasbjerg
2002-01-01
A newly developed test statistic for equality of two complex covariance matrices following the complex Wishart distribution and an associated asymptotic probability for the test statistic has been used in a segmentation algorithm. The segmentation algorithm is based on the MUM (merge using moments......) approach, which is a merging algorithm for single channel SAR images. The polarimetric version described in this paper uses the above-mentioned test statistic for merging. The segmentation algorithm has been applied to polarimetric SAR data from the Danish dual-frequency, airborne polarimetric SAR, EMISAR...
Software Alchemy: Turning Complex Statistical Computations into Embarrassingly-Parallel Ones
Directory of Open Access Journals (Sweden)
Norman Matloff
2016-07-01
Full Text Available The growth in the use of computationally intensive statistical procedures, especially with big data, has necessitated the usage of parallel computation on diverse platforms such as multicore, GPUs, clusters and clouds. However, slowdown due to interprocess communication costs typically limits such methods to "embarrassingly parallel" (EP algorithms, especially on non-shared memory platforms. This paper develops a broadlyapplicable method for converting many non-EP algorithms into statistically equivalent EP ones. The method is shown to yield excellent levels of speedup for a variety of statistical computations. It also overcomes certain problems of memory limitations.
Naghshpour, Shahdad
2012-01-01
Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...
Statistical Analysis of Big Data on Pharmacogenomics
Fan, Jianqing; Liu, Han
2013-01-01
This paper discusses statistical methods for estimating complex correlation structure from large pharmacogenomic datasets. We selectively review several prominent statistical methods for estimating large covariance matrix for understanding correlation structure, inverse covariance matrix for network modeling, large-scale simultaneous tests for selecting significantly differently expressed genes and proteins and genetic markers for complex diseases, and high dimensional variable selection for identifying important molecules for understanding molecule mechanisms in pharmacogenomics. Their applications to gene network estimation and biomarker selection are used to illustrate the methodological power. Several new challenges of Big data analysis, including complex data distribution, missing data, measurement error, spurious correlation, endogeneity, and the need for robust statistical methods, are also discussed. PMID:23602905
Learning Predictive Statistics: Strategies and Brain Mechanisms.
Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E; Kourtzi, Zoe
2017-08-30
When immersed in a new environment, we are challenged to decipher initially incomprehensible streams of sensory information. However, quite rapidly, the brain finds structure and meaning in these incoming signals, helping us to predict and prepare ourselves for future actions. This skill relies on extracting the statistics of event streams in the environment that contain regularities of variable complexity from simple repetitive patterns to complex probabilistic combinations. Here, we test the brain mechanisms that mediate our ability to adapt to the environment's statistics and predict upcoming events. By combining behavioral training and multisession fMRI in human participants (male and female), we track the corticostriatal mechanisms that mediate learning of temporal sequences as they change in structure complexity. We show that learning of predictive structures relates to individual decision strategy; that is, selecting the most probable outcome in a given context (maximizing) versus matching the exact sequence statistics. These strategies engage distinct human brain regions: maximizing engages dorsolateral prefrontal, cingulate, sensory-motor regions, and basal ganglia (dorsal caudate, putamen), whereas matching engages occipitotemporal regions (including the hippocampus) and basal ganglia (ventral caudate). Our findings provide evidence for distinct corticostriatal mechanisms that facilitate our ability to extract behaviorally relevant statistics to make predictions. SIGNIFICANCE STATEMENT Making predictions about future events relies on interpreting streams of information that may initially appear incomprehensible. Past work has studied how humans identify repetitive patterns and associative pairings. However, the natural environment contains regularities that vary in complexity from simple repetition to complex probabilistic combinations. Here, we combine behavior and multisession fMRI to track the brain mechanisms that mediate our ability to adapt to
Boslaugh, Sarah
2008-01-01
Need to learn statistics as part of your job, or want some help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference that's perfect for anyone with no previous background in the subject. This book gives you a solid understanding of statistics without being too simple, yet without the numbing complexity of most college texts. You get a firm grasp of the fundamentals and a hands-on understanding of how to apply them before moving on to the more advanced material that follows. Each chapter presents you with easy-to-follow descriptions illustrat
Statistics and Data Interpretation for Social Work
Rosenthal, James
2011-01-01
"Without question, this text will be the most authoritative source of information on statistics in the human services. From my point of view, it is a definitive work that combines a rigorous pedagogy with a down to earth (commonsense) exploration of the complex and difficult issues in data analysis (statistics) and interpretation. I welcome its publication.". -Praise for the First Edition. Written by a social worker for social work students, this is a nuts and bolts guide to statistics that presents complex calculations and concepts in clear, easy-to-understand language. It includes
Editorial to: Six papers on Dynamic Statistical Models
DEFF Research Database (Denmark)
2014-01-01
statistical methodology and theory for large and complex data sets that included biostatisticians and mathematical statisticians from three faculties at the University of Copenhagen. The satellite meeting took place August 17–19, 2011. Its purpose was to bring together researchers in statistics and related......The following six papers are based on invited lectures at the satellite meeting held at the University of Copenhagen before the 58th World Statistics Congress of the International Statistical Institute in Dublin in 2011. At the invitation of the Bernoulli Society, the satellite meeting...... was organized around the theme “Dynamic Statistical Models” as a part of the Program of Excellence at the University of Copenhagen on “Statistical methods for complex and high dimensional models” (http://statistics.ku.dk/). The Excellence Program in Statistics was a research project to develop and investigate...
Capturing rogue waves by multi-point statistics
International Nuclear Information System (INIS)
Hadjihosseini, A; Wächter, Matthias; Peinke, J; Hoffmann, N P
2016-01-01
As an example of a complex system with extreme events, we investigate ocean wave states exhibiting rogue waves. We present a statistical method of data analysis based on multi-point statistics which for the first time allows the grasping of extreme rogue wave events in a highly satisfactory statistical manner. The key to the success of the approach is mapping the complexity of multi-point data onto the statistics of hierarchically ordered height increments for different time scales, for which we can show that a stochastic cascade process with Markov properties is governed by a Fokker–Planck equation. Conditional probabilities as well as the Fokker–Planck equation itself can be estimated directly from the available observational data. With this stochastic description surrogate data sets can in turn be generated, which makes it possible to work out arbitrary statistical features of the complex sea state in general, and extreme rogue wave events in particular. The results also open up new perspectives for forecasting the occurrence probability of extreme rogue wave events, and even for forecasting the occurrence of individual rogue waves based on precursory dynamics. (paper)
Complex network approach to characterize the statistical features of the sunspot series
International Nuclear Information System (INIS)
Zou, Yong; Liu, Zonghua; Small, Michael; Kurths, Jürgen
2014-01-01
Complex network approaches have been recently developed as an alternative framework to study the statistical features of time-series data. We perform a visibility-graph analysis on both the daily and monthly sunspot series. Based on the data, we propose two ways to construct the network: one is from the original observable measurements and the other is from a negative-inverse-transformed series. The degree distribution of the derived networks for the strong maxima has clear non-Gaussian properties, while the degree distribution for minima is bimodal. The long-term variation of the cycles is reflected by hubs in the network that span relatively large time intervals. Based on standard network structural measures, we propose to characterize the long-term correlations by waiting times between two subsequent events. The persistence range of the solar cycles has been identified over 15–1000 days by a power-law regime with scaling exponent γ = 2.04 of the occurrence time of two subsequent strong minima. In contrast, a persistent trend is not present in the maximal numbers, although maxima do have significant deviations from an exponential form. Our results suggest some new insights for evaluating existing models. (paper)
Functional statistics and related fields
Bongiorno, Enea; Cao, Ricardo; Vieu, Philippe
2017-01-01
This volume collects latest methodological and applied contributions on functional, high-dimensional and other complex data, related statistical models and tools as well as on operator-based statistics. It contains selected and refereed contributions presented at the Fourth International Workshop on Functional and Operatorial Statistics (IWFOS 2017) held in A Coruña, Spain, from 15 to 17 June 2017. The series of IWFOS workshops was initiated by the Working Group on Functional and Operatorial Statistics at the University of Toulouse in 2008. Since then, many of the major advances in functional statistics and related fields have been periodically presented and discussed at the IWFOS workshops. .
Koorehdavoudi, Hana; Bogdan, Paul
2016-06-01
Biological systems are frequently categorized as complex systems due to their capabilities of generating spatio-temporal structures from apparent random decisions. In spite of research on analyzing biological systems, we lack a quantifiable framework for measuring their complexity. To fill this gap, in this paper, we develop a new paradigm to study a collective group of N agents moving and interacting in a three-dimensional space. Our paradigm helps to identify the spatio-temporal states of the motion of the group and their associated transition probabilities. This framework enables the estimation of the free energy landscape corresponding to the identified states. Based on the energy landscape, we quantify missing information, emergence, self-organization and complexity for a collective motion. We show that the collective motion of the group of agents evolves to reach the most probable state with relatively lowest energy level and lowest missing information compared to other possible states. Our analysis demonstrates that the natural group of animals exhibit a higher degree of emergence, self-organization and complexity over time. Consequently, this algorithm can be integrated into new frameworks to engineer collective motions to achieve certain degrees of emergence, self-organization and complexity.
Statistical benchmark for BosonSampling
International Nuclear Information System (INIS)
Walschaers, Mattia; Mayer, Klaus; Buchleitner, Andreas; Kuipers, Jack; Urbina, Juan-Diego; Richter, Klaus; Tichy, Malte Christopher
2016-01-01
Boson samplers—set-ups that generate complex many-particle output states through the transmission of elementary many-particle input states across a multitude of mutually coupled modes—promise the efficient quantum simulation of a classically intractable computational task, and challenge the extended Church–Turing thesis, one of the fundamental dogmas of computer science. However, as in all experimental quantum simulations of truly complex systems, one crucial problem remains: how to certify that a given experimental measurement record unambiguously results from enforcing the claimed dynamics, on bosons, fermions or distinguishable particles? Here we offer a statistical solution to the certification problem, identifying an unambiguous statistical signature of many-body quantum interference upon transmission across a multimode, random scattering device. We show that statistical analysis of only partial information on the output state allows to characterise the imparted dynamics through particle type-specific features of the emerging interference patterns. The relevant statistical quantifiers are classically computable, define a falsifiable benchmark for BosonSampling, and reveal distinctive features of many-particle quantum dynamics, which go much beyond mere bunching or anti-bunching effects. (fast track communication)
Goodman, J. W.
This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.
Kalegowda, Yogesh; Harmer, Sarah L
2012-03-20
Time-of-flight secondary ion mass spectrometry (TOF-SIMS) spectra of mineral samples are complex, comprised of large mass ranges and many peaks. Consequently, characterization and classification analysis of these systems is challenging. In this study, different chemometric and statistical data evaluation methods, based on monolayer sensitive TOF-SIMS data, have been tested for the characterization and classification of copper-iron sulfide minerals (chalcopyrite, chalcocite, bornite, and pyrite) at different flotation pulp conditions (feed, conditioned feed, and Eh modified). The complex mass spectral data sets were analyzed using the following chemometric and statistical techniques: principal component analysis (PCA); principal component-discriminant functional analysis (PC-DFA); soft independent modeling of class analogy (SIMCA); and k-Nearest Neighbor (k-NN) classification. PCA was found to be an important first step in multivariate analysis, providing insight into both the relative grouping of samples and the elemental/molecular basis for those groupings. For samples exposed to oxidative conditions (at Eh ~430 mV), each technique (PCA, PC-DFA, SIMCA, and k-NN) was found to produce excellent classification. For samples at reductive conditions (at Eh ~ -200 mV SHE), k-NN and SIMCA produced the most accurate classification. Phase identification of particles that contain the same elements but a different crystal structure in a mixed multimetal mineral system has been achieved.
Geographical National Condition and Complex System
Directory of Open Access Journals (Sweden)
WANG Jiayao
2016-01-01
Full Text Available The significance of studying the complex system of geographical national conditions lies in rationally expressing the complex relationships of the “resources-environment-ecology-economy-society” system. Aiming to the problems faced by the statistical analysis of geographical national conditions, including the disunity of research contents, the inconsistency of range, the uncertainty of goals, etc.the present paper conducted a range of discussions from the perspectives of concept, theory and method, and designed some solutions based on the complex system theory and coordination degree analysis methods.By analyzing the concepts of geographical national conditions, geographical national conditions survey and geographical national conditions statistical analysis, as well as investigating the relationships between theirs, the statistical contents and the analytical range of geographical national conditions are clarified and defined. This investigation also clarifies the goals of the statistical analysis by analyzing the basic characteristics of the geographical national conditions and the complex system, and the consistency between the analysis of the degree of coordination and statistical analyses. It outlines their goals, proposes a concept for the complex system of geographical national conditions, and it describes the concept. The complex system theory provides new theoretical guidance for the statistical analysis of geographical national conditions. The degree of coordination offers new approaches on how to undertake the analysis based on the measurement method and decision-making analysis scheme upon which the complex system of geographical national conditions is based. It analyzes the overall trend via the degree of coordination of the complex system on a macro level, and it determines the direction of remediation on a micro level based on the degree of coordination among various subsystems and of single systems. These results establish
DEFF Research Database (Denmark)
Larsen, Jeppe Veirum; Knoche, Hendrik
2017-01-01
Musical instruments and musical user interfaces provide rich input and feedback through mostly tangible interactions, resulting in complex behavior. However, publications of novel interfaces often lack the required detail due to the complex- ity or the focus on a specific part of the interfaces a...
Bayesian Statistics and Uncertainty Quantification for Safety Boundary Analysis in Complex Systems
He, Yuning; Davies, Misty Dawn
2014-01-01
The analysis of a safety-critical system often requires detailed knowledge of safe regions and their highdimensional non-linear boundaries. We present a statistical approach to iteratively detect and characterize the boundaries, which are provided as parameterized shape candidates. Using methods from uncertainty quantification and active learning, we incrementally construct a statistical model from only few simulation runs and obtain statistically sound estimates of the shape parameters for safety boundaries.
Practical statistics a handbook for business projects
Buglear, John
2013-01-01
Practical Statistics is a hands-on guide to statistics, progressing by complexity of data (univariate, bivariate, multivariate) and analysis (portray, summarise, generalise) in order to give the reader a solid understanding of the fundamentals and how to apply them.
Statistical physics of non-thermal phase transitions from foundations to applications
Abaimov, Sergey G
2015-01-01
Statistical physics can be used to better understand non-thermal complex systems—phenomena such as stock-market crashes, revolutions in society and in science, fractures in engineered materials and in the Earth’s crust, catastrophes, traffic jams, petroleum clusters, polymerization, self-organized criticality and many others exhibit behaviors resembling those of thermodynamic systems. In particular, many of these systems possess phase transitions identical to critical or spinodal phenomena in statistical physics. The application of the well-developed formalism of statistical physics to non-thermal complex systems may help to predict and prevent such catastrophes as earthquakes, snow-avalanches and landslides, failure of engineering structures, or economical crises. This book addresses the issue step-by-step, from phenomenological analogies between complex systems and statistical physics to more complex aspects, such as correlations, fluctuation-dissipation theorem, susceptibility, the concept of free ener...
Statistical principles for prospective study protocols:
DEFF Research Database (Denmark)
Christensen, Robin; Langberg, Henning
2012-01-01
In the design of scientific studies it is essential to decide on which scientific questions one aims to answer, just as it is important to decide on the correct statistical methods to use to answer these questions. The correct use of statistical methods is crucial in all aspects of research...... to quantify relationships in data. Despite an increased focus on statistical content and complexity of biomedical research these topics remain difficult for most researchers. Statistical methods enable researchers to condense large spreadsheets with data into means, proportions, and difference between means...... the statistical principles for trial protocols in terms of design, analysis, and reporting of findings....
A New Look at Worst Case Complexity: A Statistical Approach
Directory of Open Access Journals (Sweden)
Niraj Kumar Singh
2014-01-01
Full Text Available We present a new and improved worst case complexity model for quick sort as yworst(n,td=b0+b1n2+g(n,td+ɛ, where the LHS gives the worst case time complexity, n is the input size, td is the frequency of sample elements, and g(n,td is a function of both the input size n and the parameter td. The rest of the terms arising due to linear regression have usual meanings. We claim this to be an improvement over the conventional model; namely, yworst(n=b0+b1n+b2n2+ɛ, which stems from the worst case O(n2 complexity for this algorithm.
Two statistical mechanics aspects of complex networks
Thurner, Stefan; Biely, Christoly
2006-12-01
By adopting an ensemble interpretation of non-growing rewiring networks, network theory can be reduced to a counting problem of possible network states and an identification of their associated probabilities. We present two scenarios of how different rewirement schemes can be used to control the state probabilities of the system. In particular, we review how by generalizing the linking rules of random graphs, in combination with superstatistics and quantum mechanical concepts, one can establish an exact relation between the degree distribution of any given network and the nodes’ linking probability distributions. In a second approach, we control state probabilities by a network Hamiltonian, whose characteristics are motivated by biological and socio-economical statistical systems. We demonstrate that a thermodynamics of networks becomes a fully consistent concept, allowing to study e.g. ‘phase transitions’ and computing entropies through thermodynamic relations.
Complexity of universality and related problems for partially ordered NFAs
Czech Academy of Sciences Publication Activity Database
Krötzsch, M.; Masopust, Tomáš; Thomazo, M.
2017-01-01
Roč. 255, č. 1 (2017), s. 177-192 ISSN 0890-5401 Institutional support: RVO:67985840 Keywords : nondeterministic automata * partial order * universal ity Subject RIV: BA - General Mathematics OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 1.050, year: 2016 http://www.sciencedirect.com/science/article/pii/S0890540117300998?via%3Dihub
General Framework for Evaluating Password Complexity and Strength
2015-11-15
MD5 or SHA-1. Finally, an adversary is represented 1. We will drop the subscript when it is clear which alphabet is being considered. 2. We use an...when information about protection function or attacker’s capabil- ities is not clear (the most typical scenario when users are asked to select a...stronger password requirements: User attitudes and behaviors,” in Pro- ceedings of the Sixth Symposium on Usable Privacy and Security, ser. SOUPS ’10. New
Statistical distribution of resonance parameters for inelastic scattering of fast neutrons
International Nuclear Information System (INIS)
Radunovic, J.
1973-01-01
This paper deals with the application of statistical method for the analysis of nuclear reactions related to complex nuclei. It is shown that inelastic neutron scattering which occurs by creation of a complex nucleus in the higher energy range can be treated by statistical approach
Landsman, V; Lou, W Y W; Graubard, B I
2015-05-20
We present a two-step approach for estimating hazard rates and, consequently, survival probabilities, by levels of general categorical exposure. The resulting estimator utilizes three sources of data: vital statistics data and census data are used at the first step to estimate the overall hazard rate for a given combination of gender and age group, and cohort data constructed from a nationally representative complex survey with linked mortality records, are used at the second step to divide the overall hazard rate by exposure levels. We present an explicit expression for the resulting estimator and consider two methods for variance estimation that account for complex multistage sample design: (1) the leaving-one-out jackknife method, and (2) the Taylor linearization method, which provides an analytic formula for the variance estimator. The methods are illustrated with smoking and all-cause mortality data from the US National Health Interview Survey Linked Mortality Files, and the proposed estimator is compared with a previously studied crude hazard rate estimator that uses survey data only. The advantages of a two-step approach and possible extensions of the proposed estimator are discussed. Copyright © 2015 John Wiley & Sons, Ltd.
Synthesis, spectroscopic characterization and catalytic oxidation ...
Indian Academy of Sciences (India)
were characterized by infrared, electronic, electron paramagnetic resonance ... The catalytic oxidation property of ruthenium(III) complexes were also ... cies at room temperature. ..... aldehyde part of Schiff base ligands, catalytic activ- ity of new ...
Palozzi, Jason; Pantopoulos, George; Maravelis, Angelos G.; Nordsvan, Adam; Zelilidis, Avraam
2018-02-01
This investigation presents an outcrop-based integrated study of internal division analysis and statistical treatment of turbidite bed thickness applied to a Carboniferous deep-water channel-levee complex in the Myall Trough, southeast Australia. Turbidite beds of the studied succession are characterized by a range of sedimentary structures grouped into two main associations, a thick-bedded and a thin-bedded one, that reflect channel-fill and overbank/levee deposits, respectively. Three vertically stacked channel-levee cycles have been identified. Results of statistical analysis of bed thickness, grain-size and internal division patterns applied on the studied channel-levee succession, indicate that turbidite bed thickness data seem to be well characterized by a bimodal lognormal distribution, which is possibly reflecting the difference between deposition from lower-density flows (in a levee/overbank setting) and very high-density flows (in a channel fill setting). Power law and exponential distributions were observed to hold only for the thick-bedded parts of the succession and cannot characterize the whole bed thickness range of the studied sediments. The succession also exhibits non-random clustering of bed thickness and grain-size measurements. The studied sediments are also characterized by the presence of statistically detected fining-upward sandstone packets. A novel quantitative approach (change-point analysis) is proposed for the detection of those packets. Markov permutation statistics also revealed the existence of order in the alternation of internal divisions in the succession expressed by an optimal internal division cycle reflecting two main types of gravity flow events deposited within both thick-bedded conglomeratic and thin-bedded sandstone associations. The analytical methods presented in this study can be used as additional tools for quantitative analysis and recognition of depositional environments in hydrocarbon-bearing research of ancient
Statistical Literacy in the Data Science Workplace
Grant, Robert
2017-01-01
Statistical literacy, the ability to understand and make use of statistical information including methods, has particular relevance in the age of data science, when complex analyses are undertaken by teams from diverse backgrounds. Not only is it essential to communicate to the consumers of information but also within the team. Writing from the…
DEFF Research Database (Denmark)
Bollerslev, Tim; Todorov, Victor
solely be explained by the level of the volatil- ity. Our empirical investigations are essentially model-free. We estimate the expected values of the tails under the statistical probability measure from "medium" size jumps in high-frequency intraday prices and an extreme value theory approximation...
Learning predictive statistics from temporal sequences: Dynamics and strategies.
Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E; Kourtzi, Zoe
2017-10-01
Human behavior is guided by our expectations about the future. Often, we make predictions by monitoring how event sequences unfold, even though such sequences may appear incomprehensible. Event structures in the natural environment typically vary in complexity, from simple repetition to complex probabilistic combinations. How do we learn these structures? Here we investigate the dynamics of structure learning by tracking human responses to temporal sequences that change in structure unbeknownst to the participants. Participants were asked to predict the upcoming item following a probabilistic sequence of symbols. Using a Markov process, we created a family of sequences, from simple frequency statistics (e.g., some symbols are more probable than others) to context-based statistics (e.g., symbol probability is contingent on preceding symbols). We demonstrate the dynamics with which individuals adapt to changes in the environment's statistics-that is, they extract the behaviorally relevant structures to make predictions about upcoming events. Further, we show that this structure learning relates to individual decision strategy; faster learning of complex structures relates to selection of the most probable outcome in a given context (maximizing) rather than matching of the exact sequence statistics. Our findings provide evidence for alternate routes to learning of behaviorally relevant statistics that facilitate our ability to predict future events in variable environments.
DEFF Research Database (Denmark)
Zhang, Shun; Yang, Yi; Hanson, Steen Grüner
2015-01-01
for the superiority of the proposed PSVC technique, we study the statistical properties of the spatial derivatives of the complex signal representation generated from the Riesz transform. Under the assumption of a Gaussian random process, a theoretical analysis for the pseudo Stokes vector correlation has been...... provided. Based on these results, we show mathematically that PSVC has a performance advantage over conventional intensity-based correlation technique....
Fish: A New Computer Program for Friendly Introductory Statistics Help
Brooks, Gordon P.; Raffle, Holly
2005-01-01
All introductory statistics students must master certain basic descriptive statistics, including means, standard deviations and correlations. Students must also gain insight into such complex concepts as the central limit theorem and standard error. This article introduces and describes the Friendly Introductory Statistics Help (FISH) computer…
COMPARATIVE STATISTICAL ANALYSIS OF GENOTYPES’ COMBINING
Directory of Open Access Journals (Sweden)
V. Z. Stetsyuk
2015-05-01
The program provides the creation of desktop program complex for statistics calculations on a personal computer of doctor. Modern methods and tools for development of information systems were described to create program.
Statistical Techniques for Project Control
Badiru, Adedeji B
2012-01-01
A project can be simple or complex. In each case, proven project management processes must be followed. In all cases of project management implementation, control must be exercised in order to assure that project objectives are achieved. Statistical Techniques for Project Control seamlessly integrates qualitative and quantitative tools and techniques for project control. It fills the void that exists in the application of statistical techniques to project control. The book begins by defining the fundamentals of project management then explores how to temper quantitative analysis with qualitati
Directory of Open Access Journals (Sweden)
D. F. Hurst
2011-12-01
Full Text Available We compare coincident, in situ, balloon-borne measurements of temperature (<i>T> and pressure (P by two radiosondes (Vaisala RS92, Intermet iMet-1-RSB and similar measurements of relative humidity (RH by RS92 sondes and frost point hygrometers. Data from a total of 28 balloon flights with at least one pair of radiosondes are analyzed in 1-km altitude bins to quantify measurement differences between the sonde sensors and how they vary with altitude. Each comparison (<i>T>, P, RH exposes several profiles of anomalously large measurement differences. Measurement difference statistics, calculated with and without the anomalous profiles, are compared to uncertainties quoted by the radiosonde manufacturers. Excluding seven anomalous profiles, <i>T> differences between 19 pairs of RS92 and iMet sondes exceed their measurement uncertainty limits (2 σ 31% of the time and reveal a statistically significant, altitude-independent bias of 0.5 ± 0.2 °C. Similarly, RS92-iMet P differences in 22 non-anomalous profiles exceed their uncertainty limits 23% of the time, with a disproportionate 83% of the excessive P differences at altitudes >16 km. The RS92-iMet pressure differences increase smoothly from −0.6 hPa near the surface to 0.8 hPa above 25 km. Temperature and P differences between all 14 pairs of RS92 sondes exceed manufacturer-quoted, reproducibility limits (σ 28% and 11% of the time, respectively. About 95% of the excessive <i>T> differences are eliminated when 5 anomalous RS92-RS92 profiles are excluded. Only 5% of RH measurement differences between 14 pairs of RS92 sondes exceed the manufacturer's measurement reproducibility limit (σ. RH measurements by RS92 sondes are also compared to RH values calculated from frost point hygrometer measurements and coincident <i>T> measurements by the radiosondes. The influences of RS92-iMet Tand P differences on RH values and water vapor mixing
Statistics: The stethoscope of a thinking urologist
Directory of Open Access Journals (Sweden)
Arun S Sivanandam
2009-01-01
Full Text Available Understanding statistical terminology and the ability to appraise clinical research findings and statistical tests are critical to the practice of evidence-based medicine. Urologists require statistics in their toolbox of skills in order to successfully sift through increasingly complex studies and realize the drawbacks of statistical tests. Currently, the level of evidence in urology literature is low and the majority of research abstracts published for the American Urological Association (AUA meetings lag behind for full-text publication because of a lack of statistical reporting. Underlying these issues is a distinct deficiency in solid comprehension of statistics in the literature and a discomfort with the application of statistics for clinical decision-making. This review examines the plight of statistics in urology and investigates the reason behind the white-coat aversion to biostatistics. Resources such as evidence-based medicine websites, primers in statistics, and guidelines for statistical reporting exist for quick reference by urologists. Ultimately, educators should take charge of monitoring statistical knowledge among trainees by bolstering competency requirements and creating sustained opportunities for statistics and methodology exposure.
Digital Repository Service at National Institute of Oceanography (India)
Biju, A.; Honey, U.K.; Kusum, K.K.; Jagadeesan, L.
di ffe re nt se as o n s St at io n n o . La tit ud e (°N ) Lo ng itu de (°E ) IM S (20 04 ) SW M (20 05 ) N EM (20 06 ) S. gr a ci lis Te m pe ra tu re Sa lin ity S. gr a ci lis Te m pe ra tu re Sa lin ity S. gr a ci lis Te m pe ra tu re Sa lin ity... 04 ) SW M (20 05 ) N EM (20 06 ) S. gr a ci lis Te m pe ra tu re Sa lin ity S. gr a ci lis Te m pe ra tu re Sa lin ity S. gr a ci lis Te m pe ra tu re Sa lin ity de ns ity (°C ) de ns ity (°C ) de ns ity (°C ) 27 15 72 – 29 .7 35 .0 – 30 .7 35 .9...
Simple statistical methods for software engineering data and patterns
Pandian, C Ravindranath
2015-01-01
Although there are countless books on statistics, few are dedicated to the application of statistical methods to software engineering. Simple Statistical Methods for Software Engineering: Data and Patterns fills that void. Instead of delving into overly complex statistics, the book details simpler solutions that are just as effective and connect with the intuition of problem solvers.Sharing valuable insights into software engineering problems and solutions, the book not only explains the required statistical methods, but also provides many examples, review questions, and case studies that prov
Statistical principles for prospective study protocols:
DEFF Research Database (Denmark)
Christensen, Robin; Langberg, Henning
2012-01-01
In the design of scientific studies it is essential to decide on which scientific questions one aims to answer, just as it is important to decide on the correct statistical methods to use to answer these questions. The correct use of statistical methods is crucial in all aspects of research...... to quantify relationships in data. Despite an increased focus on statistical content and complexity of biomedical research these topics remain difficult for most researchers. Statistical methods enable researchers to condense large spreadsheets with data into means, proportions, and difference between means......, risk differences, and other quantities that convey information. One of the goals in biomedical research is to develop parsimonious models - meaning as simple as possible. This approach is valid if the subsequent research report (the article) is written independent of whether the results...
Statistical and Computational Techniques in Manufacturing
2012-01-01
In recent years, interest in developing statistical and computational techniques for applied manufacturing engineering has been increased. Today, due to the great complexity of manufacturing engineering and the high number of parameters used, conventional approaches are no longer sufficient. Therefore, in manufacturing, statistical and computational techniques have achieved several applications, namely, modelling and simulation manufacturing processes, optimization manufacturing parameters, monitoring and control, computer-aided process planning, etc. The present book aims to provide recent information on statistical and computational techniques applied in manufacturing engineering. The content is suitable for final undergraduate engineering courses or as a subject on manufacturing at the postgraduate level. This book serves as a useful reference for academics, statistical and computational science researchers, mechanical, manufacturing and industrial engineers, and professionals in industries related to manu...
Rusoja, Evan; Haynie, Deson; Sievers, Jessica; Mustafee, Navonil; Nelson, Fred; Reynolds, Martin; Sarriot, Eric; Swanson, Robert Chad; Williams, Bob
2018-01-30
Goals. Key messages Systems thinking and complexity science, theories that acknowledge the dynamic, connected, and context-dependent nature of health, are highly relevant to the post-millennium development goal era yet lack consensus on their use in relation to health Although heterogeneous, terms, and concepts like emergence, dynamic/dynamical Systems, nonlinear(ity), and interdependent/interconnected as well as methods like systems dynamic modelling and agent-based modelling that comprise systems thinking and complexity science in the health literature are shared across an increasing number of publications within medical/healthcare disciplines Planners, practitioners, and theorists that can better understand these key systems thinking and complexity science concepts will be better equipped to tackle the challenges of the upcoming development goals. © 2018 John Wiley & Sons, Ltd.
Conformity and statistical tolerancing
Leblond, Laurent; Pillet, Maurice
2018-02-01
Statistical tolerancing was first proposed by Shewhart (Economic Control of Quality of Manufactured Product, (1931) reprinted 1980 by ASQC), in spite of this long history, its use remains moderate. One of the probable reasons for this low utilization is undoubtedly the difficulty for designers to anticipate the risks of this approach. The arithmetic tolerance (worst case) allows a simple interpretation: conformity is defined by the presence of the characteristic in an interval. Statistical tolerancing is more complex in its definition. An interval is not sufficient to define the conformance. To justify the statistical tolerancing formula used by designers, a tolerance interval should be interpreted as the interval where most of the parts produced should probably be located. This tolerance is justified by considering a conformity criterion of the parts guaranteeing low offsets on the latter characteristics. Unlike traditional arithmetic tolerancing, statistical tolerancing requires a sustained exchange of information between design and manufacture to be used safely. This paper proposes a formal definition of the conformity, which we apply successively to the quadratic and arithmetic tolerancing. We introduce a concept of concavity, which helps us to demonstrate the link between tolerancing approach and conformity. We use this concept to demonstrate the various acceptable propositions of statistical tolerancing (in the space decentring, dispersion).
Indian Academy of Sciences (India)
The Editorial Board for the articles in the mathematical sciences section consisted of. G. Misra, R. Mukherjee, R. Sujatha and myself. We decided to seek articles on the follow- ing broad themes: mathematical analysis, probabil- ity and statistics, number theory, the theory of Lie and algebraic groups, and algebraic geometry.
Unifying Complexity and Information
Ke, Da-Guan
2013-04-01
Complex systems, arising in many contexts in the computer, life, social, and physical sciences, have not shared a generally-accepted complexity measure playing a fundamental role as the Shannon entropy H in statistical mechanics. Superficially-conflicting criteria of complexity measurement, i.e. complexity-randomness (C-R) relations, have given rise to a special measure intrinsically adaptable to more than one criterion. However, deep causes of the conflict and the adaptability are not much clear. Here I trace the root of each representative or adaptable measure to its particular universal data-generating or -regenerating model (UDGM or UDRM). A representative measure for deterministic dynamical systems is found as a counterpart of the H for random process, clearly redefining the boundary of different criteria. And a specific UDRM achieving the intrinsic adaptability enables a general information measure that ultimately solves all major disputes. This work encourages a single framework coving deterministic systems, statistical mechanics and real-world living organisms.
Directory of Open Access Journals (Sweden)
Kuo Zhang
2018-01-01
Full Text Available The mechanisms of acupuncture are still unclear. In order to reveal the regulatory effect of manual acupuncture (MA on the neuroendocrine-immune (NEI network and identify the key signaling molecules during MA modulating NEI network, we used a rat complete Freund’s adjuvant (CFA model to observe the analgesic and anti-inflammatory effect of MA, and, what is more, we used statistical and complex network methods to analyze the data about the expression of 55 common signaling molecules of NEI network in ST36 (Zusanli acupoint, and serum and hind foot pad tissue. The results indicate that MA had significant analgesic, anti-inflammatory effects on CFA rats; the key signaling molecules may play a key role during MA regulating NEI network, but further research is needed.
Statistical modelling with quantile functions
Gilchrist, Warren
2000-01-01
Galton used quantiles more than a hundred years ago in describing data. Tukey and Parzen used them in the 60s and 70s in describing populations. Since then, the authors of many papers, both theoretical and practical, have used various aspects of quantiles in their work. Until now, however, no one put all the ideas together to form what turns out to be a general approach to statistics.Statistical Modelling with Quantile Functions does just that. It systematically examines the entire process of statistical modelling, starting with using the quantile function to define continuous distributions. The author shows that by using this approach, it becomes possible to develop complex distributional models from simple components. A modelling kit can be developed that applies to the whole model - deterministic and stochastic components - and this kit operates by adding, multiplying, and transforming distributions rather than data.Statistical Modelling with Quantile Functions adds a new dimension to the practice of stati...
The CEO performance effect : Statistical issues and a complex fit perspective
Blettner, D.P.; Chaddad, F.R.; Bettis, R.
2012-01-01
How CEOs affect strategy and performance is important to strategic management research. We show that sophisticated statistical analysis alone is problematic for establishing the magnitude and causes of CEO impact on performance. We discuss three problem areas that substantially distort the
Towards an Information Theory of Complex Networks
Dehmer, Matthias; Mehler, Alexander
2011-01-01
For over a decade, complex networks have steadily grown as an important tool across a broad array of academic disciplines, with applications ranging from physics to social media. A tightly organized collection of carefully-selected papers on the subject, Towards an Information Theory of Complex Networks: Statistical Methods and Applications presents theoretical and practical results about information-theoretic and statistical models of complex networks in the natural sciences and humanities. The book's major goal is to advocate and promote a combination of graph-theoretic, information-theoreti
Fermi-Dirac statistics and traffic in complex networks.
de Moura, Alessandro P S
2005-06-01
We propose an idealized model for traffic in a network, in which many particles move randomly from node to node, following the network's links, and it is assumed that at most one particle can occupy any given node. This is intended to mimic the finite forwarding capacity of nodes in communication networks, thereby allowing the possibility of congestion and jamming phenomena. We show that the particles behave like free fermions, with appropriately defined energy-level structure and temperature. The statistical properties of this system are thus given by the corresponding Fermi-Dirac distribution. We use this to obtain analytical expressions for dynamical quantities of interest, such as the mean occupation of each node and the transport efficiency, for different network topologies and particle densities. We show that the subnetwork of free nodes always fragments into small isolated clusters for a sufficiently large number of particles, implying a communication breakdown at some density for all network topologies. These results are compared to direct simulations.
Ignorability in Statistical and Probabilistic Inference
DEFF Research Database (Denmark)
Jaeger, Manfred
2005-01-01
When dealing with incomplete data in statistical learning, or incomplete observations in probabilistic inference, one needs to distinguish the fact that a certain event is observed from the fact that the observed event has happened. Since the modeling and computational complexities entailed...
A statistical physics perspective on criticality in financial markets
International Nuclear Information System (INIS)
Bury, Thomas
2013-01-01
Stock markets are complex systems exhibiting collective phenomena and particular features such as synchronization, fluctuations distributed as power-laws, non-random structures and similarity to neural networks. Such specific properties suggest that markets operate at a very special point. Financial markets are believed to be critical by analogy to physical systems, but little statistically founded evidence has been given. Through a data-based methodology and comparison to simulations inspired by the statistical physics of complex systems, we show that the Dow Jones and index sets are not rigorously critical. However, financial systems are closer to criticality in the crash neighborhood. (paper)
Statistical physics and computational methods for evolutionary game theory
Javarone, Marco Alberto
2018-01-01
This book presents an introduction to Evolutionary Game Theory (EGT) which is an emerging field in the area of complex systems attracting the attention of researchers from disparate scientific communities. EGT allows one to represent and study several complex phenomena, such as the emergence of cooperation in social systems, the role of conformity in shaping the equilibrium of a population, and the dynamics in biological and ecological systems. Since EGT models belong to the area of complex systems, statistical physics constitutes a fundamental ingredient for investigating their behavior. At the same time, the complexity of some EGT models, such as those realized by means of agent-based methods, often require the implementation of numerical simulations. Therefore, beyond providing an introduction to EGT, this book gives a brief overview of the main statistical physics tools (such as phase transitions and the Ising model) and computational strategies for simulating evolutionary games (such as Monte Carlo algor...
Statistical Challenges in Modeling Big Brain Signals
Yu, Zhaoxia; Pluta, Dustin; Shen, Tong; Chen, Chuansheng; Xue, Gui; Ombao, Hernando
2017-01-01
Brain signal data are inherently big: massive in amount, complex in structure, and high in dimensions. These characteristics impose great challenges for statistical inference and learning. Here we review several key challenges, discuss possible
1989 lectures in complex systems
International Nuclear Information System (INIS)
Jen, E.
1990-01-01
This report contains papers on the following topics: Lectures on a Theory of Computation and Complexity over the Reals; Algorithmic Information Content, Church-Turing Thesis, Physical Entroph, and Maxwell's Demon; Physical Measures of Complexity; An Introduction to Chaos and Prediction; Hamiltonian Chaos in Nonlinear Polarized Optical Beam; Chemical Oscillators and Nonlinear Chemical Dynamics; Isotropic Navier-Stokes Turbulence. I. Qualitative Features and Basic Equations; Isotropic Navier-Stokes Turbulence. II. Statistical Approximation Methods; Lattice Gases; Data-Parallel Computation and the Connection Machine; Preimages and Forecasting for Cellular Automata; Lattice-Gas Models for Multiphase Flows and Magnetohydrodynamics; Probabilistic Cellular Automata: Some Statistical Mechanical Considerations; Complexity Due to Disorder and Frustration; Self-Organization by Simulated Evolution; Theoretical Immunology; Morphogenesis by Cell Intercalation; and Theoretical Physics Meets Experimental Neurobiology
Estimation of global network statistics from incomplete data.
Directory of Open Access Journals (Sweden)
Catherine A Bliss
Full Text Available Complex networks underlie an enormous variety of social, biological, physical, and virtual systems. A profound complication for the science of complex networks is that in most cases, observing all nodes and all network interactions is impossible. Previous work addressing the impacts of partial network data is surprisingly limited, focuses primarily on missing nodes, and suggests that network statistics derived from subsampled data are not suitable estimators for the same network statistics describing the overall network topology. We generate scaling methods to predict true network statistics, including the degree distribution, from only partial knowledge of nodes, links, or weights. Our methods are transparent and do not assume a known generating process for the network, thus enabling prediction of network statistics for a wide variety of applications. We validate analytical results on four simulated network classes and empirical data sets of various sizes. We perform subsampling experiments by varying proportions of sampled data and demonstrate that our scaling methods can provide very good estimates of true network statistics while acknowledging limits. Lastly, we apply our techniques to a set of rich and evolving large-scale social networks, Twitter reply networks. Based on 100 million tweets, we use our scaling techniques to propose a statistical characterization of the Twitter Interactome from September 2008 to November 2008. Our treatment allows us to find support for Dunbar's hypothesis in detecting an upper threshold for the number of active social contacts that individuals maintain over the course of one week.
Directory of Open Access Journals (Sweden)
Minna Pitkänen
Full Text Available Repetition suppression (RS is evident as a weakened response to repeated stimuli after the initial response. RS has been demonstrated in motor-evoked potentials (MEPs induced with transcranial magnetic stimulation (TMS. Here, we investigated the effect of inter-train interval (ITI on the induction of RS of MEPs with the attempt to optimize the investigative protocols. Trains of TMS pulses, targeted to the primary motor cortex by neuronavigation, were applied at a stimulation intensity of 120% of the resting motor threshold. The stimulus trains included either four or twenty pulses with an inter-stimulus interval (ISI of 1 s. The ITI was here defined as the interval between the last pulse in a train and the first pulse in the next train; the ITIs used here were 1, 3, 4, 6, 7, 12, and 17 s. RS was observed with all ITIs except with the ITI of 1 s, in which the ITI was equal to ISI. RS was more pronounced with longer ITIs. Shorter ITIs may not allow sufficient time for a return to baseline. RS may reflect a startle-like response to the first pulse of a train followed by habituation. Longer ITIs may allow more recovery time and in turn demonstrate greater RS. Our results indicate that RS can be studied with confidence at relatively short ITIs of 6 s and above.
Bayesian approach to inverse statistical mechanics
Habeck, Michael
2014-05-01
Inverse statistical mechanics aims to determine particle interactions from ensemble properties. This article looks at this inverse problem from a Bayesian perspective and discusses several statistical estimators to solve it. In addition, a sequential Monte Carlo algorithm is proposed that draws the interaction parameters from their posterior probability distribution. The posterior probability involves an intractable partition function that is estimated along with the interactions. The method is illustrated for inverse problems of varying complexity, including the estimation of a temperature, the inverse Ising problem, maximum entropy fitting, and the reconstruction of molecular interaction potentials.
Directory of Open Access Journals (Sweden)
Rochelle E. Tractenberg
2016-12-01
Full Text Available Statistical literacy is essential to an informed citizenry; and two emerging trends highlight a growing need for training that achieves this literacy. The first trend is towards “big” data: while automated analyses can exploit massive amounts of data, the interpretation—and possibly more importantly, the replication—of results are challenging without adequate statistical literacy. The second trend is that science and scientific publishing are struggling with insufficient/inappropriate statistical reasoning in writing, reviewing, and editing. This paper describes a model for statistical literacy (SL and its development that can support modern scientific practice. An established curriculum development and evaluation tool—the Mastery Rubric—is integrated with a new, developmental, model of statistical literacy that reflects the complexity of reasoning and habits of mind that scientists need to cultivate in order to recognize, choose, and interpret statistical methods. This developmental model provides actionable evidence, and explicit opportunities for consequential assessment that serves students, instructors, developers/reviewers/accreditors of a curriculum, and institutions. By supporting the enrichment, rather than increasing the amount, of statistical training in the basic and life sciences, this approach supports curriculum development, evaluation, and delivery to promote statistical literacy for students and a collective quantitative proficiency more broadly.
Foundation of statistical energy analysis in vibroacoustics
Le Bot, A
2015-01-01
This title deals with the statistical theory of sound and vibration. The foundation of statistical energy analysis is presented in great detail. In the modal approach, an introduction to random vibration with application to complex systems having a large number of modes is provided. For the wave approach, the phenomena of propagation, group speed, and energy transport are extensively discussed. Particular emphasis is given to the emergence of diffuse field, the central concept of the theory.
Statistical properties of deep inelastic reactions
International Nuclear Information System (INIS)
Moretto, L.G.
1983-08-01
The multifaceted aspects of deep-inelastic heavy-ion collisions are discussed in terms of the statistical equilibrium limit. It is shown that a conditional statistical equilibrium, where a number of degrees of freedom are thermalized while others are still relaxing, prevails in most of these reactions. The individual degrees of freedom that have been explored experimentally are considered in their statistical equilibrium limit, and the extent to which they appear to be thermalized is discussed. The interaction between degrees of freedom on their way towards equilibrium is shown to create complex feedback phenomena that may lead to self-regulation. A possible example of self-regulation is shown for the process of energy partition between fragments promoted by particle exchange. 35 references
African Journals Online (AJOL)
Improved resistance to fatigue. 15. 37.5 .... in the prevalence of. AAS use are interesting and prOVide a useful model for trying ... Athletes' projections of anabolic steroid use. elm Sports ... of Mathematical Statistics, L'niver5ity of Cape Town, 1991. 23. ... Test-retest reliability and validity information for a high .school drug use.
Cultural Factors in Managing an FMS Case Program: Saudi Arabian Army Ordnance Corps (SOCP) Program
1977-11-01
exores:ecd when attempting, to discuss 13 complex, sophisticated technical material with senior counterparts who possessed relative fluency in...i.ored -:ith ’ mop ity; they crnnot be rvoided; the: can to a rrroat extent be anticipated as critical man- cement factors. Bfy anticipating and preparing
Wang, W. L.; Tsui, K. L.; Lo, S. M.; Liu, S. B.
2018-01-01
Crowded transportation hubs such as metro stations are thought as ideal places for the development and spread of epidemics. However, for the special features of complex spatial layout, confined environment with a large number of highly mobile individuals, it is difficult to quantify human contacts in such environments, wherein disease spreading dynamics were less explored in the previous studies. Due to the heterogeneity and dynamic nature of human interactions, increasing studies proved the importance of contact distance and length of contact in transmission probabilities. In this study, we show how detailed information on contact and exposure patterns can be obtained by statistical analyses on microscopic crowd simulation data. To be specific, a pedestrian simulation model-CityFlow was employed to reproduce individuals' movements in a metro station based on site survey data, values and distributions of individual contact rate and exposure in different simulation cases were obtained and analyzed. It is interesting that Weibull distribution fitted the histogram values of individual-based exposure in each case very well. Moreover, we found both individual contact rate and exposure had linear relationship with the average crowd densities of the environments. The results obtained in this paper can provide reference to epidemic study in complex and confined transportation hubs and refine the existing disease spreading models.
Directory of Open Access Journals (Sweden)
Jinping Liu
Full Text Available Computer vision as a fast, low-cost, noncontact, and online monitoring technology has been an important tool to inspect product quality, particularly on a large-scale assembly production line. However, the current industrial vision system is far from satisfactory in the intelligent perception of complex grain images, comprising a large number of local homogeneous fragmentations or patches without distinct foreground and background. We attempt to solve this problem based on the statistical modeling of spatial structures of grain images. We present a physical explanation in advance to indicate that the spatial structures of the complex grain images are subject to a representative Weibull distribution according to the theory of sequential fragmentation, which is well known in the continued comminution of ore grinding. To delineate the spatial structure of the grain image, we present a method of multiscale and omnidirectional Gaussian derivative filtering. Then, a product quality classifier based on sparse multikernel-least squares support vector machine is proposed to solve the low-confidence classification problem of imbalanced data distribution. The proposed method is applied on the assembly line of a food-processing enterprise to classify (or identify automatically the production quality of rice. The experiments on the real application case, compared with the commonly used methods, illustrate the validity of our method.
Liu, Jinping; Tang, Zhaohui; Zhang, Jin; Chen, Qing; Xu, Pengfei; Liu, Wenzhong
2016-01-01
Computer vision as a fast, low-cost, noncontact, and online monitoring technology has been an important tool to inspect product quality, particularly on a large-scale assembly production line. However, the current industrial vision system is far from satisfactory in the intelligent perception of complex grain images, comprising a large number of local homogeneous fragmentations or patches without distinct foreground and background. We attempt to solve this problem based on the statistical modeling of spatial structures of grain images. We present a physical explanation in advance to indicate that the spatial structures of the complex grain images are subject to a representative Weibull distribution according to the theory of sequential fragmentation, which is well known in the continued comminution of ore grinding. To delineate the spatial structure of the grain image, we present a method of multiscale and omnidirectional Gaussian derivative filtering. Then, a product quality classifier based on sparse multikernel-least squares support vector machine is proposed to solve the low-confidence classification problem of imbalanced data distribution. The proposed method is applied on the assembly line of a food-processing enterprise to classify (or identify) automatically the production quality of rice. The experiments on the real application case, compared with the commonly used methods, illustrate the validity of our method.
Mock, B A; Holiday, D L; Cerretti, D P; Darnell, S C; O'Brien, A D; Potter, M
1994-01-01
The interval of mouse chromosome 1 extending from Idh-1 to Pep-3 harbors the natural resistance gene Ity/Lsh/Bcg; it controls the outcome of infection with Salmonella typhimurium, Leishmania donovani, and several Mycobacterium species. This region also contains a DNA repair gene, Rep-1, which determines the rapidity with which double-strand breaks in chromatin are repaired. BALB/cAnPt and DBA/2N mice differ in their phenotypic expression of these genes. To generate appropriate strains of mice for the study of these genes, a series of 10 C.D2 congenic strains recombinant across a 28-centimorgan interval of mouse chromosome 1 extending from Idh-1 to Pep-3 were derived from crosses of the C.D2-Idh-1 Pep-3 congenic strain back to BALB/cAn. Analyses of these recombinant strains will allow the correlation of biological-immunological phenotypes with defined genetic regions.
Statistical Challenges in Modeling Big Brain Signals
Yu, Zhaoxia
2017-11-01
Brain signal data are inherently big: massive in amount, complex in structure, and high in dimensions. These characteristics impose great challenges for statistical inference and learning. Here we review several key challenges, discuss possible solutions, and highlight future research directions.
Radio Observations of the S5 Sample Jun Liu & Xiang Liu
Indian Academy of Sciences (India)
density in flux monitoring, standard deviation, the modulation index, the variabil- ity amplitude and the reduced χ2. The last three columns (m, Y, χ2 red. ) are used to judge the degree of variability, and the definition of these parameters are described in Kraus et al. (2003). We studied the statistics of the variability and ...
Emberson, Lauren L; Rubinstein, Dani Y
2016-08-01
The influence of statistical information on behavior (either through learning or adaptation) is quickly becoming foundational to many domains of cognitive psychology and cognitive neuroscience, from language comprehension to visual development. We investigate a central problem impacting these diverse fields: when encountering input with rich statistical information, are there any constraints on learning? This paper examines learning outcomes when adult learners are given statistical information across multiple levels of abstraction simultaneously: from abstract, semantic categories of everyday objects to individual viewpoints on these objects. After revealing statistical learning of abstract, semantic categories with scrambled individual exemplars (Exp. 1), participants viewed pictures where the categories as well as the individual objects predicted picture order (e.g., bird1-dog1, bird2-dog2). Our findings suggest that participants preferentially encode the relationships between the individual objects, even in the presence of statistical regularities linking semantic categories (Exps. 2 and 3). In a final experiment we investigate whether learners are biased towards learning object-level regularities or simply construct the most detailed model given the data (and therefore best able to predict the specifics of the upcoming stimulus) by investigating whether participants preferentially learn from the statistical regularities linking individual snapshots of objects or the relationship between the objects themselves (e.g., bird_picture1-dog_picture1, bird_picture2-dog_picture2). We find that participants fail to learn the relationships between individual snapshots, suggesting a bias towards object-level statistical regularities as opposed to merely constructing the most complete model of the input. This work moves beyond the previous existence proofs that statistical learning is possible at both very high and very low levels of abstraction (categories vs. individual
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Statistical learning and selective inference.
Taylor, Jonathan; Tibshirani, Robert J
2015-06-23
We describe the problem of "selective inference." This addresses the following challenge: Having mined a set of data to find potential associations, how do we properly assess the strength of these associations? The fact that we have "cherry-picked"--searched for the strongest associations--means that we must set a higher bar for declaring significant the associations that we see. This challenge becomes more important in the era of big data and complex statistical modeling. The cherry tree (dataset) can be very large and the tools for cherry picking (statistical learning methods) are now very sophisticated. We describe some recent new developments in selective inference and illustrate their use in forward stepwise regression, the lasso, and principal components analysis.
Using statistics to understand the environment
Cook, Penny A
2000-01-01
Using Statistics to Understand the Environment covers all the basic tests required for environmental practicals and projects and points the way to the more advanced techniques that may be needed in more complex research designs. Following an introduction to project design, the book covers methods to describe data, to examine differences between samples, and to identify relationships and associations between variables.Featuring: worked examples covering a wide range of environmental topics, drawings and icons, chapter summaries, a glossary of statistical terms and a further reading section, this book focuses on the needs of the researcher rather than on the mathematics behind the tests.
Nonextensive statistical mechanics of ionic solutions
International Nuclear Information System (INIS)
Varela, L.M.; Carrete, J.; Munoz-Sola, R.; Rodriguez, J.R.; Gallego, J.
2007-01-01
Classical mean-field Poisson-Boltzmann theory of ionic solutions is revisited in the theoretical framework of nonextensive Tsallis statistics. The nonextensive equivalent of Poisson-Boltzmann equation is formulated revisiting the statistical mechanics of liquids and the Debye-Hueckel framework is shown to be valid for highly diluted solutions even under circumstances where nonextensive thermostatistics must be applied. The lowest order corrections associated to nonadditive effects are identified for both symmetric and asymmetric electrolytes and the behavior of the average electrostatic potential in a homogeneous system is analytically and numerically analyzed for various values of the complexity measurement nonextensive parameter q
"Statistical Techniques for Particle Physics" (2/4)
CERN. Geneva
2009-01-01
This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...
"Statistical Techniques for Particle Physics" (1/4)
CERN. Geneva
2009-01-01
This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...
"Statistical Techniques for Particle Physics" (4/4)
CERN. Geneva
2009-01-01
This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...
"Statistical Techniques for Particle Physics" (3/4)
CERN. Geneva
2009-01-01
This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...
Statistical Characterization of Electromagnetic Wave Propagation in Mine Environments
Yucel, Abdulkadir C.
2013-01-01
A computational framework for statistically characterizing electromagnetic (EM) wave propagation through mine tunnels and galleries is presented. The framework combines a multi-element probabilistic collocation method with a full-wave fast Fourier transform and fast multipole method accelerated surface integral equation-based EM simulator to statistically characterize fields from wireless transmitters in complex mine environments. 1536-1225 © 2013 IEEE.
Assessment of complications due to intratympanic injections
Directory of Open Access Journals (Sweden)
Yu-Chuan Liu
2016-03-01
Full Text Available Objective: The purpose of the study is to report and to analyze the complications following intratympanic injections (ITI of steroids. The occurrence rate of complications at different ITI sites, four quadrants of eardrum, was also compared. Methods: A retrospective clinical review in a medical center. Each patient received ITI twice in a week for 2â3 consecutive weeks as a salvage therapy for sudden sensorineural hearing loss. Post-injection complications, especially transient dizziness and vertigo, were recorded. Patients with acute or chronic vertigo episodes in 1 month were excluded. Results: A total of 59 patients with sudden sensorineural hearing loss and a total of 278 times of ITI were performed in 1 year. The post-injection complications included pain, tongue numbness, transient dizziness, vertigo, tinnitus, and a small persistent perforation. There was no significant difference in the occurrence of these complications between the injections sites on the 4 quadrants of the tympanic membrane. However, there was statistical significance in the post-injection vertiginous episode after IT injections to posterior-inferior quadrant (Q3 and posterior-superior quadrant (Q4 compared to anterior-superior quadrant (Q1 and anterior-inferior quadrant (Q2 (PÂ =Â 0.0113. Conclusion: IT injection is recommended to be applied to the Q2 since the Q1 and Q4 injections are more likely to induce the adverse effect of tongue numbness, while the Q3 and Q4 areas are more likely to induce post-injection vertigo. Keywords: Intratympanic injection, Sudden deafness, Complications, Vertigo
Bayesian models a statistical primer for ecologists
Hobbs, N Thompson
2015-01-01
Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods-in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach. Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probabili
Mathematical and Statistical Opportunities in Cyber Security
Energy Technology Data Exchange (ETDEWEB)
Meza, Juan; Campbell, Scott; Bailey, David
2009-03-23
The role of mathematics in a complex system such as the Internet has yet to be deeply explored. In this paper, we summarize some of the important and pressing problems in cyber security from the viewpoint of open science environments. We start by posing the question 'What fundamental problems exist within cyber security research that can be helped by advanced mathematics and statistics'? Our first and most important assumption is that access to real-world data is necessary to understand large and complex systems like the Internet. Our second assumption is that many proposed cyber security solutions could critically damage both the openness and the productivity of scientific research. After examining a range of cyber security problems, we come to the conclusion that the field of cyber security poses a rich set of new and exciting research opportunities for the mathematical and statistical sciences.
Statistics and Informatics in Space Astrophysics
Feigelson, E.
2017-12-01
The interest in statistical and computational methodology has seen rapid growth in space-based astrophysics, parallel to the growth seen in Earth remote sensing. There is widespread agreement that scientific interpretation of the cosmic microwave background, discovery of exoplanets, and classifying multiwavelength surveys is too complex to be accomplished with traditional techniques. NASA operates several well-functioning Science Archive Research Centers providing 0.5 PBy datasets to the research community. These databases are integrated with full-text journal articles in the NASA Astrophysics Data System (200K pageviews/day). Data products use interoperable formats and protocols established by the International Virtual Observatory Alliance. NASA supercomputers also support complex astrophysical models of systems such as accretion disks and planet formation. Academic researcher interest in methodology has significantly grown in areas such as Bayesian inference and machine learning, and statistical research is underway to treat problems such as irregularly spaced time series and astrophysical model uncertainties. Several scholarly societies have created interest groups in astrostatistics and astroinformatics. Improvements are needed on several fronts. Community education in advanced methodology is not sufficiently rapid to meet the research needs. Statistical procedures within NASA science analysis software are sometimes not optimal, and pipeline development may not use modern software engineering techniques. NASA offers few grant opportunities supporting research in astroinformatics and astrostatistics.
Steganalysis based on reducing the differences of image statistical characteristics
Wang, Ran; Niu, Shaozhang; Ping, Xijian; Zhang, Tao
2018-04-01
Compared with the process of embedding, the image contents make a more significant impact on the differences of image statistical characteristics. This makes the image steganalysis to be a classification problem with bigger withinclass scatter distances and smaller between-class scatter distances. As a result, the steganalysis features will be inseparate caused by the differences of image statistical characteristics. In this paper, a new steganalysis framework which can reduce the differences of image statistical characteristics caused by various content and processing methods is proposed. The given images are segmented to several sub-images according to the texture complexity. Steganalysis features are separately extracted from each subset with the same or close texture complexity to build a classifier. The final steganalysis result is figured out through a weighted fusing process. The theoretical analysis and experimental results can demonstrate the validity of the framework.
The Playground Game: Inquiry‐Based Learning About Research Methods and Statistics
Westera, Wim; Slootmaker, Aad; Kurvers, Hub
2014-01-01
The Playground Game is a web-based game that was developed for teaching research methods and statistics to nursing and social sciences students in higher education and vocational training. The complexity and abstract nature of research methods and statistics poses many challenges for students. The
Introduction to statistics using interactive MM*Stat elements
Härdle, Wolfgang Karl; Rönz, Bernd
2015-01-01
MM*Stat, together with its enhanced online version with interactive examples, offers a flexible tool that facilitates the teaching of basic statistics. It covers all the topics found in introductory descriptive statistics courses, including simple linear regression and time series analysis, the fundamentals of inferential statistics (probability theory, random sampling and estimation theory), and inferential statistics itself (confidence intervals, testing). MM*Stat is also designed to help students rework class material independently and to promote comprehension with the help of additional examples. Each chapter starts with the necessary theoretical background, which is followed by a variety of examples. The core examples are based on the content of the respective chapter, while the advanced examples, designed to deepen students’ knowledge, also draw on information and material from previous chapters. The enhanced online version helps students grasp the complexity and the practical relevance of statistical...
Physics-based statistical model and simulation method of RF propagation in urban environments
Pao, Hsueh-Yuan; Dvorak, Steven L.
2010-09-14
A physics-based statistical model and simulation/modeling method and system of electromagnetic wave propagation (wireless communication) in urban environments. In particular, the model is a computationally efficient close-formed parametric model of RF propagation in an urban environment which is extracted from a physics-based statistical wireless channel simulation method and system. The simulation divides the complex urban environment into a network of interconnected urban canyon waveguides which can be analyzed individually; calculates spectral coefficients of modal fields in the waveguides excited by the propagation using a database of statistical impedance boundary conditions which incorporates the complexity of building walls in the propagation model; determines statistical parameters of the calculated modal fields; and determines a parametric propagation model based on the statistical parameters of the calculated modal fields from which predictions of communications capability may be made.
Nokia, Miriam S; Mikkonen, Jarno E; Penttonen, Markku; Wikgren, Jan
2012-01-01
Oscillations in hippocampal local-field potentials (LFPs) reflect the crucial involvement of the hippocampus in memory trace formation: theta (4-8 Hz) oscillations and ripples (~200 Hz) occurring during sharp waves are thought to mediate encoding and consolidation, respectively. During sharp wave-ripple complexes (SPW-Rs), hippocampal cell firing closely follows the pattern that took place during the initial experience, most likely reflecting replay of that event. Disrupting hippocampal ripples using electrical stimulation either during training in awake animals or during sleep after training retards spatial learning. Here, adult rabbits were trained in trace eyeblink conditioning, a hippocampus-dependent associative learning task. A bright light was presented to the animals during the inter-trial interval (ITI), when awake, either during SPW-Rs or irrespective of their neural state. Learning was particularly poor when the light was presented following SPW-Rs. While the light did not disrupt the ripple itself, it elicited a theta-band oscillation, a state that does not usually coincide with SPW-Rs. Thus, it seems that consolidation depends on neuronal activity within and beyond the hippocampus taking place immediately after, but by no means limited to, hippocampal SPW-Rs.
Use Of R in Statistics Lithuania
Directory of Open Access Journals (Sweden)
Tomas Rudys
2016-06-01
Full Text Available Recently R becoming more and more popular among official statistics offices. It can be used not even for research purposes, but also for a production of official statistics. Statistics Lithuania recently started an analysis of possibilities where R can be used and could it replace some other statistical programming languages or systems. For this reason a work group was arranged. In the paper we will present overview of the current situation on implementation of R in Statistics Lithuania, some problems we are chasing with and some future plans. At the current situation R is used mainly for research purposes. Looking for- ward a short courses on basic R was prepared and at the moment we are starting to use R for data analysis, data manipulation from Oracle data bases, some reports preparation, data editing, survey estimation. On the other hand we found some problems working with big data sets, also survey sampling as there are surveys with complex sampling designs. We are also analysing the running of R on our servers in order to have possibilities to use more random access memory (RAM. Despite the problems, we are trying to use R in more fields in production of official statistics.
Applications of Probabilistic Combiners on Linear Feedback Shift Register Sequences
2016-12-01
on the resulting output strings show a drastic increase in complexity, while simultaneously passing the stringent randomness tests required by the...a three-variable function. Our tests on the resulting output strings show a drastic increase in complex- ity, while simultaneously passing the...10001101 01000010 11101001 Decryption of a message that has been encrypted using bitwise XOR is quite simple. Since each bit is its own additive inverse
Statistical theory of multi-step compound and direct reactions
International Nuclear Information System (INIS)
Feshbach, H.; Kerman, A.; Koonin, S.
1980-01-01
The theory of nuclear reactions is extended so as to include a statistical treatment of multi-step processes. Two types are distinguished, the multi-step compound and the multi-step direct. The wave functions for the system are grouped according to their complexity. The multi-step direct process involves explicitly those states which are open, while the multi-step compound involves those which are bound. In addition to the random phase assumption which is applied differently to the multi-step direct and to the multi-step compound cross-sections, it is assumed that the residual interaction will have non-vanishing matrix elements between states whose complexities differ by at most one unit. This is referred to as the chaining hypothesis. Explicit expressions for the double differential cross-section giving the angular distribution and energy spectrum are obtained for both reaction types. The statistical multi-step compound cross-sections are symmetric about 90 0 . The classical statistical theory of nuclear reactions is a special limiting case. The cross-section for the statistical multi-step direct reaction consists of a set of convolutions of single-step direct cross-sections. For the many step case it is possible to derive a diffusion equation in momentum space. Application is made to the reaction 181 Ta(p,n) 181 W using the statistical multi-step compound formalism
Petersson, K M; Nichols, T E; Poline, J B; Holmes, A P
1999-01-01
Functional neuroimaging (FNI) provides experimental access to the intact living brain making it possible to study higher cognitive functions in humans. In this review and in a companion paper in this issue, we discuss some common methods used to analyse FNI data. The emphasis in both papers is on assumptions and limitations of the methods reviewed. There are several methods available to analyse FNI data indicating that none is optimal for all purposes. In order to make optimal use of the methods available it is important to know the limits of applicability. For the interpretation of FNI results it is also important to take into account the assumptions, approximations and inherent limitations of the methods used. This paper gives a brief overview over some non-inferential descriptive methods and common statistical models used in FNI. Issues relating to the complex problem of model selection are discussed. In general, proper model selection is a necessary prerequisite for the validity of the subsequent statistical inference. The non-inferential section describes methods that, combined with inspection of parameter estimates and other simple measures, can aid in the process of model selection and verification of assumptions. The section on statistical models covers approaches to global normalization and some aspects of univariate, multivariate, and Bayesian models. Finally, approaches to functional connectivity and effective connectivity are discussed. In the companion paper we review issues related to signal detection and statistical inference. PMID:10466149
Overdispersion in nuclear statistics
International Nuclear Information System (INIS)
Semkow, Thomas M.
1999-01-01
The modern statistical distribution theory is applied to the development of the overdispersion theory in ionizing-radiation statistics for the first time. The physical nuclear system is treated as a sequence of binomial processes, each depending on a characteristic probability, such as probability of decay, detection, etc. The probabilities fluctuate in the course of a measurement, and the physical reasons for that are discussed. If the average values of the probabilities change from measurement to measurement, which originates from the random Lexis binomial sampling scheme, then the resulting distribution is overdispersed. The generating functions and probability distribution functions are derived, followed by a moment analysis. The Poisson and Gaussian limits are also given. The distribution functions belong to a family of generalized hypergeometric factorial moment distributions by Kemp and Kemp, and can serve as likelihood functions for the statistical estimations. An application to radioactive decay with detection is described and working formulae are given, including a procedure for testing the counting data for overdispersion. More complex experiments in nuclear physics (such as solar neutrino) can be handled by this model, as well as distinguishing between the source and background
Emolo, Antonio; Zollo, Aldo; Picozzi, Matteo; Martino, Claudio; Elia, Luca; Verderame, Gerardo; De Risi, Maria Teresa; Ricci, Paolo; Lombardi, Anna; Bindi, Dino; Parolai, Stefano; Boxberger, Tobias; Miranda, Nicola
2014-05-01
One of the main objective of the WP7 (Strategic Applications and Capacity Building) in the framework of the REAKT-Strategies and tools for Real Time Earthquake RisK ReducTion FP7 European project, is to evaluate the effectiveness of EEW and real-time risk assessment procedures in reducing seismic risk to various industrial partners and end-users. In the context of the REAKT project, the AMRA-RISSCLab group is engaged in a feasibility study on the application of earthquake early-warning procedures in two high schools located in the Irpinia region (South Italy), an area that in the 1980 was struck by a magnitude 6.9 earthquake. In this work we report on the activities carried out during the last 24 Months at the school ITIS 'E. Majorana', located in Somma Vesuviana, a village in the neighbourhood of Naples. In order to perform a continuous seismic monitoring of the site, which includes a rather complex structure building, 5 accelerometric stations have been installed in different part of the school. In particular, a 24-bit ADC (Sigma/Delta) Agecodagis-Kefren data-logger has been installed with a Guralp CMG-5TC accelerometer with a 0.25g full-scale in the school courtyard, while 4 SOSEWIN sensors have been also installed at different locations within the building. Commercial ADSL lines provide transmission of real-time data to the EEW centre. Data streams are now acquired in real-time in the PRESToPlus (regional and on-site, threshold-based early-warning) software platform [1]. The recent December 29, 2013 M 5.1 Monti del Matese Earthquake, gave us the unique opportunity to use real strong motion data to test the performance of threshold-based early warning method at the school. The on-site method [2] aims to define alert levels at the monitored site. In particular, at each station the characteristic P-waves period (τc) and the peak displacement (Pd) are measured on the initial P-wave signal. They are compared with threshold values, previously established through an
On Nonextensive Statistics, Chaos and Fractal Strings
Castro, C
2004-01-01
Motivated by the growing evidence of universality and chaos in QFT and string theory, we study the Tsallis non-extensive statistics ( with a non-additive $ q$-entropy ) of an ensemble of fractal strings and branes of different dimensionalities. Non-equilibrium systems with complex dynamics in stationary states may exhibit large fluctuations of intensive quantities which are described in terms of generalized statistics. Tsallis statistics is a particular representative of such class. The non-extensive entropy and probability distribution of a canonical ensemble of fractal strings and branes is studied in terms of their dimensional spectrum which leads to a natural upper cutoff in energy and establishes a direct correlation among dimensions, energy and temperature. The absolute zero temperature ( Kelvin ) corresponds to zero dimensions (energy ) and an infinite temperature corresponds to infinite dimensions. In the concluding remarks some applications of fractal statistics, quasi-particles, knot theory, quantum...
Direct Learning of Systematics-Aware Summary Statistics
CERN. Geneva
2018-01-01
Complex machine learning tools, such as deep neural networks and gradient boosting algorithms, are increasingly being used to construct powerful discriminative features for High Energy Physics analyses. These methods are typically trained with simulated or auxiliary data samples by optimising some classification or regression surrogate objective. The learned feature representations are then used to build a sample-based statistical model to perform inference (e.g. interval estimation or hypothesis testing) over a set of parameters of interest. However, the effectiveness of the mentioned approach can be reduced by the presence of known uncertainties that cause differences between training and experimental data, included in the statistical model via nuisance parameters. This work presents an end-to-end algorithm, which leverages on existing deep learning technologies but directly aims to produce inference-optimal sample-summary statistics. By including the statistical model and a differentiable approximation of ...
A perceptual space of local image statistics.
Victor, Jonathan D; Thengone, Daniel J; Rizvi, Syed M; Conte, Mary M
2015-12-01
Local image statistics are important for visual analysis of textures, surfaces, and form. There are many kinds of local statistics, including those that capture luminance distributions, spatial contrast, oriented segments, and corners. While sensitivity to each of these kinds of statistics have been well-studied, much less is known about visual processing when multiple kinds of statistics are relevant, in large part because the dimensionality of the problem is high and different kinds of statistics interact. To approach this problem, we focused on binary images on a square lattice - a reduced set of stimuli which nevertheless taps many kinds of local statistics. In this 10-parameter space, we determined psychophysical thresholds to each kind of statistic (16 observers) and all of their pairwise combinations (4 observers). Sensitivities and isodiscrimination contours were consistent across observers. Isodiscrimination contours were elliptical, implying a quadratic interaction rule, which in turn determined ellipsoidal isodiscrimination surfaces in the full 10-dimensional space, and made predictions for sensitivities to complex combinations of statistics. These predictions, including the prediction of a combination of statistics that was metameric to random, were verified experimentally. Finally, check size had only a mild effect on sensitivities over the range from 2.8 to 14min, but sensitivities to second- and higher-order statistics was substantially lower at 1.4min. In sum, local image statistics form a perceptual space that is highly stereotyped across observers, in which different kinds of statistics interact according to simple rules. Copyright © 2015 Elsevier Ltd. All rights reserved.
Metrics to Compare Aircraft Operating and Support Costs in the Department of Defense
2015-01-01
a phenomenon in regression analysis called multicollinear - ity, which makes problematic the interpretation of the coefficient esti- mates of highly...indicating a very high amount of multicollinearity and suggesting that the magnitude of the coefficients on those variables should be treated with caution... multicollinearity between these independent variables, one must be cautious when interpreting the statistical relationship between flying hours and cost. The
Directory of Open Access Journals (Sweden)
Anne eBobin-Bègue
2014-12-01
Full Text Available This study examined the young children’s abilities to switch from rhythm production, with short 15 Inter-Taps Intervals (ITI, to temporal interval production, with long ITI (> 1 s, in a sensorimotor synchronization task. Children aged 3 and 5 years old were given 6 sessions of synchronization. In a control group, they had to synchronize their ITI to an Inter-Stimulus Interval (ISI of 4 s. In the experimental group, they must progressively increase their ITI for one session to the next (from 0.4-s to 4.0-s ISI. Our results showed that the 5-year-olds produced longer ITI that the 3-year-olds in synchronization. However, the value of ITI in the 5-year-olds never exceeded 1.5 s, with more variable ITI in the control than in the experimental group. In addition, at 5 years, boys had more difficulties than girls in changing their tapping rhythm. These results suggest a temporal window in sensorimotor synchronization, beyond which the rhythm is lost and the synchronization becomes difficult.
Topology for statistical modeling of petascale data.
Energy Technology Data Exchange (ETDEWEB)
Pascucci, Valerio (University of Utah, Salt Lake City, UT); Mascarenhas, Ajith Arthur; Rusek, Korben (Texas A& M University, College Station, TX); Bennett, Janine Camille; Levine, Joshua (University of Utah, Salt Lake City, UT); Pebay, Philippe Pierre; Gyulassy, Attila (University of Utah, Salt Lake City, UT); Thompson, David C.; Rojas, Joseph Maurice (Texas A& M University, College Station, TX)
2011-07-01
This document presents current technical progress and dissemination of results for the Mathematics for Analysis of Petascale Data (MAPD) project titled 'Topology for Statistical Modeling of Petascale Data', funded by the Office of Science Advanced Scientific Computing Research (ASCR) Applied Math program. Many commonly used algorithms for mathematical analysis do not scale well enough to accommodate the size or complexity of petascale data produced by computational simulations. The primary goal of this project is thus to develop new mathematical tools that address both the petascale size and uncertain nature of current data. At a high level, our approach is based on the complementary techniques of combinatorial topology and statistical modeling. In particular, we use combinatorial topology to filter out spurious data that would otherwise skew statistical modeling techniques, and we employ advanced algorithms from algebraic statistics to efficiently find globally optimal fits to statistical models. This document summarizes the technical advances we have made to date that were made possible in whole or in part by MAPD funding. These technical contributions can be divided loosely into three categories: (1) advances in the field of combinatorial topology, (2) advances in statistical modeling, and (3) new integrated topological and statistical methods.
A Reliability Test of a Complex System Based on Empirical Likelihood
Zhou, Yan; Fu, Liya; Zhang, Jun; Hui, Yongchang
2016-01-01
To analyze the reliability of a complex system described by minimal paths, an empirical likelihood method is proposed to solve the reliability test problem when the subsystem distributions are unknown. Furthermore, we provide a reliability test statistic of the complex system and extract the limit distribution of the test statistic. Therefore, we can obtain the confidence interval for reliability and make statistical inferences. The simulation studies also demonstrate the theorem results.
Ganju, Jitendra; Yu, Xinxin; Ma, Guoguang Julie
2013-01-01
Formal inference in randomized clinical trials is based on controlling the type I error rate associated with a single pre-specified statistic. The deficiency of using just one method of analysis is that it depends on assumptions that may not be met. For robust inference, we propose pre-specifying multiple test statistics and relying on the minimum p-value for testing the null hypothesis of no treatment effect. The null hypothesis associated with the various test statistics is that the treatment groups are indistinguishable. The critical value for hypothesis testing comes from permutation distributions. Rejection of the null hypothesis when the smallest p-value is less than the critical value controls the type I error rate at its designated value. Even if one of the candidate test statistics has low power, the adverse effect on the power of the minimum p-value statistic is not much. Its use is illustrated with examples. We conclude that it is better to rely on the minimum p-value rather than a single statistic particularly when that single statistic is the logrank test, because of the cost and complexity of many survival trials. Copyright © 2013 John Wiley & Sons, Ltd.
DEFF Research Database (Denmark)
Wu, Mo; Forchhammer, Søren; Aghito, Shankar Manuel
2007-01-01
A complexity control algorithm for H.264 advanced video coding is proposed. The algorithm can control the complexity of integer inter motion estimation for a given target complexity. The Rate-Distortion-Complexity performance is improved by a complexity prediction model, simple analysis of the pa...... statistics and a control scheme. The algorithm also works well for scene change condition. Test results for coding interlaced video (720x576 PAL) are reported.......A complexity control algorithm for H.264 advanced video coding is proposed. The algorithm can control the complexity of integer inter motion estimation for a given target complexity. The Rate-Distortion-Complexity performance is improved by a complexity prediction model, simple analysis of the past...
From Hamiltonian chaos to complex systems a nonlinear physics approach
Leonetti, Marc
2013-01-01
From Hamiltonian Chaos to Complex Systems: A Nonlinear Physics Approach collects contributions on recent developments in non-linear dynamics and statistical physics with an emphasis on complex systems. This book provides a wide range of state-of-the-art research in these fields. The unifying aspect of this book is a demonstration of how similar tools coming from dynamical systems, nonlinear physics, and statistical dynamics can lead to a large panorama of research in various fields of physics and beyond, most notably with the perspective of application in complex systems. This book also: Illustrates the broad research influence of tools coming from dynamical systems, nonlinear physics, and statistical dynamics Adopts a pedagogic approach to facilitate understanding by non-specialists and students Presents applications in complex systems Includes 150 illustrations From Hamiltonian Chaos to Complex Systems: A Nonlinear Physics Approach is an ideal book for graduate students and researchers working in applied...
Energy Technology Data Exchange (ETDEWEB)
Kamp, Derek van der [University of Victoria, Pacific Climate Impacts Consortium, Victoria, BC (Canada); University of Victoria, School of Earth and Ocean Sciences, Victoria, BC (Canada); Curry, Charles L. [Environment Canada University of Victoria, Canadian Centre for Climate Modelling and Analysis, Victoria, BC (Canada); University of Victoria, School of Earth and Ocean Sciences, Victoria, BC (Canada); Monahan, Adam H. [University of Victoria, School of Earth and Ocean Sciences, Victoria, BC (Canada)
2012-04-15
A regression-based downscaling technique was applied to monthly mean surface wind observations from stations throughout western Canada as well as from buoys in the Northeast Pacific Ocean over the period 1979-2006. A predictor set was developed from principal component analysis of the three wind components at 500 hPa and mean sea-level pressure taken from the NCEP Reanalysis II. Building on the results of a companion paper, Curry et al. (Clim Dyn 2011), the downscaling was applied to both wind speed and wind components, in an effort to evaluate the utility of each type of predictand. Cross-validated prediction skill varied strongly with season, with autumn and summer displaying the highest and lowest skill, respectively. In most cases wind components were predicted with better skill than wind speeds. The predictive ability of wind components was found to be strongly related to their orientation. Wind components with the best predictions were often oriented along topographically significant features such as constricted valleys, mountain ranges or ocean channels. This influence of directionality on predictive ability is most prominent during autumn and winter at inland sites with complex topography. Stations in regions with relatively flat terrain (where topographic steering is minimal) exhibit inter-station consistencies including region-wide seasonal shifts in the direction of the best predicted wind component. The conclusion that wind components can be skillfully predicted only over a limited range of directions at most stations limits the scope of statistically downscaled wind speed predictions. It seems likely that such limitations apply to other regions of complex terrain as well. (orig.)
Subjective randomness as statistical inference.
Griffiths, Thomas L; Daniels, Dylan; Austerweil, Joseph L; Tenenbaum, Joshua B
2018-06-01
Some events seem more random than others. For example, when tossing a coin, a sequence of eight heads in a row does not seem very random. Where do these intuitions about randomness come from? We argue that subjective randomness can be understood as the result of a statistical inference assessing the evidence that an event provides for having been produced by a random generating process. We show how this account provides a link to previous work relating randomness to algorithmic complexity, in which random events are those that cannot be described by short computer programs. Algorithmic complexity is both incomputable and too general to capture the regularities that people can recognize, but viewing randomness as statistical inference provides two paths to addressing these problems: considering regularities generated by simpler computing machines, and restricting the set of probability distributions that characterize regularity. Building on previous work exploring these different routes to a more restricted notion of randomness, we define strong quantitative models of human randomness judgments that apply not just to binary sequences - which have been the focus of much of the previous work on subjective randomness - but also to binary matrices and spatial clustering. Copyright © 2018 Elsevier Inc. All rights reserved.
Maximizing information exchange between complex networks
International Nuclear Information System (INIS)
West, Bruce J.; Geneston, Elvis L.; Grigolini, Paolo
2008-01-01
Science is not merely the smooth progressive interaction of hypothesis, experiment and theory, although it sometimes has that form. More realistically the scientific study of any given complex phenomenon generates a number of explanations, from a variety of perspectives, that eventually requires synthesis to achieve a deep level of insight and understanding. One such synthesis has created the field of out-of-equilibrium statistical physics as applied to the understanding of complex dynamic networks. Over the past forty years the concept of complexity has undergone a metamorphosis. Complexity was originally seen as a consequence of memory in individual particle trajectories, in full agreement with a Hamiltonian picture of microscopic dynamics and, in principle, macroscopic dynamics could be derived from the microscopic Hamiltonian picture. The main difficulty in deriving macroscopic dynamics from microscopic dynamics is the need to take into account the actions of a very large number of components. The existence of events such as abrupt jumps, considered by the conventional continuous time random walk approach to describing complexity was never perceived as conflicting with the Hamiltonian view. Herein we review many of the reasons why this traditional Hamiltonian view of complexity is unsatisfactory. We show that as a result of technological advances, which make the observation of single elementary events possible, the definition of complexity has shifted from the conventional memory concept towards the action of non-Poisson renewal events. We show that the observation of crucial processes, such as the intermittent fluorescence of blinking quantum dots as well as the brain's response to music, as monitored by a set of electrodes attached to the scalp, has forced investigators to go beyond the traditional concept of complexity and to establish closer contact with the nascent field of complex networks. Complex networks form one of the most challenging areas of modern
Maximizing information exchange between complex networks
West, Bruce J.; Geneston, Elvis L.; Grigolini, Paolo
2008-10-01
Science is not merely the smooth progressive interaction of hypothesis, experiment and theory, although it sometimes has that form. More realistically the scientific study of any given complex phenomenon generates a number of explanations, from a variety of perspectives, that eventually requires synthesis to achieve a deep level of insight and understanding. One such synthesis has created the field of out-of-equilibrium statistical physics as applied to the understanding of complex dynamic networks. Over the past forty years the concept of complexity has undergone a metamorphosis. Complexity was originally seen as a consequence of memory in individual particle trajectories, in full agreement with a Hamiltonian picture of microscopic dynamics and, in principle, macroscopic dynamics could be derived from the microscopic Hamiltonian picture. The main difficulty in deriving macroscopic dynamics from microscopic dynamics is the need to take into account the actions of a very large number of components. The existence of events such as abrupt jumps, considered by the conventional continuous time random walk approach to describing complexity was never perceived as conflicting with the Hamiltonian view. Herein we review many of the reasons why this traditional Hamiltonian view of complexity is unsatisfactory. We show that as a result of technological advances, which make the observation of single elementary events possible, the definition of complexity has shifted from the conventional memory concept towards the action of non-Poisson renewal events. We show that the observation of crucial processes, such as the intermittent fluorescence of blinking quantum dots as well as the brain’s response to music, as monitored by a set of electrodes attached to the scalp, has forced investigators to go beyond the traditional concept of complexity and to establish closer contact with the nascent field of complex networks. Complex networks form one of the most challenging areas of
Maximizing information exchange between complex networks
Energy Technology Data Exchange (ETDEWEB)
West, Bruce J. [Mathematical and Information Science, Army Research Office, Research Triangle Park, NC 27708 (United States); Physics Department, Duke University, Durham, NC 27709 (United States)], E-mail: bwest@nc.rr.com; Geneston, Elvis L. [Center for Nonlinear Science, University of North Texas, P.O. Box 311427, Denton, TX 76203-1427 (United States); Physics Department, La Sierra University, 4500 Riverwalk Parkway, Riverside, CA 92515 (United States); Grigolini, Paolo [Center for Nonlinear Science, University of North Texas, P.O. Box 311427, Denton, TX 76203-1427 (United States); Istituto di Processi Chimico Fisici del CNR, Area della Ricerca di Pisa, Via G. Moruzzi, 56124, Pisa (Italy); Dipartimento di Fisica ' E. Fermi' Universita' di Pisa, Largo Pontecorvo 3, 56127 Pisa (Italy)
2008-10-15
Science is not merely the smooth progressive interaction of hypothesis, experiment and theory, although it sometimes has that form. More realistically the scientific study of any given complex phenomenon generates a number of explanations, from a variety of perspectives, that eventually requires synthesis to achieve a deep level of insight and understanding. One such synthesis has created the field of out-of-equilibrium statistical physics as applied to the understanding of complex dynamic networks. Over the past forty years the concept of complexity has undergone a metamorphosis. Complexity was originally seen as a consequence of memory in individual particle trajectories, in full agreement with a Hamiltonian picture of microscopic dynamics and, in principle, macroscopic dynamics could be derived from the microscopic Hamiltonian picture. The main difficulty in deriving macroscopic dynamics from microscopic dynamics is the need to take into account the actions of a very large number of components. The existence of events such as abrupt jumps, considered by the conventional continuous time random walk approach to describing complexity was never perceived as conflicting with the Hamiltonian view. Herein we review many of the reasons why this traditional Hamiltonian view of complexity is unsatisfactory. We show that as a result of technological advances, which make the observation of single elementary events possible, the definition of complexity has shifted from the conventional memory concept towards the action of non-Poisson renewal events. We show that the observation of crucial processes, such as the intermittent fluorescence of blinking quantum dots as well as the brain's response to music, as monitored by a set of electrodes attached to the scalp, has forced investigators to go beyond the traditional concept of complexity and to establish closer contact with the nascent field of complex networks. Complex networks form one of the most challenging areas of
Information geometric methods for complexity
Felice, Domenico; Cafaro, Carlo; Mancini, Stefano
2018-03-01
Research on the use of information geometry (IG) in modern physics has witnessed significant advances recently. In this review article, we report on the utilization of IG methods to define measures of complexity in both classical and, whenever available, quantum physical settings. A paradigmatic example of a dramatic change in complexity is given by phase transitions (PTs). Hence, we review both global and local aspects of PTs described in terms of the scalar curvature of the parameter manifold and the components of the metric tensor, respectively. We also report on the behavior of geodesic paths on the parameter manifold used to gain insight into the dynamics of PTs. Going further, we survey measures of complexity arising in the geometric framework. In particular, we quantify complexity of networks in terms of the Riemannian volume of the parameter space of a statistical manifold associated with a given network. We are also concerned with complexity measures that account for the interactions of a given number of parts of a system that cannot be described in terms of a smaller number of parts of the system. Finally, we investigate complexity measures of entropic motion on curved statistical manifolds that arise from a probabilistic description of physical systems in the presence of limited information. The Kullback-Leibler divergence, the distance to an exponential family and volumes of curved parameter manifolds, are examples of essential IG notions exploited in our discussion of complexity. We conclude by discussing strengths, limits, and possible future applications of IG methods to the physics of complexity.
Universal Poisson Statistics of mRNAs with Complex Decay Pathways.
Thattai, Mukund
2016-01-19
Messenger RNA (mRNA) dynamics in single cells are often modeled as a memoryless birth-death process with a constant probability per unit time that an mRNA molecule is synthesized or degraded. This predicts a Poisson steady-state distribution of mRNA number, in close agreement with experiments. This is surprising, since mRNA decay is known to be a complex process. The paradox is resolved by realizing that the Poisson steady state generalizes to arbitrary mRNA lifetime distributions. A mapping between mRNA dynamics and queueing theory highlights an identifiability problem: a measured Poisson steady state is consistent with a large variety of microscopic models. Here, I provide a rigorous and intuitive explanation for the universality of the Poisson steady state. I show that the mRNA birth-death process and its complex decay variants all take the form of the familiar Poisson law of rare events, under a nonlinear rescaling of time. As a corollary, not only steady-states but also transients are Poisson distributed. Deviations from the Poisson form occur only under two conditions, promoter fluctuations leading to transcriptional bursts or nonindependent degradation of mRNA molecules. These results place severe limits on the power of single-cell experiments to probe microscopic mechanisms, and they highlight the need for single-molecule measurements. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Applied statistics in agricultural, biological, and environmental sciences.
Agronomic research often involves measurement and collection of multiple response variables in an effort to understand the more complex nature of the system being studied. Multivariate statistical methods encompass the simultaneous analysis of all random variables measured on each experimental or s...
Statistical physics of hard combinatorial optimization: Vertex cover problem
Zhao, Jin-Hua; Zhou, Hai-Jun
2014-07-01
Typical-case computation complexity is a research topic at the boundary of computer science, applied mathematics, and statistical physics. In the last twenty years, the replica-symmetry-breaking mean field theory of spin glasses and the associated message-passing algorithms have greatly deepened our understanding of typical-case computation complexity. In this paper, we use the vertex cover problem, a basic nondeterministic-polynomial (NP)-complete combinatorial optimization problem of wide application, as an example to introduce the statistical physical methods and algorithms. We do not go into the technical details but emphasize mainly the intuitive physical meanings of the message-passing equations. A nonfamiliar reader shall be able to understand to a large extent the physics behind the mean field approaches and to adjust the mean field methods in solving other optimization problems.
Statistical Analysis of Radio Propagation Channel in Ruins Environment
Directory of Open Access Journals (Sweden)
Jiao He
2015-01-01
Full Text Available The cellphone based localization system for search and rescue in complex high density ruins has attracted a great interest in recent years, where the radio channel characteristics are critical for design and development of such a system. This paper presents a spatial smoothing estimation via rotational invariance technique (SS-ESPRIT for radio channel characterization of high density ruins. The radio propagations at three typical mobile communication bands (0.9, 1.8, and 2 GHz are investigated in two different scenarios. Channel parameters, such as arrival time, delays, and complex amplitudes, are statistically analyzed. Furthermore, a channel simulator is built based on these statistics. By comparison analysis of average excess delay and delay spread, the validation results show a good agreement between the measurements and channel modeling results.
Quantum formalism for classical statistics
Wetterich, C.
2018-06-01
In static classical statistical systems the problem of information transport from a boundary to the bulk finds a simple description in terms of wave functions or density matrices. While the transfer matrix formalism is a type of Heisenberg picture for this problem, we develop here the associated Schrödinger picture that keeps track of the local probabilistic information. The transport of the probabilistic information between neighboring hypersurfaces obeys a linear evolution equation, and therefore the superposition principle for the possible solutions. Operators are associated to local observables, with rules for the computation of expectation values similar to quantum mechanics. We discuss how non-commutativity naturally arises in this setting. Also other features characteristic of quantum mechanics, such as complex structure, change of basis or symmetry transformations, can be found in classical statistics once formulated in terms of wave functions or density matrices. We construct for every quantum system an equivalent classical statistical system, such that time in quantum mechanics corresponds to the location of hypersurfaces in the classical probabilistic ensemble. For suitable choices of local observables in the classical statistical system one can, in principle, compute all expectation values and correlations of observables in the quantum system from the local probabilistic information of the associated classical statistical system. Realizing a static memory material as a quantum simulator for a given quantum system is not a matter of principle, but rather of practical simplicity.
Online Statistical Modeling (Regression Analysis) for Independent Responses
Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus
2017-06-01
Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.
Rabin, Laura A.; Nutter-Upham, Katherine E.
2010-01-01
We describe an active learning exercise intended to improve undergraduate students' understanding of statistics by grounding complex concepts within a meaningful, applied context. Students in a journal excerpt activity class read brief excerpts of statistical reporting from published research articles, answered factual and interpretive questions,…
Statistical and machine learning approaches for network analysis
Dehmer, Matthias
2012-01-01
Explore the multidisciplinary nature of complex networks through machine learning techniques Statistical and Machine Learning Approaches for Network Analysis provides an accessible framework for structurally analyzing graphs by bringing together known and novel approaches on graph classes and graph measures for classification. By providing different approaches based on experimental data, the book uniquely sets itself apart from the current literature by exploring the application of machine learning techniques to various types of complex networks. Comprised of chapters written by internation
2011-03-01
scenarios cr it was presen that scenari take into acco at the relativ of their exp rs. First, gene xamined in t nvironment were created...rating of “ t (i.e., relativ scenario to cenario to a r y as ratings a can be found ors (for the L rio mappings ity rating crit Rating (High t 1...Subcom to flexibi insufficie have join with peo misunde collabora collabora change r hidden a high turn collabora challeng share inf morale a a high le
Renyi statistics in equilibrium statistical mechanics
International Nuclear Information System (INIS)
Parvan, A.S.; Biro, T.S.
2010-01-01
The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.
Measurement and Statistics of Application Business in Complex Internet
Wang, Lei; Li, Yang; Li, Yipeng; Wu, Shuhang; Song, Shiji; Ren, Yong
Owing to independent topologies and autonomic routing mechanism, the logical networks formed by Internet application business behavior cause the significant influence on the physical networks. In this paper, the backbone traffic of TUNET (Tsinghua University Networks) is measured, further more, the two most important application business: HTTP and P2P are analyzed at IP-packet level. It is shown that uplink HTTP and P2P packets behavior presents spatio-temporal power-law characteristics with exponents 1.25 and 1.53 respectively. Downlink HTTP packets behavior also presents power-law characteristics, but has more little exponents γ = 0.82 which differs from traditional complex networks research result. Moreover, downlink P2P packets distribution presents an approximate power-law which means that flow equilibrium profits little from distributed peer-to peer mechanism actually.
Bouhaj, M.; von Estorff, O.; Peiffer, A.
2017-09-01
In the application of Statistical Energy Analysis "SEA" to complex assembled structures, a purely predictive model often exhibits errors. These errors are mainly due to a lack of accurate modelling of the power transmission mechanism described through the Coupling Loss Factors (CLF). Experimental SEA (ESEA) is practically used by the automotive and aerospace industry to verify and update the model or to derive the CLFs for use in an SEA predictive model when analytical estimates cannot be made. This work is particularly motivated by the lack of procedures that allow an estimate to be made of the variance and confidence intervals of the statistical quantities when using the ESEA technique. The aim of this paper is to introduce procedures enabling a statistical description of measured power input, vibration energies and the derived SEA parameters. Particular emphasis is placed on the identification of structural CLFs of complex built-up structures comparing different methods. By adopting a Stochastic Energy Model (SEM), the ensemble average in ESEA is also addressed. For this purpose, expressions are obtained to randomly perturb the energy matrix elements and generate individual samples for the Monte Carlo (MC) technique applied to derive the ensemble averaged CLF. From results of ESEA tests conducted on an aircraft fuselage section, the SEM approach provides a better performance of estimated CLFs compared to classical matrix inversion methods. The expected range of CLF values and the synthesized energy are used as quality criteria of the matrix inversion, allowing to assess critical SEA subsystems, which might require a more refined statistical description of the excitation and the response fields. Moreover, the impact of the variance of the normalized vibration energy on uncertainty of the derived CLFs is outlined.
Statistical Methods for Environmental Pollution Monitoring
Energy Technology Data Exchange (ETDEWEB)
Gilbert, Richard O. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
1987-01-01
The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Some statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.
Phenomenon of statistical instability of the third type systems—complexity
Eskov, V. V.; Gavrilenko, T. V.; Eskov, V. M.; Vokhmina, Yu. V.
2017-11-01
The problem of the existence and special properties of third type systems has been formulated within the new chaos-self-organization theory. In fact, a global problem of the possibility of the existence of steady-state regimes for homeostatic systems has been considered. These systems include not only medical and biological systems, but also the dynamics of meteorological parameters, as well as the ambient parameters of the environment in which humans are located. The new approach has been used to give a new definition for homeostatic systems (complexity).
Directory of Open Access Journals (Sweden)
Zaira M Alieva
2016-01-01
Full Text Available The article analyzes the application of mathematical and statistical methods in the analysis of socio-humanistic texts. The essence of mathematical and statistical methods, presents examples of their use in the study of Humanities and social phenomena. Considers the key issues faced by the expert in the application of mathematical-statistical methods in socio-humanitarian sphere, including the availability of sustainable contrasting socio-humanitarian Sciences and mathematics; the complexity of the allocation of the object that is the bearer of the problem; having the use of a probabilistic approach. The conclusion according to the results of the study.
DEFF Research Database (Denmark)
Conradsen, Knut; Nielsen, Allan Aasbjerg; Schou, Jesper
2003-01-01
. Based on this distribution, a test statistic for equality of two such matrices and an associated asymptotic probability for obtaining a smaller value of the test statistic are derived and applied successfully to change detection in polarimetric SAR data. In a case study, EMISAR L-band data from April 17...... to HH, VV, or HV data alone, the derived test statistic reduces to the well-known gamma likelihood-ratio test statistic. The derived test statistic and the associated significance value can be applied as a line or edge detector in fully polarimetric SAR data also....
Directory of Open Access Journals (Sweden)
Rasekh HR
2011-11-01
Full Text Available Hamid Reza Rasekh1, Ali Imani1, Mehran Karimi2, Mina Golestani11Shahid Beheshti University of Medical Sciences, Department of Pharmaceutical Management and Pharmacoeconomics, School of Pharmacy, Tehran, 2University of Shiraz Medical Sciences, Hematology Research Center, Shiraz, IranBackground: In developing countries, the treatment of hemophilia patients with inhibitors is presently the most challenging and serious issue in hemophilia management, direct costs of clotting factor concentrates accounting for >98% of the highest economic burden absorbed for the health care of patients in this setting. In the setting of chronic diseases, cost-utility analysis, which takes into account the beneficial effects of a given treatment/health care intervention in terms of health-related quality of life, is likely to be the most appropriate approach.Objective: The aim of this study was to assess the incremental cost-effectiveness ratios of immune tolerance induction (ITI therapy with plasma-derived factor VIII concentrates versus on-demand treatment with recombinant-activated FVIIa (rFVIIa in hemophilia A with high titer inhibitors from an Iranian Ministry of Health perspective.Methods: This study was based on the study of Knight et al, which evaluated the cost-effectiveness ratios of different treatments for hemophilia A with high-responding inhibitors. To adapt Knight et al's results to the Iranian context, a few clinical parameters were varied, and cost data were replaced with the corresponding Iranian estimates of resource use. The time horizon of the analysis was 10 years. One-way sensitivity analyses were performed, varying the cost of the clotting factor, the drug dose, and the administration frequency, to test the robustness of the analysis.Results: Comparison of the incremental cost-effectiveness ratios between the three ITI protocols and the on-demand regimen with rFVIIa shows that all three ITI protocols dominate the on-demand regimen with r
International Nuclear Information System (INIS)
Tadaki, Kohtaro
2010-01-01
The statistical mechanical interpretation of algorithmic information theory (AIT, for short) was introduced and developed by our former works [K. Tadaki, Local Proceedings of CiE 2008, pp. 425-434, 2008] and [K. Tadaki, Proceedings of LFCS'09, Springer's LNCS, vol. 5407, pp. 422-440, 2009], where we introduced the notion of thermodynamic quantities, such as partition function Z(T), free energy F(T), energy E(T), statistical mechanical entropy S(T), and specific heat C(T), into AIT. We then discovered that, in the interpretation, the temperature T equals to the partial randomness of the values of all these thermodynamic quantities, where the notion of partial randomness is a stronger representation of the compression rate by means of program-size complexity. Furthermore, we showed that this situation holds for the temperature T itself, which is one of the most typical thermodynamic quantities. Namely, we showed that, for each of the thermodynamic quantities Z(T), F(T), E(T), and S(T) above, the computability of its value at temperature T gives a sufficient condition for T is an element of (0,1) to satisfy the condition that the partial randomness of T equals to T. In this paper, based on a physical argument on the same level of mathematical strictness as normal statistical mechanics in physics, we develop a total statistical mechanical interpretation of AIT which actualizes a perfect correspondence to normal statistical mechanics. We do this by identifying a microcanonical ensemble in the framework of AIT. As a result, we clarify the statistical mechanical meaning of the thermodynamic quantities of AIT.
Management of complex dynamical systems
MacKay, R. S.
2018-02-01
Complex dynamical systems are systems with many interdependent components which evolve in time. One might wish to control their trajectories, but a more practical alternative is to control just their statistical behaviour. In many contexts this would be both sufficient and a more realistic goal, e.g. climate and socio-economic systems. I refer to it as ‘management’ of complex dynamical systems. In this paper, some mathematics for management of complex dynamical systems is developed in the weakly dependent regime, and questions are posed for the strongly dependent regime.
Complex data modeling and computationally intensive methods for estimation and prediction
Secchi, Piercesare; Advances in Complex Data Modeling and Computational Methods in Statistics
2015-01-01
The book is addressed to statisticians working at the forefront of the statistical analysis of complex and high dimensional data and offers a wide variety of statistical models, computer intensive methods and applications: network inference from the analysis of high dimensional data; new developments for bootstrapping complex data; regression analysis for measuring the downsize reputational risk; statistical methods for research on the human genome dynamics; inference in non-euclidean settings and for shape data; Bayesian methods for reliability and the analysis of complex data; methodological issues in using administrative data for clinical and epidemiological research; regression models with differential regularization; geostatistical methods for mobility analysis through mobile phone data exploration. This volume is the result of a careful selection among the contributions presented at the conference "S.Co.2013: Complex data modeling and computationally intensive methods for estimation and prediction" held...
Statistical Models of Adaptive Immune populations
Sethna, Zachary; Callan, Curtis; Walczak, Aleksandra; Mora, Thierry
The availability of large (104-106 sequences) datasets of B or T cell populations from a single individual allows reliable fitting of complex statistical models for naïve generation, somatic selection, and hypermutation. It is crucial to utilize a probabilistic/informational approach when modeling these populations. The inferred probability distributions allow for population characterization, calculation of probability distributions of various hidden variables (e.g. number of insertions), as well as statistical properties of the distribution itself (e.g. entropy). In particular, the differences between the T cell populations of embryonic and mature mice will be examined as a case study. Comparing these populations, as well as proposed mixed populations, provides a concrete exercise in model creation, comparison, choice, and validation.
Implementing the “Big Data” Concept in Official Statistics
О. V.
2017-01-01
Big data is a huge resource that needs to be used at all levels of economic planning. The article is devoted to the study of the development of the concept of “Big Data” in the world and its impact on the transformation of statistical simulation of economic processes. Statistics at the current stage should take into account the complex system of international economic relations, which functions in the conditions of globalization and brings new forms of economic development in small open ec...
International Nuclear Information System (INIS)
Tokuyama, M.; Stanley, H.E.
2000-01-01
The main purpose of the Tohwa University International Conference on Statistical Physics is to provide an opportunity for an international group of experimentalists, theoreticians, and computational scientists who are working on various fields of statistical physics to gather together and discuss their recent advances. The conference covered six topics: complex systems, general methods of statistical physics, biological physics, cross-disciplinary physics, information science, and econophysics
Understanding Statistics - Cancer Statistics
Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.
An Application of Multivariate Statistical Analysis for Query-Driven Visualization
Energy Technology Data Exchange (ETDEWEB)
Gosink, Luke J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Garth, Christoph [Univ. of California, Davis, CA (United States); Anderson, John C. [Univ. of California, Davis, CA (United States); Bethel, E. Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Joy, Kenneth I. [Univ. of California, Davis, CA (United States)
2011-03-01
Driven by the ability to generate ever-larger, increasingly complex data, there is an urgent need in the scientific community for scalable analysis methods that can rapidly identify salient trends in scientific data. Query-Driven Visualization (QDV) strategies are among the small subset of techniques that can address both large and highly complex datasets. This paper extends the utility of QDV strategies with a statistics-based framework that integrates non-parametric distribution estimation techniques with a new segmentation strategy to visually identify statistically significant trends and features within the solution space of a query. In this framework, query distribution estimates help users to interactively explore their query's solution and visually identify the regions where the combined behavior of constrained variables is most important, statistically, to their inquiry. Our new segmentation strategy extends the distribution estimation analysis by visually conveying the individual importance of each variable to these regions of high statistical significance. We demonstrate the analysis benefits these two strategies provide and show how they may be used to facilitate the refinement of constraints over variables expressed in a user's query. We apply our method to datasets from two different scientific domains to demonstrate its broad applicability.
Directory of Open Access Journals (Sweden)
Balá Peter
2003-09-01
Full Text Available Mechanical activation belongs to innovative procedures which intensify technological processes by creating new surfaces and making a defective structure of solid phase. Mechanical impact on the solid phase is a suitable procedure to ensure the mobility of its structure elements and to accumulate the mechanical energy that is later used in the processes of leaching.The aim of this study was to realize the mechanical activation of a complex CuPbZn sulphide concentrate (Slovak deposit in an attritor by using of statistical methods for the design of factorial experiments and to determine the conditions for preparing the optimum mechanically activated sample of studied concentrate.The following parameters of the attritor were studied as variables:the weight of sample/steel balls (degree of mill filling, the number of revolutions of the milling shaft and the time of mechanical activation. Interpretation of the chosen variables inducing the mechanical activation of the complex CuPbZn concentrate was also carried out by using statistical methods of factorial design experiments. The presented linear model (23 factorial experiment does not support directly the optimum search, therefore this model was extended to the nonlinear model by the utilization of second order ortogonal polynom. This nonlinear model does not describe adequately the process of new surface formation by the mechanical activation of the studied concentrate. It would be necessary to extend the presented nonlinear model to the nonlinear model of the third order or choose another model. In regard to the economy with the aspect of minimal energy input consumption, the sample with the value of 524 kWht-1 and with the maximum value of specific surface area 8.59 m2g-1 (as a response of the factorial experiment was chosen as the optimum mechanically activated sample of the studied concentrate. The optimum mechanically activated sample of the complex CuPbZn sulphide concentrate was prepared
International Nuclear Information System (INIS)
Beck, W.
1984-01-01
From the complexity of computer programs for the solution of scientific and technical problems results a lot of questions. Typical questions concern the strength and weakness of computer programs, the propagation of incertainties among the input data, the sensitivity of input data on output data and the substitute of complex models by more simple ones, which provide equivalent results in certain ranges. Those questions have a general practical meaning, principle answers may be found by statistical methods, which are based on the Monte Carlo Method. In this report the statistical methods are chosen, described and valuated. They are implemented into the modular program system STAR, which is an own component of the program system RSYST. The design of STAR considers users with different knowledge of data processing and statistics. The variety of statistical methods, generating and evaluating procedures. The processing of large data sets in complex structures. The coupling to other components of RSYST and RSYST foreign programs. That the system can be easily modificated and enlarged. Four examples are given, which demonstrate the application of STAR. (orig.) [de
Lu, Qiongshi; Li, Boyang; Ou, Derek; Erlendsdottir, Margret; Powles, Ryan L; Jiang, Tony; Hu, Yiming; Chang, David; Jin, Chentian; Dai, Wei; He, Qidu; Liu, Zefeng; Mukherjee, Shubhabrata; Crane, Paul K; Zhao, Hongyu
2017-12-07
Despite the success of large-scale genome-wide association studies (GWASs) on complex traits, our understanding of their genetic architecture is far from complete. Jointly modeling multiple traits' genetic profiles has provided insights into the shared genetic basis of many complex traits. However, large-scale inference sets a high bar for both statistical power and biological interpretability. Here we introduce a principled framework to estimate annotation-stratified genetic covariance between traits using GWAS summary statistics. Through theoretical and numerical analyses, we demonstrate that our method provides accurate covariance estimates, thereby enabling researchers to dissect both the shared and distinct genetic architecture across traits to better understand their etiologies. Among 50 complex traits with publicly accessible GWAS summary statistics (N total ≈ 4.5 million), we identified more than 170 pairs with statistically significant genetic covariance. In particular, we found strong genetic covariance between late-onset Alzheimer disease (LOAD) and amyotrophic lateral sclerosis (ALS), two major neurodegenerative diseases, in single-nucleotide polymorphisms (SNPs) with high minor allele frequencies and in SNPs located in the predicted functional genome. Joint analysis of LOAD, ALS, and other traits highlights LOAD's correlation with cognitive traits and hints at an autoimmune component for ALS. Copyright © 2017 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.
Statistical assessment of the learning curves of health technologies.
Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T
2001-01-01
(1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second
Statistical physics of an anyon gas
International Nuclear Information System (INIS)
Dasnieres de Veigy, A.
1994-01-01
In quantum two-dimensional physics, anyons are particles which have an intermediate statistics between Bose-Einstein and Fermi-Dirac statistics. The wave amplitude can change by an arbitrary phase under particle exchanges. Contrary to bosons or fermions, the permutation group cannot uniquely characterize this phase and one must introduce the braid group. One shows that the statistical ''interaction'' is equivalent to an Aharonov-Bohm interaction which derives from a Chern-Simons lagrangian. The main subject of this thesis is the thermodynamics of an anyon gas. Since the complete spectrum of N anyons seems out of reach, we have done a perturbative computation of the equation of state at second order near Bose or Fermi statistics. One avoids ultraviolet divergences by noticing that the short-range singularities of the statistical interaction enforce the wave functions to vanish when two particles approach each other (statistical exclusion). The gas is confined in a harmonic well in order to obtain the thermodynamics limit when the harmonic attraction goes to zero. Infrared divergences thus cancel in this limit and a finite virial expansion is obtained. The complexity of the anyon model appears in this result. We have also computed the equation of state of an anyon gas in a magnetic field strong enough to project the system in its degenerate groundstate. This result concerns anyons with any statistics. One then finds an exclusion principle generalizing the Pauli principle to anyons. On the other hand, we have defined a model of two-dimensional particles topologically interacting at a distance. The anyon model is recovered as a particular case where all particles are identical. (orig.)
Mathematical-statistical models and qualitative theories for economic and social sciences
Maturo, Fabrizio; Kacprzyk, Janusz
2017-01-01
This book presents a broad spectrum of problems related to statistics, mathematics, teaching, social science, and economics as well as a range of tools and techniques that can be used to solve these problems. It is the result of a scientific collaboration between experts in the field of economic and social systems from the University of Defence in Brno (Czech Republic), G. d’Annunzio University of Chieti-Pescara (Italy), Pablo de Olavid eUniversity of Sevilla (Spain), and Ovidius University in Constanţa, (Romania). The studies included were selected using a peer-review process and reflect heterogeneity and complexity of economic and social phenomena. They and present interesting empirical research from around the globe and from several research fields, such as statistics, decision making, mathematics, complexity, psychology, sociology and economics. The volume is divided into two parts. The first part, “Recent trends in mathematical and statistical models for economic and social sciences”, collects pap...
Webb, S. J.; Ashwal, L. D.; Cooper, G. R.
2007-12-01
Susceptibility (n=~110,000) and density (n=~~2500) measurements on core samples have been collected in a stratigraphic context from the Bellevue (BV-1) 2950 m deep borehole in the Northern Lobe of the Bushveld Complex. This drill core starts in the granitoid roof rocks, extends through the entire Upper Zone, and ends approximately in the middle of the Main Zone. These physical property measurements now provide an extensive database useful for geophysical modeling and stratigraphic studies. In an effort to quantify the periodicity of the layering we have applied various statistical and wavelet methods to analyze the susceptibility and density data. The density data have revealed a strong periodic layering with a scale of ~~80 m that extends through the Main and Upper Zones. In the Main Zone the layering is unusual in that the density values increase upwards by as much as 10%. This is due to systematic variation in the modal abundance of mafic silicates and appears to be related to separate pulses during emplacement. The magnetic susceptibility data in the Upper Zone also show a strong cyclicity of similar scale. The discrete wavelet transform, using the real Haar wavelet, has been applied to help discretise the susceptibility data and clarifies the geological boundaries without blurring them, which is a common problem with multipoint moving averages. As expected, the histogram of the entire data set is non-Gaussian, with a long tail for high values. We can roughly fit a power law to the log histogram plot indicating a probable fractal distribution of susceptibilities. However if we window the data in the range 750-1000 m the histogram is very different. This region shows a strong peak and no power law relationship. This dramatic change in statistical properties prompted us to investigate these properties more thoroughly. To complement the wavelet analysis we have calculated various statistical measures (mean, standard deviation, skew, and
Generalized Statistical Mechanics at the Onset of Chaos
Directory of Open Access Journals (Sweden)
Alberto Robledo
2013-11-01
Full Text Available Transitions to chaos in archetypal low-dimensional nonlinear maps offer real and precise model systems in which to assess proposed generalizations of statistical mechanics. The known association of chaotic dynamics with the structure of Boltzmann–Gibbs (BG statistical mechanics has suggested the potential verification of these generalizations at the onset of chaos, when the only Lyapunov exponent vanishes and ergodic and mixing properties cease to hold. There are three well-known routes to chaos in these deterministic dissipative systems, period-doubling, quasi-periodicity and intermittency, which provide the setting in which to explore the limit of validity of the standard BG structure. It has been shown that there is a rich and intricate behavior for both the dynamics within and towards the attractors at the onset of chaos and that these two kinds of properties are linked via generalized statistical-mechanical expressions. Amongst the topics presented are: (i permanently growing sensitivity fluctuations and their infinite family of generalized Pesin identities; (ii the emergence of statistical-mechanical structures in the dynamics along the routes to chaos; (iii dynamical hierarchies with modular organization; and (iv limit distributions of sums of deterministic variables. The occurrence of generalized entropy properties in condensed-matter physical systems is illustrated by considering critical fluctuations, localization transition and glass formation. We complete our presentation with the description of the manifestations of the dynamics at the transitions to chaos in various kinds of complex systems, such as, frequency and size rank distributions and complex network images of time series. We discuss the results.
Multivariate statistical methods and data mining in particle physics (4/4)
CERN. Geneva
2008-01-01
The lectures will cover multivariate statistical methods and their applications in High Energy Physics. The methods will be viewed in the framework of a statistical test, as used e.g. to discriminate between signal and background events. Topics will include an introduction to the relevant statistical formalism, linear test variables, neural networks, probability density estimation (PDE) methods, kernel-based PDE, decision trees and support vector machines. The methods will be evaluated with respect to criteria relevant to HEP analyses such as statistical power, ease of computation and sensitivity to systematic effects. Simple computer examples that can be extended to more complex analyses will be presented.
Multivariate statistical methods and data mining in particle physics (2/4)
CERN. Geneva
2008-01-01
The lectures will cover multivariate statistical methods and their applications in High Energy Physics. The methods will be viewed in the framework of a statistical test, as used e.g. to discriminate between signal and background events. Topics will include an introduction to the relevant statistical formalism, linear test variables, neural networks, probability density estimation (PDE) methods, kernel-based PDE, decision trees and support vector machines. The methods will be evaluated with respect to criteria relevant to HEP analyses such as statistical power, ease of computation and sensitivity to systematic effects. Simple computer examples that can be extended to more complex analyses will be presented.
Multivariate statistical methods and data mining in particle physics (1/4)
CERN. Geneva
2008-01-01
The lectures will cover multivariate statistical methods and their applications in High Energy Physics. The methods will be viewed in the framework of a statistical test, as used e.g. to discriminate between signal and background events. Topics will include an introduction to the relevant statistical formalism, linear test variables, neural networks, probability density estimation (PDE) methods, kernel-based PDE, decision trees and support vector machines. The methods will be evaluated with respect to criteria relevant to HEP analyses such as statistical power, ease of computation and sensitivity to systematic effects. Simple computer examples that can be extended to more complex analyses will be presented.
Statistical analysis of next generation sequencing data
Nettleton, Dan
2014-01-01
Next Generation Sequencing (NGS) is the latest high throughput technology to revolutionize genomic research. NGS generates massive genomic datasets that play a key role in the big data phenomenon that surrounds us today. To extract signals from high-dimensional NGS data and make valid statistical inferences and predictions, novel data analytic and statistical techniques are needed. This book contains 20 chapters written by prominent statisticians working with NGS data. The topics range from basic preprocessing and analysis with NGS data to more complex genomic applications such as copy number variation and isoform expression detection. Research statisticians who want to learn about this growing and exciting area will find this book useful. In addition, many chapters from this book could be included in graduate-level classes in statistical bioinformatics for training future biostatisticians who will be expected to deal with genomic data in basic biomedical research, genomic clinical trials and personalized med...
HistFitter software framework for statistical data analysis
Baak, M.; Côte, D.; Koutsman, A.; Lorenz, J.; Short, D.
2015-01-01
We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with mu...
Using Carbon Emissions Data to "Heat Up" Descriptive Statistics
Brooks, Robert
2012-01-01
This article illustrates using carbon emissions data in an introductory statistics assignment. The carbon emissions data has desirable characteristics including: choice of measure; skewness; and outliers. These complexities allow research and public policy debate to be introduced. (Contains 4 figures and 2 tables.)
Statistical uncertainties and unrecognized relationships
International Nuclear Information System (INIS)
Rankin, J.P.
1985-01-01
Hidden relationships in specific designs directly contribute to inaccuracies in reliability assessments. Uncertainty factors at the system level may sometimes be applied in attempts to compensate for the impact of such unrecognized relationships. Often uncertainty bands are used to relegate unknowns to a miscellaneous category of low-probability occurrences. However, experience and modern analytical methods indicate that perhaps the dominant, most probable and significant events are sometimes overlooked in statistical reliability assurances. The author discusses the utility of two unique methods of identifying the otherwise often unforeseeable system interdependencies for statistical evaluations. These methods are sneak circuit analysis and a checklist form of common cause failure analysis. Unless these techniques (or a suitable equivalent) are also employed along with the more widely-known assurance tools, high reliability of complex systems may not be adequately assured. This concern is indicated by specific illustrations. 8 references, 5 figures
International Nuclear Information System (INIS)
Androsenko, A.A.; Androsenko, P.A.
1983-01-01
A description is given of the structure, input procedure and recording rules of initial data for the BRAND programme complex intended for the Monte Carlo simulation of neutron physics experiments. The BRAND complex ideology is based on non-analogous simulation of the neutron and photon transport process (statistic weights are used, absorption and escape of particles from the considered region is taken into account, shifted readouts from a coordinate part of transition nucleus density are applied, local estimations, etc. are used). The preparation of initial data for three sections is described in detail: general information for Monte Carlo calculation, source definition and data for describing the geometry of the system. The complex is to be processed with the BESM-6 computer, the basic programming lan-- guage is FORTRAN, volume - more than 8000 operators
Managing Macroeconomic Risks by Using Statistical Simulation
Directory of Open Access Journals (Sweden)
Merkaš Zvonko
2017-06-01
Full Text Available The paper analyzes the possibilities of using statistical simulation in the macroeconomic risks measurement. At the level of the whole world, macroeconomic risks are, due to the excessive imbalance, significantly increased. Using analytical statistical methods and Monte Carlo simulation, the authors interpret the collected data sets, compare and analyze them in order to mitigate potential risks. The empirical part of the study is a qualitative case study that uses statistical methods and Monte Carlo simulation for managing macroeconomic risks, which is the central theme of this work. Application of statistical simulation is necessary because the system, for which it is necessary to specify the model, is too complex for an analytical approach. The objective of the paper is to point out the previous need for consideration of significant macroeconomic risks, particularly in terms of the number of the unemployed in the society, the movement of gross domestic product and the country’s credit rating, and the use of data previously processed by statistical methods, through statistical simulation, to analyze the existing model of managing the macroeconomic risks and suggest elements for a management model development that will allow, with the lowest possible probability and consequences, the emergence of the recent macroeconomic risks. The stochastic characteristics of the system, defined by random variables as input values defined by probability distributions, require the performance of a large number of iterations on which to record the output of the model and calculate the mathematical expectations. The paper expounds the basic procedures and techniques of discrete statistical simulation applied to systems that can be characterized by a number of events which represent a set of circumstances that have caused a change in the system’s state and the possibility of its application in the field of assessment of macroeconomic risks. The method has no
Nonextensive statistical mechanics and high energy physics
Directory of Open Access Journals (Sweden)
Tsallis Constantino
2014-04-01
Full Text Available The use of the celebrated Boltzmann-Gibbs entropy and statistical mechanics is justified for ergodic-like systems. In contrast, complex systems typically require more powerful theories. We will provide a brief introduction to nonadditive entropies (characterized by indices like q, which, in the q → 1 limit, recovers the standard Boltzmann-Gibbs entropy and associated nonextensive statistical mechanics. We then present somerecent applications to systems such as high-energy collisions, black holes and others. In addition to that, we clarify and illustrate the neat distinction that exists between Lévy distributions and q-exponential ones, a point which occasionally causes some confusion in the literature, very particularly in the LHC literature
A Statistical Evaluation of Atmosphere-Ocean General Circulation Models: Complexity vs. Simplicity
Robert K. Kaufmann; David I. Stern
2004-01-01
The principal tools used to model future climate change are General Circulation Models which are deterministic high resolution bottom-up models of the global atmosphere-ocean system that require large amounts of supercomputer time to generate results. But are these models a cost-effective way of predicting future climate change at the global level? In this paper we use modern econometric techniques to evaluate the statistical adequacy of three general circulation models (GCMs) by testing thre...
National Statistical Commission and Indian Official Statistics*
Indian Academy of Sciences (India)
IAS Admin
a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.
Right-sizing statistical models for longitudinal data.
Wood, Phillip K; Steinley, Douglas; Jackson, Kristina M
2015-12-01
Arguments are proposed that researchers using longitudinal data should consider more and less complex statistical model alternatives to their initially chosen techniques in an effort to "right-size" the model to the data at hand. Such model comparisons may alert researchers who use poorly fitting, overly parsimonious models to more complex, better-fitting alternatives and, alternatively, may identify more parsimonious alternatives to overly complex (and perhaps empirically underidentified and/or less powerful) statistical models. A general framework is proposed for considering (often nested) relationships between a variety of psychometric and growth curve models. A 3-step approach is proposed in which models are evaluated based on the number and patterning of variance components prior to selection of better-fitting growth models that explain both mean and variation-covariation patterns. The orthogonal free curve slope intercept (FCSI) growth model is considered a general model that includes, as special cases, many models, including the factor mean (FM) model (McArdle & Epstein, 1987), McDonald's (1967) linearly constrained factor model, hierarchical linear models (HLMs), repeated-measures multivariate analysis of variance (MANOVA), and the linear slope intercept (linearSI) growth model. The FCSI model, in turn, is nested within the Tuckerized factor model. The approach is illustrated by comparing alternative models in a longitudinal study of children's vocabulary and by comparing several candidate parametric growth and chronometric models in a Monte Carlo study. (c) 2015 APA, all rights reserved).
Official Statistics and Statistics Education: Bridging the Gap
Directory of Open Access Journals (Sweden)
Gal Iddo
2017-03-01
Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.
Statistics of resonances in a one-dimensional chain: a weak disorder limit
International Nuclear Information System (INIS)
Vinayak
2012-01-01
We study statistics of resonances in a one-dimensional disordered chain coupled to an outer world simulated by a perfect lead. We consider a limiting case for weak disorder and derive some results which are new in these studies. The main focus of this study is to describe the statistics of the scattered complex energies. We derive compact analytic statistical results for long chains. A comparison of these results has been found to be in good agreement with numerical simulations. (paper)
Formica, S; Roach, T I; Blackwell, J M
1994-05-01
The murine resistance gene Lsh/Ity/Bcg regulates activation of macrophages for tumour necrosis factor-alpha (TNF-alpha)-dependent production of nitric oxide mediating antimicrobial activity against Leishmania, Salmonella and Mycobacterium. As Lsh is differentially expressed in macrophages from different tissue sites, experiments were performed to determine whether interaction with extracellular matrix (ECM) proteins would influence the macrophage TNF-alpha response. Plating of bone marrow-derived macrophages onto purified fibrinogen or fibronectin-rich L929 cell-derived matrices, but not onto mannan, was itself sufficient to stimulate TNF-alpha release, with significantly higher levels released from congenic B10.L-Lshr compared to C57BL/10ScSn (Lshs) macrophages. Only macrophages plated onto fibrinogen also released measurable levels of nitrites, again higher in Lshr compared to Lshs macrophages. Addition of interferon-gamma (IFN-gamma), but not bacterial lipopolysaccharide or mycobacterial lipoarabinomannan, as a second signal enhanced the TNF-alpha and nitrite responses of macrophages plated onto fibrinogen, particularly in the Lshr macrophages. Interaction with fibrinogen and fibronectin also primed macrophages for an enhanced TNF-alpha response to leishmanial parasites, but this was only translated into enhanced nitrite responses in the presence of IFN-gamma. In these experiments, Lshr macrophages remained superior in their TNF-alpha responses throughout, but to a degree which reflected the magnitude of the difference observed on ECM alone. Hence, the specificity for the enhanced TNF-alpha responses of Lshr macrophages lay in their interaction with fibrinogen and fibronectin ECM, while a differential nitrite response was only observed with fibrinogen and/or IFN-gamma. The results are discussed in relation to the possible function of the recently cloned candidate gene Nramp, which has structural identity to eukaryote transporters and an N-terminal cytoplasmic
Theory of overdispersion in counting statistics caused by fluctuating probabilities
International Nuclear Information System (INIS)
Semkow, Thomas M.
1999-01-01
It is shown that the random Lexis fluctuations of probabilities such as probability of decay or detection cause the counting statistics to be overdispersed with respect to the classical binomial, Poisson, or Gaussian distributions. The generating and the distribution functions for the overdispersed counting statistics are derived. Applications to radioactive decay with detection and more complex experiments are given, as well as distinguishing between the source and background, in the presence of overdispersion. Monte-Carlo verifications are provided
Using Mobile Apps to Entice General Education Students into Technology Fields
Liu, Michelle; Murphy, Diane
2013-01-01
It is of national importance to increase the number of college students pursuing degrees in information systems/information technology (IT/IS) subjects. The primary focus at many institutions is renovating or enhancing existing IT/IS programs and the target audience is the students who have selected to major in IT/IS subjects. This paper looks at…
Research on Statistical Flow of the Complex Background Based on Image Method
Directory of Open Access Journals (Sweden)
Yang Huanhai
2014-06-01
Full Text Available Along with our country city changes a process continues to accelerate, city road traffic system pressure increasing. Therefore, the importance of intelligent transportation system based on computer vision technology is becoming more and more significant. Using the image processing technology for the vehicle detection has become a hot topic in the research field of. Only accurately segmented from the background of vehicle can recognize and track vehicles. Therefore, the application of video vehicle detection technology and image processing technology, identify a number of the same sight many car can, types and moving characteristics, can provide real-time basis for intelligent traffic control. This paper first introduces the concept of intelligent transportation system, the importance and the image processing technology in vehicle recognition in statistics, overview of video vehicle detection method, and the video detection technology and other detection technology, puts forward the superiority of video detection technology. Finally we design a real-time and reliable background subtraction method and the area of the vehicle recognition method based on information fusion algorithm, which is implemented with the MATLAB/GUI development tool in Windows operating system platform. In this paper, the application of the algorithm to study the frame traffic flow image. The experimental results show that, the algorithm of recognition of vehicle flow statistics, the effect is very good.
Statistics of natural binaural sounds.
Directory of Open Access Journals (Sweden)
Wiktor Młynarski
Full Text Available Binaural sound localization is usually considered a discrimination task, where interaural phase (IPD and level (ILD disparities at narrowly tuned frequency channels are utilized to identify a position of a sound source. In natural conditions however, binaural circuits are exposed to a stimulation by sound waves originating from multiple, often moving and overlapping sources. Therefore statistics of binaural cues depend on acoustic properties and the spatial configuration of the environment. Distribution of cues encountered naturally and their dependence on physical properties of an auditory scene have not been studied before. In the present work we analyzed statistics of naturally encountered binaural sounds. We performed binaural recordings of three auditory scenes with varying spatial configuration and analyzed empirical cue distributions from each scene. We have found that certain properties such as the spread of IPD distributions as well as an overall shape of ILD distributions do not vary strongly between different auditory scenes. Moreover, we found that ILD distributions vary much weaker across frequency channels and IPDs often attain much higher values, than can be predicted from head filtering properties. In order to understand the complexity of the binaural hearing task in the natural environment, sound waveforms were analyzed by performing Independent Component Analysis (ICA. Properties of learned basis functions indicate that in natural conditions soundwaves in each ear are predominantly generated by independent sources. This implies that the real-world sound localization must rely on mechanisms more complex than a mere cue extraction.
Statistics of natural binaural sounds.
Młynarski, Wiktor; Jost, Jürgen
2014-01-01
Binaural sound localization is usually considered a discrimination task, where interaural phase (IPD) and level (ILD) disparities at narrowly tuned frequency channels are utilized to identify a position of a sound source. In natural conditions however, binaural circuits are exposed to a stimulation by sound waves originating from multiple, often moving and overlapping sources. Therefore statistics of binaural cues depend on acoustic properties and the spatial configuration of the environment. Distribution of cues encountered naturally and their dependence on physical properties of an auditory scene have not been studied before. In the present work we analyzed statistics of naturally encountered binaural sounds. We performed binaural recordings of three auditory scenes with varying spatial configuration and analyzed empirical cue distributions from each scene. We have found that certain properties such as the spread of IPD distributions as well as an overall shape of ILD distributions do not vary strongly between different auditory scenes. Moreover, we found that ILD distributions vary much weaker across frequency channels and IPDs often attain much higher values, than can be predicted from head filtering properties. In order to understand the complexity of the binaural hearing task in the natural environment, sound waveforms were analyzed by performing Independent Component Analysis (ICA). Properties of learned basis functions indicate that in natural conditions soundwaves in each ear are predominantly generated by independent sources. This implies that the real-world sound localization must rely on mechanisms more complex than a mere cue extraction.
Simulations with complex measure
International Nuclear Information System (INIS)
Markham, J.K.; Kieu, T.D.
1997-01-01
A method is proposed to handle the sign problem in the simulation of systems having indefinite or complex-valued measures. In general, this new approach, which is based on renormalisation blocking, is shown to yield statistical errors smaller that the crude Monte Carlo method using absolute values of the original measures. The improved method is applied to the 2D Ising model with temperature generalised to take on complex values. It is also adapted to implement Monte Carlo Renormalisation Group calculations of the magnetic and thermal critical exponents. 10 refs., 4 tabs., 7 figs
Change detection in full and dual polarization sar data and the complex wishart distribution
DEFF Research Database (Denmark)
Nielsen, Allan Aasbjerg; Conradsen, Knut; Skriver, Henning
A test statistic for equality of two complex variance-covariance matrices following the complex Wishart distribution with an associated probability of observing a smaller value of the test statistic is sketched. We demonstrate the use of the test statistic and the associated probability measure f...... for change detection in both full and dual polarimetry synthetic aperture radar (SAR) data collected by the Danish EMISAR system....
Energy Technology Data Exchange (ETDEWEB)
Radunovic, J [Institute of nuclear sciences Boris Kidric, Vinca, Beograd (Yugoslavia)
1973-07-01
This paper deals with the application of statistical method for the analysis of nuclear reactions related to complex nuclei. It is shown that inelastic neutron scattering which occurs by creation of a complex nucleus in the higher energy range can be treated by statistical approach.
Statistical Physics in the Era of Big Data
Wang, Dashun
2013-01-01
With the wealth of data provided by a wide range of high-throughout measurement tools and technologies, statistical physics of complex systems is entering a new phase, impacting in a meaningful fashion a wide range of fields, from cell biology to computer science to economics. In this dissertation, by applying tools and techniques developed in…
Phonetic diversity, statistical learning, and acquisition of phonology.
Pierrehumbert, Janet B
2003-01-01
In learning to perceive and produce speech, children master complex language-specific patterns. Daunting language-specific variation is found both in the segmental domain and in the domain of prosody and intonation. This article reviews the challenges posed by results in phonetic typology and sociolinguistics for the theory of language acquisition. It argues that categories are initiated bottom-up from statistical modes in use of the phonetic space, and sketches how exemplar theory can be used to model the updating of categories once they are initiated. It also argues that bottom-up initiation of categories is successful thanks to the perception-production loop operating in the speech community. The behavior of this loop means that the superficial statistical properties of speech available to the infant indirectly reflect the contrastiveness and discriminability of categories in the adult grammar. The article also argues that the developing system is refined using internal feedback from type statistics over the lexicon, once the lexicon is well-developed. The application of type statistics to a system initiated with surface statistics does not cause a fundamental reorganization of the system. Instead, it exploits confluences across levels of representation which characterize human language and make bootstrapping possible.
An 'electronic' extramural course in epidemiology and medical statistics.
Ostbye, T
1989-03-01
This article describes an extramural university course in epidemiology and medical statistics taught using a computer conferencing system, microcomputers and data communications. Computer conferencing was shown to be a powerful, yet quite easily mastered, vehicle for distance education. It allows health personnel unable to attend regular classes due to geographical or time constraints, to take part in an interactive learning environment at low cost. This overcomes part of the intellectual and social isolation associated with traditional correspondence courses. Teaching of epidemiology and medical statistics is well suited to computer conferencing, even if the asynchronicity of the medium makes discussion of the most complex statistical concepts a little cumbersome. Computer conferencing may also prove to be a useful tool for teaching other medical and health related subjects.
Dai, Mingwei; Ming, Jingsi; Cai, Mingxuan; Liu, Jin; Yang, Can; Wan, Xiang; Xu, Zongben
2017-09-15
Results from genome-wide association studies (GWAS) suggest that a complex phenotype is often affected by many variants with small effects, known as 'polygenicity'. Tens of thousands of samples are often required to ensure statistical power of identifying these variants with small effects. However, it is often the case that a research group can only get approval for the access to individual-level genotype data with a limited sample size (e.g. a few hundreds or thousands). Meanwhile, summary statistics generated using single-variant-based analysis are becoming publicly available. The sample sizes associated with the summary statistics datasets are usually quite large. How to make the most efficient use of existing abundant data resources largely remains an open question. In this study, we propose a statistical approach, IGESS, to increasing statistical power of identifying risk variants and improving accuracy of risk prediction by i ntegrating individual level ge notype data and s ummary s tatistics. An efficient algorithm based on variational inference is developed to handle the genome-wide analysis. Through comprehensive simulation studies, we demonstrated the advantages of IGESS over the methods which take either individual-level data or summary statistics data as input. We applied IGESS to perform integrative analysis of Crohns Disease from WTCCC and summary statistics from other studies. IGESS was able to significantly increase the statistical power of identifying risk variants and improve the risk prediction accuracy from 63.2% ( ±0.4% ) to 69.4% ( ±0.1% ) using about 240 000 variants. The IGESS software is available at https://github.com/daviddaigithub/IGESS . zbxu@xjtu.edu.cn or xwan@comp.hkbu.edu.hk or eeyang@hkbu.edu.hk. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Quantitative and statistical approaches to geography a practical manual
Matthews, John A
2013-01-01
Quantitative and Statistical Approaches to Geography: A Practical Manual is a practical introduction to some quantitative and statistical techniques of use to geographers and related scientists. This book is composed of 15 chapters, each begins with an outline of the purpose and necessary mechanics of a technique or group of techniques and is concluded with exercises and the particular approach adopted. These exercises aim to enhance student's ability to use the techniques as part of the process by which sound judgments are made according to scientific standards while tackling complex problems. After a brief introduction to the principles of quantitative and statistical geography, this book goes on dealing with the topics of measures of central tendency; probability statements and maps; the problem of time-dependence, time-series analysis, non-normality, and data transformations; and the elements of sampling methodology. Other chapters cover the confidence intervals and estimation from samples, statistical hy...
Visual wetness perception based on image color statistics.
Sawayama, Masataka; Adelson, Edward H; Nishida, Shin'ya
2017-05-01
Color vision provides humans and animals with the abilities to discriminate colors based on the wavelength composition of light and to determine the location and identity of objects of interest in cluttered scenes (e.g., ripe fruit among foliage). However, we argue that color vision can inform us about much more than color alone. Since a trichromatic image carries more information about the optical properties of a scene than a monochromatic image does, color can help us recognize complex material qualities. Here we show that human vision uses color statistics of an image for the perception of an ecologically important surface condition (i.e., wetness). Psychophysical experiments showed that overall enhancement of chromatic saturation, combined with a luminance tone change that increases the darkness and glossiness of the image, tended to make dry scenes look wetter. Theoretical analysis along with image analysis of real objects indicated that our image transformation, which we call the wetness enhancing transformation, is consistent with actual optical changes produced by surface wetting. Furthermore, we found that the wetness enhancing transformation operator was more effective for the images with many colors (large hue entropy) than for those with few colors (small hue entropy). The hue entropy may be used to separate surface wetness from other surface states having similar optical properties. While surface wetness and surface color might seem to be independent, there are higher order color statistics that can influence wetness judgments, in accord with the ecological statistics. The present findings indicate that the visual system uses color image statistics in an elegant way to help estimate the complex physical status of a scene.
Spectral statistics of 'cellular' billiards
International Nuclear Information System (INIS)
Gutkin, Boris
2011-01-01
For a bounded domain Ω 0 subset of R 2 whose boundary contains a number of flat pieces Γ i , i = 1, ..., l we consider a family of non-symmetric billiards Ω constructed by patching several copies of Ω 0 along Γ i s. It is demonstrated that the length spectrum of the periodic orbits in Ω is degenerate with the multiplicities determined by a matrix group G. We study the energy spectrum of the corresponding quantum billiard problem in Ω and show that it can be split into a number of uncorrelated subspectra corresponding to a set of irreducible representations α of G. Assuming that the classical dynamics in Ω 0 are chaotic, we derive a semiclassical trace formula for each spectral component and show that their energy level statistics are the same as in standard random matrix ensembles. Depending on whether α is real, pseudo-real or complex, the spectrum has either Gaussian orthogonal, Gaussian symplectic or Gaussian unitary types of statistics, respectively
Oliveira, Fernando C; Ferreira, Carlos E R; Haas, Cristina S; Oliveira, Leonardo G; Mondadori, Rafael G; Schneider, Augusto; Rovani, Monique T; Gonçalves, Paulo B D; Vieira, Arnaldo D; Gasperin, Bernardo G; Lucia, Thomaz
2017-03-01
Intratesticular injection (ITI) of sodium chloride (NaCl) is efficient for chemical castration of young calves, but its effects on calves welfare are unknown. Two experiments were conducted to evaluate the effects of ITI of 20% NaCl on stress and inflammatory markers in calves less than 20 days old and to assess the efficiency of ITI of 30% NaCl in 5 months old calves. In Experiment 1, control calves were only restrained and compared to calves submitted to castration through surgery (SC) and ITI with 20% NaCl (n = 9/group). No differences were observed for the eye corner temperature measured by thermography from 60 s before to 60 s after the procedures (P > 0.05). In the SC group, acute serum cortisol levels increased at 30 and 60 min after the procedure, but increased levels in the ITI group occurred only at 30 min (P 0.05). Scrotal temperature was higher at D1 in the SC group than for the other groups, but lowest at D4 compared to the control (both P castration through ITI of 20% NaCl in young calves was followed by slight stress and inflammatory responses compared to surgical castration. However, ITI of 30% NaCl was ineffective for chemical castration of 5 months old calves. Copyright © 2016 Elsevier Inc. All rights reserved.
Bayesian models: A statistical primer for ecologists
Hobbs, N. Thompson; Hooten, Mevin B.
2015-01-01
Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods—in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach.Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probability and develops a step-by-step sequence of connected ideas, including basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and inference from single and multiple models. This unique book places less emphasis on computer coding, favoring instead a concise presentation of the mathematical statistics needed to understand how and why Bayesian analysis works. It also explains how to write out properly formulated hierarchical Bayesian models and use them in computing, research papers, and proposals.This primer enables ecologists to understand the statistical principles behind Bayesian modeling and apply them to research, teaching, policy, and management.Presents the mathematical and statistical foundations of Bayesian modeling in language accessible to non-statisticiansCovers basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and moreDeemphasizes computer coding in favor of basic principlesExplains how to write out properly factored statistical expressions representing Bayesian models
Modelling the structure of complex networks
DEFF Research Database (Denmark)
Herlau, Tue
networks has been independently studied as mathematical objects in their own right. As such, there has been both an increased demand for statistical methods for complex networks as well as a quickly growing mathematical literature on the subject. In this dissertation we explore aspects of modelling complex....... The next chapters will treat some of the various symmetries, representer theorems and probabilistic structures often deployed in the modelling complex networks, the construction of sampling methods and various network models. The introductory chapters will serve to provide context for the included written...
Statistical learning and the challenge of syntax: Beyond finite state automata
Elman, Jeff
2003-10-01
Over the past decade, it has been clear that even very young infants are sensitive to the statistical structure of language input presented to them, and use the distributional regularities to induce simple grammars. But can such statistically-driven learning also explain the acquisition of more complex grammar, particularly when the grammar includes recursion? Recent claims (e.g., Hauser, Chomsky, and Fitch, 2002) have suggested that the answer is no, and that at least recursion must be an innate capacity of the human language acquisition device. In this talk evidence will be presented that indicates that, in fact, statistically-driven learning (embodied in recurrent neural networks) can indeed enable the learning of complex grammatical patterns, including those that involve recursion. When the results are generalized to idealized machines, it is found that the networks are at least equivalent to Push Down Automata. Perhaps more interestingly, with limited and finite resources (such as are presumed to exist in the human brain) these systems demonstrate patterns of performance that resemble those in humans.
Central Limit Theorem for Exponentially Quasi-local Statistics of Spin Models on Cayley Graphs
Reddy, Tulasi Ram; Vadlamani, Sreekar; Yogeshwaran, D.
2018-04-01
Central limit theorems for linear statistics of lattice random fields (including spin models) are usually proven under suitable mixing conditions or quasi-associativity. Many interesting examples of spin models do not satisfy mixing conditions, and on the other hand, it does not seem easy to show central limit theorem for local statistics via quasi-associativity. In this work, we prove general central limit theorems for local statistics and exponentially quasi-local statistics of spin models on discrete Cayley graphs with polynomial growth. Further, we supplement these results by proving similar central limit theorems for random fields on discrete Cayley graphs taking values in a countable space, but under the stronger assumptions of α -mixing (for local statistics) and exponential α -mixing (for exponentially quasi-local statistics). All our central limit theorems assume a suitable variance lower bound like many others in the literature. We illustrate our general central limit theorem with specific examples of lattice spin models and statistics arising in computational topology, statistical physics and random networks. Examples of clustering spin models include quasi-associated spin models with fast decaying covariances like the off-critical Ising model, level sets of Gaussian random fields with fast decaying covariances like the massive Gaussian free field and determinantal point processes with fast decaying kernels. Examples of local statistics include intrinsic volumes, face counts, component counts of random cubical complexes while exponentially quasi-local statistics include nearest neighbour distances in spin models and Betti numbers of sub-critical random cubical complexes.
Birth tourism: socio-demographic and statistical aspects
Directory of Open Access Journals (Sweden)
Anatoly V. Korotkov
2016-01-01
Full Text Available The purpose of the study is to research birth tourism issue. The article gives the socio-demographic and statistical aspects of research problems of birth inbound tourism in the Russian Federation. Following the literature analysis, the degree of study for birth tourism lags behind its actual size. Currently, the media has accumulated a significant amount of information on birth tourism in Russia, that requires processing, systematization and understanding that can and should become an independent area of study of sociologists and demographers to develop recommendations for the management of socio-demographic processes in birth tourism in our country. It is necessary to identify the problems that will inevitably arise. At present, this process is almost not regulated.These problems are complex, it requires the joint efforts of sociologists and demographers. However, it is impossible to obtain reliable results and to develop management decisions without attention to the statistical aspect of this problem. It is necessary to create methodological support for collecting and information processing and model development of the birth tourism. At the initial stage it is necessary to identify the direction and objectives of the analysis to determine the factors in the development of this process, to develop a hierarchical system of statistical indicators, to receive the information, needed for calculating of specific indicators.The complex research of the birth tourism issues should be based on the methodology of sociology, demography and statistics, including statistical observation, interviews with residents, structure analysis and birth tourism concentration in the country, the analysis of the dynamics, classification of factors and reasons, the grouping of regions for the development of the studied processes and, of course, the development of economic-statistical indicators.The article reveals the problem of the significant influence of the
Innovations in Statistical Observations of Consumer Prices
Directory of Open Access Journals (Sweden)
Olga Stepanovna Oleynik
2016-10-01
Full Text Available This article analyzes the innovative changes in the methodology of statistical surveys of consumer prices. These changes are reflected in the “Official statistical methodology for the organization of statistical observation of consumer prices for goods and services and the calculation of the consumer price index”, approved by order of the Federal State Statistics Service of December 30, 2014 no. 734. The essence of innovation is the use of mathematical methods in determining the range of studies objects of trade and services, in calculating the sufficient observable price quotes based on price dispersion, the proportion of the observed product (service, a representative of consumer spending, as well as the indicator of the complexity of price registration. The authors analyzed the mathematical calculations of the required number of quotations for observation in the Volgograd region in 2016, the results of calculations are compared with the number of quotes included in the monitoring. The authors believe that the implementation of these mathematical models allowed to substantially reduce the influence of the subjective factor in the organization of monitoring of consumer prices, and therefore to increase the objectivity of the resulting statistics on consumer prices and inflation. At the same time, the proposed methodology needs further improvement in terms of payment for goods, products (services by representatives having a minor share in consumer expenditure.
Li, Ziru; Zhang, Xusheng
2008-12-01
Infrared thermal imaging (ITI) is the potential imaging technique for the health care field of traditional Chinese medicine (TCM). Successful application demands obeying the characteristics and regularity of the ITI of human body and designing rigorous trials. First, the influence of time must be taken into account as the ITI of human body varies with time markedly. Second, relative magnitude is preferred to be the index of the image features. Third, scatter diagrams and the method of least square could present important information for evaluating the health care effect. A double-blind placebo-controlled randomized trial was undertaken to study the influences of Shengsheng capsule, one of the TCM health food with immunity adjustment function, on the ITI of human body. The results showed that the effect of Shengsheng capsule to people with weak constitution or in the period of being weak could be reflected objectively by ITI. The relative efficacy rate was 81.3% for the trial group and 30.0% for the control group, there was significant difference between the two groups (P=0.003). So the sensitivity and objectivity of ITI are of great importance to the health care field of TCM.
Zeng, Irene Sui Lan; Lumley, Thomas
2018-01-01
Integrated omics is becoming a new channel for investigating the complex molecular system in modern biological science and sets a foundation for systematic learning for precision medicine. The statistical/machine learning methods that have emerged in the past decade for integrated omics are not only innovative but also multidisciplinary with integrated knowledge in biology, medicine, statistics, machine learning, and artificial intelligence. Here, we review the nontrivial classes of learning methods from the statistical aspects and streamline these learning methods within the statistical learning framework. The intriguing findings from the review are that the methods used are generalizable to other disciplines with complex systematic structure, and the integrated omics is part of an integrated information science which has collated and integrated different types of information for inferences and decision making. We review the statistical learning methods of exploratory and supervised learning from 42 publications. We also discuss the strengths and limitations of the extended principal component analysis, cluster analysis, network analysis, and regression methods. Statistical techniques such as penalization for sparsity induction when there are fewer observations than the number of features and using Bayesian approach when there are prior knowledge to be integrated are also included in the commentary. For the completeness of the review, a table of currently available software and packages from 23 publications for omics are summarized in the appendix.
Directory of Open Access Journals (Sweden)
Chaeyoung Lee
2012-11-01
Full Text Available Epistasis that may explain a large portion of the phenotypic variation for complex economic traits of animals has been ignored in many genetic association studies. A Baysian method was introduced to draw inferences about multilocus genotypic effects based on their marginal posterior distributions by a Gibbs sampler. A simulation study was conducted to provide statistical powers under various unbalanced designs by using this method. Data were simulated by combined designs of number of loci, within genotype variance, and sample size in unbalanced designs with or without null combined genotype cells. Mean empirical statistical power was estimated for testing posterior mean estimate of combined genotype effect. A practical example for obtaining empirical statistical power estimates with a given sample size was provided under unbalanced designs. The empirical statistical powers would be useful for determining an optimal design when interactive associations of multiple loci with complex phenotypes were examined.
International Nuclear Information System (INIS)
Lim, Gyeong Hui
2008-03-01
This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics
DbAccess: Interactive Statistics and Graphics for Plasma Physics Databases
International Nuclear Information System (INIS)
Davis, W.; Mastrovito, D.
2003-01-01
DbAccess is an X-windows application, written in IDL(reg s ign), meeting many specialized statistical and graphical needs of NSTX [National Spherical Torus Experiment] plasma physicists, such as regression statistics and the analysis of variance. Flexible ''views'' and ''joins,'' which include options for complex SQL expressions, facilitate mixing data from different database tables. General Atomics Plot Objects add extensive graphical and interactive capabilities. An example is included for plasma confinement-time scaling analysis using a multiple linear regression least-squares power fit
Statistical model of natural stimuli predicts edge-like pooling of spatial frequency channels in V2
Directory of Open Access Journals (Sweden)
Gutmann Michael
2005-02-01
Full Text Available Abstract Background It has been shown that the classical receptive fields of simple and complex cells in the primary visual cortex emerge from the statistical properties of natural images by forcing the cell responses to be maximally sparse or independent. We investigate how to learn features beyond the primary visual cortex from the statistical properties of modelled complex-cell outputs. In previous work, we showed that a new model, non-negative sparse coding, led to the emergence of features which code for contours of a given spatial frequency band. Results We applied ordinary independent component analysis to modelled outputs of complex cells that span different frequency bands. The analysis led to the emergence of features which pool spatially coherent across-frequency activity in the modelled primary visual cortex. Thus, the statistically optimal way of processing complex-cell outputs abandons separate frequency channels, while preserving and even enhancing orientation tuning and spatial localization. As a technical aside, we found that the non-negativity constraint is not necessary: ordinary independent component analysis produces essentially the same results as our previous work. Conclusion We propose that the pooling that emerges allows the features to code for realistic low-level image features related to step edges. Further, the results prove the viability of statistical modelling of natural images as a framework that produces quantitative predictions of visual processing.
Dynamical systems examples of complex behaviour
Jost, Jürgen
2005-01-01
Our aim is to introduce, explain, and discuss the fundamental problems, ideas, concepts, results, and methods of the theory of dynamical systems and to show how they can be used in speci?c examples. We do not intend to give a comprehensive overview of the present state of research in the theory of dynamical systems, nor a detailed historical account of its development. We try to explain the important results, often neglecting technical re?nements 1 and, usually, we do not provide proofs. One of the basic questions in studying dynamical systems, i.e. systems that evolve in time, is the construction of invariants that allow us to classify qualitative types of dynamical evolution, to distinguish between qualitatively di?erent dynamics, and to studytransitions between di?erent types. Itis also important to ?nd out when a certain dynamic behavior is stable under small perturbations, as well as to understand the various scenarios of instability. Finally, an essential aspect of a dynamic evolution is the transformat...
Energy Technology Data Exchange (ETDEWEB)
Marzouk, Youssef [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)
2016-08-31
Predictive simulation of complex physical systems increasingly rests on the interplay of experimental observations with computational models. Key inputs, parameters, or structural aspects of models may be incomplete or unknown, and must be developed from indirect and limited observations. At the same time, quantified uncertainties are needed to qualify computational predictions in the support of design and decision-making. In this context, Bayesian statistics provides a foundation for inference from noisy and limited data, but at prohibitive computional expense. This project intends to make rigorous predictive modeling *feasible* in complex physical systems, via accelerated and scalable tools for uncertainty quantification, Bayesian inference, and experimental design. Specific objectives are as follows: 1. Develop adaptive posterior approximations and dimensionality reduction approaches for Bayesian inference in high-dimensional nonlinear systems. 2. Extend accelerated Bayesian methodologies to large-scale {\\em sequential} data assimilation, fully treating nonlinear models and non-Gaussian state and parameter distributions. 3. Devise efficient surrogate-based methods for Bayesian model selection and the learning of model structure. 4. Develop scalable simulation/optimization approaches to nonlinear Bayesian experimental design, for both parameter inference and model selection. 5. Demonstrate these inferential tools on chemical kinetic models in reacting flow, constructing and refining thermochemical and electrochemical models from limited data. Demonstrate Bayesian filtering on canonical stochastic PDEs and in the dynamic estimation of inhomogeneous subsurface properties and flow fields.
[Statistics for statistics?--Thoughts about psychological tools].
Berger, Uwe; Stöbel-Richter, Yve
2007-12-01
Statistical methods take a prominent place among psychologists' educational programs. Being known as difficult to understand and heavy to learn, students fear of these contents. Those, who do not aspire after a research carrier at the university, will forget the drilled contents fast. Furthermore, because it does not apply for the work with patients and other target groups at a first glance, the methodological education as a whole was often questioned. For many psychological practitioners the statistical education makes only sense by enforcing respect against other professions, namely physicians. For the own business, statistics is rarely taken seriously as a professional tool. The reason seems to be clear: Statistics treats numbers, while psychotherapy treats subjects. So, does statistics ends in itself? With this article, we try to answer the question, if and how statistical methods were represented within the psychotherapeutical and psychological research. Therefore, we analyzed 46 Originals of a complete volume of the journal Psychotherapy, Psychosomatics, Psychological Medicine (PPmP). Within the volume, 28 different analyse methods were applied, from which 89 per cent were directly based upon statistics. To be able to write and critically read Originals as a backbone of research, presumes a high degree of statistical education. To ignore statistics means to ignore research and at least to reveal the own professional work to arbitrariness.
A complex autoregressive model and application to monthly temperature forecasts
Directory of Open Access Journals (Sweden)
X. Gu
2005-11-01
Full Text Available A complex autoregressive model was established based on the mathematic derivation of the least squares for the complex number domain which is referred to as the complex least squares. The model is different from the conventional way that the real number and the imaginary number are separately calculated. An application of this new model shows a better forecast than forecasts from other conventional statistical models, in predicting monthly temperature anomalies in July at 160 meteorological stations in mainland China. The conventional statistical models include an autoregressive model, where the real number and the imaginary number are separately disposed, an autoregressive model in the real number domain, and a persistence-forecast model.
Nonlinear Phenomena in Complex Systems: From Nano to Macro Scale
Stanley, H
2014-01-01
Topics of complex system physics and their interdisciplinary applications to different problems in seismology, biology, economy, sociology, energy and nanotechnology are covered in this new work from renowned experts in their fields. In particular, contributed papers contain original results on network science, earthquake dynamics, econophysics, sociophysics, nanoscience and biological physics. Most of the papers use interdisciplinary approaches based on statistical physics, quantum physics and other topics of complex system physics. Papers on econophysics and sociophysics are focussed on societal aspects of physics such as, opinion dynamics, public debates and financial and economic stability. This work will be of interest to statistical physicists, economists, biologists, seismologists and all scientists working in interdisciplinary topics of complexity.
Complex networks: Dynamics and security
Indian Academy of Sciences (India)
This paper presents a perspective in the study of complex networks by focusing on how dynamics may affect network security under attacks. ... Department of Mathematics and Statistics, Arizona State University, Tempe, Arizona 85287, USA; Institute of Mathematics and Computer Science, University of Sao Paulo, Brazil ...
Dynamics and statistics of unstable quantum states
International Nuclear Information System (INIS)
Sokolov, V.V.; Zelevinsky, V.G.
1989-01-01
The statistical theory of spectra formulated in terms of random matrices is extended to unstable states. The energies and widths of these states are treated as real and imaginary parts of complex eigenvalues for an effective non-hermitian hamiltonian. Eigenvalue statistics are investigated under simple assumptions. If the coupling through common decay channels is weak we obtain a Wigner distribution for the level spacings and a Porter-Thomas one for the widths, with the only exception for spacings less than widths where level repulsion fades out. Meanwhile in the complex energy plane the repulsion of eigenvalues is quadratic in accordance with the T-noninvariant character of decaying systems. In the opposite case of strong coupling with the continuum, k short-lived states are formed (k is the number of open decay channels). These states accumulate almost the whole total width, the rest of the states becoming long-lived. Such a perestroika corresponds to separation of direct processes (a nuclear analogue of Dicke coherent superradiance). At small channel number, Ericson fluctuations of the cross sections are found to be suppressed. The one-channel case is considered in detail. The joint distribution of energies and widths is obtained. The average cross sections and density of unstable states are calculated. (orig.)
Turchi, Janita; Devan, Bryan; Yin, Pingbo; Sigrist, Emmalynn; Mishkin, Mortimer
2010-07-01
The monkey's ability to learn a set of visual discriminations presented concurrently just once a day on successive days (24-h ITI task) is based on habit formation, which is known to rely on a visuo-striatal circuit and to be independent of visuo-rhinal circuits that support one-trial memory. Consistent with this dissociation, we recently reported that performance on the 24-h ITI task is impaired by a striatal-function blocking agent, the dopaminergic antagonist haloperidol, and not by a rhinal-function blocking agent, the muscarinic cholinergic antagonist scopolamine. In the present study, monkeys were trained on a short-ITI form of concurrent visual discrimination learning, one in which a set of stimulus pairs is repeated not only across daily sessions but also several times within each session (in this case, at about 4-min ITIs). Asymptotic discrimination learning rates in the non-drug condition were reduced by half, from approximately 11 trials/pair on the 24-h ITI task to approximately 5 trials/pair on the 4-min ITI task, and this faster learning was impaired by systemic injections of either haloperidol or scopolamine. The results suggest that in the version of concurrent discrimination learning used here, the short ITIs within a session recruit both visuo-rhinal and visuo-striatal circuits, and that the final performance level is driven by both cognitive memory and habit formation working in concert.
HistFitter software framework for statistical data analysis
Energy Technology Data Exchange (ETDEWEB)
Baak, M. [CERN, Geneva (Switzerland); Besjes, G.J. [Radboud University Nijmegen, Nijmegen (Netherlands); Nikhef, Amsterdam (Netherlands); Cote, D. [University of Texas, Arlington (United States); Koutsman, A. [TRIUMF, Vancouver (Canada); Lorenz, J. [Ludwig-Maximilians-Universitaet Muenchen, Munich (Germany); Excellence Cluster Universe, Garching (Germany); Short, D. [University of Oxford, Oxford (United Kingdom)
2015-04-15
We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)
HistFitter software framework for statistical data analysis
International Nuclear Information System (INIS)
Baak, M.; Besjes, G.J.; Cote, D.; Koutsman, A.; Lorenz, J.; Short, D.
2015-01-01
We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)
Cressie, Noel; Calder, Catherine A; Clark, James S; Ver Hoef, Jay M; Wikle, Christopher K
2009-04-01
Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.
Habitat Complexity in Aquatic Microcosms Affects Processes Driven by Detritivores.
Directory of Open Access Journals (Sweden)
Lorea Flores
Full Text Available Habitat complexity can influence predation rates (e.g. by providing refuge but other ecosystem processes and species interactions might also be modulated by the properties of habitat structure. Here, we focussed on how complexity of artificial habitat (plastic plants, in microcosms, influenced short-term processes driven by three aquatic detritivores. The effects of habitat complexity on leaf decomposition, production of fine organic matter and pH levels were explored by measuring complexity in three ways: 1. as the presence vs. absence of habitat structure; 2. as the amount of structure (3 or 4.5 g of plastic plants; and 3. as the spatial configuration of structures (measured as fractal dimension. The experiment also addressed potential interactions among the consumers by running all possible species combinations. In the experimental microcosms, habitat complexity influenced how species performed, especially when comparing structure present vs. structure absent. Treatments with structure showed higher fine particulate matter production and lower pH compared to treatments without structures and this was probably due to higher digestion and respiration when structures were present. When we explored the effects of the different complexity levels, we found that the amount of structure added explained more than the fractal dimension of the structures. We give a detailed overview of the experimental design, statistical models and R codes, because our statistical analysis can be applied to other study systems (and disciplines such as restoration ecology. We further make suggestions of how to optimise statistical power when artificially assembling, and analysing, 'habitat complexity' by not confounding complexity with the amount of structure added. In summary, this study highlights the importance of habitat complexity for energy flow and the maintenance of ecosystem processes in aquatic ecosystems.
International Nuclear Information System (INIS)
Dai, Wu-Sheng; Xie, Mi
2013-01-01
In this paper, we give a general discussion on the calculation of the statistical distribution from a given operator relation of creation, annihilation, and number operators. Our result shows that as long as the relation between the number operator and the creation and annihilation operators can be expressed as a † b=Λ(N) or N=Λ −1 (a † b), where N, a † , and b denote the number, creation, and annihilation operators, i.e., N is a function of quadratic product of the creation and annihilation operators, the corresponding statistical distribution is the Gentile distribution, a statistical distribution in which the maximum occupation number is an arbitrary integer. As examples, we discuss the statistical distributions corresponding to various operator relations. In particular, besides the Bose–Einstein and Fermi–Dirac cases, we discuss the statistical distributions for various schemes of intermediate statistics, especially various q-deformation schemes. Our result shows that the statistical distributions corresponding to various q-deformation schemes are various Gentile distributions with different maximum occupation numbers which are determined by the deformation parameter q. This result shows that the results given in much literature on the q-deformation distribution are inaccurate or incomplete. -- Highlights: ► A general discussion on calculating statistical distribution from relations of creation, annihilation, and number operators. ► A systemic study on the statistical distributions corresponding to various q-deformation schemes. ► Arguing that many results of q-deformation distributions in literature are inaccurate or incomplete
Between-Trial Forgetting Due to Interference and Time in Motor Adaptation.
Directory of Open Access Journals (Sweden)
Sungshin Kim
Full Text Available Learning a motor task with temporally spaced presentations or with other tasks intermixed between presentations reduces performance during training, but can enhance retention post training. These two effects are known as the spacing and contextual interference effect, respectively. Here, we aimed at testing a unifying hypothesis of the spacing and contextual interference effects in visuomotor adaptation, according to which forgetting between trials due to either spaced presentations or interference by another task will promote between-trial forgetting, which will depress performance during acquisition, but will promote retention. We first performed an experiment with three visuomotor adaptation conditions: a short inter-trial-interval (ITI condition (SHORT-ITI; a long ITI condition (LONG-ITI; and an alternating condition with two alternated opposite tasks (ALT, with the same single-task ITI as in LONG-ITI. In the SHORT-ITI condition, there was fastest increase in performance during training and largest immediate forgetting in the retention tests. In contrast, in the ALT condition, there was slowest increase in performance during training and little immediate forgetting in the retention tests. Compared to these two conditions, in the LONG-ITI, we found intermediate increase in performance during training and intermediate immediate forgetting. To account for these results, we fitted to the data six possible adaptation models with one or two time scales, and with interference in the fast, or in the slow, or in both time scales. Model comparison confirmed that two time scales and some degree of interferences in either time scale are needed to account for our experimental results. In summary, our results suggest that retention following adaptation is modulated by the degree of between-trial forgetting, which is due to time-based decay in single adaptation task and interferences in multiple adaptation tasks.
Aspects of statistical spectroscopy relevant to effective-interaction theory
International Nuclear Information System (INIS)
French, J.B.
1975-01-01
The three aspects of statistical spectroscopy discussed in this paper are the information content of complex spectra: procedures for spectroscopy in huge model spaces, useful in effective-interaction theory; and practical ways of identifying and calculating measurable parameters of the effective Hamiltonian and other operators, and of comparing different effective Hamiltonians. (4 figures) (U.S.)
Non-extensive statistical aspects of clustering and nuclear multi-fragmentation
International Nuclear Information System (INIS)
Calboreanu, A.
2002-01-01
Recent developments concerning an application of the non-extensive Tsalis statistics to describe clustering phenomena is briefly presented. Cluster formation is a common feature of a large number of physical phenomena encountered in molecular and nuclear physics, astrophysics, condensed matter and biophysics. Common to all these is the large number of degrees of freedom, thus justifying a statistical approach. However the conventional statistical mechanics paradigm seems to fail in dealing with clustering. Whether this is due to the prevalence of complex dynamical constrains, or it is a manifestation of new statistics is a subject of considerable interest, which was intensively debated during the last few years. Tsalis conjecture has proved extremely appealing due to its rather elegant and transparent basic arguments. We present here evidence for its adequacy for the study of a large class of physical phenomena related to cluster formation. An application to nuclear multi-fragmentation is presented. (author)
Quantum communication complexity advantage implies violation of a Bell inequality
H. Buhrman (Harry); L. Czekaj (Lłukasz); A. Grudka (Andrzej); M. Horodecki (Michalł); P. Horodecki (Pawelł); M. Markiewicz (Marcin); F. Speelman (Florian); S. Strelchuk (Sergii)
2016-01-01
textabstractWe obtain a general connection between a large quantumadvantage in communication complexity and Bell nonlocality. We show that given any protocol offering a sufficiently large quantum advantage in communication complexity, there exists a way of obtaining measurement statistics that
Quantum communication complexity advantage implies violation of a Bell inequality
H. Buhrman (Harry); L. Czekaj (Lłukasz); A. Grudka (Andrzej); M. Horodecki (Michalł); P. Horodecki (Pawelł); M. Markiewicz (Marcin); F. Speelman (Florian); S. Strelchuk (Sergii)
2015-01-01
htmlabstractWe obtain a general connection between a quantum advantage in communication complexity and non-locality. We show that given any protocol offering a (sufficiently large) quantum advantage in communication complexity, there exists a way of obtaining measurement statistics which violate
Stochastic electromagnetic radiation of complex sources
Naus, H.W.L.
2007-01-01
The emission of electromagnetic radiation by localized complex electric charge and current distributions is studied. A statistical formalism in terms of general dynamical multipole fields is developed. The appearing coefficients are treated as stochastic variables. Hereby as much as possible a
Statistical-mechanical lattice models for protein-DNA binding in chromatin
International Nuclear Information System (INIS)
Teif, Vladimir B; Rippe, Karsten
2010-01-01
Statistical-mechanical lattice models for protein-DNA binding are well established as a method to describe complex ligand binding equilibria measured in vitro with purified DNA and protein components. Recently, a new field of applications has opened up for this approach since it has become possible to experimentally quantify genome-wide protein occupancies in relation to the DNA sequence. In particular, the organization of the eukaryotic genome by histone proteins into a nucleoprotein complex termed chromatin has been recognized as a key parameter that controls the access of transcription factors to the DNA sequence. New approaches have to be developed to derive statistical-mechanical lattice descriptions of chromatin-associated protein-DNA interactions. Here, we present the theoretical framework for lattice models of histone-DNA interactions in chromatin and investigate the (competitive) DNA binding of other chromosomal proteins and transcription factors. The results have a number of applications for quantitative models for the regulation of gene expression.
Hartmann, Alexander K
2005-01-01
A concise, comprehensive introduction to the topic of statistical physics of combinatorial optimization, bringing together theoretical concepts and algorithms from computer science with analytical methods from physics. The result bridges the gap between statistical physics and combinatorial optimization, investigating problems taken from theoretical computing, such as the vertex-cover problem, with the concepts and methods of theoretical physics. The authors cover rapid developments and analytical methods that are both extremely complex and spread by word-of-mouth, providing all the necessary
Topology for Statistical Modeling of Petascale Data
Energy Technology Data Exchange (ETDEWEB)
Pascucci, Valerio [Univ. of Utah, Salt Lake City, UT (United States); Levine, Joshua [Univ. of Utah, Salt Lake City, UT (United States); Gyulassy, Attila [Univ. of Utah, Salt Lake City, UT (United States); Bremer, P. -T. [Univ. of Utah, Salt Lake City, UT (United States)
2013-10-31
Many commonly used algorithms for mathematical analysis do not scale well enough to accommodate the size or complexity of petascale data produced by computational simulations. The primary goal of this project is to develop new mathematical tools that address both the petascale size and uncertain nature of current data. At a high level, the approach of the entire team involving all three institutions is based on the complementary techniques of combinatorial topology and statistical modelling. In particular, we use combinatorial topology to filter out spurious data that would otherwise skew statistical modelling techniques, and we employ advanced algorithms from algebraic statistics to efficiently find globally optimal fits to statistical models. The overall technical contributions can be divided loosely into three categories: (1) advances in the field of combinatorial topology, (2) advances in statistical modelling, and (3) new integrated topological and statistical methods. Roughly speaking, the division of labor between our 3 groups (Sandia Labs in Livermore, Texas A&M in College Station, and U Utah in Salt Lake City) is as follows: the Sandia group focuses on statistical methods and their formulation in algebraic terms, and finds the application problems (and data sets) most relevant to this project, the Texas A&M Group develops new algebraic geometry algorithms, in particular with fewnomial theory, and the Utah group develops new algorithms in computational topology via Discrete Morse Theory. However, we hasten to point out that our three groups stay in tight contact via videconference every 2 weeks, so there is much synergy of ideas between the groups. The following of this document is focused on the contributions that had grater direct involvement from the team at the University of Utah in Salt Lake City.
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
Nearing, G. S.
2014-12-01
Statistical models consistently out-perform conceptual models in the short term, however to account for a nonstationary future (or an unobserved past) scientists prefer to base predictions on unchanging and commutable properties of the universe - i.e., physics. The problem with physically-based hydrology models is, of course, that they aren't really based on physics - they are based on statistical approximations of physical interactions, and we almost uniformly lack an understanding of the entropy associated with these approximations. Thermodynamics is successful precisely because entropy statistics are computable for homogeneous (well-mixed) systems, and ergodic arguments explain the success of Newton's laws to describe systems that are fundamentally quantum in nature. Unfortunately, similar arguments do not hold for systems like watersheds that are heterogeneous at a wide range of scales. Ray Solomonoff formalized the situation in 1968 by showing that given infinite evidence, simultaneously minimizing model complexity and entropy in predictions always leads to the best possible model. The open question in hydrology is about what happens when we don't have infinite evidence - for example, when the future will not look like the past, or when one watershed does not behave like another. How do we isolate stationary and commutable components of watershed behavior? I propose that one possible answer to this dilemma lies in a formal combination of physics and statistics. In this talk I outline my recent analogue (Solomonoff's theorem was digital) of Solomonoff's idea that allows us to quantify the complexity/entropy tradeoff in a way that is intuitive to physical scientists. I show how to formally combine "physical" and statistical methods for model development in a way that allows us to derive the theoretically best possible model given any given physics approximation(s) and available observations. Finally, I apply an analogue of Solomonoff's theorem to evaluate the
Industrial commodity statistics yearbook 2001. Production statistics (1992-2001)
International Nuclear Information System (INIS)
2003-01-01
This is the thirty-fifth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1992-2001 for about 200 countries and areas
Industrial commodity statistics yearbook 2002. Production statistics (1993-2002)
International Nuclear Information System (INIS)
2004-01-01
This is the thirty-sixth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title 'The Growth of World industry' and the next eight editions under the title 'Yearbook of Industrial Statistics'. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1993-2002 for about 200 countries and areas
Industrial commodity statistics yearbook 2000. Production statistics (1991-2000)
International Nuclear Information System (INIS)
2002-01-01
This is the thirty-third in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. Most of the statistics refer to the ten-year period 1991-2000 for about 200 countries and areas
International Nuclear Information System (INIS)
Kongsoe, H.E.; Lauridsen, K.
1993-09-01
SIMON is a program for calculation of reliability and statistical analysis. The program is of the Monte Carlo type, and it is designed with high flexibility, and has a large potential for application to complex problems like reliability analyses of very large systems and of systems, where complex modelling or knowledge of special details are required. Examples of application of the program, including input and output, for reliability and statistical analysis are presented. (au) (3 tabs., 3 ills., 5 refs.)
Scalable Algorithms for Adaptive Statistical Designs
Directory of Open Access Journals (Sweden)
Robert Oehmke
2000-01-01
Full Text Available We present a scalable, high-performance solution to multidimensional recurrences that arise in adaptive statistical designs. Adaptive designs are an important class of learning algorithms for a stochastic environment, and we focus on the problem of optimally assigning patients to treatments in clinical trials. While adaptive designs have significant ethical and cost advantages, they are rarely utilized because of the complexity of optimizing and analyzing them. Computational challenges include massive memory requirements, few calculations per memory access, and multiply-nested loops with dynamic indices. We analyze the effects of various parallelization options, and while standard approaches do not work well, with effort an efficient, highly scalable program can be developed. This allows us to solve problems thousands of times more complex than those solved previously, which helps make adaptive designs practical. Further, our work applies to many other problems involving neighbor recurrences, such as generalized string matching.
Bayesian Statistics: Concepts and Applications in Animal Breeding – A Review
Directory of Open Access Journals (Sweden)
Lsxmikant-Sambhaji Kokate
2011-07-01
Full Text Available Statistics uses two major approaches- conventional (or frequentist and Bayesian approach. Bayesian approach provides a complete paradigm for both statistical inference and decision making under uncertainty. Bayesian methods solve many of the difficulties faced by conventional statistical methods, and extend the applicability of statistical methods. It exploits the use of probabilistic models to formulate scientific problems. To use Bayesian statistics, there is computational difficulty and secondly, Bayesian methods require specifying prior probability distributions. Markov Chain Monte-Carlo (MCMC methods were applied to overcome the computational difficulty, and interest in Bayesian methods was renewed. In Bayesian statistics, Bayesian structural equation model (SEM is used. It provides a powerful and flexible approach for studying quantitative traits for wide spectrum problems and thus it has no operational difficulties, with the exception of some complex cases. In this method, the problems are solved at ease, and the statisticians feel it comfortable with the particular way of expressing the results and employing the software available to analyze a large variety of problems.
Turistik Destinasyonlarda Tüketici Temelli Marka Değerinin Ölçülmesi: Anamur Üzerine Bir Araştırma
Burçin Cevdet ÇETİNSÖZ; Gökhan KARAKEÇİLİ*
2018-01-01
Itis aimed to investigate the validity of the consumer based brand value modeland the relationship between dimensions on a tourism destination in theresearch. The sample group of the study consisted of a domestic visitors toAnamur district in June and July. As a data collection tool, a questionnairewas used to measure customer brand values for tourism destinations. In orderto analyze the data statistical techniques such as descriptive analysis such as arithmetic mean andfrequency analysis, ...
Estrés y burnout docente: conceptos, causas y efectos
Zavala Zavala, José
2012-01-01
In this article the concepts of stress (eustress and distress) and stress coping arereviewed under a transactional perspective mainly, and it is also described the burnoutor syndrome of professional wearing as consequence of the chronic stress. Itis also offered a series of statisticals related to Latin-American teachers’ perceptionsof stress and health conditions, and there is indeed the shadow of a doubt of thepossible suffering of burnout. The stress has direct effects in the physical and ...
Mechanisms of Radiation Induced Effects in Carbon Nanotubes
2016-10-01
102 103 104 105 106 0 1 1012 1013 1014 1015 Metallic Semi Metallic- doped Semi-dopedEl ec tri ca l C on du ct iv ity , l og (S /m ) Fluence (ions/cm2...distributions in SWCNT networks because of both experimental time scale and sample preparation. Selective Au nanoparticle (Au-NP) deposition onto...deposit metal nanoparticles onto SWCNTs for other applications. The nucleation of Au-NPs has been statistically analyzed as a function of exposure time
Anchored LH2 complexes in 2D polarization imaging.
Tubasum, Sumera; Sakai, Shunsuke; Dewa, Takehisa; Sundström, Villy; Scheblykin, Ivan G; Nango, Mamoru; Pullerits, Tõnu
2013-09-26
Protein is a soft material with inherently large structural disorder. Consequently, the bulk spectroscopies of photosynthetic pigment protein complexes provide averaged information where many details are lost. Here we report spectroscopy of single light-harvesting complexes where fluorescence excitation and detection polarizations are both independently rotated. Two samples of peripheral antenna (LH2) complexes from Rhodopseudomonas acidophila were studied. In one, the complexes were embedded in polyvinyl alcohol (PVA) film; in the other, they were anchored on the glass surface and covered by the PVA film. LH2 contains two rings of pigment molecules-B800 and B850. The B800 excitation polarization properties of the two samples were found to be very similar, indicating that orientation statistics of LH2s are the same in these two very different preparations. At the same time, we found a significant difference in B850 emission polarization statistics. We conclude that the B850 band of the anchored sample is substantially more disordered. We argue that both B800 excitation and B850 emission polarization properties can be explained by the tilt of the anchored LH2s due to the spin-casting of the PVA film on top of the complexes and related shear forces. Due to the tilt, the orientation statistics of two samples become similar. Anchoring is expected to orient the LH2s so that B850 is closer to the substrate. Consequently, the tilt-related strain leads to larger deformation and disorder in B850 than in B800.
Shell model in large spaces and statistical spectroscopy
International Nuclear Information System (INIS)
Kota, V.K.B.
1996-01-01
For many nuclear structure problems of current interest it is essential to deal with shell model in large spaces. For this, three different approaches are now in use and two of them are: (i) the conventional shell model diagonalization approach but taking into account new advances in computer technology; (ii) the shell model Monte Carlo method. A brief overview of these two methods is given. Large space shell model studies raise fundamental questions regarding the information content of the shell model spectrum of complex nuclei. This led to the third approach- the statistical spectroscopy methods. The principles of statistical spectroscopy have their basis in nuclear quantum chaos and they are described (which are substantiated by large scale shell model calculations) in some detail. (author)
Do we need statistics when we have linguistics?
Directory of Open Access Journals (Sweden)
Cantos Gómez Pascual
2002-01-01
Full Text Available Statistics is known to be a quantitative approach to research. However, most of the research done in the fields of language and linguistics is of a different kind, namely qualitative. Succinctly, qualitative analysis differs from quantitative analysis is that in the former no attempt is made to assign frequencies, percentages and the like, to the linguistic features found or identified in the data. In quantitative research, linguistic features are classified and counted, and even more complex statistical models are constructed in order to explain these observed facts. In qualitative research, however, we use the data only for identifying and describing features of language usage and for providing real occurrences/examples of particular phenomena. In this paper, we shall try to show how quantitative methods and statistical techniques can supplement qualitative analyses of language. We shall attempt to present some mathematical and statistical properties of natural languages, and introduce some of the quantitative methods which are of the most value in working empirically with texts and corpora, illustrating the various issues with numerous examples and moving from the most basic descriptive techniques (frequency counts and percentages to decision-taking techniques (chi-square and z-score and to more sophisticated statistical language models (Type-Token/Lemma-Token/Lemma-Type formulae, cluster analysis and discriminant function analysis.
Statistical probability tables CALENDF program
International Nuclear Information System (INIS)
Ribon, P.
1989-01-01
The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity
Multivariate statistical analysis of major and trace element data for ...
African Journals Online (AJOL)
Multivariate statistical analysis of major and trace element data for niobium exploration in the peralkaline granites of the anorogenic ring-complex province of Nigeria. PO Ogunleye, EC Ike, I Garba. Abstract. No Abstract Available Journal of Mining and Geology Vol.40(2) 2004: 107-117. Full Text: EMAIL FULL TEXT EMAIL ...
Quantify the complexity of turbulence
Tao, Xingtian; Wu, Huixuan
2017-11-01
Many researchers have used Reynolds stress, power spectrum and Shannon entropy to characterize a turbulent flow, but few of them have measured the complexity of turbulence. Yet as this study shows, conventional turbulence statistics and Shannon entropy have limits when quantifying the flow complexity. Thus, it is necessary to introduce new complexity measures- such as topology complexity and excess information-to describe turbulence. Our test flow is a classic turbulent cylinder wake at Reynolds number 8100. Along the stream-wise direction, the flow becomes more isotropic and the magnitudes of normal Reynolds stresses decrease monotonically. These seem to indicate the flow dynamics becomes simpler downstream. However, the Shannon entropy keeps increasing along the flow direction and the dynamics seems to be more complex, because the large-scale vortices cascade to small eddies, the flow is less correlated and more unpredictable. In fact, these two contradictory observations partially describe the complexity of a turbulent wake. Our measurements (up to 40 diameters downstream the cylinder) show that the flow's degree-of-complexity actually increases firstly and then becomes a constant (or drops slightly) along the stream-wise direction. University of Kansas General Research Fund.
Statistical cluster analysis and diagnosis of nuclear system level performance
International Nuclear Information System (INIS)
Teichmann, T.; Levine, M.M.; Samanta, P.K.; Kato, W.Y.
1985-01-01
The complexity of individual nuclear power plants and the importance of maintaining reliable and safe operations makes it desirable to complement the deterministic analyses of these plants by corresponding statistical surveys and diagnoses. Based on such investigations, one can then explore, statistically, the anticipation, prevention, and when necessary, the control of such failures and malfunctions. This paper, and the accompanying one by Samanta et al., describe some of the initial steps in exploring the feasibility of setting up such a program on an integrated and global (industry-wide) basis. The conceptual statistical and data framework was originally outlined in BNL/NUREG-51609, NUREG/CR-3026, and the present work aims at showing how some important elements might be implemented in a practical way (albeit using hypothetical or simulated data)
International Nuclear Information System (INIS)
Eliazar, Iddo
2017-01-01
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
Energy Technology Data Exchange (ETDEWEB)
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
2017-05-15
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
Borgbjerg, Jens; Bøgsted, Martin; Lindholt, Jes S; Behr-Rasmussen, Carsten; Hørlyck, Arne; Frøkjær, Jens B
2018-02-01
Controversy exists regarding optimal caliper placement in ultrasound assessment of maximum abdominal aortic diameter. This study aimed primarily to determine reproducibility of caliper placement in relation to the aortic wall with the three principal methods: leading to leading edge (LTL), inner to inner edge (ITI), and outer to outer edge (OTO). The secondary aim was to assess the mean difference between the OTO, ITI, and LTL diameters and estimate the impact of using either of these methods on abdominal aortic aneurysm (AAA) prevalence in a screening program. Radiologists (n=18) assessed the maximum antero-posterior abdominal aortic diameter by completing repeated caliper placements with the OTO, LTL, and ITI methods on 50 still abdominal aortic images obtained from an AAA screening program. Inter-observer reproducibility was calculated as the limit of agreement with the mean (LoA), which represents expected deviation of a single observer from the mean of all observers. Intra-observer reproducibility was assessed averaging the LoA for each observer with their repeated measurements. Based on data from an AAA screening trial and the estimated mean differences between the three principal methods, AAA prevalence was estimated using each of the methods. The inter-observer LoA of the OTO, ITI, and LTL was 2.6, 1.9, and 1.9 mm, whereas the intra-observer LoA was 2.0, 1.6, and 1.5 mm, respectively. Mean differences of 5.0 mm were found between OTO and ITI measurements, 2.6 mm between OTO and LTL measurements, and 2.4 mm between LTL and ITI measurements. The prevalence of AAA almost doubled using OTO instead of ITI, while the difference between ITI and LTL was minor (3.3% vs. 4.0% AAA). The study shows superior reproducibility of LTL and ITI compared with the OTO method of caliper placement in ultrasound determination of maximum abdominal aortic diameter, and the choice of caliper placement method significantly affects the prevalence of AAAs in screening programs
Statistical mechanics for a class of quantum statistics
International Nuclear Information System (INIS)
Isakov, S.B.
1994-01-01
Generalized statistical distributions for identical particles are introduced for the case where filling a single-particle quantum state by particles depends on filling states of different momenta. The system of one-dimensional bosons with a two-body potential that can be solved by means of the thermodynamic Bethe ansatz is shown to be equivalent thermodynamically to a system of free particles obeying statistical distributions of the above class. The quantum statistics arising in this way are completely determined by the two-particle scattering phases of the corresponding interacting systems. An equation determining the statistical distributions for these statistics is derived
Social media in Romanian public administration – case study: National Institute of Statistics
Iulia Alexandra Nicolescu; Andreea Mirica
2015-01-01
Social media offers great opportunities especially, considering widening transparency in public administration. Given the importance, the challenges and the complexity of social media-based communication in public administration, this paper aims to provide an analysis on the impact that social media has in official statistics communication and dissemination. Using social media as one of the key communication channels in official statistics in Romania has been implemented only since the late 2...
Complexity of Economical Systems
Directory of Open Access Journals (Sweden)
G. P. Pavlos
2015-01-01
Full Text Available In this study new theoretical concepts are described concerning the interpretation of economical complex dynamics. In addition a summary of an extended algorithm of nonlinear time series analysis is provided which is applied not only in economical time series but also in other physical complex systems (e.g. [22, 24]. In general, Economy is a vast and complicated set of arrangements and actions wherein agents—consumers, firms, banks, investors, government agencies—buy and sell, speculate, trade, oversee, bring products into being, offer services, invest in companies, strategize, explore, forecast, compete, learn, innovate, and adapt. As a result the economic and financial variables such as foreign exchange rates, gross domestic product, interest rates, production, stock market prices and unemployment exhibit large-amplitude and aperiodic fluctuations evident in complex systems. Thus, the Economics can be considered as spatially distributed non-equilibrium complex system, for which new theoretical concepts, such as Tsallis non extensive statistical mechanics and strange dynamics, percolation, nonGaussian, multifractal and multiscale dynamics related to fractional Langevin equations can be used for modeling and understanding of the economical complexity locally or globally.
Project risk management in complex petrochemical system
Directory of Open Access Journals (Sweden)
Kirin Snežana
2012-01-01
Full Text Available Investigation of risk in complex industrial systems, as well as evaluation of main factors influencing decision making and implementation process using large petrochemical company as an example, has proved the importance of successful project risk management. This is even more emphasized when analyzing systems with complex structure, i.e. with several organizational units. It has been shown that successful risk management requires modern methods, based on adequate application of statistical analysis methods.
Product development projects dynamics and emergent complexity
Schlick, Christopher
2016-01-01
This book primarily explores two topics: the representation of simultaneous, cooperative work processes in product development projects with the help of statistical models, and the assessment of their emergent complexity using a metric from theoretical physics (Effective Measure Complexity, EMC). It is intended to promote more effective management of development projects by shifting the focus from the structural complexity of the product being developed to the dynamic complexity of the development processes involved. The book is divided into four main parts, the first of which provides an introduction to vector autoregression models, periodic vector autoregression models and linear dynamical systems for modeling cooperative work in product development projects. The second part presents theoretical approaches for assessing complexity in the product development environment, while the third highlights and explains closed-form solutions for the complexity metric EMC for vector autoregression models and linear dyn...
Statistical Image Analysis of Tomograms with Application to Fibre Geometry Characterisation
DEFF Research Database (Denmark)
Emerson, Monica Jane
The goal of this thesis is to develop statistical image analysis tools to characterise the micro-structure of complex materials used in energy technologies, with a strong focus on fibre composites. These quantification tools are based on extracting geometrical parameters defining structures from 2D...... with high resolution both in space and time to observe fast micro-structural changes. This thesis demonstrates that statistical image analysis combined with X-ray CT opens up numerous possibilities for understanding the behaviour of fibre composites under real life conditions. Besides enabling...
Statistics with JMP graphs, descriptive statistics and probability
Goos, Peter
2015-01-01
Peter Goos, Department of Statistics, University ofLeuven, Faculty of Bio-Science Engineering and University ofAntwerp, Faculty of Applied Economics, BelgiumDavid Meintrup, Department of Mathematics and Statistics,University of Applied Sciences Ingolstadt, Faculty of MechanicalEngineering, GermanyThorough presentation of introductory statistics and probabilitytheory, with numerous examples and applications using JMPDescriptive Statistics and Probability provides anaccessible and thorough overview of the most important descriptivestatistics for nominal, ordinal and quantitative data withpartic
Implementing the “Big Data” Concept in Official Statistics
Directory of Open Access Journals (Sweden)
О. V.
2017-02-01
Full Text Available Big data is a huge resource that needs to be used at all levels of economic planning. The article is devoted to the study of the development of the concept of “Big Data” in the world and its impact on the transformation of statistical simulation of economic processes. Statistics at the current stage should take into account the complex system of international economic relations, which functions in the conditions of globalization and brings new forms of economic development in small open economies. Statistical science should take into account such phenomena as gig-economy, common economy, institutional factors, etc. The concept of “Big Data” and open data are analyzed, problems of implementation of “Big Data” in the official statistics are shown. The ways of implementation of “Big Data” in the official statistics of Ukraine through active use of technological opportunities of mobile operators, navigation systems, surveillance cameras, social networks, etc. are presented. The possibilities of using “Big Data” in different sectors of the economy, also on the level of companies are shown. The problems of storage of large volumes of data are highlighted. The study shows that “Big Data” is a huge resource that should be used across the Ukrainian economy.
Perspectives and challenges in statistical physics and complex systems for the next decade
Raposo, Ernesto P; Gomes Eleutério da Luz, Marcos
2014-01-01
Statistical Physics (SP) has followed an unusual evolutionary path in science. Originally aiming to provide a fundamental basis for another important branch of Physics, namely Thermodynamics, SP gradually became an independent field of research in its own right. But despite more than a century of steady progress, there are still plenty of challenges and open questions in the SP realm. In fact, the area is still rapidly evolving, in contrast to other branches of science, which already have well defined scopes and borderlines of applicability. This difference is due to the steadily expanding num
Statistical mechanics of two-dimensional and geophysical flows
International Nuclear Information System (INIS)
Bouchet, Freddy; Venaille, Antoine
2012-01-01
The theoretical study of the self-organization of two-dimensional and geophysical turbulent flows is addressed based on statistical mechanics methods. This review is a self-contained presentation of classical and recent works on this subject; from the statistical mechanics basis of the theory up to applications to Jupiter’s troposphere and ocean vortices and jets. Emphasize has been placed on examples with available analytical treatment in order to favor better understanding of the physics and dynamics. After a brief presentation of the 2D Euler and quasi-geostrophic equations, the specificity of two-dimensional and geophysical turbulence is emphasized. The equilibrium microcanonical measure is built from the Liouville theorem. Important statistical mechanics concepts (large deviations and mean field approach) and thermodynamic concepts (ensemble inequivalence and negative heat capacity) are briefly explained and described. On this theoretical basis, we predict the output of the long time evolution of complex turbulent flows as statistical equilibria. This is applied to make quantitative models of two-dimensional turbulence, the Great Red Spot and other Jovian vortices, ocean jets like the Gulf-Stream, and ocean vortices. A detailed comparison between these statistical equilibria and real flow observations is provided. We also present recent results for non-equilibrium situations, for the studies of either the relaxation towards equilibrium or non-equilibrium steady states. In this last case, forces and dissipation are in a statistical balance; fluxes of conserved quantity characterize the system and microcanonical or other equilibrium measures no longer describe the system.
A Statistical Approach For Modeling Tropical Cyclones. Synthetic Hurricanes Generator Model
Energy Technology Data Exchange (ETDEWEB)
Pasqualini, Donatella [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-05-11
This manuscript brie y describes a statistical ap- proach to generate synthetic tropical cyclone tracks to be used in risk evaluations. The Synthetic Hur- ricane Generator (SynHurG) model allows model- ing hurricane risk in the United States supporting decision makers and implementations of adaptation strategies to extreme weather. In the literature there are mainly two approaches to model hurricane hazard for risk prediction: deterministic-statistical approaches, where the storm key physical parameters are calculated using physi- cal complex climate models and the tracks are usually determined statistically from historical data; and sta- tistical approaches, where both variables and tracks are estimated stochastically using historical records. SynHurG falls in the second category adopting a pure stochastic approach.
Characterization of time series via Rényi complexity-entropy curves
Jauregui, M.; Zunino, L.; Lenzi, E. K.; Mendes, R. S.; Ribeiro, H. V.
2018-05-01
One of the most useful tools for distinguishing between chaotic and stochastic time series is the so-called complexity-entropy causality plane. This diagram involves two complexity measures: the Shannon entropy and the statistical complexity. Recently, this idea has been generalized by considering the Tsallis monoparametric generalization of the Shannon entropy, yielding complexity-entropy curves. These curves have proven to enhance the discrimination among different time series related to stochastic and chaotic processes of numerical and experimental nature. Here we further explore these complexity-entropy curves in the context of the Rényi entropy, which is another monoparametric generalization of the Shannon entropy. By combining the Rényi entropy with the proper generalization of the statistical complexity, we associate a parametric curve (the Rényi complexity-entropy curve) with a given time series. We explore this approach in a series of numerical and experimental applications, demonstrating the usefulness of this new technique for time series analysis. We show that the Rényi complexity-entropy curves enable the differentiation among time series of chaotic, stochastic, and periodic nature. In particular, time series of stochastic nature are associated with curves displaying positive curvature in a neighborhood of their initial points, whereas curves related to chaotic phenomena have a negative curvature; finally, periodic time series are represented by vertical straight lines.
Towards the disease biomarker in an individual patient using statistical health monitoring
Engel, J.; Blanchet, L.M.; Engelke, U.F.; Wevers, R.A.; Buydens, L.M.
2014-01-01
In metabolomics, identification of complex diseases is often based on application of (multivariate) statistical techniques to the data. Commonly, each disease requires its own specific diagnostic model, separating healthy and diseased individuals, which is not very practical in a diagnostic setting.
Preliminary study on the time-related changes of the infrared thermal images of the human body
Li, Ziru; Zhang, Xusheng; Lin, Gang; Chen, Zhigang
2009-08-01
It is of great importance to study the manifestations and the influencing factors of the time-related changes of infrared thermal images (ITI) of human body since the variable body surface temperature distribution seriously affected the application of ITI in medicine. In this paper, manifestations of time-related changes of the ITI of human body from three double-blind randomized trials and their correlation with meteorological factors (e.g. temperature, pressure, humidity, cold front passage and tropical cyclone landing) were studied. The trials were placebo or drug controlled studying the influences of Chinese medicine health food (including Shengsheng capsule with immunity adjustment function, Shengan capsule with sleep improvement function and Shengyi capsule with the function of helping to decrease serum lipid) on the ITI of human body. In the first thirty-six days of the trials images were scanned every six days and image data in the seven observation time spots (including the 0, 6, 12, 18, 24, 30, 36 day of the trial) were used for the time-related study. For every subject the scanned time was fixed in the day within two hours. The ITI features which could reflect the functions of the health foods were studied. The indexes of the features were relative magnitude (temperature difference between the viewing area and the reference area). Results showed that the variation tendencies of the trial group and control group were basically the same in placebo controlled trials and some of the long-term effects of Chinese medicine health food could be reflected significantly in certain time spots in the first thirty-six days. Time-related changes of the ITI of human body were closely related with meteorological factors but there were other influencing factors still need to be studied. As the ITI of human body could reflect the influences of Chinese medicine health foods and are closely related with meteorology, there are bright prospects for the application of ITI in
Virial-statistic method for calculation of atom and molecule energies
International Nuclear Information System (INIS)
Borisov, Yu.A.
1977-01-01
A virial-statistical method has been applied to the calculation of the atomization energies of the following molecules: Mo(CO) 6 , Cr(CO) 6 , Fe(CO) 5 , MnH(CO) 5 , CoH(CO) 4 , Ni(CO) 4 . The principles of this method are briefly presented. Calculation results are given for the individual contributions to the atomization energies together with the calculated and experimental atomization energies (D). For the Mo(CO) 6 complex Dsub(calc) = 1759 and Dsub(exp) = 1763 kcal/mole. Calculated and experimental combination heat values for carbonyl complexes are presented. These values are shown to be adequately consistent [ru
Statistics Anxiety and Business Statistics: The International Student
Bell, James A.
2008-01-01
Does the international student suffer from statistics anxiety? To investigate this, the Statistics Anxiety Rating Scale (STARS) was administered to sixty-six beginning statistics students, including twelve international students and fifty-four domestic students. Due to the small number of international students, nonparametric methods were used to…
Workshop on Nonlinear Phenomena in Complex Systems
1989-01-01
This book contains a thorough treatment of neural networks, cellular-automata and synergetics, in an attempt to provide three different approaches to nonlinear phenomena in complex systems. These topics are of major interest to physicists active in the fields of statistical mechanics and dynamical systems. They have been developed with a high degree of sophistication and include the refinements necessary to work with the complexity of real systems as well as the more recent research developments in these areas.
Complex fragment emission from hot compound nuclei
International Nuclear Information System (INIS)
Moretto, L.G.
1986-03-01
The experimental evidence for compound nucleus emission of complex fragments at low energies is used to interpret the emission of the same fragments at higher energies. The resulting experimental picture is that of highly excited compound nuclei formed in incomplete fusion processes which decay statistically. In particular, complex fragments appear to be produced mostly through compound nucleus decay. In the appendix a geometric-kinematic theory for incomplete fusion and the associated momentum transfer is outlined. 10 refs., 19 figs
Health-Care Waste Treatment Technology Selection Using the Interval 2-Tuple Induced TOPSIS Method
Directory of Open Access Journals (Sweden)
Chao Lu
2016-06-01
Full Text Available Health-care waste (HCW management is a major challenge for municipalities, particularly in the cities of developing nations. Selecting the best treatment technology for HCW can be regarded as a complex multi-criteria decision making (MCDM issue involving a number of alternatives and multiple evaluation criteria. In addition, decision makers tend to express their personal assessments via multi-granularity linguistic term sets because of different backgrounds and knowledge, some of which may be imprecise, uncertain and incomplete. Therefore, the main objective of this study is to propose a new hybrid decision making approach combining interval 2-tuple induced distance operators with the technique for order preference by similarity to an ideal solution (TOPSIS for tackling HCW treatment technology selection problems with linguistic information. The proposed interval 2-tuple induced TOPSIS (ITI-TOPSIS can not only model the uncertainty and diversity of the assessment information given by decision makers, but also reflect the complex attitudinal characters of decision makers and provide much more complete information for the selection of the optimum disposal alternative. Finally, an empirical example in Shanghai, China is provided to illustrate the proposed decision making method, and results show that the ITI-TOPSIS proposed in this paper can solve the problem of HCW treatment technology selection effectively.
Spreadsheets as tools for statistical computing and statistics education
Neuwirth, Erich
2000-01-01
Spreadsheets are an ubiquitous program category, and we will discuss their use in statistics and statistics education on various levels, ranging from very basic examples to extremely powerful methods. Since the spreadsheet paradigm is very familiar to many potential users, using it as the interface to statistical methods can make statistics more easily accessible.
Relativistic effects on complexity indexes in atoms in position and momentum spaces
International Nuclear Information System (INIS)
Maldonado, P.; Sarsa, A.; Buendia, E.; Galvez, F.J.
2010-01-01
Three different statistical measures of complexity are explored for the atoms He to Ra. The measures are analysed in both position and momentum spaces. Relativistic effects on the complexity indexes are systematically studied. These effects are discussed in terms of the information content factor and the disorder terms of the complexity indexes. Relativistic and non-relativistic complexity indexes are calculated from Optimized Effective Potential densities.
Register-based statistics statistical methods for administrative data
Wallgren, Anders
2014-01-01
This book provides a comprehensive and up to date treatment of theory and practical implementation in Register-based statistics. It begins by defining the area, before explaining how to structure such systems, as well as detailing alternative approaches. It explains how to create statistical registers, how to implement quality assurance, and the use of IT systems for register-based statistics. Further to this, clear details are given about the practicalities of implementing such statistical methods, such as protection of privacy and the coordination and coherence of such an undertaking. Thi
... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...
Directory of Open Access Journals (Sweden)
Peter M Visscher
2014-04-01
Full Text Available We have recently developed analysis methods (GREML to estimate the genetic variance of a complex trait/disease and the genetic correlation between two complex traits/diseases using genome-wide single nucleotide polymorphism (SNP data in unrelated individuals. Here we use analytical derivations and simulations to quantify the sampling variance of the estimate of the proportion of phenotypic variance captured by all SNPs for quantitative traits and case-control studies. We also derive the approximate sampling variance of the estimate of a genetic correlation in a bivariate analysis, when two complex traits are either measured on the same or different individuals. We show that the sampling variance is inversely proportional to the number of pairwise contrasts in the analysis and to the variance in SNP-derived genetic relationships. For bivariate analysis, the sampling variance of the genetic correlation additionally depends on the harmonic mean of the proportion of variance explained by the SNPs for the two traits and the genetic correlation between the traits, and depends on the phenotypic correlation when the traits are measured on the same individuals. We provide an online tool for calculating the power of detecting genetic (covariation using genome-wide SNP data. The new theory and online tool will be helpful to plan experimental designs to estimate the missing heritability that has not yet been fully revealed through genome-wide association studies, and to estimate the genetic overlap between complex traits (diseases in particular when the traits (diseases are not measured on the same samples.
Statistical assessment on a combined analysis of GRYN-ROMN-UCBN upland vegetation vital signs
Irvine, Kathryn M.; Rodhouse, Thomas J.
2014-01-01
As of 2013, Rocky Mountain and Upper Columbia Basin Inventory and Monitoring Networks have multiple years of vegetation data and Greater Yellowstone Network has three years of vegetation data and monitoring is ongoing in all three networks. Our primary objective is to assess whether a combined analysis of these data aimed at exploring correlations with climate and weather data is feasible. We summarize the core survey design elements across protocols and point out the major statistical challenges for a combined analysis at present. The dissimilarity in response designs between ROMN and UCBN-GRYN network protocols presents a statistical challenge that has not been resolved yet. However, the UCBN and GRYN data are compatible as they implement a similar response design; therefore, a combined analysis is feasible and will be pursued in future. When data collected by different networks are combined, the survey design describing the merged dataset is (likely) a complex survey design. A complex survey design is the result of combining datasets from different sampling designs. A complex survey design is characterized by unequal probability sampling, varying stratification, and clustering (see Lohr 2010 Chapter 7 for general overview). Statistical analysis of complex survey data requires modifications to standard methods, one of which is to include survey design weights within a statistical model. We focus on this issue for a combined analysis of upland vegetation from these networks, leaving other topics for future research. We conduct a simulation study on the possible effects of equal versus unequal probability selection of points on parameter estimates of temporal trend using available packages within the R statistical computing package. We find that, as written, using lmer or lm for trend detection in a continuous response and clm and clmm for visually estimated cover classes with “raw” GRTS design weights specified for the weight argument leads to substantially
Light-driven molecular machine at ITIES
DEFF Research Database (Denmark)
Kornyshev, A.A.; Kuimova, M.; Kuznetsov, A.M.
2007-01-01
We suggest a principle of operation of a new molecular device that transforms the energy of light into repetitive mechanical motions. Such a device can also serve as a model system for the study of the effect of electric field on intramolecular electron transfer. We discuss the design of suitable...
Light-driven molecular machine at ITIES
International Nuclear Information System (INIS)
Kornyshev, Alexei A; Kuimova, Marina; Kuznetsov, Alexander M; Ulstrup, Jens; Urbakh, Michael
2007-01-01
We suggest a principle of operation of a new molecular device that transforms the energy of light into repetitive mechanical motions. Such a device can also serve as a model system for the study of the effect of electric field on intramolecular electron transfer. We discuss the design of suitable molecular systems and the methods that may monitor the 'performance' of such a machine
Marketingové využitie Instagramu
Poláková, Lucia
2015-01-01
This bachelor thesis deals with the use of Instagram for marketing purposes. The theoretical part includes basic information about Internet, on-line marketing and social media, but it is mainly focused on Instagram, its principles, on the companies which are using this media and also on successful campaigns on Instagram. The practical part deals with activities of company Birell on Instagram and evaluates their marketing campaign led on this social media.
Sparse approximation of currents for statistics on curves and surfaces.
Durrleman, Stanley; Pennec, Xavier; Trouvé, Alain; Ayache, Nicholas
2008-01-01
Computing, processing, visualizing statistics on shapes like curves or surfaces is a real challenge with many applications ranging from medical image analysis to computational geometry. Modelling such geometrical primitives with currents avoids feature-based approach as well as point-correspondence method. This framework has been proved to be powerful to register brain surfaces or to measure geometrical invariants. However, if the state-of-the-art methods perform efficiently pairwise registrations, new numerical schemes are required to process groupwise statistics due to an increasing complexity when the size of the database is growing. Statistics such as mean and principal modes of a set of shapes often have a heavy and highly redundant representation. We propose therefore to find an adapted basis on which mean and principal modes have a sparse decomposition. Besides the computational improvement, this sparse representation offers a way to visualize and interpret statistics on currents. Experiments show the relevance of the approach on 34 sets of 70 sulcal lines and on 50 sets of 10 meshes of deep brain structures.
Software Used to Generate Cancer Statistics - SEER Cancer Statistics
Videos that highlight topics and trends in cancer statistics and definitions of statistical terms. Also software tools for analyzing and reporting cancer statistics, which are used to compile SEER's annual reports.
Probability density cloud as a geometrical tool to describe statistics of scattered light.
Yaitskova, Natalia
2017-04-01
First-order statistics of scattered light is described using the representation of the probability density cloud, which visualizes a two-dimensional distribution for complex amplitude. The geometric parameters of the cloud are studied in detail and are connected to the statistical properties of phase. The moment-generating function for intensity is obtained in a closed form through these parameters. An example of exponentially modified normal distribution is provided to illustrate the functioning of this geometrical approach.
Understanding Statistics and Statistics Education: A Chinese Perspective
Shi, Ning-Zhong; He, Xuming; Tao, Jian
2009-01-01
In recent years, statistics education in China has made great strides. However, there still exists a fairly large gap with the advanced levels of statistics education in more developed countries. In this paper, we identify some existing problems in statistics education in Chinese schools and make some proposals as to how they may be overcome. We…
Pestman, Wiebe R
2009-01-01
This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.
Bayesian statistics in radionuclide metrology: measurement of a decaying source
International Nuclear Information System (INIS)
Bochud, F. O.; Bailat, C.J.; Laedermann, J.P.
2007-01-01
The most intuitive way of defining a probability is perhaps through the frequency at which it appears when a large number of trials are realized in identical conditions. The probability derived from the obtained histogram characterizes the so-called frequentist or conventional statistical approach. In this sense, probability is defined as a physical property of the observed system. By contrast, in Bayesian statistics, a probability is not a physical property or a directly observable quantity, but a degree of belief or an element of inference. The goal of this paper is to show how Bayesian statistics can be used in radionuclide metrology and what its advantages and disadvantages are compared with conventional statistics. This is performed through the example of an yttrium-90 source typically encountered in environmental surveillance measurement. Because of the very low activity of this kind of source and the small half-life of the radionuclide, this measurement takes several days, during which the source decays significantly. Several methods are proposed to compute simultaneously the number of unstable nuclei at a given reference time, the decay constant and the background. Asymptotically, all approaches give the same result. However, Bayesian statistics produces coherent estimates and confidence intervals in a much smaller number of measurements. Apart from the conceptual understanding of statistics, the main difficulty that could deter radionuclide metrologists from using Bayesian statistics is the complexity of the computation. (authors)
Sampling, Probability Models and Statistical Reasoning Statistical
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...
Mapping and discrimination of networks in the complexity-entropy plane
Wiedermann, Marc; Donges, Jonathan F.; Kurths, Jürgen; Donner, Reik V.
2017-10-01
Complex networks are usually characterized in terms of their topological, spatial, or information-theoretic properties and combinations of the associated metrics are used to discriminate networks into different classes or categories. However, even with the present variety of characteristics at hand it still remains a subject of current research to appropriately quantify a network's complexity and correspondingly discriminate between different types of complex networks, like infrastructure or social networks, on such a basis. Here we explore the possibility to classify complex networks by means of a statistical complexity measure that has formerly been successfully applied to distinguish different types of chaotic and stochastic time series. It is composed of a network's averaged per-node entropic measure characterizing the network's information content and the associated Jenson-Shannon divergence as a measure of disequilibrium. We study 29 real-world networks and show that networks of the same category tend to cluster in distinct areas of the resulting complexity-entropy plane. We demonstrate that within our framework, connectome networks exhibit among the highest complexity while, e.g., transportation and infrastructure networks display significantly lower values. Furthermore, we demonstrate the utility of our framework by applying it to families of random scale-free and Watts-Strogatz model networks. We then show in a second application that the proposed framework is useful to objectively construct threshold-based networks, such as functional climate networks or recurrence networks, by choosing the threshold such that the statistical network complexity is maximized.
On advisability of developing automatic complexes of radiation flow detection
International Nuclear Information System (INIS)
Akopov, V.S.; Voronin, S.A.; Meshalkin, I.A.
1976-01-01
On the basis of mathematical treatment of statistical data obtained by inquest of specialists from a number of factories, problems associated with the determination of the most acceptable efficiency of radiation defectoscopy automatized complexes are considered. Production requirements for radiation control sensitivity are generalized. The use of providing the complexes with computer technique is substantiated
Non-statistical behavior of coupled optical systems
International Nuclear Information System (INIS)
Perez, G.; Pando Lambruschini, C.; Sinha, S.; Cerdeira, H.A.
1991-10-01
We study globally coupled chaotic maps modeling an optical system, and find clear evidence of non-statistical behavior: the mean square deviation (MSD) of the mean field saturates with respect to increase in the number of elements coupled, after a critical value, and its distribution is clearly non-Gaussian. We also find that the power spectrum of the mean field displays well defined peaks, indicating a subtle coherence among different elements, even in the ''turbulent'' phase. This system is a physically realistic model that may be experimentally realizable. It is also a higher dimensional example (as each individual element is given by a complex map). Its study confirms that the phenomena observed in a wide class of coupled one-dimensional maps are present here as well. This gives more evidence to believe that such non-statistical behavior is probably generic in globally coupled systems. We also investigate the influence of parametric fluctuations on the MSD. (author). 10 refs, 7 figs, 1 tab
Directory of Open Access Journals (Sweden)
Suhrad G Banugaria
Full Text Available OBJECTIVE: Although enzyme replacement therapy (ERT is a highly effective therapy, CRIM-negative (CN infantile Pompe disease (IPD patients typically mount a strong immune response which abrogates the efficacy of ERT, resulting in clinical decline and death. This study was designed to demonstrate that immune tolerance induction (ITI prevents or diminishes the development of antibody titers, resulting in a better clinical outcome compared to CN IPD patients treated with ERT monotherapy. METHODS: We evaluated the safety, efficacy and feasibility of a clinical algorithm designed to accurately identify CN IPD patients and minimize delays between CRIM status determination and initiation of an ITI regimen (combination of rituximab, methotrexate and IVIG concurrent with ERT. Clinical and laboratory data including measures of efficacy analysis for response to ERT were analyzed and compared to CN IPD patients treated with ERT monotherapy. RESULTS: Seven CN IPD patients were identified and started on the ITI regimen concurrent with ERT. Median time from diagnosis of CN status to commencement of ERT and ITI was 0.5 months (range: 0.1-1.6 months. At baseline, all patients had significant cardiomyopathy and all but one required respiratory support. The ITI regimen was safely tolerated in all seven cases. Four patients never seroconverted and remained antibody-free. One patient died from respiratory failure. Two patients required another course of the ITI regimen. In addition to their clinical improvement, the antibody titers observed in these patients were much lower than those seen in ERT monotherapy treated CN patients. CONCLUSIONS: The ITI regimen appears safe and efficacious and holds promise in altering the natural history of CN IPD by increasing ERT efficacy. An algorithm such as this substantiates the benefits of accelerated diagnosis and management of CN IPD patients, thus, further supporting the importance of early identification and treatment
International Nuclear Information System (INIS)
2005-01-01
For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees
Statistical Process Control in a Modern Production Environment
DEFF Research Database (Denmark)
Windfeldt, Gitte Bjørg
gathered here and standard statistical software. In Paper 2 a new method for process monitoring is introduced. The method uses a statistical model of the quality characteristic and a sliding window of observations to estimate the probability that the next item will not respect the specications......Paper 1 is aimed at practicians to help them test the assumption that the observations in a sample are independent and identically distributed. An assumption that is essential when using classical Shewhart charts. The test can easily be performed in the control chart setup using the samples....... If the estimated probability exceeds a pre-determined threshold the process will be stopped. The method is exible, allowing a complexity in modeling that remains invisible to the end user. Furthermore, the method allows to build diagnostic plots based on the parameters estimates that can provide valuable insight...
Statistical laws in urban mobility from microscopic GPS data in the area of Florence
International Nuclear Information System (INIS)
Bazzani, Armando; Giorgini, Bruno; Rambaldi, Sandro; Gallotti, Riccardo; Giovannini, Luca
2010-01-01
The application of Statistical Physics to social systems is mainly related to the search for macroscopic laws that can be derived from experimental data averaged in time or space, assuming the system in a steady state. One of the major goals would be to find a connection between the statistical laws and the microscopic properties: for example, to understand the nature of the microscopic interactions or to point out the existence of interaction networks. Probability theory suggests the existence of a few classes of stationary distributions in the thermodynamics limit, so that the question is if a statistical physics approach could be able to enroll the complex nature of the social systems. We have analyzed a large GPS database for single-vehicle mobility in the Florence urban area, obtaining statistical laws for path lengths, for activity downtimes and for activity degrees. We show also that simple generic assumptions on the microscopic behavior could explain the existence of stationary macroscopic laws, with a universal function describing the distribution. Our conclusion is that understanding the system complexity requires a dynamical database for the microscopic evolution, which allows us to solve both small space and time scales in order to study the transients
Statistical learning in high energy and astrophysics
International Nuclear Information System (INIS)
Zimmermann, J.
2005-01-01
This thesis studies the performance of statistical learning methods in high energy and astrophysics where they have become a standard tool in physics analysis. They are used to perform complex classification or regression by intelligent pattern recognition. This kind of artificial intelligence is achieved by the principle ''learning from examples'': The examples describe the relationship between detector events and their classification. The application of statistical learning methods is either motivated by the lack of knowledge about this relationship or by tight time restrictions. In the first case learning from examples is the only possibility since no theory is available which would allow to build an algorithm in the classical way. In the second case a classical algorithm exists but is too slow to cope with the time restrictions. It is therefore replaced by a pattern recognition machine which implements a fast statistical learning method. But even in applications where some kind of classical algorithm had done a good job, statistical learning methods convinced by their remarkable performance. This thesis gives an introduction to statistical learning methods and how they are applied correctly in physics analysis. Their flexibility and high performance will be discussed by showing intriguing results from high energy and astrophysics. These include the development of highly efficient triggers, powerful purification of event samples and exact reconstruction of hidden event parameters. The presented studies also show typical problems in the application of statistical learning methods. They should be only second choice in all cases where an algorithm based on prior knowledge exists. Some examples in physics analyses are found where these methods are not used in the right way leading either to wrong predictions or bad performance. Physicists also often hesitate to profit from these methods because they fear that statistical learning methods cannot be controlled in a
Statistical learning in high energy and astrophysics
Energy Technology Data Exchange (ETDEWEB)
Zimmermann, J.
2005-06-16
This thesis studies the performance of statistical learning methods in high energy and astrophysics where they have become a standard tool in physics analysis. They are used to perform complex classification or regression by intelligent pattern recognition. This kind of artificial intelligence is achieved by the principle ''learning from examples'': The examples describe the relationship between detector events and their classification. The application of statistical learning methods is either motivated by the lack of knowledge about this relationship or by tight time restrictions. In the first case learning from examples is the only possibility since no theory is available which would allow to build an algorithm in the classical way. In the second case a classical algorithm exists but is too slow to cope with the time restrictions. It is therefore replaced by a pattern recognition machine which implements a fast statistical learning method. But even in applications where some kind of classical algorithm had done a good job, statistical learning methods convinced by their remarkable performance. This thesis gives an introduction to statistical learning methods and how they are applied correctly in physics analysis. Their flexibility and high performance will be discussed by showing intriguing results from high energy and astrophysics. These include the development of highly efficient triggers, powerful purification of event samples and exact reconstruction of hidden event parameters. The presented studies also show typical problems in the application of statistical learning methods. They should be only second choice in all cases where an algorithm based on prior knowledge exists. Some examples in physics analyses are found where these methods are not used in the right way leading either to wrong predictions or bad performance. Physicists also often hesitate to profit from these methods because they fear that statistical learning methods cannot
Indian Academy of Sciences (India)
The special theory of relativity is one of the cornerstones of physics. When we ... ity of the principle of relativity, many great physicists and mathematicians ... ity, according to which the laws of physical phenomena should ... The difference be-.
A new universality class in corpus of texts; A statistical physics study
Najafi, Elham; Darooneh, Amir H.
2018-05-01
Text can be regarded as a complex system. There are some methods in statistical physics which can be used to study this system. In this work, by means of statistical physics methods, we reveal new universal behaviors of texts associating with the fractality values of words in a text. The fractality measure indicates the importance of words in a text by considering distribution pattern of words throughout the text. We observed a power law relation between fractality of text and vocabulary size for texts and corpora. We also observed this behavior in studying biological data.
Statistical analysis of probabilistic models of software product lines with quantitative constraints
DEFF Research Database (Denmark)
Beek, M.H. ter; Legay, A.; Lluch Lafuente, Alberto
2015-01-01
We investigate the suitability of statistical model checking for the analysis of probabilistic models of software product lines with complex quantitative constraints and advanced feature installation options. Such models are specified in the feature-oriented language QFLan, a rich process algebra...... of certain behaviour to the expected average cost of products. This is supported by a Maude implementation of QFLan, integrated with the SMT solver Z3 and the distributed statistical model checker MultiVeStA. Our approach is illustrated with a bikes product line case study....
Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics
Speeding up the MATLAB complex networks package using graphic processors
International Nuclear Information System (INIS)
Zhang Bai-Da; Wu Jun-Jie; Li Xin; Tang Yu-Hua
2011-01-01
The availability of computers and communication networks allows us to gather and analyse data on a far larger scale than previously. At present, it is believed that statistics is a suitable method to analyse networks with millions, or more, of vertices. The MATLAB language, with its mass of statistical functions, is a good choice to rapidly realize an algorithm prototype of complex networks. The performance of the MATLAB codes can be further improved by using graphic processor units (GPU). This paper presents the strategies and performance of the GPU implementation of a complex networks package, and the Jacket toolbox of MATLAB is used. Compared with some commercially available CPU implementations, GPU can achieve a speedup of, on average, 11.3×. The experimental result proves that the GPU platform combined with the MATLAB language is a good combination for complex network research. (interdisciplinary physics and related areas of science and technology)
Generalized statistical mechanics approaches to earthquakes and tectonics
Papadakis, Giorgos; Michas, Georgios
2016-01-01
Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes. PMID:28119548
Summer School Mathematical Foundations of Complex Networked Information Systems
Fosson, Sophie; Ravazzi, Chiara
2015-01-01
Introducing the reader to the mathematics beyond complex networked systems, these lecture notes investigate graph theory, graphical models, and methods from statistical physics. Complex networked systems play a fundamental role in our society, both in everyday life and in scientific research, with applications ranging from physics and biology to economics and finance. The book is self-contained, and requires only an undergraduate mathematical background.
Analysis of statistical misconception in terms of statistical reasoning
Maryati, I.; Priatna, N.
2018-05-01
Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.
Statistical Inference at Work: Statistical Process Control as an Example
Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia
2008-01-01
To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…
Action detection by double hierarchical multi-structure space-time statistical matching model
Han, Jing; Zhu, Junwei; Cui, Yiyin; Bai, Lianfa; Yue, Jiang
2018-03-01
Aimed at the complex information in videos and low detection efficiency, an actions detection model based on neighboring Gaussian structure and 3D LARK features is put forward. We exploit a double hierarchical multi-structure space-time statistical matching model (DMSM) in temporal action localization. First, a neighboring Gaussian structure is presented to describe the multi-scale structural relationship. Then, a space-time statistical matching method is proposed to achieve two similarity matrices on both large and small scales, which combines double hierarchical structural constraints in model by both the neighboring Gaussian structure and the 3D LARK local structure. Finally, the double hierarchical similarity is fused and analyzed to detect actions. Besides, the multi-scale composite template extends the model application into multi-view. Experimental results of DMSM on the complex visual tracker benchmark data sets and THUMOS 2014 data sets show the promising performance. Compared with other state-of-the-art algorithm, DMSM achieves superior performances.
Data Mining and Complex Problems: Case Study in Composite Materials
Rabelo, Luis; Marin, Mario
2009-01-01
Data mining is defined as the discovery of useful, possibly unexpected, patterns and relationships in data using statistical and non-statistical techniques in order to develop schemes for decision and policy making. Data mining can be used to discover the sources and causes of problems in complex systems. In addition, data mining can support simulation strategies by finding the different constants and parameters to be used in the development of simulation models. This paper introduces a framework for data mining and its application to complex problems. To further explain some of the concepts outlined in this paper, the potential application to the NASA Shuttle Reinforced Carbon-Carbon structures and genetic programming is used as an illustration.
A study of complex particle emission in the pre-equilibrium statistical model
International Nuclear Information System (INIS)
Miao Rongzhi; Wu Guohua
1986-01-01
A concept of the quasi-composite system in the process of the pre-equilibrium emission is presented in this paper. On the basis of the principle of detailed balance, the existence of the factor, [γ β ω(π β , 0, ν β , 0, E-U)g π,ν ], has been proved with an account of the distinguishabllity between protons and neutrons. A formula for the rate of the complex particle emission in the pre-equilibrium process can be obtained. The theoretical calculation results fit the experimental data quite well, especially in the high energy part of the energy spectrum the agreement are much better than ever before
A Statistical Primer: Understanding Descriptive and Inferential Statistics
Gillian Byrne
2007-01-01
As libraries and librarians move more towards evidence‐based decision making, the data being generated in libraries is growing. Understanding the basics of statistical analysis is crucial for evidence‐based practice (EBP), in order to correctly design and analyze researchas well as to evaluate the research of others. This article covers the fundamentals of descriptive and inferential statistics, from hypothesis construction to sampling to common statistical techniques including chi‐square, co...
Solution of the statistical bootstrap with Bose statistics
International Nuclear Information System (INIS)
Engels, J.; Fabricius, K.; Schilling, K.
1977-01-01
A brief and transparent way to introduce Bose statistics into the statistical bootstrap of Hagedorn and Frautschi is presented. The resulting bootstrap equation is solved by a cluster expansion for the grand canonical partition function. The shift of the ultimate temperature due to Bose statistics is determined through an iteration process. We discuss two-particle spectra of the decaying fireball (with given mass) as obtained from its grand microcanonical level density
Fundamental statistical features and self-similar properties of tagged networks
International Nuclear Information System (INIS)
Palla, Gergely; Farkas, Illes J; Pollner, Peter; Vicsek, Tamas; Derenyi, Imre
2008-01-01
We investigate the fundamental statistical features of tagged (or annotated) networks having a rich variety of attributes associated with their nodes. Tags (attributes, annotations, properties, features, etc) provide essential information about the entity represented by a given node, thus, taking them into account represents a significant step towards a more complete description of the structure of large complex systems. Our main goal here is to uncover the relations between the statistical properties of the node tags and those of the graph topology. In order to better characterize the networks with tagged nodes, we introduce a number of new notions, including tag-assortativity (relating link probability to node similarity), and new quantities, such as node uniqueness (measuring how rarely the tags of a node occur in the network) and tag-assortativity exponent. We apply our approach to three large networks representing very different domains of complex systems. A number of the tag related quantities display analogous behaviour (e.g. the networks we studied are tag-assortative, indicating possible universal aspects of tags versus topology), while some other features, such as the distribution of the node uniqueness, show variability from network to network allowing for pin-pointing large scale specific features of real-world complex networks. We also find that for each network the topology and the tag distribution are scale invariant, and this self-similar property of the networks can be well characterized by the tag-assortativity exponent, which is specific to each system.
Directory of Open Access Journals (Sweden)
J. F. Pankow
2008-05-01
Full Text Available The SIMPOL.1 group contribution method is developed for predicting the liquid vapor pressure p^{o}_{L} (atm and enthalpy of vaporization Δ H_{vap} (kJ mol^{-1} of organic compounds as functions of temperature (<i>T>. For each compound i, the method assumes log_{10}p^{o}_{L,i} (T=∑_{k}ν_{k,i}b_{k}(T where ν_{k,i} is the number of groups of type k, and b_{k} (<i>T> is the contribution to log_{10}p^{o}_{L,i} (<i>T> by each group of type k. A zeroeth group is included that uses b_{0} (<i>T> with ν_{0,i}=1 for all i. A total of 30 structural groups are considered: molecular carbon, alkyl hydroxyl, aromatic hydroxyl, alkyl ether, alkyl ring ether, aromatic ether, aldehyde, ketone, carboxylic acid, ester, nitrate, nitro, alkyl amine (primary, secondary, and tertiary, aromatic amine, amide (primary, secondary, and tertiary, peroxide, hydroperoxide, peroxy acid, C=C, carbonylperoxynitrate, nitro-phenol, nitro-ester, aromatic rings, non-aromatic rings, C=C–C=O in a non-aromatic ring, and carbon on the acid-side of an amide. The <i>T> dependence in each of the b_{k} (<i>T> is assumed to follow b(T=B_{1}/T+B_{2}+B_{3}T+B_{4}ln <i>T>. Values of the B coefficients are fit using an initial basis set of 272 compounds for which experimentally based functions p^{o} _{L,i}=f_{i} (<i>T> are available. The range of vapor pressure considered spans fourteen orders of magnitude. The ability of the initially fitted B coefficients to predict p^{o}_{L} values is examined using a test set of 184 compounds and a <i>T> range that is as wide as 273
Szulc, Stefan
1965-01-01
Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then
Goodman, Joseph W
2015-01-01
This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications. The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i
Networking—a statistical physics perspective
Yeung, Chi Ho; Saad, David
2013-03-01
Networking encompasses a variety of tasks related to the communication of information on networks; it has a substantial economic and societal impact on a broad range of areas including transportation systems, wired and wireless communications and a range of Internet applications. As transportation and communication networks become increasingly more complex, the ever increasing demand for congestion control, higher traffic capacity, quality of service, robustness and reduced energy consumption requires new tools and methods to meet these conflicting requirements. The new methodology should serve for gaining better understanding of the properties of networking systems at the macroscopic level, as well as for the development of new principled optimization and management algorithms at the microscopic level. Methods of statistical physics seem best placed to provide new approaches as they have been developed specifically to deal with nonlinear large-scale systems. This review aims at presenting an overview of tools and methods that have been developed within the statistical physics community and that can be readily applied to address the emerging problems in networking. These include diffusion processes, methods from disordered systems and polymer physics, probabilistic inference, which have direct relevance to network routing, file and frequency distribution, the exploration of network structures and vulnerability, and various other practical networking applications.
Networking—a statistical physics perspective
International Nuclear Information System (INIS)
Yeung, Chi Ho; Saad, David
2013-01-01
Networking encompasses a variety of tasks related to the communication of information on networks; it has a substantial economic and societal impact on a broad range of areas including transportation systems, wired and wireless communications and a range of Internet applications. As transportation and communication networks become increasingly more complex, the ever increasing demand for congestion control, higher traffic capacity, quality of service, robustness and reduced energy consumption requires new tools and methods to meet these conflicting requirements. The new methodology should serve for gaining better understanding of the properties of networking systems at the macroscopic level, as well as for the development of new principled optimization and management algorithms at the microscopic level. Methods of statistical physics seem best placed to provide new approaches as they have been developed specifically to deal with nonlinear large-scale systems. This review aims at presenting an overview of tools and methods that have been developed within the statistical physics community and that can be readily applied to address the emerging problems in networking. These include diffusion processes, methods from disordered systems and polymer physics, probabilistic inference, which have direct relevance to network routing, file and frequency distribution, the exploration of network structures and vulnerability, and various other practical networking applications. (topical review)
All of statistics a concise course in statistical inference
Wasserman, Larry
2004-01-01
This book is for people who want to learn probability and statistics quickly It brings together many of the main ideas in modern statistics in one place The book is suitable for students and researchers in statistics, computer science, data mining and machine learning This book covers a much wider range of topics than a typical introductory text on mathematical statistics It includes modern topics like nonparametric curve estimation, bootstrapping and classification, topics that are usually relegated to follow-up courses The reader is assumed to know calculus and a little linear algebra No previous knowledge of probability and statistics is required The text can be used at the advanced undergraduate and graduate level Larry Wasserman is Professor of Statistics at Carnegie Mellon University He is also a member of the Center for Automated Learning and Discovery in the School of Computer Science His research areas include nonparametric inference, asymptotic theory, causality, and applications to astrophysics, bi...
The Association of Academic Health Sciences Libraries Annual Statistics: a thematic history.
Shedlock, James; Byrd, Gary D
2003-04-01
The Annual Statistics of Medical School Libraries in the United States and Canada (Annual Statistics) is the most recognizable achievement of the Association of Academic Health Sciences Libraries in its history to date. This article gives a thematic history of the Annual Statistics, emphasizing the leadership role of editors and Editorial Boards, the need for cooperation and membership support to produce comparable data useful for everyday management of academic medical center libraries and the use of technology as a tool for data gathering and publication. The Annual Statistics' origin is recalled, and survey features and content are related to the overall themes. The success of the Annual Statistics is evident in the leadership skills of the first editor, Richard Lyders, executive director of the Houston Academy of Medicine-Texas Medical Center Library. The history shows the development of a survey instrument that strives to produce reliable and valid data for a diverse group of libraries while reflecting the many complex changes in the library environment. The future of the Annual Statistics is assured by the anticipated changes facing academic health sciences libraries, namely the need to reflect the transition from a physical environment to an electronic operation.
Complex Empiricism and the Quantification of Uncertainty in Paleoclimate Reconstructions
Brumble, K. C.
2014-12-01
Because the global climate cannot be observed directly, and because of vast and noisy data sets, climate science is a rich field to study how computational statistics informs what it means to do empirical science. Traditionally held virtues of empirical science and empirical methods like reproducibility, independence, and straightforward observation are complicated by representational choices involved in statistical modeling and data handling. Examining how climate reconstructions instantiate complicated empirical relationships between model, data, and predictions reveals that the path from data to prediction does not match traditional conceptions of empirical inference either. Rather, the empirical inferences involved are "complex" in that they require articulation of a good deal of statistical processing wherein assumptions are adopted and representational decisions made, often in the face of substantial uncertainties. Proxy reconstructions are both statistical and paleoclimate science activities aimed at using a variety of proxies to reconstruct past climate behavior. Paleoclimate proxy reconstructions also involve complex data handling and statistical refinement, leading to the current emphasis in the field on the quantification of uncertainty in reconstructions. In this presentation I explore how the processing needed for the correlation of diverse, large, and messy data sets necessitate the explicit quantification of the uncertainties stemming from wrangling proxies into manageable suites. I also address how semi-empirical pseudo-proxy methods allow for the exploration of signal detection in data sets, and as intermediary steps for statistical experimentation.
Statistical and Fractal Processing of Phase Images of Human Biological Fluids
Directory of Open Access Journals (Sweden)
MARCHUK, Y. I.
2010-11-01
Full Text Available Performed in this work are complex statistical and fractal analyses of phase properties inherent to birefringence networks of liquid crystals consisting of optically-thin layers prepared from human bile. Within the framework of a statistical approach, the authors have investigated values and ranges for changes of statistical moments of the 1-st to 4-th orders that characterize coordinate distributions for phase shifts between orthogonal components of amplitudes inherent to laser radiation transformed by human bile with various pathologies. Using the Gramm-Charlie method, ascertained are correlation criteria for differentiation of phase maps describing pathologically changed liquid-crystal networks. In the framework of the fractal approach, determined are dimensionalities of self-similar coordinate phase distributions as well as features of transformation of logarithmic dependences for power spectra of these distributions for various types of human pathologies.
... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...
On spin and matrix models in the complex plane
International Nuclear Information System (INIS)
Damgaard, P.H.; Heller, U.M.
1993-01-01
We describe various aspects of statistical mechanics defined in the complex temperature or coupling-constant plane. Using exactly solvable models, we analyse such aspects as renormalization group flows in the complex plane, the distribution of partition function zeros, and the question of new coupling-constant symmetries of complex-plane spin models. The double-scaling form of matrix models is shown to be exactly equivalent to finite-size scaling of two-dimensional spin systems. This is used to show that the string susceptibility exponents derived from matrix models can be obtained numerically with very high accuracy from the scaling of finite-N partition function zeros in the complex plane. (orig.)
Rizou, Ourania; Klonari, Aikaterini
2016-01-01
In the 21st century, the age of information and technology, there is an increasing importance to statistical literacy for everyday life. In addition, education innovation and globalisation in the past decade in Europe has resulted in a new perceived complexity of reality that affected the curriculum and statistics education, with a shift from…
Complex networks from multivariate time series
Czech Academy of Sciences Publication Activity Database
Paluš, Milan; Hartman, David; Vejmelka, Martin
2010-01-01
Roč. 12, - (2010), A-14382 ISSN 1607-7962. [General Asembly of the European Geophysical Society. 02.05.2010-07.05.2010, Vienna] R&D Projects: GA AV ČR IAA300420805 Institutional research plan: CEZ:AV0Z10300504 Keywords : complex network * surface air temperature * reanalysis data * global change Subject RIV: BB - Applied Statistics, Operational Research
International Nuclear Information System (INIS)
Zhao Yi; Small, Michael; Coward, David; Howell, Eric; Zhao Chunnong; Ju Li; Blair, David
2006-01-01
We describe the application of complexity estimation and the surrogate data method to identify deterministic dynamics in simulated gravitational wave (GW) data contaminated with white and coloured noises. The surrogate method uses algorithmic complexity as a discriminating statistic to decide if noisy data contain a statistically significant level of deterministic dynamics (the GW signal). The results illustrate that the complexity method is sensitive to a small amplitude simulated GW background (SNR down to 0.08 for white noise and 0.05 for coloured noise) and is also more robust than commonly used linear methods (autocorrelation or Fourier analysis)
Testing and qualification of confidence in statistical procedures
Energy Technology Data Exchange (ETDEWEB)
Serghiuta, D.; Tholammakkil, J.; Hammouda, N. [Canadian Nuclear Safety Commission (Canada); O' Hagan, A. [Sheffield Univ. (United Kingdom)
2014-07-01
This paper discusses a framework for designing artificial test problems, evaluation criteria, and two of the benchmark tests developed under a research project initiated by the Canadian Nuclear Safety Commission to investigate the approaches for qualification of tolerance limit methods and algorithms proposed for application in optimization of CANDU regional/neutron overpower protection trip setpoints for aged conditions. A significant component of this investigation has been the development of a series of benchmark problems of gradually increased complexity, from simple 'theoretical' problems up to complex problems closer to the real application. The first benchmark problem discussed in this paper is a simplified scalar problem which does not involve extremal, maximum or minimum, operations, typically encountered in the real applications. The second benchmark is a high dimensional, but still simple, problem for statistical inference of maximum channel power during normal operation. Bayesian algorithms have been developed for each benchmark problem to provide an independent way of constructing tolerance limits from the same data and allow assessing how well different methods make use of those data and, depending on the type of application, evaluating what the level of 'conservatism' is. The Bayesian method is not, however, used as a reference method, or 'gold' standard, but simply as an independent review method. The approach and the tests developed can be used as a starting point for developing a generic suite (generic in the sense of potentially applying whatever the proposed statistical method) of empirical studies, with clear criteria for passing those tests. Some lessons learned, in particular concerning the need to assure the completeness of the description of the application and the role of completeness of input information, are also discussed. It is concluded that a formal process which includes extended and detailed benchmark
Computing Science and Statistics. Volume 24. Graphics and Visualization
1993-03-01
Mike West Institute of Statistics & Decision Sciences Duke University, Durham NC 27708, USA Abstract density estimation techniques. With an importance...in J., Sharples , L. D. and Kirby, A. J. press). (1992b) Modelling complexity: applica- Wakefield J. C., Smith, A. F. M., Racine- tions of Gibbs...Math & Stats Box 13040 SFA Riccarton Edinburgh, Scotland EH 14 4AS Nacognoches, TX 75962 mike @cara.ma.hw.ac.uk Allen McIntosh Michael T. Longnecker
Generalized quantum statistics
International Nuclear Information System (INIS)
Chou, C.
1992-01-01
In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics
Scalable Harmonization of Complex Networks With Local Adaptive Controllers
Czech Academy of Sciences Publication Activity Database
Kárný, Miroslav; Herzallah, R.
2017-01-01
Roč. 47, č. 3 (2017), s. 394-404 ISSN 2168-2216 R&D Projects: GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : Adaptive control * Adaptive estimation * Bayes methods * Complex networks * Decentralized control * Fee dback * Fee dforward systems * Recursive estimation Subject RIV: BB - Applied Statistics, Operational Research OBOR OECD: Statistics and probability Impact factor: 2.350, year: 2016 http://library.utia.cas.cz/separaty/2016/AS/karny-0457337.pdf
Do neural nets learn statistical laws behind natural language?
Directory of Open Access Journals (Sweden)
Shuntaro Takahashi
Full Text Available The performance of deep learning in natural language processing has been spectacular, but the reasons for this success remain unclear because of the inherent complexity of deep learning. This paper provides empirical evidence of its effectiveness and of a limitation of neural networks for language engineering. Precisely, we demonstrate that a neural language model based on long short-term memory (LSTM effectively reproduces Zipf's law and Heaps' law, two representative statistical properties underlying natural language. We discuss the quality of reproducibility and the emergence of Zipf's law and Heaps' law as training progresses. We also point out that the neural language model has a limitation in reproducing long-range correlation, another statistical property of natural language. This understanding could provide a direction for improving the architectures of neural networks.
Statistical learning modeling method for space debris photometric measurement
Sun, Wenjing; Sun, Jinqiu; Zhang, Yanning; Li, Haisen
2016-03-01
Photometric measurement is an important way to identify the space debris, but the present methods of photometric measurement have many constraints on star image and need complex image processing. Aiming at the problems, a statistical learning modeling method for space debris photometric measurement is proposed based on the global consistency of the star image, and the statistical information of star images is used to eliminate the measurement noises. First, the known stars on the star image are divided into training stars and testing stars. Then, the training stars are selected as the least squares fitting parameters to construct the photometric measurement model, and the testing stars are used to calculate the measurement accuracy of the photometric measurement model. Experimental results show that, the accuracy of the proposed photometric measurement model is about 0.1 magnitudes.
Quantitative Analysis of Complex Tropical Forest Stands: A Review ...
African Journals Online (AJOL)
FIRST LADY
The importance of data analysis in quantitative assessment of natural resources .... Data collection design is an important process in complex forest statistical ... Ideally, the sample size should be equal among groups and sufficiently large.
Computational and Statistical Models: A Comparison for Policy Modeling of Childhood Obesity
Mabry, Patricia L.; Hammond, Ross; Ip, Edward Hak-Sing; Huang, Terry T.-K.
As systems science methodologies have begun to emerge as a set of innovative approaches to address complex problems in behavioral, social science, and public health research, some apparent conflicts with traditional statistical methodologies for public health have arisen. Computational modeling is an approach set in context that integrates diverse sources of data to test the plausibility of working hypotheses and to elicit novel ones. Statistical models are reductionist approaches geared towards proving the null hypothesis. While these two approaches may seem contrary to each other, we propose that they are in fact complementary and can be used jointly to advance solutions to complex problems. Outputs from statistical models can be fed into computational models, and outputs from computational models can lead to further empirical data collection and statistical models. Together, this presents an iterative process that refines the models and contributes to a greater understanding of the problem and its potential solutions. The purpose of this panel is to foster communication and understanding between statistical and computational modelers. Our goal is to shed light on the differences between the approaches and convey what kinds of research inquiries each one is best for addressing and how they can serve complementary (and synergistic) roles in the research process, to mutual benefit. For each approach the panel will cover the relevant "assumptions" and how the differences in what is assumed can foster misunderstandings. The interpretations of the results from each approach will be compared and contrasted and the limitations for each approach will be delineated. We will use illustrative examples from CompMod, the Comparative Modeling Network for Childhood Obesity Policy. The panel will also incorporate interactive discussions with the audience on the issues raised here.
Rumsey, Deborah
2011-01-01
The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou
Additional methodology development for statistical evaluation of reactor safety analyses
International Nuclear Information System (INIS)
Marshall, J.A.; Shore, R.W.; Chay, S.C.; Mazumdar, M.
1977-03-01
The project described is motivated by the desire for methods to quantify uncertainties and to identify conservatisms in nuclear power plant safety analysis. The report examines statistical methods useful for assessing the probability distribution of output response from complex nuclear computer codes, considers sensitivity analysis and several other topics, and also sets the path for using the developed methods for realistic assessment of the design basis accident
International Nuclear Information System (INIS)
Ali, S A; Kim, D-H; Cafaro, C; Giffin, A
2012-01-01
Information geometric techniques and inductive inference methods hold great promise for solving computational problems of interest in classical and quantum physics, especially with regard to complexity characterization of dynamical systems in terms of their probabilistic description on curved statistical manifolds. In this paper, we investigate the possibility of describing the macroscopic behavior of complex systems in terms of the underlying statistical structure of their microscopic degrees of freedom by the use of statistical inductive inference and information geometry. We review the maximum relative entropy formalism and the theoretical structure of the information geometrodynamical approach to chaos on statistical manifolds M S . Special focus is devoted to a description of the roles played by the sectional curvature K M S , the Jacobi field intensity J M S and the information geometrodynamical entropy S M S . These quantities serve as powerful information-geometric complexity measures of information-constrained dynamics associated with arbitrary chaotic and regular systems defined on M S . Finally, the application of such information-geometric techniques to several theoretical models is presented.
Statistical models describing the energy signature of buildings
DEFF Research Database (Denmark)
Bacher, Peder; Madsen, Henrik; Thavlov, Anders
2010-01-01
Approximately one third of the primary energy production in Denmark is used for heating in buildings. Therefore efforts to accurately describe and improve energy performance of the building mass are very important. For this purpose statistical models describing the energy signature of a building, i...... or varying energy prices. The paper will give an overview of statistical methods and applied models based on experiments carried out in FlexHouse, which is an experimental building in SYSLAB, Risø DTU. The models are of different complexity and can provide estimates of physical quantities such as UA......-values, time constants of the building, and other parameters related to the heat dynamics. A method for selecting the most appropriate model for a given building is outlined and finally a perspective of the applications is given. Aknowledgements to the Danish Energy Saving Trust and the Interreg IV ``Vind i...
Multivariate methods and forecasting with IBM SPSS statistics
Aljandali, Abdulkader
2017-01-01
This is the second of a two-part guide to quantitative analysis using the IBM SPSS Statistics software package; this volume focuses on multivariate statistical methods and advanced forecasting techniques. More often than not, regression models involve more than one independent variable. For example, forecasting methods are commonly applied to aggregates such as inflation rates, unemployment, exchange rates, etc., that have complex relationships with determining variables. This book introduces multivariate regression models and provides examples to help understand theory underpinning the model. The book presents the fundamentals of multivariate regression and then moves on to examine several related techniques that have application in business-orientated fields such as logistic and multinomial regression. Forecasting tools such as the Box-Jenkins approach to time series modeling are introduced, as well as exponential smoothing and naïve techniques. This part also covers hot topics such as Factor Analysis, Dis...
Nick, Todd G
2007-01-01
Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.
Statistical physics of human beings in games: Controlled experiments
International Nuclear Information System (INIS)
Liang Yuan; Huang Ji-Ping
2014-01-01
It is important to know whether the laws or phenomena in statistical physics for natural systems with non-adaptive agents still hold for social human systems with adaptive agents, because this implies whether it is possible to study or understand social human systems by using statistical physics originating from natural systems. For this purpose, we review the role of human adaptability in four kinds of specific human behaviors, namely, normal behavior, herd behavior, contrarian behavior, and hedge behavior. The approach is based on controlled experiments in the framework of market-directed resource-allocation games. The role of the controlled experiments could be at least two-fold: adopting the real human decision-making process so that the system under consideration could reflect the performance of genuine human beings; making it possible to obtain macroscopic physical properties of a human system by tuning a particular factor of the system, thus directly revealing cause and effect. As a result, both computer simulations and theoretical analyses help to show a few counterparts of some laws or phenomena in statistical physics for social human systems: two-phase phenomena or phase transitions, entropy-related phenomena, and a non-equilibrium steady state. This review highlights the role of human adaptability in these counterparts, and makes it possible to study or understand some particular social human systems by means of statistical physics coming from natural systems. (topical review - statistical physics and complex systems)
Proof of the Spin Statistics Connection 2: Relativistic Theory
Santamato, Enrico; De Martini, Francesco
2017-12-01
The traditional standard theory of quantum mechanics is unable to solve the spin-statistics problem, i.e. to justify the utterly important "Pauli Exclusion Principle" but by the adoption of the complex standard relativistic quantum field theory. In a recent paper (Santamato and De Martini in Found Phys 45(7):858-873, 2015) we presented a proof of the spin-statistics problem in the nonrelativistic approximation on the basis of the "Conformal Quantum Geometrodynamics". In the present paper, by the same theory the proof of the spin-statistics theorem is extended to the relativistic domain in the general scenario of curved spacetime. The relativistic approach allows to formulate a manifestly step-by-step Weyl gauge invariant theory and to emphasize some fundamental aspects of group theory in the demonstration. No relativistic quantum field operators are used and the particle exchange properties are drawn from the conservation of the intrinsic helicity of elementary particles. It is therefore this property, not considered in the standard quantum mechanics, which determines the correct spin-statistics connection observed in Nature (Santamato and De Martini in Found Phys 45(7):858-873, 2015). The present proof of the spin-statistics theorem is simpler than the one presented in Santamato and De Martini (Found Phys 45(7):858-873, 2015), because it is based on symmetry group considerations only, without having recourse to frames attached to the particles. Second quantization and anticommuting operators are not necessary.
A robust statistical method for association-based eQTL analysis.
Directory of Open Access Journals (Sweden)
Ning Jiang
Full Text Available It has been well established that theoretical kernel for recently surging genome-wide association study (GWAS is statistical inference of linkage disequilibrium (LD between a tested genetic marker and a putative locus affecting a disease trait. However, LD analysis is vulnerable to several confounding factors of which population stratification is the most prominent. Whilst many methods have been proposed to correct for the influence either through predicting the structure parameters or correcting inflation in the test statistic due to the stratification, these may not be feasible or may impose further statistical problems in practical implementation.We propose here a novel statistical method to control spurious LD in GWAS from population structure by incorporating a control marker into testing for significance of genetic association of a polymorphic marker with phenotypic variation of a complex trait. The method avoids the need of structure prediction which may be infeasible or inadequate in practice and accounts properly for a varying effect of population stratification on different regions of the genome under study. Utility and statistical properties of the new method were tested through an intensive computer simulation study and an association-based genome-wide mapping of expression quantitative trait loci in genetically divergent human populations.The analyses show that the new method confers an improved statistical power for detecting genuine genetic association in subpopulations and an effective control of spurious associations stemmed from population structure when compared with other two popularly implemented methods in the literature of GWAS.
Information Statistics in Schools Educate your students about the value and everyday use of statistics. The Statistics in Schools program provides resources for teaching and learning with real life data. Explore the site for standards-aligned, classroom-ready activities. Statistics in Schools Math Activities History
Hussein, R; Khalifa, A
2011-01-01
During the last decade, Egypt has experienced a revolution in the field of Information and Communication Technology (ICT) that has had a corresponding impact on the field of healthcare. Since 1993, the Information Technology Institute (ITI) has been leading the development of the Information Technology (IT) professional training and education in Egypt to produce top quality IT professionals who are considered now the backbone of the IT revolution in Egypt. For the past five years, ITI has been adopting the objective of building high caliber health professionals who can effectively serve the ever-growing information society. Academic links have been established with internationally renowned universities, e.g., Oregon Health and Science University (OHSU) in US, University of Leipzig in Germany, in addition those with the Egyptian Fellowship Board in order to enrich ITI Medical Informatics Education and Research. The ITI Biomedical and Health Informatics (BMHI) education and training programs target fresh graduates as well as life-long learners. Therefore, the program's learning objectives are framed within the context of the four specialization tracks: Healthcare Management (HCM), Biomedical Informatics Research (BMIR), Bioinformatics Professional (BIP), and Healthcare Professional (HCP). The ITI BMHI research projects tackle a wide-range of current challenges in this field, such as knowledge management in healthcare, providing tele-consultation services for diagnosis and treatment of infectious diseases for underserved regions in Egypt, and exploring the cultural and educational aspects of Nanoinformatics. Since 2006, ITI has been positively contributing to develop the discipline of BMHI in Egypt in order to support improved healthcare services.
Fast Quantum Algorithm for Predicting Descriptive Statistics of Stochastic Processes
Williams Colin P.
1999-01-01
Stochastic processes are used as a modeling tool in several sub-fields of physics, biology, and finance. Analytic understanding of the long term behavior of such processes is only tractable for very simple types of stochastic processes such as Markovian processes. However, in real world applications more complex stochastic processes often arise. In physics, the complicating factor might be nonlinearities; in biology it might be memory effects; and in finance is might be the non-random intentional behavior of participants in a market. In the absence of analytic insight, one is forced to understand these more complex stochastic processes via numerical simulation techniques. In this paper we present a quantum algorithm for performing such simulations. In particular, we show how a quantum algorithm can predict arbitrary descriptive statistics (moments) of N-step stochastic processes in just O(square root of N) time. That is, the quantum complexity is the square root of the classical complexity for performing such simulations. This is a significant speedup in comparison to the current state of the art.
Complex groundwater flow systems as traveling agent models
Directory of Open Access Journals (Sweden)
Oliver López Corona
2014-10-01
Full Text Available Analyzing field data from pumping tests, we show that as with many other natural phenomena, groundwater flow exhibits complex dynamics described by 1/f power spectrum. This result is theoretically studied within an agent perspective. Using a traveling agent model, we prove that this statistical behavior emerges when the medium is complex. Some heuristic reasoning is provided to justify both spatial and dynamic complexity, as the result of the superposition of an infinite number of stochastic processes. Even more, we show that this implies that non-Kolmogorovian probability is needed for its study, and provide a set of new partial differential equations for groundwater flow.
Quantum statistical vibrational entropy and enthalpy of formation of helium-vacancy complex in BCC W
Energy Technology Data Exchange (ETDEWEB)
Wen, Haohua [Sino-French Institute of Nuclear Engineering and Technology, Sun Yat-Sen University, 519082, Zhuhai (China); Woo, C.H., E-mail: chung.woo@polyu.edu.hk [ME Department, The Hong Kong Polytechnic University, Hong Kong SAR (China)
2016-12-15
High-temperature advance-reactor design and operation require knowledge of in-reactor materials properties far from the thermal ground state. Temperature-dependence due to the effects of lattice vibrations is important to the understanding and formulation of atomic processes involved in irradiation-damage accumulation. In this paper, we concentrate on the formation of He-V complex. The free-energy change in this regard is derived via thermodynamic integration from the phase-space trajectories generated from MD simulations based on the quantum fluctuation-dissipation relation. The change of frequency distribution of vibration modes during the complex formation is properly accounted for, and the corresponding entropy change avoids the classical ln(T) divergence that violates the third law. The vibrational enthalpy and entropy of formation calculated this way have significant effects on the He kinetics during irradiation.
Neto, Olmiro Andrade; Gasperin, Bernardo G; Rovani, Monique T; Ilha, Gustavo F; Nóbrega, Janduí E; Mondadori, Rafael G; Gonçalves, Paulo B D; Antoniazzi, Alfredo Q
2014-10-15
Castration of male calves is necessary for trading to facilitate handling and prevent reproduction. However, some methods of castration are traumatic and lead to economic losses because of infection and myiasis. The objective of the present study was to evaluate the efficiency of intratesticular injection (ITI) of hypertonic sodium chloride (NaCl; 20%) solution in male calf castration during the first weeks of life. Forty male calves were allocated to one of the following experimental groups: negative control-surgically castrated immediately after birth; positive control -intact males; G1-ITI from 1- to 5-day old; G2-ITI from 15- to 20-day old; and G3-ITI from 25- to 30-day old. Intratesticular injection induced coagulative necrosis of Leydig cells and seminiferous tubules leading to extensive fibrosis. Testosterone secretion and testicular development were severely impaired in 12-month-old animals from G1 and G2 groups (P<0.05), in which no testicular structure and sperm cells were observed during breeding soundness evaluation. Rectal and scrotal temperatures were not affected by different procedures. In conclusion, ITI of hypertonic NaCl solution induces sterility and completely suppresses testosterone secretion when performed during the first 20 days of life. Copyright © 2014 Elsevier Inc. All rights reserved.
Quantum mechanics: why complex Hilbert space?
Cassinelli, G.; Lahti, P.
2017-10-01
We outline a programme for an axiomatic reconstruction of quantum mechanics based on the statistical duality of states and effects that combines the use of a theorem of Solér with the idea of symmetry. We also discuss arguments favouring the choice of the complex field. This article is part of the themed issue `Second quantum revolution: foundational questions'.
Bayesian models based on test statistics for multiple hypothesis testing problems.
Ji, Yuan; Lu, Yiling; Mills, Gordon B
2008-04-01
We propose a Bayesian method for the problem of multiple hypothesis testing that is routinely encountered in bioinformatics research, such as the differential gene expression analysis. Our algorithm is based on modeling the distributions of test statistics under both null and alternative hypotheses. We substantially reduce the complexity of the process of defining posterior model probabilities by modeling the test statistics directly instead of modeling the full data. Computationally, we apply a Bayesian FDR approach to control the number of rejections of null hypotheses. To check if our model assumptions for the test statistics are valid for various bioinformatics experiments, we also propose a simple graphical model-assessment tool. Using extensive simulations, we demonstrate the performance of our models and the utility of the model-assessment tool. In the end, we apply the proposed methodology to an siRNA screening and a gene expression experiment.
Joint probability of statistical success of multiple phase III trials.
Zhang, Jianliang; Zhang, Jenny J
2013-01-01
In drug development, after completion of phase II proof-of-concept trials, the sponsor needs to make a go/no-go decision to start expensive phase III trials. The probability of statistical success (PoSS) of the phase III trials based on data from earlier studies is an important factor in that decision-making process. Instead of statistical power, the predictive power of a phase III trial, which takes into account the uncertainty in the estimation of treatment effect from earlier studies, has been proposed to evaluate the PoSS of a single trial. However, regulatory authorities generally require statistical significance in two (or more) trials for marketing licensure. We show that the predictive statistics of two future trials are statistically correlated through use of the common observed data from earlier studies. Thus, the joint predictive power should not be evaluated as a simplistic product of the predictive powers of the individual trials. We develop the relevant formulae for the appropriate evaluation of the joint predictive power and provide numerical examples. Our methodology is further extended to the more complex phase III development scenario comprising more than two (K > 2) trials, that is, the evaluation of the PoSS of at least k₀ (k₀≤ K) trials from a program of K total trials. Copyright © 2013 John Wiley & Sons, Ltd.
PRIS-STATISTICS: Power Reactor Information System Statistical Reports. User's Manual
International Nuclear Information System (INIS)
2013-01-01
The IAEA developed the Power Reactor Information System (PRIS)-Statistics application to assist PRIS end users with generating statistical reports from PRIS data. Statistical reports provide an overview of the status, specification and performance results of every nuclear power reactor in the world. This user's manual was prepared to facilitate the use of the PRIS-Statistics application and to provide guidelines and detailed information for each report in the application. Statistical reports support analyses of nuclear power development and strategies, and the evaluation of nuclear power plant performance. The PRIS database can be used for comprehensive trend analyses and benchmarking against best performers and industrial standards.
Nonlinear Dynamics, Chaotic and Complex Systems
Infeld, E.; Zelazny, R.; Galkowski, A.
2011-04-01
Part I. Dynamic Systems Bifurcation Theory and Chaos: 1. Chaos in random dynamical systems V. M. Gunldach; 2. Controlling chaos using embedded unstable periodic orbits: the problem of optimal periodic orbits B. R. Hunt and E. Ott; 3. Chaotic tracer dynamics in open hydrodynamical flows G. Karolyi, A. Pentek, T. Tel and Z. Toroczkai; 4. Homoclinic chaos L. P. Shilnikov; Part II. Spatially Extended Systems: 5. Hydrodynamics of relativistic probability flows I. Bialynicki-Birula; 6. Waves in ionic reaction-diffusion-migration systems P. Hasal, V. Nevoral, I. Schreiber, H. Sevcikova, D. Snita, and M. Marek; 7. Anomalous scaling in turbulence: a field theoretical approach V. Lvov and I. Procaccia; 8. Abelian sandpile cellular automata M. Markosova; 9. Transport in an incompletely chaotic magnetic field F. Spineanu; Part III. Dynamical Chaos Quantum Physics and Foundations Of Statistical Mechanics: 10. Non-equilibrium statistical mechanics and ergodic theory L. A. Bunimovich; 11. Pseudochaos in statistical physics B. Chirikov; 12. Foundations of non-equilibrium statistical mechanics J. P. Dougherty; 13. Thermomechanical particle simulations W. G. Hoover, H. A. Posch, C. H. Dellago, O. Kum, C. G. Hoover, A. J. De Groot and B. L. Holian; 14. Quantum dynamics on a Markov background and irreversibility B. Pavlov; 15. Time chaos and the laws of nature I. Prigogine and D. J. Driebe; 16. Evolutionary Q and cognitive systems: dynamic entropies and predictability of evolutionary processes W. Ebeling; 17. Spatiotemporal chaos information processing in neural networks H. Szu; 18. Phase transitions and learning in neural networks C. Van den Broeck; 19. Synthesis of chaos A. Vanecek and S. Celikovsky; 20. Computational complexity of continuous problems H. Wozniakowski; Part IV. Complex Systems As An Interface Between Natural Sciences and Environmental Social and Economic Sciences: 21. Stochastic differential geometry in finance studies V. G. Makhankov; Part V. Conference Banquet
Sadovskii, Michael V
2012-01-01
This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.
Applying Statistical Design to Control the Risk of Over-Design with Stochastic Simulation
Directory of Open Access Journals (Sweden)
Yi Wu
2010-02-01
Full Text Available By comparing a hard real-time system and a soft real-time system, this article elicits the risk of over-design in soft real-time system designing. To deal with this risk, a novel concept of statistical design is proposed. The statistical design is the process accurately accounting for and mitigating the effects of variation in part geometry and other environmental conditions, while at the same time optimizing a target performance factor. However, statistical design can be a very difficult and complex task when using clas-sical mathematical methods. Thus, a simulation methodology to optimize the design is proposed in order to bridge the gap between real-time analysis and optimization for robust and reliable system design.
International Nuclear Information System (INIS)
Samatova, Nagiza F; Branstetter, Marcia; Ganguly, Auroop R; Hettich, Robert; Khan, Shiraj; Kora, Guruprasad; Li, Jiangtian; Ma, Xiaosong; Pan, Chongle; Shoshani, Arie; Yoginath, Srikanth
2006-01-01
Ultrascale computing and high-throughput experimental technologies have enabled the production of scientific data about complex natural phenomena. With this opportunity, comes a new problem - the massive quantities of data so produced. Answers to fundamental questions about the nature of those phenomena remain largely hidden in the produced data. The goal of this work is to provide a scalable high performance statistical data analysis framework to help scientists perform interactive analyses of these raw data to extract knowledge. Towards this goal we have been developing an open source parallel statistical analysis package, called Parallel R, that lets scientists employ a wide range of statistical analysis routines on high performance shared and distributed memory architectures without having to deal with the intricacies of parallelizing these routines
Using student models to generate feedback in a university course on statistical sampling
Tacoma, S.G.|info:eu-repo/dai/nl/411923080; Drijvers, P.H.M.|info:eu-repo/dai/nl/074302922; Boon, P.B.J.|info:eu-repo/dai/nl/203374207
2017-01-01
Due to the complexity of the topic and a lack of individual guidance, introductory statistics courses at university are often challenging. Automated feedback might help to address this issue. In this study, we explore the use of student models to provide feedback. The research question is how
The Complexity of Solar and Geomagnetic Indices
Pesnell, W. Dean
2017-08-01
How far in advance can the sunspot number be predicted with any degree of confidence? Solar cycle predictions are needed to plan long-term space missions. Fleets of satellites circle the Earth collecting science data, protecting astronauts, and relaying information. All of these satellites are sensitive at some level to solar cycle effects. Statistical and timeseries analyses of the sunspot number are often used to predict solar activity. These methods have not been completely successful as the solar dynamo changes over time and one cycle's sunspots are not a faithful predictor of the next cycle's activity. In some ways, using these techniques is similar to asking whether the stock market can be predicted. It has been shown that the Dow Jones Industrial Average (DJIA) can be more accurately predicted during periods when it obeys certain statistical properties than at other times. The Hurst exponent is one such way to partition the data. Another measure of the complexity of a timeseries is the fractal dimension. We can use these measures of complexity to compare the sunspot number with other solar and geomagnetic indices. Our concentration is on how trends are removed by the various techniques, either internally or externally. Comparisons of the statistical properties of the various solar indices may guide us in understanding how the dynamo manifests in the various indices and the Sun.
A weighted generalized score statistic for comparison of predictive values of diagnostic tests.
Kosinski, Andrzej S
2013-03-15
Positive and negative predictive values are important measures of a medical diagnostic test performance. We consider testing equality of two positive or two negative predictive values within a paired design in which all patients receive two diagnostic tests. The existing statistical tests for testing equality of predictive values are either Wald tests based on the multinomial distribution or the empirical Wald and generalized score tests within the generalized estimating equations (GEE) framework. As presented in the literature, these test statistics have considerably complex formulas without clear intuitive insight. We propose their re-formulations that are mathematically equivalent but algebraically simple and intuitive. As is clearly seen with a new re-formulation we presented, the generalized score statistic does not always reduce to the commonly used score statistic in the independent samples case. To alleviate this, we introduce a weighted generalized score (WGS) test statistic that incorporates empirical covariance matrix with newly proposed weights. This statistic is simple to compute, always reduces to the score statistic in the independent samples situation, and preserves type I error better than the other statistics as demonstrated by simulations. Thus, we believe that the proposed WGS statistic is the preferred statistic for testing equality of two predictive values and for corresponding sample size computations. The new formulas of the Wald statistics may be useful for easy computation of confidence intervals for difference of predictive values. The introduced concepts have potential to lead to development of the WGS test statistic in a general GEE setting. Copyright © 2012 John Wiley & Sons, Ltd.
Statistical distribution for generalized ideal gas of fractional-statistics particles
International Nuclear Information System (INIS)
Wu, Y.
1994-01-01
We derive the occupation-number distribution in a generalized ideal gas of particles obeying fractional statistics, including mutual statistics, by adopting a state-counting definition. When there is no mutual statistics, the statistical distribution interpolates between bosons and fermions, and respects a fractional exclusion principle (except for bosons). Anyons in a strong magnetic field at low temperatures constitute such a physical system. Applications to the thermodynamic properties of quasiparticle excitations in the Laughlin quantum Hall fluid are discussed
... Watchdog Ratings Feedback Contact Select Page Childhood Cancer Statistics Home > Cancer Resources > Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates ...
View discovery in OLAP databases through statistical combinatorial optimization
Energy Technology Data Exchange (ETDEWEB)
Hengartner, Nick W [Los Alamos National Laboratory; Burke, John [PNNL; Critchlow, Terence [PNNL; Joslyn, Cliff [PNNL; Hogan, Emilie [PNNL
2009-01-01
OnLine Analytical Processing (OLAP) is a relational database technology providing users with rapid access to summary, aggregated views of a single large database, and is widely recognized for knowledge representation and discovery in high-dimensional relational databases. OLAP technologies provide intuitive and graphical access to the massively complex set of possible summary views available in large relational (SQL) structured data repositories. The capability of OLAP database software systems to handle data complexity comes at a high price for analysts, presenting them a combinatorially vast space of views of a relational database. We respond to the need to deploy technologies sufficient to allow users to guide themselves to areas of local structure by casting the space of 'views' of an OLAP database as a combinatorial object of all projections and subsets, and 'view discovery' as an search process over that lattice. We equip the view lattice with statistical information theoretical measures sufficient to support a combinatorial optimization process. We outline 'hop-chaining' as a particular view discovery algorithm over this object, wherein users are guided across a permutation of the dimensions by searching for successive two-dimensional views, pushing seen dimensions into an increasingly large background filter in a 'spiraling' search process. We illustrate this work in the context of data cubes recording summary statistics for radiation portal monitors at US ports.
Xylanases of thermophilic bacteria from Icelandic hot springs
Energy Technology Data Exchange (ETDEWEB)
Pertulla, M; Raettoe, M; Viikari, L [VTT, Biotechnical Lab., Espoo (Finland); Kondradsdottir, M [Dept. of Biotechnology, Technological Inst. of Iceland, Reykjavik (Iceland); Kristjansson, J K [Dept. of Biotechnology, Technological Inst. of Iceland, Reykjavik (Iceland) Inst. of Biotechnology, Iceland Univ., Reykjavik (Iceland)
1993-02-01
Thermophilic, aerobic bacteria isolated from Icelandic hot springs were screened for xylanase activity. Of 97 strains tested, 14 were found to be xylanase positive. Xylanase activities up to 12 nkat/ml were produced by these strains in shake flasks on xylan medium. The xylanases of the two strains producing the highest activities (ITI 36 and ITI 283) were similar with respect to temperature and pH optima (80deg C and pH 8.0). Xylanase production of strain ITI 36 was found to be induced by xylan and xylose. Xylanase activity of 24 nkat/ml was obtained with this strain in a laboratory-scale-fermentor cultivation on xylose medium. [beta]-Xylosidase activity was also detected in the culture filtrate. The thermal half-life of ITI 36 xylanase was 24 h at 70deg C. The highest production of sugars from hydrolysis of beech xylan was obtained at 70deg C, although xylan depolymerization was detected even up to 90deg C. (orig.).
Nonparametric statistical inference
Gibbons, Jean Dickinson
2010-01-01
Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente
Dowdy, Shirley; Chilko, Daniel
2011-01-01
Praise for the Second Edition "Statistics for Research has other fine qualities besides superior organization. The examples and the statistical methods are laid out with unusual clarity by the simple device of using special formats for each. The book was written with great care and is extremely user-friendly."-The UMAP Journal Although the goals and procedures of statistical research have changed little since the Second Edition of Statistics for Research was published, the almost universal availability of personal computers and statistical computing application packages have made it possible f
Griffiths, Dawn
2009-01-01
Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics
Statistical considerations on safety analysis
International Nuclear Information System (INIS)
Pal, L.; Makai, M.
2004-01-01
statement is true. In some cases statistical aspects of safety are misused, where the number of runs for several outputs is correct only for statistically independent outputs, or misinterpreted. We do not know the probability distribution of the output variables subjected to safety limitations. At the same time in some asymmetric distributions the 0.95/0.95 methodology simply fails: if we repeat the calculations in many cases we would get a value higher than the basic value, which means the limit violation in the calculation becomes more and more probable in the repeated analysis. Consequent application of order statistics or the application of the sign test may offer a way out of the present situation. The authors are also convinced that efforts should be made to study the statistics of the output variables, and to study the occurrence of chaos in the analyzed cases. All these observations should influence, in safety analysis, the application of best estimate methods, and underline the opinion that any realistic modeling and simulation of complex systems must include the probabilistic features of the system and the environment
Hartmann, Alexander K.; Weigt, Martin
2005-10-01
A concise, comprehensive introduction to the topic of statistical physics of combinatorial optimization, bringing together theoretical concepts and algorithms from computer science with analytical methods from physics. The result bridges the gap between statistical physics and combinatorial optimization, investigating problems taken from theoretical computing, such as the vertex-cover problem, with the concepts and methods of theoretical physics. The authors cover rapid developments and analytical methods that are both extremely complex and spread by word-of-mouth, providing all the necessary basics in required detail. Throughout, the algorithms are shown with examples and calculations, while the proofs are given in a way suitable for graduate students, post-docs, and researchers. Ideal for newcomers to this young, multidisciplinary field.
Statistical and Visualization Data Mining Tools for Foundry Production
Directory of Open Access Journals (Sweden)
M. Perzyk
2007-07-01
Full Text Available In recent years a rapid development of a new, interdisciplinary knowledge area, called data mining, is observed. Its main task is extracting useful information from previously collected large amount of data. The main possibilities and potential applications of data mining in manufacturing industry are characterized. The main types of data mining techniques are briefly discussed, including statistical, artificial intelligence, data base and visualization tools. The statistical methods and visualization methods are presented in more detail, showing their general possibilities, advantages as well as characteristic examples of applications in foundry production. Results of the author’s research are presented, aimed at validation of selected statistical tools which can be easily and effectively used in manufacturing industry. A performance analysis of ANOVA and contingency tables based methods, dedicated for determination of the most significant process parameters as well as for detection of possible interactions among them, has been made. Several numerical tests have been performed using simulated data sets, with assumed hidden relationships as well some real data, related to the strength of ductile cast iron, collected in a foundry. It is concluded that the statistical methods offer relatively easy and fairly reliable tools for extraction of that type of knowledge about foundry manufacturing processes. However, further research is needed, aimed at explanation of some imperfections of the investigated tools as well assessment of their validity for more complex tasks.
A Simplified Algorithm for Statistical Investigation of Damage Spreading
International Nuclear Information System (INIS)
Gecow, Andrzej
2009-01-01
On the way to simulating adaptive evolution of complex system describing a living object or human developed project, a fitness should be defined on node states or network external outputs. Feedbacks lead to circular attractors of these states or outputs which make it difficult to define a fitness. The main statistical effects of adaptive condition are the result of small change tendency and to appear, they only need a statistically correct size of damage initiated by evolutionary change of system. This observation allows to cut loops of feedbacks and in effect to obtain a particular statistically correct state instead of a long circular attractor which in the quenched model is expected for chaotic network with feedback. Defining fitness on such states is simple. We calculate only damaged nodes and only once. Such an algorithm is optimal for investigation of damage spreading i.e. statistical connections of structural parameters of initial change with the size of effected damage. It is a reversed-annealed method--function and states (signals) may be randomly substituted but connections are important and are preserved. The small damages important for adaptive evolution are correctly depicted in comparison to Derrida annealed approximation which expects equilibrium levels for large networks. The algorithm indicates these levels correctly. The relevant program in Pascal, which executes the algorithm for a wide range of parameters, can be obtained from the author.
Software for statistical data analysis used in Higgs searches
International Nuclear Information System (INIS)
Gumpert, Christian; Moneta, Lorenzo; Cranmer, Kyle; Kreiss, Sven; Verkerke, Wouter
2014-01-01
The analysis and interpretation of data collected by the Large Hadron Collider (LHC) requires advanced statistical tools in order to quantify the agreement between observation and theoretical models. RooStats is a project providing a statistical framework for data analysis with the focus on discoveries, confidence intervals and combination of different measurements in both Bayesian and frequentist approaches. It employs the RooFit data modelling language where mathematical concepts such as variables, (probability density) functions and integrals are represented as C++ objects. RooStats and RooFit rely on the persistency technology of the ROOT framework. The usage of a common data format enables the concept of digital publishing of complicated likelihood functions. The statistical tools have been developed in close collaboration with the LHC experiments to ensure their applicability to real-life use cases. Numerous physics results have been produced using the RooStats tools, with the discovery of the Higgs boson by the ATLAS and CMS experiments being certainly the most popular among them. We will discuss tools currently used by LHC experiments to set exclusion limits, to derive confidence intervals and to estimate discovery significances based on frequentist statistics and the asymptotic behaviour of likelihood functions. Furthermore, new developments in RooStats and performance optimisation necessary to cope with complex models depending on more than 1000 variables will be reviewed
Potirakis, Stelios M.; Contoyiannis, Yiannis; Kopanas, John; Kalimeris, Anastasios; Antonopoulos, George; Peratzakis, Athanasios; Eftaxias, Konstantinos; Nomicos, Costantinos
2014-05-01
When one considers a phenomenon that is "complex" refers to a system whose phenomenological laws that describe the global behavior of the system, are not necessarily directly related to the "microscopic" laws that regulate the evolution of its elementary parts. The field of study of complex systems considers that the dynamics of complex systems are founded on universal principles that may be used to describe disparate problems ranging from particle physics to economies of societies. Several authors have suggested that earthquake (EQ) dynamics can be analyzed within similar mathematical frameworks with economy dynamics, and neurodynamics. A central property of the EQ preparation process is the occurrence of coherent large-scale collective behavior with a very rich structure, resulting from repeated nonlinear interactions among the constituents of the system. As a result, nonextensive statistics is an appropriate, physically meaningful, tool for the study of EQ dynamics. Since the fracture induced electromagnetic (EM) precursors are observable manifestations of the underlying EQ preparation process, the analysis of a fracture induced EM precursor observed prior to the occurrence of a large EQ can also be conducted within the nonextensive statistics framework. Within the frame of the investigation for universal principles that may hold for different dynamical systems that are related to the genesis of extreme events, we present here statistical similarities of the pre-earthquake EM emissions related to an EQ, with the pre-ictal electrical brain activity related to an epileptic seizure, and with the pre-crisis economic observables related to the collapse of a share. It is demonstrated the all three dynamical systems' observables can be analyzed in the frame of nonextensive statistical mechanics, while the frequency-size relations of appropriately defined "events" that precede the extreme event related to each one of these different systems present striking quantitative
Blind identification and separation of complex-valued signals
Moreau, Eric
2013-01-01
Blind identification consists of estimating a multi-dimensional system only through the use of its output, and source separation, the blind estimation of the inverse of the system. Estimation is generally carried out using different statistics of the output. The authors of this book consider the blind identification and source separation problem in the complex-domain, where the available statistical properties are richer and include non-circularity of the sources - underlying components. They define identifiability conditions and present state-of-the-art algorithms that are based on algebraic methods as well as iterative algorithms based on maximum likelihood theory. Contents 1. Mathematical Preliminaries. 2. Estimation by Joint Diagonalization. 3. Maximum Likelihood ICA. About the Authors Eric Moreau is Professor of Electrical Engineering at the University of Toulon, France. His research interests concern statistical signal processing, high order statistics and matrix/tensor decompositions with applic...
A Nineteenth Century Statistical Society that Abandoned Statistics
Stamhuis, I.H.
2007-01-01
In 1857, a Statistical Society was founded in the Netherlands. Within this society, statistics was considered a systematic, quantitative, and qualitative description of society. In the course of time, the society attracted a wide and diverse membership, although the number of physicians on its rolls
Korenchenko, Anna E.; Vorontsov, Alexander G.; Gelchinski, Boris R.; Sannikov, Grigorii P.
2018-04-01
We discuss the problem of dimer formation during the homogeneous nucleation of atomic metal vapor in an inert gas environment. We simulated nucleation with molecular dynamics and carried out the statistical analysis of double- and triple-atomic collisions as the two ways of long-lived diatomic complex formation. Close pair of atoms with lifetime greater than the mean time interval between atom-atom collisions is called a long-lived diatomic complex. We found that double- and triple-atomic collisions gave approximately the same probabilities of long-lived diatomic complex formation, but internal energy of the resulted state was essentially lower in the second case. Some diatomic complexes formed in three-particle collisions are stable enough to be a critical nucleus.
Quantum communication complexity advantage implies violation of a Bell inequality
Buhrman, Harry; Czekaj, Łukasz; Grudka, Andrzej; Horodecki, Michał; Horodecki, Paweł; Markiewicz, Marcin; Speelman, Florian; Strelchuk, Sergii
2016-01-01
We obtain a general connection between a large quantum advantage in communication complexity and Bell nonlocality. We show that given any protocol offering a sufficiently large quantum advantage in communication complexity, there exists a way of obtaining measurement statistics that violate some Bell inequality. Our main tool is port-based teleportation. If the gap between quantum and classical communication complexity can grow arbitrarily large, the ratio of the quantum value to the classical value of the Bell quantity becomes unbounded with the increase in the number of inputs and outputs. PMID:26957600
On Generalized Stam Inequalities and Fisher–Rényi Complexity Measures
Directory of Open Access Journals (Sweden)
Steeve Zozor
2017-09-01
Full Text Available Information-theoretic inequalities play a fundamental role in numerous scientific and technological areas (e.g., estimation and communication theories, signal and information processing, quantum physics, … as they generally express the impossibility to have a complete description of a system via a finite number of information measures. In particular, they gave rise to the design of various quantifiers (statistical complexity measures of the internal complexity of a (quantum system. In this paper, we introduce a three-parametric Fisher–Rényi complexity, named ( p , β , λ -Fisher–Rényi complexity, based on both a two-parametic extension of the Fisher information and the Rényi entropies of a probability density function ρ characteristic of the system. This complexity measure quantifies the combined balance of the spreading and the gradient contents of ρ , and has the three main properties of a statistical complexity: the invariance under translation and scaling transformations, and a universal bounding from below. The latter is proved by generalizing the Stam inequality, which lowerbounds the product of the Shannon entropy power and the Fisher information of a probability density function. An extension of this inequality was already proposed by Bercher and Lutwak, a particular case of the general one, where the three parameters are linked, allowing to determine the sharp lower bound and the associated probability density with minimal complexity. Using the notion of differential-escort deformation, we are able to determine the sharp bound of the complexity measure even when the three parameters are decoupled (in a certain range. We determine as well the distribution that saturates the inequality: the ( p , β , λ -Gaussian distribution, which involves an inverse incomplete beta function. Finally, the complexity measure is calculated for various quantum-mechanical states of the harmonic and hydrogenic systems, which are the two main
Contributions to sampling statistics
Conti, Pier; Ranalli, Maria
2014-01-01
This book contains a selection of the papers presented at the ITACOSM 2013 Conference, held in Milan in June 2013. ITACOSM is the bi-annual meeting of the Survey Sampling Group S2G of the Italian Statistical Society, intended as an international forum of scientific discussion on the developments of theory and application of survey sampling methodologies and applications in human and natural sciences. The book gathers research papers carefully selected from both invited and contributed sessions of the conference. The whole book appears to be a relevant contribution to various key aspects of sampling methodology and techniques; it deals with some hot topics in sampling theory, such as calibration, quantile-regression and multiple frame surveys, and with innovative methodologies in important topics of both sampling theory and applications. Contributions cut across current sampling methodologies such as interval estimation for complex samples, randomized responses, bootstrap, weighting, modeling, imputati...
Minimized state complexity of quantum-encoded cryptic processes
Riechers, Paul M.; Mahoney, John R.; Aghamohammadi, Cina; Crutchfield, James P.
2016-05-01
The predictive information required for proper trajectory sampling of a stochastic process can be more efficiently transmitted via a quantum channel than a classical one. This recent discovery allows quantum information processing to drastically reduce the memory necessary to simulate complex classical stochastic processes. It also points to a new perspective on the intrinsic complexity that nature must employ in generating the processes we observe. The quantum advantage increases with codeword length: the length of process sequences used in constructing the quantum communication scheme. In analogy with the classical complexity measure, statistical complexity, we use this reduced communication cost as an entropic measure of state complexity in the quantum representation. Previously difficult to compute, the quantum advantage is expressed here in closed form using spectral decomposition. This allows for efficient numerical computation of the quantum-reduced state complexity at all encoding lengths, including infinite. Additionally, it makes clear how finite-codeword reduction in state complexity is controlled by the classical process's cryptic order, and it allows asymptotic analysis of infinite-cryptic-order processes.
Energy Technology Data Exchange (ETDEWEB)
Cohen, E G.D.
1985-01-01
The following topics were dealt with: walks, walls and ordering in low dimensions; renormalisation of fluids; wetting transition; phases and phase transitions; liquid-vapour interface; statistical mechanics in lattice gauge theory; hydrodynamic instabilities; complex dynamics and chaos; dynamical transitions; phase separation and pattern formation; kinetic theory of clustering; localisation.
Consolidity analysis for fully fuzzy functions, matrices, probability and statistics
Directory of Open Access Journals (Sweden)
Walaa Ibrahim Gabr
2015-03-01
Full Text Available The paper presents a comprehensive review of the know-how for developing the systems consolidity theory for modeling, analysis, optimization and design in fully fuzzy environment. The solving of systems consolidity theory included its development for handling new functions of different dimensionalities, fuzzy analytic geometry, fuzzy vector analysis, functions of fuzzy complex variables, ordinary differentiation of fuzzy functions and partial fraction of fuzzy polynomials. On the other hand, the handling of fuzzy matrices covered determinants of fuzzy matrices, the eigenvalues of fuzzy matrices, and solving least-squares fuzzy linear equations. The approach demonstrated to be also applicable in a systematic way in handling new fuzzy probabilistic and statistical problems. This included extending the conventional probabilistic and statistical analysis for handling fuzzy random data. Application also covered the consolidity of fuzzy optimization problems. Various numerical examples solved have demonstrated that the new consolidity concept is highly effective in solving in a compact form the propagation of fuzziness in linear, nonlinear, multivariable and dynamic problems with different types of complexities. Finally, it is demonstrated that the implementation of the suggested fuzzy mathematics can be easily embedded within normal mathematics through building special fuzzy functions library inside the computational Matlab Toolbox or using other similar software languages.
Statistical calculation of complete events in medium-energy nuclear collisions
International Nuclear Information System (INIS)
Randrup, J.
1984-01-01
Several heavy-ion accelerators throughout the world are presently able to deliver beams of heavy nuclei with kinetic energies in the range from tens to hundreds of MeV per nucleon, the so-called medium or intermediate energy range. At such energies a large number of final channels are open, each consisting of many nuclear fragments. The disassembly of the collision system is expected to be a very complicated process and a detailed dynamical description is beyond their present capability. However, by virtue of the complexity of the process, statistical considerations may be useful. A statistical description of the disassembly yields the least biased expectations about the outcome of a collision process and provides a meaningful reference against which more specific dynamical models, as well as the data, can be discussed. This lecture presents the essential tools for formulating a statistical model for the nuclear disassembly process. The authors consider the quick disassembly (explosion) of a hot nuclear system, a so-called source, into multifragment final states, which complete according to their statistical weight. First some useful notation is introduced. Then the expressions for exclusive and inclusive distributions are given and the factorization of an exclusive distribution into inclusive ones is carried out. In turn, the grand canonical approximation for one-fragment inclusive distributions is introduced. Finally, it is outlined how to generate a statistical sample of complete final states. On this basis, a model for statistical simulation of complete events in medium-energy nuclear collisions has been developed
Statistical data analysis using SAS intermediate statistical methods
Marasinghe, Mervyn G
2018-01-01
The aim of this textbook (previously titled SAS for Data Analytics) is to teach the use of SAS for statistical analysis of data for advanced undergraduate and graduate students in statistics, data science, and disciplines involving analyzing data. The book begins with an introduction beyond the basics of SAS, illustrated with non-trivial, real-world, worked examples. It proceeds to SAS programming and applications, SAS graphics, statistical analysis of regression models, analysis of variance models, analysis of variance with random and mixed effects models, and then takes the discussion beyond regression and analysis of variance to conclude. Pedagogically, the authors introduce theory and methodological basis topic by topic, present a problem as an application, followed by a SAS analysis of the data provided and a discussion of results. The text focuses on applied statistical problems and methods. Key features include: end of chapter exercises, downloadable SAS code and data sets, and advanced material suitab...
Online incidental statistical learning of audiovisual word sequences in adults: a registered report.
Kuppuraj, Sengottuvel; Duta, Mihaela; Thompson, Paul; Bishop, Dorothy
2018-02-01
Statistical learning has been proposed as a key mechanism in language learning. Our main goal was to examine whether adults are capable of simultaneously extracting statistical dependencies in a task where stimuli include a range of structures amenable to statistical learning within a single paradigm. We devised an online statistical learning task using real word auditory-picture sequences that vary in two dimensions: (i) predictability and (ii) adjacency of dependent elements. This task was followed by an offline recall task to probe learning of each sequence type. We registered three hypotheses with specific predictions. First, adults would extract regular patterns from continuous stream (effect of grammaticality). Second, within grammatical conditions, they would show differential speeding up for each condition as a factor of statistical complexity of the condition and exposure. Third, our novel approach to measure online statistical learning would be reliable in showing individual differences in statistical learning ability. Further, we explored the relation between statistical learning and a measure of verbal short-term memory (STM). Forty-two participants were tested and retested after an interval of at least 3 days on our novel statistical learning task. We analysed the reaction time data using a novel regression discontinuity approach. Consistent with prediction, participants showed a grammaticality effect, agreeing with the predicted order of difficulty for learning different statistical structures. Furthermore, a learning index from the task showed acceptable test-retest reliability ( r = 0.67). However, STM did not correlate with statistical learning. We discuss the findings noting the benefits of online measures in tracking the learning process.
Neuroendocrine Tumor: Statistics
... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 01/ ... the body. It is important to remember that statistics on the survival rates for people with a ...
Position on mouse chromosome 1 of a gene that controls resistance to Salmonella typhimurium.
Taylor, B A; O'Brien, A D
1982-06-01
Ity is a gene which regulates the magnitude of Salmonella typhimurium growth in murine tissues and, hence, the innate salmonella resistance of mice. The results of a five-point backcross clearly showed that the correct gene order on chromosome 1 is fz-Idh-1-Ity-ln-Pep-3.
Quantum mechanics: why complex Hilbert space?
Cassinelli, G; Lahti, P
2017-11-13
We outline a programme for an axiomatic reconstruction of quantum mechanics based on the statistical duality of states and effects that combines the use of a theorem of Solér with the idea of symmetry. We also discuss arguments favouring the choice of the complex field.This article is part of the themed issue 'Second quantum revolution: foundational questions'. © 2017 The Author(s).
Sun, Hokeun; Wang, Shuang
2014-08-15
Existing association methods for rare variants from sequencing data have focused on aggregating variants in a gene or a genetic region because of the fact that analysing individual rare variants is underpowered. However, these existing rare variant detection methods are not able to identify which rare variants in a gene or a genetic region of all variants are associated with the complex diseases or traits. Once phenotypic associations of a gene or a genetic region are identified, the natural next step in the association study with sequencing data is to locate the susceptible rare variants within the gene or the genetic region. In this article, we propose a power set-based statistical selection procedure that is able to identify the locations of the potentially susceptible rare variants within a disease-related gene or a genetic region. The selection performance of the proposed selection procedure was evaluated through simulation studies, where we demonstrated the feasibility and superior power over several comparable existing methods. In particular, the proposed method is able to handle the mixed effects when both risk and protective variants are present in a gene or a genetic region. The proposed selection procedure was also applied to the sequence data on the ANGPTL gene family from the Dallas Heart Study to identify potentially susceptible rare variants within the trait-related genes. An R package 'rvsel' can be downloaded from http://www.columbia.edu/∼sw2206/ and http://statsun.pusan.ac.kr. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Algorithm for image retrieval based on edge gradient orientation statistical code.
Zeng, Jiexian; Zhao, Yonggang; Li, Weiye; Fu, Xiang
2014-01-01
Image edge gradient direction not only contains important information of the shape, but also has a simple, lower complexity characteristic. Considering that the edge gradient direction histograms and edge direction autocorrelogram do not have the rotation invariance, we put forward the image retrieval algorithm which is based on edge gradient orientation statistical code (hereinafter referred to as EGOSC) by sharing the application of the statistics method in the edge direction of the chain code in eight neighborhoods to the statistics of the edge gradient direction. Firstly, we construct the n-direction vector and make maximal summation restriction on EGOSC to make sure this algorithm is invariable for rotation effectively. Then, we use Euclidean distance of edge gradient direction entropy to measure shape similarity, so that this method is not sensitive to scaling, color, and illumination change. The experimental results and the algorithm analysis demonstrate that the algorithm can be used for content-based image retrieval and has good retrieval results.
... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...
A Review of Modeling Bioelectrochemical Systems: Engineering and Statistical Aspects
Directory of Open Access Journals (Sweden)
Shuai Luo
2016-02-01
Full Text Available Bioelectrochemical systems (BES are promising technologies to convert organic compounds in wastewater to electrical energy through a series of complex physical-chemical, biological and electrochemical processes. Representative BES such as microbial fuel cells (MFCs have been studied and advanced for energy recovery. Substantial experimental and modeling efforts have been made for investigating the processes involved in electricity generation toward the improvement of the BES performance for practical applications. However, there are many parameters that will potentially affect these processes, thereby making the optimization of system performance hard to be achieved. Mathematical models, including engineering models and statistical models, are powerful tools to help understand the interactions among the parameters in BES and perform optimization of BES configuration/operation. This review paper aims to introduce and discuss the recent developments of BES modeling from engineering and statistical aspects, including analysis on the model structure, description of application cases and sensitivity analysis of various parameters. It is expected to serves as a compass for integrating the engineering and statistical modeling strategies to improve model accuracy for BES development.
Bayesian statistic methods and theri application in probabilistic simulation models
Directory of Open Access Journals (Sweden)
Sergio Iannazzo
2007-03-01
Full Text Available Bayesian statistic methods are facing a rapidly growing level of interest and acceptance in the field of health economics. The reasons of this success are probably to be found on the theoretical fundaments of the discipline that make these techniques more appealing to decision analysis. To this point should be added the modern IT progress that has developed different flexible and powerful statistical software framework. Among them probably one of the most noticeably is the BUGS language project and its standalone application for MS Windows WinBUGS. Scope of this paper is to introduce the subject and to show some interesting applications of WinBUGS in developing complex economical models based on Markov chains. The advantages of this approach reside on the elegance of the code produced and in its capability to easily develop probabilistic simulations. Moreover an example of the integration of bayesian inference models in a Markov model is shown. This last feature let the analyst conduce statistical analyses on the available sources of evidence and exploit them directly as inputs in the economic model.
THE INTEGRATED SHORT-TERM STATISTICAL SURVEYS: EXPERIENCE OF NBS IN MOLDOVA
Directory of Open Access Journals (Sweden)
Oleg CARA
2012-07-01
Full Text Available The users’ rising need for relevant, reliable, coherent, timely data for the early diagnosis of the economic vulnerability and of the turning points in the business cycles, especially during a financial and economic crisis, asks for a prompt answer, coordinated by statistical institutions. High quality short term statistics are of special interest for the emerging market economies, such as the Moldavian one, being extremely vulnerable when facing economic recession. Answering to the challenges of producing a coherent and adequate image of the economic activity, by using the system of indicators and definitions efficiently applied at the level of the European Union, the National Bureau of Statistics (NBS of the Republic of Moldova has launched the development process of an integrated system of short term statistics (STS based on the advanced international experience.Thus, in 2011, BNS implemented the integrated statistical survey on STS based on consistent concepts, harmonized with the EU standards. The integration of the production processes, which were previously separated, is based on a common technical infrastructure, standardized procedures and techniques for data production. The achievement of this complex survey with holistic approach has allowed the consolidation of the statistical data quality, comparable at European level and the signifi cant reduction of information burden on business units, especially of small size.The reformation of STS based on the integrated survey has been possible thanks to the consistent methodological and practical support given to NBS by the National Institute of Statistics (INS of Romania, for which we would like to thank to our Romanian colleagues.
Blakemore, J S
1962-01-01
Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co
Hall, Michelle G; Mattingley, Jason B; Dux, Paul E
2015-08-01
The brain exploits redundancies in the environment to efficiently represent the complexity of the visual world. One example of this is ensemble processing, which provides a statistical summary of elements within a set (e.g., mean size). Another is statistical learning, which involves the encoding of stable spatial or temporal relationships between objects. It has been suggested that ensemble processing over arrays of oriented lines disrupts statistical learning of structure within the arrays (Zhao, Ngo, McKendrick, & Turk-Browne, 2011). Here we asked whether ensemble processing and statistical learning are mutually incompatible, or whether this disruption might occur because ensemble processing encourages participants to process the stimulus arrays in a way that impedes statistical learning. In Experiment 1, we replicated Zhao and colleagues' finding that ensemble processing disrupts statistical learning. In Experiments 2 and 3, we found that statistical learning was unimpaired by ensemble processing when task demands necessitated (a) focal attention to individual items within the stimulus arrays and (b) the retention of individual items in working memory. Together, these results are consistent with an account suggesting that ensemble processing and statistical learning can operate over the same stimuli given appropriate stimulus processing demands during exposure to regularities. (c) 2015 APA, all rights reserved).
Aeromagnetometry and aeroradiometry of Gabal El Kahfa ring complex, Eastern Desert, Egypt
International Nuclear Information System (INIS)
Meleik, M.L.; Ammar, A.A.; Fouad, K.M.; Rabie, S.I.
1988-01-01
The existence of Gabal El Kahfa ring complex, located in the Eastern Desert of Egypt, has been ascertained from aeromagnetic and aeroradiometric survey data as well as by aerial photography. Two maps for the net aerial radiometric measurements reduced to ground level and aerial magnetic data corrected for the regional normal gradient of the earth's magnetic field have been constructed. The aeroradioactivity data have been interpreted geologically and analyzed statistically to outline various radiometric units and compute their characteristic statistics. The ring complex showed a circular radiometric feature and yielded a radiometric mean background and a standard deviation of 6.48 and 0.89 μR/h respectively. Besides, the observations showed a normal distribution. The aeromagnetic data have been reduced to the north magnetic pole, then filtered to produce the regional-and residual-component maps. Statistical trend analysis was conducted for the tectonic lineaments resulting from the three magnetic maps, to define the structural framework of the area under study. This statistical study proved the existence of trends in the east-west, northwest-southeast, northeast-southwest and north-south directions. The ring complex is characterized by a circular magnetic feature, whose average relief is 2150 nT. It is included within a high east-west trending magnetic zone. The latter represents a deep-seated uplift or anticline which is bordered from all sides by some faults of different trends. 14 refs., 14 figs
Bayes linear statistics, theory & methods
Goldstein, Michael
2007-01-01
Bayesian methods combine information available from data with any prior information available from expert knowledge. The Bayes linear approach follows this path, offering a quantitative structure for expressing beliefs, and systematic methods for adjusting these beliefs, given observational data. The methodology differs from the full Bayesian methodology in that it establishes simpler approaches to belief specification and analysis based around expectation judgements. Bayes Linear Statistics presents an authoritative account of this approach, explaining the foundations, theory, methodology, and practicalities of this important field. The text provides a thorough coverage of Bayes linear analysis, from the development of the basic language to the collection of algebraic results needed for efficient implementation, with detailed practical examples. The book covers:The importance of partial prior specifications for complex problems where it is difficult to supply a meaningful full prior probability specification...
International Nuclear Information System (INIS)
Nemnes, G A; Anghel, D V
2010-01-01
We present a stochastic method for the simulation of the time evolution in systems which obey generalized statistics, namely fractional exclusion statistics and Gentile's statistics. The transition rates are derived in the framework of canonical ensembles. This approach introduces a tool for describing interacting fermionic and bosonic systems in non-equilibrium as ideal FES systems, in a computationally efficient manner. The two types of statistics are analyzed comparatively, indicating their intrinsic thermodynamic differences and revealing key aspects related to the species size
Symmetry and Algorithmic Complexity of Polyominoes and Polyhedral Graphs
Zenil, Hector
2018-02-24
We introduce a definition of algorithmic symmetry able to capture essential aspects of geometric symmetry. We review, study and apply a method for approximating the algorithmic complexity (also known as Kolmogorov-Chaitin complexity) of graphs and networks based on the concept of Algorithmic Probability (AP). AP is a concept (and method) capable of recursively enumeration all properties of computable (causal) nature beyond statistical regularities. We explore the connections of algorithmic complexity---both theoretical and numerical---with geometric properties mainly symmetry and topology from an (algorithmic) information-theoretic perspective. We show that approximations to algorithmic complexity by lossless compression and an Algorithmic Probability-based method can characterize properties of polyominoes, polytopes, regular and quasi-regular polyhedra as well as polyhedral networks, thereby demonstrating its profiling capabilities.
Symmetry and Algorithmic Complexity of Polyominoes and Polyhedral Graphs
Zenil, Hector; Kiani, Narsis A.; Tegner, Jesper
2018-01-01
We introduce a definition of algorithmic symmetry able to capture essential aspects of geometric symmetry. We review, study and apply a method for approximating the algorithmic complexity (also known as Kolmogorov-Chaitin complexity) of graphs and networks based on the concept of Algorithmic Probability (AP). AP is a concept (and method) capable of recursively enumeration all properties of computable (causal) nature beyond statistical regularities. We explore the connections of algorithmic complexity---both theoretical and numerical---with geometric properties mainly symmetry and topology from an (algorithmic) information-theoretic perspective. We show that approximations to algorithmic complexity by lossless compression and an Algorithmic Probability-based method can characterize properties of polyominoes, polytopes, regular and quasi-regular polyhedra as well as polyhedral networks, thereby demonstrating its profiling capabilities.
Statistical Emulator for Expensive Classification Simulators
Ross, Jerret; Samareh, Jamshid A.
2016-01-01
Expensive simulators prevent any kind of meaningful analysis to be performed on the phenomena they model. To get around this problem the concept of using a statistical emulator as a surrogate representation of the simulator was introduced in the 1980's. Presently, simulators have become more and more complex and as a result running a single example on these simulators is very expensive and can take days to weeks or even months. Many new techniques have been introduced, termed criteria, which sequentially select the next best (most informative to the emulator) point that should be run on the simulator. These criteria methods allow for the creation of an emulator with only a small number of simulator runs. We follow and extend this framework to expensive classification simulators.
DEFF Research Database (Denmark)
Tryggestad, Kjell
2004-01-01
The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...
Temporal scaling and spatial statistical analyses of groundwater level fluctuations
Sun, H.; Yuan, L., Sr.; Zhang, Y.
2017-12-01
Natural dynamics such as groundwater level fluctuations can exhibit multifractionality and/or multifractality due likely to multi-scale aquifer heterogeneity and controlling factors, whose statistics requires efficient quantification methods. This study explores multifractionality and non-Gaussian properties in groundwater dynamics expressed by time series of daily level fluctuation at three wells located in the lower Mississippi valley, after removing the seasonal cycle in the temporal scaling and spatial statistical analysis. First, using the time-scale multifractional analysis, a systematic statistical method is developed to analyze groundwater level fluctuations quantified by the time-scale local Hurst exponent (TS-LHE). Results show that the TS-LHE does not remain constant, implying the fractal-scaling behavior changing with time and location. Hence, we can distinguish the potentially location-dependent scaling feature, which may characterize the hydrology dynamic system. Second, spatial statistical analysis shows that the increment of groundwater level fluctuations exhibits a heavy tailed, non-Gaussian distribution, which can be better quantified by a Lévy stable distribution. Monte Carlo simulations of the fluctuation process also show that the linear fractional stable motion model can well depict the transient dynamics (i.e., fractal non-Gaussian property) of groundwater level, while fractional Brownian motion is inadequate to describe natural processes with anomalous dynamics. Analysis of temporal scaling and spatial statistics therefore may provide useful information and quantification to understand further the nature of complex dynamics in hydrology.
Statistical application of groundwater monitoring data at the Hanford Site
International Nuclear Information System (INIS)
Chou, C.J.; Johnson, V.G.; Hodges, F.N.
1993-09-01
Effective use of groundwater monitoring data requires both statistical and geohydrologic interpretations. At the Hanford Site in south-central Washington state such interpretations are used for (1) detection monitoring, assessment monitoring, and/or corrective action at Resource Conservation and Recovery Act sites; (2) compliance testing for operational groundwater surveillance; (3) impact assessments at active liquid-waste disposal sites; and (4) cleanup decisions at Comprehensive Environmental Response Compensation and Liability Act sites. Statistical tests such as the Kolmogorov-Smirnov two-sample test are used to test the hypothesis that chemical concentrations from spatially distinct subsets or populations are identical within the uppermost unconfined aquifer. Experience at the Hanford Site in applying groundwater background data indicates that background must be considered as a statistical distribution of concentrations, rather than a single value or threshold. The use of a single numerical value as a background-based standard ignores important information and may result in excessive or unnecessary remediation. Appropriate statistical evaluation techniques include Wilcoxon rank sum test, Quantile test, ''hot spot'' comparisons, and Kolmogorov-Smirnov types of tests. Application of such tests is illustrated with several case studies derived from Hanford groundwater monitoring programs. To avoid possible misuse of such data, an understanding of the limitations is needed. In addition to statistical test procedures, geochemical, and hydrologic considerations are integral parts of the decision process. For this purpose a phased approach is recommended that proceeds from simple to the more complex, and from an overview to detailed analysis
The role of shape complexity in the detection of closed contours.
Wilder, John; Feldman, Jacob; Singh, Manish
2016-09-01
The detection of contours in noise has been extensively studied, but the detection of closed contours, such as the boundaries of whole objects, has received relatively little attention. Closed contours pose substantial challenges not present in the simple (open) case, because they form the outlines of whole shapes and thus take on a range of potentially important configural properties. In this paper we consider the detection of closed contours in noise as a probabilistic decision problem. Previous work on open contours suggests that contour complexity, quantified as the negative log probability (Description Length, DL) of the contour under a suitably chosen statistical model, impairs contour detectability; more complex (statistically surprising) contours are harder to detect. In this study we extended this result to closed contours, developing a suitable probabilistic model of whole shapes that gives rise to several distinct though interrelated measures of shape complexity. We asked subjects to detect either natural shapes (Exp. 1) or experimentally manipulated shapes (Exp. 2) embedded in noise fields. We found systematic effects of global shape complexity on detection performance, demonstrating how aspects of global shape and form influence the basic process of object detection. Copyright © 2015 Elsevier Ltd. All rights reserved.
Wannier, Gregory Hugh
1966-01-01
Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for
HIV intertest interval among MSM in King County, Washington.
Katz, David A; Dombrowski, Julia C; Swanson, Fred; Buskin, Susan E; Golden, Matthew R; Stekler, Joanne D
2013-02-01
The authors examined temporal trends and correlates of HIV testing frequency among men who have sex with men (MSM) in King County, Washington. The authors evaluated data from MSM testing for HIV at the Public Health-Seattle & King County (PHSKC) STD Clinic and Gay City Health Project (GCHP) and testing history data from MSM in PHSKC HIV surveillance. The intertest interval (ITI) was defined as the number of days between the last negative HIV test and the current testing visit or first positive test. Correlates of the log(10)-transformed ITI were determined using generalised estimating equations linear regression. Between 2003 and 2010, the median ITI among MSM seeking HIV testing at the STD Clinic and GCHP were 215 (IQR: 124-409) and 257 (IQR: 148-503) days, respectively. In multivariate analyses, younger age, having only male partners and reporting ≥10 male sex partners in the last year were associated with shorter ITIs at both testing sites (pGCHP attendees, having a regular healthcare provider, seeking a test as part of a regular schedule and inhaled nitrite use in the last year were also associated with shorter ITIs (pGCHP (median 359 vs 255 days, p=0.02). Although MSM in King County appear to be testing at frequent intervals, further efforts are needed to reduce the time that HIV-infected persons are unaware of their status.
National Research Council Canada - National Science Library
Willsky, Alan S
2008-01-01
...: (a) the use of graphical, hierarchical, and multiresolution representations for the development of statistical modeling methodologies for complex phenomena and for the construction of scalable algorithms...
Characterizing time series via complexity-entropy curves
Ribeiro, Haroldo V.; Jauregui, Max; Zunino, Luciano; Lenzi, Ervin K.
2017-06-01
The search for patterns in time series is a very common task when dealing with complex systems. This is usually accomplished by employing a complexity measure such as entropies and fractal dimensions. However, such measures usually only capture a single aspect of the system dynamics. Here, we propose a family of complexity measures for time series based on a generalization of the complexity-entropy causality plane. By replacing the Shannon entropy by a monoparametric entropy (Tsallis q entropy) and after considering the proper generalization of the statistical complexity (q complexity), we build up a parametric curve (the q -complexity-entropy curve) that is used for characterizing and classifying time series. Based on simple exact results and numerical simulations of stochastic processes, we show that these curves can distinguish among different long-range, short-range, and oscillating correlated behaviors. Also, we verify that simulated chaotic and stochastic time series can be distinguished based on whether these curves are open or closed. We further test this technique in experimental scenarios related to chaotic laser intensity, stock price, sunspot, and geomagnetic dynamics, confirming its usefulness. Finally, we prove that these curves enhance the automatic classification of time series with long-range correlations and interbeat intervals of healthy subjects and patients with heart disease.
Analysis and classification of ECG-waves and rhythms using circular statistics and vector strength
Directory of Open Access Journals (Sweden)
Janßen Jan-Dirk
2017-09-01
Full Text Available The most common way to analyse heart rhythm is to calculate the RR-interval and the heart rate variability. For further evaluation, descriptive statistics are often used. Here we introduce a new and more natural heart rhythm analysis tool that is based on circular statistics and vector strength. Vector strength is a tool to measure the periodicity or lack of periodicity of a signal. We divide the signal into non-overlapping window segments and project the detected R-waves around the unit circle using the complex exponential function and the median RR-interval. In addition, we calculate the vector strength and apply circular statistics as wells as an angular histogram on the R-wave vectors. This approach enables an intuitive visualization and analysis of rhythmicity. Our results show that ECG-waves and rhythms can be easily visualized, analysed and classified by circular statistics and vector strength.
New Hybrid Monte Carlo methods for efficient sampling. From physics to biology and statistics
International Nuclear Information System (INIS)
Akhmatskaya, Elena; Reich, Sebastian
2011-01-01
We introduce a class of novel hybrid methods for detailed simulations of large complex systems in physics, biology, materials science and statistics. These generalized shadow Hybrid Monte Carlo (GSHMC) methods combine the advantages of stochastic and deterministic simulation techniques. They utilize a partial momentum update to retain some of the dynamical information, employ modified Hamiltonians to overcome exponential performance degradation with the system’s size and make use of multi-scale nature of complex systems. Variants of GSHMCs were developed for atomistic simulation, particle simulation and statistics: GSHMC (thermodynamically consistent implementation of constant-temperature molecular dynamics), MTS-GSHMC (multiple-time-stepping GSHMC), meso-GSHMC (Metropolis corrected dissipative particle dynamics (DPD) method), and a generalized shadow Hamiltonian Monte Carlo, GSHmMC (a GSHMC for statistical simulations). All of these are compatible with other enhanced sampling techniques and suitable for massively parallel computing allowing for a range of multi-level parallel strategies. A brief description of the GSHMC approach, examples of its application on high performance computers and comparison with other existing techniques are given. Our approach is shown to resolve such problems as resonance instabilities of the MTS methods and non-preservation of thermodynamic equilibrium properties in DPD, and to outperform known methods in sampling efficiency by an order of magnitude. (author)
Computational algebraic geometry for statistical modeling FY09Q2 progress.
Energy Technology Data Exchange (ETDEWEB)
Thompson, David C.; Rojas, Joseph Maurice; Pebay, Philippe Pierre
2009-03-01
This is a progress report on polynomial system solving for statistical modeling. This is a progress report on polynomial system solving for statistical modeling. This quarter we have developed our first model of shock response data and an algorithm for identifying the chamber cone containing a polynomial system in n variables with n+k terms within polynomial time - a significant improvement over previous algorithms, all having exponential worst-case complexity. We have implemented and verified the chamber cone algorithm for n+3 and are working to extend the implementation to handle arbitrary k. Later sections of this report explain chamber cones in more detail; the next section provides an overview of the project and how the current progress fits into it.
Johnson, Norman
This is author-approved bcc: This is the third volume of a collection of seminal papers in the statistical sciences written during the past 110 years. These papers have each had an outstanding influence on the development of statistical theory and practice over the last century. Each paper is preceded by an introduction written by an authority in the field providing background information and assessing its influence. Volume III concerntrates on articles from the 1980's while including some earlier articles not included in Volume I and II. Samuel Kotz is Professor of Statistics in the College of Business and Management at the University of Maryland. Norman L. Johnson is Professor Emeritus of Statistics at the University of North Carolina. Also available: Breakthroughs in Statistics Volume I: Foundations and Basic Theory Samuel Kotz and Norman L. Johnson, Editors 1993. 631 pp. Softcover. ISBN 0-387-94037-5 Breakthroughs in Statistics Volume II: Methodology and Distribution Samuel Kotz and Norman L. Johnson, Edi...
Chiou, Chei-Chang; Wang, Yu-Min; Lee, Li-Tze
2014-08-01
Statistical knowledge is widely used in academia; however, statistics teachers struggle with the issue of how to reduce students' statistics anxiety and enhance students' statistics learning. This study assesses the effectiveness of a "one-minute paper strategy" in reducing students' statistics-related anxiety and in improving students' statistics-related achievement. Participants were 77 undergraduates from two classes enrolled in applied statistics courses. An experiment was implemented according to a pretest/posttest comparison group design. The quasi-experimental design showed that the one-minute paper strategy significantly reduced students' statistics anxiety and improved students' statistics learning achievement. The strategy was a better instructional tool than the textbook exercise for reducing students' statistics anxiety and improving students' statistics achievement.
[A study in cognitive complexity of the self: an evaluation of the Linville's index].
Hayashi, F; Horiuchi, T
1997-02-01
The purpose of the present study was to evaluate H statistic, proposed by Linville (1985, 1987), as an index for cognitive complexity of the self. Linville asserted that high self-complexity would act as a buffer against life stress or depression. One hundred and eighty-seven undergraduates sorted 40 personality-trait adjectives into as many categories as necessary in order to describe themselves. In addition, 126 participants filled out several scales including self-consciousness and esteem. Main findings were as follows: (a) H statistic was not significantly associated with any variable related to the self-ratings, and showed no stress-buffering effect. (b) On the other hand, participants who had high cognitive complexity for the negative aspects of the self, as operationalized by Woolfolk, Novalany, Gara, Allen, and Polino (1995), were low in self-esteem and high in public self-consciousness. The results suggest that cognitive complexity of the negative self may indicate a predisposition for depression or neurosis. (c) Also, women scored significantly higher than men on cognitive complexity of the negative self.
Bakar, Ab Rahim; Mohamed, Shamsiah; Hamzah, Ramlah
2013-01-01
This study was performed to identify the employability skills of technical students from the Industrial Training Institutes (ITI) and Indigenous People's Trust Council (MARA) Skills Training Institutes (IKM) in Malaysia. The study sample consisted of 850 final year trainees of IKM and ITI. The sample was chosen by a random sampling procedure from…
Epidemiology of Maternal Mortality in Malawi
African Journals Online (AJOL)
live births. Causes and determinants of maternal mortal- ity. Global causes of maternal mortality. Across the globe the causes of maternal deaths are strik- ..... at home”. Findings from Thyolo, Mangochi and Chik- wawa were similar". Perceived qua/ity of care. Like anywhere in the world, the perceived quality of care in ...
Approach of Complex Networks for the Determination of Brain Death
Institute of Scientific and Technical Information of China (English)
SUN Wei-Gang; CAO Jian-Ting; WANG Ru-Bin
2011-01-01
In clinical practice, brain death is the irreversible end of all brain activity. Compared to current statistical methods for the determination of brain death, we focus on the approach of complex networks for real-world electroencephalography in its determination. Brain functional networks constructed by correlation analysis are derived, and statistical network quantities used for distinguishing the patients in coma or brain death state, such as average strength, clustering coefficient and average path length, are calculated. Numerical results show that the values of network quantities of patients in coma state are larger than those of patients in brain death state. Our Sndings might provide valuable insights on the determination of brain death.%@@ In clinical practice, brain death is the irreversible end of all brain activity.Compared to current statistical methods for the determination of brain death, we focus on the approach of complex networks for real-world electroencephalography in its determination.Brain functional networks constructed by correlation analysis axe derived, and statistical network quantities used for distinguishing the patients in coma or brain death state, such as average strength, clustering coefficient and average path length, are calculated.Numerical results show that the values of network quantities of patients in coma state are larger than those of patients in brain death state.Our findings might provide valuable insights on the determination of brain death.
Phase flow and statistical structure of Galton-board systems
International Nuclear Information System (INIS)
Lue, A.; Brenner, H.
1993-01-01
Galton boards, found in museum exhibits devoted to science and technology, are often used to demonstrate visually the ubiquity of so-called ''laws of probability'' via an experimental realization of normal distributions. A detailed theoretical study of Galton-board phase-space dynamics and statistical behavior is presented. The study is based on a simple inelastic-collision model employing a particle fall- ing through a spatially periodic lattice of rigid, convex scatterers. We show that such systems exhibit indeterminate behavior through the presence of strange attractors or strange repellers in phase space; nevertheless, we also show that these systems exhibit regular and predictable behavior under specific circumstances. Phase-space strange attractors, periodic attractors, and strange repellers are present in numerical simulations, confirming results anticipated from geometric analysis. The system's geometry (dictated by lattice geometry and density as well as the direction of gravity) is observed to play a dominant role in stability, phase-flow topology, and statistical observations. Smale horseshoes appear to exist in the low-lattice-density limit and may exist in other regimes. These horseshoes are generated by homoclinic orbits whose existence is dictated by system characteristics. The horseshoes lead directly to deterministic chaos in the system. Strong evidence exists for ergodicity in all attractors. Phase-space complexities are manifested at all observed levels, particularly statistical ones. Consequently, statistical observations are critically dependent upon system details. Under well-defined circumstances, these observations display behavior which does not constitute a realization of the ''laws of probability.''
Towards consistent and reliable Dutch and international energy statistics for the chemical industry
International Nuclear Information System (INIS)
Neelis, M.L.; Pouwelse, J.W.
2008-01-01
Consistent and reliable energy statistics are of vital importance for proper monitoring of energy-efficiency policies. In recent studies, irregularities have been reported in the Dutch energy statistics for the chemical industry. We studied in depth the company data that form the basis of the energy statistics in the Netherlands between 1995 and 2004 to find causes for these irregularities. We discovered that chemical products have occasionally been included, resulting in statistics with an inconsistent system boundary. Lack of guidance in the survey for the complex energy conversions in the chemical industry in the survey also resulted in large fluctuations for certain energy commodities. The findings of our analysis have been the basis for a new survey that has been used since 2007. We demonstrate that the annual questionnaire used for the international energy statistics can result in comparable problems as observed in the Netherlands. We suggest to include chemical residual gas as energy commodity in the questionnaire and to include the energy conversions in the chemical industry in the international energy statistics. In addition, we think the questionnaire should be explicit about the treatment of basic chemical products produced at refineries and in the petrochemical industry to avoid system boundary problems