WorldWideScience

Sample records for automatic type inference

  1. Inference of Well-Typings for Logic Programs with Application to Termination Analysis

    DEFF Research Database (Denmark)

    Bruynooghe, M.; Gallagher, John Patrick; Humbeeck, W. Van

    2005-01-01

    A method is developed to infer a polymorphic well-typing for a logic program. Our motivation is to improve the automation of termination analysis by deriving types from which norms can automatically be constructed. Previous work on type-based termination analysis used either types declared...... by the user, or automatically generated monomorphic types describing the success set of predicates. The latter types are less precise and result in weaker termination conditions than those obtained from declared types. Our type inference procedure involves solving set constraints generated from the program...... and derives a well-typing in contrast to a success-set approximation. Experiments so far show that our automatically inferred well-typings are close to the declared types and result in termination conditions that are as strong as those obtained with declared types. We describe the method, its implementation...

  2. Automatic physical inference with information maximizing neural networks

    Science.gov (United States)

    Charnock, Tom; Lavaux, Guilhem; Wandelt, Benjamin D.

    2018-04-01

    Compressing large data sets to a manageable number of summaries that are informative about the underlying parameters vastly simplifies both frequentist and Bayesian inference. When only simulations are available, these summaries are typically chosen heuristically, so they may inadvertently miss important information. We introduce a simulation-based machine learning technique that trains artificial neural networks to find nonlinear functionals of data that maximize Fisher information: information maximizing neural networks (IMNNs). In test cases where the posterior can be derived exactly, likelihood-free inference based on automatically derived IMNN summaries produces nearly exact posteriors, showing that these summaries are good approximations to sufficient statistics. In a series of numerical examples of increasing complexity and astrophysical relevance we show that IMNNs are robustly capable of automatically finding optimal, nonlinear summaries of the data even in cases where linear compression fails: inferring the variance of Gaussian signal in the presence of noise, inferring cosmological parameters from mock simulations of the Lyman-α forest in quasar spectra, and inferring frequency-domain parameters from LISA-like detections of gravitational waveforms. In this final case, the IMNN summary outperforms linear data compression by avoiding the introduction of spurious likelihood maxima. We anticipate that the automatic physical inference method described in this paper will be essential to obtain both accurate and precise cosmological parameter estimates from complex and large astronomical data sets, including those from LSST and Euclid.

  3. Automatic inference of indexing rules for MEDLINE

    Directory of Open Access Journals (Sweden)

    Shooshan Sonya E

    2008-11-01

    Full Text Available Abstract Background: Indexing is a crucial step in any information retrieval system. In MEDLINE, a widely used database of the biomedical literature, the indexing process involves the selection of Medical Subject Headings in order to describe the subject matter of articles. The need for automatic tools to assist MEDLINE indexers in this task is growing with the increasing number of publications being added to MEDLINE. Methods: In this paper, we describe the use and the customization of Inductive Logic Programming (ILP to infer indexing rules that may be used to produce automatic indexing recommendations for MEDLINE indexers. Results: Our results show that this original ILP-based approach outperforms manual rules when they exist. In addition, the use of ILP rules also improves the overall performance of the Medical Text Indexer (MTI, a system producing automatic indexing recommendations for MEDLINE. Conclusion: We expect the sets of ILP rules obtained in this experiment to be integrated into MTI.

  4. Type Inference for Session Types in the Pi-Calculus

    DEFF Research Database (Denmark)

    Graversen, Eva Fajstrup; Harbo, Jacob Buchreitz; Huttel, Hans

    2014-01-01

    In this paper we present a direct algorithm for session type inference for the π-calculus. Type inference for session types has previously been achieved by either imposing limitations and restriction on the π-calculus, or by reducing the type inference problem to that for linear types. Our approach...

  5. Type Inference with Inequalities

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff

    1991-01-01

    of (monotonic) inequalities on the types of variables and expressions. A general result about systems of inequalities over semilattices yields a solvable form. We distinguish between deciding typability (the existence of solutions) and type inference (the computation of a minimal solution). In our case, both......Type inference can be phrased as constraint-solving over types. We consider an implicitly typed language equipped with recursive types, multiple inheritance, 1st order parametric polymorphism, and assignments. Type correctness is expressed as satisfiability of a possibly infinite collection...

  6. Making Type Inference Practical

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff; Oxhøj, Nicholas; Palsberg, Jens

    1992-01-01

    We present the implementation of a type inference algorithm for untyped object-oriented programs with inheritance, assignments, and late binding. The algorithm significantly improves our previous one, presented at OOPSLA'91, since it can handle collection classes, such as List, in a useful way. Abo......, the complexity has been dramatically improved, from exponential time to low polynomial time. The implementation uses the techniques of incremental graph construction and constraint template instantiation to avoid representing intermediate results, doing superfluous work, and recomputing type information....... Experiments indicate that the implementation type checks as much as 100 lines pr. second. This results in a mature product, on which a number of tools can be based, for example a safety tool, an image compression tool, a code optimization tool, and an annotation tool. This may make type inference for object...

  7. Object-Oriented Type Inference

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff; Palsberg, Jens

    1991-01-01

    We present a new approach to inferring types in untyped object-oriented programs with inheritance, assignments, and late binding. It guarantees that all messages are understood, annotates the program with type information, allows polymorphic methods, and can be used as the basis of an op...

  8. Type inference for correspondence types

    DEFF Research Database (Denmark)

    Hüttel, Hans; Gordon, Andy; Hansen, Rene Rydhof

    2009-01-01

    We present a correspondence type/effect system for authenticity in a π-calculus with polarized channels, dependent pair types and effect terms and show how one may, given a process P and an a priori type environment E, generate constraints that are formulae in the Alternating Least Fixed......-Point (ALFP) logic. We then show how a reasonable model of the generated constraints yields a type/effect assignment such that P becomes well-typed with respect to E if and only if this is possible. The formulae generated satisfy a finite model property; a system of constraints is satisfiable if and only...... if it has a finite model. As a consequence, we obtain the result that type/effect inference in our system is polynomial-time decidable....

  9. Adaptive neuro-fuzzy inference system based automatic generation control

    Energy Technology Data Exchange (ETDEWEB)

    Hosseini, S.H.; Etemadi, A.H. [Department of Electrical Engineering, Sharif University of Technology, Tehran (Iran)

    2008-07-15

    Fixed gain controllers for automatic generation control are designed at nominal operating conditions and fail to provide best control performance over a wide range of operating conditions. So, to keep system performance near its optimum, it is desirable to track the operating conditions and use updated parameters to compute control gains. A control scheme based on artificial neuro-fuzzy inference system (ANFIS), which is trained by the results of off-line studies obtained using particle swarm optimization, is proposed in this paper to optimize and update control gains in real-time according to load variations. Also, frequency relaxation is implemented using ANFIS. The efficiency of the proposed method is demonstrated via simulations. Compliance of the proposed method with NERC control performance standard is verified. (author)

  10. Automatic approach to deriving fuzzy slope positions

    Science.gov (United States)

    Zhu, Liang-Jun; Zhu, A.-Xing; Qin, Cheng-Zhi; Liu, Jun-Zhi

    2018-03-01

    Fuzzy characterization of slope positions is important for geographic modeling. Most of the existing fuzzy classification-based methods for fuzzy characterization require extensive user intervention in data preparation and parameter setting, which is tedious and time-consuming. This paper presents an automatic approach to overcoming these limitations in the prototype-based inference method for deriving fuzzy membership value (or similarity) to slope positions. The key contribution is a procedure for finding the typical locations and setting the fuzzy inference parameters for each slope position type. Instead of being determined totally by users in the prototype-based inference method, in the proposed approach the typical locations and fuzzy inference parameters for each slope position type are automatically determined by a rule set based on prior domain knowledge and the frequency distributions of topographic attributes. Furthermore, the preparation of topographic attributes (e.g., slope gradient, curvature, and relative position index) is automated, so the proposed automatic approach has only one necessary input, i.e., the gridded digital elevation model of the study area. All compute-intensive algorithms in the proposed approach were speeded up by parallel computing. Two study cases were provided to demonstrate that this approach can properly, conveniently and quickly derive the fuzzy slope positions.

  11. An analysis of line-drawings based upon automatically inferred grammar and its application to chest x-ray images

    International Nuclear Information System (INIS)

    Nakayama, Akira; Yoshida, Yuuji; Fukumura, Teruo

    1984-01-01

    There is a technique using inferring grammer as image- structure analyzing technique. This technique involves a few problems if it is applied to naturally obtained images, as the practical grammatical technique for two-dimensional image is not established. The authors developed a technique which solved the above problems for the main purpose of the automated structure analysis of naturally obtained image. The first half of this paper describes on the automatic inference of line drawing generation grammar and the line drawing analysis based on that automatic inference. The second half of the paper reports on the actual analysis. The proposed technique is that to extract object line drawings out of the line drawings containing noise. The technique was evaluated for its effectiveness with an example of extracting rib center lines out of thin line chest X-ray images having practical scale and complexity. In this example, the total number of characteristic points (ends, branch points and intersections) composing line drawings per one image was 377, and the total number of line segments composing line drawings was 566 on average per sheet. The extraction ratio was 86.6 % which seemed to be proper when the complexity of input line drawings was considered. Further, the result was compared with the identified rib center lines with the automatic screening system AISCR-V3 for comparison with the conventional processing technique, and it was satisfactory when the versatility of this method was considered. (Wakatsuki, Y.)

  12. Automatic segmentation of coronary angiograms based on fuzzy inferring and probabilistic tracking

    Directory of Open Access Journals (Sweden)

    Shoujun Zhou

    2010-08-01

    Full Text Available Abstract Background Segmentation of the coronary angiogram is important in computer-assisted artery motion analysis or reconstruction of 3D vascular structures from a single-plan or biplane angiographic system. Developing fully automated and accurate vessel segmentation algorithms is highly challenging, especially when extracting vascular structures with large variations in image intensities and noise, as well as with variable cross-sections or vascular lesions. Methods This paper presents a novel tracking method for automatic segmentation of the coronary artery tree in X-ray angiographic images, based on probabilistic vessel tracking and fuzzy structure pattern inferring. The method is composed of two main steps: preprocessing and tracking. In preprocessing, multiscale Gabor filtering and Hessian matrix analysis were used to enhance and extract vessel features from the original angiographic image, leading to a vessel feature map as well as a vessel direction map. In tracking, a seed point was first automatically detected by analyzing the vessel feature map. Subsequently, two operators [e.g., a probabilistic tracking operator (PTO and a vessel structure pattern detector (SPD] worked together based on the detected seed point to extract vessel segments or branches one at a time. The local structure pattern was inferred by a multi-feature based fuzzy inferring function employed in the SPD. The identified structure pattern, such as crossing or bifurcation, was used to control the tracking process, for example, to keep tracking the current segment or start tracking a new one, depending on the detected pattern. Results By appropriate integration of these advanced preprocessing and tracking steps, our tracking algorithm is able to extract both vessel axis lines and edge points, as well as measure the arterial diameters in various complicated cases. For example, it can walk across gaps along the longitudinal vessel direction, manage varying vessel

  13. Automatic fuzzy inference system development for marker-based watershed segmentation

    International Nuclear Information System (INIS)

    Gonzalez, M A; Meschino, G J; Ballarin, V L

    2007-01-01

    Texture image segmentation is a constant challenge in digital image processing. The partition of an image into regions that allow the experienced observer to obtain the necessary information can be done using a Mathematical Morphology tool called the Watershed Transform. This transform is able to distinguish extremely complex objects and is easily adaptable to various kinds of images. The success of the Watershed Transform depends essentially on the existence of unequivocal markers for each of the objects of interest. The standard methods for marker detection are highly specific and complex when objects presenting great variability of shape, size and texture are processed. This paper proposes the automatic generation of a fuzzy inference system for marker detection using object selection done by the expert. This method allows applying the Watershed Transform to biomedical images with diferent kinds of texture. The results allow concluding that the method proposed is an effective tool for the application of the Watershed Transform

  14. SEMANTIC PATCH INFERENCE

    DEFF Research Database (Denmark)

    Andersen, Jesper

    2009-01-01

    Collateral evolution the problem of updating several library-using programs in response to API changes in the used library. In this dissertation we address the issue of understanding collateral evolutions by automatically inferring a high-level specification of the changes evident in a given set ...... specifications inferred by spdiff in Linux are shown. We find that the inferred specifications concisely capture the actual collateral evolution performed in the examples....

  15. Extending Dylan's type system for better type inference and error detection

    DEFF Research Database (Denmark)

    Mehnert, Hannes

    2010-01-01

    a dynamically typed language. Dylan poses several special challenges for gradual typing, such as multiple return values, variable-arity methods and generic functions (multiple dispatch). In this paper Dylan is extended with function types and parametric polymorphism. We implemented the type system...... and aunification-based type inference algorithm in the mainstream Dylan compiler. As case study we use the Dylan standard library (roughly 32000 lines of code), which witnesses that the implementation generates faster code with fewer errors. Some previously undiscovered errors in the Dylan library were revealed....

  16. Automatic Detection of Wild-type Mouse Cranial Sutures

    DEFF Research Database (Denmark)

    Ólafsdóttir, Hildur; Darvann, Tron Andre; Hermann, Nuno V.

    , automatic detection of the cranial sutures becomes important. We have previously built a craniofacial, wild-type mouse atlas from a set of 10 Micro CT scans using a B-spline-based nonrigid registration method by Rueckert et al. Subsequently, all volumes were registered nonrigidly to the atlas. Using......, the observer traced the sutures on each of the mouse volumes as well. The observer outperforms the automatic approach by approximately 0.1 mm. All mice have similar errors while the suture error plots reveal that suture 1 and 2 are cumbersome, both for the observer and the automatic approach. These sutures can...

  17. Automatic inference of geometric camera parameters and inter-camera topology in uncalibrated disjoint surveillance cameras

    Science.gov (United States)

    den Hollander, Richard J. M.; Bouma, Henri; Baan, Jan; Eendebak, Pieter T.; van Rest, Jeroen H. C.

    2015-10-01

    Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many cameras, or for frequent ad-hoc deployments of cameras, the cost of this calibration is high. This creates a barrier for the use of video analytics. Automating the calibration allows for a short configuration time, and the use of video analytics in a wider range of scenarios, including ad-hoc crisis situations and large scale surveillance systems. We show an autocalibration method entirely based on pedestrian detections in surveillance video in multiple non-overlapping cameras. In this paper, we show the two main components of automatic calibration. The first shows the intra-camera geometry estimation that leads to an estimate of the tilt angle, focal length and camera height, which is important for the conversion from pixels to meters and vice versa. The second component shows the inter-camera topology inference that leads to an estimate of the distance between cameras, which is important for spatio-temporal analysis of multi-camera tracking. This paper describes each of these methods and provides results on realistic video data.

  18. [Comparison of different types automatic water-supply system for mouse rearing (author's transl)].

    Science.gov (United States)

    Kikuchi, S; Suzuki, M; Tagashira, Y

    1979-04-01

    Rearing and breeding scores were compared between groups of mice (JCL : ICR and ddN strains) raised with two different types of automatic water-supply systems; the Japanese type and the American type, using manual water-supply system as control. The mice raised with the manual water-supply system were superior in body weight gain as compared to those with two automatic water-supply systems. As to the survival rate, however, the m; anual water-supply system and the Japanese type gave better results than the American type. As to weanling rate in the breeding test, the manual water-supply system gave somewhat better result than either of the two automatic types. Accidental water leaks, which are serious problems of automatic systems, occurred frequently only when the American type was used. Only one defect of the Japanese type revealed was that it was unfavorable for mice with smaller size (e.g., young ddN mice), resulting in lower body weight gain as well as lower breeding scores.

  19. Automatic and strategic processes in advertising effects

    DEFF Research Database (Denmark)

    Grunert, Klaus G.

    1996-01-01

    , the retrieval of information, and provide a heuristic for brand evaluation. Strategic processes govern learning and inference formation. T relative importance of both types of processes will depend on product involvement. The distinction of these two types of processes leads to some conclusions which...... are at variance with current notions about advertising effects. For example, the att span problem will be relevant only for strategic processes, not for automatic processes, a certain amount of learning can occur with very little conscious effort, and advertising's effect on brand evaluation may be more stable......Two kinds of cognitive processes can be distinguished: Automatic processes, which are mostly subconscious, are learned and changed very slowly, and are not subject to the capacity limitations of working memory, and strategic processes, which are conscious, are subject to capacity limitations...

  20. Depression, automatic thoughts, alexithymia, and assertiveness in patients with tension-type headache.

    Science.gov (United States)

    Yücel, Basak; Kora, Kaan; Ozyalçín, Süleyman; Alçalar, Nilüfer; Ozdemir, Ozay; Yücel, Aysen

    2002-03-01

    The role of psychological factors related to headache has long been a focus of investigation. The aim of this study was to evaluate depression, automatic thoughts, alexithymia, and assertiveness in persons with tension-type headache and to compare the results with those from healthy controls. One hundred five subjects with tension-type headache (according to the criteria of the International Headache Society classification) and 70 controls were studied. The Beck Depression Inventory, Automatic Thoughts Scale, Toronto Alexithymia Scale, and Rathus Assertiveness Schedule were administered to both groups. Sociodemographic variables and headache features were evaluated via a semistructured scale. Compared with healthy controls, the subjects with headache had significantly higher scores on measures of depression, automatic thoughts, and alexithymia and lower scores on assertiveness. Subjects with chronic tension-type headache had higher depression and automatic thoughts scores than those with episodic tension-type headache. These findings suggested that persons with tension-type headache have high depression scores and also may have difficulty with expression of their emotions. Headache frequency appears to influence the likelihood of coexisting depression.

  1. Fisher information and statistical inference for phase-type distributions

    DEFF Research Database (Denmark)

    Bladt, Mogens; Esparza, Luz Judith R; Nielsen, Bo Friis

    2011-01-01

    This paper is concerned with statistical inference for both continuous and discrete phase-type distributions. We consider maximum likelihood estimation, where traditionally the expectation-maximization (EM) algorithm has been employed. Certain numerical aspects of this method are revised and we...

  2. AUTOMATIC ROAD GAP DETECTION USING FUZZY INFERENCE SYSTEM

    Directory of Open Access Journals (Sweden)

    S. Hashemi

    2012-09-01

    Full Text Available Automatic feature extraction from aerial and satellite images is a high-level data processing which is still one of the most important research topics of the field. In this area, most of the researches are focused on the early step of road detection, where road tracking methods, morphological analysis, dynamic programming and snakes, multi-scale and multi-resolution methods, stereoscopic and multi-temporal analysis, hyper spectral experiments, are some of the mature methods in this field. Although most researches are focused on detection algorithms, none of them can extract road network perfectly. On the other hand, post processing algorithms accentuated on the refining of road detection results, are not developed as well. In this article, the main is to design an intelligent method to detect and compensate road gaps remained on the early result of road detection algorithms. The proposed algorithm consists of five main steps as follow: 1 Short gap coverage: In this step, a multi-scale morphological is designed that covers short gaps in a hierarchical scheme. 2 Long gap detection: In this step, the long gaps, could not be covered in the previous stage, are detected using a fuzzy inference system. for this reason, a knowledge base consisting of some expert rules are designed which are fired on some gap candidates of the road detection results. 3 Long gap coverage: In this stage, detected long gaps are compensated by two strategies of linear and polynomials for this reason, shorter gaps are filled by line fitting while longer ones are compensated by polynomials.4 Accuracy assessment: In order to evaluate the obtained results, some accuracy assessment criteria are proposed. These criteria are obtained by comparing the obtained results with truly compensated ones produced by a human expert. The complete evaluation of the obtained results whit their technical discussions are the materials of the full paper.

  3. Automatic Road Gap Detection Using Fuzzy Inference System

    Science.gov (United States)

    Hashemi, S.; Valadan Zoej, M. J.; Mokhtarzadeh, M.

    2011-09-01

    Automatic feature extraction from aerial and satellite images is a high-level data processing which is still one of the most important research topics of the field. In this area, most of the researches are focused on the early step of road detection, where road tracking methods, morphological analysis, dynamic programming and snakes, multi-scale and multi-resolution methods, stereoscopic and multi-temporal analysis, hyper spectral experiments, are some of the mature methods in this field. Although most researches are focused on detection algorithms, none of them can extract road network perfectly. On the other hand, post processing algorithms accentuated on the refining of road detection results, are not developed as well. In this article, the main is to design an intelligent method to detect and compensate road gaps remained on the early result of road detection algorithms. The proposed algorithm consists of five main steps as follow: 1) Short gap coverage: In this step, a multi-scale morphological is designed that covers short gaps in a hierarchical scheme. 2) Long gap detection: In this step, the long gaps, could not be covered in the previous stage, are detected using a fuzzy inference system. for this reason, a knowledge base consisting of some expert rules are designed which are fired on some gap candidates of the road detection results. 3) Long gap coverage: In this stage, detected long gaps are compensated by two strategies of linear and polynomials for this reason, shorter gaps are filled by line fitting while longer ones are compensated by polynomials.4) Accuracy assessment: In order to evaluate the obtained results, some accuracy assessment criteria are proposed. These criteria are obtained by comparing the obtained results with truly compensated ones produced by a human expert. The complete evaluation of the obtained results whit their technical discussions are the materials of the full paper.

  4. Automatic Type Recognition and Mapping of Global Tropical Cyclone Disaster Chains (TDC

    Directory of Open Access Journals (Sweden)

    Ran Wang

    2016-10-01

    Full Text Available The catastrophic events caused by meteorological disasters are becoming more severe in the context of global warming. The disaster chains triggered by Tropical Cyclones induce the serious losses of population and economy. It is necessary to make the regional type recognition of Tropical Cyclone Disaster Chain (TDC effective in order to make targeted preventions. This study mainly explores the method of automatic recognition and the mapping of TDC and designs a software system. We constructed an automatic recognition system in terms of the characteristics of a hazard-formative environment based on the theory of a natural disaster system. The ArcEngine components enable an intelligent software system to present results by the automatic mapping approach. The study data comes from global metadata such as Digital Elevation Model (DEM, terrain slope, population density and Gross Domestic Product (GDP. The result shows that: (1 according to the characteristic of geomorphology type, we establish a type of recognition system for global TDC; (2 based on the recognition principle, we design a software system with the functions of automatic recognition and mapping; and (3 we validate the type of distribution in terms of real cases of TDC. The result shows that the automatic recognition function has good reliability. The study can provide the basis for targeted regional disaster prevention strategy, as well as regional sustainable development.

  5. Toward the Decision Tree for Inferring Requirements Maturation Types

    Science.gov (United States)

    Nakatani, Takako; Kondo, Narihito; Shirogane, Junko; Kaiya, Haruhiko; Hori, Shozo; Katamine, Keiichi

    Requirements are elicited step by step during the requirements engineering (RE) process. However, some types of requirements are elicited completely after the scheduled requirements elicitation process is finished. Such a situation is regarded as problematic situation. In our study, the difficulties of eliciting various kinds of requirements is observed by components. We refer to the components as observation targets (OTs) and introduce the word “Requirements maturation.” It means when and how requirements are elicited completely in the project. The requirements maturation is discussed on physical and logical OTs. OTs Viewed from a logical viewpoint are called logical OTs, e.g. quality requirements. The requirements of physical OTs, e.g., modules, components, subsystems, etc., includes functional and non-functional requirements. They are influenced by their requesters' environmental changes, as well as developers' technical changes. In order to infer the requirements maturation period of each OT, we need to know how much these factors influence the OTs' requirements maturation. According to the observation of actual past projects, we defined the PRINCE (Pre Requirements Intelligence Net Consideration and Evaluation) model. It aims to guide developers in their observation of the requirements maturation of OTs. We quantitatively analyzed the actual cases with their requirements elicitation process and extracted essential factors that influence the requirements maturation. The results of interviews of project managers are analyzed by WEKA, a data mining system, from which the decision tree was derived. This paper introduces the PRINCE model and the category of logical OTs to be observed. The decision tree that helps developers infer the maturation type of an OT is also described. We evaluate the tree through real projects and discuss its ability to infer the requirements maturation types.

  6. Spontaneous Facial Mimicry Is Enhanced by the Goal of Inferring Emotional States: Evidence for Moderation of "Automatic" Mimicry by Higher Cognitive Processes.

    Science.gov (United States)

    Murata, Aiko; Saito, Hisamichi; Schug, Joanna; Ogawa, Kenji; Kameda, Tatsuya

    2016-01-01

    A number of studies have shown that individuals often spontaneously mimic the facial expressions of others, a tendency known as facial mimicry. This tendency has generally been considered a reflex-like "automatic" response, but several recent studies have shown that the degree of mimicry may be moderated by contextual information. However, the cognitive and motivational factors underlying the contextual moderation of facial mimicry require further empirical investigation. In this study, we present evidence that the degree to which participants spontaneously mimic a target's facial expressions depends on whether participants are motivated to infer the target's emotional state. In the first study we show that facial mimicry, assessed by facial electromyography, occurs more frequently when participants are specifically instructed to infer a target's emotional state than when given no instruction. In the second study, we replicate this effect using the Facial Action Coding System to show that participants are more likely to mimic facial expressions of emotion when they are asked to infer the target's emotional state, rather than make inferences about a physical trait unrelated to emotion. These results provide convergent evidence that the explicit goal of understanding a target's emotional state affects the degree of facial mimicry shown by the perceiver, suggesting moderation of reflex-like motor activities by higher cognitive processes.

  7. Integrating distributed Bayesian inference and reinforcement learning for sensor management

    NARCIS (Netherlands)

    Grappiolo, C.; Whiteson, S.; Pavlin, G.; Bakker, B.

    2009-01-01

    This paper introduces a sensor management approach that integrates distributed Bayesian inference (DBI) and reinforcement learning (RL). DBI is implemented using distributed perception networks (DPNs), a multiagent approach to performing efficient inference, while RL is used to automatically

  8. On stylistic automatization of lexical units in various types of contexts

    Directory of Open Access Journals (Sweden)

    В В Зуева

    2009-12-01

    Full Text Available Stylistic automatization of lexical units in various types of contexts is investigated in this article. Following the works of Boguslav Havranek and other linguists of the Prague Linguistic School automatization is treated as a contextual narrowing of the meaning of a lexical unit to the level of its complete predictability in situational contexts and the lack of stylistic contradiction with other lexical units in speech.

  9. An Automatic Lab-on-Disc System for Blood Typing.

    Science.gov (United States)

    Chang, Yaw-Jen; Fan, Yi-Hua; Chen, Shia-Chung; Lee, Kuan-Hua; Lou, Liao-Yong

    2018-04-01

    A blood-typing assay is a critical test to ensure the serological compatibility of a donor and an intended recipient prior to a blood transfusion. This article presents a lab-on-disc blood-typing system to conduct a total of eight assays for a patient, including forward-typing tests, reverse-typing tests, and irregular-antibody tests. These assays are carried out in a microfluidic disc simultaneously. A blood-typing apparatus was designed to automatically manipulate the disc. The blood type can be determined by integrating the results of red blood cell (RBC) agglutination in the microchannels. The experimental results of our current 40 blood samples show that the results agree with those examined in the hospital. The accuracy reaches 97.5%.

  10. A Comparative Analysis of Fuzzy Inference Engines in Context of ...

    African Journals Online (AJOL)

    Fuzzy inference engine has found successful applications in a wide variety of fields, such as automatic control, data classification, decision analysis, expert engines, time series prediction, robotics, pattern recognition, etc. This paper presents a comparative analysis of three fuzzy inference engines, max-product, max-min ...

  11. A Risk Assessment System with Automatic Extraction of Event Types

    Science.gov (United States)

    Capet, Philippe; Delavallade, Thomas; Nakamura, Takuya; Sandor, Agnes; Tarsitano, Cedric; Voyatzi, Stavroula

    In this article we describe the joint effort of experts in linguistics, information extraction and risk assessment to integrate EventSpotter, an automatic event extraction engine, into ADAC, an automated early warning system. By detecting as early as possible weak signals of emerging risks ADAC provides a dynamic synthetic picture of situations involving risk. The ADAC system calculates risk on the basis of fuzzy logic rules operated on a template graph whose leaves are event types. EventSpotter is based on a general purpose natural language dependency parser, XIP, enhanced with domain-specific lexical resources (Lexicon-Grammar). Its role is to automatically feed the leaves with input data.

  12. Fetal ECG extraction via Type-2 adaptive neuro-fuzzy inference systems.

    Science.gov (United States)

    Ahmadieh, Hajar; Asl, Babak Mohammadzadeh

    2017-04-01

    We proposed a noninvasive method for separating the fetal ECG (FECG) from maternal ECG (MECG) by using Type-2 adaptive neuro-fuzzy inference systems. The method can extract FECG components from abdominal signal by using one abdominal channel, including maternal and fetal cardiac signals and other environmental noise signals, and one chest channel. The proposed algorithm detects the nonlinear dynamics of the mother's body. So, the components of the MECG are estimated from the abdominal signal. By subtracting estimated mother cardiac signal from abdominal signal, fetal cardiac signal can be extracted. This algorithm was applied on synthetic ECG signals generated based on the models developed by McSharry et al. and Behar et al. and also on DaISy real database. In environments with high uncertainty, our method performs better than the Type-1 fuzzy method. Specifically, in evaluation of the algorithm with the synthetic data based on McSharry model, for input signals with SNR of -5dB, the SNR of the extracted FECG was improved by 38.38% in comparison with the Type-1 fuzzy method. Also, the results show that increasing the uncertainty or decreasing the input SNR leads to increasing the percentage of the improvement in SNR of the extracted FECG. For instance, when the SNR of the input signal decreases to -30dB, our proposed algorithm improves the SNR of the extracted FECG by 71.06% with respect to the Type-1 fuzzy method. The same results were obtained on synthetic data based on Behar model. Our results on real database reflect the success of the proposed method to separate the maternal and fetal heart signals even if their waves overlap in time. Moreover, the proposed algorithm was applied to the simulated fetal ECG with ectopic beats and achieved good results in separating FECG from MECG. The results show the superiority of the proposed Type-2 neuro-fuzzy inference method over the Type-1 neuro-fuzzy inference and the polynomial networks methods, which is due to its

  13. Wallac automatic alarm dosimeter type RAD21

    International Nuclear Information System (INIS)

    Burgess, P. H.; Iles, W.J.

    1980-02-01

    The Automatic Alarm Dosimeter type RAD 21 is a batterypowered personal dosemeter and exposure rate alarm monitor, designed to be worn on the body, covering an exposure range from 0.1 to 999.9 mR and has an audible alarm which can be pre-set over the range 1 mR h -1 to 250 mR h -1 . The instrument is designed to measure x- and γ radiation over the energy range 50 keV to 3 MeV. The facilities and controls, the radiation, electrical, environmental and mechanical characteristics, and the manual, have been evaluated. (U.K.)

  14. Type-Based Automated Verification of Authenticity in Asymmetric Cryptographic Protocols

    DEFF Research Database (Denmark)

    Dahl, Morten; Kobayashi, Naoki; Sun, Yunde

    2011-01-01

    Gordon and Jeffrey developed a type system for verification of asymmetric and symmetric cryptographic protocols. We propose a modified version of Gordon and Jeffrey's type system and develop a type inference algorithm for it, so that protocols can be verified automatically as they are, without any...... type annotations or explicit type casts. We have implemented a protocol verifier SpiCa based on the algorithm, and confirmed its effectiveness....

  15. A new type industrial total station based on target automatic collimation

    Science.gov (United States)

    Lao, Dabao; Zhou, Weihu; Ji, Rongyi; Dong, Dengfeng; Xiong, Zhi; Wei, Jiang

    2018-01-01

    In the case of industrial field measurement, the present measuring instruments work with manual operation and collimation, which give rise to low efficiency for field measurement. In order to solve the problem, a new type industrial total station is presented in this paper. The new instrument can identify and trace cooperative target automatically, in the mean time, coordinate of the target is measured in real time. For realizing the system, key technology including high precision absolutely distance measurement, small high accuracy angle measurement, target automatic collimation with vision, and quick precise controlling should be worked out. After customized system assemblage and adjustment, the new type industrial total station will be established. As the experiments demonstrated, the coordinate accuracy of the instrument is under 15ppm in the distance of 60m, which proved that the measuring system is feasible. The result showed that the total station can satisfy most industrial field measurement requirements.

  16. Automatic Imitation

    Science.gov (United States)

    Heyes, Cecilia

    2011-01-01

    "Automatic imitation" is a type of stimulus-response compatibility effect in which the topographical features of task-irrelevant action stimuli facilitate similar, and interfere with dissimilar, responses. This article reviews behavioral, neurophysiological, and neuroimaging research on automatic imitation, asking in what sense it is "automatic"…

  17. Koka: Programming with Row Polymorphic Effect Types

    Directory of Open Access Journals (Sweden)

    Daan Leijen

    2014-06-01

    Full Text Available We propose a programming model where effects are treated in a disciplined way, and where the potential side-effects of a function are apparent in its type signature. The type and effect of expressions can also be inferred automatically, and we describe a polymorphic type inference system based on Hindley-Milner style inference. A novel feature is that we support polymorphic effects through row-polymorphism using duplicate labels. Moreover, we show that our effects are not just syntactic labels but have a deep semantic connection to the program. For example, if an expression can be typed without an _exn_ effect, then it will never throw an unhandled exception. Similar to Haskell's `runST` we show how we can safely encapsulate stateful operations. Through the state effect, we can also safely combine state with let-polymorphism without needing either imperative type variables or a syntactic value restriction. Finally, our system is implemented fully in a new language called Koka and has been used successfully on various small to medium-sized sample programs ranging from a Markdown processor to a tier-splitted chat application. You can try out Koka live at www.rise4fun.com/koka/tutorial.

  18. Analyzing Program Termination and Complexity Automatically with AProVE

    DEFF Research Database (Denmark)

    Giesl, Jürgen; Aschermann, Cornelius; Brockschmidt, Marc

    2017-01-01

    In this system description, we present the tool AProVE for automatic termination and complexity proofs of Java, C, Haskell, Prolog, and rewrite systems. In addition to classical term rewrite systems (TRSs), AProVE also supports rewrite systems containing built-in integers (int-TRSs). To analyze...... programs in high-level languages, AProVE automatically converts them to (int-)TRSs. Then, a wide range of techniques is employed to prove termination and to infer complexity bounds for the resulting rewrite systems. The generated proofs can be exported to check their correctness using automatic certifiers...

  19. Automatic invariant detection in dynamic web applications

    NARCIS (Netherlands)

    Groeneveld, F.; Mesbah, A.; Van Deursen, A.

    2010-01-01

    The complexity of modern web applications increases as client-side JavaScript and dynamic DOM programming are used to offer a more interactive web experience. In this paper, we focus on improving the dependability of such applications by automatically inferring invariants from the client-side and

  20. Intracranial EEG correlates of implicit relational inference within the hippocampus.

    Science.gov (United States)

    Reber, T P; Do Lam, A T A; Axmacher, N; Elger, C E; Helmstaedter, C; Henke, K; Fell, J

    2016-01-01

    Drawing inferences from past experiences enables adaptive behavior in future situations. Inference has been shown to depend on hippocampal processes. Usually, inference is considered a deliberate and effortful mental act which happens during retrieval, and requires the focus of our awareness. Recent fMRI studies hint at the possibility that some forms of hippocampus-dependent inference can also occur during encoding and possibly also outside of awareness. Here, we sought to further explore the feasibility of hippocampal implicit inference, and specifically address the temporal evolution of implicit inference using intracranial EEG. Presurgical epilepsy patients with hippocampal depth electrodes viewed a sequence of word pairs, and judged the semantic fit between two words in each pair. Some of the word pairs entailed a common word (e.g., "winter-red," "red-cat") such that an indirect relation was established in following word pairs (e.g., "winter-cat"). The behavioral results suggested that drawing inference implicitly from past experience is feasible because indirect relations seemed to foster "fit" judgments while the absence of indirect relations fostered "do not fit" judgments, even though the participants were unaware of the indirect relations. A event-related potential (ERP) difference emerging 400 ms post-stimulus was evident in the hippocampus during encoding, suggesting that indirect relations were already established automatically during encoding of the overlapping word pairs. Further ERP differences emerged later post-stimulus (1,500 ms), were modulated by the participants' responses and were evident during encoding and test. Furthermore, response-locked ERP effects were evident at test. These ERP effects could hence be a correlate of the interaction of implicit memory with decision-making. Together, the data map out a time-course in which the hippocampus automatically integrates memories from discrete but related episodes to implicitly influence future

  1. Approximation Methods for Inference and Learning in Belief Networks: Progress and Future Directions

    National Research Council Canada - National Science Library

    Pazzan, Michael

    1997-01-01

    .... In this research project, we have investigated methods and implemented algorithms for efficiently making certain classes of inference in belief networks, and for automatically learning certain...

  2. Integrated Automatic Workflow for Phylogenetic Tree Analysis Using Public Access and Local Web Services.

    Science.gov (United States)

    Damkliang, Kasikrit; Tandayya, Pichaya; Sangket, Unitsa; Pasomsub, Ekawat

    2016-11-28

    At the present, coding sequence (CDS) has been discovered and larger CDS is being revealed frequently. Approaches and related tools have also been developed and upgraded concurrently, especially for phylogenetic tree analysis. This paper proposes an integrated automatic Taverna workflow for the phylogenetic tree inferring analysis using public access web services at European Bioinformatics Institute (EMBL-EBI) and Swiss Institute of Bioinformatics (SIB), and our own deployed local web services. The workflow input is a set of CDS in the Fasta format. The workflow supports 1,000 to 20,000 numbers in bootstrapping replication. The workflow performs the tree inferring such as Parsimony (PARS), Distance Matrix - Neighbor Joining (DIST-NJ), and Maximum Likelihood (ML) algorithms of EMBOSS PHYLIPNEW package based on our proposed Multiple Sequence Alignment (MSA) similarity score. The local web services are implemented and deployed into two types using the Soaplab2 and Apache Axis2 deployment. There are SOAP and Java Web Service (JWS) providing WSDL endpoints to Taverna Workbench, a workflow manager. The workflow has been validated, the performance has been measured, and its results have been verified. Our workflow's execution time is less than ten minutes for inferring a tree with 10,000 replicates of the bootstrapping numbers. This paper proposes a new integrated automatic workflow which will be beneficial to the bioinformaticians with an intermediate level of knowledge and experiences. All local services have been deployed at our portal http://bioservices.sci.psu.ac.th.

  3. Inference of segmented color and texture description by tensor voting.

    Science.gov (United States)

    Jia, Jiaya; Tang, Chi-Keung

    2004-06-01

    A robust synthesis method is proposed to automatically infer missing color and texture information from a damaged 2D image by (N)D tensor voting (N > 3). The same approach is generalized to range and 3D data in the presence of occlusion, missing data and noise. Our method translates texture information into an adaptive (N)D tensor, followed by a voting process that infers noniteratively the optimal color values in the (N)D texture space. A two-step method is proposed. First, we perform segmentation based on insufficient geometry, color, and texture information in the input, and extrapolate partitioning boundaries by either 2D or 3D tensor voting to generate a complete segmentation for the input. Missing colors are synthesized using (N)D tensor voting in each segment. Different feature scales in the input are automatically adapted by our tensor scale analysis. Results on a variety of difficult inputs demonstrate the effectiveness of our tensor voting approach.

  4. GRN2SBML: automated encoding and annotation of inferred gene regulatory networks complying with SBML.

    Science.gov (United States)

    Vlaic, Sebastian; Hoffmann, Bianca; Kupfer, Peter; Weber, Michael; Dräger, Andreas

    2013-09-01

    GRN2SBML automatically encodes gene regulatory networks derived from several inference tools in systems biology markup language. Providing a graphical user interface, the networks can be annotated via the simple object access protocol (SOAP)-based application programming interface of BioMart Central Portal and minimum information required in the annotation of models registry. Additionally, we provide an R-package, which processes the output of supported inference algorithms and automatically passes all required parameters to GRN2SBML. Therefore, GRN2SBML closes a gap in the processing pipeline between the inference of gene regulatory networks and their subsequent analysis, visualization and storage. GRN2SBML is freely available under the GNU Public License version 3 and can be downloaded from http://www.hki-jena.de/index.php/0/2/490. General information on GRN2SBML, examples and tutorials are available at the tool's web page.

  5. Automatic NMR-based identification of chemical reaction types in mixtures of co-occurring reactions.

    Science.gov (United States)

    Latino, Diogo A R S; Aires-de-Sousa, João

    2014-01-01

    The combination of chemoinformatics approaches with NMR techniques and the increasing availability of data allow the resolution of problems far beyond the original application of NMR in structure elucidation/verification. The diversity of applications can range from process monitoring, metabolic profiling, authentication of products, to quality control. An application related to the automatic analysis of complex mixtures concerns mixtures of chemical reactions. We encoded mixtures of chemical reactions with the difference between the (1)H NMR spectra of the products and the reactants. All the signals arising from all the reactants of the co-occurring reactions were taken together (a simulated spectrum of the mixture of reactants) and the same was done for products. The difference spectrum is taken as the representation of the mixture of chemical reactions. A data set of 181 chemical reactions was used, each reaction manually assigned to one of 6 types. From this dataset, we simulated mixtures where two reactions of different types would occur simultaneously. Automatic learning methods were trained to classify the reactions occurring in a mixture from the (1)H NMR-based descriptor of the mixture. Unsupervised learning methods (self-organizing maps) produced a reasonable clustering of the mixtures by reaction type, and allowed the correct classification of 80% and 63% of the mixtures in two independent test sets of different similarity to the training set. With random forests (RF), the percentage of correct classifications was increased to 99% and 80% for the same test sets. The RF probability associated to the predictions yielded a robust indication of their reliability. This study demonstrates the possibility of applying machine learning methods to automatically identify types of co-occurring chemical reactions from NMR data. Using no explicit structural information about the reactions participants, reaction elucidation is performed without structure elucidation of

  6. Automatic NMR-based identification of chemical reaction types in mixtures of co-occurring reactions.

    Directory of Open Access Journals (Sweden)

    Diogo A R S Latino

    Full Text Available The combination of chemoinformatics approaches with NMR techniques and the increasing availability of data allow the resolution of problems far beyond the original application of NMR in structure elucidation/verification. The diversity of applications can range from process monitoring, metabolic profiling, authentication of products, to quality control. An application related to the automatic analysis of complex mixtures concerns mixtures of chemical reactions. We encoded mixtures of chemical reactions with the difference between the (1H NMR spectra of the products and the reactants. All the signals arising from all the reactants of the co-occurring reactions were taken together (a simulated spectrum of the mixture of reactants and the same was done for products. The difference spectrum is taken as the representation of the mixture of chemical reactions. A data set of 181 chemical reactions was used, each reaction manually assigned to one of 6 types. From this dataset, we simulated mixtures where two reactions of different types would occur simultaneously. Automatic learning methods were trained to classify the reactions occurring in a mixture from the (1H NMR-based descriptor of the mixture. Unsupervised learning methods (self-organizing maps produced a reasonable clustering of the mixtures by reaction type, and allowed the correct classification of 80% and 63% of the mixtures in two independent test sets of different similarity to the training set. With random forests (RF, the percentage of correct classifications was increased to 99% and 80% for the same test sets. The RF probability associated to the predictions yielded a robust indication of their reliability. This study demonstrates the possibility of applying machine learning methods to automatically identify types of co-occurring chemical reactions from NMR data. Using no explicit structural information about the reactions participants, reaction elucidation is performed without structure

  7. Deliberation versus automaticity in decision making: Which presentation format features facilitate automatic decision making?

    Directory of Open Access Journals (Sweden)

    Anke Soellner

    2013-05-01

    Full Text Available The idea of automatic decision making approximating normatively optimal decisions without necessitating much cognitive effort is intriguing. Whereas recent findings support the notion that such fast, automatic processes explain empirical data well, little is known about the conditions under which such processes are selected rather than more deliberate stepwise strategies. We investigate the role of the format of information presentation, focusing explicitly on the ease of information acquisition and its influence on information integration processes. In a probabilistic inference task, the standard matrix employed in prior research was contrasted with a newly created map presentation format and additional variations of both presentation formats. Across three experiments, a robust presentation format effect emerged: Automatic decision making was more prevalent in the matrix (with high information accessibility, whereas sequential decision strategies prevailed when the presentation format demanded more information acquisition effort. Further scrutiny of the effect showed that it is not driven by the presentation format as such, but rather by the extent of information search induced by a format. Thus, if information is accessible with minimal need for information search, information integration is likely to proceed in a perception-like, holistic manner. In turn, a moderate demand for information search decreases the likelihood of behavior consistent with the assumptions of automatic decision making.

  8. Don’t Interrupt Me While I Type: Inferring Text Entered Through Gesture Typing on Android Keyboards

    Directory of Open Access Journals (Sweden)

    Simon Laurent

    2016-07-01

    Full Text Available We present a new side-channel attack against soft keyboards that support gesture typing on Android smartphones. An application without any special permissions can observe the number and timing of the screen hardware interrupts and system-wide software interrupts generated during user input, and analyze this information to make inferences about the text being entered by the user. System-wide information is usually considered less sensitive than app-specific information, but we provide concrete evidence that this may be mistaken. Our attack applies to all Android versions, including Android M where the SELinux policy is tightened.

  9. Data Provenance Inference in Logic Programming: Reducing Effort of Instance-driven Debugging

    NARCIS (Netherlands)

    Huq, M.R.; Mileo, Alessandra; Wombacher, Andreas

    Data provenance allows scientists in different domains validating their models and algorithms to find out anomalies and unexpected behaviors. In previous works, we described on-the-fly interpretation of (Python) scripts to build workflow provenance graph automatically and then infer fine-grained

  10. Vertically Integrated Seismological Analysis II : Inference

    Science.gov (United States)

    Arora, N. S.; Russell, S.; Sudderth, E.

    2009-12-01

    Methods for automatically associating detected waveform features with hypothesized seismic events, and localizing those events, are a critical component of efforts to verify the Comprehensive Test Ban Treaty (CTBT). As outlined in our companion abstract, we have developed a hierarchical model which views detection, association, and localization as an integrated probabilistic inference problem. In this abstract, we provide more details on the Markov chain Monte Carlo (MCMC) methods used to solve this inference task. MCMC generates samples from a posterior distribution π(x) over possible worlds x by defining a Markov chain whose states are the worlds x, and whose stationary distribution is π(x). In the Metropolis-Hastings (M-H) method, transitions in the Markov chain are constructed in two steps. First, given the current state x, a candidate next state x‧ is generated from a proposal distribution q(x‧ | x), which may be (more or less) arbitrary. Second, the transition to x‧ is not automatic, but occurs with an acceptance probability—α(x‧ | x) = min(1, π(x‧)q(x | x‧)/π(x)q(x‧ | x)). The seismic event model outlined in our companion abstract is quite similar to those used in multitarget tracking, for which MCMC has proved very effective. In this model, each world x is defined by a collection of events, a list of properties characterizing those events (times, locations, magnitudes, and types), and the association of each event to a set of observed detections. The target distribution π(x) = P(x | y), the posterior distribution over worlds x given the observed waveform data y at all stations. Proposal distributions then implement several types of moves between worlds. For example, birth moves create new events; death moves delete existing events; split moves partition the detections for an event into two new events; merge moves combine event pairs; swap moves modify the properties and assocations for pairs of events. Importantly, the rules for

  11. Type Ia Supernova Light Curve Inference: Hierarchical Models for Nearby SN Ia in the Optical and Near Infrared

    Science.gov (United States)

    Mandel, Kaisey; Kirshner, R. P.; Narayan, G.; Wood-Vasey, W. M.; Friedman, A. S.; Hicken, M.

    2010-01-01

    I have constructed a comprehensive statistical model for Type Ia supernova light curves spanning optical through near infrared data simultaneously. The near infrared light curves are found to be excellent standard candles (sigma(MH) = 0.11 +/- 0.03 mag) that are less vulnerable to systematic error from dust extinction, a major confounding factor for cosmological studies. A hierarchical statistical framework incorporates coherently multiple sources of randomness and uncertainty, including photometric error, intrinsic supernova light curve variations and correlations, dust extinction and reddening, peculiar velocity dispersion and distances, for probabilistic inference with Type Ia SN light curves. Inferences are drawn from the full probability density over individual supernovae and the SN Ia and dust populations, conditioned on a dataset of SN Ia light curves and redshifts. To compute probabilistic inferences with hierarchical models, I have developed BayeSN, a Markov Chain Monte Carlo algorithm based on Gibbs sampling. This code explores and samples the global probability density of parameters describing individual supernovae and the population. I have applied this hierarchical model to optical and near infrared data of over 100 nearby Type Ia SN from PAIRITEL, the CfA3 sample, and the literature. Using this statistical model, I find that SN with optical and NIR data have a smaller residual scatter in the Hubble diagram than SN with only optical data. The continued study of Type Ia SN in the near infrared will be important for improving their utility as precise and accurate cosmological distance indicators.

  12. An ontology for Autism Spectrum Disorder (ASD) to infer ASD phenotypes from Autism Diagnostic Interview-Revised data.

    Science.gov (United States)

    Mugzach, Omri; Peleg, Mor; Bagley, Steven C; Guter, Stephen J; Cook, Edwin H; Altman, Russ B

    2015-08-01

    Our goal is to create an ontology that will allow data integration and reasoning with subject data to classify subjects, and based on this classification, to infer new knowledge on Autism Spectrum Disorder (ASD) and related neurodevelopmental disorders (NDD). We take a first step toward this goal by extending an existing autism ontology to allow automatic inference of ASD phenotypes and Diagnostic & Statistical Manual of Mental Disorders (DSM) criteria based on subjects' Autism Diagnostic Interview-Revised (ADI-R) assessment data. Knowledge regarding diagnostic instruments, ASD phenotypes and risk factors was added to augment an existing autism ontology via Ontology Web Language class definitions and semantic web rules. We developed a custom Protégé plugin for enumerating combinatorial OWL axioms to support the many-to-many relations of ADI-R items to diagnostic categories in the DSM. We utilized a reasoner to infer whether 2642 subjects, whose data was obtained from the Simons Foundation Autism Research Initiative, meet DSM-IV-TR (DSM-IV) and DSM-5 diagnostic criteria based on their ADI-R data. We extended the ontology by adding 443 classes and 632 rules that represent phenotypes, along with their synonyms, environmental risk factors, and frequency of comorbidities. Applying the rules on the data set showed that the method produced accurate results: the true positive and true negative rates for inferring autistic disorder diagnosis according to DSM-IV criteria were 1 and 0.065, respectively; the true positive rate for inferring ASD based on DSM-5 criteria was 0.94. The ontology allows automatic inference of subjects' disease phenotypes and diagnosis with high accuracy. The ontology may benefit future studies by serving as a knowledge base for ASD. In addition, by adding knowledge of related NDDs, commonalities and differences in manifestations and risk factors could be automatically inferred, contributing to the understanding of ASD pathophysiology. Copyright

  13. Automatic Reverse Engineering of Private Flight Control Protocols of UAVs

    Directory of Open Access Journals (Sweden)

    Ran Ji

    2017-01-01

    Full Text Available The increasing use of civil unmanned aerial vehicles (UAVs has the potential to threaten public safety and privacy. Therefore, airspace administrators urgently need an effective method to regulate UAVs. Understanding the meaning and format of UAV flight control commands by automatic protocol reverse-engineering techniques is highly beneficial to UAV regulation. To improve our understanding of the meaning and format of UAV flight control commands, this paper proposes a method to automatically analyze the private flight control protocols of UAVs. First, we classify flight control commands collected from a binary network trace into clusters; then, we analyze the meaning of flight control commands by the accumulated error of each cluster; next, we extract the binary format of commands and infer field semantics in these commands; and finally, we infer the location of the check field in command and the generator polynomial matrix. The proposed approach is validated via experiments on a widely used consumer UAV.

  14. Methodology for Automatic Ontology Generation Using Database Schema Information

    Directory of Open Access Journals (Sweden)

    JungHyen An

    2018-01-01

    Full Text Available An ontology is a model language that supports the functions to integrate conceptually distributed domain knowledge and infer relationships among the concepts. Ontologies are developed based on the target domain knowledge. As a result, methodologies to automatically generate an ontology from metadata that characterize the domain knowledge are becoming important. However, existing methodologies to automatically generate an ontology using metadata are required to generate the domain metadata in a predetermined template, and it is difficult to manage data that are increased on the ontology itself when the domain OWL (Ontology Web Language individuals are continuously increased. The database schema has a feature of domain knowledge and provides structural functions to efficiently process the knowledge-based data. In this paper, we propose a methodology to automatically generate ontologies and manage the OWL individual through an interaction of the database and the ontology. We describe the automatic ontology generation process with example schema and demonstrate the effectiveness of the automatically generated ontology by comparing it with existing ontologies using the ontology quality score.

  15. The Automatic Annotation of the Semiotic Type of Hand Gestures in Obama’s Humorous Speeches

    DEFF Research Database (Denmark)

    Navarretta, Costanza

    2018-01-01

    is expressed by speech or by adding new information to what is uttered. The automatic classification of the semiotic type of gestures from their shape description can contribute to their interpretation in human-human communication and in advanced multimodal interactive systems. We annotated and analysed hand...

  16. Estimating uncertainty of inference for validation

    Energy Technology Data Exchange (ETDEWEB)

    Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2010-09-30

    We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the

  17. A Network Inference Workflow Applied to Virulence-Related Processes in Salmonella typhimurium

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, Ronald C.; Singhal, Mudita; Weller, Jennifer B.; Khoshnevis, Saeed; Shi, Liang; McDermott, Jason E.

    2009-04-20

    Inference of the structure of mRNA transcriptional regulatory networks, protein regulatory or interaction networks, and protein activation/inactivation-based signal transduction networks are critical tasks in systems biology. In this article we discuss a workflow for the reconstruction of parts of the transcriptional regulatory network of the pathogenic bacterium Salmonella typhimurium based on the information contained in sets of microarray gene expression data now available for that organism, and describe our results obtained by following this workflow. The primary tool is one of the network inference algorithms deployed in the Software Environment for BIological Network Inference (SEBINI). Specifically, we selected the algorithm called Context Likelihood of Relatedness (CLR), which uses the mutual information contained in the gene expression data to infer regulatory connections. The associated analysis pipeline automatically stores the inferred edges from the CLR runs within SEBINI and, upon request, transfers the inferred edges into either Cytoscape or the plug-in Collective Analysis of Biological of Biological Interaction Networks (CABIN) tool for further post-analysis of the inferred regulatory edges. The following article presents the outcome of this workflow, as well as the protocols followed for microarray data collection, data cleansing, and network inference. Our analysis revealed several interesting interactions, functional groups, metabolic pathways, and regulons in S. typhimurium.

  18. Statistical Inference at Work: Statistical Process Control as an Example

    Science.gov (United States)

    Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

    2008-01-01

    To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

  19. Automatically explaining machine learning prediction results: a demonstration on type 2 diabetes risk prediction.

    Science.gov (United States)

    Luo, Gang

    2016-01-01

    Predictive modeling is a key component of solutions to many healthcare problems. Among all predictive modeling approaches, machine learning methods often achieve the highest prediction accuracy, but suffer from a long-standing open problem precluding their widespread use in healthcare. Most machine learning models give no explanation for their prediction results, whereas interpretability is essential for a predictive model to be adopted in typical healthcare settings. This paper presents the first complete method for automatically explaining results for any machine learning predictive model without degrading accuracy. We did a computer coding implementation of the method. Using the electronic medical record data set from the Practice Fusion diabetes classification competition containing patient records from all 50 states in the United States, we demonstrated the method on predicting type 2 diabetes diagnosis within the next year. For the champion machine learning model of the competition, our method explained prediction results for 87.4 % of patients who were correctly predicted by the model to have type 2 diabetes diagnosis within the next year. Our demonstration showed the feasibility of automatically explaining results for any machine learning predictive model without degrading accuracy.

  20. LAIT: a local ancestry inference toolkit.

    Science.gov (United States)

    Hui, Daniel; Fang, Zhou; Lin, Jerome; Duan, Qing; Li, Yun; Hu, Ming; Chen, Wei

    2017-09-06

    Inferring local ancestry in individuals of mixed ancestry has many applications, most notably in identifying disease-susceptible loci that vary among different ethnic groups. Many software packages are available for inferring local ancestry in admixed individuals. However, most of these existing software packages require specific formatted input files and generate output files in various types, yielding practical inconvenience. We developed a tool set, Local Ancestry Inference Toolkit (LAIT), which can convert standardized files into software-specific input file formats as well as standardize and summarize inference results for four popular local ancestry inference software: HAPMIX, LAMP, LAMP-LD, and ELAI. We tested LAIT using both simulated and real data sets and demonstrated that LAIT provides convenience to run multiple local ancestry inference software. In addition, we evaluated the performance of local ancestry software among different supported software packages, mainly focusing on inference accuracy and computational resources used. We provided a toolkit to facilitate the use of local ancestry inference software, especially for users with limited bioinformatics background.

  1. Automatic recognition of coronal type II radio bursts: The ARBIS 2 method and first observations

    Science.gov (United States)

    Lobzin, Vasili; Cairns, Iver; Robinson, Peter; Steward, Graham; Patterson, Garth

    Major space weather events such as solar flares and coronal mass ejections are usually accompa-nied by solar radio bursts, which can potentially be used for real-time space weather forecasts. Type II radio bursts are produced near the local plasma frequency and its harmonic by fast electrons accelerated by a shock wave moving through the corona and solar wind with a typi-cal speed of 1000 km s-1 . The coronal bursts have dynamic spectra with frequency gradually falling with time and durations of several minutes. We present a new method developed to de-tect type II coronal radio bursts automatically and describe its implementation in an extended Automated Radio Burst Identification System (ARBIS 2). Preliminary tests of the method with spectra obtained in 2002 show that the performance of the current implementation is quite high, ˜ 80%, while the probability of false positives is reasonably low, with one false positive per 100-200 hr for high solar activity and less than one false event per 10000 hr for low solar activity periods. The first automatically detected coronal type II radio bursts are also presented. ARBIS 2 is now operational with IPS Radio and Space Services, providing email alerts and event lists internationally.

  2. Using suggestion to model different types of automatic writing.

    Science.gov (United States)

    Walsh, E; Mehta, M A; Oakley, D A; Guilmette, D N; Gabay, A; Halligan, P W; Deeley, Q

    2014-05-01

    Our sense of self includes awareness of our thoughts and movements, and our control over them. This feeling can be altered or lost in neuropsychiatric disorders as well as in phenomena such as "automatic writing" whereby writing is attributed to an external source. Here, we employed suggestion in highly hypnotically suggestible participants to model various experiences of automatic writing during a sentence completion task. Results showed that the induction of hypnosis, without additional suggestion, was associated with a small but significant reduction of control, ownership, and awareness for writing. Targeted suggestions produced a double dissociation between thought and movement components of writing, for both feelings of control and ownership, and additionally, reduced awareness of writing. Overall, suggestion produced selective alterations in the control, ownership, and awareness of thought and motor components of writing, thus enabling key aspects of automatic writing, observed across different clinical and cultural settings, to be modelled. Copyright © 2014. Published by Elsevier Inc.

  3. PALM: a paralleled and integrated framework for phylogenetic inference with automatic likelihood model selectors.

    Directory of Open Access Journals (Sweden)

    Shu-Hwa Chen

    Full Text Available BACKGROUND: Selecting an appropriate substitution model and deriving a tree topology for a given sequence set are essential in phylogenetic analysis. However, such time consuming, computationally intensive tasks rely on knowledge of substitution model theories and related expertise to run through all possible combinations of several separate programs. To ensure a thorough and efficient analysis and avert tedious manipulations of various programs, this work presents an intuitive framework, the phylogenetic reconstruction with automatic likelihood model selectors (PALM, with convincing, updated algorithms and a best-fit model selection mechanism for seamless phylogenetic analysis. METHODOLOGY: As an integrated framework of ClustalW, PhyML, MODELTEST, ProtTest, and several in-house programs, PALM evaluates the fitness of 56 substitution models for nucleotide sequences and 112 substitution models for protein sequences with scores in various criteria. The input for PALM can be either sequences in FASTA format or a sequence alignment file in PHYLIP format. To accelerate the computing of maximum likelihood and bootstrapping, this work integrates MPICH2/PhyML, PalmMonitor and Palm job controller across several machines with multiple processors and adopts the task parallelism approach. Moreover, an intuitive and interactive web component, PalmTree, is developed for displaying and operating the output tree with options of tree rooting, branches swapping, viewing the branch length values, and viewing bootstrapping score, as well as removing nodes to restart analysis iteratively. SIGNIFICANCE: The workflow of PALM is straightforward and coherent. Via a succinct, user-friendly interface, researchers unfamiliar with phylogenetic analysis can easily use this server to submit sequences, retrieve the output, and re-submit a job based on a previous result if some sequences are to be deleted or added for phylogenetic reconstruction. PALM results in an inference of

  4. Content-aware automatic cropping for consumer photos

    Science.gov (United States)

    Tang, Hao; Tretter, Daniel; Lin, Qian

    2013-03-01

    Consumer photos are typically authored once, but need to be retargeted for reuse in various situations. These include printing a photo on different size paper, changing the size and aspect ratio of an embedded photo to accommodate the dynamic content layout of web pages or documents, adapting a large photo for browsing on small displays such as mobile phone screens, and improving the aesthetic quality of a photo that was badly composed at the capture time. In this paper, we propose a novel, effective, and comprehensive content-aware automatic cropping (hereafter referred to as "autocrop") method for consumer photos to achieve the above purposes. Our autocrop method combines the state-of-the-art context-aware saliency detection algorithm, which aims to infer the likely intent of the photographer, and the "branch-and-bound" efficient subwindow search optimization technique, which seeks to locate the globally optimal cropping rectangle in a fast manner. Unlike most current autocrop methods, which can only crop a photo into an arbitrary rectangle, our autocrop method can automatically crop a photo into either a rectangle of arbitrary dimensions or a rectangle of the desired aspect ratio specified by the user. The aggressiveness of the cropping operation may be either automatically determined by the method or manually indicated by the user with ease. In addition, our autocrop method is extended to support the cropping of a photo into non-rectangular shapes such as polygons of any number of sides. It may also be potentially extended to return multiple cropping suggestions, which will enable the creation of new photos to enrich the original photo collections. Our experimental results show that the proposed autocrop method in this paper can generate high-quality crops for consumer photos of various types.

  5. AD Model Builder: using automatic differentiation for statistical inference of highly parameterized complex nonlinear models

    DEFF Research Database (Denmark)

    Fournier, David A.; Skaug, Hans J.; Ancheta, Johnoel

    2011-01-01

    Many criteria for statistical parameter estimation, such as maximum likelihood, are formulated as a nonlinear optimization problem.Automatic Differentiation Model Builder (ADMB) is a programming framework based on automatic differentiation, aimed at highly nonlinear models with a large number...... of such a feature is the generic implementation of Laplace approximation of high-dimensional integrals for use in latent variable models. We also review the literature in which ADMB has been used, and discuss future development of ADMB as an open source project. Overall, the main advantages ofADMB are flexibility...

  6. 30 CFR 77.314 - Automatic temperature control instruments.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic temperature control instruments. 77... UNDERGROUND COAL MINES Thermal Dryers § 77.314 Automatic temperature control instruments. (a) Automatic temperature control instruments for thermal dryer system shall be of the recording type. (b) Automatic...

  7. Automatic segmentation of time-lapse microscopy images depicting a live Dharma embryo.

    Science.gov (United States)

    Zacharia, Eleni; Bondesson, Maria; Riu, Anne; Ducharme, Nicole A; Gustafsson, Jan-Åke; Kakadiaris, Ioannis A

    2011-01-01

    Biological inferences about the toxicity of chemicals reached during experiments on the zebrafish Dharma embryo can be greatly affected by the analysis of the time-lapse microscopy images depicting the embryo. Among the stages of image analysis, automatic and accurate segmentation of the Dharma embryo is the most crucial and challenging. In this paper, an accurate and automatic segmentation approach for the segmentation of the Dharma embryo data obtained by fluorescent time-lapse microscopy is proposed. Experiments performed in four stacks of 3D images over time have shown promising results.

  8. Impact of Sample Type and DNA Isolation Procedure on Genomic Inference of Microbiome Composition

    DEFF Research Database (Denmark)

    Knudsen, Berith Elkær; Bergmark, Lasse; Munk, Patrick

    2016-01-01

    that in standard protocols. Based on this insight, we designed an improved DNA isolation procedure optimized for microbiome genomics that can be used for the three examined specimen types and potentially also for other biological specimens. A standard operating procedure is available from https://dx.doi.org/10......Explorations of complex microbiomes using genomics greatly enhance our understanding about their diversity, biogeography, and function. The isolation of DNA from microbiome specimens is a key prerequisite for such examinations, but challenges remain in obtaining sufficient DNA quantities required...... for certain sequencing approaches, achieving accurate genomic inference of microbiome composition, and facilitating comparability of findings across specimen types and sequencing projects. These aspects are particularly relevant for the genomics-based global surveillance of infectious agents and antimicrobial...

  9. Nonparametric predictive inference for reliability of a k-out-of-m:G system with multiple component types

    International Nuclear Information System (INIS)

    Aboalkhair, Ahmad M.; Coolen, Frank P.A.; MacPhee, Iain M.

    2014-01-01

    Nonparametric predictive inference for system reliability has recently been presented, with specific focus on k-out-of-m:G systems. The reliability of systems is quantified by lower and upper probabilities of system functioning, given binary test results on components, taking uncertainty about component functioning and indeterminacy due to limited test information explicitly into account. Thus far, systems considered were series configurations of subsystems, with each subsystem i a k i -out-of-m i :G system which consisted of only one type of components. Key results are briefly summarized in this paper, and as an important generalization new results are presented for a single k-out-of-m:G system consisting of components of multiple types. The important aspects of redundancy and diversity for such systems are discussed. - Highlights: • New results on nonparametric predictive inference for system reliability. • Prediction of system reliability based on test data for components. • New insights on system redundancy optimization and diversity. • Components that appear inferior in tests may be included to enhance redundancy

  10. Disentangling Complexity in Bayesian Automatic Adaptive Quadrature

    Science.gov (United States)

    Adam, Gheorghe; Adam, Sanda

    2018-02-01

    The paper describes a Bayesian automatic adaptive quadrature (BAAQ) solution for numerical integration which is simultaneously robust, reliable, and efficient. Detailed discussion is provided of three main factors which contribute to the enhancement of these features: (1) refinement of the m-panel automatic adaptive scheme through the use of integration-domain-length-scale-adapted quadrature sums; (2) fast early problem complexity assessment - enables the non-transitive choice among three execution paths: (i) immediate termination (exceptional cases); (ii) pessimistic - involves time and resource consuming Bayesian inference resulting in radical reformulation of the problem to be solved; (iii) optimistic - asks exclusively for subrange subdivision by bisection; (3) use of the weaker accuracy target from the two possible ones (the input accuracy specifications and the intrinsic integrand properties respectively) - results in maximum possible solution accuracy under minimum possible computing time.

  11. Shaping electromagnetic waves using software-automatically-designed metasurfaces.

    Science.gov (United States)

    Zhang, Qian; Wan, Xiang; Liu, Shuo; Yuan Yin, Jia; Zhang, Lei; Jun Cui, Tie

    2017-06-15

    We present a fully digital procedure of designing reflective coding metasurfaces to shape reflected electromagnetic waves. The design procedure is completely automatic, controlled by a personal computer. In details, the macro coding units of metasurface are automatically divided into several types (e.g. two types for 1-bit coding, four types for 2-bit coding, etc.), and each type of the macro coding units is formed by discretely random arrangement of micro coding units. By combining an optimization algorithm and commercial electromagnetic software, the digital patterns of the macro coding units are optimized to possess constant phase difference for the reflected waves. The apertures of the designed reflective metasurfaces are formed by arranging the macro coding units with certain coding sequence. To experimentally verify the performance, a coding metasurface is fabricated by automatically designing two digital 1-bit unit cells, which are arranged in array to constitute a periodic coding metasurface to generate the required four-beam radiations with specific directions. Two complicated functional metasurfaces with circularly- and elliptically-shaped radiation beams are realized by automatically designing 4-bit macro coding units, showing excellent performance of the automatic designs by software. The proposed method provides a smart tool to realize various functional devices and systems automatically.

  12. TYPE Ia SUPERNOVA LIGHT-CURVE INFERENCE: HIERARCHICAL BAYESIAN ANALYSIS IN THE NEAR-INFRARED

    International Nuclear Information System (INIS)

    Mandel, Kaisey S.; Friedman, Andrew S.; Kirshner, Robert P.; Wood-Vasey, W. Michael

    2009-01-01

    We present a comprehensive statistical analysis of the properties of Type Ia supernova (SN Ia) light curves in the near-infrared using recent data from Peters Automated InfraRed Imaging TELescope and the literature. We construct a hierarchical Bayesian framework, incorporating several uncertainties including photometric error, peculiar velocities, dust extinction, and intrinsic variations, for principled and coherent statistical inference. SN Ia light-curve inferences are drawn from the global posterior probability of parameters describing both individual supernovae and the population conditioned on the entire SN Ia NIR data set. The logical structure of the hierarchical model is represented by a directed acyclic graph. Fully Bayesian analysis of the model and data is enabled by an efficient Markov Chain Monte Carlo algorithm exploiting the conditional probabilistic structure using Gibbs sampling. We apply this framework to the JHK s SN Ia light-curve data. A new light-curve model captures the observed J-band light-curve shape variations. The marginal intrinsic variances in peak absolute magnitudes are σ(M J ) = 0.17 ± 0.03, σ(M H ) = 0.11 ± 0.03, and σ(M Ks ) = 0.19 ± 0.04. We describe the first quantitative evidence for correlations between the NIR absolute magnitudes and J-band light-curve shapes, and demonstrate their utility for distance estimation. The average residual in the Hubble diagram for the training set SNe at cz > 2000kms -1 is 0.10 mag. The new application of bootstrap cross-validation to SN Ia light-curve inference tests the sensitivity of the statistical model fit to the finite sample and estimates the prediction error at 0.15 mag. These results demonstrate that SN Ia NIR light curves are as effective as corrected optical light curves, and, because they are less vulnerable to dust absorption, they have great potential as precise and accurate cosmological distance indicators.

  13. DeepInfer: open-source deep learning deployment toolkit for image-guided therapy

    Science.gov (United States)

    Mehrtash, Alireza; Pesteie, Mehran; Hetherington, Jorden; Behringer, Peter A.; Kapur, Tina; Wells, William M.; Rohling, Robert; Fedorov, Andriy; Abolmaesumi, Purang

    2017-03-01

    Deep learning models have outperformed some of the previous state-of-the-art approaches in medical image analysis. Instead of using hand-engineered features, deep models attempt to automatically extract hierarchical representations at multiple levels of abstraction from the data. Therefore, deep models are usually considered to be more flexible and robust solutions for image analysis problems compared to conventional computer vision models. They have demonstrated significant improvements in computer-aided diagnosis and automatic medical image analysis applied to such tasks as image segmentation, classification and registration. However, deploying deep learning models often has a steep learning curve and requires detailed knowledge of various software packages. Thus, many deep models have not been integrated into the clinical research work ows causing a gap between the state-of-the-art machine learning in medical applications and evaluation in clinical research procedures. In this paper, we propose "DeepInfer" - an open-source toolkit for developing and deploying deep learning models within the 3D Slicer medical image analysis platform. Utilizing a repository of task-specific models, DeepInfer allows clinical researchers and biomedical engineers to deploy a trained model selected from the public registry, and apply it to new data without the need for software development or configuration. As two practical use cases, we demonstrate the application of DeepInfer in prostate segmentation for targeted MRI-guided biopsy and identification of the target plane in 3D ultrasound for spinal injections.

  14. Automatic welding processes for reactor coolant pipes used in PWR type nuclear power plant

    International Nuclear Information System (INIS)

    Hamada, T.; Nakamura, A.; Nagura, Y.; Sakamoto, N.

    1979-01-01

    The authors developed automatic welding processes (submerged arc welding process and TIG welding process) for application to the welding of reactor coolant pipes which constitute the most important part of the PWR type nuclear power plant. Submerged arc welding process is suitable for flat position welding in which pipes can be rotated, while TIG welding process is suitable for all position welding. This paper gives an outline of the two processes and the results of tests performed using these processes. (author)

  15. An analysis pipeline for the inference of protein-protein interaction networks

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, Ronald C.; Singhal, Mudita; Daly, Don S.; Gilmore, Jason M.; Cannon, William R.; Domico, Kelly O.; White, Amanda M.; Auberry, Deanna L.; Auberry, Kenneth J.; Hooker, Brian S.; Hurst, G. B.; McDermott, Jason E.; McDonald, W. H.; Pelletier, Dale A.; Schmoyer, Denise A.; Wiley, H. S.

    2009-12-01

    An analysis pipeline has been created for deployment of a novel algorithm, the Bayesian Estimator of Protein-Protein Association Probabilities (BEPro), for use in the reconstruction of protein-protein interaction networks. We have combined the Software Environment for BIological Network Inference (SEBINI), an interactive environment for the deployment and testing of network inference algorithms that use high-throughput data, and the Collective Analysis of Biological Interaction Networks (CABIN), software that allows integration and analysis of protein-protein interaction and gene-to-gene regulatory evidence obtained from multiple sources, to allow interactions computed by BEPro to be stored, visualized, and further analyzed. Incorporating BEPro into SEBINI and automatically feeding the resulting inferred network into CABIN, we have created a structured workflow for protein-protein network inference and supplemental analysis from sets of mass spectrometry bait-prey experiment data. SEBINI demo site: https://www.emsl.pnl.gov /SEBINI/ Contact: ronald.taylor@pnl.gov. BEPro is available at http://www.pnl.gov/statistics/BEPro3/index.htm. Contact: ds.daly@pnl.gov. CABIN is available at http://www.sysbio.org/dataresources/cabin.stm. Contact: mudita.singhal@pnl.gov.

  16. POPPER, a simple programming language for probabilistic semantic inference in medicine.

    Science.gov (United States)

    Robson, Barry

    2015-01-01

    Our previous reports described the use of the Hyperbolic Dirac Net (HDN) as a method for probabilistic inference from medical data, and a proposed probabilistic medical Semantic Web (SW) language Q-UEL to provide that data. Rather like a traditional Bayes Net, that HDN provided estimates of joint and conditional probabilities, and was static, with no need for evolution due to "reasoning". Use of the SW will require, however, (a) at least the semantic triple with more elaborate relations than conditional ones, as seen in use of most verbs and prepositions, and (b) rules for logical, grammatical, and definitional manipulation that can generate changes in the inference net. Here is described the simple POPPER language for medical inference. It can be automatically written by Q-UEL, or by hand. Based on studies with our medical students, it is believed that a tool like this may help in medical education and that a physician unfamiliar with SW science can understand it. It is here used to explore the considerable challenges of assigning probabilities, and not least what the meaning and utility of inference net evolution would be for a physician. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Evaluation of stationary and non-stationary geostatistical models for inferring hydraulic conductivity values at Aespoe

    International Nuclear Information System (INIS)

    La Pointe, P.R.

    1994-11-01

    This report describes the comparison of stationary and non-stationary geostatistical models for the purpose of inferring block-scale hydraulic conductivity values from packer tests at Aespoe. The comparison between models is made through the evaluation of cross-validation statistics for three experimental designs. The first experiment consisted of a 'Delete-1' test previously used at Finnsjoen. The second test consisted of 'Delete-10%' and the third test was a 'Delete-50%' test. Preliminary data analysis showed that the 3 m and 30 m packer test data can be treated as a sample from a single population for the purposes of geostatistical analyses. Analysis of the 3 m data does not indicate that there are any systematic statistical changes with depth, rock type, fracture zone vs non-fracture zone or other mappable factor. Directional variograms are ambiguous to interpret due to the clustered nature of the data, but do not show any obvious anisotropy that should be accounted for in geostatistical analysis. Stationary analysis suggested that there exists a sizeable spatially uncorrelated component ('Nugget Effect') in the 3 m data, on the order of 60% of the observed variance for the various models fitted. Four different nested models were automatically fit to the data. Results for all models in terms of cross-validation statistics were very similar for the first set of validation tests. Non-stationary analysis established that both the order of drift and the order of the intrinsic random functions is low. This study also suggests that conventional cross-validation studies and automatic variogram fitting are not necessarily evaluating how well a model will infer block scale hydraulic conductivity values. 20 refs, 20 figs, 14 tabs

  18. Fused Regression for Multi-source Gene Regulatory Network Inference.

    Directory of Open Access Journals (Sweden)

    Kari Y Lam

    2016-12-01

    Full Text Available Understanding gene regulatory networks is critical to understanding cellular differentiation and response to external stimuli. Methods for global network inference have been developed and applied to a variety of species. Most approaches consider the problem of network inference independently in each species, despite evidence that gene regulation can be conserved even in distantly related species. Further, network inference is often confined to single data-types (single platforms and single cell types. We introduce a method for multi-source network inference that allows simultaneous estimation of gene regulatory networks in multiple species or biological processes through the introduction of priors based on known gene relationships such as orthology incorporated using fused regression. This approach improves network inference performance even when orthology mapping and conservation are incomplete. We refine this method by presenting an algorithm that extracts the true conserved subnetwork from a larger set of potentially conserved interactions and demonstrate the utility of our method in cross species network inference. Last, we demonstrate our method's utility in learning from data collected on different experimental platforms.

  19. Method for automatic control rod operation using rule-based control

    International Nuclear Information System (INIS)

    Kinoshita, Mitsuo; Yamada, Naoyuki; Kiguchi, Takashi

    1988-01-01

    An automatic control rod operation method using rule-based control is proposed. Its features are as follows: (1) a production system to recognize plant events, determine control actions and realize fast inference (fast selection of a suitable production rule), (2) use of the fuzzy control technique to determine quantitative control variables. The method's performance was evaluated by simulation tests on automatic control rod operation at a BWR plant start-up. The results were as follows; (1) The performance which is related to stabilization of controlled variables and time required for reactor start-up, was superior to that of other methods such as PID control and program control methods, (2) the process time to select and interpret the suitable production rule, which was the same as required for event recognition or determination of control action, was short (below 1 s) enough for real time control. The results showed that the method is effective for automatic control rod operation. (author)

  20. 46 CFR 63.25-1 - Small automatic auxiliary boilers.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Small automatic auxiliary boilers. 63.25-1 Section 63.25... AUXILIARY BOILERS Requirements for Specific Types of Automatic Auxiliary Boilers § 63.25-1 Small automatic auxiliary boilers. Small automatic auxiliary boilers defined as having heat-input ratings of 400,000 Btu/hr...

  1. AUTOMATIC RECOGNITION OF CORONAL TYPE II RADIO BURSTS: THE AUTOMATED RADIO BURST IDENTIFICATION SYSTEM METHOD AND FIRST OBSERVATIONS

    International Nuclear Information System (INIS)

    Lobzin, Vasili V.; Cairns, Iver H.; Robinson, Peter A.; Steward, Graham; Patterson, Garth

    2010-01-01

    Major space weather events such as solar flares and coronal mass ejections are usually accompanied by solar radio bursts, which can potentially be used for real-time space weather forecasts. Type II radio bursts are produced near the local plasma frequency and its harmonic by fast electrons accelerated by a shock wave moving through the corona and solar wind with a typical speed of ∼1000 km s -1 . The coronal bursts have dynamic spectra with frequency gradually falling with time and durations of several minutes. This Letter presents a new method developed to detect type II coronal radio bursts automatically and describes its implementation in an extended Automated Radio Burst Identification System (ARBIS 2). Preliminary tests of the method with spectra obtained in 2002 show that the performance of the current implementation is quite high, ∼80%, while the probability of false positives is reasonably low, with one false positive per 100-200 hr for high solar activity and less than one false event per 10000 hr for low solar activity periods. The first automatically detected coronal type II radio burst is also presented.

  2. Children's and adults' judgments of the certainty of deductive inferences, inductive inferences, and guesses.

    Science.gov (United States)

    Pillow, Bradford H; Pearson, Raeanne M; Hecht, Mary; Bremer, Amanda

    2010-01-01

    Children and adults rated their own certainty following inductive inferences, deductive inferences, and guesses. Beginning in kindergarten, participants rated deductions as more certain than weak inductions or guesses. Deductions were rated as more certain than strong inductions beginning in Grade 3, and fourth-grade children and adults differentiated strong inductions, weak inductions, and informed guesses from pure guesses. By Grade 3, participants also gave different types of explanations for their deductions and inductions. These results are discussed in relation to children's concepts of cognitive processes, logical reasoning, and epistemological development.

  3. Landslide Fissure Inference Assessment by ANFIS and Logistic Regression Using UAS-Based Photogrammetry

    Directory of Open Access Journals (Sweden)

    Ozgun Akcay

    2015-10-01

    Full Text Available Unmanned Aerial Systems (UAS are now capable of gathering high-resolution data, therefore, landslides can be explored in detail at larger scales. In this research, 132 aerial photographs were captured, and 85,456 features were detected and matched automatically using UAS photogrammetry. The root mean square (RMS values of the image coordinates of the Ground Control Points (GPCs varied from 0.521 to 2.293 pixels, whereas maximum RMS values of automatically matched features was calculated as 2.921 pixels. Using the 3D point cloud, which was acquired by aerial photogrammetry, the raster datasets of the aspect, slope, and maximally stable extremal regions (MSER detecting visual uniformity, were defined as three variables, in order to reason fissure structures on the landslide surface. In this research, an Adaptive Neuro Fuzzy Inference System (ANFIS and a Logistic Regression (LR were implemented using training datasets to infer fissure data appropriately. The accuracy of the predictive models was evaluated by drawing receiver operating characteristic (ROC curves and by calculating the area under the ROC curve (AUC. The experiments exposed that high-resolution imagery is an indispensable data source to model and validate landslide fissures appropriately.

  4. Refining Automatically Extracted Knowledge Bases Using Crowdsourcing.

    Science.gov (United States)

    Li, Chunhua; Zhao, Pengpeng; Sheng, Victor S; Xian, Xuefeng; Wu, Jian; Cui, Zhiming

    2017-01-01

    Machine-constructed knowledge bases often contain noisy and inaccurate facts. There exists significant work in developing automated algorithms for knowledge base refinement. Automated approaches improve the quality of knowledge bases but are far from perfect. In this paper, we leverage crowdsourcing to improve the quality of automatically extracted knowledge bases. As human labelling is costly, an important research challenge is how we can use limited human resources to maximize the quality improvement for a knowledge base. To address this problem, we first introduce a concept of semantic constraints that can be used to detect potential errors and do inference among candidate facts. Then, based on semantic constraints, we propose rank-based and graph-based algorithms for crowdsourced knowledge refining, which judiciously select the most beneficial candidate facts to conduct crowdsourcing and prune unnecessary questions. Our experiments show that our method improves the quality of knowledge bases significantly and outperforms state-of-the-art automatic methods under a reasonable crowdsourcing cost.

  5. Refining Automatically Extracted Knowledge Bases Using Crowdsourcing

    Directory of Open Access Journals (Sweden)

    Chunhua Li

    2017-01-01

    Full Text Available Machine-constructed knowledge bases often contain noisy and inaccurate facts. There exists significant work in developing automated algorithms for knowledge base refinement. Automated approaches improve the quality of knowledge bases but are far from perfect. In this paper, we leverage crowdsourcing to improve the quality of automatically extracted knowledge bases. As human labelling is costly, an important research challenge is how we can use limited human resources to maximize the quality improvement for a knowledge base. To address this problem, we first introduce a concept of semantic constraints that can be used to detect potential errors and do inference among candidate facts. Then, based on semantic constraints, we propose rank-based and graph-based algorithms for crowdsourced knowledge refining, which judiciously select the most beneficial candidate facts to conduct crowdsourcing and prune unnecessary questions. Our experiments show that our method improves the quality of knowledge bases significantly and outperforms state-of-the-art automatic methods under a reasonable crowdsourcing cost.

  6. Automatic Facial Expression Recognition and Operator Functional State

    Science.gov (United States)

    Blanson, Nina

    2012-01-01

    The prevalence of human error in safety-critical occupations remains a major challenge to mission success despite increasing automation in control processes. Although various methods have been proposed to prevent incidences of human error, none of these have been developed to employ the detection and regulation of Operator Functional State (OFS), or the optimal condition of the operator while performing a task, in work environments due to drawbacks such as obtrusiveness and impracticality. A video-based system with the ability to infer an individual's emotional state from facial feature patterning mitigates some of the problems associated with other methods of detecting OFS, like obtrusiveness and impracticality in integration with the mission environment. This paper explores the utility of facial expression recognition as a technology for inferring OFS by first expounding on the intricacies of OFS and the scientific background behind emotion and its relationship with an individual's state. Then, descriptions of the feedback loop and the emotion protocols proposed for the facial recognition program are explained. A basic version of the facial expression recognition program uses Haar classifiers and OpenCV libraries to automatically locate key facial landmarks during a live video stream. Various methods of creating facial expression recognition software are reviewed to guide future extensions of the program. The paper concludes with an examination of the steps necessary in the research of emotion and recommendations for the creation of an automatic facial expression recognition program for use in real-time, safety-critical missions

  7. Automatic Facial Expression Recognition and Operator Functional State

    Science.gov (United States)

    Blanson, Nina

    2011-01-01

    The prevalence of human error in safety-critical occupations remains a major challenge to mission success despite increasing automation in control processes. Although various methods have been proposed to prevent incidences of human error, none of these have been developed to employ the detection and regulation of Operator Functional State (OFS), or the optimal condition of the operator while performing a task, in work environments due to drawbacks such as obtrusiveness and impracticality. A video-based system with the ability to infer an individual's emotional state from facial feature patterning mitigates some of the problems associated with other methods of detecting OFS, like obtrusiveness and impracticality in integration with the mission environment. This paper explores the utility of facial expression recognition as a technology for inferring OFS by first expounding on the intricacies of OFS and the scientific background behind emotion and its relationship with an individual's state. Then, descriptions of the feedback loop and the emotion protocols proposed for the facial recognition program are explained. A basic version of the facial expression recognition program uses Haar classifiers and OpenCV libraries to automatically locate key facial landmarks during a live video stream. Various methods of creating facial expression recognition software are reviewed to guide future extensions of the program. The paper concludes with an examination of the steps necessary in the research of emotion and recommendations for the creation of an automatic facial expression recognition program for use in real-time, safety-critical missions.

  8. Automatic Incubator-type Temperature Control System for Brain Hypothermia Treatment

    Science.gov (United States)

    Gaohua, Lu; Wakamatsu, Hidetoshi

    An automatic air-cooling incubator is proposed to replace the manual water-cooling blanket to control the brain tissue temperature for brain hypothermia treatment. Its feasibility is theoretically discussed as follows: First, an adult patient with the cooling incubator is modeled as a linear dynamical patient-incubator biothermal system. The patient is represented by an 18-compartment structure and described by its state equations. The air-cooling incubator provides almost same cooling effect as the water-cooling blanket, if a light breeze of speed around 3 m/s is circulated in the incubator. Then, in order to control the brain temperature automatically, an adaptive-optimal control algorithm is adopted, while the patient-blanket therapeutic system is considered as a reference model. Finally, the brain temperature of the patient-incubator biothermal system is controlled to follow up the given reference temperature course, in which an adaptive algorithm is confirmed useful for unknown environmental change and/or metabolic rate change of the patient in the incubating system. Thus, the present work ensures the development of the automatic air-cooling incubator for a better temperature regulation of the brain hypothermia treatment in ICU.

  9. Optimization of analytical parameters for inferring relationships among Escherichia coli isolates from repetitive-element PCR by maximizing correspondence with multilocus sequence typing data.

    Science.gov (United States)

    Goldberg, Tony L; Gillespie, Thomas R; Singer, Randall S

    2006-09-01

    Repetitive-element PCR (rep-PCR) is a method for genotyping bacteria based on the selective amplification of repetitive genetic elements dispersed throughout bacterial chromosomes. The method has great potential for large-scale epidemiological studies because of its speed and simplicity; however, objective guidelines for inferring relationships among bacterial isolates from rep-PCR data are lacking. We used multilocus sequence typing (MLST) as a "gold standard" to optimize the analytical parameters for inferring relationships among Escherichia coli isolates from rep-PCR data. We chose 12 isolates from a large database to represent a wide range of pairwise genetic distances, based on the initial evaluation of their rep-PCR fingerprints. We conducted MLST with these same isolates and systematically varied the analytical parameters to maximize the correspondence between the relationships inferred from rep-PCR and those inferred from MLST. Methods that compared the shapes of densitometric profiles ("curve-based" methods) yielded consistently higher correspondence values between data types than did methods that calculated indices of similarity based on shared and different bands (maximum correspondences of 84.5% and 80.3%, respectively). Curve-based methods were also markedly more robust in accommodating variations in user-specified analytical parameter values than were "band-sharing coefficient" methods, and they enhanced the reproducibility of rep-PCR. Phylogenetic analyses of rep-PCR data yielded trees with high topological correspondence to trees based on MLST and high statistical support for major clades. These results indicate that rep-PCR yields accurate information for inferring relationships among E. coli isolates and that accuracy can be enhanced with the use of analytical methods that consider the shapes of densitometric profiles.

  10. Visualization of simulated urban spaces: inferring parameterized generation of streets, parcels, and aerial imagery.

    Science.gov (United States)

    Vanegas, Carlos A; Aliaga, Daniel G; Benes, Bedrich; Waddell, Paul

    2009-01-01

    Urban simulation models and their visualization are used to help regional planning agencies evaluate alternative transportation investments, land use regulations, and environmental protection policies. Typical urban simulations provide spatially distributed data about number of inhabitants, land prices, traffic, and other variables. In this article, we build on a synergy of urban simulation, urban visualization, and computer graphics to automatically infer an urban layout for any time step of the simulation sequence. In addition to standard visualization tools, our method gathers data of the original street network, parcels, and aerial imagery and uses the available simulation results to infer changes to the original urban layout and produce a new and plausible layout for the simulation results. In contrast with previous work, our approach automatically updates the layout based on changes in the simulation data and thus can scale to a large simulation over many years. The method in this article offers a substantial step forward in building integrated visualization and behavioral simulation systems for use in community visioning, planning, and policy analysis. We demonstrate our method on several real cases using a 200 GB database for a 16,300 km2 area surrounding Seattle.

  11. I Hear You Eat and Speak: Automatic Recognition of Eating Condition and Food Type, Use-Cases, and Impact on ASR Performance.

    Science.gov (United States)

    Hantke, Simone; Weninger, Felix; Kurle, Richard; Ringeval, Fabien; Batliner, Anton; Mousa, Amr El-Desoky; Schuller, Björn

    2016-01-01

    We propose a new recognition task in the area of computational paralinguistics: automatic recognition of eating conditions in speech, i. e., whether people are eating while speaking, and what they are eating. To this end, we introduce the audio-visual iHEARu-EAT database featuring 1.6 k utterances of 30 subjects (mean age: 26.1 years, standard deviation: 2.66 years, gender balanced, German speakers), six types of food (Apple, Nectarine, Banana, Haribo Smurfs, Biscuit, and Crisps), and read as well as spontaneous speech, which is made publicly available for research purposes. We start with demonstrating that for automatic speech recognition (ASR), it pays off to know whether speakers are eating or not. We also propose automatic classification both by brute-forcing of low-level acoustic features as well as higher-level features related to intelligibility, obtained from an Automatic Speech Recogniser. Prediction of the eating condition was performed with a Support Vector Machine (SVM) classifier employed in a leave-one-speaker-out evaluation framework. Results show that the binary prediction of eating condition (i. e., eating or not eating) can be easily solved independently of the speaking condition; the obtained average recalls are all above 90%. Low-level acoustic features provide the best performance on spontaneous speech, which reaches up to 62.3% average recall for multi-way classification of the eating condition, i. e., discriminating the six types of food, as well as not eating. The early fusion of features related to intelligibility with the brute-forced acoustic feature set improves the performance on read speech, reaching a 66.4% average recall for the multi-way classification task. Analysing features and classifier errors leads to a suitable ordinal scale for eating conditions, on which automatic regression can be performed with up to 56.2% determination coefficient.

  12. Automatic Traffic-Based Internet Control Message Protocol (ICMP) Model Generation for ns-3

    Science.gov (United States)

    2015-12-01

    more protocols (especially at different layers of the OSI model ), implementing an inference engine to extract inter- and intrapacket dependencies, and...ARL-TR-7543 ● DEC 2015 US Army Research Laboratory Automatic Traffic-Based Internet Control Message Protocol (ICMP) Model ...ICMP) Model Generation for ns-3 by Jaime C Acosta and Felipe Jovel Survivability/Lethality Analysis Directorate, ARL Felipe Sotelo and Caesar

  13. Automatic vibration monitoring system for the diagnostic inspection of the WWER-440 type nuclear power plants

    International Nuclear Information System (INIS)

    Hollo, E.; Siklossy, P.; Toth, Zs.

    1982-01-01

    In the Hungarian Research Institute for Electric Power Industry (VEIKI) an automatic vibration monitoring system for diagnostics and inspection of nuclear power plants of type WWER-440 was developed. The paper summarizes the results of this work and investigates the use of mechanical vibrations and oscillations induced by flow for fault diagnosis. The design of the hardware system, the present software possibilities, the laboratory experiments and the guidelines for future software developments are also described in detail. (A.L.)

  14. Automatic exchange unit for control rod drive device

    International Nuclear Information System (INIS)

    Nasu, Seiji; Sasaki, Masayoshi.

    1982-01-01

    Purpose: To enable automatic reoperation and continuation without external power interruption remedy device at the time of recovering the interrupted power soruce during automatic positioning operation. Constitution: In case of an automatic exchange unit for a control rod drive device of the control type for setting the deviation between the positioning target position and the present position of the device to zero, the position data of the drive device of the positioning target value of the device is automatically read, and an interlock of operation inhibit is applied to a control system until the data reading is completed and automatic operation start or restart conditions are sequentially confirmed. After the confirmation, the interlock is released to start the automatic operation or reoperation. Accordingly, the automatic operation can be safely restarted and continued. (Yoshihara, H.)

  15. Examining the influence of psychopathy, hostility biases, and automatic processing on criminal offenders' Theory of Mind

    NARCIS (Netherlands)

    Nentjes, L.; Bernstein, D.; Arntz, A.; van Breukelen, G.; Slaats, M.

    2015-01-01

    Theory of Mind (ToM) is a social perceptual skill that refers to the ability to take someone else's perspective and infer what others think. The current study examined the effect of potential hostility biases, as well as controlled (slow) versus automatic (fast) processing on ToM performance in

  16. Packaging design as communicator of product attributes: Effects on consumers’ attribute inferences

    NARCIS (Netherlands)

    van Ooijen, I.

    2016-01-01

    This dissertation will focus on two types of attribute inferences that result from packaging design cues. First, the effects of product packaging design on quality related inferences are investigated. Second, the effects of product packaging design on healthiness related inferences are examined (See

  17. Automatic recognition of offensive team formation in american football plays

    KAUST Repository

    Atmosukarto, Indriyati

    2013-06-01

    Compared to security surveillance and military applications, where automated action analysis is prevalent, the sports domain is extremely under-served. Most existing software packages for sports video analysis require manual annotation of important events in the video. American football is the most popular sport in the United States, however most game analysis is still done manually. Line of scrimmage and offensive team formation recognition are two statistics that must be tagged by American Football coaches when watching and evaluating past play video clips, a process which takes many man hours per week. These two statistics are also the building blocks for more high-level analysis such as play strategy inference and automatic statistic generation. In this paper, we propose a novel framework where given an American football play clip, we automatically identify the video frame in which the offensive team lines in formation (formation frame), the line of scrimmage for that play, and the type of player formation the offensive team takes on. The proposed framework achieves 95% accuracy in detecting the formation frame, 98% accuracy in detecting the line of scrimmage, and up to 67% accuracy in classifying the offensive team\\'s formation. To validate our framework, we compiled a large dataset comprising more than 800 play-clips of standard and high definition resolution from real-world football games. This dataset will be made publicly available for future comparison. © 2013 IEEE.

  18. Kernel learning at the first level of inference.

    Science.gov (United States)

    Cawley, Gavin C; Talbot, Nicola L C

    2014-05-01

    Kernel learning methods, whether Bayesian or frequentist, typically involve multiple levels of inference, with the coefficients of the kernel expansion being determined at the first level and the kernel and regularisation parameters carefully tuned at the second level, a process known as model selection. Model selection for kernel machines is commonly performed via optimisation of a suitable model selection criterion, often based on cross-validation or theoretical performance bounds. However, if there are a large number of kernel parameters, as for instance in the case of automatic relevance determination (ARD), there is a substantial risk of over-fitting the model selection criterion, resulting in poor generalisation performance. In this paper we investigate the possibility of learning the kernel, for the Least-Squares Support Vector Machine (LS-SVM) classifier, at the first level of inference, i.e. parameter optimisation. The kernel parameters and the coefficients of the kernel expansion are jointly optimised at the first level of inference, minimising a training criterion with an additional regularisation term acting on the kernel parameters. The key advantage of this approach is that the values of only two regularisation parameters need be determined in model selection, substantially alleviating the problem of over-fitting the model selection criterion. The benefits of this approach are demonstrated using a suite of synthetic and real-world binary classification benchmark problems, where kernel learning at the first level of inference is shown to be statistically superior to the conventional approach, improves on our previous work (Cawley and Talbot, 2007) and is competitive with Multiple Kernel Learning approaches, but with reduced computational expense. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. System for automatic checking of nuclear radiation detectors of sparkle type

    International Nuclear Information System (INIS)

    Gutierrez O, E.; Vilchis P, A.; Romero G, M.; Torres B, M.A.; Garcia H, J.M.

    2001-01-01

    In this work an automatic system of checking of nuclear detectors of sparkle type is described. This system is used in laboratory for the checking of the parameters which define the reliable operation of each detector, also it compares the obtained results with those proportionated by the manufacturer for the operator can emit the acceptance or rejection criteria. The checking system consists of an acquisition data card with a digital signal processor (DSP) as central device, a programmable high voltage source and an insertion and conversion module. These components interact with a personal computer to provide to the operator the energy spectra, the nuclear pulse form and the merit figure. The obtained results are showed in graphic form and/or numerical values and it is possible store them in a data file and/or in printed form. For facilitating the interaction of the computer with the user, the system software was realized with a commercial language of graphic programming (virtual instrumentation). (Author)

  20. Hybrid Optical Inference Machines

    Science.gov (United States)

    1991-09-27

    with labels. Now, events. a set of facts cal be generated in the dyadic form "u, R 1,2" Eichmann and Caulfield (19] consider the same type of and can...these enceding-schemes. These architectures are-based pri- 19. G. Eichmann and H. J. Caulfield, "Optical Learning (Inference)marily on optical inner

  1. Training Inference Making Skills Using a Situation Model Approach Improves Reading Comprehension

    Directory of Open Access Journals (Sweden)

    Lisanne eBos

    2016-02-01

    Full Text Available This study aimed to enhance third and fourth graders’ text comprehension at the situation model level. Therefore, we tested a reading strategy training developed to target inference making skills, which are widely considered to be pivotal to situation model construction. The training was grounded in contemporary literature on situation model-based inference making and addressed the source (text-based versus knowledge-based, type (necessary versus unnecessary for (re-establishing coherence, and depth of an inference (making single lexical inferences versus combining multiple lexical inferences, as well as the type of searching strategy (forward versus backward. Results indicated that, compared to a control group (n = 51, children who followed the experimental training (n = 67 improved their inference making skills supportive to situation model construction. Importantly, our training also resulted in increased levels of general reading comprehension and motivation. In sum, this study showed that a ‘level of text representation’-approach can provide a useful framework to teach inference making skills to third and fourth graders.

  2. Automatic extraction of corpus callosum from midsagittal head MR image and examination of Alzheimer-type dementia objective diagnostic system in feature analysis

    International Nuclear Information System (INIS)

    Kaneko, Tomoyuki; Kodama, Naoki; Kaeriyama, Tomoharu; Fukumoto, Ichiro

    2004-01-01

    We studied the objective diagnosis of Alzheimer-type dementia based on changes in the corpus callosum. We examined midsagittal head MR images of 40 Alzheimer-type dementia patients (15 men and 25 women; mean age, 75.4±5.5 years) and 31 healthy elderly persons (10 men and 21 women; mean age, 73.4±7.5 years), 71 subjects altogether. First, the corpus callosum was automatically extracted from midsagittal head MR images. Next, Alzheimer-type dementia was compared with the healthy elderly individuals using the features of shape factor and six features of Co-occurrence Matrix from the corpus callosum. Automatic extraction of the corpus callosum succeeded in 64 of 71 individuals, for an extraction rate of 90.1%. A statistically significant difference was found in 7 of the 9 features between Alzheimer-type dementia patients and the healthy elderly adults. Discriminant analysis using the 7 features demonstrated a sensitivity rate of 82.4%, specificity of 89.3%, and overall accuracy of 85.5%. These results indicated the possibility of an objective diagnostic system for Alzheimer-type dementia using feature analysis based on change in the corpus callosum. (author)

  3. First order augmentation to tensor voting for boundary inference and multiscale analysis in 3D.

    Science.gov (United States)

    Tong, Wai-Shun; Tang, Chi-Keung; Mordohai, Philippos; Medioni, Gérard

    2004-05-01

    Most computer vision applications require the reliable detection of boundaries. In the presence of outliers, missing data, orientation discontinuities, and occlusion, this problem is particularly challenging. We propose to address it by complementing the tensor voting framework, which was limited to second order properties, with first order representation and voting. First order voting fields and a mechanism to vote for 3D surface and volume boundaries and curve endpoints in 3D are defined. Boundary inference is also useful for a second difficult problem in grouping, namely, automatic scale selection. We propose an algorithm that automatically infers the smallest scale that can preserve the finest details. Our algorithm then proceeds with progressively larger scales to ensure continuity where it has not been achieved. Therefore, the proposed approach does not oversmooth features or delay the handling of boundaries and discontinuities until model misfit occurs. The interaction of smooth features, boundaries, and outliers is accommodated by the unified representation, making possible the perceptual organization of data in curves, surfaces, volumes, and their boundaries simultaneously. We present results on a variety of data sets to show the efficacy of the improved formalism.

  4. Diagnosis - Using automatic test equipment and artificial intelligence expert systems

    Science.gov (United States)

    Ramsey, J. E., Jr.

    Three expert systems (ATEOPS, ATEFEXPERS, and ATEFATLAS), which were created to direct automatic test equipment (ATE), are reviewed. The purpose of the project was to develop an expert system to troubleshoot the converter-programmer power supply card for the F-15 aircraft and have that expert system direct the automatic test equipment. Each expert system uses a different knowledge base or inference engine, basing the testing on the circuit schematic, test requirements document, or ATLAS code. Implementing generalized modules allows the expert systems to be used for any different unit under test. Using converted ATLAS to LISP code allows the expert system to direct any ATE using ATLAS. The constraint propagated frame system allows for the expansion of control by creating the ATLAS code, checking the code for good software engineering techniques, directing the ATE, and changing the test sequence as needed (planning).

  5. Length Scales in Bayesian Automatic Adaptive Quadrature

    Directory of Open Access Journals (Sweden)

    Adam Gh.

    2016-01-01

    Full Text Available Two conceptual developments in the Bayesian automatic adaptive quadrature approach to the numerical solution of one-dimensional Riemann integrals [Gh. Adam, S. Adam, Springer LNCS 7125, 1–16 (2012] are reported. First, it is shown that the numerical quadrature which avoids the overcomputing and minimizes the hidden floating point loss of precision asks for the consideration of three classes of integration domain lengths endowed with specific quadrature sums: microscopic (trapezoidal rule, mesoscopic (Simpson rule, and macroscopic (quadrature sums of high algebraic degrees of precision. Second, sensitive diagnostic tools for the Bayesian inference on macroscopic ranges, coming from the use of Clenshaw-Curtis quadrature, are derived.

  6. 30 CFR 75.1103-6 - Automatic fire sensors; actuation of fire suppression systems.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic fire sensors; actuation of fire... Protection § 75.1103-6 Automatic fire sensors; actuation of fire suppression systems. Point-type heat sensors or automatic fire sensor and warning device systems may be used to actuate deluge-type water systems...

  7. Improvement in the performance of CAD for the Alzheimer-type dementia based on automatic extraction of temporal lobe from coronal MR images

    International Nuclear Information System (INIS)

    Kaeriyama, Tomoharu; Kodama, Naoki; Kaneko, Tomoyuki; Shimada, Tetsuo; Tanaka, Hiroyuki; Takeda, Ai; Fukumoto, Ichiro

    2004-01-01

    In this study, we extracted whole brain and temporal lobe images from MR images (26 healthy elderly controls and 34 Alzheimer-type dementia patients) by means of binarize, mask processing, template matching, Hough transformation, and boundary tracing etc. We assessed the extraction accuracy by comparing the extracted images to images extracts by a radiological technologist. The results of assessment by consistent rate; brain images 91.3±4.3%, right temporal lobe 83.3±6.9%, left temporal lobe 83.7±7.6%. Furthermore discriminant analysis using 6 textural features demonstrated sensitivity and specificity of 100% when the healthy elderly controls were compared to the Alzheimer-type dementia patients. Our research showed the possibility of automatic objective diagnosis of temporal lobe abnormalities by automatic extracted images of the temporal lobes. (author)

  8. Fast Markov chain Monte Carlo sampling for sparse Bayesian inference in high-dimensional inverse problems using L1-type priors

    International Nuclear Information System (INIS)

    Lucka, Felix

    2012-01-01

    Sparsity has become a key concept for solving of high-dimensional inverse problems using variational regularization techniques. Recently, using similar sparsity-constraints in the Bayesian framework for inverse problems by encoding them in the prior distribution has attracted attention. Important questions about the relation between regularization theory and Bayesian inference still need to be addressed when using sparsity promoting inversion. A practical obstacle for these examinations is the lack of fast posterior sampling algorithms for sparse, high-dimensional Bayesian inversion. Accessing the full range of Bayesian inference methods requires being able to draw samples from the posterior probability distribution in a fast and efficient way. This is usually done using Markov chain Monte Carlo (MCMC) sampling algorithms. In this paper, we develop and examine a new implementation of a single component Gibbs MCMC sampler for sparse priors relying on L1-norms. We demonstrate that the efficiency of our Gibbs sampler increases when the level of sparsity or the dimension of the unknowns is increased. This property is contrary to the properties of the most commonly applied Metropolis–Hastings (MH) sampling schemes. We demonstrate that the efficiency of MH schemes for L1-type priors dramatically decreases when the level of sparsity or the dimension of the unknowns is increased. Practically, Bayesian inversion for L1-type priors using MH samplers is not feasible at all. As this is commonly believed to be an intrinsic feature of MCMC sampling, the performance of our Gibbs sampler also challenges common beliefs about the applicability of sample based Bayesian inference. (paper)

  9. Comparison of Urban Human Movements Inferring from Multi-Source Spatial-Temporal Data

    Science.gov (United States)

    Cao, Rui; Tu, Wei; Cao, Jinzhou; Li, Qingquan

    2016-06-01

    The quantification of human movements is very hard because of the sparsity of traditional data and the labour intensive of the data collecting process. Recently, much spatial-temporal data give us an opportunity to observe human movement. This research investigates the relationship of city-wide human movements inferring from two types of spatial-temporal data at traffic analysis zone (TAZ) level. The first type of human movement is inferred from long-time smart card transaction data recording the boarding actions. The second type of human movement is extracted from citywide time sequenced mobile phone data with 30 minutes interval. Travel volume, travel distance and travel time are used to measure aggregated human movements in the city. To further examine the relationship between the two types of inferred movements, the linear correlation analysis is conducted on the hourly travel volume. The obtained results show that human movements inferred from smart card data and mobile phone data have a correlation of 0.635. However, there are still some non-ignorable differences in some special areas. This research not only reveals the citywide spatial-temporal human dynamic but also benefits the understanding of the reliability of the inference of human movements with big spatial-temporal data.

  10. COMPARISON OF URBAN HUMAN MOVEMENTS INFERRING FROM MULTI-SOURCE SPATIAL-TEMPORAL DATA

    Directory of Open Access Journals (Sweden)

    R. Cao

    2016-06-01

    Full Text Available The quantification of human movements is very hard because of the sparsity of traditional data and the labour intensive of the data collecting process. Recently, much spatial-temporal data give us an opportunity to observe human movement. This research investigates the relationship of city-wide human movements inferring from two types of spatial-temporal data at traffic analysis zone (TAZ level. The first type of human movement is inferred from long-time smart card transaction data recording the boarding actions. The second type of human movement is extracted from citywide time sequenced mobile phone data with 30 minutes interval. Travel volume, travel distance and travel time are used to measure aggregated human movements in the city. To further examine the relationship between the two types of inferred movements, the linear correlation analysis is conducted on the hourly travel volume. The obtained results show that human movements inferred from smart card data and mobile phone data have a correlation of 0.635. However, there are still some non-ignorable differences in some special areas. This research not only reveals the citywide spatial-temporal human dynamic but also benefits the understanding of the reliability of the inference of human movements with big spatial-temporal data.

  11. Misleading or Falsification? Inferring Deceptive Strategies and Types in Online News and Social Media

    Energy Technology Data Exchange (ETDEWEB)

    Volkova, Svitlana; Jang, Jin Yea

    2018-04-27

    Deceptive information in online news and social media has had dramatic effect on our society in recent years. This study is the first to gain deeper insights into writers' intent behind digital misinformation by analyzing psycholinguistic signals: moral foundations and connotations extracted from different types of deceptive news ranging from strategic disinformation to propaganda and hoaxes. To ensure consistency of our findings and generalizability across domains, we experiment with data from: (1) confirmed cases of disinformation in news summaries, (2) {propaganda}, hoax, and disinformation news pages, and (3) social media news. We first contrast lexical markers of biased language, syntactic and stylistic signals, and connotations across deceptive news types including disinformation, propaganda, and hoaxes, and {deceptive} strategies including misleading or falsification. We then incorporate these insights to build machine learning and deep learning predictive models to infer deception strategies and deceptive news types. Our experimental results demonstrate that unlike earlier work on deception detection, content combined with biased language markers, moral foundations, and connotations leads to better predictive performance of deception strategies compared to syntactic and stylistic signals (as reported in earlier work on deceptive reviews). Falsification strategy is easier to identify than misleading strategy. Disinformation is more difficult to predict than to propaganda or hoaxes. Deceptive news types (disinformation, propaganda, and hoaxes), unlike deceptive strategies (falsification and misleading), are more salient, and thus easier to identify in tweets than in news reports. Finally, our novel connotation analysis across deception types provides deeper understanding of writers' perspectives and therefore reveals the intentions behind digital misinformation.

  12. Type Inference of Turbo Pascal

    DEFF Research Database (Denmark)

    Hougaard, Ole Ildsgaard; Schwartzbach, Michael I; Askari, Hosein

    1995-01-01

    of Turbo Pascal. It has the form of a preprocessor that analyzes programs in which the type annotations are only partial or even absent. The resulting program has full type annotations, will be accepted by the standard Turbo Pascal compiler, and has polymorphic use of procedures resolved by means of code...

  13. Automatic-heuristic and executive-analytic processing during reasoning: Chronometric and dual-task considerations.

    Science.gov (United States)

    De Neys, Wim

    2006-06-01

    Human reasoning has been shown to overly rely on intuitive, heuristic processing instead of a more demanding analytic inference process. Four experiments tested the central claim of current dual-process theories that analytic operations involve time-consuming executive processing whereas the heuristic system would operate automatically. Participants solved conjunction fallacy problems and indicative and deontic selection tasks. Experiment 1 established that making correct analytic inferences demanded more processing time than did making heuristic inferences. Experiment 2 showed that burdening the executive resources with an attention-demanding secondary task decreased correct, analytic responding and boosted the rate of conjunction fallacies and indicative matching card selections. Results were replicated in Experiments 3 and 4 with a different secondary-task procedure. Involvement of executive resources for the deontic selection task was less clear. Findings validate basic processing assumptions of the dual-process framework and complete the correlational research programme of K. E. Stanovich and R. F. West (2000).

  14. Inference of type-specific HPV transmissibility, progression and clearance rates: a mathematical modelling approach.

    Directory of Open Access Journals (Sweden)

    Helen C Johnson

    Full Text Available Quantifying rates governing the clearance of Human Papillomavirus (HPV and its progression to clinical disease, together with viral transmissibility and the duration of naturally-acquired immunity, is essential in estimating the impact of vaccination programmes and screening or testing regimes. However, the complex natural history of HPV makes this difficult. We infer the viral transmissibility, rate of waning natural immunity and rates of progression and clearance of infection of 13 high-risk and 2 non-oncogenic HPV types, making use of a number of rich datasets from Sweden. Estimates of viral transmissibility, clearance of initial infection and waning immunity were derived in a Bayesian framework by fitting a susceptible-infectious-recovered-susceptible (SIRS transmission model to age- and type-specific HPV prevalence data from both a cross-sectional study and a randomised controlled trial (RCT of primary HPV screening. The models fitted well, but over-estimated the prevalence of four high-risk types with respect to the data. Three of these types (HPV-33, -35 and -58 are among the most closely related phylogenetically to the most prevalent HPV-16. The fourth (HPV-45 is the most closely related to HPV-18; the second most prevalent type. We suggest that this may be an indicator of cross-immunity. Rates of progression and clearance of clinical lesions were additionally estimated from longitudinal data gathered as part of the same RCT. Our estimates of progression and clearance rates are consistent with the findings of survival analysis studies and we extend the literature by estimating progression and clearance rates for non-16 and non-18 high-risk types. We anticipate that such type-specific estimates will be useful in the parameterisation of further models and in developing our understanding of HPV natural history.

  15. Efficient inference of population size histories and locus-specific mutation rates from large-sample genomic variation data.

    Science.gov (United States)

    Bhaskar, Anand; Wang, Y X Rachel; Song, Yun S

    2015-02-01

    With the recent increase in study sample sizes in human genetics, there has been growing interest in inferring historical population demography from genomic variation data. Here, we present an efficient inference method that can scale up to very large samples, with tens or hundreds of thousands of individuals. Specifically, by utilizing analytic results on the expected frequency spectrum under the coalescent and by leveraging the technique of automatic differentiation, which allows us to compute gradients exactly, we develop a very efficient algorithm to infer piecewise-exponential models of the historical effective population size from the distribution of sample allele frequencies. Our method is orders of magnitude faster than previous demographic inference methods based on the frequency spectrum. In addition to inferring demography, our method can also accurately estimate locus-specific mutation rates. We perform extensive validation of our method on simulated data and show that it can accurately infer multiple recent epochs of rapid exponential growth, a signal that is difficult to pick up with small sample sizes. Lastly, we use our method to analyze data from recent sequencing studies, including a large-sample exome-sequencing data set of tens of thousands of individuals assayed at a few hundred genic regions. © 2015 Bhaskar et al.; Published by Cold Spring Harbor Laboratory Press.

  16. A new method for automatic discontinuity traces sampling on rock mass 3D model

    Science.gov (United States)

    Umili, G.; Ferrero, A.; Einstein, H. H.

    2013-02-01

    A new automatic method for discontinuity traces mapping and sampling on a rock mass digital model is described in this work. The implemented procedure allows one to automatically identify discontinuity traces on a Digital Surface Model: traces are detected directly as surface breaklines, by means of maximum and minimum principal curvature values of the vertices that constitute the model surface. Color influence and user errors, that usually characterize the trace mapping on images, are eliminated. Also trace sampling procedures based on circular windows and circular scanlines have been implemented: they are used to infer trace data and to calculate values of mean trace length, expected discontinuity diameter and intensity of rock discontinuities. The method is tested on a case study: results obtained applying the automatic procedure on the DSM of a rock face are compared to those obtained performing a manual sampling on the orthophotograph of the same rock face.

  17. Inferring Pairwise Interactions from Biological Data Using Maximum-Entropy Probability Models.

    Directory of Open Access Journals (Sweden)

    Richard R Stein

    2015-07-01

    Full Text Available Maximum entropy-based inference methods have been successfully used to infer direct interactions from biological datasets such as gene expression data or sequence ensembles. Here, we review undirected pairwise maximum-entropy probability models in two categories of data types, those with continuous and categorical random variables. As a concrete example, we present recently developed inference methods from the field of protein contact prediction and show that a basic set of assumptions leads to similar solution strategies for inferring the model parameters in both variable types. These parameters reflect interactive couplings between observables, which can be used to predict global properties of the biological system. Such methods are applicable to the important problems of protein 3-D structure prediction and association of gene-gene networks, and they enable potential applications to the analysis of gene alteration patterns and to protein design.

  18. Driver Behavioral Changes through Interactions with an Automatic Brake System for Collision Avoidance

    Science.gov (United States)

    Itoh, Makoto; Fujiwara, Yusuke; Inagaki, Toshiyuki

    This paper discusses driver's behavioral changes as a result of driver's use of an automatic brake system for preventing a rear-end collision from occurring. Three types of automatic brake systems are investigated in this study. Type 1 brake system applies a strong automatic brake when a collision is very imminent. Type 2 brake system initiates brake operation softly when a rear-end crash may be anticipated. Types 1 and 2 are for avoidance of a collision. Type 3 brake system, on the other hand, applies a strong automatic brake to reduce the damage when a collision can not be avoided. An experiment was conducted with a driving simulator in order to analyze the driver's possible behavioral changes. The results showed that the time headway (THW) during car following phase was reduced by use of an automatic brake system of any type. The inverse of time to collision (TTC), which is an index of the driver's brake timing, increased by use of Type 1 brake system when the deceleration rate of the lead vehicle was relatively low. However, the brake timing did not change when the drivers used Type 2 or 3 brake system. As a whole, dangerous behavioral changes, such as overreliance on a brake system, were not observed for either type of brake system.

  19. Statistical inference based on divergence measures

    CERN Document Server

    Pardo, Leandro

    2005-01-01

    The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach.Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, prese...

  20. Discontinuity of maximum entropy inference and quantum phase transitions

    International Nuclear Information System (INIS)

    Chen, Jianxin; Ji, Zhengfeng; Yu, Nengkun; Zeng, Bei; Li, Chi-Kwong; Poon, Yiu-Tung; Shen, Yi; Zhou, Duanlu

    2015-01-01

    In this paper, we discuss the connection between two genuinely quantum phenomena—the discontinuity of quantum maximum entropy inference and quantum phase transitions at zero temperature. It is shown that the discontinuity of the maximum entropy inference of local observable measurements signals the non-local type of transitions, where local density matrices of the ground state change smoothly at the transition point. We then propose to use the quantum conditional mutual information of the ground state as an indicator to detect the discontinuity and the non-local type of quantum phase transitions in the thermodynamic limit. (paper)

  1. Automatic and strategic effects in the guidance of attention by working memory representations.

    Science.gov (United States)

    Carlisle, Nancy B; Woodman, Geoffrey F

    2011-06-01

    Theories of visual attention suggest that working memory representations automatically guide attention toward memory-matching objects. Some empirical tests of this prediction have produced results consistent with working memory automatically guiding attention. However, others have shown that individuals can strategically control whether working memory representations guide visual attention. Previous studies have not independently measured automatic and strategic contributions to the interactions between working memory and attention. In this study, we used a classic manipulation of the probability of valid, neutral, and invalid cues to tease apart the nature of such interactions. This framework utilizes measures of reaction time (RT) to quantify the costs and benefits of attending to memory-matching items and infer the relative magnitudes of automatic and strategic effects. We found both costs and benefits even when the memory-matching item was no more likely to be the target than other items, indicating an automatic component of attentional guidance. However, the costs and benefits essentially doubled as the probability of a trial with a valid cue increased from 20% to 80%, demonstrating a potent strategic effect. We also show that the instructions given to participants led to a significant change in guidance distinct from the actual probability of events during the experiment. Together, these findings demonstrate that the influence of working memory representations on attention is driven by both automatic and strategic interactions. Copyright © 2010 Elsevier B.V. All rights reserved.

  2. Automatic recognition of ship types from infrared images using superstructure moment invariants

    Science.gov (United States)

    Li, Heng; Wang, Xinyu

    2007-11-01

    Automatic object recognition is an active area of interest for military and commercial applications. In this paper, a system addressing autonomous recognition of ship types in infrared images is proposed. Firstly, an approach of segmentation based on detection of salient features of the target with subsequent shadow removing is proposed, as is the base of the subsequent object recognition. Considering the differences between the shapes of various ships mainly lie in their superstructures, we then use superstructure moment functions invariant to translation, rotation and scale differences in input patterns and develop a robust algorithm of obtaining ship superstructure. Subsequently a back-propagation neural network is used as a classifier in the recognition stage and projection images of simulated three-dimensional ship models are used as the training sets. Our recognition model was implemented and experimentally validated using both simulated three-dimensional ship model images and real images derived from video of an AN/AAS-44V Forward Looking Infrared(FLIR) sensor.

  3. Automatization of welding

    International Nuclear Information System (INIS)

    Iwabuchi, Masashi; Tomita, Jinji; Nishihara, Katsunori.

    1978-01-01

    Automatization of welding is one of the effective measures for securing high degree of quality of nuclear power equipment, as well as for correspondence to the environment at the site of plant. As the latest ones of the automatic welders practically used for welding of nuclear power apparatuses in factories of Toshiba and IHI, those for pipes and lining tanks are described here. The pipe welder performs the battering welding on the inside of pipe end as the so-called IGSCC countermeasure and the succeeding butt welding through the same controller. The lining tank welder is able to perform simultaneous welding of two parallel weld lines on a large thin plate lining tank. Both types of the welders are demonstrating excellent performance at the shops as well as at the plant site. (author)

  4. Automatic classification of ovarian cancer types from cytological images using deep convolutional neural networks.

    Science.gov (United States)

    Wu, Miao; Yan, Chuanbo; Liu, Huiqiang; Liu, Qian

    2018-06-29

    Ovarian cancer is one of the most common gynecologic malignancies. Accurate classification of ovarian cancer types (serous carcinoma, mucous carcinoma, endometrioid carcinoma, transparent cell carcinoma) is an essential part in the different diagnosis. Computer-aided diagnosis (CADx) can provide useful advice for pathologists to determine the diagnosis correctly. In our study, we employed a Deep Convolutional Neural Networks (DCNN) based on AlexNet to automatically classify the different types of ovarian cancers from cytological images. The DCNN consists of five convolutional layers, three max pooling layers, and two full reconnect layers. Then we trained the model by two group input data separately, one was original image data and the other one was augmented image data including image enhancement and image rotation. The testing results are obtained by the method of 10-fold cross-validation, showing that the accuracy of classification models has been improved from 72.76 to 78.20% by using augmented images as training data. The developed scheme was useful for classifying ovarian cancers from cytological images. © 2018 The Author(s).

  5. Assessing children's inference generation: what do tests of reading comprehension measure?

    Science.gov (United States)

    Bowyer-Crane, Claudine; Snowling, Margaret J

    2005-06-01

    Previous research suggests that children with specific comprehension difficulties have problems with the generation of inferences. This raises important questions as to whether poor comprehenders have poor comprehension skills generally, or whether their problems are confined to specific inference types. The main aims of the study were (a) using two commonly used tests of reading comprehension to classify the questions requiring the generation of inferences, and (b) to investigate the relative performance of skilled and less-skilled comprehenders on questions tapping different inference types. The performance of 10 poor comprehenders (mean age 110.06 months) was compared with the performance of 10 normal readers (mean age 112.78 months) on two tests of reading comprehension. A qualitative analysis of the NARA II (form 1) and the WORD comprehension subtest was carried out. Participants were then administered the NARA II, WORD comprehension subtest and a test of non-word reading. The NARA II was heavily reliant on the generation of knowledge-based inferences, while the WORD comprehension subtest was biased towards the retention of literal information. Children identified by the NARA II as having comprehension difficulties performed in the normal range on the WORD comprehension subtests. Further, children with comprehension difficulties performed poorly on questions requiring the generation of knowledge-based and elaborative inferences. However, they were able to answer questions requiring attention to literal information or use of cohesive devices at a level comparable to normal readers. Different reading tests tap different types of inferencing skills. Lessskilled comprehenders have particular difficulty applying real-world knowledge to a text during reading, and this has implications for the formulation of effective intervention strategies.

  6. A method for studying the hunting oscillations of an airplane with a simple type of automatic control

    Science.gov (United States)

    Jones, R. T.

    1976-01-01

    A method is presented for predicting the amplitude and frequency, under certain simplifying conditions, of the hunting oscillations of an automatically controlled aircraft with lag in the control system or in the response of the aircraft to the controls. If the steering device is actuated by a simple right-left type of signal, the series of alternating fixed amplitude signals occuring during the hunting may ordinarily be represented by a square wave. Formulas are given expressing the response to such a variation of signal in terms of the response to a unit signal.

  7. An Analysis of Multi-type Relational Interactions in FMA Using Graph Motifs with Disjointness Constraints

    Science.gov (United States)

    Zhang, Guo-Qiang; Luo, Lingyun; Ogbuji, Chime; Joslyn, Cliff; Mejino, Jose; Sahoo, Satya S

    2012-01-01

    The interaction of multiple types of relationships among anatomical classes in the Foundational Model of Anatomy (FMA) can provide inferred information valuable for quality assurance. This paper introduces a method called Motif Checking (MOCH) to study the effects of such multi-relation type interactions for detecting logical inconsistencies as well as other anomalies represented by the motifs. MOCH represents patterns of multi-type interaction as small labeled (with multiple types of edges) sub-graph motifs, whose nodes represent class variables, and labeled edges represent relational types. By representing FMA as an RDF graph and motifs as SPARQL queries, fragments of FMA are automatically obtained as auditing candidates. Leveraging the scalability and reconfigurability of Semantic Web Technology, we performed exhaustive analyses of a variety of labeled sub-graph motifs. The quality assurance feature of MOCH comes from the distinct use of a subset of the edges of the graph motifs as constraints for disjointness, whereby bringing in rule-based flavor to the approach as well. With possible disjointness implied by antonyms, we performed manual inspection of the resulting FMA fragments and tracked down sources of abnormal inferred conclusions (logical inconsistencies), which are amendable for programmatic revision of the FMA. Our results demonstrate that MOCH provides a unique source of valuable information for quality assurance. Since our approach is general, it is applicable to any ontological system with an OWL representation. PMID:23304382

  8. An analysis of multi-type relational interactions in FMA using graph motifs with disjointness constraints.

    Science.gov (United States)

    Zhang, Guo-Qiang; Luo, Lingyun; Ogbuji, Chime; Joslyn, Cliff; Mejino, Jose; Sahoo, Satya S

    2012-01-01

    The interaction of multiple types of relationships among anatomical classes in the Foundational Model of Anatomy (FMA) can provide inferred information valuable for quality assurance. This paper introduces a method called Motif Checking (MOCH) to study the effects of such multi-relation type interactions for detecting logical inconsistencies as well as other anomalies represented by the motifs. MOCH represents patterns of multi-type interaction as small labeled (with multiple types of edges) sub-graph motifs, whose nodes represent class variables, and labeled edges represent relational types. By representing FMA as an RDF graph and motifs as SPARQL queries, fragments of FMA are automatically obtained as auditing candidates. Leveraging the scalability and reconfigurability of Semantic Web Technology, we performed exhaustive analyses of a variety of labeled sub-graph motifs. The quality assurance feature of MOCH comes from the distinct use of a subset of the edges of the graph motifs as constraints for disjointness, whereby bringing in rule-based flavor to the approach as well. With possible disjointness implied by antonyms, we performed manual inspection of the resulting FMA fragments and tracked down sources of abnormal inferred conclusions (logical inconsistencies), which are amendable for programmatic revision of the FMA. Our results demonstrate that MOCH provides a unique source of valuable information for quality assurance. Since our approach is general, it is applicable to any ontological system with an OWL representation.

  9. NIFTY - Numerical Information Field Theory. A versatile PYTHON library for signal inference

    Science.gov (United States)

    Selig, M.; Bell, M. R.; Junklewitz, H.; Oppermann, N.; Reinecke, M.; Greiner, M.; Pachajoa, C.; Enßlin, T. A.

    2013-06-01

    NIFTy (Numerical Information Field Theory) is a software package designed to enable the development of signal inference algorithms that operate regardless of the underlying spatial grid and its resolution. Its object-oriented framework is written in Python, although it accesses libraries written in Cython, C++, and C for efficiency. NIFTy offers a toolkit that abstracts discretized representations of continuous spaces, fields in these spaces, and operators acting on fields into classes. Thereby, the correct normalization of operations on fields is taken care of automatically without concerning the user. This allows for an abstract formulation and programming of inference algorithms, including those derived within information field theory. Thus, NIFTy permits its user to rapidly prototype algorithms in 1D, and then apply the developed code in higher-dimensional settings of real world problems. The set of spaces on which NIFTy operates comprises point sets, n-dimensional regular grids, spherical spaces, their harmonic counterparts, and product spaces constructed as combinations of those. The functionality and diversity of the package is demonstrated by a Wiener filter code example that successfully runs without modification regardless of the space on which the inference problem is defined. NIFTy homepage http://www.mpa-garching.mpg.de/ift/nifty/; Excerpts of this paper are part of the NIFTy source code and documentation.

  10. Adaptive neuro-fuzzy inference systems for semi-automatic discrimination between seismic events: a study in Tehran region

    Science.gov (United States)

    Vasheghani Farahani, Jamileh; Zare, Mehdi; Lucas, Caro

    2012-04-01

    Thisarticle presents an adaptive neuro-fuzzy inference system (ANFIS) for classification of low magnitude seismic events reported in Iran by the network of Tehran Disaster Mitigation and Management Organization (TDMMO). ANFIS classifiers were used to detect seismic events using six inputs that defined the seismic events. Neuro-fuzzy coding was applied using the six extracted features as ANFIS inputs. Two types of events were defined: weak earthquakes and mining blasts. The data comprised 748 events (6289 signals) ranging from magnitude 1.1 to 4.6 recorded at 13 seismic stations between 2004 and 2009. We surveyed that there are almost 223 earthquakes with M ≤ 2.2 included in this database. Data sets from the south, east, and southeast of the city of Tehran were used to evaluate the best short period seismic discriminants, and features as inputs such as origin time of event, distance (source to station), latitude of epicenter, longitude of epicenter, magnitude, and spectral analysis (fc of the Pg wave) were used, increasing the rate of correct classification and decreasing the confusion rate between weak earthquakes and quarry blasts. The performance of the ANFIS model was evaluated for training and classification accuracy. The results confirmed that the proposed ANFIS model has good potential for determining seismic events.

  11. Probabilistic logic networks a comprehensive framework for uncertain inference

    CERN Document Server

    Goertzel, Ben; Goertzel, Izabela Freire; Heljakka, Ari

    2008-01-01

    This comprehensive book describes Probabilistic Logic Networks (PLN), a novel conceptual, mathematical and computational approach to uncertain inference. A broad scope of reasoning types are considered.

  12. Image Analysis of Endosocopic Ultrasonography in Submucosal Tumor Using Fuzzy Inference

    Directory of Open Access Journals (Sweden)

    Kwang Baek Kim

    2013-01-01

    Full Text Available Endoscopists usually make a diagnosis in the submucosal tumor depending on the subjective evaluation about general images obtained by endoscopic ultrasonography. In this paper, we propose a method to extract areas of gastrointestinal stromal tumor (GIST and lipoma automatically from the ultrasonic image to assist those specialists. We also propose an algorithm to differentiate GIST from non-GIST by fuzzy inference from such images after applying ROC curve with mean and standard deviation of brightness information. In experiments using real images that medical specialists use, we verify that our method is sufficiently helpful for such specialists for efficient classification of submucosal tumors.

  13. Development of a doorframe-typed swinging seedling pick-up device for automatic field transplantation

    Energy Technology Data Exchange (ETDEWEB)

    Han, H.; Mao, H.; Hu, J.; Tian, K.

    2015-07-01

    A doorframe-typed swing seedling pick-up device for automatic field transplanters was developed and evaluated in a laboratory. The device, consisting of a path manipulator and two grippers, can move the pins slowly to extract seedlings from the tray cells and return quickly to the pick-up point for the next extraction. The path manipulator was constructed with the creative design of type-Ⅱ mechanism combination in series. It consists of an oscillating guide linkage mechanism and a grooved globoidal cam mechanism. The gripper is a pincette-type mechanism using the pick-up pins to penetrate into the root mass for seedling extraction. The dynamic analysis of the designed seedling pick-up device was simulated with ADAMS software. Being the first prototype, various performance tests under local production conditions were conducted to find out the optimal machine operation parameters and transplant production conditions. As the gripper with multiple fine pins was moved by the swing pick-up device, it can effectively complete the transplanting work cycle of extracting, transferring, and discharging a seedling. The laboratory evaluation showed that the pick-up device equipped with two grippers can extract 80 seedlings/min with a 90% success and a 3% failure in discharging seedlings, using 42-day-old tomato plantlets. The quality of extracting seedlings was satisfactory. (Author)

  14. Implementing and analyzing the multi-threaded LP-inference

    Science.gov (United States)

    Bolotova, S. Yu; Trofimenko, E. V.; Leschinskaya, M. V.

    2018-03-01

    The logical production equations provide new possibilities for the backward inference optimization in intelligent production-type systems. The strategy of a relevant backward inference is aimed at minimization of a number of queries to external information source (either to a database or an interactive user). The idea of the method is based on the computing of initial preimages set and searching for the true preimage. The execution of each stage can be organized independently and in parallel and the actual work at a given stage can also be distributed between parallel computers. This paper is devoted to the parallel algorithms of the relevant inference based on the advanced scheme of the parallel computations “pipeline” which allows to increase the degree of parallelism. The author also provides some details of the LP-structures implementation.

  15. Automatic categorization of diverse experimental information in the bioscience literature.

    Science.gov (United States)

    Fang, Ruihua; Schindelman, Gary; Van Auken, Kimberly; Fernandes, Jolene; Chen, Wen; Wang, Xiaodong; Davis, Paul; Tuli, Mary Ann; Marygold, Steven J; Millburn, Gillian; Matthews, Beverley; Zhang, Haiyan; Brown, Nick; Gelbart, William M; Sternberg, Paul W

    2012-01-26

    Curation of information from bioscience literature into biological knowledge databases is a crucial way of capturing experimental information in a computable form. During the biocuration process, a critical first step is to identify from all published literature the papers that contain results for a specific data type the curator is interested in annotating. This step normally requires curators to manually examine many papers to ascertain which few contain information of interest and thus, is usually time consuming. We developed an automatic method for identifying papers containing these curation data types among a large pool of published scientific papers based on the machine learning method Support Vector Machine (SVM). This classification system is completely automatic and can be readily applied to diverse experimental data types. It has been in use in production for automatic categorization of 10 different experimental datatypes in the biocuration process at WormBase for the past two years and it is in the process of being adopted in the biocuration process at FlyBase and the Saccharomyces Genome Database (SGD). We anticipate that this method can be readily adopted by various databases in the biocuration community and thereby greatly reducing time spent on an otherwise laborious and demanding task. We also developed a simple, readily automated procedure to utilize training papers of similar data types from different bodies of literature such as C. elegans and D. melanogaster to identify papers with any of these data types for a single database. This approach has great significance because for some data types, especially those of low occurrence, a single corpus often does not have enough training papers to achieve satisfactory performance. We successfully tested the method on ten data types from WormBase, fifteen data types from FlyBase and three data types from Mouse Genomics Informatics (MGI). It is being used in the curation work flow at WormBase for

  16. Derivation of groundwater flow-paths based on semi-automatic extraction of lineaments from remote sensing data

    OpenAIRE

    U. Mallast; R. Gloaguen; S. Geyer; T. Rödiger; C. Siebert

    2011-01-01

    In this paper we present a semi-automatic method to infer groundwater flow-paths based on the extraction of lineaments from digital elevation models. This method is especially adequate in remote and inaccessible areas where in-situ data are scarce. The combined method of linear filtering and object-based classification provides a lineament map with a high degree of accuracy. Subsequently, lineaments are differentiated into geological and morphological lineaments using auxili...

  17. The Generator of the Event Structure Lexicon (GESL): Automatic Annotation of Event Structure for Textual Inference Tasks

    Science.gov (United States)

    Im, Seohyun

    2013-01-01

    This dissertation aims to develop the Generator of the Event Structure Lexicon (GESL) which is a tool to automate annotating the event structure of verbs in text to support textual inference tasks related to lexically entailed subevents. The output of the GESL is the Event Structure Lexicon (ESL), which is a lexicon of verbs in text which includes…

  18. SubClonal Hierarchy Inference from Somatic Mutations: Automatic Reconstruction of Cancer Evolutionary Trees from Multi-region Next Generation Sequencing.

    Science.gov (United States)

    Niknafs, Noushin; Beleva-Guthrie, Violeta; Naiman, Daniel Q; Karchin, Rachel

    2015-10-01

    Recent improvements in next-generation sequencing of tumor samples and the ability to identify somatic mutations at low allelic fractions have opened the way for new approaches to model the evolution of individual cancers. The power and utility of these models is increased when tumor samples from multiple sites are sequenced. Temporal ordering of the samples may provide insight into the etiology of both primary and metastatic lesions and rationalizations for tumor recurrence and therapeutic failures. Additional insights may be provided by temporal ordering of evolving subclones--cellular subpopulations with unique mutational profiles. Current methods for subclone hierarchy inference tightly couple the problem of temporal ordering with that of estimating the fraction of cancer cells harboring each mutation. We present a new framework that includes a rigorous statistical hypothesis test and a collection of tools that make it possible to decouple these problems, which we believe will enable substantial progress in the field of subclone hierarchy inference. The methods presented here can be flexibly combined with methods developed by others addressing either of these problems. We provide tools to interpret hypothesis test results, which inform phylogenetic tree construction, and we introduce the first genetic algorithm designed for this purpose. The utility of our framework is systematically demonstrated in simulations. For most tested combinations of tumor purity, sequencing coverage, and tree complexity, good power (≥ 0.8) can be achieved and Type 1 error is well controlled when at least three tumor samples are available from a patient. Using data from three published multi-region tumor sequencing studies of (murine) small cell lung cancer, acute myeloid leukemia, and chronic lymphocytic leukemia, in which the authors reconstructed subclonal phylogenetic trees by manual expert curation, we show how different configurations of our tools can identify either a single

  19. SubClonal Hierarchy Inference from Somatic Mutations: Automatic Reconstruction of Cancer Evolutionary Trees from Multi-region Next Generation Sequencing.

    Directory of Open Access Journals (Sweden)

    Noushin Niknafs

    2015-10-01

    Full Text Available Recent improvements in next-generation sequencing of tumor samples and the ability to identify somatic mutations at low allelic fractions have opened the way for new approaches to model the evolution of individual cancers. The power and utility of these models is increased when tumor samples from multiple sites are sequenced. Temporal ordering of the samples may provide insight into the etiology of both primary and metastatic lesions and rationalizations for tumor recurrence and therapeutic failures. Additional insights may be provided by temporal ordering of evolving subclones--cellular subpopulations with unique mutational profiles. Current methods for subclone hierarchy inference tightly couple the problem of temporal ordering with that of estimating the fraction of cancer cells harboring each mutation. We present a new framework that includes a rigorous statistical hypothesis test and a collection of tools that make it possible to decouple these problems, which we believe will enable substantial progress in the field of subclone hierarchy inference. The methods presented here can be flexibly combined with methods developed by others addressing either of these problems. We provide tools to interpret hypothesis test results, which inform phylogenetic tree construction, and we introduce the first genetic algorithm designed for this purpose. The utility of our framework is systematically demonstrated in simulations. For most tested combinations of tumor purity, sequencing coverage, and tree complexity, good power (≥ 0.8 can be achieved and Type 1 error is well controlled when at least three tumor samples are available from a patient. Using data from three published multi-region tumor sequencing studies of (murine small cell lung cancer, acute myeloid leukemia, and chronic lymphocytic leukemia, in which the authors reconstructed subclonal phylogenetic trees by manual expert curation, we show how different configurations of our tools can

  20. A Rewriting Logic Approach to Type Inference

    Science.gov (United States)

    Ellison, Chucky; Şerbănuţă, Traian Florin; Roşu, Grigore

    Meseguer and Roşu proposed rewriting logic semantics (RLS) as a programing language definitional framework that unifies operational and algebraic denotational semantics. RLS has already been used to define a series of didactic and real languages, but its benefits in connection with defining and reasoning about type systems have not been fully investigated. This paper shows how the same RLS style employed for giving formal definitions of languages can be used to define type systems. The same term-rewriting mechanism used to execute RLS language definitions can now be used to execute type systems, giving type checkers or type inferencers. The proposed approach is exemplified by defining the Hindley-Milner polymorphic type inferencer mathcal{W} as a rewrite logic theory and using this definition to obtain a type inferencer by executing it in a rewriting logic engine. The inferencer obtained this way compares favorably with other definitions or implementations of mathcal{W}. The performance of the executable definition is within an order of magnitude of that of highly optimized implementations of type inferencers, such as that of OCaml.

  1. MATRIX-VECTOR ALGORITHMS OF LOCAL POSTERIORI INFERENCE IN ALGEBRAIC BAYESIAN NETWORKS ON QUANTA PROPOSITIONS

    Directory of Open Access Journals (Sweden)

    A. A. Zolotin

    2015-07-01

    Full Text Available Posteriori inference is one of the three kinds of probabilistic-logic inferences in the probabilistic graphical models theory and the base for processing of knowledge patterns with probabilistic uncertainty using Bayesian networks. The paper deals with a task of local posteriori inference description in algebraic Bayesian networks that represent a class of probabilistic graphical models by means of matrix-vector equations. The latter are essentially based on the use of tensor product of matrices, Kronecker degree and Hadamard product. Matrix equations for calculating posteriori probabilities vectors within posteriori inference in knowledge patterns with quanta propositions are obtained. Similar equations of the same type have already been discussed within the confines of the theory of algebraic Bayesian networks, but they were built only for the case of posteriori inference in the knowledge patterns on the ideals of conjuncts. During synthesis and development of matrix-vector equations on quanta propositions probability vectors, a number of earlier results concerning normalizing factors in posteriori inference and assignment of linear projective operator with a selector vector was adapted. We consider all three types of incoming evidences - deterministic, stochastic and inaccurate - combined with scalar and interval estimation of probability truth of propositional formulas in the knowledge patterns. Linear programming problems are formed. Their solution gives the desired interval values of posterior probabilities in the case of inaccurate evidence or interval estimates in a knowledge pattern. That sort of description of a posteriori inference gives the possibility to extend the set of knowledge pattern types that we can use in the local and global posteriori inference, as well as simplify complex software implementation by use of existing third-party libraries, effectively supporting submission and processing of matrices and vectors when

  2. Genealogical and evolutionary inference with the human Y chromosome.

    Science.gov (United States)

    Stumpf, M P; Goldstein, D B

    2001-03-02

    Population genetics has emerged as a powerful tool for unraveling human history. In addition to the study of mitochondrial and autosomal DNA, attention has recently focused on Y-chromosome variation. Ambiguities and inaccuracies in data analysis, however, pose an important obstacle to further development of the field. Here we review the methods available for genealogical inference using Y-chromosome data. Approaches can be divided into those that do and those that do not use an explicit population model in genealogical inference. We describe the strengths and weaknesses of these model-based and model-free approaches, as well as difficulties associated with the mutation process that affect both methods. In the case of genealogical inference using microsatellite loci, we use coalescent simulations to show that relatively simple generalizations of the mutation process can greatly increase the accuracy of genealogical inference. Because model-free and model-based approaches have different biases and limitations, we conclude that there is considerable benefit in the continued use of both types of approaches.

  3. Culture, attribution and automaticity: a social cognitive neuroscience view.

    Science.gov (United States)

    Mason, Malia F; Morris, Michael W

    2010-06-01

    A fundamental challenge facing social perceivers is identifying the cause underlying other people's behavior. Evidence indicates that East Asian perceivers are more likely than Western perceivers to reference the social context when attributing a cause to a target person's actions. One outstanding question is whether this reflects a culture's influence on automatic or on controlled components of causal attribution. After reviewing behavioral evidence that culture can shape automatic mental processes as well as controlled reasoning, we discuss the evidence in favor of cultural differences in automatic and controlled components of causal attribution more specifically. We contend that insights emerging from social cognitive neuroscience research can inform this debate. After introducing an attribution framework popular among social neuroscientists, we consider findings relevant to the automaticity of attribution, before speculating how one could use a social neuroscience approach to clarify whether culture affects automatic, controlled or both types of attribution processes.

  4. Automatic alignment of radionuclide images

    International Nuclear Information System (INIS)

    Barber, D.C.

    1982-01-01

    The variability of the position, dimensions and orientation of a radionuclide image within the field of view of a gamma camera hampers attempts to analyse the image numerically. This paper describes a method of using a set of training images of a particular type, in this case right lateral brain images, to define the likely variations in the position, dimensions and orientation for that type of image and to provide alignment data for a program that automatically aligns new images of the specified type to a standard position, size and orientation. Examples are given of the use of this method on three types of radionuclide image. (author)

  5. Entropic Inference

    Science.gov (United States)

    Caticha, Ariel

    2011-03-01

    In this tutorial we review the essential arguments behing entropic inference. We focus on the epistemological notion of information and its relation to the Bayesian beliefs of rational agents. The problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), includes as special cases both MaxEnt and Bayes' rule, and therefore unifies the two themes of these workshops—the Maximum Entropy and the Bayesian methods—into a single general inference scheme.

  6. Towards automatic understanding of technical texts

    Energy Technology Data Exchange (ETDEWEB)

    Hajicova, E; Sgall, P

    1981-01-01

    The authors briefly mention one experiment of natural language interface with databases of a common type. The main part of this paper is devoted to the prepared system of natural language understanding with an automatic construction of the collection of data. 12 references.

  7. Automatic modulation recognition of communication signals

    CERN Document Server

    Azzouz, Elsayed Elsayed

    1996-01-01

    Automatic modulation recognition is a rapidly evolving area of signal analysis. In recent years, interest from the academic and military research institutes has focused around the research and development of modulation recognition algorithms. Any communication intelligence (COMINT) system comprises three main blocks: receiver front-end, modulation recogniser and output stage. Considerable work has been done in the area of receiver front-ends. The work at the output stage is concerned with information extraction, recording and exploitation and begins with signal demodulation, that requires accurate knowledge about the signal modulation type. There are, however, two main reasons for knowing the current modulation type of a signal; to preserve the signal information content and to decide upon the suitable counter action, such as jamming. Automatic Modulation Recognition of Communications Signals describes in depth this modulation recognition process. Drawing on several years of research, the authors provide a cr...

  8. Learning algorithms and automatic processing of languages

    International Nuclear Information System (INIS)

    Fluhr, Christian Yves Andre

    1977-01-01

    This research thesis concerns the field of artificial intelligence. It addresses learning algorithms applied to automatic processing of languages. The author first briefly describes some mechanisms of human intelligence in order to describe how these mechanisms are simulated on a computer. He outlines the specific role of learning in various manifestations of intelligence. Then, based on the Markov's algorithm theory, the author discusses the notion of learning algorithm. Two main types of learning algorithms are then addressed: firstly, an 'algorithm-teacher dialogue' type sanction-based algorithm which aims at learning how to solve grammatical ambiguities in submitted texts; secondly, an algorithm related to a document system which structures semantic data automatically obtained from a set of texts in order to be able to understand by references to any question on the content of these texts

  9. Probabilistic Inference of Biological Networks via Data Integration

    Directory of Open Access Journals (Sweden)

    Mark F. Rogers

    2015-01-01

    Full Text Available There is significant interest in inferring the structure of subcellular networks of interaction. Here we consider supervised interactive network inference in which a reference set of known network links and nonlinks is used to train a classifier for predicting new links. Many types of data are relevant to inferring functional links between genes, motivating the use of data integration. We use pairwise kernels to predict novel links, along with multiple kernel learning to integrate distinct sources of data into a decision function. We evaluate various pairwise kernels to establish which are most informative and compare individual kernel accuracies with accuracies for weighted combinations. By associating a probability measure with classifier predictions, we enable cautious classification, which can increase accuracy by restricting predictions to high-confidence instances, and data cleaning that can mitigate the influence of mislabeled training instances. Although one pairwise kernel (the tensor product pairwise kernel appears to work best, different kernels may contribute complimentary information about interactions: experiments in S. cerevisiae (yeast reveal that a weighted combination of pairwise kernels applied to different types of data yields the highest predictive accuracy. Combined with cautious classification and data cleaning, we can achieve predictive accuracies of up to 99.6%.

  10. The Multivariate Generalised von Mises Distribution: Inference and Applications

    DEFF Research Database (Denmark)

    Navarro, Alexandre Khae Wu; Frellsen, Jes; Turner, Richard

    2017-01-01

    Circular variables arise in a multitude of data-modelling contexts ranging from robotics to the social sciences, but they have been largely overlooked by the machine learning community. This paper partially redresses this imbalance by extending some standard probabilistic modelling tools to the c......Circular variables arise in a multitude of data-modelling contexts ranging from robotics to the social sciences, but they have been largely overlooked by the machine learning community. This paper partially redresses this imbalance by extending some standard probabilistic modelling tools....... These models can leverage standard modelling tools (e.g. kernel functions and automatic relevance determination). Third, we show that the posterior distribution in these models is a mGvM distribution which enables development of an efficient variational free-energy scheme for performing approximate inference...... and approximate maximum-likelihood learning....

  11. Oocytes Polar Body Detection for Automatic Enucleation

    Directory of Open Access Journals (Sweden)

    Di Chen

    2016-02-01

    Full Text Available Enucleation is a crucial step in cloning. In order to achieve automatic blind enucleation, we should detect the polar body of the oocyte automatically. The conventional polar body detection approaches have low success rate or low efficiency. We propose a polar body detection method based on machine learning in this paper. On one hand, the improved Histogram of Oriented Gradient (HOG algorithm is employed to extract features of polar body images, which will increase success rate. On the other hand, a position prediction method is put forward to narrow the search range of polar body, which will improve efficiency. Experiment results show that the success rate is 96% for various types of polar bodies. Furthermore, the method is applied to an enucleation experiment and improves the degree of automatic enucleation.

  12. Affective theory of mind inferences contextually influence the recognition of emotional facial expressions.

    Science.gov (United States)

    Stewart, Suzanne L K; Schepman, Astrid; Haigh, Matthew; McHugh, Rhian; Stewart, Andrew J

    2018-03-14

    The recognition of emotional facial expressions is often subject to contextual influence, particularly when the face and the context convey similar emotions. We investigated whether spontaneous, incidental affective theory of mind inferences made while reading vignettes describing social situations would produce context effects on the identification of same-valenced emotions (Experiment 1) as well as differently-valenced emotions (Experiment 2) conveyed by subsequently presented faces. Crucially, we found an effect of context on reaction times in both experiments while, in line with previous work, we found evidence for a context effect on accuracy only in Experiment 1. This demonstrates that affective theory of mind inferences made at the pragmatic level of a text can automatically, contextually influence the perceptual processing of emotional facial expressions in a separate task even when those emotions are of a distinctive valence. Thus, our novel findings suggest that language acts as a contextual influence to the recognition of emotional facial expressions for both same and different valences.

  13. Automatic categorization of diverse experimental information in the bioscience literature

    Directory of Open Access Journals (Sweden)

    Fang Ruihua

    2012-01-01

    Full Text Available Abstract Background Curation of information from bioscience literature into biological knowledge databases is a crucial way of capturing experimental information in a computable form. During the biocuration process, a critical first step is to identify from all published literature the papers that contain results for a specific data type the curator is interested in annotating. This step normally requires curators to manually examine many papers to ascertain which few contain information of interest and thus, is usually time consuming. We developed an automatic method for identifying papers containing these curation data types among a large pool of published scientific papers based on the machine learning method Support Vector Machine (SVM. This classification system is completely automatic and can be readily applied to diverse experimental data types. It has been in use in production for automatic categorization of 10 different experimental datatypes in the biocuration process at WormBase for the past two years and it is in the process of being adopted in the biocuration process at FlyBase and the Saccharomyces Genome Database (SGD. We anticipate that this method can be readily adopted by various databases in the biocuration community and thereby greatly reducing time spent on an otherwise laborious and demanding task. We also developed a simple, readily automated procedure to utilize training papers of similar data types from different bodies of literature such as C. elegans and D. melanogaster to identify papers with any of these data types for a single database. This approach has great significance because for some data types, especially those of low occurrence, a single corpus often does not have enough training papers to achieve satisfactory performance. Results We successfully tested the method on ten data types from WormBase, fifteen data types from FlyBase and three data types from Mouse Genomics Informatics (MGI. It is being used in

  14. Automatic categorization of diverse experimental information in the bioscience literature

    Science.gov (United States)

    2012-01-01

    Background Curation of information from bioscience literature into biological knowledge databases is a crucial way of capturing experimental information in a computable form. During the biocuration process, a critical first step is to identify from all published literature the papers that contain results for a specific data type the curator is interested in annotating. This step normally requires curators to manually examine many papers to ascertain which few contain information of interest and thus, is usually time consuming. We developed an automatic method for identifying papers containing these curation data types among a large pool of published scientific papers based on the machine learning method Support Vector Machine (SVM). This classification system is completely automatic and can be readily applied to diverse experimental data types. It has been in use in production for automatic categorization of 10 different experimental datatypes in the biocuration process at WormBase for the past two years and it is in the process of being adopted in the biocuration process at FlyBase and the Saccharomyces Genome Database (SGD). We anticipate that this method can be readily adopted by various databases in the biocuration community and thereby greatly reducing time spent on an otherwise laborious and demanding task. We also developed a simple, readily automated procedure to utilize training papers of similar data types from different bodies of literature such as C. elegans and D. melanogaster to identify papers with any of these data types for a single database. This approach has great significance because for some data types, especially those of low occurrence, a single corpus often does not have enough training papers to achieve satisfactory performance. Results We successfully tested the method on ten data types from WormBase, fifteen data types from FlyBase and three data types from Mouse Genomics Informatics (MGI). It is being used in the curation work flow at

  15. Bayesian inference in processing experimental data: principles and basic applications

    International Nuclear Information System (INIS)

    D'Agostini, G

    2003-01-01

    This paper introduces general ideas and some basic methods of the Bayesian probability theory applied to physics measurements. Our aim is to make the reader familiar, through examples rather than rigorous formalism, with concepts such as the following: model comparison (including the automatic Ockham's Razor filter provided by the Bayesian approach); parametric inference; quantification of the uncertainty about the value of physical quantities, also taking into account systematic effects; role of marginalization; posterior characterization; predictive distributions; hierarchical modelling and hyperparameters; Gaussian approximation of the posterior and recovery of conventional methods, especially maximum likelihood and chi-square fits under well-defined conditions; conjugate priors, transformation invariance and maximum entropy motivated priors; and Monte Carlo (MC) estimates of expectation, including a short introduction to Markov Chain MC methods

  16. Evaluation of artificial time series microarray data for dynamic gene regulatory network inference.

    Science.gov (United States)

    Xenitidis, P; Seimenis, I; Kakolyris, S; Adamopoulos, A

    2017-08-07

    High-throughput technology like microarrays is widely used in the inference of gene regulatory networks (GRNs). We focused on time series data since we are interested in the dynamics of GRNs and the identification of dynamic networks. We evaluated the amount of information that exists in artificial time series microarray data and the ability of an inference process to produce accurate models based on them. We used dynamic artificial gene regulatory networks in order to create artificial microarray data. Key features that characterize microarray data such as the time separation of directly triggered genes, the percentage of directly triggered genes and the triggering function type were altered in order to reveal the limits that are imposed by the nature of microarray data on the inference process. We examined the effect of various factors on the inference performance such as the network size, the presence of noise in microarray data, and the network sparseness. We used a system theory approach and examined the relationship between the pole placement of the inferred system and the inference performance. We examined the relationship between the inference performance in the time domain and the true system parameter identification. Simulation results indicated that time separation and the percentage of directly triggered genes are crucial factors. Also, network sparseness, the triggering function type and noise in input data affect the inference performance. When two factors were simultaneously varied, it was found that variation of one parameter significantly affects the dynamic response of the other. Crucial factors were also examined using a real GRN and acquired results confirmed simulation findings with artificial data. Different initial conditions were also used as an alternative triggering approach. Relevant results confirmed that the number of datasets constitutes the most significant parameter with regard to the inference performance. Copyright © 2017 Elsevier

  17. More than one kind of inference: re-examining what's learned in feature inference and classification.

    Science.gov (United States)

    Sweller, Naomi; Hayes, Brett K

    2010-08-01

    Three studies examined how task demands that impact on attention to typical or atypical category features shape the category representations formed through classification learning and inference learning. During training categories were learned via exemplar classification or by inferring missing exemplar features. In the latter condition inferences were made about missing typical features alone (typical feature inference) or about both missing typical and atypical features (mixed feature inference). Classification and mixed feature inference led to the incorporation of typical and atypical features into category representations, with both kinds of features influencing inferences about familiar (Experiments 1 and 2) and novel (Experiment 3) test items. Those in the typical inference condition focused primarily on typical features. Together with formal modelling, these results challenge previous accounts that have characterized inference learning as producing a focus on typical category features. The results show that two different kinds of inference learning are possible and that these are subserved by different kinds of category representations.

  18. Perceptual inference.

    Science.gov (United States)

    Aggelopoulos, Nikolaos C

    2015-08-01

    Perceptual inference refers to the ability to infer sensory stimuli from predictions that result from internal neural representations built through prior experience. Methods of Bayesian statistical inference and decision theory model cognition adequately by using error sensing either in guiding action or in "generative" models that predict the sensory information. In this framework, perception can be seen as a process qualitatively distinct from sensation, a process of information evaluation using previously acquired and stored representations (memories) that is guided by sensory feedback. The stored representations can be utilised as internal models of sensory stimuli enabling long term associations, for example in operant conditioning. Evidence for perceptual inference is contributed by such phenomena as the cortical co-localisation of object perception with object memory, the response invariance in the responses of some neurons to variations in the stimulus, as well as from situations in which perception can be dissociated from sensation. In the context of perceptual inference, sensory areas of the cerebral cortex that have been facilitated by a priming signal may be regarded as comparators in a closed feedback loop, similar to the better known motor reflexes in the sensorimotor system. The adult cerebral cortex can be regarded as similar to a servomechanism, in using sensory feedback to correct internal models, producing predictions of the outside world on the basis of past experience. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Efficient Exact Inference With Loss Augmented Objective in Structured Learning.

    Science.gov (United States)

    Bauer, Alexander; Nakajima, Shinichi; Muller, Klaus-Robert

    2016-08-19

    Structural support vector machine (SVM) is an elegant approach for building complex and accurate models with structured outputs. However, its applicability relies on the availability of efficient inference algorithms--the state-of-the-art training algorithms repeatedly perform inference to compute a subgradient or to find the most violating configuration. In this paper, we propose an exact inference algorithm for maximizing nondecomposable objectives due to special type of a high-order potential having a decomposable internal structure. As an important application, our method covers the loss augmented inference, which enables the slack and margin scaling formulations of structural SVM with a variety of dissimilarity measures, e.g., Hamming loss, precision and recall, Fβ-loss, intersection over union, and many other functions that can be efficiently computed from the contingency table. We demonstrate the advantages of our approach in natural language parsing and sequence segmentation applications.

  20. A general Bayes weibull inference model for accelerated life testing

    International Nuclear Information System (INIS)

    Dorp, J. Rene van; Mazzuchi, Thomas A.

    2005-01-01

    This article presents the development of a general Bayes inference model for accelerated life testing. The failure times at a constant stress level are assumed to belong to a Weibull distribution, but the specification of strict adherence to a parametric time-transformation function is not required. Rather, prior information is used to indirectly define a multivariate prior distribution for the scale parameters at the various stress levels and the common shape parameter. Using the approach, Bayes point estimates as well as probability statements for use-stress (and accelerated) life parameters may be inferred from a host of testing scenarios. The inference procedure accommodates both the interval data sampling strategy and type I censored sampling strategy for the collection of ALT test data. The inference procedure uses the well-known MCMC (Markov Chain Monte Carlo) methods to derive posterior approximations. The approach is illustrated with an example

  1. Automatic weld torch guidance control system

    Science.gov (United States)

    Smaith, H. E.; Wall, W. A.; Burns, M. R., Jr.

    1982-01-01

    A highly reliable, fully digital, closed circuit television optical, type automatic weld seam tracking control system was developed. This automatic tracking equipment is used to reduce weld tooling costs and increase overall automatic welding reliability. The system utilizes a charge injection device digital camera which as 60,512 inidividual pixels as the light sensing elements. Through conventional scanning means, each pixel in the focal plane is sequentially scanned, the light level signal digitized, and an 8-bit word transmitted to scratch pad memory. From memory, the microprocessor performs an analysis of the digital signal and computes the tracking error. Lastly, the corrective signal is transmitted to a cross seam actuator digital drive motor controller to complete the closed loop, feedback, tracking system. This weld seam tracking control system is capable of a tracking accuracy of + or - 0.2 mm, or better. As configured, the system is applicable to square butt, V-groove, and lap joint weldments.

  2. Automation of chromosomes analysis. Automatic system for image processing

    International Nuclear Information System (INIS)

    Le Go, R.; Cosnac, B. de; Spiwack, A.

    1975-01-01

    The A.S.T.I. is an automatic system relating to the fast conversational processing of all kinds of images (cells, chromosomes) converted to a numerical data set (120000 points, 16 grey levels stored in a MOS memory) through a fast D.O. analyzer. The system performs automatically the isolation of any individual image, the area and weighted area of which are computed. These results are directly displayed on the command panel and can be transferred to a mini-computer for further computations. A bright spot allows parts of an image to be picked out and the results to be displayed. This study is particularly directed towards automatic karyo-typing [fr

  3. Text comprehension in children: comparing different classes of inferences by using on-line methodology / Compreensão de texto em crianças: comparações entre diferentes classes de inferência a partir de uma metodologia on-line

    Directory of Open Access Journals (Sweden)

    Alina Galvão Spinillo

    2007-01-01

    Full Text Available This study, by means of using an on-line methodology, examined 7 and 9-year-old children's text comprehension in relation to different types of inferences constructed during a story reading task: causal inferences, state inferences and inferences of prediction (what happens next in the story. The on-line methodology consists of making inferential questions to the child during text comprehension immediately after the subject has read a passage. Due to the fact that inferences of prediction involve extratextual information and require to raise hypothesis about the continuity of the narrative, children had difficulties in predicting events that had not occurred yet in the story. It was concluded that the ability to make inferences during text comprehension varies according to the type of inferential question presented and that this ability develops with age. The inovative aspect of the on-line methodology and its relevance to the research on text comprehension are discussed.

  4. Multimodel inference and adaptive management

    Science.gov (United States)

    Rehme, S.E.; Powell, L.A.; Allen, Craig R.

    2011-01-01

    Ecology is an inherently complex science coping with correlated variables, nonlinear interactions and multiple scales of pattern and process, making it difficult for experiments to result in clear, strong inference. Natural resource managers, policy makers, and stakeholders rely on science to provide timely and accurate management recommendations. However, the time necessary to untangle the complexities of interactions within ecosystems is often far greater than the time available to make management decisions. One method of coping with this problem is multimodel inference. Multimodel inference assesses uncertainty by calculating likelihoods among multiple competing hypotheses, but multimodel inference results are often equivocal. Despite this, there may be pressure for ecologists to provide management recommendations regardless of the strength of their study’s inference. We reviewed papers in the Journal of Wildlife Management (JWM) and the journal Conservation Biology (CB) to quantify the prevalence of multimodel inference approaches, the resulting inference (weak versus strong), and how authors dealt with the uncertainty. Thirty-eight percent and 14%, respectively, of articles in the JWM and CB used multimodel inference approaches. Strong inference was rarely observed, with only 7% of JWM and 20% of CB articles resulting in strong inference. We found the majority of weak inference papers in both journals (59%) gave specific management recommendations. Model selection uncertainty was ignored in most recommendations for management. We suggest that adaptive management is an ideal method to resolve uncertainty when research results in weak inference.

  5. Type-assisted automatic garbage collection for lock-free data structures

    OpenAIRE

    Yang, Albert Mingkun; Wrigstad, Tobias

    2017-01-01

    We introduce Isolde, an automatic garbage collection scheme designed specifically for managing memory in lock-free data structures, such as stacks, lists, maps and queues. Isolde exists as a plug-in memory manager, designed to sit on-top of another memory manager, and use it's allocator and reclaimer (if exists). Isolde treats a lock-free data structure as a logical heap, isolated from the rest of the program. This allows garbage collection outside of Isolde to take place without affecting th...

  6. Advanced computer-controlled automatic alpha-beta air sample counter

    International Nuclear Information System (INIS)

    Howell, W.P.; Bruinekool, D.J.; Stapleton, E.E.

    1983-01-01

    An improved computer controlled automatic alpha-beta air sample counter was developed, based upon an earlier automatic air sample counter design. The system consists of an automatic sample changer, an electronic counting system utilizing a large silicon diode detector, a small desk-type microcomputer, a high speed matrix printer, and the necessary data interfaces. The system is operated by commands from the keyboard and programs stored on magnetic tape cassettes. The programs provide for background counting, Chi 2 test, radon subtraction, and sample counting for sample periods of one day to one week. Output data are printed by the matrix printer on standard multifold paper. The data output includes gross beta, gross alpha, and plutonium results. Data are automatically corrected for background, counter efficiency, and in the gross alpha and plutonium channels, for the presence of radon

  7. Modeling and control of an unstable system using probabilistic fuzzy inference system

    Directory of Open Access Journals (Sweden)

    Sozhamadevi N.

    2015-09-01

    Full Text Available A new type Fuzzy Inference System is proposed, a Probabilistic Fuzzy Inference system which model and minimizes the effects of statistical uncertainties. The blend of two different concepts, degree of truth and probability of truth in a unique framework leads to this new concept. This combination is carried out both in Fuzzy sets and Fuzzy rules, which gives rise to Probabilistic Fuzzy Sets and Probabilistic Fuzzy Rules. Introducing these probabilistic elements, a distinctive probabilistic fuzzy inference system is developed and this involves fuzzification, inference and output processing. This integrated approach accounts for all of the uncertainty like rule uncertainties and measurement uncertainties present in the systems and has led to the design which performs optimally after training. In this paper a Probabilistic Fuzzy Inference System is applied for modeling and control of a highly nonlinear, unstable system and also proved its effectiveness.

  8. Inferring Group Processes from Computer-Mediated Affective Text Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Schryver, Jack C [ORNL; Begoli, Edmon [ORNL; Jose, Ajith [Missouri University of Science and Technology; Griffin, Christopher [Pennsylvania State University

    2011-02-01

    Political communications in the form of unstructured text convey rich connotative meaning that can reveal underlying group social processes. Previous research has focused on sentiment analysis at the document level, but we extend this analysis to sub-document levels through a detailed analysis of affective relationships between entities extracted from a document. Instead of pure sentiment analysis, which is just positive or negative, we explore nuances of affective meaning in 22 affect categories. Our affect propagation algorithm automatically calculates and displays extracted affective relationships among entities in graphical form in our prototype (TEAMSTER), starting with seed lists of affect terms. Several useful metrics are defined to infer underlying group processes by aggregating affective relationships discovered in a text. Our approach has been validated with annotated documents from the MPQA corpus, achieving a performance gain of 74% over comparable random guessers.

  9. Automatic design of magazine covers

    Science.gov (United States)

    Jahanian, Ali; Liu, Jerry; Tretter, Daniel R.; Lin, Qian; Damera-Venkata, Niranjan; O'Brien-Strain, Eamonn; Lee, Seungyon; Fan, Jian; Allebach, Jan P.

    2012-03-01

    In this paper, we propose a system for automatic design of magazine covers that quantifies a number of concepts from art and aesthetics. Our solution to automatic design of this type of media has been shaped by input from professional designers, magazine art directors and editorial boards, and journalists. Consequently, a number of principles in design and rules in designing magazine covers are delineated. Several techniques are derived and employed in order to quantify and implement these principles and rules in the format of a software framework. At this stage, our framework divides the task of design into three main modules: layout of magazine cover elements, choice of color for masthead and cover lines, and typography of cover lines. Feedback from professional designers on our designs suggests that our results are congruent with their intuition.

  10. Optimal inference with suboptimal models: Addiction and active Bayesian inference

    Science.gov (United States)

    Schwartenbeck, Philipp; FitzGerald, Thomas H.B.; Mathys, Christoph; Dolan, Ray; Wurst, Friedrich; Kronbichler, Martin; Friston, Karl

    2015-01-01

    When casting behaviour as active (Bayesian) inference, optimal inference is defined with respect to an agent’s beliefs – based on its generative model of the world. This contrasts with normative accounts of choice behaviour, in which optimal actions are considered in relation to the true structure of the environment – as opposed to the agent’s beliefs about worldly states (or the task). This distinction shifts an understanding of suboptimal or pathological behaviour away from aberrant inference as such, to understanding the prior beliefs of a subject that cause them to behave less ‘optimally’ than our prior beliefs suggest they should behave. Put simply, suboptimal or pathological behaviour does not speak against understanding behaviour in terms of (Bayes optimal) inference, but rather calls for a more refined understanding of the subject’s generative model upon which their (optimal) Bayesian inference is based. Here, we discuss this fundamental distinction and its implications for understanding optimality, bounded rationality and pathological (choice) behaviour. We illustrate our argument using addictive choice behaviour in a recently described ‘limited offer’ task. Our simulations of pathological choices and addictive behaviour also generate some clear hypotheses, which we hope to pursue in ongoing empirical work. PMID:25561321

  11. A consideration of the operation of automatic production machines.

    Science.gov (United States)

    Hoshi, Toshiro; Sugimoto, Noboru

    2015-01-01

    At worksites, various automatic production machines are in use to release workers from muscular labor or labor in the detrimental environment. On the other hand, a large number of industrial accidents have been caused by automatic production machines. In view of this, this paper considers the operation of automatic production machines from the viewpoint of accident prevention, and points out two types of machine operation - operation for which quick performance is required (operation that is not permitted to be delayed) - and operation for which composed performance is required (operation that is not permitted to be performed in haste). These operations are distinguished by operation buttons of suitable colors and shapes. This paper shows that these characteristics are evaluated as "asymmetric on the time-axis". Here, in order for workers to accept the risk of automatic production machines, it is preconditioned in general that harm should be sufficiently small or avoidance of harm is easy. In this connection, this paper shows the possibility of facilitating the acceptance of the risk of automatic production machines by enhancing the asymmetric on the time-axis.

  12. Inference rule and problem solving

    Energy Technology Data Exchange (ETDEWEB)

    Goto, S

    1982-04-01

    Intelligent information processing signifies an opportunity of having man's intellectual activity executed on the computer, in which inference, in place of ordinary calculation, is used as the basic operational mechanism for such an information processing. Many inference rules are derived from syllogisms in formal logic. The problem of programming this inference function is referred to as a problem solving. Although logically inference and problem-solving are in close relation, the calculation ability of current computers is on a low level for inferring. For clarifying the relation between inference and computers, nonmonotonic logic has been considered. The paper deals with the above topics. 16 references.

  13. Inference of gene-phenotype associations via protein-protein interaction and orthology.

    Directory of Open Access Journals (Sweden)

    Panwen Wang

    Full Text Available One of the fundamental goals of genetics is to understand gene functions and their associated phenotypes. To achieve this goal, in this study we developed a computational algorithm that uses orthology and protein-protein interaction information to infer gene-phenotype associations for multiple species. Furthermore, we developed a web server that provides genome-wide phenotype inference for six species: fly, human, mouse, worm, yeast, and zebrafish. We evaluated our inference method by comparing the inferred results with known gene-phenotype associations. The high Area Under the Curve values suggest a significant performance of our method. By applying our method to two human representative diseases, Type 2 Diabetes and Breast Cancer, we demonstrated that our method is able to identify related Gene Ontology terms and Kyoto Encyclopedia of Genes and Genomes pathways. The web server can be used to infer functions and putative phenotypes of a gene along with the candidate genes of a phenotype, and thus aids in disease candidate gene discovery. Our web server is available at http://jjwanglab.org/PhenoPPIOrth.

  14. Automatic Tuning of Control Parameters for Single Speed Engines

    OpenAIRE

    Olsson, Johan

    2004-01-01

    In Scania’s single speed engines for industrial and marine use, the engine speed is controlled by a PI-controller. This controller is tuned independent of engine type and application. This brings certain disadvantages since the engines are used in a wide range of applications where the dynamics may differ. In this thesis, the possibility to tune the controller automatically for a specific engine installation has been investigated. The work shows that automatic tuning is possible. By performin...

  15. IERIAS: inference engine for reactor accident diagnostic system using knowledge engineering technique

    International Nuclear Information System (INIS)

    Yokobayashi, Masao; Yoshida, Kazuo; Kohsaka, Atsuo; Yamamoto, Minoru.

    1984-11-01

    This report describes an inference engine IERIAS which has been devoloped for a diagnostic system to identify the cause and type of an abnormal transient of a reactor plant. This system using knowledge engineering technique consists of a knowledge base and an inference engine. The inference engine IERIAS is designed so as to treat time-varying data of a plant. The major features of IERIAS are ; (1) histroy of transients can be treated, (2) knowledge base can be divided into some knowledge units, (3) program language UTILISP is used which is suitable for symbolic data manipulation. Inference was made using IERIAS with a knowledge base which was created from simulated results of various transients by a PWR plant simulator. The results showed a good applicability of IERIAS for reactor diagnosis. (author)

  16. Knowledge and inference

    CERN Document Server

    Nagao, Makoto

    1990-01-01

    Knowledge and Inference discusses an important problem for software systems: How do we treat knowledge and ideas on a computer and how do we use inference to solve problems on a computer? The book talks about the problems of knowledge and inference for the purpose of merging artificial intelligence and library science. The book begins by clarifying the concept of """"knowledge"""" from many points of view, followed by a chapter on the current state of library science and the place of artificial intelligence in library science. Subsequent chapters cover central topics in the artificial intellig

  17. Automatic accounting of nuclear materials at WWER type reactor NPPs

    International Nuclear Information System (INIS)

    Babaev, N.S.; Poznyakov, N.L.; Strelkov, D.F.

    1978-01-01

    The possibilities of automatic accounting of nuclear materials at NPPs based on WWER reactors are considered. Organizational and technical principles of an automated system of accounting that takes into consideration IAEA requirements in conducting accounting documentation are proposed. A program is described for accounting materials using a BESM-6 computer. Operation of the program requires that all accounting data be recorded on conventional carriers of computer information (magnetic tapes, discs, perforated cards), which constitute the basic NPP accounting documents and may be directly used as initial data for a corresponding information program

  18. Geometric statistical inference

    International Nuclear Information System (INIS)

    Periwal, Vipul

    1999-01-01

    A reparametrization-covariant formulation of the inverse problem of probability is explicitly solved for finite sample sizes. The inferred distribution is explicitly continuous for finite sample size. A geometric solution of the statistical inference problem in higher dimensions is outlined

  19. Applications of automatic differentiation in topology optimization

    DEFF Research Database (Denmark)

    Nørgaard, Sebastian A.; Sagebaum, Max; Gauger, Nicolas R.

    2017-01-01

    and is demonstrated on two separate, previously published types of problems in topology optimization. Two separate software packages for automatic differentiation, CoDiPack and Tapenade are considered, and their performance and usability trade-offs are discussed and compared to a hand coded adjoint gradient...

  20. A parameter-adaptive dynamic programming approach for inferring cophylogenies

    DEFF Research Database (Denmark)

    Merkle, Daniel; Middendorf, Martin; Wieseke, Nicolas

    2010-01-01

    Background: Coevolutionary systems like hosts and their parasites are commonly used model systems for evolutionary studies. Inferring the coevolutionary history based on given phylogenies of both groups is often done by employing a set of possible types of events that happened during coevolution....

  1. High-resolution magnetic resonance imaging reveals nuclei of the human amygdala: manual segmentation to automatic atlas

    DEFF Research Database (Denmark)

    Saygin, Z M; Kliemann, D; Iglesias, J. E.

    2017-01-01

    The amygdala is composed of multiple nuclei with unique functions and connections in the limbic system and to the rest of the brain. However, standard in vivo neuroimaging tools to automatically delineate the amygdala into its multiple nuclei are still rare. By scanning postmortem specimens at high...... resolution (100-150µm) at 7T field strength (n = 10), we were able to visualize and label nine amygdala nuclei (anterior amygdaloid, cortico-amygdaloid transition area; basal, lateral, accessory basal, central, cortical medial, paralaminar nuclei). We created an atlas from these labels using a recently...... developed atlas building algorithm based on Bayesian inference. This atlas, which will be released as part of FreeSurfer, can be used to automatically segment nine amygdala nuclei from a standard resolution structural MR image. We applied this atlas to two publicly available datasets (ADNI and ABIDE...

  2. Inferring biological tasks using Pareto analysis of high-dimensional data.

    Science.gov (United States)

    Hart, Yuval; Sheftel, Hila; Hausser, Jean; Szekely, Pablo; Ben-Moshe, Noa Bossel; Korem, Yael; Tendler, Avichai; Mayo, Avraham E; Alon, Uri

    2015-03-01

    We present the Pareto task inference method (ParTI; http://www.weizmann.ac.il/mcb/UriAlon/download/ParTI) for inferring biological tasks from high-dimensional biological data. Data are described as a polytope, and features maximally enriched closest to the vertices (or archetypes) allow identification of the tasks the vertices represent. We demonstrate that human breast tumors and mouse tissues are well described by tetrahedrons in gene expression space, with specific tumor types and biological functions enriched at each of the vertices, suggesting four key tasks.

  3. Development of automatic facilities for ZEPHYR

    International Nuclear Information System (INIS)

    Eder, O.; Lackner, E.; Pohl, F.; Schilling, H.B.

    1982-04-01

    This concept of remotely controlled facilities for repair and maintenance tasks inside the ZEPHYR vacuum vessel uses a supporting structure to insert various types of mobile automatic devices are guided by an egg-shaped disc which is part of the supporting structure. Considerations of adapting the guiding disc to the vessel contour are included. (orig.)

  4. Goal inferences about robot behavior : goal inferences and human response behaviors

    NARCIS (Netherlands)

    Broers, H.A.T.; Ham, J.R.C.; Broeders, R.; De Silva, P.; Okada, M.

    2014-01-01

    This explorative research focused on the goal inferences human observers draw based on a robot's behavior, and the extent to which those inferences predict people's behavior in response to that robot. Results show that different robot behaviors cause different response behavior from people.

  5. Gene regulatory network inference by point-based Gaussian approximation filters incorporating the prior information.

    Science.gov (United States)

    Jia, Bin; Wang, Xiaodong

    2013-12-17

    : The extended Kalman filter (EKF) has been applied to inferring gene regulatory networks. However, it is well known that the EKF becomes less accurate when the system exhibits high nonlinearity. In addition, certain prior information about the gene regulatory network exists in practice, and no systematic approach has been developed to incorporate such prior information into the Kalman-type filter for inferring the structure of the gene regulatory network. In this paper, an inference framework based on point-based Gaussian approximation filters that can exploit the prior information is developed to solve the gene regulatory network inference problem. Different point-based Gaussian approximation filters, including the unscented Kalman filter (UKF), the third-degree cubature Kalman filter (CKF3), and the fifth-degree cubature Kalman filter (CKF5) are employed. Several types of network prior information, including the existing network structure information, sparsity assumption, and the range constraint of parameters, are considered, and the corresponding filters incorporating the prior information are developed. Experiments on a synthetic network of eight genes and the yeast protein synthesis network of five genes are carried out to demonstrate the performance of the proposed framework. The results show that the proposed methods provide more accurate inference results than existing methods, such as the EKF and the traditional UKF.

  6. Automatic topics segmentation for TV news video

    Science.gov (United States)

    Hmayda, Mounira; Ejbali, Ridha; Zaied, Mourad

    2017-03-01

    Automatic identification of television programs in the TV stream is an important task for operating archives. This article proposes a new spatio-temporal approach to identify the programs in TV stream into two main steps: First, a reference catalogue for video features visual jingles built. We operate the features that characterize the instances of the same program type to identify the different types of programs in the flow of television. The role of video features is to represent the visual invariants for each visual jingle using appropriate automatic descriptors for each television program. On the other hand, programs in television streams are identified by examining the similarity of the video signal for visual grammars in the catalogue. The main idea of the identification process is to compare the visual similarity of the video signal features in the flow of television to the catalogue. After presenting the proposed approach, the paper overviews encouraging experimental results on several streams extracted from different channels and compounds of several programs.

  7. A canonical correlation analysis-based dynamic bayesian network prior to infer gene regulatory networks from multiple types of biological data.

    Science.gov (United States)

    Baur, Brittany; Bozdag, Serdar

    2015-04-01

    One of the challenging and important computational problems in systems biology is to infer gene regulatory networks (GRNs) of biological systems. Several methods that exploit gene expression data have been developed to tackle this problem. In this study, we propose the use of copy number and DNA methylation data to infer GRNs. We developed an algorithm that scores regulatory interactions between genes based on canonical correlation analysis. In this algorithm, copy number or DNA methylation variables are treated as potential regulator variables, and expression variables are treated as potential target variables. We first validated that the canonical correlation analysis method is able to infer true interactions in high accuracy. We showed that the use of DNA methylation or copy number datasets leads to improved inference over steady-state expression. Our results also showed that epigenetic and structural information could be used to infer directionality of regulatory interactions. Additional improvements in GRN inference can be gleaned from incorporating the result in an informative prior in a dynamic Bayesian algorithm. This is the first study that incorporates copy number and DNA methylation into an informative prior in dynamic Bayesian framework. By closely examining top-scoring interactions with different sources of epigenetic or structural information, we also identified potential novel regulatory interactions.

  8. An advanced computer-controlled automatic alpha-beta air sample counter

    International Nuclear Information System (INIS)

    Howell, W.P.; Bruinekool, D.J.; Stapleton, E.E.

    1984-01-01

    An improved computer-controlled automatic alpha-beta air sample counter was developed, based upon an earlier automatic air sample counter design. The system consists of an automatic sample changer, an electronic counting system utilizing a large silicon diode detector, a small desk-type microcomputer, a high-speed matrix printer and the necessary data interfaces. The system is operated by commands from the keyboard and programs stored on magnetic tape cassettes. The programs provide for background counting, Chi 2 test, radon subtraction and sample counting for sample periods of one day to one week. Output data are printed by the matrix printer on standard multifold paper. The data output includes gross beta, gross alpha and plutonium results. Data are automatically corrected for background, counter efficiency, and in the gross alpha and plutonium channels, for the presence of radon

  9. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. [comp.

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  10. Automatic generation of gene finders for eukaryotic species

    DEFF Research Database (Denmark)

    Terkelsen, Kasper Munch; Krogh, A.

    2006-01-01

    and quality of reliable gene annotation grows. Results We present a procedure, Agene, that automatically generates a species-specific gene predictor from a set of reliable mRNA sequences and a genome. We apply a Hidden Markov model (HMM) that implements explicit length distribution modelling for all gene......Background The number of sequenced eukaryotic genomes is rapidly increasing. This means that over time it will be hard to keep supplying customised gene finders for each genome. This calls for procedures to automatically generate species-specific gene finders and to re-train them as the quantity...... structure blocks using acyclic discrete phase type distributions. The state structure of the each HMM is generated dynamically from an array of sub-models to include only gene features represented in the training set. Conclusion Acyclic discrete phase type distributions are well suited to model sequence...

  11. A combined deep-learning and deformable-model approach to fully automatic segmentation of the left ventricle in cardiac MRI.

    Science.gov (United States)

    Avendi, M R; Kheradvar, Arash; Jafarkhani, Hamid

    2016-05-01

    Segmentation of the left ventricle (LV) from cardiac magnetic resonance imaging (MRI) datasets is an essential step for calculation of clinical indices such as ventricular volume and ejection fraction. In this work, we employ deep learning algorithms combined with deformable models to develop and evaluate a fully automatic LV segmentation tool from short-axis cardiac MRI datasets. The method employs deep learning algorithms to learn the segmentation task from the ground true data. Convolutional networks are employed to automatically detect the LV chamber in MRI dataset. Stacked autoencoders are used to infer the LV shape. The inferred shape is incorporated into deformable models to improve the accuracy and robustness of the segmentation. We validated our method using 45 cardiac MR datasets from the MICCAI 2009 LV segmentation challenge and showed that it outperforms the state-of-the art methods. Excellent agreement with the ground truth was achieved. Validation metrics, percentage of good contours, Dice metric, average perpendicular distance and conformity, were computed as 96.69%, 0.94, 1.81 mm and 0.86, versus those of 79.2-95.62%, 0.87-0.9, 1.76-2.97 mm and 0.67-0.78, obtained by other methods, respectively. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Entropic Inference

    OpenAIRE

    Caticha, Ariel

    2010-01-01

    In this tutorial we review the essential arguments behing entropic inference. We focus on the epistemological notion of information and its relation to the Bayesian beliefs of rational agents. The problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), includes as special cases both MaxEn...

  13. How CBO Estimates Automatic Stabilizers

    Science.gov (United States)

    2015-11-01

    the economy. Most types of revenues—mainly personal, corporate, and social insurance taxes —are sensitive to the business cycle and account for most of...Medicare taxes for self-employed people, taxes on production and imports, and unemployment insurance taxes . Those six categories account for the bulk of...federal tax revenues.6 Individual taxes account for most of the automatic stabilizers from revenues, followed by Social Security plus Medicare

  14. Conversion of KEGG metabolic pathways to SBGN maps including automatic layout.

    Science.gov (United States)

    Czauderna, Tobias; Wybrow, Michael; Marriott, Kim; Schreiber, Falk

    2013-08-16

    Biologists make frequent use of databases containing large and complex biological networks. One popular database is the Kyoto Encyclopedia of Genes and Genomes (KEGG) which uses its own graphical representation and manual layout for pathways. While some general drawing conventions exist for biological networks, arbitrary graphical representations are very common. Recently, a new standard has been established for displaying biological processes, the Systems Biology Graphical Notation (SBGN), which aims to unify the look of such maps. Ideally, online repositories such as KEGG would automatically provide networks in a variety of notations including SBGN. Unfortunately, this is non-trivial, since converting between notations may add, remove or otherwise alter map elements so that the existing layout cannot be simply reused. Here we describe a methodology for automatic translation of KEGG metabolic pathways into the SBGN format. We infer important properties of the KEGG layout and treat these as layout constraints that are maintained during the conversion to SBGN maps. This allows for the drawing and layout conventions of SBGN to be followed while creating maps that are still recognizably the original KEGG pathways. This article details the steps in this process and provides examples of the final result.

  15. Learning Convex Inference of Marginals

    OpenAIRE

    Domke, Justin

    2012-01-01

    Graphical models trained using maximum likelihood are a common tool for probabilistic inference of marginal distributions. However, this approach suffers difficulties when either the inference process or the model is approximate. In this paper, the inference process is first defined to be the minimization of a convex function, inspired by free energy approximations. Learning is then done directly in terms of the performance of the inference process at univariate marginal prediction. The main ...

  16. Bayesian Estimation and Inference using Stochastic Hardware

    Directory of Open Access Journals (Sweden)

    Chetan Singh Thakur

    2016-03-01

    Full Text Available In this paper, we present the implementation of two types of Bayesian inference problems to demonstrate the potential of building probabilistic algorithms in hardware using single set of building blocks with the ability to perform these computations in real time. The first implementation, referred to as the BEAST (Bayesian Estimation and Stochastic Tracker, demonstrates a simple problem where an observer uses an underlying Hidden Markov Model (HMM to track a target in one dimension. In this implementation, sensors make noisy observations of the target position at discrete time steps. The tracker learns the transition model for target movement, and the observation model for the noisy sensors, and uses these to estimate the target position by solving the Bayesian recursive equation online. We show the tracking performance of the system and demonstrate how it can learn the observation model, the transition model, and the external distractor (noise probability interfering with the observations. In the second implementation, referred to as the Bayesian INference in DAG (BIND, we show how inference can be performed in a Directed Acyclic Graph (DAG using stochastic circuits. We show how these building blocks can be easily implemented using simple digital logic gates. An advantage of the stochastic electronic implementation is that it is robust to certain types of noise, which may become an issue in integrated circuit (IC technology with feature sizes in the order of tens of nanometers due to their low noise margin, the effect of high-energy cosmic rays and the low supply voltage. In our framework, the flipping of random individual bits would not affect the system performance because information is encoded in a bit stream.

  17. Bayesian Estimation and Inference Using Stochastic Electronics.

    Science.gov (United States)

    Thakur, Chetan Singh; Afshar, Saeed; Wang, Runchun M; Hamilton, Tara J; Tapson, Jonathan; van Schaik, André

    2016-01-01

    In this paper, we present the implementation of two types of Bayesian inference problems to demonstrate the potential of building probabilistic algorithms in hardware using single set of building blocks with the ability to perform these computations in real time. The first implementation, referred to as the BEAST (Bayesian Estimation and Stochastic Tracker), demonstrates a simple problem where an observer uses an underlying Hidden Markov Model (HMM) to track a target in one dimension. In this implementation, sensors make noisy observations of the target position at discrete time steps. The tracker learns the transition model for target movement, and the observation model for the noisy sensors, and uses these to estimate the target position by solving the Bayesian recursive equation online. We show the tracking performance of the system and demonstrate how it can learn the observation model, the transition model, and the external distractor (noise) probability interfering with the observations. In the second implementation, referred to as the Bayesian INference in DAG (BIND), we show how inference can be performed in a Directed Acyclic Graph (DAG) using stochastic circuits. We show how these building blocks can be easily implemented using simple digital logic gates. An advantage of the stochastic electronic implementation is that it is robust to certain types of noise, which may become an issue in integrated circuit (IC) technology with feature sizes in the order of tens of nanometers due to their low noise margin, the effect of high-energy cosmic rays and the low supply voltage. In our framework, the flipping of random individual bits would not affect the system performance because information is encoded in a bit stream.

  18. GAMBIT: the global and modular beyond-the-standard-model inference tool

    Science.gov (United States)

    Athron, Peter; Balazs, Csaba; Bringmann, Torsten; Buckley, Andy; Chrząszcz, Marcin; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Dickinson, Hugh; Edsjö, Joakim; Farmer, Ben; Gonzalo, Tomás E.; Jackson, Paul; Krislock, Abram; Kvellestad, Anders; Lundberg, Johan; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Raklev, Are; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Savage, Christopher; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; White, Martin; Wild, Sebastian

    2017-11-01

    We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org.

  19. GAMBIT. The global and modular beyond-the-standard-model inference tool

    Energy Technology Data Exchange (ETDEWEB)

    Athron, Peter; Balazs, Csaba [Monash University, School of Physics and Astronomy, Melbourne, VIC (Australia); Australian Research Council Centre of Excellence for Particle Physics at the Tera-scale (Australia); Bringmann, Torsten; Dal, Lars A.; Gonzalo, Tomas E.; Krislock, Abram; Raklev, Are [University of Oslo, Department of Physics, Oslo (Norway); Buckley, Andy [University of Glasgow, SUPA, School of Physics and Astronomy, Glasgow (United Kingdom); Chrzaszcz, Marcin [Universitaet Zuerich, Physik-Institut, Zurich (Switzerland); Polish Academy of Sciences, H. Niewodniczanski Institute of Nuclear Physics, Krakow (Poland); Conrad, Jan; Edsjoe, Joakim; Farmer, Ben; Lundberg, Johan [AlbaNova University Centre, Oskar Klein Centre for Cosmoparticle Physics, Stockholm (Sweden); Stockholm University, Department of Physics, Stockholm (Sweden); Cornell, Jonathan M. [McGill University, Department of Physics, Montreal, QC (Canada); Dickinson, Hugh [University of Minnesota, Minnesota Institute for Astrophysics, Minneapolis, MN (United States); Jackson, Paul; White, Martin [Australian Research Council Centre of Excellence for Particle Physics at the Tera-scale (Australia); University of Adelaide, Department of Physics, Adelaide, SA (Australia); Kvellestad, Anders; Savage, Christopher [NORDITA, Stockholm (Sweden); McKay, James [Imperial College London, Blackett Laboratory, Department of Physics, London (United Kingdom); Mahmoudi, Farvah [Univ Lyon, Univ Lyon 1, ENS de Lyon, CNRS, Centre de Recherche Astrophysique de Lyon UMR5574, Saint-Genis-Laval (France); CERN, Theoretical Physics Department, Geneva (Switzerland); Martinez, Gregory D. [University of California, Physics and Astronomy Department, Los Angeles, CA (United States); Putze, Antje [LAPTh, Universite de Savoie, CNRS, Annecy-le-Vieux (France); Ripken, Joachim [Max Planck Institute for Solar System Research, Goettingen (Germany); Rogan, Christopher [Harvard University, Department of Physics, Cambridge, MA (United States); Saavedra, Aldo [Australian Research Council Centre of Excellence for Particle Physics at the Tera-scale (Australia); The University of Sydney, Faculty of Engineering and Information Technologies, Centre for Translational Data Science, School of Physics, Sydney, NSW (Australia); Scott, Pat [Imperial College London, Blackett Laboratory, Department of Physics, London (United Kingdom); Seo, Seon-Hee [Seoul National University, Department of Physics and Astronomy, Seoul (Korea, Republic of); Serra, Nicola [Universitaet Zuerich, Physik-Institut, Zurich (Switzerland); Weniger, Christoph [University of Amsterdam, GRAPPA, Institute of Physics, Amsterdam (Netherlands); Wild, Sebastian [DESY, Hamburg (Germany); Collaboration: The GAMBIT Collaboration

    2017-11-15

    We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org. (orig.)

  20. GAMBIT. The global and modular beyond-the-standard-model inference tool

    International Nuclear Information System (INIS)

    Athron, Peter; Balazs, Csaba; Bringmann, Torsten; Dal, Lars A.; Gonzalo, Tomas E.; Krislock, Abram; Raklev, Are; Buckley, Andy; Chrzaszcz, Marcin; Conrad, Jan; Edsjoe, Joakim; Farmer, Ben; Lundberg, Johan; Cornell, Jonathan M.; Dickinson, Hugh; Jackson, Paul; White, Martin; Kvellestad, Anders; Savage, Christopher; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; Wild, Sebastian

    2017-01-01

    We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org. (orig.)

  1. Bayesian electron density inference from JET lithium beam emission spectra using Gaussian processes

    Science.gov (United States)

    Kwak, Sehyun; Svensson, J.; Brix, M.; Ghim, Y.-C.; Contributors, JET

    2017-03-01

    A Bayesian model to infer edge electron density profiles is developed for the JET lithium beam emission spectroscopy (Li-BES) system, measuring Li I (2p-2s) line radiation using 26 channels with  ∼1 cm spatial resolution and 10∼ 20 ms temporal resolution. The density profile is modelled using a Gaussian process prior, and the uncertainty of the density profile is calculated by a Markov Chain Monte Carlo (MCMC) scheme. From the spectra measured by the transmission grating spectrometer, the Li I line intensities are extracted, and modelled as a function of the plasma density by a multi-state model which describes the relevant processes between neutral lithium beam atoms and plasma particles. The spectral model fully takes into account interference filter and instrument effects, that are separately estimated, again using Gaussian processes. The line intensities are inferred based on a spectral model consistent with the measured spectra within their uncertainties, which includes photon statistics and electronic noise. Our newly developed method to infer JET edge electron density profiles has the following advantages in comparison to the conventional method: (i) providing full posterior distributions of edge density profiles, including their associated uncertainties, (ii) the available radial range for density profiles is increased to the full observation range (∼26 cm), (iii) an assumption of monotonic electron density profile is not necessary, (iv) the absolute calibration factor of the diagnostic system is automatically estimated overcoming the limitation of the conventional technique and allowing us to infer the electron density profiles for all pulses without preprocessing the data or an additional boundary condition, and (v) since the full spectrum is modelled, the procedure of modulating the beam to measure the background signal is only necessary for the case of overlapping of the Li I line with impurity lines.

  2. A Type System for Tom

    Directory of Open Access Journals (Sweden)

    Claude Kirchner

    2010-03-01

    Full Text Available Extending a given language with new dedicated features is a general and quite used approach to make the programming language more adapted to problems. Being closer to the application, this leads to less programming flaws and easier maintenance. But of course one would still like to perform program analysis on these kinds of extended languages, in particular type checking and inference. In this case one has to make the typing of the extended features compatible with the ones in the starting language. The Tom programming language is a typical example of such a situation as it consists of an extension of Java that adds pattern matching, more particularly associative pattern matching, and reduction strategies. This paper presents a type system with subtyping for Tom, that is compatible with Java's type system, and that performs both type checking and type inference. We propose an algorithm that checks if all patterns of a Tom program are well-typed. In addition, we propose an algorithm based on equality and subtyping constraints that infers types of variables occurring in a pattern. Both algorithms are exemplified and the proposed type system is showed to be sound and complete.

  3. Probabilistic inductive inference: a survey

    OpenAIRE

    Ambainis, Andris

    2001-01-01

    Inductive inference is a recursion-theoretic theory of learning, first developed by E. M. Gold (1967). This paper surveys developments in probabilistic inductive inference. We mainly focus on finite inference of recursive functions, since this simple paradigm has produced the most interesting (and most complex) results.

  4. Bayesian statistical inference

    Directory of Open Access Journals (Sweden)

    Bruno De Finetti

    2017-04-01

    Full Text Available This work was translated into English and published in the volume: Bruno De Finetti, Induction and Probability, Biblioteca di Statistica, eds. P. Monari, D. Cocchi, Clueb, Bologna, 1993.Bayesian statistical Inference is one of the last fundamental philosophical papers in which we can find the essential De Finetti's approach to the statistical inference.

  5. Cortical information flow during inferences of agency

    Directory of Open Access Journals (Sweden)

    Myrthel eDogge

    2014-08-01

    Full Text Available Building on the recent finding that agency experiences do not merely rely on sensorimotor information but also on cognitive cues, this exploratory study uses electroencephalographic recordings to examine functional connectivity during agency inference processing in a setting where action and outcome are independent. Participants completed a computerized task in which they pressed a button followed by one of two color words (red or blue and rated their experienced agency over producing the color. Before executing the action, a matching or mismatching color word was pre-activated by explicitly instructing participants to produce the color (goal condition or by briefly presenting the color word (prime condition. In both conditions, experienced agency was higher in matching versus mismatching trials. Furthermore, increased electroencephalography (EEG-based connectivity strength was observed between parietal and frontal nodes and within the (prefrontal cortex when color-outcomes matched with goals and participants reported high agency. This pattern of increased connectivity was not identified in trials where outcomes were pre-activated through primes. These results suggest that different connections are involved in the experience and in the loss of agency, as well as in inferences of agency resulting from different types of pre-activation. Moreover, the findings provide novel support for the involvement of a fronto-parietal network in agency inferences.

  6. Is there a hierarchy of social inferences? The likelihood and speed of inferring intentionality, mind, and personality.

    Science.gov (United States)

    Malle, Bertram F; Holbrook, Jess

    2012-04-01

    People interpret behavior by making inferences about agents' intentionality, mind, and personality. Past research studied such inferences 1 at a time; in real life, people make these inferences simultaneously. The present studies therefore examined whether 4 major inferences (intentionality, desire, belief, and personality), elicited simultaneously in response to an observed behavior, might be ordered in a hierarchy of likelihood and speed. To achieve generalizability, the studies included a wide range of stimulus behaviors, presented them verbally and as dynamic videos, and assessed inferences both in a retrieval paradigm (measuring the likelihood and speed of accessing inferences immediately after they were made) and in an online processing paradigm (measuring the speed of forming inferences during behavior observation). Five studies provide evidence for a hierarchy of social inferences-from intentionality and desire to belief to personality-that is stable across verbal and visual presentations and that parallels the order found in developmental and primate research. (c) 2012 APA, all rights reserved.

  7. Development of Automatic Remote Exposure Controller for Gamma Radiography

    International Nuclear Information System (INIS)

    Joo, Gwang Tae; Shin, Jin Seong; Kim, Dong Eun; Song, Jung Ho; Choo, Seung Hwan; Chang, Hong Keun

    2002-01-01

    Recently, gamma radiographic equipment have been used about 1,000 sets manually and operated by about 2,500 persons in Korea. In order for a radiography to work effectively with avoiding any hazard of the high level radiation from the source, many field workers have expected developing a wireless automatic remote exposure controller. The KlTCO research team has developed an automatic remote exposure controller that can regulate the speed of 0.4∼1.2m/s by BLDC motor of 24V 200W which has output of 54 kgf·, suitable torque and safety factor for the work. And the developed automatic remote exposure controller can control rpm of motor, pigtail position by photo-sensor and exposure time by timer to RF sensor. Thus, the developed equipment is expected that the unit can be used in many practical applications with benefits in economical advantage to combine the use of both automatic and manual type because attachment is possible existent manual remote exposure controller, AC and DC combined use

  8. The automatic regulation of the basal dose on the insulin pump for the treatment of patients that have Diabetes type 1.

    Science.gov (United States)

    Mehanović, Sifet; Mujić, Midhat

    2010-05-01

    Diabetes mellitus type 1 is a chronic metabolic disorder, and its main characteristic is Hyperglycemia. It usually occurs in the early years because of the absolute or relative absence of the active insulin that is caused by the autoimmune disease of the beta cells of the pancreas. Despite the numerous researches and efforts of the scientists, the therapy for Diabetes type 1 is based on the substitution of insulin. Even though the principles of the therapy have not changed so much, still some important changes have occurred in the production and usage of insulin. Lately, the insulin pumps are more frequent in the therapy for Diabetes type 1. The functioning of the pump is based on the continuing delivery of insulin in a small dose ("the basal dose"), that keeps the level of glycemia in the blood constant. The increase of glycemia during the meal is reduced with the additional dose of insulin ("the bolus dose"). The use of the insulin pumps and the continuing glucose sensors has provided an easier and more efficient monitoring of the diabetes, a better metabolic control and a better life quality for the patient and his/her family. This work presents the way of automatic regulation of the basal dose of insulin through the synthesis of the functions of the insulin pump and the continuing glucose sensor. The aim is to give a contribution to the development of the controlling algorithm on the insulin pump for the automatic regulation of the glucose concentration in the blood. This could be a step further which is closer to the delivery of the dose of insulin that is really needed for the basic needs of the organism, and a significant contribution is given to the development of the artificial pancreas.

  9. The Automatic Regulation of the Basal Dose on the Insulin Pump for the Treatment of Patients that have Diabetes Type 1

    Directory of Open Access Journals (Sweden)

    Sifet Mehanović

    2010-05-01

    Full Text Available Diabetes mellitus type 1 is a chronic metabolic disorder, and its main characteristic is Hyperglycemia. It usually occurs in the early years because of the absolute or relative absence of the active insulin that is caused by the autoimmune disease of the β cells of the pancreas. Despite the numerous researches and efforts of the scientists, the therapy for Diabetes type 1 is based on the substitution of insulin. Even though the principles of the therapy have not changed so much, still some important changes have occurred in the production and usage of insulin. Lately, the insulin pumps are more frequent in the therapy for Diabetes type 1. The functioning of the pump is based on the continuing delivery of insulin in a small dose (“the basal dose”, that keeps the level of glycemia in the blood constant. The increase of glycemia during the meal is reduced with the additional dose of insulin (“the bolus dose”. The use of the insulin pumps and the continuing glucose sensors has provided an easier and more efficient monitoring of the diabetes, a better metabolic control and a better life quality for the patient and his/her family.This work presents the way of automatic regulation of the basal dose of insulin through the synthesis of the functions of the insulin pump and the continuing glucose sensor. The aim is to give a contribution to the development of the controlling algorithm on the insulin pump for the automatic regulation of the glucose concentration in the blood. This could be a step further which is closer to the delivery of the dose of insulin that is really needed for the basic needs of the organism, and a significant contribution is given to the development of the artificial pancreas.

  10. Next Generation Model 8800 Automatic TLD Reader

    International Nuclear Information System (INIS)

    Velbeck, K.J.; Streetz, K.L.; Rotunda, J.E.

    1999-01-01

    BICRON NE has developed an advanced version of the Model 8800 Automatic TLD Reader. Improvements in the reader include a Windows NT TM -based operating system and a Pentium microprocessor for the host controller, a servo-controlled transport, a VGA display, mouse control, and modular assembly. This high capacity reader will automatically read fourteen hundred TLD Cards in one loading. Up to four elements in a card can be heated without mechanical contact, using hot nitrogen gas. Improvements in performance include an increased throughput rate and more precise card positioning. Operation is simplified through easy-to-read Windows-type screens. Glow curves are displayed graphically along with light intensity, temperature, and channel scaling. Maintenance and diagnostic aids are included for easier troubleshooting. A click of a mouse will command actions that are displayed in easy-to-understand English words. Available options include an internal 90 Sr irradiator, automatic TLD calibration, and two different extremity monitoring modes. Results from testing include reproducibility, reader stability, linearity, detection threshold, residue, primary power supply voltage and frequency, transient voltage, drop testing, and light leakage. (author)

  11. INFERENCE BUILDING BLOCKS

    Science.gov (United States)

    2018-02-15

    expressed a variety of inference techniques on discrete and continuous distributions: exact inference, importance sampling, Metropolis-Hastings (MH...without redoing any math or rewriting any code. And although our main goal is composable reuse, our performance is also good because we can use...control paths. • The Hakaru language can express mixtures of discrete and continuous distributions, but the current disintegration transformation

  12. Practical Bayesian Inference

    Science.gov (United States)

    Bailer-Jones, Coryn A. L.

    2017-04-01

    Preface; 1. Probability basics; 2. Estimation and uncertainty; 3. Statistical models and inference; 4. Linear models, least squares, and maximum likelihood; 5. Parameter estimation: single parameter; 6. Parameter estimation: multiple parameters; 7. Approximating distributions; 8. Monte Carlo methods for inference; 9. Parameter estimation: Markov chain Monte Carlo; 10. Frequentist hypothesis testing; 11. Model comparison; 12. Dealing with more complicated problems; References; Index.

  13. Inferring genetic interactions from comparative fitness data.

    Science.gov (United States)

    Crona, Kristina; Gavryushkin, Alex; Greene, Devin; Beerenwinkel, Niko

    2017-12-20

    Darwinian fitness is a central concept in evolutionary biology. In practice, however, it is hardly possible to measure fitness for all genotypes in a natural population. Here, we present quantitative tools to make inferences about epistatic gene interactions when the fitness landscape is only incompletely determined due to imprecise measurements or missing observations. We demonstrate that genetic interactions can often be inferred from fitness rank orders, where all genotypes are ordered according to fitness, and even from partial fitness orders. We provide a complete characterization of rank orders that imply higher order epistasis. Our theory applies to all common types of gene interactions and facilitates comprehensive investigations of diverse genetic interactions. We analyzed various genetic systems comprising HIV-1, the malaria-causing parasite Plasmodium vivax , the fungus Aspergillus niger , and the TEM-family of β-lactamase associated with antibiotic resistance. For all systems, our approach revealed higher order interactions among mutations.

  14. Structure of the automatic system for plasma equilibrium position control

    International Nuclear Information System (INIS)

    Gubarev, V.F.; Krivonos, Yu.G.; Samojlenko, Yu.I.; Snegur, A.A.

    1978-01-01

    Considered are the principles of construction of the automatic system for plasma filament equilibrium position control inside the discharge chamber for the installation of a tokamak type. The combined current control system in control winding is suggested. The most powerful subsystem creates current in the control winding according to the program calculated beforehand. This system provides plasma rough equilibrium along the ''big radius''. The subsystem performing the current change in small limits according to the principle of feed-back coupling is provided simultaneously. The stabilization of plasma position is achieved in the discharge chamber. The advantage of construction of such system is in decreasing of the automatic requlator power without lowering the requirements to the accuracy of equilibrium preservation. The subsystem of automatic control of plasma position over the vertical is put into the system. Such an approach to the construction of the automatic control system proves to be correct; it is based on the experience of application of similar devices for some existing thermonuclear plants

  15. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu

    2013-11-01

    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  16. Research and Development of Fully Automatic Alien Smoke Stack and Packaging System

    Science.gov (United States)

    Yang, Xudong; Ge, Qingkuan; Peng, Tao; Zuo, Ping; Dong, Weifu

    2017-12-01

    The problem of low efficiency of manual sorting packaging for the current tobacco distribution center, which developed a set of safe efficient and automatic type of alien smoke stack and packaging system. The functions of fully automatic alien smoke stack and packaging system adopt PLC control technology, servo control technology, robot technology, image recognition technology and human-computer interaction technology. The characteristics, principles, control process and key technology of the system are discussed in detail. Through the installation and commissioning fully automatic alien smoke stack and packaging system has a good performance and has completed the requirements for shaped cigarette.

  17. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    Directory of Open Access Journals (Sweden)

    Fabrício A. Magalhaes

    2013-12-01

    Full Text Available Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP, based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM. Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4% than for COM (17.8%. Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis.

  18. CHLOE: a system for the automatic handling of spark pictures

    International Nuclear Information System (INIS)

    Butler, J.W.; Hodges, D.; Royston, R.

    The system for automatic data handling uses commercially available or state-of-the-art components. The system is flexible enough to accept information from various types of experiments involving photographic data acquisition

  19. Bidirectional automatic release of reserve for low voltage network made with low capacity PLCs

    Science.gov (United States)

    Popa, I.; Popa, G. N.; Diniş, C. M.; Deaconu, S. I.

    2018-01-01

    The article presents the design of a bidirectional automatic release of reserve made on two types low capacity programmable logic controllers: PS-3 from Klöckner-Moeller and Zelio from Schneider. It analyses the electronic timing circuits that can be used for making the bidirectional automatic release of reserve: time-on delay circuit and time-off delay circuit (two types). In the paper are present the sequences code for timing performed on the PS-3 PLC, the logical functions for the bidirectional automatic release of reserve, the classical control electrical diagram (with contacts, relays, and time relays), the electronic control diagram (with logical gates and timing circuits), the code (in IL language) made for the PS-3 PLC, and the code (in FBD language) made for Zelio PLC. A comparative analysis will be carried out on the use of the two types of PLC and will be present the advantages of using PLCs.

  20. Logical inference and evaluation

    International Nuclear Information System (INIS)

    Perey, F.G.

    1981-01-01

    Most methodologies of evaluation currently used are based upon the theory of statistical inference. It is generally perceived that this theory is not capable of dealing satisfactorily with what are called systematic errors. Theories of logical inference should be capable of treating all of the information available, including that not involving frequency data. A theory of logical inference is presented as an extension of deductive logic via the concept of plausibility and the application of group theory. Some conclusions, based upon the application of this theory to evaluation of data, are also given

  1. Type Classes for Lightweight Substructural Types

    Directory of Open Access Journals (Sweden)

    Edward Gan

    2015-02-01

    Full Text Available Linear and substructural types are powerful tools, but adding them to standard functional programming languages often means introducing extra annotations and typing machinery. We propose a lightweight substructural type system design that recasts the structural rules of weakening and contraction as type classes; we demonstrate this design in a prototype language, Clamp. Clamp supports polymorphic substructural types as well as an expressive system of mutable references. At the same time, it adds little additional overhead to a standard Damas-Hindley-Milner type system enriched with type classes. We have established type safety for the core model and implemented a type checker with type inference in Haskell.

  2. Causal Inference and Explaining Away in a Spiking Network

    Science.gov (United States)

    Moreno-Bote, Rubén; Drugowitsch, Jan

    2015-01-01

    While the brain uses spiking neurons for communication, theoretical research on brain computations has mostly focused on non-spiking networks. The nature of spike-based algorithms that achieve complex computations, such as object probabilistic inference, is largely unknown. Here we demonstrate that a family of high-dimensional quadratic optimization problems with non-negativity constraints can be solved exactly and efficiently by a network of spiking neurons. The network naturally imposes the non-negativity of causal contributions that is fundamental to causal inference, and uses simple operations, such as linear synapses with realistic time constants, and neural spike generation and reset non-linearities. The network infers the set of most likely causes from an observation using explaining away, which is dynamically implemented by spike-based, tuned inhibition. The algorithm performs remarkably well even when the network intrinsically generates variable spike trains, the timing of spikes is scrambled by external sources of noise, or the network is mistuned. This type of network might underlie tasks such as odor identification and classification. PMID:26621426

  3. ANUBIS: artificial neuromodulation using a Bayesian inference system.

    Science.gov (United States)

    Smith, Benjamin J H; Saaj, Chakravarthini M; Allouis, Elie

    2013-01-01

    Gain tuning is a crucial part of controller design and depends not only on an accurate understanding of the system in question, but also on the designer's ability to predict what disturbances and other perturbations the system will encounter throughout its operation. This letter presents ANUBIS (artificial neuromodulation using a Bayesian inference system), a novel biologically inspired technique for automatically tuning controller parameters in real time. ANUBIS is based on the Bayesian brain concept and modifies it by incorporating a model of the neuromodulatory system comprising four artificial neuromodulators. It has been applied to the controller of EchinoBot, a prototype walking rover for Martian exploration. ANUBIS has been implemented at three levels of the controller; gait generation, foot trajectory planning using Bézier curves, and foot trajectory tracking using a terminal sliding mode controller. We compare the results to a similar system that has been tuned using a multilayer perceptron. The use of Bayesian inference means that the system retains mathematical interpretability, unlike other intelligent tuning techniques, which use neural networks, fuzzy logic, or evolutionary algorithms. The simulation results show that ANUBIS provides significant improvements in efficiency and adaptability of the three controller components; it allows the robot to react to obstacles and uncertainties faster than the system tuned with the MLP, while maintaining stability and accuracy. As well as advancing rover autonomy, ANUBIS could also be applied to other situations where operating conditions are likely to change or cannot be accurately modeled in advance, such as process control. In addition, it demonstrates one way in which neuromodulation could fit into the Bayesian brain framework.

  4. Inference

    DEFF Research Database (Denmark)

    Møller, Jesper

    (This text written by Jesper Møller, Aalborg University, is submitted for the collection ‘Stochastic Geometry: Highlights, Interactions and New Perspectives', edited by Wilfrid S. Kendall and Ilya Molchanov, to be published by ClarendonPress, Oxford, and planned to appear as Section 4.1 with the ......(This text written by Jesper Møller, Aalborg University, is submitted for the collection ‘Stochastic Geometry: Highlights, Interactions and New Perspectives', edited by Wilfrid S. Kendall and Ilya Molchanov, to be published by ClarendonPress, Oxford, and planned to appear as Section 4.......1 with the title ‘Inference'.) This contribution concerns statistical inference for parametric models used in stochastic geometry and based on quick and simple simulation free procedures as well as more comprehensive methods using Markov chain Monte Carlo (MCMC) simulations. Due to space limitations the focus...

  5. Lower complexity bounds for lifted inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2015-01-01

    instances of the model. Numerous approaches for such “lifted inference” techniques have been proposed. While it has been demonstrated that these techniques will lead to significantly more efficient inference on some specific models, there are only very recent and still quite restricted results that show...... the feasibility of lifted inference on certain syntactically defined classes of models. Lower complexity bounds that imply some limitations for the feasibility of lifted inference on more expressive model classes were established earlier in Jaeger (2000; Jaeger, M. 2000. On the complexity of inference about...... that under the assumption that NETIME≠ETIME, there is no polynomial lifted inference algorithm for knowledge bases of weighted, quantifier-, and function-free formulas. Further strengthening earlier results, this is also shown to hold for approximate inference and for knowledge bases not containing...

  6. Data mining in forecasting PVT correlations of crude oil systems based on Type1 fuzzy logic inference systems

    Science.gov (United States)

    El-Sebakhy, Emad A.

    2009-09-01

    Pressure-volume-temperature properties are very important in the reservoir engineering computations. There are many empirical approaches for predicting various PVT properties based on empirical correlations and statistical regression models. Last decade, researchers utilized neural networks to develop more accurate PVT correlations. These achievements of neural networks open the door to data mining techniques to play a major role in oil and gas industry. Unfortunately, the developed neural networks correlations are often limited, and global correlations are usually less accurate compared to local correlations. Recently, adaptive neuro-fuzzy inference systems have been proposed as a new intelligence framework for both prediction and classification based on fuzzy clustering optimization criterion and ranking. This paper proposes neuro-fuzzy inference systems for estimating PVT properties of crude oil systems. This new framework is an efficient hybrid intelligence machine learning scheme for modeling the kind of uncertainty associated with vagueness and imprecision. We briefly describe the learning steps and the use of the Takagi Sugeno and Kang model and Gustafson-Kessel clustering algorithm with K-detected clusters from the given database. It has featured in a wide range of medical, power control system, and business journals, often with promising results. A comparative study will be carried out to compare their performance of this new framework with the most popular modeling techniques, such as neural networks, nonlinear regression, and the empirical correlations algorithms. The results show that the performance of neuro-fuzzy systems is accurate, reliable, and outperform most of the existing forecasting techniques. Future work can be achieved by using neuro-fuzzy systems for clustering the 3D seismic data, identification of lithofacies types, and other reservoir characterization.

  7. An Intuitive Dashboard for Bayesian Network Inference

    International Nuclear Information System (INIS)

    Reddy, Vikas; Farr, Anna Charisse; Wu, Paul; Mengersen, Kerrie; Yarlagadda, Prasad K D V

    2014-01-01

    Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++

  8. An Intuitive Dashboard for Bayesian Network Inference

    Science.gov (United States)

    Reddy, Vikas; Charisse Farr, Anna; Wu, Paul; Mengersen, Kerrie; Yarlagadda, Prasad K. D. V.

    2014-03-01

    Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++.

  9. Variations on Bayesian Prediction and Inference

    Science.gov (United States)

    2016-05-09

    inference 2.2.1 Background There are a number of statistical inference problems that are not generally formulated via a full probability model...problem of inference about an unknown parameter, the Bayesian approach requires a full probability 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND...the problem of inference about an unknown parameter, the Bayesian approach requires a full probability model/likelihood which can be an obstacle

  10. Acquiring concepts and features of novel words by two types of learning: direct mapping and inference.

    Science.gov (United States)

    Chen, Shuang; Wang, Lin; Yang, Yufang

    2014-04-01

    This study examined the semantic representation of novel words learnt in two conditions: directly mapping a novel word to a concept (Direct mapping: DM) and inferring the concept from provided features (Inferred learning: IF). A condition where no definite concept could be inferred (No basic-level meaning: NM) served as a baseline. The semantic representation of the novel word was assessed via a semantic-relatedness judgment task. In this task, the learned novel word served as a prime, while the corresponding concept, an unlearned feature of the concept, and an unrelated word served as targets. ERP responses to the targets, primed by the novel words in the three learning conditions, were compared. For the corresponding concept, smaller N400s were elicited in the DM and IF conditions than in the NM condition, indicating that the concept could be obtained in both learning conditions. However, for the unlearned feature, the targets in the IF condition produced an N400 effect while in the DM condition elicited an LPC effect relative to the NM learning condition. No ERP difference was observed among the three learning conditions for the unrelated words. The results indicate that conditions of learning affect the semantic representation of novel word, and that the unlearned feature was only activated by the novel word in the IF learning condition. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Automatic computer aided analysis algorithms and system for adrenal tumors on CT images.

    Science.gov (United States)

    Chai, Hanchao; Guo, Yi; Wang, Yuanyuan; Zhou, Guohui

    2017-12-04

    The adrenal tumor will disturb the secreting function of adrenocortical cells, leading to many diseases. Different kinds of adrenal tumors require different therapeutic schedules. In the practical diagnosis, it highly relies on the doctor's experience to judge the tumor type by reading the hundreds of CT images. This paper proposed an automatic computer aided analysis method for adrenal tumors detection and classification. It consisted of the automatic segmentation algorithms, the feature extraction and the classification algorithms. These algorithms were then integrated into a system and conducted on the graphic interface by using MATLAB Graphic user interface (GUI). The accuracy of the automatic computer aided segmentation and classification reached 90% on 436 CT images. The experiments proved the stability and reliability of this automatic computer aided analytic system.

  12. An Overview of Automaticity and Implications For Training the Thinking Process

    National Research Council Canada - National Science Library

    Holt, Brian

    2002-01-01

    ...., visual search to battlefield thinking). The results of this examination suggest that automaticity can be developed using consistent rules and extensive practice that vary depending on the type of task...

  13. Automatic classification of defects in weld pipe

    International Nuclear Information System (INIS)

    Anuar Mikdad Muad; Mohd Ashhar Hj Khalid; Abdul Aziz Mohamad; Abu Bakar Mhd Ghazali; Abdul Razak Hamzah

    2000-01-01

    With the advancement of computer imaging technology, the image on hard radiographic film can be digitized and stored in a computer and the manual process of defect recognition and classification may be replace by the computer. In this paper a computerized method for automatic detection and classification of common defects in film radiography of weld pipe is described. The detection and classification processes consist of automatic selection of interest area on the image and then classify common defects using image processing and special algorithms. Analysis of the attributes of each defect such as area, size, shape and orientation are carried out by the feature analysis process. These attributes reveal the type of each defect. These methods of defect classification result in high success rate. Our experience showed that sharp film images produced better results

  14. Automatic classification of defects in weld pipe

    International Nuclear Information System (INIS)

    Anuar Mikdad Muad; Mohd Ashhar Khalid; Abdul Aziz Mohamad; Abu Bakar Mhd Ghazali; Abdul Razak Hamzah

    2001-01-01

    With the advancement of computer imaging technology, the image on hard radiographic film can be digitized and stored in a computer and the manual process of defect recognition and classification may be replaced by the computer. In this paper, a computerized method for automatic detection and classification of common defects in film radiography of weld pipe is described. The detection and classification processes consist of automatic selection of interest area on the image and then classify common defects using image processing and special algorithms. Analysis of the attributes of each defect such area, size, shape and orientation are carried out by the feature analysis process. These attributes reveal the type of each defect. These methods of defect classification result in high success rate. Our experience showed that sharp film images produced better results. (Author)

  15. Bayesian CP Factorization of Incomplete Tensors with Automatic Rank Determination.

    Science.gov (United States)

    Zhao, Qibin; Zhang, Liqing; Cichocki, Andrzej

    2015-09-01

    CANDECOMP/PARAFAC (CP) tensor factorization of incomplete data is a powerful technique for tensor completion through explicitly capturing the multilinear latent factors. The existing CP algorithms require the tensor rank to be manually specified, however, the determination of tensor rank remains a challenging problem especially for CP rank . In addition, existing approaches do not take into account uncertainty information of latent factors, as well as missing entries. To address these issues, we formulate CP factorization using a hierarchical probabilistic model and employ a fully Bayesian treatment by incorporating a sparsity-inducing prior over multiple latent factors and the appropriate hyperpriors over all hyperparameters, resulting in automatic rank determination. To learn the model, we develop an efficient deterministic Bayesian inference algorithm, which scales linearly with data size. Our method is characterized as a tuning parameter-free approach, which can effectively infer underlying multilinear factors with a low-rank constraint, while also providing predictive distributions over missing entries. Extensive simulations on synthetic data illustrate the intrinsic capability of our method to recover the ground-truth of CP rank and prevent the overfitting problem, even when a large amount of entries are missing. Moreover, the results from real-world applications, including image inpainting and facial image synthesis, demonstrate that our method outperforms state-of-the-art approaches for both tensor factorization and tensor completion in terms of predictive performance.

  16. Adaptive Inference on General Graphical Models

    OpenAIRE

    Acar, Umut A.; Ihler, Alexander T.; Mettu, Ramgopal; Sumer, Ozgur

    2012-01-01

    Many algorithms and applications involve repeatedly solving variations of the same inference problem; for example we may want to introduce new evidence to the model or perform updates to conditional dependencies. The goal of adaptive inference is to take advantage of what is preserved in the model and perform inference more rapidly than from scratch. In this paper, we describe techniques for adaptive inference on general graphs that support marginal computation and updates to the conditional ...

  17. Finding weak points automatically

    International Nuclear Information System (INIS)

    Archinger, P.; Wassenberg, M.

    1999-01-01

    Operators of nuclear power stations have to carry out material tests at selected components by regular intervalls. Therefore a full automaticated test, which achieves a clearly higher reproducibility, compared to part automaticated variations, would provide a solution. In addition the full automaticated test reduces the dose of radiation for the test person. (orig.) [de

  18. Assessing school-aged children's inference-making: the effect of story test format in listening comprehension.

    Science.gov (United States)

    Freed, Jenny; Cain, Kate

    2017-01-01

    Comprehension is critical for classroom learning and educational success. Inferences are integral to good comprehension: successful comprehension requires the listener to generate local coherence inferences, which involve integrating information between clauses, and global coherence inferences, which involve integrating textual information with background knowledge to infer motivations, themes, etc. A central priority for the diagnosis of comprehension difficulties and our understanding of why these difficulties arise is the development of valid assessment instruments. We explored typically developing children's ability to make local and global coherence inferences using a novel assessment of listening comprehension. The aims were to determine whether children were more likely to make the target inferences when these were asked during story presentation versus after presentation of the story, and whether there were any age differences between conditions. Children in Years 3 (n = 29) and 5 (n = 31) listened to short stories presented either in a segmented format, in which questions to assess local and global coherence inferences were asked at specific points during story presentation, or in a whole format, when all the questions were asked after the story had been presented. There was developmental progression between age groups for both types of inference question. Children also scored higher on the global coherence inference questions than the local coherence inference questions. There was a benefit of the segmented format for younger children, particularly for the local inference questions. The results suggest that children are more likely to make target inferences if prompted during presentation of the story, and that this format is particularly facilitative for younger children and for local coherence inferences. This has implications for the design of comprehension assessments as well as for supporting children with comprehension difficulties in the classroom

  19. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  20. The use of automatic programming techniques for fault tolerant computing systems

    Science.gov (United States)

    Wild, C.

    1985-01-01

    It is conjectured that the production of software for ultra-reliable computing systems such as required by Space Station, aircraft, nuclear power plants and the like will require a high degree of automation as well as fault tolerance. In this paper, the relationship between automatic programming techniques and fault tolerant computing systems is explored. Initial efforts in the automatic synthesis of code from assertions to be used for error detection as well as the automatic generation of assertions and test cases from abstract data type specifications is outlined. Speculation on the ability to generate truly diverse designs capable of recovery from errors by exploring alternate paths in the program synthesis tree is discussed. Some initial thoughts on the use of knowledge based systems for the global detection of abnormal behavior using expectations and the goal-directed reconfiguration of resources to meet critical mission objectives are given. One of the sources of information for these systems would be the knowledge captured during the automatic programming process.

  1. Ontology-Based High-Level Context Inference for Human Behavior Identification

    Science.gov (United States)

    Villalonga, Claudia; Razzaq, Muhammad Asif; Khan, Wajahat Ali; Pomares, Hector; Rojas, Ignacio; Lee, Sungyoung; Banos, Oresti

    2016-01-01

    Recent years have witnessed a huge progress in the automatic identification of individual primitives of human behavior, such as activities or locations. However, the complex nature of human behavior demands more abstract contextual information for its analysis. This work presents an ontology-based method that combines low-level primitives of behavior, namely activity, locations and emotions, unprecedented to date, to intelligently derive more meaningful high-level context information. The paper contributes with a new open ontology describing both low-level and high-level context information, as well as their relationships. Furthermore, a framework building on the developed ontology and reasoning models is presented and evaluated. The proposed method proves to be robust while identifying high-level contexts even in the event of erroneously-detected low-level contexts. Despite reasonable inference times being obtained for a relevant set of users and instances, additional work is required to scale to long-term scenarios with a large number of users. PMID:27690050

  2. Ontology-Based High-Level Context Inference for Human Behavior Identification

    Directory of Open Access Journals (Sweden)

    Claudia Villalonga

    2016-09-01

    Full Text Available Recent years have witnessed a huge progress in the automatic identification of individual primitives of human behavior, such as activities or locations. However, the complex nature of human behavior demands more abstract contextual information for its analysis. This work presents an ontology-based method that combines low-level primitives of behavior, namely activity, locations and emotions, unprecedented to date, to intelligently derive more meaningful high-level context information. The paper contributes with a new open ontology describing both low-level and high-level context information, as well as their relationships. Furthermore, a framework building on the developed ontology and reasoning models is presented and evaluated. The proposed method proves to be robust while identifying high-level contexts even in the event of erroneously-detected low-level contexts. Despite reasonable inference times being obtained for a relevant set of users and instances, additional work is required to scale to long-term scenarios with a large number of users.

  3. Learning Probabilistic Inference through Spike-Timing-Dependent Plasticity.

    Science.gov (United States)

    Pecevski, Dejan; Maass, Wolfgang

    2016-01-01

    Numerous experimental data show that the brain is able to extract information from complex, uncertain, and often ambiguous experiences. Furthermore, it can use such learnt information for decision making through probabilistic inference. Several models have been proposed that aim at explaining how probabilistic inference could be performed by networks of neurons in the brain. We propose here a model that can also explain how such neural network could acquire the necessary information for that from examples. We show that spike-timing-dependent plasticity in combination with intrinsic plasticity generates in ensembles of pyramidal cells with lateral inhibition a fundamental building block for that: probabilistic associations between neurons that represent through their firing current values of random variables. Furthermore, by combining such adaptive network motifs in a recursive manner the resulting network is enabled to extract statistical information from complex input streams, and to build an internal model for the distribution p (*) that generates the examples it receives. This holds even if p (*) contains higher-order moments. The analysis of this learning process is supported by a rigorous theoretical foundation. Furthermore, we show that the network can use the learnt internal model immediately for prediction, decision making, and other types of probabilistic inference.

  4. The automatic component of habit in health behavior: habit as cue-contingent automaticity.

    Science.gov (United States)

    Orbell, Sheina; Verplanken, Bas

    2010-07-01

    Habit might be usefully characterized as a form of automaticity that involves the association of a cue and a response. Three studies examined habitual automaticity in regard to different aspects of the cue-response relationship characteristic of unhealthy and healthy habits. In each study, habitual automaticity was assessed by the Self-Report Habit Index (SRHI). In Study 1 SRHI scores correlated with attentional bias to smoking cues in a Stroop task. Study 2 examined the ability of a habit cue to elicit an unwanted habit response. In a prospective field study, habitual automaticity in relation to smoking when drinking alcohol in a licensed public house (pub) predicted the likelihood of cigarette-related action slips 2 months later after smoking in pubs had become illegal. In Study 3 experimental group participants formed an implementation intention to floss in response to a specified situational cue. Habitual automaticity of dental flossing was rapidly enhanced compared to controls. The studies provided three different demonstrations of the importance of cues in the automatic operation of habits. Habitual automaticity assessed by the SRHI captured aspects of a habit that go beyond mere frequency or consistency of the behavior. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  5. Genomic inferences of domestication events are corroborated by written records in Brassica rapa.

    Science.gov (United States)

    Qi, Xinshuai; An, Hong; Ragsdale, Aaron P; Hall, Tara E; Gutenkunst, Ryan N; Chris Pires, J; Barker, Michael S

    2017-07-01

    Demographic modelling is often used with population genomic data to infer the relationships and ages among populations. However, relatively few analyses are able to validate these inferences with independent data. Here, we leverage written records that describe distinct Brassica rapa crops to corroborate demographic models of domestication. Brassica rapa crops are renowned for their outstanding morphological diversity, but the relationships and order of domestication remain unclear. We generated genomewide SNPs from 126 accessions collected globally using high-throughput transcriptome data. Analyses of more than 31,000 SNPs across the B. rapa genome revealed evidence for five distinct genetic groups and supported a European-Central Asian origin of B. rapa crops. Our results supported the traditionally recognized South Asian and East Asian B. rapa groups with evidence that pak choi, Chinese cabbage and yellow sarson are likely monophyletic groups. In contrast, the oil-type B. rapa subsp. oleifera and brown sarson were polyphyletic. We also found no evidence to support the contention that rapini is the wild type or the earliest domesticated subspecies of B. rapa. Demographic analyses suggested that B. rapa was introduced to Asia 2,400-4,100 years ago, and that Chinese cabbage originated 1,200-2,100 years ago via admixture of pak choi and European-Central Asian B. rapa. We also inferred significantly different levels of founder effect among the B. rapa subspecies. Written records from antiquity that document these crops are consistent with these inferences. The concordance between our age estimates of domestication events with historical records provides unique support for our demographic inferences. © 2017 John Wiley & Sons Ltd.

  6. Methodology Investigation Automatic Magnetic Recording Borescope.

    Science.gov (United States)

    1986-01-01

    or other brushless signal coupling devices to the extent possible and feasible to reduce or eliminate the need for slip ring and brush type signal...the inspection head, is used to magnetically couple the necessary energy across the rotary interface. Because there is (1) an appreciable air gap in...were written. (2) As required by the contract, the signal conditioners in the MB employ automatic gain control to compensate for the changes in

  7. Testing AGN unification via inference from large catalogs

    Science.gov (United States)

    Nikutta, Robert; Ivezic, Zeljko; Elitzur, Moshe; Nenkova, Maia

    2018-01-01

    Source orientation and clumpiness of the central dust are the main factors in AGN classification. Type-1 QSOs are easy to observe and large samples are available (e.g. in SDSS), but obscured type-2 AGN are dimmer and redder as our line of sight is more obscured, making it difficult to obtain a complete sample. WISE has found up to a million QSOs. With only 4 bands and a relatively small aperture the analysis of individual sources is challenging, but the large sample allows inference of bulk properties at a very significant level.CLUMPY (www.clumpy.org) is arguably the most popular database of AGN torus SEDs. We model the ensemble properties of the entire WISE AGN content using regularized linear regression, with orientation-dependent CLUMPY color-color-magnitude (CCM) tracks as basis functions. We can reproduce the observed number counts per CCM bin with percent-level accuracy, and simultaneously infer the probability distributions of all torus parameters, redshifts, additional SED components, and identify type-1/2 AGN populations through their IR properties alone. We increase the statistical power of our AGN unification tests even further, by adding other datasets as axes in the regression problem. To this end, we make use of the NOAO Data Lab (datalab.noao.edu), which hosts several high-level large datasets and provides very powerful tools for handling large data, e.g. cross-matched catalogs, fast remote queries, etc.

  8. The Russo-Williamson Theses in the social sciences: causal inference drawing on two types of evidence.

    Science.gov (United States)

    Claveau, François

    2012-12-01

    This article examines two theses formulated by Russo and Williamson (2007) in their study of causal inference in the health sciences. The two theses are assessed against evidence from a specific case in the social sciences, i.e., research on the institutional determinants of the aggregate unemployment rate. The first Russo-Williamson Thesis is that a causal claim can only be established when it is jointly supported by difference-making and mechanistic evidence. This thesis is shown not to hold. While researchers in my case study draw extensively on both types of evidence, one causal claim out of the three analyzed is established even though it is exclusively supported by mechanistic evidence. The second Russo-Williamson Thesis is that standard accounts of causality fail to handle the dualist epistemology highlighted in the first Thesis. I argue that a counterfactual-manipulationist account of causality--which is endorsed by many philosophers as well as many social scientists--can perfectly make sense of the typical strategy in my case study to draw on both difference-making and mechanistic evidence; it is just an instance of the common strategy of increasing evidential variety. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. The inference from a single case: moral versus scientific inferences in implementing new biotechnologies.

    Science.gov (United States)

    Hofmann, B

    2008-06-01

    Are there similarities between scientific and moral inference? This is the key question in this article. It takes as its point of departure an instance of one person's story in the media changing both Norwegian public opinion and a brand-new Norwegian law prohibiting the use of saviour siblings. The case appears to falsify existing norms and to establish new ones. The analysis of this case reveals similarities in the modes of inference in science and morals, inasmuch as (a) a single case functions as a counter-example to an existing rule; (b) there is a common presupposition of stability, similarity and order, which makes it possible to reason from a few cases to a general rule; and (c) this makes it possible to hold things together and retain order. In science, these modes of inference are referred to as falsification, induction and consistency. In morals, they have a variety of other names. Hence, even without abandoning the fact-value divide, there appear to be similarities between inference in science and inference in morals, which may encourage communication across the boundaries between "the two cultures" and which are relevant to medical humanities.

  10. Introductory statistical inference

    CERN Document Server

    Mukhopadhyay, Nitis

    2014-01-01

    This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist

  11. Type tests to the automatic thermoluminescent dosimetry system acquired by the CPHR for personal dosimetry

    International Nuclear Information System (INIS)

    Molina P, D.; Pernas S, R.; Martinez G, A.

    2006-01-01

    The CPHR individual monitoring service acquired an automatic RADOS TLD system to improve its capacities to satisfy the increasing needs of their national customers. The TLD system consists of: two automatic TLD reader, model DOSACUS, a TLD irradiator and personal dosimeters card including slide and holders. The dosimeters were composed by this personal dosimeters card and LiF:Mg,Cu,P (model GR-200) detectors. These readers provide to detectors a constant temperature readout cycle using hot nitrogen gas. In order to evaluate the performance characteristics of the system, different performance tests recommended by the IEC 1066 standard were carried out. Important dosimetric characteristics evaluated were batch homogeneity, reproducibility, detection threshold, energy dependence, residual signal and fading. The results of the tests showed good performance characteristics of the system. (Author)

  12. Microprocessor controlled system for automatic and semi-automatic syntheses of radiopharmaceuticals

    International Nuclear Information System (INIS)

    Ruth, T.J.; Adam, M.J.; Morris, D.; Jivan, S.

    1986-01-01

    A computer based system has been constructed to control the automatic synthesis of 2-deoxy-2-( 18 F)fluoro-D-glucose and is also being used in the development of an automatic synthesis of L-6-( 18 F)fluorodopa. (author)

  13. Active inference, communication and hermeneutics.

    Science.gov (United States)

    Friston, Karl J; Frith, Christopher D

    2015-07-01

    Hermeneutics refers to interpretation and translation of text (typically ancient scriptures) but also applies to verbal and non-verbal communication. In a psychological setting it nicely frames the problem of inferring the intended content of a communication. In this paper, we offer a solution to the problem of neural hermeneutics based upon active inference. In active inference, action fulfils predictions about how we will behave (e.g., predicting we will speak). Crucially, these predictions can be used to predict both self and others--during speaking and listening respectively. Active inference mandates the suppression of prediction errors by updating an internal model that generates predictions--both at fast timescales (through perceptual inference) and slower timescales (through perceptual learning). If two agents adopt the same model, then--in principle--they can predict each other and minimise their mutual prediction errors. Heuristically, this ensures they are singing from the same hymn sheet. This paper builds upon recent work on active inference and communication to illustrate perceptual learning using simulated birdsongs. Our focus here is the neural hermeneutics implicit in learning, where communication facilitates long-term changes in generative models that are trying to predict each other. In other words, communication induces perceptual learning and enables others to (literally) change our minds and vice versa. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Studies in the extensively automatic construction of large odds-based inference networks from structured data. Examples from medical, bioinformatics, and health insurance claims data.

    Science.gov (United States)

    Robson, B; Boray, S

    2018-04-01

    Theoretical and methodological principles are presented for the construction of very large inference nets for odds calculations, composed of hundreds or many thousands or more of elements, in this paper generated by structured data mining. It is argued that the usual small inference nets can sometimes represent rather simple, arbitrary estimates. Examples of applications in clinical and public health data analysis, medical claims data and detection of irregular entries, and bioinformatics data, are presented. Construction of large nets benefits from application of a theory of expected information for sparse data and the Dirac notation and algebra. The extent to which these are important here is briefly discussed. Purposes of the study include (a) exploration of the properties of large inference nets and a perturbation and tacit conditionality models, (b) using these to propose simpler models including one that a physician could use routinely, analogous to a "risk score", (c) examination of the merit of describing optimal performance in a single measure that combines accuracy, specificity, and sensitivity in place of a ROC curve, and (d) relationship to methods for detecting anomalous and potentially fraudulent data. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...

  16. Choosing Actuators for Automatic Control Systems of Thermal Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Gorbunov, A. I., E-mail: gor@tornado.nsk.ru [JSC “Tornado Modular Systems” (Russian Federation); Serdyukov, O. V. [Siberian Branch of the Russian Academy of Sciences, Institute of Automation and Electrometry (Russian Federation)

    2015-03-15

    Two types of actuators for automatic control systems of thermal power plants are analyzed: (i) pulse-controlled actuator and (ii) analog-controlled actuator with positioning function. The actuators are compared in terms of control circuit, control accuracy, reliability, and cost.

  17. Accuracy of Automatic Carbohydrate, Protein, Fat and Calorie Counting Based on Voice Descriptions of Meals in People with Type 1 Diabetes

    Directory of Open Access Journals (Sweden)

    Piotr Ladyzynski

    2018-04-01

    Full Text Available The aim of this work was to assess the accuracy of automatic macronutrient and calorie counting based on voice descriptions of meals provided by people with unstable type 1 diabetes using the developed expert system (VoiceDiab in comparison with reference counting made by a dietitian, and to evaluate the impact of insulin doses recommended by a physician on glycemic control in the study’s participants. We also compared insulin doses calculated using the algorithm implemented in the VoiceDiab system. Meal descriptions were provided by 30 hospitalized patients (mean hemoglobin A1c of 8.4%, i.e., 68 mmol/mol. In 16 subjects, the physician determined insulin boluses based on the data provided by the system, and in 14 subjects, by data provided by the dietitian. On one hand, differences introduced by patients who subjectively described their meals compared to those introduced by the system that used the average characteristics of food products, although statistically significant, were low enough not to have a significant impact on insulin doses automatically calculated by the system. On the other hand, the glycemic control of patients was comparable regardless of whether the physician was using the system-estimated or the reference content of meals to determine insulin doses.

  18. Automatic depressurization system of BWR type nuclear power plant

    International Nuclear Information System (INIS)

    Fujii, Masahiko.

    1993-01-01

    In the present invention, depressurization is conducted while keeping versatility and retardancy of a water injection system so that safety is improved. That is, a means that judges whether a turbine driving water injection system is operated or not by the following conditions. (1) a discharging pressure of the turbine driving pump is greater than a set value, (2) a flow rate of the turbine driving water injection system is greater than a set value, (3) an injection valve of the turbine driving water injection system into a reactor is opened, or combination of (1) to (3). With such procedures, when an automatic depressurization system is necessary during operation of the turbine driving water injection system, reactor pressure is decreased till a low pressure water injection system is operated, but pressure is not decreased to such a level that the turbine driving water injection system is isolated. Therefore, versatility and retardancy of the water injection system are ensured. As a result, reliability of a reactor cooling means is improved. (I.S.)

  19. Subgrouping Automata: automatic sequence subgrouping using phylogenetic tree-based optimum subgrouping algorithm.

    Science.gov (United States)

    Seo, Joo-Hyun; Park, Jihyang; Kim, Eun-Mi; Kim, Juhan; Joo, Keehyoung; Lee, Jooyoung; Kim, Byung-Gee

    2014-02-01

    Sequence subgrouping for a given sequence set can enable various informative tasks such as the functional discrimination of sequence subsets and the functional inference of unknown sequences. Because an identity threshold for sequence subgrouping may vary according to the given sequence set, it is highly desirable to construct a robust subgrouping algorithm which automatically identifies an optimal identity threshold and generates subgroups for a given sequence set. To meet this end, an automatic sequence subgrouping method, named 'Subgrouping Automata' was constructed. Firstly, tree analysis module analyzes the structure of tree and calculates the all possible subgroups in each node. Sequence similarity analysis module calculates average sequence similarity for all subgroups in each node. Representative sequence generation module finds a representative sequence using profile analysis and self-scoring for each subgroup. For all nodes, average sequence similarities are calculated and 'Subgrouping Automata' searches a node showing statistically maximum sequence similarity increase using Student's t-value. A node showing the maximum t-value, which gives the most significant differences in average sequence similarity between two adjacent nodes, is determined as an optimum subgrouping node in the phylogenetic tree. Further analysis showed that the optimum subgrouping node from SA prevents under-subgrouping and over-subgrouping. Copyright © 2013. Published by Elsevier Ltd.

  20. Inference of miRNA targets using evolutionary conservation and pathway analysis

    Directory of Open Access Journals (Sweden)

    van Nimwegen Erik

    2007-03-01

    Full Text Available Abstract Background MicroRNAs have emerged as important regulatory genes in a variety of cellular processes and, in recent years, hundreds of such genes have been discovered in animals. In contrast, functional annotations are available only for a very small fraction of these miRNAs, and even in these cases only partially. Results We developed a general Bayesian method for the inference of miRNA target sites, in which, for each miRNA, we explicitly model the evolution of orthologous target sites in a set of related species. Using this method we predict target sites for all known miRNAs in flies, worms, fish, and mammals. By comparing our predictions in fly with a reference set of experimentally tested miRNA-mRNA interactions we show that our general method performs at least as well as the most accurate methods available to date, including ones specifically tailored for target prediction in fly. An important novel feature of our model is that it explicitly infers the phylogenetic distribution of functional target sites, independently for each miRNA. This allows us to infer species-specific and clade-specific miRNA targeting. We also show that, in long human 3' UTRs, miRNA target sites occur preferentially near the start and near the end of the 3' UTR. To characterize miRNA function beyond the predicted lists of targets we further present a method to infer significant associations between the sets of targets predicted for individual miRNAs and specific biochemical pathways, in particular those of the KEGG pathway database. We show that this approach retrieves several known functional miRNA-mRNA associations, and predicts novel functions for known miRNAs in cell growth and in development. Conclusion We have presented a Bayesian target prediction algorithm without any tunable parameters, that can be applied to sequences from any clade of species. The algorithm automatically infers the phylogenetic distribution of functional sites for each miRNA, and

  1. Motor automaticity in Parkinson’s disease

    Science.gov (United States)

    Wu, Tao; Hallett, Mark; Chan, Piu

    2017-01-01

    Bradykinesia is the most important feature contributing to motor difficulties in Parkinson’s disease (PD). However, the pathophysiology underlying bradykinesia is not fully understood. One important aspect is that PD patients have difficulty in performing learned motor skills automatically, but this problem has been generally overlooked. Here we review motor automaticity associated motor deficits in PD, such as reduced arm swing, decreased stride length, freezing of gait, micrographia and reduced facial expression. Recent neuroimaging studies have revealed some neural mechanisms underlying impaired motor automaticity in PD, including less efficient neural coding of movement, failure to shift automated motor skills to the sensorimotor striatum, instability of the automatic mode within the striatum, and use of attentional control and/or compensatory efforts to execute movements usually performed automatically in healthy people. PD patients lose previously acquired automatic skills due to their impaired sensorimotor striatum, and have difficulty in acquiring new automatic skills or restoring lost motor skills. More investigations on the pathophysiology of motor automaticity, the effect of L-dopa or surgical treatments on automaticity, and the potential role of using measures of automaticity in early diagnosis of PD would be valuable. PMID:26102020

  2. Optimization methods for logical inference

    CERN Document Server

    Chandru, Vijay

    2011-01-01

    Merging logic and mathematics in deductive inference-an innovative, cutting-edge approach. Optimization methods for logical inference? Absolutely, say Vijay Chandru and John Hooker, two major contributors to this rapidly expanding field. And even though ""solving logical inference problems with optimization methods may seem a bit like eating sauerkraut with chopsticks. . . it is the mathematical structure of a problem that determines whether an optimization model can help solve it, not the context in which the problem occurs."" Presenting powerful, proven optimization techniques for logic in

  3. Inference in `poor` languages

    Energy Technology Data Exchange (ETDEWEB)

    Petrov, S.

    1996-10-01

    Languages with a solvable implication problem but without complete and consistent systems of inference rules (`poor` languages) are considered. The problem of existence of finite complete and consistent inference rule system for a ``poor`` language is stated independently of the language or rules syntax. Several properties of the problem arc proved. An application of results to the language of join dependencies is given.

  4. Inference of Transmission Network Structure from HIV Phylogenetic Trees.

    Science.gov (United States)

    Giardina, Federica; Romero-Severson, Ethan Obie; Albert, Jan; Britton, Tom; Leitner, Thomas

    2017-01-01

    Phylogenetic inference is an attractive means to reconstruct transmission histories and epidemics. However, there is not a perfect correspondence between transmission history and virus phylogeny. Both node height and topological differences may occur, depending on the interaction between within-host evolutionary dynamics and between-host transmission patterns. To investigate these interactions, we added a within-host evolutionary model in epidemiological simulations and examined if the resulting phylogeny could recover different types of contact networks. To further improve realism, we also introduced patient-specific differences in infectivity across disease stages, and on the epidemic level we considered incomplete sampling and the age of the epidemic. Second, we implemented an inference method based on approximate Bayesian computation (ABC) to discriminate among three well-studied network models and jointly estimate both network parameters and key epidemiological quantities such as the infection rate. Our ABC framework used both topological and distance-based tree statistics for comparison between simulated and observed trees. Overall, our simulations showed that a virus time-scaled phylogeny (genealogy) may be substantially different from the between-host transmission tree. This has important implications for the interpretation of what a phylogeny reveals about the underlying epidemic contact network. In particular, we found that while the within-host evolutionary process obscures the transmission tree, the diversification process and infectivity dynamics also add discriminatory power to differentiate between different types of contact networks. We also found that the possibility to differentiate contact networks depends on how far an epidemic has progressed, where distance-based tree statistics have more power early in an epidemic. Finally, we applied our ABC inference on two different outbreaks from the Swedish HIV-1 epidemic.

  5. EI: A Program for Ecological Inference

    Directory of Open Access Journals (Sweden)

    Gary King

    2004-09-01

    Full Text Available The program EI provides a method of inferring individual behavior from aggregate data. It implements the statistical procedures, diagnostics, and graphics from the book A Solution to the Ecological Inference Problem: Reconstructing Individual Behavior from Aggregate Data (King 1997. Ecological inference, as traditionally defined, is the process of using aggregate (i.e., "ecological" data to infer discrete individual-level relationships of interest when individual-level data are not available. Ecological inferences are required in political science research when individual-level surveys are unavailable (e.g., local or comparative electoral politics, unreliable (racial politics, insufficient (political geography, or infeasible (political history. They are also required in numerous areas of ma jor significance in public policy (e.g., for applying the Voting Rights Act and other academic disciplines ranging from epidemiology and marketing to sociology and quantitative history.

  6. Automatic Photoelectric Telescope Service

    International Nuclear Information System (INIS)

    Genet, R.M.; Boyd, L.J.; Kissell, K.E.; Crawford, D.L.; Hall, D.S.; BDM Corp., McLean, VA; Kitt Peak National Observatory, Tucson, AZ; Dyer Observatory, Nashville, TN)

    1987-01-01

    Automatic observatories have the potential of gathering sizable amounts of high-quality astronomical data at low cost. The Automatic Photoelectric Telescope Service (APT Service) has realized this potential and is routinely making photometric observations of a large number of variable stars. However, without observers to provide on-site monitoring, it was necessary to incorporate special quality checks into the operation of the APT Service at its multiple automatic telescope installation on Mount Hopkins. 18 references

  7. Automatic imitation: A meta-analysis.

    Science.gov (United States)

    Cracco, Emiel; Bardi, Lara; Desmet, Charlotte; Genschow, Oliver; Rigoni, Davide; De Coster, Lize; Radkova, Ina; Deschrijver, Eliane; Brass, Marcel

    2018-05-01

    Automatic imitation is the finding that movement execution is facilitated by compatible and impeded by incompatible observed movements. In the past 15 years, automatic imitation has been studied to understand the relation between perception and action in social interaction. Although research on this topic started in cognitive science, interest quickly spread to related disciplines such as social psychology, clinical psychology, and neuroscience. However, important theoretical questions have remained unanswered. Therefore, in the present meta-analysis, we evaluated seven key questions on automatic imitation. The results, based on 161 studies containing 226 experiments, revealed an overall effect size of g z = 0.95, 95% CI [0.88, 1.02]. Moderator analyses identified automatic imitation as a flexible, largely automatic process that is driven by movement and effector compatibility, but is also influenced by spatial compatibility. Automatic imitation was found to be stronger for forced choice tasks than for simple response tasks, for human agents than for nonhuman agents, and for goalless actions than for goal-directed actions. However, it was not modulated by more subtle factors such as animacy beliefs, motion profiles, or visual perspective. Finally, there was no evidence for a relation between automatic imitation and either empathy or autism. Among other things, these findings point toward actor-imitator similarity as a crucial modulator of automatic imitation and challenge the view that imitative tendencies are an indicator of social functioning. The current meta-analysis has important theoretical implications and sheds light on longstanding controversies in the literature on automatic imitation and related domains. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  8. Logical inference techniques for loop parallelization

    KAUST Repository

    Oancea, Cosmin E.; Rauchwerger, Lawrence

    2012-01-01

    This paper presents a fully automatic approach to loop parallelization that integrates the use of static and run-time analysis and thus overcomes many known difficulties such as nonlinear and indirect array indexing and complex control flow. Our hybrid analysis framework validates the parallelization transformation by verifying the independence of the loop's memory references. To this end it represents array references using the USR (uniform set representation) language and expresses the independence condition as an equation, S = Ø, where S is a set expression representing array indexes. Using a language instead of an array-abstraction representation for S results in a smaller number of conservative approximations but exhibits a potentially-high runtime cost. To alleviate this cost we introduce a language translation F from the USR set-expression language to an equally rich language of predicates (F(S) ⇒ S = Ø). Loop parallelization is then validated using a novel logic inference algorithm that factorizes the obtained complex predicates (F(S)) into a sequence of sufficient-independence conditions that are evaluated first statically and, when needed, dynamically, in increasing order of their estimated complexities. We evaluate our automated solution on 26 benchmarks from PERFECTCLUB and SPEC suites and show that our approach is effective in parallelizing large, complex loops and obtains much better full program speedups than the Intel and IBM Fortran compilers. Copyright © 2012 ACM.

  9. Logical inference techniques for loop parallelization

    KAUST Repository

    Oancea, Cosmin E.

    2012-01-01

    This paper presents a fully automatic approach to loop parallelization that integrates the use of static and run-time analysis and thus overcomes many known difficulties such as nonlinear and indirect array indexing and complex control flow. Our hybrid analysis framework validates the parallelization transformation by verifying the independence of the loop\\'s memory references. To this end it represents array references using the USR (uniform set representation) language and expresses the independence condition as an equation, S = Ø, where S is a set expression representing array indexes. Using a language instead of an array-abstraction representation for S results in a smaller number of conservative approximations but exhibits a potentially-high runtime cost. To alleviate this cost we introduce a language translation F from the USR set-expression language to an equally rich language of predicates (F(S) ⇒ S = Ø). Loop parallelization is then validated using a novel logic inference algorithm that factorizes the obtained complex predicates (F(S)) into a sequence of sufficient-independence conditions that are evaluated first statically and, when needed, dynamically, in increasing order of their estimated complexities. We evaluate our automated solution on 26 benchmarks from PERFECTCLUB and SPEC suites and show that our approach is effective in parallelizing large, complex loops and obtains much better full program speedups than the Intel and IBM Fortran compilers. Copyright © 2012 ACM.

  10. Development of the Automatic Modeling System for Reaction Mechanisms Using REX+JGG

    Science.gov (United States)

    Takahashi, Takahiro; Kawai, Kohei; Nakai, Hiroyuki; Ema, Yoshinori

    The identification of appropriate reaction models is very helpful for developing chemical vapor deposition (CVD) processes. In this study, we developed an automatic modeling system that analyzes experimental data on the cross- sectional shapes of films deposited on substrates with nanometer- or micrometer-sized trenches. The system then identifies a suitable reaction model to describe the film deposition. The inference engine used by the system to model the reaction mechanism was designed using real-coded genetic algorithms (RCGAs): a generation alternation model named "just generation gap" (JGG) and a real-coded crossover named "real-coded ensemble crossover" (REX). We studied the effect of REX+JGG on the system's performance, and found that the system with REX+JGG was the most accurate and reliable at model identification among the algorithms that we studied.

  11. The Automatic Integration of Folksonomies with Taxonomies Using Non-axiomatic Logic

    Science.gov (United States)

    Geldart, Joe; Cummins, Stephen

    Cooperative tagging systems such as folksonomies are powerful tools when used to annotate information resources. The inherent power of folksonomies is in their ability to allow casual users to easily contribute ad hoc, yet meaningful, resource metadata without any specialist training. Older folksonomies have begun to degrade due to the lack of internal structure and from the use of many low quality tags. This chapter describes a remedy for some of the problems associated with folksonomies. We introduce a method of automatic integration and inference of the relationships between tags and resources in a folksonomy using non-axiomatic logic. We test this method on the CiteULike corpus of tags by comparing precision and recall between it and standard keyword search. Our results show that non-axiomatic reasoning is a promising technique for integrating tagging systems with more structured knowledge representations.

  12. Gauging Variational Inference

    Energy Technology Data Exchange (ETDEWEB)

    Chertkov, Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ahn, Sungsoo [Korea Advanced Inst. Science and Technology (KAIST), Daejeon (Korea, Republic of); Shin, Jinwoo [Korea Advanced Inst. Science and Technology (KAIST), Daejeon (Korea, Republic of)

    2017-05-25

    Computing partition function is the most important statistical inference task arising in applications of Graphical Models (GM). Since it is computationally intractable, approximate methods have been used to resolve the issue in practice, where meanfield (MF) and belief propagation (BP) are arguably the most popular and successful approaches of a variational type. In this paper, we propose two new variational schemes, coined Gauged-MF (G-MF) and Gauged-BP (G-BP), improving MF and BP, respectively. Both provide lower bounds for the partition function by utilizing the so-called gauge transformation which modifies factors of GM while keeping the partition function invariant. Moreover, we prove that both G-MF and G-BP are exact for GMs with a single loop of a special structure, even though the bare MF and BP perform badly in this case. Our extensive experiments, on complete GMs of relatively small size and on large GM (up-to 300 variables) confirm that the newly proposed algorithms outperform and generalize MF and BP.

  13. Model-based and design-based inference goals frame how to account for neighborhood clustering in studies of health in overlapping context types.

    Science.gov (United States)

    Lovasi, Gina S; Fink, David S; Mooney, Stephen J; Link, Bruce G

    2017-12-01

    Accounting for non-independence in health research often warrants attention. Particularly, the availability of geographic information systems data has increased the ease with which studies can add measures of the local "neighborhood" even if participant recruitment was through other contexts, such as schools or clinics. We highlight a tension between two perspectives that is often present, but particularly salient when more than one type of potentially health-relevant context is indexed (e.g., both neighborhood and school). On the one hand, a model-based perspective emphasizes the processes producing outcome variation, and observed data are used to make inference about that process. On the other hand, a design-based perspective emphasizes inference to a well-defined finite population, and is commonly invoked by those using complex survey samples or those with responsibility for the health of local residents. These two perspectives have divergent implications when deciding whether clustering must be accounted for analytically and how to select among candidate cluster definitions, though the perspectives are by no means monolithic. There are tensions within each perspective as well as between perspectives. We aim to provide insight into these perspectives and their implications for population health researchers. We focus on the crucial step of deciding which cluster definition or definitions to use at the analysis stage, as this has consequences for all subsequent analytic and interpretational challenges with potentially clustered data.

  14. Stop valve with automatic control and locking for nuclear reactors

    International Nuclear Information System (INIS)

    Chung, D.K.

    1980-01-01

    This invention generally concerns an automatic control and locking stop valve. Specifically it relates to the use of such a valve in a nuclear reactor of the type containing absorber elements supported by a fluid and intended for stopping the reactor in complete safety [fr

  15. On the criticality of inferred models

    Science.gov (United States)

    Mastromatteo, Iacopo; Marsili, Matteo

    2011-10-01

    Advanced inference techniques allow one to reconstruct a pattern of interaction from high dimensional data sets, from probing simultaneously thousands of units of extended systems—such as cells, neural tissues and financial markets. We focus here on the statistical properties of inferred models and argue that inference procedures are likely to yield models which are close to singular values of parameters, akin to critical points in physics where phase transitions occur. These are points where the response of physical systems to external perturbations, as measured by the susceptibility, is very large and diverges in the limit of infinite size. We show that the reparameterization invariant metrics in the space of probability distributions of these models (the Fisher information) are directly related to the susceptibility of the inferred model. As a result, distinguishable models tend to accumulate close to critical points, where the susceptibility diverges in infinite systems. This region is the one where the estimate of inferred parameters is most stable. In order to illustrate these points, we discuss inference of interacting point processes with application to financial data and show that sensible choices of observation time scales naturally yield models which are close to criticality.

  16. On the criticality of inferred models

    International Nuclear Information System (INIS)

    Mastromatteo, Iacopo; Marsili, Matteo

    2011-01-01

    Advanced inference techniques allow one to reconstruct a pattern of interaction from high dimensional data sets, from probing simultaneously thousands of units of extended systems—such as cells, neural tissues and financial markets. We focus here on the statistical properties of inferred models and argue that inference procedures are likely to yield models which are close to singular values of parameters, akin to critical points in physics where phase transitions occur. These are points where the response of physical systems to external perturbations, as measured by the susceptibility, is very large and diverges in the limit of infinite size. We show that the reparameterization invariant metrics in the space of probability distributions of these models (the Fisher information) are directly related to the susceptibility of the inferred model. As a result, distinguishable models tend to accumulate close to critical points, where the susceptibility diverges in infinite systems. This region is the one where the estimate of inferred parameters is most stable. In order to illustrate these points, we discuss inference of interacting point processes with application to financial data and show that sensible choices of observation time scales naturally yield models which are close to criticality

  17. An Inference Language for Imaging

    DEFF Research Database (Denmark)

    Pedemonte, Stefano; Catana, Ciprian; Van Leemput, Koen

    2014-01-01

    We introduce iLang, a language and software framework for probabilistic inference. The iLang framework enables the definition of directed and undirected probabilistic graphical models and the automated synthesis of high performance inference algorithms for imaging applications. The iLang framewor...

  18. Automatic evaluation isn't that crude! Moderation of masked affective priming by type of valence

    NARCIS (Netherlands)

    Wentura, D.; Degner, J.

    2010-01-01

    In two experiments, the automatic processing of evaluative information was investigated using a masked affective priming paradigm, varying valence (positive vs. negative) and relevance (other-relevant traits vs. possessor-relevant traits; Peeters, 1983) of prime and target stimuli. It was found that

  19. Compreensão de texto em crianças: comparações entre diferentes classes de inferência a partir de uma metodologia on-line Text comprehension in children: comparing different classes of inferences by using on-line methodology

    Directory of Open Access Journals (Sweden)

    Alina Galvão Spinillo

    2007-01-01

    Full Text Available A partir de uma metodologia on-line, examinou-se a compreensão de textos em crianças de 7 e 9 anos em relação a diferentes tipos de inferências estabelecidas durante a leitura de uma história: inferências causais, de estado e de previsão. A metodologia on-line consiste na leitura interrompida do texto, sendo feitas perguntas inferenciais sobre cada passagem lida e sobre o que o leitor acha que virá a seguir (previsão. Verificou-se que as inferências de previsão envolvem informações extratextuais e requerem a formulação de hipóteses sobre a continuidade da narrativa; gerando, nas crianças, certa dificuldade em prever eventos que estão por acontecer. Conclui-se que a capacidade de estabelecer inferências durante a leitura de um texto varia em função da natureza da informação inferencial solicitada, e que esta capacidade se desenvolve com a idade. Dicute-se o caráter inovador da metodologia on-line de investigação e sua relevância para a pesquisa na área.This study, by means of using an on-line methodology, examined 7 and 9-year-old children's text comprehension in relation to different types of inferences constructed during a story reading task: causal inferences, state inferences and inferences of prediction (what happens next in the story. The on-line methodology consists of making inferential questions to the child during text comprehension immediately after the subject has read a passage. Due to the fact that inferences of prediction involve extratextual information and require to raise hypothesis about the continuity of the narrative, children had difficulties in predicting events that had not occurred yet in the story. It was concluded that the ability to make inferences during text comprehension varies according to the type of inferential question presented and that this ability develops with age. The inovative aspect of the on-line methodology and its relevance to the research on text comprehension are

  20. Automatic indexing, compiling and classification

    International Nuclear Information System (INIS)

    Andreewsky, Alexandre; Fluhr, Christian.

    1975-06-01

    A review of the principles of automatic indexing, is followed by a comparison and summing-up of work by the authors and by a Soviet staff from the Moscou INFORM-ELECTRO Institute. The mathematical and linguistic problems of the automatic building of thesaurus and automatic classification are examined [fr

  1. Inference

    DEFF Research Database (Denmark)

    Møller, Jesper

    2010-01-01

    Chapter 9: This contribution concerns statistical inference for parametric models used in stochastic geometry and based on quick and simple simulation free procedures as well as more comprehensive methods based on a maximum likelihood or Bayesian approach combined with markov chain Monte Carlo...... (MCMC) techniques. Due to space limitations the focus is on spatial point processes....

  2. Automatic control study of the icing research tunnel refrigeration system

    Science.gov (United States)

    Kieffer, Arthur W.; Soeder, Ronald H.

    1991-01-01

    The Icing Research Tunnel (IRT) at the NASA Lewis Research Center is a subsonic, closed-return atmospheric tunnel. The tunnel includes a heat exchanger and a refrigeration plant to achieve the desired air temperature and a spray system to generate the type of icing conditions that would be encountered by aircraft. At the present time, the tunnel air temperature is controlled by manual adjustment of freon refrigerant flow control valves. An upgrade of this facility calls for these control valves to be adjusted by an automatic controller. The digital computer simulation of the IRT refrigeration plant and the automatic controller that was used in the simulation are discussed.

  3. Feature Inference Learning and Eyetracking

    Science.gov (United States)

    Rehder, Bob; Colner, Robert M.; Hoffman, Aaron B.

    2009-01-01

    Besides traditional supervised classification learning, people can learn categories by inferring the missing features of category members. It has been proposed that feature inference learning promotes learning a category's internal structure (e.g., its typical features and interfeature correlations) whereas classification promotes the learning of…

  4. Learning Probabilistic Inference through Spike-Timing-Dependent Plasticity123

    Science.gov (United States)

    Pecevski, Dejan

    2016-01-01

    Abstract Numerous experimental data show that the brain is able to extract information from complex, uncertain, and often ambiguous experiences. Furthermore, it can use such learnt information for decision making through probabilistic inference. Several models have been proposed that aim at explaining how probabilistic inference could be performed by networks of neurons in the brain. We propose here a model that can also explain how such neural network could acquire the necessary information for that from examples. We show that spike-timing-dependent plasticity in combination with intrinsic plasticity generates in ensembles of pyramidal cells with lateral inhibition a fundamental building block for that: probabilistic associations between neurons that represent through their firing current values of random variables. Furthermore, by combining such adaptive network motifs in a recursive manner the resulting network is enabled to extract statistical information from complex input streams, and to build an internal model for the distribution p* that generates the examples it receives. This holds even if p* contains higher-order moments. The analysis of this learning process is supported by a rigorous theoretical foundation. Furthermore, we show that the network can use the learnt internal model immediately for prediction, decision making, and other types of probabilistic inference. PMID:27419214

  5. Distinguishing between statistical significance and practical/clinical meaningfulness using statistical inference.

    Science.gov (United States)

    Wilkinson, Michael

    2014-03-01

    Decisions about support for predictions of theories in light of data are made using statistical inference. The dominant approach in sport and exercise science is the Neyman-Pearson (N-P) significance-testing approach. When applied correctly it provides a reliable procedure for making dichotomous decisions for accepting or rejecting zero-effect null hypotheses with known and controlled long-run error rates. Type I and type II error rates must be specified in advance and the latter controlled by conducting an a priori sample size calculation. The N-P approach does not provide the probability of hypotheses or indicate the strength of support for hypotheses in light of data, yet many scientists believe it does. Outcomes of analyses allow conclusions only about the existence of non-zero effects, and provide no information about the likely size of true effects or their practical/clinical value. Bayesian inference can show how much support data provide for different hypotheses, and how personal convictions should be altered in light of data, but the approach is complicated by formulating probability distributions about prior subjective estimates of population effects. A pragmatic solution is magnitude-based inference, which allows scientists to estimate the true magnitude of population effects and how likely they are to exceed an effect magnitude of practical/clinical importance, thereby integrating elements of subjective Bayesian-style thinking. While this approach is gaining acceptance, progress might be hastened if scientists appreciate the shortcomings of traditional N-P null hypothesis significance testing.

  6. Automatic Prompt System in the Process of Mapping plWordNet on Princeton WordNet

    Directory of Open Access Journals (Sweden)

    Paweł Kędzia

    2015-06-01

    Full Text Available Automatic Prompt System in the Process of Mapping plWordNet on Princeton WordNet The paper offers a critical evaluation of the power and usefulness of an automatic prompt system based on the extended Relaxation Labelling algorithm in the process of (manual mapping plWordNet on Princeton WordNet. To this end the results of manual mapping – that is inter-lingual relations between plWN and PWN synsets – are juxtaposed with the automatic prompts that were generated for the source language synsets to be mapped. We check the number and type of inter-lingual relations introduced on the basis of automatic prompts and the distance of the respective prompt synsets from the actual target language synsets.

  7. Automatic analysis of children’s engagement using interactional network features

    NARCIS (Netherlands)

    Kim, Jaebok; Truong, Khiet Phuong

    We explored the automatic analysis of vocal non-verbal cues of a group of children in the context of engagement and collaborative play. For the current study, we defined two types of engagement on groups of children: harmonised and unharmonised. A spontaneous audiovisual corpus with groups of

  8. Automatic analysis of microscopic images of red blood cell aggregates

    Science.gov (United States)

    Menichini, Pablo A.; Larese, Mónica G.; Riquelme, Bibiana D.

    2015-06-01

    Red blood cell aggregation is one of the most important factors in blood viscosity at stasis or at very low rates of flow. The basic structure of aggregates is a linear array of cell commonly termed as rouleaux. Enhanced or abnormal aggregation is seen in clinical conditions, such as diabetes and hypertension, producing alterations in the microcirculation, some of which can be analyzed through the characterization of aggregated cells. Frequently, image processing and analysis for the characterization of RBC aggregation were done manually or semi-automatically using interactive tools. We propose a system that processes images of RBC aggregation and automatically obtains the characterization and quantification of the different types of RBC aggregates. Present technique could be interesting to perform the adaptation as a routine used in hemorheological and Clinical Biochemistry Laboratories because this automatic method is rapid, efficient and economical, and at the same time independent of the user performing the analysis (repeatability of the analysis).

  9. Automatic 3d Building Model Generations with Airborne LiDAR Data

    Science.gov (United States)

    Yastikli, N.; Cetin, Z.

    2017-11-01

    LiDAR systems become more and more popular because of the potential use for obtaining the point clouds of vegetation and man-made objects on the earth surface in an accurate and quick way. Nowadays, these airborne systems have been frequently used in wide range of applications such as DEM/DSM generation, topographic mapping, object extraction, vegetation mapping, 3 dimensional (3D) modelling and simulation, change detection, engineering works, revision of maps, coastal management and bathymetry. The 3D building model generation is the one of the most prominent applications of LiDAR system, which has the major importance for urban planning, illegal construction monitoring, 3D city modelling, environmental simulation, tourism, security, telecommunication and mobile navigation etc. The manual or semi-automatic 3D building model generation is costly and very time-consuming process for these applications. Thus, an approach for automatic 3D building model generation is needed in a simple and quick way for many studies which includes building modelling. In this study, automatic 3D building models generation is aimed with airborne LiDAR data. An approach is proposed for automatic 3D building models generation including the automatic point based classification of raw LiDAR point cloud. The proposed point based classification includes the hierarchical rules, for the automatic production of 3D building models. The detailed analyses for the parameters which used in hierarchical rules have been performed to improve classification results using different test areas identified in the study area. The proposed approach have been tested in the study area which has partly open areas, forest areas and many types of the buildings, in Zekeriyakoy, Istanbul using the TerraScan module of TerraSolid. The 3D building model was generated automatically using the results of the automatic point based classification. The obtained results of this research on study area verified that automatic 3D

  10. AUTOMATIC 3D BUILDING MODEL GENERATIONS WITH AIRBORNE LiDAR DATA

    Directory of Open Access Journals (Sweden)

    N. Yastikli

    2017-11-01

    Full Text Available LiDAR systems become more and more popular because of the potential use for obtaining the point clouds of vegetation and man-made objects on the earth surface in an accurate and quick way. Nowadays, these airborne systems have been frequently used in wide range of applications such as DEM/DSM generation, topographic mapping, object extraction, vegetation mapping, 3 dimensional (3D modelling and simulation, change detection, engineering works, revision of maps, coastal management and bathymetry. The 3D building model generation is the one of the most prominent applications of LiDAR system, which has the major importance for urban planning, illegal construction monitoring, 3D city modelling, environmental simulation, tourism, security, telecommunication and mobile navigation etc. The manual or semi-automatic 3D building model generation is costly and very time-consuming process for these applications. Thus, an approach for automatic 3D building model generation is needed in a simple and quick way for many studies which includes building modelling. In this study, automatic 3D building models generation is aimed with airborne LiDAR data. An approach is proposed for automatic 3D building models generation including the automatic point based classification of raw LiDAR point cloud. The proposed point based classification includes the hierarchical rules, for the automatic production of 3D building models. The detailed analyses for the parameters which used in hierarchical rules have been performed to improve classification results using different test areas identified in the study area. The proposed approach have been tested in the study area which has partly open areas, forest areas and many types of the buildings, in Zekeriyakoy, Istanbul using the TerraScan module of TerraSolid. The 3D building model was generated automatically using the results of the automatic point based classification. The obtained results of this research on study area verified

  11. Bayesian pedigree inference with small numbers of single nucleotide polymorphisms via a factor-graph representation.

    Science.gov (United States)

    Anderson, Eric C; Ng, Thomas C

    2016-02-01

    We develop a computational framework for addressing pedigree inference problems using small numbers (80-400) of single nucleotide polymorphisms (SNPs). Our approach relaxes the assumptions, which are commonly made, that sampling is complete with respect to the pedigree and that there is no genotyping error. It relies on representing the inferred pedigree as a factor graph and invoking the Sum-Product algorithm to compute and store quantities that allow the joint probability of the data to be rapidly computed under a large class of rearrangements of the pedigree structure. This allows efficient MCMC sampling over the space of pedigrees, and, hence, Bayesian inference of pedigree structure. In this paper we restrict ourselves to inference of pedigrees without loops using SNPs assumed to be unlinked. We present the methodology in general for multigenerational inference, and we illustrate the method by applying it to the inference of full sibling groups in a large sample (n=1157) of Chinook salmon typed at 95 SNPs. The results show that our method provides a better point estimate and estimate of uncertainty than the currently best-available maximum-likelihood sibling reconstruction method. Extensions of this work to more complex scenarios are briefly discussed. Published by Elsevier Inc.

  12. Causal inference in nonlinear systems: Granger causality versus time-delayed mutual information

    Science.gov (United States)

    Li, Songting; Xiao, Yanyang; Zhou, Douglas; Cai, David

    2018-05-01

    The Granger causality (GC) analysis has been extensively applied to infer causal interactions in dynamical systems arising from economy and finance, physics, bioinformatics, neuroscience, social science, and many other fields. In the presence of potential nonlinearity in these systems, the validity of the GC analysis in general is questionable. To illustrate this, here we first construct minimal nonlinear systems and show that the GC analysis fails to infer causal relations in these systems—it gives rise to all types of incorrect causal directions. In contrast, we show that the time-delayed mutual information (TDMI) analysis is able to successfully identify the direction of interactions underlying these nonlinear systems. We then apply both methods to neuroscience data collected from experiments and demonstrate that the TDMI analysis but not the GC analysis can identify the direction of interactions among neuronal signals. Our work exemplifies inference hazards in the GC analysis in nonlinear systems and suggests that the TDMI analysis can be an appropriate tool in such a case.

  13. TYPE Ia SUPERNOVA LIGHT CURVE INFERENCE: HIERARCHICAL MODELS IN THE OPTICAL AND NEAR-INFRARED

    International Nuclear Information System (INIS)

    Mandel, Kaisey S.; Narayan, Gautham; Kirshner, Robert P.

    2011-01-01

    We have constructed a comprehensive statistical model for Type Ia supernova (SN Ia) light curves spanning optical through near-infrared (NIR) data. A hierarchical framework coherently models multiple random and uncertain effects, including intrinsic supernova (SN) light curve covariances, dust extinction and reddening, and distances. An improved BAYESN Markov Chain Monte Carlo code computes probabilistic inferences for the hierarchical model by sampling the global probability density of parameters describing individual SNe and the population. We have applied this hierarchical model to optical and NIR data of 127 SNe Ia from PAIRITEL, CfA3, Carnegie Supernova Project, and the literature. We find an apparent population correlation between the host galaxy extinction A V and the ratio of total-to-selective dust absorption R V . For SNe with low dust extinction, A V ∼ V ∼ 2.5-2.9, while at high extinctions, A V ∼> 1, low values of R V < 2 are favored. The NIR luminosities are excellent standard candles and are less sensitive to dust extinction. They exhibit low correlation with optical peak luminosities, and thus provide independent information on distances. The combination of NIR and optical data constrains the dust extinction and improves the predictive precision of individual SN Ia distances by about 60%. Using cross-validation, we estimate an rms distance modulus prediction error of 0.11 mag for SNe with optical and NIR data versus 0.15 mag for SNe with optical data alone. Continued study of SNe Ia in the NIR is important for improving their utility as precise and accurate cosmological distance indicators.

  14. Automatic generation of data merging program codes.

    OpenAIRE

    Hyensook, Kim; Oussena, Samia; Zhang, Ying; Clark, Tony

    2010-01-01

    Data merging is an essential part of ETL (Extract-Transform-Load) processes to build a data warehouse system. To avoid rewheeling merging techniques, we propose a Data Merging Meta-model (DMM) and its transformation into executable program codes in the manner of model driven engineering. DMM allows defining relationships of different model entities and their merging types in conceptual level. Our formalized transformation described using ATL (ATLAS Transformation Language) enables automatic g...

  15. Automatic dose-rate controlling equipment

    International Nuclear Information System (INIS)

    Szasz, T.; Nagy Czirok, Cs.; Batki, L.; Antal, S.

    1977-01-01

    The patent of a dose-rate controlling equipment that can be attached to X-ray image-amplifiers is presented. In the new equipment the current of the photocatode of the image-amplifier is led into the regulating unit, which controls the X-ray generator automatically. The advantages of the equipment are the following: it can be simply attached to any type of X-ray image-amplifier, it accomplishes fast and sensitive regulation, it makes possible the control of both the mA and the kV values, it is attached to the most reliable point of the image-transmission chain. (L.E.)

  16. Transitioning Existing Content: inferring organisation-specific documents

    Directory of Open Access Journals (Sweden)

    Arijit Sengupta

    2000-11-01

    Full Text Available A definition for a document type within an organization represents an organizational norm about the way the organizational actors represent products and supporting evidence of organizational processes. Generating a good organization-specific document structure is, therefore, important since it can capture a shared understanding among the organizational actors about how certain business processes should be performed. Current tools that generate document type definitions focus on the underlying technology, emphasizing tags created in a single instance document. The tools, thus, fall short of capturing the shared understanding between organizational actors about how a given document type should be represented. We propose a method for inferring organization-specific document structures using multiple instance documents as inputs. The method consists of heuristics that combine individual document definitions, which may have been compiled using standard algorithms. We propose a number of heuristics utilizing artificial intelligence and natural language processing techniques. As the research progresses, the heuristics will be tested on a suite of test cases representing multiple instance documents for different document types. The complete methodology will be implemented as a research prototype

  17. Forward and backward inference in spatial cognition.

    Directory of Open Access Journals (Sweden)

    Will D Penny

    Full Text Available This paper shows that the various computations underlying spatial cognition can be implemented using statistical inference in a single probabilistic model. Inference is implemented using a common set of 'lower-level' computations involving forward and backward inference over time. For example, to estimate where you are in a known environment, forward inference is used to optimally combine location estimates from path integration with those from sensory input. To decide which way to turn to reach a goal, forward inference is used to compute the likelihood of reaching that goal under each option. To work out which environment you are in, forward inference is used to compute the likelihood of sensory observations under the different hypotheses. For reaching sensory goals that require a chaining together of decisions, forward inference can be used to compute a state trajectory that will lead to that goal, and backward inference to refine the route and estimate control signals that produce the required trajectory. We propose that these computations are reflected in recent findings of pattern replay in the mammalian brain. Specifically, that theta sequences reflect decision making, theta flickering reflects model selection, and remote replay reflects route and motor planning. We also propose a mapping of the above computational processes onto lateral and medial entorhinal cortex and hippocampus.

  18. A comparison of algorithms for inference and learning in probabilistic graphical models.

    Science.gov (United States)

    Frey, Brendan J; Jojic, Nebojsa

    2005-09-01

    Research into methods for reasoning under uncertainty is currently one of the most exciting areas of artificial intelligence, largely because it has recently become possible to record, store, and process large amounts of data. While impressive achievements have been made in pattern classification problems such as handwritten character recognition, face detection, speaker identification, and prediction of gene function, it is even more exciting that researchers are on the verge of introducing systems that can perform large-scale combinatorial analyses of data, decomposing the data into interacting components. For example, computational methods for automatic scene analysis are now emerging in the computer vision community. These methods decompose an input image into its constituent objects, lighting conditions, motion patterns, etc. Two of the main challenges are finding effective representations and models in specific applications and finding efficient algorithms for inference and learning in these models. In this paper, we advocate the use of graph-based probability models and their associated inference and learning algorithms. We review exact techniques and various approximate, computationally efficient techniques, including iterated conditional modes, the expectation maximization (EM) algorithm, Gibbs sampling, the mean field method, variational techniques, structured variational techniques and the sum-product algorithm ("loopy" belief propagation). We describe how each technique can be applied in a vision model of multiple, occluding objects and contrast the behaviors and performances of the techniques using a unifying cost function, free energy.

  19. The Diagnostic System of A – 604 Automatic Transmission

    Directory of Open Access Journals (Sweden)

    Czaban Jaroslaw

    2014-09-01

    Full Text Available Automatic gearbox gains increasing popularity in Europe. Little interest in diagnosis of such type of transmission in Poland results from the fact of small share in the whole market of operated cars, so there is a lack of availability of special diagnostic devices. These factors cause issues of expensive repairs, often involving a replacement of subassembly to new or aftermarket one. To a small extent some prophylactic diagnostic tests are conducted, which can eliminate future gearbox system failures. In the paper, the proposition of diagnostic system of popular A - 604 gearbox was presented. The authors are seeking for the possibility of using such type of devices to functional elaboration of gearboxes after renovation. The built system pursues the drive of the researched object, connected with simulated load, where special controller, replacing the original one, is responsible for controlling gearbox operation. This way is used to evaluate the mechanic and hydraulic parts' state. Analysis of signal runs, registered during measurements lets conclude about operation correctness, where as comparison with stock data verifies the technical state of an automatic gearbox.

  20. Intrinsic and extrinsic motivators of attachment under active inference

    Science.gov (United States)

    Nolte, Tobias; Friston, Karl; Edalat, Abbas

    2018-01-01

    This paper addresses the formation of infant attachment types within the context of active inference: a holistic account of action, perception and learning in the brain. We show how the organised forms of attachment (secure, avoidant and ambivalent) might arise in (Bayesian) infants. Specifically, we show that these distinct forms of attachment emerge from a minimisation of free energy—over interoceptive states relating to internal stress levels—when seeking proximity to caregivers who have a varying impact on these interoceptive states. In line with empirical findings in disrupted patterns of affective communication, we then demonstrate how exteroceptive cues (in the form of caregiver-mediated AMBIANCE affective communication errors, ACE) can result in disorganised forms of attachment in infants of caregivers who consistently increase stress when the infant seeks proximity, but can have an organising (towards ambivalence) effect in infants of inconsistent caregivers. In particular, we differentiate disorganised attachment from avoidance in terms of the high epistemic value of proximity seeking behaviours (resulting from the caregiver’s misleading exteroceptive cues) that preclude the emergence of coherent and organised behavioural policies. Our work, the first to formulate infant attachment in terms of active inference, makes a new testable prediction with regards to the types of affective communication errors that engender ambivalent attachment. PMID:29621266

  1. Hybrid Origins of Citrus Varieties Inferred from DNA Marker Analysis of Nuclear and Organelle Genomes

    Science.gov (United States)

    Kitajima, Akira; Nonaka, Keisuke; Yoshioka, Terutaka; Ohta, Satoshi; Goto, Shingo; Toyoda, Atsushi; Fujiyama, Asao; Mochizuki, Takako; Nagasaki, Hideki; Kaminuma, Eli; Nakamura, Yasukazu

    2016-01-01

    Most indigenous citrus varieties are assumed to be natural hybrids, but their parentage has so far been determined in only a few cases because of their wide genetic diversity and the low transferability of DNA markers. Here we infer the parentage of indigenous citrus varieties using simple sequence repeat and indel markers developed from various citrus genome sequence resources. Parentage tests with 122 known hybrids using the selected DNA markers certify their transferability among those hybrids. Identity tests confirm that most variant strains are selected mutants, but we find four types of kunenbo (Citrus nobilis) and three types of tachibana (Citrus tachibana) for which we suggest different origins. Structure analysis with DNA markers that are in Hardy–Weinberg equilibrium deduce three basic taxa coinciding with the current understanding of citrus ancestors. Genotyping analysis of 101 indigenous citrus varieties with 123 selected DNA markers infers the parentages of 22 indigenous citrus varieties including Satsuma, Temple, and iyo, and single parents of 45 indigenous citrus varieties, including kunenbo, C. ichangensis, and Ichang lemon by allele-sharing and parentage tests. Genotyping analysis of chloroplast and mitochondrial genomes using 11 DNA markers classifies their cytoplasmic genotypes into 18 categories and deduces the combination of seed and pollen parents. Likelihood ratio analysis verifies the inferred parentages with significant scores. The reconstructed genealogy identifies 12 types of varieties consisting of Kishu, kunenbo, yuzu, koji, sour orange, dancy, kobeni mikan, sweet orange, tachibana, Cleopatra, willowleaf mandarin, and pummelo, which have played pivotal roles in the occurrence of these indigenous varieties. The inferred parentage of the indigenous varieties confirms their hybrid origins, as found by recent studies. PMID:27902727

  2. A formal model of interpersonal inference

    Directory of Open Access Journals (Sweden)

    Michael eMoutoussis

    2014-03-01

    Full Text Available Introduction: We propose that active Bayesian inference – a general framework for decision-making – can equally be applied to interpersonal exchanges. Social cognition, however, entails special challenges. We address these challenges through a novel formulation of a formal model and demonstrate its psychological significance. Method: We review relevant literature, especially with regards to interpersonal representations, formulate a mathematical model and present a simulation study. The model accommodates normative models from utility theory and places them within the broader setting of Bayesian inference. Crucially, we endow people's prior beliefs, into which utilities are absorbed, with preferences of self and others. The simulation illustrates the model's dynamics and furnishes elementary predictions of the theory. Results: 1. Because beliefs about self and others inform both the desirability and plausibility of outcomes, in this framework interpersonal representations become beliefs that have to be actively inferred. This inference, akin to 'mentalising' in the psychological literature, is based upon the outcomes of interpersonal exchanges. 2. We show how some well-known social-psychological phenomena (e.g. self-serving biases can be explained in terms of active interpersonal inference. 3. Mentalising naturally entails Bayesian updating of how people value social outcomes. Crucially this includes inference about one’s own qualities and preferences. Conclusion: We inaugurate a Bayes optimal framework for modelling intersubject variability in mentalising during interpersonal exchanges. Here, interpersonal representations are endowed with explicit functional and affective properties. We suggest the active inference framework lends itself to the study of psychiatric conditions where mentalising is distorted.

  3. Type tests to the automatic system of thermoluminescent dosimetry acquired by the CPHR for personnel dosimetry

    International Nuclear Information System (INIS)

    Molina P, D.; Pernas S, R.

    2005-01-01

    The CPHR individual monitoring service acquired an automatic RADOS TLD system to improve its capacities to satisfy the increasing needs of their national customers. The TLD system consists of: two automatic TLD reader, model DOSACUS, a TLD irradiator and personal dosimeters card including slide and holders. The dosimeters were composed by this personal dosimeters card and LiF: Mg,Cu,P (model GR-200) detectors. These readers provide to detectors a constant temperature readout cycle using hot nitrogen gas. In order to evaluate the performance characteristics of the system, different performance tests recommended by the IEC 1066 standard were carried out. Important dosimetric characteristics evaluated were batch homogeneity, reproducibility, detection threshold, energy dependence, residual signal and fading. The results of the tests showed good performance characteristics of the system. (Author)

  4. Distributional Inference

    NARCIS (Netherlands)

    Kroese, A.H.; van der Meulen, E.A.; Poortema, Klaas; Schaafsma, W.

    1995-01-01

    The making of statistical inferences in distributional form is conceptionally complicated because the epistemic 'probabilities' assigned are mixtures of fact and fiction. In this respect they are essentially different from 'physical' or 'frequency-theoretic' probabilities. The distributional form is

  5. Automaticity and stability of adaptation to a foreign-accented speaker

    NARCIS (Netherlands)

    Witteman, M.J.; Bardhan, N.P.; Weber, A.C.; McQueen, J.M.

    2015-01-01

    In three cross-modal priming experiments we asked whether adaptation to a foreign-accented speaker is automatic, and whether adaptation can be seen after a long delay between initial exposure and test. Dutch listeners were exposed to a Hebrew-accented Dutch speaker with two types of Dutch words:

  6. A Flexible and Configurable Architecture for Automatic Control Remote Laboratories

    Science.gov (United States)

    Kalúz, Martin; García-Zubía, Javier; Fikar, Miroslav; Cirka, Luboš

    2015-01-01

    In this paper, we propose a novel approach in hardware and software architecture design for implementation of remote laboratories for automatic control. In our contribution, we show the solution with flexible connectivity at back-end, providing features of multipurpose usage with different types of experimental devices, and fully configurable…

  7. Automatic sets and Delone sets

    International Nuclear Information System (INIS)

    Barbe, A; Haeseler, F von

    2004-01-01

    Automatic sets D part of Z m are characterized by having a finite number of decimations. They are equivalently generated by fixed points of certain substitution systems, or by certain finite automata. As examples, two-dimensional versions of the Thue-Morse, Baum-Sweet, Rudin-Shapiro and paperfolding sequences are presented. We give a necessary and sufficient condition for an automatic set D part of Z m to be a Delone set in R m . The result is then extended to automatic sets that are defined as fixed points of certain substitutions. The morphology of automatic sets is discussed by means of examples

  8. Application of software technology to automatic test data analysis

    Science.gov (United States)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  9. Continuous Integrated Invariant Inference, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed project will develop a new technique for invariant inference and embed this and other current invariant inference and checking techniques in an...

  10. MRI in assessing children with learning disability, focal findings, and reduced automaticity.

    Science.gov (United States)

    Urion, David K; Huff, Hanalise V; Carullo, Maria Paulina

    2015-08-18

    In children with clinically diagnosed learning disabilities with focal findings on neurologic or neuropsychological evaluations, there is a hypothesized association between disorders in automaticity and focal structural abnormalities observed in brain MRIs. We undertook a retrospective analysis of cases referred to a tertiary-hospital-based learning disabilities program. Individuals were coded as having a focal deficit if either neurologic or neuropsychological evaluation demonstrated focal dysfunction. Those with abnormal MRI findings were categorized based on findings. Children with abnormalities from each of these categories were compared in terms of deficits in automaticity, as measured by the tasks of Rapid Automatized Naming, Rapid Alternating Stimulus Naming, or the timed motor performance battery from the Physical and Neurological Examination for Soft Signs. Data were compared in children with and without disorders of automaticity regarding type of brain structure abnormality. Of the 1,587 children evaluated, 127 had a focal deficit. Eighty-seven had a brain MRI (52 on 1.5-tesla machines and 35 on 3.0-tesla machines). Forty of these images were found to be abnormal. These children were compared with a clinic sample of 150 patients with learning disabilities and no focal findings on examination, who also had undergone MRI. Only 5 of the latter group had abnormalities on MRI. Reduced verbal automaticity was associated with cerebellar abnormalities, whereas reduced automaticity on motor or motor and verbal tasks was associated with white matter abnormalities. Reduced automaticity of retrieval and slow timed motor performance appear to be highly associated with MRI findings. © 2015 American Academy of Neurology.

  11. Hanger-type laundry monitor system

    International Nuclear Information System (INIS)

    Aoyama, Kei; Kouno, Yoshio; Yanagishima, Ryouhei; Ikeda, Yasuyuki; Nakatani, Masahiro

    1987-01-01

    Laundry monitor is installed in nuclear power plants or other nuclear facilities in order to efficiently detect radioactive contamination remains on the surfaces of the working clothes which were used in the controlled area and washed afterward. The number of the working clothes which must be measured has been increasing in accordance with the increase of the nuclear facilities. This fact and recent intensified radiation control require automatic, high-speed and high sensitive measurement. Conveyer-type laundry monitor in which the working clothes are inserted by the metal net conveyer has been generally used, and recently new system with an automatic folder has become more popular. But, this type of system has not so big capacity because the clothes are conveyed longitudinally and also requires considerable wide space when installed. Fuji electric Co., Ltd. has been engaging in research and development for an optimum laundry monitor system used in nuclear facilities since the joint investigation with ten electric power companies in Japan in 1982. Consequently hanger-type laundry monitor system using automatic hanger conveyer was developed and 2 systems were delivered to Chubu Electric Power Co., Ltd. in 1986. This system permits to detect radioactive contamination on the working clothes, pick the contaminated clothes out and fold the uncontaminated clothes fully automatically and continuously. Moreover it allows to shorten the measurement time because the clothes are conveyed transversely and save the installation space, so that this will be regarded as considerably complete system in the world. This report describes the outline of the hanger-type laundry monitor system. (author)

  12. AUTOMATIC CLASSIFICATION OF VARIABLE STARS IN CATALOGS WITH MISSING DATA

    International Nuclear Information System (INIS)

    Pichara, Karim; Protopapas, Pavlos

    2013-01-01

    We present an automatic classification method for astronomical catalogs with missing data. We use Bayesian networks and a probabilistic graphical model that allows us to perform inference to predict missing values given observed data and dependency relationships between variables. To learn a Bayesian network from incomplete data, we use an iterative algorithm that utilizes sampling methods and expectation maximization to estimate the distributions and probabilistic dependencies of variables from data with missing values. To test our model, we use three catalogs with missing data (SAGE, Two Micron All Sky Survey, and UBVI) and one complete catalog (MACHO). We examine how classification accuracy changes when information from missing data catalogs is included, how our method compares to traditional missing data approaches, and at what computational cost. Integrating these catalogs with missing data, we find that classification of variable objects improves by a few percent and by 15% for quasar detection while keeping the computational cost the same

  13. Automatic Image Segmentation Using Active Contours with Univariate Marginal Distribution

    Directory of Open Access Journals (Sweden)

    I. Cruz-Aceves

    2013-01-01

    Full Text Available This paper presents a novel automatic image segmentation method based on the theory of active contour models and estimation of distribution algorithms. The proposed method uses the univariate marginal distribution model to infer statistical dependencies between the control points on different active contours. These contours have been generated through an alignment process of reference shape priors, in order to increase the exploration and exploitation capabilities regarding different interactive segmentation techniques. This proposed method is applied in the segmentation of the hollow core in microscopic images of photonic crystal fibers and it is also used to segment the human heart and ventricular areas from datasets of computed tomography and magnetic resonance images, respectively. Moreover, to evaluate the performance of the medical image segmentations compared to regions outlined by experts, a set of similarity measures has been adopted. The experimental results suggest that the proposed image segmentation method outperforms the traditional active contour model and the interactive Tseng method in terms of segmentation accuracy and stability.

  14. AUTOMATIC CLASSIFICATION OF VARIABLE STARS IN CATALOGS WITH MISSING DATA

    Energy Technology Data Exchange (ETDEWEB)

    Pichara, Karim [Computer Science Department, Pontificia Universidad Católica de Chile, Santiago (Chile); Protopapas, Pavlos [Institute for Applied Computational Science, Harvard University, Cambridge, MA (United States)

    2013-11-10

    We present an automatic classification method for astronomical catalogs with missing data. We use Bayesian networks and a probabilistic graphical model that allows us to perform inference to predict missing values given observed data and dependency relationships between variables. To learn a Bayesian network from incomplete data, we use an iterative algorithm that utilizes sampling methods and expectation maximization to estimate the distributions and probabilistic dependencies of variables from data with missing values. To test our model, we use three catalogs with missing data (SAGE, Two Micron All Sky Survey, and UBVI) and one complete catalog (MACHO). We examine how classification accuracy changes when information from missing data catalogs is included, how our method compares to traditional missing data approaches, and at what computational cost. Integrating these catalogs with missing data, we find that classification of variable objects improves by a few percent and by 15% for quasar detection while keeping the computational cost the same.

  15. Automatic measuring device for atomic oxygen concentrations (1962); Dispositif de mesure automatique de concentrations d'oxygene atomique (1962)

    Energy Technology Data Exchange (ETDEWEB)

    Weill, J; Deiss, M; Mercier, R [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1962-07-01

    Within the framework of the activities of the Autonomous Reactor Electronics Section we have developed a device, which renders automatic one type of measurement carried out in the Physical Chemistry Department at the Saclay Research Centre. We define here: - the physico-chemical principle of the apparatus which is adapted to the measurement of atomic oxygen concentrations; - the physical principle of the automatic measurement; - the properties, performance, constitution, use and maintenance of the automatic measurement device. It is concluded that the principle of the automatic device, whose tests have confirmed the estimation of the theoretical performance, could usefully be adapted to other types of measurement. (authors) [French] Dans le cadre des activites de la Section Autonome d'Electronique des Reacteurs, il a ete realise et mis au point un dispositif permettant de rendre automatique un type de mesures effectuees au Departement de Physico-Chimie du C.E.N. SACLAY. On definit ici: - le principe physico-chimique de l'appareillage, adapte a la mesure de concentrations de l'oxygene atomique; - le principe physique de la mesure automatique; - les qualites, performances, constitution, utilisation, et maintenance du dispositif de mesure automatique. Il est porte en conclusion, que le principe du dispositif automatique realise, dont les essais ont sensiblement confirme l'evaluation des performances theoriques, pourrait etre utilement adapte a d'autres types de mesures courantes. (auteurs)

  16. Development of an automatic reactor inspection system

    International Nuclear Information System (INIS)

    Kim, Jae Hee; Eom, Heung Seop; Lee, Jae Cheol; Choi, Yoo Raek; Moon, Soon Seung

    2002-02-01

    Using recent technologies on a mobile robot computer science, we developed an automatic inspection system for weld lines of the reactor vessel. The ultrasonic inspection of the reactor pressure vessel is currently performed by commercialized robot manipulators. Since, however, the conventional fixed type robot manipulator is very huge, heavy and expensive, it needs long inspection time and is hard to handle and maintain. In order to resolve these problems, we developed a new automatic inspection system using a small mobile robot crawling on the vertical wall of the reactor vessel. According to our conceptual design, we developed the reactor inspection system including an underwater inspection robot, a laser position control subsystem, an ultrasonic data acquisition/analysis subsystem and a main control subsystem. We successfully carried out underwater experiments on the reactor vessel mockup, and real reactor ready for Ulchine nuclear power plant unit 6 at Dusan Heavy Industry in Korea. After this project, we have a plan to commercialize our inspection system. Using this system, we can expect much reduction of the inspection time, performance enhancement, automatic management of inspection history, etc. In the economic point of view, we can also expect import substitution more than 4 million dollars. The established essential technologies for intelligent control and automation are expected to be synthetically applied to the automation of similar systems in nuclear power plants

  17. Text Summarization Evaluation: Correlating Human Performance on an Extrinsic Task with Automatic Intrinsic Metrics

    National Research Council Canada - National Science Library

    President, Stacy F; Dorr, Bonnie J

    2006-01-01

    This research describes two types of summarization evaluation methods, intrinsic and extrinsic, and concentrates on determining the level of correlation between automatic intrinsic methods and human...

  18. Development of a parameter optimization technique for the design of automatic control systems

    Science.gov (United States)

    Whitaker, P. H.

    1977-01-01

    Parameter optimization techniques for the design of linear automatic control systems that are applicable to both continuous and digital systems are described. The model performance index is used as the optimization criterion because of the physical insight that can be attached to it. The design emphasis is to start with the simplest system configuration that experience indicates would be practical. Design parameters are specified, and a digital computer program is used to select that set of parameter values which minimizes the performance index. The resulting design is examined, and complexity, through the use of more complex information processing or more feedback paths, is added only if performance fails to meet operational specifications. System performance specifications are assumed to be such that the desired step function time response of the system can be inferred.

  19. Model-based and design-based inference goals frame how to account for neighborhood clustering in studies of health in overlapping context types

    Directory of Open Access Journals (Sweden)

    Gina S. Lovasi

    2017-12-01

    Full Text Available Accounting for non-independence in health research often warrants attention. Particularly, the availability of geographic information systems data has increased the ease with which studies can add measures of the local “neighborhood” even if participant recruitment was through other contexts, such as schools or clinics. We highlight a tension between two perspectives that is often present, but particularly salient when more than one type of potentially health-relevant context is indexed (e.g., both neighborhood and school. On the one hand, a model-based perspective emphasizes the processes producing outcome variation, and observed data are used to make inference about that process. On the other hand, a design-based perspective emphasizes inference to a well-defined finite population, and is commonly invoked by those using complex survey samples or those with responsibility for the health of local residents. These two perspectives have divergent implications when deciding whether clustering must be accounted for analytically and how to select among candidate cluster definitions, though the perspectives are by no means monolithic. There are tensions within each perspective as well as between perspectives. We aim to provide insight into these perspectives and their implications for population health researchers. We focus on the crucial step of deciding which cluster definition or definitions to use at the analysis stage, as this has consequences for all subsequent analytic and interpretational challenges with potentially clustered data.

  20. Quantum-Like Representation of Non-Bayesian Inference

    Science.gov (United States)

    Asano, M.; Basieva, I.; Khrennikov, A.; Ohya, M.; Tanaka, Y.

    2013-01-01

    This research is related to the problem of "irrational decision making or inference" that have been discussed in cognitive psychology. There are some experimental studies, and these statistical data cannot be described by classical probability theory. The process of decision making generating these data cannot be reduced to the classical Bayesian inference. For this problem, a number of quantum-like coginitive models of decision making was proposed. Our previous work represented in a natural way the classical Bayesian inference in the frame work of quantum mechanics. By using this representation, in this paper, we try to discuss the non-Bayesian (irrational) inference that is biased by effects like the quantum interference. Further, we describe "psychological factor" disturbing "rationality" as an "environment" correlating with the "main system" of usual Bayesian inference.

  1. Bayesian Inference Methods for Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand

    2013-01-01

    This thesis deals with sparse Bayesian learning (SBL) with application to radio channel estimation. As opposed to the classical approach for sparse signal representation, we focus on the problem of inferring complex signals. Our investigations within SBL constitute the basis for the development...... of Bayesian inference algorithms for sparse channel estimation. Sparse inference methods aim at finding the sparse representation of a signal given in some overcomplete dictionary of basis vectors. Within this context, one of our main contributions to the field of SBL is a hierarchical representation...... analysis of the complex prior representation, where we show that the ability to induce sparse estimates of a given prior heavily depends on the inference method used and, interestingly, whether real or complex variables are inferred. We also show that the Bayesian estimators derived from the proposed...

  2. Automatisms: bridging clinical neurology with criminal law.

    Science.gov (United States)

    Rolnick, Joshua; Parvizi, Josef

    2011-03-01

    The law, like neurology, grapples with the relationship between disease states and behavior. Sometimes, the two disciplines share the same terminology, such as automatism. In law, the "automatism defense" is a claim that action was involuntary or performed while unconscious. Someone charged with a serious crime can acknowledge committing the act and yet may go free if, relying on the expert testimony of clinicians, the court determines that the act of crime was committed in a state of automatism. In this review, we explore the relationship between the use of automatism in the legal and clinical literature. We close by addressing several issues raised by the automatism defense: semantic ambiguity surrounding the term automatism, the presence or absence of consciousness during automatisms, and the methodological obstacles that have hindered the study of cognition during automatisms. Copyright © 2010 Elsevier Inc. All rights reserved.

  3. Statistical inference an integrated Bayesianlikelihood approach

    CERN Document Server

    Aitkin, Murray

    2010-01-01

    Filling a gap in current Bayesian theory, Statistical Inference: An Integrated Bayesian/Likelihood Approach presents a unified Bayesian treatment of parameter inference and model comparisons that can be used with simple diffuse prior specifications. This novel approach provides new solutions to difficult model comparison problems and offers direct Bayesian counterparts of frequentist t-tests and other standard statistical methods for hypothesis testing.After an overview of the competing theories of statistical inference, the book introduces the Bayes/likelihood approach used throughout. It pre

  4. Research designs and making causal inferences from health care studies.

    Science.gov (United States)

    Flannelly, Kevin J; Jankowski, Katherine R B

    2014-01-01

    This article summarizes the major types of research designs used in healthcare research, including experimental, quasi-experimental, and observational studies. Observational studies are divided into survey studies (descriptive and correlational studies), case-studies and analytic studies, the last of which are commonly used in epidemiology: case-control, retrospective cohort, and prospective cohort studies. Similarities and differences among the research designs are described and the relative strength of evidence they provide is discussed. Emphasis is placed on five criteria for drawing causal inferences that are derived from the writings of the philosopher John Stuart Mill, especially his methods or canons. The application of the criteria to experimentation is explained. Particular attention is given to the degree to which different designs meet the five criteria for making causal inferences. Examples of specific studies that have used various designs in chaplaincy research are provided.

  5. Illustration interface of accident progression in PWR by quick inference based on multilevel flow models

    International Nuclear Information System (INIS)

    Yoshikawa, H.; Ouyang, J.; Niwa, Y.

    2006-01-01

    In this paper, a new accident inference method is proposed by using a goal and function oriented modeling method called Multilevel Flow Model focusing on explaining the causal-consequence relations and the objective of automatic action in the accident of nuclear power plant. Users can easily grasp how the various plant parameters will behave and how the various safety facilities will be activated sequentially to cope with the accident until the nuclear power plants are settled into safety state, i.e., shutdown state. The applicability of the developed method was validated by the conduction of internet-based 'view' experiment to the voluntary respondents, and in the future, further elaboration of interface design and the further introduction of instruction contents will be developed to make it become the usable CAI system. (authors)

  6. Manual and automatic locomotion scoring systems in dairy cows: A review

    NARCIS (Netherlands)

    Schlageter-Tello, A.; Bokkers, E.A.M.; Groot Koerkamp, P.W.G.; Hertem, van T.; Viazzi, S.; Romanini Bites, E.; Halachmi, I.; Bahr, C.; Berckmans, D.; Lokhorst, K.

    2014-01-01

    The objective of this review was to describe, compare and evaluate agreement, reliability, and validity of manual and automatic locomotion scoring systems (MLSSs and ALSSs, respectively) used in dairy cattle lameness research. There are many different types of MLSSs and ALSSs. Twenty-five MLSSs were

  7. Genetic interaction motif finding by expectation maximization – a novel statistical model for inferring gene modules from synthetic lethality

    Directory of Open Access Journals (Sweden)

    Ye Ping

    2005-12-01

    Full Text Available Abstract Background Synthetic lethality experiments identify pairs of genes with complementary function. More direct functional associations (for example greater probability of membership in a single protein complex may be inferred between genes that share synthetic lethal interaction partners than genes that are directly synthetic lethal. Probabilistic algorithms that identify gene modules based on motif discovery are highly appropriate for the analysis of synthetic lethal genetic interaction data and have great potential in integrative analysis of heterogeneous datasets. Results We have developed Genetic Interaction Motif Finding (GIMF, an algorithm for unsupervised motif discovery from synthetic lethal interaction data. Interaction motifs are characterized by position weight matrices and optimized through expectation maximization. Given a seed gene, GIMF performs a nonlinear transform on the input genetic interaction data and automatically assigns genes to the motif or non-motif category. We demonstrate the capacity to extract known and novel pathways for Saccharomyces cerevisiae (budding yeast. Annotations suggested for several uncharacterized genes are supported by recent experimental evidence. GIMF is efficient in computation, requires no training and automatically down-weights promiscuous genes with high degrees. Conclusion GIMF effectively identifies pathways from synthetic lethality data with several unique features. It is mostly suitable for building gene modules around seed genes. Optimal choice of one single model parameter allows construction of gene networks with different levels of confidence. The impact of hub genes the generic probabilistic framework of GIMF may be used to group other types of biological entities such as proteins based on stochastic motifs. Analysis of the strongest motifs discovered by the algorithm indicates that synthetic lethal interactions are depleted between genes within a motif, suggesting that synthetic

  8. Automatic recognition of cardiac arrhythmias based on the geometric patterns of Poincaré plots

    International Nuclear Information System (INIS)

    Zhang, Lijuan; Guo, Tianci; Xi, Bin; Fan, Yang; Wang, Kun; Bi, Jiacheng; Wang, Ying

    2015-01-01

    The Poincaré plot emerges as an effective tool for assessing cardiovascular autonomic regulation. It displays nonlinear characteristics of heart rate variability (HRV) from electrocardiographic (ECG) recordings and gives a global view of the long range of ECG signals. In the telemedicine or computer-aided diagnosis system, it would offer significant auxiliary information for diagnosis if the patterns of the Poincaré plots can be automatically classified. Therefore, we developed an automatic classification system to distinguish five geometric patterns of the Poincaré plots from four types of cardiac arrhythmias. The statistics features are designed on measurements and an ensemble classifier of three types of neural networks is proposed. Aiming at the difficulty to set a proper threshold for classifying the multiple categories, the threshold selection strategy is analyzed. 24 h ECG monitoring recordings from 674 patients, which have four types of cardiac arrhythmias, are adopted for recognition. For comparison, Support Vector Machine (SVM) classifiers with linear and Gaussian kernels are also applied. The experiment results demonstrate the effectiveness of the extracted features and the better performance of the designed classifier. Our study can be applied to diagnose the corresponding sinus rhythm and arrhythmia substrates disease automatically in the telemedicine and computer-aided diagnosis system. (paper)

  9. PhySIC_IST: cleaning source trees to infer more informative supertrees.

    Science.gov (United States)

    Scornavacca, Celine; Berry, Vincent; Lefort, Vincent; Douzery, Emmanuel J P; Ranwez, Vincent

    2008-10-04

    supertrees than PhySIC, while preserving low type I error compared to the well-known MRP method. Two biological case studies on animals confirm that the STC preprocess successfully detects anomalies in the source trees while STC+PhySIC_IST provides well-resolved supertrees agreeing with current knowledge in systematics. The paper introduces and tests two new methodologies, PhySIC_IST and STC, that demonstrate the interest in inferring non-plenary supertrees as well as preprocessing the source trees. An implementation of the methods is available at: http://www.atgc-montpellier.fr/physic_ist/.

  10. An automatic taxonomy of galaxy morphology using unsupervised machine learning

    Science.gov (United States)

    Hocking, Alex; Geach, James E.; Sun, Yi; Davey, Neil

    2018-01-01

    We present an unsupervised machine learning technique that automatically segments and labels galaxies in astronomical imaging surveys using only pixel data. Distinct from previous unsupervised machine learning approaches used in astronomy we use no pre-selection or pre-filtering of target galaxy type to identify galaxies that are similar. We demonstrate the technique on the Hubble Space Telescope (HST) Frontier Fields. By training the algorithm using galaxies from one field (Abell 2744) and applying the result to another (MACS 0416.1-2403), we show how the algorithm can cleanly separate early and late type galaxies without any form of pre-directed training for what an 'early' or 'late' type galaxy is. We then apply the technique to the HST Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey (CANDELS) fields, creating a catalogue of approximately 60 000 classifications. We show how the automatic classification groups galaxies of similar morphological (and photometric) type and make the classifications public via a catalogue, a visual catalogue and galaxy similarity search. We compare the CANDELS machine-based classifications to human-classifications from the Galaxy Zoo: CANDELS project. Although there is not a direct mapping between Galaxy Zoo and our hierarchical labelling, we demonstrate a good level of concordance between human and machine classifications. Finally, we show how the technique can be used to identify rarer objects and present lensed galaxy candidates from the CANDELS imaging.

  11. Visual recognition and inference using dynamic overcomplete sparse learning.

    Science.gov (United States)

    Murray, Joseph F; Kreutz-Delgado, Kenneth

    2007-09-01

    We present a hierarchical architecture and learning algorithm for visual recognition and other visual inference tasks such as imagination, reconstruction of occluded images, and expectation-driven segmentation. Using properties of biological vision for guidance, we posit a stochastic generative world model and from it develop a simplified world model (SWM) based on a tractable variational approximation that is designed to enforce sparse coding. Recent developments in computational methods for learning overcomplete representations (Lewicki & Sejnowski, 2000; Teh, Welling, Osindero, & Hinton, 2003) suggest that overcompleteness can be useful for visual tasks, and we use an overcomplete dictionary learning algorithm (Kreutz-Delgado, et al., 2003) as a preprocessing stage to produce accurate, sparse codings of images. Inference is performed by constructing a dynamic multilayer network with feedforward, feedback, and lateral connections, which is trained to approximate the SWM. Learning is done with a variant of the back-propagation-through-time algorithm, which encourages convergence to desired states within a fixed number of iterations. Vision tasks require large networks, and to make learning efficient, we take advantage of the sparsity of each layer to update only a small subset of elements in a large weight matrix at each iteration. Experiments on a set of rotated objects demonstrate various types of visual inference and show that increasing the degree of overcompleteness improves recognition performance in difficult scenes with occluded objects in clutter.

  12. Inference Attacks and Control on Database Structures

    Directory of Open Access Journals (Sweden)

    Muhamed Turkanovic

    2015-02-01

    Full Text Available Today’s databases store information with sensitivity levels that range from public to highly sensitive, hence ensuring confidentiality can be highly important, but also requires costly control. This paper focuses on the inference problem on different database structures. It presents possible treats on privacy with relation to the inference, and control methods for mitigating these treats. The paper shows that using only access control, without any inference control is inadequate, since these models are unable to protect against indirect data access. Furthermore, it covers new inference problems which rise from the dimensions of new technologies like XML, semantics, etc.

  13. A METHOD OF AUTOMATIC DETERMINATION OF THE NUMBER OF THE ELECTRICAL MOTORS SIMULTANEOUSLY WORKING IN GROUP

    Directory of Open Access Journals (Sweden)

    A. V. Voloshko

    2016-11-01

    Full Text Available Purpose. Propose a method of automatic determination of the number of operating high voltage electric motors in the group of the same type based on the determination and analysis of the account data of power consumption, obtained from of electric power meters installed at the connection of motors. Results. The algorithm of the automatic determination program for the number of working in the same group of electric motors, which is based on the determination of the motor power minimum value at which it is considered on, was developed. Originality. For the first time a method of automatic determination of the number of working of the same type high-voltage motors group was proposed. Practical value. Obtained results may be used for the introduction of an automated accounting run of each motor, calculating the parameters of the equivalent induction motor or a synchronous motor.

  14. Semi-automatic ultrasonic inspection of PWR upper internal immersed components

    International Nuclear Information System (INIS)

    Dombret, P.; Coquette, A.; Cermak, J.; Verspeelt, D.

    1985-01-01

    The present paper describes the characteristics of a semi-automatic ultrasonic inspection system. Components inspected are the so-called flexures, small pins located at the upper part of control rod tube-guide, some of which happened to broke in a few Westinghouse type PWR's. Inspection results and other system capabilities are also mentioned

  15. Dispatch of Helicopter Emergency Medical Services Via Advanced Automatic Collision Notification.

    Science.gov (United States)

    Matsumoto, Hisashi; Mashiko, Kunihiro; Hara, Yoshiaki; Yagi, Takanori; Hayashida, Kazuyuki; Mashiko, Kazuki; Saito, Nobuyuki; Iida, Hiroaki; Motomura, Tomokazu; Yasumatsu, Hiroshi; Kameyama, Daisuke; Hirabayashi, Atsushi; Yokota, Hiroyuki; Ishikawa, Hirotoshi; Kunimatsu, Takaji

    2016-03-01

    Advanced automatic collision notification (AACN) is a system for predicting occupant injury from collision information. If the helicopter emergency medical services (HEMS) physician can be alerted by AACN, it may be possible to reduce the time to patient contact. The purpose of this study was to validate the feasibility of early HEMS dispatch via AACN. A full-scale validation study was conducted. A car equipped with AACN was made to collide with a wall. Immediately after the collision, the HEMS was alerted directly by the operation center, which received the information from AACN. Elapsed times were recorded and compared with those inferred from the normal, real-world HEMS emergency request process. AACN information was sent to the operation center only 7 s after the collision; the HEMS was dispatched after 3 min. The helicopter landed at the temporary helipad 18 min later. Finally, medical intervention was started 21 min after the collision. Without AACN, it was estimated that the HEMS would be requested 14 min after the collision by fire department personnel. The start of treatment was estimated to be at 32 min, which was 11 min later than that associated with the use of AACN. The dispatch of the HEMS using the AACN can shorten the start time of treatment for patients in motor vehicle collisions. This study demonstrated that it is feasible to automatically alert and activate the HEMS via AACN. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Inference comprehension in text reading: Performance of individuals with right- versus left-hemisphere lesions and the influence of cognitive functions.

    Science.gov (United States)

    Silagi, Marcela Lima; Radanovic, Marcia; Conforto, Adriana Bastos; Mendonça, Lucia Iracema Zanotto; Mansur, Leticia Lessa

    2018-01-01

    Right-hemisphere lesions (RHL) may impair inference comprehension. However, comparative studies between left-hemisphere lesions (LHL) and RHL are rare, especially regarding reading comprehension. Moreover, further knowledge of the influence of cognition on inferential processing in this task is needed. To compare the performance of patients with RHL and LHL on an inference reading comprehension task. We also aimed to analyze the effects of lesion site and to verify correlations between cognitive functions and performance on the task. Seventy-five subjects were equally divided into the groups RHL, LHL, and control group (CG). The Implicit Management Test was used to evaluate inference comprehension. In this test, subjects read short written passages and subsequently answer five types of questions (explicit, logical, distractor, pragmatic, and other), which require different types of inferential reasoning. The cognitive functional domains of attention, memory, executive functions, language, and visuospatial abilities were assessed using the Cognitive Linguistic Quick Test (CLQT). The LHL and RHL groups presented difficulties in inferential comprehension in comparison with the CG. However, the RHL group presented lower scores than the LHL group on logical, pragmatic and other questions. A covariance analysis did not show any effect of lesion site within the hemispheres. Overall, all cognitive domains were correlated with all the types of questions from the inference test (especially logical, pragmatic, and other). Attention and visuospatial abilities affected the scores of both the RHL and LHL groups, and only memory influenced the performance of the RHL group. Lesions in either hemisphere may cause difficulties in making inferences during reading. However, processing more complex inferences was more difficult for patients with RHL than for those with LHL, which suggests that the right hemisphere plays an important role in tasks with higher comprehension demands

  17. Inference comprehension in text reading: Performance of individuals with right- versus left-hemisphere lesions and the influence of cognitive functions.

    Directory of Open Access Journals (Sweden)

    Marcela Lima Silagi

    Full Text Available Right-hemisphere lesions (RHL may impair inference comprehension. However, comparative studies between left-hemisphere lesions (LHL and RHL are rare, especially regarding reading comprehension. Moreover, further knowledge of the influence of cognition on inferential processing in this task is needed.To compare the performance of patients with RHL and LHL on an inference reading comprehension task. We also aimed to analyze the effects of lesion site and to verify correlations between cognitive functions and performance on the task.Seventy-five subjects were equally divided into the groups RHL, LHL, and control group (CG. The Implicit Management Test was used to evaluate inference comprehension. In this test, subjects read short written passages and subsequently answer five types of questions (explicit, logical, distractor, pragmatic, and other, which require different types of inferential reasoning. The cognitive functional domains of attention, memory, executive functions, language, and visuospatial abilities were assessed using the Cognitive Linguistic Quick Test (CLQT.The LHL and RHL groups presented difficulties in inferential comprehension in comparison with the CG. However, the RHL group presented lower scores than the LHL group on logical, pragmatic and other questions. A covariance analysis did not show any effect of lesion site within the hemispheres. Overall, all cognitive domains were correlated with all the types of questions from the inference test (especially logical, pragmatic, and other. Attention and visuospatial abilities affected the scores of both the RHL and LHL groups, and only memory influenced the performance of the RHL group.Lesions in either hemisphere may cause difficulties in making inferences during reading. However, processing more complex inferences was more difficult for patients with RHL than for those with LHL, which suggests that the right hemisphere plays an important role in tasks with higher comprehension

  18. Colour transformations and K-means segmentation for automatic cloud detection

    Directory of Open Access Journals (Sweden)

    Martin Blazek

    2015-08-01

    Full Text Available The main aim of this work is to find simple criteria for automatic recognition of several meteorological phenomena using optical digital sensors (e.g., Wide-Field cameras, automatic DSLR cameras or robotic telescopes. The output of those sensors is commonly represented in RGB channels containing information about both colour and luminosity even when normalised. Transformation into other colour spaces (e.g., CIE 1931 xyz, CIE L*a*b*, YCbCr can separate colour from luminosity, which is especially useful in the image processing of automatic cloud boundary recognition. Different colour transformations provide different sectorization of cloudy images. Hence, the analysed meteorological phenomena (cloud types, clear sky project differently into the colour diagrams of each international colour systems. In such diagrams, statistical tools can be applied in search of criteria which could determine clear sky from a covered one and possibly even perform a meteorological classification of cloud types. For the purpose of this work, a database of sky images (both clear and cloudy, with emphasis on a variety of different observation conditions (e.g., time, altitude, solar angle, etc. was acquired. The effectiveness of several colour transformations for meteorological application is discussed and the representation of different clouds (or clear sky in those colour systems is analysed. Utilisation of this algorithm would be useful in all-sky surveys, supplementary meteorological observations, solar cell effectiveness predictions or daytime astronomical solar observations.

  19. Supervised dictionary learning for inferring concurrent brain networks.

    Science.gov (United States)

    Zhao, Shijie; Han, Junwei; Lv, Jinglei; Jiang, Xi; Hu, Xintao; Zhao, Yu; Ge, Bao; Guo, Lei; Liu, Tianming

    2015-10-01

    Task-based fMRI (tfMRI) has been widely used to explore functional brain networks via predefined stimulus paradigm in the fMRI scan. Traditionally, the general linear model (GLM) has been a dominant approach to detect task-evoked networks. However, GLM focuses on task-evoked or event-evoked brain responses and possibly ignores the intrinsic brain functions. In comparison, dictionary learning and sparse coding methods have attracted much attention recently, and these methods have shown the promise of automatically and systematically decomposing fMRI signals into meaningful task-evoked and intrinsic concurrent networks. Nevertheless, two notable limitations of current data-driven dictionary learning method are that the prior knowledge of task paradigm is not sufficiently utilized and that the establishment of correspondences among dictionary atoms in different brains have been challenging. In this paper, we propose a novel supervised dictionary learning and sparse coding method for inferring functional networks from tfMRI data, which takes both of the advantages of model-driven method and data-driven method. The basic idea is to fix the task stimulus curves as predefined model-driven dictionary atoms and only optimize the other portion of data-driven dictionary atoms. Application of this novel methodology on the publicly available human connectome project (HCP) tfMRI datasets has achieved promising results.

  20. Automatic modulation classification principles, algorithms and applications

    CERN Document Server

    Zhu, Zhechen

    2014-01-01

    Automatic Modulation Classification (AMC) has been a key technology in many military, security, and civilian telecommunication applications for decades. In military and security applications, modulation often serves as another level of encryption; in modern civilian applications, multiple modulation types can be employed by a signal transmitter to control the data rate and link reliability. This book offers comprehensive documentation of AMC models, algorithms and implementations for successful modulation recognition. It provides an invaluable theoretical and numerical comparison of AMC algo

  1. A simple semi-automatic approach for land cover classification from multispectral remote sensing imagery.

    Directory of Open Access Journals (Sweden)

    Dong Jiang

    Full Text Available Land cover data represent a fundamental data source for various types of scientific research. The classification of land cover based on satellite data is a challenging task, and an efficient classification method is needed. In this study, an automatic scheme is proposed for the classification of land use using multispectral remote sensing images based on change detection and a semi-supervised classifier. The satellite image can be automatically classified using only the prior land cover map and existing images; therefore human involvement is reduced to a minimum, ensuring the operability of the method. The method was tested in the Qingpu District of Shanghai, China. Using Environment Satellite 1(HJ-1 images of 2009 with 30 m spatial resolution, the areas were classified into five main types of land cover based on previous land cover data and spectral features. The results agreed on validation of land cover maps well with a Kappa value of 0.79 and statistical area biases in proportion less than 6%. This study proposed a simple semi-automatic approach for land cover classification by using prior maps with satisfied accuracy, which integrated the accuracy of visual interpretation and performance of automatic classification methods. The method can be used for land cover mapping in areas lacking ground reference information or identifying rapid variation of land cover regions (such as rapid urbanization with convenience.

  2. Automatic section cutting and forming control of longitudinal-axial-roadheaders

    Energy Technology Data Exchange (ETDEWEB)

    Jie Tian; Yang Yang; Guo-Qiang Chen; Hong-Yao Wang; Jian-Gong Liu; Miao Wu [China University of Mining and Technology (Beijing), Beijing (China). School of Mechanical, Electronic and Information Engineering

    2009-01-15

    To reduce the useless driving workload and volume of filling and improve excavating efficiency, a method of laneway section automatic forming control of longitudinal-axial-roadheaders was presented. Firstly, a mine laneway section automatic cutting process was developed according to actual conditions of a coal mine. Then, a kinematic analysis was carried out of the automatic section forming control, including analysis of the swing mechanism, the spatial position of the cutting head and a geometric analysis of its mechanical structure. The geometrical relationship formulas were worked out between the cutter head spatial position coordinate, expansion increment of the hydraulic cylinders and swinging angles of the cantilever. The results show that the control mode of directly measuring swing angles of the cutting head is more simple and effective. The method proposed was put in practice in EBZ160 and EBZ200 boom-type roadheaders and the effect of experiment is very good, laying a foundation for further study on position detection and direction correction of roadheader. 11 refs., 5 figs., 1 tab.

  3. Evidence of the Generalization and Construct Representation Inferences for the "GRE"® Revised General Test Sentence Equivalence Item Type. ETS GRE® Board Research Report. ETS GRE®-17-02. ETS Research Report. RR-17-05

    Science.gov (United States)

    Bejar, Isaac I.; Deane, Paul D.; Flor, Michael; Chen, Jing

    2017-01-01

    The report is the first systematic evaluation of the sentence equivalence item type introduced by the "GRE"® revised General Test. We adopt a validity framework to guide our investigation based on Kane's approach to validation whereby a hierarchy of inferences that should be documented to support score meaning and interpretation is…

  4. Inference in models with adaptive learning

    NARCIS (Netherlands)

    Chevillon, G.; Massmann, M.; Mavroeidis, S.

    2010-01-01

    Identification of structural parameters in models with adaptive learning can be weak, causing standard inference procedures to become unreliable. Learning also induces persistent dynamics, and this makes the distribution of estimators and test statistics non-standard. Valid inference can be

  5. Learning Additional Languages as Hierarchical Probabilistic Inference: Insights From First Language Processing.

    Science.gov (United States)

    Pajak, Bozena; Fine, Alex B; Kleinschmidt, Dave F; Jaeger, T Florian

    2016-12-01

    We present a framework of second and additional language (L2/L n ) acquisition motivated by recent work on socio-indexical knowledge in first language (L1) processing. The distribution of linguistic categories covaries with socio-indexical variables (e.g., talker identity, gender, dialects). We summarize evidence that implicit probabilistic knowledge of this covariance is critical to L1 processing, and propose that L2/L n learning uses the same type of socio-indexical information to probabilistically infer latent hierarchical structure over previously learned and new languages. This structure guides the acquisition of new languages based on their inferred place within that hierarchy, and is itself continuously revised based on new input from any language. This proposal unifies L1 processing and L2/L n acquisition as probabilistic inference under uncertainty over socio-indexical structure. It also offers a new perspective on crosslinguistic influences during L2/L n learning, accommodating gradient and continued transfer (both negative and positive) from previously learned to novel languages, and vice versa.

  6. Report on achievements of research and development of an automatic sewing system in fiscal 1988. Production systems by apparel types; 1988 nendo jido hosei system no kenkyu kaihatsu seika hokokusho. Fukushubetsu seisan system

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-03-01

    This paper describes the production systems by apparel types from among the achievement report for fiscal 1988 on developing an automatic sewing system. Studies were continued on model wears as the experimental objects for the apparel side studies, and the processing manuals were expanded and improved. In order to expand application of the automatic sewing system technology, design studies were moved forward on an experimental plant for five kinds of apparels, namely tops, bottoms, dresses, sports wears and night wears to prepare for the comprehensive operation of the plant. The study items included such items as layout, handling, communications, utilities, and buildings. Factory design drawings and layout drawings were drawn on trial wear experimenting plants by apparel types. In the FA/FMS study, it was decided to study the factory design taking ladies' blazer coats as the object, whereas shortening the production time was discussed, and the device and software specifications were written up. Changing and switching of cloths and sizes for the five apparel types will be performed during the plant operation studies. Cooperation is tightened between subcommittees for adjusting the experimental plant schedule and technical problems. (NEDO)

  7. Neural Bases of Automaticity

    Science.gov (United States)

    Servant, Mathieu; Cassey, Peter; Woodman, Geoffrey F.; Logan, Gordon D.

    2018-01-01

    Automaticity allows us to perform tasks in a fast, efficient, and effortless manner after sufficient practice. Theories of automaticity propose that across practice processing transitions from being controlled by working memory to being controlled by long-term memory retrieval. Recent event-related potential (ERP) studies have sought to test this…

  8. Computing effects for correspondence types

    OpenAIRE

    Hüttel, Hans

    2010-01-01

    We show that type and effect inference is possible for a type and  effect system for authenticity using non-injective correspondences, opponent  types and a spi-calculus with symmetric encryption. We do this by a general  account of how effects can be computed given knowledge of how and where they  appear in type judgments. 

  9. Computing effects for correspondence types

    DEFF Research Database (Denmark)

    Hüttel, Hans

    2010-01-01

    We show that type and effect inference is possible for a type and  effect system for authenticity using non-injective correspondences, opponent  types and a spi-calculus with symmetric encryption. We do this by a general  account of how effects can be computed given knowledge of how and where they......  appear in type judgments. ...

  10. Design of a computerized application for the quality control of film automatic processors

    International Nuclear Information System (INIS)

    Merillas del Castillo, A.; Guibelalde del Castillo, E.; Fernandez Soto, J.M.; Vano carruana, E.

    1997-01-01

    The description of a free ware software for quality control of film automatic processors in diagnostic radiology developed by the authors is presented (CC-PRO ver 1.0). The application has been developed for using with automatic scanning densitometers, type X-Rite 380, being also able for manual data acquisition. By means of standard sensitometric techniques and the thus developed software, the trend analysis of the sensitometric variables and the film processor diagnostic could be carried out with an important production improvement, easy management and test consistency. (Author) 6 refs

  11. Inferring network topology from complex dynamics

    International Nuclear Information System (INIS)

    Shandilya, Srinivas Gorur; Timme, Marc

    2011-01-01

    Inferring the network topology from dynamical observations is a fundamental problem pervading research on complex systems. Here, we present a simple, direct method for inferring the structural connection topology of a network, given an observation of one collective dynamical trajectory. The general theoretical framework is applicable to arbitrary network dynamical systems described by ordinary differential equations. No interference (external driving) is required and the type of dynamics is hardly restricted in any way. In particular, the observed dynamics may be arbitrarily complex; stationary, invariant or transient; synchronous or asynchronous and chaotic or periodic. Presupposing a knowledge of the functional form of the dynamical units and of the coupling functions between them, we present an analytical solution to the inverse problem of finding the network topology from observing a time series of state variables only. Robust reconstruction is achieved in any sufficiently long generic observation of the system. We extend our method to simultaneously reconstructing both the entire network topology and all parameters appearing linear in the system's equations of motion. Reconstruction of network topology and system parameters is viable even in the presence of external noise that distorts the original dynamics substantially. The method provides a conceptually new step towards reconstructing a variety of real-world networks, including gene and protein interaction networks and neuronal circuits.

  12. Brand and automaticity

    OpenAIRE

    Liu, J.

    2008-01-01

    A presumption of most consumer research is that consumers endeavor to maximize the utility of their choices and are in complete control of their purchasing and consumption behavior. However, everyday life experience suggests that many of our choices are not all that reasoned or conscious. Indeed, automaticity, one facet of behavior, is indispensable to complete the portrait of consumers. Despite its importance, little attention is paid to how the automatic side of behavior can be captured and...

  13. User Interaction in Semi-Automatic Segmentation of Organs at Risk: a Case Study in Radiotherapy.

    Science.gov (United States)

    Ramkumar, Anjana; Dolz, Jose; Kirisli, Hortense A; Adebahr, Sonja; Schimek-Jasch, Tanja; Nestle, Ursula; Massoptier, Laurent; Varga, Edit; Stappers, Pieter Jan; Niessen, Wiro J; Song, Yu

    2016-04-01

    Accurate segmentation of organs at risk is an important step in radiotherapy planning. Manual segmentation being a tedious procedure and prone to inter- and intra-observer variability, there is a growing interest in automated segmentation methods. However, automatic methods frequently fail to provide satisfactory result, and post-processing corrections are often needed. Semi-automatic segmentation methods are designed to overcome these problems by combining physicians' expertise and computers' potential. This study evaluates two semi-automatic segmentation methods with different types of user interactions, named the "strokes" and the "contour", to provide insights into the role and impact of human-computer interaction. Two physicians participated in the experiment. In total, 42 case studies were carried out on five different types of organs at risk. For each case study, both the human-computer interaction process and quality of the segmentation results were measured subjectively and objectively. Furthermore, different measures of the process and the results were correlated. A total of 36 quantifiable and ten non-quantifiable correlations were identified for each type of interaction. Among those pairs of measures, 20 of the contour method and 22 of the strokes method were strongly or moderately correlated, either directly or inversely. Based on those correlated measures, it is concluded that: (1) in the design of semi-automatic segmentation methods, user interactions need to be less cognitively challenging; (2) based on the observed workflows and preferences of physicians, there is a need for flexibility in the interface design; (3) the correlated measures provide insights that can be used in improving user interaction design.

  14. Automatic Program Development

    DEFF Research Database (Denmark)

    Automatic Program Development is a tribute to Robert Paige (1947-1999), our accomplished and respected colleague, and moreover our good friend, whose untimely passing was a loss to our academic and research community. We have collected the revised, updated versions of the papers published in his...... honor in the Higher-Order and Symbolic Computation Journal in the years 2003 and 2005. Among them there are two papers by Bob: (i) a retrospective view of his research lines, and (ii) a proposal for future studies in the area of the automatic program derivation. The book also includes some papers...... by members of the IFIP Working Group 2.1 of which Bob was an active member. All papers are related to some of the research interests of Bob and, in particular, to the transformational development of programs and their algorithmic derivation from formal specifications. Automatic Program Development offers...

  15. Fiducial inference - A Neyman-Pearson interpretation

    NARCIS (Netherlands)

    Salome, D; VonderLinden, W; Dose,; Fischer, R; Preuss, R

    1999-01-01

    Fisher's fiducial argument is a tool for deriving inferences in the form of a probability distribution on the parameter space, not based on Bayes's Theorem. Lindley established that in exceptional situations fiducial inferences coincide with posterior distributions; in the other situations fiducial

  16. Uncertainty in prediction and in inference

    NARCIS (Netherlands)

    Hilgevoord, J.; Uffink, J.

    1991-01-01

    The concepts of uncertainty in prediction and inference are introduced and illustrated using the diffraction of light as an example. The close re-lationship between the concepts of uncertainty in inference and resolving power is noted. A general quantitative measure of uncertainty in

  17. Polynomial Chaos Acceleration for the Bayesian Inference of Random Fields with Gaussian Priors and Uncertain Covariance Hyper-Parameters

    KAUST Repository

    Le Maitre, Olivier

    2015-01-07

    We address model dimensionality reduction in the Bayesian inference of Gaussian fields, considering prior covariance function with unknown hyper-parameters. The Karhunen-Loeve (KL) expansion of a prior Gaussian process is traditionally derived assuming fixed covariance function with pre-assigned hyperparameter values. Thus, the modes strengths of the Karhunen-Loeve expansion inferred using available observations, as well as the resulting inferred process, dependent on the pre-assigned values for the covariance hyper-parameters. Here, we seek to infer the process and its the covariance hyper-parameters in a single Bayesian inference. To this end, the uncertainty in the hyper-parameters is treated by means of a coordinate transformation, leading to a KL-type expansion on a fixed reference basis of spatial modes, but with random coordinates conditioned on the hyper-parameters. A Polynomial Chaos (PC) expansion of the model prediction is also introduced to accelerate the Bayesian inference and the sampling of the posterior distribution with MCMC method. The PC expansion of the model prediction also rely on a coordinates transformation, enabling us to avoid expanding the dependence of the prediction with respect to the covariance hyper-parameters. We demonstrate the efficiency of the proposed method on a transient diffusion equation by inferring spatially-varying log-diffusivity fields from noisy data.

  18. Einstein SSS+MPC observations of Seyfert type galaxies

    Science.gov (United States)

    Holt, S. S.; Turner, T. J.; Mushotzky, R. F.; Weaver, K.

    1989-01-01

    The X-ray spectra of 27 Seyfert galaxies measured with the Solid State Spectrometer (SSS) onboard the Einstein Observatory is investigated. This new investigation features the utilization of simultaneous data from the Monitor Proportional Counter (MPC) and automatic correction for systematic effects in the SSS. The new results are that the best-fit single power law indices agree with those previously reported, but that soft excesses are inferred for at least 20 percent of the measured spectra. The soft excesses are consistent with either an approximately 0.25 keV black body or Fe-L line emission.

  19. Development of advanced automatic operation system for nuclear ship. 1. Perfect automatic normal operation

    International Nuclear Information System (INIS)

    Nakazawa, Toshio; Yabuuti, Noriaki; Takahashi, Hiroki; Shimazaki, Junya

    1999-02-01

    Development of operation support system such as automatic operating system and anomaly diagnosis systems of nuclear reactor is very important in practical nuclear ship because of a limited number of operators and severe conditions in which receiving support from others in a case of accident is very difficult. The goal of development of the operation support systems is to realize the perfect automatic control system in a series of normal operation from the reactor start-up to the shutdown. The automatic control system for the normal operation has been developed based on operating experiences of the first Japanese nuclear ship 'Mutsu'. Automation technique was verified by 'Mutsu' plant data at manual operation. Fully automatic control of start-up and shutdown operations was achieved by setting the desired value of operation and the limiting value of parameter fluctuation, and by making the operation program of the principal equipment such as the main coolant pump and the heaters. This report presents the automatic operation system developed for the start-up and the shutdown of reactor and the verification of the system using the Nuclear Ship Engineering Simulator System. (author)

  20. 14 CFR 23.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 23.1329 Section 23...: Installation § 23.1329 Automatic pilot system. If an automatic pilot system is installed, it must meet the following: (a) Each system must be designed so that the automatic pilot can— (1) Be quickly and positively...

  1. 46 CFR 52.01-10 - Automatic controls.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Automatic controls. 52.01-10 Section 52.01-10 Shipping... Requirements § 52.01-10 Automatic controls. (a) Each main boiler must meet the special requirements for automatic safety controls in § 62.35-20(a)(1) of this chapter. (b) Each automatically controlled auxiliary...

  2. Polynomial Chaos Surrogates for Bayesian Inference

    KAUST Repository

    Le Maitre, Olivier

    2016-01-06

    The Bayesian inference is a popular probabilistic method to solve inverse problems, such as the identification of field parameter in a PDE model. The inference rely on the Bayes rule to update the prior density of the sought field, from observations, and derive its posterior distribution. In most cases the posterior distribution has no explicit form and has to be sampled, for instance using a Markov-Chain Monte Carlo method. In practice the prior field parameter is decomposed and truncated (e.g. by means of Karhunen- Lo´eve decomposition) to recast the inference problem into the inference of a finite number of coordinates. Although proved effective in many situations, the Bayesian inference as sketched above faces several difficulties requiring improvements. First, sampling the posterior can be a extremely costly task as it requires multiple resolutions of the PDE model for different values of the field parameter. Second, when the observations are not very much informative, the inferred parameter field can highly depends on its prior which can be somehow arbitrary. These issues have motivated the introduction of reduced modeling or surrogates for the (approximate) determination of the parametrized PDE solution and hyperparameters in the description of the prior field. Our contribution focuses on recent developments in these two directions: the acceleration of the posterior sampling by means of Polynomial Chaos expansions and the efficient treatment of parametrized covariance functions for the prior field. We also discuss the possibility of making such approach adaptive to further improve its efficiency.

  3. Interactive Instruction in Bayesian Inference

    DEFF Research Database (Denmark)

    Khan, Azam; Breslav, Simon; Hornbæk, Kasper

    2018-01-01

    An instructional approach is presented to improve human performance in solving Bayesian inference problems. Starting from the original text of the classic Mammography Problem, the textual expression is modified and visualizations are added according to Mayer’s principles of instruction. These pri......An instructional approach is presented to improve human performance in solving Bayesian inference problems. Starting from the original text of the classic Mammography Problem, the textual expression is modified and visualizations are added according to Mayer’s principles of instruction....... These principles concern coherence, personalization, signaling, segmenting, multimedia, spatial contiguity, and pretraining. Principles of self-explanation and interactivity are also applied. Four experiments on the Mammography Problem showed that these principles help participants answer the questions...... that an instructional approach to improving human performance in Bayesian inference is a promising direction....

  4. Inferring Phylogenetic Networks Using PhyloNet.

    Science.gov (United States)

    Wen, Dingqiao; Yu, Yun; Zhu, Jiafan; Nakhleh, Luay

    2018-07-01

    PhyloNet was released in 2008 as a software package for representing and analyzing phylogenetic networks. At the time of its release, the main functionalities in PhyloNet consisted of measures for comparing network topologies and a single heuristic for reconciling gene trees with a species tree. Since then, PhyloNet has grown significantly. The software package now includes a wide array of methods for inferring phylogenetic networks from data sets of unlinked loci while accounting for both reticulation (e.g., hybridization) and incomplete lineage sorting. In particular, PhyloNet now allows for maximum parsimony, maximum likelihood, and Bayesian inference of phylogenetic networks from gene tree estimates. Furthermore, Bayesian inference directly from sequence data (sequence alignments or biallelic markers) is implemented. Maximum parsimony is based on an extension of the "minimizing deep coalescences" criterion to phylogenetic networks, whereas maximum likelihood and Bayesian inference are based on the multispecies network coalescent. All methods allow for multiple individuals per species. As computing the likelihood of a phylogenetic network is computationally hard, PhyloNet allows for evaluation and inference of networks using a pseudolikelihood measure. PhyloNet summarizes the results of the various analyzes and generates phylogenetic networks in the extended Newick format that is readily viewable by existing visualization software.

  5. Automatic welding and cladding in heavy fabrication

    International Nuclear Information System (INIS)

    Altamer, A. de

    1980-01-01

    A description is given of the automatic welding processes used by an Italian fabricator of pressure vessels for petrochemical and nuclear plant. The automatic submerged arc welding, submerged arc strip cladding, pulsed TIG, hot wire TIG and MIG welding processes have proved satisfactory in terms of process reliability, metal deposition rate, and cost effectiveness for low alloy and carbon steels. An example shows sequences required during automatic butt welding, including heat treatments. Factors which govern satisfactory automatic welding include automatic anti-drift rotator device, electrode guidance and bead programming system, the capability of single and dual head operation, flux recovery and slag removal systems, operator environment and controls, maintaining continuity of welding and automatic reverse side grinding. Automatic welding is used for: joining vessel sections; joining tubes to tubeplate; cladding of vessel rings and tubes, dished ends and extruded nozzles; nozzle to shell and butt welds, including narrow gap welding. (author)

  6. Active inference and learning.

    Science.gov (United States)

    Friston, Karl; FitzGerald, Thomas; Rigoli, Francesco; Schwartenbeck, Philipp; O Doherty, John; Pezzulo, Giovanni

    2016-09-01

    This paper offers an active inference account of choice behaviour and learning. It focuses on the distinction between goal-directed and habitual behaviour and how they contextualise each other. We show that habits emerge naturally (and autodidactically) from sequential policy optimisation when agents are equipped with state-action policies. In active inference, behaviour has explorative (epistemic) and exploitative (pragmatic) aspects that are sensitive to ambiguity and risk respectively, where epistemic (ambiguity-resolving) behaviour enables pragmatic (reward-seeking) behaviour and the subsequent emergence of habits. Although goal-directed and habitual policies are usually associated with model-based and model-free schemes, we find the more important distinction is between belief-free and belief-based schemes. The underlying (variational) belief updating provides a comprehensive (if metaphorical) process theory for several phenomena, including the transfer of dopamine responses, reversal learning, habit formation and devaluation. Finally, we show that active inference reduces to a classical (Bellman) scheme, in the absence of ambiguity. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Honesty by Typing

    OpenAIRE

    Bartoletti , Massimo; Scalas , Alceste; Tuosto , Emilio; Zunino , Roberto

    2013-01-01

    We propose a type system for a calculus of contracting processes. Processes can establish sessions by stipulating contracts, and then can interact either by keeping the promises made, or not. Type safety guarantees that a typeable process is honest - that is, it abides by the contracts it has stipulated in all possible contexts, even in presence of dishonest adversaries. Type inference is decidable, and it allows to safely approximate the honesty of processes using either synchronous or async...

  8. Active Inference, homeostatic regulation and adaptive behavioural control.

    Science.gov (United States)

    Pezzulo, Giovanni; Rigoli, Francesco; Friston, Karl

    2015-11-01

    We review a theory of homeostatic regulation and adaptive behavioural control within the Active Inference framework. Our aim is to connect two research streams that are usually considered independently; namely, Active Inference and associative learning theories of animal behaviour. The former uses a probabilistic (Bayesian) formulation of perception and action, while the latter calls on multiple (Pavlovian, habitual, goal-directed) processes for homeostatic and behavioural control. We offer a synthesis these classical processes and cast them as successive hierarchical contextualisations of sensorimotor constructs, using the generative models that underpin Active Inference. This dissolves any apparent mechanistic distinction between the optimization processes that mediate classical control or learning. Furthermore, we generalize the scope of Active Inference by emphasizing interoceptive inference and homeostatic regulation. The ensuing homeostatic (or allostatic) perspective provides an intuitive explanation for how priors act as drives or goals to enslave action, and emphasises the embodied nature of inference. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Generative Inferences Based on Learned Relations

    Science.gov (United States)

    Chen, Dawn; Lu, Hongjing; Holyoak, Keith J.

    2017-01-01

    A key property of relational representations is their "generativity": From partial descriptions of relations between entities, additional inferences can be drawn about other entities. A major theoretical challenge is to demonstrate how the capacity to make generative inferences could arise as a result of learning relations from…

  10. Design of Control System for Kiwifruit Automatic Grading Machine

    Directory of Open Access Journals (Sweden)

    Xingjian Zuo

    2013-05-01

    Full Text Available The kiwifruit automatic grading machine is an important machine for postharvest processing of kiwifruit, and the control system ensures that the machine realizes intelligence. The control system for the kiwifruit automatic grading machine designed in this paper comprises a host computer and a slave microcontroller. The host computer provides a visual grading interface for the machine with a LabVIEW software, the slave microcontroller adopts an STC89C52 microcontroller as its core, and C language is used to write programs for controlling a position sensor module, push-pull type electromagnets, motor driving modules and a power supply for controlling the operation of the machine as well as the rise or descend of grading baffle plates. The ideal control effect is obtained through test, and the intelligent operation of the machine is realized.

  11. Automatic differentiation of functions

    International Nuclear Information System (INIS)

    Douglas, S.R.

    1990-06-01

    Automatic differentiation is a method of computing derivatives of functions to any order in any number of variables. The functions must be expressible as combinations of elementary functions. When evaluated at specific numerical points, the derivatives have no truncation error and are automatically found. The method is illustrated by simple examples. Source code in FORTRAN is provided

  12. NetBenchmark: a bioconductor package for reproducible benchmarks of gene regulatory network inference.

    Science.gov (United States)

    Bellot, Pau; Olsen, Catharina; Salembier, Philippe; Oliveras-Vergés, Albert; Meyer, Patrick E

    2015-09-29

    In the last decade, a great number of methods for reconstructing gene regulatory networks from expression data have been proposed. However, very few tools and datasets allow to evaluate accurately and reproducibly those methods. Hence, we propose here a new tool, able to perform a systematic, yet fully reproducible, evaluation of transcriptional network inference methods. Our open-source and freely available Bioconductor package aggregates a large set of tools to assess the robustness of network inference algorithms against different simulators, topologies, sample sizes and noise intensities. The benchmarking framework that uses various datasets highlights the specialization of some methods toward network types and data. As a result, it is possible to identify the techniques that have broad overall performances.

  13. Automatic detection of children's engagement using non-verbal features and ordinal learning

    NARCIS (Netherlands)

    Kim, Jaebok; Truong, Khiet Phuong; Evers, Vanessa

    In collaborative play, young children can exhibit different types of engagement. Some children are engaged with other children in the play activity while others are just looking. In this study, we investigated methods to automatically detect the children's levels of engagement in play settings using

  14. Solar Powered Automatic Shrimp Feeding System

    Directory of Open Access Journals (Sweden)

    Dindo T. Ani

    2015-12-01

    Full Text Available - Automatic system has brought many revolutions in the existing technologies. One among the technologies, which has greater developments, is the solar powered automatic shrimp feeding system. For instance, the solar power which is a renewable energy can be an alternative solution to energy crisis and basically reducing man power by using it in an automatic manner. The researchers believe an automatic shrimp feeding system may help solve problems on manual feeding operations. The project study aimed to design and develop a solar powered automatic shrimp feeding system. It specifically sought to prepare the design specifications of the project, to determine the methods of fabrication and assembly, and to test the response time of the automatic shrimp feeding system. The researchers designed and developed an automatic system which utilizes a 10 hour timer to be set in intervals preferred by the user and will undergo a continuous process. The magnetic contactor acts as a switch connected to the 10 hour timer which controls the activation or termination of electrical loads and powered by means of a solar panel outputting electrical power, and a rechargeable battery in electrical communication with the solar panel for storing the power. By undergoing through series of testing, the components of the modified system were proven functional and were operating within the desired output. It was recommended that the timer to be used should be tested to avoid malfunction and achieve the fully automatic system and that the system may be improved to handle changes in scope of the project.

  15. Verifying Process Algebra Proofs in Type Theory

    NARCIS (Netherlands)

    Sellink, M.P.A.

    In this paper we study automatic verification of proofs in process algebra. Formulas of process algebra are represented by types in typed λ-calculus. Inhabitants (terms) of these types represent proofs. The specific typed λ-calculus we use is the Calculus of Inductive Constructions as implemented

  16. Automatic feature-based grouping during multiple object tracking.

    Science.gov (United States)

    Erlikhman, Gennady; Keane, Brian P; Mettler, Everett; Horowitz, Todd S; Kellman, Philip J

    2013-12-01

    Contour interpolation automatically binds targets with distractors to impair multiple object tracking (Keane, Mettler, Tsoi, & Kellman, 2011). Is interpolation special in this regard or can other features produce the same effect? To address this question, we examined the influence of eight features on tracking: color, contrast polarity, orientation, size, shape, depth, interpolation, and a combination (shape, color, size). In each case, subjects tracked 4 of 8 objects that began as undifferentiated shapes, changed features as motion began (to enable grouping), and returned to their undifferentiated states before halting. We found that intertarget grouping improved performance for all feature types except orientation and interpolation (Experiment 1 and Experiment 2). Most importantly, target-distractor grouping impaired performance for color, size, shape, combination, and interpolation. The impairments were, at times, large (>15% decrement in accuracy) and occurred relative to a homogeneous condition in which all objects had the same features at each moment of a trial (Experiment 2), and relative to a "diversity" condition in which targets and distractors had different features at each moment (Experiment 3). We conclude that feature-based grouping occurs for a variety of features besides interpolation, even when irrelevant to task instructions and contrary to the task demands, suggesting that interpolation is not unique in promoting automatic grouping in tracking tasks. Our results also imply that various kinds of features are encoded automatically and in parallel during tracking.

  17. Polite Interactions with Robots

    DEFF Research Database (Denmark)

    Benotti, Luciana; Blackburn, Patrick Rowan

    2016-01-01

    We sketch an inference architecture that permits linguistic aspects of politeness to be interpreted; we do so by applying the ideas of politeness theory to the SCARE corpus of task-oriented dialogues, a type of dialogue of particular relevance to robotics. The fragment of the SCARE corpus we...... analyzed contains 77 uses of politeness strategies: our inference architecture covers 58 of them using classical AI planning techniques; the remainder require other forms of means-ends inference. So by the end of the paper we will have discussed in some detail how to interpret automatically different forms...

  18. PENERAPAN FUZZY INFERENCE SYSTEM TAKAGI-SUGENO-KANG PADA SISTEM PAKAR DIAGNOSA PENYAKIT GIGI

    Directory of Open Access Journals (Sweden)

    Lutfi Salisa Setiawati

    2016-04-01

    Full Text Available Generally, expert system only show types of disease after user choose symptoms. In the study is done the addition of disease severity level. The method applied in the calculation of the severity is a method of Fuzzy Inference System Takagi-Sugeno-Kang (Method of Sugeno. This study attempts to know whether method Fuzzy Inference System Takagi-Sugeno-Kang can work for expert system in giving the diagnosis diseases of the teeth. The result of this research or severity for diseases of pulpitis reversible 38,53%, pulpitis irreversible 59,64%, periodontitis 69,62%, acute periodontitis 51,43%, gingivitis 45.5%, acute pericoronitis 53,93%, sub acute pericoronitis 52,14%, chronic pericoronitis 46,05%, caries dentist an early stage 37,61%, caries dentist toward an advanced stage 43,89%, caries dentist an advanced stage 51,76%, gangrene pulpa 42,5%, polyps pulpa 56,43%, and periostitis 58,55%. A conclusion that was obtained from the study that is a method of Fuzzy Inference System Takagi-Sugeno-Kang could be applied to expert system of the teeth. Key Word: Teeth , Expert System , Expert System Teeth , Fuzzy Logic , Fuzzy Inference System , Takagi-Sugeno-Kang , Fuzzy Sugeno Pada umumnya, istem pakar hanya menampilkan jenis penyakit setelah user memilih gejala-gejala. Pada penelitian ini dilakukan penambahan tingkat keparahan penyakit. Metode yang diterapkan dalam perhitungan tingkat keparahan ini yaitu Metode Fuzzy Inference System Takagi-Sugeno-Kang (Metode Sugeno. Penelitian ini bertujuan untuk mengetahui apakah metode Fuzzy Inference System Takagi-Sugeno-Kang dapat diterapkan pada sistem pakar dalam memberikan diagnosa penyakit gigi. Hasil dari penelitian ini didapatkan tingkat keparahan untuk penyakit Pulpitis Reversibel 38,53%, Pulpitis Irreversibel 59,64%, Periodontitis 69,62%, Periodontitis Akut 51,43%, Gingivitis 45,5%, Perikoronitis Akut 53,93%, Perikoronitis Sub Akut 52,14%, Perikoronitis Kronis 46,05%, Karies Denties Tahap Awal 37,61%, Karies

  19. Parametric statistical inference basic theory and modern approaches

    CERN Document Server

    Zacks, Shelemyahu; Tsokos, C P

    1981-01-01

    Parametric Statistical Inference: Basic Theory and Modern Approaches presents the developments and modern trends in statistical inference to students who do not have advanced mathematical and statistical preparation. The topics discussed in the book are basic and common to many fields of statistical inference and thus serve as a jumping board for in-depth study. The book is organized into eight chapters. Chapter 1 provides an overview of how the theory of statistical inference is presented in subsequent chapters. Chapter 2 briefly discusses statistical distributions and their properties. Chapt

  20. Automatic analysis of macerals and reflectance; Analisis Automatico de Macerales y Reflectancia

    Energy Technology Data Exchange (ETDEWEB)

    Catalina, J.C.; Alarcon, D.; Gonzalez Prado, J.

    1998-12-01

    A new system has been developed to perform automatically macerals and reflectance analysis of single-seam bituminous coals, improving the interlaboratory accuracy of these types of analyses. The system follows the same steps as the manual method, requiring a human operator for preparation of coal samples and system startup; then, sample scanning, microscope focusing and field centre analysis are fully automatic. The main and most innovative idea of this approach is to coordinate an expert system with an image processing system, using both reflectance and morphological information. In this way, the system tries to reproduce the analysis procedure followed by a human expert in petrography. (Author)

  1. Inferring the emotions of friends versus strangers: the role of culture and self-construal.

    Science.gov (United States)

    Ma-Kellams, Christine; Blascovich, Jim

    2012-07-01

    Three studies examined cross-cultural differences in empathic accuracy (the ability to correctly infer another's emotional experience) within the context of different relationships. East-West cultural differences in self-construal were hypothesized to differentiate levels of empathic accuracy across relationship types. In contrast to the independent self prevalent among members of Western cultures, members of Eastern cultures generally view the self as interdependent with those with whom they have a relationship. Easterners, relative to Westerners, are more concerned with the thoughts or feelings of close others and less concerned with the thoughts or feelings of those with whom they have no relational link (i.e., strangers). Across three studies, the authors found that East Asians, compared with European Americans, made more accurate inferences regarding the emotions of close others (i.e., friends), but less accurate inferences regarding the emotions of strangers. Furthermore, individual differences in interdependent self-construal among East Asians predicted the degree of empathic accuracy.

  2. Qualitative reasoning for biological network inference from systematic perturbation experiments.

    Science.gov (United States)

    Badaloni, Silvana; Di Camillo, Barbara; Sambo, Francesco

    2012-01-01

    The systematic perturbation of the components of a biological system has been proven among the most informative experimental setups for the identification of causal relations between the components. In this paper, we present Systematic Perturbation-Qualitative Reasoning (SPQR), a novel Qualitative Reasoning approach to automate the interpretation of the results of systematic perturbation experiments. Our method is based on a qualitative abstraction of the experimental data: for each perturbation experiment, measured values of the observed variables are modeled as lower, equal or higher than the measurements in the wild type condition, when no perturbation is applied. The algorithm exploits a set of IF-THEN rules to infer causal relations between the variables, analyzing the patterns of propagation of the perturbation signals through the biological network, and is specifically designed to minimize the rate of false positives among the inferred relations. Tested on both simulated and real perturbation data, SPQR indeed exhibits a significantly higher precision than the state of the art.

  3. Variational inference & deep learning: A new synthesis

    OpenAIRE

    Kingma, D.P.

    2017-01-01

    In this thesis, Variational Inference and Deep Learning: A New Synthesis, we propose novel solutions to the problems of variational (Bayesian) inference, generative modeling, representation learning, semi-supervised learning, and stochastic optimization.

  4. Variational inference & deep learning : A new synthesis

    NARCIS (Netherlands)

    Kingma, D.P.

    2017-01-01

    In this thesis, Variational Inference and Deep Learning: A New Synthesis, we propose novel solutions to the problems of variational (Bayesian) inference, generative modeling, representation learning, semi-supervised learning, and stochastic optimization.

  5. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel

    2014-01-01

    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  6. Automatic Ultrasound Scanning

    DEFF Research Database (Denmark)

    Moshavegh, Ramin

    on the user adjustments on the scanner interface to optimize the scan settings. This explains the huge interest in the subject of this PhD project entitled “AUTOMATIC ULTRASOUND SCANNING”. The key goals of the project have been to develop automated techniques to minimize the unnecessary settings...... on the scanners, and to improve the computer-aided diagnosis (CAD) in ultrasound by introducing new quantitative measures. Thus, four major issues concerning automation of the medical ultrasound are addressed in this PhD project. They touch upon gain adjustments in ultrasound, automatic synthetic aperture image...

  7. Ensemble stacking mitigates biases in inference of synaptic connectivity

    Directory of Open Access Journals (Sweden)

    Brendan Chambers

    2018-03-01

    Full Text Available A promising alternative to directly measuring the anatomical connections in a neuronal population is inferring the connections from the activity. We employ simulated spiking neuronal networks to compare and contrast commonly used inference methods that identify likely excitatory synaptic connections using statistical regularities in spike timing. We find that simple adjustments to standard algorithms improve inference accuracy: A signing procedure improves the power of unsigned mutual-information-based approaches and a correction that accounts for differences in mean and variance of background timing relationships, such as those expected to be induced by heterogeneous firing rates, increases the sensitivity of frequency-based methods. We also find that different inference methods reveal distinct subsets of the synaptic network and each method exhibits different biases in the accurate detection of reciprocity and local clustering. To correct for errors and biases specific to single inference algorithms, we combine methods into an ensemble. Ensemble predictions, generated as a linear combination of multiple inference algorithms, are more sensitive than the best individual measures alone, and are more faithful to ground-truth statistics of connectivity, mitigating biases specific to single inference methods. These weightings generalize across simulated datasets, emphasizing the potential for the broad utility of ensemble-based approaches. Mapping the routing of spikes through local circuitry is crucial for understanding neocortical computation. Under appropriate experimental conditions, these maps can be used to infer likely patterns of synaptic recruitment, linking activity to underlying anatomical connections. Such inferences help to reveal the synaptic implementation of population dynamics and computation. We compare a number of standard functional measures to infer underlying connectivity. We find that regularization impacts measures

  8. Constraint Satisfaction Inference : Non-probabilistic Global Inference for Sequence Labelling

    NARCIS (Netherlands)

    Canisius, S.V.M.; van den Bosch, A.; Daelemans, W.; Basili, R.; Moschitti, A.

    2006-01-01

    We present a new method for performing sequence labelling based on the idea of using a machine-learning classifier to generate several possible output sequences, and then applying an inference procedure to select the best sequence among those. Most sequence labelling methods following a similar

  9. Reasoning about Informal Statistical Inference: One Statistician's View

    Science.gov (United States)

    Rossman, Allan J.

    2008-01-01

    This paper identifies key concepts and issues associated with the reasoning of informal statistical inference. I focus on key ideas of inference that I think all students should learn, including at secondary level as well as tertiary. I argue that a fundamental component of inference is to go beyond the data at hand, and I propose that statistical…

  10. Meta-learning framework applied in bioinformatics inference system design.

    Science.gov (United States)

    Arredondo, Tomás; Ormazábal, Wladimir

    2015-01-01

    This paper describes a meta-learner inference system development framework which is applied and tested in the implementation of bioinformatic inference systems. These inference systems are used for the systematic classification of the best candidates for inclusion in bacterial metabolic pathway maps. This meta-learner-based approach utilises a workflow where the user provides feedback with final classification decisions which are stored in conjunction with analysed genetic sequences for periodic inference system training. The inference systems were trained and tested with three different data sets related to the bacterial degradation of aromatic compounds. The analysis of the meta-learner-based framework involved contrasting several different optimisation methods with various different parameters. The obtained inference systems were also contrasted with other standard classification methods with accurate prediction capabilities observed.

  11. GIS Data Based Automatic High-Fidelity 3D Road Network Modeling

    Science.gov (United States)

    Wang, Jie; Shen, Yuzhong

    2011-01-01

    3D road models are widely used in many computer applications such as racing games and driving simulations_ However, almost all high-fidelity 3D road models were generated manually by professional artists at the expense of intensive labor. There are very few existing methods for automatically generating 3D high-fidelity road networks, especially those existing in the real world. This paper presents a novel approach thai can automatically produce 3D high-fidelity road network models from real 2D road GIS data that mainly contain road. centerline in formation. The proposed method first builds parametric representations of the road centerlines through segmentation and fitting . A basic set of civil engineering rules (e.g., cross slope, superelevation, grade) for road design are then selected in order to generate realistic road surfaces in compliance with these rules. While the proposed method applies to any types of roads, this paper mainly addresses automatic generation of complex traffic interchanges and intersections which are the most sophisticated elements in the road networks

  12. Improving Automation Routines for Automatic Heating Load Detection in Buildings

    Directory of Open Access Journals (Sweden)

    Stephen Timlin

    2012-11-01

    Full Text Available Energy managers use weather compensation data and heating system cut off routines to reduce heating energy consumption in buildings and improve user comfort. These routines are traditionally based on the calculation of an estimated building load that is inferred from the external dry bulb temperature at any point in time. While this method does reduce heating energy consumption and accidental overheating, it can be inaccurate under some weather conditions and therefore has limited effectiveness. There remains considerable scope to improve on the accuracy and relevance of the traditional method by expanding the calculations used to include a larger range of environmental metrics. It is proposed that weather compensation and automatic shut off routines that are commonly used could be improved notably with little additional cost by the inclusion of additional weather metrics. This paper examines the theoretical relationship between various external metrics and building heating loads. Results of the application of an advanced routine to a recently constructed building are examined, and estimates are made of the potential savings that can be achieved through the use of the routines proposed.

  13. A completely automatic operation type super-safe fast reactor, RAPID. Its application to dispersion source on lunar and earth surfaces

    International Nuclear Information System (INIS)

    Kanbe, Mitsuru; Tsunoda, Hirokazu; Mishima, Kaichiro; Kawasaki, Akira; Iwamura, Takamichi

    2002-01-01

    At a viewpoint of flexible measures to future electric power demands, expectation onto a small-scale reactor for dispersion source is increasing gradually. This is thought to increase its importance not only for a source at proximity of its market in advanced nations but also for the one in developing nations. A study on development of the completely automatic operation type super-safe fast reactor, RAPID (refueling by all pins integrated design) has been carried out as a part of the nuclear energy basic research promoting system under three years project since 1999 by a trust of the Japan Atomic Energy Research Institute to a group of the Central Research Institute of Electric Power Industry (CRIEPI) and so on. As the reactor is a lithium cooled fast reactor with 200 Kw of electric output supposing to use at lunar surface, it can be applied to a super-small scale nuclear reactor on the earth, and has feasibility to become a new option of future nuclear power generation. On the other hand, CRIEPI has investigated on various types of fast reactors (RAPID series) for fast reactor for dispersion source on the earth. Here was introduced on such super-safe fast reactors at a center of RAPID-L. (G.K.)

  14. Automatic control systems engineering

    International Nuclear Information System (INIS)

    Shin, Yun Gi

    2004-01-01

    This book gives descriptions of automatic control for electrical electronics, which indicates history of automatic control, Laplace transform, block diagram and signal flow diagram, electrometer, linearization of system, space of situation, state space analysis of electric system, sensor, hydro controlling system, stability, time response of linear dynamic system, conception of root locus, procedure to draw root locus, frequency response, and design of control system.

  15. Inferring Gene Regulatory Networks Using Conditional Regulation Pattern to Guide Candidate Genes.

    Directory of Open Access Journals (Sweden)

    Fei Xiao

    Full Text Available Combining path consistency (PC algorithms with conditional mutual information (CMI are widely used in reconstruction of gene regulatory networks. CMI has many advantages over Pearson correlation coefficient in measuring non-linear dependence to infer gene regulatory networks. It can also discriminate the direct regulations from indirect ones. However, it is still a challenge to select the conditional genes in an optimal way, which affects the performance and computation complexity of the PC algorithm. In this study, we develop a novel conditional mutual information-based algorithm, namely RPNI (Regulation Pattern based Network Inference, to infer gene regulatory networks. For conditional gene selection, we define the co-regulation pattern, indirect-regulation pattern and mixture-regulation pattern as three candidate patterns to guide the selection of candidate genes. To demonstrate the potential of our algorithm, we apply it to gene expression data from DREAM challenge. Experimental results show that RPNI outperforms existing conditional mutual information-based methods in both accuracy and time complexity for different sizes of gene samples. Furthermore, the robustness of our algorithm is demonstrated by noisy interference analysis using different types of noise.

  16. Development of advanced automatic control system for nuclear ship. 2. Perfect automatic operation after reactor scram events

    International Nuclear Information System (INIS)

    Yabuuchi, Noriaki; Nakazawa, Toshio; Takahashi, Hiroki; Shimazaki, Junya; Hoshi, Tsutao

    1997-11-01

    An automatic operation system has been developed for the purpose of realizing a perfect automatic plant operation after reactor scram events. The goal of the automatic operation after a reactor scram event is to bring the reactor hot stand-by condition automatically. The basic functions of this system are as follows; to monitor actions of the equipments of safety actions after a reactor scram, to control necessary control equipments to bring a reactor to a hot stand-by condition automatically, and to energize a decay heat removal system. The performance evaluation on this system was carried out by comparing the results using to Nuclear Ship Engineering Simulation System (NESSY) and the those measured in the scram test of the nuclear ship 'Mutsu'. As the result, it was showed that this system had the sufficient performance to bring a reactor to a hot syand-by condition quickly and safety. (author)

  17. Development of advanced automatic control system for nuclear ship. 2. Perfect automatic operation after reactor scram events

    Energy Technology Data Exchange (ETDEWEB)

    Yabuuchi, Noriaki; Nakazawa, Toshio; Takahashi, Hiroki; Shimazaki, Junya; Hoshi, Tsutao [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-11-01

    An automatic operation system has been developed for the purpose of realizing a perfect automatic plant operation after reactor scram events. The goal of the automatic operation after a reactor scram event is to bring the reactor hot stand-by condition automatically. The basic functions of this system are as follows; to monitor actions of the equipments of safety actions after a reactor scram, to control necessary control equipments to bring a reactor to a hot stand-by condition automatically, and to energize a decay heat removal system. The performance evaluation on this system was carried out by comparing the results using to Nuclear Ship Engineering Simulation System (NESSY) and the those measured in the scram test of the nuclear ship `Mutsu`. As the result, it was showed that this system had the sufficient performance to bring a reactor to a hot syand-by condition quickly and safety. (author)

  18. Individuals with fear of blushing explicitly and automatically associate blushing with social costs

    NARCIS (Netherlands)

    Glashouwer, K.A.; de Jong, P.J.; Dijk, C.; Buwalda, F.M.

    2011-01-01

    To explain fear of blushing, it has been proposed that individuals with fear of blushing overestimate the social costs of their blushing. Current information-processing models emphasize the relevance of differentiating between more automatic and more explicit cognitions, as both types of cognitions

  19. Individuals with Fear of Blushing Explicitly and Automatically Associate Blushing with Social Costs

    NARCIS (Netherlands)

    Glashouwer, Klaske A.; de Jong, Peter J.; Dijk, Corine; Buwalda, Femke M.

    2011-01-01

    To explain fear of blushing, it has been proposed that individuals with fear of blushing overestimate the social costs of their blushing. Current information-processing models emphasize the relevance of differentiating between more automatic and more explicit cognitions, as both types of cognitions

  20. Statistical inference and Aristotle's Rhetoric.

    Science.gov (United States)

    Macdonald, Ranald R

    2004-11-01

    Formal logic operates in a closed system where all the information relevant to any conclusion is present, whereas this is not the case when one reasons about events and states of the world. Pollard and Richardson drew attention to the fact that the reasoning behind statistical tests does not lead to logically justifiable conclusions. In this paper statistical inferences are defended not by logic but by the standards of everyday reasoning. Aristotle invented formal logic, but argued that people mostly get at the truth with the aid of enthymemes--incomplete syllogisms which include arguing from examples, analogies and signs. It is proposed that statistical tests work in the same way--in that they are based on examples, invoke the analogy of a model and use the size of the effect under test as a sign that the chance hypothesis is unlikely. Of existing theories of statistical inference only a weak version of Fisher's takes this into account. Aristotle anticipated Fisher by producing an argument of the form that there were too many cases in which an outcome went in a particular direction for that direction to be plausibly attributed to chance. We can therefore conclude that Aristotle would have approved of statistical inference and there is a good reason for calling this form of statistical inference classical.

  1. Deep Learning for Population Genetic Inference.

    Science.gov (United States)

    Sheehan, Sara; Song, Yun S

    2016-03-01

    Given genomic variation data from multiple individuals, computing the likelihood of complex population genetic models is often infeasible. To circumvent this problem, we introduce a novel likelihood-free inference framework by applying deep learning, a powerful modern technique in machine learning. Deep learning makes use of multilayer neural networks to learn a feature-based function from the input (e.g., hundreds of correlated summary statistics of data) to the output (e.g., population genetic parameters of interest). We demonstrate that deep learning can be effectively employed for population genetic inference and learning informative features of data. As a concrete application, we focus on the challenging problem of jointly inferring natural selection and demography (in the form of a population size change history). Our method is able to separate the global nature of demography from the local nature of selection, without sequential steps for these two factors. Studying demography and selection jointly is motivated by Drosophila, where pervasive selection confounds demographic analysis. We apply our method to 197 African Drosophila melanogaster genomes from Zambia to infer both their overall demography, and regions of their genome under selection. We find many regions of the genome that have experienced hard sweeps, and fewer under selection on standing variation (soft sweep) or balancing selection. Interestingly, we find that soft sweeps and balancing selection occur more frequently closer to the centromere of each chromosome. In addition, our demographic inference suggests that previously estimated bottlenecks for African Drosophila melanogaster are too extreme.

  2. Position automatic determination technology

    International Nuclear Information System (INIS)

    1985-10-01

    This book tells of method of position determination and characteristic, control method of position determination and point of design, point of sensor choice for position detector, position determination of digital control system, application of clutch break in high frequency position determination, automation technique of position determination, position determination by electromagnetic clutch and break, air cylinder, cam and solenoid, stop position control of automatic guide vehicle, stacker crane and automatic transfer control.

  3. Automatic supervision and fault detection of PV systems based on power losses analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chouder, A.; Silvestre, S. [Electronic Engineering Department, Universitat Politecnica de Catalunya, C/Jordi Girona 1-3, Campus Nord UPC, 08034 Barcelona (Spain)

    2010-10-15

    In this work, we present a new automatic supervision and fault detection procedure for PV systems, based on the power losses analysis. This automatic supervision system has been developed in Matlab and Simulink environment. It includes parameter extraction techniques to calculate main PV system parameters from monitoring data in real conditions of work, taking into account the environmental irradiance and module temperature evolution, allowing simulation of the PV system behaviour in real time. The automatic supervision method analyses the output power losses, presents in the DC side of the PV generator, capture losses. Two new power losses indicators are defined: thermal capture losses (L{sub ct}) and miscellaneous capture losses (L{sub cm}). The processing of these indicators allows the supervision system to generate a faulty signal as indicator of fault detection in the PV system operation. Two new indicators of the deviation of the DC variables respect to the simulated ones have been also defined. These indicators are the current and voltage ratios: R{sub C} and R{sub V}. Analysing both, the faulty signal and the current/voltage ratios, the type of fault can be identified. The automatic supervision system has been successfully tested experimentally. (author)

  4. Automatic supervision and fault detection of PV systems based on power losses analysis

    International Nuclear Information System (INIS)

    Chouder, A.; Silvestre, S.

    2010-01-01

    In this work, we present a new automatic supervision and fault detection procedure for PV systems, based on the power losses analysis. This automatic supervision system has been developed in Matlab and Simulink environment. It includes parameter extraction techniques to calculate main PV system parameters from monitoring data in real conditions of work, taking into account the environmental irradiance and module temperature evolution, allowing simulation of the PV system behaviour in real time. The automatic supervision method analyses the output power losses, presents in the DC side of the PV generator, capture losses. Two new power losses indicators are defined: thermal capture losses (L ct ) and miscellaneous capture losses (L cm ). The processing of these indicators allows the supervision system to generate a faulty signal as indicator of fault detection in the PV system operation. Two new indicators of the deviation of the DC variables respect to the simulated ones have been also defined. These indicators are the current and voltage ratios: R C and R V . Analysing both, the faulty signal and the current/voltage ratios, the type of fault can be identified. The automatic supervision system has been successfully tested experimentally.

  5. Classification of sports types from tracklets

    DEFF Research Database (Denmark)

    Gade, Rikke; Moeslund, Thomas B.

    Automatic analysis of video is important in order to process and exploit large amounts of data, e.g. for sports analysis. Classification of sports types is one of the first steps to- wards a fully automatic analysis of the activities performed at sports arenas. In this work we test the idea...... that sports types can be classified from features extracted from short trajectories of the players. From tracklets created by a Kalman filter tracker we extract four robust features; Total distance, lifespan, distance span and mean speed. For clas- sification we use a quadratic discriminant analysis. In our...... experiments we use 30 2-minutes thermal video sequences from each of five different sports types. By applying a 10- fold cross validation we obtain a correct classification rate of 94.5 %....

  6. A Meta-Analysis of Multiple Matched Copy Number and Transcriptomics Data Sets for Inferring Gene Regulatory Relationships

    Science.gov (United States)

    Newton, Richard; Wernisch, Lorenz

    2014-01-01

    Inferring gene regulatory relationships from observational data is challenging. Manipulation and intervention is often required to unravel causal relationships unambiguously. However, gene copy number changes, as they frequently occur in cancer cells, might be considered natural manipulation experiments on gene expression. An increasing number of data sets on matched array comparative genomic hybridisation and transcriptomics experiments from a variety of cancer pathologies are becoming publicly available. Here we explore the potential of a meta-analysis of thirty such data sets. The aim of our analysis was to assess the potential of in silico inference of trans-acting gene regulatory relationships from this type of data. We found sufficient correlation signal in the data to infer gene regulatory relationships, with interesting similarities between data sets. A number of genes had highly correlated copy number and expression changes in many of the data sets and we present predicted potential trans-acted regulatory relationships for each of these genes. The study also investigates to what extent heterogeneity between cell types and between pathologies determines the number of statistically significant predictions available from a meta-analysis of experiments. PMID:25148247

  7. A neurocomputational model of automatic sequence production.

    Science.gov (United States)

    Helie, Sebastien; Roeder, Jessica L; Vucovich, Lauren; Rünger, Dennis; Ashby, F Gregory

    2015-07-01

    Most behaviors unfold in time and include a sequence of submovements or cognitive activities. In addition, most behaviors are automatic and repeated daily throughout life. Yet, relatively little is known about the neurobiology of automatic sequence production. Past research suggests a gradual transfer from the associative striatum to the sensorimotor striatum, but a number of more recent studies challenge this role of the BG in automatic sequence production. In this article, we propose a new neurocomputational model of automatic sequence production in which the main role of the BG is to train cortical-cortical connections within the premotor areas that are responsible for automatic sequence production. The new model is used to simulate four different data sets from human and nonhuman animals, including (1) behavioral data (e.g., RTs), (2) electrophysiology data (e.g., single-neuron recordings), (3) macrostructure data (e.g., TMS), and (4) neurological circuit data (e.g., inactivation studies). We conclude with a comparison of the new model with existing models of automatic sequence production and discuss a possible new role for the BG in automaticity and its implication for Parkinson's disease.

  8. Vikodak--A Modular Framework for Inferring Functional Potential of Microbial Communities from 16S Metagenomic Datasets.

    Directory of Open Access Journals (Sweden)

    Sunil Nagpal

    Full Text Available The overall metabolic/functional potential of any given environmental niche is a function of the sum total of genes/proteins/enzymes that are encoded and expressed by various interacting microbes residing in that niche. Consequently, prior (collated information pertaining to genes, enzymes encoded by the resident microbes can aid in indirectly (reconstructing/ inferring the metabolic/ functional potential of a given microbial community (given its taxonomic abundance profile. In this study, we present Vikodak--a multi-modular package that is based on the above assumption and automates inferring and/ or comparing the functional characteristics of an environment using taxonomic abundance generated from one or more environmental sample datasets. With the underlying assumptions of co-metabolism and independent contributions of different microbes in a community, a concerted effort has been made to accommodate microbial co-existence patterns in various modules incorporated in Vikodak.Validation experiments on over 1400 metagenomic samples have confirmed the utility of Vikodak in (a deciphering enzyme abundance profiles of any KEGG metabolic pathway, (b functional resolution of distinct metagenomic environments, (c inferring patterns of functional interaction between resident microbes, and (d automating statistical comparison of functional features of studied microbiomes. Novel features incorporated in Vikodak also facilitate automatic removal of false positives and spurious functional predictions.With novel provisions for comprehensive functional analysis, inclusion of microbial co-existence pattern based algorithms, automated inter-environment comparisons; in-depth analysis of individual metabolic pathways and greater flexibilities at the user end, Vikodak is expected to be an important value addition to the family of existing tools for 16S based function prediction.A web implementation of Vikodak can be publicly accessed at: http

  9. Using Alien Coins to Test Whether Simple Inference Is Bayesian

    Science.gov (United States)

    Cassey, Peter; Hawkins, Guy E.; Donkin, Chris; Brown, Scott D.

    2016-01-01

    Reasoning and inference are well-studied aspects of basic cognition that have been explained as statistically optimal Bayesian inference. Using a simplified experimental design, we conducted quantitative comparisons between Bayesian inference and human inference at the level of individuals. In 3 experiments, with more than 13,000 participants, we…

  10. On Maximum Entropy and Inference

    Directory of Open Access Journals (Sweden)

    Luigi Gresele

    2017-11-01

    Full Text Available Maximum entropy is a powerful concept that entails a sharp separation between relevant and irrelevant variables. It is typically invoked in inference, once an assumption is made on what the relevant variables are, in order to estimate a model from data, that affords predictions on all other (dependent variables. Conversely, maximum entropy can be invoked to retrieve the relevant variables (sufficient statistics directly from the data, once a model is identified by Bayesian model selection. We explore this approach in the case of spin models with interactions of arbitrary order, and we discuss how relevant interactions can be inferred. In this perspective, the dimensionality of the inference problem is not set by the number of parameters in the model, but by the frequency distribution of the data. We illustrate the method showing its ability to recover the correct model in a few prototype cases and discuss its application on a real dataset.

  11. Automatic analyzing device for chlorine ion

    International Nuclear Information System (INIS)

    Sugibayashi, Shinji; Morikawa, Yoshitake; Fukase, Kazuo; Kashima, Hiromasa.

    1997-01-01

    The present invention provides a device of automatically analyzing a trance amount of chlorine ions contained in feedwater, condensate and reactor water of a BWR type power plant. Namely, zero-adjustment or span calibration in this device is conducted as follows. (1) A standard chlorine ion liquid is supplied from a tank to a mixer by a constant volume pump, and the liquid is diluted and mixed with purified water to form a standard liquid. (2) The pH of the standard liquid is adjusted by a pH adjuster. (3) The standard liquid is supplied to an electrode cell to conduct zero adjustment or span calibration. Chlorine ions in a specimen are measured by the device of the present invention as follows. (1) The specimen is supplied to a head tank through a line filter. (2) The pH of the specimen is adjusted by a pH adjuster. (3) The specimen is supplied to an electrode cell to electrically measure the concentration of the chlorine ions in the specimen. The device of the present invention can automatically analyze trance amount of chlorine ions at a high accuracy, thereby capable of improving the sensitivity, reducing an operator's burden and radiation exposure. (I.S.)

  12. Automatic ultrasound image enhancement for 2D semi-automatic breast-lesion segmentation

    Science.gov (United States)

    Lu, Kongkuo; Hall, Christopher S.

    2014-03-01

    Breast cancer is the fastest growing cancer, accounting for 29%, of new cases in 2012, and second leading cause of cancer death among women in the United States and worldwide. Ultrasound (US) has been used as an indispensable tool for breast cancer detection/diagnosis and treatment. In computer-aided assistance, lesion segmentation is a preliminary but vital step, but the task is quite challenging in US images, due to imaging artifacts that complicate detection and measurement of the suspect lesions. The lesions usually present with poor boundary features and vary significantly in size, shape, and intensity distribution between cases. Automatic methods are highly application dependent while manual tracing methods are extremely time consuming and have a great deal of intra- and inter- observer variability. Semi-automatic approaches are designed to counterbalance the advantage and drawbacks of the automatic and manual methods. However, considerable user interaction might be necessary to ensure reasonable segmentation for a wide range of lesions. This work proposes an automatic enhancement approach to improve the boundary searching ability of the live wire method to reduce necessary user interaction while keeping the segmentation performance. Based on the results of segmentation of 50 2D breast lesions in US images, less user interaction is required to achieve desired accuracy, i.e. < 80%, when auto-enhancement is applied for live-wire segmentation.

  13. The ‘Continuing Misfortune’ of Automatism in Early Surrealism

    Directory of Open Access Journals (Sweden)

    Tessel M. Bauduin

    2015-09-01

    Full Text Available In the 1924 Manifesto of Surrealism surrealist leader André Breton (1896-1966 defined Surrealism as ‘psychic automatism in its pure state,’ positioning ‘psychic automatism’ as both a concept and a technique. This definition followed upon an intense period of experimentation with various forms of automatism among the proto-surrealist group; predominantly automatic writing, but also induced dream states. This article explores how surrealist ‘psychic automatism’ functioned as a mechanism for communication, or the expression of thought as directly as possible through the unconscious, in the first two decades of Surrealism. It touches upon automatic writing, hysteria as an automatic bodily performance of the unconscious, dreaming and the experimentation with induced dream states, and automatic drawing and other visual arts-techniques that could be executed more or less automatically as well. For all that the surrealists reinvented automatism for their own poetic, artistic and revolutionary aims, the automatic techniques were primarily drawn from contemporary Spiritualism, psychical research and experimentation with mediums, and the article teases out the connections to mediumistic automatism. It is demonstrated how the surrealists effectively and successfully divested automatism of all things spiritual. It furthermore becomes clear that despite various mishaps, automatism in many forms was a very successful creative technique within Surrealism.

  14. A versatile Czochralski crystal growth system with automatic diameter control

    Science.gov (United States)

    Aggarwal, M. D.; Metzl, R.; Wang, W. S.; Choi, J.

    1995-07-01

    A versatile Czochralski crystal pulling system with automatic diameter control for the growth of nonlinear optical oxide crystals is discussed. Pure and doped bulk single crystals of bismuth silicon oxide (Bi12SiO20) have been successfully grown using this system. The system consists of a regular Czochralski type pulling system with provision for continuous weighing of the growing crystal to provide feedback for power control.

  15. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...

  16. Causal inference in economics and marketing.

    Science.gov (United States)

    Varian, Hal R

    2016-07-05

    This is an elementary introduction to causal inference in economics written for readers familiar with machine learning methods. The critical step in any causal analysis is estimating the counterfactual-a prediction of what would have happened in the absence of the treatment. The powerful techniques used in machine learning may be useful for developing better estimates of the counterfactual, potentially improving causal inference.

  17. Uncertainty in prediction and in inference

    International Nuclear Information System (INIS)

    Hilgevoord, J.; Uffink, J.

    1991-01-01

    The concepts of uncertainty in prediction and inference are introduced and illustrated using the diffraction of light as an example. The close relationship between the concepts of uncertainty in inference and resolving power is noted. A general quantitative measure of uncertainty in inference can be obtained by means of the so-called statistical distance between probability distributions. When applied to quantum mechanics, this distance leads to a measure of the distinguishability of quantum states, which essentially is the absolute value of the matrix element between the states. The importance of this result to the quantum mechanical uncertainty principle is noted. The second part of the paper provides a derivation of the statistical distance on the basis of the so-called method of support

  18. Automatic prediction of facial trait judgments: appearance vs. structural models.

    Directory of Open Access Journals (Sweden)

    Mario Rojas

    Full Text Available Evaluating other individuals with respect to personality characteristics plays a crucial role in human relations and it is the focus of attention for research in diverse fields such as psychology and interactive computer systems. In psychology, face perception has been recognized as a key component of this evaluation system. Multiple studies suggest that observers use face information to infer personality characteristics. Interactive computer systems are trying to take advantage of these findings and apply them to increase the natural aspect of interaction and to improve the performance of interactive computer systems. Here, we experimentally test whether the automatic prediction of facial trait judgments (e.g. dominance can be made by using the full appearance information of the face and whether a reduced representation of its structure is sufficient. We evaluate two separate approaches: a holistic representation model using the facial appearance information and a structural model constructed from the relations among facial salient points. State of the art machine learning methods are applied to a derive a facial trait judgment model from training data and b predict a facial trait value for any face. Furthermore, we address the issue of whether there are specific structural relations among facial points that predict perception of facial traits. Experimental results over a set of labeled data (9 different trait evaluations and classification rules (4 rules suggest that a prediction of perception of facial traits is learnable by both holistic and structural approaches; b the most reliable prediction of facial trait judgments is obtained by certain type of holistic descriptions of the face appearance; and c for some traits such as attractiveness and extroversion, there are relationships between specific structural features and social perceptions.

  19. Programmable automatic alpha--beta air sample counter

    International Nuclear Information System (INIS)

    Howell, W.P.

    1978-01-01

    A programmable automatic alpha-beta air sample counter was developed for routine sample counting by operational health physics personnel. The system is composed of an automatic sample changer utilizing a large silicon diode detector, an electronic counting system with energy analysis capability, an automatic data acquisition controller, an interface module, and a teletypewriter with paper tape punch and paper tape reader. The system is operated through the teletypewriter keyboard and the paper tape reader, which are used to instruct the automatic data acquisition controller. Paper tape programs are provided for background counting, Chi 2 test, and sample counting. Output data are printed by the teletypewriter on standard continuous roll or multifold paper. Data are automatically corrected for background and counter efficiency

  20. Nonparametric predictive inference in statistical process control

    NARCIS (Netherlands)

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2000-01-01

    New methods for statistical process control are presented, where the inferences have a nonparametric predictive nature. We consider several problems in process control in terms of uncertainties about future observable random quantities, and we develop inferences for these random quantities hased on

  1. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...

  2. Making inference from wildlife collision data: inferring predator absence from prey strikes

    Directory of Open Access Journals (Sweden)

    Peter Caley

    2017-02-01

    Full Text Available Wildlife collision data are ubiquitous, though challenging for making ecological inference due to typically irreducible uncertainty relating to the sampling process. We illustrate a new approach that is useful for generating inference from predator data arising from wildlife collisions. By simply conditioning on a second prey species sampled via the same collision process, and by using a biologically realistic numerical response functions, we can produce a coherent numerical response relationship between predator and prey. This relationship can then be used to make inference on the population size of the predator species, including the probability of extinction. The statistical conditioning enables us to account for unmeasured variation in factors influencing the runway strike incidence for individual airports and to enable valid comparisons. A practical application of the approach for testing hypotheses about the distribution and abundance of a predator species is illustrated using the hypothesized red fox incursion into Tasmania, Australia. We estimate that conditional on the numerical response between fox and lagomorph runway strikes on mainland Australia, the predictive probability of observing no runway strikes of foxes in Tasmania after observing 15 lagomorph strikes is 0.001. We conclude there is enough evidence to safely reject the null hypothesis that there is a widespread red fox population in Tasmania at a population density consistent with prey availability. The method is novel and has potential wider application.

  3. Making inference from wildlife collision data: inferring predator absence from prey strikes.

    Science.gov (United States)

    Caley, Peter; Hosack, Geoffrey R; Barry, Simon C

    2017-01-01

    Wildlife collision data are ubiquitous, though challenging for making ecological inference due to typically irreducible uncertainty relating to the sampling process. We illustrate a new approach that is useful for generating inference from predator data arising from wildlife collisions. By simply conditioning on a second prey species sampled via the same collision process, and by using a biologically realistic numerical response functions, we can produce a coherent numerical response relationship between predator and prey. This relationship can then be used to make inference on the population size of the predator species, including the probability of extinction. The statistical conditioning enables us to account for unmeasured variation in factors influencing the runway strike incidence for individual airports and to enable valid comparisons. A practical application of the approach for testing hypotheses about the distribution and abundance of a predator species is illustrated using the hypothesized red fox incursion into Tasmania, Australia. We estimate that conditional on the numerical response between fox and lagomorph runway strikes on mainland Australia, the predictive probability of observing no runway strikes of foxes in Tasmania after observing 15 lagomorph strikes is 0.001. We conclude there is enough evidence to safely reject the null hypothesis that there is a widespread red fox population in Tasmania at a population density consistent with prey availability. The method is novel and has potential wider application.

  4. Causal inference in biology networks with integrated belief propagation.

    Science.gov (United States)

    Chang, Rui; Karr, Jonathan R; Schadt, Eric E

    2015-01-01

    Inferring causal relationships among molecular and higher order phenotypes is a critical step in elucidating the complexity of living systems. Here we propose a novel method for inferring causality that is no longer constrained by the conditional dependency arguments that limit the ability of statistical causal inference methods to resolve causal relationships within sets of graphical models that are Markov equivalent. Our method utilizes Bayesian belief propagation to infer the responses of perturbation events on molecular traits given a hypothesized graph structure. A distance measure between the inferred response distribution and the observed data is defined to assess the 'fitness' of the hypothesized causal relationships. To test our algorithm, we infer causal relationships within equivalence classes of gene networks in which the form of the functional interactions that are possible are assumed to be nonlinear, given synthetic microarray and RNA sequencing data. We also apply our method to infer causality in real metabolic network with v-structure and feedback loop. We show that our method can recapitulate the causal structure and recover the feedback loop only from steady-state data which conventional method cannot.

  5. Robot-assisted automatic ultrasound calibration.

    Science.gov (United States)

    Aalamifar, Fereshteh; Cheng, Alexis; Kim, Younsu; Hu, Xiao; Zhang, Haichong K; Guo, Xiaoyu; Boctor, Emad M

    2016-10-01

    Ultrasound (US) calibration is the process of determining the unknown transformation from a coordinate frame such as the robot's tooltip to the US image frame and is a necessary task for any robotic or tracked US system. US calibration requires submillimeter-range accuracy for most applications, but it is a time-consuming and repetitive task. We provide a new framework for automatic US calibration with robot assistance and without the need for temporal calibration. US calibration based on active echo (AE) phantom was previously proposed, and its superiority over conventional cross-wire phantom-based calibration was shown. In this work, we use AE to guide the robotic arm motion through the process of data collection; we combine the capability of the AE point to localize itself in the frame of the US image with the automatic motion of the robotic arm to provide a framework for calibrating the arm to the US image automatically. We demonstrated the efficacy of the automated method compared to the manual method through experiments. To highlight the necessity of frequent ultrasound calibration, it is demonstrated that the calibration precision changed from 1.67 to 3.20 mm if the data collection is not repeated after a dismounting/mounting of the probe holder. In a large data set experiment, similar reconstruction precision of automatic and manual data collection was observed, while the time was reduced by 58 %. In addition, we compared ten automatic calibrations with ten manual ones, each performed in 15 min, and showed that all the automatic ones could converge in the case of setting the initial matrix as identity, while this was not achieved by manual data sets. Given the same initial matrix, the repeatability of the automatic was [0.46, 0.34, 0.80, 0.47] versus [0.42, 0.51, 0.98, 1.15] mm in the manual case for the US image four corners. The submillimeter accuracy requirement of US calibration makes frequent data collections unavoidable. We proposed an automated

  6. Learning algorithms and automatic processing of languages; Algorithmes a apprentissage et traitement automatique des langues

    Energy Technology Data Exchange (ETDEWEB)

    Fluhr, Christian Yves Andre

    1977-06-15

    This research thesis concerns the field of artificial intelligence. It addresses learning algorithms applied to automatic processing of languages. The author first briefly describes some mechanisms of human intelligence in order to describe how these mechanisms are simulated on a computer. He outlines the specific role of learning in various manifestations of intelligence. Then, based on the Markov's algorithm theory, the author discusses the notion of learning algorithm. Two main types of learning algorithms are then addressed: firstly, an 'algorithm-teacher dialogue' type sanction-based algorithm which aims at learning how to solve grammatical ambiguities in submitted texts; secondly, an algorithm related to a document system which structures semantic data automatically obtained from a set of texts in order to be able to understand by references to any question on the content of these texts.

  7. Efficient Bayesian inference for ARFIMA processes

    Science.gov (United States)

    Graves, T.; Gramacy, R. B.; Franzke, C. L. E.; Watkins, N. W.

    2015-03-01

    Many geophysical quantities, like atmospheric temperature, water levels in rivers, and wind speeds, have shown evidence of long-range dependence (LRD). LRD means that these quantities experience non-trivial temporal memory, which potentially enhances their predictability, but also hampers the detection of externally forced trends. Thus, it is important to reliably identify whether or not a system exhibits LRD. In this paper we present a modern and systematic approach to the inference of LRD. Rather than Mandelbrot's fractional Gaussian noise, we use the more flexible Autoregressive Fractional Integrated Moving Average (ARFIMA) model which is widely used in time series analysis, and of increasing interest in climate science. Unlike most previous work on the inference of LRD, which is frequentist in nature, we provide a systematic treatment of Bayesian inference. In particular, we provide a new approximate likelihood for efficient parameter inference, and show how nuisance parameters (e.g. short memory effects) can be integrated over in order to focus on long memory parameters, and hypothesis testing more directly. We illustrate our new methodology on the Nile water level data, with favorable comparison to the standard estimators.

  8. Automation strategies in five domains - A comparison of levels of automation, function allocation and visualisation of automatic functions

    International Nuclear Information System (INIS)

    Andersson, J.

    2011-01-01

    This study was conducted as a field study where control room operators and engineers from the refinery, heat and power, aviation, shipping and nuclear domain were interviewed regarding use of automation and the visualisation of automatic functions. The purpose of the study was to collect experiences and best practices from the five studied domains on levels of automation, function allocation and visualisation of automatic functions. In total, nine different control room settings were visited. The studied settings were compared using a systemic approach based on a human-machine systems model. The results show that the 'left over principle' is still the most common applied approach for function allocation but in high risk settings the decision whether to automate or not is more carefully considered. Regarding the visualisation of automatic functions, it was found that as long as each display type (process based, functional oriented, situation oriented and task based) are applied so that they correspond to the same level of abstraction as the technical system the operators mental model will be supported. No single display type can however readily match all levels of abstraction at the same time - all display types are still needed and serve different purposes. (Author)

  9. Estimation of parameter uncertainty for an activated sludge model using Bayesian inference: a comparison with the frequentist method.

    Science.gov (United States)

    Zonta, Zivko J; Flotats, Xavier; Magrí, Albert

    2014-08-01

    The procedure commonly used for the assessment of the parameters included in activated sludge models (ASMs) relies on the estimation of their optimal value within a confidence region (i.e. frequentist inference). Once optimal values are estimated, parameter uncertainty is computed through the covariance matrix. However, alternative approaches based on the consideration of the model parameters as probability distributions (i.e. Bayesian inference), may be of interest. The aim of this work is to apply (and compare) both Bayesian and frequentist inference methods when assessing uncertainty for an ASM-type model, which considers intracellular storage and biomass growth, simultaneously. Practical identifiability was addressed exclusively considering respirometric profiles based on the oxygen uptake rate and with the aid of probabilistic global sensitivity analysis. Parameter uncertainty was thus estimated according to both the Bayesian and frequentist inferential procedures. Results were compared in order to evidence the strengths and weaknesses of both approaches. Since it was demonstrated that Bayesian inference could be reduced to a frequentist approach under particular hypotheses, the former can be considered as a more generalist methodology. Hence, the use of Bayesian inference is encouraged for tackling inferential issues in ASM environments.

  10. Sensitivity analysis and design optimization through automatic differentiation

    International Nuclear Information System (INIS)

    Hovland, Paul D; Norris, Boyana; Strout, Michelle Mills; Bhowmick, Sanjukta; Utke, Jean

    2005-01-01

    Automatic differentiation is a technique for transforming a program or subprogram that computes a function, including arbitrarily complex simulation codes, into one that computes the derivatives of that function. We describe the implementation and application of automatic differentiation tools. We highlight recent advances in the combinatorial algorithms and compiler technology that underlie successful implementation of automatic differentiation tools. We discuss applications of automatic differentiation in design optimization and sensitivity analysis. We also describe ongoing research in the design of language-independent source transformation infrastructures for automatic differentiation algorithms

  11. Automatic neutron dosimetry system based on fluorescent nuclear track detector technology

    International Nuclear Information System (INIS)

    Akselrod, M.S.; Fomenko, V.V.; Bartz, J.A.; Haslett, T.L.

    2014-01-01

    For the first time, the authors are describing an automatic fluorescent nuclear track detector (FNTD) reader for neutron dosimetry. FNTD is a luminescent integrating type of detector made of aluminium oxide crystals that does not require electronics or batteries during irradiation. Non-destructive optical readout of the detector is performed using a confocal laser scanning fluorescence imaging with near-diffraction limited resolution. The fully automatic table-top reader allows one to load up to 216 detectors on a tray, read their engraved IDs using a CCD camera and optical character recognition, scan and process simultaneously two types of images in fluorescent and reflected laser light contrast to eliminate false-positive tracks related to surface and volume crystal imperfections. The FNTD dosimetry system allows one to measure neutron doses from 0.1 mSv to 20 Sv and covers neutron energies from thermal to 20 MeV. The reader is characterised by a robust, compact optical design, fast data processing electronics and user-friendly software. The first table-top automatic FNTD neutron dosimetry system was successfully tested for LLD, linearity and ability to measure neutrons in mixed neutron-photon fields satisfying US and ISO standards. This new neutron dosimetry system provides advantages over other technologies including environmental stability of the detector material, wide range of detectable neutron energies and doses, detector re-readability and re-usability and all-optical readout. A new adaptive image processing algorithm reliably removes false-positive tracks associated with surface and bulk crystal imperfections. (authors)

  12. Deep Learning for Population Genetic Inference.

    Directory of Open Access Journals (Sweden)

    Sara Sheehan

    2016-03-01

    Full Text Available Given genomic variation data from multiple individuals, computing the likelihood of complex population genetic models is often infeasible. To circumvent this problem, we introduce a novel likelihood-free inference framework by applying deep learning, a powerful modern technique in machine learning. Deep learning makes use of multilayer neural networks to learn a feature-based function from the input (e.g., hundreds of correlated summary statistics of data to the output (e.g., population genetic parameters of interest. We demonstrate that deep learning can be effectively employed for population genetic inference and learning informative features of data. As a concrete application, we focus on the challenging problem of jointly inferring natural selection and demography (in the form of a population size change history. Our method is able to separate the global nature of demography from the local nature of selection, without sequential steps for these two factors. Studying demography and selection jointly is motivated by Drosophila, where pervasive selection confounds demographic analysis. We apply our method to 197 African Drosophila melanogaster genomes from Zambia to infer both their overall demography, and regions of their genome under selection. We find many regions of the genome that have experienced hard sweeps, and fewer under selection on standing variation (soft sweep or balancing selection. Interestingly, we find that soft sweeps and balancing selection occur more frequently closer to the centromere of each chromosome. In addition, our demographic inference suggests that previously estimated bottlenecks for African Drosophila melanogaster are too extreme.

  13. Deep Learning for Population Genetic Inference

    Science.gov (United States)

    Sheehan, Sara; Song, Yun S.

    2016-01-01

    Given genomic variation data from multiple individuals, computing the likelihood of complex population genetic models is often infeasible. To circumvent this problem, we introduce a novel likelihood-free inference framework by applying deep learning, a powerful modern technique in machine learning. Deep learning makes use of multilayer neural networks to learn a feature-based function from the input (e.g., hundreds of correlated summary statistics of data) to the output (e.g., population genetic parameters of interest). We demonstrate that deep learning can be effectively employed for population genetic inference and learning informative features of data. As a concrete application, we focus on the challenging problem of jointly inferring natural selection and demography (in the form of a population size change history). Our method is able to separate the global nature of demography from the local nature of selection, without sequential steps for these two factors. Studying demography and selection jointly is motivated by Drosophila, where pervasive selection confounds demographic analysis. We apply our method to 197 African Drosophila melanogaster genomes from Zambia to infer both their overall demography, and regions of their genome under selection. We find many regions of the genome that have experienced hard sweeps, and fewer under selection on standing variation (soft sweep) or balancing selection. Interestingly, we find that soft sweeps and balancing selection occur more frequently closer to the centromere of each chromosome. In addition, our demographic inference suggests that previously estimated bottlenecks for African Drosophila melanogaster are too extreme. PMID:27018908

  14. Automatically sweeping dual-channel boxcar integrator

    International Nuclear Information System (INIS)

    Keefe, D.J.; Patterson, D.R.

    1978-01-01

    An automatically sweeping dual-channel boxcar integrator has been developed to automate the search for a signal that repeatedly follows a trigger pulse by a constant or slowly varying time delay when that signal is completely hidden in random electrical noise and dc-offset drifts. The automatically sweeping dual-channel boxcar integrator improves the signal-to-noise ratio and eliminates dc-drift errors in the same way that a conventional dual-channel boxcar integrator does, but, in addition, automatically locates the hidden signal. When the signal is found, its time delay is displayed with 100-ns resolution, and its peak value is automatically measured and displayed. This relieves the operator of the tedious, time-consuming, and error-prone search for the signal whenever the time delay changes. The automatically sweeping boxcar integrator can also be used as a conventional dual-channel boxcar integrator. In either mode, it can repeatedly integrate a signal up to 990 times and thus make accurate measurements of the signal pulse height in the presence of random noise, dc offsets, and unsynchronized interfering signals

  15. A Bayesian Network Schema for Lessening Database Inference

    National Research Council Canada - National Science Library

    Chang, LiWu; Moskowitz, Ira S

    2001-01-01

    .... The authors introduce a formal schema for database inference analysis, based upon a Bayesian network structure, which identifies critical parameters involved in the inference problem and represents...

  16. Analysing Solar-like Oscillations with an Automatic Pipeline

    International Nuclear Information System (INIS)

    Mathur, S.; Garcia, R. A.; Regulo, C.; Ballot, J.; Salabert, D.; Chaplin, W. J.

    2009-01-01

    The Kepler mission will provide a huge amount of asteroseismic data during the next few years, among which hundreds of solar-like stars will be targeted. The amount of stars and their observation length represent a step forward in the comprehension of the stellar evolution that has already been initiated by CoRoT and MOST missions. Up to now, the slow cadence of observed targets allowed an individual and personalized analysis of each star. During the survey phase of Kepler, this will be impossible. This is the reason why, within the AsteroFLAG team, we have been developing automatic pipelines for the Kepler solar-like oscillation stars. Our code starts by finding the frequency-range where p-mode power is present and, after fitting the background, it looks for the mode amplitudes as well as the central frequency of the p-mode hump. A good estimation of the large separation can thus be inferred in this region. If the signal to noise is high enough, the code obtains the characteristics of the p modes by doing a global fitting on the power spectrum. Here, we will first describe a few features of this pipeline and its application to AsteroFLAG synthetic data to check the validity of the code.

  17. Personality in speech assessment and automatic classification

    CERN Document Server

    Polzehl, Tim

    2015-01-01

    This work combines interdisciplinary knowledge and experience from research fields of psychology, linguistics, audio-processing, machine learning, and computer science. The work systematically explores a novel research topic devoted to automated modeling of personality expression from speech. For this aim, it introduces a novel personality assessment questionnaire and presents the results of extensive labeling sessions to annotate the speech data with personality assessments. It provides estimates of the Big 5 personality traits, i.e. openness, conscientiousness, extroversion, agreeableness, and neuroticism. Based on a database built on the questionnaire, the book presents models to tell apart different personality types or classes from speech automatically.

  18. Automatic control of a negative ion source

    International Nuclear Information System (INIS)

    Saadatmand, K.; Sredniawski, J.; Solensten, L.

    1989-01-01

    A CAMAC based control architecture is devised for a Berkeley-type H - volume ion source. The architecture employs three 80386 PCs. One PC is dedicated to control and monitoring of source operation. The other PC functions with digitizers to provide data acquisition of waveforms. The third PC is used for off-line analysis. Initially, operation of the source was put under remote computer control (supervisory). This was followed by development of an automated startup procedure. Finally, a study of the physics of operation is now underway to establish a data base from which automatic beam optimization can be derived. (orig.)

  19. Automatic control of a negative ion source

    Science.gov (United States)

    Saadatmand, K.; Sredniawski, J.; Solensten, L.

    1989-04-01

    A CAMAC based control architecture is devised for a Berkeley-type H - volume ion source [1]. The architecture employs three 80386 TM PCs. One PC is dedicated to control and monitoring of source operation. The other PC functions with digitizers to provide data acquisition of waveforms. The third PC is used for off-line analysis. Initially, operation of the source was put under remote computer control (supervisory). This was followed by development of an automated startup procedure. Finally, a study of the physics of operation is now underway to establish a data base from which automatic beam optimization can be derived.

  20. Automatic control of a negative ion source

    Energy Technology Data Exchange (ETDEWEB)

    Saadatmand, K.; Sredniawski, J.; Solensten, L. (Grumman Corp., Long Island, NY (USA))

    1989-04-01

    A CAMAC based control architecture is devised for a Berkeley-type H/sup -/ volume ion source. The architecture employs three 80386 PCs. One PC is dedicated to control and monitoring of source operation. The other PC functions with digitizers to provide data acquisition of waveforms. The third PC is used for off-line analysis. Initially, operation of the source was put under remote computer control (supervisory). This was followed by development of an automated startup procedure. Finally, a study of the physics of operation is now underway to establish a data base from which automatic beam optimization can be derived. (orig.).

  1. What Automaticity Deficit? Activation of Lexical Information by Readers with Dyslexia in a Rapid Automatized Naming Stroop-Switch Task

    Science.gov (United States)

    Jones, Manon W.; Snowling, Margaret J.; Moll, Kristina

    2016-01-01

    Reading fluency is often predicted by rapid automatized naming (RAN) speed, which as the name implies, measures the automaticity with which familiar stimuli (e.g., letters) can be retrieved and named. Readers with dyslexia are considered to have less "automatized" access to lexical information, reflected in longer RAN times compared with…

  2. Automatization of welding for nuclear power equipments and facilities

    International Nuclear Information System (INIS)

    Tamai, Yasumasa; Matsumoto, Teruo; Koyama, Takaichi

    1980-01-01

    For the requirement of high reliability in the construction of nuclear power plants and the reduction of radiation exposure in the modefying works of existing plants, the automation and remote operation of welding have increased their necessity. In this paper, the present state of the automation of welding for making machines, equipments and pipings for nuclear power plants in Hitachi Ltd. is described, and the aim of developing the automation, the features of the equipments and the state of application to actual plants are introduced, centering around the automation of welding for large structures such as reactor containment vessels and the remote type automatic welding system for pipings. By these automations, the large outcomes were obtained in the improvement of welding quality required for the machines and equipments for atomic energy. Moreover, the conspicuous results were also obtained in case of the peculiar works to nuclear power plants, in which the reduction of the radiation exposure related to human bodies and the welding of high quality are demanded. The present state of the automation of welding for nuclear installations in Hitachi Ltd., the development of automatic welding equipments and the present state of application to actual plants, and the development and application of the automatic pipe working machine for reducing radiation exposure are explained. (Kako, I.)

  3. Explanatory Preferences Shape Learning and Inference.

    Science.gov (United States)

    Lombrozo, Tania

    2016-10-01

    Explanations play an important role in learning and inference. People often learn by seeking explanations, and they assess the viability of hypotheses by considering how well they explain the data. An emerging body of work reveals that both children and adults have strong and systematic intuitions about what constitutes a good explanation, and that these explanatory preferences have a systematic impact on explanation-based processes. In particular, people favor explanations that are simple and broad, with the consequence that engaging in explanation can shape learning and inference by leading people to seek patterns and favor hypotheses that support broad and simple explanations. Given the prevalence of explanation in everyday cognition, understanding explanation is therefore crucial to understanding learning and inference. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Grammatical inference algorithms, routines and applications

    CERN Document Server

    Wieczorek, Wojciech

    2017-01-01

    This book focuses on grammatical inference, presenting classic and modern methods of grammatical inference from the perspective of practitioners. To do so, it employs the Python programming language to present all of the methods discussed. Grammatical inference is a field that lies at the intersection of multiple disciplines, with contributions from computational linguistics, pattern recognition, machine learning, computational biology, formal learning theory and many others. Though the book is largely practical, it also includes elements of learning theory, combinatorics on words, the theory of automata and formal languages, plus references to real-world problems. The listings presented here can be directly copied and pasted into other programs, thus making the book a valuable source of ready recipes for students, academic researchers, and programmers alike, as well as an inspiration for their further development.>.

  5. 14 CFR 29.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 29.1329 Section 29... pilot system. (a) Each automatic pilot system must be designed so that the automatic pilot can— (1) Be sufficiently overpowered by one pilot to allow control of the rotorcraft; and (2) Be readily and positively...

  6. 14 CFR 27.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 27.1329 Section 27... pilot system. (a) Each automatic pilot system must be designed so that the automatic pilot can— (1) Be sufficiently overpowered by one pilot to allow control of the rotorcraft; and (2) Be readily and positively...

  7. Occurrence and evolutionary inferences about Kranz anatomy in Cyperaceae (Poales

    Directory of Open Access Journals (Sweden)

    SHIRLEY MARTINS

    2015-12-01

    Full Text Available ABSTRACT Cyperaceae is an angiosperm family with the greatest diversity of species with Kranz anatomy. Four different types of Kranz anatomy (chlorocyperoid, eleocharoid, fimbristyloid and rhynchosporoid have been described for this angiosperm family, and the occurrence and structural characteristics of these types are important to trace evolutionary hypotheses. The purpose of this study was to examine the available data on Cyperaceae Kranz anatomy, emphasizing taxonomy, geographic distribution, habitat and anatomy, to infer the potential origin of the Kranz anatomy in this family. The results showed that the four types of Kranz anatomy (associated with C4 photosynthesis in Cyperaceae emerged numerous times in unrelated phylogenetic groups. However, the convergence of these anatomical types, except rhynchosporoid, was observed in certain groups. Thus, the diverse origin of these species might result from different environmental pressures that promote photorespiration. Greater variation in occurrence of Kranz anatomy and anatomical types was observed inEleocharis, whose emergence of the C4 pathway was recent compared with other genera in the family, and the species of this genus are located in aquatic environments.

  8. BagReg: Protein inference through machine learning.

    Science.gov (United States)

    Zhao, Can; Liu, Dao; Teng, Ben; He, Zengyou

    2015-08-01

    Protein inference from the identified peptides is of primary importance in the shotgun proteomics. The target of protein inference is to identify whether each candidate protein is truly present in the sample. To date, many computational methods have been proposed to solve this problem. However, there is still no method that can fully utilize the information hidden in the input data. In this article, we propose a learning-based method named BagReg for protein inference. The method firstly artificially extracts five features from the input data, and then chooses each feature as the class feature to separately build models to predict the presence probabilities of proteins. Finally, the weak results from five prediction models are aggregated to obtain the final result. We test our method on six public available data sets. The experimental results show that our method is superior to the state-of-the-art protein inference algorithms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Automatic differentiation algorithms in model analysis

    NARCIS (Netherlands)

    Huiskes, M.J.

    2002-01-01

    Title: Automatic differentiation algorithms in model analysis
    Author: M.J. Huiskes
    Date: 19 March, 2002

    In this thesis automatic differentiation algorithms and derivative-based methods

  10. Sensitometric characteristics of D-, E- and F-speed dental radiographic films in manual and automatic processing

    Directory of Open Access Journals (Sweden)

    Jahangir Haghani

    2012-12-01

    Full Text Available BACKGROUND: The purpose of this study was to evaluate the sensitometric characteristics of Ultraspeed, Ektaspeed Plus and Insight dental radiographic films using manual and automatic processing systems. METHODS: In this experimental invitro study, an aluminum step-wedge was used to construct characteristic curves for D-, E- and F-speed radiographic films (Kodak Eastman, Rochester, USA. All films were processed in Iranian processing solution (chemical industries Co., Iran, Tehran both manually and automatically in a period of six days. Unexposed films of three types were processed manually and automatically to determine base plus fog density. Speed and film contrast were measured according to International Standard Organization definition. RESULTS: There was significant difference in density obtained with the D-, E- and F-speed films in both manually and automatically processing systems (P < 0.001. There was significant difference in density obtained with the Ultraspeed and insight films. There was no significant difference in contrast obtained with the D-, E- and F-speed films in both manually and automatically processing systems (P = 0.255 , P = 0.26. There was significant difference in speed obtained with the D-, E- and F-speed films in both manually and automatically processing systems (P = 0.034, P = 0.04. CONCLUSIONS: The choice of processing system can affect radiographic characteristics. The F-speed film processed in automatic system has greater speed in comparison with manual processing system, and it provides a further reduction in radiation exposure without detriment to image quality.

  11. Integrating Automatic Speech Recognition and Machine Translation for Better Translation Outputs

    DEFF Research Database (Denmark)

    Liyanapathirana, Jeevanthi

    translations, combining machine translation with computer assisted translation has drawn attention in current research. This combines two prospects: the opportunity of ensuring high quality translation along with a significant performance gain. Automatic Speech Recognition (ASR) is another important area......, which caters important functionalities in language processing and natural language understanding tasks. In this work we integrate automatic speech recognition and machine translation in parallel. We aim to avoid manual typing of possible translations as dictating the translation would take less time...... to the n-best list rescoring, we also use word graphs with the expectation of arriving at a tighter integration of ASR and MT models. Integration methods include constraining ASR models using language and translation models of MT, and vice versa. We currently develop and experiment different methods...

  12. Inference of Transcription Regulatory Network in Low Phytic Acid Soybean Seeds

    Directory of Open Access Journals (Sweden)

    Neelam Redekar

    2017-11-01

    Full Text Available A dominant loss of function mutation in myo-inositol phosphate synthase (MIPS gene and recessive loss of function mutations in two multidrug resistant protein type-ABC transporter genes not only reduce the seed phytic acid levels in soybean, but also affect the pathways associated with seed development, ultimately resulting in low emergence. To understand the regulatory mechanisms and identify key genes that intervene in the seed development process in low phytic acid crops, we performed computational inference of gene regulatory networks in low and normal phytic acid soybeans using a time course transcriptomic data and multiple network inference algorithms. We identified a set of putative candidate transcription factors and their regulatory interactions with genes that have functions in myo-inositol biosynthesis, auxin-ABA signaling, and seed dormancy. We evaluated the performance of our unsupervised network inference method by comparing the predicted regulatory network with published regulatory interactions in Arabidopsis. Some contrasting regulatory interactions were observed in low phytic acid mutants compared to non-mutant lines. These findings provide important hypotheses on expression regulation of myo-inositol metabolism and phytohormone signaling in developing low phytic acid soybeans. The computational pipeline used for unsupervised network learning in this study is provided as open source software and is freely available at https://lilabatvt.github.io/LPANetwork/.

  13. Impact of noise on molecular network inference.

    Directory of Open Access Journals (Sweden)

    Radhakrishnan Nagarajan

    Full Text Available Molecular entities work in concert as a system and mediate phenotypic outcomes and disease states. There has been recent interest in modelling the associations between molecular entities from their observed expression profiles as networks using a battery of algorithms. These networks have proven to be useful abstractions of the underlying pathways and signalling mechanisms. Noise is ubiquitous in molecular data and can have a pronounced effect on the inferred network. Noise can be an outcome of several factors including: inherent stochastic mechanisms at the molecular level, variation in the abundance of molecules, heterogeneity, sensitivity of the biological assay or measurement artefacts prevalent especially in high-throughput settings. The present study investigates the impact of discrepancies in noise variance on pair-wise dependencies, conditional dependencies and constraint-based Bayesian network structure learning algorithms that incorporate conditional independence tests as a part of the learning process. Popular network motifs and fundamental connections, namely: (a common-effect, (b three-chain, and (c coherent type-I feed-forward loop (FFL are investigated. The choice of these elementary networks can be attributed to their prevalence across more complex networks. Analytical expressions elucidating the impact of discrepancies in noise variance on pairwise dependencies and conditional dependencies for special cases of these motifs are presented. Subsequently, the impact of noise on two popular constraint-based Bayesian network structure learning algorithms such as Grow-Shrink (GS and Incremental Association Markov Blanket (IAMB that implicitly incorporate tests for conditional independence is investigated. Finally, the impact of noise on networks inferred from publicly available single cell molecular expression profiles is investigated. While discrepancies in noise variance are overlooked in routine molecular network inference, the

  14. Ensemble stacking mitigates biases in inference of synaptic connectivity.

    Science.gov (United States)

    Chambers, Brendan; Levy, Maayan; Dechery, Joseph B; MacLean, Jason N

    2018-01-01

    A promising alternative to directly measuring the anatomical connections in a neuronal population is inferring the connections from the activity. We employ simulated spiking neuronal networks to compare and contrast commonly used inference methods that identify likely excitatory synaptic connections using statistical regularities in spike timing. We find that simple adjustments to standard algorithms improve inference accuracy: A signing procedure improves the power of unsigned mutual-information-based approaches and a correction that accounts for differences in mean and variance of background timing relationships, such as those expected to be induced by heterogeneous firing rates, increases the sensitivity of frequency-based methods. We also find that different inference methods reveal distinct subsets of the synaptic network and each method exhibits different biases in the accurate detection of reciprocity and local clustering. To correct for errors and biases specific to single inference algorithms, we combine methods into an ensemble. Ensemble predictions, generated as a linear combination of multiple inference algorithms, are more sensitive than the best individual measures alone, and are more faithful to ground-truth statistics of connectivity, mitigating biases specific to single inference methods. These weightings generalize across simulated datasets, emphasizing the potential for the broad utility of ensemble-based approaches.

  15. Stochastic processes inference theory

    CERN Document Server

    Rao, Malempati M

    2014-01-01

    This is the revised and enlarged 2nd edition of the authors’ original text, which was intended to be a modest complement to Grenander's fundamental memoir on stochastic processes and related inference theory. The present volume gives a substantial account of regression analysis, both for stochastic processes and measures, and includes recent material on Ridge regression with some unexpected applications, for example in econometrics. The first three chapters can be used for a quarter or semester graduate course on inference on stochastic processes. The remaining chapters provide more advanced material on stochastic analysis suitable for graduate seminars and discussions, leading to dissertation or research work. In general, the book will be of interest to researchers in probability theory, mathematical statistics and electrical and information theory.

  16. Russell and Humean Inferences

    Directory of Open Access Journals (Sweden)

    João Paulo Monteiro

    2001-12-01

    Full Text Available Russell's The Problems of Philosophy tries to establish a new theory of induction, at the same time that Hume is there accused of an irrational/ scepticism about induction". But a careful analysis of the theory of knowledge explicitly acknowledged by Hume reveals that, contrary to the standard interpretation in the XXth century, possibly influenced by Russell, Hume deals exclusively with causal inference (which he never classifies as "causal induction", although now we are entitled to do so, never with inductive inference in general, mainly generalizations about sensible qualities of objects ( whether, e.g., "all crows are black" or not is not among Hume's concerns. Russell's theories are thus only false alternatives to Hume's, in (1912 or in his (1948.

  17. Efficient algorithms for conditional independence inference

    Czech Academy of Sciences Publication Activity Database

    Bouckaert, R.; Hemmecke, R.; Lindner, S.; Studený, Milan

    2010-01-01

    Roč. 11, č. 1 (2010), s. 3453-3479 ISSN 1532-4435 R&D Projects: GA ČR GA201/08/0539; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : conditional independence inference * linear programming approach Subject RIV: BA - General Mathematics Impact factor: 2.949, year: 2010 http://library.utia.cas.cz/separaty/2010/MTR/studeny-efficient algorithms for conditional independence inference.pdf

  18. LIKELIHOOD-FREE COSMOLOGICAL INFERENCE WITH TYPE Ia SUPERNOVAE: APPROXIMATE BAYESIAN COMPUTATION FOR A COMPLETE TREATMENT OF UNCERTAINTY

    Energy Technology Data Exchange (ETDEWEB)

    Weyant, Anja; Wood-Vasey, W. Michael [Pittsburgh Particle Physics, Astrophysics, and Cosmology Center (PITT PACC), Physics and Astronomy Department, University of Pittsburgh, Pittsburgh, PA 15260 (United States); Schafer, Chad, E-mail: anw19@pitt.edu [Department of Statistics, Carnegie Mellon University, Pittsburgh, PA 15213 (United States)

    2013-02-20

    Cosmological inference becomes increasingly difficult when complex data-generating processes cannot be modeled by simple probability distributions. With the ever-increasing size of data sets in cosmology, there is an increasing burden placed on adequate modeling; systematic errors in the model will dominate where previously these were swamped by statistical errors. For example, Gaussian distributions are an insufficient representation for errors in quantities like photometric redshifts. Likewise, it can be difficult to quantify analytically the distribution of errors that are introduced in complex fitting codes. Without a simple form for these distributions, it becomes difficult to accurately construct a likelihood function for the data as a function of parameters of interest. Approximate Bayesian computation (ABC) provides a means of probing the posterior distribution when direct calculation of a sufficiently accurate likelihood is intractable. ABC allows one to bypass direct calculation of the likelihood but instead relies upon the ability to simulate the forward process that generated the data. These simulations can naturally incorporate priors placed on nuisance parameters, and hence these can be marginalized in a natural way. We present and discuss ABC methods in the context of supernova cosmology using data from the SDSS-II Supernova Survey. Assuming a flat cosmology and constant dark energy equation of state, we demonstrate that ABC can recover an accurate posterior distribution. Finally, we show that ABC can still produce an accurate posterior distribution when we contaminate the sample with Type IIP supernovae.

  19. LIKELIHOOD-FREE COSMOLOGICAL INFERENCE WITH TYPE Ia SUPERNOVAE: APPROXIMATE BAYESIAN COMPUTATION FOR A COMPLETE TREATMENT OF UNCERTAINTY

    International Nuclear Information System (INIS)

    Weyant, Anja; Wood-Vasey, W. Michael; Schafer, Chad

    2013-01-01

    Cosmological inference becomes increasingly difficult when complex data-generating processes cannot be modeled by simple probability distributions. With the ever-increasing size of data sets in cosmology, there is an increasing burden placed on adequate modeling; systematic errors in the model will dominate where previously these were swamped by statistical errors. For example, Gaussian distributions are an insufficient representation for errors in quantities like photometric redshifts. Likewise, it can be difficult to quantify analytically the distribution of errors that are introduced in complex fitting codes. Without a simple form for these distributions, it becomes difficult to accurately construct a likelihood function for the data as a function of parameters of interest. Approximate Bayesian computation (ABC) provides a means of probing the posterior distribution when direct calculation of a sufficiently accurate likelihood is intractable. ABC allows one to bypass direct calculation of the likelihood but instead relies upon the ability to simulate the forward process that generated the data. These simulations can naturally incorporate priors placed on nuisance parameters, and hence these can be marginalized in a natural way. We present and discuss ABC methods in the context of supernova cosmology using data from the SDSS-II Supernova Survey. Assuming a flat cosmology and constant dark energy equation of state, we demonstrate that ABC can recover an accurate posterior distribution. Finally, we show that ABC can still produce an accurate posterior distribution when we contaminate the sample with Type IIP supernovae.

  20. State-Space Inference and Learning with Gaussian Processes

    OpenAIRE

    Turner, R; Deisenroth, MP; Rasmussen, CE

    2010-01-01

    18.10.13 KB. Ok to add author version to spiral, authors hold copyright. State-space inference and learning with Gaussian processes (GPs) is an unsolved problem. We propose a new, general methodology for inference and learning in nonlinear state-space models that are described probabilistically by non-parametric GP models. We apply the expectation maximization algorithm to iterate between inference in the latent state-space and learning the parameters of the underlying GP dynamics model. C...

  1. Microsoft excel's automatic data processing and diagram drawing of RIA internal quality control parameters

    International Nuclear Information System (INIS)

    Zeng Pingfan; Liu Guoqiang

    2006-01-01

    We did automatic data processing and diagram drawing of various parameters of RIA' s internal quality control (IQC)by the use of Microsoft Excel (ME). By use of AVERAGE and STDEV of ME, we got x-bar, s and CV%. With pearson, we got the serum quality control coefficients (r). Inputing the original data to diagram's self-definition item, the diagram was drawn automatically. By the use of logic judging, we got the quality control judging results with the status, timing and data of various quality control parameters. For the past four years, the ME data processing and diagram drawing as well as quality control judging have been showed to be accurate, convenient and correct. It was quick and easy to manage and the automatic computer processing of RIA's IQC was realized. Conclusion: the method is applicable to all types of RIA' s IQC. (authors)

  2. Super learning to hedge against incorrect inference from arbitrary parametric assumptions in marginal structural modeling.

    Science.gov (United States)

    Neugebauer, Romain; Fireman, Bruce; Roy, Jason A; Raebel, Marsha A; Nichols, Gregory A; O'Connor, Patrick J

    2013-08-01

    Clinical trials are unlikely to ever be launched for many comparative effectiveness research (CER) questions. Inferences from hypothetical randomized trials may however be emulated with marginal structural modeling (MSM) using observational data, but success in adjusting for time-dependent confounding and selection bias typically relies on parametric modeling assumptions. If these assumptions are violated, inferences from MSM may be inaccurate. In this article, we motivate the application of a data-adaptive estimation approach called super learning (SL) to avoid reliance on arbitrary parametric assumptions in CER. Using the electronic health records data from adults with new-onset type 2 diabetes, we implemented MSM with inverse probability weighting (IPW) estimation to evaluate the effect of three oral antidiabetic therapies on the worsening of glomerular filtration rate. Inferences from IPW estimation were noticeably sensitive to the parametric assumptions about the associations between both the exposure and censoring processes and the main suspected source of confounding, that is, time-dependent measurements of hemoglobin A1c. SL was successfully implemented to harness flexible confounding and selection bias adjustment from existing machine learning algorithms. Erroneous IPW inference about clinical effectiveness because of arbitrary and incorrect modeling decisions may be avoided with SL. Copyright © 2013 Elsevier Inc. All rights reserved.

  3. Enhancing Transparency and Control When Drawing Data-Driven Inferences About Individuals.

    Science.gov (United States)

    Chen, Daizhuo; Fraiberger, Samuel P; Moakler, Robert; Provost, Foster

    2017-09-01

    Recent studies show the remarkable power of fine-grained information disclosed by users on social network sites to infer users' personal characteristics via predictive modeling. Similar fine-grained data are being used successfully in other commercial applications. In response, attention is turning increasingly to the transparency that organizations provide to users as to what inferences are drawn and why, as well as to what sort of control users can be given over inferences that are drawn about them. In this article, we focus on inferences about personal characteristics based on information disclosed by users' online actions. As a use case, we explore personal inferences that are made possible from "Likes" on Facebook. We first present a means for providing transparency into the information responsible for inferences drawn by data-driven models. We then introduce the "cloaking device"-a mechanism for users to inhibit the use of particular pieces of information in inference. Using these analytical tools we ask two main questions: (1) How much information must users cloak to significantly affect inferences about their personal traits? We find that usually users must cloak only a small portion of their actions to inhibit inference. We also find that, encouragingly, false-positive inferences are significantly easier to cloak than true-positive inferences. (2) Can firms change their modeling behavior to make cloaking more difficult? The answer is a definitive yes. We demonstrate a simple modeling change that requires users to cloak substantially more information to affect the inferences drawn. The upshot is that organizations can provide transparency and control even into complicated, predictive model-driven inferences, but they also can make control easier or harder for their users.

  4. Automatic intelligent cruise control

    OpenAIRE

    Stanton, NA; Young, MS

    2006-01-01

    This paper reports a study on the evaluation of automatic intelligent cruise control (AICC) from a psychological perspective. It was anticipated that AICC would have an effect upon the psychology of driving—namely, make the driver feel like they have less control, reduce the level of trust in the vehicle, make drivers less situationally aware, but might reduce the workload and make driving might less stressful. Drivers were asked to drive in a driving simulator under manual and automatic inte...

  5. Control System Design for Automatic Cavity Tuning Machines

    Energy Technology Data Exchange (ETDEWEB)

    Carcagno, R.; Khabiboulline, T.; Kotelnikov, S.; Makulski, A.; Nehring, R.; Nogiec, J.; Ross, M.; Schappert, W.; /Fermilab; Goessel, A.; Iversen, J.; Klinke, D.; /DESY

    2009-05-01

    A series of four automatic tuning machines for 9-cell TESLA-type cavities are being developed and fabricated in a collaborative effort among DESY, FNAL, and KEK. These machines are intended to support high-throughput cavity fabrication for construction of large SRF-based accelerator projects. Two of these machines will be delivered to cavity vendors for the tuning of XFEL cavities. The control system for these machines must support a high level of automation adequate for industrial use by non-experts operators. This paper describes the control system hardware and software design for these machines.

  6. Control System Design for Automatic Cavity Tuning Machines

    International Nuclear Information System (INIS)

    Carcagno, R.; Khabiboulline, T.; Kotelnikov, S.; Makulski, A.; Nehring, R.; Nogiec, J.; Ross, M.; Schappert, W.; Goessel, A.; Iversen, J.; Klinke, D.

    2009-01-01

    A series of four automatic tuning machines for 9-cell TESLA-type cavities are being developed and fabricated in a collaborative effort among DESY, FNAL, and KEK. These machines are intended to support high-throughput cavity fabrication for construction of large SRF-based accelerator projects. Two of these machines will be delivered to cavity vendors for the tuning of XFEL cavities. The control system for these machines must support a high level of automation adequate for industrial use by non-experts operators. This paper describes the control system hardware and software design for these machines.

  7. HIERARCHICAL PROBABILISTIC INFERENCE OF COSMIC SHEAR

    International Nuclear Information System (INIS)

    Schneider, Michael D.; Dawson, William A.; Hogg, David W.; Marshall, Philip J.; Bard, Deborah J.; Meyers, Joshua; Lang, Dustin

    2015-01-01

    Point estimators for the shearing of galaxy images induced by gravitational lensing involve a complex inverse problem in the presence of noise, pixelization, and model uncertainties. We present a probabilistic forward modeling approach to gravitational lensing inference that has the potential to mitigate the biased inferences in most common point estimators and is practical for upcoming lensing surveys. The first part of our statistical framework requires specification of a likelihood function for the pixel data in an imaging survey given parameterized models for the galaxies in the images. We derive the lensing shear posterior by marginalizing over all intrinsic galaxy properties that contribute to the pixel data (i.e., not limited to galaxy ellipticities) and learn the distributions for the intrinsic galaxy properties via hierarchical inference with a suitably flexible conditional probabilitiy distribution specification. We use importance sampling to separate the modeling of small imaging areas from the global shear inference, thereby rendering our algorithm computationally tractable for large surveys. With simple numerical examples we demonstrate the improvements in accuracy from our importance sampling approach, as well as the significance of the conditional distribution specification for the intrinsic galaxy properties when the data are generated from an unknown number of distinct galaxy populations with different morphological characteristics

  8. Inverse Ising inference with correlated samples

    International Nuclear Information System (INIS)

    Obermayer, Benedikt; Levine, Erel

    2014-01-01

    Correlations between two variables of a high-dimensional system can be indicative of an underlying interaction, but can also result from indirect effects. Inverse Ising inference is a method to distinguish one from the other. Essentially, the parameters of the least constrained statistical model are learned from the observed correlations such that direct interactions can be separated from indirect correlations. Among many other applications, this approach has been helpful for protein structure prediction, because residues which interact in the 3D structure often show correlated substitutions in a multiple sequence alignment. In this context, samples used for inference are not independent but share an evolutionary history on a phylogenetic tree. Here, we discuss the effects of correlations between samples on global inference. Such correlations could arise due to phylogeny but also via other slow dynamical processes. We present a simple analytical model to address the resulting inference biases, and develop an exact method accounting for background correlations in alignment data by combining phylogenetic modeling with an adaptive cluster expansion algorithm. We find that popular reweighting schemes are only marginally effective at removing phylogenetic bias, suggest a rescaling strategy that yields better results, and provide evidence that our conclusions carry over to the frequently used mean-field approach to the inverse Ising problem. (paper)

  9. Bayesian structural inference for hidden processes

    Science.gov (United States)

    Strelioff, Christopher C.; Crutchfield, James P.

    2014-04-01

    We introduce a Bayesian approach to discovering patterns in structurally complex processes. The proposed method of Bayesian structural inference (BSI) relies on a set of candidate unifilar hidden Markov model (uHMM) topologies for inference of process structure from a data series. We employ a recently developed exact enumeration of topological ɛ-machines. (A sequel then removes the topological restriction.) This subset of the uHMM topologies has the added benefit that inferred models are guaranteed to be ɛ-machines, irrespective of estimated transition probabilities. Properties of ɛ-machines and uHMMs allow for the derivation of analytic expressions for estimating transition probabilities, inferring start states, and comparing the posterior probability of candidate model topologies, despite process internal structure being only indirectly present in data. We demonstrate BSI's effectiveness in estimating a process's randomness, as reflected by the Shannon entropy rate, and its structure, as quantified by the statistical complexity. We also compare using the posterior distribution over candidate models and the single, maximum a posteriori model for point estimation and show that the former more accurately reflects uncertainty in estimated values. We apply BSI to in-class examples of finite- and infinite-order Markov processes, as well to an out-of-class, infinite-state hidden process.

  10. The Impact of Disablers on Predictive Inference

    Science.gov (United States)

    Cummins, Denise Dellarosa

    2014-01-01

    People consider alternative causes when deciding whether a cause is responsible for an effect (diagnostic inference) but appear to neglect them when deciding whether an effect will occur (predictive inference). Five experiments were conducted to test a 2-part explanation of this phenomenon: namely, (a) that people interpret standard predictive…

  11. Automatic radioxenon analyzer for CTBT monitoring

    International Nuclear Information System (INIS)

    Bowyer, T.W.; Abel, K.H.; Hensley, W.K.

    1996-12-01

    Over the past 3 years, with support from US DOE's NN-20 Comprehensive Test Ban Treaty (CTBT) R ampersand D program, PNNL has developed and demonstrated a fully automatic analyzer for collecting and measuring the four Xe radionuclides, 131m Xe(11.9 d), 133m Xe(2.19 d), 133 Xe (5.24 d), and 135 Xe(9.10 h), in the atmosphere. These radionuclides are important signatures in monitoring for compliance to a CTBT. Activity ratios permit discriminating radioxenon from nuclear detonation and that from nuclear reactor operations, nuclear fuel reprocessing, or medical isotope production and usage. In the analyzer, Xe is continuously and automatically separated from the atmosphere at flow rates of about 7 m 3 /h on sorption bed. Aliquots collected for 6-12 h are automatically analyzed by electron-photon coincidence spectrometry to produce sensitivities in the range of 20-100 μBq/m 3 of air, about 100-fold better than with reported laboratory-based procedures for short time collection intervals. Spectral data are automatically analyzed and the calculated radioxenon concentrations and raw gamma- ray spectra automatically transmitted to data centers

  12. Radio-controlled automatic gas meter-reading system; Releve automatique de compteur par radio

    Energy Technology Data Exchange (ETDEWEB)

    Yasui, M. [Osaka Gas Co., Ltd (Japan); Ishikawa, K.; Fujiwara, J. [Tokyo Gas Co., Ltd. (Japan); Ichihashi, T. [Toho Gas Co., Ltd. (Japan)

    2000-07-01

    In Japan, an automatic gas meter-reading system is in operation, also incorporating the functions of monitoring for abnormalities in gas use and remote-controlled emergency gas supply shutoff. This system has been realized by linking microcomputer-controlled gas meters(It's called 'Intelligent gas mater') equipped with automatic shutoff mechanism to the gas utility company operation center via communication lines. While the present system uses cable communication lines, we of Tokyo Gas Co., Ltd., Osaka Gas Co., Ltd. and Toho Gas Co., Ltd., have jointly developed a new system based on radio communication. This paper introduces this new system. While radio-controlled meter-reading systems are used in many countries around the world solely for automatic meter reading, our recently developed system is also capable of monitoring for abnormalities in gas use and remote-controlled emergency gas supply shutoff, thanks to its almost real-time two-way communication function. The new system can serve for a period of ten years without recharging. It is also characterized by its applicability as different systems according to purposes: 1) conventional automatic meter-reading system (terminal network control unit or T-NCU), 2) large-scale radio-controlled meter-reading system, and 3) portable terminal-type radio-controlled meter-reading system. (authors)

  13. Using Historical Data to Automatically Identify Air-Traffic Control Behavior

    Science.gov (United States)

    Lauderdale, Todd A.; Wu, Yuefeng; Tretto, Celeste

    2014-01-01

    This project seeks to develop statistical-based machine learning models to characterize the types of errors present when using current systems to predict future aircraft states. These models will be data-driven - based on large quantities of historical data. Once these models are developed, they will be used to infer situations in the historical data where an air-traffic controller intervened on an aircraft's route, even when there is no direct recording of this action.

  14. Causal inference and longitudinal data: a case study of religion and mental health.

    Science.gov (United States)

    VanderWeele, Tyler J; Jackson, John W; Li, Shanshan

    2016-11-01

    We provide an introduction to causal inference with longitudinal data and discuss the complexities of analysis and interpretation when exposures can vary over time. We consider what types of causal questions can be addressed with the standard regression-based analyses and what types of covariate control and control for the prior values of outcome and exposure must be made to reason about causal effects. We also consider newer classes of causal models, including marginal structural models, that can assess questions of the joint effects of time-varying exposures and can take into account feedback between the exposure and outcome over time. Such feedback renders cross-sectional data ineffective for drawing inferences about causation. The challenges are illustrated by analyses concerning potential effects of religious service attendance on depression, in which there may in fact be effects in both directions with service attendance preventing the subsequent depression, but depression itself leading to lower levels of the subsequent religious service attendance. Longitudinal designs, with careful control for prior exposures, outcomes, and confounders, and suitable methodology, will strengthen research on mental health, religion and health, and in the biomedical and social sciences generally.

  15. Inference as Prediction

    Science.gov (United States)

    Watson, Jane

    2007-01-01

    Inference, or decision making, is seen in curriculum documents as the final step in a statistical investigation. For a formal statistical enquiry this may be associated with sophisticated tests involving probability distributions. For young students without the mathematical background to perform such tests, it is still possible to draw informal…

  16. Improving Precision of Types

    DEFF Research Database (Denmark)

    Winther, Johnni

    Types in programming languages provide a powerful tool for the programmer to document the code so that a large aspect of the intent can not only be presented to fellow programmers but also be checked automatically by compilers. The precision with which types model the behavior of programs...... is crucial to the quality of these automated checks, and in this thesis we present three different improvements to the precision of types in three different aspects of the Java programming language. First we show how to extend the type system in Java with a new type which enables the detection of unintended...

  17. Type checking by domain analysis in Ampersand

    NARCIS (Netherlands)

    Joosten, S.M.M.; Joosten, S.J.C.; Kahl, W.; Winter, M.; Oliveira, J.N.

    2015-01-01

    In the process of incorporating subtyping in relation algebra, an algorithm was found to derive the subtyping relation from the program to be checked. By using domain analysis rather than type inference, this algorithm offers an attractive visualization of the type derivation process. This

  18. Problem solving and inference mechanisms

    Energy Technology Data Exchange (ETDEWEB)

    Furukawa, K; Nakajima, R; Yonezawa, A; Goto, S; Aoyama, A

    1982-01-01

    The heart of the fifth generation computer will be powerful mechanisms for problem solving and inference. A deduction-oriented language is to be designed, which will form the core of the whole computing system. The language is based on predicate logic with the extended features of structuring facilities, meta structures and relational data base interfaces. Parallel computation mechanisms and specialized hardware architectures are being investigated to make possible efficient realization of the language features. The project includes research into an intelligent programming system, a knowledge representation language and system, and a meta inference system to be built on the core. 30 references.

  19. Recent Advances in System Reliability Signatures, Multi-state Systems and Statistical Inference

    CERN Document Server

    Frenkel, Ilia

    2012-01-01

    Recent Advances in System Reliability discusses developments in modern reliability theory such as signatures, multi-state systems and statistical inference. It describes the latest achievements in these fields, and covers the application of these achievements to reliability engineering practice. The chapters cover a wide range of new theoretical subjects and have been written by leading experts in reliability theory and its applications.  The topics include: concepts and different definitions of signatures (D-spectra),  their  properties and applications  to  reliability of coherent systems and network-type structures; Lz-transform of Markov stochastic process and its application to multi-state system reliability analysis; methods for cost-reliability and cost-availability analysis of multi-state systems; optimal replacement and protection strategy; and statistical inference. Recent Advances in System Reliability presents many examples to illustrate the theoretical results. Real world multi-state systems...

  20. Automatic plasma control in magnetic traps

    International Nuclear Information System (INIS)

    Samojlenko, Y.; Chuyanov, V.

    1984-01-01

    Hot plasma is essentially in thermodynamic non-steady state. Automatic plasma control basically means monitoring deviations from steady state and producing a suitable magnetic or electric field which brings the plasma back to its original state. Briefly described are two systems of automatic plasma control: control with a magnetic field using a negative impedance circuit, and control using an electric field. It appears that systems of automatic plasma stabilization will be an indispensable component of the fusion reactor and its possibilities will in many ways determine the reactor economy. (Ha)

  1. Elements of Causal Inference: Foundations and Learning Algorithms

    DEFF Research Database (Denmark)

    Peters, Jonas Martin; Janzing, Dominik; Schölkopf, Bernhard

    A concise and self-contained introduction to causal inference, increasingly important in data science and machine learning......A concise and self-contained introduction to causal inference, increasingly important in data science and machine learning...

  2. Word Processing in Dyslexics: An Automatic Decoding Deficit?

    Science.gov (United States)

    Yap, Regina; Van Der Leu, Aryan

    1993-01-01

    Compares dyslexic children with normal readers on measures of phonological decoding and automatic word processing. Finds that dyslexics have a deficit in automatic phonological decoding skills. Discusses results within the framework of the phonological deficit and the automatization deficit hypotheses. (RS)

  3. Automatic test equipment for C and I of compact LWR

    International Nuclear Information System (INIS)

    Mayya, Anuradha; Marathe, P.P.; Madala, Kalyan C.

    2014-01-01

    The C and I of compact LWR consist of a wide variety of electronic modules. Testing of these modules manually was found to be very cumbersome. To ease the testing of these modules, Automatic Test Equipments (ATE) were developed jointly by BARC and ECIL. This paper describes the design of two ATEs for testing 69 types of modules. A power supply ATE was developed for 43 types of power supply modules of type AC-AC, AC-DC, DC-DC and signal conditioning modules. A VME ATE was developed to test 26 types of VME bus based and other microcontroller based non-bussed modules. These ATEs are used for the automated black box testing of modules by feeding power and control inputs and checking the outputs without operator intervention. This paper describes the important considerations in design and the major design challenges. (author)

  4. Bayesian methods for hackers probabilistic programming and Bayesian inference

    CERN Document Server

    Davidson-Pilon, Cameron

    2016-01-01

    Bayesian methods of inference are deeply natural and extremely powerful. However, most discussions of Bayesian inference rely on intensely complex mathematical analyses and artificial examples, making it inaccessible to anyone without a strong mathematical background. Now, though, Cameron Davidson-Pilon introduces Bayesian inference from a computational perspective, bridging theory to practice–freeing you to get results using computing power. Bayesian Methods for Hackers illuminates Bayesian inference through probabilistic programming with the powerful PyMC language and the closely related Python tools NumPy, SciPy, and Matplotlib. Using this approach, you can reach effective solutions in small increments, without extensive mathematical intervention. Davidson-Pilon begins by introducing the concepts underlying Bayesian inference, comparing it with other techniques and guiding you through building and training your first Bayesian model. Next, he introduces PyMC through a series of detailed examples a...

  5. Using Approximate Bayesian Computation to infer sex ratios from acoustic data.

    Science.gov (United States)

    Lehnen, Lisa; Schorcht, Wigbert; Karst, Inken; Biedermann, Martin; Kerth, Gerald; Puechmaille, Sebastien J

    2018-01-01

    Population sex ratios are of high ecological relevance, but are challenging to determine in species lacking conspicuous external cues indicating their sex. Acoustic sexing is an option if vocalizations differ between sexes, but is precluded by overlapping distributions of the values of male and female vocalizations in many species. A method allowing the inference of sex ratios despite such an overlap will therefore greatly increase the information extractable from acoustic data. To meet this demand, we developed a novel approach using Approximate Bayesian Computation (ABC) to infer the sex ratio of populations from acoustic data. Additionally, parameters characterizing the male and female distribution of acoustic values (mean and standard deviation) are inferred. This information is then used to probabilistically assign a sex to a single acoustic signal. We furthermore develop a simpler means of sex ratio estimation based on the exclusion of calls from the overlap zone. Applying our methods to simulated data demonstrates that sex ratio and acoustic parameter characteristics of males and females are reliably inferred by the ABC approach. Applying both the ABC and the exclusion method to empirical datasets (echolocation calls recorded in colonies of lesser horseshoe bats, Rhinolophus hipposideros) provides similar sex ratios as molecular sexing. Our methods aim to facilitate evidence-based conservation, and to benefit scientists investigating ecological or conservation questions related to sex- or group specific behaviour across a wide range of organisms emitting acoustic signals. The developed methodology is non-invasive, low-cost and time-efficient, thus allowing the study of many sites and individuals. We provide an R-script for the easy application of the method and discuss potential future extensions and fields of applications. The script can be easily adapted to account for numerous biological systems by adjusting the type and number of groups to be

  6. Mathematical modelling and quality indices optimization of automatic control systems of reactor facility

    International Nuclear Information System (INIS)

    Severin, V.P.

    2007-01-01

    The mathematical modeling of automatic control systems of reactor facility WWER-1000 with various regulator types is considered. The linear and nonlinear models of neutron power control systems of nuclear reactor WWER-1000 with various group numbers of delayed neutrons are designed. The results of optimization of direct quality indexes of neutron power control systems of nuclear reactor WWER-1000 are designed. The identification and optimization of level control systems with various regulator types of steam generator are executed

  7. Causal inference in econometrics

    CERN Document Server

    Kreinovich, Vladik; Sriboonchitta, Songsak

    2016-01-01

    This book is devoted to the analysis of causal inference which is one of the most difficult tasks in data analysis: when two phenomena are observed to be related, it is often difficult to decide whether one of them causally influences the other one, or whether these two phenomena have a common cause. This analysis is the main focus of this volume. To get a good understanding of the causal inference, it is important to have models of economic phenomena which are as accurate as possible. Because of this need, this volume also contains papers that use non-traditional economic models, such as fuzzy models and models obtained by using neural networks and data mining techniques. It also contains papers that apply different econometric models to analyze real-life economic dependencies.

  8. Implementation of automatic protection switching in an optical cross connect

    OpenAIRE

    Uy, Jason

    2005-01-01

    Having a reliable network is a hard requirement for Telecommunication companies when deploying new networks. Service providers and enterprise customers lose a lot of money any time an interruption of internet service occurs. The SONETISDH specification specifies several different types of topology that support redundancy. An Automatic Protection Switching (APS) mechanism is specified for each topology to dictate how a network behaves in a failure event. For this project, a software implementa...

  9. Assessment of network inference methods: how to cope with an underdetermined problem.

    Directory of Open Access Journals (Sweden)

    Caroline Siegenthaler

    Full Text Available The inference of biological networks is an active research area in the field of systems biology. The number of network inference algorithms has grown tremendously in the last decade, underlining the importance of a fair assessment and comparison among these methods. Current assessments of the performance of an inference method typically involve the application of the algorithm to benchmark datasets and the comparison of the network predictions against the gold standard or reference networks. While the network inference problem is often deemed underdetermined, implying that the inference problem does not have a (unique solution, the consequences of such an attribute have not been rigorously taken into consideration. Here, we propose a new procedure for assessing the performance of gene regulatory network (GRN inference methods. The procedure takes into account the underdetermined nature of the inference problem, in which gene regulatory interactions that are inferable or non-inferable are determined based on causal inference. The assessment relies on a new definition of the confusion matrix, which excludes errors associated with non-inferable gene regulations. For demonstration purposes, the proposed assessment procedure is applied to the DREAM 4 In Silico Network Challenge. The results show a marked change in the ranking of participating methods when taking network inferability into account.

  10. Reactor power automatically controlling method and device for BWR type reactor

    International Nuclear Information System (INIS)

    Murata, Akira; Miyamoto, Yoshiyuki; Tanigawa, Naoshi.

    1997-01-01

    For an automatic control for a reactor power, when a deviation exceeds a predetermined value, the aimed value is kept at a predetermined value, and when the deviation is decreased to less than the predetermined value, the aimed value is increased from the predetermined value again. Alternatively, when a reactor power variation coefficient is decreased to less than a predetermine value, an aimed value is maintained at a predetermined value, and when the variation coefficient exceeds the predetermined value, the aimed value is increased. When the reactor power variation coefficient exceeds a first determined value, an aimed value is increased to a predetermined variation coefficient, and when the variation coefficient is decreased to less than the first determined value and also when the deviation between the aimed value and an actual reactor power exceeds a second determined value, the aimed value is maintained at a constant value. When the deviation is increased or when the reactor power variation coefficient is decreased, since the aimed value is maintained at predetermined value without increasing the aimed value, the deviation is not increased excessively thereby enabling to avoid excessive overshoot. (N.H.)

  11. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  12. AWSCS-A System to Evaluate Different Approaches for the Automatic Composition and Execution of Web Services Flows.

    Science.gov (United States)

    Tardiole Kuehne, Bruno; Estrella, Julio Cezar; Nunes, Luiz Henrique; Martins de Oliveira, Edvard; Hideo Nakamura, Luis; Gomes Ferreira, Carlos Henrique; Carlucci Santana, Regina Helena; Reiff-Marganiec, Stephan; Santana, Marcos José

    2015-01-01

    This paper proposes a system named AWSCS (Automatic Web Service Composition System) to evaluate different approaches for automatic composition of Web services, based on QoS parameters that are measured at execution time. The AWSCS is a system to implement different approaches for automatic composition of Web services and also to execute the resulting flows from these approaches. Aiming at demonstrating the results of this paper, a scenario was developed, where empirical flows were built to demonstrate the operation of AWSCS, since algorithms for automatic composition are not readily available to test. The results allow us to study the behaviour of running composite Web services, when flows with the same functionality but different problem-solving strategies were compared. Furthermore, we observed that the influence of the load applied on the running system as the type of load submitted to the system is an important factor to define which approach for the Web service composition can achieve the best performance in production.

  13. AWSCS-A System to Evaluate Different Approaches for the Automatic Composition and Execution of Web Services Flows.

    Directory of Open Access Journals (Sweden)

    Bruno Tardiole Kuehne

    Full Text Available This paper proposes a system named AWSCS (Automatic Web Service Composition System to evaluate different approaches for automatic composition of Web services, based on QoS parameters that are measured at execution time. The AWSCS is a system to implement different approaches for automatic composition of Web services and also to execute the resulting flows from these approaches. Aiming at demonstrating the results of this paper, a scenario was developed, where empirical flows were built to demonstrate the operation of AWSCS, since algorithms for automatic composition are not readily available to test. The results allow us to study the behaviour of running composite Web services, when flows with the same functionality but different problem-solving strategies were compared. Furthermore, we observed that the influence of the load applied on the running system as the type of load submitted to the system is an important factor to define which approach for the Web service composition can achieve the best performance in production.

  14. Forensic Automatic Speaker Recognition Based on Likelihood Ratio Using Acoustic-phonetic Features Measured Automatically

    Directory of Open Access Journals (Sweden)

    Huapeng Wang

    2015-01-01

    Full Text Available Forensic speaker recognition is experiencing a remarkable paradigm shift in terms of the evaluation framework and presentation of voice evidence. This paper proposes a new method of forensic automatic speaker recognition using the likelihood ratio framework to quantify the strength of voice evidence. The proposed method uses a reference database to calculate the within- and between-speaker variability. Some acoustic-phonetic features are extracted automatically using the software VoiceSauce. The effectiveness of the approach was tested using two Mandarin databases: A mobile telephone database and a landline database. The experiment's results indicate that these acoustic-phonetic features do have some discriminating potential and are worth trying in discrimination. The automatic acoustic-phonetic features have acceptable discriminative performance and can provide more reliable results in evidence analysis when fused with other kind of voice features.

  15. Fuzzy logic controller using different inference methods

    International Nuclear Information System (INIS)

    Liu, Z.; De Keyser, R.

    1994-01-01

    In this paper the design of fuzzy controllers by using different inference methods is introduced. Configuration of the fuzzy controllers includes a general rule-base which is a collection of fuzzy PI or PD rules, the triangular fuzzy data model and a centre of gravity defuzzification algorithm. The generalized modus ponens (GMP) is used with the minimum operator of the triangular norm. Under the sup-min inference rule, six fuzzy implication operators are employed to calculate the fuzzy look-up tables for each rule base. The performance is tested in simulated systems with MATLAB/SIMULINK. Results show the effects of using the fuzzy controllers with different inference methods and applied to different test processes

  16. Automatic inference of geometric camera parameters and intercamera topology in uncalibrated disjoint surveillance cameras

    NARCIS (Netherlands)

    Hollander, R.J.M. den; Bouma, H.; Baan, J.; Eendebak, P.T.; Rest, J.H.C. van

    2015-01-01

    Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many

  17. An algebra-based method for inferring gene regulatory networks.

    Science.gov (United States)

    Vera-Licona, Paola; Jarrah, Abdul; Garcia-Puente, Luis David; McGee, John; Laubenbacher, Reinhard

    2014-03-26

    The inference of gene regulatory networks (GRNs) from experimental observations is at the heart of systems biology. This includes the inference of both the network topology and its dynamics. While there are many algorithms available to infer the network topology from experimental data, less emphasis has been placed on methods that infer network dynamics. Furthermore, since the network inference problem is typically underdetermined, it is essential to have the option of incorporating into the inference process, prior knowledge about the network, along with an effective description of the search space of dynamic models. Finally, it is also important to have an understanding of how a given inference method is affected by experimental and other noise in the data used. This paper contains a novel inference algorithm using the algebraic framework of Boolean polynomial dynamical systems (BPDS), meeting all these requirements. The algorithm takes as input time series data, including those from network perturbations, such as knock-out mutant strains and RNAi experiments. It allows for the incorporation of prior biological knowledge while being robust to significant levels of noise in the data used for inference. It uses an evolutionary algorithm for local optimization with an encoding of the mathematical models as BPDS. The BPDS framework allows an effective representation of the search space for algebraic dynamic models that improves computational performance. The algorithm is validated with both simulated and experimental microarray expression profile data. Robustness to noise is tested using a published mathematical model of the segment polarity gene network in Drosophila melanogaster. Benchmarking of the algorithm is done by comparison with a spectrum of state-of-the-art network inference methods on data from the synthetic IRMA network to demonstrate that our method has good precision and recall for the network reconstruction task, while also predicting several of the

  18. Active inference, sensory attenuation and illusions.

    Science.gov (United States)

    Brown, Harriet; Adams, Rick A; Parees, Isabel; Edwards, Mark; Friston, Karl

    2013-11-01

    Active inference provides a simple and neurobiologically plausible account of how action and perception are coupled in producing (Bayes) optimal behaviour. This can be seen most easily as minimising prediction error: we can either change our predictions to explain sensory input through perception. Alternatively, we can actively change sensory input to fulfil our predictions. In active inference, this action is mediated by classical reflex arcs that minimise proprioceptive prediction error created by descending proprioceptive predictions. However, this creates a conflict between action and perception; in that, self-generated movements require predictions to override the sensory evidence that one is not actually moving. However, ignoring sensory evidence means that externally generated sensations will not be perceived. Conversely, attending to (proprioceptive and somatosensory) sensations enables the detection of externally generated events but precludes generation of actions. This conflict can be resolved by attenuating the precision of sensory evidence during movement or, equivalently, attending away from the consequences of self-made acts. We propose that this Bayes optimal withdrawal of precise sensory evidence during movement is the cause of psychophysical sensory attenuation. Furthermore, it explains the force-matching illusion and reproduces empirical results almost exactly. Finally, if attenuation is removed, the force-matching illusion disappears and false (delusional) inferences about agency emerge. This is important, given the negative correlation between sensory attenuation and delusional beliefs in normal subjects--and the reduction in the magnitude of the illusion in schizophrenia. Active inference therefore links the neuromodulatory optimisation of precision to sensory attenuation and illusory phenomena during the attribution of agency in normal subjects. It also provides a functional account of deficits in syndromes characterised by false inference

  19. Bayesian Inference and Online Learning in Poisson Neuronal Networks.

    Science.gov (United States)

    Huang, Yanping; Rao, Rajesh P N

    2016-08-01

    Motivated by the growing evidence for Bayesian computation in the brain, we show how a two-layer recurrent network of Poisson neurons can perform both approximate Bayesian inference and learning for any hidden Markov model. The lower-layer sensory neurons receive noisy measurements of hidden world states. The higher-layer neurons infer a posterior distribution over world states via Bayesian inference from inputs generated by sensory neurons. We demonstrate how such a neuronal network with synaptic plasticity can implement a form of Bayesian inference similar to Monte Carlo methods such as particle filtering. Each spike in a higher-layer neuron represents a sample of a particular hidden world state. The spiking activity across the neural population approximates the posterior distribution over hidden states. In this model, variability in spiking is regarded not as a nuisance but as an integral feature that provides the variability necessary for sampling during inference. We demonstrate how the network can learn the likelihood model, as well as the transition probabilities underlying the dynamics, using a Hebbian learning rule. We present results illustrating the ability of the network to perform inference and learning for arbitrary hidden Markov models.

  20. Automation strategies in five domains - A comparison of levels of automation, function allocation and visualisation of automatic functions

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, J. (Chalmers Univ. of Technology. Division Design and Human factors. Dept. of Product and Production Development, Goeteborg (Sweden))

    2011-01-15

    This study was conducted as a field study where control room operators and engineers from the refinery, heat and power, aviation, shipping and nuclear domain were interviewed regarding use of automation and the visualisation of automatic functions. The purpose of the study was to collect experiences and best practices from the five studied domains on levels of automation, function allocation and visualisation of automatic functions. In total, nine different control room settings were visited. The studied settings were compared using a systemic approach based on a human-machine systems model. The results show that the 'left over principle' is still the most common applied approach for function allocation but in high risk settings the decision whether to automate or not is more carefully considered. Regarding the visualisation of automatic functions, it was found that as long as each display type (process based, functional oriented, situation oriented and task based) are applied so that they correspond to the same level of abstraction as the technical system the operator's mental model will be supported. No single display type can however readily match all levels of abstraction at the same time - all display types are still needed and serve different purposes. (Author)