WorldWideScience

Sample records for model hmm framework

  1. Unified HMM-based layout analysis framework and algorithm

    Institute of Scientific and Technical Information of China (English)

    陈明; 丁晓青; 吴佑寿

    2003-01-01

    To manipulate the layout analysis problem for complex or irregular document image, a Unified HMM-based Layout Analysis Framework is presented in this paper. Based on the multi-resolution wavelet analysis results of the document image, we use HMM method in both inner-scale image model and trans-scale context model to classify the pixel region properties, such as text, picture or background. In each scale, a HMM direct segmentation method is used to get better inner-scale classification result. Then another HMM method is used to fuse the inner-scale result in each scale and then get better final seg- mentation result. The optimized algorithm uses a stop rule in the coarse to fine multi-scale segmentation process, so the speed is improved remarkably. Experiments prove the efficiency of proposed algorithm.

  2. HMM-based Trust Model

    DEFF Research Database (Denmark)

    ElSalamouny, Ehab; Nielsen, Mogens; Sassone, Vladimiro

    2010-01-01

    with their dynamic behaviour. Using Hidden Markov Models (HMMs) for both modelling and approximating the behaviours of principals, we introduce the HMM-based trust model as a new approach to evaluating trust in systems exhibiting dynamic behaviour. This model avoids the fixed behaviour assumption which is considered...... the major limitation of existing Beta trust model. We show the consistency of the HMM-based trust model and contrast it against the well known Beta trust model with the decay principle in terms of the estimation precision....

  3. AN HMM BASED ANALYSIS FRAMEWORK FOR SEMANTIC VIDEO EVENTS

    Institute of Scientific and Technical Information of China (English)

    You Junyong; Liu Guizhong; Zhang Yaxin

    2007-01-01

    Semantic video analysis plays an important role in the field of machine intelligence and pattern recognition. In this paper, based on the Hidden Markov Model (HMM), a semantic recognition framework on compressed videos is proposed to analyze the video events according to six low-level features. After the detailed analysis of video events, the pattern of global motion and five features in foreground--the principal parts of videos, are employed as the observations of the Hidden Markov Model to classify events in videos. The applications of the proposed framework in some video event detections demonstrate the promising success of the proposed framework on semantic video analysis.

  4. HMM Framework, for Industrial Maintenance ActivitiesHMM Framework, for Industrial Maintenance Activities

    OpenAIRE

    Roblès, Bernard; Avila, Manuel; Duculty, Florent; Vrignat, Pascal; Begot, Stéphane; Kratz, Frédéric

    2013-01-01

    8 pages; International audience; This paper uses the Hidden Markov Model to model an industrial process seen as a discrete event system. Different graphical structures based on Markov automata, called topologies, are proposed. We designed a Synthetic Hidden Markov Model based on a real industrial process. This Synthetic Model is intended to produce industrial maintenance observations (or "symbols"), with a corresponding degradation indicator. These time series events are shown as Markov chain...

  5. pHMM-tree: phylogeny of profile hidden Markov models.

    Science.gov (United States)

    Huo, Luyang; Zhang, Han; Huo, Xueting; Yang, Yasong; Li, Xueqiong; Yin, Yanbin

    2017-04-01

    Protein families are often represented by profile hidden Markov models (pHMMs). Homology between two distant protein families can be determined by comparing the pHMMs. Here we explored the idea of building a phylogeny of protein families using the distance matrix of their pHMMs. We developed a new software and web server (pHMM-tree) to allow four major types of inputs: (i) multiple pHMM files, (ii) multiple aligned protein sequence files, (iii) mixture of pHMM and aligned sequence files and (iv) unaligned protein sequences in a single file. The output will be a pHMM phylogeny of different protein families delineating their relationships. We have applied pHMM-tree to build phylogenies for CAZyme (carbohydrate active enzyme) classes and Pfam clans, which attested its usefulness in the phylogenetic representation of the evolutionary relationship among distant protein families. This software is implemented in C/C ++ and is available at http://cys.bios.niu.edu/pHMM-Tree/source/. zhanghan@nankai.edu.cn or yyin@niu.edu. Supplementary data are available at Bioinformatics online.

  6. Creating Discriminative Models for Time Series Classification and Clustering by HMM Ensembles.

    Science.gov (United States)

    Asadi, Nazanin; Mirzaei, Abdolreza; Haghshenas, Ehsan

    2016-12-01

    Classification of temporal data sequences is a fundamental branch of machine learning with a broad range of real world applications. Since the dimensionality of temporal data is significantly larger than static data, and its modeling and interpreting is more complicated, performing classification and clustering on temporal data is more complex as well. Hidden Markov models (HMMs) are well-known statistical models for modeling and analysis of sequence data. Besides, ensemble methods, which employ multiple models to obtain the target model, revealed good performances in the conducted experiments. All these facts are a high level of motivation to employ HMM ensembles in the task of classification and clustering of time series data. So far, no effective classification and clustering method based on HMM ensembles has been proposed. Moreover, employing the limited existing HMM ensemble methods has trouble separating models of distinct classes as a vital task. In this paper, according to previous points a new framework based on HMM ensembles for classification and clustering is proposed. In addition to its strong theoretical background by employing the Rényi entropy for ensemble learning procedure, the main contribution of the proposed method is addressing HMM-based methods problem in separating models of distinct classes by considering the inverse emission matrix of the opposite class to build an opposite model. The proposed algorithms perform more effectively compared to other methods especially other HMM ensemble-based methods. Moreover, the proposed clustering framework, which derives benefits from both similarity-based and model-based methods, together with the Rényi-based ensemble method revealed its superiority in several measurements.

  7. An HMM-based comparative genomic framework for detecting introgression in eukaryotes.

    Directory of Open Access Journals (Sweden)

    Kevin J Liu

    2014-06-01

    Full Text Available One outcome of interspecific hybridization and subsequent effects of evolutionary forces is introgression, which is the integration of genetic material from one species into the genome of an individual in another species. The evolution of several groups of eukaryotic species has involved hybridization, and cases of adaptation through introgression have been already established. In this work, we report on PhyloNet-HMM-a new comparative genomic framework for detecting introgression in genomes. PhyloNet-HMM combines phylogenetic networks with hidden Markov models (HMMs to simultaneously capture the (potentially reticulate evolutionary history of the genomes and dependencies within genomes. A novel aspect of our work is that it also accounts for incomplete lineage sorting and dependence across loci. Application of our model to variation data from chromosome 7 in the mouse (Mus musculus domesticus genome detected a recently reported adaptive introgression event involving the rodent poison resistance gene Vkorc1, in addition to other newly detected introgressed genomic regions. Based on our analysis, it is estimated that about 9% of all sites within chromosome 7 are of introgressive origin (these cover about 13 Mbp of chromosome 7, and over 300 genes. Further, our model detected no introgression in a negative control data set. We also found that our model accurately detected introgression and other evolutionary processes from synthetic data sets simulated under the coalescent model with recombination, isolation, and migration. Our work provides a powerful framework for systematic analysis of introgression while simultaneously accounting for dependence across sites, point mutations, recombination, and ancestral polymorphism.

  8. Prediction of nuclear proteins using SVM and HMM models

    Directory of Open Access Journals (Sweden)

    Raghava Gajendra PS

    2009-01-01

    Full Text Available Abstract Background The nucleus, a highly organized organelle, plays important role in cellular homeostasis. The nuclear proteins are crucial for chromosomal maintenance/segregation, gene expression, RNA processing/export, and many other processes. Several methods have been developed for predicting the nuclear proteins in the past. The aim of the present study is to develop a new method for predicting nuclear proteins with higher accuracy. Results All modules were trained and tested on a non-redundant dataset and evaluated using five-fold cross-validation technique. Firstly, Support Vector Machines (SVM based modules have been developed using amino acid and dipeptide compositions and achieved a Mathews correlation coefficient (MCC of 0.59 and 0.61 respectively. Secondly, we have developed SVM modules using split amino acid compositions (SAAC and achieved the maximum MCC of 0.66. Thirdly, a hidden Markov model (HMM based module/profile was developed for searching exclusively nuclear and non-nuclear domains in a protein. Finally, a hybrid module was developed by combining SVM module and HMM profile and achieved a MCC of 0.87 with an accuracy of 94.61%. This method performs better than the existing methods when evaluated on blind/independent datasets. Our method estimated 31.51%, 21.89%, 26.31%, 25.72% and 24.95% of the proteins as nuclear proteins in Saccharomyces cerevisiae, Caenorhabditis elegans, Drosophila melanogaster, mouse and human proteomes respectively. Based on the above modules, we have developed a web server NpPred for predicting nuclear proteins http://www.imtech.res.in/raghava/nppred/. Conclusion This study describes a highly accurate method for predicting nuclear proteins. SVM module has been developed for the first time using SAAC for predicting nuclear proteins, where amino acid composition of N-terminus and the remaining protein were computed separately. In addition, our study is a first documentation where exclusively nuclear

  9. Dynamic HMM Model with Estimated Dynamic Property in Continuous Mandarin Speech Recognition

    Institute of Scientific and Technical Information of China (English)

    CHENFeili; ZHUJie

    2003-01-01

    A new dynamic HMM (hiddem Markov model) has been introduced in this paper, which describes the relationship between dynamic property and feature of space. The method to estimate the dynamic property is discussed in this paper, which makes the dynamic HMMmuch more practical in real time speech recognition. Ex-periment on large vocabulary continuous Mandarin speech recognition task has shown that the dynamic HMM model can achieve about 10% of error reduction both for tonal and toneless syllable. Estimated dynamic property can achieve nearly same (even better) performance than using extracted dynamic property.

  10. Speech-To-Text Conversion STT System Using Hidden Markov Model HMM

    Directory of Open Access Journals (Sweden)

    Su Myat Mon

    2015-06-01

    Full Text Available Abstract Speech is an easiest way to communicate with each other. Speech processing is widely used in many applications like security devices household appliances cellular phones ATM machines and computers. The human computer interface has been developed to communicate or interact conveniently for one who is suffering from some kind of disabilities. Speech-to-Text Conversion STT systems have a lot of benefits for the deaf or dumb people and find their applications in our daily lives. In the same way the aim of the system is to convert the input speech signals into the text output for the deaf or dumb students in the educational fields. This paper presents an approach to extract features by using Mel Frequency Cepstral Coefficients MFCC from the speech signals of isolated spoken words. And Hidden Markov Model HMM method is applied to train and test the audio files to get the recognized spoken word. The speech database is created by using MATLAB.Then the original speech signals are preprocessed and these speech samples are extracted to the feature vectors which are used as the observation sequences of the Hidden Markov Model HMM recognizer. The feature vectors are analyzed in the HMM depending on the number of states.

  11. Accelerated Profile HMM Searches.

    Directory of Open Access Journals (Sweden)

    Sean R Eddy

    2011-10-01

    Full Text Available Profile hidden Markov models (profile HMMs and probabilistic inference methods have made important contributions to the theory of sequence database homology search. However, practical use of profile HMM methods has been hindered by the computational expense of existing software implementations. Here I describe an acceleration heuristic for profile HMMs, the "multiple segment Viterbi" (MSV algorithm. The MSV algorithm computes an optimal sum of multiple ungapped local alignment segments using a striped vector-parallel approach previously described for fast Smith/Waterman alignment. MSV scores follow the same statistical distribution as gapped optimal local alignment scores, allowing rapid evaluation of significance of an MSV score and thus facilitating its use as a heuristic filter. I also describe a 20-fold acceleration of the standard profile HMM Forward/Backward algorithms using a method I call "sparse rescaling". These methods are assembled in a pipeline in which high-scoring MSV hits are passed on for reanalysis with the full HMM Forward/Backward algorithm. This accelerated pipeline is implemented in the freely available HMMER3 software package. Performance benchmarks show that the use of the heuristic MSV filter sacrifices negligible sensitivity compared to unaccelerated profile HMM searches. HMMER3 is substantially more sensitive and 100- to 1000-fold faster than HMMER2. HMMER3 is now about as fast as BLAST for protein searches.

  12. Soft context clustering for F0 modeling in HMM-based speech synthesis

    Science.gov (United States)

    Khorram, Soheil; Sameti, Hossein; King, Simon

    2015-12-01

    This paper proposes the use of a new binary decision tree, which we call a soft decision tree, to improve generalization performance compared to the conventional `hard' decision tree method that is used to cluster context-dependent model parameters in statistical parametric speech synthesis. We apply the method to improve the modeling of fundamental frequency, which is an important factor in synthesizing natural-sounding high-quality speech. Conventionally, hard decision tree-clustered hidden Markov models (HMMs) are used, in which each model parameter is assigned to a single leaf node. However, this `divide-and-conquer' approach leads to data sparsity, with the consequence that it suffers from poor generalization, meaning that it is unable to accurately predict parameters for models of unseen contexts: the hard decision tree is a weak function approximator. To alleviate this, we propose the soft decision tree, which is a binary decision tree with soft decisions at the internal nodes. In this soft clustering method, internal nodes select both their children with certain membership degrees; therefore, each node can be viewed as a fuzzy set with a context-dependent membership function. The soft decision tree improves model generalization and provides a superior function approximator because it is able to assign each context to several overlapped leaves. In order to use such a soft decision tree to predict the parameters of the HMM output probability distribution, we derive the smoothest (maximum entropy) distribution which captures all partial first-order moments and a global second-order moment of the training samples. Employing such a soft decision tree architecture with maximum entropy distributions, a novel speech synthesis system is trained using maximum likelihood (ML) parameter re-estimation and synthesis is achieved via maximum output probability parameter generation. In addition, a soft decision tree construction algorithm optimizing a log-likelihood measure

  13. HMM Adaptation for child speech synthesis

    CSIR Research Space (South Africa)

    Govender, Avashna

    2015-09-01

    Full Text Available Hidden Markov Model (HMM)-based synthesis in combination with speaker adaptation has proven to be an approach that is well-suited for child speech synthesis. This paper describes the development and evaluation of different HMM-based child speech...

  14. Identification of divergent protein domains by combining HMM-HMM comparisons and co-occurrence detection.

    Directory of Open Access Journals (Sweden)

    Amel Ghouila

    Full Text Available Identification of protein domains is a key step for understanding protein function. Hidden Markov Models (HMMs have proved to be a powerful tool for this task. The Pfam database notably provides a large collection of HMMs which are widely used for the annotation of proteins in sequenced organisms. This is done via sequence/HMM comparisons. However, this approach may lack sensitivity when searching for domains in divergent species. Recently, methods for HMM/HMM comparisons have been proposed and proved to be more sensitive than sequence/HMM approaches in certain cases. However, these approaches are usually not used for protein domain discovery at a genome scale, and the benefit that could be expected from their utilization for this problem has not been investigated. Using proteins of P. falciparum and L. major as examples, we investigate the extent to which HMM/HMM comparisons can identify new domain occurrences not already identified by sequence/HMM approaches. We show that although HMM/HMM comparisons are much more sensitive than sequence/HMM comparisons, they are not sufficiently accurate to be used as a standalone complement of sequence/HMM approaches at the genome scale. Hence, we propose to use domain co-occurrence--the general domain tendency to preferentially appear along with some favorite domains in the proteins--to improve the accuracy of the approach. We show that the combination of HMM/HMM comparisons and co-occurrence domain detection boosts protein annotations. At an estimated False Discovery Rate of 5%, it revealed 901 and 1098 new domains in Plasmodium and Leishmania proteins, respectively. Manual inspection of part of these predictions shows that it contains several domain families that were missing in the two organisms. All new domain occurrences have been integrated in the EuPathDomains database, along with the GO annotations that can be deduced.

  15. Identification of divergent protein domains by combining HMM-HMM comparisons and co-occurrence detection.

    Science.gov (United States)

    Ghouila, Amel; Florent, Isabelle; Guerfali, Fatma Zahra; Terrapon, Nicolas; Laouini, Dhafer; Yahia, Sadok Ben; Gascuel, Olivier; Bréhélin, Laurent

    2014-01-01

    Identification of protein domains is a key step for understanding protein function. Hidden Markov Models (HMMs) have proved to be a powerful tool for this task. The Pfam database notably provides a large collection of HMMs which are widely used for the annotation of proteins in sequenced organisms. This is done via sequence/HMM comparisons. However, this approach may lack sensitivity when searching for domains in divergent species. Recently, methods for HMM/HMM comparisons have been proposed and proved to be more sensitive than sequence/HMM approaches in certain cases. However, these approaches are usually not used for protein domain discovery at a genome scale, and the benefit that could be expected from their utilization for this problem has not been investigated. Using proteins of P. falciparum and L. major as examples, we investigate the extent to which HMM/HMM comparisons can identify new domain occurrences not already identified by sequence/HMM approaches. We show that although HMM/HMM comparisons are much more sensitive than sequence/HMM comparisons, they are not sufficiently accurate to be used as a standalone complement of sequence/HMM approaches at the genome scale. Hence, we propose to use domain co-occurrence--the general domain tendency to preferentially appear along with some favorite domains in the proteins--to improve the accuracy of the approach. We show that the combination of HMM/HMM comparisons and co-occurrence domain detection boosts protein annotations. At an estimated False Discovery Rate of 5%, it revealed 901 and 1098 new domains in Plasmodium and Leishmania proteins, respectively. Manual inspection of part of these predictions shows that it contains several domain families that were missing in the two organisms. All new domain occurrences have been integrated in the EuPathDomains database, along with the GO annotations that can be deduced.

  16. Network Attack Classification and Recognition Using HMM and Improved Evidence Theory

    Directory of Open Access Journals (Sweden)

    Gang Luo

    2016-04-01

    Full Text Available In this paper, a decision model of fusion classification based on HMM-DS is proposed, and the training and recognition methods of the model are given. As the pure HMM classifier can’t have an ideal balance between each model with a strong ability to identify its target and the maximum difference between models. So in this paper, the results of HMM are integrated into the DS framework, and HMM provides state probabilities for DS. The output of each hidden Markov model is used as a body of evidence. The improved evidence theory method is proposed to fuse the results and encounter drawbacks of the pure HMM for improving classification accuracy of the system. We compare our approach with the traditional evidence theory method, other representative improved DS methods, pure HMM method and common classification methods. The experimental results show that our proposed method has a significant practical effect in improving the training process of network attack classification with high accuracy.

  17. HMM in Predicting Protein Secondary Structure

    Institute of Scientific and Technical Information of China (English)

    Huang Jing; Shi Feng; Zou Xiu-fen; Li Yuan-xiang; Zhou Huai-bei

    2003-01-01

    We introduced a new method --duration Hidden Markov Model (dHMM) to predicate the secondary structure of Protein. In our study, we divide the basic second structure of protein into three parts: H (α-Helix), E (β-sheet) and O (others, include coil and turn). HMM is a kind of probabilistic model which more thinking of the interaction between adjacent amino acids (these interaction were represented by transmit probability), and we use genetic algorithm to determine the nodel parameters. After improving on the model and fixed on the parameters of the model, we write aprogram HMMPS. Our example shows that HMM is a nice method for protein secondary structure prediction.

  18. A Hybrid Approach for Co-Channel Speech Segregation based on CASA, HMM Multipitch Tracking, and Medium Frame Harmonic Model

    Directory of Open Access Journals (Sweden)

    Ashraf M. Mohy Eldin

    2013-08-01

    Full Text Available This paper proposes a hybrid approach for co-channel speech segregation. HMM (hidden Markov model is used to track the pitches of 2 talkers. The resulting pitch tracks are then enriched with the prominent pitch. The enriched tracks are correctly grouped using pitch continuity. Medium frame harmonics are used to extract the second pitch for frames with only one pitch deduced using the previous steps. Finally, the pitch tracks are input to CASA (computational auditory scene analysis to segregate the mixed speech. The center frequency range of the gamma tone filter banks is maximized to reduce the overlap between the channels filtered for better segregation. Experiments were conducted using this hybrid approach on the speech separation challenge database and compared to the single (non-hybrid approaches, i.e. signal processing and CASA. Results show that using the hybrid approach outperforms the single approaches.

  19. Hybrid SVM/HMM Method for Face Recognition

    Institute of Scientific and Technical Information of China (English)

    刘江华; 陈佳品; 程君实

    2004-01-01

    A face recognition system based on Support Vector Machine (SVM) and Hidden Markov Model (HMM) has been proposed. The powerful discriminative ability of SVM is combined with the temporal modeling ability of HMM. The output of SVM is moderated to be probability output, which replaces the Mixture of Gauss (MOG) in HMM. Wavelet transformation is used to extract observation vector, which reduces the data dimension and improves the robustness.The hybrid system is compared with pure HMM face recognition method based on ORL face database and Yale face database. Experiments results show that the hybrid method has better performance.

  20. A Survey on Hidden Markov Model (HMM Based Intention Prediction Techniques

    Directory of Open Access Journals (Sweden)

    Mrs. Manisha Bharati

    2016-01-01

    Full Text Available The extensive use of virtualization in implementing cloud infrastructure brings unrivaled security concerns for cloud tenants or customers and introduces an additional layer that itself must be completely configured and secured. Intruders can exploit the large amount of cloud resources for their attacks. This paper discusses two approaches In the first three features namely ongoing attacks, autonomic prevention actions, and risk measure are Integrated to our Autonomic Cloud Intrusion Detection Framework (ACIDF as most of the current security technologies do not provide the essential security features for cloud systems such as early warnings about future ongoing attacks, autonomic prevention actions, and risk measure. The early warnings are signaled through a new finite State Hidden Markov prediction model that captures the interaction between the attackers and cloud assets. The risk assessment model measures the potential impact of a threat on assets given its occurrence probability. The estimated risk of each security alert is updated dynamically as the alert is correlated to prior ones. This enables the adaptive risk metric to evaluate the cloud’s overall security state. The prediction system raises early warnings about potential attacks to the autonomic component, controller. Thus, the controller can take proactive corrective actions before the attacks pose a serious security risk to the system. In another Attack Sequence Detection (ASD approach as Tasks from different users may be performed on the same machine. Therefore, one primary security concern is whether user data is secure in cloud. On the other hand, hacker may facilitate cloud computing to launch larger range of attack, such as a request of port scan in cloud with multiple virtual machines executing such malicious action. In addition, hacker may perform a sequence of attacks in order to compromise his target system in cloud, for example, evading an easy-to-exploit machine in a

  1. HMM-ModE – Improved classification using profile hidden Markov models by optimising the discrimination threshold and modifying emission probabilities with negative training sequences

    Directory of Open Access Journals (Sweden)

    Nandi Soumyadeep

    2007-03-01

    Full Text Available Abstract Background Profile Hidden Markov Models (HMM are statistical representations of protein families derived from patterns of sequence conservation in multiple alignments and have been used in identifying remote homologues with considerable success. These conservation patterns arise from fold specific signals, shared across multiple families, and function specific signals unique to the families. The availability of sequences pre-classified according to their function permits the use of negative training sequences to improve the specificity of the HMM, both by optimizing the threshold cutoff and by modifying emission probabilities to minimize the influence of fold-specific signals. A protocol to generate family specific HMMs is described that first constructs a profile HMM from an alignment of the family's sequences and then uses this model to identify sequences belonging to other classes that score above the default threshold (false positives. Ten-fold cross validation is used to optimise the discrimination threshold score for the model. The advent of fast multiple alignment methods enables the use of the profile alignments to align the true and false positive sequences, and the resulting alignments are used to modify the emission probabilities in the original model. Results The protocol, called HMM-ModE, was validated on a set of sequences belonging to six sub-families of the AGC family of kinases. These sequences have an average sequence similarity of 63% among the group though each sub-group has a different substrate specificity. The optimisation of discrimination threshold, by using negative sequences scored against the model improves specificity in test cases from an average of 21% to 98%. Further discrimination by the HMM after modifying model probabilities using negative training sequences is provided in a few cases, the average specificity rising to 99%. Similar improvements were obtained with a sample of G-Protein coupled receptors

  2. A stochastic HMM-based forecasting model for fuzzy time series.

    Science.gov (United States)

    Li, Sheng-Tun; Cheng, Yi-Chung

    2010-10-01

    Recently, fuzzy time series have attracted more academic attention than traditional time series due to their capability of dealing with the uncertainty and vagueness inherent in the data collected. The formulation of fuzzy relations is one of the key issues affecting forecasting results. Most of the present works adopt IF-THEN rules for relationship representation, which leads to higher computational overhead and rule redundancy. Sullivan and Woodall proposed a Markov-based formulation and a forecasting model to reduce computational overhead; however, its applicability is limited to handling one-factor problems. In this paper, we propose a novel forecasting model based on the hidden Markov model by enhancing Sullivan and Woodall's work to allow handling of two-factor forecasting problems. Moreover, in order to make the nature of conjecture and randomness of forecasting more realistic, the Monte Carlo method is adopted to estimate the outcome. To test the effectiveness of the resulting stochastic model, we conduct two experiments and compare the results with those from other models. The first experiment consists of forecasting the daily average temperature and cloud density in Taipei, Taiwan, and the second experiment is based on the Taiwan Weighted Stock Index by forecasting the exchange rate of the New Taiwan dollar against the U.S. dollar. In addition to improving forecasting accuracy, the proposed model adheres to the central limit theorem, and thus, the result statistically approximates to the real mean of the target value being forecast.

  3. An improved HMM/SVM dynamic hand gesture recognition algorithm

    Science.gov (United States)

    Zhang, Yi; Yao, Yuanyuan; Luo, Yuan

    2015-10-01

    In order to improve the recognition rate and stability of dynamic hand gesture recognition, for the low accuracy rate of the classical HMM algorithm in train the B parameter, this paper proposed an improved HMM/SVM dynamic gesture recognition algorithm. In the calculation of the B parameter of HMM model, this paper introduced the SVM algorithm which has the strong ability of classification. Through the sigmoid function converted the state output of the SVM into the probability and treat this probability as the observation state transition probability of the HMM model. After this, it optimized the B parameter of HMM model and improved the recognition rate of the system. At the same time, it also enhanced the accuracy and the real-time performance of the human-computer interaction. Experiments show that this algorithm has a strong robustness under the complex background environment and the varying illumination environment. The average recognition rate increased from 86.4% to 97.55%.

  4. BW Trained HMM based Aerial Image Segmentation

    Directory of Open Access Journals (Sweden)

    R Rajasree

    2011-03-01

    Full Text Available Image segmentation is an essential preprocessing tread in a complicated and composite image dealing out algorithm. In segmenting arial image the expenditure of misclassification could depend on the factual group of pupils. In this paper, aggravated by modern advances in contraption erudition conjecture, I introduce a modus operandi to make light of the misclassification expenditure with class-dependent expenditure. The procedure assumes the hidden Markov model (HMM which has been popularly used for image segmentation in recent years. We represent all feasible HMM based segmenters (or classifiers as a set of points in the beneficiary operating characteristic (ROC space. optimizing HMM parameters is still an important and challenging work in automatic image segmentation research area. Usually the Baum-Welch (B-W Algorithm is used to calculate the HMM model parameters. However, the B-W algorithm uses an initial random guess of the parameters, therefore after convergence the output tends to be close to this initial value of the algorithm, which is not necessarily the global optimum of the model parameters. In this project, a Adaptive Baum-Welch (GA-BW is proposed.

  5. HMM Logos for visualization of protein families

    Directory of Open Access Journals (Sweden)

    Schultz Jörg

    2004-01-01

    Full Text Available Abstract Background Profile Hidden Markov Models (pHMMs are a widely used tool for protein family research. Up to now, however, there exists no method to visualize all of their central aspects graphically in an intuitively understandable way. Results We present a visualization method that incorporates both emission and transition probabilities of the pHMM, thus extending sequence logos introduced by Schneider and Stephens. For each emitting state of the pHMM, we display a stack of letters. The stack height is determined by the deviation of the position's letter emission frequencies from the background frequencies. The stack width visualizes both the probability of reaching the state (the hitting probability and the expected number of letters the state emits during a pass through the model (the state's expected contribution. A web interface offering online creation of HMM Logos and the corresponding source code can be found at the Logos web server of the Max Planck Institute for Molecular Genetics http://logos.molgen.mpg.de. Conclusions We demonstrate that HMM Logos can be a useful tool for the biologist: We use them to highlight differences between two homologous subfamilies of GTPases, Rab and Ras, and we show that they are able to indicate structural elements of Ras.

  6. HHblits: lightning-fast iterative protein sequence searching by HMM-HMM alignment.

    Science.gov (United States)

    Remmert, Michael; Biegert, Andreas; Hauser, Andreas; Söding, Johannes

    2011-12-25

    Sequence-based protein function and structure prediction depends crucially on sequence-search sensitivity and accuracy of the resulting sequence alignments. We present an open-source, general-purpose tool that represents both query and database sequences by profile hidden Markov models (HMMs): 'HMM-HMM-based lightning-fast iterative sequence search' (HHblits; http://toolkit.genzentrum.lmu.de/hhblits/). Compared to the sequence-search tool PSI-BLAST, HHblits is faster owing to its discretized-profile prefilter, has 50-100% higher sensitivity and generates more accurate alignments.

  7. Learning Pullback HMM Distances.

    Science.gov (United States)

    Cuzzolin, Fabio; Sapienza, Michael

    2014-07-01

    Recent work in action recognition has exposed the limitations of methods which directly classify local features extracted from spatio-temporal video volumes. In opposition, encoding the actions' dynamics via generative dynamical models has a number of attractive features: however, using all-purpose distances for their classification does not necessarily deliver good results. We propose a general framework for learning distance functions for generative dynamical models, given a training set of labelled videos. The optimal distance function is selected among a family of pullback ones, induced by a parametrised automorphism of the space of models. We focus here on hidden Markov models and their model space, and design an appropriate automorphism there. Experimental results are presented which show how pullback learning greatly improves action recognition performances with respect to base distances.

  8. BW Trained HMM based Aerial Image Segmentation

    OpenAIRE

    R Rajasree; J. Nalini; S C Ramesh

    2011-01-01

    Image segmentation is an essential preprocessing tread in a complicated and composite image dealing out algorithm. In segmenting arial image the expenditure of misclassification could depend on the factual group of pupils. In this paper, aggravated by modern advances in contraption erudition conjecture, I introduce a modus operandi to make light of the misclassification expenditure with class-dependent expenditure. The procedure assumes the hidden Markov model (HMM) which has been popularly u...

  9. Efficient Blind System Identification of Non-Gaussian Auto-Regressive Models with HMM Modeling of the Excitation

    DEFF Research Database (Denmark)

    Li, Chunjian; Andersen, Søren Vang

    2007-01-01

    We propose two blind system identification methods that exploit the underlying dynamics of non-Gaussian signals. The two signal models to be identified are: an Auto-Regressive (AR) model driven by a discrete-state Hidden Markov process, and the same model whose output is perturbed by white Gaussian...

  10. Duration-Distribution-Based HMM for Speech Recognition

    Institute of Scientific and Technical Information of China (English)

    WANG Zuo-ying; XIAO Xi

    2006-01-01

    To overcome the defects of the duration modeling in the homogeneous Hidden Markov Model (HMM)for speech recognition,a duration-distribution-based HMM (DDBHMM) is proposed in this paper based on a formalized definition of a left-to-right inhomogeneous Markov model.It has been demonstrated that it can be identically defined by either the state duration or the state transition probability.The speaker-independent continuous speech recognition experiments show that by only modeling the state duration in DDBHMM,a significant improvement (17.8% error rate reduction) can be achieved compared with the classical HMM.The ideal properties of DDBHMM give promise to many aspects of speech modeling,such as the modeling of the state duration,speed variation,speech discontinuity,and interframe correlation.

  11. HMM-FRAME: accurate protein domain classification for metagenomic sequences containing frameshift errors

    Directory of Open Access Journals (Sweden)

    Sun Yanni

    2011-05-01

    Full Text Available Abstract Background Protein domain classification is an important step in metagenomic annotation. The state-of-the-art method for protein domain classification is profile HMM-based alignment. However, the relatively high rates of insertions and deletions in homopolymer regions of pyrosequencing reads create frameshifts, causing conventional profile HMM alignment tools to generate alignments with marginal scores. This makes error-containing gene fragments unclassifiable with conventional tools. Thus, there is a need for an accurate domain classification tool that can detect and correct sequencing errors. Results We introduce HMM-FRAME, a protein domain classification tool based on an augmented Viterbi algorithm that can incorporate error models from different sequencing platforms. HMM-FRAME corrects sequencing errors and classifies putative gene fragments into domain families. It achieved high error detection sensitivity and specificity in a data set with annotated errors. We applied HMM-FRAME in Targeted Metagenomics and a published metagenomic data set. The results showed that our tool can correct frameshifts in error-containing sequences, generate much longer alignments with significantly smaller E-values, and classify more sequences into their native families. Conclusions HMM-FRAME provides a complementary protein domain classification tool to conventional profile HMM-based methods for data sets containing frameshifts. Its current implementation is best used for small-scale metagenomic data sets. The source code of HMM-FRAME can be downloaded at http://www.cse.msu.edu/~zhangy72/hmmframe/ and at https://sourceforge.net/projects/hmm-frame/.

  12. Subspace Distribution Clustering HMM for Chinese Digit Speech Recognition

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    As a kind of statistical method, the technique of Hidden Markov Model (HMM) is widely used for speech recognition. In order to train the HMM to be more effective with much less amount of data, the Subspace Distribution Clustering Hidden Markov Model (SDCHMM), derived from the Continuous Density Hidden Markov Model (CDHMM), is introduced. With parameter tying, a new method to train SDCHMMs is described. Compared with the conventional training method, an SDCHMM recognizer trained by means of the new method achieves higher accuracy and speed. Experiment results show that the SDCHMM recognizer outperforms the CDHMM recognizer on speech recognition of Chinese digits.

  13. An HMM-Like Dynamic Time Warping Scheme for Automatic Speech Recognition

    Directory of Open Access Journals (Sweden)

    Ing-Jr Ding

    2014-01-01

    Full Text Available In the past, the kernel of automatic speech recognition (ASR is dynamic time warping (DTW, which is feature-based template matching and belongs to the category technique of dynamic programming (DP. Although DTW is an early developed ASR technique, DTW has been popular in lots of applications. DTW is playing an important role for the known Kinect-based gesture recognition application now. This paper proposed an intelligent speech recognition system using an improved DTW approach for multimedia and home automation services. The improved DTW presented in this work, called HMM-like DTW, is essentially a hidden Markov model- (HMM- like method where the concept of the typical HMM statistical model is brought into the design of DTW. The developed HMM-like DTW method, transforming feature-based DTW recognition into model-based DTW recognition, will be able to behave as the HMM recognition technique and therefore proposed HMM-like DTW with the HMM-like recognition model will have the capability to further perform model adaptation (also known as speaker adaptation. A series of experimental results in home automation-based multimedia access service environments demonstrated the superiority and effectiveness of the developed smart speech recognition system by HMM-like DTW.

  14. Optimization of HMM Parameters Based on Chaos and Genetic Algorithm for Hand Gesture Recognition

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In order to prevent standard genetic algorithm (SGA) from being premature, chaos is introduced into GA,thus forming chaotic anneal genetic algorithm (CAGA). Chaos' ergodicity is used to initialize the population, and chaoticanneal mutation operator is used as the substitute for the mutation operator in SGA. CAGA is a unified framework of theexisting chaotic mutation methods. To validate the proposed algorithm, three algorithms, i.e. Baum-Welch, SGA andCAGA, are compared on training hidden Markov model (HMM) to recognize the hand gestures. Experiments on twenty-six alphabetical gestures show the CAGA's validity.

  15. Metode Linear Predictive Coding (LPC Pada klasifikasi Hidden Markov Model (HMM Untuk Kata Arabic pada penutur Indonesia

    Directory of Open Access Journals (Sweden)

    Ririn Kusumawati

    2016-05-01

    In the classification, using Hidden Markov Model, voice signal is analyzed and searched the maximum possible value that can be recognized. The modeling results obtained parameters are used to compare with the sound of Arabic speakers. From the test results' Classification, Hidden Markov Models with Linear Predictive Coding extraction average accuracy of 78.6% for test data sampling frequency of 8,000 Hz, 80.2% for test data sampling frequency of 22050 Hz, 79% for frequencies sampling test data at 44100 Hz.

  16. Effect of HMM Glutenin Subunits on Wheat Quality Attributes

    Directory of Open Access Journals (Sweden)

    Daniela Horvat

    2009-01-01

    Full Text Available Glutenin is a group of polymeric gluten proteins. Glutenin molecules consist of glutenin subunits linked together with disulphide bonds and having higher (HMM-GS and lower (LMM-GS molecular mass. The main objective of this study is the evaluation of the influence of HMM-GS on flour processing properties. Seven bread wheat genotypes with contrasting quality attributes and different HMM-GS composition were analyzed during three years. The composition and quantity of HMM-GS were determined by SDS-PAGE and RP-HPLC, respectively. The quality diversity among genotypes was estimated by the analysis of wheat grain, and flour and bread quality parameters. The presence of HMM glutenin subunits 1 and 2* at Glu-A1 and the subunits 5+10 at Glu-D1 loci, as well as a higher proportion of total HMM-GS, had a positive effect on wheat quality. Cluster analysis of the three groups of data (genotype and HMM-GS, flour and bread quality, and dough rheology yielded the same hierarchical structure for the first top three levels, and similarity of the corresponding dendrograms was proved by the principal eigenvalues of the corresponding Euclidian distance matrices. The obtained similarity in classification based on essentially different types of measurements reflects strong natural association between genetic data, product quality and physical properties. Principal component analysis (PCA was applied to effectively reduce large data set into lower dimensions of latent variables amenable for the analysis. PCA analysis of the total set of data (15 variables revealed a very strong interrelationship between the variables. The first three PCA components accounted for 96 % of the total variance, which was significant to the level of 0.05 and was considered as the level of experimental error. These data imply that the quality of wheat cultivars can be contributed to HMM-GS data and should be taken into account in breeding programs assisted by computer models with the aim to

  17. Appropriate baseline values for HMM-based speech recognition

    CSIR Research Space (South Africa)

    Barnard, E

    2004-11-01

    Full Text Available A number of issues realted to the development of speech-recognition systems with Hidden Markov Models (HMM) are discussed. A set of systematic experiments using the HTK toolkit and the TMIT database are used to elucidate matters such as the number...

  18. Deciding of HMM parameters based on number of critical points for gesture recognition from motion capture data

    CERN Document Server

    Cholewa, Michał

    2011-01-01

    This paper presents a method of choosing number of states of a HMM based on number of critical points of the motion capture data. The choice of Hidden Markov Models(HMM) parameters is crucial for recognizer's performance as it is the first step of the training and cannot be corrected automatically within HMM. In this article we define predictor of number of states based on number of critical points of the sequence and test its effectiveness against sample data.

  19. HMM based Korean Named Entity Recognition

    Directory of Open Access Journals (Sweden)

    Yi-Gyu Hwang

    2003-02-01

    Full Text Available In this paper, we present a named entity recognition model for Korean Language. Named entity recognition is an essential and important process of Question Answering and Information Extraction system. This paper proposes a HMM based named entity recognition using compound word construction principles. In Korean, above 60% of NE (Named-Entity is a compound word. This compound word may be consisted of proper noun, common noun, or bound noun, etc. There is an intercontextual relationship among nouns which consists NE. NE and surrounding words of NE have a contextual relationship. For considering these relationships, we classified nouns into 4 word classes (Independent Entity, Constituent Entity, Adjacent Entity, Not an Entity. With this classification, our system gets contextual and lexical information by stochastic based machine leaning method from a NE labeled training data. Experimental result shows that this approach is better approach than rulebased in the Korean named-entity recognition.

  20. Multiple instance learning for hidden Markov models: application to landmine detection

    Science.gov (United States)

    Bolton, Jeremy; Yuksel, Seniha Esen; Gader, Paul

    2013-06-01

    Multiple instance learning is a recently researched learning paradigm in machine intelligence which operates under conditions of uncertainty. A Multiple Instance Hidden Markov Model (MI-HMM) is investigated with applications to landmine detection using ground penetrating radar data. Without introducing any additional parameters, the MI-HMM provides an elegant and simple way to learn the parameters of an HMM in a multiple instance framework. The efficacy of the model is shown on a real landmine dataset. Experiments on the landmine dataset show that MI-HMM learning is effective.

  1. DWT BASED HMM FOR FACE RECOGNITION

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    A novel Discrete Wavelet Transform (DWT) based Hidden Markov Module (HMM) for face recognition is presented in this letter. To improve the accuracy of HMM based face recognition algorithm, DWT is used to replace Discrete Cosine Transform (DCT) for observation sequence extraction. Extensive experiments are conducted on two public databases and the results show that the proposed method can improve the accuracy significantly, especially when the face database is large and only few training images are available.

  2. HMM FEATURE MODELLING AND IMMUNE CLONE OPTIMAL CLASSIFICATION FOR ABNORMAL TRAFFIC EVENTS%交通异常事件HMM特征建模及免疫克隆优化分类

    Institute of Scientific and Technical Information of China (English)

    杜佳颖; 隋强强

    2011-01-01

    Aiming at the indirect correlation between the observational data and the discriminated status in traffic video event description,the relative velocity, distance and location of the vehicles on the freeway were selected as the physical quantity to be observed, and the degrees of risk of abnormal event were defined as detection status, hidden Markov learning algorithm was employed to establish event feature description model (HMM). As HMM event classification interface is nonlinear, in allusion to this, immune clone concentration clustering algorithm (ICCCA) was used for the optimised classification of normal/abnormal events, which overcame the limitation of the way directly classifying with HMM threshold, as well as the weaknesses of traditional classification algorithms such as multiple restraints and easy to falling into local optimum. Therefore, it can obtain the global optimal classification outcome accurately and quickly. 30 episodes of general video and 70 episodes of vehicle clashing video were used for testing. ROC curves and computation complexities of vehicle crashing events with HMM threshold,neural network algorithm and SVM algorithm were compared,it is shown that method proposed in this paper performed better than other classification algorithms in detecting vehicle crash events on the freeway.%针对交通视频事件描述的观测数据与判别状态的非直接关联性,以高速公路上车辆之间的相对速度、距离、位置作为观测物理量,定义异常事件的危险程度作为检测状态,利用隐马尔科夫学习算法,建立了事件特征描述模型(HMM);针对HMM事件分类界面的非线性问题,利用免疫克隆浓度聚类算法(ICCCA)对事件进行异常/正常优化分类,克服了直接利用HMM阈值分类的方法局限,以及传统分类方法约束条件多,容易陷入局部最小的缺点,能够准确快速地得到全局优化分类结果.采用30段正常视频和70段撞车视频进行测试,比较了HMM

  3. Ensemble learning HMM for motion recognition and retrieval by Isomap dimension reduction

    Institute of Scientific and Technical Information of China (English)

    XIANG Jian; WENG Jian-guang; ZHUANG Yue-ting; WU Fei

    2006-01-01

    Along with the development of motion capture technique, more and more 3D motion databases become available. In this paper, a novel approach is presented for motion recognition and retrieval based on ensemble HMM (hidden Markov model)learning. Due to the high dimensionality of motion's features, Isomap nonlinear dimension reduction is used for training data of ensemble HMM learning. For handling new motion data, Isomap is generalized based on the estimation of underlying eigenfunctions. Then each action class is learned with one HMM. Since ensemble learning can effectively enhance supervised learning,ensembles of weak HMM learners are built. Experiment results showed that the approaches are effective for motion data recognition and retrieval.

  4. HMM based automated wheelchair navigation using EOG traces in EEG

    Science.gov (United States)

    Aziz, Fayeem; Arof, Hamzah; Mokhtar, Norrima; Mubin, Marizan

    2014-10-01

    This paper presents a wheelchair navigation system based on a hidden Markov model (HMM), which we developed to assist those with restricted mobility. The semi-autonomous system is equipped with obstacle/collision avoidance sensors and it takes the electrooculography (EOG) signal traces from the user as commands to maneuver the wheelchair. The EOG traces originate from eyeball and eyelid movements and they are embedded in EEG signals collected from the scalp of the user at three different locations. Features extracted from the EOG traces are used to determine whether the eyes are open or closed, and whether the eyes are gazing to the right, center, or left. These features are utilized as inputs to a few support vector machine (SVM) classifiers, whose outputs are regarded as observations to an HMM. The HMM determines the state of the system and generates commands for navigating the wheelchair accordingly. The use of simple features and the implementation of a sliding window that captures important signatures in the EOG traces result in a fast execution time and high classification rates. The wheelchair is equipped with a proximity sensor and it can move forward and backward in three directions. The asynchronous system achieved an average classification rate of 98% when tested with online data while its average execution time was less than 1 s. It was also tested in a navigation experiment where all of the participants managed to complete the tasks successfully without collisions.

  5. Computer-Aided Modeling Framework

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    development and application. The proposed work is a part of the project for development of methods and tools that will allow systematic generation, analysis and solution of models for various objectives. It will use the computer-aided modeling framework that is based on a modeling methodology, which combines....... In this contribution, the concept of template-based modeling is presented and application is highlighted for the specific case of catalytic membrane fixed bed models. The modeling template is integrated in a generic computer-aided modeling framework. Furthermore, modeling templates enable the idea of model reuse...... are generated through the template in ICAS-MoT and translated into a model object. Once in ICAS-MoT, the model is numerical analyzed, solved and identified. A computer-aided modeling framework integrating systematic model derivation and development tools has been developed. It includes features for model...

  6. Dicyanometallates as Model Extended Frameworks

    Science.gov (United States)

    2016-01-01

    We report the structures of eight new dicyanometallate frameworks containing molecular extra-framework cations. These systems include a number of hybrid inorganic–organic analogues of conventional ceramics, such as Ruddlesden–Popper phases and perovskites. The structure types adopted are rationalized in the broader context of all known dicyanometallate framework structures. We show that the structural diversity of this family can be understood in terms of (i) the charge and coordination preferences of the particular metal cation acting as framework node, and (ii) the size, shape, and extent of incorporation of extra-framework cations. In this way, we suggest that dicyanometallates form a particularly attractive model family of extended frameworks in which to explore the interplay between molecular degrees of freedom, framework topology, and supramolecular interactions. PMID:27057759

  7. Geologic Framework Model (GFM2000)

    Energy Technology Data Exchange (ETDEWEB)

    T. Vogt

    2004-08-26

    The purpose of this report is to document the geologic framework model, version GFM2000 with regard to input data, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, and the differences between GFM2000 and previous versions. The version number of this model reflects the year during which the model was constructed. This model supersedes the previous model version, documented in Geologic Framework Model (GFM 3.1) (CRWMS M&O 2000 [DIRS 138860]). The geologic framework model represents a three-dimensional interpretation of the geology surrounding the location of the monitored geologic repository for spent nuclear fuel and high-level radioactive waste at Yucca Mountain. The geologic framework model encompasses and is limited to an area of 65 square miles (168 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the geologic framework model (shown in Figure 1-1) were chosen to encompass the exploratory boreholes and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The upper surface of the model is made up of the surface topography and the depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The geologic framework model was constructed from geologic map and borehole data. Additional information from measured stratigraphic sections, gravity profiles, and seismic profiles was also considered. The intended use of the geologic framework model is to provide a geologic framework over the area of interest consistent with the level of detailed needed for hydrologic flow and radionuclide transport modeling through the UZ and for repository design. The model is limited by the availability of data and relative amount of geologic complexity found in an area. The geologic framework model is inherently limited by scale and content. The grid spacing used in the

  8. Computer-Aided Modeling Framework

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    with them. As the required models may be complex and require multiple time and/or length scales, their development and application for product-process design is not trivial. Therefore, a systematic modeling framework can contribute by significantly reducing the time and resources needed for model...... development and application. The proposed work is a part of the project for development of methods and tools that will allow systematic generation, analysis and solution of models for various objectives. It will use the computer-aided modeling framework that is based on a modeling methodology, which combines....... In this contribution, the concept of template-based modeling is presented and application is highlighted for the specific case of catalytic membrane fixed bed models. The modeling template is integrated in a generic computer-aided modeling framework. Furthermore, modeling templates enable the idea of model reuse...

  9. A Strategy for Selecting Classes of Symbols from Classes of Graphemes in HMM-Based Handwritten Word Recognition A Strategy for Selecting Classes of Symbols from Classes of Graphemes in HMM-Based Handwritten Word Recognition A Strategy for Selecting Classes of Symbols from Classes of Graphemes in HMM-Based Handwritten Word Recognition

    Directory of Open Access Journals (Sweden)

    Cinthia O. A. Freitas

    2004-06-01

    Full Text Available This paper presents a new strategy for selecting classes of symbols from classes of graphemes in HMM-based handwritten word recognition from Brazilian legal amounts. This paper discusses features, graphemes and symbols, as our baseline system is based on a global approach in which the explicit segmentation of words into letters or pseudo-letters is avoided and HMM models are used. For this framework, the input data are the symbols of an alphabet based on graphemes extracted from the word images visible on the Hidden Markov Model. The idea is to introduce high-level concepts, such as perceptual features (loops, ascenders, descenders, concavities and convexities and to provide fast and informative feedback about the information contained in each class of grapheme for symbol class selection. The paper presents an algorithm based on Mutual Information and HMM working in the same evaluation process. Finally, the experimental results demonstrate that it is possible to select from the “original” grapheme set (composed of 94 graphemes an alphabet of symbols (composed of 29 symbols. We conclude that the discriminating power of the grapheme is very important for consolidating an alphabet of symbols. Este artigo descreve uma metodologia para seleção de classes de símbolos a partir de classesde grafemas em um sistema de reconhecimento de palavras manuscritas do extenso de cheques bancários brasileiros baseado em HMM (Hidden Markov Models. Este artigo discute as definições de primitivas,grafemas e símbolos considerando um enfoque Global para o reconhecimento das palavras, o qual evita a segmentação das palavras em letras ou pseudo-letras utilizando HMM. Assim, a entrada para os modelos consiste em uma descrição da palavra a partir de um alfabeto de símbolos gerados a partir dos grafemas extraídos das imagens das palavras, sendo esta a representação visível para o HMM. Portanto, a idéia é introduzir uma conceituação de alto n

  10. Hybrid model decomposition of speech and noise in a radial basis function neural model framework

    DEFF Research Database (Denmark)

    Sørensen, Helge Bjarup Dissing; Hartmann, Uwe

    1994-01-01

    applied is based on a combination of the hidden Markov model (HMM) decomposition method, for speech recognition in noise, developed by Varga and Moore (1990) from DRA and the hybrid (HMM/RBF) recognizer containing hidden Markov models and radial basis function (RBF) neural networks, developed by Singer...... and Lippmann (1992) from MIT Lincoln Lab. The present authors modified the hybrid recognizer to fit into the decomposition method to achieve high performance speech recognition in noisy environments. The approach has been denoted the hybrid model decomposition method and it provides an optimal method...... for decomposition of speech and noise by using a set of speech pattern models and a noise model(s), each realized as an HMM/RBF pattern model...

  11. Environmental modeling framework invasiveness: analysis and implications

    Science.gov (United States)

    Environmental modeling frameworks support scientific model development by providing an Application Programming Interface (API) which model developers use to implement models. This paper presents results of an investigation on the framework invasiveness of environmental modeling frameworks. Invasiven...

  12. A Model for Rearchitecting Frameworks

    Directory of Open Access Journals (Sweden)

    Galal H. Galal-Edeen

    2009-07-01

    Full Text Available Software rearchitecting is the process of obtaining a documented architecture for an existing system. There are many software rearchitecting frameworks which are based upon different concepts and context-related issues for a specific application or programming language, such as Rigi, Ciao, SPOOL, and Symphony, and Software Rearchitecting Action Framework (SRAF. Most of the frameworks focus on the reverse engineering process of source code. They neglect the role of stakeholders in enhancing and developing their systems. This paper presents a systematic analysis and comparative study for rearchitecting frameworks using generic architecture characteristics or elements. Based on the major requirements that should be available in the rearchitecting frameworks, the comparative study proceeds. An efficient model is proposed based on the trends that resulted from the comparative analysis. It considers the evaluation criteria of the compared frameworks. Conclusions and remarks are highlighted.

  13. A SPEAKER ADAPTABLE VERY LOW BIT RATE SPEECHCODER BASED ON HMM

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    This paper presented a speaker adaptable very low bit rate speech coder based on HMM (Hidden Markov Model) which includes the dynamic features, i.e. , delta and delta-delta parameters of speech. The performance of this speech coder has been improved by using the dynamic features generated by an algorithm for speech parameter generation from HMM because the generated speech parameter vectors reflect not only the means of static and dynamic feature vectors but also the covariance of those. The encoder part is equivalent to an HMM-based phoneme recognizer and transmits phoneme indexes, state durations, pitch information and speaker characteristics adaptation vectors to the decoder. The decoder receives those messages and concatenates phoneme HMM sequence according to the phoneme indexes. Then the decoder generates a sequence of mel-cepstral coefficient vectors using HMM-based speech parameter generation technique. Finally the decoder synthesizes speech by directly exciting the MLSA(Mel Log Spectrum Approximation) filter with the generated mel-cepstral coefficient vectors, according to the pitch information.

  14. Evaluation of various feature extraction methods for landmine detection using hidden Markov models

    Science.gov (United States)

    Hamdi, Anis; Frigui, Hichem

    2012-06-01

    Hidden Markov Models (HMM) have proved to be eective for detecting buried land mines using data collected by a moving-vehicle-mounted ground penetrating radar (GPR). The general framework for a HMM-based landmine detector consists of building a HMM model for mine signatures and a HMM model for clutter signatures. A test alarm is assigned a condence proportional to the probability of that alarm being generated by the mine model and inversely proportional to its probability in the clutter model. The HMM models are built based on features extracted from GPR training signatures. These features are expected to capture the salient properties of the 3-dimensional alarms in a compact representation. The baseline HMM framework for landmine detection is based on gradient features. It models the time varying behavior of GPR signals, encoded using edge direction information, to compute the likelihood that a sequence of measurements is consistent with a buried landmine. In particular, the HMM mine models learns the hyperbolic shape associated with the signature of a buried mine by three states that correspond to the succession of an increasing edge, a at edge, and a decreasing edge. Recently, for the same application, other features have been used with dierent classiers. In particular, the Edge Histogram Descriptor (EHD) has been used within a K-nearest neighbor classier. Another descriptor is based on Gabor features and has been used within a discrete HMM classier. A third feature, that is closely related to the EHD, is the Bar histogram feature. This feature has been used within a Neural Networks classier for handwritten word recognition. In this paper, we propose an evaluation of the HMM based landmine detection framework with several feature extraction techniques. We adapt and evaluate the EHD, Gabor, Bar, and baseline gradient feature extraction methods. We compare the performance of these features using a large and diverse GPR data collection.

  15. CMAQ Model Evaluation Framework

    Science.gov (United States)

    CMAQ is tested to establish the modeling system’s credibility in predicting pollutants such as ozone and particulate matter. Evaluation of CMAQ has been designed to assess the model’s performance for specific time periods and for specific uses.

  16. HHsenser: exhaustive transitive profile search using HMM-HMM comparison.

    Science.gov (United States)

    Söding, Johannes; Remmert, Michael; Biegert, Andreas; Lupas, Andrei N

    2006-07-01

    HHsenser is the first server to offer exhaustive intermediate profile searches, which it combines with pairwise comparison of hidden Markov models. Starting from a single protein sequence or a multiple alignment, it can iteratively explore whole superfamilies, producing few or no false positives. The output is a multiple alignment of all detected homologs. HHsenser's sensitivity should make it a useful tool for evolutionary studies. It may also aid applications that rely on diverse multiple sequence alignments as input, such as homology-based structure and function prediction, or the determination of functional residues by conservation scoring and functional subtyping.HHsenser can be accessed at http://hhsenser.tuebingen.mpg.de/. It has also been integrated into our structure and function prediction server HHpred (http://hhpred.tuebingen.mpg.de/) to improve predictions for near-singleton sequences.

  17. Application of HMM-SVM in Fault Diagnosis of Analog Circuits%HMM-SVM混合模型在模拟电路故障诊断中的应用

    Institute of Scientific and Technical Information of China (English)

    刘任洋; 吴文全; 李超; 马龙

    2013-01-01

    针对单一的隐马尔科夫模型(HMM)或支持向量机(SVM)在模拟电路早期的软故障中识别率不高的特点,将HMM-SVM混合模型应用到模拟电路早期的软故障识别中.首先通过主成分分析(PCA)将原始数据样本降维实现初步划分;接着利用HMM计算测试样本与各故障状态的匹配程度形成特征向量;最后由SVM做故障状态判别.实验结果表明,HMM-SVM混合模型的早期故障识别率优于单一的HMM或SVM模型,将平均故障识别率提高到95%以上.%Since the incipient faults of analog circuit are hard to be identified well by using only Hidden Markov Model (HMM) or Support Vector Machine (SVM), a new fault diagnosis method based on HMM-SVM was proposed. Firstly, the dimensions of the experimental samples were decreased and classified briefly by Principal Components Analysis (PCA). Then, HMM was used to calculate the matching degree between the test samples and all the fault states, which formed the feature vectors for SVM in final diagnosis. The result shows that HMM-SVM is better than single HMM or SVM model for the incipient fault diagnosis, and the average fault recognition rate was increased by more than ninety-five percent.

  18. Application of Improved HMM Algorithm in Slag Detection System

    Institute of Scientific and Technical Information of China (English)

    TAN Da-peng; LI Pei-yu; PAN Xiao-hong

    2009-01-01

    To solve the problems of ladle slag detection system (SDS),such as high cost,short service life,and inconvenient maintenance,a new SDS realization method based on hidden Markov model (HMM) was put forward.The physical process of continuous casting was analyzed,and vibration signal was considered as the main detecting signal according to the difference in shock vibration generated by molten steel and slag because of their difference in density.Automatic control experiment platform oriented to SDS was established,and vibration sensor was installed far away from molten steel,which could solve the problem of easy power consumption by the sensor.The combination of vector quantization technology with learning process parameters of HMM was optimized,and its revaluation formula was revised to enhance its recognition effectiveness.Industrial field experiments proved that this system requires low cost and little rebuilding for current devices,and its slag detection rate can exceed 95 %.

  19. NEW HMM ALGORITHM FOR TOPOLOGY OPTIMIZATION

    Institute of Scientific and Technical Information of China (English)

    Zuo Kongtian; Zhao Yudong; Chen Liping; Zhong Yifang; Huang Yuying

    2005-01-01

    A new hybrid MMA-MGCMMA (HMM) algorithm for solving topology optimization problems is presented. This algorithm combines the method of moving asymptotes (MMA) algorithm and the modified globally convergent version of the method of moving asymptotes (MGCMMA) algorithm in the optimization process. This algorithm preserves the advantages of both MMA and MGCMMA. The optimizer is switched from MMA to MGCMMA automatically, depending on the numerical oscillation value existing in the calculation. This algorithm can improve calculation efficiency and accelerate convergence compared with simplex MMA or MGCMMA algorithms, which is proven with an example.

  20. Explorations in the History of Machines and Mechanisms : Proceedings of HMM2012

    CERN Document Server

    Ceccarelli, Marco

    2012-01-01

    This book contains the proceedings of HMM2012, the 4th International Symposium on Historical Developments in the field of Mechanism and Machine Science (MMS). These proceedings cover recent research concerning all aspects of the development of MMS from antiquity until the present and its historiography: machines, mechanisms, kinematics, dynamics, concepts and theories, design methods, collections of methods, collections of models, institutions and biographies.

  1. 基于 SVM 和 HMM 二级模型的行为识别方案%Human Activ ity Recognition Based on Combined SVM & HMM

    Institute of Scientific and Technical Information of China (English)

    苏竑宇; 陈启安; 吴海涛

    2015-01-01

    Absrt act:Being able to recognize human activities is essential for several intelligent applications , including personal assistive ro-botics and smart homes .In this paper , we perform the recognition of the human activity based on the combined SVM&HMM in daily living environments .Firstly, we use a RGBD sensor ( Microsoft Kinect ) as the input sensor , and extract a set of the fusion features, including motion, body structure features and joint polar coordinates features .Secondly, we propose a combined SVM&HMM Model which not only combines the SVM characteristics of reflecting the difference among the samples , but also de-velops the HMM characteristics of dealing with the continuous activities .The SVM&HMM model plays their respective advantages of SVM and HMM comprehensively .Thus, the combined model overcomes the drawbacks of accuracy , robustness and computa-tional efficiency compared with the separate SVM model or the traditional HMM model in the human activity recognition .The ex-periment results show that the proposed algorithm possesses the better robustness and distinction .%人体行为识别对于个人辅助机器人和智能家居等一些智能应用,是非常必要的功能,本文运用SVM&HMM混合分类模型进行日常生活环境的人体行为识别。首先,使用微软的Kinect(一种RGBD感应器)作为输入感应器,提取融合特征集,包括运动特征、身体结构特征、极坐标特征。其次,提出SVM&HMM模型, SVM&HMM二级模型发挥了SVM和HMM各自的优点,既结合了SVM适于反映样本间差异性特点,又发挥了HMM适合处理连续行为的特点。该二级模型克服了单一SVM模型、传统HMM模型和在人体复杂和相似行为建模过程中精度、鲁棒性和计算效率上的不足。通过大量实验,结果表明SVM&HMM二级模型对室内日常行为的识别具有较高的识别率,且具有较好的区分性和鲁棒性。

  2. jpHMM: recombination analysis in viruses with circular genomes such as the hepatitis B virus

    Science.gov (United States)

    Schultz, Anne-Kathrin; Bulla, Ingo; Abdou-Chekaraou, Mariama; Gordien, Emmanuel; Morgenstern, Burkhard; Zoaulim, Fabien; Dény, Paul; Stanke, Mario

    2012-01-01

    jpHMM is a very accurate and widely used tool for recombination detection in genomic sequences of HIV-1. Here, we present an extension of jpHMM to analyze recombinations in viruses with circular genomes such as the hepatitis B virus (HBV). Sequence analysis of circular genomes is usually performed on linearized sequences using linear models. Since linear models are unable to model dependencies between nucleotides at the 5′- and 3′-end of a sequence, this can result in inaccurate predictions of recombination breakpoints and thus in incorrect classification of viruses with circular genomes. The proposed circular jpHMM takes into account the circularity of the genome and is not biased against recombination breakpoints close to the 5′- or 3′-end of the linearized version of the circular genome. It can be applied automatically to any query sequence without assuming a specific origin for the sequence coordinates. We apply the method to genomic sequences of HBV and visualize its output in a circular form. jpHMM is available online at http://jphmm.gobics.de for download and as a web server for HIV-1 and HBV sequences. PMID:22600739

  3. Improved ASL based Gesture Recognition using HMM for System Application

    Directory of Open Access Journals (Sweden)

    Shalini Anand

    2014-03-01

    Full Text Available Gesture recognition is a growing field of research and among various human computer interactions; hand gesture recognition is very popular for interacting between human and machines. It is non verbal way of communication and this research area is full of innovative approaches. This project aims at recognizing 34 basic static hand gestures based on American Sign Language (ASL including alphabets as well as numbers (0 to 9. In this project we have not considered two alphabets i.e J and Z as our project aims as recognizing static hand gesture but according to ASL they are considered as dynamic. The main features used are optimization of the database using neural network and Hidden Markov Model (HMM. That is the algorithm is based on shape based features by keeping in the mind that shape of human hand is same for all human beings except in some situations

  4. A Unified Framework for Systematic Model Improvement

    DEFF Research Database (Denmark)

    Kristensen, Niels Rode; Madsen, Henrik; Jørgensen, Sten Bay

    2003-01-01

    A unified framework for improving the quality of continuous time models of dynamic systems based on experimental data is presented. The framework is based on an interplay between stochastic differential equation (SDE) modelling, statistical tests and multivariate nonparametric regression...

  5. Hybrid Genetic Algorithm Based Optimization of Coupled HMM for Complex Interacting Processes Recognition

    Institute of Scientific and Technical Information of China (English)

    Liu Jianghua(刘江华); Chen Jiapin; Cheng Junshi

    2004-01-01

    Coupled Hidden Markov Model (CHMM) is the extension of traditional HMM, which is mainly used for complex interactive process modeling such as two-hand gestures. However, the problems of finding optimal model parameter are still of great interest to the researches in this area. This paper proposes a hybrid genetic algorithm (HGA) for the CHMM training. Chaos is used to initialize GA and used as mutation operator. Experiments on Chinese TaiChi gestures show that standard GA (SGA) based CHMM training is superior to Maximum Likelihood (ML) HMM training. HGA approach has the highest recognition rate of 98.0769%, then 96.1538% for SGA. The last one is ML method, only with a recognition rate of 69.2308%.

  6. Crystallization Kinetics within a Generic Modelling Framework

    DEFF Research Database (Denmark)

    Meisler, Kresten Troelstrup; von Solms, Nicolas; Gernaey, Krist

    2013-01-01

    An existing generic modelling framework has been expanded with tools for kinetic model analysis. The analysis of kinetics is carried out within the framework where kinetic constitutive models are collected, analysed and utilized for the simulation of crystallization operations. A modelling...... procedure is proposed to gain the information of crystallization operation kinetic model analysis and utilize this for faster evaluation of crystallization operations....

  7. Geologic Framework Model Analysis Model Report

    Energy Technology Data Exchange (ETDEWEB)

    R. Clayton

    2000-12-19

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the

  8. Deriving Framework Usages Based on Behavioral Models

    Science.gov (United States)

    Zenmyo, Teruyoshi; Kobayashi, Takashi; Saeki, Motoshi

    One of the critical issue in framework-based software development is a huge introduction cost caused by technical gap between developers and users of frameworks. This paper proposes a technique for deriving framework usages to implement a given requirements specification. By using the derived usages, the users can use the frameworks without understanding the framework in detail. Requirements specifications which describe definite behavioral requirements cannot be related to frameworks in as-is since the frameworks do not have definite control structure so that the users can customize them to suit given requirements specifications. To cope with this issue, a new technique based on satisfiability problems (SAT) is employed to derive the control structures of the framework model. In the proposed technique, requirements specifications and frameworks are modeled based on Labeled Transition Systems (LTSs) with branch conditions represented by predicates. Truth assignments of the branch conditions in the framework models are not given initially for representing the customizable control structure. The derivation of truth assignments of the branch conditions is regarded as the SAT by assuming relations between termination states of the requirements specification model and ones of the framework model. This derivation technique is incorporated into a technique we have proposed previously for relating actions of requirements specifications to ones of frameworks. Furthermore, this paper discuss a case study of typical use cases in e-commerce systems.

  9. A Novel Approach to Detect Network Attacks Using G-HMM-Based Temporal Relations between Internet Protocol Packets

    Directory of Open Access Journals (Sweden)

    Han Kyusuk

    2011-01-01

    Full Text Available This paper introduces novel attack detection approaches on mobile and wireless device security and network which consider temporal relations between internet packets. In this paper we first present a field selection technique using a Genetic Algorithm and generate a Packet-based Mining Association Rule from an original Mining Association Rule for Support Vector Machine in mobile and wireless network environment. Through the preprocessing with PMAR, SVM inputs can account for time variation between packets in mobile and wireless network. Third, we present Gaussian observation Hidden Markov Model to exploit the hidden relationships between packets based on probabilistic estimation. In our G-HMM approach, we also apply G-HMM feature reduction for better initialization. We demonstrate the usefulness of our SVM and G-HMM approaches with GA on MIT Lincoln Lab datasets and a live dataset that we captured on a real mobile and wireless network. Moreover, experimental results are verified by -fold cross-validation test.

  10. Frameworks for understanding and describing business models

    DEFF Research Database (Denmark)

    Nielsen, Christian; Roslender, Robin

    2014-01-01

    This chapter provides in a chronological fashion an introduction to six frameworks that one can apply to describing, understanding and also potentially innovating business models. These six frameworks have been chosen carefully as they represent six very different perspectives on business models ...... Maps (2001) • Intellectual Capital Statements (2003) • Chesbrough’s framework for Open Business Models (2006) • Business Model Canvas (2008)......This chapter provides in a chronological fashion an introduction to six frameworks that one can apply to describing, understanding and also potentially innovating business models. These six frameworks have been chosen carefully as they represent six very different perspectives on business models...... and in this manner “complement” each other. There are a multitude of varying frameworks that could be chosen from and we urge the reader to search and trial these for themselves. The six chosen models (year of release in parenthesis) are: • Service-Profit Chain (1994) • Strategic Systems Auditing (1997) • Strategy...

  11. Comparison of HMM and DTW methods in automatic recognition of pathological phoneme pronunciation

    OpenAIRE

    Wielgat, Robert; Zielinski, Tomasz P.; Swietojanski, Pawel; Zoladz, Piotr; Król, Daniel; Wozniak, Tomasz; Grabias, Stanislaw

    2007-01-01

    In the paper recently proposed Human Factor Cepstral Coefficients (HFCC) are used to automatic recognition of pathological phoneme pronunciation in speech of impaired children and efficiency of this approach is compared to application of the standard Mel-Frequency Cepstral Coefficients (MFCC) as a feature vector. Both dynamic time warping (DTW), working on whole words or embedded phoneme patterns, and hidden Markov models (HMM) are used as classifiers in the presented research. Obtained resul...

  12. A UML profile for framework modeling

    Institute of Scientific and Technical Information of China (English)

    XU Xiao-liang(徐小良); WANG Le-yu(汪乐宇); ZHOU Hong(周泓)

    2004-01-01

    The current standard Unified Modeling Language(UML) could not model framework flexibility and extendibility adequately due to lack of appropriate constructs to distinguish framework hot-spots from kernel elements. A new UML profile that may customize UML for framework modeling was presented using the extension mechanisms of UML, providing a group of UML extensions to meet the needs of framework modeling. In this profile, the extended class diagrams and sequence diagrams were defined to straightforwardly identify the hot-spots and describe their instantiation restrictions. A transformation model based on design patterns was also put forward, such that the profile based framework design diagrams could be automatically mapped to the corresponding implementation diagrams. It was proved that the presented profile makes framework modeling more straightforwardly and therefore easier to understand and instantiate.

  13. A UML profile for framework modeling.

    Science.gov (United States)

    Xu, Xiao-liang; Wang, Le-yu; Zhou, Hong

    2004-01-01

    The current standard Unified Modeling Language(UML) could not model framework flexibility and extendability adequately due to lack of appropriate constructs to distinguish framework hot-spots from kernel elements. A new UML profile that may customize UML for framework modeling was presented using the extension mechanisms of UML, providing a group of UML extensions to meet the needs of framework modeling. In this profile, the extended class diagrams and sequence diagrams were defined to straightforwardly identify the hot-spots and describe their instantiation restrictions. A transformation model based on design patterns was also put forward, such that the profile based framework design diagrams could be automatically mapped to the corresponding implementation diagrams. It was proved that the presented profile makes framework modeling more straightforwardly and therefore easier to understand and instantiate.

  14. Estimating VDT Mental Fatigue Using Multichannel Linear Descriptors and KPCA-HMM

    Directory of Open Access Journals (Sweden)

    Yi Ouyang

    2008-04-01

    Full Text Available The impacts of prolonged visual display terminal (VDT work on central nervous system and autonomic nervous system are observed and analyzed based on electroencephalogram (EEG and heart rate variability (HRV. Power spectral indices of HRV, the P300 components based on visual oddball task, and multichannel linear descriptors of EEG are combined to estimate the change of mental fatigue. The results show that long-term VDT work induces the mental fatigue. The power spectral of HRV, the P300 components, and multichannel linear descriptors of EEG are correlated with mental fatigue level. The cognitive information processing would come down after long-term VDT work. Moreover, the multichannel linear descriptors of EEG can effectively reflect the changes of θ, α, and β waves and may be used as the indices of the mental fatigue level. The kernel principal component analysis (KPCA and hidden Markov model (HMM are combined to differentiate two mental fatigue states. The investigation suggests that the joint KPCA-HMM method can effectively reduce the dimensions of the feature vectors, accelerate the classification speed, and improve the accuracy of mental fatigue to achieve the maximum 88%. Hence KPCA-HMM could be a promising model for the estimation of mental fatigue.

  15. Estimating VDT Mental Fatigue Using Multichannel Linear Descriptors and KPCA-HMM

    Science.gov (United States)

    Zhang, Chong; Zheng, Chongxun; Yu, Xiaolin; Ouyang, Yi

    2008-12-01

    The impacts of prolonged visual display terminal (VDT) work on central nervous system and autonomic nervous system are observed and analyzed based on electroencephalogram (EEG) and heart rate variability (HRV). Power spectral indices of HRV, the P300 components based on visual oddball task, and multichannel linear descriptors of EEG are combined to estimate the change of mental fatigue. The results show that long-term VDT work induces the mental fatigue. The power spectral of HRV, the P300 components, and multichannel linear descriptors of EEG are correlated with mental fatigue level. The cognitive information processing would come down after long-term VDT work. Moreover, the multichannel linear descriptors of EEG can effectively reflect the changes of θ, α, and β waves and may be used as the indices of the mental fatigue level. The kernel principal component analysis (KPCA) and hidden Markov model (HMM) are combined to differentiate two mental fatigue states. The investigation suggests that the joint KPCA-HMM method can effectively reduce the dimensions of the feature vectors, accelerate the classification speed, and improve the accuracy of mental fatigue to achieve the maximum 88%. Hence KPCA-HMM could be a promising model for the estimation of mental fatigue.

  16. An HMM posterior decoder for sequence feature prediction that includes homology information

    DEFF Research Database (Denmark)

    Käll, Lukas; Krogh, Anders Stærmose; Sonnhammer, Erik L. L.

    2005-01-01

    Motivation: When predicting sequence features like transmembrane topology, signal peptides, coil-coil structures, protein secondary structure or genes, extra support can be gained from homologs. Results: We present here a general hidden Markov model (HMM) decoding algorithm that combines probabil......Motivation: When predicting sequence features like transmembrane topology, signal peptides, coil-coil structures, protein secondary structure or genes, extra support can be gained from homologs. Results: We present here a general hidden Markov model (HMM) decoding algorithm that combines...... probabilities for sequence features of homologs by considering the average of the posterior label probability of each position in a global sequence alignment. The algorithm is an extension of the previously described ‘optimal accuracy' decoder, allowing homology information to be used. It was benchmarked using...... an HMM for transmembrane topology and signal peptide prediction, Phobius. We found that the performance was substantially increased when incorporating information from homologs. Availability: A prediction server for transmembrane topology and signal peptides that uses the algorithm is available at http...

  17. Infinite hidden Markov models for unusual-event detection in video.

    Science.gov (United States)

    Pruteanu-Malinici, Iulian; Carin, Lawrence

    2008-05-01

    We address the problem of unusual-event detection in a video sequence. Invariant subspace analysis (ISA) is used to extract features from the video, and the time-evolving properties of these features are modeled via an infinite hidden Markov model (iHMM), which is trained using "normal"/"typical" video. The iHMM retains a full posterior density function on all model parameters, including the number of underlying HMM states. Anomalies (unusual events) are detected subsequently if a low likelihood is observed when associated sequential features are submitted to the trained iHMM. A hierarchical Dirichlet process framework is employed in the formulation of the iHMM. The evaluation of posterior distributions for the iHMM is achieved in two ways: via Markov chain Monte Carlo and using a variational Bayes formulation. Comparisons are made to modeling based on conventional maximum-likelihood-based HMMs, as well as to Dirichlet-process-based Gaussian-mixture models.

  18. A framework for sustainable interorganizational business model

    OpenAIRE

    Neupane, Ganesh Prasad; Haugland, Sven A.

    2016-01-01

    Drawing on literature on business model innovations and sustainability, this paper develops a framework for sustainable interorganizational business models. The aim of the framework is to enhance the sustainability of firms’ business models by enabling firms to create future value by taking into account environmental, social and economic factors. The paper discusses two themes: (1) application of the term sustainability to business model innovation, and (2) implications of integrating sustain...

  19. Crystallization Kinetics within a Generic Modeling Framework

    DEFF Research Database (Denmark)

    Meisler, Kresten Troelstrup; von Solms, Nicolas; Gernaey, Krist V.

    2014-01-01

    to the modeling of various kinetic phenomena like nucleation, growth, agglomeration, and breakage are discussed in terms of model forms, model parameters, their availability and/or estimation, and their selection and application for specific crystallization operational scenarios under study. The advantages......A new and extended version of a generic modeling framework for analysis and design of crystallization operations is presented. The new features of this framework are described, with focus on development, implementation, identification, and analysis of crystallization kinetic models. Issues related...... of employing a well-structured model library for storage, use/reuse, and analysis of the kinetic models are highlighted. Examples illustrating the application of the modeling framework for kinetic model discrimination related to simulation of specific crystallization scenarios and for kinetic model parameter...

  20. DATA DRIVEN DESIGN OF HMM TOPOLOGY FOR ON­LINE HANDWRITING RECOGNITION

    NARCIS (Netherlands)

    Lee, J.L.; Kim, J.; Kim, J.H.

    2004-01-01

    Although HMM is widely used for on­line handwriting recognition, there is no simple and well­established way of designing the HMM topology. We propose a data­driven systematic method to design HMM topology. Data samples in a single pattern class are structurally simplified into a sequence of straigh

  1. MDM: A Mode Diagram Modeling Framework

    DEFF Research Database (Denmark)

    Wang, Zheng; Pu, Geguang; Li, Jianwen

    2012-01-01

    systems are widely used in the above-mentioned safety-critical embedded domains, there is lack of domain-specific formal modelling languages for such systems in the relevant industry. To address this problem, we propose a formal visual modeling framework called mode diagram as a concise and precise way...... checking technique can then be used to verify the mode diagram models against desired properties. To demonstrate the viability of our approach, we have applied our modelling framework to some real life case studies from industry and helped detect two design defects for some spacecraft control systems....

  2. Graphical Model Debugger Framework for Embedded Systems

    DEFF Research Database (Denmark)

    Zeng, Kebin

    2010-01-01

    Debugger Framework, providing an auxiliary avenue of analysis of system models at runtime by executing generated code and updating models synchronously, which allows embedded developers to focus on the model level. With the model debugger, embedded developers can graphically test their design model......Model Driven Software Development has offered a faster way to design and implement embedded real-time software by moving the design to a model level, and by transforming models to code. However, the testing of embedded systems has remained at the code level. This paper presents a Graphical Model...... and check the running status of the system, which offers a debugging capability on a higher level of abstraction. The framework intends to contribute a tool to the Eclipse society, especially suitable for model-driven development of embedded systems....

  3. Graphical Model Debugger Framework for Embedded Systems

    DEFF Research Database (Denmark)

    Zeng, Kebin; Guo, Yu; Angelov, Christo K.

    2010-01-01

    Model Driven Software Development has offered a faster way to design and implement embedded real-time software by moving the design to a model level, and by transforming models to code. However, the testing of embedded systems has remained at the code level. This paper presents a Graphical Model...... Debugger Framework, providing an auxiliary avenue of analysis of system models at runtime by executing generated code and updating models synchronously, which allows embedded developers to focus on the model level. With the model debugger, embedded developers can graphically test their design model...

  4. A Framework for Modelling Software Requirements

    Directory of Open Access Journals (Sweden)

    Dhirendra Pandey

    2011-05-01

    Full Text Available Requirement engineering plays an important role in producing quality software products. In recent past years, some approaches of requirement framework have been designed to provide an end-to-end solution for system development life cycle. Textual requirements specifications are difficult to learn, design, understand, review, and maintain whereas pictorial modelling is widely recognized as an effective requirement analysis tool. In this paper, we will present a requirement modelling framework with the analysis of modern requirements modelling techniques. Also, we will discuss various domains of requirement engineering with the help of modelling elements such as semantic map of business concepts, lifecycles of business objects, business processes, business rules, system context diagram, use cases and their scenarios, constraints, and user interface prototypes. The proposed framework will be illustrated with the case study of inventory management system.

  5. Key Frame Selection for One-Two Hand Gesture Recognition with HMM

    Directory of Open Access Journals (Sweden)

    Ketki P. Kshirsagar

    2015-06-01

    Full Text Available The sign language recognition is the most popular research area involving computer vision, pattern recognition and image processing. It enhances communication capabilities of the mute person. In this paper, I present an object based key frame selection. Forward Algorithm is used for shape similarity for one and two handed gesture recognition. That recognition is with feature and without feature using HMM method. I proposed use to the hidden markov model with key frame selection facility and gesture trajectory features for one and two hand gesture recognition. Experimental results demonstrate the effectiveness of my proposed scheme for recognizing One Handed American Sign Language and Two Handed British Sign Language.

  6. Knowledge Encapsulation Framework for Collaborative Social Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Cowell, Andrew J.; Gregory, Michelle L.; Marshall, Eric J.; McGrath, Liam R.

    2009-03-24

    This paper describes the Knowledge Encapsulation Framework (KEF), a suite of tools to enable knowledge inputs (relevant, domain-specific facts) to modeling and simulation projects, as well as other domains that require effective collaborative workspaces for knowledge-based task. This framework can be used to capture evidence (e.g., trusted material such as journal articles and government reports), discover new evidence (covering both trusted and social media), enable discussions surrounding domain-specific topics and provide automatically generated semantic annotations for improved corpus investigation. The current KEF implementation is presented within a wiki environment, providing a simple but powerful collaborative space for team members to review, annotate, discuss and align evidence with their modeling frameworks. The novelty in this approach lies in the combination of automatically tagged and user-vetted resources, which increases user trust in the environment, leading to ease of adoption for the collaborative environment.

  7. A framework of benchmarking land models

    Science.gov (United States)

    Luo, Y. Q.; Randerson, J.; Abramowitz, G.; Bacour, C.; Blyth, E.; Carvalhais, N.; Ciais, P.; Dalmonech, D.; Fisher, J.; Fisher, R.; Friedlingstein, P.; Hibbard, K.; Hoffman, F.; Huntzinger, D.; Jones, C. D.; Koven, C.; Lawrence, D.; Li, D. J.; Mahecha, M.; Niu, S. L.; Norby, R.; Piao, S. L.; Qi, X.; Peylin, P.; Prentice, I. C.; Riley, W.; Reichstein, M.; Schwalm, C.; Wang, Y. P.; Xia, J. Y.; Zaehle, S.; Zhou, X. H.

    2012-02-01

    Land models, which have been developed by the modeling community in the past two decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure and evaluate performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land models. The framework includes (1) targeted aspects of model performance to be evaluated; (2) a set of benchmarks as defined references to test model performance; (3) metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies; and (4) model improvement. Component 4 may or may not be involved in a benchmark analysis but is an ultimate goal of general modeling research. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and the land-surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics across timescales in response to both weather and climate change. Benchmarks that are used to evaluate models generally consist of direct observations, data-model products, and data-derived patterns and relationships. Metrics of measuring mismatches between models and benchmarks may include (1) a priori thresholds of acceptable model performance and (2) a scoring system to combine data-model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance for future improvement. Iterations between model evaluation and improvement via benchmarking shall demonstrate progress of land modeling and help establish confidence in land models for their predictions of future states of ecosystems and climate.

  8. A framework of benchmarking land models

    Directory of Open Access Journals (Sweden)

    Y. Q. Luo

    2012-02-01

    Full Text Available Land models, which have been developed by the modeling community in the past two decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure and evaluate performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land models. The framework includes (1 targeted aspects of model performance to be evaluated; (2 a set of benchmarks as defined references to test model performance; (3 metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies; and (4 model improvement. Component 4 may or may not be involved in a benchmark analysis but is an ultimate goal of general modeling research. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and the land-surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics across timescales in response to both weather and climate change. Benchmarks that are used to evaluate models generally consist of direct observations, data-model products, and data-derived patterns and relationships. Metrics of measuring mismatches between models and benchmarks may include (1 a priori thresholds of acceptable model performance and (2 a scoring system to combine data-model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance for future improvement. Iterations between model evaluation and improvement via benchmarking shall demonstrate progress of land modeling and help establish confidence in land models for their predictions of future states of ecosystems and climate.

  9. Cytoview: Development of a cell modelling framework

    Indian Academy of Sciences (India)

    Prashant Khodade; Samta Malhotra; Nirmal Kumar; M Sriram Iyengar; N Balakrishnan; Nagasuma Chandra

    2007-08-01

    The biological cell, a natural self-contained unit of prime biological importance, is an enormously complex machine that can be understood at many levels. A higher-level perspective of the entire cell requires integration of various features into coherent, biologically meaningful descriptions. There are some efforts to model cells based on their genome, proteome or metabolome descriptions. However, there are no established methods as yet to describe cell morphologies, capture similarities and differences between different cells or between healthy and disease states. Here we report a framework to model various aspects of a cell and integrate knowledge encoded at different levels of abstraction, with cell morphologies at one end to atomic structures at the other. The different issues that have been addressed are ontologies, feature description and model building. The framework describes dotted representations and tree data structures to integrate diverse pieces of data and parametric models enabling size, shape and location descriptions. The framework serves as a first step in integrating different levels of data available for a biological cell and has the potential to lead to development of computational models in our pursuit to model cell structure and function, from which several applications can flow out.

  10. An evaluation framework for participatory modelling

    Science.gov (United States)

    Krueger, T.; Inman, A.; Chilvers, J.

    2012-04-01

    Strong arguments for participatory modelling in hydrology can be made on substantive, instrumental and normative grounds. These arguments have led to increasingly diverse groups of stakeholders (here anyone affecting or affected by an issue) getting involved in hydrological research and the management of water resources. In fact, participation has become a requirement of many research grants, programs, plans and policies. However, evidence of beneficial outcomes of participation as suggested by the arguments is difficult to generate and therefore rare. This is because outcomes are diverse, distributed, often tacit, and take time to emerge. In this paper we develop an evaluation framework for participatory modelling focussed on learning outcomes. Learning encompasses many of the potential benefits of participation, such as better models through diversity of knowledge and scrutiny, stakeholder empowerment, greater trust in models and ownership of subsequent decisions, individual moral development, reflexivity, relationships, social capital, institutional change, resilience and sustainability. Based on the theories of experiential, transformative and social learning, complemented by practitioner experience our framework examines if, when and how learning has occurred. Special emphasis is placed on the role of models as learning catalysts. We map the distribution of learning between stakeholders, scientists (as a subgroup of stakeholders) and models. And we analyse what type of learning has occurred: instrumental learning (broadly cognitive enhancement) and/or communicative learning (change in interpreting meanings, intentions and values associated with actions and activities; group dynamics). We demonstrate how our framework can be translated into a questionnaire-based survey conducted with stakeholders and scientists at key stages of the participatory process, and show preliminary insights from applying the framework within a rural pollution management situation in

  11. Talking Cure Models: A Framework of Analysis

    Science.gov (United States)

    Marx, Christopher; Benecke, Cord; Gumz, Antje

    2017-01-01

    Psychotherapy is commonly described as a “talking cure,” a treatment method that operates through linguistic action and interaction. The operative specifics of therapeutic language use, however, are insufficiently understood, mainly due to a multitude of disparate approaches that advance different notions of what “talking” means and what “cure” implies in the respective context. Accordingly, a clarification of the basic theoretical structure of “talking cure models,” i.e., models that describe therapeutic processes with a focus on language use, is a desideratum of language-oriented psychotherapy research. Against this background the present paper suggests a theoretical framework of analysis which distinguishes four basic components of “talking cure models”: (1) a foundational theory (which suggests how linguistic activity can affect and transform human experience), (2) an experiential problem state (which defines the problem or pathology of the patient), (3) a curative linguistic activity (which defines linguistic activities that are supposed to effectuate a curative transformation of the experiential problem state), and (4) a change mechanism (which defines the processes and effects involved in such transformations). The purpose of the framework is to establish a terminological foundation that allows for systematically reconstructing basic properties and operative mechanisms of “talking cure models.” To demonstrate the applicability and utility of the framework, five distinct “talking cure models” which spell out the details of curative “talking” processes in terms of (1) catharsis, (2) symbolization, (3) narrative, (4) metaphor, and (5) neurocognitive inhibition are introduced and discussed in terms of the framework components. In summary, we hope that our framework will prove useful for the objective of clarifying the theoretical underpinnings of language-oriented psychotherapy research and help to establish a more comprehensive

  12. Estimation of Phoneme-Specific HMM Topologies for the Automatic Recognition of Dysarthric Speech

    Directory of Open Access Journals (Sweden)

    Santiago-Omar Caballero-Morales

    2013-01-01

    Full Text Available Dysarthria is a frequently occurring motor speech disorder which can be caused by neurological trauma, cerebral palsy, or degenerative neurological diseases. Because dysarthria affects phonation, articulation, and prosody, spoken communication of dysarthric speakers gets seriously restricted, affecting their quality of life and confidence. Assistive technology has led to the development of speech applications to improve the spoken communication of dysarthric speakers. In this field, this paper presents an approach to improve the accuracy of HMM-based speech recognition systems. Because phonatory dysfunction is a main characteristic of dysarthric speech, the phonemes of a dysarthric speaker are affected at different levels. Thus, the approach consists in finding the most suitable type of HMM topology (Bakis, Ergodic for each phoneme in the speaker’s phonetic repertoire. The topology is further refined with a suitable number of states and Gaussian mixture components for acoustic modelling. This represents a difference when compared with studies where a single topology is assumed for all phonemes. Finding the suitable parameters (topology and mixtures components is performed with a Genetic Algorithm (GA. Experiments with a well-known dysarthric speech database showed statistically significant improvements of the proposed approach when compared with the single topology approach, even for speakers with severe dysarthria.

  13. HMM based Offline Signature Verification system using ContourletTransform and Textural features

    Directory of Open Access Journals (Sweden)

    K N PUSHPALATHA

    2014-07-01

    Full Text Available Handwritten signatures occupy a very special place in the identification of an individual and it is a challenging task because of the possible variations in directions and shapes of the constituent strokes of written samples. In this paper we investigated offline verifications system based on fusion of contourlet transform, directional features and Hidden Markov Model (HMM as classifier. The handwritten signature image is preprocessed for noise removal and a two level contourlet transform is applied to get feature vector. The textural features are computed and concatenated with coefficients of contourlet transform to form the final feature vector. A two level contourlet transform is applied to get feature vector after the signature images of both query and database are preprocessed for noise removal. The classification results are computed using HTK tool with HMM classifier. The experimental results are computed using GPDS-960 database images to get the parameters like False Rejection Rate (FRR, False Acceptance Rate (FAR and Total Success Rate (TSR. The results show that the values of FRR and FAR are improved compared to the existing algorithm.

  14. Combining slanted-frame classifiers for improved HMM-based Arabic handwriting recognition.

    Science.gov (United States)

    Al-Hajj Mohamad, Ramy; Likforman-Sulem, Laurence; Mokbel, Chafic

    2009-07-01

    The problem addressed in this study is the offline recognition of handwritten Arabic city names. The names are assumed to belong to a fixed lexicon of about 1,000 entries. A state-of-the-art classical right-left hidden Markov model (HMM)-based recognizer (reference system) using the sliding window approach is developed. The feature set includes both baseline-independent and baseline-dependent features. The analysis of the errors made by the recognizer shows that the inclination, overlap, and shifted positions of diacritical marks are major sources of errors. In this paper, we propose coping with these problems. Our approach relies on the combination of three homogeneous HMM-based classifiers. All classifiers have the same topology as the reference system and differ only in the orientation of the sliding window. We compare three combination schemes of these classifiers at the decision level. Our reported results on the benchmark IFN/ENIT database of Arabic Tunisian city names give a recognition rate higher than 90 percent accuracy and demonstrate the superiority of the neural network-based combination. Our results also show that the combination of classifiers performs better than a single classifier dealing with slant-corrected images and that the approach is robust for a wide range of orientation angles.

  15. A framework for benchmarking land models

    Science.gov (United States)

    Luo, Y. Q.; Randerson, J. T.; Abramowitz, G.; Bacour, C.; Blyth, E.; Carvalhais, N.; Ciais, P.; Dalmonech, D.; Fisher, J. B.; Fisher, R.; Friedlingstein, P.; Hibbard, K.; Hoffman, F.; Huntzinger, D.; Jones, C. D.; Koven, C.; Lawrence, D.; Li, D. J.; Mahecha, M.; Niu, S. L.; Norby, R.; Piao, S. L.; Qi, X.; Peylin, P.; Prentice, I. C.; Riley, W.; Reichstein, M.; Schwalm, C.; Wang, Y. P.; Xia, J. Y.; Zaehle, S.; Zhou, X. H.

    2012-10-01

    Land models, which have been developed by the modeling community in the past few decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land model performances and, meanwhile, highlights major challenges at this infant stage of benchmark analysis. The framework includes (1) targeted aspects of model performance to be evaluated, (2) a set of benchmarks as defined references to test model performance, (3) metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies, and (4) model improvement. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and land surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics in response to climate change across broad temporal and spatial scales. Thus, one major challenge is to select and define a limited number of benchmarks to effectively evaluate land model performance. The second challenge is to develop metrics of measuring mismatches between models and benchmarks. The metrics may include (1) a priori thresholds of acceptable model performance and (2) a scoring system to combine data-model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance to guide future development, thus enabling improved predictions of future states of ecosystems and climate. The near-future research effort should be on development of a set of widely acceptable benchmarks that can be used to objectively, effectively, and reliably evaluate fundamental properties of land models

  16. A framework for benchmarking land models

    Directory of Open Access Journals (Sweden)

    Y. Q. Luo

    2012-10-01

    Full Text Available Land models, which have been developed by the modeling community in the past few decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land model performances and, meanwhile, highlights major challenges at this infant stage of benchmark analysis. The framework includes (1 targeted aspects of model performance to be evaluated, (2 a set of benchmarks as defined references to test model performance, (3 metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies, and (4 model improvement. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and land surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics in response to climate change across broad temporal and spatial scales. Thus, one major challenge is to select and define a limited number of benchmarks to effectively evaluate land model performance. The second challenge is to develop metrics of measuring mismatches between models and benchmarks. The metrics may include (1 a priori thresholds of acceptable model performance and (2 a scoring system to combine data–model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance to guide future development, thus enabling improved predictions of future states of ecosystems and climate. The near-future research effort should be on development of a set of widely acceptable benchmarks that can be used to objectively, effectively, and reliably evaluate fundamental properties

  17. Automatic Speech Segmentation Based on HMM

    Directory of Open Access Journals (Sweden)

    M. Kroul

    2007-06-01

    Full Text Available This contribution deals with the problem of automatic phoneme segmentation using HMMs. Automatization of speech segmentation task is important for applications, where large amount of data is needed to process, so manual segmentation is out of the question. In this paper we focus on automatic segmentation of recordings, which will be used for triphone synthesis unit database creation. For speech synthesis, the speech unit quality is a crucial aspect, so the maximal accuracy in segmentation is needed here. In this work, different kinds of HMMs with various parameters have been trained and their usefulness for automatic segmentation is discussed. At the end of this work, some segmentation accuracy tests of all models are presented.

  18. Printed Arabic Character Recognition Using HMM

    Institute of Scientific and Technical Information of China (English)

    Abbas H.Hassin; Xiang-Long Tang; Jia-Feng Liu; Wei Zhao

    2004-01-01

    The Arabic Language has a very rich vocabulary.More than 200 million people speak this language as their native speaking,and over 1 billion people use it in several religion-related activities.In this paper a new technique is presented for recognizing printed Arabic characters.After a word is segmented,each character/word is entirely transformed into a feature vector.The features of printed Arabic characters include strokes and bays in various directions,endpoints,intersection points,loops,dots and zigzags.The word skeleton is decomposed into a number of links in orthographic order,and then it is transferred into a sequence of symbols using vector quantization.Single hidden Markov model has been used for recognizing the printed Arabic characters.Experimental results show that the high recognition rate depends on the number of states in each sample.

  19. Computer-aided modeling framework – a generic modeling template

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    This work focuses on the development of a computer-aided modeling framework. The framework is a knowledge-based system that is built on a generic modeling language and structured on workflows for different modeling tasks. The overall objective is to support model developers and users to generate...... and test models systematically, efficiently and reliably. In this way, development of products and processes can be made faster, cheaper and more efficient. In this contribution, as part of the framework, a generic modeling template for the systematic derivation of problem specific models is presented....... The application of the modeling template is highlighted with a case study related to the modeling of a catalytic membrane reactor coupling dehydrogenation of ethylbenzene with hydrogenation of nitrobenzene...

  20. An entropic framework for modeling economies

    Science.gov (United States)

    Caticha, Ariel; Golan, Amos

    2014-08-01

    We develop an information-theoretic framework for economic modeling. This framework is based on principles of entropic inference that are designed for reasoning on the basis of incomplete information. We take the point of view of an external observer who has access to limited information about broad macroscopic economic features. We view this framework as complementary to more traditional methods. The economy is modeled as a collection of agents about whom we make no assumptions of rationality (in the sense of maximizing utility or profit). States of statistical equilibrium are introduced as those macrostates that maximize entropy subject to the relevant information codified into constraints. The basic assumption is that this information refers to supply and demand and is expressed in the form of the expected values of certain quantities (such as inputs, resources, goods, production functions, utility functions and budgets). The notion of economic entropy is introduced. It provides a measure of the uniformity of the distribution of goods and resources. It captures both the welfare state of the economy as well as the characteristics of the market (say, monopolistic, concentrated or competitive). Prices, which turn out to be the Lagrange multipliers, are endogenously generated by the economy. Further studies include the equilibrium between two economies and the conditions for stability. As an example, the case of the nonlinear economy that arises from linear production and utility functions is treated in some detail.

  1. MDM: A Mode Diagram Modeling Framework

    Directory of Open Access Journals (Sweden)

    Zheng Wang

    2012-12-01

    Full Text Available Periodic control systems used in spacecrafts and automotives are usually period-driven and can be decomposed into different modes with each mode representing a system state observed from outside. Such systems may also involve intensive computing in their modes. Despite the fact that such control systems are widely used in the above-mentioned safety-critical embedded domains, there is lack of domain-specific formal modelling languages for such systems in the relevant industry. To address this problem, we propose a formal visual modeling framework called mode diagram as a concise and precise way to specify and analyze such systems. To capture the temporal properties of periodic control systems, we provide, along with mode diagram, a property specification language based on interval logic for the description of concrete temporal requirements the engineers are concerned with. The statistical model checking technique can then be used to verify the mode diagram models against desired properties. To demonstrate the viability of our approach, we have applied our modelling framework to some real life case studies from industry and helped detect two design defects for some spacecraft control systems.

  2. Cluster-Based Adaptation Using Density Forest for HMM Phone Recognition

    DEFF Research Database (Denmark)

    Abou-Zleikha, Mohamed; Tan, Zheng-Hua; Christensen, Mads Græsbøll

    2014-01-01

    the data of each leaf (cluster) in each tree with the corresponding GMM adapted by the leaf data using the MAP method. The results show that the proposed approach achieves 3:8% (absolute) lower phone error rate compared with the standard HMM/GMM and 0:8% (absolute) lower PER compared with bagged HMM/GMM....

  3. 基于HMM的地形高度匹配算法%An HMM Based Terrain Elevation Matching Algorithm

    Institute of Scientific and Technical Information of China (English)

    冯庆堂; 沈林成; 常文森; 叶媛媛

    2005-01-01

    Terrain-aided navigation (TAN) uses terrain height variations under an aircraft to render the position estimate to bound the inertial navigation system (INS) error. This paper proposes a new terrain elevation matching(TEM) model, viz. Hidden-Markov-model(HMM) based TEM (HMMTEM) model. With the given model, an HMMTEM algorithm using Viterbi algorithm is designed and implemented to estimate the position error in INS. The simulation results show that HMMTEM algorithm can better improve the positioning precision of autonomous navigation than SITAN algorithm.

  4. A Framework of Memory Consistency Models

    Institute of Scientific and Technical Information of China (English)

    胡伟武; 施巍松; 等

    1998-01-01

    Previous descriptions of memory consistency models in shared-memory multiprocessor systems are mainly expressed as constraints on the memory access event ordering and hence are hardware-centric.This paper presents a framework of memory consistency models which describes the memory consistency model on the behavior level.Based on the understanding that the behavior of an execution is determined by the execution order of conflicting accesses,a memory consistency model is defined as an interprocessor synchronization mechanism which orders the execution of operations from different processors.Synchronization order of an execution under certain consistency model is also defined.The synchronization order,together with the program order determines the behavior of an execution.This paper also presents criteria for correct program and correct implementation of consistency models.Regarding an implementation of a consistency model as certain memory event ordering constraints,this paper provides a method to prove the correctness of consistency model implementations,and the correctness of the lock-based cache coherence protocol is proved with this method.

  5. The Optical Character Recognition for Cursive Script Using HMM: A Review

    Directory of Open Access Journals (Sweden)

    Saeeda Naz

    2014-11-01

    Full Text Available Automatic Character Recognition has wide variety of applications such as automatic postal mail sorting, number plate recognition and automatic form of reader and entering text from PDA's etc. Cursive script’s Automatic Character Recognition is a complex process facing unique issues unlike other scripts. Many solutions have been proposed in the literature to solve complexities of cursive scripts character recognition. This paper present a comprehensive literature review of the Optical Character Recognition (OCR for off-line and on-line character recognition for Urdu, Arabic and Persian languages, based on Hidden Markov Model (HMM. We surveyed all most all significant approaches proposed and concluded future directions of OCR for cursive languages.

  6. Performance Evaluation of Conventional and Hybrid Feature Extractions Using Multivariate HMM Classifier

    Directory of Open Access Journals (Sweden)

    Veton Z. Këpuska

    2015-04-01

    Full Text Available Speech feature extraction and likelihood evaluation are considered the main issues in speech recognition system. Although both techniques were developed and improved, but they remain the most active area of research. This paper investigates the performance of conventional and hybrid speech feature extraction algorithm of Mel Frequency Cepstrum Coefficient (MFCC, Linear Prediction Cepstrum Coefficient (LPCC, perceptual linear production (PLP and RASTA-PLP through using multivariate Hidden Markov Model (HMM classifier. The performance of the speech recognition system is evaluated based on word error rate (WER, which is given for different data set of human voice using isolated speech TIDIGIT corpora sampled by 8 Khz. This data includes the pronunciation of eleven words (zero to nine plus oh are recorded from 208 different adult speakers (men & women each person uttered each word 2 times.

  7. Analysis on Recommended System for Web Information Retrieval Using HMM

    Directory of Open Access Journals (Sweden)

    Himangni Rathore

    2014-11-01

    Full Text Available Web is a rich domain of data and knowledge, which is spread over the world in unstructured manner. The number of users is continuously access the information over the internet. Web mining is an application of data mining where web related data is extracted and manipulated for extracting knowledge. The data mining is used in the domain of web information mining is refers as web mining, that is further divided into three major domains web uses mining, web content mining and web structure mining. The proposed work is intended to work with web uses mining. The concept of web mining is to improve the user feedbacks and user navigation pattern discovery for a CRM system. Finally a new algorithm HMM is used for finding the pattern in data, which method promises to provide much accurate recommendation.

  8. A Procurement Performance Model for Construction Frameworks

    Directory of Open Access Journals (Sweden)

    Terence Y M Lam

    2015-07-01

    Full Text Available Collaborative construction frameworks have been developed in the United Kingdom (UK to create longer term relationships between clients and suppliers in order to improve project outcomes. Research undertaken into highways maintenance set within a major county council has confirmed that such collaborative procurement methods can improve time, cost and quality of construction projects. Building upon this and examining the same single case, this research aims to develop a performance model through identification of performance drivers in the whole project delivery process including pre and post contract phases. A priori performance model based on operational and sociological constructs was proposed and then checked by a pilot study. Factor analysis and central tendency statistics from the questionnaires as well as content analysis from the interview transcripts were conducted. It was confirmed that long term relationships, financial and non-financial incentives and stronger communication are the sociological behaviour factors driving performance. The interviews also established that key performance indicators (KPIs can be used as an operational measure to improve performance. With the posteriori performance model, client project managers can effectively collaboratively manage contractor performance through procurement measures including use of longer term and KPIs for the contract so that the expected project outcomes can be achieved. The findings also make significant contribution to construction framework procurement theory by identifying the interrelated sociological and operational performance drivers. This study is set predominantly in the field of highways civil engineering. It is suggested that building based projects or other projects that share characteristics are grouped together and used for further research of the phenomena discovered.

  9. Motor Imagery EEG Classification Based on CI-HMM%基于CI-HMM的运动想象脑电信号分类

    Institute of Scientific and Technical Information of China (English)

    孟明; 满海涛; 佘青山

    2013-01-01

    针对隐马尔科夫模型在运动想象脑电信号分类应用中,其独立性假设与脑电信号间相关性的不一致问题,提出一种基于Choquet模糊积分隐马尔科夫模型的脑电信号分类方法。该模型应用模糊积分的单调性取代了概率测度的可加性,放宽了隐马尔科夫模型的独立性假设。利用重叠滑动窗对脑电信号分段,然后对每段数据提取绝对均值、波长和小波包相对能量特征,构成特征序列用于CI-HMM的训练和分类。选取2008年BCI竞赛Datasets 1的两类运动想象数据进行分类实验,结果表明,该方法有效提高了隐马尔科夫模型方法对运动想象脑电信号分类的性能。%In the applications of hidden Markov model( HMM) in motor imagery electroencephalogram( EEG) classi-fication,the independence assumption of HMM is inconsistent with the inherent correlation of EEG signals. In order to resolve the problem,an EEG classification method based on Choquet fuzzy integral HMM( CI-HMM) is proposed. The independence assumption of HMM is relaxed by substituting the monotonicity of fuzzy integrals for the additivity of probability measures. Each signal was segmented using overlapping sliding window. Then from each segment,the absolute mean,wavelength and wavelet packet based relative energy features were extracted to constitute observation sequence for the CI-HMM training and classification. The BCI Competition 2008 Datasets 1 with two classes of motor imagery were selected for classification experiments. The experimental results show that this method can effectively improve the performance of the HMM method used in motor imagery EEG classification.

  10. Conceptual Frameworks in the Doctoral Research Process: A Pedagogical Model

    Science.gov (United States)

    Berman, Jeanette; Smyth, Robyn

    2015-01-01

    This paper contributes to consideration of the role of conceptual frameworks in the doctoral research process. Through reflection on the two authors' own conceptual frameworks for their doctoral studies, a pedagogical model has been developed. The model posits the development of a conceptual framework as a core element of the doctoral…

  11. The Conceptual Integration Modeling Framework: Abstracting from the Multidimensional Model

    CERN Document Server

    Rizzolo, Flavio; Pottinger, Rachel; Wong, Kwok

    2010-01-01

    Data warehouses are overwhelmingly built through a bottom-up process, which starts with the identification of sources, continues with the extraction and transformation of data from these sources, and then loads the data into a set of data marts according to desired multidimensional relational schemas. End user business intelligence tools are added on top of the materialized multidimensional schemas to drive decision making in an organization. Unfortunately, this bottom-up approach is costly both in terms of the skilled users needed and the sheer size of the warehouses. This paper proposes a top-down framework in which data warehousing is driven by a conceptual model. The framework offers both design time and run time environments. At design time, a business user first uses the conceptual modeling language as a multidimensional object model to specify what business information is needed; then she maps the conceptual model to a pre-existing logical multidimensional representation. At run time, a system will tra...

  12. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  13. A Smallholder Socio-hydrological Modelling Framework

    Science.gov (United States)

    Pande, S.; Savenije, H.; Rathore, P.

    2014-12-01

    Small holders are farmers who own less than 2 ha of farmland. They often have low productivity and thus remain at subsistence level. A fact that nearly 80% of Indian farmers are smallholders, who merely own a third of total farmlands and belong to the poorest quartile, but produce nearly 40% of countries foodgrains underlines the importance of understanding the socio-hydrology of a small holder. We present a framework to understand the socio-hydrological system dynamics of a small holder. It couples the dynamics of 6 main variables that are most relevant at the scale of a small holder: local storage (soil moisture and other water storage), capital, knowledge, livestock production, soil fertility and grass biomass production. The model incorporates rule-based adaptation mechanisms (for example: adjusting expenditures on food and fertilizers, selling livestocks etc.) of small holders when they face adverse socio-hydrological conditions, such as low annual rainfall, higher intra-annual variability in rainfall or variability in agricultural prices. It allows us to study sustainability of small holder farming systems under various settings. We apply the framework to understand the socio-hydrology of small holders in Aurangabad, Maharashtra, India. This district has witnessed suicides of many sugarcane farmers who could not extricate themselves out of the debt trap. These farmers lack irrigation and are susceptible to fluctuating sugar prices and intra-annual hydroclimatic variability. This presentation discusses two aspects in particular: whether government interventions to absolve the debt of farmers is enough and what is the value of investing in local storages that can buffer intra-annual variability in rainfall and strengthening the safety-nets either by creating opportunities for alternative sources of income or by crop diversification.

  14. Systematic identification of crystallization kinetics within a generic modelling framework

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli Bin; Meisler, Kresten Troelstrup; Gernaey, Krist

    2012-01-01

    A systematic development of constitutive models within a generic modelling framework has been developed for use in design, analysis and simulation of crystallization operations. The framework contains a tool for model identification connected with a generic crystallizer modelling tool-box, a tool...

  15. Research on Focused Crawler Based on HMM%基于HMM的主题爬虫研究

    Institute of Scientific and Technical Information of China (English)

    谢治军; 杨武; 李稚楹; 宋静静

    2012-01-01

    主题爬虫是垂直搜索引擎的核心组成部分,它为面向主题的用户查询准备数据资源;提出了一种基于HMM的主题爬虫方法,方法不仅分析网页内容,而且还考虑网页的上下文链接结构,首先将当前网页的聚类结果作为观察状态、将当前网页到目标网页的链接距离作为隐含状态,然后通过HMM模型学习用户的主题浏览模式并利用它采集更多的主题网页;实验结果表明:方法能采集大量与指定主题相关的高质量网页,主题爬行效率优于Best-First主题爬虫。%Focused crawler is a core component of the vertical search engine, it collected data resources for the subject-oriented user's query. This paper proposes an approach for focused crawler based on HMM, it not only considers the web content, but also analyzes the context of web link structure. Firstly, the observation state represents the clustering of the current web page, the hidden state represents the link distance from current web page to target web page, then through the HMM model learning user browsing patterns, more topic webpages are downloaded by using the model. Experiments show that the focused crawler based on HMM can capture a large number of high quality web pages related to target topics, and its crawling oerforms better than Best-First crawler.

  16. Hidden Markov Models with Factored Gaussian Mixtures Densities

    Institute of Scientific and Technical Information of China (English)

    LI Hao-zheng; LIU Zhi-qiang; ZHU Xiang-hua

    2004-01-01

    We present a factorial representation of Gaussian mixture models for observation densities in Hidden Markov Models(HMMs), which uses the factorial learning in the HMM framework. We derive the reestimation formulas for estimating the factorized parameters by the Expectation Maximization (EM) algorithm. We conduct several experiments to compare the performance of this model structure with Factorial Hidden Markov Models(FHMMs) and HMMs, some conclusions and promising empirical results are presented.

  17. An evolutionary method for learning HMM structure: prediction of protein secondary structure

    DEFF Research Database (Denmark)

    Won, Kyoung-Jae; Hamelryck, Thomas; Prügel-Bennett, Adam;

    2007-01-01

    Therefore, we have developed a method for evolving the structure of HMMs automatically, using Genetic Algorithms (GAs). RESULTS: In the GA procedure, populations of HMMs are assembled from biologically meaningful building blocks. Mutation and crossover operators were designed to explore the space...... HMM also calculates the probabilities associated with the predictions. We carefully examined the performance of the HMM based predictor, both under the multiple- and single-sequence...

  18. Making sense of implementation theories, models and frameworks

    National Research Council Canada - National Science Library

    Nilsen, Per

    2015-01-01

    .... The aim of this article is to propose a taxonomy that distinguishes between different categories of theories, models and frameworks in implementation science, to facilitate appropriate selection...

  19. Modeling Framework for Mining Lifecycle Management

    Directory of Open Access Journals (Sweden)

    Na Lu

    2014-03-01

    Full Text Available In the development process of the information of the mining engineering, it is difficult to directly exchange and share information in the different phases and different application system, which causes the information isolation and information gap due to lack of unified data exchange standards and information integration mechanism. The purpose of this research is to build a modeling framework for mining lifecycle information management. The conception of mining lifecycle management (MLM is proposed based on product lifecycle management (PLM and Hall three dimension structures. The frame system of mining lifecycle management has been established by the application route of the information integration technologies and information standards. The four-layer structure of the realization of MLM system is put forward, which draws up the development method of MLM system. -The application indicates that the proposed theories and technologies have solved the problem of information isolation in different phases and application in mining engineering, and have laid a foundation for information exchange, sharing and integration in mining lifecycle.

  20. An Exploratory Investigation on the Invasiveness of Environmental Modeling Frameworks

    Science.gov (United States)

    This paper provides initial results of an exploratory investigation on the invasiveness of environmental modeling frameworks. Invasiveness is defined as the coupling between application (i.e., model) and framework code used to implement the model. By comparing the implementation of an environmenta...

  1. A Software Service Framework Model Based on Agent

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper presents an agent-based software service framework model called ASF, and definesthe basic concepts and structure of ASF model. It also describes the management and process mechanismsin ASF model.

  2. Business model framework applications in health care: A systematic review.

    Science.gov (United States)

    Fredriksson, Jens Jacob; Mazzocato, Pamela; Muhammed, Rafiq; Savage, Carl

    2017-01-01

    It has proven to be a challenge for health care organizations to achieve the Triple Aim. In the business literature, business model frameworks have been used to understand how organizations are aligned to achieve their goals. We conducted a systematic literature review with an explanatory synthesis approach to understand how business model frameworks have been applied in health care. We found a large increase in applications of business model frameworks during the last decade. E-health was the most common context of application. We identified six applications of business model frameworks: business model description, financial assessment, classification based on pre-defined typologies, business model analysis, development, and evaluation. Our synthesis suggests that the choice of business model framework and constituent elements should be informed by the intent and context of application. We see a need for harmonization in the choice of elements in order to increase generalizability, simplify application, and help organizations realize the Triple Aim.

  3. LAMMPS Framework for Dynamic Bonding and an Application Modeling DNA

    DEFF Research Database (Denmark)

    Svaneborg, Carsten

    2012-01-01

    and bond types. When breaking bonds, all angular and dihedral interactions involving broken bonds are removed. The framework allows chemical reactions to be modeled, and use it to simulate a simplistic, coarse-grained DNA model. The resulting DNA dynamics illustrates the power of the present framework....

  4. IDEF method-based simulation model design and development framework

    Directory of Open Access Journals (Sweden)

    Ki-Young Jeong

    2009-09-01

    Full Text Available The purpose of this study is to provide an IDEF method-based integrated framework for a business process simulation model to reduce the model development time by increasing the communication and knowledge reusability during a simulation project. In this framework, simulation requirements are collected by a function modeling method (IDEF0 and a process modeling method (IDEF3. Based on these requirements, a common data model is constructed using the IDEF1X method. From this reusable data model, multiple simulation models are automatically generated using a database-driven simulation model development approach. The framework is claimed to help both requirement collection and experimentation phases during a simulation project by improving system knowledge, model reusability, and maintainability through the systematic use of three descriptive IDEF methods and the features of the relational database technologies. A complex semiconductor fabrication case study was used as a testbed to evaluate and illustrate the concepts and the framework. Two different simulation software products were used to develop and control the semiconductor model from the same knowledge base. The case study empirically showed that this framework could help improve the simulation project processes by using IDEF-based descriptive models and the relational database technology. Authors also concluded that this framework could be easily applied to other analytical model generation by separating the logic from the data.

  5. A Simulation and Modeling Framework for Space Situational Awareness

    Energy Technology Data Exchange (ETDEWEB)

    Olivier, S S

    2008-09-15

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.

  6. Isolated Word Recognition Using Ergodic Hidden Markov Models and Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Warih Maharani

    2012-03-01

    Full Text Available Speech to Text was one of speech recognition applications which speech signal was processed, recognized and converted into a textual representation. Hidden Markov Model (HMM was the widely used method in speech recognition. However, the level of accuracy using HMM was strongly influenced by the optimalization of extraction process and modellling methods. Hence in this research, the use of genetic algorithm (GA method to optimize the Ergodic HMM was tested. In Hybrid HMM-GA, GA was used to optimize the Baum-welch method in the training process. It was useful to improve the accuracy of the recognition result which is produced by the HMM parameters that generate the low accuracy when the HMM are tested. Based on the research, the percentage increases the level of accuracy of 20% to 41%. Proved that the combination of GA in HMM method can gives more optimal results when compared with the HMM system that not combine with any method.

  7. Modelling Framework of a Neural Object Recognition

    Directory of Open Access Journals (Sweden)

    Aswathy K S

    2016-02-01

    Full Text Available In many industrial, medical and scientific image processing applications, various feature and pattern recognition techniques are used to match specific features in an image with a known template. Despite the capabilities of these techniques, some applications require simultaneous analysis of multiple, complex, and irregular features within an image as in semiconductor wafer inspection. In wafer inspection discovered defects are often complex and irregular and demand more human-like inspection techniques to recognize irregularities. By incorporating neural network techniques such image processing systems with much number of images can be trained until the system eventually learns to recognize irregularities. The aim of this project is to develop a framework of a machine-learning system that can classify objects of different category. The framework utilizes the toolboxes in the Matlab such as Computer Vision Toolbox, Neural Network Toolbox etc.

  8. An object-oriented modelling framework for the arterial wall.

    Science.gov (United States)

    Balaguera, M I; Briceño, J C; Glazier, J A

    2010-02-01

    An object-oriented modelling framework for the arterial wall is presented. The novelty of the framework is the possibility to generate customizable artery models, taking advantage of imaging technology. In our knowledge, this is the first object-oriented modelling framework for the arterial wall. Existing models do not allow close structural mapping with arterial microstructure as in the object-oriented framework. In the implemented model, passive behaviour of the arterial wall was considered and the tunica adventitia was the objective system. As verification, a model of an arterial segment was generated. In order to simulate its deformation, a matrix structural mechanics simulator was implemented. Two simulations were conducted, one for an axial loading test and other for a pressure-volume test. Each simulation began with a sensitivity analysis in order to determinate the best parameter combination and to compare the results with analogue controls. In both cases, the simulated results closely reproduced qualitatively and quantitatively the analogue control plots.

  9. An Ontology-Based Framework for Modeling User Behavior

    DEFF Research Database (Denmark)

    Razmerita, Liana

    2011-01-01

    This paper focuses on the role of user modeling and semantically enhanced representations for personalization. This paper presents a generic Ontology-based User Modeling framework (OntobUMf), its components, and its associated user modeling processes. This framework models the behavior of the users....... The results of this research may contribute to the development of other frameworks for modeling user behavior, other semantically enhanced user modeling frameworks, or other semantically enhanced information systems....... and classifies its users according to their behavior. The user ontology is the backbone of OntobUMf and has been designed according to the Information Management System Learning Information Package (IMS LIP). The user ontology includes a Behavior concept that extends IMS LIP specification and defines...

  10. A Mathematical Modeling Framework for Analysis of Functional Clothing

    Directory of Open Access Journals (Sweden)

    Xiaolin Man

    2007-11-01

    Full Text Available In the analysis and design of functional clothing systems, it is helpful to quantify the effects of a system on a wearer’s physical performance capabilities. Toward this end, a clothing modeling framework for quantifying the mechanical interactions between a given clothing system design and a specific wearer performing defined physical tasks is proposed. The modeling framework consists of three interacting modules: (1 a macroscale fabric mechanics/dynamics model; (2 a collision detection and contact correction module; and (3 a human motion module. In the proposed framework, the macroscopic fabric model is based on a rigorous large deformation continuum-degenerated shell theory representation. Material models that capture the stress-strain behavior of different clothing fabrics are used in the continuum shell framework. The collision and contact module enforces the impenetrability constraint between the fabric and human body and computes the associated contact forces between the two. The human body is represented in the current framework as an assemblage of overlapping ellipsoids that undergo rigid body motions consistent with human motions while performing actions such as walking, running, or jumping. The transient rigid body motions of each ellipsoidal body segment in time are determined using motion capture technology. The integrated modeling framework is then exercised to quantify the resistance that the clothing exerts on the wearer during the specific activities under consideration. Current results from the framework are presented and its intended applications are discussed along with some of the key challenges remaining in clothing system modeling.

  11. A Modeling Framework for Conventional and Heat Integrated Distillation Columns

    DEFF Research Database (Denmark)

    Bisgaard, Thomas; Huusom, Jakob Kjøbsted; Abildskov, Jens

    2013-01-01

    In this paper, a generic, modular model framework for describing fluid separation by distillation is presented. At present, the framework is able to describe a conventional distillation column and a heat-integrated distillation column, but due to a modular structure the database can be further ex...

  12. A framework for habitat monitoring and climate change modelling

    NARCIS (Netherlands)

    Villoslada, Miguel; Bunce, Robert G.H.; Sepp, Kalev; Jongman, Rob H.G.; Metzger, Marc J.; Kull, Tiiu; Raet, Janar; Kuusemets, Valdo; Kull, Ain; Leito, Aivar

    2017-01-01

    Environmental stratifications provide the framework for efficient surveillance and monitoring of biodiversity and ecological resources, as well as modelling exercises. An obstacle for agricultural landscape monitoring in Estonia has been the lack of a framework for the objective selection of

  13. A framework for modeling uncertainty in regional climate change

    Science.gov (United States)

    In this study, we present a new modeling framework and a large ensemble of climate projections to investigate the uncertainty in regional climate change over the United States associated with four dimensions of uncertainty. The sources of uncertainty considered in this framework ...

  14. A Framework for Dimensionality Assessment for Multidimensional Item Response Models

    Science.gov (United States)

    Svetina, Dubravka; Levy, Roy

    2014-01-01

    A framework is introduced for considering dimensionality assessment procedures for multidimensional item response models. The framework characterizes procedures in terms of their confirmatory or exploratory approach, parametric or nonparametric assumptions, and applicability to dichotomous, polytomous, and missing data. Popular and emerging…

  15. Generic Model Predictive Control Framework for Advanced Driver Assistance Systems

    NARCIS (Netherlands)

    Wang, M.

    2014-01-01

    This thesis deals with a model predictive control framework for control design of Advanced Driver Assistance Systems, where car-following tasks are under control. The framework is applied to design several autonomous and cooperative controllers and to examine the controller properties at the microsc

  16. Generic Model Predictive Control Framework for Advanced Driver Assistance Systems

    NARCIS (Netherlands)

    Wang, M.

    2014-01-01

    This thesis deals with a model predictive control framework for control design of Advanced Driver Assistance Systems, where car-following tasks are under control. The framework is applied to design several autonomous and cooperative controllers and to examine the controller properties at the microsc

  17. HMMEditor: a visual editing tool for profile hidden Markov model

    Directory of Open Access Journals (Sweden)

    Cheng Jianlin

    2008-03-01

    Full Text Available Abstract Background Profile Hidden Markov Model (HMM is a powerful statistical model to represent a family of DNA, RNA, and protein sequences. Profile HMM has been widely used in bioinformatics research such as sequence alignment, gene structure prediction, motif identification, protein structure prediction, and biological database search. However, few comprehensive, visual editing tools for profile HMM are publicly available. Results We develop a visual editor for profile Hidden Markov Models (HMMEditor. HMMEditor can visualize the profile HMM architecture, transition probabilities, and emission probabilities. Moreover, it provides functions to edit and save HMM and parameters. Furthermore, HMMEditor allows users to align a sequence against the profile HMM and to visualize the corresponding Viterbi path. Conclusion HMMEditor provides a set of unique functions to visualize and edit a profile HMM. It is a useful tool for biological sequence analysis and modeling. Both HMMEditor software and web service are freely available.

  18. Coastal Ecosystem Integrated Compartment Model (ICM): Modeling Framework

    Science.gov (United States)

    Meselhe, E. A.; White, E. D.; Reed, D.

    2015-12-01

    The Integrated Compartment Model (ICM) was developed as part of the 2017 Coastal Master Plan modeling effort. It is a comprehensive and numerical hydrodynamic model coupled to various geophysical process models. Simplifying assumptions related to some of the flow dynamics are applied to increase the computational efficiency of the model. The model can be used to provide insights about coastal ecosystems and evaluate restoration strategies. It builds on existing tools where possible and incorporates newly developed tools where necessary. It can perform decadal simulations (~ 50 years) across the entire Louisiana coast. It includes several improvements over the approach used to support the 2012 Master Plan, such as: additional processes in the hydrology, vegetation, wetland and barrier island morphology subroutines, increased spatial resolution, and integration of previously disparate models into a single modeling framework. The ICM includes habitat suitability indices (HSIs) to predict broad spatial patterns of habitat change, and it provides an additional integration to a dynamic fish and shellfish community model which quantitatively predicts potential changes in important fishery resources. It can be used to estimate the individual and cumulative effects of restoration and protection projects on the landscape, including a general estimate of water levels associated with flooding. The ICM is also used to examine possible impacts of climate change and future environmental scenarios (e.g. precipitation, Eustatic sea level rise, subsidence, tropical storms, etc.) on the landscape and on the effectiveness of restoration projects. The ICM code is publically accessible, and coastal restoration and protection groups interested in planning-level modeling are encouraged to explore its utility as a computationally efficient tool to examine ecosystem response to future physical or ecological changes, including the implementation of restoration and protection strategies.

  19. The Reputation Evaluation Based on Optimized Hidden Markov Model in E-Commerce

    Directory of Open Access Journals (Sweden)

    Liu Chang

    2013-01-01

    Full Text Available Nowadays, a large number of reputation systems have been deployed in practical applications or investigated in the literature to protect buyers from deception and malicious behaviors in online transactions. As an efficient Bayesian analysis tool, Hidden Markov Model (HMM has been used into e-commerce to describe the dynamic behavior of sellers. Traditional solutions adopt Baum-Welch algorithm to train model parameters which is unstable due to its inability to find a globally optimal solution. Consequently, this paper presents a reputation evaluation mechanism based on the optimized Hidden Markov Model, which is called PSOHMM. The algorithm takes full advantage of the search mechanism in Particle Swarm Optimization (PSO algorithm to strengthen the learning ability of HMM and PSO has been modified to guarantee interval and normalization constraints in HMM. Furthermore, a simplified reputation evaluation framework based on HMM is developed and applied to analyze the specific behaviors of sellers. The simulation experiments demonstrate that the proposed PSOHMM has better performance to search optimal model parameters than BWHMM, has faster convergence speed, and is more stable than BWHMM. Compared with Average and Beta reputation evaluation mechanism, PSOHMM can reflect the behavior changes of sellers more quickly in e-commerce systems.

  20. Fuzzy C-Means Clustering Based Phonetic Tied-Mixture HMM in Speech Recognition

    Institute of Scientific and Technical Information of China (English)

    XU Xiang-hua; ZHU Jie; GUO Qiang

    2005-01-01

    A fuzzy clustering analysis based phonetic tied-mixture HMM(FPTM) was presented to decrease parameter size and improve robustness of parameter training. FPTM was synthesized from state-tied HMMs by a modified fuzzy C-means clustering algorithm. Each Gaussian codebook of FPTM was built from Gaussian components within the same root node in phonetic decision tree. The experimental results on large vocabulary Mandarin speech recognition show that compared with conventional phonetic tied-mixture HMM and state-tied HMM with approximately the same number of Gaussian mixtures, FPTM achieves word error rate reductions by 4.84% and 13.02 % respectively. Combining the two schemes of mixing weights pruning and Gaussian centers fuzzy merging, a significantly parameter size reduction was achieved with little impact on recognition accuracy.

  1. Conceptualising Business Models: Definitions, Frameworks and Classifications

    Directory of Open Access Journals (Sweden)

    Erwin Fielt

    2013-12-01

    Full Text Available The business model concept is gaining traction in different disciplines but is still criticized for being fuzzy and vague and lacking consensus on its definition and compositional elements. In this paper we set out to advance our understanding of the business model concept by addressing three areas of foundational research: business model definitions, business model elements, and business model archetypes. We define a business model as a representation of the value logic of an organization in terms of how it creates and captures customer value. This abstract and generic definition is made more specific and operational by the compositional elements that need to address the customer, value proposition, organizational architecture (firm and network level and economics dimensions. Business model archetypes complement the definition and elements by providing a more concrete and empirical understanding of the business model concept. The main contributions of this paper are (1 explicitly including the customer value concept in the business model definition and focussing on value creation, (2 presenting four core dimensions that business model elements need to cover, (3 arguing for flexibility by adapting and extending business model elements to cater for different purposes and contexts (e.g. technology, innovation, strategy (4 stressing a more systematic approach to business model archetypes by using business model elements for their description, and (5 suggesting to use business model archetype research for the empirical exploration and testing of business model elements and their relationships.

  2. A conceptual framework for a mentoring model for nurse educators ...

    African Journals Online (AJOL)

    A conceptual framework for a mentoring model for nurse educators. ... recruiting and retaining nurse educators to meet the demands of teaching and learning ... approaches focusing on reasoning strategies, literature control and empirical data ...

  3. Bregman divergence as general framework to estimate unnormalized statistical models

    CERN Document Server

    Gutmann, Michael

    2012-01-01

    We show that the Bregman divergence provides a rich framework to estimate unnormalized statistical models for continuous or discrete random variables, that is, models which do not integrate or sum to one, respectively. We prove that recent estimation methods such as noise-contrastive estimation, ratio matching, and score matching belong to the proposed framework, and explain their interconnection based on supervised learning. Further, we discuss the role of boosting in unsupervised learning.

  4. POSITIVE LEADERSHIP MODELS: THEORETICAL FRAMEWORK AND RESEARCH

    Directory of Open Access Journals (Sweden)

    Javier Blanch, Francisco Gil

    2016-09-01

    Full Text Available The objective of this article is twofold; firstly, we establish the theoretical boundaries of positive leadership and the reasons for its emergence. It is related to the new paradigm of positive psychology that has recently been shaping the scope of organizational knowledge. This conceptual framework has triggered the development of the various forms of positive leadership (i.e. transformational, servant, spiritual, authentic, and positive. Although the construct does not seem univocally defined, these different types of leadership overlap and share a significant affinity. Secondly, we review the empirical evidence that shows the impact of positive leadership in organizations and we highlight the positive relationship between these forms of leadership and key positive organizational variables. Lastly, we analyse future research areas in order to further develop this concept.

  5. An Extensible Model and Analysis Framework

    Science.gov (United States)

    2010-11-01

    for a total of 543 seconds. For comparison purposes, in interpreted mode, opening the model took 224 seconds and running the model took 217 seconds...contains 19683 entities. 9 A comparison of the key model complexity metrics may be found in Table 3. Table 3: Comparison of the model...Triquetrum/RCP supports assembling in arbitrary ways. (12/08 presentation) 2. Prototyped OSGi component architecture for use with Netbeans and

  6. A Simulation and Modeling Framework for Space Situational Awareness

    Science.gov (United States)

    Olivier, S.

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. This framework includes detailed models for threat scenarios, signatures, sensors, observables and knowledge extraction algorithms. The framework is based on a flexible, scalable architecture to enable efficient simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. We will describe the details of the modeling and simulation framework, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical and infra-red brightness calculations, generic radar system models, generic optical and infra-red system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The specific modeling of the Space Surveillance Network is performed in collaboration with the Air Force Space Command Space Control Group. We will demonstrate the use of this integrated simulation and modeling framework on specific threat scenarios, including space debris and satellite maneuvers, and we will examine the results of case studies involving the addition of new sensor systems, used in conjunction with the Space Surveillance Network, for improving space situational awareness.

  7. An Ising model for metal-organic frameworks

    Science.gov (United States)

    Höft, Nicolas; Horbach, Jürgen; Martín-Mayor, Victor; Seoane, Beatriz

    2017-08-01

    We present a three-dimensional Ising model where lines of equal spins are frozen such that they form an ordered framework structure. The frame spins impose an external field on the rest of the spins (active spins). We demonstrate that this "porous Ising model" can be seen as a minimal model for condensation transitions of gas molecules in metal-organic frameworks. Using Monte Carlo simulation techniques, we compare the phase behavior of a porous Ising model with that of a particle-based model for the condensation of methane (CH4) in the isoreticular metal-organic framework IRMOF-16. For both models, we find a line of first-order phase transitions that end in a critical point. We show that the critical behavior in both cases belongs to the 3D Ising universality class, in contrast to other phase transitions in confinement such as capillary condensation.

  8. Mediation Analysis in a Latent Growth Curve Modeling Framework

    Science.gov (United States)

    von Soest, Tilmann; Hagtvet, Knut A.

    2011-01-01

    This article presents several longitudinal mediation models in the framework of latent growth curve modeling and provides a detailed account of how such models can be constructed. Logical and statistical challenges that might arise when such analyses are conducted are also discussed. Specifically, we discuss how the initial status (intercept) and…

  9. A DSM-based framework for integrated function modelling

    DEFF Research Database (Denmark)

    Eisenbart, Boris; Gericke, Kilian; Blessing, Lucienne T. M.

    2017-01-01

    an integrated function modelling framework, which specifically aims at relating between the different function modelling perspectives prominently addressed in different disciplines. It uses interlinked matrices based on the concept of DSM and MDM in order to facilitate cross-disciplinary modelling and analysis...

  10. Threat model framework and methodology for personal networks (PNs)

    DEFF Research Database (Denmark)

    Prasad, Neeli R.

    2007-01-01

    is to give a structured, convenient approach for building threat models. A framework for the threat model is presented with a list of requirements for methodology. The methodology will be applied to build a threat model for Personal Networks. Practical tools like UML sequence diagrams and attack trees have...

  11. MODELS FOR NETWORK DYNAMICS - A MARKOVIAN FRAMEWORK

    NARCIS (Netherlands)

    LEENDERS, RTAJ

    1995-01-01

    A question not very often addressed in social network analysis relates to network dynamics and focuses on how networks arise and change. It alludes to the idea that ties do not arise or vanish randomly, but (partly) as a consequence of human behavior and preferences. Statistical models for modeling

  12. A Computational Framework for Realistic Retina Modeling.

    Science.gov (United States)

    Martínez-Cañada, Pablo; Morillas, Christian; Pino, Begoña; Ros, Eduardo; Pelayo, Francisco

    2016-11-01

    Computational simulations of the retina have led to valuable insights about the biophysics of its neuronal activity and processing principles. A great number of retina models have been proposed to reproduce the behavioral diversity of the different visual processing pathways. While many of these models share common computational stages, previous efforts have been more focused on fitting specific retina functions rather than generalizing them beyond a particular model. Here, we define a set of computational retinal microcircuits that can be used as basic building blocks for the modeling of different retina mechanisms. To validate the hypothesis that similar processing structures may be repeatedly found in different retina functions, we implemented a series of retina models simply by combining these computational retinal microcircuits. Accuracy of the retina models for capturing neural behavior was assessed by fitting published electrophysiological recordings that characterize some of the best-known phenomena observed in the retina: adaptation to the mean light intensity and temporal contrast, and differential motion sensitivity. The retinal microcircuits are part of a new software platform for efficient computational retina modeling from single-cell to large-scale levels. It includes an interface with spiking neural networks that allows simulation of the spiking response of ganglion cells and integration with models of higher visual areas.

  13. GeoFramework: Coupling multiple models of mantle convection within a computational framework

    Science.gov (United States)

    Tan, E.; Choi, E.; Thoutireddy, P.; Gurnis, M.; Aivazis, M.

    2004-12-01

    Geological processes usually encompass a broad spectrum of length and time scales. Traditionally, a modeling code (solver) is developed for a problem of specific length and time scales, but the utility of the solver beyond the designated purpose is usually limited. As we have come to recognize that geological processes often result from the dynamic coupling of deformation across a wide range of time and spatial scales, more robust methods are needed. One means to address this need is through the integration of complementary modeling codes, while attempting to reuse existing software as much as possible. The GeoFramework project addresses this by developing a suite of reusable and combinable tools for the Earth science community. GeoFramework is based on and extends Pyre, a Python-based modeling framework, developed to link solid (Lagrangian) and fluid (Eulerian) solvers, as well as mesh generators, visualization packages, and databases, with one another for engineering applications. Under the framework, a solver is aware of the presence of other solvers and can interact with each other via exchanging information across adjacent mesh boundary. We will show an example of linking two instances of the CitcomS finite element solver within GeoFramework. A high-resolution regional mantle convection model is linked with a global mantle convection model. The global solver has a resolution of ˜180 km horizontally and 35-100 km (with mesh refinement) vertically. The fine mesh has a resolution of ˜40 km horizontally and vertically. The fine mesh is center on the Hawaii hotspot. A vertical plume is used as an initial condition. Time-varying plate velocity models are imposed since 80 Ma and we have investigated how the plume conduit is deflected by the global circulation patterns as a function of mantle viscosity, plume flux, and plate motion.

  14. Multilevel Models: Conceptual Framework and Applicability

    Directory of Open Access Journals (Sweden)

    Roxana-Otilia-Sonia Hrițcu

    2015-10-01

    Full Text Available Individuals and the social or organizational groups they belong to can be viewed as a hierarchical system situated on different levels. Individuals are situated on the first level of the hierarchy and they are nested together on the higher levels. Individuals interact with the social groups they belong to and are influenced by these groups. Traditional methods that study the relationships between data, like simple regression, do not take into account the hierarchical structure of the data and the effects of a group membership and, hence, results may be invalidated. Unlike standard regression modelling, the multilevel approach takes into account the individuals as well as the groups to which they belong. To take advantage of the multilevel analysis it is important that we recognize the multilevel characteristics of the data. In this article we introduce the outlines of multilevel data and we describe the models that work with such data. We introduce the basic multilevel model, the two-level model: students can be nested into classes, individuals into countries and the general two-level model can be extended very easily to several levels. Multilevel analysis has begun to be extensively used in many research areas. We present the most frequent study areas where multilevel models are used, such as sociological studies, education, psychological research, health studies, demography, epidemiology, biology, environmental studies and entrepreneurship. We support the idea that since hierarchies exist everywhere, multilevel data should be recognized and analyzed properly by using multilevel modelling.

  15. Traffic modelling framework for electric vehicles

    Science.gov (United States)

    Schlote, Arieh; Crisostomi, Emanuele; Kirkland, Stephen; Shorten, Robert

    2012-07-01

    This article reviews and improves a recently proposed model of road network dynamics. The model is also adapted and generalised to represent the patterns of battery consumption of electric vehicles travelling in the road network. Simulations from the mobility simulator SUMO are given to support and to illustrate the efficacy of the proposed approach. Applications relevant in the field of electric vehicles, such as optimal routing and traffic load control, are provided to illustrate how the proposed model can be used to address typical problems arising in contemporary road network planning and electric vehicle mobility.

  16. TRANSFORM - TRANsient Simulation Framework of Reconfigurable Models

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-01

    Existing development tools for early stage design and scoping of energy systems are often time consuming to use, proprietary, and do not contain the necessary function to model complete systems (i.e., controls, primary, and secondary systems) in a common platform. The Modelica programming language based TRANSFORM tool (1) provides a standardized, common simulation environment for early design of energy systems (i.e., power plants), (2) provides a library of baseline component modules to be assembled into full plant models using available geometry, design, and thermal-hydraulic data, (3) defines modeling conventions for interconnecting component models, and (4) establishes user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  17. Fisher Information Framework for Time Series Modeling

    CERN Document Server

    Venkatesan, R C

    2016-01-01

    A robust prediction model invoking the Takens embedding theorem, whose \\textit{working hypothesis} is obtained via an inference procedure based on the minimum Fisher information principle, is presented. The coefficients of the ansatz, central to the \\textit{working hypothesis} satisfy a time independent Schr\\"{o}dinger-like equation in a vector setting. The inference of i) the probability density function of the coefficients of the \\textit{working hypothesis} and ii) the establishing of constraint driven pseudo-inverse condition for the modeling phase of the prediction scheme, is made, for the case of normal distributions, with the aid of the quantum mechanical virial theorem. The well-known reciprocity relations and the associated Legendre transform structure for the Fisher information measure (FIM, hereafter)-based model in a vector setting (with least square constraints) are self-consistently derived. These relations are demonstrated to yield an intriguing form of the FIM for the modeling phase, which defi...

  18. A Modeling Framework for Gossip-based Information Spread

    CERN Document Server

    Bakhshi, Rena; Fokkink, Wan; van Steen, Maarten

    2011-01-01

    We present an analytical framework for gossip protocols based on the pairwise information exchange between interacting nodes. This framework allows for studying the impact of protocol parameters on the performance of the protocol. Previously, gossip-based information dissemination protocols have been analyzed under the assumption of perfect, lossless communication channels. We extend our framework for the analysis of networks with lossy channels. We show how the presence of message loss, coupled with specific topology configurations,impacts the expected behavior of the protocol. We validate the obtained models against simulations for two protocols.

  19. 基于HMM的驾驶员疲劳识别在智能汽车空间的应用%APPLYING HMM-BASED DRIVER FATIGUE RECOGNITION IN SMART VEHICLE SPACE

    Institute of Scientific and Technical Information of China (English)

    郁伟炜; 吴卿

    2011-01-01

    Smart vehicle space is a specific and focused performance of pervasive computing; this paper presents an application about driver fatigue recognition based on hidden Markov model ( HMM). The authors select PERCLOS feature variable as a low-level context of driver fatigue evaluation, and establish the HMM through a large number of sample data training. Then they identify the most likely driver's hidden state from the observation sequence using Viterbi algorithm,and remind drivers to ensure their safe driving behaviour. Finally, a case study in simulation environment confirmed the validity of the scheme.%智能汽车空间作为普适计算一种具体而集中的表现,对此提出一个基于隐马尔科夫模型(HMM)的驾驶员疲劳识别应用.选取PERCLOS特征变量作为测评驾驶员疲劳的低层上下文,通过大量样本数据的训练,建立HMM,用Viterbi算法从观察序列中识别出最有可能的驾驶员隐藏状态,提醒驾驶员以确保安全的驾驶行为.最后通过在模拟实验环境中的案例验证了该方法的有效性.

  20. Multicriteria framework for selecting a process modelling language

    Science.gov (United States)

    Scanavachi Moreira Campos, Ana Carolina; Teixeira de Almeida, Adiel

    2016-01-01

    The choice of process modelling language can affect business process management (BPM) since each modelling language shows different features of a given process and may limit the ways in which a process can be described and analysed. However, choosing the appropriate modelling language for process modelling has become a difficult task because of the availability of a large number modelling languages and also due to the lack of guidelines on evaluating, and comparing languages so as to assist in selecting the most appropriate one. This paper proposes a framework for selecting a modelling language in accordance with the purposes of modelling. This framework is based on the semiotic quality framework (SEQUAL) for evaluating process modelling languages and a multicriteria decision aid (MCDA) approach in order to select the most appropriate language for BPM. This study does not attempt to set out new forms of assessment and evaluation criteria, but does attempt to demonstrate how two existing approaches can be combined so as to solve the problem of selection of modelling language. The framework is described in this paper and then demonstrated by means of an example. Finally, the advantages and disadvantages of using SEQUAL and MCDA in an integrated manner are discussed.

  1. Theoretical Tinnitus framework: A Neurofunctional Model

    Directory of Open Access Journals (Sweden)

    Iman Ghodratitoostani

    2016-08-01

    Full Text Available Subjective tinnitus is the conscious (attended awareness perception of sound in the absence of an external source and can be classified as an auditory phantom perception. The current tinnitus development models depend on the role of external events congruently paired with the causal physical events that precipitate the phantom perception. We propose a novel Neurofunctional tinnitus model to indicate that the conscious perception of phantom sound is essential in activating the cognitive-emotional value. The cognitive-emotional value plays a crucial role in governing attention allocation as well as developing annoyance within tinnitus clinical distress. Structurally, the Neurofunctional tinnitus model includes the peripheral auditory system, the thalamus, the limbic system, brain stem, basal ganglia, striatum and the auditory along with prefrontal cortices. Functionally, we assume the model includes presence of continuous or intermittent abnormal signals at the peripheral auditory system or midbrain auditory paths. Depending on the availability of attentional resources, the signals may or may not be perceived. The cognitive valuation process strengthens the lateral-inhibition and noise canceling mechanisms in the mid-brain, which leads to the cessation of sound perception and renders the signal evaluation irrelevant. However, the sourceless sound is eventually perceived and can be cognitively interpreted as suspicious or an indication of a disease in which the cortical top-down processes weaken the noise canceling effects. This results in an increase in cognitive and emotional negative reactions such as depression and anxiety. The negative or positive cognitive-emotional feedbacks within the top-down approach may have no relation to the previous experience of the patients. They can also be associated with aversive stimuli similar to abnormal neural activity in generating the phantom sound. Cognitive and emotional reactions depend on general

  2. Theoretical Tinnitus Framework: A Neurofunctional Model.

    Science.gov (United States)

    Ghodratitoostani, Iman; Zana, Yossi; Delbem, Alexandre C B; Sani, Siamak S; Ekhtiari, Hamed; Sanchez, Tanit G

    2016-01-01

    Subjective tinnitus is the conscious (attended) awareness perception of sound in the absence of an external source and can be classified as an auditory phantom perception. Earlier literature establishes three distinct states of conscious perception as unattended, attended, and attended awareness conscious perception. The current tinnitus development models depend on the role of external events congruently paired with the causal physical events that precipitate the phantom perception. We propose a novel Neurofunctional Tinnitus Model to indicate that the conscious (attended) awareness perception of phantom sound is essential in activating the cognitive-emotional value. The cognitive-emotional value plays a crucial role in governing attention allocation as well as developing annoyance within tinnitus clinical distress. Structurally, the Neurofunctional Tinnitus Model includes the peripheral auditory system, the thalamus, the limbic system, brainstem, basal ganglia, striatum, and the auditory along with prefrontal cortices. Functionally, we assume the model includes presence of continuous or intermittent abnormal signals at the peripheral auditory system or midbrain auditory paths. Depending on the availability of attentional resources, the signals may or may not be perceived. The cognitive valuation process strengthens the lateral-inhibition and noise canceling mechanisms in the mid-brain, which leads to the cessation of sound perception and renders the signal evaluation irrelevant. However, the "sourceless" sound is eventually perceived and can be cognitively interpreted as suspicious or an indication of a disease in which the cortical top-down processes weaken the noise canceling effects. This results in an increase in cognitive and emotional negative reactions such as depression and anxiety. The negative or positive cognitive-emotional feedbacks within the top-down approach may have no relation to the previous experience of the patients. They can also be

  3. A community-based framework for aquatic ecosystem models

    DEFF Research Database (Denmark)

    Trolle, Didde; Hamilton, D. P.; Hipsey, M. R.;

    2012-01-01

    aim to (i) advance collaboration within the aquatic ecosystem modelling community, (ii) enable increased use of models for research, policy and ecosystem-based management, (iii) facilitate a collective framework using common (standardised) code to ensure that model development is incremental, (iv......Here, we communicate a point of departure in the development of aquatic ecosystem models, namely a new community-based framework, which supports an enhanced and transparent union between the collective expertise that exists in the communities of traditional ecologists and model developers. Through...... a literature survey, we document the growing importance of numerical aquatic ecosystem models while also noting the difficulties, up until now, of the aquatic scientific community to make significant advances in these models during the past two decades. Through a common forum for aquatic ecosystem modellers we...

  4. A framework for quantifying net benefits of alternative prognostic models

    DEFF Research Database (Denmark)

    Rapsomaniki, Eleni; White, Ian R; Wood, Angela M

    2012-01-01

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit...

  5. Industrial Sector Energy Efficiency Modeling (ISEEM) Framework Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Karali, Nihan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-12-12

    The goal of this study is to develop a new bottom-up industry sector energy-modeling framework with an agenda of addressing least cost regional and global carbon reduction strategies, improving the capabilities and limitations of the existing models that allows trading across regions and countries as an alternative.

  6. A framework for quantifying net benefits of alternative prognostic models

    NARCIS (Netherlands)

    Rapsomaniki, Eleni; White, Ian R.; Wood, Angela M.; Thompson, Simon G.

    2012-01-01

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) o

  7. Conceptual Frameworks and Research Models on Resilience in Leadership

    OpenAIRE

    Janet Ledesma

    2014-01-01

    The purpose of this article was to discuss conceptual frameworks and research models on resilience theory. The constructs of resilience, the history of resilience theory, models of resilience, variables of resilience, career resilience, and organizational resilience will be examined and discussed as they relate to leadership development. The literature demonstrates that there is a direct relationship between the stress...

  8. A National Modeling Framework for Water Management Decisions

    Science.gov (United States)

    Bales, J. D.; Cline, D. W.; Pietrowsky, R.

    2013-12-01

    The National Weather Service (NWS), the U.S. Army Corps of Engineers (USACE), and the U.S. Geological Survey (USGS), all Federal agencies with complementary water-resources activities, entered into an Interagency Memorandum of Understanding (MOU) "Collaborative Science Services and Tools to Support Integrated and Adaptive Water Resources Management" to collaborate in activities that are supportive to their respective missions. One of the interagency activities is the development of a highly integrated national water modeling framework and information services framework. Together these frameworks establish a common operating picture, improve modeling and synthesis, support the sharing of data and products among agencies, and provide a platform for incorporation of new scientific understanding. Each of the agencies has existing operational systems to assist in carrying out their respective missions. The systems generally are designed, developed, tested, fielded, and supported by specialized teams. A broader, shared approach is envisioned and would include community modeling, wherein multiple independent investigators or teams develop and contribute new modeling capabilities based on science advances; modern technology in coupling model components and visualizing results; and a coupled atmospheric - hydrologic model construct such that the framework could be used in real-time water-resources decision making or for long-term management decisions. The framework also is being developed to account for organizational structures of the three partners such that, for example, national data sets can move down to the regional scale, and vice versa. We envision the national water modeling framework to be an important element of North American Water Program, to contribute to goals of the Program, and to be informed by the science and approaches developed as a part of the Program.

  9. The Guided System Development Framework: Modeling and Verifying Communication Systems

    DEFF Research Database (Denmark)

    Carvalho Quaresma, Jose Nuno; Probst, Christian W.; Nielson, Flemming

    2014-01-01

    In a world that increasingly relies on the Internet to function, application developers rely on the implementations of protocols to guarantee the security of data transferred. Whether a chosen protocol gives the required guarantees, and whether the implementation does the same, is usually unclear....... The Guided System Development framework contributes to more secure communication systems by aiding the development of such systems. The framework features a simple modelling language, step-wise refinement from models to implementation, interfaces to security verification tools, and code generation from...

  10. GEMFsim: A Stochastic Simulator for the Generalized Epidemic Modeling Framework

    CERN Document Server

    Sahneh, Faryad Darabi; Shakeri, Heman; Fan, Futing; Scoglio, Caterina

    2016-01-01

    The recently proposed generalized epidemic modeling framework (GEMF) \\cite{sahneh2013generalized} lays the groundwork for systematically constructing a broad spectrum of stochastic spreading processes over complex networks. This article builds an algorithm for exact, continuous-time numerical simulation of GEMF-based processes. Moreover the implementation of this algorithm, GEMFsim, is available in popular scientific programming platforms such as MATLAB, R, Python, and C; GEMFsim facilitates simulating stochastic spreading models that fit in GEMF framework. Using these simulations one can examine the accuracy of mean-field-type approximations that are commonly used for analytical study of spreading processes on complex networks.

  11. A software engineering perspective on environmental modeling framework design: The object modeling system

    Science.gov (United States)

    The environmental modeling community has historically been concerned with the proliferation of models and the effort associated with collective model development tasks (e.g., code generation, data provisioning and transformation, etc.). Environmental modeling frameworks (EMFs) have been developed to...

  12. The application of HMM in gait recognition using lower limb SEMG%HMM在下肢表面肌电信号步态识别中的应用

    Institute of Scientific and Technical Information of China (English)

    孟明; 佘青山; 罗志增

    2011-01-01

    提出了一种基于隐马尔可夫模型(HMM)的分类方法,利用下肢表面肌电信号(SEMG)进行人体步态状态的识别.对每通道的SEMG信号按时间分段后,对每段数据提取4个时域特征来描述信号特点.根据对步态周期中状态的划分确定了HMM的结构,将HMM的状态与步态状态一一对应,并利用改进的Baum-Welch算法估计HMM参数,然后通过使用Viterbi算法寻找最佳状态序列来将给定时刻的数据段对应到相应的步态状态,最终实现步态状态识别.实验结果表明HMM在时序变化信号的分类方面具有独特优势.%A hidden Markov model (HMM) based classification method for recognizing gait phases using surface electromyographic (SEMG) signals of lower limb was presented. Four time-domain features were extracted within a time segment of each channel of SEMG signals to preserve pattern structure. According to the division of the gait cycle, the structure of HMM was determined, in which each state was associated with a gait phase. A modified Baum-Welch algorithm was used to estimate the parameter of HMM. The Viterbi algorithm achieves the phase recognition by finding the best state sequence to assign corresponding phases to the given segments. The experimental results show that HMM has unique advantage in classifying sequentially changing signals.

  13. A Bayesian framework for parameter estimation in dynamical models.

    Directory of Open Access Journals (Sweden)

    Flávio Codeço Coelho

    Full Text Available Mathematical models in biology are powerful tools for the study and exploration of complex dynamics. Nevertheless, bringing theoretical results to an agreement with experimental observations involves acknowledging a great deal of uncertainty intrinsic to our theoretical representation of a real system. Proper handling of such uncertainties is key to the successful usage of models to predict experimental or field observations. This problem has been addressed over the years by many tools for model calibration and parameter estimation. In this article we present a general framework for uncertainty analysis and parameter estimation that is designed to handle uncertainties associated with the modeling of dynamic biological systems while remaining agnostic as to the type of model used. We apply the framework to fit an SIR-like influenza transmission model to 7 years of incidence data in three European countries: Belgium, the Netherlands and Portugal.

  14. ECoS, a framework for modelling hierarchical spatial systems.

    Science.gov (United States)

    Harris, John R W; Gorley, Ray N

    2003-10-01

    A general framework for modelling hierarchical spatial systems has been developed and implemented as the ECoS3 software package. The structure of this framework is described, and illustrated with representative examples. It allows the set-up and integration of sets of advection-diffusion equations representing multiple constituents interacting in a spatial context. Multiple spaces can be defined, with zero, one or two-dimensions and can be nested, and linked through constituent transfers. Model structure is generally object-oriented and hierarchical, reflecting the natural relations within its real-world analogue. Velocities, dispersions and inter-constituent transfers, together with additional functions, are defined as properties of constituents to which they apply. The resulting modular structure of ECoS models facilitates cut and paste model development, and template model components have been developed for the assembly of a range of estuarine water quality models. Published examples of applications to the geochemical dynamics of estuaries are listed.

  15. Emotion Recognition in Speech Based on HMM and PNN%基于HMM和PNN的语音情感识别研究

    Institute of Scientific and Technical Information of China (English)

    叶斌

    2011-01-01

    语音情感识别是从语音的角度赋予计算机理解情感特征的能力,最终使计算机能像人一样进行自然、亲切和生动的交互.提出了一种融合隐马尔科夫模型(hidden markov model,HMM)和概率神经网络(probabilistic neural network,PNN)的语音情感识别方法.在所设计情感识别系统中,提取出基本的韵律参数和频谱参数,利用PNN处理声学参数的统计特征,利用HMM处理声学参数的时序特征,运用加法规则和乘法规则融合了统计特征和时序特征的识别结果.实验结果显示,所提出的算法在语音情感识别中具有有效的识别能力.%The aim of the emotion recognition is make the computer have the capacity of understand emotion by the way of voice characteristics studies and ultimately like people for natural, warm and lively interaction. A speech emotion recognition algorithm based on HMM (hidden Markov model) and PNN (probabilistic neural network) was developed, in the system, the basic prosody parameters and spectral parameters were extracted first, and then the PNN was used to model the statistic features and HMM to model the temporal features. The sum and product rules were used to combine the probabilities from each group of features for the final decision. Experimental results approved the capacity and the efficiency of the proposed method.

  16. A unified framework for Schelling's model of segregation

    CERN Document Server

    Rogers, Tim

    2011-01-01

    Schelling's model of segregation is one of the first and most influential models in the field of social simulation. There are many variations of the model which have been proposed and simulated over the last forty years, though the present state of the literature on the subject is somewhat fragmented and lacking comprehensive analytical treatments. In this article a unified mathematical framework for Schelling's model and its many variants is developed. This methodology is useful in two regards: firstly, it provides a tool with which to understand the differences observed between models; secondly, phenomena which appear in several model variations may be understood in more depth through analytic studies of simpler versions.

  17. A unified framework for Schelling's model of segregation

    Science.gov (United States)

    Rogers, Tim; McKane, Alan J.

    2011-07-01

    Schelling's model of segregation is one of the first and most influential models in the field of social simulation. There are many variations of the model which have been proposed and simulated over the last forty years, though the present state of the literature on the subject is somewhat fragmented and lacking comprehensive analytical treatments. In this paper a unified mathematical framework for Schelling's model and its many variants is developed. This methodology is useful in two regards: firstly, it provides a tool with which to understand the differences observed between models; secondly, phenomena which appear in several model variations may be understood in more depth through analytic studies of simpler versions.

  18. 3D Building Model Fitting Using A New Kinetic Framework

    CERN Document Server

    Brédif, Mathieu; Pierrot-Deseilligny, Marc; Maître, Henri

    2008-01-01

    We describe a new approach to fit the polyhedron describing a 3D building model to the point cloud of a Digital Elevation Model (DEM). We introduce a new kinetic framework that hides to its user the combinatorial complexity of determining or maintaining the polyhedron topology, allowing the design of a simple variational optimization. This new kinetic framework allows the manipulation of a bounded polyhedron with simple faces by specifying the target plane equations of each of its faces. It proceeds by evolving continuously from the polyhedron defined by its initial topology and its initial plane equations to a polyhedron that is as topologically close as possible to the initial polyhedron but with the new plane equations. This kinetic framework handles internally the necessary topological changes that may be required to keep the faces simple and the polyhedron bounded. For each intermediate configurations where the polyhedron looses the simplicity of its faces or its boundedness, the simplest topological mod...

  19. New framework for standardized notation in wastewater treatment modelling

    DEFF Research Database (Denmark)

    Corominas, L.; Rieger, L.; Takacs, I.

    2010-01-01

    is a framework that can be used in whole plant modelling, which consists of different fields such as activated sludge, anaerobic digestion, sidestream treatment, membrane bioreactors, metabolic approaches, fate of micropollutants and biofilm processes. The main objective of this consensus building paper...... notational framework which allows unique and systematic naming of state variables and parameters of biokinetic models in the wastewater treatment field. The symbols are based on one main letter that gives a general description of the state variable or parameter and several subscript levels that provide......Many unit process models are available in the field of wastewater treatment. All of these models use their own notation, causing problems for documentation, implementation and connection of different models (using different sets of state variables). The main goal of this paper is to propose a new...

  20. A Liver-Centric Multiscale Modeling Framework for Xenobiotics

    Science.gov (United States)

    Swat, Maciej; Cosmanescu, Alin; Clendenon, Sherry G.; Wambaugh, John F.; Glazier, James A.

    2016-01-01

    We describe a multi-scale, liver-centric in silico modeling framework for acetaminophen pharmacology and metabolism. We focus on a computational model to characterize whole body uptake and clearance, liver transport and phase I and phase II metabolism. We do this by incorporating sub-models that span three scales; Physiologically Based Pharmacokinetic (PBPK) modeling of acetaminophen uptake and distribution at the whole body level, cell and blood flow modeling at the tissue/organ level and metabolism at the sub-cellular level. We have used standard modeling modalities at each of the three scales. In particular, we have used the Systems Biology Markup Language (SBML) to create both the whole-body and sub-cellular scales. Our modeling approach allows us to run the individual sub-models separately and allows us to easily exchange models at a particular scale without the need to extensively rework the sub-models at other scales. In addition, the use of SBML greatly facilitates the inclusion of biological annotations directly in the model code. The model was calibrated using human in vivo data for acetaminophen and its sulfate and glucuronate metabolites. We then carried out extensive parameter sensitivity studies including the pairwise interaction of parameters. We also simulated population variation of exposure and sensitivity to acetaminophen. Our modeling framework can be extended to the prediction of liver toxicity following acetaminophen overdose, or used as a general purpose pharmacokinetic model for xenobiotics. PMID:27636091

  1. New framework for standardized notation in wastewater treatment modelling.

    Science.gov (United States)

    Corominas, L L; Rieger, L; Takács, I; Ekama, G; Hauduc, H; Vanrolleghem, P A; Oehmen, A; Gernaey, K V; van Loosdrecht, M C M; Comeau, Y

    2010-01-01

    Many unit process models are available in the field of wastewater treatment. All of these models use their own notation, causing problems for documentation, implementation and connection of different models (using different sets of state variables). The main goal of this paper is to propose a new notational framework which allows unique and systematic naming of state variables and parameters of biokinetic models in the wastewater treatment field. The symbols are based on one main letter that gives a general description of the state variable or parameter and several subscript levels that provide greater specification. Only those levels that make the name unique within the model context are needed in creating the symbol. The paper describes specific problems encountered with the currently used notation, presents the proposed framework and provides additional practical examples. The overall result is a framework that can be used in whole plant modelling, which consists of different fields such as activated sludge, anaerobic digestion, sidestream treatment, membrane bioreactors, metabolic approaches, fate of micropollutants and biofilm processes. The main objective of this consensus building paper is to establish a consistent set of rules that can be applied to existing and most importantly, future models. Applying the proposed notation should make it easier for everyone active in the wastewater treatment field to read, write and review documents describing modelling projects.

  2. A computational framework for a database of terrestrial biosphere models

    Science.gov (United States)

    Metzler, Holger; Müller, Markus; Ceballos-Núñez, Verónika; Sierra, Carlos A.

    2016-04-01

    Most terrestrial biosphere models consist of a set of coupled ordinary first order differential equations. Each equation represents a pool containing carbon with a certain turnover rate. Although such models share some basic mathematical structures, they can have very different properties such as number of pools, cycling rates, and internal fluxes. We present a computational framework that helps analyze the structure and behavior of terrestrial biosphere models using as an example the process of soil organic matter decomposition. The same framework can also be used for other sub-processes such as carbon fixation or allocation. First, the models have to be fed into a database consisting of simple text files with a common structure. Then they are read in using Python and transformed into an internal 'Model Class' that can be used to automatically create an overview stating the model's structure, state variables, internal and external fluxes. SymPy, a Python library for symbolic mathematics, helps to also calculate the Jacobian matrix at possibly given steady states and the eigenvalues of this matrix. If complete parameter sets are available, the model can also be run using R to simulate its behavior under certain conditions and to support a deeper stability analysis. In this case, the framework is also able to provide phase-plane plots if appropriate. Furthermore, an overview of all the models in the database can be given to help identify their similarities and differences.

  3. Service business model framework and the service innovation scope

    NARCIS (Netherlands)

    van der Aa, W.; van der Rhee, B.; Victorino, L.

    2011-01-01

    In this paper we present a framework for service business models. We build on three streams of research. The first stream is the service management and marketing literature that focuses on the specific challenges of managing a service business. The second stream consists of research on e-business

  4. Designing for Learning and Play - The Smiley Model as Framework

    DEFF Research Database (Denmark)

    Weitze, Charlotte Lærke

    2016-01-01

    This paper presents a framework for designing engaging learning experiences in games – the Smiley Model. In this Design-Based Research project, student-game-designers were learning inside a gamified learning design - while designing and implementing learning goals from curriculum into the small d...

  5. The BMW Model: A New Framework for Teaching Monetary Economics

    Science.gov (United States)

    Bofinger, Peter; Mayer, Eric; Wollmershauser, Timo

    2006-01-01

    Although the IS/LM-AS/AD model is still the central tool of macroeconomic teaching in most macroeconomic textbooks, it has been criticized by several economists. Colander (1995) demonstrated that the framework is logically inconsistent, Romer (2000) showed that it is unable to deal with a monetary policy that uses the interest rate as its…

  6. A compositional modelling framework for exploring MPSoC systems

    DEFF Research Database (Denmark)

    Tranberg-Hansen, Anders Sejer; Madsen, Jan

    2009-01-01

    This paper presents a novel compositional framework for system level performance estimation and exploration of Multi-Processor System On Chip (MPSoC) based systems. The main contributions are the definition of a compositional model which allows quantitative performance estimation to be carried ou...

  7. A Model-Driven Framework to Develop Personalized Health Monitoring

    Directory of Open Access Journals (Sweden)

    Algimantas Venčkauskas

    2016-07-01

    Full Text Available Both distributed healthcare systems and the Internet of Things (IoT are currently hot topics. The latter is a new computing paradigm to enable advanced capabilities in engineering various applications, including those for healthcare. For such systems, the core social requirement is the privacy/security of the patient information along with the technical requirements (e.g., energy consumption and capabilities for adaptability and personalization. Typically, the functionality of the systems is predefined by the patient’s data collected using sensor networks along with medical instrumentation; then, the data is transferred through the Internet for treatment and decision-making. Therefore, systems creation is indeed challenging. In this paper, we propose a model-driven framework to develop the IoT-based prototype and its reference architecture for personalized health monitoring (PHM applications. The framework contains a multi-layered structure with feature-based modeling and feature model transformations at the top and the application software generation at the bottom. We have validated the framework using available tools and developed an experimental PHM to test some aspects of the functionality of the reference architecture in real time. The main contribution of the paper is the development of the model-driven computational framework with emphasis on the synergistic effect of security and energy issues.

  8. Public–private partnership conceptual framework and models for the ...

    African Journals Online (AJOL)

    This paper presents public–private partnership (PPP) framework models for funding and financing of water services ... capital markets to finance water infrastructure, particularly local bond markets ...... for the provision of water services infrastructure assets to be ... of water use charges and/or tariffs (pricing), regulatory impact.

  9. Generic modelling framework for economic analysis of battery systems

    DEFF Research Database (Denmark)

    You, Shi; Rasmussen, Claus Nygaard

    2011-01-01

    for battery cycle life estimation, since the cycle life plays a central role in the economic analysis of BS. To illustrate the modelling framework, a case study using a Sodium Sulfur Battery (NAS) system with 5-minute regulating service is performed. The economic performances of two dispatch scenarios, a so...

  10. A Liver-centric Multiscale Modeling Framework for Xenobiotics

    Science.gov (United States)

    We describe a multi-scale framework for modeling acetaminophen-induced liver toxicity. Acetaminophen is a widely used analgesic. Overdose of acetaminophen can result in liver injury via its biotransformation into toxic product, which further induce massive necrosis. Our study foc...

  11. Service business model framework and the service innovation scope

    NARCIS (Netherlands)

    van der Aa, W.; van der Rhee, B.; Victorino, L.

    2011-01-01

    In this paper we present a framework for service business models. We build on three streams of research. The first stream is the service management and marketing literature that focuses on the specific challenges of managing a service business. The second stream consists of research on e-business mo

  12. The BMW Model: A New Framework for Teaching Monetary Economics

    Science.gov (United States)

    Bofinger, Peter; Mayer, Eric; Wollmershauser, Timo

    2006-01-01

    Although the IS/LM-AS/AD model is still the central tool of macroeconomic teaching in most macroeconomic textbooks, it has been criticized by several economists. Colander (1995) demonstrated that the framework is logically inconsistent, Romer (2000) showed that it is unable to deal with a monetary policy that uses the interest rate as its…

  13. Model-based safety architecture framework for complex systems

    NARCIS (Netherlands)

    Schuitemaker, K.; Rajabalinejad, M.; Braakhuis, J.G.; Podofilini, Luca; Sudret, Bruno; Stojadinovic, Bozidar; Zio, Enrico; Kröger, Wolfgang

    2015-01-01

    The shift to transparency and rising need of the general public for safety, together with the increasing complexity and interdisciplinarity of modern safety-critical Systems of Systems (SoS) have resulted in a Model-Based Safety Architecture Framework (MBSAF) for capturing and sharing architectural

  14. A Graph Based Framework to Model Virus Integration Sites

    Directory of Open Access Journals (Sweden)

    Raffaele Fronza

    2016-01-01

    Here, we addressed the challenge to: 1 define the notion of CIS on graph models, 2 demonstrate that the structure of CIS enters in the category of scale-free networks and 3 show that our network approach analyzes CIS dynamically in an integrated systems biology framework using the Retroviral Transposon Tagged Cancer Gene Database (RTCGD as a testing dataset.

  15. Framework for Understanding Structural Errors (FUSE): a modular framework to diagnose differences between hydrological models

    Science.gov (United States)

    Clark, Martyn P.; Slater, Andrew G.; Rupp, David E.; Woods, Ross A.; Vrugt, Jasper A.; Gupta, Hoshin V.; Wagener, Thorsten; Hay, Lauren E.

    2008-01-01

    The problems of identifying the most appropriate model structure for a given problem and quantifying the uncertainty in model structure remain outstanding research challenges for the discipline of hydrology. Progress on these problems requires understanding of the nature of differences between models. This paper presents a methodology to diagnose differences in hydrological model structures: the Framework for Understanding Structural Errors (FUSE). FUSE was used to construct 79 unique model structures by combining components of 4 existing hydrological models. These new models were used to simulate streamflow in two of the basins used in the Model Parameter Estimation Experiment (MOPEX): the Guadalupe River (Texas) and the French Broad River (North Carolina). Results show that the new models produced simulations of streamflow that were at least as good as the simulations produced by the models that participated in the MOPEX experiment. Our initial application of the FUSE method for the Guadalupe River exposed relationships between model structure and model performance, suggesting that the choice of model structure is just as important as the choice of model parameters. However, further work is needed to evaluate model simulations using multiple criteria to diagnose the relative importance of model structural differences in various climate regimes and to assess the amount of independent information in each of the models. This work will be crucial to both identifying the most appropriate model structure for a given problem and quantifying the uncertainty in model structure. To facilitate research on these problems, the FORTRAN-90 source code for FUSE is available upon request from the lead author.

  16. A Framework for Understanding Physics Students' Computational Modeling Practices

    Science.gov (United States)

    Lunk, Brandon Robert

    With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by

  17. A Systematic Modelling Framework for Phase Transfer Catalyst Systems

    DEFF Research Database (Denmark)

    Anantpinijwatna, Amata; Sales-Cruz, Mauricio; Hyung Kim, Sun

    2016-01-01

    in an aqueous phase. These reacting systems are receiving increased attention as novel organic synthesis options due to their flexible operation, higher product yields, and ability to avoid hazardous or expensive solvents. Major considerations in the design and analysis of PTC systems are physical and chemical...... equilibria, as well as kinetic mechanisms and rates. This paper presents a modelling framework for design and analysis of PTC systems that requires a minimum amount of experimental data to develop and employ the necessary thermodynamic and reaction models and embeds them into a reactor model for simulation....... The application of the framework is made to two cases in order to highlight the performance and issues of activity coefficient models for predicting design and operation and the effects when different organic solvents are employed....

  18. Indeterminate direction relation model based on fuzzy description framework

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The indetermination of direction relation is a hot topic for fuzzy GIS researchers. The existing models only study the effects of indetermination of spatial objects,but ignore the uncertainty of direction reference framework. In this paper,first a for-malized representation model of indeterminate spatial objects is designed based on quadruple (x,y,A,μ),then a fuzzy direction reference framework is constructed by revising the cone method,in which the partitions of direction tiles are smooth and continuous,and two neighboring sections are overlapped in the transitional zones with fuzzy method. Grounded on these,a fuzzy description model for indeterminate direction relation is proposed in which the uncertainty of all three parts (source object,reference object and reference frame) is taken into account simultaneously. In the end,case studies are implemented to test the rationality and validity of the model.

  19. Framework of Distributed Coupled Atmosphere-Ocean-Wave Modeling System

    Institute of Scientific and Technical Information of China (English)

    WEN Yuanqiao; HUANG Liwen; DENG Jian; ZHANG Jinfeng; WANG Sisi; WANG Lijun

    2006-01-01

    In order to research the interactions between the atmosphere and ocean as well as their important role in the intensive weather systems of coastal areas, and to improve the forecasting ability of the hazardous weather processes of coastal areas, a coupled atmosphere-ocean-wave modeling system has been developed.The agent-based environment framework for linking models allows flexible and dynamic information exchange between models. For the purpose of flexibility, portability and scalability, the framework of the whole system takes a multi-layer architecture that includes a user interface layer, computational layer and service-enabling layer. The numerical experiment presented in this paper demonstrates the performance of the distributed coupled modeling system.

  20. Theoretical Models and Operational Frameworks in Public Health Ethics

    Science.gov (United States)

    Petrini, Carlo

    2010-01-01

    The article is divided into three sections: (i) an overview of the main ethical models in public health (theoretical foundations); (ii) a summary of several published frameworks for public health ethics (practical frameworks); and (iii) a few general remarks. Rather than maintaining the superiority of one position over the others, the main aim of the article is to summarize the basic approaches proposed thus far concerning the development of public health ethics by describing and comparing the various ideas in the literature. With this in mind, an extensive list of references is provided. PMID:20195441

  1. Theoretical Models and Operational Frameworks in Public Health Ethics

    Directory of Open Access Journals (Sweden)

    Carlo Petrini

    2010-01-01

    Full Text Available The article is divided into three sections: (i an overview of the main ethical models in public health (theoretical foundations; (ii a summary of several published frameworks for public health ethics (practical frameworks; and (iii a few general remarks. Rather than maintaining the superiority of one position over the others, the main aim of the article is to summarize the basic approaches proposed thus far concerning the development of public health ethics by describing and comparing the various ideas in the literature. With this in mind, an extensive list of references is provided.

  2. A High Performance Bayesian Computing Framework for Spatiotemporal Uncertainty Modeling

    Science.gov (United States)

    Cao, G.

    2015-12-01

    All types of spatiotemporal measurements are subject to uncertainty. With spatiotemporal data becomes increasingly involved in scientific research and decision making, it is important to appropriately model the impact of uncertainty. Quantitatively modeling spatiotemporal uncertainty, however, is a challenging problem considering the complex dependence and dataheterogeneities.State-space models provide a unifying and intuitive framework for dynamic systems modeling. In this paper, we aim to extend the conventional state-space models for uncertainty modeling in space-time contexts while accounting for spatiotemporal effects and data heterogeneities. Gaussian Markov Random Field (GMRF) models, also known as conditional autoregressive models, are arguably the most commonly used methods for modeling of spatially dependent data. GMRF models basically assume that a geo-referenced variable primarily depends on its neighborhood (Markov property), and the spatial dependence structure is described via a precision matrix. Recent study has shown that GMRFs are efficient approximation to the commonly used Gaussian fields (e.g., Kriging), and compared with Gaussian fields, GMRFs enjoy a series of appealing features, such as fast computation and easily accounting for heterogeneities in spatial data (e.g, point and areal). This paper represents each spatial dataset as a GMRF and integrates them into a state-space form to statistically model the temporal dynamics. Different types of spatial measurements (e.g., categorical, count or continuous), can be accounted for by according link functions. A fast alternative to MCMC framework, so-called Integrated Nested Laplace Approximation (INLA), was adopted for model inference.Preliminary case studies will be conducted to showcase the advantages of the described framework. In the first case, we apply the proposed method for modeling the water table elevation of Ogallala aquifer over the past decades. In the second case, we analyze the

  3. Compendium of models from a gauge U(1) framework

    Science.gov (United States)

    Ma, Ernest

    2016-06-01

    A gauge U(1) framework was established in 2002 to extend the supersymmetric Standard Model. It has many possible realizations. Whereas all have the necessary and sufficient ingredients to explain the possible 750 GeV diphoton excess, observed recently by the ATLAS and CMS Collaborations at the large hadron collider (LHC), they differ in other essential aspects. A compendium of such models is discussed.

  4. Modeling Movement Primitives with Hidden Markov Models for Robotic and Biomedical Applications.

    Science.gov (United States)

    Karg, Michelle; Kulić, Dana

    2017-01-01

    Movement primitives are elementary motion units and can be combined sequentially or simultaneously to compose more complex movement sequences. A movement primitive timeseries consist of a sequence of motion phases. This progression through a set of motion phases can be modeled by Hidden Markov Models (HMMs). HMMs are stochastic processes that model time series data as the evolution of a hidden state variable through a discrete set of possible values, where each state value is associated with an observation (emission) probability. Each motion phase is represented by one of the hidden states and the sequential order by their transition probabilities. The observations of the MP-HMM are the sensor measurements of the human movement, for example, motion capture or inertial measurements. The emission probabilities are modeled as Gaussians. In this chapter, the MP-HMM modeling framework is described and applications to motion recognition and motion performance assessment are discussed. The selected applications include parametric MP-HMMs for explicitly modeling variability in movement performance and the comparison of MP-HMMs based on the loglikelihood, the Kullback-Leibler divergence, the extended HMM-based F-statistic, and gait-specific reference-based measures.

  5. Advancing Integrated Systems Modelling Framework for Life Cycle Sustainability Assessment

    Directory of Open Access Journals (Sweden)

    Anthony Halog

    2011-02-01

    Full Text Available The need for integrated methodological framework for sustainability assessment has been widely discussed and is urgent due to increasingly complex environmental system problems. These problems have impacts on ecosystems and human well-being which represent a threat to economic performance of countries and corporations. Integrated assessment crosses issues; spans spatial and temporal scales; looks forward and backward; and incorporates multi-stakeholder inputs. This study aims to develop an integrated methodology by capitalizing the complementary strengths of different methods used by industrial ecologists and biophysical economists. The computational methodology proposed here is systems perspective, integrative, and holistic approach for sustainability assessment which attempts to link basic science and technology to policy formulation. The framework adopts life cycle thinking methods—LCA, LCC, and SLCA; stakeholders analysis supported by multi-criteria decision analysis (MCDA; and dynamic system modelling. Following Pareto principle, the critical sustainability criteria, indicators and metrics (i.e., hotspots can be identified and further modelled using system dynamics or agent based modelling and improved by data envelopment analysis (DEA and sustainability network theory (SNT. The framework is being applied to development of biofuel supply chain networks. The framework can provide new ways of integrating knowledge across the divides between social and natural sciences as well as between critical and problem-solving research.

  6. Modelling Framework to Support Decision-Making in Manufacturing Enterprises

    Directory of Open Access Journals (Sweden)

    Tariq Masood

    2013-01-01

    Full Text Available Systematic model-driven decision-making is crucial to design, engineer, and transform manufacturing enterprises (MEs. Choosing and applying the best philosophies and techniques is challenging as most MEs deploy complex and unique configurations of process-resource systems and seek economies of scope and scale in respect of changing and distinctive product flows. This paper presents a novel systematic enhanced integrated modelling framework to facilitate transformation of MEs, which is centred on CIMOSA. Application of the new framework in an automotive industrial case study is also presented. The following new contributions to knowledge are made: (1 an innovative structured framework that can support various decisions in design, optimisation, and control to reconfigure MEs; (2 an enriched and generic process modelling approach with capability to represent both static and dynamic aspects of MEs; and (3 an automotive industrial case application showing benefits in terms of reduced lead time and cost with improved responsiveness of process-resource system with a special focus on PPC. It is anticipated that the new framework is not limited to only automotive industry but has a wider scope of application. Therefore, it would be interesting to extend its testing with different configurations and decision-making levels.

  7. Possibilities: A framework for modeling students' deductive reasoning in physics

    Science.gov (United States)

    Gaffney, Jonathan David Housley

    Students often make errors when trying to solve qualitative or conceptual physics problems, and while many successful instructional interventions have been generated to prevent such errors, the process of deduction that students use when solving physics problems has not been thoroughly studied. In an effort to better understand that reasoning process, I have developed a new framework, which is based on the mental models framework in psychology championed by P. N. Johnson-Laird. My new framework models how students search possibility space when thinking about conceptual physics problems and suggests that errors arise from failing to flesh out all possibilities. It further suggests that instructional interventions should focus on making apparent those possibilities, as well as all physical consequences those possibilities would incur. The possibilities framework emerged from the analysis of data from a unique research project specifically invented for the purpose of understanding how students use deductive reasoning. In the selection task, participants were given a physics problem along with three written possible solutions with the goal of identifying which one of the three possible solutions was correct. Each participant was also asked to identify the errors in the incorrect solutions. For the study presented in this dissertation, participants not only performed the selection task individually on four problems, but they were also placed into groups of two or three and asked to discuss with each other the reasoning they used in making their choices and attempt to reach a consensus about which solution was correct. Finally, those groups were asked to work together to perform the selection task on three new problems. The possibilities framework appropriately models the reasoning that students use, and it makes useful predictions about potentially helpful instructional interventions. The study reported in this dissertation emphasizes the useful insight the

  8. A Probabilistic Model for Face Transformation with Application to Person Identification

    Directory of Open Access Journals (Sweden)

    Rose Kenneth

    2004-01-01

    Full Text Available A novel approach for content-based image retrieval and its specialization to face recognition are described. While most face recognition techniques aim at modeling faces, our goal is to model the transformation between face images of the same person. As a global face transformation may be too complex to be modeled directly, it is approximated by a collection of local transformations with a constraint that imposes consistency between neighboring transformations. Local transformations and neighborhood constraints are embedded within a probabilistic framework using two-dimensional hidden Markov models (2D HMMs. We further introduce a new efficient technique, called turbo-HMM (T-HMM for approximating intractable 2D HMMs. Experimental results on a face identification task show that our novel approach compares favorably to the popular eigenfaces and fisherfaces algorithms.

  9. An enhanced BSIM modeling framework for selfheating aware circuit design

    Science.gov (United States)

    Schleyer, M.; Leuschner, S.; Baumgartner, P.; Mueller, J.-E.; Klar, H.

    2014-11-01

    This work proposes a modeling framework to enhance the industry-standard BSIM4 MOSFET models with capabilities for coupled electro-thermal simulations. An automated simulation environment extracts thermal information from model data as provided by the semiconductor foundry. The standard BSIM4 model is enhanced with a Verilog-A based wrapper module, adding thermal nodes which can be connected to a thermal-equivalent RC network. The proposed framework allows a fully automated extraction process based on the netlist of the top-level design and the model library. A numerical analysis tool is used to control the extraction flow and to obtain all required parameters. The framework is used to model self-heating effects on a fully integrated class A/AB power amplifier (PA) designed in a standard 65 nm CMOS process. The PA is driven with +30 dBm output power, leading to an average temperature rise of approximately 40 °C over ambient temperature.

  10. Speech Recognition Using HMM with MFCC-An Analysis Using Frequency Specral Decomposion Technique

    Directory of Open Access Journals (Sweden)

    Ibrahim Patel

    2010-12-01

    Full Text Available This paper presents an approach to the recognition of speech signal using frequency spectral information with Mel frequency for the improvement of speech feature representation in a HMM based recognition approach. A frequency spectral information is incorporated to the conventional Mel spectrum base speech recognition approach. The Mel frequency approach exploits the frequency observation for speech signal in a given resolution which results in resolution feature overlapping resulting in recognition limit. Resolution decomposition with separating frequency is mapping approach for a HMM based speech recognition system. The Simulation results show an improvement in the quality metrics of speech recognition with respect to computational time, learning accuracy for a speech recognition system.

  11. A modeling framework for system restoration from cascading failures.

    Science.gov (United States)

    Liu, Chaoran; Li, Daqing; Zio, Enrico; Kang, Rui

    2014-01-01

    System restoration from cascading failures is an integral part of the overall defense against catastrophic breakdown in networked critical infrastructures. From the outbreak of cascading failures to the system complete breakdown, actions can be taken to prevent failure propagation through the entire network. While most analysis efforts have been carried out before or after cascading failures, restoration during cascading failures has been rarely studied. In this paper, we present a modeling framework to investigate the effects of in-process restoration, which depends strongly on the timing and strength of the restoration actions. Furthermore, in the model we also consider additional disturbances to the system due to restoration actions themselves. We demonstrate that the effect of restoration is also influenced by the combination of system loading level and restoration disturbance. Our modeling framework will help to provide insights on practical restoration from cascading failures and guide improvements of reliability and resilience of actual network systems.

  12. `Dhara': An Open Framework for Critical Zone Modeling

    Science.gov (United States)

    Le, P. V.; Kumar, P.

    2016-12-01

    Processes in the Critical Zone, which sustain terrestrial life, are tightly coupled across hydrological, physical, biological, chemical, pedological, geomorphological and ecological domains over both short and long timescales. Observations and quantification of the Earth's surface across these domains using emerging high resolution measurement technologies such as light detection and ranging (lidar) and hyperspectral remote sensing are enabling us to characterize fine scale landscape attributes over large spatial areas. This presents a unique opportunity to develop novel approaches to model the Critical Zone that can capture fine scale intricate dependencies across the different processes in 3D. The development of interdisciplinary tools that transcend individual disciplines and capture new levels of complexity and emergent properties is at the core of Critical Zone science. Here we introduce an open framework for high-performance computing model (`Dhara') for modeling complex processes in the Critical Zone. The framework is designed to be modular in structure with the aim to create uniform and efficient tools to facilitate and leverage process modeling. It also provides flexibility to maintain, collaborate, and co-develop additional components by the scientific community. We show the essential framework that simulates ecohydrologic dynamics, and surface - sub-surface coupling in 3D using hybrid parallel CPU-GPU. We demonstrate that the open framework in Dhara is feasible for detailed, multi-processes, and large-scale modeling of the Critical Zone, which opens up exciting possibilities. We will also present outcomes from a Modeling Summer Institute led by Intensively Managed Critical Zone Observatory (IMLCZO) with representation from several CZOs and international representatives.

  13. Web Mining Based on Hybrid Simulated Annealing Genetic Algorithm and HMM%基于混合模拟退火-遗传算法和HMM的Web挖掘

    Institute of Scientific and Technical Information of China (English)

    邹腊梅; 龚向坚

    2012-01-01

    The training algorithm which is used to training HMM is a sub-optimal algorithm and sensitive to initial parameters. Typical hidden Markov model often leads to sub-optimal when training it with random parameters. It is ineffective when mining Web information with typical HMM. GA has the excellent ability of global searching and has the defect of slow convergence rate. SA has the excellent ability of local searching and has the defect of randomly roaming. It combines the advantages of genetic algorithm and simulated annealing algorithm .proposes hybrid simulated annealing genetic algorithm (SGA). SGA chooses the best SGA parameters by experiment and optimizes HMM combining Baum-Welch during the course of Web mining. The experimental results show that the SGA significantly improves the performance in precision and recall.%隐马尔可夫模型训练算法是一种局部搜索算法,对初值敏感.传统方法采用随机参数训练隐马尔可夫模型时常陷入局部最优,应用于Web挖掘效果不佳.遗传算法具有较强的全局搜索能力,但容易早熟、收敛慢,模拟退火算法具有较强的局部寻优能力,但会随机漫游,全局搜索能力欠缺.综合考虑遗传算法和模拟退火算法的特点,提出混合模拟退火-遗传算法SGA,优化HMM初始参数,弥补Baum-Welch算法对初始参数敏感的缺陷,Web挖掘的实验结果表明五个域提取的REC和PRE都有明显的提高.

  14. Research study on harmonized molecular materials (HMM); Bunshi kyocho zairyo ni kansuru chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    As functional material to satisfy various needs for environmental harmonization and efficient conversion for information-oriented and aging societies, HMM were surveyed. Living bodies effectively carry out transmission/processing of information, and transport/conversion of substances, and these functions are based on harmonization between organic molecules, and between those and metal or inorganic ones. HMM is a key substance to artificially realize these bio-related functions. Its R & D aims at (1) Making a breakthrough in production process based on innovation of material separation/conversion technology, (2) Contribution to an information-oriented society by high-efficiency devices, and (3) Growth of a functional bio-material industry. HMM is classified into three categories: (1) Assembly materials such as organic ultra-thin films (LB film, self-organizing film), and organic/inorganic hybrid materials for optoelectronics, sensors and devices, (2) Mesophase materials such as functional separation membrane and photo-conductive material, and (3) Microporous materials such as synthetic catalyst using guest/host materials. 571 refs., 88 figs., 21 tabs.

  15. Mechanisms of Soil Aggregation: a biophysical modeling framework

    Science.gov (United States)

    Ghezzehei, T. A.; Or, D.

    2016-12-01

    Soil aggregation is one of the main crosscutting concepts in all sub-disciplines and applications of soil science from agriculture to climate regulation. The concept generally refers to adhesion of primary soil particles into distinct units that remain stable when subjected to disruptive forces. It is one of the most sensitive soil qualities that readily respond to disturbances such as cultivation, fire, drought, flooding, and changes in vegetation. These changes are commonly quantified and incorporated in soil models indirectly as alterations in carbon content and type, bulk density, aeration, permeability, as well as water retention characteristics. Soil aggregation that is primarily controlled by organic matter generally exhibits hierarchical organization of soil constituents into stable units that range in size from a few microns to centimeters. However, this conceptual model of soil aggregation as the key unifying mechanism remains poorly quantified and is rarely included in predictive soil models. Here we provide a biophysical framework for quantitative and predictive modeling of soil aggregation and its attendant soil characteristics. The framework treats aggregates as hotspots of biological, chemical and physical processes centered around roots and root residue. We keep track of the life cycle of an individual aggregate from it genesis in the rhizosphere, fueled by rhizodeposition and mediated by vigorous microbial activity, until its disappearance when the root-derived resources are depleted. The framework synthesizes current understanding of microbial life in porous media; water holding and soil binding capacity of biopolymers; and environmental controls on soil organic matter dynamics. The framework paves a way for integration of processes that are presently modeled as disparate or poorly coupled processes, including storage and protection of carbon, microbial activity, greenhouse gas fluxes, movement and storage of water, resistance of soils against

  16. Bayesian-based Project Monitoring: Framework Development and Model Testing

    Directory of Open Access Journals (Sweden)

    Budi Hartono

    2015-12-01

    Full Text Available During project implementation, risk becomes an integral part of project monitoring. Therefore. a tool that could dynamically include elements of risk in project progress monitoring is needed. This objective of this study is to develop a general framework that addresses such a concern. The developed framework consists of three interrelated major building blocks, namely: Risk Register (RR, Bayesian Network (BN, and Project Time Networks (PTN for dynamic project monitoring. RR is used to list and to categorize identified project risks. PTN is utilized for modeling the relationship between project activities. BN is used to reflect the interdependence among risk factors and to bridge RR and PTN. A residential development project is chosen as a working example and the result shows that the proposed framework has been successfully applied. The specific model of the development project is also successfully developed and is used to monitor the project progress. It is shown in this study that the proposed BN-based model provides superior performance in terms of forecast accuracy compared to the extant models.

  17. A VGI data integration framework based on linked data model

    Science.gov (United States)

    Wan, Lin; Ren, Rongrong

    2015-12-01

    This paper aims at the geographic data integration and sharing method for multiple online VGI data sets. We propose a semantic-enabled framework for online VGI sources cooperative application environment to solve a target class of geospatial problems. Based on linked data technologies - which is one of core components of semantic web, we can construct the relationship link among geographic features distributed in diverse VGI platform by using linked data modeling methods, then deploy these semantic-enabled entities on the web, and eventually form an interconnected geographic data network to support geospatial information cooperative application across multiple VGI data sources. The mapping and transformation from VGI sources to RDF linked data model is presented to guarantee the unique data represent model among different online social geographic data sources. We propose a mixed strategy which combined spatial distance similarity and feature name attribute similarity as the measure standard to compare and match different geographic features in various VGI data sets. And our work focuses on how to apply Markov logic networks to achieve interlinks of the same linked data in different VGI-based linked data sets. In our method, the automatic generating method of co-reference object identification model according to geographic linked data is discussed in more detail. It finally built a huge geographic linked data network across loosely-coupled VGI web sites. The results of the experiment built on our framework and the evaluation of our method shows the framework is reasonable and practicable.

  18. A Building Model Framework for a Genetic Algorithm Multi-objective Model Predictive Control

    DEFF Research Database (Denmark)

    Arendt, Krzysztof; Ionesi, Ana; Jradi, Muhyiddine

    2016-01-01

    Mock-Up Interface, which is used to link the models with the MPC system. The framework was used to develop and run initial thermal and CO2 models. Their performance and the implementation procedure are discussed in the present paper. The framework is going to be implemented in the MPC system planned...

  19. A framework to model real-time databases

    CERN Document Server

    Idoudi, Nizar; Duvallet, Claude; Sadeg, Bruno; Bouaziz, Rafik; Gargouri, Faiez

    2010-01-01

    Real-time databases deal with time-constrained data and time-constrained transactions. The design of this kind of databases requires the introduction of new concepts to support both data structures and the dynamic behaviour of the database. In this paper, we give an overview about different aspects of real-time databases and we clarify requirements of their modelling. Then, we present a framework for real-time database design and describe its fundamental operations. A case study demonstrates the validity of the structural model and illustrates SQL queries and Java code generated from the classes of the model

  20. An Integrated Modeling Framework for Probable Maximum Precipitation and Flood

    Science.gov (United States)

    Gangrade, S.; Rastogi, D.; Kao, S. C.; Ashfaq, M.; Naz, B. S.; Kabela, E.; Anantharaj, V. G.; Singh, N.; Preston, B. L.; Mei, R.

    2015-12-01

    With the increasing frequency and magnitude of extreme precipitation and flood events projected in the future climate, there is a strong need to enhance our modeling capabilities to assess the potential risks on critical energy-water infrastructures such as major dams and nuclear power plants. In this study, an integrated modeling framework is developed through high performance computing to investigate the climate change effects on probable maximum precipitation (PMP) and probable maximum flood (PMF). Multiple historical storms from 1981-2012 over the Alabama-Coosa-Tallapoosa River Basin near the Atlanta metropolitan area are simulated by the Weather Research and Forecasting (WRF) model using the Climate Forecast System Reanalysis (CFSR) forcings. After further WRF model tuning, these storms are used to simulate PMP through moisture maximization at initial and lateral boundaries. A high resolution hydrological model, Distributed Hydrology-Soil-Vegetation Model, implemented at 90m resolution and calibrated by the U.S. Geological Survey streamflow observations, is then used to simulate the corresponding PMF. In addition to the control simulation that is driven by CFSR, multiple storms from the Community Climate System Model version 4 under the Representative Concentrations Pathway 8.5 emission scenario are used to simulate PMP and PMF in the projected future climate conditions. The multiple PMF scenarios developed through this integrated modeling framework may be utilized to evaluate the vulnerability of existing energy-water infrastructures with various aspects associated PMP and PMF.

  1. Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation

    DEFF Research Database (Denmark)

    Mangado Lopez, Nerea; Ceresa, Mario; Duchateau, Nicolas

    2016-01-01

    Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging. To addr......Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging....... To address such a challenge, we propose an automatic framework for the generation of patient-specific meshes for finite element modeling of the implanted cochlea. First, a statistical shape model is constructed from high-resolution anatomical μCT images. Then, by fitting the statistical model to a patient......'s CT image, an accurate model of the patient-specific cochlea anatomy is obtained. An algorithm based on the parallel transport frame is employed to perform the virtual insertion of the cochlear implant. Our automatic framework also incorporates the surrounding bone and nerve fibers and assigns...

  2. A Structural Model Decomposition Framework for Systems Health Management

    Science.gov (United States)

    Roychoudhury, Indranil; Daigle, Matthew J.; Bregon, Anibal; Pulido, Belamino

    2013-01-01

    Systems health management (SHM) is an important set of technologies aimed at increasing system safety and reliability by detecting, isolating, and identifying faults; and predicting when the system reaches end of life (EOL), so that appropriate fault mitigation and recovery actions can be taken. Model-based SHM approaches typically make use of global, monolithic system models for online analysis, which results in a loss of scalability and efficiency for large-scale systems. Improvement in scalability and efficiency can be achieved by decomposing the system model into smaller local submodels and operating on these submodels instead. In this paper, the global system model is analyzed offline and structurally decomposed into local submodels. We define a common model decomposition framework for extracting submodels from the global model. This framework is then used to develop algorithms for solving model decomposition problems for the design of three separate SHM technologies, namely, estimation (which is useful for fault detection and identification), fault isolation, and EOL prediction. We solve these model decomposition problems using a three-tank system as a case study.

  3. Trading USDCHF filtered by Gold dynamics via HMM coupling

    OpenAIRE

    2013-01-01

    We devise a USDCHF trading strategy using the dynamics of gold as a filter. Our strategy involves modelling both USDCHF and gold using a coupled hidden Markov model (CHMM). The observations will be indicators, RSI and CCI, which will be used as triggers for our trading signals. Upon decoding the model in each iteration, we can get the next most probable state and the next most probable observation. Hopefully by taking advantage of intermarket analysis and the Markov property implicit in the m...

  4. Implementation of a Tour Guide Robot System Using RFID Technology and Viterbi Algorithm-Based HMM for Speech Recognition

    Directory of Open Access Journals (Sweden)

    Neng-Sheng Pai

    2014-01-01

    Full Text Available This paper applied speech recognition and RFID technologies to develop an omni-directional mobile robot into a robot with voice control and guide introduction functions. For speech recognition, the speech signals were captured by short-time processing. The speaker first recorded the isolated words for the robot to create speech database of specific speakers. After the speech pre-processing of this speech database, the feature parameters of cepstrum and delta-cepstrum were obtained using linear predictive coefficient (LPC. Then, the Hidden Markov Model (HMM was used for model training of the speech database, and the Viterbi algorithm was used to find an optimal state sequence as the reference sample for speech recognition. The trained reference model was put into the industrial computer on the robot platform, and the user entered the isolated words to be tested. After processing by the same reference model and comparing with previous reference model, the path of the maximum total probability in various models found using the Viterbi algorithm in the recognition was the recognition result. Finally, the speech recognition and RFID systems were achieved in an actual environment to prove its feasibility and stability, and implemented into the omni-directional mobile robot.

  5. RESEARCH ON Web INFORMATION EXTRACTION BASED ON IMPROVED GENETIC ANNEALING AND HMM%基于改进遗传退火HMM的Web信息抽取研究

    Institute of Scientific and Technical Information of China (English)

    李荣; 冯丽萍; 王鸿斌

    2014-01-01

    In order to further raise the accuracy of Web information extraction,for the shortcomings of hidden Markov model (HMM)and its hybrid method in the parameter optimisation,we present a Web extraction algorithm which is based on the improved genetic annealing and HMM.First,the algorithm sets up a novel HMMwith backward dependency assumption;secondly,it applies the improved genetic annealing algorithm to optimise HMM parameters.After the genetic operators and parameters of simulated annealing (SA)have been improved,the subpopulations are classified according to the adaptive crossover and mutation probability of GA in order to realise the multi-group parallel search and information exchange,which can avoid premature and accelerate convergence.Then SA is taken for a GA operator to strengthen the local searching capability.Finally,the bi-order Viterbi algorithm is used for decoding.Compared with existing HMM optimisation method,the comprehensive Fβ=1 value in experiment increases by 6% in average,which shows that the improved algorithm can effectively raise the extraction accuracy and search performance.%为进一步提高 Web 信息抽取的准确率,针对隐马尔可夫模型 HMM(Hidden Markov Model)及混合法在参数寻优上的不足,提出一种改进遗传退火 HMM的 Web 抽取算法。构建一个后向依赖假设的 HMM;用改进遗传退火优化 HMM参数,将遗传算子和模拟退火 SA(simulated annealing)参数改进后,据 GA(genetic algorithm)的自适应交叉、变异概率给子群体分类,实现多种群并行搜索和信息交换,以避免早熟,加速收敛;并将 SA 作为 GA 算子,加强局部寻优能力;最后,用双序 Viterbi 解码,与现有 HMM优化法相比,实验的综合 Fβ=1平均提高了6%,表明改进算法能有效提高抽取准确率和寻优性能。

  6. A Simulink simulation framework of a MagLev model

    Energy Technology Data Exchange (ETDEWEB)

    Boudall, H.; Williams, R.D.; Giras, T.C. [University of Virginia, Charlottesville (United States). School of Enegineering and Applied Science

    2003-09-01

    This paper presents a three-degree-of-freedom model of a section of the magnetically levitated train Maglev. The Maglev system dealt with in this article utilizes electromagnetic levitation. Each MagLev vehicle section is viewed as two separate parts, namely a body and a chassis, coupled by a set of springs and dampers. The MagLev model includes the propulsion, the guidance and the levitation systems. The equations of motion are developed. A Simulink simulation framework is implemented in order to study the interaction between the different systems and the dynamics of a MagLev vehicle. The simulation framework will eventually serve as a tool to assist the design and development of the Maglev system in the United States of America. (author)

  7. Modelling Framework and Assistive Device for Peripheral Intravenous Injections

    Science.gov (United States)

    Kam, Kin F.; Robinson, Martin P.; Gilbert, Mathew A.; Pelah, Adar

    2016-02-01

    Intravenous access for blood sampling or drug administration that requires peripheral venepuncture is perhaps the most common invasive procedure practiced in hospitals, clinics and general practice surgeries.We describe an idealised mathematical framework for modelling the dynamics of the peripheral venepuncture process. Basic assumptions of the model are confirmed through motion analysis of needle trajectories during venepuncture, taken from video recordings of a skilled practitioner injecting into a practice kit. The framework is also applied to the design and construction of a proposed device for accurate needle guidance during venepuncture administration, assessed as consistent and repeatable in application and does not lead to over puncture. The study provides insights into the ubiquitous peripheral venepuncture process and may contribute to applications in training and in the design of new devices, including for use in robotic automation.

  8. Integrating knowledge seeking into knowledge management models and frameworks

    Directory of Open Access Journals (Sweden)

    Francois Lottering

    2012-02-01

    Full Text Available Background: A striking feature of the knowledge management (KM literature is that the standard list of KM processes either subsumes or overlooks the process of knowledge seeking. Knowledge seeking is manifestly under-theorised, making the need to address this gap in KM theory and practice clear and urgent.Objectives: This article investigates the theoretical status of the knowledge-seeking process in extant KM models and frameworks. It also statistically describes knowledge seeking and knowledge sharing practices in a sample of South African companies. Using this data, it proposes a KM model based on knowledge seeking.Method: Knowledge seeking is traced in a number of KM models and frameworks with a specific focus on Han Lai and Margaret Graham’s adapted KM cycle model, which separates knowledge seeking from knowledge sharing. This empirical investigation used a questionnaire to examine knowledge seeking and knowledge sharing practices in a sample of South African companies.Results: This article critiqued and elaborated on the adapted KM cycle model of Lai and Graham. It identified some of the key features of knowledge seeking practices in the workplace. It showed that knowledge seeking and sharing are human-centric actions and that seeking knowledge uses trust and loyalty as its basis. It also showed that one cannot separate knowledge seeking from knowledge sharing.Conclusion: The knowledge seeking-based KM model elaborates on Lai and Graham’s model. It provides insight into how and where people seek and share knowledge in the workplace. The article concludes that it is necessary to cement the place of knowledge seeking in KM models as well as frameworks and suggests that organisations should apply its findings to improving their knowledge management strategies. 

  9. Certified reduced basis model validation: A frequentistic uncertainty framework

    OpenAIRE

    Patera, A. T.; Huynh, Dinh Bao Phuong; Knezevic, David; Patera, Anthony T.

    2011-01-01

    We introduce a frequentistic validation framework for assessment — acceptance or rejection — of the consistency of a proposed parametrized partial differential equation model with respect to (noisy) experimental data from a physical system. Our method builds upon the Hotelling T[superscript 2] statistical hypothesis test for bias first introduced by Balci and Sargent in 1984 and subsequently extended by McFarland and Mahadevan (2008). Our approach introduces two new elements: a spectral repre...

  10. Common and Innovative Visuals: A sparsity modeling framework for video.

    Science.gov (United States)

    Abdolhosseini Moghadam, Abdolreza; Kumar, Mrityunjay; Radha, Hayder

    2014-05-02

    Efficient video representation models are critical for many video analysis and processing tasks. In this paper, we present a framework based on the concept of finding the sparsest solution to model video frames. To model the spatio-temporal information, frames from one scene are decomposed into two components: (i) a common frame, which describes the visual information common to all the frames in the scene/segment, and (ii) a set of innovative frames, which depicts the dynamic behaviour of the scene. The proposed approach exploits and builds on recent results in the field of compressed sensing to jointly estimate the common frame and the innovative frames for each video segment. We refer to the proposed modeling framework by CIV (Common and Innovative Visuals). We show how the proposed model can be utilized to find scene change boundaries and extend CIV to videos from multiple scenes. Furthermore, the proposed model is robust to noise and can be used for various video processing applications without relying on motion estimation and detection or image segmentation. Results for object tracking, video editing (object removal, inpainting) and scene change detection are presented to demonstrate the efficiency and the performance of the proposed model.

  11. Dynamic modelling of household automobile transactions within a microsimulation framework

    Energy Technology Data Exchange (ETDEWEB)

    Mohammadian, A.

    2002-07-01

    This thesis presents a newly developed dynamic model of household automobile transactions within an integrated land-use transportation and environment (ILUTE) modeling system framework. It is a market-based decision-making tool for use by individuals who have to choose between adding new vehicles to a fleet, disposing of vehicles, trading one of the vehicles of a fleet, or do-nothing. Different approaches were used within the model, including an artificial neural network, hedonic price, regression, and vehicle class and vintage choices. The model can also predict the complex activity of individuals' behaviour to become active in the market. An estimation approach was used to incorporate the vehicle type choice model into the main dynamic transaction choice model.

  12. A framework for the calibration of social simulation models

    CERN Document Server

    Ciampaglia, Giovanni Luca

    2013-01-01

    Simulation with agent-based models is increasingly used in the study of complex socio-technical systems and in social simulation in general. This paradigm offers a number of attractive features, namely the possibility of modeling emergent phenomena within large populations. As a consequence, often the quantity in need of calibration may be a distribution over the population whose relation with the parameters of the model is analytically intractable. Nevertheless, we can simulate. In this paper we present a simulation-based framework for the calibration of agent-based models with distributional output based on indirect inference. We illustrate our method step by step on a model of norm emergence in an online community of peer production, using data from three large Wikipedia communities. Model fit and diagnostics are discussed.

  13. Population balance models: a useful complementary modelling framework for future WWTP modelling.

    Science.gov (United States)

    Nopens, Ingmar; Torfs, Elena; Ducoste, Joel; Vanrolleghem, Peter A; Gernaey, Krist V

    2015-01-01

    Population balance models (PBMs) represent a powerful modelling framework for the description of the dynamics of properties that are characterised by distributions. This distribution of properties under transient conditions has been demonstrated in many chemical engineering applications. Modelling efforts of several current and future unit processes in wastewater treatment plants could potentially benefit from this framework, especially when distributed dynamics have a significant impact on the overall unit process performance. In these cases, current models that rely on average properties cannot sufficiently capture the true behaviour and even lead to completely wrong conclusions. Examples of distributed properties are bubble size, floc size, crystal size or granule size. In these cases, PBMs can be used to develop new knowledge that can be embedded in our current models to improve their predictive capability. Hence, PBMs should be regarded as a complementary modelling framework to biokinetic models. This paper provides an overview of current applications, future potential and limitations of PBMs in the field of wastewater treatment modelling, thereby looking over the fence to other scientific disciplines.

  14. Population Balance Models: A useful complementary modelling framework for future WWTP modelling

    DEFF Research Database (Denmark)

    Nopens, Ingmar; Torfs, Elena; Ducoste, Joel

    2014-01-01

    processes in WWTPs could potentially benefit from this framework, especially when distributed dynamics have a significant impact on the overall unit process performance. In these cases, current models that rely on average properties cannot sufficiently captured the true behaviour. Examples are bubble size...

  15. An Integrated Snow Radiance and Snow Physics Modeling Framework for Cold Land Surface Modeling

    Science.gov (United States)

    Kim, Edward J.; Tedesco, Marco

    2006-01-01

    Recent developments in forward radiative transfer modeling and physical land surface modeling are converging to allow the assembly of an integrated snow/cold lands modeling framework for land surface modeling and data assimilation applications. The key elements of this framework include: a forward radiative transfer model (FRTM) for snow, a snowpack physical model, a land surface water/energy cycle model, and a data assimilation scheme. Together these form a flexible framework for self-consistent remote sensing and water/energy cycle studies. In this paper we will describe the elements and the integration plan. Each element of this framework is modular so the choice of element can be tailored to match the emphasis of a particular study. For example, within our framework, four choices of a FRTM are available to simulate the brightness temperature of snow: Two models are available to model the physical evolution of the snowpack and underlying soil, and two models are available to handle the water/energy balance at the land surface. Since the framework is modular, other models-physical or statistical--can be accommodated, too. All modules will operate within the framework of the Land Information System (LIS), a land surface modeling framework with data assimilation capabilities running on a parallel-node computing cluster at the NASA Goddard Space Flight Center. The advantages of such an integrated modular framework built on the LIS will be described through examples-e.g., studies to analyze snow field experiment observations, and simulations of future satellite missions for snow and cold land processes.

  16. C-HiLasso: A Collaborative Hierarchical Sparse Modeling Framework

    CERN Document Server

    Sprechmann, Pablo; Sapiro, Guillermo; Eldar, Yonina

    2010-01-01

    Sparse modeling is a powerful framework for data analysis and processing. Traditionally, encoding in this framework is performed by solving an L1-regularized linear regression problem, commonly referred to as Lasso or Basis Pursuit. In this work we combine the sparsity-inducing property of the Lasso model at the individual feature level, with the block-sparsity property of the Group Lasso model, where sparse groups of features are jointly encoded, obtaining a sparsity pattern hierarchically structured. This results in the Hierarchical Lasso (HiLasso), which shows important practical modeling advantages. We then extend this approach to the collaborative case, where a set of simultaneously coded signals share the same sparsity pattern at the higher (group) level, but not necessarily at the lower (inside the group) level, obtaining the collaborative HiLasso model (C-HiLasso). Such signals then share the same active groups, or classes, but not necessarily the same active set. This model is very well suited for ap...

  17. Designing for Learning and Play - The Smiley Model as Framework

    DEFF Research Database (Denmark)

    Weitze, Charlotte Lærke

    2016-01-01

    This paper presents a framework for designing engaging learning experiences in games – the Smiley Model. In this Design-Based Research project, student-game-designers were learning inside a gamified learning design - while designing and implementing learning goals from curriculum into the small...... digital games. The Smiley Model inspired and provided a scaffold or a heuristic for the overall gamified learning design –- as well as for the students’ learning game design processes when creating small games turning the learning situation into an engaging experience. The audience for the experiments...

  18. A constitutive model for magnetostriction based on thermodynamic framework

    Science.gov (United States)

    Ho, Kwangsoo

    2016-08-01

    This work presents a general framework for the continuum-based formulation of dissipative materials with magneto-mechanical coupling in the viewpoint of irreversible thermodynamics. The thermodynamically consistent model developed for the magnetic hysteresis is extended to include the magnetostrictive effect. The dissipative and hysteretic response of magnetostrictive materials is captured through the introduction of internal state variables. The evolution rate of magnetostrictive strain as well as magnetization is derived from thermodynamic and dissipative potentials in accordance with the general principles of thermodynamics. It is then demonstrated that the constitutive model is competent to describe the magneto-mechanical behavior by comparing simulation results with the experimental data reported in the literature.

  19. Next generation framework for aquatic modeling of the Earth System

    Directory of Open Access Journals (Sweden)

    C. J. Vörösmarty

    2009-03-01

    Full Text Available Earth System model development is becoming an increasingly complex task. As scientists attempt to represent the physical and bio-geochemical processes and various feedback mechanisms in unprecedented detail, the models themselves are becoming increasingly complex. At the same time, the complexity of the surrounding IT infrastructure is growing as well. Earth System models must manage a vast amount of data in heterogeneous computing environments. Numerous development efforts are on the way to ease that burden and offer model development platforms that reduce IT challenges and allow scientists to focus on their science. While these new modeling frameworks (e.g. FMS, ESMF, CCA, OpenMI do provide solutions to many IT challenges (performing input/output, managing space and time, establishing model coupling, etc., they are still considerably complex and often have steep learning curves.

    The Next generation Framework for Aquatic Modeling of the Earth System (NextFrAMES, a revised version of FrAMES have numerous similarities to those developed by other teams, but represents a novel model development paradigm. NextFrAMES is built around a modeling XML that lets modelers to express the overall model structure and provides an API for dynamically linked plugins to represent the processes. The model XML is executed by the NextFrAMES run-time engine that parses the model definition, loads the module plugins, performs the model I/O and executes the model calculations. NextFrAMES has a minimalistic view representing spatial domains and treats every domain (regardless of its layout such as grid, network tree, individual points, polygons, etc. as vector of objects. NextFrAMES performs computations on multiple domains and interactions between different spatial domains are carried out through couplers. NextFrAMES allows processes to operate at different frequencies by providing rudimentary aggregation and disaggregation facilities.

    NextFrAMES was

  20. An economic evaluation of home management of malaria in Uganda: an interactive Markov model.

    Science.gov (United States)

    Lubell, Yoel; Mills, Anne J; Whitty, Christopher J M; Staedke, Sarah G

    2010-08-27

    Home management of malaria (HMM), promoting presumptive treatment of febrile children in the community, is advocated to improve prompt appropriate treatment of malaria in Africa. The cost-effectiveness of HMM is likely to vary widely in different settings and with the antimalarial drugs used. However, no data on the cost-effectiveness of HMM programmes are available. A Markov model was constructed to estimate the cost-effectiveness of HMM as compared to conventional care for febrile illnesses in children without HMM. The model was populated with data from Uganda, but is designed to be interactive, allowing the user to adjust certain parameters, including the antimalarials distributed. The model calculates the cost per disability adjusted life year averted and presents the incremental cost-effectiveness ratio compared to a threshold value. Model output is stratified by level of malaria transmission and the probability that a child would receive appropriate care from a health facility, to indicate the circumstances in which HMM is likely to be cost-effective. The model output suggests that the cost-effectiveness of HMM varies with malaria transmission, the probability of appropriate care, and the drug distributed. Where transmission is high and the probability of appropriate care is limited, HMM is likely to be cost-effective from a provider perspective. Even with the most effective antimalarials, HMM remains an attractive intervention only in areas of high malaria transmission and in medium transmission areas with a lower probability of appropriate care. HMM is generally not cost-effective in low transmission areas, regardless of which antimalarial is distributed. Considering the analysis from the societal perspective decreases the attractiveness of HMM. Syndromic HMM for children with fever may be a useful strategy for higher transmission settings with limited health care and diagnosis, but is not appropriate for all settings. HMM may need to be tailored to

  1. An economic evaluation of home management of malaria in Uganda: an interactive Markov model.

    Directory of Open Access Journals (Sweden)

    Yoel Lubell

    Full Text Available BACKGROUND: Home management of malaria (HMM, promoting presumptive treatment of febrile children in the community, is advocated to improve prompt appropriate treatment of malaria in Africa. The cost-effectiveness of HMM is likely to vary widely in different settings and with the antimalarial drugs used. However, no data on the cost-effectiveness of HMM programmes are available. METHODS/PRINCIPAL FINDINGS: A Markov model was constructed to estimate the cost-effectiveness of HMM as compared to conventional care for febrile illnesses in children without HMM. The model was populated with data from Uganda, but is designed to be interactive, allowing the user to adjust certain parameters, including the antimalarials distributed. The model calculates the cost per disability adjusted life year averted and presents the incremental cost-effectiveness ratio compared to a threshold value. Model output is stratified by level of malaria transmission and the probability that a child would receive appropriate care from a health facility, to indicate the circumstances in which HMM is likely to be cost-effective. The model output suggests that the cost-effectiveness of HMM varies with malaria transmission, the probability of appropriate care, and the drug distributed. Where transmission is high and the probability of appropriate care is limited, HMM is likely to be cost-effective from a provider perspective. Even with the most effective antimalarials, HMM remains an attractive intervention only in areas of high malaria transmission and in medium transmission areas with a lower probability of appropriate care. HMM is generally not cost-effective in low transmission areas, regardless of which antimalarial is distributed. Considering the analysis from the societal perspective decreases the attractiveness of HMM. CONCLUSION: Syndromic HMM for children with fever may be a useful strategy for higher transmission settings with limited health care and diagnosis, but is

  2. A HMM-Based Method for Vocal Fold Pathology Diagnosis

    Directory of Open Access Journals (Sweden)

    Vahid Majidnezhad

    2012-11-01

    Full Text Available Acoustic analysis is a proper method in vocal fold pathology diagnosis so that it can complement and in some cases replace the other invasive, based on direct vocal fold observations methods. There are different approaches for vocal fold pathology diagnosis. This paper presents a method based on hidden markov model which classifies speeches into two classes: the normal and the pathological. Two hidden markov models are trained based on these two classes of speech and then the trained models are used to classify the dataset. The proposed method is able to classify the speeches with an accuracy of 93.75%. The results of this algorithm provide insights that can help biologists and computer scientists design high-performance system for detection of vocal fold pathology diagnosis.

  3. Model Selection Framework for Graph-based data

    CERN Document Server

    Caceres, Rajmonda S; Schmidt, Matthew C; Miller, Benjamin A; Campbell, William M

    2016-01-01

    Graphs are powerful abstractions for capturing complex relationships in diverse application settings. An active area of research focuses on theoretical models that define the generative mechanism of a graph. Yet given the complexity and inherent noise in real datasets, it is still very challenging to identify the best model for a given observed graph. We discuss a framework for graph model selection that leverages a long list of graph topological properties and a random forest classifier to learn and classify different graph instances. We fully characterize the discriminative power of our approach as we sweep through the parameter space of two generative models, the Erdos-Renyi and the stochastic block model. We show that our approach gets very close to known theoretical bounds and we provide insight on which topological features play a critical discriminating role.

  4. Hidden Markov Model-based Packet Loss Concealment for Voice over IP

    DEFF Research Database (Denmark)

    Rødbro, Christoffer A.; Murthi, Manohar N.; Andersen, Søren Vang

    2006-01-01

    As voice over IP proliferates, packet loss concealment (PLC) at the receiver has emerged as an important factor in determining voice quality of service. Through the use of heuristic variations of signal and parameter repetition and overlap-add interpolation to handle packet loss, conventional PLC...... systems largely ignore the dynamics of the statistical evolution of the speech signal, possibly leading to perceptually annoying artifacts. To address this problem, we propose the use of hidden Markov models for PLC. With a hidden Markov model (HMM) tracking the evolution of speech signal parameters, we...... demonstrate how PLC is performed within a statistical signal processing framework. Moreover, we show how the HMM is used to index a specially designed PLC module for the particular signal context, leading to signal-contingent PLC. Simulation examples, objective tests, and subjective listening tests...

  5. A framework for similarity recognition of CAD models

    Directory of Open Access Journals (Sweden)

    Leila Zehtaban

    2016-07-01

    Full Text Available A designer is mainly supported by two essential factors in design decisions. These two factors are intelligence and experience aiding the designer by predicting the interconnection between the required design parameters. Through classification of product data and similarity recognition between new and existing designs, it is partially possible to replace the required experience for an inexperienced designer. Given this context, the current paper addresses a framework for recognition and flexible retrieval of similar models in product design. The idea is to establish an infrastructure for transferring design as well as the required PLM (Product Lifecycle Management know-how to the design phase of product development in order to reduce the design time. Furthermore, such a method can be applied as a brainstorming method for a new and creative product development as well. The proposed framework has been tested and benchmarked while showing promising results.

  6. The ontology model of FrontCRM framework

    Science.gov (United States)

    Budiardjo, Eko K.; Perdana, Wira; Franshisca, Felicia

    2013-03-01

    Adoption and implementation of Customer Relationship Management (CRM) is not merely a technological installation, but the emphasis is more on the application of customer-centric philosophy and culture as a whole. CRM must begin at the level of business strategy, the only level that thorough organizational changes are possible to be done. Changes agenda can be directed to each departmental plans, and supported by information technology. Work processes related to CRM concept include marketing, sales, and services. FrontCRM is developed as framework to guide in identifying business processes related to CRM in which based on the concept of strategic planning approach. This leads to processes and practices identification in every process area related to marketing, sales, and services. The Ontology model presented on this paper by means serves as tools to avoid framework misunderstanding, to define practices systematically within process area and to find CRM software features related to those practices.

  7. Modeling phenotypic plasticity in growth trajectories: a statistical framework.

    Science.gov (United States)

    Wang, Zhong; Pang, Xiaoming; Wu, Weimiao; Wang, Jianxin; Wang, Zuoheng; Wu, Rongling

    2014-01-01

    Phenotypic plasticity, that is multiple phenotypes produced by a single genotype in response to environmental change, has been thought to play an important role in evolution and speciation. Historically, knowledge about phenotypic plasticity has resulted from the analysis of static traits measured at a single time point. New insight into the adaptive nature of plasticity can be gained by an understanding of how organisms alter their developmental processes in a range of environments. Recent advances in statistical modeling of functional data and developmental genetics allow us to construct a dynamic framework of plastic response in developmental form and pattern. Under this framework, development, genetics, and evolution can be synthesized through statistical bridges to better address how evolution results from phenotypic variation in the process of development via genetic alterations.

  8. Velo: A Knowledge Management Framework for Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Gorton, Ian; Sivaramakrishnan, Chandrika; Black, Gary D.; White, Signe K.; Purohit, Sumit; Lansing, Carina S.; Madison, Michael C.; Schuchardt, Karen L.; Liu, Yan

    2012-03-01

    Modern scientific enterprises are inherently knowledge-intensive. Scientific studies in domains such as geosciences, climate, and biology require the acquisition and manipulation of large amounts of experimental and field data to create inputs for large-scale computational simulations. The results of these simulations are then analyzed, leading to refinements of inputs and models and additional simulations. The results of this process must be managed and archived to provide justifications for regulatory decisions and publications that are based on the models. In this paper we introduce our Velo framework that is designed as a reusable, domain independent knowledge management infrastructure for modeling and simulation. Velo leverages, integrates and extends open source collaborative and content management technologies to create a scalable and flexible core platform that can be tailored to specific scientific domains. We describe the architecture of Velo for managing and associating the various types of data that are used and created in modeling and simulation projects, as well as the framework for integrating domain-specific tools. To demonstrate realizations of Velo, we describe examples from two deployed sites for carbon sequestration and climate modeling. These provide concrete example of the inherent extensibility and utility of our approach.

  9. Optimization Framework for Stochastic Modeling of Annual Streamflows

    Science.gov (United States)

    Srivastav, R. K.; Srinivasan, K.; Sudheer, K.

    2008-12-01

    Synthetic streamflow data generation involves the synthesis of likely streamflow patterns that are statistically indistinguishable from the observed streamflow data. The various kinds of stochastic models adopted for streamflow generation in hydrology are: i) parametric models which hypothesize the form of the dependence structure and the distributional form a priori (examples are AR, ARMA); ii) Nonparametric models (examples are bootstrap/kernel based methods), which characterize the laws of chance, describing the stream flow process, without recourse to prior assumptions as to the form or structure of these laws; iii) Hybrid models which blend both parametric and non-parametric models advantageously to model the streamflows effectively. Despite many of these developments that have taken place in the field of stochastic modeling of streamflows over the last four decades, accurate prediction of the storage and the critical drought (water use) characteristics has been posing a persistent challenge to the stochastic modeler. This may be because, usually, the stochastic streamflow model parameters are estimated by minimizing a statistically based objective function (such as maximum likelihood (MLE) or least squares estimation) and subsequently the efficacy of the models is being validated based on the accuracy of prediction of the estimates of the water- use characteristics. In this study a framework is proposed to find the optimal hybrid model (blend of ARMA(1,1) and moving block bootstrap (MBB)) based on the explicit objective function of minimizing the relative bias in estimating the storage capacity of the reservoir. The optimal parameter set of the hybrid model is obtained based on the search over a multi-dimensional parameter space involving simultaneous exploration of the parametric (ARMA[1,1]) as well as the non-parametric (MBB) components. This is achieved using the efficient evolutionary search based optimization tool namely, non-dominated sorting genetic

  10. Important factors in HMM-based phonetic segmentation

    CSIR Research Space (South Africa)

    Van Niekerk, DR

    2007-11-01

    Full Text Available of phones, which rep- resent the acoustic realisations of the smallest meaningful units of speech, namely phonemes. Data in this form can be used to construct language based systems (including speech recognition and synthesis systems) through... the training of statistical models or the definition of acoustic databases, as well as aid language research in general. The accuracy and consistency of phonetic labels are crucial to the eventual quality of systems dependent on speech data. La- bels...

  11. A Structural Model Decomposition Framework for Hybrid Systems Diagnosis

    Science.gov (United States)

    Daigle, Matthew; Bregon, Anibal; Roychoudhury, Indranil

    2015-01-01

    Nowadays, a large number of practical systems in aerospace and industrial environments are best represented as hybrid systems that consist of discrete modes of behavior, each defined by a set of continuous dynamics. These hybrid dynamics make the on-line fault diagnosis task very challenging. In this work, we present a new modeling and diagnosis framework for hybrid systems. Models are composed from sets of user-defined components using a compositional modeling approach. Submodels for residual generation are then generated for a given mode, and reconfigured efficiently when the mode changes. Efficient reconfiguration is established by exploiting causality information within the hybrid system models. The submodels can then be used for fault diagnosis based on residual generation and analysis. We demonstrate the efficient causality reassignment, submodel reconfiguration, and residual generation for fault diagnosis using an electrical circuit case study.

  12. Generalized framework for context-specific metabolic model extraction methods

    Directory of Open Access Journals (Sweden)

    Semidán eRobaina Estévez

    2014-09-01

    Full Text Available Genome-scale metabolic models are increasingly applied to investigate the physiology not only of simple prokaryotes, but also eukaryotes, such as plants, characterized with compartmentalized cells of multiple types. While genome-scale models aim at including the entirety of known metabolic reactions, mounting evidence has indicated that only a subset of these reactions is active in a given context, including: developmental stage, cell type, or environment. As a result, several methods have been proposed to reconstruct context-specific models from existing genome-scale models by integrating various types of high-throughput data. Here we present a mathematical framework that puts all existing methods under one umbrella and provides the means to better understand their functioning, highlight similarities and differences, and to help users in selecting a most suitable method for an application.

  13. A computational framework for modeling targets as complex adaptive systems

    Science.gov (United States)

    Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh

    2017-05-01

    Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.

  14. Probabilistic assessment of agricultural droughts using graphical models

    Science.gov (United States)

    Ramadas, Meenu; Govindaraju, Rao S.

    2015-07-01

    Agricultural droughts are often characterized by soil moisture in the root zone of the soil, but crop needs are rarely factored into the analysis. Since water needs vary with crops, agricultural drought incidences in a region can be characterized better if crop responses to soil water deficits are also accounted for in the drought index. This study investigates agricultural droughts driven by plant stress due to soil moisture deficits using crop stress functions available in the literature. Crop water stress is assumed to begin at the soil moisture level corresponding to incipient stomatal closure, and reaches its maximum at the crop's wilting point. Using available location-specific crop acreage data, a weighted crop water stress function is computed. A new probabilistic agricultural drought index is then developed within a hidden Markov model (HMM) framework that provides model uncertainty in drought classification and accounts for time dependence between drought states. The proposed index allows probabilistic classification of the drought states and takes due cognizance of the stress experienced by the crop due to soil moisture deficit. The capabilities of HMM model formulations for assessing agricultural droughts are compared to those of current drought indices such as standardized precipitation evapotranspiration index (SPEI) and self-calibrating Palmer drought severity index (SC-PDSI). The HMM model identified critical drought events and several drought occurrences that are not detected by either SPEI or SC-PDSI, and shows promise as a tool for agricultural drought studies.

  15. Vulnerability Assessment Models to Drought: Toward a Conceptual Framework

    Directory of Open Access Journals (Sweden)

    Kiumars Zarafshani

    2016-06-01

    Full Text Available Drought is regarded as a slow-onset natural disaster that causes inevitable damage to water resources and to farm life. Currently, crisis management is the basis of drought mitigation plans, however, thus far studies indicate that effective drought management strategies are based on risk management. As a primary tool in mitigating the impact of drought, vulnerability assessment can be used as a benchmark in drought mitigation plans and to enhance farmers’ ability to cope with drought. Moreover, literature pertaining to drought has focused extensively on its impact, only awarding limited attention to vulnerability assessment as a tool. Therefore, the main purpose of this paper is to develop a conceptual framework for designing a vulnerability model in order to assess farmers’ level of vulnerability before, during and after the onset of drought. Use of this developed drought vulnerability model would aid disaster relief workers by enhancing the adaptive capacity of farmers when facing the impacts of drought. The paper starts with the definition of vulnerability and outlines different frameworks on vulnerability developed thus far. It then identifies various approaches of vulnerability assessment and finally offers the most appropriate model. The paper concludes that the introduced model can guide drought mitigation programs in countries that are impacted the most by drought.

  16. Computer-aided modeling framework – a generic modeling template for catalytic membrane fixed bed reactors

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    2013-01-01

    This work focuses on development of computer-aided modeling framework. The framework is a knowledge-based system that is built on a generic modeling language and structured based on workflows for different general modeling tasks. The overall objective of this work is to support the model developers...... and users to generate and test models systematically, efficiently and reliably. In this way, development of products and processes can be faster, cheaper and very efficient. In this contribution, as part of the framework a generic modeling template for the systematic derivation of problem specific catalytic...... membrane fixed bed models is developed. The application of the modeling template is highlighted with a case study related to the modeling of a catalytic membrane reactor coupling dehydrogenation of ethylbenzene with hydrogenation of nitrobenzene....

  17. Web Information Extraction Based on a hybrid of HMM/WNN%基于HMM和小波神经网络混合模型的Web信息抽取

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    A hybrid model was presented for information extraction by using the combined Hidden Markov Models(HMM) and Wavelet Neural Network (WNN).It first characterize the node of the web and establish different HMM according to the content of the web. Then appropriate HMM is selected by WNN for information extraction.As HMM can not extract important information accurately, WNN is used as an auxilary tool to do the discrimination. Experiments show that this hybrid model can improve the accuracy of Web information extraction.%提出一种将隐马尔科夫模型(HMM)和小波神经网络(wNN)相结合的混合模型应用于信息抽取。其首先将网页节点特征化,并依据网页内容建立不同的HMM,之后通过WNN调用相应的HMM用于信息抽取。HMM无法准确抽取的重要信息,利用WNN做辅助判别。实验证明,该混合模型可以提高Web信息抽取的精准度。

  18. Tarmo: A Framework for Parallelized Bounded Model Checking

    CERN Document Server

    Wieringa, Siert; Heljanko, Keijo; 10.4204/EPTCS.14.5

    2009-01-01

    This paper investigates approaches to parallelizing Bounded Model Checking (BMC) for shared memory environments as well as for clusters of workstations. We present a generic framework for parallelized BMC named Tarmo. Our framework can be used with any incremental SAT encoding for BMC but for the results in this paper we use only the current state-of-the-art encoding for full PLTL. Using this encoding allows us to check both safety and liveness properties, contrary to an earlier work on distributing BMC that is limited to safety properties only. Despite our focus on BMC after it has been translated to SAT, existing distributed SAT solvers are not well suited for our application. This is because solving a BMC problem is not solving a set of independent SAT instances but rather involves solving multiple related SAT instances, encoded incrementally, where the satisfiability of each instance corresponds to the existence of a counterexample of a specific length. Our framework includes a generic architecture for a ...

  19. Kernel PCA for HMM-Based Cursive Handwriting Recognition

    Science.gov (United States)

    Fischer, Andreas; Bunke, Horst

    In this paper, we propose Kernel Principal Component Analysis as a feature selection method for offline cursive handwriting recognition based on Hidden Markov Models. In contrast to formerly used feature selection methods, namely standard Principal Component Analysis and Independent Component Analysis, nonlinearity is achieved by making use of a radial basis function kernel. In an experimental study we demonstrate that the proposed nonlinear method has a great potential to improve cursive handwriting recognition systems and is able to significantly outperform linear feature selection methods. We consider two diverse datasets of isolated handwritten words for the experimental evaluation, the first consisting of modern English words, and the second consisting of medieval Middle High German words.

  20. HHsenser: exhaustive transitive profile search using HMM–HMM comparison

    Science.gov (United States)

    Söding, Johannes; Remmert, Michael; Biegert, Andreas; Lupas, Andrei N.

    2006-01-01

    HHsenser is the first server to offer exhaustive intermediate profile searches, which it combines with pairwise comparison of hidden Markov models. Starting from a single protein sequence or a multiple alignment, it can iteratively explore whole superfamilies, producing few or no false positives. The output is a multiple alignment of all detected homologs. HHsenser's sensitivity should make it a useful tool for evolutionary studies. It may also aid applications that rely on diverse multiple sequence alignments as input, such as homology-based structure and function prediction, or the determination of functional residues by conservation scoring and functional subtyping. HHsenser can be accessed at . It has also been integrated into our structure and function prediction server HHpred () to improve predictions for near-singleton sequences. PMID:16845029

  1. Tarmo: A Framework for Parallelized Bounded Model Checking

    Directory of Open Access Journals (Sweden)

    Siert Wieringa

    2009-12-01

    Full Text Available This paper investigates approaches to parallelizing Bounded Model Checking (BMC for shared memory environments as well as for clusters of workstations. We present a generic framework for parallelized BMC named Tarmo. Our framework can be used with any incremental SAT encoding for BMC but for the results in this paper we use only the current state-of-the-art encoding for full PLTL. Using this encoding allows us to check both safety and liveness properties, contrary to an earlier work on distributing BMC that is limited to safety properties only. Despite our focus on BMC after it has been translated to SAT, existing distributed SAT solvers are not well suited for our application. This is because solving a BMC problem is not solving a set of independent SAT instances but rather involves solving multiple related SAT instances, encoded incrementally, where the satisfiability of each instance corresponds to the existence of a counterexample of a specific length. Our framework includes a generic architecture for a shared clause database that allows easy clause sharing between SAT solver threads solving various such instances. We present extensive experimental results obtained with multiple variants of our Tarmo implementation. Our shared memory variants have a significantly better performance than conventional single threaded approaches, which is a result that many users can benefit from as multi-core and multi-processor technology is widely available. Furthermore we demonstrate that our framework can be deployed in a typical cluster of workstations, where several multi-core machines are connected by a network.

  2. Using hidden markov models to improve quantifying physical activity in accelerometer data - a simulation study.

    Directory of Open Access Journals (Sweden)

    Vitali Witowski

    Full Text Available INTRODUCTION: The use of accelerometers to objectively measure physical activity (PA has become the most preferred method of choice in recent years. Traditionally, cutpoints are used to assign impulse counts recorded by the devices to sedentary and activity ranges. Here, hidden Markov models (HMM are used to improve the cutpoint method to achieve a more accurate identification of the sequence of modes of PA. METHODS: 1,000 days of labeled accelerometer data have been simulated. For the simulated data the actual sedentary behavior and activity range of each count is known. The cutpoint method is compared with HMMs based on the Poisson distribution (HMM[Pois], the generalized Poisson distribution (HMM[GenPois] and the Gaussian distribution (HMM[Gauss] with regard to misclassification rate (MCR, bout detection, detection of the number of activities performed during the day and runtime. RESULTS: The cutpoint method had a misclassification rate (MCR of 11% followed by HMM[Pois] with 8%, HMM[GenPois] with 3% and HMM[Gauss] having the best MCR with less than 2%. HMM[Gauss] detected the correct number of bouts in 12.8% of the days, HMM[GenPois] in 16.1%, HMM[Pois] and the cutpoint method in none. HMM[GenPois] identified the correct number of activities in 61.3% of the days, whereas HMM[Gauss] only in 26.8%. HMM[Pois] did not identify the correct number at all and seemed to overestimate the number of activities. Runtime varied between 0.01 seconds (cutpoint, 2.0 minutes (HMM[Gauss] and 14.2 minutes (HMM[GenPois]. CONCLUSIONS: Using simulated data, HMM-based methods were superior in activity classification when compared to the traditional cutpoint method and seem to be appropriate to model accelerometer data. Of the HMM-based methods, HMM[Gauss] seemed to be the most appropriate choice to assess real-life accelerometer data.

  3. Strategic assessment of capacity consumption in railway networks: Framework and model

    DEFF Research Database (Denmark)

    Jensen, Lars Wittrup; Landex, Alex; Nielsen, Otto Anker

    2017-01-01

    In this paper, we develop a new framework for strategic planning purposes to calculate railway infrastructure occupation and capacity consumption in networks, independent of a timetable. Furthermore, a model implementing the framework is presented. In this model different train sequences are gene......In this paper, we develop a new framework for strategic planning purposes to calculate railway infrastructure occupation and capacity consumption in networks, independent of a timetable. Furthermore, a model implementing the framework is presented. In this model different train sequences...

  4. Model-Driven Policy Framework for Data Centers

    DEFF Research Database (Denmark)

    Caba, Cosmin Marius; Kentis, Angelos Mimidis; Soler, José

    2016-01-01

    . Moreover, the lack of simple solutions for managing the configuration and behavior of the DC components makes the DC hard to configure and slow in adapting to changes in business needs. In this paper, we propose a model-driven framework for policy-based management for DCs, to simplify not only the service......Data Centers (DCs) continue to become increasingly complex, due to comprising multiple functional entities (e.g. routing, orchestration). Managing the multitude of interconnected components in the DC becomes difficult and error prone, leading to slow service provisioning, lack of QoS support, etc...

  5. Generic modelling framework for economic analysis of battery systems

    DEFF Research Database (Denmark)

    You, Shi; Rasmussen, Claus Nygaard

    2011-01-01

    Deregulated electricity markets provide opportunities for Battery Systems (BS) to participate in energy arbitrage and ancillary services (regulation, operating reserves, contingency reserves, voltage regulation, power quality etc.). To evaluate the economic viability of BS with different business...... for battery cycle life estimation, since the cycle life plays a central role in the economic analysis of BS. To illustrate the modelling framework, a case study using a Sodium Sulfur Battery (NAS) system with 5-minute regulating service is performed. The economic performances of two dispatch scenarios, a so...

  6. HMM-based lexicon-driven and lexicon-free word recognition for online handwritten Indic scripts.

    Science.gov (United States)

    Bharath, A; Madhvanath, Sriganesh

    2012-04-01

    Research for recognizing online handwritten words in Indic scripts is at its early stages when compared to Latin and Oriental scripts. In this paper, we address this problem specifically for two major Indic scripts--Devanagari and Tamil. In contrast to previous approaches, the techniques we propose are largely data driven and script independent. We propose two different techniques for word recognition based on Hidden Markov Models (HMM): lexicon driven and lexicon free. The lexicon-driven technique models each word in the lexicon as a sequence of symbol HMMs according to a standard symbol writing order derived from the phonetic representation. The lexicon-free technique uses a novel Bag-of-Symbols representation of the handwritten word that is independent of symbol order and allows rapid pruning of the lexicon. On handwritten Devanagari word samples featuring both standard and nonstandard symbol writing orders, a combination of lexicon-driven and lexicon-free recognizers significantly outperforms either of them used in isolation. In contrast, most Tamil word samples feature the standard symbol order, and the lexicon-driven recognizer outperforms the lexicon free one as well as their combination. The best recognition accuracies obtained for 20,000 word lexicons are 87.13 percent for Devanagari when the two recognizers are combined, and 91.8 percent for Tamil using the lexicon-driven technique.

  7. An Integrated Framework Advancing Membrane Protein Modeling and Design.

    Directory of Open Access Journals (Sweden)

    Rebecca F Alford

    2015-09-01

    Full Text Available Membrane proteins are critical functional molecules in the human body, constituting more than 30% of open reading frames in the human genome. Unfortunately, a myriad of difficulties in overexpression and reconstitution into membrane mimetics severely limit our ability to determine their structures. Computational tools are therefore instrumental to membrane protein structure prediction, consequently increasing our understanding of membrane protein function and their role in disease. Here, we describe a general framework facilitating membrane protein modeling and design that combines the scientific principles for membrane protein modeling with the flexible software architecture of Rosetta3. This new framework, called RosettaMP, provides a general membrane representation that interfaces with scoring, conformational sampling, and mutation routines that can be easily combined to create new protocols. To demonstrate the capabilities of this implementation, we developed four proof-of-concept applications for (1 prediction of free energy changes upon mutation; (2 high-resolution structural refinement; (3 protein-protein docking; and (4 assembly of symmetric protein complexes, all in the membrane environment. Preliminary data show that these algorithms can produce meaningful scores and structures. The data also suggest needed improvements to both sampling routines and score functions. Importantly, the applications collectively demonstrate the potential of combining the flexible nature of RosettaMP with the power of Rosetta algorithms to facilitate membrane protein modeling and design.

  8. A hybrid parallel framework for the cellular Potts model simulations

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Yi [Los Alamos National Laboratory; He, Kejing [SOUTH CHINA UNIV; Dong, Shoubin [SOUTH CHINA UNIV

    2009-01-01

    The Cellular Potts Model (CPM) has been widely used for biological simulations. However, most current implementations are either sequential or approximated, which can't be used for large scale complex 3D simulation. In this paper we present a hybrid parallel framework for CPM simulations. The time-consuming POE solving, cell division, and cell reaction operation are distributed to clusters using the Message Passing Interface (MPI). The Monte Carlo lattice update is parallelized on shared-memory SMP system using OpenMP. Because the Monte Carlo lattice update is much faster than the POE solving and SMP systems are more and more common, this hybrid approach achieves good performance and high accuracy at the same time. Based on the parallel Cellular Potts Model, we studied the avascular tumor growth using a multiscale model. The application and performance analysis show that the hybrid parallel framework is quite efficient. The hybrid parallel CPM can be used for the large scale simulation ({approx}10{sup 8} sites) of complex collective behavior of numerous cells ({approx}10{sup 6}).

  9. CIMS: A FRAMEWORK FOR INFRASTRUCTURE INTERDEPENDENCY MODELING AND ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Donald D. Dudenhoeffer; May R. Permann; Milos Manic

    2006-12-01

    Today’s society relies greatly upon an array of complex national and international infrastructure networks such as transportation, utilities, telecommunication, and even financial networks. While modeling and simulation tools have provided insight into the behavior of individual infrastructure networks, a far less understood area is that of the interrelationships among multiple infrastructure networks including the potential cascading effects that may result due to these interdependencies. This paper first describes infrastructure interdependencies as well as presenting a formalization of interdependency types. Next the paper describes a modeling and simulation framework called CIMS© and the work that is being conducted at the Idaho National Laboratory (INL) to model and simulate infrastructure interdependencies and the complex behaviors that can result.

  10. A framework for modeling emerging diseases to inform management

    Science.gov (United States)

    Russell, Robin E.; Katz, Rachel A.; Richgels, Katherine L.D.; Walsh, Daniel P.; Grant, Evan

    2017-01-01

    The rapid emergence and reemergence of zoonotic diseases requires the ability to rapidly evaluate and implement optimal management decisions. Actions to control or mitigate the effects of emerging pathogens are commonly delayed because of uncertainty in the estimates and the predicted outcomes of the control tactics. The development of models that describe the best-known information regarding the disease system at the early stages of disease emergence is an essential step for optimal decision-making. Models can predict the potential effects of the pathogen, provide guidance for assessing the likelihood of success of different proposed management actions, quantify the uncertainty surrounding the choice of the optimal decision, and highlight critical areas for immediate research. We demonstrate how to develop models that can be used as a part of a decision-making framework to determine the likelihood of success of different management actions given current knowledge.

  11. A Building Model Framework for a Genetic Algorithm Multi-objective Model Predictive Control

    DEFF Research Database (Denmark)

    Arendt, Krzysztof; Ionesi, Ana; Jradi, Muhyiddine

    2016-01-01

    implemented only in few buildings. The following difficulties hinder the widespread usage of MPC: (1) significant model development time, (2) limited portability of models, (3) model computational demand. In the present study a new model development framework for an MPC system based on a Genetic Algorithm (GA...

  12. Validated Real Time Middle Ware For Distributed Cyber Physical Systems Using Hmm(

    Directory of Open Access Journals (Sweden)

    Ankit Mundra

    2013-04-01

    Full Text Available Distributed Cyber Physical Systems designed for different scenario must be capable enough to perform inan efficient manner in every situation. Earlier approaches, such as CORBA, has performed but withdifferent time constraints. Therefore, there was the need to design reconfigurable, robust, validated andconsistent real time middle ware systems with end-to-end timing. In the DCPS-HMM we have proposed theprocessor efficiency and data validation which may proof crucial in implementing various distributedsystems such as credit card systems or file transfer through network.

  13. Integrating HMM-Based Speech Recognition With Direct Manipulation In A Multimodal Korean Natural Language Interface

    CERN Document Server

    Lee, G; Kim, S; Lee, Geunbae; Lee, Jong-Hyeok; Kim, Sangeok

    1996-01-01

    This paper presents a HMM-based speech recognition engine and its integration into direct manipulation interfaces for Korean document editor. Speech recognition can reduce typical tedious and repetitive actions which are inevitable in standard GUIs (graphic user interfaces). Our system consists of general speech recognition engine called ABrain {Auditory Brain} and speech commandable document editor called SHE {Simple Hearing Editor}. ABrain is a phoneme-based speech recognition engine which shows up to 97% of discrete command recognition rate. SHE is a EuroBridge widget-based document editor that supports speech commands as well as direct manipulation interfaces.

  14. An integrated modelling framework for neural circuits with multiple neuromodulators

    Science.gov (United States)

    Vemana, Vinith

    2017-01-01

    Neuromodulators are endogenous neurochemicals that regulate biophysical and biochemical processes, which control brain function and behaviour, and are often the targets of neuropharmacological drugs. Neuromodulator effects are generally complex partly owing to the involvement of broad innervation, co-release of neuromodulators, complex intra- and extrasynaptic mechanism, existence of multiple receptor subtypes and high interconnectivity within the brain. In this work, we propose an efficient yet sufficiently realistic computational neural modelling framework to study some of these complex behaviours. Specifically, we propose a novel dynamical neural circuit model that integrates the effective neuromodulator-induced currents based on various experimental data (e.g. electrophysiology, neuropharmacology and voltammetry). The model can incorporate multiple interacting brain regions, including neuromodulator sources, simulate efficiently and easily extendable to large-scale brain models, e.g. for neuroimaging purposes. As an example, we model a network of mutually interacting neural populations in the lateral hypothalamus, dorsal raphe nucleus and locus coeruleus, which are major sources of neuromodulator orexin/hypocretin, serotonin and norepinephrine/noradrenaline, respectively, and which play significant roles in regulating many physiological functions. We demonstrate that such a model can provide predictions of systemic drug effects of the popular antidepressants (e.g. reuptake inhibitors), neuromodulator antagonists or their combinations. Finally, we developed user-friendly graphical user interface software for model simulation and visualization for both fundamental sciences and pharmacological studies. PMID:28100828

  15. A python framework for environmental model uncertainty analysis

    Science.gov (United States)

    White, Jeremy; Fienen, Michael; Doherty, John E.

    2016-01-01

    We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.

  16. Receptor modeling application framework for particle source apportionment.

    Science.gov (United States)

    Watson, John G; Zhu, Tan; Chow, Judith C; Engelbrecht, Johann; Fujita, Eric M; Wilson, William E

    2002-12-01

    Receptor models infer contributions from particulate matter (PM) source types using multivariate measurements of particle chemical and physical properties. Receptor models complement source models that estimate concentrations from emissions inventories and transport meteorology. Enrichment factor, chemical mass balance, multiple linear regression, eigenvector. edge detection, neural network, aerosol evolution, and aerosol equilibrium models have all been used to solve particulate air quality problems, and more than 500 citations of their theory and application document these uses. While elements, ions, and carbons were often used to apportion TSP, PM10, and PM2.5 among many source types, many of these components have been reduced in source emissions such that more complex measurements of carbon fractions, specific organic compounds, single particle characteristics, and isotopic abundances now need to be measured in source and receptor samples. Compliance monitoring networks are not usually designed to obtain data for the observables, locations, and time periods that allow receptor models to be applied. Measurements from existing networks can be used to form conceptual models that allow the needed monitoring network to be optimized. The framework for using receptor models to solve air quality problems consists of: (1) formulating a conceptual model; (2) identifying potential sources; (3) characterizing source emissions; (4) obtaining and analyzing ambient PM samples for major components and source markers; (5) confirming source types with multivariate receptor models; (6) quantifying source contributions with the chemical mass balance; (7) estimating profile changes and the limiting precursor gases for secondary aerosols; and (8) reconciling receptor modeling results with source models, emissions inventories, and receptor data analyses.

  17. A Model-driven Framework for Educational Game Design

    Directory of Open Access Journals (Sweden)

    Bill Roungas

    2016-09-01

    Full Text Available Educational games are a class of serious games whose main purpose is to teach some subject to their players. Despite the many existing design frameworks, these games are too often created in an ad-hoc manner, and typically without the use of a game design document (GDD. We argue that a reason for this phenomenon is that current ways to structure, create and update GDDs do not increase the value of the artifact in the design and development process. As a solution, we propose a model-driven, web-based knowledge management environment that supports game designers in the creation of a GDD that accounts for and relates educational and entertainment game elements. The foundation of our approach is our devised conceptual model for educational games, which also defines the structure of the design environment. We present promising results from an evaluation of our environment with eight experts in serious games.

  18. A Bisimulation-based Hierarchical Framework for Software Development Models

    Directory of Open Access Journals (Sweden)

    Ping Liang

    2013-08-01

    Full Text Available Software development models have been ripen since the emergence of software engineering, like waterfall model, V-model, spiral model, etc. To ensure the successful implementation of those models, various metrics for software products and development process have been developed along, like CMMI, software metrics, and process re-engineering, etc. The quality of software products and processes can be ensured in consistence as much as possible and the abstract integrity of a software product can be achieved. However, in reality, the maintenance of software products is still high and even higher along with software evolution due to the inconsistence occurred by changes and inherent errors of software products. It is better to build up a robust software product that can sustain changes as many as possible. Therefore, this paper proposes a process algebra based hierarchical framework to extract an abstract equivalent of deliverable at the end of phases of a software product from its software development models. The process algebra equivalent of the deliverable is developed hierarchically with the development of the software product, applying bi-simulation to test run the deliverable of phases to guarantee the consistence and integrity of the software development and product in a trivially mathematical way. And an algorithm is also given to carry out the assessment of the phase deliverable in process algebra.  

  19. A Categorical Framework for Model Classification in the Geosciences

    Science.gov (United States)

    Hauhs, Michael; Trancón y Widemann, Baltasar; Lange, Holger

    2016-04-01

    Models have a mixed record of success in the geosciences. In meteorology, model development and implementation has been among the first and most successful examples of triggering computer technology in science. On the other hand, notorious problems such as the 'equifinality issue' in hydrology lead to a rather mixed reputation of models in other areas. The most successful models in geosciences are applications of dynamic systems theory to non-living systems or phenomena. Thus, we start from the hypothesis that the success of model applications relates to the influence of life on the phenomenon under study. We thus focus on the (formal) representation of life in models. The aim is to investigate whether disappointment in model performance is due to system properties such as heterogeneity and historicity of ecosystems, or rather reflects an abstraction and formalisation problem at a fundamental level. As a formal framework for this investigation, we use category theory as applied in computer science to specify behaviour at an interface. Its methods have been developed for translating and comparing formal structures among different application areas and seems highly suited for a classification of the current "model zoo" in the geosciences. The approach is rather abstract, with a high degree of generality but a low level of expressibility. Here, category theory will be employed to check the consistency of assumptions about life in different models. It will be shown that it is sufficient to distinguish just four logical cases to check for consistency of model content. All four cases can be formalised as variants of coalgebra-algebra homomorphisms. It can be demonstrated that transitions between the four variants affect the relevant observations (time series or spatial maps), the formalisms used (equations, decision trees) and the test criteria of success (prediction, classification) of the resulting model types. We will present examples from hydrology and ecology in

  20. An Extended Model Driven Framework for End-to-End Consistent Model Transformation

    Directory of Open Access Journals (Sweden)

    Mr. G. Ramesh

    2016-08-01

    Full Text Available Model Driven Development (MDD results in quick transformation from models to corresponding systems. Forward engineering features of modelling tools can help in generating source code from models. To build a robust system it is important to have consistency checking in the design models and the same between design model and the transformed implementation. Our framework named as Extensible Real Time Software Design Inconsistency Checker (XRTSDIC proposed in our previous papers supports consistency checking in design models. This paper focuses on automatic model transformation. An algorithm and defined transformation rules for model transformation from UML class diagram to ERD and SQL are being proposed. The model transformation bestows many advantages such as reducing cost of development, improving quality, enhancing productivity and leveraging customer satisfaction. Proposed framework has been enhanced to ensure that the transformed implementations conform to their model counterparts besides checking end-to-end consistency.

  1. Usage Intention Framework Model: A Fuzzy Logic Interpretation of the Classical Utaut Model

    Science.gov (United States)

    Sandaire, Johnny

    2009-01-01

    A fuzzy conjoint analysis (FCA: Turksen, 1992) model for enhancing management decision in the technology adoption domain was implemented as an extension to the UTAUT model (Venkatesh, Morris, Davis, & Davis, 2003). Additionally, a UTAUT-based Usage Intention Framework Model (UIFM) introduced a closed-loop feedback system. The empirical evidence…

  2. Participatory Model Construction and Model Use in Natural Resource Management: a Framework for Reflection

    NARCIS (Netherlands)

    Bots, P.W.G.; Van Daalen, C.E.

    2008-01-01

    In this article we propose a framework which can assist analysts in their reflection on the requirements for a participatory modelling exercise in natural resource management. Firstly, we distinguish different types of formal models which may be developed, ranging from models that focus on (bio)phys

  3. Usage Intention Framework Model: A Fuzzy Logic Interpretation of the Classical Utaut Model

    Science.gov (United States)

    Sandaire, Johnny

    2009-01-01

    A fuzzy conjoint analysis (FCA: Turksen, 1992) model for enhancing management decision in the technology adoption domain was implemented as an extension to the UTAUT model (Venkatesh, Morris, Davis, & Davis, 2003). Additionally, a UTAUT-based Usage Intention Framework Model (UIFM) introduced a closed-loop feedback system. The empirical evidence…

  4. An Integrated Hydro-Economic Modelling Framework to Evaluate Water Allocation Strategies I: Model Development.

    NARCIS (Netherlands)

    George, B.; Malano, H.; Davidson, B.; Hellegers, P.; Bharati, L.; Sylvain, M.

    2011-01-01

    In this paper an integrated modelling framework for water resources planning and management that can be used to carry out an analysis of alternative policy scenarios for water allocation and use is described. The modelling approach is based on integrating a network allocation model (REALM) and a soc

  5. Learning Hidden Markov Models using Non-Negative Matrix Factorization

    CERN Document Server

    Cybenko, George

    2008-01-01

    The Baum-Welsh algorithm together with its derivatives and variations has been the main technique for learning Hidden Markov Models (HMM) from observational data. We present an HMM learning algorithm based on the non-negative matrix factorization (NMF) of higher order Markovian statistics that is structurally different from the Baum-Welsh and its associated approaches. The described algorithm supports estimation of the number of recurrent states of an HMM and iterates the non-negative matrix factorization (NMF) algorithm to improve the learned HMM parameters. Numerical examples are provided as well.

  6. DATA CONTEXT MODEL IN THE PRO- CESS INTEGRATION FRAMEWORK

    Institute of Scientific and Technical Information of China (English)

    ZHAO Bo; YAN Yan; NING Ruxin; LI Shiyun

    2008-01-01

    Process integration is the important aspect of product development process. The recent researches focus on project management, workflow management and process modeling. Based on the analysis of the process, product development process is divided into three levels according to different grains from macroscopy to microcosm. Our research concentrate on the workflow and the fine-grained design process. According to the need of representing the data and the relationships among them for process integration, context model is introduced, and its characters are analyzed. The tree-like structure of inheritance among context model's classes is illustrated; The relationships of reference among them are also explained. Then, extensible markup language (XML) file is used to depict these classes. A four-tier framework of process integration has been established, in which model-view-controller pattern is designed to realize the separation between context model and its various views. The integration of applications is applied by the encapsulation of enterprise's business logic as distributed services. The prototype system for the design of air filter is applied in an institute.

  7. Assessment of Solution Uncertainties in Single-Column Modeling Frameworks.

    Science.gov (United States)

    Hack, James J.; Pedretti, John A.

    2000-01-01

    Single-column models (SCMs) have been extensively promoted in recent years as an effective means to develop and test physical parameterizations targeted for more complex three-dimensional climate models. Although there are some clear advantages associated with single-column modeling, there are also some significant disadvantages, including the absence of large-scale feedbacks. Basic limitations of an SCM framework can make it difficult to interpret solutions, and at times contribute to rather striking failures to identify even first-order sensitivities as they would be observed in a global climate simulation. This manuscript will focus on one of the basic experimental approaches currently exploited by the single-column modeling community, with an emphasis on establishing the inherent uncertainties in the numerical solutions. The analysis will employ the standard physics package from the NCAR CCM3 and will illustrate the nature of solution uncertainties that arise from nonlinearities in parameterized physics. The results of this study suggest the need to make use of an ensemble methodology when conducting single-column modeling investigations.

  8. Structural uncertainty in watershed phosphorus modeling: Toward a stochastic framework

    Science.gov (United States)

    Chen, Lei; Gong, Yongwei; Shen, Zhenyao

    2016-06-01

    Structural uncertainty is an important source of model predictive errors, but few studies have been conducted on the error-transitivity from model structure to nonpoint source (NPS) prediction. In this study, we focused on the structural uncertainty caused by the algorithms and equations that are used to describe the phosphorus (P) cycle at the watershed scale. The sensitivity of simulated P to each algorithm/equation was quantified using the Soil and Water Assessment Tool (SWAT) in the Three Gorges Reservoir Area, China. The results indicated that the ratios of C:N and P:N for humic materials, as well as the algorithm of fertilization and P leaching contributed the largest output uncertainties. In comparison, the initiation of inorganic P in the soil layer and the transformation algorithm between P pools are less sensitive for the NPS-P predictions. In addition, the coefficient of variation values were quantified as 0.028-0.086, indicating that the structure-induced uncertainty is minor compared to NPS-P prediction uncertainty caused by the model input and parameters. Using the stochastic framework, the cumulative probability of simulated NPS-P data provided a trade-off between expenditure burden and desired risk. In this sense, this paper provides valuable information for the control of model structural uncertainty, and can be extrapolated to other model-based studies.

  9. A Data Driven Framework for Integrating Regional Climate Models

    Science.gov (United States)

    Lansing, C.; Kleese van Dam, K.; Liu, Y.; Elsethagen, T.; Guillen, Z.; Stephan, E.; Critchlow, T.; Gorton, I.

    2012-12-01

    There are increasing needs for research addressing complex climate sensitive issues of concern to decision-makers and policy planners at a regional level. Decisions about allocating scarce water across competing municipal, agricultural, and ecosystem demands is just one of the challenges ahead, along with decisions regarding competing land use priorities such as biofuels, food, and species habitat. Being able to predict the extent of future climate change in the context of introducing alternative energy production strategies requires a new generation of modeling capabilities. We will also need more complete representations of human systems at regional scales, incorporating the influences of population centers, land use, agriculture and existing and planned electrical demand and generation infrastructure. At PNNL we are working towards creating a first-of-a-kind capability known as the Integrated Regional Earth System Model (iRESM). The fundamental goal of the iRESM initiative is the critical analyses of the tradeoffs and consequences of decision and policy making for integrated human and environmental systems. This necessarily combines different scientific processes, bridging different temporal and geographic scales and resolving the semantic differences between them. To achieve this goal, iRESM is developing a modeling framework and supporting infrastructure that enable the scientific team to evaluate different scenarios in light of specific stakeholder questions such as "How do regional changes in mean climate states and climate extremes affect water storage and energy consumption and how do such decisions influence possible mitigation and carbon management schemes?" The resulting capability will give analysts a toolset to gain insights into how regional economies can respond to climate change mitigation policies and accelerated deployment of alternative energy technologies. The iRESM framework consists of a collection of coupled models working with high

  10. A modelling framework to simulate foliar fungal epidemics using functional-structural plant models.

    Science.gov (United States)

    Garin, Guillaume; Fournier, Christian; Andrieu, Bruno; Houlès, Vianney; Robert, Corinne; Pradal, Christophe

    2014-09-01

    Sustainable agriculture requires the identification of new, environmentally responsible strategies of crop protection. Modelling of pathosystems can allow a better understanding of the major interactions inside these dynamic systems and may lead to innovative protection strategies. In particular, functional-structural plant models (FSPMs) have been identified as a means to optimize the use of architecture-related traits. A current limitation lies in the inherent complexity of this type of modelling, and thus the purpose of this paper is to provide a framework to both extend and simplify the modelling of pathosystems using FSPMs. Different entities and interactions occurring in pathosystems were formalized in a conceptual model. A framework based on these concepts was then implemented within the open-source OpenAlea modelling platform, using the platform's general strategy of modelling plant-environment interactions and extending it to handle plant interactions with pathogens. New developments include a generic data structure for representing lesions and dispersal units, and a series of generic protocols to communicate with objects representing the canopy and its microenvironment in the OpenAlea platform. Another development is the addition of a library of elementary models involved in pathosystem modelling. Several plant and physical models are already available in OpenAlea and can be combined in models of pathosystems using this framework approach. Two contrasting pathosystems are implemented using the framework and illustrate its generic utility. Simulations demonstrate the framework's ability to simulate multiscaled interactions within pathosystems, and also show that models are modular components within the framework and can be extended. This is illustrated by testing the impact of canopy architectural traits on fungal dispersal. This study provides a framework for modelling a large number of pathosystems using FSPMs. This structure can accommodate both

  11. A Theoretical Framework for Physics Education Research: Modeling Student Thinking

    CERN Document Server

    Redish, E F

    2004-01-01

    Education is a goal-oriented field. But if we want to treat education scientifically so we can accumulate, evaluate, and refine what we learn, then we must develop a theoretical framework that is strongly rooted in objective observations and through which different theoretical models of student thinking can be compared. Much that is known in the behavioral sciences is robust and observationally based. In this paper, I draw from a variety of fields ranging from neuroscience to sociolinguistics to propose an over-arching theoretical framework that allows us to both make sense of what we see in the classroom and to compare a variety of specific theoretical approaches. My synthesis is organized around an analysis of the individual's cognition and how it interacts with the environment. This leads to a two level system, a knowledge-structure level where associational patterns dominate, and a control-structure level where one can describe expectations and epistemology. For each level, I sketch some plausible startin...

  12. 基于LMD近似熵与HMM的转子故障诊断方法%Fault diagnosis approach for rotor system based on LMD-approximate entropy and HMM

    Institute of Scientific and Technical Information of China (English)

    赵荣珍; 于昊; 徐继刚

    2012-01-01

    A new fault diagnosis approach for rotor system was proposed based on local mean decomposition LMD) approximate entropy and hidden Markov models (HMM). The fine localization feature of LMD and approximate entropy combined with HMM were used to identify quantify the fault type. By using LMD method, the vibration signal of the rotor systems was made as a sum of several components of a product function (PF), in which the instantaneous frequencies should have physical meaning. The approximate entropies of the first three PF components were taken as the eigenvectors of the signal and the eigenvectors were input into HMM classifier to recognize the fault type. Simulation result showed that this method could be effectively used to extract the fault characteristics, and, combined with the dynamic statistical characteristics of HMM, the rotor fault type could be identified intelligently.%提出一种基于局部均值模式分解(local mean decomposition,简称LMD)的近似熵和隐Markov模型(hidden Markov model,简称HMM)的转子系统故障识别新方法.利用LMD良好的局域化特性和近似熵来量化故障特征,再与HMM结合进行故障类型识别.用LMD方法将转子信号分解成若干个瞬时频率具有物理意义的乘积函数(product function)PF分量之和,选取转子信号的前3个PF分量的近似熵值作为信号的特征向量,将构造出的特征向量输入到HMM分类器中进行故障类型识别.仿真表明,该方法能有效地提取故障特征,结合HMM的动态统计特性可智能识别转子故障类型.

  13. TP-model transformation-based-control design frameworks

    CERN Document Server

    Baranyi, Péter

    2016-01-01

    This book covers new aspects and frameworks of control, design, and optimization based on the TP model transformation and its various extensions. The author outlines the three main steps of polytopic and LMI based control design: 1) development of the qLPV state-space model, 2) generation of the polytopic model; and 3) application of LMI to derive controller and observer. He goes on to describe why literature has extensively studied LMI design, but has not focused much on the second step, in part because the generation and manipulation of the polytopic form was not tractable in many cases. The author then shows how the TP model transformation facilitates this second step and hence reveals new directions, leading to powerful design procedures and the formulation of new questions. The chapters of this book, and the complex dynamical control tasks which they cover, are organized so as to present and analyze the beneficial aspect of the family of approaches (control, design, and optimization). Additionally, the b...

  14. A Multiple Reaction Modelling Framework for Microbial Electrochemical Technologies

    Science.gov (United States)

    Oyetunde, Tolutola; Sarma, Priyangshu M.; Ahmad, Farrukh; Rodríguez, Jorge

    2017-01-01

    A mathematical model for the theoretical evaluation of microbial electrochemical technologies (METs) is presented that incorporates a detailed physico-chemical framework, includes multiple reactions (both at the electrodes and in the bulk phase) and involves a variety of microbial functional groups. The model is applied to two theoretical case studies: (i) A microbial electrolysis cell (MEC) for continuous anodic volatile fatty acids (VFA) oxidation and cathodic VFA reduction to alcohols, for which the theoretical system response to changes in applied voltage and VFA feed ratio (anode-to-cathode) as well as membrane type are investigated. This case involves multiple parallel electrode reactions in both anode and cathode compartments; (ii) A microbial fuel cell (MFC) for cathodic perchlorate reduction, in which the theoretical impact of feed flow rates and concentrations on the overall system performance are investigated. This case involves multiple electrode reactions in series in the cathode compartment. The model structure captures interactions between important system variables based on first principles and provides a platform for the dynamic description of METs involving electrode reactions both in parallel and in series and in both MFC and MEC configurations. Such a theoretical modelling approach, largely based on first principles, appears promising in the development and testing of MET control and optimization strategies. PMID:28054959

  15. Improving NASA's Multiscale Modeling Framework for Tropical Cyclone Climate Study

    Science.gov (United States)

    Shen, Bo-Wen; Nelson, Bron; Cheung, Samson; Tao, Wei-Kuo

    2013-01-01

    One of the current challenges in tropical cyclone (TC) research is how to improve our understanding of TC interannual variability and the impact of climate change on TCs. Recent advances in global modeling, visualization, and supercomputing technologies at NASA show potential for such studies. In this article, the authors discuss recent scalability improvement to the multiscale modeling framework (MMF) that makes it feasible to perform long-term TC-resolving simulations. The MMF consists of the finite-volume general circulation model (fvGCM), supplemented by a copy of the Goddard cumulus ensemble model (GCE) at each of the fvGCM grid points, giving 13,104 GCE copies. The original fvGCM implementation has a 1D data decomposition; the revised MMF implementation retains the 1D decomposition for most of the code, but uses a 2D decomposition for the massive copies of GCEs. Because the vast majority of computation time in the MMF is spent computing the GCEs, this approach can achieve excellent speedup without incurring the cost of modifying the entire code. Intelligent process mapping allows differing numbers of processes to be assigned to each domain for load balancing. The revised parallel implementation shows highly promising scalability, obtaining a nearly 80-fold speedup by increasing the number of cores from 30 to 3,335.

  16. A Smart Modeling Framework for Integrating BMI-enabled Models as Web Services

    Science.gov (United States)

    Jiang, P.; Elag, M.; Kumar, P.; Peckham, S. D.; Liu, R.; Marini, L.; Hsu, L.

    2015-12-01

    Serviced-oriented computing provides an opportunity to couple web service models using semantic web technology. Through this approach, models that are exposed as web services can be conserved in their own local environment, thus making it easy for modelers to maintain and update the models. In integrated modeling, the serviced-oriented loose-coupling approach requires (1) a set of models as web services, (2) the model metadata describing the external features of a model (e.g., variable name, unit, computational grid, etc.) and (3) a model integration framework. We present the architecture of coupling web service models that are self-describing by utilizing a smart modeling framework. We expose models that are encapsulated with CSDMS (Community Surface Dynamics Modeling System) Basic Model Interfaces (BMI) as web services. The BMI-enabled models are self-describing by uncovering models' metadata through BMI functions. After a BMI-enabled model is serviced, a client can initialize, execute and retrieve the meta-information of the model by calling its BMI functions over the web. Furthermore, a revised version of EMELI (Peckham, 2015), an Experimental Modeling Environment for Linking and Interoperability, is chosen as the framework for coupling BMI-enabled web service models. EMELI allows users to combine a set of component models into a complex model by standardizing model interface using BMI as well as providing a set of utilities smoothing the integration process (e.g., temporal interpolation). We modify the original EMELI so that the revised modeling framework is able to initialize, execute and find the dependencies of the BMI-enabled web service models. By using the revised EMELI, an example will be presented on integrating a set of topoflow model components that are BMI-enabled and exposed as web services. Reference: Peckham, S.D. (2014) EMELI 1.0: An experimental smart modeling framework for automatic coupling of self-describing models, Proceedings of HIC 2014

  17. A FRAMEWORK FOR AN OPEN SOURCE GEOSPATIAL CERTIFICATION MODEL

    Directory of Open Access Journals (Sweden)

    T. U. R. Khan

    2016-06-01

    Full Text Available The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission “Making geospatial education and opportunities accessible to all”. Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the “Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM. The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and

  18. a Framework for AN Open Source Geospatial Certification Model

    Science.gov (United States)

    Khan, T. U. R.; Davis, P.; Behr, F.-J.

    2016-06-01

    The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105

  19. A framework of modeling detector systems for computed tomography simulations

    Science.gov (United States)

    Youn, H.; Kim, D.; Kim, S. H.; Kam, S.; Jeon, H.; Nam, J.; Kim, H. K.

    2016-01-01

    Ultimate development in computed tomography (CT) technology may be a system that can provide images with excellent lesion conspicuity with the patient dose as low as possible. Imaging simulation tools have been cost-effectively used for these developments and will continue. For a more accurate and realistic imaging simulation, the signal and noise propagation through a CT detector system has been modeled in this study using the cascaded linear-systems theory. The simulation results are validated in comparisons with the measured results using a laboratory flat-panel micro-CT system. Although the image noise obtained from the simulations at higher exposures is slightly smaller than that obtained from the measurements, the difference between them is reasonably acceptable. According to the simulation results for various exposure levels and additive electronic noise levels, x-ray quantum noise is more dominant than the additive electronic noise. The framework of modeling a CT detector system suggested in this study will be helpful for the development of an accurate and realistic projection simulation model.

  20. A Production Model for Construction: A Theoretical Framework

    Directory of Open Access Journals (Sweden)

    Ricardo Antunes

    2015-03-01

    Full Text Available The building construction industry faces challenges, such as increasing project complexity and scope requirements, but shorter deadlines. Additionally, economic uncertainty and rising business competition with a subsequent decrease in profit margins for the industry demands the development of new approaches to construction management. However, the building construction sector relies on practices based on intuition and experience, overlooking the dynamics of its production system. Furthermore, researchers maintain that the construction industry has no history of the application of mathematical approaches to model and manage production. Much work has been carried out on how manufacturing practices apply to construction projects, mostly lean principles. Nevertheless, there has been little research to understand the fundamental mechanisms of production in construction. This study develops an in-depth literature review to examine the existing knowledge about production models and their characteristics in order to establish a foundation for dynamic production systems management in construction. As a result, a theoretical framework is proposed, which will be instrumental in the future development of mathematical production models aimed at predicting the performance and behaviour of dynamic project-based systems in construction.

  1. A unified framework for benchmark dose estimation applied to mixed models and model averaging

    DEFF Research Database (Denmark)

    Ritz, Christian; Gerhard, Daniel; Hothorn, Ludwig A.

    2013-01-01

    This article develops a framework for benchmark dose estimation that allows intrinsically nonlinear dose-response models to be used for continuous data in much the same way as is already possible for quantal data. This means that the same dose-response model equations may be applied to both...

  2. Name segmentation using hidden Markov models and its application in record linkage

    Directory of Open Access Journals (Sweden)

    Rita de Cassia Braga Gonçalves

    2014-10-01

    Full Text Available This study aimed to evaluate the use of hidden Markov models (HMM for the segmentation of person names and its influence on record linkage. A HMM was applied to the segmentation of patient’s and mother’s names in the databases of the Mortality Information System (SIM, Information Subsystem for High Complexity Procedures (APAC, and Hospital Information System (AIH. A sample of 200 patients from each database was segmented via HMM, and the results were compared to those from segmentation by the authors. The APAC-SIM and APAC-AIH databases were linked using three different segmentation strategies, one of which used HMM. Conformity of segmentation via HMM varied from 90.5% to 92.5%. The different segmentation strategies yielded similar results in the record linkage process. This study suggests that segmentation of Brazilian names via HMM is no more effective than traditional segmentation approaches in the linkage process.

  3. Flexible modeling frameworks to replace small ensembles of hydrological models and move toward large ensembles?

    Science.gov (United States)

    Addor, Nans; Clark, Martyn P.; Mizukami, Naoki

    2017-04-01

    Climate change impacts on hydrological processes are typically assessed using small ensembles of hydrological models. That is, a handful of hydrological models are typically driven by a larger number of climate models. Such a setup has several limitations. Because the number of hydrological models is small, only a small proportion of the model space is sampled, likely leading to an underestimation of the uncertainties in the projections. Further, sampling is arbitrary: although hydrological models should be selected to provide a representative sample of existing models (in terms of complexity and governing hypotheses), they are instead usually selected based on legacy reasons. Furthermore, running several hydrological models currently constitutes a practical challenge because each model must be setup and calibrated individually. Finally, and probably most importantly, the differences between the projected impacts cannot be directly related to differences between hydrological models, because the models are different in almost every possible aspect. We are hence in a situation in which different hydrological models deliver different projections, but for reasons that are mostly unclear, and in which the uncertainty in the projections is probably underestimated. To overcome these limitations, we are experimenting with the flexible modeling framework FUSE (Framework for Understanding Model Errors). FUSE enables to construct conceptual models piece by piece (in a "pick and mix" approach), so it can be used to generate a large number of models that mimic existing models and/or models that differ from other models in single targeted respect (e.g. how baseflow is generated). FUSE hence allows for controlled modeling experiments, and for a more systematic and exhaustive sampling of the model space. Here we explore climate change impacts over the contiguous USA on a 12km grid using two groups of three models: the first group involves the commonly used models VIC, PRMS and HEC

  4. Acid deposition: decision framework. Volume 1. Description of conceptual framework and decision-tree models. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Balson, W.E.; Boyd, D.W.; North, D.W.

    1982-08-01

    Acid precipitation and dry deposition of acid materials have emerged as an important environmental issue affecting the electric utility industry. This report presents a framework for the analysis of decisions on acid deposition. The decision framework is intended as a means of summarizing scientific information and uncertainties on the relation between emissions from electric utilities and other sources, acid deposition, and impacts on ecological systems. The methodology for implementing the framework is that of decision analysis, which provides a quantitative means of analyzing decisions under uncertainty. The decisions of interest include reductions in sulfur oxide and other emissions thought to be precursors of acid deposition, mitigation of acid deposition impacts through means such as liming of waterways and soils, and choice of strategies for research. The report first gives an overview of the decision framework and explains the decision analysis methods with a simplified caricature example. The state of scientific information and the modeling assumptions for the framework are then discussed for the three main modules of the framework: emissions and control technologies; long-range transport and chemical conversion in the atmosphere; and ecological impacts. The report then presents two versions of a decision tree model that implements the decision framework. The basic decision tree addresses decisions on emissions control and mitigation in the immediate future and a decade hence, and it includes uncertainties in the long-range transport and ecological impacts. The research emphasis decision tree addresses the effect of research funding on obtaining new information as the basis for future decisions. Illustrative data and calculations using the decision tree models are presented.

  5. AN INTEGRATED MODELING FRAMEWORK FOR CARBON MANAGEMENT TECHNOLOGIES

    Energy Technology Data Exchange (ETDEWEB)

    Anand B. Rao; Edward S. Rubin; Michael B. Berkenpas

    2004-03-01

    CO{sub 2} capture and storage (CCS) is gaining widespread interest as a potential method to control greenhouse gas emissions from fossil fuel sources, especially electric power plants. Commercial applications of CO{sub 2} separation and capture technologies are found in a number of industrial process operations worldwide. Many of these capture technologies also are applicable to fossil fuel power plants, although applications to large-scale power generation remain to be demonstrated. This report describes the development of a generalized modeling framework to assess alternative CO{sub 2} capture and storage options in the context of multi-pollutant control requirements for fossil fuel power plants. The focus of the report is on post-combustion CO{sub 2} capture using amine-based absorption systems at pulverized coal-fired plants, which are the most prevalent technology used for power generation today. The modeling framework builds on the previously developed Integrated Environmental Control Model (IECM). The expanded version with carbon sequestration is designated as IECM-cs. The expanded modeling capability also includes natural gas combined cycle (NGCC) power plants and integrated coal gasification combined cycle (IGCC) systems as well as pulverized coal (PC) plants. This report presents details of the performance and cost models developed for an amine-based CO{sub 2} capture system, representing the baseline of current commercial technology. The key uncertainties and variability in process design, performance and cost parameters which influence the overall cost of carbon mitigation also are characterized. The new performance and cost models for CO{sub 2} capture systems have been integrated into the IECM-cs, along with models to estimate CO{sub 2} transport and storage costs. The CO{sub 2} control system also interacts with other emission control technologies such as flue gas desulfurization (FGD) systems for SO{sub 2} control. The integrated model is applied to

  6. Retrofitting Non-Cognitive-Diagnostic Reading Assessment under the Generalized DINA Model Framework

    Science.gov (United States)

    Chen, Huilin; Chen, Jinsong

    2016-01-01

    Cognitive diagnosis models (CDMs) are psychometric models developed mainly to assess examinees' specific strengths and weaknesses in a set of skills or attributes within a domain. By adopting the Generalized-DINA model framework, the recently developed general modeling framework, we attempted to retrofit the PISA reading assessments, a…

  7. Model-Based Reasoning in the Physics Laboratory: Framework and Initial Results

    Science.gov (United States)

    Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.

    2015-01-01

    We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable…

  8. A Global Modeling Framework for Plasma Kinetics: Development and Applications

    Science.gov (United States)

    Parsey, Guy Morland

    The modern study of plasmas, and applications thereof, has developed synchronously with com- puter capabilities since the mid-1950s. Complexities inherent to these charged-particle, many- body, systems have resulted in the development of multiple simulation methods (particle-in-cell, fluid, global modeling, etc.) in order to both explain observed phenomena and predict outcomes of plasma applications. Recognizing that different algorithms are chosen to best address specific topics of interest, this thesis centers around the development of an open-source global model frame- work for the focused study of non-equilibrium plasma kinetics. After verification and validation of the framework, it was used to study two physical phenomena: plasma-assisted combustion and the recently proposed optically-pumped rare gas metastable laser. Global models permeate chemistry and plasma science, relying on spatial averaging to focus attention on the dynamics of reaction networks. Defined by a set of species continuity and energy conservation equations, the required data and constructed systems are conceptually similar across most applications, providing a light platform for exploratory and result-search parameter scan- ning. Unfortunately, it is common practice for custom code to be developed for each application-- an enormous duplication of effort which negatively affects the quality of the software produced. Presented herein, the Python-based Kinetic Global Modeling framework (KGMf) was designed to support all modeling phases: collection and analysis of reaction data, construction of an exportable system of model ODEs, and a platform for interactive evaluation and post-processing analysis. A symbolic ODE system is constructed for interactive manipulation and generation of a Jacobian, both of which are compiled as operation-optimized C-code. Plasma-assisted combustion and ignition (PAC/PAI) embody the modernization of burning fuel by opening up new avenues of control and optimization

  9. Young diabetics' compliance in the framework of the MIMIC model.

    Science.gov (United States)

    Kyngäs, H; Hentinen, M; Koivukangas, P; Ohinmaa, A

    1996-11-01

    The compliance of 346 young diabetics aged 13-17 years with health regimens is analysed in the framework of a MIMIC (multiple indicators, multiple causes) model. The data were compiled by means of a questionnaire on compliance, conditions for compliance, the meaning attached to treatment and the impact of the disease, and the model constructed using the LISREL VII programme, treating compliance as an unobserved variable formulated in terms of observed causes (x-variables) and observed indicators (y-variables). The resulting solutions are entirely satisfactory. The goodness-of-fit index is 0.983, the root mean square residual 0.058 and the chi-squared statistic 43.35 (P compliance to be indicated by self-care behaviour, responsibility for treatment, intention to pursue the treatment and collaboration with the physician, and to be greatly determined by motivation, experience of the results of treatment and having the energy and will-power to pursue the treatment and, to a lesser extent, by a sense of normality and fear.

  10. Smart licensing and environmental flows: Modeling framework and sensitivity testing

    Science.gov (United States)

    Wilby, R. L.; Fenn, C. R.; Wood, P. J.; Timlett, R.; Lequesne, T.

    2011-12-01

    Adapting to climate change is just one among many challenges facing river managers. The response will involve balancing the long-term water demands of society with the changing needs of the environment in sustainable and cost effective ways. This paper describes a modeling framework for evaluating the sensitivity of low river flows to different configurations of abstraction licensing under both historical climate variability and expected climate change. A rainfall-runoff model is used to quantify trade-offs among environmental flow (e-flow) requirements, potential surface and groundwater abstraction volumes, and the frequency of harmful low-flow conditions. Using the River Itchen in southern England as a case study it is shown that the abstraction volume is more sensitive to uncertainty in the regional climate change projection than to the e-flow target. It is also found that "smarter" licensing arrangements (involving a mix of hands off flows and "rising block" abstraction rules) could achieve e-flow targets more frequently than conventional seasonal abstraction limits, with only modest reductions in average annual yield, even under a hotter, drier climate change scenario.

  11. Python framework for kinetic modeling of electronically excited reaction pathways

    Science.gov (United States)

    Verboncoeur, John; Parsey, Guy; Guclu, Yaman; Christlieb, Andrew

    2012-10-01

    The use of plasma energy to enhance and control the chemical reactions during combustion, a technology referred to as ``plasma assisted combustion'' (PAC), can result in a variety of beneficial effects: e.g. stable lean operation, pollution reduction, and wider range of p-T operating conditions. While experimental evidence abounds, theoretical understanding of PAC is at best incomplete, and numerical tools still lack in reliable predictive capabilities. In the context of a joint experimental-numerical effort at Michigan State University, we present here an open-source modular Python framework dedicated to the dynamic optimization of non-equilibrium PAC systems. Multiple sources of experimental reaction data, e.g. reaction rates, cross-sections and oscillator strengths, are used in order to quantify the effect of data uncertainty and limiting assumptions. A collisional-radiative model (CRM) is implemented to organize reactions by importance and as a potential means of measuring a non-Maxwellian electron energy distribution function (EEDF), when coupled to optical emission spectroscopy data. Finally, we explore scaling laws in PAC parameter space using a kinetic global model (KGM) accelerated with CRM optimized reaction sequences and sparse stiff integrators.

  12. A modeling and simulation framework for electrokinetic nanoparticle treatment

    Science.gov (United States)

    Phillips, James

    2011-12-01

    The focus of this research is to model and provide a simulation framework for the packing of differently sized spheres within a hard boundary. The novel contributions of this dissertation are the cylinders of influence (COI) method and sectoring method implementations. The impetus for this research stems from modeling electrokinetic nanoparticle (EN) treatment, which packs concrete pores with differently sized nanoparticles. We show an improved speed of the simulation compared to previously published results of EN treatment simulation while obtaining similar porosity reduction results. We mainly focused on readily, commercially available particle sizes of 2 nm and 20 nm particles, but have the capability to model other sizes. Our simulation has graphical capabilities and can provide additional data unobtainable from physical experimentation. The data collected has a median of 0.5750 and a mean of 0.5504. The standard error is 0.0054 at alpha = 0.05 for a 95% confidence interval of 0.5504 +/- 0.0054. The simulation has produced maximum packing densities of 65% and minimum packing densities of 34%. Simulation data are analyzed using linear regression via the R statistical language to obtain two equations: one that describes porosity reduction based on all cylinder and particle characteristics, and another that focuses on describing porosity reduction based on cylinder diameter for 2 and 20 nm particles into pores of 100 nm height. Simulation results are similar to most physical results obtained from MIP and WLR. Some MIP results do not fall within the simulation limits; however, this is expected as MIP has been documented to be an inaccurate measure of pore distribution and porosity of concrete. Despite the disagreement between WLR and MIP, there is a trend that porosity reduction is higher two inches from the rebar as compared to the rebar-concrete interface. The simulation also detects a higher porosity reduction further from the rebar. This may be due to particles

  13. Inference with Constrained Hidden Markov Models in PRISM

    CERN Document Server

    Christiansen, Henning; Lassen, Ole Torp; Petit, Matthieu

    2010-01-01

    A Hidden Markov Model (HMM) is a common statistical model which is widely used for analysis of biological sequence data and other sequential phenomena. In the present paper we show how HMMs can be extended with side-constraints and present constraint solving techniques for efficient inference. Defining HMMs with side-constraints in Constraint Logic Programming have advantages in terms of more compact expression and pruning opportunities during inference. We present a PRISM-based framework for extending HMMs with side-constraints and show how well-known constraints such as cardinality and all different are integrated. We experimentally validate our approach on the biologically motivated problem of global pairwise alignment.

  14. Modelling supported driving as an optimal control cycle: Framework and model characteristics

    CERN Document Server

    Wang, Meng; Daamen, Winnie; Hoogendoorn, Serge P; van Arem, Bart

    2014-01-01

    Driver assistance systems support drivers in operating vehicles in a safe, comfortable and efficient way, and thus may induce changes in traffic flow characteristics. This paper puts forward a receding horizon control framework to model driver assistance and cooperative systems. The accelerations of automated vehicles are controlled to optimise a cost function, assuming other vehicles driving at stationary conditions over a prediction horizon. The flexibility of the framework is demonstrated with controller design of Adaptive Cruise Control (ACC) and Cooperative ACC (C-ACC) systems. The proposed ACC and C-ACC model characteristics are investigated analytically, with focus on equilibrium solutions and stability properties. The proposed ACC model produces plausible human car-following behaviour and is unconditionally locally stable. By careful tuning of parameters, the ACC model generates similar stability characteristics as human driver models. The proposed C-ACC model results in convective downstream and abso...

  15. HPeak: an HMM-based algorithm for defining read-enriched regions in ChIP-Seq data

    Directory of Open Access Journals (Sweden)

    Maher Christopher A

    2010-07-01

    Full Text Available Abstract Background Protein-DNA interaction constitutes a basic mechanism for the genetic regulation of target gene expression. Deciphering this mechanism has been a daunting task due to the difficulty in characterizing protein-bound DNA on a large scale. A powerful technique has recently emerged that couples chromatin immunoprecipitation (ChIP with next-generation sequencing, (ChIP-Seq. This technique provides a direct survey of the cistrom of transcription factors and other chromatin-associated proteins. In order to realize the full potential of this technique, increasingly sophisticated statistical algorithms have been developed to analyze the massive amount of data generated by this method. Results Here we introduce HPeak, a Hidden Markov model (HMM-based Peak-finding algorithm for analyzing ChIP-Seq data to identify protein-interacting genomic regions. In contrast to the majority of available ChIP-Seq analysis software packages, HPeak is a model-based approach allowing for rigorous statistical inference. This approach enables HPeak to accurately infer genomic regions enriched with sequence reads by assuming realistic probability distributions, in conjunction with a novel weighting scheme on the sequencing read coverage. Conclusions Using biologically relevant data collections, we found that HPeak showed a higher prevalence of the expected transcription factor binding motifs in ChIP-enriched sequences relative to the control sequences when compared to other currently available ChIP-Seq analysis approaches. Additionally, in comparison to the ChIP-chip assay, ChIP-Seq provides higher resolution along with improved sensitivity and specificity of binding site detection. Additional file and the HPeak program are freely available at http://www.sph.umich.edu/csg/qin/HPeak.

  16. Digital Moon: A three-dimensional framework for lunar modeling

    Science.gov (United States)

    Paige, D. A.; Elphic, R. C.; Foote, E. J.; Meeker, S. R.; Siegler, M. A.; Vasavada, A. R.

    2009-12-01

    The Moon has a complex three-dimensional shape with significant large-scale and small-scale topographic relief. The Moon’s topography largely controls the distribution of incident solar radiation, as well as the scattered solar and infrared radiation fields. Topography also affects the Moon’s interaction with the space environment, its magnetic field, and the propagation of seismic waves. As more extensive and detailed lunar datasets become available, there is an increasing need to interpret and compare them with the results of physical models in a fully three-dimensional context. We have developed a three-dimensional framework for lunar modeling we call the Digital Moon. The goal of this work is to enable high fidelity physical modeling and visualization of the Moon in a parallel computing environment. The surface of the Moon is described by a continuous triangular mesh of arbitrary shape and spatial scale. For regions of limited geographic extent, it is convenient to employ meshes on a rectilinear grid. However for global-scale modeling, we employ a continuous geodesic gridding scheme (Teanby, 2008). Each element in the mesh surface is allowed to have a unique set of physical properties. Photon and particle interactions between mesh elements are modeled using efficient ray tracing algorithms. Heat, mass, photon and particle transfer within each mesh element are modeled in one dimension. Each compute node is assigned a portion of the mesh and collective interactions between elements are handled through network interfaces. We have used the model to calculate lunar surface and subsurface temperatures that can be compared directly with radiometric temperatures measured by the Diviner Lunar Radiometer Experiment on the Lunar Reconnaissance Orbiter. The model includes realistic surface photometric functions based on goniometric measurements of lunar soil samples (Foote and Paige, 2009), and one-dimensional thermal models based on lunar remote sensing and Apollo

  17. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  18. Multi-scale feature based double-layer HMM and its application in behavior recognition%基于多尺度特征的双层隐马尔可夫模型及其在行为识别中的应用

    Institute of Scientific and Technical Information of China (English)

    梅雪; 胡石; 许松松; 张继法

    2012-01-01

    借鉴人类视觉感知所具有的多尺度、多分辨性的特性,针对智能视频监控系统的人体运动行为识别,提出了一种基于多尺度特征的双层隐马尔可夫模型.根据人体行为关键姿态数确定HMM的状态数目,发掘人体运动行为隐藏的多尺度结构间的关系,将运动轨迹和人体姿态边缘小波矩2个不同尺度特征应用于2层HMM,提供更为丰富的行为尺度间的相关信息.分别用Weizmann人体行为数据库和自行拍摄的室内视频,对人体运动行为识别进行仿真实验,结果表明,五状态HMM模型更符合人体运动行为特点,基于多尺度特征的五状态双层隐马尔可夫模型具有较高的识别率.%Learning from multi-scale and multi-distinguish attributes of human beings' visual perception and aiming at human movement behavior recognition in intelligent video surveillance system, a double-layer hidden markov model ( DL-HMM) is developed based on multi-scale behavior features. Considering the human behavior characteristics , the number of HMM states is according to the number of key gestures selected. Discovering the relationship between the multi-scale structures hidden in the human movement behavior, two different scale features-human motion trajectory and wavelet moment of human gesture' s edge, are applied respectively in two layers of DL-HMM, so as to provide more scale information about behavior. Experiments, using Israel Weizmann human behavior database and human actions indoor recorded by ourselves, show the five-state HMM more accords with the human motion behavior characteristics, and the five-state DL-HMM based on multi-scale feature has a higher recognition rate compared with traditional methods using one layer HMM.

  19. Instant e-Teaching Framework Model for Live Online Teaching

    Directory of Open Access Journals (Sweden)

    Suhailan Safei

    2011-03-01

    Full Text Available Instant e-Teaching is a new concept that supplements e-Teaching and e-Learning environment in providing a full and comprehensive modern education styles. The e-Learning technology depicts the concept of enabling self-learning among students on a certain subject using online reference and materials. While the instant e-teaching requires 'face-to-face' characteristic between teacher and student to simultaneously execute actions and gain instant responses. The word instant enhances the e-Teaching with the concept of real time teaching. The challenge to exercise online and instant teaching is not just merely relying on the technologies and system efficiency, but it needs to satisfy the usability and friendliness of the system as to replicate the traditional class environment during the deliveries of the class. For this purpose, an instant e-Teaching framework is been developed that will emulate a dedicated virtual classroom, and primarily designed for synchronous and live sharing of current teaching notes. The model has been demonstrated using a teaching Arabic recitation prototype and evaluated from the professional user profession's perspectives.

  20. A graphical model framework for decoding in the visual ERP-based BCI speller

    NARCIS (Netherlands)

    Martens, S.M.M.; Mooij, J.M.; Hill, N.J.; Farquhar, J.D.R.; Schölkopf, B.

    2011-01-01

    We present a graphical model framework for decoding in the visual ERP-based speller system. The proposed framework allows researchers to build generative models from which the decoding rules are obtained in a straightforward manner. We suggest two models for generating brain signals conditioned on

  1. Adaptive invasive species distribution models: A framework for modeling incipient invasions

    Science.gov (United States)

    Uden, Daniel R.; Allen, Craig R.; Angeler, David G.; Corral, Lucia; Fricke, Kent A.

    2015-01-01

    The utilization of species distribution model(s) (SDM) for approximating, explaining, and predicting changes in species’ geographic locations is increasingly promoted for proactive ecological management. Although frameworks for modeling non-invasive species distributions are relatively well developed, their counterparts for invasive species—which may not be at equilibrium within recipient environments and often exhibit rapid transformations—are lacking. Additionally, adaptive ecological management strategies address the causes and effects of biological invasions and other complex issues in social-ecological systems. We conducted a review of biological invasions, species distribution models, and adaptive practices in ecological management, and developed a framework for adaptive, niche-based, invasive species distribution model (iSDM) development and utilization. This iterative, 10-step framework promotes consistency and transparency in iSDM development, allows for changes in invasive drivers and filters, integrates mechanistic and correlative modeling techniques, balances the avoidance of type 1 and type 2 errors in predictions, encourages the linking of monitoring and management actions, and facilitates incremental improvements in models and management across space, time, and institutional boundaries. These improvements are useful for advancing coordinated invasive species modeling, management and monitoring from local scales to the regional, continental and global scales at which biological invasions occur and harm native ecosystems and economies, as well as for anticipating and responding to biological invasions under continuing global change.

  2. Conceptual Frameworks and Research Models on Resilience in Leadership

    Directory of Open Access Journals (Sweden)

    Janet Ledesma

    2014-08-01

    Full Text Available The purpose of this article was to discuss conceptual frameworks and research models on resilience theory. The constructs of resilience, the history of resilience theory, models of resilience, variables of resilience, career resilience, and organizational resilience will be examined and discussed as they relate to leadership development. The literature demonstrates that there is a direct relationship between the stress of the leader’s job and his or her ability to maintain resilience in the face of prolonged contact with adversity. This article discusses resilience theory as it relates to leadership development. The concept associated with resilience, which includes thriving and hardiness, is explored with the belief that resilient leaders are invaluable to the sustainability of an organization. In addition, the constructs of resilience and the history of resilience studies in the field of psychiatry, developmental psychopathy, human development, medicine, epidemiology, and the social sciences are examined. Survival, recovery, and thriving are concepts associated with resilience and describe the stage at which a person may be during or after facing adversity. The concept of “thriving” refers to a person’s ability to go beyond his or her original level of functioning and to grow and function despite repeated exposure to stressful experiences. The literature suggests a number of variables that characterize resilience and thriving. These variables include positive self-esteem, hardiness, strong coping skills, a sense of coherence, self-efficacy, optimism, strong social resources, adaptability, risk-taking, low fear of failure, determination, perseverance, and a high tolerance of uncertainty. These are reviewed in this article. The findings in this article suggest that those who develop leaders need to create safe environments to help emerging and existing leaders thrive as individuals and as organizational leaders in the area of resilience

  3. A novel HMM distributed classifier for the detection of gait phases by means of a wearable inertial sensor network.

    Science.gov (United States)

    Taborri, Juri; Rossi, Stefano; Palermo, Eduardo; Patanè, Fabrizio; Cappa, Paolo

    2014-09-02

    In this work, we decided to apply a hierarchical weighted decision, proposed and used in other research fields, for the recognition of gait phases. The developed and validated novel distributed classifier is based on hierarchical weighted decision from outputs of scalar Hidden Markov Models (HMM) applied to angular velocities of foot, shank, and thigh. The angular velocities of ten healthy subjects were acquired via three uni-axial gyroscopes embedded in inertial measurement units (IMUs) during one walking task, repeated three times, on a treadmill. After validating the novel distributed classifier and scalar and vectorial classifiers-already proposed in the literature, with a cross-validation, classifiers were compared for sensitivity, specificity, and computational load for all combinations of the three targeted anatomical segments. Moreover, the performance of the novel distributed classifier in the estimation of gait variability in terms of mean time and coefficient of variation was evaluated. The highest values of specificity and sensitivity (>0.98) for the three classifiers examined here were obtained when the angular velocity of the foot was processed. Distributed and vectorial classifiers reached acceptable values (>0.95) when the angular velocity of shank and thigh were analyzed. Distributed and scalar classifiers showed values of computational load about 100 times lower than the one obtained with the vectorial classifier. In addition, distributed classifiers showed an excellent reliability for the evaluation of mean time and a good/excellent reliability for the coefficient of variation. In conclusion, due to the better performance and the small value of computational load, the here proposed novel distributed classifier can be implemented in the real-time application of gait phases recognition, such as to evaluate gait variability in patients or to control active orthoses for the recovery of mobility of lower limb joints.

  4. A Novel HMM Distributed Classifier for the Detection of Gait Phases by Means of a Wearable Inertial Sensor Network

    Directory of Open Access Journals (Sweden)

    Juri Taborri

    2014-09-01

    Full Text Available In this work, we decided to apply a hierarchical weighted decision, proposed and used in other research fields, for the recognition of gait phases. The developed and validated novel distributed classifier is based on hierarchical weighted decision from outputs of scalar Hidden Markov Models (HMM applied to angular velocities of foot, shank, and thigh. The angular velocities of ten healthy subjects were acquired via three uni-axial gyroscopes embedded in inertial measurement units (IMUs during one walking task, repeated three times, on a treadmill. After validating the novel distributed classifier and scalar and vectorial classifiers-already proposed in the literature, with a cross-validation, classifiers were compared for sensitivity, specificity, and computational load for all combinations of the three targeted anatomical segments. Moreover, the performance of the novel distributed classifier in the estimation of gait variability in terms of mean time and coefficient of variation was evaluated. The highest values of specificity and sensitivity (>0.98 for the three classifiers examined here were obtained when the angular velocity of the foot was processed. Distributed and vectorial classifiers reached acceptable values (>0.95 when the angular velocity of shank and thigh were analyzed. Distributed and scalar classifiers showed values of computational load about 100 times lower than the one obtained with the vectorial classifier. In addition, distributed classifiers showed an excellent reliability for the evaluation of mean time and a good/excellent reliability for the coefficient of variation. In conclusion, due to the better performance and the small value of computational load, the here proposed novel distributed classifier can be implemented in the real-time application of gait phases recognition, such as to evaluate gait variability in patients or to control active orthoses for the recovery of mobility of lower limb joints.

  5. A Novel HMM Distributed Classifier for the Detection of Gait Phases by Means of a Wearable Inertial Sensor Network

    Science.gov (United States)

    Taborri, Juri; Rossi, Stefano; Palermo, Eduardo; Patanè, Fabrizio; Cappa, Paolo

    2014-01-01

    In this work, we decided to apply a hierarchical weighted decision, proposed and used in other research fields, for the recognition of gait phases. The developed and validated novel distributed classifier is based on hierarchical weighted decision from outputs of scalar Hidden Markov Models (HMM) applied to angular velocities of foot, shank, and thigh. The angular velocities of ten healthy subjects were acquired via three uni-axial gyroscopes embedded in inertial measurement units (IMUs) during one walking task, repeated three times, on a treadmill. After validating the novel distributed classifier and scalar and vectorial classifiers-already proposed in the literature, with a cross-validation, classifiers were compared for sensitivity, specificity, and computational load for all combinations of the three targeted anatomical segments. Moreover, the performance of the novel distributed classifier in the estimation of gait variability in terms of mean time and coefficient of variation was evaluated. The highest values of specificity and sensitivity (>0.98) for the three classifiers examined here were obtained when the angular velocity of the foot was processed. Distributed and vectorial classifiers reached acceptable values (>0.95) when the angular velocity of shank and thigh were analyzed. Distributed and scalar classifiers showed values of computational load about 100 times lower than the one obtained with the vectorial classifier. In addition, distributed classifiers showed an excellent reliability for the evaluation of mean time and a good/excellent reliability for the coefficient of variation. In conclusion, due to the better performance and the small value of computational load, the here proposed novel distributed classifier can be implemented in the real-time application of gait phases recognition, such as to evaluate gait variability in patients or to control active orthoses for the recovery of mobility of lower limb joints. PMID:25184488

  6. Analysis of Decision Trees in Context Clustering of Hidden Markov Model Based Thai Speech Synthesis

    Directory of Open Access Journals (Sweden)

    Suphattharachai Chomphan

    2011-01-01

    Full Text Available Problem statement: In Thai speech synthesis using Hidden Markov model (HMM based synthesis system, the tonal speech quality is degraded due to tone distortion. This major problem must be treated appropriately to preserve the tone characteristics of each syllable unit. Since tone brings about the intelligibility of the synthesized speech. It is needed to establish the tone questions and other phonetic questions in tree-based context clustering process accordingly. Approach: This study describes the analysis of questions in tree-based context clustering process of an HMM-based speech synthesis system for Thai language. In the system, spectrum, pitch or F0 and state duration are modeled simultaneously in a unified framework of HMM, their parameter distributions are clustered independently by using a decision-tree based context clustering technique. The contextual factors which affect spectrum, pitch and duration, i.e., part of speech, position and number of phones in a syllable, position and number of syllables in a word, position and number of words in a sentence, phone type and tone type, are taken into account for constructing the questions of the decision tree. All in all, thirteen sets of questions are analyzed in comparison. Results: In the experiment, we analyzed the decision trees by counting the number of questions in each node coming from those thirteen sets and by calculating the dominance score given to each question as the reciprocal of the distance from the root node to the question node. The highest number and dominance score are of the set of phonetic type, while the second, third highest ones are of the set of part of speech and tone type. Conclusion: By counting the number of questions in each node and calculating the dominance score, we can set the priority of each question set. All in all, the analysis results bring about further development of Thai speech synthesis with efficient context clustering process in

  7. An Enhanced Informed Watermarking Scheme Using the Posterior Hidden Markov Model

    Directory of Open Access Journals (Sweden)

    Chuntao Wang

    2014-01-01

    Full Text Available Designing a practical watermarking scheme with high robustness, feasible imperceptibility, and large capacity remains one of the most important research topics in robust watermarking. This paper presents a posterior hidden Markov model (HMM- based informed image watermarking scheme, which well enhances the practicability of the prior-HMM-based informed watermarking with favorable robustness, imperceptibility, and capacity. To make the encoder and decoder use the (nearly identical posterior HMM, each cover image at the encoder and each received image at the decoder are attacked with JPEG compression at an equivalently small quality factor (QF. The attacked images are then employed to estimate HMM parameter sets for both the encoder and decoder, respectively. Numerical simulations show that a small QF of 5 is an optimum setting for practical use. Based on this posterior HMM, we develop an enhanced posterior-HMM-based informed watermarking scheme. Extensive experimental simulations show that the proposed scheme is comparable to its prior counterpart in which the HMM is estimated with the original image, but it avoids the transmission of the prior HMM from the encoder to the decoder. This thus well enhances the practical application of HMM-based informed watermarking systems. Also, it is demonstrated that the proposed scheme has the robustness comparable to the state-of-the-art with significantly reduced computation time.

  8. An enhanced informed watermarking scheme using the posterior hidden Markov model.

    Science.gov (United States)

    Wang, Chuntao

    2014-01-01

    Designing a practical watermarking scheme with high robustness, feasible imperceptibility, and large capacity remains one of the most important research topics in robust watermarking. This paper presents a posterior hidden Markov model (HMM-) based informed image watermarking scheme, which well enhances the practicability of the prior-HMM-based informed watermarking with favorable robustness, imperceptibility, and capacity. To make the encoder and decoder use the (nearly) identical posterior HMM, each cover image at the encoder and each received image at the decoder are attacked with JPEG compression at an equivalently small quality factor (QF). The attacked images are then employed to estimate HMM parameter sets for both the encoder and decoder, respectively. Numerical simulations show that a small QF of 5 is an optimum setting for practical use. Based on this posterior HMM, we develop an enhanced posterior-HMM-based informed watermarking scheme. Extensive experimental simulations show that the proposed scheme is comparable to its prior counterpart in which the HMM is estimated with the original image, but it avoids the transmission of the prior HMM from the encoder to the decoder. This thus well enhances the practical application of HMM-based informed watermarking systems. Also, it is demonstrated that the proposed scheme has the robustness comparable to the state-of-the-art with significantly reduced computation time.

  9. 基于HMM和GMM的维吾尔语联机手写体识别研究%Online-handwriting recognition research of Uyghur word using GMM and HMM

    Institute of Scientific and Technical Information of China (English)

    许辉; 热依曼吐尔逊; 吾守尔斯拉木

    2014-01-01

    This paper presents an online-handwriting recognition system of Uyghur language based on GMM and HMM twin-engine recognition model. In the GMM part, the system extracts 8-directional features, then generates 8-directional pattern images, locates spatial sampling points and extracts the blurred directional features. The GMM model files are formed after the iterative training of the model refinement. In the HMM part, the system obtains the line_segmen_features sequence by applying line_segment_features method. The HMM model files are got from the iterative training of the model refinement as well. The GMM and HMM model files are packaged and encapsulated respectively, and then joint-packaged into a dictionary. In the first phase of experiment, the recognition rate is 97%;in the second phase, the recognition rate in-creases to 99%.%给出了一个基于HMM和GMM双引擎识别模型的维吾尔语联机手写体整词识别系统。在GMM部分,系统提取了8-方向特征,生成8-方向特征样式图像、定位空间采样点以及提取模糊的方向特征。在对模型精细化迭代训练之后,得到GMM模型文件。HMM部分,系统采用了笔段特征的方法来获取笔段分段点特征序列,在对模型进行精细化迭代训练后,得到HMM模型文件。将GMM模型文件和HMM模型文件分别打包封装再进行联合封装成字典。在第一期的实验中,系统的识别率达到97%,第二期的实验中,系统的识别率高达99%。

  10. D Geological Framework Models as a Teaching Aid for Geoscience

    Science.gov (United States)

    Kessler, H.; Ward, E.; Geological ModelsTeaching Project Team

    2010-12-01

    3D geological models have great potential as a resource for universities when teaching foundation geological concepts as it allows the student to visualise and interrogate UK geology. They are especially useful when dealing with the conversion of 2D field, map and GIS outputs into three dimensional geological units, which is a common problem for all students of geology. Today’s earth science students use a variety of skills and processes during their learning experience including the application of schema’s, spatial thinking, image construction, detecting patterns, memorising figures, mental manipulation and interpretation, making predictions and deducing the orientation of themselves and the rocks. 3D geological models can reinforce spatial thinking strategies and encourage students to think about processes and properties, in turn helping the student to recognise pre-learnt geological principles in the field and to convert what they see at the surface into a picture of what is going on at depth. Learning issues faced by students may also be encountered by experts, policy managers, and stakeholders when dealing with environmental problems. Therefore educational research of student learning in earth science may also improve environmental decision making. 3D geological framework models enhance the learning of Geosciences because they: ● enable a student to observe, manipulate and interpret geology; in particular the models instantly convert two-dimensional geology (maps, boreholes and cross-sections) into three dimensions which is a notoriously difficult geospatial skill to acquire. ● can be orientated to whatever the user finds comfortable and most aids recognition and interpretation. ● can be used either to teach geosciences to complete beginners or add to experienced students body of knowledge (whatever point that may be at). Models could therefore be packaged as a complete educational journey or students and tutor can select certain areas of the model

  11. Airline Sustainability Modeling: A New Framework with Application of Bayesian Structural Equation Modeling

    Directory of Open Access Journals (Sweden)

    Hashem Salarzadeh Jenatabadi

    2016-11-01

    Full Text Available There are many factors which could influence the sustainability of airlines. The main purpose of this study is to introduce a framework for a financial sustainability index and model it based on structural equation modeling (SEM with maximum likelihood and Bayesian predictors. The introduced framework includes economic performance, operational performance, cost performance, and financial performance. Based on both Bayesian SEM (Bayesian-SEM and Classical SEM (Classical-SEM, it was found that economic performance with both operational performance and cost performance are significantly related to the financial performance index. The four mathematical indices employed are root mean square error, coefficient of determination, mean absolute error, and mean absolute percentage error to compare the efficiency of Bayesian-SEM and Classical-SEM in predicting the airline financial performance. The outputs confirmed that the framework with Bayesian prediction delivered a good fit with the data, although the framework predicted with a Classical-SEM approach did not prepare a well-fitting model. The reasons for this discrepancy between Classical and Bayesian predictions, as well as the potential advantages and caveats with the application of Bayesian approach in airline sustainability studies, are debated.

  12. Legitimising neural network river forecasting models: a new data-driven mechanistic modelling framework

    Science.gov (United States)

    Mount, N. J.; Dawson, C. W.; Abrahart, R. J.

    2013-01-01

    In this paper we address the difficult problem of gaining an internal, mechanistic understanding of a neural network river forecasting (NNRF) model. Neural network models in hydrology have long been criticised for their black-box character, which prohibits adequate understanding of their modelling mechanisms and has limited their broad acceptance by hydrologists. In response, we here present a new, data-driven mechanistic modelling (DDMM) framework that incorporates an evaluation of the legitimacy of a neural network's internal modelling mechanism as a core element in the model development process. The framework is exemplified for two NNRF modelling scenarios, and uses a novel adaptation of first order, partial derivate, relative sensitivity analysis methods as the means by which each model's mechanistic legitimacy is explored. The results demonstrate the limitations of standard, goodness-of-fit validation procedures applied by NNRF modellers, by highlighting how the internal mechanisms of complex models that produce the best fit scores can have much lower legitimacy than simpler counterparts whose scores are only slightly inferior. The study emphasises the urgent need for better mechanistic understanding of neural network-based hydrological models and the further development of methods for elucidating their mechanisms.

  13. Ensemble hidden Markov models with application to landmine detection

    Science.gov (United States)

    Hamdi, Anis; Frigui, Hichem

    2015-12-01

    We introduce an ensemble learning method for temporal data that uses a mixture of hidden Markov models (HMM). We hypothesize that the data are generated by K models, each of which reflects a particular trend in the data. The proposed approach, called ensemble HMM (eHMM), is based on clustering within the log-likelihood space and has two main steps. First, one HMM is fit to each of the N individual training sequences. For each fitted model, we evaluate the log-likelihood of each sequence. This results in an N-by-N log-likelihood distance matrix that will be partitioned into K groups using a relational clustering algorithm. In the second step, we learn the parameters of one HMM per cluster. We propose using and optimizing various training approaches for the different K groups depending on their size and homogeneity. In particular, we investigate the maximum likelihood (ML), the minimum classification error (MCE), and the variational Bayesian (VB) training approaches. Finally, to test a new sequence, its likelihood is computed in all the models and a final confidence value is assigned by combining the models' outputs using an artificial neural network. We propose both discrete and continuous versions of the eHMM. Our approach was evaluated on a real-world application for landmine detection using ground-penetrating radar (GPR). Results show that both the continuous and discrete eHMM can identify meaningful and coherent HMM mixture components that describe different properties of the data. Each HMM mixture component models a group of data that share common attributes. These attributes are reflected in the mixture model's parameters. The results indicate that the proposed method outperforms the baseline HMM that uses one model for each class in the data.

  14. Designing a framework to design a business model for the 'bottom of the pyramid' population

    NARCIS (Netherlands)

    Ver loren van Themaat, Tanye; Schutte, Cornelius S.L.; Lutters, Diederick

    2013-01-01

    This article presents a framework for developing and designing a business model to target the bottom of the pyramid (BoP) population. Using blue ocean strategy and business model literature, integrated with research on the BoP, the framework offers a systematic approach for organisations to analyse

  15. Designing a framework to design a business model for the 'bottom of the pyramid' population

    NARCIS (Netherlands)

    Ver loren van Themaat, Tanye; Schutte, Cornelius S.L.; Lutters, Eric

    2013-01-01

    This article presents a framework for developing and designing a business model to target the bottom of the pyramid (BoP) population. Using blue ocean strategy and business model literature, integrated with research on the BoP, the framework offers a systematic approach for organisations to analyse

  16. A modeling framework for the evolution and spread of antibiotic resistance: literature review and model categorization.

    Science.gov (United States)

    Spicknall, Ian H; Foxman, Betsy; Marrs, Carl F; Eisenberg, Joseph N S

    2013-08-15

    Antibiotic-resistant infections complicate treatment and increase morbidity and mortality. Mathematical modeling has played an integral role in improving our understanding of antibiotic resistance. In these models, parameter sensitivity is often assessed, while model structure sensitivity is not. To examine the implications of this, we first reviewed the literature on antibiotic-resistance modeling published between 1993 and 2011. We then classified each article's model structure into one or more of 6 categories based on the assumptions made in those articles regarding within-host and population-level competition between antibiotic-sensitive and antibiotic-resistant strains. Each model category has different dynamic implications with respect to how antibiotic use affects resistance prevalence, and therefore each may produce different conclusions about optimal treatment protocols that minimize resistance. Thus, even if all parameter values are correctly estimated, inferences may be incorrect because of the incorrect selection of model structure. Our framework provides insight into model selection.

  17. Gear Fault Pattern Recognition based on the Optimum Wavelet Packet Decomposition and HMM%基于最佳小波包分解和HMM的齿轮故障模式识别

    Institute of Scientific and Technical Information of China (English)

    郑思来; 王细洋

    2014-01-01

    齿轮故障模式识别的关键问题在于对故障振动信号的特征提取。为了快速准确识别齿轮故障模式,提出了一种基于最佳小波包分解( OWPD)和隐马尔可夫模型( HMM)的识别方法。该方法对采集的振动信号进行小波包分解,再利用OWPD自动选择提取最佳小波包能量构造特征向量,输入HMM中进行训练与测试,实现了齿轮故障模式识别。实验结果表明该方法在齿轮故障模式识别方面的有效性和准确性。%The key point of gear fault pattern recognition is the fault feature extraction of vibration signal. Aiming at feature extraction of gear fault pattern recognition, a method based on the Optimum Wavelet Packet Decomposition ( OWPD) and Hidden Markov Model ( HMM) is proposed in this paper. Processing of the vibration signals in the time domain is considered, using the wavelet packet. The characteristic energy automatically selected by OWPD is then employed as the input of HMM model for training and test. Finally the effect and accurate of the new method is validated by experiments.

  18. A Framework Model for an Order Fulfillment System Based on Service Oriented Architecture

    Institute of Scientific and Technical Information of China (English)

    YANG Li-xi; LI Shi-qi

    2008-01-01

    To effectively implement order fulfillment, we present an integrated framework model focusing on the whole process of order fulfillment. Firstly, five aims of the OFS (order fulfillment system) are built. Then after discussing three major processes of order fulfillment, we summarize functional and quality attributes of the OFS. Subsequently, we investigate SOA (Service Oriented Architecture) and present a SOA meta-model to be an integrated framework and to fulfill quality requirements. Moreover, based on the SOA meta-model, we construct a conceptual framework model that aims to conveniently integrate other functions from different systems into the order fulfillment system. This model offers enterprises a new approach to implementing order fulfillment.

  19. Model-Based Reasoning in the Upper-Division Physics Laboratory: Framework and Initial Results

    CERN Document Server

    Zwickl, Benjamin M; Finkelstein, Noah; Lewandowski, H J

    2014-01-01

    Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable process, within physics education, it has been preferentially applied to the iterative development of broadly applicable principles (e.g., Newton's laws of motion in introductory mechanics). We review and extend existing frameworks on modeling to develop a new framework that more naturally describes model-based reasoning in upper-division physics labs. A significant feature of the new framework is that measurement tools (in addition to the physical system being studied) are subjected to the process of modeling. Think-aloud interviews were used to document examples of model-based reasoning in the laboratory and refine the modeling framework. The interviews showed how students productively applied similar facets of modeling to the physical system and measurement tools: construction, prediction, interpretation of data, identification of mod...

  20. Computer-aided modeling framework for efficient model development, analysis and identification

    DEFF Research Database (Denmark)

    Heitzig, Martina; Sin, Gürkan; Sales Cruz, Mauricio;

    2011-01-01

    branches; the first branch deals with single-scale model development while the second branch introduces features for multiscale model development to the methodology. In this paper, the emphasis is on single-scale model development and application part. The modeling framework and the supported stepwise......Model-based computer aided product-process engineering has attained increased importance in a number of industries, including pharmaceuticals, petrochemicals, fine chemicals, polymers, biotechnology, food, energy, and water. This trend is set to continue due to the substantial benefits computer......-aided methods introduce. The key prerequisite of computer-aided product-process engineering is however the availability of models of different types, forms, and application modes. The development of the models required for the systems under investigation tends to be a challenging and time-consuming task...

  1. A unified framework for modeling landscape evolution by discrete flows

    Science.gov (United States)

    Shelef, Eitan; Hilley, George E.

    2016-05-01

    Topographic features such as branched valley networks and undissected convex-up hillslopes are observed in disparate physical environments. In some cases, these features are formed by sediment transport processes that occur discretely in space and time, while in others, by transport processes that are uniformly distributed across the landscape. This paper presents an analytical framework that reconciles the basic attributes of such sediment transport processes with the topographic features that they form and casts those in terms that are likely common to different physical environments. In this framework, temporal changes in surface elevation reflect the frequency with which the landscape is traversed by geophysical flows generated discretely in time and space. This frequency depends on the distance to which flows travel downslope, which depends on the dynamics of individual flows, the lithologic and topographic properties of the underlying substrate, and the coevolution of topography, erosion, and the routing of flows over the topographic surface. To explore this framework, we postulate simple formulations for sediment transport and flow runout distance and demonstrate that the conditions for hillslope and channel network formation can be cast in terms of fundamental parameters such as distance from drainage divide and a friction-like coefficient that describes a flow's resistance to motion. The framework we propose is intentionally general, but the postulated formulas can be substituted with those that aim to describe a specific process and to capture variations in the size distribution of such flow events.

  2. Framework for Modelling Multiple Input Complex Aggregations for Interactive Installations

    DEFF Research Database (Denmark)

    Padfield, Nicolas; Andreasen, Troels

    2012-01-01

    We describe a generalized framework as a method and design tool for creating interactive installations with a demand for exploratory meaning creation, not limited to the design stage, but extending into the stage where the installation meets participants and audience. The proposed solution is bas...

  3. Parametric design and analysis framework with integrated dynamic models

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer

    2014-01-01

    of building energy and indoor environment, are generally confined to late in the design process. Consequence based design is a framework intended for the early design stage. It involves interdisciplinary expertise that secures validity and quality assurance with a simulationist while sustaining autonomous...

  4. Holland's RIASEC Model as an Integrative Framework for Individual Differences

    Science.gov (United States)

    Armstrong, Patrick Ian; Day, Susan X.; McVay, Jason P.; Rounds, James

    2008-01-01

    Using data from published sources, the authors investigated J. L. Holland's (1959, 1997) theory of interest types as an integrative framework for organizing individual differences variables that are used in counseling psychology. Holland's interest types were used to specify 2- and 3-dimensional interest structures. In Study 1, measures of…

  5. Holland's RIASEC Model as an Integrative Framework for Individual Differences

    Science.gov (United States)

    Armstrong, Patrick Ian; Day, Susan X.; McVay, Jason P.; Rounds, James

    2008-01-01

    Using data from published sources, the authors investigated J. L. Holland's (1959, 1997) theory of interest types as an integrative framework for organizing individual differences variables that are used in counseling psychology. Holland's interest types were used to specify 2- and 3-dimensional interest structures. In Study 1, measures of…

  6. Total Quality Management (TQM framework for e-learning based on EFQM and Kirkpatrick models

    Directory of Open Access Journals (Sweden)

    Jeanne Schreurs

    2006-07-01

    Full Text Available The EFQM excellence model is a famous quality management tool. We have translated it to be useful in e-learning quality management. EFQM will be used as a framework for self-evaluation. We developed the e-learning stakeholder model. We identified the main criterion and positioned them in the stakeholder model. We present short the Kirkpatrick evaluation model of e-learning. We developed a Kirkpatrick-EFQM self-assessment framework. We propose the limited learner-centric self-assessment framework. A preliminary set of quality criteria have been identified for self-assessment by the learners.

  7. Software Process Improvement Framework Based on CMMI Continuous Model Using QFD

    Directory of Open Access Journals (Sweden)

    Yonghui Cao

    2013-01-01

    Full Text Available In the rapid technological innovation and changes era, the key to the survival company is the continuous improvement of its process. In this paper, we introduce Software Process Improvement (SPI and Quality Function Deployment (QFD; and for combining also the staged model and the continuous model in CMMI, the Software Process Improvement framework with CMMI has two parts: 1 Software Process Improvement framework with CMMI staged model based on QFD and 2 SPI framework for C MI Mbased on QFD continuous model. Finally, we also draw conclusions.

  8. From Principles to Details: Integrated Framework for Architecture Modelling of Large Scale Software Systems

    Directory of Open Access Journals (Sweden)

    Andrzej Zalewski

    2013-06-01

    Full Text Available There exist numerous models of software architecture (box models, ADL’s, UML, architectural decisions, architecture modelling frameworks (views, enterprise architecture frameworks and even standards recommending practice for the architectural description. We show in this paper, that there is still a gap between these rather abstract frameworks/standards and existing architecture models. Frameworks and standards define what should be modelled rather than which models should be used and how these models are related to each other. We intend to prove that a less abstract modelling framework is needed for the effective modelling of large scale software intensive systems. It should provide a more precise guidance kinds of models to be employed and how they should relate to each other. The paper defines principles that can serve as base for an integrated model. Finally, structure of such a model has been proposed. It comprises three layers: the upper one – architectural policy – reflects corporate policy and strategies in architectural terms, the middle one –system organisation pattern – represents the core structural concepts and their rationale at a given level of scope, the lower one contains detailed architecture models. Architectural decisions play an important role here: they model the core architectural concepts explaining detailed models as well as organise the entire integrated model and the relations between its submodels.

  9. System modeling with the DISC framework: evidence from safety-critical domains.

    Science.gov (United States)

    Reiman, Teemu; Pietikäinen, Elina; Oedewald, Pia; Gotcheva, Nadezhda

    2012-01-01

    The objective of this paper is to illustrate the development and application of the Design for Integrated Safety Culture (DISC) framework for system modeling by evaluating organizational potential for safety in nuclear and healthcare domains. The DISC framework includes criteria for good safety culture and a description of functions that the organization needs to implement in order to orient the organization toward the criteria. Three case studies will be used to illustrate the utilization of the DISC framework in practice.

  10. Experimental development based on mapping rule between requirements analysis model and web framework specific design model.

    Science.gov (United States)

    Okuda, Hirotaka; Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    Model Driven Development is a promising approach to develop high quality software systems. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface prototype from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the prototype. We proposes a mapping rule in which design information independent of each web application framework implementation is defined based on the requirements analysis model, so as to improve the traceability to the final product from the valid requirements analysis model. This paper discusses the result of applying our method to the development of a Group Work Support System that is currently running in our department.

  11. Sign Language Recognition Based on Position and Movement Using Multi-Stream HMM

    Science.gov (United States)

    Nishida, Masafumi; Maebatake, Masaru; Suzuki, Iori; Horiuchi, Yasuo; Kuroiwa, Shingo

    To establish a universal communication environment, computer systems should recognize various modal communication languages. In conventional sign language recognition, recognition is performed by the word unit using gesture information of hand shape and movement. In the conventional studies, each feature has same weight to calculate the probability for the recognition. We think hand position is very important for sign language recognition, since the implication of word differs according to hand position. In this study, we propose a sign language recognition method by using a multi-stream HMM technique to show the importance of position and movement information for the sign language recognition. We conducted recognition experiments using 28,200 sign language word data. As a result, 82.1 % recognition accuracy was obtained with the appropriate weight (position:movement=0.2:0.8), while 77.8 % was obtained with the same weight. As a result, we demonstrated that it is necessary to put weight on movement than position in sign language recognition.

  12. Modelling plasticity of unsaturated soils in a thermodynamically consistent framework

    CERN Document Server

    Coussy, O

    2010-01-01

    Constitutive equations of unsaturated soils are often derived in a thermodynamically consistent framework through the use a unique 'effective' interstitial pressure. This later is naturally chosen as the space averaged interstitial pressure. However, experimental observations have revealed that two stress state variables were needed to describe the stress-strain-strength behaviour of unsaturated soils. The thermodynamics analysis presented here shows that the most general approach to the behaviour of unsaturated soils actually requires three stress state variables: the suction, which is required to describe the retention properties of the soil and two effective stresses, which are required to describe the soil deformation at water saturation held constant. Actually, it is shown that a simple assumption related to internal deformation leads to the need of a unique effective stress to formulate the stress-strain constitutive equation describing the soil deformation. An elastoplastic framework is then presented ...

  13. A Framework for Modeling and Simulation of the Artificial

    Science.gov (United States)

    2012-01-01

    valid OMB control number. 1. REPORT DATE 2012 2. REPORT TYPE 3. DATES COVERED 00-00-2012 to 00-00-2012 4. TITLE AND SUBTITLE A Framework for...style of symphonic, folk, or jazz . A musical performance can also therefore have an ensemble of orches- tra, small group, or soloist. With no...constraint m3 :musical-performance (==> (equale (e@ style) jazz ) (or (equale (e@ ensemble) small-group) (equale (e@ ensemble) orchestra)))) (orv (ifv

  14. A Modular Simulation Framework for Assessing Swarm Search Models

    Science.gov (United States)

    2014-09-01

    Simulink . American Institute of Aeronautics and Astronautics, 2011. [31] MathWorks. (2014, Jun. 25). Strategies for efficient use of memory - MATLAB ... Simulink . [Online]. Available: http://www.mathworks.com/help/ matlab / matlab \\ _prog/strategies-for-efficient-use-of-memory.html [32] J. P. C. Kleijnen, S. M...explored utilizing today’s search decision support and analysis tools. This thesis develops a framework in MATLAB that allows the investigation of search

  15. Advances in Geoscience Modeling: Smart Modeling Frameworks, Self-Describing Models and the Role of Standardized Metadata

    Science.gov (United States)

    Peckham, Scott

    2016-04-01

    Over the last decade, model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that make it much easier for modelers to connect heterogeneous sets of process models in a plug-and-play manner to create composite "system models". These mechanisms greatly simplify code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing with standardized metadata. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can use the self description functions to learn about each process model in a collection to be coupled and then automatically call framework service components (e.g. regridders

  16. Collaborative Project. A Flexible Atmospheric Modeling Framework for the Community Earth System Model (CESM)

    Energy Technology Data Exchange (ETDEWEB)

    Gettelman, Andrew [University Corporation For Atmospheric Research (UCAR), Boulder, CO (United States)

    2015-10-01

    In this project we have been upgrading the Multiscale Modeling Framework (MMF) in the Community Atmosphere Model (CAM), also known as Super-Parameterized CAM (SP-CAM). This has included a major effort to update the coding standards and interface with CAM so that it can be placed on the main development trunk. It has also included development of a new software structure for CAM to be able to handle sub-grid column information. These efforts have formed the major thrust of the work.

  17. A PLM components monitoring framework for SMEs based on a PLM maturity model and FAHP methodology

    OpenAIRE

    Zhang, Haiqing; Sekhari, Aicha; Ouzrout, Yacine; Bouras, Abdelaziz

    2014-01-01

    Right PLM components selection and investments increase business advantages. This paper develops a PLM components monitoring framework to assess and guide PLM implementation in small and middle enterprises (SMEs). The framework builds upon PLM maturity models and decision-making methodology. PLM maturity model has the capability to analyze PLM functionalities and evaluate PLM components. A proposed PLM components maturity assessment (PCMA) model can obtain general maturity levels of PLM compo...

  18. A Bayesian Approach for Structural Learning with Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Cen Li

    2002-01-01

    Full Text Available Hidden Markov Models(HMM have proved to be a successful modeling paradigm for dynamic and spatial processes in many domains, such as speech recognition, genomics, and general sequence alignment. Typically, in these applications, the model structures are predefined by domain experts. Therefore, the HMM learning problem focuses on the learning of the parameter values of the model to fit the given data sequences. However, when one considers other domains, such as, economics and physiology, model structure capturing the system dynamic behavior is not available. In order to successfully apply the HMM methodology in these domains, it is important that a mechanism is available for automatically deriving the model structure from the data. This paper presents a HMM learning procedure that simultaneously learns the model structure and the maximum likelihood parameter values of a HMM from data. The HMM model structures are derived based on the Bayesian model selection methodology. In addition, we introduce a new initialization procedure for HMM parameter value estimation based on the K-means clustering method. Experimental results with artificially generated data show the effectiveness of the approach.

  19. Assessing Students' Understandings of Biological Models and Their Use in Science to Evaluate a Theoretical Framework

    Science.gov (United States)

    Grünkorn, Juliane; Upmeier zu Belzen, Annette; Krüger, Dirk

    2014-01-01

    Research in the field of students' understandings of models and their use in science describes different frameworks concerning these understandings. Currently, there is no conjoint framework that combines these structures and so far, no investigation has focused on whether it reflects students' understandings sufficiently (empirical evaluation).…

  20. Applying the Nominal Response Model within a Longitudinal Framework to Construct the Positive Family Relationships Scale

    Science.gov (United States)

    Preston, Kathleen Suzanne Johnson; Parral, Skye N.; Gottfried, Allen W.; Oliver, Pamella H.; Gottfried, Adele Eskeles; Ibrahim, Sirena M.; Delany, Danielle

    2015-01-01

    A psychometric analysis was conducted using the nominal response model under the item response theory framework to construct the Positive Family Relationships scale. Using data from the Fullerton Longitudinal Study, this scale was constructed within a long-term longitudinal framework spanning middle childhood through adolescence. Items tapping…

  1. Robustness Analysis of Road Networks: a Framework with Combined DTA Models

    NARCIS (Netherlands)

    Li, M.

    2008-01-01

    Network robustness is the ability of a road network functioning properly facing unpredictable and exceptional incidents. A systematical framework with combined dynamic traffic assignment (DTA) models is designed for the analysis of road network robustness. With this framework, network performance co

  2. A model based safety architecture framework for Dutch high speed train lines

    NARCIS (Netherlands)

    Schuitemaker, K.; Braakhuis, J.G.; Rajabalinejad, M.

    2015-01-01

    This paper presents a model-based safety architecture framework (MBSAF) for capturing and sharing architectural knowledge of safety cases of safetycritical systems of systems (SoS). Whilst architecture frameworks in the systems engineering domain consider safety often as dependent attribute, this st

  3. Towards a Framework for Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2010-01-01

    This paper describes a framework currently under development for modelling, simulation, and verification of relay interlocking systems as used by the Danish railways. The framework is centred around a domain-specific language (DSL) for describing such systems, and provides (1) a graphical editor ...

  4. Integration of the DAYCENT Biogeochemical Model within a Multi-Model Framework

    Energy Technology Data Exchange (ETDEWEB)

    David Muth

    2012-07-01

    Agricultural residues are the largest near term source of cellulosic 13 biomass for bioenergy production, but removing agricultural residues sustainably 14 requires considering the critical roles that residues play in the agronomic system. 15 Determining sustainable removal rates for agricultural residues has received 16 significant attention and integrated modeling strategies have been built to evaluate 17 sustainable removal rates considering soil erosion and organic matter constraints. 18 However the current integrated model does not quantitatively assess soil carbon 19 and long term crop yields impacts of residue removal. Furthermore the current 20 integrated model does not evaluate the greenhouse gas impacts of residue 21 removal, specifically N2O and CO2 gas fluxes from the soil surface. The DAYCENT 22 model simulates several important processes for determining agroecosystem 23 performance. These processes include daily Nitrogen-gas flux, daily carbon dioxide 24 flux from soil respiration, soil organic carbon and nitrogen, net primary productivity, 25 and daily water and nitrate leaching. Each of these processes is an indicator of 26 sustainability when evaluating emerging cellulosic biomass production systems for 27 bioenergy. A potentially vulnerable cellulosic biomass resource is agricultural 28 residues. This paper presents the integration of the DAYCENT model with the 29 existing integration framework modeling tool to investigate additional environment 30 impacts of agricultural residue removal. The integrated model is extended to 31 facilitate two-way coupling between DAYCENT and the existing framework. The 32 extended integrated model is applied to investigate additional environmental 33 impacts from a recent sustainable agricultural residue removal dataset. The 34 integrated model with DAYCENT finds some differences in sustainable removal 35 rates compared to previous results for a case study county in Iowa. The extended 36 integrated model with

  5. EarthCube - Earth System Bridge: Spanning Scientific Communities with Interoperable Modeling Frameworks

    Science.gov (United States)

    Peckham, S. D.; DeLuca, C.; Gochis, D. J.; Arrigo, J.; Kelbert, A.; Choi, E.; Dunlap, R.

    2014-12-01

    In order to better understand and predict environmental hazards of weather/climate, ecology and deep earth processes, geoscientists develop and use physics-based computational models. These models are used widely both in academic and federal communities. Because of the large effort required to develop and test models, there is widespread interest in component-based modeling, which promotes model reuse and simplified coupling to tackle problems that often cross discipline boundaries. In component-based modeling, the goal is to make relatively small changes to models that make it easy to reuse them as "plug-and-play" components. Sophisticated modeling frameworks exist to rapidly couple these components to create new composite models. They allow component models to exchange variables while accommodating different programming languages, computational grids, time-stepping schemes, variable names and units. Modeling frameworks have arisen in many modeling communities. CSDMS (Community Surface Dynamics Modeling System) serves the academic earth surface process dynamics community, while ESMF (Earth System Modeling Framework) serves many federal Earth system modeling projects. Others exist in both the academic and federal domains and each satisfies design criteria that are determined by the community they serve. While they may use different interface standards or semantic mediation strategies, they share fundamental similarities. The purpose of the Earth System Bridge project is to develop mechanisms for interoperability between modeling frameworks, such as the ability to share a model or service component. This project has three main goals: (1) Develop a Framework Description Language (ES-FDL) that allows modeling frameworks to be described in a standard way so that their differences and similarities can be assessed. (2) Demonstrate that if a model is augmented with a framework-agnostic Basic Model Interface (BMI), then simple, universal adapters can go from BMI to a

  6. Temporo-spatial model construction using the MML and software framework.

    Science.gov (United States)

    Chang, David C; Dokos, Socrates; Lovell, Nigel H

    2011-12-01

    Development of complex temporo-spatial biological computational models can be a time consuming and arduous task. These models may contain hundreds of differential equations as well as realistic geometries that may require considerable investment in time to ensure that all model components are correctly implemented and error free. To tackle this problem, the Modeling Markup Languages (MML) and software framework is a modular XML/HDF5-based specification and toolkits that aims to simplify this process. The main goal of this framework is to encourage reusability, sharing and storage. To achieve this, the MML framework utilizes the CellML specification and repository, which comprises an extensive range of curated models available for use. The MML framework is an open-source project available at http://mml.gsbme.unsw.edu.au.

  7. A general mathematical framework for representing soil organic matter dynamics in biogeochemistry models

    Science.gov (United States)

    Sierra, C. A.; Mueller, M.

    2013-12-01

    Recent work have highlighted the importance of nonlinear interactions in representing the decomposition of soil organic matter (SOM). It is unclear however how to integrate these concepts into larger biogeochemical models or into a more general mathematical description of the decomposition process. Here we present a mathematical framework that generalizes both previous decomposition models and recent ideas about nonlinear microbial interactions. The framework is based on a set of four basic principles: 1) mass balance, 2) heterogeneity in the decomposability of SOM, 3) transformations in the decomposability of SOM over time, 4) energy limitation of decomposers. This framework generalizes a large majority of SOM decomposition models proposed to date. We illustrate the application of this framework to the development of a continuous model that includes the ideas in the Dual Arrhenius Michaelis-Menten Model (DAMM) for explicitly representing temperature-moisture limitations of enzyme activity in the decomposition of heterogenous substrates.

  8. Model-based reasoning in the physics laboratory: Framework and initial results

    Science.gov (United States)

    Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.

    2015-12-01

    [This paper is part of the Focused Collection on Upper Division Physics Courses.] We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable process, within physics education, it has been preferentially applied to the iterative development of broadly applicable principles (e.g., Newton's laws of motion in introductory mechanics). A significant feature of the new framework is that measurement tools (in addition to the physical system being studied) are subjected to the process of modeling. Think-aloud interviews were used to refine the framework and demonstrate its utility by documenting examples of model-based reasoning in the laboratory. When applied to the think-aloud interviews, the framework captures and differentiates students' model-based reasoning and helps identify areas of future research. The interviews showed how students productively applied similar facets of modeling to the physical system and measurement tools: construction, prediction, interpretation of data, identification of model limitations, and revision. Finally, we document students' challenges in explicitly articulating assumptions when constructing models of experimental systems and further challenges in model construction due to students' insufficient prior conceptual understanding. A modeling perspective reframes many of the seemingly arbitrary technical details of measurement tools and apparatus as an opportunity for authentic and engaging scientific sense making.

  9. The Framework Dedicated to Three Phase Flows Wellbore Modelling

    Directory of Open Access Journals (Sweden)

    Bartlomiej Bielecki

    2015-01-01

    Full Text Available To predict physical properties in a wellbore during oil and gas production, scientists use empirical correlations or mechanistic approach algorithms. The typical research in this field is concentrated on a single property analysis as heat transfer, pressure, temperature, and so forth. Here the most proper correlations, regarding the subject, are presented. And the authors studied how to join all correlations into the full framework which returns all production parameters at every depth in a wellbore. Additionally, the presented simulation results are studied here. Based on presented algorithms, the proper tool has been applied and the results shown in this paper are taken from this application.

  10. Model-based visual tracking the OpenTL framework

    CERN Document Server

    Panin, Giorgio

    2011-01-01

    This book has two main goals: to provide a unifed and structured overview of this growing field, as well as to propose a corresponding software framework, the OpenTL library, developed by the author and his working group at TUM-Informatik. The main objective of this work is to show, how most real-world application scenarios can be naturally cast into a common description vocabulary, and therefore implemented and tested in a fully modular and scalable way, through the defnition of a layered, object-oriented software architecture.The resulting architecture covers in a seamless way all processin

  11. Estimation methods for nonlinear state-space models in ecology

    DEFF Research Database (Denmark)

    Pedersen, Martin Wæver; Berg, Casper Willestofte; Thygesen, Uffe Høgsbro

    2011-01-01

    The use of nonlinear state-space models for analyzing ecological systems is increasing. A wide range of estimation methods for such models are available to ecologists, however it is not always clear, which is the appropriate method to choose. To this end, three approaches to estimation in the theta...... logistic model for population dynamics were benchmarked by Wang (2007). Similarly, we examine and compare the estimation performance of three alternative methods using simulated data. The first approach is to partition the state-space into a finite number of states and formulate the problem as a hidden...... Markov model (HMM). The second method uses the mixed effects modeling and fast numerical integration framework of the AD Model Builder (ADMB) open-source software. The third alternative is to use the popular Bayesian framework of BUGS. The study showed that state and parameter estimation performance...

  12. A model independent S/W framework for search-based software testing.

    Science.gov (United States)

    Oh, Jungsup; Baik, Jongmoon; Lim, Sung-Hwa

    2014-01-01

    In Model-Based Testing (MBT) area, Search-Based Software Testing (SBST) has been employed to generate test cases from the model of a system under test. However, many types of models have been used in MBT. If the type of a model has changed from one to another, all functions of a search technique must be reimplemented because the types of models are different even if the same search technique has been applied. It requires too much time and effort to implement the same algorithm over and over again. We propose a model-independent software framework for SBST, which can reduce redundant works. The framework provides a reusable common software platform to reduce time and effort. The software framework not only presents design patterns to find test cases for a target model but also reduces development time by using common functions provided in the framework. We show the effectiveness and efficiency of the proposed framework with two case studies. The framework improves the productivity by about 50% when changing the type of a model.

  13. Hydrogeologic Framework Model for the Saturated Zone Site Scale flow and Transport Model

    Energy Technology Data Exchange (ETDEWEB)

    T. Miller

    2004-11-15

    The purpose of this report is to document the 19-unit, hydrogeologic framework model (19-layer version, output of this report) (HFM-19) with regard to input data, modeling methods, assumptions, uncertainties, limitations, and validation of the model results in accordance with AP-SIII.10Q, Models. The HFM-19 is developed as a conceptual model of the geometric extent of the hydrogeologic units at Yucca Mountain and is intended specifically for use in the development of the ''Saturated Zone Site-Scale Flow Model'' (BSC 2004 [DIRS 170037]). Primary inputs to this model report include the GFM 3.1 (DTN: MO9901MWDGFM31.000 [DIRS 103769]), borehole lithologic logs, geologic maps, geologic cross sections, water level data, topographic information, and geophysical data as discussed in Section 4.1. Figure 1-1 shows the information flow among all of the saturated zone (SZ) reports and the relationship of this conceptual model in that flow. The HFM-19 is a three-dimensional (3-D) representation of the hydrogeologic units surrounding the location of the Yucca Mountain geologic repository for spent nuclear fuel and high-level radioactive waste. The HFM-19 represents the hydrogeologic setting for the Yucca Mountain area that covers about 1,350 km2 and includes a saturated thickness of about 2.75 km. The boundaries of the conceptual model were primarily chosen to be coincident with grid cells in the Death Valley regional groundwater flow model (DTN: GS960808312144.003 [DIRS 105121]) such that the base of the site-scale SZ flow model is consistent with the base of the regional model (2,750 meters below a smoothed version of the potentiometric surface), encompasses the exploratory boreholes, and provides a framework over the area of interest for groundwater flow and radionuclide transport modeling. In depth, the model domain extends from land surface to the base of the regional groundwater flow model (D'Agnese et al. 1997 [DIRS 100131], p 2). For the site

  14. Baltes' SOC model of successful ageing as a potential framework for stroke rehabilitation.

    Science.gov (United States)

    Donnellan, C; O'Neill, D

    2014-01-01

    The aim of this paper is to explore approaches used to address some stroke rehabilitation interventions and to examine the potential use of one of the life-span theories called the Baltes' model of selective optimisation with compensation (SOC) as a potential framework. Some of the key considerations for a stroke rehabilitation intervention framework are highlighted including accommodating for the life management changes post stroke, alterations in self-regulation, acknowledge losses and focusing on a person-centred approach for transition from acute rehabilitation to the home or community setting. The Baltes' SOC model is then described in terms of these considerations for a stroke rehabilitation intervention framework. The Baltes' SOC model may offer further insights, including ageing considerations, for stroke rehabilitation approaches and interventions. It has potential to facilitate some of the necessary complexities of adjustment required in stroke rehabilitation. However, further development in terms of empirical support is required for using the model as a framework to structure stroke rehabilitation intervention. Implications for Rehabilitation There is a scarcity of theoretical frameworks that can facilitate and be inclusive for all the necessary complexities of adjustment, required in stroke rehabilitation. In addition to motor recovery post stroke, rehabilitation intervention frameworks should be goal orientated; address self-regulatory processes; be person-centred and use a common language for goal planning, setting and attainment. The Baltes' SOC model is one such framework that may address some of the considerations for stroke rehabilitation, including motor recovery and other life management aspects.

  15. A model-based framework for the analysis of team communication in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Yun Hyung [Knowledge and Information Management Department, Korea Institute of Nuclear Safety, 19 Guseong-Dong, Yuseong-Gu, Daejeon 335-338 (Korea, Republic of)], E-mail: yhchung@kins.re.kr; Yoon, Wan Chul [Intelligent Service Engineering, Korea Advanced Institute of Science and Technology, 373-1 Guseong-Dong, Yuseong-Gu, Daejeon 305-701 (Korea, Republic of); Min, Daihwan [Department of MIS, Korea University, 208 Seochang-Dong, Jochiwon-Eup, Yongi-Gun, Choongnam 339-700 (Korea, Republic of)

    2009-06-15

    Advanced human-machine interfaces are rapidly changing the interaction between humans and systems, with the level of abstraction of the presented information, the human task characteristics, and the modes of communication all affected. To accommodate the changes in the human/system co-working environment, an extended communication analysis framework is needed that can describe and relate the tasks, verbal exchanges, and information interface. This paper proposes an extended analytic framework, referred to as the H-H-S (human-human-system) communication analysis framework, which can model the changes in team communication that are emerging in these new working environments. The stage-specific decision-making model and analysis tool of the proposed framework make the analysis of team communication easier by providing visual clues. The usefulness of the proposed framework is demonstrated with an in-depth comparison of the characteristics of communication in the conventional and advanced main control rooms of nuclear power plants.

  16. A framework for modeling interregional population distribution and economic growth.

    Science.gov (United States)

    Ledent, J; Gordon, P

    1981-01-01

    "An integrated model is proposed to capture economic and demographic interactions in a system of regions. This model links the interregional economic model of Isard (1960) and the interregional demographic model of Rogers (1975) via functions describing consumption and migration patterns. Migration rates are determined jointly with labor force participation rates and unemployment rates."

  17. Framework of Pattern Recognition Model Based on the Cognitive Psychology

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    According to the fundamental theory of visual cognition mechanism and cognitive psychology,the visual pattern recognition model is introduced briefly.Three pattern recognition models,i.e.template-based matching model,prototype-based matching model and feature-based matching model are built and discussed separately.In addition,the influence of object background information and visual focus point to the result of pattern recognition is also discussed with the example of recognition for fuzzy letters and figures.

  18. Poly(ethylene glycol) (PEG) in a Polyethylene (PE) Framework: A Simple Model for Simulation Studies of a Soluble Polymer in an Open Framework.

    Science.gov (United States)

    Xie, Liangxu; Chan, Kwong-Yu; Quirke, Nick

    2017-08-16

    Canonical molecular dynamics simulations are performed to investigate the behavior of single-chain and multiple-chain poly(ethylene glycol) (PEG) contained within a cubic framework spanned by polyethylene (PE) chains. This simple model is the first of its kind to study the chemical physics of polymer-threaded organic frameworks, which are materials with potential applications in catalysis and separation processes. For a single-chain 9-mer, 14-mer, and 18-mer in a small framework, the PEG will interact strongly with the framework and assume a more linear geometry chain with an increased radius of gyration Rg compared to that of a large framework. The interaction between PEG and the framework decreases with increasing mesh size in both vacuum and water. In the limit of a framework with an infinitely large cavity (infinitely long linkers), PEG behavior approaches simulation results without a framework. The solvation of PEG is simulated by adding explicit TIP3P water molecules to a 6-chain PEG 14-mer aggregate confined in a framework. The 14-mer chains are readily solvated and leach out of a large 2.6 nm mesh framework. There are fewer water-PEG interactions in a small 1.0 nm mesh framework, as indicated by a smaller number of hydrogen bonds. The PEG aggregate, however, still partially dissolves but is retained within the 1.0 nm framework. The preliminary results illustrate the effectiveness of the simple model in studying polymer-threaded framework materials and in optimizing polymer or framework parameters for high performance.

  19. Modeling the marketing strategy-performance relationship : towards an hierarchical marketing performance framework

    NARCIS (Netherlands)

    Huizingh, Eelko K.R.E.; Zengerink, Evelien

    2001-01-01

    Accurate measurement of marketing performance is an important topic for both marketing academics and marketing managers. Many researchers have recognized that marketing performance measurement should go beyond financial measurement. In this paper we propose a conceptual framework that models

  20. Modeling the marketing strategy-performance relationship : towards an hierarchical marketing performance framework

    NARCIS (Netherlands)

    Huizingh, Eelko K.R.E.; Zengerink, Evelien

    2001-01-01

    Accurate measurement of marketing performance is an important topic for both marketing academics and marketing managers. Many researchers have recognized that marketing performance measurement should go beyond financial measurement. In this paper we propose a conceptual framework that models marketi

  1. The Regional Hydrologic Extremes Assessment System: A software framework for hydrologic modeling and data assimilation

    National Research Council Canada - National Science Library

    Konstantinos M Andreadis; Narendra Das; Dimitrios Stampoulis; Amor Ines; Joshua B Fisher; Stephanie Granger; Jessie Kawata; Eunjin Han; Ali Behrangi

    2017-01-01

    The Regional Hydrologic Extremes Assessment System (RHEAS) is a prototype software framework for hydrologic modeling and data assimilation that automates the deployment of water resources nowcasting and forecasting applications...

  2. A reference model and technical framework for mobile social software for learning

    NARCIS (Netherlands)

    De Jong, Tim; Specht, Marcus; Koper, Rob

    2008-01-01

    De Jong, T., Specht, M., & Koper, R. (2008). A reference model and technical framework for mobile social software for learning. Presented at the IADIS m-learning 2008 Conference. April, 11-13, 2008, Carvoeiro, Portugal.

  3. The IIR evaluation model: a framework for evaluation of interactive information retrieval systems

    Directory of Open Access Journals (Sweden)

    Borlund Pia

    2003-01-01

    Full Text Available An alternative approach to evaluation of interactive information retrieval (IIR systems is proposed. The model provides a framework for the collection and analysis of IR interaction data.

  4. A flexible and efficient multi-model framework in support of water management

    Science.gov (United States)

    Wolfs, Vincent; Tran Quoc, Quan; Willems, Patrick

    2016-05-01

    Flexible, fast and accurate water quantity models are essential tools in support of water management. Adjustable levels of model detail and the ability to handle varying spatial and temporal resolutions are requisite model characteristics to ensure that such models can be employed efficiently in various applications. This paper uses a newly developed flexible modelling framework that aims to generate such models. The framework incorporates several approaches to model catchment hydrology, rivers and floodplains, and the urban drainage system by lumping processes on different levels. To illustrate this framework, a case study of integrated hydrological-hydraulic modelling is elaborated for the Grote Nete catchment in Belgium. Three conceptual rainfall-runoff models (NAM, PDM and VHM) were implemented in a generalized model structure, allowing flexibility in the spatial resolution by means of an innovative disaggregation/aggregation procedure. They were linked to conceptual hydraulic models of the rivers in the catchment, which were developed by means of an advanced model structure identification and calibration procedure. The conceptual models manage to emulate the simulation results of a detailed full hydrodynamic model accurately. The models configured using the approaches of this framework are well-suited for many applications in water management due to their very short calculation time, interfacing possibilities and adjustable level of detail.

  5. A framework for sharing and integrating remote sensing and GIS models based on Web service.

    Science.gov (United States)

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a "black box" and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users.

  6. NONLINEAR EXTENSION OF ASYMMETRIC GARCH MODEL WITHIN NEURAL NETWORK FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Josip Arnerić

    2016-05-01

    Full Text Available The importance of volatility for all market participants has led to the development and application of various econometric models. The most popular models in modelling volatility are GARCH type models because they can account excess kurtosis and asymmetric effects of financial time series. Since standard GARCH(1,1 model usually indicate high persistence in the conditional variance, the empirical researches turned to GJR-GARCH model and reveal its superiority in fitting the asymmetric heteroscedasticity in the data. In order to capture both asymmetry and nonlinearity in data, the goal of this paper is to develop a parsimonious NN model as an extension to GJR-GARCH model and to determine if GJR-GARCH-NN outperforms the GJR-GARCH model.

  7. Multi-stream LSTM-HMM decoding and histogram equalization for noise robust keyword spotting.

    Science.gov (United States)

    Wöllmer, Martin; Marchi, Erik; Squartini, Stefano; Schuller, Björn

    2011-09-01

    Highly spontaneous, conversational, and potentially emotional and noisy speech is known to be a challenge for today's automatic speech recognition (ASR) systems, which highlights the need for advanced algorithms that improve speech features and models. Histogram Equalization is an efficient method to reduce the mismatch between clean and noisy conditions by normalizing all moments of the probability distribution of the feature vector components. In this article, we propose to combine histogram equalization and multi-condition training for robust keyword detection in noisy speech. To better cope with conversational speaking styles, we show how contextual information can be effectively exploited in a multi-stream ASR framework that dynamically models context-sensitive phoneme estimates generated by a long short-term memory neural network. The proposed techniques are evaluated on the SEMAINE database-a corpus containing emotionally colored conversations with a cognitive system for "Sensitive Artificial Listening".

  8. An Integrated Qualitative and Quantitative Biochemical Model Learning Framework Using Evolutionary Strategy and Simulated Annealing.

    Science.gov (United States)

    Wu, Zujian; Pang, Wei; Coghill, George M

    2015-01-01

    Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.

  9. National culture and business model change: a framework for successful expansions

    DEFF Research Database (Denmark)

    Dalby, J.; Nielsen, L.S.; Lueg, Rainer;

    2014-01-01

    Dalby, J., Nielsen, Lueg, R., L. S., Pedersen, L., Tomoni, A. C. 2014. National culture and business model change: a framework for successful expansions. Journal of Enterprising Culture, 22(4): 379-498.......Dalby, J., Nielsen, Lueg, R., L. S., Pedersen, L., Tomoni, A. C. 2014. National culture and business model change: a framework for successful expansions. Journal of Enterprising Culture, 22(4): 379-498....

  10. Super-Exponential Solution in Markovian Supermarket Models: Framework and Challenge

    OpenAIRE

    Li, Quan-Lin

    2011-01-01

    Marcel F. Neuts opened a key door in numerical computation of stochastic models by means of phase-type (PH) distributions and Markovian arrival processes (MAPs). To celebrate his 75th birthday, this paper reports a more general framework of Markovian supermarket models, including a system of differential equations for the fraction measure and a system of nonlinear equations for the fixed point. To understand this framework heuristically, this paper gives a detailed analysis for three importan...

  11. A scalable delivery framework and a pricing model for streaming media with advertisements

    Science.gov (United States)

    Al-Hadrusi, Musab; Sarhan, Nabil J.

    2008-01-01

    This paper presents a delivery framework for streaming media with advertisements and an associated pricing model. The delivery model combines the benefits of periodic broadcasting and stream merging. The advertisements' revenues are used to subsidize the price of the media content. The pricing is determined based on the total ads' viewing time. Moreover, this paper presents an efficient ad allocation scheme and three modified scheduling policies that are well suited to the proposed delivery framework. Furthermore, we study the effectiveness of the delivery framework and various scheduling polices through extensive simulation in terms of numerous metrics, including customer defection probability, average number of ads viewed per client, price, arrival rate, profit, and revenue.

  12. Integrating water quality modeling with ecological risk assessment for nonpoint source pollution control: A conceptual framework

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Y.D.; McCutcheon, S.C.; Rasmussen, T.C.; Nutter, W.L.; Carsel, R.F.

    1993-01-01

    The historical development of water quality protection goals and strategies in the United States is reviewed. The review leads to the identification and discussion of three components (i.e., management mechanism, environmental investigation approaches, and environmental assessment and criteria) for establishing a management framework for nonpoint source pollution control. Water quality modeling and ecological risk assessment are the two most important and promising approaches to the operation of the proposed management framework. A conceptual framework that shows the general integrative relationships between water quality modeling and ecological risk assessment is presented. (Copyright (c) 1993 IAWQ.)

  13. BioASF: a framework for automatically generating executable pathway models specified in BioPAX.

    Science.gov (United States)

    Haydarlou, Reza; Jacobsen, Annika; Bonzanni, Nicola; Feenstra, K Anton; Abeln, Sanne; Heringa, Jaap

    2016-06-15

    Biological pathways play a key role in most cellular functions. To better understand these functions, diverse computational and cell biology researchers use biological pathway data for various analysis and modeling purposes. For specifying these biological pathways, a community of researchers has defined BioPAX and provided various tools for creating, validating and visualizing BioPAX models. However, a generic software framework for simulating BioPAX models is missing. Here, we attempt to fill this gap by introducing a generic simulation framework for BioPAX. The framework explicitly separates the execution model from the model structure as provided by BioPAX, with the advantage that the modelling process becomes more reproducible and intrinsically more modular; this ensures natural biological constraints are satisfied upon execution. The framework is based on the principles of discrete event systems and multi-agent systems, and is capable of automatically generating a hierarchical multi-agent system for a given BioPAX model. To demonstrate the applicability of the framework, we simulated two types of biological network models: a gene regulatory network modeling the haematopoietic stem cell regulators and a signal transduction network modeling the Wnt/β-catenin signaling pathway. We observed that the results of the simulations performed using our framework were entirely consistent with the simulation results reported by the researchers who developed the original models in a proprietary language. The framework, implemented in Java, is open source and its source code, documentation and tutorial are available at http://www.ibi.vu.nl/programs/BioASF CONTACT: j.heringa@vu.nl. © The Author 2016. Published by Oxford University Press.

  14. Finite element model updating using bayesian framework and modal properties

    CSIR Research Space (South Africa)

    Marwala, T

    2005-01-01

    Full Text Available Finite element (FE) models are widely used to predict the dynamic characteristics of aerospace structures. These models often give results that differ from measured results and therefore need to be updated to match measured results. Some...

  15. A Generic Software Framework for Data Assimilation and Model Calibration

    NARCIS (Netherlands)

    Van Velzen, N.

    2010-01-01

    The accuracy of dynamic simulation models can be increased by using observations in conjunction with a data assimilation or model calibration algorithm. However, implementing such algorithms usually increases the complexity of the model software significantly. By using concepts from object oriented

  16. Integration of GFDL Data Portal into FMS Runtime Environment (FRE) modeling framework

    Science.gov (United States)

    Nikonov, S.; Balaji, V.; Rehbein, C.; Malysheva, Y.

    2009-12-01

    The complexities of modern climate modeling and the tremendous volume of model output data require advancements in the development of a climate model data dissemination infrastructure, stressing tight integration with the modeling framework. It is particularly important on the threshold of the IPCC Assessment Report 5, where data volume growth is expected to be two orders of magnitude beyond what was required for AR4. Shared infrastructure between the modeling framework and the data preparation and publishing framework will more easily allow the automation of three important phases of data dissemination: - Proper metadata annotation of datasets, including but not limited to model principles, configurations and simulation details; - Making data available using community-adopted standards; - Web-based data publishing which can feature semantic discoverability, navigation, federalization and analysis of published datasets. A framework which integrates the running of models with the publishing of data is a mutually profitable process. Climate model authors would have a powerful tool for composing and analyzing conducted experiments using FRE and a well-organized metadata database. In turn, those maintaining the data access portal gain direct access to metadata used in FMS that is useful in the process of publishing data. The kernel of this integration is a model development database that contains metadata describing all aspects of the modeling process, ranging from model configuration through model runtime, including model output analysis and data publishing. The database is populated using an automated process that collects model information and metadata via the model's XML-based configuration file. The framework also allows one to work backwards, generating synthesized experiment configurations using experiment components established in prior conducted experiments. In addition, a web-based graphical interface grants the model author a friendly and comfortable way to

  17. Integration of the Radiation Belt Environment Model Into the Space Weather Modeling Framework

    Science.gov (United States)

    Glocer, A.; Toth, G.; Fok, M.; Gombosi, T.; Liemohn, M.

    2009-01-01

    We have integrated the Fok radiation belt environment (RBE) model into the space weather modeling framework (SWMF). RBE is coupled to the global magnetohydrodynamics component (represented by the Block-Adaptive-Tree Solar-wind Roe-type Upwind Scheme, BATS-R-US, code) and the Ionosphere Electrodynamics component of the SWMF, following initial results using the Weimer empirical model for the ionospheric potential. The radiation belt (RB) model solves the convection-diffusion equation of the plasma in the energy range of 10 keV to a few MeV. In stand-alone mode RBE uses Tsyganenko's empirical models for the magnetic field, and Weimer's empirical model for the ionospheric potential. In the SWMF the BATS-R-US model provides the time dependent magnetic field by efficiently tracing the closed magnetic field-lines and passing the geometrical and field strength information to RBE at a regular cadence. The ionosphere electrodynamics component uses a two-dimensional vertical potential solver to provide new potential maps to the RBE model at regular intervals. We discuss the coupling algorithm and show some preliminary results with the coupled code. We run our newly coupled model for periods of steady solar wind conditions and compare our results to the RB model using an empirical magnetic field and potential model. We also simulate the RB for an active time period and find that there are substantial differences in the RB model results when changing either the magnetic field or the electric field, including the creation of an outer belt enhancement via rapid inward transport on the time scale of tens of minutes.

  18. VOML: A Framework for Modelling Virtual Organizations and Virtual Breeding Environment

    Directory of Open Access Journals (Sweden)

    Noor Jahan Rajper

    2015-07-01

    Full Text Available This paper presents the VOML (Virtual Organization Modelling Language framework. VOML is a formal approach for specifying VOs (Virtual Organizations and their VBEs (Virtual Breeding Environments.The VOML framework allows domain users to model a system in terms of their domain terminology and from that domain specific model IT community can derive a complete operational model closer to underlying execution environment. The framework is a collection of three sub-languages, each covering different aspects which are considered paramount at a particular level of VO representation. We present VOML and its underlying methodological approach in detail and demonstrate how to model VOs. Our focus will be on the methodological approach that VOML supports and on the language primitives that VOML offers for modelling VOs.

  19. Physical Models of Galaxy Formation in a Cosmological Framework

    OpenAIRE

    Somerville, Rachel S.; Davé, Romeel

    2014-01-01

    Modeling galaxy formation in a cosmological context presents one of the greatest challenges in astrophysics today, due to the vast range of scales and numerous physical processes involved. Here we review the current status of models that employ two leading techniques to simulate the physics of galaxy formation: semi-analytic models and numerical hydrodynamic simulations. We focus on a set of observational targets that describe the evolution of the global and structural properties of galaxies ...

  20. Viewpoints: a framework for object oriented database modelling and distribution

    Directory of Open Access Journals (Sweden)

    Fouzia Benchikha

    2006-01-01

    Full Text Available The viewpoint concept has received widespread attention recently. Its integration into a data model improves the flexibility of the conventional object-oriented data model and allows one to improve the modelling power of objects. The viewpoint paradigm can be used as a means of providing multiple descriptions of an object and as a means of mastering the complexity of current database systems enabling them to be developed in a distributed manner. The contribution of this paper is twofold: to define an object data model integrating viewpoints in databases and to present a federated database system integrating multiple sources following a local-as-extended-view approach.

  1. Assessing uncertainties in global cropland futures using a conditional probabilistic modelling framework

    NARCIS (Netherlands)

    Engström, Kerstin; Olin, Stefan; Rounsevell, Mark D A; Brogaard, Sara; Van Vuuren, Detlef P.; Alexander, Peter; Murray-Rust, Dave; Arneth, Almut

    2016-01-01

    We present a modelling framework to simulate probabilistic futures of global cropland areas that are conditional on the SSP (shared socio-economic pathway) scenarios. Simulations are based on the Parsimonious Land Use Model (PLUM) linked with the global dynamic vegetation model LPJ-GUESS (Lund-Potsd

  2. An ice sheet model validation framework for the Greenland ice sheet

    NARCIS (Netherlands)

    Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; Howat, Ian M.; Neumann, Thomas; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey; Chambers, Don P.; Evans, Katherine J.; Kennedy, Joseph H.; Lenaerts, Jan; Lipscomb, William H.; Perego, Mauro; Salinger, Andrew G.; Tuminaro, Raymond S.; Van Den Broeke, Michiel R.; Nowicki, Sophie M J

    2017-01-01

    We propose a new ice sheet model validation framework - the Cryospheric Model Comparison Tool (CmCt) - that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic

  3. Assessing uncertainties in global cropland futures using a conditional probabilistic modelling framework

    NARCIS (Netherlands)

    Engström, Kerstin; Olin, Stefan; Rounsevell, Mark D A; Brogaard, Sara; Van Vuuren, Detlef P.; Alexander, Peter; Murray-Rust, Dave; Arneth, Almut

    2016-01-01

    We present a modelling framework to simulate probabilistic futures of global cropland areas that are conditional on the SSP (shared socio-economic pathway) scenarios. Simulations are based on the Parsimonious Land Use Model (PLUM) linked with the global dynamic vegetation model LPJ-GUESS

  4. Deep Modeling: Circuit Characterization Using Theory Based Models in a Data Driven Framework

    Energy Technology Data Exchange (ETDEWEB)

    Bolme, David S [ORNL; Mikkilineni, Aravind K [ORNL; Rose, Derek C [ORNL; Yoginath, Srikanth B [ORNL; Holleman, Jeremy [University of Tennessee, Knoxville (UTK); Judy, Mohsen [University of Tennessee, Knoxville (UTK), Department of Electrical Engineering and Computer Science

    2017-01-01

    Analog computational circuits have been demonstrated to provide substantial improvements in power and speed relative to digital circuits, especially for applications requiring extreme parallelism but only modest precision. Deep machine learning is one such area and stands to benefit greatly from analog and mixed-signal implementations. However, even at modest precisions, offsets and non-linearity can degrade system performance. Furthermore, in all but the simplest systems, it is impossible to directly measure the intermediate outputs of all sub-circuits. The result is that circuit designers are unable to accurately evaluate the non-idealities of computational circuits in-situ and are therefore unable to fully utilize measurement results to improve future designs. In this paper we present a technique to use deep learning frameworks to model physical systems. Recently developed libraries like TensorFlow make it possible to use back propagation to learn parameters in the context of modeling circuit behavior. Offsets and scaling errors can be discovered even for sub-circuits that are deeply embedded in a computational system and not directly observable. The learned parameters can be used to refine simulation methods or to identify appropriate compensation strategies. We demonstrate the framework using a mixed-signal convolution operator as an example circuit.

  5. Toward the Establishment of a Common Framework for Model Evaluation

    DEFF Research Database (Denmark)

    Olesen, H. R.

    1996-01-01

    Proceedings of the Twenty-first NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held November 6-10 1995, in Baltimore, Maryland.......Proceedings of the Twenty-first NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held November 6-10 1995, in Baltimore, Maryland....

  6. Abdominal surgery process modeling framework for simulation using spreadsheets.

    Science.gov (United States)

    Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja

    2015-08-01

    We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  7. An integrated hydrologic modeling framework for coupling SWAT with MODFLOW

    Science.gov (United States)

    The Soil and Water Assessment Tool (SWAT), MODFLOW, and Energy Balance based Evapotranspiration (EB_ET) models are extensively used to estimate different components of the hydrological cycle. Surface and subsurface hydrological processes are modeled in SWAT but limited to the extent of shallow aquif...

  8. A MULTISCALE, CELL-BASED FRAMEWORK FOR MODELING CANCER DEVELOPMENT

    Energy Technology Data Exchange (ETDEWEB)

    JIANG, YI [Los Alamos National Laboratory

    2007-01-16

    Cancer remains to be one of the leading causes of death due to diseases. We use a systems approach that combines mathematical modeling, numerical simulation, in vivo and in vitro experiments, to develop a predictive model that medical researchers can use to study and treat cancerous tumors. The multiscale, cell-based model includes intracellular regulations, cellular level dynamics and intercellular interactions, and extracellular level chemical dynamics. The intracellular level protein regulations and signaling pathways are described by Boolean networks. The cellular level growth and division dynamics, cellular adhesion and interaction with the extracellular matrix is described by a lattice Monte Carlo model (the Cellular Potts Model). The extracellular dynamics of the signaling molecules and metabolites are described by a system of reaction-diffusion equations. All three levels of the model are integrated through a hybrid parallel scheme into a high-performance simulation tool. The simulation results reproduce experimental data in both avasular tumors and tumor angiogenesis. By combining the model with experimental data to construct biologically accurate simulations of tumors and their vascular systems, this model will enable medical researchers to gain a deeper understanding of the cellular and molecular interactions associated with cancer progression and treatment.

  9. A business model for IPTV service: A dynamic framework

    NARCIS (Netherlands)

    Bouwman, H.; Zhengjia, M.; Duin, P. van der; Limonard, S.

    2008-01-01

    Purpose - The purpose of this paper is to investigate a possible business model for telecom operators for entering the IPTV (digital television) market. Design/methodology/approach - The approach takes the form of a case study, literature search and interviews. Findings - The IPTV business model alw

  10. Physical Models of Galaxy Formation in a Cosmological Framework

    Science.gov (United States)

    Somerville, Rachel S.; Davé, Romeel

    2015-08-01

    Modeling galaxy formation in a cosmological context presents one of the greatest challenges in astrophysics today due to the vast range of scales and numerous physical processes involved. Here we review the current status of models that employ two leading techniques to simulate the physics of galaxy formation: semianalytic models and numerical hydrodynamic simulations. We focus on a set of observational targets that describe the evolution of the global and structural properties of galaxies from roughly cosmic high noon (z â¼ 2-3) to the present. Although minor discrepancies remain, overall, models show remarkable convergence among different methods and make predictions that are in qualitative agreement with observations. Modelers have converged on a core set of physical processes that are critical for shaping galaxy properties. This core set includes cosmological accretion, strong stellar-driven winds that are more efficient at low masses, black hole feedback that preferentially suppresses star formation at high masses, and structural and morphological evolution through merging and environmental processes. However, all cosmological models currently adopt phenomenological implementations of many of these core processes, which must be tuned to observations. Many details of how these diverse processes interact within a hierarchical structure formation setting remain poorly understood. Emerging multiscale simulations are helping to bridge the gap between stellar and cosmological scales, placing models on a firmer, more physically grounded footing. Concurrently, upcoming telescope facilities will provide new challenges and constraints for models, particularly by directly constraining inflows and outflows through observations of gas in and around galaxies.

  11. A Framework for Modeling and Analyzing Complex Distributed Systems

    Science.gov (United States)

    2005-08-15

    tool Kronos , Hybrid Systems HI, Verification and Control, Springer-Verlag, pages 208-219, LNCS, volume 1066, 1996 [16] Roberto De Prisco, Alan Fekete...Open- Kronos model checker for timed automata. Monte Carlo model checking has already been implemented in Open- Kronos and has demonstrated significant

  12. Instant e-Teaching Framework Model for Live Online Teaching

    CERN Document Server

    Safei, Suhailan; Rose, Ahmad Nazari Mohd; Rahman, Mohd Nordin Abdul

    2011-01-01

    Instant e-Teaching is a new concept that supplements e-Teaching and e-Learning environment in providing a full and comprehensive modern education styles. The e-Learning technology depicts the concept of enabling self-learning among students on a certain subject using online reference and materials. While the instant e-teaching requires 'face-to-face' characteristic between teacher and student to simultaneously execute actions and gain instant responses. The word instant enhances the e- Teaching with the concept of real time teaching. The challenge to exercise online and instant teaching is not just merely relying on the technologies and system efficiency, but it needs to satisfy the usability and friendliness of the system as to replicate the traditional class environment during the deliveries of the class. For this purpose, an instant e-Teaching framework is been developed that will emulate a dedicated virtual classroom, and primarily designed for synchronous and live sharing of current teaching notes. The m...

  13. Physical Models of Galaxy Formation in a Cosmological Framework

    CERN Document Server

    Somerville, Rachel S

    2014-01-01

    Modeling galaxy formation in a cosmological context presents one of the greatest challenges in astrophysics today, due to the vast range of scales and numerous physical processes involved. Here we review the current status of models that employ two leading techniques to simulate the physics of galaxy formation: semi-analytic models and numerical hydrodynamic simulations. We focus on a set of observational targets that describe the evolution of the global and structural properties of galaxies from roughly Cosmic High Noon ($z\\sim 2-3$) to the present. Although minor discrepancies remain, overall, models show remarkable convergence between different methods and make predictions that are in qualitative agreement with observations. Modelers seem to have converged on a core set of physical processes that are critical for shaping galaxy properties. This core set includes cosmological accretion, strong stellar-driven winds that are more efficient at low masses, black hole feedback that preferentially suppresses star...

  14. Genetic Algorithms Principles Towards Hidden Markov Model

    Directory of Open Access Journals (Sweden)

    Nabil M. Hewahi

    2011-10-01

    Full Text Available In this paper we propose a general approach based on Genetic Algorithms (GAs to evolve Hidden Markov Models (HMM. The problem appears when experts assign probability values for HMM, they use only some limited inputs. The assigned probability values might not be accurate to serve in other cases related to the same domain. We introduce an approach based on GAs to find
    out the suitable probability values for the HMM to be mostly correct in more cases than what have been used to assign the probability values.

  15. Comparison of Hugoniots calculated for aluminum in the framework of three quantum-statistical models

    CERN Document Server

    Kadatskiy, Maxim A

    2015-01-01

    The results of calculations of thermodynamic properties of aluminum under shock compression in the framework of the Thomas--Fermi model, the Thomas--Fermi model with quantum and exchange corrections and the Hartree--Fock--Slater model are presented. The influences of the thermal motion and the interaction of ions are taken into account in the framework of three models: the ideal gas, the one-component plasma and the charged hard spheres. Calculations are performed in the pressure range from 1 to $10^7$ GPa. Calculated Hugoniots are compared with available experimental data.

  16. Comparison of Hugoniots calculated for aluminum in the framework of three quantum-statistical models

    Science.gov (United States)

    Kadatskiy, M. A.; Khishchenko, K. V.

    2015-11-01

    The results of calculations of thermodynamic properties of aluminum under shock compression in the framework of the Thomas-Fermi model, the Thomas-Fermi model with quantum and exchange corrections and the Hartree-Fock-Slater model are presented. The influences of the thermal motion and the interaction of ions are taken into account in the framework of three models: the ideal gas, the one-component plasma and the charged hard spheres. Calculations are performed in the pressure range from 1 to 107 GPa. Calculated Hugoniots are compared with available experimental data.

  17. Total Quality Management (TQM) framework for e-learning based on EFQM and Kirkpatrick models

    OpenAIRE

    Jeanne Schreurs

    2006-01-01

    The EFQM excellence model is a famous quality management tool. We have translated it to be useful in e-learning quality management. EFQM will be used as a framework for self-evaluation. We developed the e-learning stakeholder model. We identified the main criterion and positioned them in the stakeholder model. We present short the Kirkpatrick evaluation model of e-learning. We developed a Kirkpatrick-EFQM self-assessment framework. We propose the limited learner-centric self-assessment frame...

  18. MoVES - A Framework for Modelling and Verifying Embedded Systems

    DEFF Research Database (Denmark)

    Brekling, Aske Wiid; Hansen, Michael Reichhardt; Madsen, Jan

    2009-01-01

    The MoVES framework is being developed to assist in the early phases of embedded systems design. A system is modelled as an application running on an execution platform. The application is modelled through the individual tasks, and the execution platform is modelled through the processing elements...... consumption. A simple specification language for embedded systems and a verification backend are presented. The framework has a modular, parameterized structure supporting easy extension and adaptation of the specification language as well as of the verification backend. We show, using a number of small...... examples, how MoVES can be used to model and analyze embedded systems....

  19. A framework to establish credibility of computational models in biology.

    Science.gov (United States)

    Patterson, Eann A; Whelan, Maurice P

    2017-10-01

    Computational models in biology and biomedical science are often constructed to aid people's understanding of phenomena or to inform decisions with socioeconomic consequences. Model credibility is the willingness of people to trust a model's predictions and is often difficult to establish for computational biology models. A 3 × 3 matrix has been proposed to allow such models to be categorised with respect to their testability and epistemic foundation in order to guide the selection of an appropriate process of validation to supply evidence to establish credibility. Three approaches to validation are identified that can be deployed depending on whether a model is deemed untestable, testable or lies somewhere in between. In the latter two cases, the validation process involves the quantification of uncertainty which is a key output. The issues arising due to the complexity and inherent variability of biological systems are discussed and the creation of 'digital twins' proposed as a means to alleviate the issues and provide a more robust, transparent and traceable route to model credibility and acceptance. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  20. Interaction between GIS and hydrologic model: A preliminary approach using ArcHydro Framework Data Model

    Directory of Open Access Journals (Sweden)

    Silvio Jorge C. Simões

    2013-08-01

    Full Text Available In different regions of Brazil, population growth and economic development can degrade water quality, compromising watershed health and human supply. Because of its ability to combine spatial and temporal data in the same environment and to create water resources management (WRM models, the Geographical Information System (GIS is a powerful tool for managing water resources, preventing floods and estimating water supply. This paper discusses the integration between GIS and hydrological models and presents a case study relating to the upper section of the Paraíba do Sul Basin (Sao Paulo State portion, situated in the Southeast of Brazil. The case study presented in this paper has a database suitable for the basin’s dimensions, including digitized topographic maps at a 50,000 scale. From an ArcGIS®/ArcHydro Framework Data Model, a geometric network was created to produce different raster products. This first grid derived from the digital elevation model grid (DEM is the flow direction map followed by flow accumulation, stream and catchment maps. The next steps in this research are to include the different multipurpose reservoirs situated along the Paraíba do Sul River and to incorporate rainfall time series data in ArcHydro to build a hydrologic data model within a GIS environment in order to produce a comprehensive spatial temporal model.

  1. Next Generation Framework for Aquatic Modeling of the Earth System (NextFrAMES)

    Science.gov (United States)

    Fekete, B. M.; Wollheim, W. M.; Lakhankar, T.; Vorosmarty, C. J.

    2008-12-01

    Earth System model development is becoming an increasingly complex task. As scientists attempt to represent the physical and bio-geochemical processes and various feedback mechanisms in unprecedented detail, the models themselves are becoming increasingly complex. At the same time, the surrounding IT infrastructure needed to carry out these detailed model computations is growing increasingly complex as well. To be accurate and useful, Earth System models must manage a vast amount of data in heterogenous computing environments ranging from single CPU systems to Beowulf type computer clusters. Scientists developing Earth System models increasingly confront obstacles associated with IT infrastructure. Numerous development efforts are on the way to ease that burden and offer model development platforms that reduce IT challenges and allow scientists to focus on their science. While these new modeling frameworks (e.g. FMS, ESMF, CCA, OpenMI) do provide solutions to many IT challenges (performing input/output, managing space and time, establishing model coupling, etc.), they are still considerably complex and often have steep learning curves. Over the course of the last fifteen years ,the University of New Hampshire developed several modeling frameworks independently from the above-mentioned efforts (Data Assembler, Frameworks for Aquatic Modeling of the Earth System and NextFrAMES which is continued at CCNY). While the UNH modeling frameworks have numerous similarities to those developed by other teams, these frameworks, in particular the latest NextFrAMES, represent a novel model development paradigm. While other modeling frameworks focus on providing services to modelers to perform various tasks, NextFrAMES strives to hide all of those services and provide a new approach for modelers to express their scientific thoughts. From a scientific perspective, most models have two core elements: the overall model structure (defining the linkages between the simulated processes

  2. Open Models of Decision Support Towards a Framework

    OpenAIRE

    Diasio, Stephen Ray

    2012-01-01

    Aquesta tesi presenta un marc per als models oberts de suport a les decisions en les organitzacions. El treball es vehicula a través d’un compendi d’articles on s’analitzen els fluxos d’entrada i de sortida de coneixement en les organitzacions, així como les tecnologies existents de suport a les decisions. Es presenten els factors subjacents que impulsen nous models per a formes obertes de suport a la decisió. La tesis presenta un estudi de les distintes tipologies de models de suport a les d...

  3. Extending the Modelling Framework for Gas-Particle Systems

    DEFF Research Database (Denmark)

    Rosendahl, Lasse Aistrup

    , with very good results. Single particle combustion has been tested using a number of different particle combustion models applied to coal and straw particles. Comparing the results of these calculations to measurements on straw burnout, the results indicate that for straw, existing heterogeneous combustion...... models perform well, and may be used in high temperature ranges. Finally, the particle tracking and combustion model is applied to an existing coal and straw co- fuelled burner. The results indicate that again, the straw follows very different trajectories than the coal particles, and also that burnout...

  4. FAULT DIAGNOSIS APPROACH BASED ON HIDDEN MARKOV MODEL AND SUPPORT VECTOR MACHINE

    Institute of Scientific and Technical Information of China (English)

    LIU Guanjun; LIU Xinmin; QIU Jing; HU Niaoqing

    2007-01-01

    Aiming at solving the problems of machine-learning in fault diagnosis, a diagnosis approach is proposed based on hidden Markov model (HMM) and support vector machine (SVM). HMM usually describes intra-class measure well and is good at dealing with continuous dynamic signals. SVM expresses inter-class difference effectively and has perfect classify ability. This approach is built on the merit of HMM and SVM. Then, the experiment is made in the transmission system of a helicopter. With the features extracted from vibration signals in gearbox, this HMM-SVM based diagnostic approach is trained and used to monitor and diagnose the gearbox's faults. The result shows that this method is better than HMM-based and SVM-based diagnosing methods in higher diagnostic accuracy with small training samples.

  5. Model Components of the Certification Framework for Geologic Carbon Sequestration Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Oldenburg, Curtis M.; Bryant, Steven L.; Nicot, Jean-Philippe; Kumar, Navanit; Zhang, Yingqi; Jordan, Preston; Pan, Lehua; Granvold, Patrick; Chow, Fotini K.

    2009-06-01

    We have developed a framework for assessing the leakage risk of geologic carbon sequestration sites. This framework, known as the Certification Framework (CF), emphasizes wells and faults as the primary potential leakage conduits. Vulnerable resources are grouped into compartments, and impacts due to leakage are quantified by the leakage flux or concentrations that could potentially occur in compartments under various scenarios. The CF utilizes several model components to simulate leakage scenarios. One model component is a catalog of results of reservoir simulations that can be queried to estimate plume travel distances and times, rather than requiring CF users to run new reservoir simulations for each case. Other model components developed for the CF and described here include fault characterization using fault-population statistics; fault connection probability using fuzzy rules; well-flow modeling with a drift-flux model implemented in TOUGH2; and atmospheric dense-gas dispersion using a mesoscale weather prediction code.

  6. Development of a practical modeling framework for estimating the impact of wind technology on bird populations

    Energy Technology Data Exchange (ETDEWEB)

    Morrison, M.L. [California State Univ., Sacramento, CA (United States); Pollock, K.H. [North Carolina State Univ., Raleigh, NC (United States)

    1997-11-01

    One of the most pressing environmental concerns related to wind project development is the potential for avian fatalities caused by the turbines. The goal of this project is to develop a useful, practical modeling framework for evaluating potential wind power plant impacts that can be generalized to most bird species. This modeling framework could be used to get a preliminary understanding of the likelihood of significant impacts to birds, in a cost-effective way. The authors accomplish this by (1) reviewing the major factors that can influence the persistence of a wild population; (2) briefly reviewing various models that can aid in estimating population status and trend, including methods of evaluating model structure and performance; (3) reviewing survivorship and population projections; and (4) developing a framework for using models to evaluate the potential impacts of wind development on birds.

  7. Development of a practical modeling framework for estimating the impact of wind technology on bird populations

    Energy Technology Data Exchange (ETDEWEB)

    Morrison, M.L. [California State Univ., Sacramento, CA (United States); Pollock, K.H. [North Carolina State Univ., Raleigh, NC (United States)

    1997-11-01

    One of the most pressing environmental concerns related to wind project development is the potential for avian fatalities caused by the turbines. The goal of this project is to develop a useful, practical modeling framework for evaluating potential wind power plant impacts that can be generalized to most bird species. This modeling framework could be used to get a preliminary understanding of the likelihood of significant impacts to birds, in a cost-effective way. The authors accomplish this by (1) reviewing the major factors that can influence the persistence of a wild population; (2) briefly reviewing various models that can aid in estimating population status and trend, including methods of evaluating model structure and performance; (3) reviewing survivorship and population projections; and (4) developing a framework for using models to evaluate the potential impacts of wind development on birds.

  8. Non-local first-order modelling of crowd dynamics: a multidimensional framework with applications

    CERN Document Server

    Bruno, Luca; Tricerri, Paolo; Venuti, Fiammetta

    2010-01-01

    In this work a physical modelling framework is presented, describing the intelligent, non-local, and anisotropic behaviour of pedestrians. Its phenomenological basics and constitutive elements are detailed, and a qualitative analysis is provided. Within this common framework, two first-order mathematical models, along with related numerical solution techniques, are derived. The models are oriented to specific real world applications: a one-dimensional model of crowd-structure interaction in footbridges and a two-dimensional model of pedestrian flow in an underground station with several obstacles and exits. The noticeable heterogeneity of the applications demonstrates the significance of the physical framework and its versatility in addressing different engineering problems. The results of the simulations point out the key role played by the physiological and psychological features of human perception on the overall crowd dynamics.

  9. Effective Thermal Conductivity Modeling of Sandstones: SVM Framework Analysis

    Science.gov (United States)

    Rostami, Alireza; Masoudi, Mohammad; Ghaderi-Ardakani, Alireza; Arabloo, Milad; Amani, Mahmood

    2016-06-01

    Among the most significant physical characteristics of porous media, the effective thermal conductivity (ETC) is used for estimating the thermal enhanced oil recovery process efficiency, hydrocarbon reservoir thermal design, and numerical simulation. This paper reports the implementation of an innovative least square support vector machine (LS-SVM) algorithm for the development of enhanced model capable of predicting the ETCs of dry sandstones. By means of several statistical parameters, the validity of the presented model was evaluated. The prediction of the developed model for determining the ETCs of dry sandstones was in excellent agreement with the reported data with a coefficient of determination value ({R}2) of 0.983 and an average absolute relative deviation of 0.35 %. Results from present research show that the proposed LS-SVM model is robust, reliable, and efficient in calculating the ETCs of sandstones.

  10. A PROBABILISTIC MODELING FRAMEWORK FOR PREDICTING POPULATION EXPOSURES TO BENZENE

    Science.gov (United States)

    The US Environmental Protection Agency (EPA) is modifying their probabilistic Stochastic Human Exposure Dose Simulation (SHEDS) model to assess aggregate exposures to air toxics. Air toxics include urban Hazardous Air Pollutants (HAPS) such as benzene from mobile sources, part...

  11. A parametric framework for modelling of bioelectrical signals

    CERN Document Server

    Mughal, Yar Muhammad

    2016-01-01

    This book examines non-invasive, electrical-based methods for disease diagnosis and assessment of heart function. In particular, a formalized signal model is proposed since this offers several advantages over methods that rely on measured data alone. By using a formalized representation, the parameters of the signal model can be easily manipulated and/or modified, thus providing mechanisms that allow researchers to reproduce and control such signals. In addition, having such a formalized signal model makes it possible to develop computer tools that can be used for manipulating and understanding how signal changes result from various heart conditions, as well as for generating input signals for experimenting with and evaluating the performance of e.g. signal extraction methods. The work focuses on bioelectrical information, particularly electrical bio-impedance (EBI). Once the EBI has been measured, the corresponding signals have to be modelled for analysis. This requires a structured approach in order to move...

  12. Model Adaptation for Prognostics in a Particle Filtering Framework

    Data.gov (United States)

    National Aeronautics and Space Administration — One of the key motivating factors for using particle filters for prognostics is the ability to include model parameters as part of the state vector to be estimated....

  13. A Framework for Structural Equation Models in General Pedigrees

    National Research Council Canada - National Science Library

    Morris, Nathan J; Elston, Robert C; Stein, Catherine M

    Background/Aims: Structural Equation Modeling (SEM) is an analysis approach that accounts for both the causal relationships between variables and the errors associated with the measurement of these variables...

  14. Integrated Modeling Framework for Anthropometry and Physiology Virtual Body

    Science.gov (United States)

    2007-06-01

    crash dummies (HYBRID III and THOR) databases, generation of geometrical models for and a state of the art instrumented thorax for blast simulations...graphics. Early humans were represented as simple performance, de-mining, ballistic protection, and other articulated bodies made of segments and...cfdrc.com 2. ATB 1998, Articulated Total Body Model Version V Website: www.cfdrc.com Users Manual: Phone: 256-726-4857 Fax: 256-726-4806 8

  15. Multi-Fidelity Framework for Modeling Combustion Instability

    Science.gov (United States)

    2016-07-27

    by integrating a reduced- order model (ROM) for combustion response into the linearized Euler equations. The ROM is developed from CFD simulations...distinguishable instability behaviors. The coupling between the ROM and the Euler equations requires two-way information transfer between the two...established by integrating a reduced-order model (ROM) for combustion response into the linearized Euler equations. The ROM is developed from CFD

  16. NSME: a framework for network worm modeling and simulation

    OpenAIRE

    Lin, Siming; Cheng, Xueqi

    2006-01-01

    Various worms have a devastating impact on Internet. Packet level network modeling and simulation has become an approach to find effective countermeasures against worm threat. However, current alternatives are not fit enough for this purpose. For instance, they mostly focus on the details of lower layers of the network so that the abstraction of application layer is very coarse. In our work, we propose a formal description of network and worm models, and define network virtualization level...

  17. A fuzzy rule based framework for noise annoyance modeling.

    Science.gov (United States)

    Botteldooren, Dick; Verkeyn, Andy; Lercher, Peter

    2003-09-01

    Predicting the effect of noise on individual people and small groups is an extremely difficult task due to the influence of a multitude of factors that vary from person to person and from context to context. Moreover, noise annoyance is inherently a vague concept. That is why, in this paper, it is argued that noise annoyance models should identify a fuzzy set of possible effects rather than seek a very accurate crisp prediction. Fuzzy rule based models seem ideal candidates for this task. This paper provides the theoretical background for building these models. Existing empirical knowledge is used to extract a few typical rules that allow making the model more specific for small groups of individuals. The resulting model is tested on two large-scale social surveys augmented with exposure simulations. The testing demonstrates how this new way of thinking about noise effect modeling can be used in practice both in management support as a "noise annoyance adviser" and in social science for testing hypotheses such as the effect of noise sensitivity or the degree of urbanization.

  18. A full annual cycle modeling framework for American black ducks

    Science.gov (United States)

    Robinson, Orin J.; McGowan, Conor; Devers, Patrick K.; Brook, Rodney W.; Huang, Min; Jones, Malcom; McAuley, Daniel G.; Zimmerman, Guthrie

    2016-01-01

    American black ducks (Anas rubripes) are a harvested, international migratory waterfowl species in eastern North America. Despite an extended period of restrictive harvest regulations, the black duck population is still below the population goal identified in the North American Waterfowl Management Plan (NAWMP). It has been hypothesized that density-dependent factors restrict population growth in the black duck population and that habitat management (increases, improvements, etc.) may be a key component of growing black duck populations and reaching the prescribed NAWMP population goal. Using banding data from 1951 to 2011 and breeding population survey data from 1990 to 2014, we developed a full annual cycle population model for the American black duck. This model uses the seven management units as set by the Black Duck Joint Venture, allows movement into and out of each unit during each season, and models survival and fecundity for each region separately. We compare model population trajectories with observed population data and abundance estimates from the breeding season counts to show the accuracy of this full annual cycle model. With this model, we then show how to simulate the effects of habitat management on the continental black duck population.

  19. Model Adaptation for Prognostics in a Particle Filtering Framework

    Directory of Open Access Journals (Sweden)

    Bhaskar Saha

    2011-01-01

    Full Text Available One of the key motivating factors for using particle filters for prognostics is the ability to include model parameters as part of the state vector to be estimated. This performs model adaptation in conjunction with state tracking, and thus, produces a tuned model that can used for long term predictions. This feature of particle filters works in most part due to the fact that they are not subject to the “curse of dimensionality”, i.e. the exponential growth of computational complexity with state dimension. However, in practice, this property holds for “well-designed” particle filters only as dimensionality increases. This paper explores the notion of wellness of design in the context of predicting remaining useful life for individual discharge cycles of Li-ion and Li-Polymer batteries. Prognostic metrics are used to analyze the tradeoff between different model designs and prediction performance. Results demonstrate how sensitivity analysis may be used to arrive at a well-designed prognostic model that can take advantage of the model adaptation properties of a particle filter.

  20. Multiscale Model of Colorectal Cancer Using the Cellular Potts Framework.

    Science.gov (United States)

    Osborne, James M

    2015-01-01

    Colorectal cancer (CRC) is one of the major causes of death in the developed world and forms a canonical example of tumorigenesis. CRC arises from a string of mutations of individual cells in the colorectal crypt, making it particularly suited for multiscale multicellular modeling, where mutations of individual cells can be clearly represented and their effects readily tracked. In this paper, we present a multicellular model of the onset of colorectal cancer, utilizing the cellular Potts model (CPM). We use the model to investigate how, through the modification of their mechanical properties, mutant cells colonize the crypt. Moreover, we study the influence of mutations on the shape of cells in the crypt, suggesting possible cell- and tissue-level indicators for identifying early-stage cancerous crypts. Crucially, we discuss the effect that the motility parameters of the model (key factors in the behavior of the CPM) have on the distribution of cells within a homeostatic crypt, resulting in an optimal parameter regime that accurately reflects biological assumptions. In summary, the key results of this paper are 1) how to couple the CPM with processes occurring on other spatial scales, using the example of the crypt to motivate suitable motility parameters; 2) modeling mutant cells with the CPM; 3) and investigating how mutations influence the shape of cells in the crypt.

  1. Design theoretic analysis of three system modeling frameworks.

    Energy Technology Data Exchange (ETDEWEB)

    McDonald, Michael James

    2007-05-01

    This paper analyzes three simulation architectures from the context of modeling scalability to address System of System (SoS) and Complex System problems. The paper first provides an overview of the SoS problem domain and reviews past work in analyzing model and general system complexity issues. It then identifies and explores the issues of vertical and horizontal integration as well as coupling and hierarchical decomposition as the system characteristics and metrics against which the tools are evaluated. In addition, it applies Nam Suh's Axiomatic Design theory as a construct for understanding coupling and its relationship to system feasibility. Next it describes the application of MATLAB, Swarm, and Umbra (three modeling and simulation approaches) to modeling swarms of Unmanned Flying Vehicle (UAV) agents in relation to the chosen characteristics and metrics. Finally, it draws general conclusions for analyzing model architectures that go beyond those analyzed. In particular, it identifies decomposition along phenomena of interaction and modular system composition as enabling features for modeling large heterogeneous complex systems.

  2. Utilisation of theoretical models and frameworks in the process of evidence synthesis.

    Science.gov (United States)

    Godfrey, Christina M; Harrison, Margaret B; Graham, Ian D; Ross-White, Amanda

    2010-01-01

    A systematic review is a comprehensive enquiry or study of secondary data sources. There is a research question, an a priori articulation of methods and a set of procedures to focus the investigation. Despite these rigorous structures to guide the review, synthesising evidence is a challenging, resource intense and time consuming process. Large volumes of information complicate not only the search functions, but also the conceptualisation of the evidence needed to create the concise and integrated results. Use of a theoretical model or framework could serve as an essential element in effectively focusing the review and designing the methods to respond to the knowledge question. This scoping review sought to confirm the value of models or frameworks used by authors working within traditional methodologies for evidence synthesis. Types of participants The focus of this review was on the context of health care.Types of intervention(s)/phenomena of interest All studies that discussed models or frameworks used specifically to address the process of synthesis were included.Types of studies Discussion, scholarship or methodology papers and reviews were included.Types of outcome All theoretical models or frameworks were described, with specific attention to the purpose of the framework for each study, and the contribution of the framework to the process of synthesis. The search strategy aimed to find both published and unpublished studies. A three-step search strategy was utilised. The databases for published material included CINAHL; Medline; EMBASE; PsycINFO; AMED; Cochrane; Biomed Central; Scirus; and Mednar. Databases for unpublished material included Dissertation Abstracts; Sociological Abstracts; Conference proceedings. The review was a focused scoping review to locate and describe the contribution of theoretical models or frameworks to the process of synthesis. The methodological quality of the discussion papers was therefore not assessed. Data was extracted from

  3. Towards uncertainty quantification and parameter estimation for Earth system models in a component-based modeling framework

    Science.gov (United States)

    Peckham, Scott D.; Kelbert, Anna; Hill, Mary C.; Hutton, Eric W. H.

    2016-05-01

    Component-based modeling frameworks make it easier for users to access, configure, couple, run and test numerical models. However, they do not typically provide tools for uncertainty quantification or data-based model verification and calibration. To better address these important issues, modeling frameworks should be integrated with existing, general-purpose toolkits for optimization, parameter estimation and uncertainty quantification. This paper identifies and then examines the key issues that must be addressed in order to make a component-based modeling framework interoperable with general-purpose packages for model analysis. As a motivating example, one of these packages, DAKOTA, is applied to a representative but nontrivial surface process problem of comparing two models for the longitudinal elevation profile of a river to observational data. Results from a new mathematical analysis of the resulting nonlinear least squares problem are given and then compared to results from several different optimization algorithms in DAKOTA.

  4. Building an Open Source Framework for Integrated Catchment Modeling

    Science.gov (United States)

    Jagers, B.; Meijers, E.; Villars, M.

    2015-12-01

    In order to develop effective strategies and associated policies for environmental management, we need to understand the dynamics of the natural system as a whole and the human role therein. This understanding is gained by comparing our mental model of the world with observations from the field. However, to properly understand the system we should look at dynamics of water, sediments, water quality, and ecology throughout the whole system from catchment to coast both at the surface and in the subsurface. Numerical models are indispensable in helping us understand the interactions of the overall system, but we need to be able to update and adjust them to improve our understanding and test our hypotheses. To support researchers around the world with this challenging task we started a few years ago with the development of a new open source modeling environment DeltaShell that integrates distributed hydrological models with 1D, 2D, and 3D hydraulic models including generic components for the tracking of sediment, water quality, and ecological quantities throughout the hydrological cycle composed of the aforementioned components. The open source approach combined with a modular approach based on open standards, which allow for easy adjustment and expansion as demands and knowledge grow, provides an ideal starting point for addressing challenging integrated environmental questions.

  5. Implementation of a PETN failure model using ARIA's general chemistry framework

    Energy Technology Data Exchange (ETDEWEB)

    Hobbs, Michael L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-01-01

    We previously developed a PETN thermal decomposition model that accurately predicts thermal ignition and detonator failure [1]. This model was originally developed for CALORE [2] and required several complex user subroutines. Recently, a simplified version of the PETN decomposition model was implemented into ARIA [3] using a general chemistry framework without need for user subroutines. Detonator failure was also predicted with this new model using ENCORE. The model was simplified by 1) basing the model on moles rather than mass, 2) simplifying the thermal conductivity model, and 3) implementing ARIA’s new phase change model. This memo briefly describes the model, implementation, and validation.

  6. A Flexible Atmospheric Modeling Framework for the CESM

    Energy Technology Data Exchange (ETDEWEB)

    Randall, David [Colorado State University; Heikes, Ross [Colorado State University; Konor, Celal [Colorado State University

    2014-11-12

    We have created two global dynamical cores based on the unified system of equations and Z-grid staggering on an icosahedral grid, which are collectively called UZIM (Unified Z-grid Icosahedral Model). The z-coordinate version (UZIM-height) can be run in hydrostatic and nonhydrostatic modes. The sigma-coordinate version (UZIM-sigma) runs in only hydrostatic mode. The super-parameterization has been included as a physics option in both models. The UZIM versions with the super-parameterization are called SUZI. With SUZI-height, we have completed aquaplanet runs. With SUZI-sigma, we are making aquaplanet runs and realistic climate simulations. SUZI-sigma includes realistic topography and a SiB3 model to parameterize the land-surface processes.

  7. A CONCEPTUAL FRAMEWORK FOR SUSTAINABLE POULTRY SUPPLY CHAIN MODEL

    Directory of Open Access Journals (Sweden)

    Mohammad SHAMSUDDOHA

    2013-12-01

    Full Text Available Now a day, sustainable supply chain is the crucially considerable matter for future focused industries. As a result, attention in supply chain management has increasingly amplified since the 1980s when firms discovered its benefits of mutual relationships within and beyond their own organization. This is why, concern researchers are trying hard to develop new theory or model which might help the corporate sector for achieving sustainability in their supply chains. This kind of reflection can be seen by the number of papers published and in particular by journal since 1980. The objectives of this paper are twofold. First, it offers a literature review on sustainable supply chain management taking papers published in last three decades. Second, it offers a conceptual sustainable supply chain process model in light of triple bottom line theory. The model has been developed by taking in-depth interview of an entrepreneur from a Poultry case industry in Bangladesh.

  8. A Design of Network Intrusion Detection Algorithm Based on HMM and Supervised Self Organize Mapping Net%基于 SOM 网络和 HMM 的入侵检测算法设计

    Institute of Scientific and Technical Information of China (English)

    李志坚

    2016-01-01

    为了有效地保证网络的安全性和弥补传统入侵检测系统无法精确识别攻击类别的问题,设计了一种基于监督 SOM 网络和 HMM 的网络入侵混合检测方法。仿真实验表明:文中方法能有效实现网络入侵检测,在样本数量较少的情况下,仍然具有较高的检测率,较其他方法具有较大的优越性。%The study proposes a network intrusion compound detection method based on supervised SOM network and HMM, in order to guarantee the safety of the network effectively and conquer the problem of traditional intrusion detection system not accurately recognizing the attack type. The simulation experiment shows the method proposed in the experiment is an efficient way of network intrusion detection even with small samples. Compared with other detective methods, this algorithm has great priority.

  9. Isolated Word Recognition From In-Ear Microphone Data Using Hidden Markov Models (HMM)

    Science.gov (United States)

    2006-03-01

    word_folder = dir; cd(w); %zlen = []; p = 1; for k = 3:length(word_folder) datapath = strcat(wpath...8217\\’,word_folder(k).name); [datacropped,speechlength,speechflag] = endpoint1( datapath ); % Calls the...endpoint detection function. fprintf(1,’... Cropping file: %s and length, %6.2f\

  10. A Framework for Modelling Connective Tissue Changes in VIIP Syndrome

    Science.gov (United States)

    Ethier, C. R.; Best, L.; Gleason, R.; Mulugeta, L.; Myers, J. G.; Nelson, E. S.; Samuels, B. C.

    2014-01-01

    Insertion of astronauts into microgravity induces a cascade of physiological adaptations, notably including a cephalad fluid shift. Longer-duration flights carry an increased risk of developing Visual Impairment and Intracranial Pressure (VIIP) syndrome, a spectrum of ophthalmic changes including posterior globe flattening, choroidal folds, distension of the optic nerve sheath, kinking of the optic nerve and potentially permanent degradation of visual function. The slow onset of changes in VIIP, their chronic nature, and the similarity of certain clinical features of VIIP to ophthalmic findings in patients with raised intracranial pressure strongly suggest that: (i) biomechanical factors play a role in VIIP, and (ii) connective tissue remodeling must be accounted for if we wish to understand the pathology of VIIP. Our goal is to elucidate the pathophysiology of VIIP and suggest countermeasures based on biomechanical modeling of ocular tissues, suitably informed by experimental data, and followed by validation and verification. We specifically seek to understand the quasi-homeostatic state that evolves over weeks to months in space, during which ocular tissue remodeling occurs. This effort is informed by three bodies of work: (i) modeling of cephalad fluid shifts; (ii) modeling of ophthalmic tissue biomechanics in glaucoma; and (iii) modeling of connective tissue changes in response to biomechanical loading.

  11. Understanding organizational congruence: formal model and simulation framework.

    NARCIS (Netherlands)

    Dignum, M.V.; Dignum, F.P.M.

    2008-01-01

    Despite a large number of studies, the effect of organizational structure on the performance and the individual cognition of its members is still not well understood. Our research aims at developing tools and formalisms to model organizations and evaluate their performance under different circumstan

  12. An Active Lattice Model in a Bayesian Framework

    DEFF Research Database (Denmark)

    Carstensen, Jens Michael

    1996-01-01

    by penalizing deviations in alignment and lattice node distance. The Markov random field represents prior knowledge about the lattice structure, and through an observation model that incorporates the visual appearance of the nodes, we can simulate realizations from the posterior distribution. A maximum...

  13. A Framework for Non-Gaussian Signal Modeling and Estimation

    Science.gov (United States)

    1999-06-01

    the minimum entropy estimator," Trabajos de Estadistica , vol. 19, pp. 55-65, 1968. XI_ ILlllgl_____l)___11-_11^· -^_X II- _ -- _ _ . III·III...Nonparametric Function Estimation, Modeling, and Simulation. Philadelphia: Society for Industrial and Applied Mathematics, 1990. [200] D. M. Titterington

  14. Levine's Conservation Model: A Framework for Advanced Gerontology Nursing Practice.

    Science.gov (United States)

    Abumaria, Ibrahim Mahmoud; Hastings-Tolsma, Marie; Sakraida, Teresa J

    2015-01-01

    Growing numbers of older adults place increased demands on already burdened healthcare systems. The cost of managing chronic illnesses mandates greater emphasis on management and prevention. This article explores the adaptation of Levine's Conservation Model as a structure for providing care to the older adult by the adult-gerontology primary care nurse practitioner (AGNP). The AGNP role, designed to provide quality care to adult and older adult populations, offers the opportunity to not only manage health care of the elderly, but to also advocate, lead in collaborative care efforts, conduct advanced planning, and manage and negotiate health delivery systems. The use of nursing models can foster the design of effective interventions that promote health of the older adult, particularly in the long-term care environment. Levine's Conservation Model provides a useful structure for older adult care in the long-term care setting. As an ideal care manager, the AGNP would be well served to consider use of the model to guide advanced nursing practice. Recommendations for clinical practice, research, and health policy. © 2014 Wiley Periodicals, Inc.

  15. Spectral element modelling of floating bodies in a Boussinesq framework

    DEFF Research Database (Denmark)

    Engsig-Karup, Allan Peter; Eskilsson, Claes; Ricchiuto, Mario

    The wave energy sector relies heavily on the use of linear hydrodynamic models for the assessment of motions, loads and power production. The linear codes are computationally efficient and produce good results if applied within their application window. However, recent studies using two-phase VOF...

  16. A Framework for the Modelling of Biphasic Reacting Systems

    DEFF Research Database (Denmark)

    Anantpinijwatna, Amata; Sin, Gürkan; O’Connell, John P.

    2014-01-01

    Biphasic reacting systems have a broad application range from organic reactions in pharmaceutical and agro-bio industries to CO 2 capture. However, mathematical modelling of biphasic reacting systems is a formidable challenge due to many phenomena underlying the process such as chemical equilibri...... systems: a PTC-based reaction system and pseudo-PTC system....

  17. An Empirical Generative Framework for Computational Modeling of Language Acquisition

    Science.gov (United States)

    Waterfall, Heidi R.; Sandbank, Ben; Onnis, Luca; Edelman, Shimon

    2010-01-01

    This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of…

  18. Quasi-continuous stochastic simulation framework for flood modelling

    Science.gov (United States)

    Moustakis, Yiannis; Kossieris, Panagiotis; Tsoukalas, Ioannis; Efstratiadis, Andreas

    2017-04-01

    Typically, flood modelling in the context of everyday engineering practices is addressed through event-based deterministic tools, e.g., the well-known SCS-CN method. A major shortcoming of such approaches is the ignorance of uncertainty, which is associated with the variability of soil moisture conditions and the variability of rainfall during the storm event.In event-based modeling, the sole expression of uncertainty is the return period of the design storm, which is assumed to represent the acceptable risk of all output quantities (flood volume, peak discharge, etc.). On the other hand, the varying antecedent soil moisture conditions across the basin are represented by means of scenarios (e.g., the three AMC types by SCS),while the temporal distribution of rainfall is represented through standard deterministic patterns (e.g., the alternative blocks method). In order to address these major inconsistencies,simultaneously preserving the simplicity and parsimony of the SCS-CN method, we have developed a quasi-continuous stochastic simulation approach, comprising the following steps: (1) generation of synthetic daily rainfall time series; (2) update of potential maximum soil moisture retention, on the basis of accumulated five-day rainfall; (3) estimation of daily runoff through the SCS-CN formula, using as inputs the daily rainfall and the updated value of soil moisture retention;(4) selection of extreme events and application of the standard SCS-CN procedure for each specific event, on the basis of synthetic rainfall.This scheme requires the use of two stochastic modelling components, namely the CastaliaR model, for the generation of synthetic daily data, and the HyetosMinute model, for the disaggregation of daily rainfall to finer temporal scales. Outcomes of this approach are a large number of synthetic flood events, allowing for expressing the design variables in statistical terms and thus properly evaluating the flood risk.

  19. Generic framework for mining cellular automata models on protein-folding simulations.

    Science.gov (United States)

    Diaz, N; Tischer, I

    2016-05-13

    Cellular automata model identification is an important way of building simplified simulation models. In this study, we describe a generic architectural framework to ease the development process of new metaheuristic-based algorithms for cellular automata model identification in protein-folding trajectories. Our framework was developed by a methodology based on design patterns that allow an improved experience for new algorithms development. The usefulness of the proposed framework is demonstrated by the implementation of four algorithms, able to obtain extremely precise cellular automata models of the protein-folding process with a protein contact map representation. Dynamic rules obtained by the proposed approach are discussed, and future use for the new tool is outlined.

  20. 用户浏览页面预测的多阶HMM模型融合%Multiple HMM Fuzzy Integral Algorithm for Web Access Prediction

    Institute of Scientific and Technical Information of China (English)

    陈铁军; 覃征; 贺升平

    2008-01-01

    预测用户的浏览页面是WWW上的一个重要的研究方向.提出了一种通过模糊积分融合多阶HMM(Hidden Markov Model)预测结果的用户浏览页面预测模型.该算法首先拓展了经典融合预测算法的先验信息空间.其方法是通过对不同用户浏览模式分类,建立1阶多Morkov链模型并以其训练结果为权重指标,而后通过模糊积分理论融合1~N阶HMM预测的结果.性能测试实验结果表明该模型预测准确率优于已有的用户浏览页面预测的多HMM融合方法.该方法可在Web站点管理、电子商务以及网页预取等领域广泛应用.

  1. Improvement of Dynamic Hand Gesture Recognition Technology Based on HMM Method%基于HMM方法的动态手势识别技术的改进

    Institute of Scientific and Technical Information of China (English)

    于美娟; 马希荣

    2011-01-01

    In order to enhance the efficiency and the accuracy of dynamic hand gesture recognition based on HMM method, for the computation's high complexity of H MM method in the training stage, a new HMM algorithm based on the dynamic programming was presented. It makes improvement to the HMM algorithm's training stage, and will enhance the accuracy and timeliness in human-robot interaction.%为了提高基于HMM方法的动态手势识别的效率和准确性,针对HMM方法在训练手势中计算的高复杂性,提出了一种HMM算法和动态规划的算法相结合的方法,对HMM算法中的训练阶段进行了改进,增强了人机交互的准确性与实时性.

  2. Periodic model of LTA framework containing various non-tetrahedral cations

    Science.gov (United States)

    Koleżyński, A.; Mikuła, A.; Król, M.

    2016-03-01

    A simplified periodic model of Linde Type A zeolite (LTA) structure with various selected mono- and di-valent extra-framework cations was formulated. Ab initio calculations (geometry optimization and vibrational spectra calculations) using the proposed model were carried out by means of Crystal09 program. The resulting structures and simulated spectra were analyzed in detail and compared with the experimental ones. The presented results show that in most cases the proposed model agrees well with experimental results. Individual bands were assigned to respective normal modes of vibration and the changes resulting from the selective substitution of extra framework cations were described and explained.

  3. Periodic model of LTA framework containing various non-tetrahedral cations.

    Science.gov (United States)

    Koleżyński, A; Mikuła, A; Król, M

    2016-03-15

    A simplified periodic model of Linde Type A zeolite (LTA) structure with various selected mono- and di-valent extra-framework cations was formulated. Ab initio calculations (geometry optimization and vibrational spectra calculations) using the proposed model were carried out by means of Crystal09 program. The resulting structures and simulated spectra were analyzed in detail and compared with the experimental ones. The presented results show that in most cases the proposed model agrees well with experimental results. Individual bands were assigned to respective normal modes of vibration and the changes resulting from the selective substitution of extra framework cations were described and explained.

  4. A Proof-Irrelevant Model of Martin-Löf's Logical Framework

    DEFF Research Database (Denmark)

    Fridlender, Daniel

    2003-01-01

    We extend the proof-irrelevant model defined in Smith (1988) to the whole of Martin-Löf's logical framework. The main difference here is the existence of a type whose objects themselves represent types rather than proof-objects. This means that the model must now be able to distinguish between...... also show how to extend it when the logical framework itself is enlarged with inductive definitions. In doing so, a variant of Church numerals is introduced. As in Smith (1988), the model can only be defined in the absence of universes, and it is useful to obtain an elementary proof of consistency...

  5. A model-based framework for incremental scale-up of wastewater treatment processes

    DEFF Research Database (Denmark)

    Mauricio Iglesias, Miguel; Sin, Gürkan

    Scale-up is traditionally done following specific ratios or rules of thumb which do not lead to optimal results. We present a generic framework to assist in scale-up of wastewater treatment processes based on multiscale modelling, multiobjective optimisation and a validation of the model at the new...... large scale. The framework is illustrated by the scale-up of a complete autotropic nitrogen removal process. The model based multiobjective scaleup offers a promising improvement compared to the rule of thumbs based emprical scale up rules...

  6. Establishing a Numerical Modeling Framework for Hydrologic Engineering Analyses of Extreme Storm Events

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Xiaodong; Hossain, Faisal; Leung, L. Ruby

    2017-08-01

    In this study a numerical modeling framework for simulating extreme storm events was established using the Weather Research and Forecasting (WRF) model. Such a framework is necessary for the derivation of engineering parameters such as probable maximum precipitation that are the cornerstone of large water management infrastructure design. Here this framework was built based on a heavy storm that occurred in Nashville (USA) in 2010, and verified using two other extreme storms. To achieve the optimal setup, several combinations of model resolutions, initial/boundary conditions (IC/BC), cloud microphysics and cumulus parameterization schemes were evaluated using multiple metrics of precipitation characteristics. The evaluation suggests that WRF is most sensitive to IC/BC option. Simulation generally benefits from finer resolutions up to 5 km. At the 15km level, NCEP2 IC/BC produces better results, while NAM IC/BC performs best at the 5km level. Recommended model configuration from this study is: NAM or NCEP2 IC/BC (depending on data availability), 15km or 15km-5km nested grids, Morrison microphysics and Kain-Fritsch cumulus schemes. Validation of the optimal framework suggests that these options are good starting choices for modeling extreme events similar to the test cases. This optimal framework is proposed in response to emerging engineering demands of extreme storm events forecasting and analyses for design, operations and risk assessment of large water infrastructures.

  7. A review of the quantification and communication of uncertainty associated with geological framework models

    Science.gov (United States)

    Mathers, Steve; Lark, Murray

    2015-04-01

    Digital Geological Framework Models show geology in three dimensions, they can most easily be thought of as 3D geological maps. The volume of the model is divided into distinct geological units using a suitable rock classification in the same way that geological maps are. Like geological maps the models are generic and many are intended to be fit for any geoscience purpose. Over the last decade many Geological Survey Organisations (GSO's) worldwide have begun to communicate their geological understanding of the subsurface through Geological Framework Models and themed derivatives, and the traditional printed geological map has been increasingly phased out. Building Geological Framework Models entails the assembly of all the known geospatial information into a single workspace for interpretation. The calculated models are commonly displayed as either a stack of geological surfaces or boundaries (unit tops, bases, unconformities) or as solid calculated blocks of 3D geology with the unit volumes infilled in with colour or symbols. The studied volume however must be completely populated so decisions on the subsurface distribution of units must be made even where considerable uncertainty exists There is naturally uncertainty associated with any Geological Framework Model and this is composed of two main components; the uncertainty in the geospatial data used to constrain the model, and the uncertainty related to the model construction, this includes factors such as choice of modeller(s), choice of software(s), and modelling workflow. Uncertainty is the inverse of confidence, reliability or certainty, other closely related terms include risk commonly used in preference to uncertainty where financial or safety matters are presented and probability used as a statistical measure of uncertainty. We can consider uncertainty in geological framework models to be of two main types: Uncertainty in the geospatial data used to constrain the model; this differs with the distinct

  8. Application of computer-aided multi-scale modelling framework – Aerosol case study

    DEFF Research Database (Denmark)

    Heitzig, Martina; Sin, Gürkan; Glarborg, Peter

    -aided methods provide. The key prerequisite of computer-aided product-process engineering is however the availability of models of different types, forms and application modes. The development of the models required for the systems under investigation tends to be a challenging and time-consuming task involving...... numerous steps, expert skills and different modelling tools. This motivates the development of a computer-aided modelling framework that supports the user during model development, documentation, analysis, identification, application and re-use with the goal to increase the efficiency of the modelling...... generation, optimal equation ordering, eigenvalue analysis. Once the models have been constructed and analysed the modelling framework incorporates 3 application work-flows for: identification, simulation and design. For these application work-flows different solvers that can solve a large range of different...

  9. A Computational Framework for Phase-field Modeling

    Science.gov (United States)

    2011-01-01

    twin embryo within an otherwise perfect single crystal (12). Analytical models based on free energy variations in the context of phase...transformations have been applied to describe twin nucleation (12, 13). Such approaches consider nucleation of a twin embryo of idealized geometry—an elliptical...Crystals, in preparation, 2011. 12. Christian , J. W.; Mahajan, S. Deformation Twinning. Prog. Mater. Sci. 1995, 39, 1–157. 13. Lee, J. K.; Yoo, M

  10. Framework for an asymptotically safe standard model via dynamical breaking

    Science.gov (United States)

    Abel, Steven; Sannino, Francesco

    2017-09-01

    We present a consistent embedding of the matter and gauge content of the Standard Model into an underlying asymptotically safe theory that has a well-determined interacting UV fixed point in the large color/flavor limit. The scales of symmetry breaking are determined by two mass-squared parameters with the breaking of electroweak symmetry being driven radiatively. There are no other free parameters in the theory apart from gauge couplings.

  11. A unifying kinetic framework for modeling oxidoreductase-catalyzed reactions

    OpenAIRE

    Chang, Ivan; Baldi, Pierre

    2013-01-01

    Motivation: Oxidoreductases are a fundamental class of enzymes responsible for the catalysis of oxidation–reduction reactions, crucial in most bioenergetic metabolic pathways. From their common root in the ancient prebiotic environment, oxidoreductases have evolved into diverse and elaborate protein structures with specific kinetic properties and mechanisms adapted to their individual functional roles and environmental conditions. While accurate kinetic modeling of oxidoreductases is thus imp...

  12. USER CONTEXT MODELS : A FRAMEWORK TO EASE SOFTWARE FORMAL VERIFICATIONS

    OpenAIRE

    2010-01-01

    This article is accepted to appear in ICEIS 2010 proceedings; International audience; Several works emphasize the difficulties of software verification applied to embedded systems. In past years, formal verification techniques and tools were widely developed and used by the research community. However, the use of formal verification at industrial scale remains difficult, expensive and requires lot of time. This is due to the size and the complexity of manipulated models, but also, to the impo...

  13. Introducing the Core Probability Framework and Discrete-Element Core Probability Model for efficient stochastic macroscopic modelling

    NARCIS (Netherlands)

    Calvert, S.C.; Taale, H.; Hoogendoorn, S.P.

    2014-01-01

    In this contribution the Core Probability Framework (CPF) is introduced with the application of the Discrete-Element Core Probability Model (DE-CPM) as a new DNL for dynamic macroscopic modelling of stochastic traffic flow. The model is demonstrated for validation in a test case and for computationa

  14. Using the Bifocal Modeling Framework to Resolve "Discrepant Events" between Physical Experiments and Virtual Models in Biology

    Science.gov (United States)

    Blikstein, Paulo; Fuhrmann, Tamar; Salehi, Shima

    2016-01-01

    In this paper, we investigate an approach to supporting students' learning in science through a combination of physical experimentation and virtual modeling. We present a study that utilizes a scientific inquiry framework, which we call "bifocal modeling," to link student-designed experiments and computer models in real time. In this…

  15. Computational fluid dynamics framework for aerodynamic model assessment

    Science.gov (United States)

    Vallespin, D.; Badcock, K. J.; Da Ronch, A.; White, M. D.; Perfect, P.; Ghoreyshi, M.

    2012-07-01

    This paper reviews the work carried out at the University of Liverpool to assess the use of CFD methods for aircraft flight dynamics applications. Three test cases are discussed in the paper, namely, the Standard Dynamic Model, the Ranger 2000 jet trainer and the Stability and Control Unmanned Combat Air Vehicle. For each of these, a tabular aerodynamic model based on CFD predictions is generated along with validation against wind tunnel experiments and flight test measurements. The main purpose of the paper is to assess the validity of the tables of aerodynamic data for the force and moment prediction of realistic aircraft manoeuvres. This is done by generating a manoeuvre based on the tables of aerodynamic data, and then replaying the motion through a time-accurate computational fluid dynamics calculation. The resulting forces and moments from these simulations were compared with predictions from the tables. As the latter are based on a set of steady-state predictions, the comparisons showed perfect agreement for slow manoeuvres. As manoeuvres became more aggressive some disagreement was seen, particularly during periods of large rates of change in attitudes. Finally, the Ranger 2000 model was used on a flight simulator.

  16. Modelling diagnosis in physical therapy: a blackboard framework and models of experts and novices.

    Science.gov (United States)

    James, G A

    2007-03-01

    The primary objective of this study was to explore clinical reasoning in physical therapy and to highlight the similarities and differences by modelling the diagnostic phase of clinical reasoning. An experimental design comparing expert and novice physical therapists was utilized. Concurrent verbal protocols detailing the clinical reasoning about standardized case material were elicited. A framework for modelling diagnosis was specified and provided the parameters for analysis. The diagnostic utterances were classified as cues or hypotheses and the knowledge utilized was identified. The experts recruited significantly more knowledge than the novices (p = 0.01) and used more cues (p < 0.01). Their diagnoses were more accurate when compared to the original diagnosis. This difference between the experts and novices was reflected in the differences shown in the models (p < 0.01). The differences between these subjects focused upon the knowledge recruitment, which impacted on the accuracy of the diagnosis. The novices' inaccurate or non-existent diagnoses led to poor quality of treatment prescription. Modelling proved to be a useful way of representing these differences.

  17. Crops in silico: A community wide multi-scale computational modeling framework of plant canopies

    Science.gov (United States)

    Srinivasan, V.; Christensen, A.; Borkiewic, K.; Yiwen, X.; Ellis, A.; Panneerselvam, B.; Kannan, K.; Shrivastava, S.; Cox, D.; Hart, J.; Marshall-Colon, A.; Long, S.

    2016-12-01

    Current crop models predict a looming gap between supply and demand for primary foodstuffs over the next 100 years. While significant yield increases were achieved in major food crops during the early years of the green revolution, the current rates of yield increases are insufficient to meet future projected food demand. Furthermore, with projected reduction in arable land, decrease in water availability, and increasing impacts of climate change on future food production, innovative technologies are required to sustainably improve crop yield. To meet these challenges, we are developing Crops in silico (Cis), a biologically informed, multi-scale, computational modeling framework that can facilitate whole plant simulations of crop systems. The Cis framework is capable of linking models of gene networks, protein synthesis, metabolic pathways, physiology, growth, and development in order to investigate crop response to different climate scenarios and resource constraints. This modeling framework will provide the mechanistic details to generate testable hypotheses toward accelerating directed breeding and engineering efforts to increase future food security. A primary objective for building such a framework is to create synergy among an inter-connected community of biologists and modelers to create a realistic virtual plant. This framework advantageously casts the detailed mechanistic understanding of individual plant processes across various scales in a common scalable framework that makes use of current advances in high performance and parallel computing. We are currently designing a user friendly interface that will make this tool equally accessible to biologists and computer scientists. Critically, this framework will provide the community with much needed tools for guiding future crop breeding and engineering, understanding the emergent implications of discoveries at the molecular level for whole plant behavior, and improved prediction of plant and ecosystem

  18. Developing the multi-level functioning interface framework for DER models

    DEFF Research Database (Denmark)

    Han, Xue; Bindner, Henrik W.; You, Shi

    2013-01-01

    The paper summarises several modelling applications of distributed energy resources (DERs) for various purposes, and describes the related operational issues regarding the complexity of the future distribution grid. Furthermore, a multi-level functioning interface framework is proposed for DER mo....... The information mapping for photovoltaic panel (PV) modelling is also provided as an example....

  19. A general simulation model developing process based on five-object framework

    Institute of Scientific and Technical Information of China (English)

    胡安斌; 伞冶; 陈建明; 陈永强

    2003-01-01

    Different paradigms that relate verification and validation to the simulation model have different development process. A simulation model developing process based on Five-Object Framework (FOF) is discussed in this paper. An example is given to demonstrate the applications of the proposed method.

  20. A modeling framework for characterizing near-road air pollutant concentration at community scales

    Science.gov (United States)

    In this study, we combine information from transportation network, traffic emissions, and dispersion model to develop a framework to inform exposure estimates for traffic-related air pollutants (TRAPs) with a high spatial resolution. A Research LINE source dispersion model (R-LIN...

  1. The Dimensions of Social Justice Model: Transforming Traditional Group Work into a Socially Just Framework

    Science.gov (United States)

    Ratts, Manivong J.; Anthony, Loni; Santos, KristiAnna Nicole T.

    2010-01-01

    Social justice is a complex and abstract concept that can be difficult to discuss and integrate within group work. To address this concern, this article introduces readers to the Dimensions of Social Justice Model. The model provides group leaders with a conceptual framework for understanding the degree to which social justice is integrated within…

  2. Support of the Generic Framework programme : calibration of groundwater flow models

    NARCIS (Netherlands)

    Stroet, Chris C.B.M. te; Minnema, Benny

    2003-01-01

    This report is to support the “Generic Framework programme” which consists of a series of projects to create a standard in the modelling processes that are used in water management issues. The topic of support is the field of model calibration. TNO-NITG is elaborating the calibration of groundwater

  3. A system-level multiprocessor system-on-chip modeling framework

    DEFF Research Database (Denmark)

    Virk, Kashif Munir; Madsen, Jan

    2004-01-01

    We present a system-level modeling framework to model system-on-chips (SoC) consisting of heterogeneous multiprocessors and network-on-chip communication structures in order to enable the developers of today's SoC designs to take advantage of the flexibility and scalability of network-on-chip...

  4. A framework to a mass dimension one fermionic sigma model

    CERN Document Server

    Rogerio, R J Bueno; Pereira, S H; da Rocha, Roldao

    2016-01-01

    In this paper a mass dimension one fermionic sigma model, realized by the eigenspinors of the charge conjugation operator with dual helicity (Elko spinors), is developed. Such spinors are chosen as a specific realization of mass dimension one spinors, wherein the non-commutative fermionic feature is ruled by torsion. Moreover, we analyse Elko spinors as a source of matter in a background in expansion. Moreover, we analyse Elko spinors as a source of matter in a background in expansion and we have found that such kind of mass dimension one fermions can serve not only as dark matter but they also induce an effective cosmological constant.

  5. A Framework for Conceptual Modeling of Geographic Data Quality

    DEFF Research Database (Denmark)

    Friis-Christensen, Anders; Christensen, J.V.; Jensen, Christian Søndergaard

    2004-01-01

    Sustained advances in wireless communications, geo-positioning, and consumer electronics pave the way to a kind of location-based service that relies on the tracking of the continuously changing positions of an entire population of service users. This type of service is characterized by large...... determined by how "good" the data is, as different applications of geographic data require different qualities of the data are met. Such qualities concern the object level as well as the attribute level of the data. This paper presents a systematic and integrated approach to the conceptual modeling...

  6. A new fit-for-purpose model testing framework: Decision Crash Tests

    Science.gov (United States)

    Tolson, Bryan; Craig, James

    2016-04-01

    Decision-makers in water resources are often burdened with selecting appropriate multi-million dollar strategies to mitigate the impacts of climate or land use change. Unfortunately, the suitability of existing hydrologic simulation models to accurately inform decision-making is in doubt because the testing procedures used to evaluate model utility (i.e., model validation) are insufficient. For example, many authors have identified that a good standard framework for model testing called the Klemes Crash Tests (KCTs), which are the classic model validation procedures from Klemeš (1986) that Andréassian et al. (2009) rename as KCTs, have yet to become common practice in hydrology. Furthermore, Andréassian et al. (2009) claim that the progression of hydrological science requires widespread use of KCT and the development of new crash tests. Existing simulation (not forecasting) model testing procedures such as KCTs look backwards (checking for consistency between simulations and past observations) rather than forwards (explicitly assessing if the model is likely to support future decisions). We propose a fundamentally different, forward-looking, decision-oriented hydrologic model testing framework based upon the concept of fit-for-purpose model testing that we call Decision Crash Tests or DCTs. Key DCT elements are i) the model purpose (i.e., decision the model is meant to support) must be identified so that model outputs can be mapped to management decisions ii) the framework evaluates not just the selected hydrologic model but the entire suite of model-building decisions associated with model discretization, calibration etc. The framework is constructed to directly and quantitatively evaluate model suitability. The DCT framework is applied to a model building case study on the Grand River in Ontario, Canada. A hypothetical binary decision scenario is analysed (upgrade or not upgrade the existing flood control structure) under two different sets of model building

  7. A hidden Markov model approach for determining expression from genomic tiling micro arrays

    DEFF Research Database (Denmark)

    Terkelsen, Kasper Munch; Gardner, P. P.; Arctander, Peter;

    2006-01-01

    HMM, that adaptively models tiling data prior to predicting expression on genomic sequence. A hidden Markov model (HMM) is used to model the distributions of tiling array probe scores in expressed and non-expressed regions. The HMM is trained on sets of probes mapped to regions of annotated expression and non......]. Results can be downloaded and viewed from our web site [2]. Conclusion The value of adaptive modelling of fluorescence scores prior to categorisation into expressed and non-expressed probes is demonstrated. Our results indicate that our adaptive approach is superior to the previous analysis in terms...

  8. Clinical Interdisciplinary Collaboration Models and Frameworks From Similarities to Differences: A Systematic Review

    Science.gov (United States)

    Mahdizadeh, Mousa; Heydari, Abbas; Moonaghi, Hossien Karimi

    2015-01-01

    Introduction: So far, various models of interdisciplinary collaboration in clinical nursing have been presented, however, yet a comprehensive model is not available. The purpose of this study is to review the evidences that had presented model or framework with qualitative approach about interdisciplinary collaboration in clinical nursing. Methods: All the articles and theses published from 1990 to 10 June 2014 which in both English and Persian models or frameworks of clinicians had presented model or framework of clinical collaboration were searched using databases of Proquest, Scopus, pub Med, Science Direct, and Iranian databases of Sid, Magiran, and Iranmedex. In this review, for published articles and theses, keywords according with MESH such as nurse-physician relations, care team, collaboration, interdisciplinary relations and their Persian equivalents were used. Results: In this study contexts, processes and outcomes of interdisciplinary collaboration as findings were extracted. One of the major components affecting on collaboration that most of the models had emphasized was background of collaboration. Most of studies suggested that the outcome of collaboration were improved care, doctors and nurses’ satisfaction, controlling costs, reducing clinical errors and patient’s safety. Conclusion: Models and frameworks had different structures, backgrounds, and conditions, but the outcomes were similar. Organizational structure, culture and social factors are important aspects of clinical collaboration. So it is necessary to improve the quality and effectiveness of clinical collaboration these factors to be considered. PMID:26153158

  9. A Physics-Informed Machine Learning Framework for RANS-based Predictive Turbulence Modeling

    Science.gov (United States)

    Xiao, Heng; Wu, Jinlong; Wang, Jianxun; Ling, Julia

    2016-11-01

    Numerical models based on the Reynolds-averaged Navier-Stokes (RANS) equations are widely used in turbulent flow simulations in support of engineering design and optimization. In these models, turbulence modeling introduces significant uncertainties in the predictions. In light of the decades-long stagnation encountered by the traditional approach of turbulence model development, data-driven methods have been proposed as a promising alternative. We will present a data-driven, physics-informed machine-learning framework for predictive turbulence modeling based on RANS models. The framework consists of three components: (1) prediction of discrepancies in RANS modeled Reynolds stresses based on machine learning algorithms, (2) propagation of improved Reynolds stresses to quantities of interests with a modified RANS solver, and (3) quantitative, a priori assessment of predictive confidence based on distance metrics in the mean flow feature space. Merits of the proposed framework are demonstrated in a class of flows featuring massive separations. Significant improvements over the baseline RANS predictions are observed. The favorable results suggest that the proposed framework is a promising path toward RANS-based predictive turbulence in the era of big data. (SAND2016-7435 A).

  10. A Mobility and Traffic Generation Framework for Modeling and Simulating Ad Hoc Communication Networks

    Directory of Open Access Journals (Sweden)

    Chris Barrett

    2004-01-01

    Full Text Available We present a generic mobility and traffic generation framework that can be incorporated into a tool for modeling and simulating large scale ad~hoc networks. Three components of this framework, namely a mobility data generator (MDG, a graph structure generator (GSG and an occlusion modification tool (OMT allow a variety of mobility models to be incorporated into the tool. The MDG module generates positions of transceivers at specified time instants. The GSG module constructs the graph corresponding to the ad hoc network from the mobility data provided by MDG. The OMT module modifies the connectivity of the graph produced by GSG to allow for occlusion effects. With two other modules, namely an activity data generator (ADG which generates packet transmission activities for transceivers and a packet activity simulator (PAS which simulates the movement and interaction of packets among the transceivers, the framework allows the modeling and simulation of ad hoc communication networks. The design of the framework allows a user to incorporate various realistic parameters crucial in the simulation. We illustrate the utility of our framework through a comparative study of three mobility models. Two of these are synthetic models (random waypoint and exponentially correlated mobility proposed in the literature. The third model is based on an urban population mobility modeling tool (TRANSIMS developed at the Los Alamos National Laboratory. This tool is capable of providing comprehensive information about the demographics, mobility and interactions of members of a large urban population. A comparison of these models is carried out by computing a variety of parameters associated with the graph structures generated by the models. There has recently been interest in the structural properties of graphs that arise in real world systems. We examine two aspects of this for the graphs created by the mobility models: change associated with power control (range of

  11. A modeling framework for investment planning in interdependent infrastructures in multi-hazard environments.

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Nathanael J. K.; Gearhart, Jared Lee; Jones, Dean A.; Nozick, Linda Karen; Prince, Michael

    2013-09-01

    Currently, much of protection planning is conducted separately for each infrastructure and hazard. Limited funding requires a balance of expenditures between terrorism and natural hazards based on potential impacts. This report documents the results of a Laboratory Directed Research & Development (LDRD) project that created a modeling framework for investment planning in interdependent infrastructures focused on multiple hazards, including terrorism. To develop this framework, three modeling elements were integrated: natural hazards, terrorism, and interdependent infrastructures. For natural hazards, a methodology was created for specifying events consistent with regional hazards. For terrorism, we modeled the terrorists actions based on assumptions regarding their knowledge, goals, and target identification strategy. For infrastructures, we focused on predicting post-event performance due to specific terrorist attacks and natural hazard events, tempered by appropriate infrastructure investments. We demonstrate the utility of this framework with various examples, including protection of electric power, roadway, and hospital networks.

  12. Towards a Common Framework for the Identification of Landforms on Terrain Models

    Directory of Open Access Journals (Sweden)

    Eric Guilbert

    2017-01-01

    Full Text Available A landform is a physical feature of the terrain with its own recognisable shape. Its definition is often qualitative and inherently vague. Hence, landforms are difficult to formalise in a logical model that can be implemented. We propose for that purpose a framework where these qualitative and vague definitions are transformed successively during different phases to yield an implementable data structure. Our main consideration is that landforms are characterised by salient elements as perceived by users. Hence, a common prototype based on an object-oriented approach is defined that shall apply to all landforms. This framework shall facilitate the definition of conceptual models for other landforms and relies on the use of ontology design patterns to express common elements and structures. The model is illustrated on examples from the literature, showing that existing works undertaken separately can be developed under a common framework.

  13. Model-based Computer Aided Framework for Design of Process Monitoring and Analysis Systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    -aided framework including the methods and tools through which the design of monitoring and analysis systems for product quality control can be generated, analyzed and/or validated, has been developed. Two important supporting tools developed as part of the framework are a knowledge base and a model library...... subject to the maintenance constraints of the desired product quality. The application of the model-based framework is highlighted through a case study involving the operation of a fermentation process.......In the manufacturing industry, for example, the pharmaceutical industry, a thorough understanding of the process is necessary in addition to a properly designed monitoring and analysis system (PAT system) to consistently obtain the desired end-product properties. A model-based computer...

  14. Modeling Complex Biological Flows in Multi-Scale Systems using the APDEC Framework

    Energy Technology Data Exchange (ETDEWEB)

    Trebotich, D

    2006-06-24

    We have developed advanced numerical algorithms to model biological fluids in multiscale flow environments using the software framework developed under the SciDAC APDEC ISIC. The foundation of our computational effort is an approach for modeling DNA-laden fluids as ''bead-rod'' polymers whose dynamics are fully coupled to an incompressible viscous solvent. The method is capable of modeling short range forces and interactions between particles using soft potentials and rigid constraints. Our methods are based on higher-order finite difference methods in complex geometry with adaptivity, leveraging algorithms and solvers in the APDEC Framework. Our Cartesian grid embedded boundary approach to incompressible viscous flow in irregular geometries has also been interfaced to a fast and accurate level-sets method within the APDEC Framework for extracting surfaces from volume renderings of medical image data and used to simulate cardio-vascular and pulmonary flows in critical anatomies.

  15. A Framework of Environmental Modelling and Information Sharing for Urban Air Pollution Control and Management

    Institute of Scientific and Technical Information of China (English)

    LIU Gang-jun; FU Er-jiang; WANG Yun-jia; ZHANG Ke-fei; HAN Bao-ping; ARROWSMITH Colin

    2007-01-01

    More effective environmental pollution control and management are needed due to the increasing environmental impacts from a range of human activities and the growing public demands for a better living environment. Urban air pollution is a serious environmental issue that poses adverse impacts on the health of people and the environment in most metropolitan areas. In this paper, we propose a geoinformatics augmented framework of environmental modelling and information sharing for supporting effective urban air pollution control and management. This framework is outlined in terms of its key components and processes including: 1) an integrated, adaptive network of sensors for environmental monitoring; 2) a set of distributed, interoperable databases for data management; 3) a set of intelligent, robust algorithms and models for environmental modelling; 4) a set of flexible, efficient user interfaces for data access and information sharing; and 5) a reliable, high capacity, high performance computing and communication infrastructure for integrating and supporting other framework components and processes.

  16. Designing A Framework To Design A Business Model For The 'Bottom Of The Pyramid' Population

    Directory of Open Access Journals (Sweden)

    Ver Loren van Themaat, Tanye

    2013-11-01

    Full Text Available This article presents a framework for developing and designing a business model to target the bottom of the pyramid (BoP population. Using blue ocean strategy and business model literature, integrated with research on the BoP, the framework offers a systematic approach for organisations to analyse and understand all aspects of the BoP and their environment, and then design a business model that minimises the risk of failure and fulfils the core requirements of the BoP. A case study on Capitec Bank demonstrates how the framework can be applied to the real world. The case study shows the practical examples that Capitec uses to target the BoP successfully, and the logic behind these actions. Further validation was done through interviews with experts in the relevant fields used in this study.

  17. A Bayesian modelling framework for tornado occurrences in North America.

    Science.gov (United States)

    Cheng, Vincent Y S; Arhonditsis, George B; Sills, David M L; Gough, William A; Auld, Heather

    2015-03-25

    Tornadoes represent one of nature's most hazardous phenomena that have been responsible for significant destruction and devastating fatalities. Here we present a Bayesian modelling approach for elucidating the spatiotemporal patterns of tornado activity in North America. Our analysis shows a significant increase in the Canadian Prairies and the Northern Great Plains during the summer, indicating a clear transition of tornado activity from the United States to Canada. The linkage between monthly-averaged atmospheric variables and likelihood of tornado events is characterized by distinct seasonality; the convective available potential energy is the predominant factor in the summer; vertical wind shear appears to have a strong signature primarily in the winter and secondarily in the summer; and storm relative environmental helicity is most influential in the spring. The present probabilistic mapping can be used to draw inference on the likelihood of tornado occurrence in any location in North America within a selected time period of the year.

  18. A Monte Carlo Simulation Framework for Testing Cosmological Models

    Directory of Open Access Journals (Sweden)

    Heymann Y.

    2014-10-01

    Full Text Available We tested alternative cosmologies using Monte Carlo simulations based on the sam- pling method of the zCosmos galactic survey. The survey encompasses a collection of observable galaxies with respective redshifts that have been obtained for a given spec- troscopic area of the sky. Using a cosmological model, we can convert the redshifts into light-travel times and, by slicing the survey into small redshift buckets, compute a curve of galactic density over time. Because foreground galaxies obstruct the images of more distant galaxies, we simulated the theoretical galactic density curve using an average galactic radius. By comparing the galactic density curves of the simulations with that of the survey, we could assess the cosmologies. We applied the test to the expanding-universe cosmology of de Sitter and to a dichotomous cosmology.

  19. Large geospatial images discovery: metadata model and technological framework

    Directory of Open Access Journals (Sweden)

    Lukáš Brůha

    2015-12-01

    Full Text Available The advancements in geospatial web technology triggered efforts for disclosure of valuable resources of historical collections. This paper focuses on the role of spatial data infrastructures (SDI in such efforts. The work describes the interplay between SDI technologies and potential use cases in libraries such as cartographic heritage. The metadata model is introduced to link up the sources from these two distinct fields. To enhance the data search capabilities, the work focuses on the representation of the content-based metadata of raster images, which is the crucial prerequisite to target the search in a more effective way. The architecture of the prototype system for automatic raster data processing, storage, analysis and distribution is introduced. The architecture responds to the characteristics of input datasets, namely to the continuous flow of very large raster data and related metadata. Proposed solutions are illustrated on the case study of cartometric analysis of digitised early maps and related metadata encoding.

  20. A climate robust integrated modelling framework for regional impact assessment of climate change

    Science.gov (United States)

    Janssen, Gijs; Bakker, Alexander; van Ek, Remco; Groot, Annemarie; Kroes, Joop; Kuiper, Marijn; Schipper, Peter; van Walsum, Paul; Wamelink, Wieger; Mol, Janet

    2013-04-01

    Decision making towards climate proofing the water management of regional catchments can benefit greatly from the availability of a climate robust integrated modelling framework, capable of a consistent assessment of climate change impacts on the various interests present in the catchments. In the Netherlands, much effort has been devoted to developing state-of-the-art regional dynamic groundwater models with a very high spatial resolution (25x25 m2). Still, these models are not completely satisfactory to decision makers because the modelling concepts do not take into account feedbacks between meteorology, vegetation/crop growth, and hydrology. This introduces uncertainties in forecasting the effects of climate change on groundwater, surface water, agricultural yields, and development of groundwater dependent terrestrial ecosystems. These uncertainties add to the uncertainties about the predictions on climate change itself. In order to create an integrated, climate robust modelling framework, we coupled existing model codes on hydrology, agriculture and nature that are currently in use at the different research institutes in the Netherlands. The modelling framework consists of the model codes MODFLOW (groundwater flow), MetaSWAP (vadose zone), WOFOST (crop growth), SMART2-SUMO2 (soil-vegetation) and NTM3 (nature valuation). MODFLOW, MetaSWAP and WOFOST are coupled online (i.e. exchange information on time step basis). Thus, changes in meteorology and CO2-concentrations affect crop growth and feedbacks between crop growth, vadose zone water movement and groundwater recharge are accounted for. The model chain WOFOST-MetaSWAP-MODFLOW generates hydrological input for the ecological prediction model combination SMART2-SUMO2-NTM3. The modelling framework was used to support the regional water management decision making process in the 267 km2 Baakse Beek-Veengoot catchment in the east of the Netherlands. Computations were performed for regionalized 30-year climate change