WorldWideScience

Sample records for classification-tree based models

  1. Predictive mapping of soil organic carbon in wet cultivated lands using classification-tree based models

    DEFF Research Database (Denmark)

    Kheir, Rania Bou; Greve, Mogens Humlekrog; Bøcher, Peder Klith

    2010-01-01

    field measurements in the area of interest (Denmark). A large number of tree-based classification models (588) were developed using (i) all of the parameters, (ii) all Digital Elevation Model (DEM) parameters only, (iii) the primary DEM parameters only, (iv), the remote sensing (RS) indices only, (v...... the geographic distribution of SOC across Denmark using remote sensing (RS), geographic information systems (GISs) and decision-tree modeling (un-pruned and pruned classification trees). Seventeen parameters, i.e. parent material, soil type, landscape type, elevation, slope gradient, slope aspect, mean curvature......) selected pairs of parameters, (vi) soil type, parent material and landscape type only, and (vii) the parameters having a high impact on SOC distribution in built pruned trees. The best constructed classification tree models (in the number of three) with the lowest misclassification error (ME...

  2. Investigation of the Effect of Traffic Parameters on Road Hazard Using Classification Tree Model

    Directory of Open Access Journals (Sweden)

    Md. Mahmud Hasan

    2012-09-01

    Full Text Available This paper presents a method for the identification of hazardous situations on the freeways. For this study, about 18 km long section of Eastern Freeway in Melbourne, Australia was selected as a test bed. Three categories of data i.e. traffic, weather and accident record data were used for the analysis and modelling. In developing the crash risk probability model, classification tree based model was developed in this study. In formulating the models, it was found that weather conditions did not have significant impact on accident occurrence so the classification tree was built using two traffic indices; traffic flow and vehicle speed only. The formulated classification tree is able to identify the possible hazard and non-hazard situations on freeway. The outcome of the study will aid the hazard mitigation strategies.

  3. Predicting student satisfaction with courses based on log data from a virtual learning environment – a neural network and classification tree model

    Directory of Open Access Journals (Sweden)

    Ivana Đurđević Babić

    2015-03-01

    Full Text Available Student satisfaction with courses in academic institutions is an important issue and is recognized as a form of support in ensuring effective and quality education, as well as enhancing student course experience. This paper investigates whether there is a connection between student satisfaction with courses and log data on student courses in a virtual learning environment. Furthermore, it explores whether a successful classification model for predicting student satisfaction with course can be developed based on course log data and compares the results obtained from implemented methods. The research was conducted at the Faculty of Education in Osijek and included analysis of log data and course satisfaction on a sample of third and fourth year students. Multilayer Perceptron (MLP with different activation functions and Radial Basis Function (RBF neural networks as well as classification tree models were developed, trained and tested in order to classify students into one of two categories of course satisfaction. Type I and type II errors, and input variable importance were used for model comparison and classification accuracy. The results indicate that a successful classification model using tested methods can be created. The MLP model provides the highest average classification accuracy and the lowest preference in misclassification of students with a low level of course satisfaction, although a t-test for the difference in proportions showed that the difference in performance between the compared models is not statistically significant. Student involvement in forum discussions is recognized as a valuable predictor of student satisfaction with courses in all observed models.

  4. Study and ranking of determinants of Taenia solium infections by classification tree models

    OpenAIRE

    Mwape, Kabemba E.; Phiri, Isaac K.; Praet, Nicolas; Dorny, Pierre; Muma, John B; Zulu, Gideon; Speybroeck, Niko; Gabriël, Sarah

    2015-01-01

    Taenia solium taeniasis/cysticercosis is an important public health problem occurring mainly in developing countries. This work aimed to study the determinants of human T. solium infections in the Eastern province of Zambia and rank them in order of importance. A household (HH)-level questionnaire was administered to 680 HHs from 53 villages in two rural districts and the taeniasis and cysticercosis status determined. A classification tree model (CART) was used to define the relative importan...

  5. Study and ranking of determinants of Taenia solium infections by classification tree models.

    Science.gov (United States)

    Mwape, Kabemba E; Phiri, Isaac K; Praet, Nicolas; Dorny, Pierre; Muma, John B; Zulu, Gideon; Speybroeck, Niko; Gabriël, Sarah

    2015-01-01

    Taenia solium taeniasis/cysticercosis is an important public health problem occurring mainly in developing countries. This work aimed to study the determinants of human T. solium infections in the Eastern province of Zambia and rank them in order of importance. A household (HH)-level questionnaire was administered to 680 HHs from 53 villages in two rural districts and the taeniasis and cysticercosis status determined. A classification tree model (CART) was used to define the relative importance and interactions between different predictor variables in their effect on taeniasis and cysticercosis. The Katete study area had a significantly higher taeniasis and cysticercosis prevalence than the Petauke area. The CART analysis for Katete showed that the most important determinant for cysticercosis infections was the number of HH inhabitants (6 to 10) and for taeniasis was the number of HH inhabitants > 6. The most important determinant in Petauke for cysticercosis was the age of head of household > 32 years and for taeniasis it was age taeniasis and cysticercosis infections was the number of HH inhabitants (6 to 10) in Katete district and age in Petauke. The results suggest that control measures should target HHs with a high number of inhabitants and older individuals.

  6. Superiority of Classification Tree versus Cluster, Fuzzy and Discriminant Models in a Heartbeat Classification System.

    Directory of Open Access Journals (Sweden)

    Vessela Krasteva

    Full Text Available This study presents a 2-stage heartbeat classifier of supraventricular (SVB and ventricular (VB beats. Stage 1 makes computationally-efficient classification of SVB-beats, using simple correlation threshold criterion for finding close match with a predominant normal (reference beat template. The non-matched beats are next subjected to measurement of 20 basic features, tracking the beat and reference template morphology and RR-variability for subsequent refined classification in SVB or VB-class by Stage 2. Four linear classifiers are compared: cluster, fuzzy, linear discriminant analysis (LDA and classification tree (CT, all subjected to iterative training for selection of the optimal feature space among extended 210-sized set, embodying interactive second-order effects between 20 independent features. The optimization process minimizes at equal weight the false positives in SVB-class and false negatives in VB-class. The training with European ST-T, AHA, MIT-BIH Supraventricular Arrhythmia databases found the best performance settings of all classification models: Cluster (30 features, Fuzzy (72 features, LDA (142 coefficients, CT (221 decision nodes with top-3 best scored features: normalized current RR-interval, higher/lower frequency content ratio, beat-to-template correlation. Unbiased test-validation with MIT-BIH Arrhythmia database rates the classifiers in descending order of their specificity for SVB-class: CT (99.9%, LDA (99.6%, Cluster (99.5%, Fuzzy (99.4%; sensitivity for ventricular ectopic beats as part from VB-class (commonly reported in published beat-classification studies: CT (96.7%, Fuzzy (94.4%, LDA (94.2%, Cluster (92.4%; positive predictivity: CT (99.2%, Cluster (93.6%, LDA (93.0%, Fuzzy (92.4%. CT has superior accuracy by 0.3-6.8% points, with the advantage for easy model complexity configuration by pruning the tree consisted of easy interpretable 'if-then' rules.

  7. Support vector machine classification trees based on fuzzy entropy of classification.

    Science.gov (United States)

    de Boves Harrington, Peter

    2017-02-15

    The support vector machine (SVM) is a powerful classifier that has recently been implemented in a classification tree (SVMTreeG). This classifier partitioned the data by finding gaps in the data space. For large and complex datasets, there may be no gaps in the data space confounding this type of classifier. A novel algorithm was devised that uses fuzzy entropy to find optimal partitions for situations when clusters of data are overlapped in the data space. Also, a kernel version of the fuzzy entropy algorithm was devised. A fast support vector machine implementation is used that has no cost C or slack variables to optimize. Statistical comparisons using bootstrapped Latin partitions among the tree classifiers were made using a synthetic XOR data set and validated with ten prediction sets comprised of 50,000 objects and a data set of NMR spectra obtained from 12 tea sample extracts.

  8. Unimodal transform of variables selected by interval segmentation purity for classification tree modeling of high-dimensional microarray data.

    Science.gov (United States)

    Du, Wen; Gu, Ting; Tang, Li-Juan; Jiang, Jian-Hui; Wu, Hai-Long; Shen, Guo-Li; Yu, Ru-Qin

    2011-09-15

    As a greedy search algorithm, classification and regression tree (CART) is easily relapsing into overfitting while modeling microarray gene expression data. A straightforward solution is to filter irrelevant genes via identifying significant ones. Considering some significant genes with multi-modal expression patterns exhibiting systematic difference in within-class samples are difficult to be identified by existing methods, a strategy that unimodal transform of variables selected by interval segmentation purity (UTISP) for CART modeling is proposed. First, significant genes exhibiting varied expression patterns can be properly identified by a variable selection method based on interval segmentation purity. Then, unimodal transform is implemented to offer unimodal featured variables for CART modeling via feature extraction. Because significant genes with complex expression patterns can be properly identified and unimodal feature extracted in advance, this developed strategy potentially improves the performance of CART in combating overfitting or underfitting while modeling microarray data. The developed strategy is demonstrated using two microarray data sets. The results reveal that UTISP-based CART provides superior performance to k-nearest neighbors or CARTs coupled with other gene identifying strategies, indicating UTISP-based CART holds great promise for microarray data analysis.

  9. A Method for Application of Classification Tree Models to Map Aquatic Vegetation Using Remotely Sensed Images from Different Sensors and Dates

    Directory of Open Access Journals (Sweden)

    Ying Cai

    2012-09-01

    Full Text Available In previous attempts to identify aquatic vegetation from remotely-sensed images using classification trees (CT, the images used to apply CT models to different times or locations necessarily originated from the same satellite sensor as that from which the original images used in model development came, greatly limiting the application of CT. We have developed an effective normalization method to improve the robustness of CT models when applied to images originating from different sensors and dates. A total of 965 ground-truth samples of aquatic vegetation types were obtained in 2009 and 2010 in Taihu Lake, China. Using relevant spectral indices (SI as classifiers, we manually developed a stable CT model structure and then applied a standard CT algorithm to obtain quantitative (optimal thresholds from 2009 ground-truth data and images from Landsat7-ETM+, HJ-1B-CCD, Landsat5-TM and ALOS-AVNIR-2 sensors. Optimal CT thresholds produced average classification accuracies of 78.1%, 84.7% and 74.0% for emergent vegetation, floating-leaf vegetation and submerged vegetation, respectively. However, the optimal CT thresholds for different sensor images differed from each other, with an average relative variation (RV of 6.40%. We developed and evaluated three new approaches to normalizing the images. The best-performing method (Method of 0.1% index scaling normalized the SI images using tailored percentages of extreme pixel values. Using the images normalized by Method of 0.1% index scaling, CT models for a particular sensor in which thresholds were replaced by those from the models developed for images originating from other sensors provided average classification accuracies of 76.0%, 82.8% and 68.9% for emergent vegetation, floating-leaf vegetation and submerged vegetation, respectively. Applying the CT models developed for normalized 2009 images to 2010 images resulted in high classification (78.0%–93.3% and overall (92.0%–93.1% accuracies. Our

  10. Building classification trees to explain the radioactive contamination levels of the plants; Construction d'arbres de discrimination pour expliquer les niveaux de contamination radioactive des vegetaux

    Energy Technology Data Exchange (ETDEWEB)

    Briand, B

    2008-04-15

    The objective of this thesis is the development of a method allowing the identification of factors leading to various radioactive contamination levels of the plants. The methodology suggested is based on the use of a radioecological transfer model of the radionuclides through the environment (A.S.T.R.A.L. computer code) and a classification-tree method. Particularly, to avoid the instability problems of classification trees and to preserve the tree structure, a node level stabilizing technique is used. Empirical comparisons are carried out between classification trees built by this method (called R.E.N. method) and those obtained by the C.A.R.T. method. A similarity measure is defined to compare the structure of two classification trees. This measure is used to study the stabilizing performance of the R.E.N. method. The methodology suggested is applied to a simplified contamination scenario. By the results obtained, we can identify the main variables responsible of the various radioactive contamination levels of four leafy-vegetables (lettuce, cabbage, spinach and leek). Some extracted rules from these classification trees can be usable in a post-accidental context. (author)

  11. 分类树模型分析糖尿病早期肾损害相关影响因素%Classification tree model analysis on related factors of early renal damage in type 2 diabetic patients

    Institute of Scientific and Technical Information of China (English)

    赵文波; 李明; 唐骅; 刘迅; 司美君; 彭晖; 娄探奇

    2013-01-01

    Objective To analyze the impact factors for early renal damage in type 2 diabetic patients by the classification tree model.Methods A total of 601 patients with type 2 diabetes were enrolled.According to glomerular filtration rates and urine albumin quantification,the patients were divided into type 2 diabetes group (418 cases) and early diabetic renal damage group (183 cases).The clinical data of the patients were recorded to analyze the main influential factors for the microalbuminuria of type 2 diabetic patients using the Exhaustive CHAID classification tree algorithm.Results Six important explanatory variables were screened out by the classification tree model from the 34 candidate variables related to early renal damage,including fibrinogen,history of hypertension,retinopathy,Cys C levels,SBP and peripheral neuropathy.Elevated fibrinogen was the main factor.Conclusion The classification tree model can analyze the major influential factors of early renal damage in type 2 diabetic patients effectively,and it can help develop the prevention and treatment methods.%目的 探讨分类树模型分析2型糖尿病早期肾损害的相关影响因素,为进一步的预防和治疗提供参考依据.方法 选取临床资料完善的2型糖尿病住院患者601例,据估算肾小球滤过率(eGFR)分期及尿白蛋白定量诊断分为2型糖尿病组(418例)和糖尿病早期肾损害组(183例).记录患者的相关临床资料,采用Exhaustive CHAID分类树算法分析2型糖尿病出现微量蛋白尿的主要影响因素.结果 分类树模型从34个候选变量中筛选出6个重要解释变量,分别是纤维蛋白原、高血压病史、视网膜病变、胱抑素水平、收缩压、周围神经病变,与糖尿病早期肾损害的发病风险相关,纤维蛋白原升高是最主要的相关因素.结论 采用分类树模型分析糖尿病早期肾损害主要相关影响因素是可行的,多因素共同作用增加发病风险,可以为制定预防、治疗方案提供依据.

  12. The Hybrid of Classification Tree and Extreme Learning Machine for Permeability Prediction in Oil Reservoir

    KAUST Repository

    Prasetyo Utomo, Chandra

    2011-06-01

    Permeability is an important parameter connected with oil reservoir. Predicting the permeability could save millions of dollars. Unfortunately, petroleum engineers have faced numerous challenges arriving at cost-efficient predictions. Much work has been carried out to solve this problem. The main challenge is to handle the high range of permeability in each reservoir. For about a hundred year, mathematicians and engineers have tried to deliver best prediction models. However, none of them have produced satisfying results. In the last two decades, artificial intelligence models have been used. The current best prediction model in permeability prediction is extreme learning machine (ELM). It produces fairly good results but a clear explanation of the model is hard to come by because it is so complex. The aim of this research is to propose a way out of this complexity through the design of a hybrid intelligent model. In this proposal, the system combines classification and regression models to predict the permeability value. These are based on the well logs data. In order to handle the high range of the permeability value, a classification tree is utilized. A benefit of this innovation is that the tree represents knowledge in a clear and succinct fashion and thereby avoids the complexity of all previous models. Finally, it is important to note that the ELM is used as a final predictor. Results demonstrate that this proposed hybrid model performs better when compared with support vector machines (SVM) and ELM in term of correlation coefficient. Moreover, the classification tree model potentially leads to better communication among petroleum engineers concerning this important process and has wider implications for oil reservoir management efficiency.

  13. Use of classification trees to apportion single echo detections to species: Application to the pelagic fish community of Lake Superior

    Science.gov (United States)

    Yule, Daniel L.; Adams, Jean V.; Hrabik, Thomas R.; Vinson, Mark R.; Woiak, Zebadiah; Ahrenstroff, Tyler D.

    2013-01-01

    Acoustic methods are used to estimate the density of pelagic fish in large lakes with results of midwater trawling used to assign species composition. Apportionment in lakes having mixed species can be challenging because only a small fraction of the water sampled acoustically is sampled with trawl gear. Here we describe a new method where single echo detections (SEDs) are assigned to species based on classification tree models developed from catch data that separate species based on fish size and the spatial habitats they occupy. During the summer of 2011, we conducted a spatially-balanced lake-wide acoustic and midwater trawl survey of Lake Superior. A total of 51 sites in four bathymetric depth strata (0–30 m, 30–100 m, 100–200 m, and >200 m) were sampled. We developed classification tree models for each stratum and found fish length was the most important variable for separating species. To apply these trees to the acoustic data, we needed to identify a target strength to length (TS-to-L) relationship appropriate for all abundant Lake Superior pelagic species. We tested performance of 7 general (i.e., multi-species) relationships derived from three published studies. The best-performing relationship was identified by comparing predicted and observed catch compositions using a second independent Lake Superior data set. Once identified, the relationship was used to predict lengths of SEDs from the lake-wide survey, and the classification tree models were used to assign each SED to a species. Exotic rainbow smelt (Osmerus mordax) were the most common species at bathymetric depths 100 m (384 million; 6.0 kt). Cisco (Coregonus artedi) were widely distributed over all strata with their population estimated at 182 million (44 kt). The apportionment method we describe should be transferable to other large lakes provided fish are not tightly aggregated, and an appropriate TS-to-L relationship for abundant pelagic fish species can be determined.

  14. DIF Trees: Using Classification Trees to Detect Differential Item Functioning

    Science.gov (United States)

    Vaughn, Brandon K.; Wang, Qiu

    2010-01-01

    A nonparametric tree classification procedure is used to detect differential item functioning for items that are dichotomously scored. Classification trees are shown to be an alternative procedure to detect differential item functioning other than the use of traditional Mantel-Haenszel and logistic regression analysis. A nonparametric…

  15. Prediction of Infertility Treatment Outcomes Using Classification Trees

    Directory of Open Access Journals (Sweden)

    Milewska Anna Justyna

    2016-12-01

    Full Text Available Infertility is currently a common problem with causes that are often unexplained, which complicates treatment. In many cases, the use of ART methods provides the only possibility of getting pregnant. Analysis of this type of data is very complex. More and more often, data mining methods or artificial intelligence techniques are appropriate for solving such problems. In this study, classification trees were used for analysis. This resulted in obtaining a group of patients characterized most likely to get pregnant while using in vitro fertilization.

  16. Stroke damage detection using classification trees on electrical bioimpedance cerebral spectroscopy measurements.

    Science.gov (United States)

    Atefi, Seyed Reza; Seoane, Fernando; Thorlin, Thorleif; Lindecrantz, Kaj

    2013-08-07

    After cancer and cardio-vascular disease, stroke is the third greatest cause of death worldwide. Given the limitations of the current imaging technologies used for stroke diagnosis, the need for portable non-invasive and less expensive diagnostic tools is crucial. Previous studies have suggested that electrical bioimpedance (EBI) measurements from the head might contain useful clinical information related to changes produced in the cerebral tissue after the onset of stroke. In this study, we recorded 720 EBI Spectroscopy (EBIS) measurements from two different head regions of 18 hemispheres of nine subjects. Three of these subjects had suffered a unilateral haemorrhagic stroke. A number of features based on structural and intrinsic frequency-dependent properties of the cerebral tissue were extracted. These features were then fed into a classification tree. The results show that a full classification of damaged and undamaged cerebral tissue was achieved after three hierarchical classification steps. Lastly, the performance of the classification tree was assessed using Leave-One-Out Cross Validation (LOO-CV). Despite the fact that the results of this study are limited to a small database, and the observations obtained must be verified further with a larger cohort of patients, these findings confirm that EBI measurements contain useful information for   assessing on the health of brain tissue after stroke and supports the hypothesis that classification features based on Cole parameters, spectral information and the geometry of EBIS measurements are useful to differentiate between healthy and stroke damaged brain tissue.

  17. Construction of a risk prediction model for proteinuria progression in diabetic nephropathy by classification tree%应用分类树模型构建糖尿病肾病蛋白尿进展的风险预测模型

    Institute of Scientific and Technical Information of China (English)

    赵文波; 李明; 唐骅; 司美君; 刘迅; 娄探奇

    2013-01-01

    Objective To establish a risk prediction model for progression of microalbuminuria to proteinuria in diabetic nephropathy,and to evaluate its value for type 2 diabetes patients with early renal damage.Methods Totally 256 patients with confirmed type 2 diabetes who were inpatients or outpatients in our hospital from January to June 2012 were enrolled in this study.According to estimated glomerular filtration rate (eGFR) installments and urine albumin quantitative analysis,they were divided into proteinuria group (n =73) and early diabetic kidney damage group (n =183).The clinical data of the patients were recorded to analyze the main factors for the progression of diabetic nephropathy using the Exhaustive CHAID classification tree algorithm.The value of the established model was evaluated by the Risk statistics and index map.Results The model had 3 stratum and 11 nodes.There were 4 explanatory variables screened out in the model,that is,CysC level,history of hypertension,waist-hip ratio and albumin level.The most important risk factors were CysC level and waist-hip ratio.The risk value of misclassification probability of the model was 0.141,and the classification tree model fitted the actuality very well.Conclusion Classification tree model can not only properly predict the progression of microalbuminuria to proteinuria in diabetic nephropathy,but also reveal the complex interaction effects among the factors.%目的 应用分类树模型构建糖尿病肾病由微量蛋白尿到大量蛋白尿进展风险的预测模型,并评价其应用价值.方法 选取本院临床资料完善的糖尿病肾病住院患者256例,根据GFR分期及尿白蛋白定量诊断分为糖尿病微量蛋白尿组(183例)和糖尿病大量蛋白尿组(73例).记录患者的相关临床资料,采用Exhaustive CHAID分类树算法建立糖尿病肾病进展风险的预测模型,采用错分概率Risk值、索引图评价模型的应用价值.结果 所建立的分类树模型包括3

  18. Forest classification trees and forest support vector machines algorithms: Demonstration using microarray data.

    Science.gov (United States)

    Zintzaras, Elias; Kowald, Axel

    2010-05-01

    Classification into multiple classes when the measured variables are outnumbered is a major methodological challenge in -omics studies. Two algorithms that overcome the dimensionality problem are presented: the forest classification tree (FCT) and the forest support vector machines (FSVM). In FCT, a set of variables is randomly chosen and a classification tree (CT) is grown using a forward classification algorithm. The process is repeated and a forest of CTs is derived. Finally, the most frequent variables from the trees with the smallest apparent misclassification rate (AMR) are used to construct a productive tree. In FSVM, the CTs are replaced by SVMs. The methods are demonstrated using prostate gene expression data for classifying tissue samples into four tumor types. For threshold split value 0.001 and utilizing 100 markers the productive CT consisted of 29 terminal nodes and achieved perfect classification (AMR=0). When the threshold value was set to 0.01, a tree with 17 terminal nodes was constructed based on 15 markers (AMR=7%). In FSVM, reducing the fraction of the forest that was used to construct the best classifier from the top 80% to the top 20% reduced the misclassification to 25% (when using 200 markers). The proposed methodologies may be used for identifying important variables in high dimensional data. Furthermore, the FCT allows exploring the data structure and provides a decision rule.

  19. Integrating TM and Ancillary Geographical Data with Classification Trees for Land Cover Classification of Marsh Area

    Institute of Scientific and Technical Information of China (English)

    NA Xiaodong; ZHANG Shuqing; ZHANG Huaiqing; LI Xiaofeng; YU Huan; LIU Chunyue

    2009-01-01

    The main objective of this research is to determine the capacity of land cover classification combining spectral and textural features of Landsat TM imagery with ancillary geographical data in wetlands of the Sanjiang Plain, Heilongjiang Province, China. Semi-variograms and Z-test value were calculated to assess the separability of grey-level co-occurrence texture measures to maximize the difference between land cover types. The degree of spatial autocorrelation showed that window sizes of 3×3 pixels and 11×11 pixels were most appropriate for Landsat TM image texture calculations. The texture analysis showed that co-occurrence entropy, dissimilarity, and variance texture measures, derived from the Landsat TM spectrum bands and vegetation indices provided the most significant statistical differentiation between land cover types. Subsequently, a Classification and Regression Tree (CART) algorithm was applied to three different combinations of predictors: 1) TM imagery alone (TM-only); 2) TM imagery plus image texture (TM+TXT model); and 3) all predictors including TM imagery, image texture and additional ancillary GIS information (TM+TXT+GIS model). Compared with traditional Maximum Likelihood Classification (MLC) supervised classification, three classification trees predictive models reduced the overall error rate significantly. Image texture measures and ancillary geographical variables depressed the speckle noise effectively and reduced classification error rate of marsh obviously. For classification trees model making use of all available predictors, omission error rate was 12.90% and commission error rate was 10.99% for marsh. The developed method is portable, relatively easy to implement and should be applicable in other settings and over larger extents.

  20. Genetic Algorithms and Classification Trees in Feature Discovery: Diabetes and the NHANES database

    Energy Technology Data Exchange (ETDEWEB)

    Heredia-Langner, Alejandro; Jarman, Kristin H.; Amidan, Brett G.; Pounds, Joel G.

    2013-09-01

    This paper presents a feature selection methodology that can be applied to datasets containing a mixture of continuous and categorical variables. Using a Genetic Algorithm (GA), this method explores a dataset and selects a small set of features relevant for the prediction of a binary (1/0) response. Binary classification trees and an objective function based on conditional probabilities are used to measure the fitness of a given subset of features. The method is applied to health data in order to find factors useful for the prediction of diabetes. Results show that our algorithm is capable of narrowing down the set of predictors to around 8 factors that can be validated using reputable medical and public health resources.

  1. Classification tree and minimum-volume ellipsoid analyses of the distribution of ponderosa pine in the western USA

    Science.gov (United States)

    Norris, Jodi R.; Jackson, Stephen T.; Betancourt, Julio L.

    2006-01-01

    Aim? Ponderosa pine (Pinus ponderosa Douglas ex Lawson & C. Lawson) is an economically and ecologically important conifer that has a wide geographic range in the western USA, but is mostly absent from the geographic centre of its distribution - the Great Basin and adjoining mountain ranges. Much of its modern range was achieved by migration of geographically distinct Sierra Nevada (P. ponderosa var. ponderosa) and Rocky Mountain (P. ponderosa var. scopulorum) varieties in the last 10,000 years. Previous research has confirmed genetic differences between the two varieties, and measurable genetic exchange occurs where their ranges now overlap in western Montana. A variety of approaches in bioclimatic modelling is required to explore the ecological differences between these varieties and their implications for historical biogeography and impending changes in western landscapes. Location? Western USA. Methods? We used a classification tree analysis and a minimum-volume ellipsoid as models to explain the broad patterns of distribution of ponderosa pine in modern environments using climatic and edaphic variables. Most biogeographical modelling assumes that the target group represents a single, ecologically uniform taxonomic population. Classification tree analysis does not require this assumption because it allows the creation of pathways that predict multiple positive and negative outcomes. Thus, classification tree analysis can be used to test the ecological uniformity of the species. In addition, a multidimensional ellipsoid was constructed to describe the niche of each variety of ponderosa pine, and distances from the niche were calculated and mapped on a 4-km grid for each ecological variable. Results? The resulting classification tree identified three dominant pathways predicting ponderosa pine presence. Two of these three pathways correspond roughly to the distribution of var. ponderosa, and the third pathway generally corresponds to the distribution of var

  2. Mastectomy or breast conserving surgery? Factors affecting type of surgical treatment for breast cancer – a classification tree approach

    Directory of Open Access Journals (Sweden)

    O'Neill Terry

    2006-04-01

    Full Text Available Abstract Background A critical choice facing breast cancer patients is which surgical treatment – mastectomy or breast conserving surgery (BCS – is most appropriate. Several studies have investigated factors that impact the type of surgery chosen, identifying features such as place of residence, age at diagnosis, tumor size, socio-economic and racial/ethnic elements as relevant. Such assessment of "propensity" is important in understanding issues such as a reported under-utilisation of BCS among women for whom such treatment was not contraindicated. Using Western Australian (WA data, we further examine the factors associated with the type of surgical treatment for breast cancer using a classification tree approach. This approach deals naturally with complicated interactions between factors, and so allows flexible and interpretable models for treatment choice to be built that add to the current understanding of this complex decision process. Methods Data was extracted from the WA Cancer Registry on women diagnosed with breast cancer in WA from 1990 to 2000. Subjects' treatment preferences were predicted from covariates using both classification trees and logistic regression. Results Tumor size was the primary determinant of patient choice, subjects with tumors smaller than 20 mm in diameter preferring BCS. For subjects with tumors greater than 20 mm in diameter factors such as patient age, nodal status, and tumor histology become relevant as predictors of patient choice. Conclusion Classification trees perform as well as logistic regression for predicting patient choice, but are much easier to interpret for clinical use. The selected tree can inform clinicians' advice to patients.

  3. Remote sensing of aquatic vegetation distribution in Taihu Lake using an improved classification tree with modified thresholds.

    Science.gov (United States)

    Zhao, Dehua; Jiang, Hao; Yang, Tangwu; Cai, Ying; Xu, Delin; An, Shuqing

    2012-03-01

    Classification trees (CT) have been used successfully in the past to classify aquatic vegetation from spectral indices (SI) obtained from remotely-sensed images. However, applying CT models developed for certain image dates to other time periods within the same year or among different years can reduce the classification accuracy. In this study, we developed CT models with modified thresholds using extreme SI values (CT(m)) to improve the stability of the models when applying them to different time periods. A total of 903 ground-truth samples were obtained in September of 2009 and 2010 and classified as emergent, floating-leaf, or submerged vegetation or other cover types. Classification trees were developed for 2009 (Model-09) and 2010 (Model-10) using field samples and a combination of two images from winter and summer. Overall accuracies of these models were 92.8% and 94.9%, respectively, which confirmed the ability of CT analysis to map aquatic vegetation in Taihu Lake. However, Model-10 had only 58.9-71.6% classification accuracy and 31.1-58.3% agreement (i.e., pixels classified the same in the two maps) for aquatic vegetation when it was applied to image pairs from both a different time period in 2010 and a similar time period in 2009. We developed a method to estimate the effects of extrinsic (EF) and intrinsic (IF) factors on model uncertainty using Modis images. Results indicated that 71.1% of the instability in classification between time periods was due to EF, which might include changes in atmospheric conditions, sun-view angle and water quality. The remainder was due to IF, such as phenological and growth status differences between time periods. The modified version of Model-10 (i.e. CT(m)) performed better than traditional CT with different image dates. When applied to 2009 images, the CT(m) version of Model-10 had very similar thresholds and performance as Model-09, with overall accuracies of 92.8% and 90.5% for Model-09 and the CT(m) version of Model

  4. A cross-cultural investigation of college student alcohol consumption: a classification tree analysis.

    Science.gov (United States)

    Kitsantas, Panagiota; Kitsantas, Anastasia; Anagnostopoulou, Tanya

    2008-01-01

    In this cross-cultural study, the authors attempted to identify high-risk subgroups for alcohol consumption among college students. American and Greek students (N = 132) answered questions about alcohol consumption, religious beliefs, attitudes toward drinking, advertisement influences, parental monitoring, and drinking consequences. Heavy drinkers in the American group were younger and less religious than were infrequent drinkers. In the Greek group, heavy drinkers tended to deny the negative results of drinking alcohol and use a permissive attitude to justify it, whereas infrequent drinkers were more likely to be monitored by their parents. These results suggest that parental monitoring and an emphasis on informing students about the negative effects of alcohol on their health and social and academic lives may be effective methods of reducing alcohol consumption. Classification tree analysis revealed that student attitudes toward drinking were important in the classification of American and Greek drinkers, indicating that this is a powerful predictor of alcohol consumption regardless of ethnic background.

  5. Applying Classification Trees to Hospital Administrative Data to Identify Patients with Lower Gastrointestinal Bleeding.

    Directory of Open Access Journals (Sweden)

    Juned Siddique

    Full Text Available Lower gastrointestinal bleeding (LGIB is a common cause of acute hospitalization. Currently, there is no accepted standard for identifying patients with LGIB in hospital administrative data. The objective of this study was to develop and validate a set of classification algorithms that use hospital administrative data to identify LGIB.Our sample consists of patients admitted between July 1, 2001 and June 30, 2003 (derivation cohort and July 1, 2003 and June 30, 2005 (validation cohort to the general medicine inpatient service of the University of Chicago Hospital, a large urban academic medical center. Confirmed cases of LGIB in both cohorts were determined by reviewing the charts of those patients who had at least 1 of 36 principal or secondary International Classification of Diseases, Ninth revision, Clinical Modification (ICD-9-CM diagnosis codes associated with LGIB. Classification trees were used on the data of the derivation cohort to develop a set of decision rules for identifying patients with LGIB. These rules were then applied to the validation cohort to assess their performance.Three classification algorithms were identified and validated: a high specificity rule with 80.1% sensitivity and 95.8% specificity, a rule that balances sensitivity and specificity (87.8% sensitivity, 90.9% specificity, and a high sensitivity rule with 100% sensitivity and 91.0% specificity.These classification algorithms can be used in future studies to evaluate resource utilization and assess outcomes associated with LGIB without the use of chart review.

  6. Knowledge-Based Classification in Automated Soil Mapping

    Institute of Scientific and Technical Information of China (English)

    ZHOU BIN; WANG RENCHAO

    2003-01-01

    A machine-learning approach was developed for automated building of knowledge bases for soil resourcesmapping by using a classification tree to generate knowledge from training data. With this method, buildinga knowledge base for automated soil mapping was easier than using the conventional knowledge acquisitionapproach. The knowledge base built by classification tree was used by the knowledge classifier to perform thesoil type classification of Longyou County, Zhejiang Province, China using Landsat TM bi-temporal imagesand GIS data. To evaluate the performance of the resultant knowledge bases, the classification results werecompared to existing soil map based on a field survey. The accuracy assessment and analysis of the resultantsoil maps suggested that the knowledge bases built by the machine-learning method was of good quality formapping distribution model of soil classes over the study area.

  7. Systematic Model-in-the-Loop Test of Embedded Control Systems

    Science.gov (United States)

    Krupp, Alexander; Müller, Wolfgang

    Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.

  8. Predicting Chemically Induced Duodenal Ulcer and Adrenal Necrosis with Classification Trees

    Science.gov (United States)

    Giampaolo, Casimiro; Gray, Andrew T.; Olshen, Richard A.; Szabo, Sandor

    1991-07-01

    Binary tree-structured statistical classification algorithms and properties of 56 model alkyl nucleophiles were brought to bear on two problems of experimental pharmacology and toxicology. Each rat of a learning sample of 745 was administered one compound and autopsied to determine the presence of duodenal ulcer or adrenal hemorrhagic necrosis. The cited statistical classification schemes were then applied to these outcomes and 67 features of the compounds to ascertain those characteristics that are associated with biologic activity. For predicting duodenal ulceration, dipole moment, melting point, and solubility in octanol are particularly important, while for predicting adrenal necrosis, important features include the number of sulfhydryl groups and double bonds. These methods may constitute inexpensive but powerful ways to screen untested compounds for possible organ-specific toxicity. Mechanisms for the etiology and pathogenesis of the duodenal and adrenal lesions are suggested, as are additional avenues for drug design.

  9. A decision tree algorithm for investigation of model biases related to dynamical cores and physical parameterizations.

    Science.gov (United States)

    Soner Yorgun, M; Rood, Richard B

    2016-12-01

    An object-based evaluation method using a pattern recognition algorithm (i.e., classification trees) is applied to the simulated orographic precipitation for idealized experimental setups using the National Center of Atmospheric Research (NCAR) Community Atmosphere Model (CAM) with the finite volume (FV) and the Eulerian spectral transform dynamical cores with varying resolutions. Daily simulations were analyzed and three different types of precipitation features were identified by the classification tree algorithm. The statistical characteristics of these features (i.e., maximum value, mean value, and variance) were calculated to quantify the difference between the dynamical cores and changing resolutions. Even with the simple and smooth topography in the idealized setups, complexity in the precipitation fields simulated by the models develops quickly. The classification tree algorithm using objective thresholding successfully detected different types of precipitation features even as the complexity of the precipitation field increased. The results show that the complexity and the bias introduced in small-scale phenomena due to the spectral transform method of CAM Eulerian spectral dynamical core is prominent, and is an important reason for its dissimilarity from the FV dynamical core. The resolvable scales, both in horizontal and vertical dimensions, have significant effect on the simulation of precipitation. The results of this study also suggest that an efficient and informative study about the biases produced by GCMs should involve daily (or even hourly) output (rather than monthly mean) analysis over local scales.

  10. Estimating the Probability of Vegetation to Be Groundwater Dependent Based on the Evaluation of Tree Models

    Directory of Open Access Journals (Sweden)

    Isabel C. Pérez Hoyos

    2016-04-01

    Full Text Available Groundwater Dependent Ecosystems (GDEs are increasingly threatened by humans’ rising demand for water resources. Consequently, it is imperative to identify the location of GDEs to protect them. This paper develops a methodology to identify the probability of an ecosystem to be groundwater dependent. Probabilities are obtained by modeling the relationship between the known locations of GDEs and factors influencing groundwater dependence, namely water table depth and climatic aridity index. Probabilities are derived for the state of Nevada, USA, using modeled water table depth and aridity index values obtained from the Global Aridity database. The model selected results from the performance comparison of classification trees (CT and random forests (RF. Based on a threshold-independent accuracy measure, RF has a better ability to generate probability estimates. Considering a threshold that minimizes the misclassification rate for each model, RF also proves to be more accurate. Regarding training accuracy, performance measures such as accuracy, sensitivity, and specificity are higher for RF. For the test set, higher values of accuracy and kappa for CT highlight the fact that these measures are greatly affected by low prevalence. As shown for RF, the choice of the cutoff probability value has important consequences on model accuracy and the overall proportion of locations where GDEs are found.

  11. The Comparative Analysis of the Application of Several Scoring Models of Consumer Credit in China%多种个人信用评分模型在中国应用的比较研究

    Institute of Scientific and Technical Information of China (English)

    石庆焱; 靳云汇

    2004-01-01

    Based on a set of credit card sample of Chinese commercial bank, a systemically comparative study of various statistical credit scoring models was firstly made in China. The comparative study indicated that every model has its own strength and weakness. Thestrengths of linear discriminant analysis, linear program, and Logistic regression are that these models are explainable and their outputs can be a linear scorecard(so can be easily implemented). But these models have higher misclassification rate comparing with others. Neural network and classification tree models have a higher predict accuracy,but may be‘over fitted’ ,and their outputs are hard to be explained.

  12. Application of classification trees for the qualitative differentiation of focal liver lesions suspicious for metastasis in gadolinium-EOB-DTPA-enhanced liver MR imaging

    Energy Technology Data Exchange (ETDEWEB)

    Schelhorn, J. [Sophien und Hufeland Klinikum, Weimar (Germany). Dept. of Radiology and Nuclear Medicine; Benndorf, M.; Dietzel, M.; Burmeister, H.P.; Kaiser, W.A.; Baltzer, P.A.T. [Jena Univ. (Germany). Inst. of Diagnostic and Interventional Radiology

    2012-09-15

    Purpose: To evaluate the diagnostic accuracy of qualitative descriptors alone and in combination for the classification of focal liver lesions (FLLs) suspicious for metastasis in gadolinium-EOB-DTPA-enhanced liver MR imaging. Materials and Methods: Consecutive patients with clinically suspected liver metastases were eligible for this retrospective investigation. 50 patients met the inclusion criteria. All underwent Gd-EOB-DTPA-enhanced liver MRI (T2w, chemical shift T1w, dynamic T1w). Primary liver malignancies or treated lesions were excluded. All investigations were read by two blinded observers (O1, O2). Both independently identified the presence of lesions and evaluated predefined qualitative lesion descriptors (signal intensities, enhancement pattern and morphology). A reference standard was determined under consideration of all clinical and follow-up information. Statistical analysis besides contingency tables (chi square, kappa statistics) included descriptor combinations using classification trees (CHAID methodology) as well as ROC analysis. Results: In 38 patients, 120 FLLs (52 benign, 68 malignant) were present. 115 (48 benign, 67 malignant) were identified by the observers. The enhancement pattern, relative SI upon T2w and late enhanced T1w images contributed significantly to the differentiation of FLLs. The overall classification accuracy was 91.3 % (O1) and 88.7 % (O2), kappa = 0.902. Conclusion: The combination of qualitative lesion descriptors proposed in this work revealed high diagnostic accuracy and interobserver agreement in the differentiation of focal liver lesions suspicious for metastases using Gd-EOB-DTPA-enhanced liver MRI. (orig.)

  13. Computer-assisted detection of colonic polyps with CT colonography using neural networks and binary classification trees.

    Science.gov (United States)

    Jerebko, Anna K; Summers, Ronald M; Malley, James D; Franaszek, Marek; Johnson, C Daniel

    2003-01-01

    Detection of colonic polyps in CT colonography is problematic due to complexities of polyp shape and the surface of the normal colon. Published results indicate the feasibility of computer-aided detection of polyps but better classifiers are needed to improve specificity. In this paper we compare the classification results of two approaches: neural networks and recursive binary trees. As our starting point we collect surface geometry information from three-dimensional reconstruction of the colon, followed by a filter based on selected variables such as region density, Gaussian and average curvature and sphericity. The filter returns sites that are candidate polyps, based on earlier work using detection thresholds, to which the neural nets or the binary trees are applied. A data set of 39 polyps from 3 to 25 mm in size was used in our investigation. For both neural net and binary trees we use tenfold cross-validation to better estimate the true error rates. The backpropagation neural net with one hidden layer trained with Levenberg-Marquardt algorithm achieved the best results: sensitivity 90% and specificity 95% with 16 false positives per study.

  14. Identification of area-level influences on regions of high cancer incidence in Queensland, Australia: a classification tree approach

    Directory of Open Access Journals (Sweden)

    Mengersen Kerrie L

    2011-07-01

    Full Text Available Abstract Background Strategies for cancer reduction and management are targeted at both individual and area levels. Area-level strategies require careful understanding of geographic differences in cancer incidence, in particular the association with factors such as socioeconomic status, ethnicity and accessibility. This study aimed to identify the complex interplay of area-level factors associated with high area-specific incidence of Australian priority cancers using a classification and regression tree (CART approach. Methods Area-specific smoothed standardised incidence ratios were estimated for priority-area cancers across 478 statistical local areas in Queensland, Australia (1998-2007, n = 186,075. For those cancers with significant spatial variation, CART models were used to identify whether area-level accessibility, socioeconomic status and ethnicity were associated with high area-specific incidence. Results The accessibility of a person's residence had the most consistent association with the risk of cancer diagnosis across the specific cancers. Many cancers were likely to have high incidence in more urban areas, although male lung cancer and cervical cancer tended to have high incidence in more remote areas. The impact of socioeconomic status and ethnicity on these associations differed by type of cancer. Conclusions These results highlight the complex interactions between accessibility, socioeconomic status and ethnicity in determining cancer incidence risk.

  15. Stochastic gradient boosting classification trees for forest fuel types mapping through airborne laser scanning and IRS LISS-III imagery

    Science.gov (United States)

    Chirici, G.; Scotti, R.; Montaghi, A.; Barbati, A.; Cartisano, R.; Lopez, G.; Marchetti, M.; McRoberts, R. E.; Olsson, H.; Corona, P.

    2013-12-01

    This paper presents an application of Airborne Laser Scanning (ALS) data in conjunction with an IRS LISS-III image for mapping forest fuel types. For two study areas of 165 km2 and 487 km2 in Sicily (Italy), 16,761 plots of size 30-m × 30-m were distributed using a tessellation-based stratified sampling scheme. ALS metrics and spectral signatures from IRS extracted for each plot were used as predictors to classify forest fuel types observed and identified by photointerpretation and fieldwork. Following use of traditional parametric methods that produced unsatisfactory results, three non-parametric classification approaches were tested: (i) classification and regression tree (CART), (ii) the CART bagging method called Random Forests, and (iii) the CART bagging/boosting stochastic gradient boosting (SGB) approach. This contribution summarizes previous experiences using ALS data for estimating forest variables useful for fire management in general and for fuel type mapping, in particular. It summarizes characteristics of classification and regression trees, presents the pre-processing operation, the classification algorithms, and the achieved results. The results demonstrated superiority of the SGB method with overall accuracy of 84%. The most relevant ALS metric was canopy cover, defined as the percent of non-ground returns. Other relevant metrics included the spectral information from IRS and several other ALS metrics such as percentiles of the height distribution, the mean height of all returns, and the number of returns.

  16. Model-based geostatistics

    CERN Document Server

    Diggle, Peter J

    2007-01-01

    Model-based geostatistics refers to the application of general statistical principles of modeling and inference to geostatistical problems. This volume provides a treatment of model-based geostatistics and emphasizes on statistical methods and applications. It also features analyses of datasets from a range of scientific contexts.

  17. Web Based VRML Modelling

    NARCIS (Netherlands)

    Kiss, S.

    2001-01-01

    Presents a method to connect VRML (Virtual Reality Modeling Language) and Java components in a Web page using EAI (External Authoring Interface), which makes it possible to interactively generate and edit VRML meshes. The meshes used are based on regular grids, to provide an interaction and modeling

  18. Automated morphological analysis of bone marrow cells in microscopic images for diagnosis of leukemia: nucleus-plasma separation and cell classification using a hierarchical tree model of hematopoesis

    Science.gov (United States)

    Krappe, Sebastian; Wittenberg, Thomas; Haferlach, Torsten; Münzenmayer, Christian

    2016-03-01

    The morphological differentiation of bone marrow is fundamental for the diagnosis of leukemia. Currently, the counting and classification of the different types of bone marrow cells is done manually under the use of bright field microscopy. This is a time-consuming, subjective, tedious and error-prone process. Furthermore, repeated examinations of a slide may yield intra- and inter-observer variances. For that reason a computer assisted diagnosis system for bone marrow differentiation is pursued. In this work we focus (a) on a new method for the separation of nucleus and plasma parts and (b) on a knowledge-based hierarchical tree classifier for the differentiation of bone marrow cells in 16 different classes. Classification trees are easily interpretable and understandable and provide a classification together with an explanation. Using classification trees, expert knowledge (i.e. knowledge about similar classes and cell lines in the tree model of hematopoiesis) is integrated in the structure of the tree. The proposed segmentation method is evaluated with more than 10,000 manually segmented cells. For the evaluation of the proposed hierarchical classifier more than 140,000 automatically segmented bone marrow cells are used. Future automated solutions for the morphological analysis of bone marrow smears could potentially apply such an approach for the pre-classification of bone marrow cells and thereby shortening the examination time.

  19. Model Based Definition

    Science.gov (United States)

    Rowe, Sidney E.

    2010-01-01

    In September 2007, the Engineering Directorate at the Marshall Space Flight Center (MSFC) created the Design System Focus Team (DSFT). MSFC was responsible for the in-house design and development of the Ares 1 Upper Stage and the Engineering Directorate was preparing to deploy a new electronic Configuration Management and Data Management System with the Design Data Management System (DDMS) based upon a Commercial Off The Shelf (COTS) Product Data Management (PDM) System. The DSFT was to establish standardized CAD practices and a new data life cycle for design data. Of special interest here, the design teams were to implement Model Based Definition (MBD) in support of the Upper Stage manufacturing contract. It is noted that this MBD does use partially dimensioned drawings for auxiliary information to the model. The design data lifecycle implemented several new release states to be used prior to formal release that allowed the models to move through a flow of progressive maturity. The DSFT identified some 17 Lessons Learned as outcomes of the standards development, pathfinder deployments and initial application to the Upper Stage design completion. Some of the high value examples are reviewed.

  20. Cluster Based Text Classification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Wiil, Uffe Kock

    2011-01-01

    We propose a cluster based classification model for suspicious email detection and other text classification tasks. The text classification tasks comprise many training examples that require a complex classification model. Using clusters for classification makes the model simpler and increases...

  1. Understanding the joint effects of family and other developmental contexts on the sense of coherence (SOC): a person-focused analysis using the Classification Tree.

    Science.gov (United States)

    García-Moya, Irene; Moreno, Carmen; Jiménez-Iglesias, Antonia

    2013-10-01

    Using a person-focused approach, the present study sought to identify meaningful constellations of contextual factors that led to predominantly high and low levels of sense of coherence (SOC). Specifically, the contributions of the quality of parent-child relationships, teacher and classmate support, models of behaviour in the peer group, and neighbourhood assets were examined in a representative sample of Spanish adolescents aged 13 to 18 that had taken part in the 2010 edition of the study Health Behaviour in School-aged Children. The quality of parent-child relationships emerged as the main predictor of SOC for the whole sample, but the remaining factors also made significant contributions, which underlines the importance of the simultaneous analysis of the main contexts in adolescents' lives. Additionally, the identified constellations usually included compensatory effects, so no factor should be considered to be completely determining. Interestingly, the role of support at school was different depending on contextual profiles.

  2. Model Construct Based Enterprise Model Architecture and Its Modeling Approach

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In order to support enterprise integration, a kind of model construct based enterprise model architecture and its modeling approach are studied in this paper. First, the structural makeup and internal relationships of enterprise model architecture are discussed. Then, the concept of reusable model construct (MC) which belongs to the control view and can help to derive other views is proposed. The modeling approach based on model construct consists of three steps, reference model architecture synthesis, enterprise model customization, system design and implementation. According to MC based modeling approach a case study with the background of one-kind-product machinery manufacturing enterprises is illustrated. It is shown that proposal model construct based enterprise model architecture and modeling approach are practical and efficient.

  3. HMM-based Trust Model

    DEFF Research Database (Denmark)

    ElSalamouny, Ehab; Nielsen, Mogens; Sassone, Vladimiro

    2010-01-01

    with their dynamic behaviour. Using Hidden Markov Models (HMMs) for both modelling and approximating the behaviours of principals, we introduce the HMM-based trust model as a new approach to evaluating trust in systems exhibiting dynamic behaviour. This model avoids the fixed behaviour assumption which is considered...... the major limitation of existing Beta trust model. We show the consistency of the HMM-based trust model and contrast it against the well known Beta trust model with the decay principle in terms of the estimation precision....

  4. Model-based Software Engineering

    DEFF Research Database (Denmark)

    2010-01-01

    The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models...

  5. Model-Based Reasoning

    Science.gov (United States)

    Ifenthaler, Dirk; Seel, Norbert M.

    2013-01-01

    In this paper, there will be a particular focus on mental models and their application to inductive reasoning within the realm of instruction. A basic assumption of this study is the observation that the construction of mental models and related reasoning is a slowly developing capability of cognitive systems that emerges effectively with proper…

  6. Principles of models based engineering

    Energy Technology Data Exchange (ETDEWEB)

    Dolin, R.M.; Hefele, J.

    1996-11-01

    This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.

  7. Element-Based Computational Model

    Directory of Open Access Journals (Sweden)

    Conrad Mueller

    2012-02-01

    Full Text Available A variation on the data-flow model is proposed to use for developing parallel architectures. While the model is a data driven model it has significant differences to the data-flow model. The proposed model has an evaluation cycleof processing elements (encapsulated data that is similar to the instruction cycle of the von Neumann model. The elements contain the information required to process them. The model is inherently parallel. An emulation of the model has been implemented. The objective of this paper is to motivate support for taking the research further. Using matrix multiplication as a case study, the element/data-flow based model is compared with the instruction-based model. This is done using complexity analysis followed by empirical testing to verify this analysis. The positive results are given as motivation for the research to be taken to the next stage - that is, implementing the model using FPGAs.

  8. Classification Trees for Problems with Monotonicity Constraints

    NARCIS (Netherlands)

    R. Potharst (Rob); A.J. Feelders

    2002-01-01

    textabstractFor classification problems with ordinal attributes very often the class attribute should increase with each or some of the explaining attributes. These are called classification problems with monotonicity constraints. Classical decision tree algorithms such as CART or C4.5 generally do

  9. Graph Model Based Indoor Tracking

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Lu, Hua; Yang, Bin

    2009-01-01

    The tracking of the locations of moving objects in large indoor spaces is important, as it enables a range of applications related to, e.g., security and indoor navigation and guidance. This paper presents a graph model based approach to indoor tracking that offers a uniform data management...... infrastructure for different symbolic positioning technologies, e.g., Bluetooth and RFID. More specifically, the paper proposes a model of indoor space that comprises a base graph and mappings that represent the topology of indoor space at different levels. The resulting model can be used for one or several...... indoor positioning technologies. Focusing on RFID-based positioning, an RFID specific reader deployment graph model is built from the base graph model. This model is then used in several algorithms for constructing and refining trajectories from raw RFID readings. Empirical studies with implementations...

  10. Model-based consensus

    NARCIS (Netherlands)

    Boumans, Marcel

    2014-01-01

    The aim of the rational-consensus method is to produce “rational consensus”, that is, “mathematical aggregation”, by weighing the performance of each expert on the basis of his or her knowledge and ability to judge relevant uncertainties. The measurement of the performance of the experts is based on

  11. Model-based consensus

    NARCIS (Netherlands)

    M. Boumans

    2014-01-01

    The aim of the rational-consensus method is to produce "rational consensus", that is, "mathematical aggregation", by weighing the performance of each expert on the basis of his or her knowledge and ability to judge relevant uncertainties. The measurement of the performance of the experts is based on

  12. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  13. Empirically Based, Agent-based models

    Directory of Open Access Journals (Sweden)

    Elinor Ostrom

    2006-12-01

    Full Text Available There is an increasing drive to combine agent-based models with empirical methods. An overview is provided of the various empirical methods that are used for different kinds of questions. Four categories of empirical approaches are identified in which agent-based models have been empirically tested: case studies, stylized facts, role-playing games, and laboratory experiments. We discuss how these different types of empirical studies can be combined. The various ways empirical techniques are used illustrate the main challenges of contemporary social sciences: (1 how to develop models that are generalizable and still applicable in specific cases, and (2 how to scale up the processes of interactions of a few agents to interactions among many agents.

  14. Base Flow Model Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The program focuses on turbulence modeling enhancements for predicting high-speed rocket base flows. A key component of the effort is the collection of high-fidelity...

  15. Constraint Based Modeling Going Multicellular.

    Science.gov (United States)

    Martins Conde, Patricia do Rosario; Sauter, Thomas; Pfau, Thomas

    2016-01-01

    Constraint based modeling has seen applications in many microorganisms. For example, there are now established methods to determine potential genetic modifications and external interventions to increase the efficiency of microbial strains in chemical production pipelines. In addition, multiple models of multicellular organisms have been created including plants and humans. While initially the focus here was on modeling individual cell types of the multicellular organism, this focus recently started to switch. Models of microbial communities, as well as multi-tissue models of higher organisms have been constructed. These models thereby can include different parts of a plant, like root, stem, or different tissue types in the same organ. Such models can elucidate details of the interplay between symbiotic organisms, as well as the concerted efforts of multiple tissues and can be applied to analyse the effects of drugs or mutations on a more systemic level. In this review we give an overview of the recent development of multi-tissue models using constraint based techniques and the methods employed when investigating these models. We further highlight advances in combining constraint based models with dynamic and regulatory information and give an overview of these types of hybrid or multi-level approaches.

  16. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2009-01-01

    The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...

  17. Event-Based Activity Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2004-01-01

    We present and discuss a modeling approach that supports event-based modeling of information and activity in information systems. Interacting human actors and IT-actors may carry out such activity. We use events to create meaningful relations between information structures and the related activit...

  18. Modelling Gesture Based Ubiquitous Applications

    CERN Document Server

    Zacharia, Kurien; Varghese, Surekha Mariam

    2011-01-01

    A cost effective, gesture based modelling technique called Virtual Interactive Prototyping (VIP) is described in this paper. Prototyping is implemented by projecting a virtual model of the equipment to be prototyped. Users can interact with the virtual model like the original working equipment. For capturing and tracking the user interactions with the model image and sound processing techniques are used. VIP is a flexible and interactive prototyping method that has much application in ubiquitous computing environments. Different commercial as well as socio-economic applications and extension to interactive advertising of VIP are also discussed.

  19. Agent Based Multiviews Requirements Model

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Based on the current researches of viewpoints oriented requirements engineering and intelligent agent, we present the concept of viewpoint agent and its abstract model based on a meta-language for multiviews requirements engineering. It provided a basis for consistency checking and integration of different viewpoint requirements, at the same time, these checking and integration works can automatically realized in virtue of intelligent agent's autonomy, proactiveness and social ability. Finally, we introduce the practical application of the model by the case study of data flow diagram.

  20. Model-Based Security Testing

    CERN Document Server

    Schieferdecker, Ina; Schneider, Martin; 10.4204/EPTCS.80.1

    2012-01-01

    Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security testing (MBST) is a relatively new field and especially dedicated to the systematic and efficient specification and documentation of security test objectives, security test cases and test suites, as well as to their automated or semi-automated generation. In particular, the combination of security modelling and test generation approaches is still a challenge in research and of high interest for industrial applications. MBST includes e.g. security functional testing, model-based fuzzing, risk- and threat-oriented testing,...

  1. Kernel model-based diagnosis

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The methods for computing the kemel consistency-based diagnoses and the kernel abductive diagnoses are only suited for the situation where part of the fault behavioral modes of the components are known. The characterization of the kernel model-based diagnosis based on the general causal theory is proposed, which can break through the limitation of the above methods when all behavioral modes of each component are known. Using this method, when observation subsets deduced logically are respectively assigned to the empty or the whole observation set, the kernel consistency-based diagnoses and the kernel abductive diagnoses can deal with all situations. The direct relationship between this diagnostic procedure and the prime implicants/implicates is proved, thus linking theoretical result with implementation.

  2. Model-based tomographic reconstruction

    Science.gov (United States)

    Chambers, David H; Lehman, Sean K; Goodman, Dennis M

    2012-06-26

    A model-based approach to estimating wall positions for a building is developed and tested using simulated data. It borrows two techniques from geophysical inversion problems, layer stripping and stacking, and combines them with a model-based estimation algorithm that minimizes the mean-square error between the predicted signal and the data. The technique is designed to process multiple looks from an ultra wideband radar array. The processed signal is time-gated and each section processed to detect the presence of a wall and estimate its position, thickness, and material parameters. The floor plan of a building is determined by moving the array around the outside of the building. In this paper we describe how the stacking and layer stripping algorithms are combined and show the results from a simple numerical example of three parallel walls.

  3. Model-Based Security Testing

    Directory of Open Access Journals (Sweden)

    Ina Schieferdecker

    2012-02-01

    Full Text Available Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security testing (MBST is a relatively new field and especially dedicated to the systematic and efficient specification and documentation of security test objectives, security test cases and test suites, as well as to their automated or semi-automated generation. In particular, the combination of security modelling and test generation approaches is still a challenge in research and of high interest for industrial applications. MBST includes e.g. security functional testing, model-based fuzzing, risk- and threat-oriented testing, and the usage of security test patterns. This paper provides a survey on MBST techniques and the related models as well as samples of new methods and tools that are under development in the European ITEA2-project DIAMONDS.

  4. Crowdsourcing Based 3d Modeling

    Science.gov (United States)

    Somogyi, A.; Barsi, A.; Molnar, B.; Lovas, T.

    2016-06-01

    Web-based photo albums that support organizing and viewing the users' images are widely used. These services provide a convenient solution for storing, editing and sharing images. In many cases, the users attach geotags to the images in order to enable using them e.g. in location based applications on social networks. Our paper discusses a procedure that collects open access images from a site frequently visited by tourists. Geotagged pictures showing the image of a sight or tourist attraction are selected and processed in photogrammetric processing software that produces the 3D model of the captured object. For the particular investigation we selected three attractions in Budapest. To assess the geometrical accuracy, we used laser scanner and DSLR as well as smart phone photography to derive reference values to enable verifying the spatial model obtained from the web-album images. The investigation shows how detailed and accurate models could be derived applying photogrammetric processing software, simply by using images of the community, without visiting the site.

  5. An Agent Based Classification Model

    CERN Document Server

    Gu, Feng; Greensmith, Julie

    2009-01-01

    The major function of this model is to access the UCI Wisconsin Breast Can- cer data-set[1] and classify the data items into two categories, which are normal and anomalous. This kind of classifi cation can be referred as anomaly detection, which discriminates anomalous behaviour from normal behaviour in computer systems. One popular solution for anomaly detection is Artifi cial Immune Sys- tems (AIS). AIS are adaptive systems inspired by theoretical immunology and observed immune functions, principles and models which are applied to prob- lem solving. The Dendritic Cell Algorithm (DCA)[2] is an AIS algorithm that is developed specifi cally for anomaly detection. It has been successfully applied to intrusion detection in computer security. It is believed that agent-based mod- elling is an ideal approach for implementing AIS, as intelligent agents could be the perfect representations of immune entities in AIS. This model evaluates the feasibility of re-implementing the DCA in an agent-based simulation environ- ...

  6. Multi-robot system learning based on evolutionary classification

    Directory of Open Access Journals (Sweden)

    Manko Sergey

    2016-01-01

    Full Text Available This paper presents a novel machine learning method for agents of a multi-robot system. The learning process is based on knowledge discovery through continual analysis of robot sensory information. We demonstrate that classification trees and evolutionary forests may be a basis for creation of autonomous robots capable both of learning and knowledge exchange with other agents in multi-robot system. The results of experimental studies confirm the effectiveness of the proposed approach.

  7. Research on BOM based composable modeling method

    NARCIS (Netherlands)

    Zhang, M.; He, Q.; Gong, J.

    2013-01-01

    Composable modeling method has been a research hotpot in the area of Modeling and Simulation for a long time. In order to increase the reuse and interoperability of BOM based model, this paper put forward a composable modeling method based on BOM, studied on the basic theory of composable modeling m

  8. Modelling distribution of marine benthos from hydroacoustics and underwater video

    Science.gov (United States)

    Holmes, K. W.; Van Niel, K. P.; Radford, B.; Kendrick, G. A.; Grove, S. L.

    2008-08-01

    Broad-scale mapping of marine benthos is required for marine resource management and conservation. This study combines textural derivatives based on bathymetry from multibeam hydroacoustics with underwater video observations to model and map sessile biota between 10- and 60-m water depth over 35 km 2 in Point Addis Marine National Park (MNP), Vic., Australia. Classification tree models and maps were developed for macroalgae (all types, mixed red algae, Ecklonia, and rhodoliths) and sessile invertebrates (all types, sponges, and ascidians). Model accuracy was tested on 25% of the video observation dataset reserved from modelling. Models fit well for most macroalgae categories (correct classification rates of 67-84%), but are not as good for sessile invertebrate classes (correct classification rates of 57-62%). The poor fit of the sessile invertebrate models may be the combined result of grouping organisms with different environmental requirements and the effect of false absences recorded during video interpretation due to poor image quality. Probability maps, binary single-class maps, and multi-class maps supply spatially explicit, detailed information on the distribution of sessile benthic biota within the MNP and provide information at a landscape-scale for ecological investigations and marine management.

  9. Intelligent model-based OPC

    Science.gov (United States)

    Huang, W. C.; Lai, C. M.; Luo, B.; Tsai, C. K.; Chih, M. H.; Lai, C. W.; Kuo, C. C.; Liu, R. G.; Lin, H. T.

    2006-03-01

    Optical proximity correction is the technique of pre-distorting mask layouts so that the printed patterns are as close to the desired shapes as possible. For model-based optical proximity correction, a lithographic model to predict the edge position (contour) of patterns on the wafer after lithographic processing is needed. Generally, segmentation of edges is performed prior to the correction. Pattern edges are dissected into several small segments with corresponding target points. During the correction, the edges are moved back and forth from the initial drawn position, assisted by the lithographic model, to finally settle on the proper positions. When the correction converges, the intensity predicted by the model in every target points hits the model-specific threshold value. Several iterations are required to achieve the convergence and the computation time increases with the increase of the required iterations. An artificial neural network is an information-processing paradigm inspired by biological nervous systems, such as how the brain processes information. It is composed of a large number of highly interconnected processing elements (neurons) working in unison to solve specific problems. A neural network can be a powerful data-modeling tool that is able to capture and represent complex input/output relationships. The network can accurately predict the behavior of a system via the learning procedure. A radial basis function network, a variant of artificial neural network, is an efficient function approximator. In this paper, a radial basis function network was used to build a mapping from the segment characteristics to the edge shift from the drawn position. This network can provide a good initial guess for each segment that OPC has carried out. The good initial guess reduces the required iterations. Consequently, cycle time can be shortened effectively. The optimization of the radial basis function network for this system was practiced by genetic algorithm

  10. Optimal pricing decision model based on activity-based costing

    Institute of Scientific and Technical Information of China (English)

    王福胜; 常庆芳

    2003-01-01

    In order to find out the applicability of the optimal pricing decision model based on conventional costbehavior model after activity-based costing has given strong shock to the conventional cost behavior model andits assumptions, detailed analyses have been made using the activity-based cost behavior and cost-volume-profitanalysis model, and it is concluded from these analyses that the theory behind the construction of optimal pri-cing decision model is still tenable under activity-based costing, but the conventional optimal pricing decisionmodel must be modified as appropriate to the activity-based costing based cost behavior model and cost-volume-profit analysis model, and an optimal pricing decision model is really a product pricing decision model construc-ted by following the economic principle of maximizing profit.

  11. Differential geometry based multiscale models.

    Science.gov (United States)

    Wei, Guo-Wei

    2010-08-01

    Large chemical and biological systems such as fuel cells, ion channels, molecular motors, and viruses are of great importance to the scientific community and public health. Typically, these complex systems in conjunction with their aquatic environment pose a fabulous challenge to theoretical description, simulation, and prediction. In this work, we propose a differential geometry based multiscale paradigm to model complex macromolecular systems, and to put macroscopic and microscopic descriptions on an equal footing. In our approach, the differential geometry theory of surfaces and geometric measure theory are employed as a natural means to couple the macroscopic continuum mechanical description of the aquatic environment with the microscopic discrete atomistic description of the macromolecule. Multiscale free energy functionals, or multiscale action functionals are constructed as a unified framework to derive the governing equations for the dynamics of different scales and different descriptions. Two types of aqueous macromolecular complexes, ones that are near equilibrium and others that are far from equilibrium, are considered in our formulations. We show that generalized Navier-Stokes equations for the fluid dynamics, generalized Poisson equations or generalized Poisson-Boltzmann equations for electrostatic interactions, and Newton's equation for the molecular dynamics can be derived by the least action principle. These equations are coupled through the continuum-discrete interface whose dynamics is governed by potential driven geometric flows. Comparison is given to classical descriptions of the fluid and electrostatic interactions without geometric flow based micro-macro interfaces. The detailed balance of forces is emphasized in the present work. We further extend the proposed multiscale paradigm to micro-macro analysis of electrohydrodynamics, electrophoresis, fuel cells, and ion channels. We derive generalized Poisson-Nernst-Planck equations that are

  12. Binary classification of dyslipidemia from the waist-to-hip ratio and body mass index: a comparison of linear, logistic, and CART models

    Directory of Open Access Journals (Sweden)

    Paccaud Fred

    2004-04-01

    Full Text Available Abstract Background We sought to improve upon previously published statistical modeling strategies for binary classification of dyslipidemia for general population screening purposes based on the waist-to-hip circumference ratio and body mass index anthropometric measurements. Methods Study subjects were participants in WHO-MONICA population-based surveys conducted in two Swiss regions. Outcome variables were based on the total serum cholesterol to high density lipoprotein cholesterol ratio. The other potential predictor variables were gender, age, current cigarette smoking, and hypertension. The models investigated were: (i linear regression; (ii logistic classification; (iii regression trees; (iv classification trees (iii and iv are collectively known as "CART". Binary classification performance of the region-specific models was externally validated by classifying the subjects from the other region. Results Waist-to-hip circumference ratio and body mass index remained modest predictors of dyslipidemia. Correct classification rates for all models were 60–80%, with marked gender differences. Gender-specific models provided only small gains in classification. The external validations provided assurance about the stability of the models. Conclusions There were no striking differences between either the algebraic (i, ii vs. non-algebraic (iii, iv, or the regression (i, iii vs. classification (ii, iv modeling approaches. Anticipated advantages of the CART vs. simple additive linear and logistic models were less than expected in this particular application with a relatively small set of predictor variables. CART models may be more useful when considering main effects and interactions between larger sets of predictor variables.

  13. Classification Tree Model for Drug Penetrability Classification Identification of Biopharmaceutics Classification System%基于分类树模型鉴别药物在生物药剂分类系统的穿透性分类

    Institute of Scientific and Technical Information of China (English)

    曾垂宇; 王晓艳

    2013-01-01

    通过构建基于分子属性的分类树模型以鉴别化合物的生物药剂分类系统(biopharmaceutics classification system,BCS)的穿透性分类.将从不同文献采集的Caco-2穿透性数据构成训练集,建立分类树模型,并应用此模型对外部测试集——美国食品药品监督管理局BCS的标准化合物进行分类测试.由此建立的鉴别化合物的BCS穿透性分类的规则为如果氢键供体原子数量<4、正性范德华极性表面积和<40.71并且溶解能>-33.89,那么该化合物为高穿透性,否则为低穿透性.本分类结构属性关系模型的105个化合物的训练集和17个化合物的外部测试集的识别正确率分别91.43%和82.35%.本模型成功应用于鉴定BCS标准化合物高低穿透性分类药物的分子属性,为药物穿透性的识别提供了简便、有效的分类方法.

  14. Guide to APA-Based Models

    Science.gov (United States)

    Robins, Robert E.; Delisi, Donald P.

    2008-01-01

    In Robins and Delisi (2008), a linear decay model, a new IGE model by Sarpkaya (2006), and a series of APA-Based models were scored using data from three airports. This report is a guide to the APA-based models.

  15. Developing Empirically Based Models of Practice.

    Science.gov (United States)

    Blythe, Betty J.; Briar, Scott

    1985-01-01

    Over the last decade emphasis has shifted from theoretically based models of practice to empirically based models whose elements are derived from clinical research. These models are defined and a developing model of practice through the use of single-case methodology is examined. Potential impediments to this new role are identified. (Author/BL)

  16. Proceedings Tenth Workshop on Model Based Testing

    OpenAIRE

    Pakulin, Nikolay; Petrenko, Alexander K.; Schlingloff, Bernd-Holger

    2015-01-01

    The workshop is devoted to model-based testing of both software and hardware. Model-based testing uses models describing the required behavior of the system under consideration to guide such efforts as test selection and test results evaluation. Testing validates the real system behavior against models and checks that the implementation conforms to them, but is capable also to find errors in the models themselves. The intent of this workshop is to bring together researchers and users of model...

  17. Product Modelling for Model-Based Maintenance

    NARCIS (Netherlands)

    Houten, van F.J.A.M.; Tomiyama, T.; Salomons, O.W.

    1998-01-01

    The paper describes the fundamental concepts of maintenance and the role that information technology can play in the support of maintenance activities. Function-Behaviour-State modelling is used to describe faults and deterioration of mechanisms in terms of user perception and measurable quantities.

  18. Observation-Based Modeling for Model-Based Testing

    NARCIS (Netherlands)

    Kanstrén, T.; Piel, E.; Gross, H.-G.

    2009-01-01

    One of the single most important reasons that modeling and modelbased testing are not yet common practice in industry is the perceived difficulty of making the models up to the level of detail and quality required for their automated processing. Models unleash their full potential only through suffi

  19. Return of feature-based cost modeling

    Science.gov (United States)

    Creese, Robert C.; Patrawala, Taher B.

    1998-10-01

    Feature Based Cost Modeling is thought of as a relative new approach to cost modeling, but feature based cost modeling had considerable development in the 1950's. Considerable work was published in the 1950's by Boeing on cost for various casting processes--sand casting, die casting, investment casting and permanent mold casting--as a function of a single casting feature, casting volume. Additional approaches to feature based cost modeling have been made, and this work is a review of previous works and a proposed integrated model to feature based cost modeling.

  20. Extensions in model-based system analysis

    OpenAIRE

    Graham, Matthew R.

    2007-01-01

    Model-based system analysis techniques provide a means for determining desired system performance prior to actual implementation. In addition to specifying desired performance, model-based analysis techniques require mathematical descriptions that characterize relevant behavior of the system. The developments of this dissertation give ex. tended formulations for control- relevant model estimation as well as model-based analysis conditions for performance requirements specified as frequency do...

  1. Trace-Based Code Generation for Model-Based Testing

    NARCIS (Netherlands)

    Kanstrén, T.; Piel, E.; Gross, H.-G.

    2009-01-01

    Paper Submitted for review at the Eighth International Conference on Generative Programming and Component Engineering. Model-based testing can be a powerful means to generate test cases for the system under test. However, creating a useful model for model-based testing requires expertise in the (fo

  2. Hierarchical Geometric Constraint Model for Parametric Feature Based Modeling

    Institute of Scientific and Technical Information of China (English)

    高曙明; 彭群生

    1997-01-01

    A new geometric constraint model is described,which is hierarchical and suitable for parametric feature based modeling.In this model,different levels of geometric information are repesented to support various stages of a design process.An efficient approach to parametric feature based modeling is also presented,adopting the high level geometric constraint model.The low level geometric model such as B-reps can be derived automatically from the hig level geometric constraint model,enabling designers to perform their task of detailed design.

  3. The Culture Based Model: Constructing a Model of Culture

    Science.gov (United States)

    Young, Patricia A.

    2008-01-01

    Recent trends reveal that models of culture aid in mapping the design and analysis of information and communication technologies. Therefore, models of culture are powerful tools to guide the building of instructional products and services. This research examines the construction of the culture based model (CBM), a model of culture that evolved…

  4. Data-driven modeling of hydroclimatic trends and soil moisture: Multi-scale data integration and decision support

    Science.gov (United States)

    Coopersmith, Evan Joseph

    regime curve data and facilitate the development of cluster-specific algorithms. Given the desire to enable intelligent decision-making at any location, this classification system is developed in a manner that will allow for classification anywhere in the U.S., even in an ungauged basin. Daily time series data from 428 catchments in the MOPEX database are analyzed to produce an empirical classification tree, partitioning the United States into regions of hydroclimatic similarity. In constructing a classification tree based upon 55 years of data, it is important to recognize the non-stationary nature of climate data. The shifts in climatic regimes will cause certain locations to shift their ultimate position within the classification tree, requiring decision-makers to alter land usage, farming practices, and equipment needs, and algorithms to adjust accordingly. This work adapts the classification model to address the issue of regime shifts over larger temporal scales and suggests how land-usage and farming protocol may vary from hydroclimatic shifts in decades to come. Finally, the generalizability of the hydroclimatic classification system is tested with a physically-based soil moisture model calibrated at several locations throughout the continental United States. The soil moisture model is calibrated at a given site and then applied with the same parameters at other sites within and outside the same hydroclimatic class. The model's performance deteriorates minimally if the calibration and validation location are within the same hydroclimatic class, but deteriorates significantly if the calibration and validates sites are located in different hydroclimatic classes. These soil moisture estimates at the field scale are then further refined by the introduction of LiDAR elevation data, distinguishing faster-drying peaks and ridges from slower-drying valleys. The inclusion of LiDAR enabled multiple locations within the same field to be predicted accurately despite non

  5. Probabilistic Model-Based Safety Analysis

    CERN Document Server

    Güdemann, Matthias; 10.4204/EPTCS.28.8

    2010-01-01

    Model-based safety analysis approaches aim at finding critical failure combinations by analysis of models of the whole system (i.e. software, hardware, failure modes and environment). The advantage of these methods compared to traditional approaches is that the analysis of the whole system gives more precise results. Only few model-based approaches have been applied to answer quantitative questions in safety analysis, often limited to analysis of specific failure propagation models, limited types of failure modes or without system dynamics and behavior, as direct quantitative analysis is uses large amounts of computing resources. New achievements in the domain of (probabilistic) model-checking now allow for overcoming this problem. This paper shows how functional models based on synchronous parallel semantics, which can be used for system design, implementation and qualitative safety analysis, can be directly re-used for (model-based) quantitative safety analysis. Accurate modeling of different types of proba...

  6. Firm Based Trade Models and Turkish Economy

    Directory of Open Access Journals (Sweden)

    Nilüfer ARGIN

    2015-12-01

    Full Text Available Among all international trade models, only The Firm Based Trade Models explains firm’s action and behavior in the world trade. The Firm Based Trade Models focuses on the trade behavior of individual firms that actually make intra industry trade. Firm Based Trade Models can explain globalization process truly. These approaches include multinational cooperation, supply chain and outsourcing also. Our paper aims to explain and analyze Turkish export with Firm Based Trade Models’ context. We use UNCTAD data on exports by SITC Rev 3 categorization to explain total export and 255 products and calculate intensive-extensive margins of Turkish firms.

  7. A genome-wide tree- and forest-based association analysis of comorbidity of alcoholism and smoking

    OpenAIRE

    Ye, Yuanqing; Zhong, Xiaoyun; Zhang, Heping

    2005-01-01

    Genetic mechanisms underlying alcoholism are complex. Understanding the etiology of alcohol dependence and its comorbid conditions such as smoking is important because of the significant health concerns. In this report, we describe a method based on classification trees and deterministic forests for association studies to perform a genome-wide joint association analysis of alcoholism and smoking. This approach is used to analyze the single-nucleotide polymorphism data from the Collaborative S...

  8. Dynamic modelling and analysis of biochemical networks: mechanism-based models and model-based experiments.

    Science.gov (United States)

    van Riel, Natal A W

    2006-12-01

    Systems biology applies quantitative, mechanistic modelling to study genetic networks, signal transduction pathways and metabolic networks. Mathematical models of biochemical networks can look very different. An important reason is that the purpose and application of a model are essential for the selection of the best mathematical framework. Fundamental aspects of selecting an appropriate modelling framework and a strategy for model building are discussed. Concepts and methods from system and control theory provide a sound basis for the further development of improved and dedicated computational tools for systems biology. Identification of the network components and rate constants that are most critical to the output behaviour of the system is one of the major problems raised in systems biology. Current approaches and methods of parameter sensitivity analysis and parameter estimation are reviewed. It is shown how these methods can be applied in the design of model-based experiments which iteratively yield models that are decreasingly wrong and increasingly gain predictive power.

  9. Traceability in Model-Based Testing

    Directory of Open Access Journals (Sweden)

    Mathew George

    2012-11-01

    Full Text Available The growing complexities of software and the demand for shorter time to market are two important challenges that face today’s IT industry. These challenges demand the increase of both productivity and quality of software. Model-based testing is a promising technique for meeting these challenges. Traceability modeling is a key issue and challenge in model-based testing. Relationships between the different models will help to navigate from one model to another, and trace back to the respective requirements and the design model when the test fails. In this paper, we present an approach for bridging the gaps between the different models in model-based testing. We propose relation definition markup language (RDML for defining the relationships between models.

  10. Lévy-based growth models

    DEFF Research Database (Denmark)

    Jónsdóttir, Kristjana Ýr; Schmiegel, Jürgen; Jensen, Eva Bjørn Vedel

    2008-01-01

    In the present paper, we give a condensed review, for the nonspecialist reader, of a new modelling framework for spatio-temporal processes, based on Lévy theory. We show the potential of the approach in stochastic geometry and spatial statistics by studying Lévy-based growth modelling of planar...... objects. The growth models considered are spatio-temporal stochastic processes on the circle. As a by product, flexible new models for space–time covariance functions on the circle are provided. An application of the Lévy-based growth models to tumour growth is discussed....

  11. Distributed Prognostics Based on Structural Model Decomposition

    Data.gov (United States)

    National Aeronautics and Space Administration — Within systems health management, prognostics focuses on predicting the remaining useful life of a system. In the model-based prognostics paradigm, physics-based...

  12. Base Flow Model Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets to...

  13. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae

    2008-01-01

    Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics...... of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical...... constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behaviour as a means of computation...

  14. PCA-based lung motion model

    CERN Document Server

    Li, Ruijiang; Jia, Xun; Zhao, Tianyu; Lamb, James; Yang, Deshan; Low, Daniel A; Jiang, Steve B

    2010-01-01

    Organ motion induced by respiration may cause clinically significant targeting errors and greatly degrade the effectiveness of conformal radiotherapy. It is therefore crucial to be able to model respiratory motion accurately. A recently proposed lung motion model based on principal component analysis (PCA) has been shown to be promising on a few patients. However, there is still a need to understand the underlying reason why it works. In this paper, we present a much deeper and detailed analysis of the PCA-based lung motion model. We provide the theoretical justification of the effectiveness of PCA in modeling lung motion. We also prove that under certain conditions, the PCA motion model is equivalent to 5D motion model, which is based on physiology and anatomy of the lung. The modeling power of PCA model was tested on clinical data and the average 3D error was found to be below 1 mm.

  15. A Multiple Model Approach to Modeling Based on LPF Algorithm

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Input-output data fitting methods are often used for unknown-structure nonlinear system modeling. Based on model-on-demand tactics, a multiple model approach to modeling for nonlinear systems is presented. The basic idea is to find out, from vast historical system input-output data sets, some data sets matching with the current working point, then to develop a local model using Local Polynomial Fitting (LPF) algorithm. With the change of working points, multiple local models are built, which realize the exact modeling for the global system. By comparing to other methods, the simulation results show good performance for its simple, effective and reliable estimation.``

  16. Behavior and Design Intent Based Product Modeling

    Directory of Open Access Journals (Sweden)

    László Horváth

    2004-11-01

    Full Text Available A knowledge based modeling of mechanical products is presented for industrial CAD/CAM systems. An active model is proposed that comprise knowledge from modeling procedures, generic part models and engineers. Present day models of mechanical systems do not contain data about the background of human decisions. This situation motivated the authors at their investigations on exchange design intent information between engineers. Their concept was extending of product models to be capable of description of design intent information. Several human-computer and human-human communication issues were considered. The complex communication problem has been divided into four sub-problems, namely communication of human intent source with the computer system, representation of human intent, exchange of intent data between modeling procedures and communication of the represented intent with humans. Paper discusses the scenario of intelligent modeling based engineering. Then key concepts for the application of computational intelligence in computer model based engineering systems are detailed including knowledge driven models as well as areas of their application. Next, behavior based models with intelligent content involving specifications and knowledge for the design processes are emphasized and an active part modeling is proposed and possibilities for its application are outlined. Finally, design intent supported intelligent modeling is discussed.

  17. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  18. A Role-Based Fuzzy Assignment Model

    Institute of Scientific and Technical Information of China (English)

    ZUO Bao-he; FENG Shan

    2002-01-01

    It's very important to dynamically assign the tasks to corresponding actors in workflow management system, especially in complex applications. This improves the flexibility of workflow systems.In this paper, a role-based workflow model with fuzzy optimized intelligent assignment is proposed and applied in the investment management system. A groupware-based software model is also proposed.

  19. An acoustical model based monitoring network

    NARCIS (Netherlands)

    Wessels, P.W.; Basten, T.G.H.; Eerden, F.J.M. van der

    2010-01-01

    In this paper the approach for an acoustical model based monitoring network is demonstrated. This network is capable of reconstructing a noise map, based on the combination of measured sound levels and an acoustic model of the area. By pre-calculating the sound attenuation within the network the noi

  20. Comparing repetition-based melody segmentation models

    NARCIS (Netherlands)

    Rodríguez López, M.E.; de Haas, Bas; Volk, Anja

    2014-01-01

    This paper reports on a comparative study of computational melody segmentation models based on repetition detection. For the comparison we implemented five repetition-based segmentation models, and subsequently evaluated their capacity to automatically find melodic phrase boundaries in a corpus of 2

  1. Phylogenetic invariants for group-based models

    CERN Document Server

    Donten-Bury, Maria

    2010-01-01

    In this paper we investigate properties of algebraic varieties representing group-based phylogenetic models. We give the (first) example of a nonnormal general group-based model for an abelian group. Following Kaie Kubjas we also determine some invariants of group-based models showing that the associated varieties do not have to be deformation equivalent. We propose a method of generating many phylogenetic invariants and in particular we show that our approach gives the whole ideal of the claw tree for 3-Kimura model under the assumption of the conjecture of Sturmfels and Sullivant. This, combined with the results of Sturmfels and Sullivant, would enable to determine all phylogenetic invariants for any tree for 3-Kimura model and possibly for other group-based models.

  2. Model-based internal wave processing

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J.V.; Chambers, D.H.

    1995-06-09

    A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (dept) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves.

  3. Key-Based Data Model

    Science.gov (United States)

    1994-05-16

    prgresive repetition. It is used, principally , to train small units to pefom tasks requiring a high degree of teamwork, such as fire and maneuver actions in...an adminitative structure that has a mission. An established need based on a valid deficiency in an administrative structure with a mission. Person A

  4. Agent-based modeling of sustainable behaviors

    CERN Document Server

    Sánchez-Maroño, Noelia; Fontenla-Romero, Oscar; Polhill, J; Craig, Tony; Bajo, Javier; Corchado, Juan

    2017-01-01

    Using the O.D.D. (Overview, Design concepts, Detail) protocol, this title explores the role of agent-based modeling in predicting the feasibility of various approaches to sustainability. The chapters incorporated in this volume consist of real case studies to illustrate the utility of agent-based modeling and complexity theory in discovering a path to more efficient and sustainable lifestyles. The topics covered within include: households' attitudes toward recycling, designing decision trees for representing sustainable behaviors, negotiation-based parking allocation, auction-based traffic signal control, and others. This selection of papers will be of interest to social scientists who wish to learn more about agent-based modeling as well as experts in the field of agent-based modeling.

  5. IP Network Management Model Based on NGOSS

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jin-yu; LI Hong-hui; LIU Feng

    2004-01-01

    This paper addresses a management model for IP network based on Next Generation Operation Support System (NGOSS). It makes the network management on the base of all the operation actions of ISP, It provides QoS to user service through the whole path by providing end-to-end Service Level Agreements (SLA) management through whole path. Based on web and coordination technology, this paper gives an implement architecture of this model.

  6. Agent-based modeling and network dynamics

    CERN Document Server

    Namatame, Akira

    2016-01-01

    The book integrates agent-based modeling and network science. It is divided into three parts, namely, foundations, primary dynamics on and of social networks, and applications. The book begins with the network origin of agent-based models, known as cellular automata, and introduce a number of classic models, such as Schelling’s segregation model and Axelrod’s spatial game. The essence of the foundation part is the network-based agent-based models in which agents follow network-based decision rules. Under the influence of the substantial progress in network science in late 1990s, these models have been extended from using lattices into using small-world networks, scale-free networks, etc. The book also shows that the modern network science mainly driven by game-theorists and sociophysicists has inspired agent-based social scientists to develop alternative formation algorithms, known as agent-based social networks. The book reviews a number of pioneering and representative models in this family. Upon the gi...

  7. Model-Based Motion Tracking of Infants

    DEFF Research Database (Denmark)

    Olsen, Mikkel Damgaard; Herskind, Anna; Nielsen, Jens Bo;

    2014-01-01

    Even though motion tracking is a widely used technique to analyze and measure human movements, only a few studies focus on motion tracking of infants. In recent years, a number of studies have emerged focusing on analyzing the motion pattern of infants, using computer vision. Most of these studies...... are based on 2D images, but few are based on 3D information. In this paper, we present a model-based approach for tracking infants in 3D. The study extends a novel study on graph-based motion tracking of infants and we show that the extension improves the tracking results. A 3D model is constructed...... that resembles the body surface of an infant, where the model is based on simple geometric shapes and a hierarchical skeleton model....

  8. Simulation-based Manufacturing System Modeling

    Institute of Scientific and Technical Information of China (English)

    卫东; 金烨; 范秀敏; 严隽琪

    2003-01-01

    In recent years, computer simulation appears to be very advantageous technique for researching the resource-constrained manufacturing system. This paper presents an object-oriented simulation modeling method, which combines the merits of traditional methods such as IDEF0 and Petri Net. In this paper, a four-layer-one-angel hierarchical modeling framework based on OOP is defined. And the modeling description of these layers is expounded, such as: hybrid production control modeling and human resource dispatch modeling. To validate the modeling method, a case study of an auto-product line in a motor manufacturing company has been carried out.

  9. Energy based prediction models for building acoustics

    DEFF Research Database (Denmark)

    Brunskog, Jonas

    2012-01-01

    In order to reach robust and simplified yet accurate prediction models, energy based principle are commonly used in many fields of acoustics, especially in building acoustics. This includes simple energy flow models, the framework of statistical energy analysis (SEA) as well as more elaborated...... principles as, e.g., wave intensity analysis (WIA). The European standards for building acoustic predictions, the EN 12354 series, are based on energy flow and SEA principles. In the present paper, different energy based prediction models are discussed and critically reviewed. Special attention is placed...

  10. Probabilistic Model-Based Background Subtraction

    DEFF Research Database (Denmark)

    Krüger, Volker; Andersen, Jakob; Prehn, Thomas

    2005-01-01

    manner. Bayesian propagation over time is used for proper model selection and tracking during model-based background subtraction. Bayes propagation is attractive in our application as it allows to deal with uncertainties during tracking. We have tested our approach on suitable outdoor video data....... is the correlation between pixels. In this paper we introduce a model-based background subtraction approach which facilitates prior knowledge of pixel correlations for clearer and better results. Model knowledge is being learned from good training video data, the data is stored for fast access in a hierarchical...

  11. Software Testing Method Based on Model Comparison

    Institute of Scientific and Technical Information of China (English)

    XIE Xiao-dong; LU Yan-sheng; MAO Cheng-yin

    2008-01-01

    A model comparison based software testing method (MCST) is proposed. In this method, the requirements and programs of software under test are transformed into the ones in the same form, and described by the same model describe language (MDL).Then, the requirements are transformed into a specification model and the programs into an implementation model. Thus, the elements and structures of the two models are compared, and the differences between them are obtained. Based on the diffrences, a test suite is generated. Different MDLs can be chosen for the software under test. The usages of two classical MDLs in MCST, the equivalence classes model and the extended finite state machine (EFSM) model, are described with example applications. The results show that the test suites generated by MCST are more efficient and smaller than some other testing methods, such as the path-coverage testing method, the object state diagram testing method, etc.

  12. Distributed Prognostics based on Structural Model Decomposition

    Science.gov (United States)

    Daigle, Matthew J.; Bregon, Anibal; Roychoudhury, I.

    2014-01-01

    Within systems health management, prognostics focuses on predicting the remaining useful life of a system. In the model-based prognostics paradigm, physics-based models are constructed that describe the operation of a system and how it fails. Such approaches consist of an estimation phase, in which the health state of the system is first identified, and a prediction phase, in which the health state is projected forward in time to determine the end of life. Centralized solutions to these problems are often computationally expensive, do not scale well as the size of the system grows, and introduce a single point of failure. In this paper, we propose a novel distributed model-based prognostics scheme that formally describes how to decompose both the estimation and prediction problems into independent local subproblems whose solutions may be easily composed into a global solution. The decomposition of the prognostics problem is achieved through structural decomposition of the underlying models. The decomposition algorithm creates from the global system model a set of local submodels suitable for prognostics. Independent local estimation and prediction problems are formed based on these local submodels, resulting in a scalable distributed prognostics approach that allows the local subproblems to be solved in parallel, thus offering increases in computational efficiency. Using a centrifugal pump as a case study, we perform a number of simulation-based experiments to demonstrate the distributed approach, compare the performance with a centralized approach, and establish its scalability. Index Terms-model-based prognostics, distributed prognostics, structural model decomposition ABBREVIATIONS

  13. Sketch-based Interfaces and Modeling

    CERN Document Server

    Jorge, Joaquim

    2011-01-01

    The field of sketch-based interfaces and modeling (SBIM) is concerned with developing methods and techniques to enable users to interact with a computer through sketching - a simple, yet highly expressive medium. SBIM blends concepts from computer graphics, human-computer interaction, artificial intelligence, and machine learning. Recent improvements in hardware, coupled with new machine learning techniques for more accurate recognition, and more robust depth inferencing techniques for sketch-based modeling, have resulted in an explosion of both sketch-based interfaces and pen-based computing

  14. PDF-based heterogeneous multiscale filtration model.

    Science.gov (United States)

    Gong, Jian; Rutland, Christopher J

    2015-04-21

    Motivated by modeling of gasoline particulate filters (GPFs), a probability density function (PDF) based heterogeneous multiscale filtration (HMF) model is developed to calculate filtration efficiency of clean particulate filters. A new methodology based on statistical theory and classic filtration theory is developed in the HMF model. Based on the analysis of experimental porosimetry data, a pore size probability density function is introduced to represent heterogeneity and multiscale characteristics of the porous wall. The filtration efficiency of a filter can be calculated as the sum of the contributions of individual collectors. The resulting HMF model overcomes the limitations of classic mean filtration models which rely on tuning of the mean collector size. Sensitivity analysis shows that the HMF model recovers the classical mean model when the pore size variance is very small. The HMF model is validated by fundamental filtration experimental data from different scales of filter samples. The model shows a good agreement with experimental data at various operating conditions. The effects of the microstructure of filters on filtration efficiency as well as the most penetrating particle size are correctly predicted by the model.

  15. Workflow-Based Dynamic Enterprise Modeling

    Institute of Scientific and Technical Information of China (English)

    黄双喜; 范玉顺; 罗海滨; 林慧萍

    2002-01-01

    Traditional systems for enterprise modeling and business process control are often static and cannot adapt to the changing environment. This paper presents a workflow-based method to dynamically execute the enterprise model. This method gives an explicit representation of the business process logic and the relationships between the elements involved in the process. An execution-oriented integrated enterprise modeling system is proposed in combination with other enterprise views. The enterprise model can be established and executed dynamically in the actual environment due to the dynamic properties of the workflow model.

  16. Bayesian Network Based XP Process Modelling

    Directory of Open Access Journals (Sweden)

    Mohamed Abouelela

    2010-07-01

    Full Text Available A Bayesian Network based mathematical model has been used for modelling Extreme Programmingsoftware development process. The model is capable of predicting the expected finish time and theexpected defect rate for each XP release. Therefore, it can be used to determine the success/failure of anyXP Project. The model takes into account the effect of three XP practices, namely: Pair Programming,Test Driven Development and Onsite Customer practices. The model’s predictions were validated againsttwo case studies. Results show the precision of our model especially in predicting the project finish time.

  17. Model Based Testing for Agent Systems

    Science.gov (United States)

    Zhang, Zhiyong; Thangarajah, John; Padgham, Lin

    Although agent technology is gaining world wide popularity, a hindrance to its uptake is the lack of proper testing mechanisms for agent based systems. While many traditional software testing methods can be generalized to agent systems, there are many aspects that are different and which require an understanding of the underlying agent paradigm. In this paper we present certain aspects of a testing framework that we have developed for agent based systems. The testing framework is a model based approach using the design models of the Prometheus agent development methodology. In this paper we focus on model based unit testing and identify the appropriate units, present mechanisms for generating suitable test cases and for determining the order in which the units are to be tested, present a brief overview of the unit testing process and an example. Although we use the design artefacts from Prometheus the approach is suitable for any plan and event based agent system.

  18. Grey-theory based intrusion detection model

    Institute of Scientific and Technical Information of China (English)

    Qin Boping; Zhou Xianwei; Yang Jun; Song Cunyi

    2006-01-01

    To solve the problem that current intrusion detection model needs large-scale data in formulating the model in real-time use, an intrusion detection system model based on grey theory (GTIDS) is presented. Grey theory has merits of fewer requirements on original data scale, less limitation of the distribution pattern and simpler algorithm in modeling.With these merits GTIDS constructs model according to partial time sequence for rapid detect on intrusive act in secure system. In this detection model rate of false drop and false retrieval are effectively reduced through twice modeling and repeated detect on target data. Furthermore, GTIDS framework and specific process of modeling algorithm are presented. The affectivity of GTIDS is proved through emulated experiments comparing snort and next-generation intrusion detection expert system (NIDES) in SRI international.

  19. Econophysics of agent-based models

    CERN Document Server

    Aoyama, Hideaki; Chakrabarti, Bikas; Chakraborti, Anirban; Ghosh, Asim

    2014-01-01

    The primary goal of this book is to present the research findings and conclusions of physicists, economists, mathematicians and financial engineers working in the field of "Econophysics" who have undertaken agent-based modelling, comparison with empirical studies and related investigations. Most standard economic models assume the existence of the representative agent, who is “perfectly rational” and applies the utility maximization principle when taking action. One reason for this is the desire to keep models mathematically tractable: no tools are available to economists for solving non-linear models of heterogeneous adaptive agents without explicit optimization. In contrast, multi-agent models, which originated from statistical physics considerations, allow us to go beyond the prototype theories of traditional economics involving the representative agent. This book is based on the Econophys-Kolkata VII Workshop, at which many such modelling efforts were presented. In the book, leading researchers in the...

  20. Approximation Algorithms for Model-Based Diagnosis

    NARCIS (Netherlands)

    Feldman, A.B.

    2010-01-01

    Model-based diagnosis is an area of abductive inference that uses a system model, together with observations about system behavior, to isolate sets of faulty components (diagnoses) that explain the observed behavior, according to some minimality criterion. This thesis presents greedy approximation a

  1. Rule-based Modelling and Tunable Resolution

    Directory of Open Access Journals (Sweden)

    Russ Harmer

    2009-11-01

    Full Text Available We investigate the use of an extension of rule-based modelling for cellular signalling to create a structured space of model variants. This enables the incremental development of rule sets that start from simple mechanisms and which, by a gradual increase in agent and rule resolution, evolve into more detailed descriptions.

  2. Rule-based Modelling and Tunable Resolution

    CERN Document Server

    Harmer, Russ

    2009-01-01

    We investigate the use of an extension of rule-based modelling for cellular signalling to create a structured space of model variants. This enables the incremental development of rule sets that start from simple mechanisms and which, by a gradual increase in agent and rule resolution, evolve into more detailed descriptions.

  3. Agent-based modelling of cholera diffusion

    NARCIS (Netherlands)

    Augustijn, Ellen-Wien; Doldersum, Tom; Useya, Juliana; Augustijn, Denie

    2016-01-01

    This paper introduces a spatially explicit agent-based simulation model for micro-scale cholera diffusion. The model simulates both an environmental reservoir of naturally occurring V. cholerae bacteria and hyperinfectious V. cholerae. Objective of the research is to test if runoff from open refuse

  4. Agent-based modelling of cholera diffusion

    NARCIS (Netherlands)

    Augustijn, Ellen-Wien; Doldersum, Tom; Useya, Juliana; Augustijn, Denie

    2016-01-01

    This paper introduces a spatially explicit agent-based simulation model for micro-scale cholera diffusion. The model simulates both an environmental reservoir of naturally occurring V.cholerae bacteria and hyperinfectious V. cholerae. Objective of the research is to test if runoff from open refuse d

  5. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae; Top, Søren

    2008-01-01

    constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behaviour as a means of computation...... constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems) is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI), University of Southern Denmark. Once specified, the software model has...... between control engineering and software engineering....

  6. Mineral resources estimation based on block modeling

    Science.gov (United States)

    Bargawa, Waterman Sulistyana; Amri, Nur Ali

    2016-02-01

    The estimation in this paper uses three kinds of block models of nearest neighbor polygon, inverse distance squared and ordinary kriging. The techniques are weighting scheme which is based on the principle that block content is a linear combination of the grade data or the sample around the block being estimated. The case study in Pongkor area, here is gold-silver resource modeling that allegedly shaped of quartz vein as a hydrothermal process of epithermal type. Resources modeling includes of data entry, statistical and variography analysis of topography and geological model, the block model construction, estimation parameter, presentation model and tabulation of mineral resources. Skewed distribution, here isolated by robust semivariogram. The mineral resources classification generated in this model based on an analysis of the kriging standard deviation and number of samples which are used in the estimation of each block. Research results are used to evaluate the performance of OK and IDS estimator. Based on the visual and statistical analysis, concluded that the model of OK gives the estimation closer to the data used for modeling.

  7. Gradient based filtering of digital elevation models

    DEFF Research Database (Denmark)

    Knudsen, Thomas; Andersen, Rune Carbuhn

    We present a filtering method for digital terrain models (DTMs). The method is based on mathematical morphological filtering within gradient (slope) defined domains. The intention with the filtering procedure is to improbé the cartographic quality of height contours generated from a DTM based...

  8. Information modelling and knowledge bases XXV

    CERN Document Server

    Tokuda, T; Jaakkola, H; Yoshida, N

    2014-01-01

    Because of our ever increasing use of and reliance on technology and information systems, information modelling and knowledge bases continue to be important topics in those academic communities concerned with data handling and computer science. As the information itself becomes more complex, so do the levels of abstraction and the databases themselves. This book is part of the series Information Modelling and Knowledge Bases, which concentrates on a variety of themes in the important domains of conceptual modeling, design and specification of information systems, multimedia information modelin

  9. Multiagent-Based Model For ESCM

    OpenAIRE

    Delia MARINCAS

    2011-01-01

    Web based applications for Supply Chain Management (SCM) are now a necessity for every company in order to meet the increasing customer demands, to face the global competition and to make profit. Multiagent-based approach is appropriate for eSCM because it shows many of the characteristics a SCM system should have. For this reason, we have proposed a multiagent-based eSCM model which configures a virtual SC, automates the SC activities: selling, purchasing, manufacturing, planning, inventory,...

  10. Flower solid modeling based on sketches

    Institute of Scientific and Technical Information of China (English)

    Zhan DING; Shu-chang XU; Xiu-zi YE; Yin ZHANG; San-yuan ZHANG

    2008-01-01

    In this paper we propose a method to model flowers of solid shape. Based on (Ijiri et al., 2005)'s method, we separate individual flower modeling and inflorescence modeling procedures into structure and geometry modeling. We incorporate interactive editing gestures to allow the user to edit structure parameters freely onto structure diagram. Furthermore, we use free-hand sketching techniques to allow users to create and edit 3D geometrical elements freely and easily. The final step is to automatically merge all independent 3D geometrical elements into a single waterproof mesh. Our experiments show that this solid modeling approach is promising. Using our approach, novice users can create vivid flower models easily and freely. The generated flower model is waterproof. It can have applications in visualization, animation, gaming, and toys and decorations if printed out on 3D rapid prototyping devices.

  11. New global ICT-based business models

    DEFF Research Database (Denmark)

    . Contents: The Theoretical History and Background of Business Models The Th eoretical Background of Business Model Innovation ICT - a Key Enabler in Innovating New Global Business Models The NEWGIBM Research Methodology The Analytical Model for NEWGIBM Industry Service - Technology Centre The KMD Case Smart...... House Case The Nano Solar Case The Master Cat Case The Pitfalls Of The Blue Ocean Strategy - Implications Of "The Six Paths Framework" Network-Based Innovation - Combining Exploration and Exploitation? Innovating New Business Models in Inter-firm Collaboration NEW Global Business Models - What Did......The New Global Business model (NEWGIBM) book describes the background, theory references, case studies, results and learning imparted by the NEWGIBM project, which is supported by ICT, to a research group during the period from 2005-2011. The book is a result of the efforts and the collaborative...

  12. Spatial disaggregation of complex soil map units at regional scale based on soil-landscape relationships

    Science.gov (United States)

    Vincent, Sébastien; Lemercier, Blandine; Berthier, Lionel; Walter, Christian

    2015-04-01

    Accurate soil information over large extent is essential to manage agronomical and environmental issues. Where it exists, information on soil is often sparse or available at coarser resolution than required. Typically, the spatial distribution of soil at regional scale is represented as a set of polygons defining soil map units (SMU), each one describing several soil types not spatially delineated, and a semantic database describing these objects. Delineation of soil types within SMU, ie spatial disaggregation of SMU allows improved soil information's accuracy using legacy data. The aim of this study was to predict soil types by spatial disaggregation of SMU through a decision tree approach, considering expert knowledge on soil-landscape relationships embedded in soil databases. The DSMART (Disaggregation and Harmonization of Soil Map Units Through resampled Classification Trees) algorithm developed by Odgers et al. (2014) was used. It requires soil information, environmental covariates, and calibration samples, to build then extrapolate decision trees. To assign a soil type to a particular spatial position, a weighed random allocation approach is applied: each soil type in the SMU is weighted according to its assumed proportion of occurrence in the SMU. Thus soil-landscape relationships are not considered in the current version of DSMART. Expert rules on soil distribution considering the relief, parent material and wetlands location were proposed to drive the procedure of allocation of soil type to sampled positions, in order to integrate the soil-landscape relationships. Semantic information about spatial organization of soil types within SMU and exhaustive landscape descriptors were used. In the eastern part of Brittany (NW France), 171 soil types were described; their relative area in the SMU were estimated, geomorphological and geological contexts were recorded. The model predicted 144 soil types. An external validation was performed by comparing predicted

  13. Deformable surface modeling based on dual subdivision

    Institute of Scientific and Technical Information of China (English)

    WANG Huawei; SUN Hanqiu; QIN Kaihuai

    2005-01-01

    Based on dual Doo-Sabin subdivision and the corresponding parameterization, a modeling technique of deformable surfaces is presented in this paper. In the proposed model, all the dynamic parameters are computed in a unified way for both non-defective and defective subdivision matrices, and central differences are used to discretize the Lagrangian dynamics equation instead of backward differences. Moreover, a local scheme is developed to solve the dynamics equation approximately, thus the order of the linear equation is reduced greatly. Therefore, the proposed model is more efficient and faster than the existing dynamic models. It can be used for deformable surface design, interactive surface editing, medical imaging and simulation.

  14. Rule-based transformations for geometric modelling

    Directory of Open Access Journals (Sweden)

    Thomas Bellet

    2011-02-01

    Full Text Available The context of this paper is the use of formal methods for topology-based geometric modelling. Topology-based geometric modelling deals with objects of various dimensions and shapes. Usually, objects are defined by a graph-based topological data structure and by an embedding that associates each topological element (vertex, edge, face, etc. with relevant data as their geometric shape (position, curve, surface, etc. or application dedicated data (e.g. molecule concentration level in a biological context. We propose to define topology-based geometric objects as labelled graphs. The arc labelling defines the topological structure of the object whose topological consistency is then ensured by labelling constraints. Nodes have as many labels as there are different data kinds in the embedding. Labelling constraints ensure then that the embedding is consistent with the topological structure. Thus, topology-based geometric objects constitute a particular subclass of a category of labelled graphs in which nodes have multiple labels.

  15. Rule-based transformations for geometric modelling

    CERN Document Server

    Bellet, Thomas; Gall, Pascale Le; 10.4204/EPTCS.48.5

    2011-01-01

    The context of this paper is the use of formal methods for topology-based geometric modelling. Topology-based geometric modelling deals with objects of various dimensions and shapes. Usually, objects are defined by a graph-based topological data structure and by an embedding that associates each topological element (vertex, edge, face, etc.) with relevant data as their geometric shape (position, curve, surface, etc.) or application dedicated data (e.g. molecule concentration level in a biological context). We propose to define topology-based geometric objects as labelled graphs. The arc labelling defines the topological structure of the object whose topological consistency is then ensured by labelling constraints. Nodes have as many labels as there are different data kinds in the embedding. Labelling constraints ensure then that the embedding is consistent with the topological structure. Thus, topology-based geometric objects constitute a particular subclass of a category of labelled graphs in which nodes hav...

  16. MLP based models to predict PM10, O3 concentrations, in Sines industrial area

    Science.gov (United States)

    Durao, R.; Pereira, M. J.

    2012-04-01

    Sines is an important Portuguese industrial area located southwest cost of Portugal with important nearby protected natural areas. The main economical activities are related with this industrial area, the deep-water port, petrochemical and thermo-electric industry. Nevertheless, tourism is also an important economic activity especially in summer time with potential to grow. The aim of this study is to develop prediction models of pollutant concentration categories (e.g. low concentration and high concentration) in order to provide early warnings to the competent authorities who are responsible for the air quality management. The knowledge in advanced of pollutant high concentrations occurrence will allow the implementation of mitigation actions and the release of precautionary alerts to population. The regional air quality monitoring network consists in three monitoring stations where a set of pollutants' concentrations are registered on a continuous basis. From this set stands out the tropospheric ozone (O3) and particulate matter (PM10) due to the high concentrations occurring in the region and their adverse effects on human health. Moreover, the major industrial plants of the region monitor SO2, NO2 and particles emitted flows at the principal chimneys (point sources), also on a continuous basis,. Therefore Artificial neuronal networks (ANN) were the applied methodology to predict next day pollutant concentrations; due to the ANNs structure they have the ability to capture the non-linear relationships between predictor variables. Hence the first step of this study was to apply multivariate exploratory techniques to select the best predictor variables. The classification trees methodology (CART) was revealed to be the most appropriate in this case.. Results shown that pollutants atmospheric concentrations are mainly dependent on industrial emissions and a complex combination of meteorological factors and the time of the year. In the second step, the Multi

  17. Applying Model Checking to Generate Model-Based Integration Tests from Choreography Models

    Science.gov (United States)

    Wieczorek, Sebastian; Kozyura, Vitaly; Roth, Andreas; Leuschel, Michael; Bendisposto, Jens; Plagge, Daniel; Schieferdecker, Ina

    Choreography models describe the communication protocols between services. Testing of service choreographies is an important task for the quality assurance of service-based systems as used e.g. in the context of service-oriented architectures (SOA). The formal modeling of service choreographies enables a model-based integration testing (MBIT) approach. We present MBIT methods for our service choreography modeling approach called Message Choreography Models (MCM). For the model-based testing of service choreographies, MCMs are translated into Event-B models and used as input for our test generator which uses the model checker ProB.

  18. Development of Ensemble Model Based Water Demand Forecasting Model

    Science.gov (United States)

    Kwon, Hyun-Han; So, Byung-Jin; Kim, Seong-Hyeon; Kim, Byung-Seop

    2014-05-01

    In recent years, Smart Water Grid (SWG) concept has globally emerged over the last decade and also gained significant recognition in South Korea. Especially, there has been growing interest in water demand forecast and optimal pump operation and this has led to various studies regarding energy saving and improvement of water supply reliability. Existing water demand forecasting models are categorized into two groups in view of modeling and predicting their behavior in time series. One is to consider embedded patterns such as seasonality, periodicity and trends, and the other one is an autoregressive model that is using short memory Markovian processes (Emmanuel et al., 2012). The main disadvantage of the abovementioned model is that there is a limit to predictability of water demands of about sub-daily scale because the system is nonlinear. In this regard, this study aims to develop a nonlinear ensemble model for hourly water demand forecasting which allow us to estimate uncertainties across different model classes. The proposed model is consist of two parts. One is a multi-model scheme that is based on combination of independent prediction model. The other one is a cross validation scheme named Bagging approach introduced by Brieman (1996) to derive weighting factors corresponding to individual models. Individual forecasting models that used in this study are linear regression analysis model, polynomial regression, multivariate adaptive regression splines(MARS), SVM(support vector machine). The concepts are demonstrated through application to observed from water plant at several locations in the South Korea. Keywords: water demand, non-linear model, the ensemble forecasting model, uncertainty. Acknowledgements This subject is supported by Korea Ministry of Environment as "Projects for Developing Eco-Innovation Technologies (GT-11-G-02-001-6)

  19. MEGen: A Physiologically Based Pharmacokinetic Model Generator

    Directory of Open Access Journals (Sweden)

    George D Loizou

    2011-11-01

    Full Text Available Physiologically based pharmacokinetic models are being used in an increasing number of different areas. These not only include the human safety assessment of pharmaceuticals, pesticides, biocides and environmental chemicals but also for food animal, wild mammal and avian risk assessment. The value of PBPK models is that they are tools for estimating tissue dosimetry by integrating in vitro and in vivo mechanistic, pharmacokinetic and toxicological information through their explicit mathematical description of important anatomical, physiological and biochemical determinants of chemical uptake, disposition and elimination. However, PBPK models are perceived as complex, data hungry, resource intensive and time consuming. In addition, model validation and verification are hindered by the relative complexity of the equations. To begin to address these issues a freely available web application for the rapid construction and documentation of bespoke PBPK models is under development. Here we present an overview of the current capabilities of MEGen, a model equation generator and parameter database and discuss future developments.

  20. Activity-based resource capability modeling

    Institute of Scientific and Technical Information of China (English)

    CHENG Shao-wu; XU Xiao-fei; WANG Gang; SUN Xue-dong

    2008-01-01

    To analyse and optimize a enterprise process in a wide scope, an activity-based method of modeling resource capabilities is presented. It models resource capabilities by means of the same structure as an activity, that is, resource capabilities are defined by input objects, actions and output objects. A set of activity-based re-source capability modeling rules and matching rules between an activity and a resource are introduced. This method can not only be used to describe capability of manufacturing tools, but also capability of persons and applications, etc. It unifies methods of modeling capability of all kinds of resources in an enterprise and supports the optimization of the resource allocation of a process.

  1. Spatial interactions in agent-based modeling

    CERN Document Server

    Ausloos, Marcel; Merlone, Ugo

    2014-01-01

    Agent Based Modeling (ABM) has become a widespread approach to model complex interactions. In this chapter after briefly summarizing some features of ABM the different approaches in modeling spatial interactions are discussed. It is stressed that agents can interact either indirectly through a shared environment and/or directly with each other. In such an approach, higher-order variables such as commodity prices, population dynamics or even institutions, are not exogenously specified but instead are seen as the results of interactions. It is highlighted in the chapter that the understanding of patterns emerging from such spatial interaction between agents is a key problem as much as their description through analytical or simulation means. The chapter reviews different approaches for modeling agents' behavior, taking into account either explicit spatial (lattice based) structures or networks. Some emphasis is placed on recent ABM as applied to the description of the dynamics of the geographical distribution o...

  2. Graphical model construction based on evolutionary algorithms

    Institute of Scientific and Technical Information of China (English)

    Youlong YANG; Yan WU; Sanyang LIU

    2006-01-01

    Using Bayesian networks to model promising solutions from the current population of the evolutionary algorithms can ensure efficiency and intelligence search for the optimum. However, to construct a Bayesian network that fits a given dataset is a NP-hard problem, and it also needs consuming mass computational resources. This paper develops a methodology for constructing a graphical model based on Bayesian Dirichlet metric. Our approach is derived from a set of propositions and theorems by researching the local metric relationship of networks matching dataset. This paper presents the algorithm to construct a tree model from a set of potential solutions using above approach. This method is important not only for evolutionary algorithms based on graphical models, but also for machine learning and data mining.The experimental results show that the exact theoretical results and the approximations match very well.

  3. Reaction-contingency based bipartite Boolean modelling

    Science.gov (United States)

    2013-01-01

    Background Intracellular signalling systems are highly complex, rendering mathematical modelling of large signalling networks infeasible or impractical. Boolean modelling provides one feasible approach to whole-network modelling, but at the cost of dequantification and decontextualisation of activation. That is, these models cannot distinguish between different downstream roles played by the same component activated in different contexts. Results Here, we address this with a bipartite Boolean modelling approach. Briefly, we use a state oriented approach with separate update rules based on reactions and contingencies. This approach retains contextual activation information and distinguishes distinct signals passing through a single component. Furthermore, we integrate this approach in the rxncon framework to support automatic model generation and iterative model definition and validation. We benchmark this method with the previously mapped MAP kinase network in yeast, showing that minor adjustments suffice to produce a functional network description. Conclusions Taken together, we (i) present a bipartite Boolean modelling approach that retains contextual activation information, (ii) provide software support for automatic model generation, visualisation and simulation, and (iii) demonstrate its use for iterative model generation and validation. PMID:23835289

  4. Evolutionary modeling-based approach for model errors correction

    Science.gov (United States)

    Wan, S. Q.; He, W. P.; Wang, L.; Jiang, W.; Zhang, W.

    2012-08-01

    The inverse problem of using the information of historical data to estimate model errors is one of the science frontier research topics. In this study, we investigate such a problem using the classic Lorenz (1963) equation as a prediction model and the Lorenz equation with a periodic evolutionary function as an accurate representation of reality to generate "observational data." On the basis of the intelligent features of evolutionary modeling (EM), including self-organization, self-adaptive and self-learning, the dynamic information contained in the historical data can be identified and extracted by computer automatically. Thereby, a new approach is proposed to estimate model errors based on EM in the present paper. Numerical tests demonstrate the ability of the new approach to correct model structural errors. In fact, it can actualize the combination of the statistics and dynamics to certain extent.

  5. A Multiagent Based Model for Tactical Planning

    Science.gov (United States)

    2002-10-01

    Pub. Co. 1985. [10] Castillo, J.M. Aproximación mediante procedimientos de Inteligencia Artificial al planeamiento táctico. Doctoral Thesis...been developed under the same conceptual model and using similar Artificial Intelligence Tools. We use four different stimulus/response agents in...The conceptual model is built on base of the Agents theory. To implement the different agents we have used Artificial Intelligence techniques such

  6. GIS-Based Hydrogeological-Parameter Modeling

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    A regression model is proposed to relate the variation of water well depth with topographic properties (area and slope), the variation of hydraulic conductivity and vertical decay factor. The implementation of this model in GIS environment (ARC/TNFO) based on known water data and DEM is used to estimate the variation of hydraulic conductivity and decay factor of different lithoiogy units in watershed context.

  7. Atom-Role-Based Access Control Model

    Science.gov (United States)

    Cai, Weihong; Huang, Richeng; Hou, Xiaoli; Wei, Gang; Xiao, Shui; Chen, Yindong

    Role-based access control (RBAC) model has been widely recognized as an efficient access control model and becomes a hot research topic of information security at present. However, in the large-scale enterprise application environments, the traditional RBAC model based on the role hierarchy has the following deficiencies: Firstly, it is unable to reflect the role relationships in complicated cases effectively, which does not accord with practical applications. Secondly, the senior role unconditionally inherits all permissions of the junior role, thus if a user is under the supervisor role, he may accumulate all permissions, and this easily causes the abuse of permission and violates the least privilege principle, which is one of the main security principles. To deal with these problems, we, after analyzing permission types and role relationships, proposed the concept of atom role and built an atom-role-based access control model, called ATRBAC, by dividing the permission set of each regular role based on inheritance path relationships. Through the application-specific analysis, this model can well meet the access control requirements.

  8. Image-Based Multiresolution Implicit Object Modeling

    Directory of Open Access Journals (Sweden)

    Sarti Augusto

    2002-01-01

    Full Text Available We discuss two image-based 3D modeling methods based on a multiresolution evolution of a volumetric function′s level set. In the former method, the role of the level set implosion is to fuse ("sew" and "stitch" together several partial reconstructions (depth maps into a closed model. In the later, the level set′s implosion is steered directly by the texture mismatch between views. Both solutions share the characteristic of operating in an adaptive multiresolution fashion, in order to boost up computational efficiency and robustness.

  9. Physically based modeling and animation of tornado

    Institute of Scientific and Technical Information of China (English)

    LIU Shi-guang; WANG Zhang-ye; GONG Zheng; CHEN Fei-fei; PENG Qun-sheng

    2006-01-01

    Realistic modeling and rendering of dynamic tornado scene is recognized as a challenging task for researchers of computer graphics. In this paper a new physically based method for simulating and animating tornado scene is presented. We first propose a Two-Fluid model based on the physical theory of tornado, then we simulate the flow of tornado and its interaction with surrounding objects such as debris, etc. Taking the scattering and absorption of light by the participating media into account, the illumination effects of the tornado scene can be generated realistically. With the support of graphics hardware, various kinds of dynamic tornado scenes can be rendered at interactive rates.

  10. A multivalued knowledge-base model

    CERN Document Server

    Achs, Agnes

    2010-01-01

    The basic aim of our study is to give a possible model for handling uncertain information. This model is worked out in the framework of DATALOG. At first the concept of fuzzy Datalog will be summarized, then its extensions for intuitionistic- and interval-valued fuzzy logic is given and the concept of bipolar fuzzy Datalog is introduced. Based on these ideas the concept of multivalued knowledge-base will be defined as a quadruple of any background knowledge; a deduction mechanism; a connecting algorithm, and a function set of the program, which help us to determine the uncertainty levels of the results. At last a possible evaluation strategy is given.

  11. Model Based Control of Reefer Container Systems

    DEFF Research Database (Denmark)

    Sørensen, Kresten Kjær

    This thesis is concerned with the development of model based control for the Star Cool refrigerated container (reefer) with the objective of reducing energy consumption. This project has been carried out under the Danish Industrial PhD programme and has been financed by Lodam together with the Da......This thesis is concerned with the development of model based control for the Star Cool refrigerated container (reefer) with the objective of reducing energy consumption. This project has been carried out under the Danish Industrial PhD programme and has been financed by Lodam together...

  12. Model-based testing for embedded systems

    CERN Document Server

    Zander, Justyna; Mosterman, Pieter J

    2011-01-01

    What the experts have to say about Model-Based Testing for Embedded Systems: "This book is exactly what is needed at the exact right time in this fast-growing area. From its beginnings over 10 years ago of deriving tests from UML statecharts, model-based testing has matured into a topic with both breadth and depth. Testing embedded systems is a natural application of MBT, and this book hits the nail exactly on the head. Numerous topics are presented clearly, thoroughly, and concisely in this cutting-edge book. The authors are world-class leading experts in this area and teach us well-used

  13. SWIFT MODELLER: a Java based GUI for molecular modeling.

    Science.gov (United States)

    Mathur, Abhinav; Shankaracharya; Vidyarthi, Ambarish S

    2011-10-01

    MODELLER is command line argument based software which requires tedious formatting of inputs and writing of Python scripts which most people are not comfortable with. Also the visualization of output becomes cumbersome due to verbose files. This makes the whole software protocol very complex and requires extensive study of MODELLER manuals and tutorials. Here we describe SWIFT MODELLER, a GUI that automates formatting, scripting and data extraction processes and present it in an interactive way making MODELLER much easier to use than before. The screens in SWIFT MODELLER are designed keeping homology modeling in mind and their flow is a depiction of its steps. It eliminates the formatting of inputs, scripting processes and analysis of verbose output files through automation and makes pasting of the target sequence as the only prerequisite. Jmol (3D structure visualization tool) has been integrated into the GUI which opens and demonstrates the protein data bank files created by the MODELLER software. All files required and created by the software are saved in a folder named after the work instance's date and time of execution. SWIFT MODELLER lowers the skill level required for the software through automation of many of the steps in the original software protocol, thus saving an enormous amount of time per instance and making MODELLER very easy to work with.

  14. PV panel model based on datasheet values

    DEFF Research Database (Denmark)

    Sera, Dezso; Teodorescu, Remus; Rodriguez, Pedro

    2007-01-01

    This work presents the construction of a model for a PV panel using the single-diode five-parameters model, based exclusively on data-sheet parameters. The model takes into account the series and parallel (shunt) resistance of the panel. The equivalent circuit and the basic equations of the PV cell....../panel in Standard Test Conditions (STC) are shown, as well as the parameters extraction from the data-sheet values. The temperature dependence of the cell dark saturation current is expressed with an alternative formula, which gives better correlation with the datasheet values of the power temperature dependence...

  15. Model Predictive Control based on Finite Impulse Response Models

    DEFF Research Database (Denmark)

    Prasath, Guru; Jørgensen, John Bagterp

    2008-01-01

    We develop a regularized l2 finite impulse response (FIR) predictive controller with input and input-rate constraints. Feedback is based on a simple constant output disturbance filter. The performance of the predictive controller in the face of plant-model mismatch is investigated by simulations...

  16. Incident Duration Modeling Using Flexible Parametric Hazard-Based Models

    Directory of Open Access Journals (Sweden)

    Ruimin Li

    2014-01-01

    Full Text Available Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time.

  17. Incident duration modeling using flexible parametric hazard-based models.

    Science.gov (United States)

    Li, Ruimin; Shang, Pan

    2014-01-01

    Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time.

  18. Model-Based Power Plant Master Control

    Energy Technology Data Exchange (ETDEWEB)

    Boman, Katarina; Thomas, Jean; Funkquist, Jonas

    2010-08-15

    The main goal of the project has been to evaluate the potential of a coordinated master control for a solid fuel power plant in terms of tracking capability, stability and robustness. The control strategy has been model-based predictive control (MPC) and the plant used in the case study has been the Vattenfall power plant Idbaecken in Nykoeping. A dynamic plant model based on nonlinear physical models was used to imitate the true plant in MATLAB/SIMULINK simulations. The basis for this model was already developed in previous Vattenfall internal projects, along with a simulation model of the existing control implementation with traditional PID controllers. The existing PID control is used as a reference performance, and it has been thoroughly studied and tuned in these previous Vattenfall internal projects. A turbine model was developed with characteristics based on the results of steady-state simulations of the plant using the software EBSILON. Using the derived model as a representative for the actual process, an MPC control strategy was developed using linearization and gain-scheduling. The control signal constraints (rate of change) and constraints on outputs were implemented to comply with plant constraints. After tuning the MPC control parameters, a number of simulation scenarios were performed to compare the MPC strategy with the existing PID control structure. The simulation scenarios also included cases highlighting the robustness properties of the MPC strategy. From the study, the main conclusions are: - The proposed Master MPC controller shows excellent set-point tracking performance even though the plant has strong interactions and non-linearity, and the controls and their rate of change are bounded. - The proposed Master MPC controller is robust, stable in the presence of disturbances and parameter variations. Even though the current study only considered a very small number of the possible disturbances and modelling errors, the considered cases are

  19. Frequency response function-based model updating using Kriging model

    Science.gov (United States)

    Wang, J. T.; Wang, C. J.; Zhao, J. P.

    2017-03-01

    An acceleration frequency response function (FRF) based model updating method is presented in this paper, which introduces Kriging model as metamodel into the optimization process instead of iterating the finite element analysis directly. The Kriging model is taken as a fast running model that can reduce solving time and facilitate the application of intelligent algorithms in model updating. The training samples for Kriging model are generated by the design of experiment (DOE), whose response corresponds to the difference between experimental acceleration FRFs and its counterpart of finite element model (FEM) at selected frequency points. The boundary condition is taken into account, and a two-step DOE method is proposed for reducing the number of training samples. The first step is to select the design variables from the boundary condition, and the selected variables will be passed to the second step for generating the training samples. The optimization results of the design variables are taken as the updated values of the design variables to calibrate the FEM, and then the analytical FRFs tend to coincide with the experimental FRFs. The proposed method is performed successfully on a composite structure of honeycomb sandwich beam, after model updating, the analytical acceleration FRFs have a significant improvement to match the experimental data especially when the damping ratios are adjusted.

  20. Consensus of classification trees for skin sensitisation hazard prediction

    OpenAIRE

    ASTURIOL BOFILL DAVID; Casati, Silvia; Worth, Andrew

    2016-01-01

    Since March 2013, it is no longer possible to market in the European Union (EU) cosmetics containing new ingredients tested on animals. Although several in silico alternatives are available and achievements have been made in the development and regulatory adoption of skin sensitisation non-animal tests, there is not yet a generally accepted approach for skin sensitisation assessment that would fully substitute the need for animal testing. The aim of this work was to build a defined approac...

  1. Classification Trees and the Analysis of Helicopter Vibration Data

    OpenAIRE

    Larson, Harold J.

    1997-01-01

    Health and Usage Monitoring Systems are receiving a great deal of interest, in an attempt to increase the safety and operational readiness of helicopters, and to minimize maintenance costs. These systems monitor (and can record) various flight parameters, pilot conversations, engine exhaust debris, metallic chip detector levels in the lubrication system, rotor track and balance, as well as vibration levels at selected locations throughout the airframe and the power drive system. Vibration lev...

  2. Classification tree analysis of factors affecting parking choices in Qatar

    OpenAIRE

    Shaaban, K.; Pande, A

    2015-01-01

    Qatar has experienced a significant population growth in the past decade. The growth has been accompanied by an increase in automobile ownership rates leading to parking problems especially in the capital city of Doha. The objective of this study was to find the factors affecting people's choice of parking in this rich developing country when different parking options are available. Two commercial centers located in the city of Doha, Qatar were selected for this study; the City Center mall an...

  3. Unifying Model-Based and Reactive Programming within a Model-Based Executive

    Science.gov (United States)

    Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)

    1999-01-01

    Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.

  4. CEAI: CCM based Email Authorship Identification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah

    2013-01-01

    content features. It is observed that the use of such features in the authorship identification process has a positive impact on the accuracy of the authorship identification task. We performed experiments to justify our arguments and compared the results with other base line models. Experimental results...

  5. Prototype-based models in machine learning

    NARCIS (Netherlands)

    Biehl, Michael; Hammer, Barbara; Villmann, Thomas

    2016-01-01

    An overview is given of prototype-based models in machine learning. In this framework, observations, i.e., data, are stored in terms of typical representatives. Together with a suitable measure of similarity, the systems can be employed in the context of unsupervised and supervised analysis of poten

  6. Model based development of engine control algorithms

    NARCIS (Netherlands)

    Dekker, H.J.; Sturm, W.L.

    1996-01-01

    Model based development of engine control systems has several advantages. The development time and costs are strongly reduced because much of the development and optimization work is carried out by simulating both engine and control system. After optimizing the control algorithm it can be executed b

  7. What's Missing in Model-Based Teaching

    Science.gov (United States)

    Khan, Samia

    2011-01-01

    In this study, the author investigated how four science teachers employed model-based teaching (MBT) over a 1-year period. The purpose of the research was to develop a baseline of the fundamental and specific dimensions of MBT that are present and absent in science teaching. Teacher interviews, classroom observations, and pre and post-student…

  8. Néron Models and Base Change

    DEFF Research Database (Denmark)

    Halle, Lars Halvard; Nicaise, Johannes

    on Néron component groups, Edixhoven’s filtration and the base change conductor of Chai and Yu, and we study these invariants using various techniques such as models of curves, sheaves on Grothendieck sites and non-archimedean uniformization. We then apply our results to the study of motivic zeta functions...

  9. Agent based computational model of trust

    NARCIS (Netherlands)

    A. Gorobets (Alexander); B. Nooteboom (Bart)

    2004-01-01

    textabstractThis paper employs the methodology of Agent-Based Computational Economics (ACE) to investigate under what conditions trust can be viable in markets. The emergence and breakdown of trust is modeled in a context of multiple buyers and suppliers. Agents adapt their trust in a partner, the w

  10. Sandboxes for Model-Based Inquiry

    Science.gov (United States)

    Brady, Corey; Holbert, Nathan; Soylu, Firat; Novak, Michael; Wilensky, Uri

    2015-01-01

    In this article, we introduce a class of constructionist learning environments that we call "Emergent Systems Sandboxes" ("ESSs"), which have served as a centerpiece of our recent work in developing curriculum to support scalable model-based learning in classroom settings. ESSs are a carefully specified form of virtual…

  11. Port-based modeling of mechatronic systems

    NARCIS (Netherlands)

    Breedveld, Peter C.

    2004-01-01

    Many engineering activities, including mechatronic design, require that a multidomain or ‘multi-physics’ system and its control system be designed as an integrated system. This contribution discusses the background and tools for a port-based approach to integrated modeling and simulation of physical

  12. Statistical Seasonal Sea Surface based Prediction Model

    Science.gov (United States)

    Suarez, Roberto; Rodriguez-Fonseca, Belen; Diouf, Ibrahima

    2014-05-01

    The interannual variability of the sea surface temperature (SST) plays a key role in the strongly seasonal rainfall regime on the West African region. The predictability of the seasonal cycle of rainfall is a field widely discussed by the scientific community, with results that fail to be satisfactory due to the difficulty of dynamical models to reproduce the behavior of the Inter Tropical Convergence Zone (ITCZ). To tackle this problem, a statistical model based on oceanic predictors has been developed at the Universidad Complutense of Madrid (UCM) with the aim to complement and enhance the predictability of the West African Monsoon (WAM) as an alternative to the coupled models. The model, called S4CAST (SST-based Statistical Seasonal Forecast) is based on discriminant analysis techniques, specifically the Maximum Covariance Analysis (MCA) and Canonical Correlation Analysis (CCA). Beyond the application of the model to the prediciton of rainfall in West Africa, its use extends to a range of different oceanic, atmospheric and helth related parameters influenced by the temperature of the sea surface as a defining factor of variability.

  13. Introducing Waqf Based Takaful Model in India

    Directory of Open Access Journals (Sweden)

    Syed Ahmed Salman

    2014-03-01

    Full Text Available Objective – Waqf is a unique feature of the socioeconomic system of Islam in a multi- religious and developing country like India. India is a rich country with waqf assets. The history of waqf in India can be traced back to 800 years ago. Most of the researchers, suggest how waqf can be used a tool to mitigate the poverty of Muslims. India has the third highest Muslim population after Indonesia and Pakistan. However, the majority of Muslims belong to the low income group and they are in need of help. It is believed that waqf can be utilized for the betterment of Indian Muslim community. Among the available uses of waqf assets, the main objective of this paper is to introduce waqf based takaful model in India. In addition, how this proposed model can be adopted in India is highlighted.Methods – Library research is applied since this paper relies on secondary data by thoroughlyreviewing the most relevant literature.Result – India as a rich country with waqf assets should fully utilize the resources to help the Muslims through takaful.Conclusion – In this study, we have proposed waqf based takaful model with the combination of the concepts mudarabah and wakalah for India. We recommend this model based on the background of the  country and situations. Since we have not tested the viability of this model in India, future research should be continued on this testing.Keywords : Wakaf, Takaful, Kemiskinan dan India

  14. Whole body acid-base modeling revisited.

    Science.gov (United States)

    Ring, Troels; Nielsen, Søren

    2017-04-01

    The textbook account of whole body acid-base balance in terms of endogenous acid production, renal net acid excretion, and gastrointestinal alkali absorption, which is the only comprehensive model around, has never been applied in clinical practice or been formally validated. To improve understanding of acid-base modeling, we managed to write up this conventional model as an expression solely on urine chemistry. Renal net acid excretion and endogenous acid production were already formulated in terms of urine chemistry, and we could from the literature also see gastrointestinal alkali absorption in terms of urine excretions. With a few assumptions it was possible to see that this expression of net acid balance was arithmetically identical to minus urine charge, whereby under the development of acidosis, urine was predicted to acquire a net negative charge. The literature already mentions unexplained negative urine charges so we scrutinized a series of seminal papers and confirmed empirically the theoretical prediction that observed urine charge did acquire negative charge as acidosis developed. Hence, we can conclude that the conventional model is problematic since it predicts what is physiologically impossible. Therefore, we need a new model for whole body acid-base balance, which does not have impossible implications. Furthermore, new experimental studies are needed to account for charge imbalance in urine under development of acidosis.

  15. Deriving Framework Usages Based on Behavioral Models

    Science.gov (United States)

    Zenmyo, Teruyoshi; Kobayashi, Takashi; Saeki, Motoshi

    One of the critical issue in framework-based software development is a huge introduction cost caused by technical gap between developers and users of frameworks. This paper proposes a technique for deriving framework usages to implement a given requirements specification. By using the derived usages, the users can use the frameworks without understanding the framework in detail. Requirements specifications which describe definite behavioral requirements cannot be related to frameworks in as-is since the frameworks do not have definite control structure so that the users can customize them to suit given requirements specifications. To cope with this issue, a new technique based on satisfiability problems (SAT) is employed to derive the control structures of the framework model. In the proposed technique, requirements specifications and frameworks are modeled based on Labeled Transition Systems (LTSs) with branch conditions represented by predicates. Truth assignments of the branch conditions in the framework models are not given initially for representing the customizable control structure. The derivation of truth assignments of the branch conditions is regarded as the SAT by assuming relations between termination states of the requirements specification model and ones of the framework model. This derivation technique is incorporated into a technique we have proposed previously for relating actions of requirements specifications to ones of frameworks. Furthermore, this paper discuss a case study of typical use cases in e-commerce systems.

  16. Agent Based Modeling Applications for Geosciences

    Science.gov (United States)

    Stein, J. S.

    2004-12-01

    Agent-based modeling techniques have successfully been applied to systems in which complex behaviors or outcomes arise from varied interactions between individuals in the system. Each individual interacts with its environment, as well as with other individuals, by following a set of relatively simple rules. Traditionally this "bottom-up" modeling approach has been applied to problems in the fields of economics and sociology, but more recently has been introduced to various disciplines in the geosciences. This technique can help explain the origin of complex processes from a relatively simple set of rules, incorporate large and detailed datasets when they exist, and simulate the effects of extreme events on system-wide behavior. Some of the challenges associated with this modeling method include: significant computational requirements in order to keep track of thousands to millions of agents, methods and strategies of model validation are lacking, as is a formal methodology for evaluating model uncertainty. Challenges specific to the geosciences, include how to define agents that control water, contaminant fluxes, climate forcing and other physical processes and how to link these "geo-agents" into larger agent-based simulations that include social systems such as demographics economics and regulations. Effective management of limited natural resources (such as water, hydrocarbons, or land) requires an understanding of what factors influence the demand for these resources on a regional and temporal scale. Agent-based models can be used to simulate this demand across a variety of sectors under a range of conditions and determine effective and robust management policies and monitoring strategies. The recent focus on the role of biological processes in the geosciences is another example of an area that could benefit from agent-based applications. A typical approach to modeling the effect of biological processes in geologic media has been to represent these processes in

  17. Entropy Based Modelling for Estimating Demographic Trends.

    Directory of Open Access Journals (Sweden)

    Guoqi Li

    Full Text Available In this paper, an entropy-based method is proposed to forecast the demographical changes of countries. We formulate the estimation of future demographical profiles as a constrained optimization problem, anchored on the empirically validated assumption that the entropy of age distribution is increasing in time. The procedure of the proposed method involves three stages, namely: 1 Prediction of the age distribution of a country's population based on an "age-structured population model"; 2 Estimation the age distribution of each individual household size with an entropy-based formulation based on an "individual household size model"; and 3 Estimation the number of each household size based on a "total household size model". The last stage is achieved by projecting the age distribution of the country's population (obtained in stage 1 onto the age distributions of individual household sizes (obtained in stage 2. The effectiveness of the proposed method is demonstrated by feeding real world data, and it is general and versatile enough to be extended to other time dependent demographic variables.

  18. Family-Based Model Checking Without a Family-Based Model Checker

    DEFF Research Database (Denmark)

    Dimovski, Aleksandar; Al-Sibahi, Ahmad Salim; Brabrand, Claus

    2015-01-01

    (systems with variability), specialized family-based model checking algorithms allow efficient verification of multiple variants, simultaneously. These algorithms scale much better than ``brute force'' verification of individual systems, one-by-one. Nevertheless, they can deal with only very small...... variational models. We address two key problems of family-based model checking. First, we improve scalability by introducing abstractions that simplify variability. Second, we reduce the burden of maintaining specialized family-based model checkers, by showing how the presented variability abstractions can...... be used to model-check variational models using the standard version of (single system) SPIN. The abstractions are first defined as Galois connections on semantic domains. We then show how to translate them into syntactic source-to-source transformations on variational models. This allows the use of SPIN...

  19. Mesoscopic model of actin-based propulsion.

    Directory of Open Access Journals (Sweden)

    Jie Zhu

    Full Text Available Two theoretical models dominate current understanding of actin-based propulsion: microscopic polymerization ratchet model predicts that growing and writhing actin filaments generate forces and movements, while macroscopic elastic propulsion model suggests that deformation and stress of growing actin gel are responsible for the propulsion. We examine both experimentally and computationally the 2D movement of ellipsoidal beads propelled by actin tails and show that neither of the two models can explain the observed bistability of the orientation of the beads. To explain the data, we develop a 2D hybrid mesoscopic model by reconciling these two models such that individual actin filaments undergoing nucleation, elongation, attachment, detachment and capping are embedded into the boundary of a node-spring viscoelastic network representing the macroscopic actin gel. Stochastic simulations of this 'in silico' actin network show that the combined effects of the macroscopic elastic deformation and microscopic ratchets can explain the observed bistable orientation of the actin-propelled ellipsoidal beads. To test the theory further, we analyze observed distribution of the curvatures of the trajectories and show that the hybrid model's predictions fit the data. Finally, we demonstrate that the model can explain both concave-up and concave-down force-velocity relations for growing actin networks depending on the characteristic time scale and network recoil. To summarize, we propose that both microscopic polymerization ratchets and macroscopic stresses of the deformable actin network are responsible for the force and movement generation.

  20. Modeling Leaves Based on Real Image

    Institute of Scientific and Technical Information of China (English)

    CAO Yu-kun; LI Yun-feng; ZHU Qing-sheng; LIU Yin-bin

    2004-01-01

    Plants have complex structures. The shape of a plant component is vital for capturing the characteristics of a species. One of the challenges in computer graphics is to create geometry of objects in an intuitive and direct way while allowing interactive manipulation of the resulting shapes. In this paper,an interactive method for modeling leaves based on real image is proposed using biological data for individual plants. The modeling process begins with a one-dimensional analogue of implicit surfaces,from which a 2D silhouette of a leaf is generated based on image segmentation. The silhouette skeleton is thus obtained. Feature parameters of the leaf are extracted based on biologically experimental data, and the obtained leaf structure is then modified by comparing the synthetic result with the real leaf so as to make the leaf structure more realistic. Finally, the leaf mesh is constructed by sweeps.

  1. Multiagent-Based Model For ESCM

    Directory of Open Access Journals (Sweden)

    Delia MARINCAS

    2011-01-01

    Full Text Available Web based applications for Supply Chain Management (SCM are now a necessity for every company in order to meet the increasing customer demands, to face the global competition and to make profit. Multiagent-based approach is appropriate for eSCM because it shows many of the characteristics a SCM system should have. For this reason, we have proposed a multiagent-based eSCM model which configures a virtual SC, automates the SC activities: selling, purchasing, manufacturing, planning, inventory, etc. This model will allow a better coordination of the supply chain network and will increase the effectiveness of Web and intel-ligent technologies employed in eSCM software.

  2. PARTICIPATION BASED MODEL OF SHIP CREW MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Toni Bielić

    2014-10-01

    Full Text Available 800x600 This paper analyse the participation - based model on board the ship as possibly optimal leadership model existing in the shipping industry with accent on decision - making process. In the paper authors have tried to define master’s behaviour model and management style identifying drawbacks and disadvantages of vertical, pyramidal organization with master on the top. Paper describes efficiency of decision making within team organization and optimization of a ship’s organisation by introducing teamwork on board the ship. Three examples of the ship’s accidents are studied and evaluated through “Leader - participation” model. The model of participation based management as a model of the teamwork has been applied in studying the cause - and - effect of accidents with the critical review of the communication and managing the human resources on a ship. The results have showed that the cause of all three accidents is the autocratic behaviour of the leaders and lack of communication within teams. Normal 0 21 false false false HR X-NONE X-NONE MicrosoftInternetExplorer4

  3. Mandarin Pronunciation Modeling Based on CASS Corpus

    Institute of Scientific and Technical Information of China (English)

    郑方; 宋战江; Pascale Fung; William Byrne

    2002-01-01

    The pronunciation variability is an important issue that must be faced with when developing practical automatic spontaneous speech recognition systems. In this paper, the factors that may affect the recognition performance are analyzed, including those specific to the Chinese language. By studying the INITIAL/FINAL (IF) characteristics of Chinese language and developing the Bayesian equation, the concepts of generalized INITIAL/FINAL (GIF) and generalized syllable (GS), the GIF modeling and the IF-GIF modeling, as well as the contextdependent pronunciation weighting, are proposed based on a well phonetically transcribed seed database. By using these methods, the Chinese syllable error rate (SER) is reduced by 6.3%and 4.2% compared with the GIF modeling and IF modeling respectively when the language model, such as syllable or word N-gram, is not used. The effectiveness of these methods is also proved when more data without the phonetic transcription are used to refine the acoustic model using the proposed iterative forced-alignment based transcribing (IFABT) method, achieving a 5.7% SER reduction.

  4. Satellite-based terrestrial production efficiency modeling

    Directory of Open Access Journals (Sweden)

    Obersteiner Michael

    2009-09-01

    Full Text Available Abstract Production efficiency models (PEMs are based on the theory of light use efficiency (LUE which states that a relatively constant relationship exists between photosynthetic carbon uptake and radiation receipt at the canopy level. Challenges remain however in the application of the PEM methodology to global net primary productivity (NPP monitoring. The objectives of this review are as follows: 1 to describe the general functioning of six PEMs (CASA; GLO-PEM; TURC; C-Fix; MOD17; and BEAMS identified in the literature; 2 to review each model to determine potential improvements to the general PEM methodology; 3 to review the related literature on satellite-based gross primary productivity (GPP and NPP modeling for additional possibilities for improvement; and 4 based on this review, propose items for coordinated research. This review noted a number of possibilities for improvement to the general PEM architecture - ranging from LUE to meteorological and satellite-based inputs. Current PEMs tend to treat the globe similarly in terms of physiological and meteorological factors, often ignoring unique regional aspects. Each of the existing PEMs has developed unique methods to estimate NPP and the combination of the most successful of these could lead to improvements. It may be beneficial to develop regional PEMs that can be combined under a global framework. The results of this review suggest the creation of a hybrid PEM could bring about a significant enhancement to the PEM methodology and thus terrestrial carbon flux modeling. Key items topping the PEM research agenda identified in this review include the following: LUE should not be assumed constant, but should vary by plant functional type (PFT or photosynthetic pathway; evidence is mounting that PEMs should consider incorporating diffuse radiation; continue to pursue relationships between satellite-derived variables and LUE, GPP and autotrophic respiration (Ra; there is an urgent need for

  5. Modeling acquaintance networks based on balance theory

    Directory of Open Access Journals (Sweden)

    Vukašinović Vida

    2014-09-01

    Full Text Available An acquaintance network is a social structure made up of a set of actors and the ties between them. These ties change dynamically as a consequence of incessant interactions between the actors. In this paper we introduce a social network model called the Interaction-Based (IB model that involves well-known sociological principles. The connections between the actors and the strength of the connections are influenced by the continuous positive and negative interactions between the actors and, vice versa, the future interactions are more likely to happen between the actors that are connected with stronger ties. The model is also inspired by the social behavior of animal species, particularly that of ants in their colony. A model evaluation showed that the IB model turned out to be sparse. The model has a small diameter and an average path length that grows in proportion to the logarithm of the number of vertices. The clustering coefficient is relatively high, and its value stabilizes in larger networks. The degree distributions are slightly right-skewed. In the mature phase of the IB model, i.e., when the number of edges does not change significantly, most of the network properties do not change significantly either. The IB model was found to be the best of all the compared models in simulating the e-mail URV (University Rovira i Virgili of Tarragona network because the properties of the IB model more closely matched those of the e-mail URV network than the other models

  6. Multivariate statistical modelling based on generalized linear models

    CERN Document Server

    Fahrmeir, Ludwig

    1994-01-01

    This book is concerned with the use of generalized linear models for univariate and multivariate regression analysis. Its emphasis is to provide a detailed introductory survey of the subject based on the analysis of real data drawn from a variety of subjects including the biological sciences, economics, and the social sciences. Where possible, technical details and proofs are deferred to an appendix in order to provide an accessible account for non-experts. Topics covered include: models for multi-categorical responses, model checking, time series and longitudinal data, random effects models, and state-space models. Throughout, the authors have taken great pains to discuss the underlying theoretical ideas in ways that relate well to the data at hand. As a result, numerous researchers whose work relies on the use of these models will find this an invaluable account to have on their desks. "The basic aim of the authors is to bring together and review a large part of recent advances in statistical modelling of m...

  7. An immune based dynamic intrusion detection model

    Institute of Scientific and Technical Information of China (English)

    LI Tao

    2005-01-01

    With the dynamic description method for self and antigen, and the concept of dynamic immune tolerance for lymphocytes in network-security domain presented in this paper, a new immune based dynamic intrusion detection model (Idid) is proposed. In Idid, the dynamic models and the corresponding recursive equations of the lifecycle of mature lymphocytes, and the immune memory are built. Therefore, the problem of the dynamic description of self and nonself in computer immune systems is solved, and the defect of the low efficiency of mature lymphocyte generating in traditional computer immune systems is overcome. Simulations of this model are performed, and the comparison experiment results show that the proposed dynamic intrusion detection model has a better adaptability than the traditional methods.

  8. Mechanics model for actin-based motility.

    Science.gov (United States)

    Lin, Yuan

    2009-02-01

    We present here a mechanics model for the force generation by actin polymerization. The possible adhesions between the actin filaments and the load surface, as well as the nucleation and capping of filament tips, are included in this model on top of the well-known elastic Brownian ratchet formulation. A closed form solution is provided from which the force-velocity relationship, summarizing the mechanics of polymerization, can be drawn. Model predictions on the velocity of moving beads driven by actin polymerization are consistent with experiment observations. This model also seems capable of explaining the enhanced actin-based motility of Listeria monocytogenes and beads by the presence of Vasodilator-stimulated phosphoprotein, as observed in recent experiments.

  9. Model based monitoring of stormwater runoff quality

    DEFF Research Database (Denmark)

    Birch, Heidi; Vezzaro, Luca; Mikkelsen, Peter Steen

    2012-01-01

    the information obtained about MPs discharged from the monitored system. A dynamic stormwater quality model was calibrated using MP data collected by volume-proportional and passive sampling in a storm drainage system in the outskirts of Copenhagen (Denmark) and a 10-year rain series was used to find annual......) for calibration of the model resulted in the same predicted level but narrower model prediction bounds than calibrations based on volume-proportional samples, allowing a better exploitation of the resources allocated for stormwater quality management.......Monitoring of micropollutants (MP) in stormwater is essential to evaluate the impacts of stormwater on the receiving aquatic environment. The aim of this study was to investigate how different strategies for monitoring of stormwater quality (combination of model with field sampling) affect...

  10. Image-based modelling of organogenesis.

    Science.gov (United States)

    Iber, Dagmar; Karimaddini, Zahra; Ünal, Erkan

    2016-07-01

    One of the major challenges in biology concerns the integration of data across length and time scales into a consistent framework: how do macroscopic properties and functionalities arise from the molecular regulatory networks-and how can they change as a result of mutations? Morphogenesis provides an excellent model system to study how simple molecular networks robustly control complex processes on the macroscopic scale despite molecular noise, and how important functional variants can emerge from small genetic changes. Recent advancements in three-dimensional imaging technologies, computer algorithms and computer power now allow us to develop and analyse increasingly realistic models of biological control. Here, we present our pipeline for image-based modelling that includes the segmentation of images, the determination of displacement fields and the solution of systems of partial differential equations on the growing, embryonic domains. The development of suitable mathematical models, the data-based inference of parameter sets and the evaluation of competing models are still challenging, and current approaches are discussed.

  11. Model-based Tomographic Reconstruction Literature Search

    Energy Technology Data Exchange (ETDEWEB)

    Chambers, D H; Lehman, S K

    2005-11-30

    In the process of preparing a proposal for internal research funding, a literature search was conducted on the subject of model-based tomographic reconstruction (MBTR). The purpose of the search was to ensure that the proposed research would not replicate any previous work. We found that the overwhelming majority of work on MBTR which used parameterized models of the object was theoretical in nature. Only three researchers had applied the technique to actual data. In this note, we summarize the findings of the literature search.

  12. Soft sensor modeling based on Gaussian processes

    Institute of Scientific and Technical Information of China (English)

    XIONG Zhi-hua; HUANG Guo-hong; SHAO Hui-he

    2005-01-01

    In order to meet the demand of online optimal running, a novel soft sensor modeling approach based on Gaussian processes was proposed. The approach is moderately simple to implement and use without loss of performance. It is trained by optimizing the hyperparameters using the scaled conjugate gradient algorithm with the squared exponential covariance function employed. Experimental simulations show that the soft sensor modeling approach has the advantage via a real-world example in a refinery. Meanwhile, the method opens new possibilities for application of kernel methods to potential fields.

  13. History-based trust negotiation model

    Institute of Scientific and Technical Information of China (English)

    ZHAO Yi-zhu; ZHAO Yan-hua; LU Hong-wei

    2009-01-01

    Trust negotiation (TN) is an approach to establish trust between strangers through iterative disclosure of digital credentials. Speeding up subsequent negotiations between the same negotiators is a problem worth of research. This paper introduces the concept of visiting card, and presents a history-based trust negotiation (HBTN) model. HBTN creates an account for a counterpart at the first negotiation and records valid credentials that the counterpart disclosed during each trust negotiation in his historical information base (HIB). For the following negotiation, no more credentials need to be disclosed for both parties. HBTN speeds up subsequent negotiations between the entities that interact with each other frequently without impairing the privacy preservation.

  14. Model-Based Trace-Checking

    CERN Document Server

    Howard, Y; Gravell, A; Ferreira, C; Augusto, J C

    2011-01-01

    Trace analysis can be a useful way to discover problems in a program under test. Rather than writing a special purpose trace analysis tool, this paper proposes that traces can usefully be analysed by checking them against a formal model using a standard model-checker or else an animator for executable specifications. These techniques are illustrated using a Travel Agent case study implemented in J2EE. We added trace beans to this code that write trace information to a database. The traces are then extracted and converted into a form suitable for analysis by Spin, a popular model-checker, and Pro-B, a model-checker and animator for the B notation. This illustrates the technique, and also the fact that such a system can have a variety of models, in different notations, that capture different features. These experiments have demonstrated that model-based trace-checking is feasible. Future work is focussed on scaling up the approach to larger systems by increasing the level of automation.

  15. Agent Based Model of Livestock Movements

    Science.gov (United States)

    Miron, D. J.; Emelyanova, I. V.; Donald, G. E.; Garner, G. M.

    The modelling of livestock movements within Australia is of national importance for the purposes of the management and control of exotic disease spread, infrastructure development and the economic forecasting of livestock markets. In this paper an agent based model for the forecasting of livestock movements is presented. This models livestock movements from farm to farm through a saleyard. The decision of farmers to sell or buy cattle is often complex and involves many factors such as climate forecast, commodity prices, the type of farm enterprise, the number of animals available and associated off-shore effects. In this model the farm agent's intelligence is implemented using a fuzzy decision tree that utilises two of these factors. These two factors are the livestock price fetched at the last sale and the number of stock on the farm. On each iteration of the model farms choose either to buy, sell or abstain from the market thus creating an artificial supply and demand. The buyers and sellers then congregate at the saleyard where livestock are auctioned using a second price sealed bid. The price time series output by the model exhibits properties similar to those found in real livestock markets.

  16. Agent-based modeling in ecological economics.

    Science.gov (United States)

    Heckbert, Scott; Baynes, Tim; Reeson, Andrew

    2010-01-01

    Interconnected social and environmental systems are the domain of ecological economics, and models can be used to explore feedbacks and adaptations inherent in these systems. Agent-based modeling (ABM) represents autonomous entities, each with dynamic behavior and heterogeneous characteristics. Agents interact with each other and their environment, resulting in emergent outcomes at the macroscale that can be used to quantitatively analyze complex systems. ABM is contributing to research questions in ecological economics in the areas of natural resource management and land-use change, urban systems modeling, market dynamics, changes in consumer attitudes, innovation, and diffusion of technology and management practices, commons dilemmas and self-governance, and psychological aspects to human decision making and behavior change. Frontiers for ABM research in ecological economics involve advancing the empirical calibration and validation of models through mixed methods, including surveys, interviews, participatory modeling, and, notably, experimental economics to test specific decision-making hypotheses. Linking ABM with other modeling techniques at the level of emergent properties will further advance efforts to understand dynamics of social-environmental systems.

  17. Individual eye model based on wavefront aberration

    Science.gov (United States)

    Guo, Huanqing; Wang, Zhaoqi; Zhao, Qiuling; Quan, Wei; Wang, Yan

    2005-03-01

    Based on the widely used Gullstrand-Le Grand eye model, the individual human eye model has been established here, which has individual corneal data, anterior chamber depth and the eyeball depth. Furthermore, the foremost thing is that the wavefront aberration calculated from the individual eye model is equal to the eye's wavefront aberration measured with the Hartmann-shack wavefront sensor. There are four main steps to build the model. Firstly, the corneal topography instrument was used to measure the corneal surfaces and depth. And in order to input cornea into the optical model, high-order aspheric surface-Zernike Fringe Sag surface was chosen to fit the corneal surfaces. Secondly, the Hartmann-shack wavefront sensor, which can offer the Zernike polynomials to describe the wavefront aberration, was built to measure the wavefront aberration of the eye. Thirdly, the eye's axial lengths among every part were measured with A-ultrasonic technology. Then the data were input into the optical design software-ZEMAX and the crystalline lens's shapes were optimized with the aberration as the merit function. The individual eye model, which has the same wavefront aberrations with the real eye, is established.

  18. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  19. Business Models for NFC Based Mobile Payments

    DEFF Research Database (Denmark)

    Chae, Johannes Sang-Un; Hedman, Jonas

    2015-01-01

    from multiple stakeholders and the creation of an ecosystem. Furthermore, they focus on the scalability of their value propositions. Originality / value: The paper offers an applicable business model framework that allows practitioners and academics to study current and future mobile payment approaches.......Purpose: The purpose of the paper is to develop a business model framework for NFC based mobile payment solutions consisting of four mutually interdepended components: the value service, value network, value architecture, and value finance. Design: Using a comparative case study method, the paper...... investigates Google Wallet and ISIS Mobile Wallet and their underlying business models. Findings: Google Wallet and ISIS Mobile Wallet are focusing on providing an enhanced customer experience with their mobile wallet through a multifaceted value proposition. The delivery of its offering requires cooperation...

  20. On Reading-Based Writing Instruction Model

    Institute of Scientific and Technical Information of China (English)

    李大艳; 王建安

    2012-01-01

    English writing is a complex integrative process of comprehensive skills. A host of students are still unable to write a coherent English paragraph after having learned English for many years at school. To help college students improve their writing competence is a great challenge facing the English teaching in China. Researches on writing teaching method abroad have experienced prosperity. In China, however, researches in this field are far behind. There is great need to search for more efficient writing instruction model so that it can serve well in Chinese context. Enlightened by Krashen's input hypothesis and Swain's output hypothesis, the writer put forward Reading-Based Writing Instruction Model. This paper aims to discuss the effectiveness of this model from the different perspectives.

  1. Realistic face modeling based on multiple deformations

    Institute of Scientific and Technical Information of China (English)

    GONG Xun; WANG Guo-yin

    2007-01-01

    On the basis of the assumption that the human face belongs to a linear class, a multiple-deformation model is proposed to recover face shape from a few points on a single 2D image. Compared to the conventional methods, this study has the following advantages. First, the proposed modified 3D sparse deforming model is a noniterative approach that can compute global translation efficiently and accurately. Subsequently, the overfitting problem can be alleviated based on the proposed multiple deformation model. Finally, by keeping the main features, the texture generated is realistic. The comparison results show that this novel method outperforms the existing methods by using ground truth data and that realistic 3D faces can be recovered efficiently from a single photograph.

  2. Business Models for NFC based mobile payments

    Directory of Open Access Journals (Sweden)

    Johannes Sang Un Chae

    2015-01-01

    Full Text Available Purpose: The purpose of the paper is to develop a business model framework for NFC based mobile payment solutions consisting of four mutually interdepended components: the value service, value network, value architecture, and value finance. Design: Using a comparative case study method, the paper investigates Google Wallet and ISIS Mobile Wallet and their underlying business models. Findings: Google Wallet and ISIS Mobile Wallet are focusing on providing an enhanced customer experience with their mobile wallet through a multifaceted value proposition. The delivery of its offering requires cooperation from multiple stakeholders and the creation of an ecosystem. Furthermore, they focus on the scalability of their value propositions. Originality / value: The paper offers an applicable business model framework that allows practitioners and academics to study current and future mobile payment approaches.

  3. Inference-based procedural modeling of solids

    KAUST Repository

    Biggers, Keith

    2011-11-01

    As virtual environments become larger and more complex, there is an increasing need for more automated construction algorithms to support the development process. We present an approach for modeling solids by combining prior examples with a simple sketch. Our algorithm uses an inference-based approach to incrementally fit patches together in a consistent fashion to define the boundary of an object. This algorithm samples and extracts surface patches from input models, and develops a Petri net structure that describes the relationship between patches along an imposed parameterization. Then, given a new parameterized line or curve, we use the Petri net to logically fit patches together in a manner consistent with the input model. This allows us to easily construct objects of varying sizes and configurations using arbitrary articulation, repetition, and interchanging of parts. The result of our process is a solid model representation of the constructed object that can be integrated into a simulation-based environment. © 2011 Elsevier Ltd. All rights reserved.

  4. Concept Tree Based Information Retrieval Model

    Directory of Open Access Journals (Sweden)

    Chunyan Yuan

    2014-05-01

    Full Text Available This paper proposes a novel concept-based query expansion technique named Markov concept tree model (MCTM, discovering term relationship through the concept tree deduced by term markov network. We address two important issues for query expansion: the selection and the weighting of expansion search terms. In contrast to earlier methods, queries are expanded by adding those terms that are most similar to the concept of the query, rather than selecting terms that are similar to a signal query terms. Utilizing Markov network which is constructed according to the co-occurrence information of the terms in collection, it generate concept tree for each original query term, remove the redundant and irrelevant nodes in concept tree, then adjust the weight of original query and the weight of expansion term based on a pruning algorithm. We use this model for query expansion and evaluate the effectiveness of the model by examining the accuracy and robustness of the expansion methods, Compared with the baseline model, the experiments on standard dataset reveal that this method can achieve a better query quality

  5. Néron models and base change

    CERN Document Server

    Halle, Lars Halvard

    2016-01-01

    Presenting the first systematic treatment of the behavior of Néron models under ramified base change, this book can be read as an introduction to various subtle invariants and constructions related to Néron models of semi-abelian varieties, motivated by concrete research problems and complemented with explicit examples. Néron models of abelian and semi-abelian varieties have become an indispensable tool in algebraic and arithmetic geometry since Néron introduced them in his seminal 1964 paper. Applications range from the theory of heights in Diophantine geometry to Hodge theory. We focus specifically on Néron component groups, Edixhoven’s filtration and the base change conductor of Chai and Yu, and we study these invariants using various techniques such as models of curves, sheaves on Grothendieck sites and non-archimedean uniformization. We then apply our results to the study of motivic zeta functions of abelian varieties. The final chapter contains a list of challenging open questions. This book is a...

  6. Model-based target and background characterization

    Science.gov (United States)

    Mueller, Markus; Krueger, Wolfgang; Heinze, Norbert

    2000-07-01

    Up to now most approaches of target and background characterization (and exploitation) concentrate solely on the information given by pixels. In many cases this is a complex and unprofitable task. During the development of automatic exploitation algorithms the main goal is the optimization of certain performance parameters. These parameters are measured during test runs while applying one algorithm with one parameter set to images that constitute of image domains with very different domain characteristics (targets and various types of background clutter). Model based geocoding and registration approaches provide means for utilizing the information stored in GIS (Geographical Information Systems). The geographical information stored in the various GIS layers can define ROE (Regions of Expectations) and may allow for dedicated algorithm parametrization and development. ROI (Region of Interest) detection algorithms (in most cases MMO (Man- Made Object) detection) use implicit target and/or background models. The detection algorithms of ROIs utilize gradient direction models that have to be matched with transformed image domain data. In most cases simple threshold calculations on the match results discriminate target object signatures from the background. The geocoding approaches extract line-like structures (street signatures) from the image domain and match the graph constellation against a vector model extracted from a GIS (Geographical Information System) data base. Apart from geo-coding the algorithms can be also used for image-to-image registration (multi sensor and data fusion) and may be used for creation and validation of geographical maps.

  7. Model-based vision for car following

    Science.gov (United States)

    Schneiderman, Henry; Nashman, Marilyn; Lumia, Ronald

    1993-08-01

    This paper describes a vision processing algorithm that supports autonomous car following. The algorithm visually tracks the position of a `lead vehicle' from the vantage of a pursuing `chase vehicle.' The algorithm requires a 2-D model of the back of the lead vehicle. This model is composed of line segments corresponding to features that give rise to strong edges. There are seven sequential stages of computation: (1) Extracting edge points; (2) Associating extracted edge points with the model features; (3) Determining the position of each model feature; (4) Determining the model position; (5) Updating the motion model of the object; (6) Predicting the position of the object in next image; (7) Predicting the location of all object features from prediction of object position. All processing is confined to the 2-D image plane. The 2-D model location computed in this processing is used to determine the position of the lead vehicle with respect to a 3-D coordinate frame affixed to the chase vehicle. This algorithm has been used as part of a complete system to drive an autonomous vehicle, a High Mobility Multipurpose Wheeled Vehicle (HMMWV) such that it follows a lead vehicle at speeds up to 35 km/hr. The algorithm runs at an update rate of 15 Hertz and has a worst case computational delay of 128 ms. The algorithm is implemented under the NASA/NBS Standard Reference Model for Telerobotic Control System Architecture (NASREM) and runs on a dedicated vision processing engine and a VME-based multiprocessor system.

  8. Model-based explanation of plant knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Huuskonen, P.J. [VTT Electronics, Oulu (Finland). Embedded Software

    1997-12-31

    This thesis deals with computer explanation of knowledge related to design and operation of industrial plants. The needs for explanation are motivated through case studies and literature reviews. A general framework for analysing plant explanations is presented. Prototypes demonstrate key mechanisms for implementing parts of the framework. Power plants, steel mills, paper factories, and high energy physics control systems are studied to set requirements for explanation. The main problems are seen to be either lack or abundance of information. Design knowledge in particular is found missing at plants. Support systems and automation should be enhanced with ways to explain plant knowledge to the plant staff. A framework is formulated for analysing explanations of plant knowledge. It consists of three parts: 1. a typology of explanation, organised by the class of knowledge (factual, functional, or strategic) and by the target of explanation (processes, automation, or support systems), 2. an identification of explanation tasks generic for the plant domain, and 3. an identification of essential model types for explanation (structural, behavioural, functional, and teleological). The tasks use the models to create the explanations of the given classes. Key mechanisms are discussed to implement the generic explanation tasks. Knowledge representations based on objects and their relations form a vocabulary to model and present plant knowledge. A particular class of models, means-end models, are used to explain plant knowledge. Explanations are generated through searches in the models. Hypertext is adopted to communicate explanations over dialogue based on context. The results are demonstrated in prototypes. The VICE prototype explains the reasoning of an expert system for diagnosis of rotating machines at power plants. The Justifier prototype explains design knowledge obtained from an object-oriented plant design tool. Enhanced access mechanisms into on-line documentation are

  9. Family-Based Model Checking Without a Family-Based Model Checker

    DEFF Research Database (Denmark)

    Dimovski, Aleksandar; Al-Sibahi, Ahmad Salim; Brabrand, Claus;

    2015-01-01

    be used to model-check variational models using the standard version of (single system) SPIN. The abstractions are first defined as Galois connections on semantic domains. We then show how to translate them into syntactic source-to-source transformations on variational models. This allows the use of SPIN...... with all its accumulated optimizations for efficient verification of variational models without any knowledge about variability. We demonstrate the practicality of this method on several examples using both the SNIP (family based) and SPIN (single system) model checkers....

  10. Effect of species rarity on the accuracy of species distribution models for reptiles and amphibians in southern California

    Science.gov (United States)

    Franklin, J.; Wejnert, K.E.; Hathaway, S.A.; Rochester, C.J.; Fisher, R.N.

    2009-01-01

    Aim: Several studies have found that more accurate predictive models of species' occurrences can be developed for rarer species; however, one recent study found the relationship between range size and model performance to be an artefact of sample prevalence, that is, the proportion of presence versus absence observations in the data used to train the model. We examined the effect of model type, species rarity class, species' survey frequency, detectability and manipulated sample prevalence on the accuracy of distribution models developed for 30 reptile and amphibian species. Location: Coastal southern California, USA. Methods: Classification trees, generalized additive models and generalized linear models were developed using species presence and absence data from 420 locations. Model performance was measured using sensitivity, specificity and the area under the curve (AUC) of the receiver-operating characteristic (ROC) plot based on twofold cross-validation, or on bootstrapping. Predictors included climate, terrain, soil and vegetation variables. Species were assigned to rarity classes by experts. The data were sampled to generate subsets with varying ratios of presences and absences to test for the effect of sample prevalence. Join count statistics were used to characterize spatial dependence in the prediction errors. Results: Species in classes with higher rarity were more accurately predicted than common species, and this effect was independent of sample prevalence. Although positive spatial autocorrelation remained in the prediction errors, it was weaker than was observed in the species occurrence data. The differences in accuracy among model types were slight. Main conclusions: Using a variety of modelling methods, more accurate species distribution models were developed for rarer than for more common species. This was presumably because it is difficult to discriminate suitable from unsuitable habitat for habitat generalists, and not as an artefact of the

  11. Grid based calibration of SWAT hydrological models

    Directory of Open Access Journals (Sweden)

    D. Gorgan

    2012-07-01

    Full Text Available The calibration and execution of large hydrological models, such as SWAT (soil and water assessment tool, developed for large areas, high resolution, and huge input data, need not only quite a long execution time but also high computation resources. SWAT hydrological model supports studies and predictions of the impact of land management practices on water, sediment, and agricultural chemical yields in complex watersheds. The paper presents the gSWAT application as a web practical solution for environmental specialists to calibrate extensive hydrological models and to run scenarios, by hiding the complex control of processes and heterogeneous resources across the grid based high computation infrastructure. The paper highlights the basic functionalities of the gSWAT platform, and the features of the graphical user interface. The presentation is concerned with the development of working sessions, interactive control of calibration, direct and basic editing of parameters, process monitoring, and graphical and interactive visualization of the results. The experiments performed on different SWAT models and the obtained results argue the benefits brought by the grid parallel and distributed environment as a solution for the processing platform. All the instances of SWAT models used in the reported experiments have been developed through the enviroGRIDS project, targeting the Black Sea catchment area.

  12. Physics-based models of the plasmasphere

    Energy Technology Data Exchange (ETDEWEB)

    Jordanova, Vania K [Los Alamos National Laboratory; Pierrard, Vivane [BELGIUM; Goldstein, Jerry [SWRI; Andr' e, Nicolas [ESTEC/ESA; Kotova, Galina A [SRI, RUSSIA; Lemaire, Joseph F [BELGIUM; Liemohn, Mike W [U OF MICHIGAN; Matsui, H [UNIV OF NEW HAMPSHIRE

    2008-01-01

    We describe recent progress in physics-based models of the plasmasphere using the Auid and the kinetic approaches. Global modeling of the dynamics and inAuence of the plasmasphere is presented. Results from global plasmasphere simulations are used to understand and quantify (i) the electric potential pattern and evolution during geomagnetic storms, and (ii) the inAuence of the plasmasphere on the excitation of electromagnetic ion cyclotron (ElvIIC) waves a.nd precipitation of energetic ions in the inner magnetosphere. The interactions of the plasmasphere with the ionosphere a.nd the other regions of the magnetosphere are pointed out. We show the results of simulations for the formation of the plasmapause and discuss the inAuence of plasmaspheric wind and of ultra low frequency (ULF) waves for transport of plasmaspheric material. Theoretical formulations used to model the electric field and plasma distribution in the plasmasphere are given. Model predictions are compared to recent CLUSTER and MAGE observations, but also to results of earlier models and satellite observations.

  13. Model-based Prognostics with Concurrent Damage Progression Processes

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics approaches rely on physics-based models that describe the behavior of systems and their components. These models must account for the several...

  14. A Comparison of Filter-based Approaches for Model-based Prognostics

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics approaches use domain knowledge about a system and its failure modes through the use of physics-based models. Model-based prognosis is...

  15. Flow based vs. demand based energy-water modelling

    Science.gov (United States)

    Rozos, Evangelos; Nikolopoulos, Dionysis; Efstratiadis, Andreas; Koukouvinos, Antonios; Makropoulos, Christos

    2015-04-01

    The water flow in hydro-power generation systems is often used downstream to cover other type of demands like irrigation and water supply. However, the typical case is that the energy demand (operation of hydro-power plant) and the water demand do not coincide. Furthermore, the water inflow into a reservoir is a stochastic process. Things become more complicated if renewable resources (wind-turbines or photovoltaic panels) are included into the system. For this reason, the assessment and optimization of the operation of hydro-power systems are challenging tasks that require computer modelling. This modelling should not only simulate the water budget of the reservoirs and the energy production/consumption (pumped-storage), but should also take into account the constraints imposed by the natural or artificial water network using a flow routing algorithm. HYDRONOMEAS, for example, uses an elegant mathematical approach (digraph) to calculate the flow in a water network based on: the demands (input timeseries), the water availability (simulated) and the capacity of the transmission components (properties of channels, rivers, pipes, etc.). The input timeseries of demand should be estimated by another model and linked to the corresponding network nodes. A model that could be used to estimate these timeseries is UWOT. UWOT is a bottom up urban water cycle model that simulates the generation, aggregation and routing of water demand signals. In this study, we explore the potentials of UWOT in simulating the operation of complex hydrosystems that include energy generation. The evident advantage of this approach is the use of a single model instead of one for estimation of demands and another for the system simulation. An application of UWOT in a large scale system is attempted in mainland Greece in an area extending over 130×170 km². The challenges, the peculiarities and the advantages of this approach are examined and critically discussed.

  16. Physiologically Based Pharmacokinetic (PBPK) Modeling of ...

    Science.gov (United States)

    Background: Quantitative estimation of toxicokinetic variability in the human population is a persistent challenge in risk assessment of environmental chemicals. Traditionally, inter-individual differences in the population are accounted for by default assumptions or, in rare cases, are based on human toxicokinetic data.Objectives: To evaluate the utility of genetically diverse mouse strains for estimating toxicokinetic population variability for risk assessment, using trichloroethylene (TCE) metabolism as a case study. Methods: We used data on oxidative and glutathione conjugation metabolism of TCE in 16 inbred and one hybrid mouse strains to calibrate and extend existing physiologically-based pharmacokinetic (PBPK) models. We added one-compartment models for glutathione metabolites and a two-compartment model for dichloroacetic acid (DCA). A Bayesian population analysis of inter-strain variability was used to quantify variability in TCE metabolism. Results: Concentration-time profiles for TCE metabolism to oxidative and glutathione conjugation metabolites varied across strains. Median predictions for the metabolic flux through oxidation was less variable (5-fold range) than that through glutathione conjugation (10-fold range). For oxidative metabolites, median predictions of trichloroacetic acid production was less variable (2-fold range) than DCA production (5-fold range), although uncertainty bounds for DCA exceeded the predicted variability. Conclusions:

  17. Model-based phase-shifting interferometer

    Science.gov (United States)

    Liu, Dong; Zhang, Lei; Shi, Tu; Yang, Yongying; Chong, Shiyao; Miao, Liang; Huang, Wei; Shen, Yibing; Bai, Jian

    2015-10-01

    A model-based phase-shifting interferometer (MPI) is developed, in which a novel calculation technique is proposed instead of the traditional complicated system structure, to achieve versatile, high precision and quantitative surface tests. In the MPI, the partial null lens (PNL) is employed to implement the non-null test. With some alternative PNLs, similar as the transmission spheres in ZYGO interferometers, the MPI provides a flexible test for general spherical and aspherical surfaces. Based on modern computer modeling technique, a reverse iterative optimizing construction (ROR) method is employed for the retrace error correction of non-null test, as well as figure error reconstruction. A self-compiled ray-tracing program is set up for the accurate system modeling and reverse ray tracing. The surface figure error then can be easily extracted from the wavefront data in forms of Zernike polynomials by the ROR method. Experiments of the spherical and aspherical tests are presented to validate the flexibility and accuracy. The test results are compared with those of Zygo interferometer (null tests), which demonstrates the high accuracy of the MPI. With such accuracy and flexibility, the MPI would possess large potential in modern optical shop testing.

  18. Cosmic emergy based ecological systems modelling

    Science.gov (United States)

    Chen, H.; Chen, G. Q.; Ji, X.

    2010-09-01

    Ecological systems modelling based on the unified biophysical measure of cosmic emergy in terms of embodied cosmic exergy is illustrated in this paper with ecological accounting, simulation and scenario analysis, by a case study for the regional socio-economic ecosystem associated with the municipality of Beijing. An urbanized regional ecosystem model with eight subsystems of natural support, agriculture, urban production, population, finance, land area, potential environmental impact, and culture is representatively presented in exergy circuit language with 12 state variables governing by corresponding ecodynamic equations, and 60 flows and auxiliary variables. To characterize the regional socio-economy as an ecosystem, a series of ecological indicators based on cosmic emergy are devised. For a systematic ecological account, cosmic exergy transformities are provided for various dimensions including climate flows, natural resources, industrial products, cultural products, population with educational hierarchy, and environmental emissions. For the urban ecosystem of Beijing in the period from 1990 to 2005, ecological accounting is carried out and characterized in full details. Taking 2000 as the starting point, systems modelling is realized to predict the urban evolution in a one hundred time horizon. For systems regulation, scenario analyses with essential policy-making implications are made to illustrate the long term systems effects of the expected water diversion and rise in energy price.

  19. Rainwater harvesting: model-based design evaluation.

    Science.gov (United States)

    Ward, S; Memon, F A; Butler, D

    2010-01-01

    The rate of uptake of rainwater harvesting (RWH) in the UK has been slow to date, but is expected to gain momentum in the near future. The designs of two different new-build rainwater harvesting systems, based on simple methods, are evaluated using three different design methods, including a continuous simulation modelling approach. The RWH systems are shown to fulfill 36% and 46% of WC demand. Financial analyses reveal that RWH systems within large commercial buildings maybe more financially viable than smaller domestic systems. It is identified that design methods based on simple approaches generate tank sizes substantially larger than the continuous simulation. Comparison of the actual tank sizes and those calculated using continuous simulation established that the tanks installed are oversized for their associated demand level and catchment size. Oversizing tanks can lead to excessive system capital costs, which currently hinders the uptake of systems. Furthermore, it is demonstrated that the catchment area size is often overlooked when designing UK-based RWH systems. With respect to these findings, a recommendation for a transition from the use of simple tools to continuous simulation models is made.

  20. Human physiologically based pharmacokinetic model for propofol

    Directory of Open Access Journals (Sweden)

    Schnider Thomas W

    2005-04-01

    Full Text Available Abstract Background Propofol is widely used for both short-term anesthesia and long-term sedation. It has unusual pharmacokinetics because of its high lipid solubility. The standard approach to describing the pharmacokinetics is by a multi-compartmental model. This paper presents the first detailed human physiologically based pharmacokinetic (PBPK model for propofol. Methods PKQuest, a freely distributed software routine http://www.pkquest.com, was used for all the calculations. The "standard human" PBPK parameters developed in previous applications is used. It is assumed that the blood and tissue binding is determined by simple partition into the tissue lipid, which is characterized by two previously determined set of parameters: 1 the value of the propofol oil/water partition coefficient; 2 the lipid fraction in the blood and tissues. The model was fit to the individual experimental data of Schnider et. al., Anesthesiology, 1998; 88:1170 in which an initial bolus dose was followed 60 minutes later by a one hour constant infusion. Results The PBPK model provides a good description of the experimental data over a large range of input dosage, subject age and fat fraction. Only one adjustable parameter (the liver clearance is required to describe the constant infusion phase for each individual subject. In order to fit the bolus injection phase, for 10 or the 24 subjects it was necessary to assume that a fraction of the bolus dose was sequestered and then slowly released from the lungs (characterized by two additional parameters. The average weighted residual error (WRE of the PBPK model fit to the both the bolus and infusion phases was 15%; similar to the WRE for just the constant infusion phase obtained by Schnider et. al. using a 6-parameter NONMEM compartmental model. Conclusion A PBPK model using standard human parameters and a simple description of tissue binding provides a good description of human propofol kinetics. The major advantage of a

  1. Trip Generation Model Based on Destination Attractiveness

    Institute of Scientific and Technical Information of China (English)

    YAO Liya; GUAN Hongzhi; YAN Hai

    2008-01-01

    Traditional trip generation forecasting methods use unified average trip generation rates to determine trip generation volumes in various traffic zones without considering the individual characteristics of each traffic zone.Therefore,the results can have significant errors.To reduce the forecasting error produced by uniform trip generation rates for different traffic zones,the behavior of each traveler was studied instead of the characteristics of the traffic zone.This paper gives a method for calculating the trip efficiency and the effect of traffic zones combined with a destination selection model based on disaggregate theory for trip generation.Beijing data is used with the trip generation method to predict trip volumes.The results show that the disaggregate model in this paper is more accurate than the traditional method.An analysis of the factors influencing traveler behavior and destination selection shows that the attractiveness of the traffic zone strongly affects the trip generation volume.

  2. CNEM: Cluster Based Network Evolution Model

    Directory of Open Access Journals (Sweden)

    Sarwat Nizamani

    2015-01-01

    Full Text Available This paper presents a network evolution model, which is based on the clustering approach. The proposed approach depicts the network evolution, which demonstrates the network formation from individual nodes to fully evolved network. An agglomerative hierarchical clustering method is applied for the evolution of network. In the paper, we present three case studies which show the evolution of the networks from the scratch. These case studies include: terrorist network of 9/11 incidents, terrorist network of WMD (Weapons Mass Destruction plot against France and a network of tweets discussing a topic. The network of 9/11 is also used for evaluation, using other social network analysis methods which show that the clusters created using the proposed model of network evolution are of good quality, thus the proposed method can be used by law enforcement agencies in order to further investigate the criminal networks

  3. Cloth Modeling Based on Particle System

    Institute of Scientific and Technical Information of China (English)

    钟跃崎; 王善元

    2001-01-01

    A physical-based particle system is employed for cloth modeling supported by two basic algorithms, between which one is the construction of the internal and external forces acting on the particle system in terms of KES-F bending and shearing tests, and the other is the collision algorithm of which the collision detection is carried by means of bi-section of time step and the collision response is handled according to the empirical law for frictionless collision With these algorithms. the geometric state of parcles can be expressed as ordinary differential equationswhich is numerically solved by fourth order Runge- Kutta integration. Different draping figures of cotton fabric and wool fabric prove that such a particle system is suitable for 3D cloth modeling and simulation.

  4. Agent based modeling in tactical wargaming

    Science.gov (United States)

    James, Alex; Hanratty, Timothy P.; Tuttle, Daniel C.; Coles, John B.

    2016-05-01

    Army staffs at division, brigade, and battalion levels often plan for contingency operations. As such, analysts consider the impact and potential consequences of actions taken. The Army Military Decision-Making Process (MDMP) dictates identification and evaluation of possible enemy courses of action; however, non-state actors often do not exhibit the same level and consistency of planned actions that the MDMP was originally designed to anticipate. The fourth MDMP step is a particular challenge, wargaming courses of action within the context of complex social-cultural behaviors. Agent-based Modeling (ABM) and its resulting emergent behavior is a potential solution to model terrain in terms of the human domain and improve the results and rigor of the traditional wargaming process.

  5. Model-based vision using geometric hashing

    Science.gov (United States)

    Akerman, Alexander, III; Patton, Ronald

    1991-04-01

    The Geometric Hashing technique developed by the NYU Courant Institute has been applied to various automatic target recognition applications. In particular, I-MATH has extended the hashing algorithm to perform automatic target recognition ofsynthetic aperture radar (SAR) imagery. For this application, the hashing is performed upon the geometric locations of dominant scatterers. In addition to being a robust model-based matching algorithm -- invariant under translation, scale, and 3D rotations of the target -- hashing is of particular utility because it can still perform effective matching when the target is partially obscured. Moreover, hashing is very amenable to a SIMD parallel processing architecture, and thus potentially realtime implementable.

  6. Model-Based Enterprise Summit Report

    Science.gov (United States)

    2014-02-01

    the “industrial internet ” for tying information together, resulting in better decisions. GE is focusing on data quality for in-process models to...based measurement planning does a number of things well, such as generating collision- free probe paths and facilitating sensor orientation selection...Summit 2012 MBE Summit “It’s Not Safe” “Not for Enterprise” “B2C, not B2B ” Miscommunications about the Cloud 2012 MBE Summit Truth about the Cloud Safe

  7. Agent-based modeling and simulation

    CERN Document Server

    Taylor, Simon

    2014-01-01

    Operational Research (OR) deals with the use of advanced analytical methods to support better decision-making. It is multidisciplinary with strong links to management science, decision science, computer science and many application areas such as engineering, manufacturing, commerce and healthcare. In the study of emergent behaviour in complex adaptive systems, Agent-based Modelling & Simulation (ABMS) is being used in many different domains such as healthcare, energy, evacuation, commerce, manufacturing and defense. This collection of articles presents a convenient introduction to ABMS with pa

  8. Knowledge-based geometric modeling in construction

    DEFF Research Database (Denmark)

    Bonev, Martin; Hvam, Lars

    2012-01-01

    a considerably high amount of their recourses is required for designing and specifying the majority of their product assortment. As design decisions are hereby based on knowledge and experience about behaviour and applicability of construction techniques and materials for a predefined design situation, smart...... tools need to be developed, to support these activities. In order to achieve a higher degree of design automation, this study proposes a framework for using configuration systems within the CAD environment together with suitable geometric modeling techniques on the example of a Danish manufacturer...

  9. Flow Modeling Based Wall Element Technique

    Directory of Open Access Journals (Sweden)

    Sabah Tamimi

    2012-08-01

    Full Text Available Two types of flow where examined, pressure and combination of pressure and Coquette flow of confined turbulent flow with a one equation model used to depict the turbulent viscosity of confined flow in a smooth straight channel when a finite element technique based on a zone close to a solid wall has been adopted for predicting the distribution of the pertinent variables in this zone and examined even with case when the near wall zone was extended away from the wall. The validation of imposed technique has been tested and well compared with other techniques.

  10. Energy-based models for environmental biotechnology.

    Science.gov (United States)

    Rodríguez, Jorge; Lema, Juan M; Kleerebezem, Robbert

    2008-07-01

    Environmental biotechnology is evolving. Current process objectives include the production of chemicals and/or energy carriers (biofuels) in addition to the traditional objective of removing pollutants from waste. To maximise product yields and minimise biomass production, future processes will rely on anaerobic microbial communities. Anaerobic processes are characterised by small Gibbs energy changes in the reactions catalysed, and this provides clear thermodynamic process boundaries. Here, a Gibbs-energy-based methodology is proposed for mathematical modelling of energy-limited anaerobic ecosystems. This methodology provides a basis for the description of microbial activities as a function of environmental factors, which will allow enhanced catalysis of specific reactions of interest for process development.

  11. Integrated Semantic Similarity Model Based on Ontology

    Institute of Scientific and Technical Information of China (English)

    LIU Ya-Jun; ZHAO Yun

    2004-01-01

    To solve the problem of the inadequacy of semantic processing in the intelligent question answering system, an integrated semantic similarity model which calculates the semantic similarity using the geometric distance and information content is presented in this paper.With the help of interrelationship between concepts, the information content of concepts and the strength of the edges in the ontology network, we can calculate the semantic similarity between two concepts and provide information for the further calculation of the semantic similarity between user's question and answers in knowlegdge base.The results of the experiments on the prototype have shown that the semantic problem in natural language processing can also be solved with the help of the knowledge and the abundant semantic information in ontology.More than 90% accuracy with less than 50 ms average searching time in the intelligent question answering prototype system based on ontology has been reached.The result is very satisfied.

  12. Python-Based Applications for Hydrogeological Modeling

    Science.gov (United States)

    Khambhammettu, P.

    2013-12-01

    Python is a general-purpose, high-level programming language whose design philosophy emphasizes code readability. Add-on packages supporting fast array computation (numpy), plotting (matplotlib), scientific /mathematical Functions (scipy), have resulted in a powerful ecosystem for scientists interested in exploratory data analysis, high-performance computing and data visualization. Three examples are provided to demonstrate the applicability of the Python environment in hydrogeological applications. Python programs were used to model an aquifer test and estimate aquifer parameters at a Superfund site. The aquifer test conducted at a Groundwater Circulation Well was modeled with the Python/FORTRAN-based TTIM Analytic Element Code. The aquifer parameters were estimated with PEST such that a good match was produced between the simulated and observed drawdowns. Python scripts were written to interface with PEST and visualize the results. A convolution-based approach was used to estimate source concentration histories based on observed concentrations at receptor locations. Unit Response Functions (URFs) that relate the receptor concentrations to a unit release at the source were derived with the ATRANS code. The impact of any releases at the source could then be estimated by convolving the source release history with the URFs. Python scripts were written to compute and visualize receptor concentrations for user-specified source histories. The framework provided a simple and elegant way to test various hypotheses about the site. A Python/FORTRAN-based program TYPECURVEGRID-Py was developed to compute and visualize groundwater elevations and drawdown through time in response to a regional uniform hydraulic gradient and the influence of pumping wells using either the Theis solution for a fully-confined aquifer or the Hantush-Jacob solution for a leaky confined aquifer. The program supports an arbitrary number of wells that can operate according to arbitrary schedules. The

  13. Biologically based multistage modeling of radiation effects

    Energy Technology Data Exchange (ETDEWEB)

    William Hazelton; Suresh Moolgavkar; E. Georg Luebeck

    2005-08-30

    This past year we have made substantial progress in modeling the contribution of homeostatic regulation to low-dose radiation effects and carcinogenesis. We have worked to refine and apply our multistage carcinogenesis models to explicitly incorporate cell cycle states, simple and complex damage, checkpoint delay, slow and fast repair, differentiation, and apoptosis to study the effects of low-dose ionizing radiation in mouse intestinal crypts, as well as in other tissues. We have one paper accepted for publication in ''Advances in Space Research'', and another manuscript in preparation describing this work. I also wrote a chapter describing our combined cell-cycle and multistage carcinogenesis model that will be published in a book on stochastic carcinogenesis models edited by Wei-Yuan Tan. In addition, we organized and held a workshop on ''Biologically Based Modeling of Human Health Effects of Low dose Ionizing Radiation'', July 28-29, 2005 at Fred Hutchinson Cancer Research Center in Seattle, Washington. We had over 20 participants, including Mary Helen Barcellos-Hoff as keynote speaker, talks by most of the low-dose modelers in the DOE low-dose program, experimentalists including Les Redpath (and Mary Helen), Noelle Metting from DOE, and Tony Brooks. It appears that homeostatic regulation may be central to understanding low-dose radiation phenomena. The primary effects of ionizing radiation (IR) are cell killing, delayed cell cycling, and induction of mutations. However, homeostatic regulation causes cells that are killed or damaged by IR to eventually be replaced. Cells with an initiating mutation may have a replacement advantage, leading to clonal expansion of these initiated cells. Thus we have focused particularly on modeling effects that disturb homeostatic regulation as early steps in the carcinogenic process. There are two primary considerations that support our focus on homeostatic regulation. First, a number of

  14. Model based control of refrigeration systems

    Energy Technology Data Exchange (ETDEWEB)

    Sloth Larsen, L.F.

    2005-11-15

    The subject for this Ph.D. thesis is model based control of refrigeration systems. Model based control covers a variety of different types of controls, that incorporates mathematical models. In this thesis the main subject therefore has been restricted to deal with system optimizing control. The optimizing control is divided into two layers, where the system oriented top layers deals with set-point optimizing control and the lower layer deals with dynamical optimizing control in the subsystems. The thesis has two main contributions, i.e. a novel approach for set-point optimization and a novel approach for desynchronization based on dynamical optimization. The focus in the development of the proposed set-point optimizing control has been on deriving a simple and general method, that with ease can be applied on various compositions of the same class of systems, such as refrigeration systems. The method is based on a set of parameter depended static equations describing the considered process. By adapting the parameters to the given process, predict the steady state and computing a steady state gradient of the cost function, the process can be driven continuously towards zero gradient, i.e. the optimum (if the cost function is convex). The method furthermore deals with system constrains by introducing barrier functions, hereby the best possible performance taking the given constrains in to account can be obtained, e.g. under extreme operational conditions. The proposed method has been applied on a test refrigeration system, placed at Aalborg University, for minimization of the energy consumption. Here it was proved that by using general static parameter depended system equations it was possible drive the set-points close to the optimum and thus reduce the power consumption with up to 20%. In the dynamical optimizing layer the idea is to optimize the operation of the subsystem or the groupings of subsystems, that limits the obtainable system performance. In systems

  15. Modeling Web-based Educational Systems: Process Design Teaching Model

    Directory of Open Access Journals (Sweden)

    Elena Rokou

    2004-01-01

    Full Text Available Using modeling languages is essential to the construction of educational systems based on software engineering principles and methods. Furthermore, the instructional design is undoubtedly the cornerstone of the design and development of educational systems. Although several methodologies and languages have been proposed for the specification of isolated educational multimedia systems, none has optimum results for the description of these systems and, especially, for their pedagogical aspect. Of course this is due primarily to how these systems function and are applied; it is not due to the language itself, although its special characteristics contribute substantially to the development of these systems sometimes positively and sometimes negatively. In this paper, we briefly describe the introduction of stereotypes to the pedagogical design of educational systems and appropriate modifications of the existing package diagrams of UML (Unified Modeling Language. The main objective of these new stereotypes is to describe sufficiently the mechanisms of generation, monitoring and re-adapting of teaching and student’s models which can be used in the educational applications.

  16. Models-Based Practice: Great White Hope or White Elephant?

    Science.gov (United States)

    Casey, Ashley

    2014-01-01

    Background: Many critical curriculum theorists in physical education have advocated a model- or models-based approach to teaching in the subject. This paper explores the literature base around models-based practice (MBP) and asks if this multi-models approach to curriculum planning has the potential to be the great white hope of pedagogical change…

  17. Activity Recognition Using Biomechanical Model Based Pose Estimation

    OpenAIRE

    Reiss, Attila; Hendeby, Gustaf; Bleser, Gabriele; Stricker, Didier

    2010-01-01

    In this paper, a novel activity recognition method based on signal-oriented and model-based features is presented. The model-based features are calculated from shoulder and elbow joint angles and torso orientation, provided by upper-body pose estimation based on a biomechanical body model. The recognition performance of signal-oriented and model-based features is compared within this paper, and the potential of improving recognition accuracy by combining the two approaches is proved: the accu...

  18. Model-based estimation of individual fitness

    Science.gov (United States)

    Link, W.A.; Cooch, E.G.; Cam, E.

    2002-01-01

    Fitness is the currency of natural selection, a measure of the propagation rate of genotypes into future generations. Its various definitions have the common feature that they are functions of survival and fertility rates. At the individual level, the operative level for natural selection, these rates must be understood as latent features, genetically determined propensities existing at birth. This conception of rates requires that individual fitness be defined and estimated by consideration of the individual in a modelled relation to a group of similar individuals; the only alternative is to consider a sample of size one, unless a clone of identical individuals is available. We present hierarchical models describing individual heterogeneity in survival and fertility rates and allowing for associations between these rates at the individual level. We apply these models to an analysis of life histories of Kittiwakes (Rissa tridactyla ) observed at several colonies on the Brittany coast of France. We compare Bayesian estimation of the population distribution of individual fitness with estimation based on treating individual life histories in isolation, as samples of size one (e.g. McGraw & Caswell, 1996).

  19. Intellectual Model-Based Configuration Management Conception

    Directory of Open Access Journals (Sweden)

    Bartusevics Arturs

    2014-07-01

    Full Text Available Software configuration management is one of the most important disciplines within the software development project, which helps control the software evolution process and allows including into the end project only tested and validated changes. To achieve this, software management completes certain tasks. Concrete tools are used for technical implementation of tasks, such as version control systems, servers of continuous integration, compilers, etc. A correct configuration management process usually requires several tools, which mutually exchange information by generating various kinds of transfers. When it comes to introducing the configuration management process, often there are situations when tool installation is started, yet at that given moment there is no general picture of the total process. The article offers a model-based configuration management concept, which foresees the development of an abstract model for the configuration management process that later is transformed to lower abstraction level models and tools are indicated to support the technical process. A solution of this kind allows a more rational introduction and configuration of tools

  20. An Attack Modeling Based on Colored Petri Net

    Institute of Scientific and Technical Information of China (English)

    ZHOU Shijie; QIN Zhiguang; ZHANG Feng; LIU Jinde

    2004-01-01

    A color petri net (CPN) based attack modeling approach is addressed.Compared with graph-based modeling,CPN based attack model is fiexible enough to model Intemet intrusions,because of their static and dynamic features.The processes and rules of building CPN based attack model from attack tree are also presented.In order to evaluate the risk of intrusion,some cost elements are added to CPN based attack modeling.This extended model is useful in intrusion detection and risk evaluation.Experiences show that it is easy to exploit CPN based attack modeling approach to provide the controlling functions,such as intrusion response and intrusion defense.A case study given in this paper shows that CPN based attack model has many unique characters which attack tree model hasn't.

  1. Model-based control of networked systems

    CERN Document Server

    Garcia, Eloy; Montestruque, Luis A

    2014-01-01

    This monograph introduces a class of networked control systems (NCS) called model-based networked control systems (MB-NCS) and presents various architectures and control strategies designed to improve the performance of NCS. The overall performance of NCS considers the appropriate use of network resources, particularly network bandwidth, in conjunction with the desired response of the system being controlled.   The book begins with a detailed description of the basic MB-NCS architecture that provides stability conditions in terms of state feedback updates . It also covers typical problems in NCS such as network delays, network scheduling, and data quantization, as well as more general control problems such as output feedback control, nonlinear systems stabilization, and tracking control.   Key features and topics include: Time-triggered and event-triggered feedback updates Stabilization of uncertain systems subject to time delays, quantization, and extended absence of feedback Optimal control analysis and ...

  2. Model Based Control of Refrigeration Systems

    DEFF Research Database (Denmark)

    Larsen, Lars Finn Sloth

    for automation of these procedures, that is to incorporate some "intelligence" in the control system, this project was started up. The main emphasis of this work has been on model based methods for system optimizing control in supermarket refrigeration systems. The idea of implementing a system optimizing...... optimizing the steady state operation "set-point optimizing control" and a part optimizing dynamic behaviour of the system "dynamical optimizing control". A novel approach for set-point optimization will be presented. The general idea is to use a prediction of the steady state, for computation of the cost...... that the crosscouplings are disregarded. Though these cross couplings are relatively weak, it turns out that they pose a major problem when hysteresis controllers are used in the local loops. It can namely cause synchronization of the distributed hysteresis controllers, thus accumulating the effects of the cross...

  3. Ecosystem Based Business Model of Smart Grid

    DEFF Research Database (Denmark)

    Lundgaard, Morten Raahauge; Ma, Zheng; Jørgensen, Bo Nørregaard

    2015-01-01

    This paper tries to investigate the ecosystem based business model in a smart grid infrastructure and the potential of value capture in the highly complex macro infrastructure such as smart grid. This paper proposes an alternative perspective to study the smart grid business ecosystem to support...... the infrastructural challenges, such as the interoperability of business components for smart grid. So far little research has explored the business ecosystem in the smart grid concept. The study on the smart grid with the theory of business ecosystem may open opportunities to understand market catalysts. This study...... contributes an understanding of business ecosystem applicable for smart grid. Smart grid infrastructure is an intricate business ecosystem, which have several intentions to deliver the value proposition and what it should be. The findings help to identify and capture value from markets....

  4. Model based optimization of EMC input filters

    Energy Technology Data Exchange (ETDEWEB)

    Raggl, K; Kolar, J. W. [Swiss Federal Institute of Technology, Power Electronic Systems Laboratory, Zuerich (Switzerland); Nussbaumer, T. [Levitronix GmbH, Zuerich (Switzerland)

    2008-07-01

    Input filters of power converters for compliance with regulatory electromagnetic compatibility (EMC) standards are often over-dimensioned in practice due to a non-optimal selection of number of filter stages and/or the lack of solid volumetric models of the inductor cores. This paper presents a systematic filter design approach based on a specific filter attenuation requirement and volumetric component parameters. It is shown that a minimal volume can be found for a certain optimal number of filter stages for both the differential mode (DM) and common mode (CM) filter. The considerations are carried out exemplarily for an EMC input filter of a single phase power converter for the power levels of 100 W, 300 W, and 500 W. (author)

  5. Multiple Damage Progression Paths in Model-based Prognostics

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics approaches employ do- main knowledge about a system, its components, and how they fail through the use of physics-based models. Compo- nent...

  6. An Agent-Based Monetary Production Simulation Model

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2006-01-01

    An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable......An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable...

  7. Distributed Damage Estimation for Prognostics based on Structural Model Decomposition

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics approaches capture system knowl- edge in the form of physics-based models of components that include how they fail. These methods consist of...

  8. Integrated Enterprise Modeling Method Based on Workflow Model and Multiviews%Integrated Enterprise Modeling Method Based on Workflow Model and Multiviews

    Institute of Scientific and Technical Information of China (English)

    林慧苹; 范玉顺; 吴澄

    2001-01-01

    Many enterprise modeling methods are proposed to model thebusiness process of enterprises and to implement CIM systems. But difficulties are still encountered when these methods are applied to the CIM system design and implementation. This paper proposes a new integrated enterprise modeling methodology based on the workflow model. The system architecture and the integrated modeling environment are described with a new simulation strategy. The modeling process and the relationship between the workflow model and the views are discussed.

  9. A model-based evaluation system of enterprise

    Institute of Scientific and Technical Information of China (English)

    Yan Junwei; Ye Yang; Wang Jian

    2005-01-01

    This paper analyses the architecture of enterprise modeling, proposesindicator selection principles and indicator decomposition methods, examines the approaches to the evaluation of enterprise modeling and designs an evaluation model of AHP. Then a model-based evaluation system of enterprise is presented toeffectively evaluate the business model in the framework of enterprise modeling.

  10. Framework of Pattern Recognition Model Based on the Cognitive Psychology

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    According to the fundamental theory of visual cognition mechanism and cognitive psychology,the visual pattern recognition model is introduced briefly.Three pattern recognition models,i.e.template-based matching model,prototype-based matching model and feature-based matching model are built and discussed separately.In addition,the influence of object background information and visual focus point to the result of pattern recognition is also discussed with the example of recognition for fuzzy letters and figures.

  11. Validating agent based models through virtual worlds.

    Energy Technology Data Exchange (ETDEWEB)

    Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E.; Bernstein, Jeremy Ray Rhythm

    2014-01-01

    As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior

  12. Model Mapping Approach Based on Ontology Semantics

    Directory of Open Access Journals (Sweden)

    Jinkui Hou

    2013-09-01

    Full Text Available The mapping relations between different models are the foundation for model transformation in model-driven software development. On the basis of ontology semantics, model mappings between different levels are classified by using structural semantics of modeling languages. The general definition process for mapping relations is explored, and the principles of structure mapping are proposed subsequently. The approach is further illustrated by the mapping relations from class model of object oriented modeling language to the C programming codes. The application research shows that the approach provides a theoretical guidance for the realization of model mapping, and thus can make an effective support to model-driven software development

  13. Tree-based disease classification using protein data.

    Science.gov (United States)

    Zhu, Hongtu; Yu, Chang-Yung; Zhang, Heping

    2003-09-01

    A reliable and precise classification of diseases is essential for successful diagnosis and treatment. Using mass spectrometry from clinical specimens, scientists may find the protein variations among disease and use this information to improve diagnosis. In this paper, we propose a novel procedure to classify disease status based on the protein data from mass spectrometry. Our new tree-based algorithm consists of three steps: projection, selection and classification tree. The projection step aims to project all observations from specimens into the same bases so that the projected data have fixed coordinates. Thus, for each specimen, we obtain a large vector of 'coefficients' on the same basis. The purpose of the selection step is data reduction by condensing the large vector from the projection step into a much lower order of informative vector. Finally, using these reduced vectors, we apply recursive partitioning to construct an informative classification tree. This method has been successfully applied to protein data, provided by the Department of Radiology and Chemistry at Duke University.

  14. A Software Service Framework Model Based on Agent

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper presents an agent-based software service framework model called ASF, and definesthe basic concepts and structure of ASF model. It also describes the management and process mechanismsin ASF model.

  15. Dynamic Metabolic Model Building Based on the Ensemble Modeling Approach

    Energy Technology Data Exchange (ETDEWEB)

    Liao, James C. [Univ. of California, Los Angeles, CA (United States)

    2016-10-01

    Ensemble modeling of kinetic systems addresses the challenges of kinetic model construction, with respect to parameter value selection, and still allows for the rich insights possible from kinetic models. This project aimed to show that constructing, implementing, and analyzing such models is a useful tool for the metabolic engineering toolkit, and that they can result in actionable insights from models. Key concepts are developed and deliverable publications and results are presented.

  16. Measurement-based load modeling: Theory and application

    Institute of Scientific and Technical Information of China (English)

    MA; Jin; HAN; Dong; HE; RenMu

    2007-01-01

    Load model is one of the most important elements in power system operation and control. However, owing to its complexity, load modeling is still an open and very difficult problem. Summarizing our work on measurement-based load modeling in China for more than twenty years, this paper systematically introduces the mathematical theory and applications regarding the load modeling. The flow chart and algorithms for measurement-based load modeling are presented. A composite load model structure with 13 parameters is also proposed. Analysis results based on the trajectory sensitivity theory indicate the importance of the load model parameters for the identification. Case studies show the accuracy of the presented measurement-based load model. The load model thus built has been validated by field measurements all over China. Future working directions on measurement- based load modeling are also discussed in the paper.

  17. A Bit Progress on Word—Based Language Model

    Institute of Scientific and Technical Information of China (English)

    陈勇; 陈国评

    2003-01-01

    A good language model is essential to a postprocessing algorithm for recognition systems. In the past, researchers have pre-sented various language models, such as character based language models, word based language model, syntactical rules :language mod-el, hybrid models, etc. The word N-gram model is by far an effective and efficient model, but one has to address the problem of data sparseness in establishing the model. Katz and Kneser et al. respectively presented effective remedies to solve this challenging prob-lem. In this study, we proposed an improvement to their methods by incorporating Chinese language-specific information or Chinese word class information into the system.

  18. Study of chaos based on a hierarchical model

    Energy Technology Data Exchange (ETDEWEB)

    Yagi, Masatoshi; Itoh, Sanae-I. [Kyushu Univ., Fukuoka (Japan). Research Inst. for Applied Mechanics

    2001-12-01

    Study of chaos based on a hierarchical model is briefly reviewed. Here we categorize hierarchical model equations, i.e., (1) a model with a few degrees of freedom, e.g., the Lorenz model, (2) a model with intermediate degrees of freedom like a shell model, and (3) a model with many degrees of freedom such as a Navier-Stokes equation. We discuss the nature of chaos and turbulence described by these models via Lyapunov exponents. The interpretation of results observed in fundamental plasma experiments is also shown based on a shell model. (author)

  19. MRO CTX-based Digital Terrain Models

    Science.gov (United States)

    Dumke, Alexander

    2016-04-01

    In planetary surface sciences, digital terrain models (DTM) are paramount when it comes to understanding and quantifying processes. In this contribution an approach for the derivation of digital terrain models from stereo images of the NASA Mars Reconnaissance Orbiter (MRO) Context Camera (CTX) are described. CTX consists of a 350 mm focal length telescope and 5000 CCD sensor elements and is operated as pushbroom camera. It acquires images with ~6 m/px over a swath width of ~30 km of the Mars surface [1]. Today, several approaches for the derivation of CTX DTMs exist [e. g. 2, 3, 4]. The discussed approach here is based on established software and combines them with proprietary software as described below. The main processing task for the derivation of CTX stereo DTMs is based on six steps: (1) First, CTX images are radiometrically corrected using the ISIS software package [5]. (2) For selected CTX stereo images, exterior orientation data from reconstructed NAIF SPICE data are extracted [6]. (3) In the next step High Resolution Stereo Camera (HRSC) DTMs [7, 8, 9] are used for the rectification of CTX stereo images to reduce the search area during the image matching. Here, HRSC DTMs are used due to their higher spatial resolution when compared to MOLA DTMs. (4) The determination of coordinates of homologous points between stereo images, i.e. the stereo image matching process, consists of two steps: first, a cross-correlation to obtain approximate values and secondly, their use in a least-square matching (LSM) process in order to obtain subpixel positions. (5) The stereo matching results are then used to generate object points from forward ray intersections. (6) As a last step, the DTM-raster generation is performed using software developed at the German Aerospace Center, Berlin. Whereby only object points are used that have a smaller error than a threshold value. References: [1] Malin, M. C. et al., 2007, JGR 112, doi:10.1029/2006JE002808 [2] Broxton, M. J. et al

  20. A stability condition for turbulence model: From EMMS model to EMMS-based turbulence model

    CERN Document Server

    Zhang, Lin; Wang, Limin; Li, Jinghai

    2013-01-01

    The closure problem of turbulence is still a challenging issue in turbulence modeling. In this work, a stability condition is used to close turbulence. Specifically, we regard single-phase flow as a mixture of turbulent and non-turbulent fluids, separating the structure of turbulence. Subsequently, according to the picture of the turbulent eddy cascade, the energy contained in turbulent flow is decomposed into different parts and then quantified. A turbulence stability condition, similar to the principle of the energy-minimization multi-scale (EMMS) model for gas-solid systems, is formulated to close the dynamic constraint equations of turbulence, allowing the heterogeneous structural parameters of turbulence to be optimized. We call this model the `EMMS-based turbulence model', and use it to construct the corresponding turbulent viscosity coefficient. To validate the EMMS-based turbulence model, it is used to simulate two classical benchmark problems, lid-driven cavity flow and turbulent flow with forced con...

  1. The Community-based Whole Magnetosphere Model

    Science.gov (United States)

    2011-11-15

    Tribulations and Exultations in Coupling Models of the Magnetosphere with Ionosphere-Thermosphere Models, Plane- tary Aeronomy ISSI Meeting, Bern...Exultations in Coupling Models of the Magnetosphere CWMM-19 Ridley CWMM Final Report with Ionosphere-Thermosphere Models, Plane- tary Aeronomy ISSI

  2. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  3. Recommendation based on trust diffusion model.

    Science.gov (United States)

    Yuan, Jinfeng; Li, Li

    2014-01-01

    Recommender system is emerging as a powerful and popular tool for online information relevant to a given user. The traditional recommendation system suffers from the cold start problem and the data sparsity problem. Many methods have been proposed to solve these problems, but few can achieve satisfactory efficiency. In this paper, we present a method which combines the trust diffusion (DiffTrust) algorithm and the probabilistic matrix factorization (PMF). DiffTrust is first used to study the possible diffusions of trust between various users. It is able to make use of the implicit relationship of the trust network, thus alleviating the data sparsity problem. The probabilistic matrix factorization (PMF) is then employed to combine the users' tastes with their trusted friends' interests. We evaluate the algorithm on Flixster, Moviedata, and Epinions datasets, respectively. The experimental results show that the recommendation based on our proposed DiffTrust + PMF model achieves high performance in terms of the root mean square error (RMSE), Recall, and F Measure.

  4. GENETIC-BASED NUTRITION RECOMMENDATION MODEL

    Directory of Open Access Journals (Sweden)

    S. A.A. Fayoumi

    2014-01-01

    Full Text Available Evolutionary computing is the collective name for a range of problem-solving techniques based on principles of biological evolution, such as natural selection and genetic inheritance. These techniques are being widely applied to a variety of problems in many vital fields. Also, Evolutionary Algorithms (EA which applied the principles of Evolutionary computations, such as genetic algorithm, particle swarm, ant colony and bees algorithm and so on play an important role in decision making process. EAs serve a lot of fields which can affect our life directly, such as medicine, engineering, transportations, communications. One of these vital fields is Nutrition which can be viewed from several points of view as medical, physical, social, environmental and psychological point of view. This study, presents a proposed model that shows how evolutionary computing generally and genetic algorithm specifically-as a powerful algorithm of evolutionary algorithms-can be used to recommend an appropriate nutrition style in a medical and physical sides only to each person according to his/her personal and medical measurements.

  5. Modeling dark fermentation for biohydrogen production: ADM1-based model vs. Gompertz model

    Energy Technology Data Exchange (ETDEWEB)

    Gadhamshetty, Venkataramana [Air Force Research Laboratory, Tyndall AFB, 139 Barnes Drive, Panama City, FL 32403 (United States); Arudchelvam, Yalini; Nirmalakhandan, Nagamany [Civil Engineering Department, New Mexico State University, Las Cruces, NM 88003 (United States); Johnson, David C. [Institute for Energy and Environment, New Mexico State University, Las Cruces, NM 88003 (United States)

    2010-01-15

    Biohydrogen production by dark fermentation in batch reactors was modeled using the Gompertz equation and a model based on Anaerobic Digestion Model (ADM1). The ADM1 framework, which has been well accepted for modeling methane production by anaerobic digestion, was modified in this study for modeling hydrogen production. Experimental hydrogen production data from eight reactor configurations varying in pressure conditions, temperature, type and concentration of substrate, inocula source, and stirring conditions were used to evaluate the predictive abilities of the two modeling approaches. Although the quality of fit between the measured and fitted hydrogen evolution by the Gompertz equation was high in all the eight reactor configurations with r{sup 2} {proportional_to}0.98, each configuration required a different set of model parameters, negating its utility as a general approach to predict hydrogen evolution. On the other hand, the ADM1-based model (ADM1BM) with predefined parameters was able to predict COD, cumulative hydrogen production, as well as volatile fatty acids production, albeit at a slightly lower quality of fit. Agreement between the experimental temporal hydrogen evolution data and the ADM1BM predictions was statistically significant with r{sup 2} > 0.91 and p-value <1E-04. Sensitivity analysis of the validated model revealed that hydrogen production was sensitive to only six parameters in the ADM1BM. (author)

  6. Research of database-based modeling for mining management system

    Institute of Scientific and Technical Information of China (English)

    WU Hai-feng; JIN Zhi-xin; BAI Xi-jun

    2005-01-01

    Put forward the method to construct the simulation model automatically with database-based automatic modeling(DBAM) for mining system. Designed the standard simulation model linked with some open cut Pautomobile dispatch system. Analyzed and finded out the law among them, and designed model maker to realize the automatic programming of the new model program.

  7. Definition of Model-based diagnosis problems with Altarica

    OpenAIRE

    Pencolé, Yannick; Chanthery, Elodie; Peynot, Thierry

    2016-01-01

    International audience; This paper presents a framework for modeling diagnosis problems based on a formal language called Altarica. The initial purpose of the language Altarica was to define a modeling language for safety analysis. This language has been developed as a collaboration between academics and industrial partners and is used in some industrial companies. The paper shows that the expres-sivity of this language, mixing event-based and state-based models, is sufficient to model classi...

  8. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  9. Computer Profiling Based Model for Investigation

    Directory of Open Access Journals (Sweden)

    Neeraj Choudhary

    2011-10-01

    Full Text Available Computer profiling is used for computer forensic analysis, and proposes and elaborates on a novel model for use in computer profiling, the computer profiling object model. The computer profiling object model is an information model which models a computer as objects with various attributes and inter-relationships. These together provide the information necessary for a human investigator or an automated reasoning engine to make judgments as to the probable usage and evidentiary value of a computer system. The computer profiling object model can be implemented so as to support automated analysis to provide an investigator with the informationneeded to decide whether manual analysis is required.

  10. On Process Modelling Using Physical Oriented And Phenomena Based Principles

    Directory of Open Access Journals (Sweden)

    Mihai Culea

    2000-12-01

    Full Text Available This work presents a modelling framework based on phenomena description of the process. The approach is taken to easy understand and construct process model in heterogeneous possible distributed modelling and simulation environments. A simplified case study of a heat exchanger is considered and Modelica modelling language to check the proposed concept. The partial results are promising and the research effort will be extended in a computer aided modelling environment based on phenomena.

  11. Model based sustainable production of biomethane

    OpenAIRE

    Biernacki, Piotr

    2014-01-01

    The main intention of this dissertation was to evaluate sustainable production of biomethane with use of mathematical modelling. To achieve this goal, widely acknowledged models like Anaerobic Digestion Model No.1 (ADM1), describing anaerobic digestion, and electrolyte Non-Random Two Liquid Model (eNRTL), for gas purification, were utilized. The experimental results, batch anaerobic digestion of different substrates and carbon dioxide solubility in 2-(Ethylamino)ethanol, were used to determin...

  12. A model-based multisensor data fusion knowledge management approach

    Science.gov (United States)

    Straub, Jeremy

    2014-06-01

    A variety of approaches exist for combining data from multiple sensors. The model-based approach combines data based on its support for or refutation of elements of the model which in turn can be used to evaluate an experimental thesis. This paper presents a collection of algorithms for mapping various types of sensor data onto a thesis-based model and evaluating the truth or falsity of the thesis, based on the model. The use of this approach for autonomously arriving at findings and for prioritizing data are considered. Techniques for updating the model (instead of arriving at a true/false assertion) are also discussed.

  13. A Size-based Ecosystem Model

    DEFF Research Database (Denmark)

    Ravn-Jonsen, Lars

     Ecosystem Management requires models that can link the ecosystem level to the operation level. This link can be created by an ecosystem production model. Because the function of the individual fish in the marine ecosystem, seen in trophic context, is closely related to its size, the model groups...... fish according to size. The model summarises individual predation events into ecosystem level properties, and thereby uses the law of conversation of mass as a framework. This paper provides the background, the conceptual model, basic assumptions, integration of fishing activities, mathematical...... completion, and a numeric implementation. Using two experiments, the model's ability to act as tool for economic production analysis and regulation design testing is demonstrated. The presented model is the simplest possible and is built on the principles of (i) size, as the attribute that determines...

  14. Genetic Algorithm Based Microscale Vehicle Emissions Modelling

    Directory of Open Access Journals (Sweden)

    Sicong Zhu

    2015-01-01

    Full Text Available There is a need to match emission estimations accuracy with the outputs of transport models. The overall error rate in long-term traffic forecasts resulting from strategic transport models is likely to be significant. Microsimulation models, whilst high-resolution in nature, may have similar measurement errors if they use the outputs of strategic models to obtain traffic demand predictions. At the microlevel, this paper discusses the limitations of existing emissions estimation approaches. Emission models for predicting emission pollutants other than CO2 are proposed. A genetic algorithm approach is adopted to select the predicting variables for the black box model. The approach is capable of solving combinatorial optimization problems. Overall, the emission prediction results reveal that the proposed new models outperform conventional equations in terms of accuracy and robustness.

  15. Overcoming limitations of model-based diagnostic reasoning systems

    Science.gov (United States)

    Holtzblatt, Lester J.; Marcotte, Richard A.; Piazza, Richard L.

    1989-01-01

    The development of a model-based diagnostic system to overcome the limitations of model-based reasoning systems is discussed. It is noted that model-based reasoning techniques can be used to analyze the failure behavior and diagnosability of system and circuit designs as part of the system process itself. One goal of current research is the development of a diagnostic algorithm which can reason efficiently about large numbers of diagnostic suspects and can handle both combinational and sequential circuits. A second goal is to address the model-creation problem by developing an approach for using design models to construct the GMODS model in an automated fashion.

  16. The study of snack behaviors and applying classification tree to filter influencing factors anong middle school students in Guangzhou%广州市某中学初中生零食行为及其影响因素的分类树研究

    Institute of Scientific and Technical Information of China (English)

    田唤; 马绍斌; 范存欣; 刘国宁; 陈然

    2011-01-01

    目的 了解广州市初中生的零食行为,探讨其影响因素,为学校和相关卫生部门开展零食教育工作提供依据.方法 采用便利抽样方法,对广州市天秀初级中学全体在校生进行问卷调查,比较知晓程度以及性别对零食行为的影响,并用分类树模型筛选影响因素.结果 该校初中生零食消费率为46.3%;良好零食行为习惯形成比率是12.1%~72.6%;对指南认知程度不同的学生在注意零食营养(χ2=18.317,P<0.05)、注意零食保质期(χ2=54.014,P<0.05)、不在上网、看电视时吃零食(χ2=4.799,P<0.05)、不在正餐前/后1~2 h吃零食(χ2=56.147,P<0.05)、不以零食代替正餐(χ2=7.635,P<0.05)等条目上差异有统计学意义;不同性别的学生在大部分的零食行为上差异均有统计学意义(P<0.05);分类树分析表明:家长态度、每周可用零食费用、对指南的知晓程度和零食包装能够影响初中生每天的零食食用次数.结论 初中生良好的零食消费观念和行为还需改善,加强多部门合作,采取多种途径的宣传教育,合理指导初中生零食选择和食用十分重要.%Objective To describe the snack behaviors and investigate the infiuencing factors, then provide reference for school snack education. Methods Convenience sampling, questionnaires were conducted in Tianxiu middle school in Guangzhou, then the consumption rates were compared to find out if there were any difference between Consumer Guide to Children and Adolescents on Snacks, gender. Classification tree was applied to filter the influencing factors. Results The rates of reasonable snack behaviors were 12.1% ~ 72.6%. The students who known the guide were more likely eating nutritious snacks (x2 = 18. 317, P<0.05 ), and more concerning about the quality guarantee period (x2 = 54.014, P < 0.05). Students who don' t know the guide were more likely eating snacks when they were watching TV or surfing the Internet (x2

  17. A general model for membrane-based separation processes

    DEFF Research Database (Denmark)

    Soni, Vipasha; Abildskov, Jens; Jonsson, Gunnar Eigil;

    2009-01-01

    behaviour will play an important role. In this paper, modelling of membrane-based processes for separation of gas and liquid mixtures are considered. Two general models, one for membrane-based liquid separation processes (with phase change) and another for membrane-based gas separation are presented...

  18. Agent Based Reasoning in Multilevel Flow Modeling

    DEFF Research Database (Denmark)

    Lind, Morten; Zhang, Xinxin

    2012-01-01

    to launch the MFM Workbench into an agent based environment, which can complement disadvantages of the original software. The agent-based MFM Workbench is centered on a concept called “Blackboard System” and use an event based mechanism to arrange the reasoning tasks. This design will support the new...

  19. MQ-2 A Tool for Prolog-based Model Querying

    DEFF Research Database (Denmark)

    Acretoaie, Vlad; Störrle, Harald

    2012-01-01

    MQ-2 integrates a Prolog console into the MagicDraw1 modeling environment and equips this console with features targeted specifically to the task of querying models. The vision of MQ-2 is to make Prolog-based model querying accessible to both student and expert modelers by offering powerful query...

  20. An approach for activity-based DEVS model specification

    DEFF Research Database (Denmark)

    Alshareef, Abdurrahman; Sarjoughian, Hessam S.; Zarrin, Bahram

    2016-01-01

    activity-based behavior modeling of parallel DEVS atomic models. We consider UML activities and actions as fundamental units of behavior modeling, especially in the presence of recent advances in the UML 2.5 specifications. We describe in detail how to approach activity modeling with a set of elemental...

  1. A Stock Pricing Model Based on Arithmetic Brown Motion

    Institute of Scientific and Technical Information of China (English)

    YAN Yong-xin; HAN Wen-xiu

    2001-01-01

    This paper presents a new stock pricing model based on arithmetic Brown motion. The model overcomes the shortcomings of Gordon model completely. With the model investors can estimate the stock value of surplus companies, deficit companies, zero increase companies and bankrupt companies in long term investment or in short term investment.

  2. Functional Behavioral Assessment: A School Based Model.

    Science.gov (United States)

    Asmus, Jennifer M.; Vollmer, Timothy R.; Borrero, John C.

    2002-01-01

    This article begins by discussing requirements for functional behavioral assessment under the Individuals with Disabilities Education Act and then describes a comprehensive model for the application of behavior analysis in the schools. The model includes descriptive assessment, functional analysis, and intervention and involves the participation…

  3. Designing Network-based Business Model Ontology

    DEFF Research Database (Denmark)

    Hashemi Nekoo, Ali Reza; Ashourizadeh, Shayegheh; Zarei, Behrouz

    2015-01-01

    Survival on dynamic environment is not achieved without a map. Scanning and monitoring of the market show business models as a fruitful tool. But scholars believe that old-fashioned business models are dead; as they are not included the effect of internet and network in themselves. This paper...

  4. A Network Formation Model Based on Subgraphs

    CERN Document Server

    Chandrasekhar, Arun

    2016-01-01

    We develop a new class of random-graph models for the statistical estimation of network formation that allow for substantial correlation in links. Various subgraphs (e.g., links, triangles, cliques, stars) are generated and their union results in a network. We provide estimation techniques for recovering the rates at which the underlying subgraphs were formed. We illustrate the models via a series of applications including testing for incentives to form cross-caste relationships in rural India, testing to see whether network structure is used to enforce risk-sharing, testing as to whether networks change in response to a community's exposure to microcredit, and show that these models significantly outperform stochastic block models in matching observed network characteristics. We also establish asymptotic properties of the models and various estimators, which requires proving a new Central Limit Theorem for correlated random variables.

  5. Image based modeling of tumor growth.

    Science.gov (United States)

    Meghdadi, N; Soltani, M; Niroomand-Oscuii, H; Ghalichi, F

    2016-09-01

    Tumors are a main cause of morbidity and mortality worldwide. Despite the efforts of the clinical and research communities, little has been achieved in the past decades in terms of improving the treatment of aggressive tumors. Understanding the underlying mechanism of tumor growth and evaluating the effects of different therapies are valuable steps in predicting the survival time and improving the patients' quality of life. Several studies have been devoted to tumor growth modeling at different levels to improve the clinical outcome by predicting the results of specific treatments. Recent studies have proposed patient-specific models using clinical data usually obtained from clinical images and evaluating the effects of various therapies. The aim of this review is to highlight the imaging role in tumor growth modeling and provide a worthwhile reference for biomedical and mathematical researchers with respect to tumor modeling using the clinical data to develop personalized models of tumor growth and evaluating the effect of different therapies.

  6. Simplified Atmospheric Dispersion Model andModel Based Real Field Estimation System ofAir Pollution

    Institute of Scientific and Technical Information of China (English)

    2015-01-01

    The atmospheric dispersion model has been well developed and applied in pollution emergency and prediction. Based on thesophisticated air diffusion model, this paper proposes a simplified model and some optimization about meteorological andgeological conditions. The model is suitable for what is proposed as Real Field Monitor and Estimation system. The principle ofsimplified diffusion model and its optimization is studied. The design of Real Field Monitor system based on this model and itsfundamental implementations are introduced.

  7. Multimedia Data Modeling Based on Temporal Logic and XYZ System

    Institute of Scientific and Technical Information of China (English)

    MA Huadong; LIU Shenquan

    1999-01-01

    This paper proposes a new approach to modeling multimedia data. The newapproach is the multimedia data model based on temporal logic and XYZSystem. It supports the formal specifications in a multimedia system.Using this model, we can not only specify information unitsbut also design and script a multimedia title in an unified framework.Based on this model, an interactive multimedia authoring environment hasbeen developed.

  8. Comparing Ray-Based and Wave-Based Models of Cross-Beam Energy Transfer

    Science.gov (United States)

    Follett, R. K.; Edgell, D. H.; Shaw, J. G.; Froula, D. H.; Myatt, J. F.

    2016-10-01

    Ray-based models of cross-beam energy transfer (CBET) are used in radiation-hydrodynamics codes to calculate laser-energy deposition. The accuracy of ray-based CBET models is limited by assumptions about the polarization and phase of the interacting laser beams and by the use of a paraxial Wentzel-Kramers-Brillouin (WKB) approximation. A 3-D wave-based solver (LPSE-CBET) is used to study the nonlinear interaction between overlapping laser beams in underdense plasma. A ray-based CBET model is compared to the wave-based model and shows good agreement in simple geometries where the assumptions of the ray-based model are satisfied. Near caustic surfaces, the assumptions of the ray-based model break down and the calculated energy transfer deviates from wave-based calculations. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944.

  9. Image-Based Modeling of Plants and Trees

    CERN Document Server

    Kang, Sing Bang

    2009-01-01

    Plants and trees are among the most complex natural objects. Much work has been done attempting to model them, with varying degrees of success. In this book, we review the various approaches in computer graphics, which we categorize as rule-based, image-based, and sketch-based methods. We describe our approaches for modeling plants and trees using images. Image-based approaches have the distinct advantage that the resulting model inherits the realistic shape and complexity of a real plant or tree. We use different techniques for modeling plants (with relatively large leaves) and trees (with re

  10. Physics-Based Pneumatic Hammer Instability Model Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Florida Turbine Technologies (FTT) proposes to conduct research necessary to develop a physics-based pneumatic hammer instability model for hydrostatic bearings...

  11. Model-Based Reasoning in Humans Becomes Automatic with Training.

    Directory of Open Access Journals (Sweden)

    Marcos Economides

    2015-09-01

    Full Text Available Model-based and model-free reinforcement learning (RL have been suggested as algorithmic realizations of goal-directed and habitual action strategies. Model-based RL is more flexible than model-free but requires sophisticated calculations using a learnt model of the world. This has led model-based RL to be identified with slow, deliberative processing, and model-free RL with fast, automatic processing. In support of this distinction, it has recently been shown that model-based reasoning is impaired by placing subjects under cognitive load--a hallmark of non-automaticity. Here, using the same task, we show that cognitive load does not impair model-based reasoning if subjects receive prior training on the task. This finding is replicated across two studies and a variety of analysis methods. Thus, task familiarity permits use of model-based reasoning in parallel with other cognitive demands. The ability to deploy model-based reasoning in an automatic, parallelizable fashion has widespread theoretical implications, particularly for the learning and execution of complex behaviors. It also suggests a range of important failure modes in psychiatric disorders.

  12. Probabilistic reasoning for assembly-based 3D modeling

    KAUST Repository

    Chaudhuri, Siddhartha

    2011-01-01

    Assembly-based modeling is a promising approach to broadening the accessibility of 3D modeling. In assembly-based modeling, new models are assembled from shape components extracted from a database. A key challenge in assembly-based modeling is the identification of relevant components to be presented to the user. In this paper, we introduce a probabilistic reasoning approach to this problem. Given a repository of shapes, our approach learns a probabilistic graphical model that encodes semantic and geometric relationships among shape components. The probabilistic model is used to present components that are semantically and stylistically compatible with the 3D model that is being assembled. Our experiments indicate that the probabilistic model increases the relevance of presented components. © 2011 ACM.

  13. Geometric deviation modeling by kinematic matrix based on Lagrangian coordinate

    Science.gov (United States)

    Liu, Weidong; Hu, Yueming; Liu, Yu; Dai, Wanyi

    2015-09-01

    Typical representation of dimension and geometric accuracy is limited to the self-representation of dimension and geometric deviation based on geometry variation thinking, yet the interactivity affection of geometric variation and gesture variation of multi-rigid body is not included. In this paper, a kinematic matrix model based on Lagrangian coordinate is introduced, with the purpose of unified model for geometric variation and gesture variation and their interactive and integrated analysis. Kinematic model with joint, local base and movable base is built. The ideal feature of functional geometry is treated as the base body; the fitting feature of functional geometry is treated as the adjacent movable body; the local base of the kinematic model is fixed onto the ideal geometry, and the movable base of the kinematic model is fixed onto the fitting geometry. Furthermore, the geometric deviation is treated as relative location or rotation variation between the movable base and the local base, and it's expressed by the Lagrangian coordinate. Moreover, kinematic matrix based on Lagrangian coordinate for different types of geometry tolerance zones is constructed, and total freedom for each kinematic model is discussed. Finally, the Lagrangian coordinate library, kinematic matrix library for geometric deviation modeling is illustrated, and an example of block and piston fits is introduced. Dimension and geometric tolerances of the shaft and hole fitting feature are constructed by kinematic matrix and Lagrangian coordinate, and the results indicate that the proposed kinematic matrix is capable and robust in dimension and geometric tolerances modeling.

  14. Statistical Model-Based Face Pose Estimation

    Institute of Scientific and Technical Information of China (English)

    GE Xinliang; YANG Jie; LI Feng; WANG Huahua

    2007-01-01

    A robust face pose estimation approach is proposed by using face shape statistical model approach and pose parameters are represented by trigonometric functions. The face shape statistical model is firstly built by analyzing the face shapes from different people under varying poses. The shape alignment is vital in the process of building the statistical model. Then, six trigonometric functions are employed to represent the face pose parameters. Lastly, the mapping function is constructed between face image and face pose by linearly relating different parameters. The proposed approach is able to estimate different face poses using a few face training samples. Experimental results are provided to demonstrate its efficiency and accuracy.

  15. ALC: automated reduction of rule-based models

    Directory of Open Access Journals (Sweden)

    Gilles Ernst

    2008-10-01

    Full Text Available Abstract Background Combinatorial complexity is a challenging problem for the modeling of cellular signal transduction since the association of a few proteins can give rise to an enormous amount of feasible protein complexes. The layer-based approach is an approximative, but accurate method for the mathematical modeling of signaling systems with inherent combinatorial complexity. The number of variables in the simulation equations is highly reduced and the resulting dynamic models show a pronounced modularity. Layer-based modeling allows for the modeling of systems not accessible previously. Results ALC (Automated Layer Construction is a computer program that highly simplifies the building of reduced modular models, according to the layer-based approach. The model is defined using a simple but powerful rule-based syntax that supports the concepts of modularity and macrostates. ALC performs consistency checks on the model definition and provides the model output in different formats (C MEX, MATLAB, Mathematica and SBML as ready-to-run simulation files. ALC also provides additional documentation files that simplify the publication or presentation of the models. The tool can be used offline or via a form on the ALC website. Conclusion ALC allows for a simple rule-based generation of layer-based reduced models. The model files are given in different formats as ready-to-run simulation files.

  16. Demand forecast model based on CRM

    Science.gov (United States)

    Cai, Yuancui; Chen, Lichao

    2006-11-01

    With interiorizing day by day management thought that regarding customer as the centre, forecasting customer demand becomes more and more important. In the demand forecast of customer relationship management, the traditional forecast methods have very great limitation because much uncertainty of the demand, these all require new modeling to meet the demands of development. In this paper, the notion is that forecasting the demand according to characteristics of the potential customer, then modeling by it. The model first depicts customer adopting uniform multiple indexes. Secondly, the model acquires characteristic customers on the basis of data warehouse and the technology of data mining. The last, there get the most similar characteristic customer by their comparing and forecast the demands of new customer by the most similar characteristic customer.

  17. AADL and Model-based Engineering

    Science.gov (United States)

    2014-10-20

    failure modes and effects analysis • fault trees • Markov processes Goal: a general facility for modeling fault /error/failure behaviors that can be...by Steve Vestal • Generation of fault trees and Markov models 1999: Requirements Document for AADL Standard • Industry input: packages, messages...Common fault ontology 36 AADL and MBE Feiler, Oct 20, 2014 © 2014 Carnegie Mellon University Error Propagation Contracts Component C NoData

  18. 3D Object Recognition Based on Linear Lie Algebra Model

    Institute of Scientific and Technical Information of China (English)

    LI Fang-xing; WU Ping-dong; SUN Hua-fei; PENG Lin-yu

    2009-01-01

    A surface model called the fibre bundle model and a 3D object model based on linear Lie algebra model are proposed.Then an algorithm of 3D object recognition using the linear Lie algebra models is presented.It is a convenient recognition method for the objects which are symmetric about some axis.By using the presented algorithm,the representation matrices of the fibre or the base curve from only finite points of the linear Lie algebra model can be obtained.At last some recognition results of practicalities are given.

  19. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  20. Mechanics and model-based control of advanced engineering systems

    CERN Document Server

    Irschik, Hans; Krommer, Michael

    2014-01-01

    Mechanics and Model-Based Control of Advanced Engineering Systems collects 32 contributions presented at the International Workshop on Advanced Dynamics and Model Based Control of Structures and Machines, which took place in St. Petersburg, Russia in July 2012. The workshop continued a series of international workshops, which started with a Japan-Austria Joint Workshop on Mechanics and Model Based Control of Smart Materials and Structures and a Russia-Austria Joint Workshop on Advanced Dynamics and Model Based Control of Structures and Machines. In the present volume, 10 full-length papers based on presentations from Russia, 9 from Austria, 8 from Japan, 3 from Italy, one from Germany and one from Taiwan are included, which represent the state of the art in the field of mechanics and model based control, with particular emphasis on the application of advanced structures and machines.

  1. Understanding uncertainty in process-based hydrological models

    Science.gov (United States)

    Clark, M. P.; Kavetski, D.; Slater, A. G.; Newman, A. J.; Marks, D. G.; Landry, C.; Lundquist, J. D.; Rupp, D. E.; Nijssen, B.

    2013-12-01

    Building an environmental model requires making a series of decisions regarding the appropriate representation of natural processes. While some of these decisions can already be based on well-established physical understanding, gaps in our current understanding of environmental dynamics, combined with incomplete knowledge of properties and boundary conditions of most environmental systems, make many important modeling decisions far more ambiguous. There is consequently little agreement regarding what a 'correct' model structure is, especially at relatively larger spatial scales such as catchments and beyond. In current practice, faced with such a range of decisions, different modelers will generally make different modeling decisions, often on an ad hoc basis, based on their balancing of process understanding, the data available to evaluate the model, the purpose of the modeling exercise, and their familiarity with or investment in an existing model infrastructure. This presentation describes development and application of multiple-hypothesis models to evaluate process-based hydrologic models. Our numerical model uses robust solutions of the hydrology and thermodynamic governing equations as the structural core, and incorporates multiple options to represent the impact of different modeling decisions, including multiple options for model parameterizations (e.g., below-canopy wind speed, thermal conductivity, storage and transmission of liquid water through soil, etc.), as well as multiple options for model architecture, that is, the coupling and organization of different model components (e.g., representations of sub-grid variability and hydrologic connectivity, coupling with groundwater, etc.). Application of this modeling framework across a collection of different research basins demonstrates that differences among model parameterizations are often overwhelmed by differences among equally-plausible model parameter sets, while differences in model architecture lead

  2. Model-based Prognostics under Limited Sensing

    Data.gov (United States)

    National Aeronautics and Space Administration — Prognostics is crucial to providing reliable condition-based maintenance decisions. To obtain accurate predictions of component life, a variety of sensors are often...

  3. Flood forecasting for River Mekong with data-based models

    Science.gov (United States)

    Shahzad, Khurram M.; Plate, Erich J.

    2014-09-01

    In many regions of the world, the task of flood forecasting is made difficult because only a limited database is available for generating a suitable forecast model. This paper demonstrates that in such cases parsimonious data-based hydrological models for flood forecasting can be developed if the special conditions of climate and topography are used to advantage. As an example, the middle reach of River Mekong in South East Asia is considered, where a database of discharges from seven gaging stations on the river and 31 rainfall stations on the subcatchments between gaging stations is available for model calibration. Special conditions existing for River Mekong are identified and used in developing first a network connecting all discharge gages and then models for forecasting discharge increments between gaging stations. Our final forecast model (Model 3) is a linear combination of two structurally different basic models: a model (Model 1) using linear regressions for forecasting discharge increments, and a model (Model 2) using rainfall-runoff models. Although the model based on linear regressions works reasonably well for short times, better results are obtained with rainfall-runoff modeling. However, forecast accuracy of Model 2 is limited by the quality of rainfall forecasts. For best results, both models are combined by taking weighted averages to form Model 3. Model quality is assessed by means of both persistence index PI and standard deviation of forecast error.

  4. Selection of probability based weighting models for Boolean retrieval system

    Energy Technology Data Exchange (ETDEWEB)

    Ebinuma, Y. (Japan Atomic Energy Research Inst., Tokai, Ibaraki. Tokai Research Establishment)

    1981-09-01

    Automatic weighting models based on probability theory were studied if they can be applied to boolean search logics including logical sum. The INIS detabase was used for searching of one particular search formula. Among sixteen models three with good ranking performance were selected. These three models were further applied to searching of nine search formulas in the same database. It was found that two models among them show slightly better average ranking performance while the other model, the simplest one, seems also practical.

  5. Cognitive Modeling for Agent-Based Simulation of Child Maltreatment

    Science.gov (United States)

    Hu, Xiaolin; Puddy, Richard

    This paper extends previous work to develop cognitive modeling for agent-based simulation of child maltreatment (CM). The developed model is inspired from parental efficacy, parenting stress, and the theory of planned behavior. It provides an explanatory, process-oriented model of CM and incorporates causality relationship and feedback loops from different factors in the social ecology in order for simulating the dynamics of CM. We describe the model and present simulation results to demonstrate the features of this model.

  6. Gradient-based adaptation of continuous dynamic model structures

    Science.gov (United States)

    La Cava, William G.; Danai, Kourosh

    2016-01-01

    A gradient-based method of symbolic adaptation is introduced for a class of continuous dynamic models. The proposed model structure adaptation method starts with the first-principles model of the system and adapts its structure after adjusting its individual components in symbolic form. A key contribution of this work is its introduction of the model's parameter sensitivity as the measure of symbolic changes to the model. This measure, which is essential to defining the structural sensitivity of the model, not only accommodates algebraic evaluation of candidate models in lieu of more computationally expensive simulation-based evaluation, but also makes possible the implementation of gradient-based optimisation in symbolic adaptation. The proposed method is applied to models of several virtual and real-world systems that demonstrate its potential utility.

  7. Thermodynamics-based models of transcriptional regulation with gene sequence.

    Science.gov (United States)

    Wang, Shuqiang; Shen, Yanyan; Hu, Jinxing

    2015-12-01

    Quantitative models of gene regulatory activity have the potential to improve our mechanistic understanding of transcriptional regulation. However, the few models available today have been based on simplistic assumptions about the sequences being modeled or heuristic approximations of the underlying regulatory mechanisms. In this work, we have developed a thermodynamics-based model to predict gene expression driven by any DNA sequence. The proposed model relies on a continuous time, differential equation description of transcriptional dynamics. The sequence features of the promoter are exploited to derive the binding affinity which is derived based on statistical molecular thermodynamics. Experimental results show that the proposed model can effectively identify the activity levels of transcription factors and the regulatory parameters. Comparing with the previous models, the proposed model can reveal more biological sense.

  8. Model Based Analysis of Insider Threats

    DEFF Research Database (Denmark)

    Chen, Taolue; Han, Tingting; Kammueller, Florian

    2016-01-01

    In order to detect malicious insider attacks it is important to model and analyse infrastructures and policies of organisations and the insiders acting within them. We extend formal approaches that allow modelling such scenarios by quantitative aspects to enable a precise analysis of security...... designs. Our framework enables evaluating the risks of an insider attack to happen quantitatively. The framework first identifies an insider's intention to perform an inside attack, using Bayesian networks, and in a second phase computes the probability of success for an inside attack by this actor, using...... probabilistic model checking. We provide prototype tool support using Matlab for Bayesian networks and PRISM for the analysis of Markov decision processes, and validate the framework with case studies....

  9. Active Appearance Model Based Hand Gesture Recognition

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    This paper addresses the application of hand gesture recognition in monocular image sequences using Active Appearance Model (AAM). For this work, the proposed algorithm is conposed of constructing AAMs and fitting the models to the interest region. In training stage, according to the manual labeled feature points, the relative AAM is constructed and the corresponding average feature is obtained. In recognition stage, the interesting hand gesture region is firstly segmented by skin and movement cues.Secondly, the models are fitted to the image that includes the hand gesture, and the relative features are extracted.Thirdly, the classification is done by comparing the extracted features and average features. 30 different gestures of Chinese sign language are applied for testing the effectiveness of the method. The Experimental results are given indicating good performance of the algorithm.

  10. Improved world-based language model

    Institute of Scientific and Technical Information of China (English)

    CHEN Yong(陈勇); CHAN Kwok-ping

    2004-01-01

    In order to construct a good language model used in the postprocessing phase of a recognition system.A smoothing technique must be used to solve the data sparseness problem. In the past, many smoothing techniques have been proposed. Among them, Katz' s smoothing technique is well known. However, we found that a weakness with the Katz' s smoothing technique. We improved this approach by incorporating one kind of special Chinese language information and Chinese word class information into the language model. We tested the new smoothing technique with a Chinese character recognition system. The experimental result showed that a better performance can be achieved.

  11. Controlling reuse in pattern-based model-to-model transformations

    OpenAIRE

    Guerra, Esther,; De Lara, Juan,; Orejas, Fernando

    2010-01-01

    Model-to-model transformation is a central activity in Model-Driven Engineering that consists of transforming models from a source to a target language. Pattern-based model-to-model transformation is our approach for specifying transformations in a declarative, relational and formal style. The approach relies on patterns describing allowed or forbidden relations between two models. These patterns are compiled into operational mechanisms to perform forward and backward transformations. Inspire...

  12. CSPBuilder - CSP based Scientific Workflow Modelling

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard; Vinter, Brian

    2008-01-01

    not previously been called for. The framework is CSP presented as a scienti¿c work¿ow model, specialized for scienti¿c computing applications. The purpose of the framework is to enable scientists to exploit large parallel computation resources, which has previously been hard due of the dif¿culty of concurrent...

  13. An Optimization Model Based on Game Theory

    Directory of Open Access Journals (Sweden)

    Yang Shi

    2014-04-01

    Full Text Available Game Theory has a wide range of applications in department of economics, but in the field of computer science, especially in the optimization algorithm is seldom used. In this paper, we integrate thinking of game theory into optimization algorithm, and then propose a new optimization model which can be widely used in optimization processing. This optimization model is divided into two types, which are called “the complete consistency” and “the partial consistency”. In these two types, the partial consistency is added disturbance strategy on the basis of the complete consistency. When model’s consistency is satisfied, the Nash equilibrium of the optimization model is global optimal and when the model’s consistency is not met, the presence of perturbation strategy can improve the application of the algorithm. The basic experiments suggest that this optimization model has broad applicability and better performance, and gives a new idea for some intractable problems in the field of artificial intelligence

  14. Automata-Based CSL Model Checking

    DEFF Research Database (Denmark)

    Zhang, Lijun; Jansen, David N.; Nielson, Flemming;

    2011-01-01

    For continuous-time Markov chains, the model-checking problem with respect to continuous-time stochastic logic (CSL) has been introduced and shown to be decidable by Aziz, Sanwal, Singhal and Brayton in 1996. The presented decision procedure, however, has exponential complexity. In this paper, we...

  15. Model-based satellite image fusion

    DEFF Research Database (Denmark)

    Aanæs, Henrik; Sveinsson, J. R.; Nielsen, Allan Aasbjerg

    2008-01-01

    A method is proposed for pixel-level satellite image fusion derived directly from a model of the imaging sensor. By design, the proposed method is spectrally consistent. It is argued that the proposed method needs regularization, as is the case for any method for this problem. A framework for pixel...

  16. A Qualitative Acceleration Model Based on Intervals

    Directory of Open Access Journals (Sweden)

    Ester MARTINEZ-MARTIN

    2013-08-01

    Full Text Available On the way to autonomous service robots, spatial reasoning plays a main role since it properly deals with problems involving uncertainty. In particular, we are interested in knowing people's pose to avoid collisions. With that aim, in this paper, we present a qualitative acceleration model for robotic applications including representation, reasoning and a practical application.

  17. Agent-based Modeling Methodology for Analyzing Weapons Systems

    Science.gov (United States)

    2015-03-26

    43 Figure 14: Simulation Study Methodology for the Weapon System Analysis Metrics Definition and Data Collection The analysis plan calls for...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Casey D. Connors, Major, USA...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Presented to the Faculty Department of Operational Sciences

  18. Software Reuse of Mobile Systems based on Modelling

    Directory of Open Access Journals (Sweden)

    Guo Ping

    2016-01-01

    Full Text Available This paper presents an architectural style based modelling approach for architectural design, analysis of mobile systems. The approach is developed based on UML-like meta models and graph transformation techniques to support sound methodological principals, formal analysis and refinement. The approach could support mobile system development.

  19. Towards automatic model based controller design for reconfigurable plants

    DEFF Research Database (Denmark)

    Michelsen, Axel Gottlieb; Stoustrup, Jakob; Izadi-Zamanabadi, Roozbeh

    2008-01-01

    This paper introduces model-based Plug and Play Process Control, a novel concept for process control, which allows a model-based control system to be reconfigured when a sensor or an actuator is plugged into a controlled process. The work reported in this paper focuses on composing a monolithic m...

  20. Neural mass model-based tracking of anesthetic brain states

    NARCIS (Netherlands)

    Kuhlmann, Levin; Freestone, Dean R.; Manton, Jonathan H.; Heyse, Bjorn; Vereecke, Hugo E. M.; Lipping, Tarmo; Struys, Michel M. R. F.; Liley, David T. J.

    2016-01-01

    Neural mass model-based tracking of brain states from electroencephalographic signals holds the promise of simultaneously tracking brain states while inferring underlying physiological changes in various neuroscientific and clinical applications. Here, neural mass model-based tracking of brain state

  1. An Active Learning Exercise for Introducing Agent-Based Modeling

    Science.gov (United States)

    Pinder, Jonathan P.

    2013-01-01

    Recent developments in agent-based modeling as a method of systems analysis and optimization indicate that students in business analytics need an introduction to the terminology, concepts, and framework of agent-based modeling. This article presents an active learning exercise for MBA students in business analytics that demonstrates agent-based…

  2. A Granular Computing Model Based on Tolerance relation

    Institute of Scientific and Technical Information of China (English)

    WANG Guo-yin; HU Feng; HUANG Hai; WU Yu

    2005-01-01

    Granular computing is a new intelligent computing theory based on partition of problem concepts. It is an important problem in Rough Set theory to process incomplete information systems directly. In this paper, a granular computing model based on tolerance relation for processing incomplete information systems is developed. Furthermore, a criteria condition for attribution necessity is proposed in this model.

  3. Agent-based modelling of socio-technical systems

    CERN Document Server

    van Dam, Koen H; Lukszo, Zofia

    2012-01-01

    Here is a practical introduction to agent-based modelling of socio-technical systems, based on methodology developed at TU Delft, which has been deployed in a number of case studies. Offers theory, methods and practical steps for creating real-world models.

  4. Group Contribution Based Process Flowsheet Synthesis, Design and Modelling

    DEFF Research Database (Denmark)

    Gani, Rafiqul; d'Anterroches, Loïc

    2004-01-01

    This paper presents a process-group-contribution Method to model. simulate and synthesize a flowsheet. The process-group based representation of a flowsheet together with a process "property" model are presented. The process-group based synthesis method is developed on the basis of the computer...

  5. Model-reduced gradient-based history matching

    NARCIS (Netherlands)

    Kaleta, M.P.

    2011-01-01

    Since the world's energy demand increases every year, the oil & gas industry makes a continuous effort to improve fossil fuel recovery. Physics-based petroleum reservoir modeling and closed-loop model-based reservoir management concept can play an important role here. In this concept measured data a

  6. Comparing models for quantitative risk assessment: an application to the European Registry of foreign body injuries in children.

    Science.gov (United States)

    Berchialla, Paola; Scarinzi, Cecilia; Snidero, Silvia; Gregori, Dario

    2016-08-01

    Risk Assessment is the systematic study of decisions subject to uncertain consequences. An increasing interest has been focused on modeling techniques like Bayesian Networks since their capability of (1) combining in the probabilistic framework different type of evidence including both expert judgments and objective data; (2) overturning previous beliefs in the light of the new information being received and (3) making predictions even with incomplete data. In this work, we proposed a comparison among Bayesian Networks and other classical Quantitative Risk Assessment techniques such as Neural Networks, Classification Trees, Random Forests and Logistic Regression models. Hybrid approaches, combining both Classification Trees and Bayesian Networks, were also considered. Among Bayesian Networks, a clear distinction between purely data-driven approach and combination of expert knowledge with objective data is made. The aim of this paper consists in evaluating among this models which best can be applied, in the framework of Quantitative Risk Assessment, to assess the safety of children who are exposed to the risk of inhalation/insertion/aspiration of consumer products. The issue of preventing injuries in children is of paramount importance, in particular where product design is involved: quantifying the risk associated to product characteristics can be of great usefulness in addressing the product safety design regulation. Data of the European Registry of Foreign Bodies Injuries formed the starting evidence for risk assessment. Results showed that Bayesian Networks appeared to have both the ease of interpretability and accuracy in making prediction, even if simpler models like logistic regression still performed well.

  7. (Re)configuration based on model generation

    CERN Document Server

    Friedrich, Gerhard; Falkner, Andreas A; Haselböck, Alois; Schenner, Gottfried; Schreiner, Herwig; 10.4204/EPTCS.65.3

    2011-01-01

    Reconfiguration is an important activity for companies selling configurable products or services which have a long life time. However, identification of a set of required changes in a legacy configuration is a hard problem, since even small changes in the requirements might imply significant modifications. In this paper we show a solution based on answer set programming, which is a logic-based knowledge representation formalism well suited for a compact description of (re)configuration problems. Its applicability is demonstrated on simple abstractions of several real-world scenarios. The evaluation of our solution on a set of benchmark instances derived from commercial (re)configuration problems shows its practical applicability.

  8. Kinetic data base for combustion modeling

    Energy Technology Data Exchange (ETDEWEB)

    Tsang, W.; Herron, J.T. [National Institute of Standards and Technology, Gaithersburg, MD (United States)

    1993-12-01

    The aim of this work is to develop a set of evaluated rate constants for use in the simulation of hydrocarbon combustion. The approach has been to begin with the small molecules and then introduce larger species with the various structural elements that can be found in all hydrocarbon fuels and decomposition products. Currently, the data base contains most of the species present in combustion systems with up to four carbon atoms. Thus, practically all the structural grouping found in aliphatic compounds have now been captured. The direction of future work is the addition of aromatic compounds to the data base.

  9. Geometric Feature Extraction and Model Reconstruction Based on Scattered Data

    Institute of Scientific and Technical Information of China (English)

    胡鑫; 习俊通; 金烨

    2004-01-01

    A method of 3D model reconstruction based on scattered point data in reverse engineering is presented here. The topological relationship of scattered points was established firstly, then the data set was triangulated to reconstruct the mesh surface model. The curvatures of cloud data were calculated based on the mesh surface, and the point data were segmented by edge-based method; Every patch of data was fitted by quadric surface of freeform surface, and the type of quadric surface was decided by parameters automatically, at last the whole CAD model was created. An example of mouse model was employed to confirm the effect of the algorithm.

  10. A method to manage the model base in DSS

    Institute of Scientific and Technical Information of China (English)

    孙成双; 李桂君

    2004-01-01

    How to manage and use models in DSS is a most important subject. Generally, it costs a lot of money and time to develop the model base management system in the development of DSS and most are simple in function or cannot be used efficiently in practice. It is a very effective, applicable, and economical choice to make use of the interfaces of professional computer software to develop a model base management system. This paper presents the method of using MATLAB, a well-known statistics software, as the development platform of a model base management system. The main functional framework of a MATLAB-based model base managementsystem is discussed. Finally, in this paper, its feasible application is illustrated in the field of construction projects.

  11. Robust speech features representation based on computational auditory model

    Institute of Scientific and Technical Information of China (English)

    LU Xugang; JIA Chuan; DANG Jianwu

    2004-01-01

    A speech signal processing and features extracting method based on computational auditory model is proposed. The computational model is based on psychological, physiological knowledge and digital signal processing methods. In each stage of a hearing perception system, there is a corresponding computational model to simulate its function. Based on this model, speech features are extracted. In each stage, the features in different kinds of level are extracted. A further processing for primary auditory spectrum based on lateral inhibition is proposed to extract much more robust speech features. All these features can be regarded as the internal representations of speech stimulation in hearing system. The robust speech recognition experiments are conducted to test the robustness of the features. Results show that the representations based on the proposed computational auditory model are robust representations for speech signals.

  12. A Family of RBAC- Based Workflow Authorization Models

    Institute of Scientific and Technical Information of China (English)

    HONG Fan; XING Guang-lin

    2005-01-01

    A family of RBAC-based workflow authorization models, called RWAM, are proposed. RWAM consists of a basic model and other models constructed from the basic model. The basic model provides the notion of temporal permission which means that a user can perform certain operation on a task only for a time interval, this not only ensure that only authorized users could execute a task but also ensure that the authorization flow is synchronized with workflow. The two advance models of RWAM deal with role hierarchy and constraints respectively. RWAM ranges from simple to complex and provides a general reference model for other researches and developments of such area.

  13. Characteristics of a Logistics-Based Business Model

    OpenAIRE

    Sandberg, Erik; Kihlén, Tobias; Abrahamsson, Mats

    2011-01-01

    In companies where excellence in logistics is decisive for the outperformance of competitors and logistics has an outspoken role for the strategy of the firm, there is present what we refer to here as a “logistics-based business model.” Based on a multiple case study of three Nordic retail companies, the purpose of this article is to explore the characteristics of such a logistics-based business model. As such, this research helps to provide structure to logistics-based business models and id...

  14. A data base for galaxy evolution modeling

    NARCIS (Netherlands)

    Leitherer, C; Alloin, D; FritzVonAlvensleben, U; Gallagher, JS; Huchra, JP; Matteucci, F; OConnell, RW; Beckman, JE; Bertelli, GP; Bica, E; Boisson, C; Bonatto, C; Bothun, GD; Bressan, A; Brodie, JP; Bruzual, G; Burstein, D; Buser, R; Caldwell, N; Casuso, E; Cervino, M; Charlot, S; Chavez, M; Chiosi, C; Christian, CA; Cuisinier, F; Dallier, R; deKoter, A; Delisle, S; Diaz, AI; Dopita, MA; Dorman, B; Fagotto, F; Fanelli, MN; Fioc, M; GarciaVargas, ML; Girardi, L; Goldader, JD; Hardy, E; Heckman, TM; Iglesias, J; Jablonka, P; Joly, M; Jones, L; Kurth, O; Lancon, A; Lejeune, T; Loxen, J; Maeder, A; Malagnini, ML; Marigo, P; MasHesse, JM; Meynet, G; Moller, CS; Molla, ML; Morossi, C; Nasi, E; Nichols, JS; Odegaard, KJR; Parker, JWM; Pastoriza, MG; Peletier, R; Robert, C; RoccaVolmerange, B; Schaerer, D; Schmidt, A; Schmitt, HR; Schommer, RA; Schmutz, W; Silva, L; Stasinska, G; Sutherland, RS; Tantalo, R; Traat, P; Vallenari, A; Vazdekis, A; Walborn, NR; Worthey, G

    1996-01-01

    This paper represents a collective effort to provide an extensive electronic data base useful for the interpretation of the spectra and evolution of galaxies. A broad variety of empirical and theoretical data is discussed here, and the data are made fully available in the AAS CD-ROM Series, Vol. 7.

  15. Fujisaki Model Based Intonation Modeling for Korean TTS System

    Science.gov (United States)

    Kim, Byeongchang; Lee, Jinsik; Lee, Gary Geunbae

    One of the enduring problems in developing high-quality TTS (text-to-speech) system is pitch contour generation. Considering language specific knowledge, an adjusted Fujisaki model for Korean TTS system is introduced along with refined machine learning features. The results of quantitative and qualitative evaluations show the validity of our system: the accuracy of the phrase command prediction is 0.8928; the correlations of the predicted amplitudes of a phrase command and an accent command are 0.6644 and 0.6002, respectively; our method achieved the level of "fair" naturalness (3.6) in a MOS scale for generated F0 curves.

  16. Warehouse Optimization Model Based on Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Guofeng Qin

    2013-01-01

    Full Text Available This paper takes Bao Steel logistics automated warehouse system as an example. The premise is to maintain the focus of the shelf below half of the height of the shelf. As a result, the cost time of getting or putting goods on the shelf is reduced, and the distance of the same kind of goods is also reduced. Construct a multiobjective optimization model, using genetic algorithm to optimize problem. At last, we get a local optimal solution. Before optimization, the average cost time of getting or putting goods is 4.52996 s, and the average distance of the same kinds of goods is 2.35318 m. After optimization, the average cost time is 4.28859 s, and the average distance is 1.97366 m. After analysis, we can draw the conclusion that this model can improve the efficiency of cargo storage.

  17. Nonlinear system modeling based on experimental data

    Energy Technology Data Exchange (ETDEWEB)

    PAEZ,THOMAS L.; HUNTER,NORMAN F.

    2000-02-02

    The canonical variate analysis technique is used in this investigation, along with a data transformation algorithm, to identify a system in a transform space. The transformation algorithm involves the preprocessing of measured excitation/response data with a zero-memory-nonlinear transform, specifically, the Rosenblatt transform. This transform approximately maps the measured excitation and response data from its own space into the space of uncorrelated, standard normal random variates. Following this transform, it is appropriate to model the excitation/response relation as linear since Gaussian inputs excite Gaussian responses in linear structures. The linear model is identified in the transform space using the canonical variate analysis approach, and system responses in the original space are predicted using inverse Rosenblatt transformation. An example is presented.

  18. Model-based scenarios of Mediterranean droughts

    Directory of Open Access Journals (Sweden)

    M. Weiß

    2007-11-01

    Full Text Available This study examines the change in current 100-year hydrological drought frequencies in the Mediterranean in comparison to the 2070s as simulated by the global model WaterGAP. The analysis considers socio-economic and climate changes as indicated by the IPCC scenarios A2 and B2 and the global general circulation model ECHAM4. Under these conditions today's 100-year drought is estimated to occur 10 times more frequently in the future over a large part of the Northern Mediterranean while in North Africa, today's 100-year drought will occur less frequently. Water abstractions are shown to play a minor role in comparison to the impact of climate change, but can intensify the situation.

  19. Online Knowledge-Based Model for Big Data Topic Extraction.

    Science.gov (United States)

    Khan, Muhammad Taimoor; Durrani, Mehr; Khalid, Shehzad; Aziz, Furqan

    2016-01-01

    Lifelong machine learning (LML) models learn with experience maintaining a knowledge-base, without user intervention. Unlike traditional single-domain models they can easily scale up to explore big data. The existing LML models have high data dependency, consume more resources, and do not support streaming data. This paper proposes online LML model (OAMC) to support streaming data with reduced data dependency. With engineering the knowledge-base and introducing new knowledge features the learning pattern of the model is improved for data arriving in pieces. OAMC improves accuracy as topic coherence by 7% for streaming data while reducing the processing cost to half.

  20. A VENSIM BASED ANALYSIS FOR SUPPLY CHAIN MODEL

    Directory of Open Access Journals (Sweden)

    Mohammad SHAMSUDDOHA

    2014-01-01

    Full Text Available The emphasis on supply chain has increased in recent years among academic and industry circles. In this paper, a supply chain model will be developed based on a case study of the poultry industry under the Vensim environment. System dynamics, supply chain, design science and case method under positivist and quantitative paradigm will be studied to develop a simulation model. The objectives of this paper are to review literature, develop a Vensim based simulation supply chain model, and examine the model qualitatively and quantitatively. The model will be also briefly discussed in relation of among forward, reverse and mainstream supply chain of the case.

  1. On grey relation projection model based on projection pursuit

    Institute of Scientific and Technical Information of China (English)

    Wang Shuo; Yang Shanlin; Ma Xijun

    2008-01-01

    Multidimensional grey relation projection value can be synthesized as one-dimensional projection value by u-sing projection pursuit model.The larger the projection value is,the better the model.Thus,according to the projection value,the best one can be chosen from the model aggregation.Because projection pursuit modeling based on accelera-ting genetic algorithm can simplify the implementation procedure of the projection pursuit technique and overcome its complex calculation as well as the difficulty in implementing its program,a new method can be obtained for choosing the best grey relation projection model based on the projection pursuit technique.

  2. Online Knowledge-Based Model for Big Data Topic Extraction

    Science.gov (United States)

    Khan, Muhammad Taimoor; Durrani, Mehr; Khalid, Shehzad; Aziz, Furqan

    2016-01-01

    Lifelong machine learning (LML) models learn with experience maintaining a knowledge-base, without user intervention. Unlike traditional single-domain models they can easily scale up to explore big data. The existing LML models have high data dependency, consume more resources, and do not support streaming data. This paper proposes online LML model (OAMC) to support streaming data with reduced data dependency. With engineering the knowledge-base and introducing new knowledge features the learning pattern of the model is improved for data arriving in pieces. OAMC improves accuracy as topic coherence by 7% for streaming data while reducing the processing cost to half. PMID:27195004

  3. An XML-based information model for archaeological pottery

    Institute of Scientific and Technical Information of China (English)

    LIU De-zhi; RAZDAN Anshuman; SIMON Arleyn; BAE Myungsoo

    2005-01-01

    An information model is defined to support sharing scientific information on Web for archaeological pottery. Apart from non-shape information, such as age, material, etc., the model also consists of shape information and shape feature information. Shape information is collected by Lasers Scanner and geometric modelling techniques. Feature information is generated from shape information via feature extracting techniques. The model is used in an integrated storage, archival, and sketch-based query and retrieval system for 3D objects, native American ceramic vessels. A novel aspect of the information model is that it is totally implemented with XML, and is designed for Web-based visual query and storage application.

  4. Model-Based Traffic Control for Sustainable Mobility

    NARCIS (Netherlands)

    Zegeye, S.K.

    2011-01-01

    Computationally efficient dynamic fuel consumption, emissions, and dispersion of emissions models are developed. Fast and practically feasible model-based controller is proposed. Using the developed models, the controller steers the traffic flow in such a way that a balanced trade-off between the t

  5. Understanding Elementary Astronomy by Making Drawing-Based Models

    NARCIS (Netherlands)

    van Joolingen, W. R.; Aukes, Annika V A; Gijlers, H.; Bollen, L.

    2015-01-01

    Modeling is an important approach in the teaching and learning of science. In this study, we attempt to bring modeling within the reach of young children by creating the SimSketch modeling system, which is based on freehand drawings that can be turned into simulations. This system was used by 247 ch

  6. IDEF method-based simulation model design and development framework

    Directory of Open Access Journals (Sweden)

    Ki-Young Jeong

    2009-09-01

    Full Text Available The purpose of this study is to provide an IDEF method-based integrated framework for a business process simulation model to reduce the model development time by increasing the communication and knowledge reusability during a simulation project. In this framework, simulation requirements are collected by a function modeling method (IDEF0 and a process modeling method (IDEF3. Based on these requirements, a common data model is constructed using the IDEF1X method. From this reusable data model, multiple simulation models are automatically generated using a database-driven simulation model development approach. The framework is claimed to help both requirement collection and experimentation phases during a simulation project by improving system knowledge, model reusability, and maintainability through the systematic use of three descriptive IDEF methods and the features of the relational database technologies. A complex semiconductor fabrication case study was used as a testbed to evaluate and illustrate the concepts and the framework. Two different simulation software products were used to develop and control the semiconductor model from the same knowledge base. The case study empirically showed that this framework could help improve the simulation project processes by using IDEF-based descriptive models and the relational database technology. Authors also concluded that this framework could be easily applied to other analytical model generation by separating the logic from the data.

  7. Clone Detection for Graph-Based Model Transformation Languages

    DEFF Research Database (Denmark)

    Strüber, Daniel; Plöger, Jennifer; Acretoaie, Vlad

    2016-01-01

    has been proposed for programming and modeling languages; yet no specific ones have emerged for model transformation languages. In this paper, we explore clone detection for graph-based model transformation languages. We introduce potential use cases for such techniques in the context of constructive...

  8. Case-Based Modeling for Learning Management and Interpersonal Skills

    Science.gov (United States)

    Lyons, Paul

    2008-01-01

    This article offers an introduction to case-based modeling (CBM) and a demonstration of the efficacy of this instructional model. CBM is grounded primarily in the concepts and theory of experiential learning, augmented by concepts of script creation. Although it is labor intensive, the model is one that has value for instruction in various…

  9. Physiologically based kinetic modeling of the bioactivation of myristicin

    NARCIS (Netherlands)

    Al-Malahmeh, Amer J.; Al-Ajlouni, Abdelmajeed; Wesseling, Sebastiaan; Soffers, Ans E.M.F.; Al-Subeihi, A.; Kiwamoto, Reiko; Vervoort, Jacques; Rietjens, Ivonne M.C.M.

    2016-01-01

    The present study describes physiologically based kinetic (PBK) models for the alkenylbenzene myristicin that were developed by extension of the PBK models for the structurally related alkenylbenzene safrole in rat and human. The newly developed myristicin models revealed that the formation of th

  10. [Model-based biofuels system analysis: a review].

    Science.gov (United States)

    Chang, Shiyan; Zhang, Xiliang; Zhao, Lili; Ou, Xunmin

    2011-03-01

    Model-based system analysis is an important tool for evaluating the potential and impacts of biofuels, and for drafting biofuels technology roadmaps and targets. The broad reach of the biofuels supply chain requires that biofuels system analyses span a range of disciplines, including agriculture/forestry, energy, economics, and the environment. Here we reviewed various models developed for or applied to modeling biofuels, and presented a critical analysis of Agriculture/Forestry System Models, Energy System Models, Integrated Assessment Models, Micro-level Cost, Energy and Emission Calculation Models, and Specific Macro-level Biofuel Models. We focused on the models' strengths, weaknesses, and applicability, facilitating the selection of a suitable type of model for specific issues. Such an analysis was a prerequisite for future biofuels system modeling, and represented a valuable resource for researchers and policy makers.

  11. Model based control of dynamic atomic force microscope.

    Science.gov (United States)

    Lee, Chibum; Salapaka, Srinivasa M

    2015-04-01

    A model-based robust control approach is proposed that significantly improves imaging bandwidth for the dynamic mode atomic force microscopy. A model for cantilever oscillation amplitude and phase dynamics is derived and used for the control design. In particular, the control design is based on a linearized model and robust H(∞) control theory. This design yields a significant improvement when compared to the conventional proportional-integral designs and verified by experiments.

  12. Model based control of dynamic atomic force microscope

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chibum [Department of Mechanical System Design Engineering, Seoul National University of Science and Technology, Seoul 139-743 (Korea, Republic of); Salapaka, Srinivasa M., E-mail: salapaka@illinois.edu [Department of Mechanical Science and Engineering, University of Illinois at Urbana-Champaign, Urbana, Illinois 61801 (United States)

    2015-04-15

    A model-based robust control approach is proposed that significantly improves imaging bandwidth for the dynamic mode atomic force microscopy. A model for cantilever oscillation amplitude and phase dynamics is derived and used for the control design. In particular, the control design is based on a linearized model and robust H{sub ∞} control theory. This design yields a significant improvement when compared to the conventional proportional-integral designs and verified by experiments.

  13. Internal Model Based Active Disturbance Rejection Control

    OpenAIRE

    Pan, Jinwen; Wang, Yong

    2016-01-01

    The basic active disturbance rejection control (BADRC) algorithm with only one order higher extended state observer (ESO) proves to be robust to both internal and external disturbances. An advantage of BADRC is that in many applications it can achieve high disturbance attenuation level without requiring a detailed model of the plant or disturbance. However, this can be regarded as a disadvantage when the disturbance characteristic is known since the BADRC algorithm cannot exploit such informa...

  14. Weather forecasting based on hybrid neural model

    Science.gov (United States)

    Saba, Tanzila; Rehman, Amjad; AlGhamdi, Jarallah S.

    2017-02-01

    Making deductions and expectations about climate has been a challenge all through mankind's history. Challenges with exact meteorological directions assist to foresee and handle problems well in time. Different strategies have been investigated using various machine learning techniques in reported forecasting systems. Current research investigates climate as a major challenge for machine information mining and deduction. Accordingly, this paper presents a hybrid neural model (MLP and RBF) to enhance the accuracy of weather forecasting. Proposed hybrid model ensure precise forecasting due to the specialty of climate anticipating frameworks. The study concentrates on the data representing Saudi Arabia weather forecasting. The main input features employed to train individual and hybrid neural networks that include average dew point, minimum temperature, maximum temperature, mean temperature, average relative moistness, precipitation, normal wind speed, high wind speed and average cloudiness. The output layer composed of two neurons to represent rainy and dry weathers. Moreover, trial and error approach is adopted to select an appropriate number of inputs to the hybrid neural network. Correlation coefficient, RMSE and scatter index are the standard yard sticks adopted for forecast accuracy measurement. On individual standing MLP forecasting results are better than RBF, however, the proposed simplified hybrid neural model comes out with better forecasting accuracy as compared to both individual networks. Additionally, results are better than reported in the state of art, using a simple neural structure that reduces training time and complexity.

  15. Solitonic Models Based on Quantum Groups and the Standard Model

    CERN Document Server

    Finkelstein, Robert J

    2010-01-01

    The idea that the elementary particles might have the symmetry of knots has had a long history. In any current formulation of this idea, however, the knot must be quantized. The present review is a summary of a small set of papers that began as an attempt to correlate the properties of quantized knots with the empirical properties of the elementary particles. As the ideas behind these papers have developed over a number of years the model has evolved, and this review is intended to present the model in its current form. The original picture of an elementary fermion as a solitonic knot of field, described by the trefoil representation of SUq(2), has expanded into its current form in which a knotted field is complementary to a composite structure composed of three or more preons that in turn are described by the fundamental representation of SLq(2). These complementary descriptions may be interpreted as describing single composite particles composed of three or more preons bound by a knotted field.

  16. Numerical simulation of base flow with hot base bleed for two jet models

    OpenAIRE

    Wen-jie Yu; Yong-gang Yu; Bin Ni

    2014-01-01

    In order to improve the benefits of base bleed in base flow field, the base flow with hot base bleed for two jet models is studied. Two-dimensional axisymmetric Navier–Stokes equations are computed by using a finite volume scheme. The base flow of a cylinder afterbody with base bleed is simulated. The simulation results are validated with the experimental data, and the experimental results are well reproduced. On this basis, the base flow fields with base bleed for a circular jet model and an...

  17. BeetleBase: the model organism database for Tribolium castaneum

    OpenAIRE

    Wang, Liangjiang; Wang, Suzhi; Li, Yonghua; Paradesi, Martin S. R.; Brown, Susan J

    2006-01-01

    BeetleBase () is an integrated resource for the Tribolium research community. The red flour beetle (Tribolium castaneum) is an important model organism for genetics, developmental biology, toxicology and comparative genomics, the genome of which has recently been sequenced. BeetleBase is constructed to integrate the genomic sequence data with information about genes, mutants, genetic markers, expressed sequence tags and publications. BeetleBase uses the Chado data model and software component...

  18. User Context Aware Base Station Power Flow Model

    OpenAIRE

    Walsh, Barbara; Farrell, Ronan

    2005-01-01

    At present the testing of power amplifiers within base station transmitters is limited to testing at component level as opposed to testing at the system level. While the detection of catastrophic failure is possible, that of performance degradation is not. This paper proposes a base station model with respect to transmitter output power with the aim of introducing system level monitoring of the power amplifier behaviour within the base station. Our model reflects the expe...

  19. A Knowledge Representation Model for Video—Based Animation

    Institute of Scientific and Technical Information of China (English)

    劳志强; 潘云鹤

    1998-01-01

    In this paper,a brief survey on knowledge-based animation techniques is given.Then a VideoStream-based Knowledge Representation Model(VSKRM)for Joint Objects is presented which includes the knowledge representation of :Graphic Object,Action and VideoStream.Next a general description of the UI framework of a system is given based on the VSKRM model.Finally,a conclusion is reached.

  20. Identity-based encryption with wildcards in the standard model

    Institute of Scientific and Technical Information of China (English)

    MING Yang; SHEN Xiao-qin; WANG Yu-min

    2009-01-01

    In this article, based on Chatterjee-Sarkar' hierarchical identity-based encryption (HIBE), a novel identity-based encryption with wildcards (WIBE) scheme is proposed and is proven secure in the standard model (without random oracle). The proposed scheme is proven to be secure assuming that the decisional Bilinear Diffie-Hellman (DBDH) problem is hard. Compared with the Wa-WIBE scheme that is secure in the standard model, our scheme has shorter common parameters and ciphertext length.

  1. Reduced model-based decision-making in schizophrenia.

    Science.gov (United States)

    Culbreth, Adam J; Westbrook, Andrew; Daw, Nathaniel D; Botvinick, Matthew; Barch, Deanna M

    2016-08-01

    Individuals with schizophrenia have a diminished ability to use reward history to adaptively guide behavior. However, tasks traditionally used to assess such deficits often rely on multiple cognitive and neural processes, leaving etiology unresolved. In the current study, we adopted recent computational formalisms of reinforcement learning to distinguish between model-based and model-free decision-making in hopes of specifying mechanisms associated with reinforcement-learning dysfunction in schizophrenia. Under this framework, decision-making is model-free to the extent that it relies solely on prior reward history, and model-based if it relies on prospective information such as motivational state, future consequences, and the likelihood of obtaining various outcomes. Model-based and model-free decision-making was assessed in 33 schizophrenia patients and 30 controls using a 2-stage 2-alternative forced choice task previously demonstrated to discern individual differences in reliance on the 2 forms of reinforcement-learning. We show that, compared with controls, schizophrenia patients demonstrate decreased reliance on model-based decision-making. Further, parameter estimates of model-based behavior correlate positively with IQ and working memory measures, suggesting that model-based deficits seen in schizophrenia may be partially explained by higher-order cognitive deficits. These findings demonstrate specific reinforcement-learning and decision-making deficits and thereby provide valuable insights for understanding disordered behavior in schizophrenia. (PsycINFO Database Record

  2. A Technology-based Model for Learning

    Directory of Open Access Journals (Sweden)

    Michael Williams

    2004-12-01

    Full Text Available The Math Emporium, opened in 1997, is an open 7000-squaremeter facility with 550+ workstations arranged in an array of widely spaced hexagonal "pods", designed to support group work at the same time maintaining an academic air. We operate it 24/7 with math support personnel in attendance 12 hours per day. Students have access to online course resources at all times, from anywhere. We have used this unique asset to transform traditional classroom-based courses into technology based learning programs that have no class meetings at all. The structure of the program is very different from the conventional one, having a new set of expectations and motivations. The results include: more effective students, substantial cost savings, economies of scale and scope and a stream-lined process for creating new on-line courses.

  3. Ray-Based Reflectance Model for Diffraction

    CERN Document Server

    Cuypers, Tom; Haber, Tom; Bekaert, Philippe; Raskar, Ramesh

    2011-01-01

    We present a novel method of simulating wave effects in graphics using ray--based renderers with a new function: the Wave BSDF (Bidirectional Scattering Distribution Function). Reflections from neighboring surface patches represented by local BSDFs are mutually independent. However, in many surfaces with wavelength-scale microstructures, interference and diffraction requires a joint analysis of reflected wavefronts from neighboring patches. We demonstrate a simple method to compute the BSDF for the entire microstructure, which can be used independently for each patch. This allows us to use traditional ray--based rendering pipelines to synthesize wave effects of light and sound. We exploit the Wigner Distribution Function (WDF) to create transmissive, reflective, and emissive BSDFs for various diffraction phenomena in a physically accurate way. In contrast to previous methods for computing interference, we circumvent the need to explicitly keep track of the phase of the wave by using BSDFs that include positiv...

  4. Spherical Individual Cell-Based Models

    OpenAIRE

    Krinner, Axel

    2010-01-01

    Over the last decade a huge amount of experimental data on biological systems has been generated by modern high-throughput methods. Aided by bioinformatics, the '-omics' (genomics, transcriptomics, proteomics, metabolomics and interactomics) have listed, quantif ed and analyzed molecular components and interactions on all levels of cellular regulation. However, a comprehensive framework, that does not only list, but links all those components, is still largely missing. The biology-based but h...

  5. An Efficient Semantic Model For Concept Based Clustering And Classification

    Directory of Open Access Journals (Sweden)

    SaiSindhu Bandaru

    2012-03-01

    Full Text Available Usually in text mining techniques the basic measures like term frequency of a term (word or phrase is computed to compute the importance of the term in the document. But with statistical analysis, the original semantics of the term may not carry the exact meaning of the term. To overcome this problem, a new framework has been introduced which relies on concept based model and synonym based approach. The proposed model can efficiently find significant matching and related concepts between documents according to concept based and synonym based approaches. Large sets of experiments using the proposed model on different set in clustering and classification are conducted. Experimental results demonstrate the substantialenhancement of the clustering quality using sentence based, document based, corpus based and combined approach concept analysis. A new similarity measure has been proposed to find the similarity between adocument and the existing clusters, which can be used in classification of the document with existing clusters.

  6. A Requirements Analysis Model Based on QFD

    Institute of Scientific and Technical Information of China (English)

    TANG Zhi-wei; Nelson K.H.Tang

    2004-01-01

    The enterprise resource planning (ERP) system has emerged to offer an integrated IT solution and more and more enterprises are increasing by adopting this system and regarding it as an important innovation. However, there is already evidence of high failure risks in ERP project implementation, one major reason is poor analysis of the requirements for system implementation. In this paper, the importance of requirements analysis for ERP project implementation is highlighted, and a requirements analysis model by applying quality function deployment (QFD) is presented, which will support to conduct requirements analysis for ERP project.

  7. Graph-based modelling in engineering

    CERN Document Server

    Rysiński, Jacek

    2017-01-01

    This book presents versatile, modern and creative applications of graph theory in mechanical engineering, robotics and computer networks. Topics related to mechanical engineering include e.g. machine and mechanism science, mechatronics, robotics, gearing and transmissions, design theory and production processes. The graphs treated are simple graphs, weighted and mixed graphs, bond graphs, Petri nets, logical trees etc. The authors represent several countries in Europe and America, and their contributions show how different, elegant, useful and fruitful the utilization of graphs in modelling of engineering systems can be. .

  8. Tsunami Propagation Models Based on First Principles

    Science.gov (United States)

    2012-11-21

    obstacle and strike land in the shadow regions. Since v h according to Eq. (9), the velocity decreases nearer the coast as the depth decreases. The wave...Earth by the two locations is, from spherical trigonometry ,  1cos sin sin cos cos coss d s d d sθ λ λ λ λ φ φ      (37) The linear...speed of propagation, bending of tsunamis around obstacles and depth of the ocean, among others. Two-dimensional models on flat and spherical ocean

  9. A SPICE model for a phase-change memory cell based on the analytical conductivity model

    Science.gov (United States)

    Yiqun, Wei; Xinnan, Lin; Yuchao, Jia; Xiaole, Cui; Jin, He; Xing, Zhang

    2012-11-01

    By way of periphery circuit design of the phase-change memory, it is necessary to present an accurate compact model of a phase-change memory cell for the circuit simulation. Compared with the present model, the model presented in this work includes an analytical conductivity model, which is deduced by means of the carrier transport theory instead of the fitting model based on the measurement. In addition, this model includes an analytical temperature model based on the 1D heat-transfer equation and the phase-transition dynamic model based on the JMA equation to simulate the phase-change process. The above models for phase-change memory are integrated by using Verilog-A language, and results show that this model is able to simulate the I-V characteristics and the programming characteristics accurately.

  10. A community-based framework for aquatic ecosystem models

    DEFF Research Database (Denmark)

    Trolle, Didde; Hamilton, D. P.; Hipsey, M. R.;

    2012-01-01

    aim to (i) advance collaboration within the aquatic ecosystem modelling community, (ii) enable increased use of models for research, policy and ecosystem-based management, (iii) facilitate a collective framework using common (standardised) code to ensure that model development is incremental, (iv......Here, we communicate a point of departure in the development of aquatic ecosystem models, namely a new community-based framework, which supports an enhanced and transparent union between the collective expertise that exists in the communities of traditional ecologists and model developers. Through...... a literature survey, we document the growing importance of numerical aquatic ecosystem models while also noting the difficulties, up until now, of the aquatic scientific community to make significant advances in these models during the past two decades. Through a common forum for aquatic ecosystem modellers we...

  11. Weibull Parameters Estimation Based on Physics of Failure Model

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2012-01-01

    Reliability estimation procedures are discussed for the example of fatigue development in solder joints using a physics of failure model. The accumulated damage is estimated based on a physics of failure model, the Rainflow counting algorithm and the Miner’s rule. A threshold model is used...... distribution. Methods from structural reliability analysis are used to model the uncertainties and to assess the reliability for fatigue failure. Maximum Likelihood and Least Square estimation techniques are used to estimate fatigue life distribution parameters....

  12. Research on Modeling of Hydropneumatic Suspension Based on Fractional Order

    OpenAIRE

    Junwei Zhang; Sizhong Chen; Yuzhuang Zhao; Jianbo Feng; Chang Liu; Ying Fan

    2015-01-01

    With such excellent performance as nonlinear stiffness, adjustable vehicle height, and good vibration resistance, hydropneumatic suspension (HS) has been more and more applied to heavy vehicle and engineering vehicle. Traditional modeling methods are still confined to simple models without taking many factors into consideration. A hydropneumatic suspension model based on fractional order (HSM-FO) is built with the advantage of fractional order (FO) in viscoelastic material modeling considerin...

  13. Pervasive Computing Location-aware Model Based on Ontology

    Institute of Scientific and Technical Information of China (English)

    PU Fang; CAI Hai-bin; CAO Qi-ying; SUN Dao-qing; LI Tong

    2008-01-01

    In order to integrate heterogeneous location-aware systems into pervasive computing environment, a novel pervasive computing location-aware model based on ontology is presented. A location-aware model ontology (LMO) is constructed. The location-aware model has the capabilities of sharing knowledge, reasoning and adjusting the usage policies of services dynamically through a unified semantic location manner. At last, the work process of our proposed location-aware model is explained by an application scenario.

  14. Markovian Building Blocks for Individual-Based Modelling

    OpenAIRE

    Nilsson, Lars Anders Fredrik; Nielsen, Bo Friis; Thygesen, Uffe Høgsbro; Beyer, Jan

    2007-01-01

    The present thesis consists of a summary report, four research articles, one technical report and one manuscript. The subject of the thesis is individual-based stochastic models. The summary report is composed of three parts and a brief history of some basic models in population biology. This history is included in order to provide a reader that has no previous exposure to models in population biology with a sufficient background to understand some of the biological models that are mentioned ...

  15. Business model for sensor-based fall recognition systems.

    Science.gov (United States)

    Fachinger, Uwe; Schöpke, Birte

    2014-01-01

    AAL systems require, in addition to sophisticated and reliable technology, adequate business models for their launch and sustainable establishment. This paper presents the basic features of alternative business models for a sensor-based fall recognition system which was developed within the context of the "Lower Saxony Research Network Design of Environments for Ageing" (GAL). The models were developed parallel to the R&D process with successive adaptation and concretization. An overview of the basic features (i.e. nine partial models) of the business model is given and the mutual exclusive alternatives for each partial model are presented. The partial models are interconnected and the combinations of compatible alternatives lead to consistent alternative business models. However, in the current state, only initial concepts of alternative business models can be deduced. The next step will be to gather additional information to work out more detailed models.

  16. UML statechart based rigorous modeling of real-time system

    Institute of Scientific and Technical Information of China (English)

    LAI Ming-zhi; YOU Jin-yuan

    2005-01-01

    Rigorous modeling could ensure correctness and could verify a reduced cost in embedded real-time system development for models. Software methods are needed for rigorous modeling of embedded real-time systems. PVS is a formal method with precise syntax and semantics defined. System modeled by PVS specification could be verified by tools. Combining the widely used UML with PVS, this paper provides a novel modeling and verification approach for embedded real-time systems. In this approach, we provide 1 ) a time-extended UML statechart for modeling dynamic behavior of an embedded real-time system; 2) an approach to capture timed automata based semantics from a timed statechart; and 3) an algorithm to generate a finite state model expressed in PVS specification for model checking. The benefits of our approach include flexibility and user friendliness in modeling, extendability in formalization and verification content, and better performance. Time constraints are modeled and verified and is a highlight of this paper.

  17. Phase Correlation Based Iris Image Registration Model

    Institute of Scientific and Technical Information of China (English)

    Jun-Zhou Huang; Tie-Niu Tan; Li Ma; Yun-Hong Wang

    2005-01-01

    Iris recognition is one of the most reliable personal identification methods. In iris recognition systems, image registration is an important component. Accurately registering iris images leads to higher recognition rate for an iris recognition system. This paper proposes a phase correlation based method for iris image registration with sub-pixel accuracy.Compared with existing methods, it is insensitive to image intensity and can compensate to a certain extent the non-linear iris deformation caused by pupil movement. Experimental results show that the proposed algorithm has an encouraging performance.

  18. Intelligent Transportation and Evacuation Planning A Modeling-Based Approach

    CERN Document Server

    Naser, Arab

    2012-01-01

    Intelligent Transportation and Evacuation Planning: A Modeling-Based Approach provides a new paradigm for evacuation planning strategies and techniques. Recently, evacuation planning and modeling have increasingly attracted interest among researchers as well as government officials. This interest stems from the recent catastrophic hurricanes and weather-related events that occurred in the southeastern United States (Hurricane Katrina and Rita). The evacuation methods that were in place before and during the hurricanes did not work well and resulted in thousands of deaths. This book offers insights into the methods and techniques that allow for implementing mathematical-based, simulation-based, and integrated optimization and simulation-based engineering approaches for evacuation planning. This book also: Comprehensively discusses the application of mathematical models for evacuation and intelligent transportation modeling Covers advanced methodologies in evacuation modeling and planning Discusses principles a...

  19. Viscoelastic Parameter Model of Magnetorheological Elastomers Based on Abel Dashpot

    Directory of Open Access Journals (Sweden)

    Fei Guo

    2014-04-01

    Full Text Available In this paper, a parametric constitutive model based on Abel dashpot is established in a simple form and with clear physical meaning to deduce the expression of dynamic mechanical modulus of MREs. Meanwhile, in consideration for the pressure stress on MREs in the experiment of shear mechanical properties or the application to vibration damper, some improvements are made on the particle chain model based on the coupled field. In addition, in order to verify the accuracy of the overall model, five groups of MREs samples based on silicone rubber with different volume fractions are prepared and the MCR51 rheometer is used to conduct the experiment of dynamic mechanical properties based on frequency and magnetic field scanning. Finally, experimental results indicate that the established model fits well with laboratory data; namely, the relationship between the dynamic modulus of MREs and changes in frequency and magnetic field is well described by the model.

  20. A review of urban residential choice models using agent-based modeling

    NARCIS (Netherlands)

    Huang, Qingxu; Parker, Dawn C.; Filatova, Tatiana; Sun, Shipeng

    2014-01-01

    Urban land-use modeling methods have experienced substantial improvements in the last several decades. With the advancement of urban land-use change theories and modeling techniques, a considerable number of models have been developed. The relatively young approach, agent-based modeling, provides ur

  1. Scale-based spatial data model for GIS

    Institute of Scientific and Technical Information of China (English)

    WEI Zu-kuan

    2004-01-01

    Being the primary media of geographical information and the elementary objects manipulated, almost all of maps adopt the layer-based model to represent geographic information in the existent GIS. However, it is difficult to extend the map represented in layer-based model. Furthermore, in Web-Based GIS, It is slow to transmit the spatial data for map viewing. In this paper, for solving the questions above, we have proposed a new method for representing the spatial data. That is scale-based model. In this model we represent maps in three levels: scale-view, block, and spatial object, and organize the maps in a set of map layers, named Scale-View, which associates some given scales.Lastly, a prototype Web-Based GIS using the proposed spatial data representation is described briefly.

  2. Discrete Element Simulation of Asphalt Mastics Based on Burgers Model

    Institute of Scientific and Technical Information of China (English)

    LIU Yu; FENG Shi-rong; HU Xia-guang

    2007-01-01

    In order to investigate the viscoelastic performance of asphalt mastics, a micro-mechanical model for asphalt mastics was built by applying Burgers model to discrete element simulation and constructing Burgers contact model. Then the numerical simulation of creep tests was conducted, and results from the simulation were compared with the analytical solution for Burgers model. The comparision snowed that the two results agreed well with each other, suggesting that discrete element model based on Burgers model could be employed in the numerical simulation for asphalt mastics.

  3. INTRUSION DETECTION BASED ON THE SECOND-ORDER STOCHASTIC MODEL

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    This paper presents a new method based on a second-order stochastic model for computer intrusion detection. The results show that the performance of the second-order stochastic model is better than that of a first-order stochastic model. In this study, different window sizes are also used to test the performance of the model. The detection results show that the second-order stochastic model is not so sensitive to the window size, comparing with the first-order stochastic model and other previous researches. The detection result of window sizes 6 and 10 is the same.

  4. GIS-BASED 1-D DIFFUSIVE WAVE OVERLAND FLOW MODEL

    Energy Technology Data Exchange (ETDEWEB)

    KALYANAPU, ALFRED [Los Alamos National Laboratory; MCPHERSON, TIMOTHY N. [Los Alamos National Laboratory; BURIAN, STEVEN J. [NON LANL

    2007-01-17

    This paper presents a GIS-based 1-d distributed overland flow model and summarizes an application to simulate a flood event. The model estimates infiltration using the Green-Ampt approach and routes excess rainfall using the 1-d diffusive wave approximation. The model was designed to use readily available topographic, soils, and land use/land cover data and rainfall predictions from a meteorological model. An assessment of model performance was performed for a small catchment and a large watershed, both in urban environments. Simulated runoff hydrographs were compared to observations for a selected set of validation events. Results confirmed the model provides reasonable predictions in a short period of time.

  5. Jamming transitions in force-based models for pedestrian dynamics

    CERN Document Server

    Chraibi, Mohcine; Tordeux, Antoine; Nishinari, Katsuhiro; Schadschneider, Andreas; Seyfried, Armin

    2015-01-01

    Force-based models describe pedestrian dynamics in analogy to classical mechanics by a system of second order ordinary differential equations. By investigating the linear stability of two main classes of forces, parameter regions with unstable homogeneous states are identified. In this unstable regime it is then checked whether phase transitions or stop-and-go waves occur. Results based on numerical simulations show, however, that the investigated models lead to unrealistic behavior in form of backwards moving pedestrians and overlapping. This is one reason why stop-and-go waves have not been observed in these models. The unrealistic behavior is not related to the numerical treatment of the dynamic equations but rather indicates an intrinsic problem of this model class. Identifying the underlying generic problems gives indications how to define models that do not show such unrealistic behavior. As an example we introduce a new force-based model which produces realistic jam dynamics without the appearance of u...

  6. Model Based Mission Assurance: Emerging Opportunities for Robotic Systems

    Science.gov (United States)

    Evans, John W.; DiVenti, Tony

    2016-01-01

    The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).

  7. CONCEPTUAL MODELING BASED ON LOGICAL EXPRESSION AND EVOLVEMENT

    Institute of Scientific and Technical Information of China (English)

    Yl Guodong; ZHANG Shuyou; TAN Jianrong; JI Yangjian

    2007-01-01

    Aiming at the problem of abstract and polytype information modeling in product conceptual design, a method of conceptual modeling based on logical expression and evolvement is presented. Based on the logic expressions of the product conceptual design information, a function/logic/structure mapping model is set up. First, the function semantics is transformed into logical expressions through function/logic mapping. Second, the methods of logical evolvement are utilized to describe the function analysis, function/structure mapping and structure combination. Last, the logical structure scheme is transformed into geometrical sketch through logic/structure mapping. The conceptual design information and modeling process are described uniformly with logical methods in the model, and an effective method for computer aided conceptual design based on the model is implemented.

  8. Distributed Maximality based CTL Model Checking

    Directory of Open Access Journals (Sweden)

    Djamel Eddine Saidouni

    2010-05-01

    Full Text Available In this paper we investigate an approach to perform a distributed CTL Model checker algorithm on a network of workstations using Kleen three value logic, the state spaces is partitioned among the network nodes, We represent the incomplete state spaces as a Maximality labeled Transition System MLTS which are able to express true concurrency. we execute in parallel the same algorithm in each node, for a certain property on an incomplete MLTS , this last compute the set of states which satisfy or which if they fail are assigned the value .The third value mean unknown whether true or false because the partial state space lacks sufficient information needed for a precise answer concerning the complete state space .To solve this problem each node exchange the information needed to conclude the result about the complete state space. The experimental version of the algorithm is currently being implemented using the functional programming language Erlang.

  9. Physiologically-based pharmacokinetic simulation modelling.

    Science.gov (United States)

    Grass, George M; Sinko, Patrick J

    2002-03-31

    Drug selection is now widely viewed as an important and relatively new, yet largely unsolved, bottleneck in the drug discovery and development process. In order to achieve an efficient selection process, high quality, rapid, predictive and correlative ADME models are required in order for them to be confidently used to support critical financial decisions. Systems that can be relied upon to accurately predict performance in humans have not existed, and decisions have been made using tools whose capabilities could not be verified until candidates went to clinical trial, leading to the high failure rates historically observed. However, with the sequencing of the human genome, advances in proteomics, the anticipation of the identification of a vastly greater number of potential targets for drug discovery, and the potential of pharmacogenomics to require individualized evaluation of drug kinetics as well as drug effects, there is an urgent need for rapid and accurately computed pharmacokinetic properties.

  10. Repetition-based Interactive Facade Modeling

    KAUST Repository

    AlHalawani, Sawsan

    2012-07-01

    Modeling and reconstruction of urban environments has gained researchers attention throughout the past few years. It spreads in a variety of directions across multiple disciplines such as image processing, computer graphics and computer vision as well as in architecture, geoscience and remote sensing. Having a virtual world of our real cities is very attractive in various directions such as entertainment, engineering, governments among many others. In this thesis, we address the problem of processing a single fa cade image to acquire useful information that can be utilized to manipulate the fa cade and generate variations of fa cade images which can be later used for buildings\\' texturing. Typical fa cade structures exhibit a rectilinear distribution where in windows and other elements are organized in a grid of horizontal and vertical repetitions of similar patterns. In the firt part of this thesis, we propose an efficient algorithm that exploits information obtained from a single image to identify the distribution grid of the dominant elements i.e. windows. This detection method is initially assisted with the user marking the dominant window followed by an automatic process for identifying its repeated instances which are used to define the structure grid. Given the distribution grid, we allow the user to interactively manipulate the fa cade by adding, deleting, resizing or repositioning the windows in order to generate new fa cade structures. Having the utility for the interactive fa cade is very valuable to create fa cade variations and generate new textures for building models. Ultimately, there is a wide range of interesting possibilities of interactions to be explored.

  11. Uncertainty Representation and Interpretation in Model-based Prognostics Algorithms based on Kalman Filter Estimation

    Data.gov (United States)

    National Aeronautics and Space Administration — This article discusses several aspects of uncertainty represen- tation and management for model-based prognostics method- ologies based on our experience with Kalman...

  12. Rodent model of activity-based anorexia.

    Science.gov (United States)

    Carrera, Olaia; Fraga, Ángela; Pellón, Ricardo; Gutiérrez, Emilio

    2014-04-10

    Activity-based anorexia (ABA) consists of a procedure that involves the simultaneous exposure of animals to a restricted feeding schedule, while free access is allowed to an activity wheel. Under these conditions, animals show a progressive increase in wheel running, a reduced efficiency in food intake to compensate for their increased activity, and a severe progression of weight loss. Due to the parallelism with the clinical manifestations of anorexia nervosa including increased activity, reduced food intake and severe weight loss, the ABA procedure has been proposed as the best analog of human anorexia nervosa (AN). Thus, ABA research could both allow a better understanding of the mechanisms underlying AN and generate useful leads for treatment development in AN.

  13. Adopsi Model Competency Based Training dalam Kewirausahaan

    Directory of Open Access Journals (Sweden)

    I Ketut Santra

    2009-01-01

    Full Text Available The aim of the research is improving the teaching method in entrepreneurship subject. This research adopted the competency based training (CBT into the entrepreneurship. The major task in this research is formulated and designed the entrepreneurship competency. Entrepreneurship competency indicated by Personal, Strategic and Situational and Business competence. All of entrepreneurship competences are described into sub topic of competence. After designing and formulating the game and simulation the research continuing to implement the competency based training in the real class. The time consumed to implementing the CBT one semester, starting on September 2006 to early February 2007. The lesson learnt from the implementation period, the CBT could improve the student competence in Personal, Situational Strategic and Business. The three of the competencies are important for the success entrepreneur. It is a sign of application of “Kurikulum Berbasis Kompetensi”. There are many evidences to describe the achievement of the CBT in entrepreneurship subject. Firstly, physically achievement, that all of the student’s business plan could became the real business. The evidences are presented by picture of the student’s real business. Secondly theoretically achievement, that the Personal, Situational Strategic and Business competence statistically have significant relation with Business Plan even Real Business quality. The effect of the Personal, Situational Strategic and Business competence to Business Plan quality is 84.4%. and, to the Real Business quality 77.2%. The statistic’s evidence suggests that the redesign of the entrepreneurship subject is the right way. The content of the entrepreneur competence (Personal, Situational and Strategic and Business competence have impact to the student to conduct and running for own business.

  14. Model-Based Integration and Interpretation of Data

    DEFF Research Database (Denmark)

    Petersen, Johannes

    2004-01-01

    Data integration and interpretation plays a crucial role in supervisory control. The paper defines a set of generic inference steps for the data integration and interpretation process based on a three-layer model of system representations. The three-layer model is used to clarify the combination...... of constraint and object-centered representations of the work domain throwing new light on the basic principles underlying the data integration and interpretation process of Rasmussen's abstraction hierarchy as well as other model-based approaches combining constraint and object-centered representations. Based...

  15. Interactive Coherence-Based Façade Modeling

    KAUST Repository

    Musialski, Przemyslaw

    2012-05-01

    We propose a novel interactive framework for modeling building facades from images. Our method is based on the notion of coherence-based editing which allows exploiting partial symmetries across the facade at any level of detail. The proposed workflow mixes manual interaction with automatic splitting and grouping operations based on unsupervised cluster analysis. In contrast to previous work, our approach leads to detailed 3d geometric models with up to several thousand regions per facade. We compare our modeling scheme to others and evaluate our approach in a user study with an experienced user and several novice users.

  16. A Role- Based PMI Security Model for E- Government

    Institute of Scientific and Technical Information of China (English)

    WU Li-jun; SU Kai-le; YANG Zhi-hua

    2005-01-01

    We introduce the general AC( attribute certificate ), the role specification AC and the role assignment AC.We discuss the role-based PMI (Privilege Management Infrastructure) architecture. The role-based PMI(Public-Key Infrastructure) secure model for E-government is researched by combining the role-based PMI with PKI architecture (Public Key Infrastructure). The model has advantages of flexibility,convenience, less storage space and less network consumption etc. We are going to use the secure model in the E-government system.

  17. Anchor-based English-Chinese Bilingual Chunk Alignment Model

    Institute of Scientific and Technical Information of China (English)

    WU We-lin; CHENG Chang-sheng; XU Liang-xian; LU Ru-zhan

    2005-01-01

    Chunk alignment for the bilingual corpus is the base of Example-based Machine Translation. An anchor-based English-Chinese bilingual chunk alignment model and the corresponding algorithm of alignment are presented in this paper. It can effectively overcome the sparse data problem due to the limited size of the bilingual corpus. In this model, the chunk segmentation disambiguation is delayed to the alignment process, and hence the accuracy of chunk segmentation is improved. The experimental results demonstrate the feasibility and viability of this model.

  18. Elastoplastic cup model for cement-based materials

    Directory of Open Access Journals (Sweden)

    Yan ZHANG

    2010-03-01

    Full Text Available Based on experimental data obtained from triaxial tests and a hydrostatic test, a cup model was formulated. Two plastic mechanisms, respectively a deviatoric shearing and a pore collapse, are taken into account. This model also considers the influence of confining pressure. In this paper, the calibration of the model is detailed and numerical simulations of the main mechanical behavior of cement paste over a large range of stress are described, showing good agreement with experimental results. The case study shows that this cup model has extensive applicability for cement-based materials and other quasi-brittle and high-porosity materials in a complex stress state.

  19. BP Network Based Users' Interest Model in Mining WWW Cache

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    By analyzing the WWW Cache model, we bring forward a user-interest description method based on the fuzzy theory and user-interest inferential relations based on BP(back propagation) neural network. By this method, the users' interest in the WWW cache can be described and the neural network of users' interest can be constructed by positive spread of interest and the negative spread of errors. This neural network can infer the users' interest. This model is not the simple extension of the simple interest model, but the round improvement of the model and its related algorithm.

  20. An Agent-Based Modeling for Pandemic Influenza in Egypt

    OpenAIRE

    Khalil, Khaled M.; Abdel-Aziz, M.; Nazmy, Taymour T.; Salem, Abdel-Badeeh M.

    2010-01-01

    Pandemic influenza has great potential to cause large and rapid increases in deaths and serious illness. The objective of this paper is to develop an agent-based model to simulate the spread of pandemic influenza (novel H1N1) in Egypt. The proposed multi-agent model is based on the modeling of individuals' interactions in a space time context. The proposed model involves different types of parameters such as: social agent attributes, distribution of Egypt population, and patterns of agents' i...

  1. Characterization of mitochondrial haplogroups in a large population-based sample from the United States.

    Science.gov (United States)

    Mitchell, Sabrina L; Goodloe, Robert; Brown-Gentry, Kristin; Pendergrass, Sarah A; Murdock, Deborah G; Crawford, Dana C

    2014-07-01

    Mitochondrial DNA (mtDNA) haplogroups are valuable for investigations in forensic science, molecular anthropology, and human genetics. In this study, we developed a custom panel of 61 mtDNA markers for high-throughput classification of European, African, and Native American/Asian mitochondrial haplogroup lineages. Using these mtDNA markers, we constructed a mitochondrial haplogroup classification tree and classified 18,832 participants from the National Health and Nutrition Examination Surveys (NHANES). To our knowledge, this is the largest study to date characterizing mitochondrial haplogroups in a population-based sample from the United States, and the first study characterizing mitochondrial haplogroup distributions in self-identified Mexican Americans separately from Hispanic Americans of other descent. We observed clear differences in the distribution of maternal genetic ancestry consistent with proposed admixture models for these subpopulations, underscoring the genetic heterogeneity of the United States Hispanic population. The mitochondrial haplogroup distributions in the other self-identified racial/ethnic groups within NHANES were largely comparable to previous studies. Mitochondrial haplogroup classification was highly concordant with self-identified race/ethnicity (SIRE) in non-Hispanic whites (94.8 %), but was considerably lower in admixed populations including non-Hispanic blacks (88.3 %), Mexican Americans (81.8 %), and other Hispanics (61.6 %), suggesting SIRE does not accurately reflect maternal genetic ancestry, particularly in populations with greater proportions of admixture. Thus, it is important to consider inconsistencies between SIRE and genetic ancestry when performing genetic association studies. The mitochondrial haplogroup data that we have generated, coupled with the epidemiologic variables in NHANES, is a valuable resource for future studies investigating the contribution of mtDNA variation to human health and disease.

  2. Numerical simulation of base flow with hot base bleed for two jet models

    Institute of Scientific and Technical Information of China (English)

    Wen-jie YU; Yong-gang YU; Bin NI

    2014-01-01

    In order to improve the benefits of base bleed in base flow field, the base flow with hot base bleed for two jet models is studied. Two-dimensional axisymmetric NaviereStokes equations are computed by using a finite volume scheme. The base flow of a cylinder afterbody with base bleed is simulated. The simulation results are validated with the experimental data, and the experimental results are well reproduced. On this basis, the base flow fields with base bleed for a circular jet model and an annulus jet model are investigated by selecting the injection temperature from 830 K to 2200 K. The results show that the base pressure of the annular jet model is higher than that of the circular jet model with the changes of the injection parameter and the injection temperature. For the circular jet model, the hot gases are concentrated in the vicinity of the base. For the annular jet model, the bleed gases flow into the shear layer directly so that the hot gases are concentrated in the shear layer. The latter temperature distribution is better for the increase of base pressure.

  3. Powerline Communications Channel Modelling Methodology Based on Statistical Features

    CERN Document Server

    Tan, Bo

    2012-01-01

    This paper proposes a new channel modelling method for powerline communications networks based on the multipath profile in the time domain. The new channel model is developed to be applied in a range of Powerline Communications (PLC) research topics such as impulse noise modelling, deployment and coverage studies, and communications theory analysis. To develop the methodology, channels are categorised according to their propagation distance and power delay profile. The statistical multipath parameters such as path arrival time, magnitude and interval for each category are analyzed to build the model. Each generated channel based on the proposed statistical model represents a different realisation of a PLC network. Simulation results in similar the time and frequency domains show that the proposed statistical modelling method, which integrates the impact of network topology presents the PLC channel features as the underlying transmission line theory model. Furthermore, two potential application scenarios are d...

  4. Hybrid and adaptive meta-model-based global optimization

    Science.gov (United States)

    Gu, J.; Li, G. Y.; Dong, Z.

    2012-01-01

    As an efficient and robust technique for global optimization, meta-model-based search methods have been increasingly used in solving complex and computation intensive design optimization problems. In this work, a hybrid and adaptive meta-model-based global optimization method that can automatically select appropriate meta-modelling techniques during the search process to improve search efficiency is introduced. The search initially applies three representative meta-models concurrently. Progress towards a better performing model is then introduced by selecting sample data points adaptively according to the calculated values of the three meta-models to improve modelling accuracy and search efficiency. To demonstrate the superior performance of the new algorithm over existing search methods, the new method is tested using various benchmark global optimization problems and applied to a real industrial design optimization example involving vehicle crash simulation. The method is particularly suitable for design problems involving computation intensive, black-box analyses and simulations.

  5. Multi-level spherical moments based 3D model retrieval

    Institute of Scientific and Technical Information of China (English)

    LIU Wei; HE Yuan-jun

    2006-01-01

    In this paper a novel 3D model retrieval method that employs multi-level spherical moment analysis and relies on voxelization and spherical mapping of the 3D models is proposed. For a given polygon-soup 3D model, first a pose normalization step is done to align the model into a canonical coordinate frame so as to define the shape representation with respect to this orientation. Afterward we rasterize its exterior surface into cubical voxel grids, then a series of homocentric spheres with their center superposing the center of the voxel grids cut the voxel grids into several spherical images. Finally moments belonging to each sphere are computed and the moments of all spheres constitute the descriptor of the model. Experiments showed that Euclidean distance based on this kind of feature vector can distinguish different 3D models well and that the 3D model retrieval system based on this arithmetic yields satisfactory performance.

  6. Research on Bayesian Network Based User's Interest Model

    Institute of Scientific and Technical Information of China (English)

    ZHANG Weifeng; XU Baowen; CUI Zifeng; XU Lei

    2007-01-01

    It has very realistic significance for improving the quality of users' accessing information to filter and selectively retrieve the large number of information on the Internet. On the basis of analyzing the existing users' interest models and some basic questions of users' interest (representation, derivation and identification of users' interest), a Bayesian network based users' interest model is given. In this model, the users' interest reduction algorithm based on Markov Blanket model is used to reduce the interest noise, and then users' interested and not interested documents are used to train the Bayesian network. Compared to the simple model, this model has the following advantages like small space requirements, simple reasoning method and high recognition rate. The experiment result shows this model can more appropriately reflect the user's interest, and has higher performance and good usability.

  7. An Agent-Based Modeling for Pandemic Influenza in Egypt

    CERN Document Server

    Khalil, Khaled M; Nazmy, Taymour T; Salem, Abdel-Badeeh M

    2010-01-01

    Pandemic influenza has great potential to cause large and rapid increases in deaths and serious illness. The objective of this paper is to develop an agent-based model to simulate the spread of pandemic influenza (novel H1N1) in Egypt. The proposed multi-agent model is based on the modeling of individuals' interactions in a space time context. The proposed model involves different types of parameters such as: social agent attributes, distribution of Egypt population, and patterns of agents' interactions. Analysis of modeling results leads to understanding the characteristics of the modeled pandemic, transmission patterns, and the conditions under which an outbreak might occur. In addition, the proposed model is used to measure the effectiveness of different control strategies to intervene the pandemic spread.

  8. Integrating Design Decision Management with Model-based Software Development

    DEFF Research Database (Denmark)

    Könemann, Patrick

    Design decisions are continuously made during the development of software systems and are important artifacts for design documentation. Dedicated decision management systems are often used to capture such design knowledge. Most such systems are, however, separated from the design artifacts...... of the system. In model-based software development, where design models are used to develop a software system, outcomes of many design decisions have big impact on design models. The realization of design decisions is often manual and tedious work on design models. Moreover, keeping design models consistent...... with all made decisions is a time-consuming manual task that is often performed in peer reviews. In this thesis, a generic technology has been developed for extracting model differences from models and transferring them to other models. These concepts, called model-independent differences, can be used...

  9. Simulating cancer growth with multiscale agent-based modeling.

    Science.gov (United States)

    Wang, Zhihui; Butner, Joseph D; Kerketta, Romica; Cristini, Vittorio; Deisboeck, Thomas S

    2015-02-01

    There have been many techniques developed in recent years to in silico model a variety of cancer behaviors. Agent-based modeling is a specific discrete-based hybrid modeling approach that allows simulating the role of diversity in cell populations as well as within each individual cell; it has therefore become a powerful modeling method widely used by computational cancer researchers. Many aspects of tumor morphology including phenotype-changing mutations, the adaptation to microenvironment, the process of angiogenesis, the influence of extracellular matrix, reactions to chemotherapy or surgical intervention, the effects of oxygen and nutrient availability, and metastasis and invasion of healthy tissues have been incorporated and investigated in agent-based models. In this review, we introduce some of the most recent agent-based models that have provided insight into the understanding of cancer growth and invasion, spanning multiple biological scales in time and space, and we further describe several experimentally testable hypotheses generated by those models. We also discuss some of the current challenges of multiscale agent-based cancer models.

  10. Image based 3D city modeling : Comparative study

    Science.gov (United States)

    Singh, S. P.; Jain, K.; Mandla, V. R.

    2014-06-01

    3D city model is a digital representation of the Earth's surface and it's related objects such as building, tree, vegetation, and some manmade feature belonging to urban area. The demand of 3D city modeling is increasing rapidly for various engineering and non-engineering applications. Generally four main image based approaches were used for virtual 3D city models generation. In first approach, researchers were used Sketch based modeling, second method is Procedural grammar based modeling, third approach is Close range photogrammetry based modeling and fourth approach is mainly based on Computer Vision techniques. SketchUp, CityEngine, Photomodeler and Agisoft Photoscan are the main softwares to represent these approaches respectively. These softwares have different approaches & methods suitable for image based 3D city modeling. Literature study shows that till date, there is no complete such type of comparative study available to create complete 3D city model by using images. This paper gives a comparative assessment of these four image based 3D modeling approaches. This comparative study is mainly based on data acquisition methods, data processing techniques and output 3D model products. For this research work, study area is the campus of civil engineering department, Indian Institute of Technology, Roorkee (India). This 3D campus acts as a prototype for city. This study also explains various governing parameters, factors and work experiences. This research work also gives a brief introduction, strengths and weakness of these four image based techniques. Some personal comment is also given as what can do or what can't do from these softwares. At the last, this study shows; it concluded that, each and every software has some advantages and limitations. Choice of software depends on user requirements of 3D project. For normal visualization project, SketchUp software is a good option. For 3D documentation record, Photomodeler gives good result. For Large city

  11. RPOA Model-Based Optimal Resource Provisioning

    Directory of Open Access Journals (Sweden)

    Noha El. Attar

    2014-01-01

    Full Text Available Optimal utilization of resources is the core of the provisioning process in the cloud computing. Sometimes the local resources of a data center are not adequate to satisfy the users’ requirements. So, the providers need to create several data centers at different geographical area around the world and spread the users’ applications on these resources to satisfy both service providers and customers QoS requirements. By considering the expansion of the resources and applications, the transmission cost and time have to be concerned as significant factors in the allocation process. According to the work of our previous paper, a Resource Provision Optimal Algorithm (RPOA based on Particle Swarm Optimization (PSO has been introduced to find the near optimal resource utilization with considering the customer budget and suitable for deadline time. This paper is considered an enhancement to RPOA algorithm to find the near optimal resource utilization with considering the data transfer time and cost, in addition to the customer budget and deadline time, in the performance measurement.

  12. Parameter optimization in differential geometry based solvation models.

    Science.gov (United States)

    Wang, Bao; Wei, G W

    2015-10-01

    Differential geometry (DG) based solvation models are a new class of variational implicit solvent approaches that are able to avoid unphysical solvent-solute boundary definitions and associated geometric singularities, and dynamically couple polar and non-polar interactions in a self-consistent framework. Our earlier study indicates that DG based non-polar solvation model outperforms other methods in non-polar solvation energy predictions. However, the DG based full solvation model has not shown its superiority in solvation analysis, due to its difficulty in parametrization, which must ensure the stability of the solution of strongly coupled nonlinear Laplace-Beltrami and Poisson-Boltzmann equations. In this work, we introduce new parameter learning algorithms based on perturbation and convex optimization theories to stabilize the numerical solution and thus achieve an optimal parametrization of the DG based solvation models. An interesting feature of the present DG based solvation model is that it provides accurate solvation free energy predictions for both polar and non-polar molecules in a unified formulation. Extensive numerical experiment demonstrates that the present DG based solvation model delivers some of the most accurate predictions of the solvation free energies for a large number of molecules.

  13. Error model identification of inertial navigation platform based on errors-in-variables model

    Institute of Scientific and Technical Information of China (English)

    Liu Ming; Liu Yu; Su Baoku

    2009-01-01

    Because the real input acceleration cannot be obtained during the error model identification of inertial navigation platform, both the input and output data contain noises. In this case, the conventional regression model and the least squares (LS) method will result in bias. Based on the models of inertial navigation platform error and observation error, the errors-in-variables (EV) model and the total least squares (TLS) method are proposed to identify the error model of the inertial navigation platform. The estimation precision is improved and the result is better than the conventional regression model based LS method. The simulation results illustrate the effectiveness of the proposed method.

  14. Using Built-In Domain-Specific Modeling Support to Guide Model-Based Test Generation

    CERN Document Server

    Kanstrén, Teemu; 10.4204/EPTCS.80.5

    2012-01-01

    We present a model-based testing approach to support automated test generation with domain-specific concepts. This includes a language expert who is an expert at building test models and domain experts who are experts in the domain of the system under test. First, we provide a framework to support the language expert in building test models using a full (Java) programming language with the help of simple but powerful modeling elements of the framework. Second, based on the model built with this framework, the toolset automatically forms a domain-specific modeling language that can be used to further constrain and guide test generation from these models by a domain expert. This makes it possible to generate a large set of test cases covering the full model, chosen (constrained) parts of the model, or manually define specific test cases on top of the model while using concepts familiar to the domain experts.

  15. RESEARCH ON VIRTUAL-PART-BASED CONNECTING ELEMENT MODELING

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    Based on the inner character analysis of interpart, detail modification and assembly relation of mechanical connecting element, the idea, which extends the feature modeling of part to the interpart feature modeling for assembly purpose, is presented, and virtual-part-based connecting element modeling is proposed. During the assembly modeling, base parts are modified by the Boolean subtraction between the virtual part and the part to be connected. Dynamic matching algorithm, which is based on list database, is designed for dynamic extension and off-line editing of connecting part and virtual part, and design rules of connecting element is encapsulated by the virtual part. A prototyped software module for rapid design of connecting elements is implemented under self-developed CAD/CAM platform-SuperMan.

  16. CDMBE: A Case Description Model Based on Evidence

    Directory of Open Access Journals (Sweden)

    Jianlin Zhu

    2015-01-01

    Full Text Available By combining the advantages of argument map and Bayesian network, a case description model based on evidence (CDMBE, which is suitable to continental law system, is proposed to describe the criminal cases. The logic of the model adopts the credibility logical reason and gets evidence-based reasoning quantitatively based on evidences. In order to consist with practical inference rules, five types of relationship and a set of rules are defined to calculate the credibility of assumptions based on the credibility and supportability of the related evidences. Experiments show that the model can get users’ ideas into a figure and the results calculated from CDMBE are in line with those from Bayesian model.

  17. Model-based approach for elevator performance estimation

    Science.gov (United States)

    Esteban, E.; Salgado, O.; Iturrospe, A.; Isasa, I.

    2016-02-01

    In this paper, a dynamic model for an elevator installation is presented in the state space domain. The model comprises both the mechanical and the electrical subsystems, including the electrical machine and a closed-loop field oriented control. The proposed model is employed for monitoring the condition of the elevator installation. The adopted model-based approach for monitoring employs the Kalman filter as an observer. A Kalman observer estimates the elevator car acceleration, which determines the elevator ride quality, based solely on the machine control signature and the encoder signal. Finally, five elevator key performance indicators are calculated based on the estimated car acceleration. The proposed procedure is experimentally evaluated, by comparing the key performance indicators calculated based on the estimated car acceleration and the values obtained from actual acceleration measurements in a test bench. Finally, the proposed procedure is compared with the sliding mode observer.

  18. Model Based Iterative Reconstruction for Bright Field Electron Tomography (Postprint)

    Science.gov (United States)

    2013-02-01

    Reconstruction Technique ( SIRT ) are applied to the data. Model based iterative reconstruction (MBIR) provides a powerful framework for tomographic...the reconstruction when the typical algorithms such as Filtered Back Projection (FBP) and Simultaneous Iterative Reconstruction Technique ( SIRT ) are

  19. Physics-Based Pneumatic Hammer Instability Model Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this project is to develop a physics-based pneumatic hammer instability model that accurately predicts the stability of hydrostatic bearings...

  20. A Model-based Prognostics Approach Applied to Pneumatic Valves

    Data.gov (United States)

    National Aeronautics and Space Administration — Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain...

  1. A Model-Based Prognostics Approach Applied to Pneumatic Valves

    Data.gov (United States)

    National Aeronautics and Space Administration — Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain...

  2. Model-based Prognostics with Fixed-lag Particle Filters

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics exploits domain knowl- edge of the system, its components, and how they fail by casting the underlying physical phenom- ena in a...

  3. A Model-based Avionic Prognostic Reasoner (MAPR)

    Data.gov (United States)

    National Aeronautics and Space Administration — The Model-based Avionic Prognostic Reasoner (MAPR) presented in this paper is an innovative solution for non-intrusively monitoring the state of health (SoH) and...

  4. Probabilistic Model-Based Diagnosis for Electrical Power Systems

    Data.gov (United States)

    National Aeronautics and Space Administration — We present in this article a case study of the probabilistic approach to model-based diagnosis. Here, the diagnosed system is a real-world electrical power system,...

  5. Human insulin dynamics in women: a physiologically based model.

    Science.gov (United States)

    Weiss, Michael; Tura, Andrea; Kautzky-Willer, Alexandra; Pacini, Giovanni; D'Argenio, David Z

    2016-02-01

    Currently available models of insulin dynamics are mostly based on the classical compartmental structure and, thus, their physiological utility is limited. In this work, we describe the development of a physiologically based model and its application to data from 154 patients who underwent an insulin-modified intravenous glucose tolerance test (IM-IVGTT). To determine the time profile of endogenous insulin delivery without using C-peptide data and to evaluate the transcapillary transport of insulin, the hepatosplanchnic, renal, and peripheral beds were incorporated into the circulatory model as separate subsystems. Physiologically reasonable population mean estimates were obtained for all estimated model parameters, including plasma volume, interstitial volume of the peripheral circulation (mainly skeletal muscle), uptake clearance into the interstitial space, hepatic and renal clearance, as well as total insulin delivery into plasma. The results indicate that, at a population level, the proposed physiologically based model provides a useful description of insulin disposition, which allows for the assessment of muscle insulin uptake.

  6. Cyberpsychology: a human-interaction perspective based on cognitive modeling.

    Science.gov (United States)

    Emond, Bruno; West, Robert L

    2003-10-01

    This paper argues for the relevance of cognitive modeling and cognitive architectures to cyberpsychology. From a human-computer interaction point of view, cognitive modeling can have benefits both for theory and model building, and for the design and evaluation of sociotechnical systems usability. Cognitive modeling research applied to human-computer interaction has two complimentary objectives: (1) to develop theories and computational models of human interactive behavior with information and collaborative technologies, and (2) to use the computational models as building blocks for the design, implementation, and evaluation of interactive technologies. From the perspective of building theories and models, cognitive modeling offers the possibility to anchor cyberpsychology theories and models into cognitive architectures. From the perspective of the design and evaluation of socio-technical systems, cognitive models can provide the basis for simulated users, which can play an important role in usability testing. As an example of application of cognitive modeling to technology design, the paper presents a simulation of interactive behavior with five different adaptive menu algorithms: random, fixed, stacked, frequency based, and activation based. Results of the simulation indicate that fixed menu positions seem to offer the best support for classification like tasks such as filing e-mails. This research is part of the Human-Computer Interaction, and the Broadband Visual Communication research programs at the National Research Council of Canada, in collaboration with the Carleton Cognitive Modeling Lab at Carleton University.

  7. A NEW DYNAMIC DEFENSE MODEL BASED ON ACTIVE DECEPTION

    Institute of Scientific and Technical Information of China (English)

    Gong Jing; Sun Zhixin; Gu Qiang

    2009-01-01

    Aiming at the traditional passive deception models, this paper constructs a Decoy Platform based on Intelligent Agent (DPIA) to realize dynamic defense. The paper explores a new dynamic defense model based on active deception, introduces its architecture, and expatiates on communication methods and security guarantee in information transference. Simulation results show that the DPIA can attract hacker agility and activity, lead abnormal traffic into it, distribute a large number of attack data, and ensure real network security.

  8. Hidden Markov Model Based Automated Fault Localization for Integration Testing

    OpenAIRE

    Ge, Ning; NAKAJIMA, SHIN; Pantel, Marc

    2013-01-01

    International audience; Integration testing is an expensive activity in software testing, especially for fault localization in complex systems. Model-based diagnosis (MBD) provides various benefits in terms of scalability and robustness. In this work, we propose a novel MBD approach for the automated fault localization in integration testing. Our method is based on Hidden Markov Model (HMM) which is an abstraction of system's component to simulate component's behaviour. The core of this metho...

  9. Discrete Discriminant analysis based on tree-structured graphical models

    DEFF Research Database (Denmark)

    Perez de la Cruz, Gonzalo; Eslava, Guillermina

    The purpose of this paper is to illustrate the potential use of discriminant analysis based on tree{structured graphical models for discrete variables. This is done by comparing its empirical performance using estimated error rates for real and simulated data. The results show that discriminant...... analysis based on tree{structured graphical models is a simple nonlinear method competitive with, and sometimes superior to, other well{known linear methods like those assuming mutual independence between variables and linear logistic regression....

  10. Intelligent Cost Modeling Based on Soft Computing for Avionics Systems

    Institute of Scientific and Technical Information of China (English)

    ZHU Li-li; LI Zhuang-sheng; XU Zong-ze

    2006-01-01

    In parametric cost estimating, objections to using statistical Cost Estimating Relationships (CERs) and parametric models include problems of low statistical significance due to limited data points, biases in the underlying data, and lack of robustness. Soft Computing (SC) technologies are used for building intelligent cost models. The SC models are systemically evaluated based on their training and prediction of the historical cost data of airborne avionics systems. Results indicating the strengths and weakness of each model are presented. In general, the intelligent cost models have higher prediction precision, better data adaptability, and stronger self-learning capability than the regression CERs.

  11. A Prototype-Based Resonance Model of Rhythm Categorization

    Directory of Open Access Journals (Sweden)

    Rasmus Bååth

    2014-10-01

    Full Text Available Categorization of rhythmic patterns is prevalent in musical practice, an example of this being the transcription of (possibly not strictly metrical music into musical notation. In this article we implement a dynamical systems' model of rhythm categorization based on the resonance theory of rhythm perception developed by Large (2010. This model is used to simulate the categorical choices of participants in two experiments of Desain and Honing (2003. The model accurately replicates the experimental data. Our results support resonance theory as a viable model of rhythm perception and show that by viewing rhythm perception as a dynamical system it is possible to model central properties of rhythm categorization.

  12. A prototype-based resonance model of rhythm categorization.

    Science.gov (United States)

    Bååth, Rasmus; Lagerstedt, Erik; Gärdenfors, Peter

    2014-01-01

    Categorization of rhythmic patterns is prevalent in musical practice, an example of this being the transcription of (possibly not strictly metrical) music into musical notation. In this article we implement a dynamical systems' model of rhythm categorization based on the resonance theory of rhythm perception developed by Large (2010). This model is used to simulate the categorical choices of participants in two experiments of Desain and Honing (2003). The model accurately replicates the experimental data. Our results support resonance theory as a viable model of rhythm perception and show that by viewing rhythm perception as a dynamical system it is possible to model central properties of rhythm categorization.

  13. A New Model of Ultracapacitors Based on Fractal Fundamentals

    Directory of Open Access Journals (Sweden)

    Xiaodong Zhang

    2014-01-01

    Full Text Available An intuitive model is proposed in this paper to describe the electrical behavior of certain ultracapacitors. The model is based on a simple expression that can be fully characterized by five real numbers. In this paper, the measured impedances of three ultracapacitors as a function of frequency are compared to model results. There is good agreement between the model and measurements. Results presented in a previous study are also reviewed and the paper demonstrates that those results are also consistent with the newly described model.

  14. A General Polygon-based Deformable Model for Object Recognition

    DEFF Research Database (Denmark)

    Jensen, Rune Fisker; Carstensen, Jens Michael

    1999-01-01

    We propose a general scheme for object localization and recognition based on a deformable model. The model combines shape and image properties by warping a arbitrary prototype intensity template according to the deformation in shape. The shape deformations are constrained by a probabilistic...... distribution, which combined with a match of the warped intensity template and the image form the final criteria used for localization and recognition of a given object. The chosen representation gives the model an ability to model an almost arbitrary object. Beside the actual model a full general scheme...

  15. Perti Net-Based Workflow Access Control Model

    Institute of Scientific and Technical Information of China (English)

    陈卓; 骆婷; 石磊; 洪帆

    2004-01-01

    Access control is an important protection mechanism for information systems. This paper shows how to make access control in workflow system. We give a workflow access control model (WACM) based on several current access control models. The model supports roles assignment and dynamic authorization. The paper defines the workflow using Petri net. It firstly gives the definition and description of the workflow, and then analyzes the architecture of the workflow access control model (WACM). Finally, an example of an e-commerce workflow access control model is discussed in detail.

  16. Toward a validation process for model based safety analysis

    OpenAIRE

    Adeline, Romain; Cardoso, Janette; Darfeuil, Pierre; Humbert, Sophie; Seguin, Christel

    2010-01-01

    Today, Model Based processes become more and more widespread to achieve the analysis of a system. However, there is no formal testing approach to ensure that the formal model is compliant with the real system. In the paper, we choose to study AltaRica model. We present a general process to well construct and validate an AltaRica formal model. The focus is made on this validation phase, i.e. verifying the compliance between the model and the real system. For it, the proposed process recommends...

  17. Model-Based Reconstructive Elasticity Imaging Using Ultrasound

    Directory of Open Access Journals (Sweden)

    Salavat R. Aglyamov

    2007-01-01

    Full Text Available Elasticity imaging is a reconstructive imaging technique where tissue motion in response to mechanical excitation is measured using modern imaging systems, and the estimated displacements are then used to reconstruct the spatial distribution of Young's modulus. Here we present an ultrasound elasticity imaging method that utilizes the model-based technique for Young's modulus reconstruction. Based on the geometry of the imaged object, only one axial component of the strain tensor is used. The numerical implementation of the method is highly efficient because the reconstruction is based on an analytic solution of the forward elastic problem. The model-based approach is illustrated using two potential clinical applications: differentiation of liver hemangioma and staging of deep venous thrombosis. Overall, these studies demonstrate that model-based reconstructive elasticity imaging can be used in applications where the geometry of the object and the surrounding tissue is somewhat known and certain assumptions about the pathology can be made.

  18. Electromagnetic Model and Image Reconstruction Algorithms Based on EIT System

    Institute of Scientific and Technical Information of China (English)

    CAO Zhang; WANG Huaxiang

    2006-01-01

    An intuitive 2 D model of circular electrical impedance tomography ( EIT) sensor with small size electrodes is established based on the theory of analytic functions.The validation of the model is proved using the result from the solution of Laplace equation.Suggestions on to electrode optimization and explanation to the ill-condition property of the sensitivity matrix are provided based on the model,which takes electrode distance into account and can be generalized to the sensor with any simple connected region through a conformal transformation.Image reconstruction algorithms based on the model are implemented to show feasibility of the model using experimental data collected from the EIT system developed in Tianjin University.In the simulation with a human chestlike configuration,electrical conductivity distributions are reconstructed using equi-potential backprojection (EBP) and Tikhonov regularization (TR) based on a conformal transformation of the model.The algorithms based on the model are suitable for online image reconstruction and the reconstructed results are good both in size and position.

  19. CEAI: CCM-based email authorship identification model

    Directory of Open Access Journals (Sweden)

    Sarwat Nizamani

    2013-11-01

    Full Text Available In this paper we present a model for email authorship identification (EAI by employing a Cluster-based Classification (CCM technique. Traditionally, stylometric features have been successfully employed in various authorship analysis tasks; we extend the traditional feature set to include some more interesting and effective features for email authorship identification (e.g., the last punctuation mark used in an email, the tendency of an author to use capitalization at the start of an email, or the punctuation after a greeting or farewell. We also included Info Gain feature selection based content features. It is observed that the use of such features in the authorship identification process has a positive impact on the accuracy of the authorship identification task. We performed experiments to justify our arguments and compared the results with other base line models. Experimental results reveal that the proposed CCM-based email authorship identification model, along with the proposed feature set, outperforms the state-of-the-art support vector machine (SVM-based models, as well as the models proposed by Iqbal et al. (2010, 2013 [1,2]. The proposed model attains an accuracy rate of 94% for 10 authors, 89% for 25 authors, and 81% for 50 authors, respectively on Enron dataset, while 89.5% accuracy has been achieved on authors’ constructed real email dataset. The results on Enron dataset have been achieved on quite a large number of authors as compared to the models proposed by Iqbal et al. [1,2].

  20. The impact of design-based modeling instruction on seventh graders' spatial abilities and model-based argumentation

    Science.gov (United States)

    McConnell, William J.

    Due to the call of current science education reform for the integration of engineering practices within science classrooms, design-based instruction is receiving much attention in science education literature. Although some aspect of modeling is often included in well-known design-based instructional methods, it is not always a primary focus. The purpose of this study was to better understand how design-based instruction with an emphasis on scientific modeling might impact students' spatial abilities and their model-based argumentation abilities. In the following mixed-method multiple case study, seven seventh grade students attending a secular private school in the Mid-Atlantic region of the United States underwent an instructional intervention involving design-based instruction, modeling and argumentation. Through the course of a lesson involving students in exploring the interrelatedness of the environment and an animal's form and function, students created and used multiple forms of expressed models to assist them in model-based scientific argument. Pre/post data were collected through the use of The Purdue Spatial Visualization Test: Rotation, the Mental Rotation Test and interviews. Other data included a spatial activities survey, student artifacts in the form of models, notes, exit tickets, and video recordings of students throughout the intervention. Spatial abilities tests were analyzed using descriptive statistics while students' arguments were analyzed using the Instrument for the Analysis of Scientific Curricular Arguments and a behavior protocol. Models were analyzed using content analysis and interviews and all other data were coded and analyzed for emergent themes. Findings in the area of spatial abilities included increases in spatial reasoning for six out of seven participants, and an immense difference in the spatial challenges encountered by students when using CAD software instead of paper drawings to create models. Students perceived 3D printed

  1. Model Checking-Based Testing of Web Applications

    Institute of Scientific and Technical Information of China (English)

    ZENG Hongwei; MIAO Huaikou

    2007-01-01

    A formal model representing the navigation behavior of a Web application as the Kripke structure is proposed and an approach that applies model checking to test case generation is presented. The Object Relation Diagram as the object model is employed to describe the object structure of a Web application design and can be translated into the behavior model. A key problem of model checking-based test generation for a Web application is how to construct a set of trap properties that intend to cause the violations of model checking against the behavior model and output of counterexamples used to construct the test sequences.We give an algorithm that derives trap properties from the object model with respect to node and edge coverage criteria.

  2. A DSM-based framework for integrated function modelling

    DEFF Research Database (Denmark)

    Eisenbart, Boris; Gericke, Kilian; Blessing, Lucienne T. M.

    2017-01-01

    Function modelling is proposed in the literature from different disciplines, in interdisciplinary approaches, and used in practice with the intention of facilitating system conceptualisation. However, function models across disciplines are largely diverse addressing different function modelling...... an integrated function modelling framework, which specifically aims at relating between the different function modelling perspectives prominently addressed in different disciplines. It uses interlinked matrices based on the concept of DSM and MDM in order to facilitate cross-disciplinary modelling and analysis...... perspectives and using different structures and forms for representing the contained information. This hampers the exchange of information between the models and poses particular challenges to joint modelling and shared comprehension between designers from different disciplines. This article proposes...

  3. MDA based-approach for UML Models Complete Comparison

    CERN Document Server

    Chaouni, Samia Benabdellah; Mouline, Salma

    2011-01-01

    If a modeling task is distributed, it will frequently be necessary to integrate models developed by different team members. Problems occur in the models integration step and particularly, in the comparison phase of the integration. This issue had been discussed in several domains and various models. However, previous approaches have not correctly handled the semantic comparison. In the current paper, we provide a MDA-based approach for models comparison which aims at comparing UML models. We develop an hybrid approach which takes into account syntactic, semantic and structural comparison aspects. For this purpose, we use the domain ontology as well as other resources such as dictionaries. We propose a decision support system which permits the user to validate (or not) correspondences extracted in the comparison phase. For implementation, we propose an extension of the generic correspondence metamodel AMW in order to transform UML models to the correspondence model.

  4. Evaluation of Artificial Intelligence Based Models for Chemical Biodegradability Prediction

    Directory of Open Access Journals (Sweden)

    Aleksandar Sabljic

    2004-12-01

    Full Text Available This study presents a review of biodegradability modeling efforts including a detailed assessment of two models developed using an artificial intelligence based methodology. Validation results for these models using an independent, quality reviewed database, demonstrate that the models perform well when compared to another commonly used biodegradability model, against the same data. The ability of models induced by an artificial intelligence methodology to accommodate complex interactions in detailed systems, and the demonstrated reliability of the approach evaluated by this study, indicate that the methodology may have application in broadening the scope of biodegradability models. Given adequate data for biodegradability of chemicals under environmental conditions, this may allow for the development of future models that include such things as surface interface impacts on biodegradability for example.

  5. Map-Based Channel Model for Urban Macrocell Propagation Scenarios

    Directory of Open Access Journals (Sweden)

    Jose F. Monserrat

    2015-01-01

    Full Text Available The evolution of LTE towards 5G has started and different research projects and institutions are in the process of verifying new technology components through simulations. Coordination between groups is strongly recommended and, in this sense, a common definition of test cases and simulation models is needed. The scope of this paper is to present a realistic channel model for urban macrocell scenarios. This model is map-based and takes into account the layout of buildings situated in the area under study. A detailed description of the model is given together with a comparison with other widely used channel models. The benchmark includes a measurement campaign in which the proposed model is shown to be much closer to the actual behavior of a cellular system. Particular attention is given to the outdoor component of the model, since it is here where the proposed approach is showing main difference with other previous models.

  6. Improved Computational Model of Grid Cells Based on Column Structure

    Institute of Scientific and Technical Information of China (English)

    Yang Zhou; Dewei Wu; Weilong Li; Jia Du

    2016-01-01

    To simulate the firing pattern of biological grid cells, this paper presents an improved computational model of grid cells based on column structure. In this model, the displacement along different directions is processed by modulus operation, and the obtained remainder is associated with firing rate of grid cell. Compared with the original model, the improved parts include that: the base of modulus operation is changed, and the firing rate in firing field is encoded by Gaussian⁃like function. Simulation validates that the firing pattern generated by the improved computational model is more consistent with biological characteristic than original model. Besides, the firing pattern is badly influenced by the cumulative positioning error, but the computational model can also generate the regularly hexagonal firing pattern when the real⁃time positioning results are modified.

  7. Support vector machine-based multi-model predictive control

    Institute of Scientific and Technical Information of China (English)

    Zhejing BA; Youxian SUN

    2008-01-01

    In this paper,a support vector machine-based multi-model predictive control is proposed,in which SVM classification combines well with SVM regression.At first,each working environment is modeled by SVM regression and the support vector machine network-based model predictive control(SVMN-MPC)algorithm corresponding to each environment is developed,and then a multi-class SVM model is established to recognize multiple operating conditions.As for control,the current environment is identified by the multi-class SVM model and then the corresponding SVMN.MPCcontroller is activated at each sampling instant.The proposed modeling,switching and controller design is demonstrated in simulation results.

  8. Model-based reinforcement learning with dimension reduction.

    Science.gov (United States)

    Tangkaratt, Voot; Morimoto, Jun; Sugiyama, Masashi

    2016-12-01

    The goal of reinforcement learning is to learn an optimal policy which controls an agent to acquire the maximum cumulative reward. The model-based reinforcement learning approach learns a transition model of the environment from data, and then derives the optimal policy using the transition model. However, learning an accurate transition model in high-dimensional environments requires a large amount of data which is difficult to obtain. To overcome this difficulty, in this paper, we propose to combine model-based reinforcement learning with the recently developed least-squares conditional entropy (LSCE) method, which simultaneously performs transition model estimation and dimension reduction. We also further extend the proposed method to imitation learning scenarios. The experimental results show that policy search combined with LSCE performs well for high-dimensional control tasks including real humanoid robot control.

  9. Tutorial on agent-based modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Macal, C. M.; North, M. J.; Decision and Information Sciences

    2005-01-01

    Agent-based modeling and simulation (ABMS) is a new approach to modeling systems comprised of autonomous, interacting agents. ABMS promises to have far-reaching effects on the way that businesses use computers to support decision-making and researchers use electronic laboratories to support their research. Some have gone so far as to contend that ABMS is a third way of doing science besides deductive and inductive reasoning. Computational advances have made possible a growing number of agent-based applications in a variety of fields. Applications range from modeling agent behavior in the stock market and supply chains, to predicting the spread of epidemics and the threat of bio-warfare, from modeling consumer behavior to understanding the fall of ancient civilizations, to name a few. This tutorial describes the theoretical and practical foundations of ABMS, identifies toolkits and methods for developing ABMS models, and provides some thoughts on the relationship between ABMS and traditional modeling techniques.

  10. Comparing model predictions for ecosystem-based management

    DEFF Research Database (Denmark)

    Jacobsen, Nis Sand; Essington, Timothy E.; Andersen, Ken Haste

    2016-01-01

    E)) and a size-structured fish community model. The models were compared with respect to predicted ecological consequences of fishing to identify commonalities and differences in model predictions for the California Current fish community. We compared the models regarding direct and indirect responses to fishing...... on one or more species. The size-based model predicted a higher fishing mortality needed to reach maximum sustainable yield than EwE for most species. The size-based model also predicted stronger top-down effects of predator removals than EwE. In contrast, EwE predicted stronger bottom-up effects...... of forage fisheries removal. In both cases, the differences are due to the presumed degree of trophic overlap between juveniles of large-bodied fish and adult stages of forage fish. These differences highlight how each model’s emphasis on distinct details of ecological processes affects its predictions...

  11. Building information modeling based on intelligent parametric technology

    Institute of Scientific and Technical Information of China (English)

    ZENG Xudong; TAN Jie

    2007-01-01

    In order to push the information organization process of the building industry,promote sustainable architectural design and enhance the competitiveness of China's building industry,the author studies building information modeling (BIM) based on intelligent parametric modeling technology.Building information modeling is a new technology in the field of computer aided architectural design,which contains not only geometric data,but also the great amount of engineering data throughout the lifecycle of a building.The author also compares BIM technology with two-dimensional CAD technology,and demonstrates the advantages and characteristics of intelligent parametric modeling technology.Building information modeling,which is based on intelligent parametric modeling technology,will certainly replace traditional computer aided architectural design and become the new driving force to push forward China's building industry in this information age.

  12. Pricing Model of Multiattribute Derivatives Based on Mixed Process

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    By Analyzing the behavior and character of derivative security, the authorsestablished a pricing model of multiattribute derivative security whose underlying asset pricingprocess is a mixed process, and obtained a new model for option pricing of multiattribute derivatives based on mixed process, and improved some original results.

  13. Model based decision support for planning of road maintenance

    NARCIS (Netherlands)

    Worm, J.M.; Harten, van A.

    1996-01-01

    In this article we describe a Decision Support Model, based on Operational Research methods, for the multi-period planning of maintenance of bituminous pavements. This model is a tool for the road manager to assist in generating an optimal maintenance plan for a road. Optimal means: minimising the N

  14. Mental Models about Seismic Effects: Students' Profile Based Comparative Analysis

    Science.gov (United States)

    Moutinho, Sara; Moura, Rui; Vasconcelos, Clara

    2016-01-01

    Nowadays, meaningful learning takes a central role in science education and is based in mental models that allow the representation of the real world by individuals. Thus, it is essential to analyse the student's mental models by promoting an easier reconstruction of scientific knowledge, by allowing them to become consistent with the curricular…

  15. A Judgement-Based Model of Workplace Learning

    Science.gov (United States)

    Athanasou, James A.

    2004-01-01

    The purpose of this paper is to outline a judgement-based model of adult learning. This approach is set out as a Perceptual-Judgemental-Reinforcement approach to social learning under conditions of complexity and where there is no single, clearly identified correct response. The model builds upon the Hager-Halliday thesis of workplace learning and…

  16. Inquiry based learning as didactic model in distant learning

    NARCIS (Netherlands)

    Rothkrantz, L.J.M.

    2015-01-01

    Recent years many universities are involved in development of Massive Open Online Courses (MOOCs). Unfortunately an appropriate didactic model for cooperated network learning is lacking. In this paper we introduce inquiry based learning as didactic model. Students are assumed to ask themselves quest

  17. Bioluminescence tomography based on the phase approximation model

    OpenAIRE

    Cong, W; Wang, G.

    2010-01-01

    A reconstruction method of bioluminescence sources is proposed based on a phase approximation model. Compared with the diffuse approximation, this phase approximation model more correctly predicts bioluminescence photon propagation in biological tissues, so that bioluminescence tomography can accurately locate and quantify the distribution of bioluminescence sources. The compressive sensing (CS) technique is applied to regularize the inverse source reconstruction to enhance numerical stabilit...

  18. Model-reduced gradient-based history matching

    NARCIS (Netherlands)

    Kaleta, M.P.; Hanea, R.G.; Heemink, A.W.; Jansen, J.D.

    2010-01-01

    Gradient-based history matching algorithms can be used to adapt the uncertain parameters in a reservoir model using production data. They require, however, the implementation of an adjoint model to compute the gradients, which is usually an enormous programming effort. We propose a new approach to g

  19. Structuring Qualitative Data for Agent-Based Modelling

    NARCIS (Netherlands)

    Ghorbani, Amineh; Dijkema, Gerard P.J.; Schrauwen, Noortje

    2015-01-01

    Using ethnography to build agent-based models may result in more empirically grounded simulations. Our study on innovation practice and culture in the Westland horticulture sector served to explore what information and data from ethnographic analysis could be used in models and how. MAIA, a framewor

  20. Models Based Practices in Physical Education: A Sociocritical Reflection

    Science.gov (United States)

    Landi, Dillon; Fitzpatrick, Katie; McGlashan, Hayley

    2016-01-01

    In this paper, we reflect on models-based practices in physical education using a sociocritical lens. Drawing links between neoliberal moves in education, and critical approaches to the body and physicality, we take a view that models are useful tools that are worth integrating into physical education, but we are apprehensive to suggest they…

  1. Archive Design Based on Planets Inspired Logical Object Model

    DEFF Research Database (Denmark)

    Zierau, Eld; Johansen, Anders

    2008-01-01

    We describe a proposal for a logical data model based on preliminary work the Planets project In OAIS terms the main areas discussed are related to the introduction of a logical data model for representing the past, present and future versions of the digital object associated with the Archival...

  2. SDRAM-based packet buffer model for high speed switches

    DEFF Research Database (Denmark)

    Rasmussen, Anders; Ruepp, Sarah Renée; Berger, Michael Stübert

    2011-01-01

    This article investigates the how the performance of SDRAM based packet buffering systems for high performance switches can be simulated using OPNET. In order to include the access pattern dependent performance of SDRAM modules in simulations, a custom SDRAM model is implemented in OPNET Modeller...

  3. CBCT versus MSCT-Based Models on Assessing Condylar Morphology

    Science.gov (United States)

    Gomes, Liliane Rosas; Gomes, Marcelo Regis; Gonçalves, João Roberto; Ruellas, Antônio Carlos O.; Wolford, Larry M.; Paniagua, Beatriz; Benavides, Erika; Cevidanes, Lúcia Helena Soares

    2015-01-01

    Objective To quantitatively compare condylar morphology using CBCT and MSCT virtual 3D surface models. Study Design The sample consisted of secondary data analysis of CBCT and MSCT scans obtained for clinical purposes from 74 patients treated with condylar resection and prosthetic joint replacement. 3D surface models of 146 condyles were constructed from each scan modality. Across-subject models were approximated and voxel-based registration was performed between homologous CBCT and MSCT images, making it possible to create an average CBCT and MSCT-based condylar models. SPHARM-PDM provided matching points on each correspondent model. ShapeAnalysisMANCOVA assessed statistical significant differences between observers and imaging modalities. One-sample t-test evaluated the null hypothesis that the mean differences between each CBCT and MSCT-based model were not clinically significant (0.68). During pairwise comparison, the mean difference observed was 0.406mm, SD 0.173. One sample t-test showed that mean differences between each paired CBCT and MSCT-based models were not clinically significant (P=0.411). Conclusion 3D surface models constructed from CBCT images are comparable to those derived from MSCT scans and may be considered reliable tools for assessing condylar morphology. PMID:26679363

  4. A Memory-Based Model of Hick's Law

    Science.gov (United States)

    Schneider, Darryl W.; Anderson, John R.

    2011-01-01

    We propose and evaluate a memory-based model of Hick's law, the approximately linear increase in choice reaction time with the logarithm of set size (the number of stimulus-response alternatives). According to the model, Hick's law reflects a combination of associative interference during retrieval from declarative memory and occasional savings…

  5. On infrastructure network design with agent-based modelling

    NARCIS (Netherlands)

    Chappin, E.J.L.; Heijnen, P.W.

    2014-01-01

    We have developed an agent-based model to optimize green-field network design in an industrial area. We aim to capture some of the deep uncertainties surrounding infrastructure design by modelling it developing specific ant colony optimizations. Hence, we propose a variety of extensions to our exist

  6. ATTEND: Toward a Mindfulness-Based Bereavement Care Model

    Science.gov (United States)

    Cacciatore, Joanne; Flint, Melissa

    2012-01-01

    Few, if any, mindfulness-based bereavement care models exist. The ATTEND (attunement, trust, touch, egalitarianism, nuance, and death education) model is an interdisciplinary paradigm for providers, including physicians, social workers, therapists, nursing staff, and others. Using a case example to enhance the breadth and depth of understanding,…

  7. Event-Based Corpuscular Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    Michielsen, K.; Jin, F.; Raedt, H. De

    2011-01-01

    A corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one is presented. The event-based corpuscular model is shown to give a u

  8. Optimal Model-Based Control in HVAC Systems

    DEFF Research Database (Denmark)

    Komareji, Mohammad; Stoustrup, Jakob; Rasmussen, Henrik;

    2008-01-01

    This paper presents optimal model-based control of a heating, ventilating, and air-conditioning (HVAC) system. This HVAC system is made of two heat exchangers: an air-to-air heat exchanger (a rotary wheel heat recovery) and a water-to- air heat exchanger. First dynamic model of the HVAC system...

  9. Model-Based GUI Testing Using Uppaal at Novo Nordisk

    DEFF Research Database (Denmark)

    H. Hjort, Ulrik; Rasmussen, Jacob Illum; Larsen, Kim Guldstrand

    2009-01-01

    This paper details a collaboration between Aalborg University and Novo Nordiskin developing an automatic model-based test generation tool for system testing of the graphical user interface of a medical device on an embedded platform. The tool takes as input an UML Statemachine model and generates...

  10. An Agent-Based Approach to Modeling Online Social Influence

    NARCIS (Netherlands)

    Maanen, P.P. van; Vecht, B. van der

    2013-01-01

    The aim of this study is to better understand social influence in online social media. Therefore, we propose a method in which we implement, validate and improve an individual behavior model. The behavior model is based on three fundamental behavioral principles of social influence from the literatu

  11. Spatial transferability of landscape-based hydrological models

    Science.gov (United States)

    Gao, Hongkai; Hrachowitz, Markus; Fenicia, Fabrizio; Gharari, Shervan; Sriwongsitanon, Nutchanart; Savenije, Hubert

    2015-04-01

    Landscapes, mainly distinguished by land surface topography and vegetation cover, are crucial in defining runoff generation mechanisms, interception capacity and transpiration processes. Landscapes information provides modelers with a way to take into account catchment heterogeneity, while simultaneously keeping model complexity low. A landscape-based hydrological modelling framework (FLEX-Topo), with parallel model structures, was developed and tested in various catchments with diverse climate, topography and land cover conditions. Landscape classification is the basic and most crucial procedure to create a tailor-made model for a certain catchment, as it explicitly relates hydrologic similarity to landscape similarity, which is the base of this type of models. Therefore, the study catchment is classified into different landscapes units that fulfil similar hydrological function, based on classification criteria such as the height above the nearest drainage, slope, aspect and land cover. At present, to suggested model includes four distinguishable landscapes: hillslopes, terraces/plateaus, riparian areas, and glacierized areas. Different parallel model structures are then associated with the different landscape units to describe their different dominant runoff generation mechanisms. These hydrological units are parallel and only connected by groundwater reservoir. The transferability of this landscape-based model can then be compared with the transferability of a lumped model. In this study, FLEX-Topo was developed and tested in three study sites: two cold-arid catchments in China (the upper Heihe River and the Urumqi Glacier No1 catchment), and one tropical catchment in Thailand (the upper Ping River). Stringent model tests indicate that FLEX-Topo, allowing for more process heterogeneity than lumped model formulations, exhibits higher capabilities to be spatially transferred. Furthermore, the simulated water balances, including internal fluxes, hydrograph

  12. A Model of Resurgence Based on Behavioral Momentum Theory

    OpenAIRE

    Shahan, Timothy A; Sweeney, Mary M.

    2011-01-01

    Resurgence is the reappearance of an extinguished behavior when an alternative behavior reinforced during extinction is subsequently placed on extinction. Resurgence is of particular interest because it may be a source of relapse to problem behavior following treatments involving alternative reinforcement. In this article we develop a quantitative model of resurgence based on the augmented model of extinction provided by behavioral momentum theory. The model suggests that alternative reinforc...

  13. Elastoplastic cup model for cement-based materials

    OpenAIRE

    Zhang, Yan; Shao, Jian-Fu

    2010-01-01

    Based on experimental data obtained from triaxial tests and a hydrostatic test, a cup model was formulated. Two plastic mechanisms, respectively a deviatoric shearing and a pore collapse, are taken into account. This model also considers the influence of confining pressure. In this paper, the calibration of the model is detailed and numerical simulations of the main mechanical behavior of cement paste over a large range of stress are described, showing good agreement with experimental results...

  14. Web Service Description and Discovery Based on Semantic Model

    Institute of Scientific and Technical Information of China (English)

    YANG Xuemei; XU Lizhen; DONG Yisheng; WANG Yongli

    2006-01-01

    A novel semantic model of Web service description and discovery was proposed through an extension for profile model of Web ontology language for services (OWL-S) in this paper.Similarity matching of Web services was implemented through computing weighted summation of semantic similarity value based on specific domain ontology and dynamical satisfy extent evaluation for quality of service (QoS).Experiments show that the provided semantic matching model is efficient.

  15. The algorithmic anatomy of model-based evaluation

    OpenAIRE

    Daw, Nathaniel D.; Dayan, Peter

    2014-01-01

    Despite many debates in the first half of the twentieth century, it is now largely a truism that humans and other animals build models of their environments and use them for prediction and control. However, model-based (MB) reasoning presents severe computational challenges. Alternative, computationally simpler, model-free (MF) schemes have been suggested in the reinforcement learning literature, and have afforded influential accounts of behavioural and neural data. Here, we study the realiza...

  16. Crosscumulants Based Approaches for the Structure Identification of Volterra Models

    Institute of Scientific and Technical Information of China (English)

    Houda Mathlouthi; Kamel Abederrahim; Faouzi Msahli; Gerard Favier

    2009-01-01

    In this paper, we address the problem of structure identification of Volterra models. It consists in estimating the model order and the memory lcngth of each kernel. Two methods based on input-output crosscumulants arc developed. The first one uses zero mean independent and identically distributed Ganssian input, and the second one concerns a symmetric input sequence. Simulations are performed on six models having different orders and kernel memory lengths to demonstrate the advantages of the proposed methods.

  17. Observationally-Based Data/Model Metrics from the Southern Ocean Climate Model Atlas

    Science.gov (United States)

    Abell, J.; Russell, J. L.; Goodman, P. J.

    2015-12-01

    The Southern Ocean Climate Model Atlas makes available observationally-based standardized data/model metrics of the latest simulations of climate and projections of climate change from available climate models. Global climate model simulations differ greatly in the Southern Ocean, so the development of consistent, observationally-based metrics, by which to assess the fidelity of model simulations is essential. We will present metrics showing and quantifying the results of the modern day climate simulations over the Southern Ocean from models submitted as part of the CMIP5/IPCC-AR5 process. Our analysis will focus on the simulations of the temperature, salinity and carbon at various depths and along significant hydrographic sections. The models exhibit different skill levels with various metrics between models and also within individual models.

  18. Observer-based and Regression Model-based Detection of Emerging Faults in Coal Mills

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Lin, Bao; Jørgensen, Sten Bay

    2006-01-01

    In order to improve the reliability of power plants it is important to detect fault as fast as possible. Doing this it is interesting to find the most efficient method. Since modeling of large scale systems is time consuming it is interesting to compare a model-based method with data driven ones....... In this paper three different fault detection approaches are compared using a example of a coal mill, where a fault emerges. The compared methods are based on: an optimal unknown input observer, static and dynamic regression model-based detections. The conclusion on the comparison is that observer-based scheme...

  19. An agent-based multi-scale wind generation model

    Energy Technology Data Exchange (ETDEWEB)

    Kremers, E.; Lewald, N. [Karlsruhe Univ., Karlsruhe (Germany). European Inst. for Energy Research; Barambones, O.; Gonzalez de Durana, J.M. [Univ. of the Basque Country, Vitoria (Spain). Dept. of Engineering

    2009-07-01

    The introduction of renewable energies, the liberalization of energy markets and the emergence of new, distributed producers that feed into the grid at almost every level of the system have all contributed to a paradigm shift in energy systems. This paper presented an agent-based model for simulating wind power systems on multiple time scales. The purpose of the study was to generate a flexible model that would permit simulating the output of a wind farm. The model was developed using multiparadigm modelling. It also combined a variety of approaches such as agent-based modelling, discrete events and dynamic systems. The paper explained the theoretical background concerning the basic models for wind speed generation and power turbines, as well as the fundamentals of agent-based modelling. The implementation of these models was illustrated. The paper also discussed several sample simulations and discussed the application of the model. It was concluded that the paradigm change encompassed new tools and methods that could deal with decentralized decision-making, planning and self-organisation. The large amount of new technologies in the energy production chain requires a shift from a top-down to a more bottom-up approach. 12 refs., 1 tab., 7 figs.

  20. Support Vector Machine-Based Nonlinear System Modeling and Control

    Institute of Scientific and Technical Information of China (English)

    张浩然; 韩正之; 冯瑞; 于志强

    2003-01-01

    This paper provides an introduction to a support vector machine, a new kernel-based technique introduced in statistical learning theory and structural risk minimization, then presents a modeling-control framework based on SVM.At last a numerical experiment is taken to demonstrate the proposed approach's correctness and effectiveness.

  1. Reo: A Channel-based Coordination Model for Component Composition

    NARCIS (Netherlands)

    Arbab, F.

    2004-01-01

    In this paper, we present Reo, which forms a paradigm for composition of software components based on the notion of mobile channels. Reo is a channel-based exogenous coordination model in which complex coordinators, called connectors, are compositionally built out of simpler ones. The simplest conne

  2. Assessing Graduate Attributes: Building a Criteria-Based Competency Model

    Science.gov (United States)

    Ipperciel, Donald; ElAtia, Samira

    2014-01-01

    Graduate attributes (GAs) have become a necessary framework of reference for the 21st century competency-based model of higher education. However, the issue of evaluating and assessing GAs still remains unchartered territory. In this article, we present a criteria-based method of assessment that allows for an institution-wide comparison of the…

  3. A Model of Homeschooling Based on Technology in Malaysia

    Science.gov (United States)

    Alias, Norlidah; Rahman, Mohd. Nazri Abdul; Siraj, Saedah; Ibrahim, Ruslina

    2013-01-01

    Homeschooling in Malaysia is a form of alternative education that emphasizes quality education based on moral values and belief in strengthening family ties. The purpose of this study is to produce a model of homeschooling technology-based learning activities in Malaysia as a guideline to improve the quality of education, curriculum and organize…

  4. Dynamic fracture modeling in shell structures based on XFEM

    NARCIS (Netherlands)

    Larsson, R.; Mediavilla, J.; Fagerström, M.

    2011-01-01

    Through-the-thickness crack propagation in thin-walled structures is dealt with in this paper. The formulation is based on the cohesive zone concept applied to a kinematically consistent shell model enhanced with an XFEM-based discontinuous kinematical representation. The resulting formulation compr

  5. A channel-based coordination model for component composition

    NARCIS (Netherlands)

    Arbab, F.

    2002-01-01

    In this paper, we present $P epsilon omega$, a paradigm for composition of software components based on the notion of mobile channels. $P repsilon omega$ is a channel-based exogenous coordination model wherein complex coordinators, called {em connectors are compositionally built out of simpler ones.

  6. Evaluating hydrological model performance using information theory-based metrics

    Science.gov (United States)

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...

  7. Adaptive, Model-Based Monitoring and Threat Detection

    Science.gov (United States)

    Valdes, Alfonso; Skinner, Keith

    2002-09-01

    We explore the suitability of model-based probabilistic techniques, such as Bayes networks, to the field of intrusion detection and alert report correlation. We describe a network intrusion detection system (IDS) using Bayes inference, wherein the knowledge base is encoded not as rules but as conditional probability relations between observables and hypotheses of normal and malicious usage. The same high-performance Bayes inference library was employed in a component of the Mission-Based Correlation effort, using an initial knowledge base that adaptively learns the security administrator's preference for alert priority and rank. Another major effort demonstrated probabilistic techniques in heterogeneous sensor correlation. We provide results for simulated attack data, live traffic, and the CyberPanel Grand Challenge Problem. Our results establish that model-based probabilistic techniques are an important complementary capability to signature-based methods in detection and correlation.

  8. Mathematical Modeling of Column-Base Connections under Monotonic Loading

    Directory of Open Access Journals (Sweden)

    Gholamreza Abdollahzadeh

    2014-12-01

    Full Text Available Some considerable damage to steel structures during the Hyogo-ken Nanbu Earthquake occurred. Among them, many exposed-type column bases failed in several consistent patterns, such as brittle base plate fracture, excessive bolt elongation, unexpected early bolt failure, and inferior construction work, etc. The lessons from these phenomena led to the need for improved understanding of column base behavior. Joint behavior must be modeled when analyzing semi-rigid frames, which is associated with a mathematical model of the moment–rotation curve. The most accurate model uses continuous nonlinear functions. This article presents three areas of steel joint research: (1 analysis methods of semi-rigid joints; (2 prediction methods for the mechanical behavior of joints; (3 mathematical representations of the moment–rotation curve. In the current study, a new exponential model to depict the moment–rotation relationship of column base connection is proposed. The proposed nonlinear model represents an approach to the prediction of M–θ curves, taking into account the possible failure modes and the deformation characteristics of the connection elements. The new model has three physical parameters, along with two curve-fitted factors. These physical parameters are generated from dimensional details of the connection, as well as the material properties. The M–θ curves obtained by the model are compared with published connection tests and 3D FEM research. The proposed mathematical model adequately comes close to characterizing M–θ behavior through the full range of loading/rotations. As a result, modeling of column base connections using the proposed mathematical model can give crucial beforehand information, and overcome the disadvantages of time consuming workmanship and cost of experimental studies.

  9. Flow Formulation-based Model for the Curriculum-based Course Timetabling Problem

    DEFF Research Database (Denmark)

    Bagger, Niels-Christian Fink; Kristiansen, Simon; Sørensen, Matias;

    2015-01-01

    In this work we will present a new mixed integer programming formulation for the curriculum-based course timetabling problem. We show that the model contains an underlying network model by dividing the problem into two models and then connecting the two models back into one model using a maximum ow...... problem. This decreases the number of integer variables signicantly and improves the performance compared to the basic formulation. It also shows competitiveness with other approaches based on mixed integer programming from the literature and improves the currently best known lower bound on one data...

  10. Construct Method of Predicting Satisfaction Model Based on Technical Characteristics

    Institute of Scientific and Technical Information of China (English)

    YANG Xiao-an; DENG Qian; SUN Guan-long; ZHANG Wei-she

    2011-01-01

    In order to construct objective relatively mapping relationship model between customer requirements and product technical characteristics, a novel approach based on customer satisfactions information digging from case products and satisfaction information of expert technical characteristics was put forward in this paper. Technical characteristics evaluation values were expressed by rough number, and technical characteristics target sequence was determined on the basis of efficiency, cost type and middle type in this method. Use each calculated satisfactions of customers and technical characteristics as input and output elements to construct BP network model. And we use MATLAB software to simulate this BP network model based on the case of electric bicycles.

  11. Whole-body acid-base modeling revisited

    DEFF Research Database (Denmark)

    Ring, Troels; Nielsen, Søren

    2017-01-01

    The textbook account of whole-body acid-base balance in terms of endogenous acid production, renal net acid excretion and gastrointestinal alkali absorption which is the only comprehensive model around, has never been applied in clinical practice or been formally validated. In order to improve...... understanding of acid-base modeling, we managed to write up this conventional model as an expression solely on urine chemistry. Renal net acid excretion and endogenous acid production was already formulated in terms of urine chemistry, and we could from the literature also see gastrointestinal alkali absorption...

  12. A Separated Domain-Based Kernel Model for Trusted Computing

    Institute of Scientific and Technical Information of China (English)

    FANG Yanxiang; SHEN Changxiang; XU Jingdong; WU Gongyi

    2006-01-01

    This paper fist gives an investigation on trusted computing on mainstream operation system (OS). Based on the observations, it is pointed out that Trusted Computing cannot be achieved due to the lack of separation mechanism of the components in mainstream OS. In order to provide a kind of separation mechanism, this paper proposes a separated domain-based kernel model (SDBKM), and this model is verified by non-interference theory. By monitoring and simplifying the trust dependence between domains, this model can solve problems in trust measurement such as deny of service (DoS) attack, Host security, and reduce the overhead of measurement.

  13. Cascaded process model based control: packed absorption column application.

    Science.gov (United States)

    Govindarajan, Anand; Jayaraman, Suresh Kumar; Sethuraman, Vijayalakshmi; Raul, Pramod R; Rhinehart, R Russell

    2014-03-01

    Nonlinear, adaptive, process-model based control is demonstrated in a cascaded single-input-single-output mode for pressure drop control in a pilot-scale packed absorption column. The process is shown to be nonlinear. Control is demonstrated in both servo and regulatory modes, for no wind-up in a constrained situation, and for bumpless transfer. Model adaptation is demonstrated and shown to provide process insight. The application procedure is revealed as a design guide to aid others in implementing process-model based control.

  14. Towards an Intelligent Project Based Organization Business Model

    Directory of Open Access Journals (Sweden)

    Alami Marrouni Oussama

    2013-01-01

    Full Text Available Global economy is undergoing a recession phase that had made competition tougher and imposed new business framework. Businesses have to shift from the classical management approaches to an Intelligent Project Based Organization (IPBO model that provides flexibility and agility. IPBO model is intended to reinforce the proven advantages of Project Based Organization (PBO by the use of suitable Enterprise Intelligence (EI Systems. The goal of this paper is to propose an IPBO model that combines benefits of PBO and EI and helps overcoming their pitfalls

  15. Modelling a Compliant-Based Precise Positioning Stage

    Directory of Open Access Journals (Sweden)

    Giedrius Augustinavičius

    2012-12-01

    Full Text Available The paper presents modelling precise dual axis flexure-based precision positioning systems for micro-positioning applications. The positioning system is featured with monolithic architecture, flexure-based joints and piezo stacks. Its workspace has been evaluated via analytical approaches. Amplification mechanism is optimally designed. The mathematical model of the positioning system has been derived and verified by resorting to finite element analysis (FEA. The established analytical and (FEA models are helpful for optimizing reliable architecture and improving the performance of the positioning system.

  16. Evaluating performances of simplified physically based models for landslide susceptibility

    Directory of Open Access Journals (Sweden)

    G. Formetta

    2015-12-01

    Full Text Available Rainfall induced shallow landslides cause loss of life and significant damages involving private and public properties, transportation system, etc. Prediction of shallow landslides susceptible locations is a complex task that involves many disciplines: hydrology, geotechnical science, geomorphology, and statistics. Usually to accomplish this task two main approaches are used: statistical or physically based model. Reliable models' applications involve: automatic parameters calibration, objective quantification of the quality of susceptibility maps, model sensitivity analysis. This paper presents a methodology to systemically and objectively calibrate, verify and compare different models and different models performances indicators in order to individuate and eventually select the models whose behaviors are more reliable for a certain case study. The procedure was implemented in package of models for landslide susceptibility analysis and integrated in the NewAge-JGrass hydrological model. The package includes three simplified physically based models for landslides susceptibility analysis (M1, M2, and M3 and a component for models verifications. It computes eight goodness of fit indices by comparing pixel-by-pixel model results and measurements data. Moreover, the package integration in NewAge-JGrass allows the use of other components such as geographic information system tools to manage inputs-output processes, and automatic calibration algorithms to estimate model parameters. The system was applied for a case study in Calabria (Italy along the Salerno-Reggio Calabria highway, between Cosenza and Altilia municipality. The analysis provided that among all the optimized indices and all the three models, the optimization of the index distance to perfect classification in the receiver operating characteristic plane (D2PC coupled with model M3 is the best modeling solution for our test case.

  17. Evaluating performances of simplified physically based models for landslide susceptibility

    Science.gov (United States)

    Formetta, G.; Capparelli, G.; Versace, P.

    2015-12-01

    Rainfall induced shallow landslides cause loss of life and significant damages involving private and public properties, transportation system, etc. Prediction of shallow landslides susceptible locations is a complex task that involves many disciplines: hydrology, geotechnical science, geomorphology, and statistics. Usually to accomplish this task two main approaches are used: statistical or physically based model. Reliable models' applications involve: automatic parameters calibration, objective quantification of the quality of susceptibility maps, model sensitivity analysis. This paper presents a methodology to systemically and objectively calibrate, verify and compare different models and different models performances indicators in order to individuate and eventually select the models whose behaviors are more reliable for a certain case study. The procedure was implemented in package of models for landslide susceptibility analysis and integrated in the NewAge-JGrass hydrological model. The package includes three simplified physically based models for landslides susceptibility analysis (M1, M2, and M3) and a component for models verifications. It computes eight goodness of fit indices by comparing pixel-by-pixel model results and measurements data. Moreover, the package integration in NewAge-JGrass allows the use of other components such as geographic information system tools to manage inputs-output processes, and automatic calibration algorithms to estimate model parameters. The system was applied for a case study in Calabria (Italy) along the Salerno-Reggio Calabria highway, between Cosenza and Altilia municipality. The analysis provided that among all the optimized indices and all the three models, the optimization of the index distance to perfect classification in the receiver operating characteristic plane (D2PC) coupled with model M3 is the best modeling solution for our test case.

  18. An inverse model for magnetorheological dampers based on a restructured phenomenological model

    Science.gov (United States)

    Qian, Li-Jun; Liu, Bo; Chen, Peng; Bai, Xian-Xu

    2016-04-01

    Magnetorheological dampers (MRDs), a semi-active actuator based on MR effect, have great potential in vibration/shock control systems. However, it is difficult to establish its inverse model due to its intrinsic strong nonlinear hysteresis behaviors, and sequentially the precise, fast and effective control could not be realized effectively. This paper presents an inverse model for MRDs based on a restructured phenomenological model with incorporation of the "normalization" concept. The proposed inverse model of MRDs is validated by the simulation of the force tracking. The research results indicate that the inverse model could be applied for the damping force control with consideration of the strong nonlinear hysteresis behaviors of the MRDs.

  19. Modeling the desired direction in a force-based model for pedestrian dynamics

    CERN Document Server

    Chraibi, Mohcine; Schadschneider, Andreas; Seyfried, Armin

    2012-01-01

    We introduce an enhanced model based on the generalized centrifugal force model. Furthermore, the desired direction of pedestrians is investigated. A new approach leaning on the well-known concept of static and dynamic floor-fields in cellular automata is presented. Numerical results of the model are presented and compared with empirical data.

  20. Examination of Modeling Languages to Allow Quantitative Analysis for Model-Based Systems Engineering

    Science.gov (United States)

    2014-06-01

    x THIS PAGE INTENTIONALLY LEFT BLANK xi LIST OF ACRONYMS AND ABBREVIATIONS BOM Base Object Model BPMN Business Process Model & Notation DOD...SysML. There are many variants such as the Unified Profile for DODAF/MODAF (UPDM) and Business Process Model & Notation ( BPMN ) that have origins in