WorldWideScience

Sample records for classification-tree based models

  1. Investigation of the Effect of Traffic Parameters on Road Hazard Using Classification Tree Model

    Directory of Open Access Journals (Sweden)

    Md. Mahmud Hasan

    2012-09-01

    Full Text Available This paper presents a method for the identification of hazardous situations on the freeways. For this study, about 18 km long section of Eastern Freeway in Melbourne, Australia was selected as a test bed. Three categories of data i.e. traffic, weather and accident record data were used for the analysis and modelling. In developing the crash risk probability model, classification tree based model was developed in this study. In formulating the models, it was found that weather conditions did not have significant impact on accident occurrence so the classification tree was built using two traffic indices; traffic flow and vehicle speed only. The formulated classification tree is able to identify the possible hazard and non-hazard situations on freeway. The outcome of the study will aid the hazard mitigation strategies.

  2. Predictive mapping of soil organic carbon in wet cultivated lands using classification-tree based models: the case study of Denmark.

    Science.gov (United States)

    Bou Kheir, Rania; Greve, Mogens H; Bøcher, Peder K; Greve, Mette B; Larsen, René; McCloy, Keith

    2010-05-01

    Soil organic carbon (SOC) is one of the most important carbon stocks globally and has large potential to affect global climate. Distribution patterns of SOC in Denmark constitute a nation-wide baseline for studies on soil carbon changes (with respect to Kyoto protocol). This paper predicts and maps the geographic distribution of SOC across Denmark using remote sensing (RS), geographic information systems (GISs) and decision-tree modeling (un-pruned and pruned classification trees). Seventeen parameters, i.e. parent material, soil type, landscape type, elevation, slope gradient, slope aspect, mean curvature, plan curvature, profile curvature, flow accumulation, specific catchment area, tangent slope, tangent curvature, steady-state wetness index, Normalized Difference Vegetation Index (NDVI), Normalized Difference Wetness Index (NDWI) and Soil Color Index (SCI) were generated to statistically explain SOC field measurements in the area of interest (Denmark). A large number of tree-based classification models (588) were developed using (i) all of the parameters, (ii) all Digital Elevation Model (DEM) parameters only, (iii) the primary DEM parameters only, (iv), the remote sensing (RS) indices only, (v) selected pairs of parameters, (vi) soil type, parent material and landscape type only, and (vii) the parameters having a high impact on SOC distribution in built pruned trees. The best constructed classification tree models (in the number of three) with the lowest misclassification error (ME) and the lowest number of nodes (N) as well are: (i) the tree (T1) combining all of the parameters (ME=29.5%; N=54); (ii) the tree (T2) based on the parent material, soil type and landscape type (ME=31.5%; N=14); and (iii) the tree (T3) constructed using parent material, soil type, landscape type, elevation, tangent slope and SCI (ME=30%; N=39). The produced SOC maps at 1:50,000 cartographic scale using these trees are highly matching with coincidence values equal to 90.5% (Map T1

  3. Predicting student satisfaction with courses based on log data from a virtual learning environment – a neural network and classification tree model

    Directory of Open Access Journals (Sweden)

    Ivana Đurđević Babić

    2015-03-01

    Full Text Available Student satisfaction with courses in academic institutions is an important issue and is recognized as a form of support in ensuring effective and quality education, as well as enhancing student course experience. This paper investigates whether there is a connection between student satisfaction with courses and log data on student courses in a virtual learning environment. Furthermore, it explores whether a successful classification model for predicting student satisfaction with course can be developed based on course log data and compares the results obtained from implemented methods. The research was conducted at the Faculty of Education in Osijek and included analysis of log data and course satisfaction on a sample of third and fourth year students. Multilayer Perceptron (MLP with different activation functions and Radial Basis Function (RBF neural networks as well as classification tree models were developed, trained and tested in order to classify students into one of two categories of course satisfaction. Type I and type II errors, and input variable importance were used for model comparison and classification accuracy. The results indicate that a successful classification model using tested methods can be created. The MLP model provides the highest average classification accuracy and the lowest preference in misclassification of students with a low level of course satisfaction, although a t-test for the difference in proportions showed that the difference in performance between the compared models is not statistically significant. Student involvement in forum discussions is recognized as a valuable predictor of student satisfaction with courses in all observed models.

  4. Study and ranking of determinants of Taenia solium infections by classification tree models.

    Science.gov (United States)

    Mwape, Kabemba E; Phiri, Isaac K; Praet, Nicolas; Dorny, Pierre; Muma, John B; Zulu, Gideon; Speybroeck, Niko; Gabriël, Sarah

    2015-01-01

    Taenia solium taeniasis/cysticercosis is an important public health problem occurring mainly in developing countries. This work aimed to study the determinants of human T. solium infections in the Eastern province of Zambia and rank them in order of importance. A household (HH)-level questionnaire was administered to 680 HHs from 53 villages in two rural districts and the taeniasis and cysticercosis status determined. A classification tree model (CART) was used to define the relative importance and interactions between different predictor variables in their effect on taeniasis and cysticercosis. The Katete study area had a significantly higher taeniasis and cysticercosis prevalence than the Petauke area. The CART analysis for Katete showed that the most important determinant for cysticercosis infections was the number of HH inhabitants (6 to 10) and for taeniasis was the number of HH inhabitants > 6. The most important determinant in Petauke for cysticercosis was the age of head of household > 32 years and for taeniasis it was age < 55 years. The CART analysis showed that the most important determinant for both taeniasis and cysticercosis infections was the number of HH inhabitants (6 to 10) in Katete district and age in Petauke. The results suggest that control measures should target HHs with a high number of inhabitants and older individuals.

  5. Superiority of Classification Tree versus Cluster, Fuzzy and Discriminant Models in a Heartbeat Classification System

    Science.gov (United States)

    Krasteva, Vessela; Jekova, Irena; Leber, Remo; Schmid, Ramun; Abächerli, Roger

    2015-01-01

    This study presents a 2-stage heartbeat classifier of supraventricular (SVB) and ventricular (VB) beats. Stage 1 makes computationally-efficient classification of SVB-beats, using simple correlation threshold criterion for finding close match with a predominant normal (reference) beat template. The non-matched beats are next subjected to measurement of 20 basic features, tracking the beat and reference template morphology and RR-variability for subsequent refined classification in SVB or VB-class by Stage 2. Four linear classifiers are compared: cluster, fuzzy, linear discriminant analysis (LDA) and classification tree (CT), all subjected to iterative training for selection of the optimal feature space among extended 210-sized set, embodying interactive second-order effects between 20 independent features. The optimization process minimizes at equal weight the false positives in SVB-class and false negatives in VB-class. The training with European ST-T, AHA, MIT-BIH Supraventricular Arrhythmia databases found the best performance settings of all classification models: Cluster (30 features), Fuzzy (72 features), LDA (142 coefficients), CT (221 decision nodes) with top-3 best scored features: normalized current RR-interval, higher/lower frequency content ratio, beat-to-template correlation. Unbiased test-validation with MIT-BIH Arrhythmia database rates the classifiers in descending order of their specificity for SVB-class: CT (99.9%), LDA (99.6%), Cluster (99.5%), Fuzzy (99.4%); sensitivity for ventricular ectopic beats as part from VB-class (commonly reported in published beat-classification studies): CT (96.7%), Fuzzy (94.4%), LDA (94.2%), Cluster (92.4%); positive predictivity: CT (99.2%), Cluster (93.6%), LDA (93.0%), Fuzzy (92.4%). CT has superior accuracy by 0.3–6.8% points, with the advantage for easy model complexity configuration by pruning the tree consisted of easy interpretable ‘if-then’ rules. PMID:26461492

  6. An improved classification tree analysis of high cost modules based upon an axiomatic definition of complexity

    Science.gov (United States)

    Tian, Jianhui; Porter, Adam; Zelkowitz, Marvin V.

    1992-01-01

    Identification of high cost modules has been viewed as one mechanism to improve overall system reliability, since such modules tend to produce more than their share of problems. A decision tree model was used to identify such modules. In this current paper, a previously developed axiomatic model of program complexity is merged with the previously developed decision tree process for an improvement in the ability to identify such modules. This improvement was tested using data from the NASA Software Engineering Laboratory.

  7. Building classification trees to explain the radioactive contamination levels of the plants

    International Nuclear Information System (INIS)

    The objective of this thesis is the development of a method allowing the identification of factors leading to various radioactive contamination levels of the plants. The methodology suggested is based on the use of a radioecological transfer model of the radionuclides through the environment (A.S.T.R.A.L. computer code) and a classification-tree method. Particularly, to avoid the instability problems of classification trees and to preserve the tree structure, a node level stabilizing technique is used. Empirical comparisons are carried out between classification trees built by this method (called R.E.N. method) and those obtained by the C.A.R.T. method. A similarity measure is defined to compare the structure of two classification trees. This measure is used to study the stabilizing performance of the R.E.N. method. The methodology suggested is applied to a simplified contamination scenario. By the results obtained, we can identify the main variables responsible of the various radioactive contamination levels of four leafy-vegetables (lettuce, cabbage, spinach and leek). Some extracted rules from these classification trees can be usable in a post-accidental context. (author)

  8. Segmentation of Firms by Means of Classification Trees

    OpenAIRE

    Mirosława Lasek; Marek Pęczkowski

    2002-01-01

    The objective of the paper was to present the utility and applicability of the method of generating classification trees for the purposes of segmentation of firms by their economic standing, i.e. their financial and assets condition. The method of classification tree generation belongs to the group of the ihdata mininglt methods that permit to find out, basing on large data sets, the relationships and links among data. Variables used to classify the firms were financial and assets indices, in...

  9. Consensus of classification trees for skin sensitisation hazard prediction.

    Science.gov (United States)

    Asturiol, D; Casati, S; Worth, A

    2016-10-01

    Since March 2013, it is no longer possible to market in the European Union (EU) cosmetics containing new ingredients tested on animals. Although several in silico alternatives are available and achievements have been made in the development and regulatory adoption of skin sensitisation non-animal tests, there is not yet a generally accepted approach for skin sensitisation assessment that would fully substitute the need for animal testing. The aim of this work was to build a defined approach (i.e. a predictive model based on readouts from various information sources that uses a fixed procedure for generating a prediction) for skin sensitisation hazard prediction (sensitiser/non-sensitiser) using Local Lymph Node Assay (LLNA) results as reference classifications. To derive the model, we built a dataset with high quality data from in chemico (DPRA) and in vitro (KeratinoSens™ and h-CLAT) methods, and it was complemented with predictions from several software packages. The modelling exercise showed that skin sensitisation hazard was better predicted by classification trees based on in silico predictions. The defined approach consists of a consensus of two classification trees that are based on descriptors that account for protein reactivity and structural features. The model showed an accuracy of 0.93, sensitivity of 0.98, and specificity of 0.85 for 269 chemicals. In addition, the defined approach provides a measure of confidence associated to the prediction. PMID:27458072

  10. Predicting 'very poor' beach water quality gradings using classification tree.

    Science.gov (United States)

    Thoe, Wai; Choi, King Wah; Lee, Joseph Hun-wei

    2016-02-01

    A beach water quality prediction system has been developed in Hong Kong using multiple linear regression (MLR) models. However, linear models are found to be weak at capturing the infrequent 'very poor' water quality occasions when Escherichia coli (E. coli) concentration exceeds 610 counts/100 mL. This study uses a classification tree to increase the accuracy in predicting the 'very poor' water quality events at three Hong Kong beaches affected either by non-point source or point source pollution. Binary-output classification trees (to predict whether E. coli concentration exceeds 610 counts/100 mL) are developed over the periods before and after the implementation of the Harbour Area Treatment Scheme, when systematic changes in water quality were observed. Results show that classification trees can capture more 'very poor' events in both periods when compared to the corresponding linear models, with an increase in correct positives by an average of 20%. Classification trees are also developed at two beaches to predict the four-category Beach Water Quality Indices. They perform worse than the binary tree and give excessive false alarms of 'very poor' events. Finally, a combined modelling approach using both MLR model and classification tree is proposed to enhance the beach water quality prediction system for Hong Kong. PMID:26837834

  11. Design of Radar Software Test Case Based on Classification Tree%基于分类树的雷达软件测试用例设计

    Institute of Scientific and Technical Information of China (English)

    职晓; 裴阿平; 张江华

    2014-01-01

    Owing to larger and larger scale of software size, it is less and less feasible to test every functional unit of modern radar software by using common combinatorial testing techniques in engineering. Aiming at solving defi-ciency of a large amount of redundant test cases generated by using the classification tree method ( CTM) designing test cases, the orthogonal experimental design method based on case set generated by CTM is used to simplify and optimize the testing so as to improve testing efficiency. The experimental results show that optimization of testing case based on orthogonal experimental test designing method can be used to reduce redundant test cases effectively and save test source and cost. It possesses applicable value in engineering.%现代雷达软件测试由于软件规模越来越大,利用常规的组合覆盖方法测试各功能单元工程上越来越不现实。文章针对分类树方法设计测试用例产生大量冗余测试用例的缺陷,提出了在分类树方法生成的用例集基础上,利用正交试验设计法对其进行精简优化,以提高测试效率。实验结果表明,基于正交试验设计法的测试用例优化,可以有效减少冗余测试用例,节省测试资源和成本,具有一定的工程应用价值。

  12. A Method for Application of Classification Tree Models to Map Aquatic Vegetation Using Remotely Sensed Images from Different Sensors and Dates

    Directory of Open Access Journals (Sweden)

    Ying Cai

    2012-09-01

    Full Text Available In previous attempts to identify aquatic vegetation from remotely-sensed images using classification trees (CT, the images used to apply CT models to different times or locations necessarily originated from the same satellite sensor as that from which the original images used in model development came, greatly limiting the application of CT. We have developed an effective normalization method to improve the robustness of CT models when applied to images originating from different sensors and dates. A total of 965 ground-truth samples of aquatic vegetation types were obtained in 2009 and 2010 in Taihu Lake, China. Using relevant spectral indices (SI as classifiers, we manually developed a stable CT model structure and then applied a standard CT algorithm to obtain quantitative (optimal thresholds from 2009 ground-truth data and images from Landsat7-ETM+, HJ-1B-CCD, Landsat5-TM and ALOS-AVNIR-2 sensors. Optimal CT thresholds produced average classification accuracies of 78.1%, 84.7% and 74.0% for emergent vegetation, floating-leaf vegetation and submerged vegetation, respectively. However, the optimal CT thresholds for different sensor images differed from each other, with an average relative variation (RV of 6.40%. We developed and evaluated three new approaches to normalizing the images. The best-performing method (Method of 0.1% index scaling normalized the SI images using tailored percentages of extreme pixel values. Using the images normalized by Method of 0.1% index scaling, CT models for a particular sensor in which thresholds were replaced by those from the models developed for images originating from other sensors provided average classification accuracies of 76.0%, 82.8% and 68.9% for emergent vegetation, floating-leaf vegetation and submerged vegetation, respectively. Applying the CT models developed for normalized 2009 images to 2010 images resulted in high classification (78.0%–93.3% and overall (92.0%–93.1% accuracies. Our

  13. Building classification trees to explain the radioactive contamination levels of the plants; Construction d'arbres de discrimination pour expliquer les niveaux de contamination radioactive des vegetaux

    Energy Technology Data Exchange (ETDEWEB)

    Briand, B

    2008-04-15

    The objective of this thesis is the development of a method allowing the identification of factors leading to various radioactive contamination levels of the plants. The methodology suggested is based on the use of a radioecological transfer model of the radionuclides through the environment (A.S.T.R.A.L. computer code) and a classification-tree method. Particularly, to avoid the instability problems of classification trees and to preserve the tree structure, a node level stabilizing technique is used. Empirical comparisons are carried out between classification trees built by this method (called R.E.N. method) and those obtained by the C.A.R.T. method. A similarity measure is defined to compare the structure of two classification trees. This measure is used to study the stabilizing performance of the R.E.N. method. The methodology suggested is applied to a simplified contamination scenario. By the results obtained, we can identify the main variables responsible of the various radioactive contamination levels of four leafy-vegetables (lettuce, cabbage, spinach and leek). Some extracted rules from these classification trees can be usable in a post-accidental context. (author)

  14. Predictive mapping of soil organic carbon in wet cultivated lands using classification-tree based models

    DEFF Research Database (Denmark)

    Kheir, Rania Bou; Greve, Mogens Humlekrog; Bøcher, Peder Klith;

    2010-01-01

    Soil organic carbon (SOC) is one of the most important carbon stocks globally and has large potential to affect global climate. Distribution patterns of SOC in Denmark constitute a nation-wide baseline for studies on soil carbon changes (with respect to Kyoto protocol). This paper predicts and maps...

  15. The Hybrid of Classification Tree and Extreme Learning Machine for Permeability Prediction in Oil Reservoir

    KAUST Repository

    Prasetyo Utomo, Chandra

    2011-06-01

    Permeability is an important parameter connected with oil reservoir. Predicting the permeability could save millions of dollars. Unfortunately, petroleum engineers have faced numerous challenges arriving at cost-efficient predictions. Much work has been carried out to solve this problem. The main challenge is to handle the high range of permeability in each reservoir. For about a hundred year, mathematicians and engineers have tried to deliver best prediction models. However, none of them have produced satisfying results. In the last two decades, artificial intelligence models have been used. The current best prediction model in permeability prediction is extreme learning machine (ELM). It produces fairly good results but a clear explanation of the model is hard to come by because it is so complex. The aim of this research is to propose a way out of this complexity through the design of a hybrid intelligent model. In this proposal, the system combines classification and regression models to predict the permeability value. These are based on the well logs data. In order to handle the high range of the permeability value, a classification tree is utilized. A benefit of this innovation is that the tree represents knowledge in a clear and succinct fashion and thereby avoids the complexity of all previous models. Finally, it is important to note that the ELM is used as a final predictor. Results demonstrate that this proposed hybrid model performs better when compared with support vector machines (SVM) and ELM in term of correlation coefficient. Moreover, the classification tree model potentially leads to better communication among petroleum engineers concerning this important process and has wider implications for oil reservoir management efficiency.

  16. Predicting battle outcomes with classification trees

    OpenAIRE

    Coban, Muzaffer.

    2001-01-01

    Historical combat data analysis is a way of understanding the factors affecting battle outcomes. Current studies mostly prefer simulations that are based on mathematical abstractions of battles. However, these abstractions emphasize objective variables, such as force ratio. Models have very limited abilities of modeling important intangible factors like morale, leadership, and luck. Historical combat analysis provides a way to understand battles with the data taken from the actual battlefield...

  17. DIF Trees: Using Classification Trees to Detect Differential Item Functioning

    Science.gov (United States)

    Vaughn, Brandon K.; Wang, Qiu

    2010-01-01

    A nonparametric tree classification procedure is used to detect differential item functioning for items that are dichotomously scored. Classification trees are shown to be an alternative procedure to detect differential item functioning other than the use of traditional Mantel-Haenszel and logistic regression analysis. A nonparametric…

  18. Computer-aided diagnosis of Alzheimer's disease using support vector machines and classification trees

    International Nuclear Information System (INIS)

    This paper presents a computer-aided diagnosis technique for improving the accuracy of early diagnosis of Alzheimer-type dementia. The proposed methodology is based on the selection of voxels which present Welch's t-test between both classes, normal and Alzheimer images, greater than a given threshold. The mean and standard deviation of intensity values are calculated for selected voxels. They are chosen as feature vectors for two different classifiers: support vector machines with linear kernel and classification trees. The proposed methodology reaches greater than 95% accuracy in the classification task.

  19. Integrating TM and Ancillary Geographical Data with Classification Trees for Land Cover Classification of Marsh Area

    Institute of Scientific and Technical Information of China (English)

    NA Xiaodong; ZHANG Shuqing; ZHANG Huaiqing; LI Xiaofeng; YU Huan; LIU Chunyue

    2009-01-01

    The main objective of this research is to determine the capacity of land cover classification combining spectral and textural features of Landsat TM imagery with ancillary geographical data in wetlands of the Sanjiang Plain, Heilongjiang Province, China. Semi-variograms and Z-test value were calculated to assess the separability of grey-level co-occurrence texture measures to maximize the difference between land cover types. The degree of spatial autocorrelation showed that window sizes of 3×3 pixels and 11×11 pixels were most appropriate for Landsat TM image texture calculations. The texture analysis showed that co-occurrence entropy, dissimilarity, and variance texture measures, derived from the Landsat TM spectrum bands and vegetation indices provided the most significant statistical differentiation between land cover types. Subsequently, a Classification and Regression Tree (CART) algorithm was applied to three different combinations of predictors: 1) TM imagery alone (TM-only); 2) TM imagery plus image texture (TM+TXT model); and 3) all predictors including TM imagery, image texture and additional ancillary GIS information (TM+TXT+GIS model). Compared with traditional Maximum Likelihood Classification (MLC) supervised classification, three classification trees predictive models reduced the overall error rate significantly. Image texture measures and ancillary geographical variables depressed the speckle noise effectively and reduced classification error rate of marsh obviously. For classification trees model making use of all available predictors, omission error rate was 12.90% and commission error rate was 10.99% for marsh. The developed method is portable, relatively easy to implement and should be applicable in other settings and over larger extents.

  20. Genetic Algorithms and Classification Trees in Feature Discovery: Diabetes and the NHANES database

    Energy Technology Data Exchange (ETDEWEB)

    Heredia-Langner, Alejandro; Jarman, Kristin H.; Amidan, Brett G.; Pounds, Joel G.

    2013-09-01

    This paper presents a feature selection methodology that can be applied to datasets containing a mixture of continuous and categorical variables. Using a Genetic Algorithm (GA), this method explores a dataset and selects a small set of features relevant for the prediction of a binary (1/0) response. Binary classification trees and an objective function based on conditional probabilities are used to measure the fitness of a given subset of features. The method is applied to health data in order to find factors useful for the prediction of diabetes. Results show that our algorithm is capable of narrowing down the set of predictors to around 8 factors that can be validated using reputable medical and public health resources.

  1. Mastectomy or breast conserving surgery? Factors affecting type of surgical treatment for breast cancer – a classification tree approach

    International Nuclear Information System (INIS)

    A critical choice facing breast cancer patients is which surgical treatment – mastectomy or breast conserving surgery (BCS) – is most appropriate. Several studies have investigated factors that impact the type of surgery chosen, identifying features such as place of residence, age at diagnosis, tumor size, socio-economic and racial/ethnic elements as relevant. Such assessment of 'propensity' is important in understanding issues such as a reported under-utilisation of BCS among women for whom such treatment was not contraindicated. Using Western Australian (WA) data, we further examine the factors associated with the type of surgical treatment for breast cancer using a classification tree approach. This approach deals naturally with complicated interactions between factors, and so allows flexible and interpretable models for treatment choice to be built that add to the current understanding of this complex decision process. Data was extracted from the WA Cancer Registry on women diagnosed with breast cancer in WA from 1990 to 2000. Subjects' treatment preferences were predicted from covariates using both classification trees and logistic regression. Tumor size was the primary determinant of patient choice, subjects with tumors smaller than 20 mm in diameter preferring BCS. For subjects with tumors greater than 20 mm in diameter factors such as patient age, nodal status, and tumor histology become relevant as predictors of patient choice. Classification trees perform as well as logistic regression for predicting patient choice, but are much easier to interpret for clinical use. The selected tree can inform clinicians' advice to patients

  2. Consequences of spatial autocorrelation for niche-based models

    DEFF Research Database (Denmark)

    Segurado, P.; Araújo, Miguel B.; Kunin, W. E.

    2006-01-01

    of the original variables. Univariate models of species' distributions using generalized linear models (GLM), generalized additive models (GAM) and classification tree analysis (CTA) were fitted for each variable permutation. Variation of accuracy measures with spatial autocorrelation of the original predictor......1.  Spatial autocorrelation is an important source of bias in most spatial analyses. We explored the bias introduced by spatial autocorrelation on the explanatory and predictive power of species' distribution models, and make recommendations for dealing with the problem. 2.  Analyses were based...... of significance based on randomizations were obtained. 3.  Spatial autocorrelation was shown to represent a serious problem for niche-based species' distribution models. Significance values were found to be inflated up to 90-fold. 4.  In general, GAM and CTA performed better than GLM, although all three methods...

  3. A simple and robust classification tree for differentiation between benign and malignant lesions in MR-mammography

    Energy Technology Data Exchange (ETDEWEB)

    Baltzer, Pascal A.T. [Medical University Vienna, Department of Radiology, Vienna (Austria); Dietzel, Matthias [University hospital Erlangen, Department of Neuroradiology, Erlangen (Germany); Kaiser, Werner A. [University Hospital Jena, Institute of Diagnostic and Interventional Radiology 1, Jena (Germany)

    2013-08-15

    In the face of multiple available diagnostic criteria in MR-mammography (MRM), a practical algorithm for lesion classification is needed. Such an algorithm should be as simple as possible and include only important independent lesion features to differentiate benign from malignant lesions. This investigation aimed to develop a simple classification tree for differential diagnosis in MRM. A total of 1,084 lesions in standardised MRM with subsequent histological verification (648 malignant, 436 benign) were investigated. Seventeen lesion criteria were assessed by 2 readers in consensus. Classification analysis was performed using the chi-squared automatic interaction detection (CHAID) method. Results include the probability for malignancy for every descriptor combination in the classification tree. A classification tree incorporating 5 lesion descriptors with a depth of 3 ramifications (1, root sign; 2, delayed enhancement pattern; 3, border, internal enhancement and oedema) was calculated. Of all 1,084 lesions, 262 (40.4 %) and 106 (24.3 %) could be classified as malignant and benign with an accuracy above 95 %, respectively. Overall diagnostic accuracy was 88.4 %. The classification algorithm reduced the number of categorical descriptors from 17 to 5 (29.4 %), resulting in a high classification accuracy. More than one third of all lesions could be classified with accuracy above 95 %. (orig.)

  4. Knowledge-Based Classification in Automated Soil Mapping

    Institute of Scientific and Technical Information of China (English)

    ZHOU BIN; WANG RENCHAO

    2003-01-01

    A machine-learning approach was developed for automated building of knowledge bases for soil resourcesmapping by using a classification tree to generate knowledge from training data. With this method, buildinga knowledge base for automated soil mapping was easier than using the conventional knowledge acquisitionapproach. The knowledge base built by classification tree was used by the knowledge classifier to perform thesoil type classification of Longyou County, Zhejiang Province, China using Landsat TM bi-temporal imagesand GIS data. To evaluate the performance of the resultant knowledge bases, the classification results werecompared to existing soil map based on a field survey. The accuracy assessment and analysis of the resultantsoil maps suggested that the knowledge bases built by the machine-learning method was of good quality formapping distribution model of soil classes over the study area.

  5. Classification tree for risk assessment in patients suffering from congestive heart failure via long-term heart rate variability.

    Science.gov (United States)

    Melillo, Paolo; De Luca, Nicola; Bracale, Marcello; Pecchia, Leandro

    2013-05-01

    This study aims to develop an automatic classifier for risk assessment in patients suffering from congestive heart failure (CHF). The proposed classifier separates lower risk patients from higher risk ones, using standard long-term heart rate variability (HRV) measures. Patients are labeled as lower or higher risk according to the New York Heart Association classification (NYHA). A retrospective analysis on two public Holter databases was performed, analyzing the data of 12 patients suffering from mild CHF (NYHA I and II), labeled as lower risk, and 32 suffering from severe CHF (NYHA III and IV), labeled as higher risk. Only patients with a fraction of total heartbeats intervals (RR) classified as normal-to-normal (NN) intervals (NN/RR) higher than 80% were selected as eligible in order to have a satisfactory signal quality. Classification and regression tree (CART) was employed to develop the classifiers. A total of 30 higher risk and 11 lower risk patients were included in the analysis. The proposed classification trees achieved a sensitivity and a specificity rate of 93.3% and 63.6%, respectively, in identifying higher risk patients. Finally, the rules obtained by CART are comprehensible and consistent with the consensus showed by previous studies that depressed HRV is a useful tool for risk assessment in patients suffering from CHF. PMID:24592473

  6. Predicting smear negative pulmonary tuberculosis with classification trees and logistic regression: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Kritski Afrânio

    2006-02-01

    Full Text Available Abstract Background Smear negative pulmonary tuberculosis (SNPT accounts for 30% of pulmonary tuberculosis cases reported yearly in Brazil. This study aimed to develop a prediction model for SNPT for outpatients in areas with scarce resources. Methods The study enrolled 551 patients with clinical-radiological suspicion of SNPT, in Rio de Janeiro, Brazil. The original data was divided into two equivalent samples for generation and validation of the prediction models. Symptoms, physical signs and chest X-rays were used for constructing logistic regression and classification and regression tree models. From the logistic regression, we generated a clinical and radiological prediction score. The area under the receiver operator characteristic curve, sensitivity, and specificity were used to evaluate the model's performance in both generation and validation samples. Results It was possible to generate predictive models for SNPT with sensitivity ranging from 64% to 71% and specificity ranging from 58% to 76%. Conclusion The results suggest that those models might be useful as screening tools for estimating the risk of SNPT, optimizing the utilization of more expensive tests, and avoiding costs of unnecessary anti-tuberculosis treatment. Those models might be cost-effective tools in a health care network with hierarchical distribution of scarce resources.

  7. Predicting smear negative pulmonary tuberculosis with classification trees and logistic regression: a cross-sectional study

    OpenAIRE

    Kritski Afrânio; Chaisson Richard E; Conde Marcus; Rezende Valéria MC; Soares Sérgio; Bastos Luiz; Mello Fernanda; Ruffino-Netto Antonio; Werneck Guilherme

    2006-01-01

    Abstract Background Smear negative pulmonary tuberculosis (SNPT) accounts for 30% of pulmonary tuberculosis cases reported yearly in Brazil. This study aimed to develop a prediction model for SNPT for outpatients in areas with scarce resources. Methods The study enrolled 551 patients with clinical-radiological suspicion of SNPT, in Rio de Janeiro, Brazil. The original data was divided into two equivalent samples for generation and validation of the prediction models. Symptoms, physical signs ...

  8. Predicting Chemically Induced Duodenal Ulcer and Adrenal Necrosis with Classification Trees

    Science.gov (United States)

    Giampaolo, Casimiro; Gray, Andrew T.; Olshen, Richard A.; Szabo, Sandor

    1991-07-01

    Binary tree-structured statistical classification algorithms and properties of 56 model alkyl nucleophiles were brought to bear on two problems of experimental pharmacology and toxicology. Each rat of a learning sample of 745 was administered one compound and autopsied to determine the presence of duodenal ulcer or adrenal hemorrhagic necrosis. The cited statistical classification schemes were then applied to these outcomes and 67 features of the compounds to ascertain those characteristics that are associated with biologic activity. For predicting duodenal ulceration, dipole moment, melting point, and solubility in octanol are particularly important, while for predicting adrenal necrosis, important features include the number of sulfhydryl groups and double bonds. These methods may constitute inexpensive but powerful ways to screen untested compounds for possible organ-specific toxicity. Mechanisms for the etiology and pathogenesis of the duodenal and adrenal lesions are suggested, as are additional avenues for drug design.

  9. Systematic Model-in-the-Loop Test of Embedded Control Systems

    Science.gov (United States)

    Krupp, Alexander; Müller, Wolfgang

    Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.

  10. Data mining methods in the prediction of Dementia: A real-data comparison of the accuracy, sensitivity and specificity of linear discriminant analysis, logistic regression, neural networks, support vector machines, classification trees and random forests

    Directory of Open Access Journals (Sweden)

    Santana Isabel

    2011-08-01

    Full Text Available Abstract Background Dementia and cognitive impairment associated with aging are a major medical and social concern. Neuropsychological testing is a key element in the diagnostic procedures of Mild Cognitive Impairment (MCI, but has presently a limited value in the prediction of progression to dementia. We advance the hypothesis that newer statistical classification methods derived from data mining and machine learning methods like Neural Networks, Support Vector Machines and Random Forests can improve accuracy, sensitivity and specificity of predictions obtained from neuropsychological testing. Seven non parametric classifiers derived from data mining methods (Multilayer Perceptrons Neural Networks, Radial Basis Function Neural Networks, Support Vector Machines, CART, CHAID and QUEST Classification Trees and Random Forests were compared to three traditional classifiers (Linear Discriminant Analysis, Quadratic Discriminant Analysis and Logistic Regression in terms of overall classification accuracy, specificity, sensitivity, Area under the ROC curve and Press'Q. Model predictors were 10 neuropsychological tests currently used in the diagnosis of dementia. Statistical distributions of classification parameters obtained from a 5-fold cross-validation were compared using the Friedman's nonparametric test. Results Press' Q test showed that all classifiers performed better than chance alone (p Conclusions When taking into account sensitivity, specificity and overall classification accuracy Random Forests and Linear Discriminant analysis rank first among all the classifiers tested in prediction of dementia using several neuropsychological tests. These methods may be used to improve accuracy, sensitivity and specificity of Dementia predictions from neuropsychological testing.

  11. The application of GIS based decision-tree models for generating the spatial distribution of hydromorphic organic landscapes in relation to digital terrain data

    Directory of Open Access Journals (Sweden)

    R. Bou Kheir

    2010-06-01

    Full Text Available Accurate information about organic/mineral soil occurrence is a prerequisite for many land resources management applications (including climate change mitigation. This paper aims at investigating the potential of using geomorphometrical analysis and decision tree modeling to predict the geographic distribution of hydromorphic organic landscapes in unsampled area in Denmark. Nine primary (elevation, slope angle, slope aspect, plan curvature, profile curvature, tangent curvature, flow direction, flow accumulation, and specific catchment area and one secondary (steady-state topographic wetness index topographic parameters were generated from Digital Elevation Models (DEMs acquired using airborne LIDAR (Light Detection and Ranging systems. They were used along with existing digital data collected from other sources (soil type, geological substrate and landscape type to explain organic/mineral field measurements in hydromorphic landscapes of the Danish area chosen. A large number of tree-based classification models (186 were developed using (1 all of the parameters, (2 the primary DEM-derived topographic (morphological/hydrological parameters only, (3 selected pairs of parameters and (4 excluding each parameter one at a time from the potential pool of predictor parameters. The best classification tree model (with the lowest misclassification error and the smallest number of terminal nodes and predictor parameters combined the steady-state topographic wetness index and soil type, and explained 68% of the variability in organic/mineral field measurements. The overall accuracy of the predictive organic/inorganic landscapes' map produced (at 1:50 000 cartographic scale using the best tree was estimated to be ca. 75%. The proposed classification-tree model is relatively simple, quick, realistic and practical, and it can be applied to other areas, thereby providing a tool to facilitate the implementation of pedological/hydrological plans for conservation

  12. The application of GIS based decision-tree models for generating the spatial distribution of hydromorphic organic landscapes in relation to digital terrain data

    Directory of Open Access Journals (Sweden)

    R. Bou Kheir

    2010-01-01

    Full Text Available Accurate information about soil organic carbon (SOC, presented in a spatially form, is prerequisite for many land resources management applications (including climate change mitigation. This paper aims to investigate the potential of using geomorphometrical analysis and decision tree modeling to predict the geographic distribution of hydromorphic organic landscapes at unsampled area in Denmark. Nine primary (elevation, slope angle, slope aspect, plan curvature, profile curvature, tangent curvature, flow direction, flow accumulation, and specific catchment area and one secondary (steady-state topographic wetness index topographic parameters were generated from Digital Elevation Models (DEMs acquired using airborne LIDAR (Light Detection and Ranging systems. They were used along with existing digital data collected from other sources (soil type, geological substrate and landscape type to statistically explain SOC field measurements in hydromorphic landscapes of the chosen Danish area. A large number of tree-based classification models (186 were developed using (1 all of the parameters, (2 the primary DEM-derived topographic (morphological/hydrological parameters only, (3 selected pairs of parameters and (4 excluding each parameter one at a time from the potential pool of predictor parameters. The best classification tree model (with the lowest misclassification error and the smallest number of terminal nodes and predictor parameters combined the steady-state topographic wetness index and soil type, and explained 68% of the variability in field SOC measurements. The overall accuracy of the produced predictive SOC map (at 1:50 000 cartographic scale using the best tree was estimated to be ca. 75%. The proposed classification-tree model is relatively simple, quick, realistic and practical, and it can be applied to other areas, thereby providing a tool to help with the implementation of pedological/hydrological plans for conservation and sustainable

  13. Estimating the Probability of Vegetation to Be Groundwater Dependent Based on the Evaluation of Tree Models

    Directory of Open Access Journals (Sweden)

    Isabel C. Pérez Hoyos

    2016-04-01

    Full Text Available Groundwater Dependent Ecosystems (GDEs are increasingly threatened by humans’ rising demand for water resources. Consequently, it is imperative to identify the location of GDEs to protect them. This paper develops a methodology to identify the probability of an ecosystem to be groundwater dependent. Probabilities are obtained by modeling the relationship between the known locations of GDEs and factors influencing groundwater dependence, namely water table depth and climatic aridity index. Probabilities are derived for the state of Nevada, USA, using modeled water table depth and aridity index values obtained from the Global Aridity database. The model selected results from the performance comparison of classification trees (CT and random forests (RF. Based on a threshold-independent accuracy measure, RF has a better ability to generate probability estimates. Considering a threshold that minimizes the misclassification rate for each model, RF also proves to be more accurate. Regarding training accuracy, performance measures such as accuracy, sensitivity, and specificity are higher for RF. For the test set, higher values of accuracy and kappa for CT highlight the fact that these measures are greatly affected by low prevalence. As shown for RF, the choice of the cutoff probability value has important consequences on model accuracy and the overall proportion of locations where GDEs are found.

  14. The Comparative Analysis of the Application of Several Scoring Models of Consumer Credit in China%多种个人信用评分模型在中国应用的比较研究

    Institute of Scientific and Technical Information of China (English)

    石庆焱; 靳云汇

    2004-01-01

    Based on a set of credit card sample of Chinese commercial bank, a systemically comparative study of various statistical credit scoring models was firstly made in China. The comparative study indicated that every model has its own strength and weakness. Thestrengths of linear discriminant analysis, linear program, and Logistic regression are that these models are explainable and their outputs can be a linear scorecard(so can be easily implemented). But these models have higher misclassification rate comparing with others. Neural network and classification tree models have a higher predict accuracy,but may be‘over fitted’ ,and their outputs are hard to be explained.

  15. Identification of area-level influences on regions of high cancer incidence in Queensland, Australia: a classification tree approach

    Directory of Open Access Journals (Sweden)

    Mengersen Kerrie L

    2011-07-01

    Full Text Available Abstract Background Strategies for cancer reduction and management are targeted at both individual and area levels. Area-level strategies require careful understanding of geographic differences in cancer incidence, in particular the association with factors such as socioeconomic status, ethnicity and accessibility. This study aimed to identify the complex interplay of area-level factors associated with high area-specific incidence of Australian priority cancers using a classification and regression tree (CART approach. Methods Area-specific smoothed standardised incidence ratios were estimated for priority-area cancers across 478 statistical local areas in Queensland, Australia (1998-2007, n = 186,075. For those cancers with significant spatial variation, CART models were used to identify whether area-level accessibility, socioeconomic status and ethnicity were associated with high area-specific incidence. Results The accessibility of a person's residence had the most consistent association with the risk of cancer diagnosis across the specific cancers. Many cancers were likely to have high incidence in more urban areas, although male lung cancer and cervical cancer tended to have high incidence in more remote areas. The impact of socioeconomic status and ethnicity on these associations differed by type of cancer. Conclusions These results highlight the complex interactions between accessibility, socioeconomic status and ethnicity in determining cancer incidence risk.

  16. Model-based geostatistics

    CERN Document Server

    Diggle, Peter J

    2007-01-01

    Model-based geostatistics refers to the application of general statistical principles of modeling and inference to geostatistical problems. This volume provides a treatment of model-based geostatistics and emphasizes on statistical methods and applications. It also features analyses of datasets from a range of scientific contexts.

  17. Staying Power of Churn Prediction Models

    NARCIS (Netherlands)

    Risselada, Hans; Verhoef, Peter C.; Bijmolt, Tammo H. A.

    2010-01-01

    In this paper, we study the staying power of various churn prediction models. Staying power is defined as the predictive performance of a model in a number of periods after the estimation period. We examine two methods, logit models and classification trees, both with and without applying a bagging

  18. Prospective Testing and Redesign of a Temporal Biomarker Based Risk Model for Patients With Septic Shock: Implications for Septic Shock Biology

    Directory of Open Access Journals (Sweden)

    Hector R. Wong

    2015-12-01

    Full Text Available The temporal version of the pediatric sepsis biomarker risk model (tPERSEVERE estimates the risk of a complicated course in children with septic shock based on biomarker changes from days 1 to 3 of septic shock. We validated tPERSEVERE performance in a prospective cohort, with an a priori plan to redesign tPERSEVERE if it did not perform well. Biomarkers were measured in the validation cohort (n = 168 and study subjects were classified according to tPERSEVERE. To redesign tPERSEVERE, the validation cohort and the original derivation cohort (n = 299 were combined and randomly allocated to training (n = 374 and test (n = 93 sets. tPERSEVERE was redesigned using the training set and CART methodology. tPERSEVERE performed poorly in the validation cohort, with an area under the curve (AUC of 0.67 (95% CI: 0.58–0.75. Failure analysis revealed potential confounders related to clinical characteristics. The redesigned tPERSEVERE model had an AUC of 0.83 (0.79–0.87 and a sensitivity of 93% (68–97 for estimating the risk of a complicated course. Similar performance was seen in the test set. The classification tree segregated patients into two broad endotypes of septic shock characterized by either excessive inflammation or immune suppression.

  19. Development of river ecosystem models for Flemish watercourses: case studies in the Zwalm river basin.

    Science.gov (United States)

    Goethals, P; Dedecker, A; Raes, N; Adriaenssens, V; Gabriels, W; De Pauw, N

    2001-01-01

    Only recently, modelling has been accepted as an interesting and powerful tool to support river quality assessment and management. The 'River Invertebrate Prediction and Classification System' (RIVPACS), based on statistical modelling, was one of the first and best known systems in this context. RIVPACS was developed to classify macroinvertebrate community types and to predict the fauna expected to occur in different types of watercourses, based on a small number of environmental variables. The prediction is essentially a static 'target' of the fauna to be expected at a site with stated environmental features, in the absence of environmental stress. Therefore this system is rather limited to apply in river assessment and management. Models that offer a prediction of faunal responses to changes in environmental features (e.g. changes in discharge regime, dissolved oxygen level, ...) would be of considerable value for river management. In this context, models based on classification trees, artificial neural networks and fuzzy logic were developed and applied to predict macro-invertebrate communities in the Zwalm river basin located in Flanders, Belgium. Structural characteristics (meandering, substrate type, flow velocity, ...) and physical-chemical variables (dissolved oxygen, pH, ...) were used as inputs to predict the presence or absence of macroinvertebrate taxa in the headwaters and brooks of the Zwalm river basin. In total, data from 60 measurement sites were available. Reliability and particular strengths and weaknesses of these techniques were compared and evaluated. Classification trees performed in general well to predict the absence or presence of the different macroinvertebrate taxa and allowed also to deduct general relations from the dataset. Models based on artificial neural networks (ANNS) were also good in predicting the macroinvertebrate communities at the different sites. Sensitivity analyses related to ANNs allowed to study the impact of the input

  20. Classification tree analysis in serous ovarian adenocarcinoma patients for prognostic factors associated with three-year survival probability%卵巢浆液性癌三年生存预后因素的分类树分析

    Institute of Scientific and Technical Information of China (English)

    祝洪澜; 李艺; 赵一鸣; 崔恒; 赵彦; 昌晓红; 冯捷; 魏丽惠

    2008-01-01

    目的 应用分类树模型探讨影响卵巢浆液性癌3年生存的预后因素.方法 收集1991年1月-2003年12月北京大学人民医院初次收治、随诊3年并已有临床结局的81例卵巢浆液性癌患者的临床资料,采用分类与回归树(CART)软件建立分类树模型,评价预后因素.结果 采用CART软件建立的分类树模型中,年龄是影响卵巢浆液性癌3年生存的最重要的预后因素,其他预后因素包括:手术病理分期、淋巴结转移、术后残余病灶、术后化疗方法及病理分级.替代变量分析显示,手术病理分期和淋巴结转移是年龄的主要替代变量.结论 年龄、手术病理分期、淋巴结转移、术后残余病灶、术后化疗方法及病理分级是影响卵巢浆液性癌3年生存预后的主要因素.%Objective To analyze the prognostic factors associated with three-year survival outcome in patients with serous ovarian adenocarcinoma by classification tree.Methods Retrospectively we analyzed 81 cases with serous ovarian adenocarcinoma who had 3-year clinical outcomes and were hospitalized in People's Hospital from Jan 1991 to Dec 2003 by classification and regression trees(CART)software.Establish the classification tree.Results Among the factors that were associated with the 3-year survival rate,age was the most important factor,other factors in turn were International Federation of Gynecology and Obstetrics(FIGO)stage,lymphoid metastasis,residual size after operation,chemotherapy and pathologic grade.By substitution variable analysis,it was demonstrated that there was cross interaction between age and residual size as well as age and chemotherapy.Conclusion Age,FIGO stage,lymphoid metastasis,residual size after operation,chemotherapy and pathologic grade are important prognostic factors related with three-year survival probability of serous ovarian adenocarcinoma patients.

  1. 山东省中西部农村居民高血压危险因素分类树分析%Study on the risk factors of hypertension among rural residents in mid-west areas of Shandong province, using the classification tree analysis methodology

    Institute of Scientific and Technical Information of China (English)

    刘甲野; 马吉祥; 徐爱强; 付振涛; 贺桂顺; 贾崇奇; 于洋

    2008-01-01

    Objective To explore the risk factors of hypertension and risk population for adults aged≥25 in the mid-western rural areas of Shandong province and to provide evidence for development of intervention measure. Methods Subjects aged ≥25 were selected by multi-stage stratified random sampling method. All participants were interviewed with a standard questionnaire and physically examined on height, weight, waist circumference, blood pressure and fasting plasma glucose (FPG). Classification tree analysis was employed to determine the risk factors of hypertension and high risk populations related to it. Results The major risk factors of hypertension would include age, abdominal obesity, overweight or obesity, family history and high blood sugar. The major populations at high risk would include populations as: a) being clderly, b) at middle-age but with: high blood sugar or with abdominal obesity/overweight, or with family history, c) people at middle-age but with family history and abdominal obesity. Through classification tree analysis, sensitivity, specificity and overall correct rates were 71.87%, 66.38% and 68.79 %, respectively on ' learning sample' while 70.70 %, 65.84 % and 67.97 % respectively on ' testing sample'. Conclusion Efforts on both weight and blood sugar reduction were common prevention measures for general population. Different kinds of prevention and control measures should be taken according to different risk factors existed in the targeted high-risk population of hypertension. Community-based prevention and control for hypertension measures should be integrated when targeting the population at high risk.%目的 探讨山东省中西部地区25岁以上农村常住居民高血压的危险因素及高危人群.方法 采用多阶段分层随机抽样的方法 ,对该地区调查对象进行问卷调查、体格检查以及实验室检测.应用分类树分析高血压人群的危险因素及高危人群.结果 高血压的主要危险因素为

  2. Model Based Definition

    Science.gov (United States)

    Rowe, Sidney E.

    2010-01-01

    In September 2007, the Engineering Directorate at the Marshall Space Flight Center (MSFC) created the Design System Focus Team (DSFT). MSFC was responsible for the in-house design and development of the Ares 1 Upper Stage and the Engineering Directorate was preparing to deploy a new electronic Configuration Management and Data Management System with the Design Data Management System (DDMS) based upon a Commercial Off The Shelf (COTS) Product Data Management (PDM) System. The DSFT was to establish standardized CAD practices and a new data life cycle for design data. Of special interest here, the design teams were to implement Model Based Definition (MBD) in support of the Upper Stage manufacturing contract. It is noted that this MBD does use partially dimensioned drawings for auxiliary information to the model. The design data lifecycle implemented several new release states to be used prior to formal release that allowed the models to move through a flow of progressive maturity. The DSFT identified some 17 Lessons Learned as outcomes of the standards development, pathfinder deployments and initial application to the Upper Stage design completion. Some of the high value examples are reviewed.

  3. Method for gesture based modeling

    DEFF Research Database (Denmark)

    2006-01-01

    A computer program based method is described for creating models using gestures. On an input device, such as an electronic whiteboard, a user draws a gesture which is recognized by a computer program and interpreted relative to a predetermined meta-model. Based on the interpretation, an algorithm...... is assigned to the gesture drawn by the user. The executed algorithm may, for example, consist in creating a new model element, modifying an existing model element, or deleting an existing model element....

  4. Model Construct Based Enterprise Model Architecture and Its Modeling Approach

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In order to support enterprise integration, a kind of model construct based enterprise model architecture and its modeling approach are studied in this paper. First, the structural makeup and internal relationships of enterprise model architecture are discussed. Then, the concept of reusable model construct (MC) which belongs to the control view and can help to derive other views is proposed. The modeling approach based on model construct consists of three steps, reference model architecture synthesis, enterprise model customization, system design and implementation. According to MC based modeling approach a case study with the background of one-kind-product machinery manufacturing enterprises is illustrated. It is shown that proposal model construct based enterprise model architecture and modeling approach are practical and efficient.

  5. Model-based software design

    Science.gov (United States)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  6. Model-Based Reasoning

    Science.gov (United States)

    Ifenthaler, Dirk; Seel, Norbert M.

    2013-01-01

    In this paper, there will be a particular focus on mental models and their application to inductive reasoning within the realm of instruction. A basic assumption of this study is the observation that the construction of mental models and related reasoning is a slowly developing capability of cognitive systems that emerges effectively with proper…

  7. Model-based Software Engineering

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2010-01-01

    The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models...

  8. Principles of models based engineering

    Energy Technology Data Exchange (ETDEWEB)

    Dolin, R.M.; Hefele, J.

    1996-11-01

    This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.

  9. Graph Model Based Indoor Tracking

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Lu, Hua; Yang, Bin

    2009-01-01

    infrastructure for different symbolic positioning technologies, e.g., Bluetooth and RFID. More specifically, the paper proposes a model of indoor space that comprises a base graph and mappings that represent the topology of indoor space at different levels. The resulting model can be used for one or several...... indoor positioning technologies. Focusing on RFID-based positioning, an RFID specific reader deployment graph model is built from the base graph model. This model is then used in several algorithms for constructing and refining trajectories from raw RFID readings. Empirical studies with implementations...

  10. Model-based consensus

    NARCIS (Netherlands)

    Boumans, Marcel

    2014-01-01

    The aim of the rational-consensus method is to produce “rational consensus”, that is, “mathematical aggregation”, by weighing the performance of each expert on the basis of his or her knowledge and ability to judge relevant uncertainties. The measurement of the performance of the experts is based on

  11. Model-based consensus

    NARCIS (Netherlands)

    M. Boumans

    2014-01-01

    The aim of the rational-consensus method is to produce "rational consensus", that is, "mathematical aggregation", by weighing the performance of each expert on the basis of his or her knowledge and ability to judge relevant uncertainties. The measurement of the performance of the experts is based on

  12. Cluster Based Text Classification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Wiil, Uffe Kock

    2011-01-01

    We propose a cluster based classification model for suspicious email detection and other text classification tasks. The text classification tasks comprise many training examples that require a complex classification model. Using clusters for classification makes the model simpler and increases......, the classifier is trained on each cluster having reduced dimensionality and less number of examples. The experimental results show that the proposed model outperforms the existing classification models for the task of suspicious email detection and topic categorization on the Reuters-21578 and 20 Newsgroups...... datasets. Our model also outperforms A Decision Cluster Classification (ADCC) and the Decision Cluster Forest Classification (DCFC) models on the Reuters-21578 dataset....

  13. Methods of development fuzzy logic driven decision-support models in copper alloys processing

    Directory of Open Access Journals (Sweden)

    S. Kluska-Nawarecka

    2010-01-01

    Full Text Available Development of a diagnostic decision support system using different then divalent logical formalism, in particular fuzzy logic, allows the inference from the facts presented not as explicit numbers, but described by linguistic variables such as the "high level", "low temperature", "too much content", etc. Thanks to this, process of inference resembles human manner in actual conditions of decision-making processes. Knowledge of experts allows him to discover the functions describing the relationship between the classification of a set of objects and their characteristics, on the basis of which it is possible to create a decision-making rules for classifying new objects of unknown classification so far. This process can be automated. Experimental studies conducted on copper alloys provide large amounts of data. Processing of these data can be greatly accelerated by the classification trees algorithms which provides classes that can be used in fuzzy inference model. Fuzzy logic also provides the flexibility of allocating to classes on the basis of membership functions (which is similar to events in real-world conditions. Decision-making in foundry operations often requires reliance on knowledge incomplete and ambiguous, hence that the conclusions from the data and facts may be "to some extent" true, and the technologist has to determine what level of confidence is acceptable, although the degree of accuracy for specific criteria is defined by membership function, which takes values from interval . This paper describes the methodology and the process of developing fuzzy logic-based models of decision making based on preprocessed data with classification trees, where the needs of the diverse characteristics of copper alloys processing are the scope. Algorithms for automatic classification of the materials research work of copper alloys are clearly the nature of the innovative and promising hope for practical applications in this area.

  14. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  15. Modeling Guru: Knowledge Base for NASA Modelers

    Science.gov (United States)

    Seablom, M. S.; Wojcik, G. S.; van Aartsen, B. H.

    2009-05-01

    Modeling Guru is an on-line knowledge-sharing resource for anyone involved with or interested in NASA's scientific models or High End Computing (HEC) systems. Developed and maintained by the NASA's Software Integration and Visualization Office (SIVO) and the NASA Center for Computational Sciences (NCCS), Modeling Guru's combined forums and knowledge base for research and collaboration is becoming a repository for the accumulated expertise of NASA's scientific modeling and HEC communities. All NASA modelers and associates are encouraged to participate and provide knowledge about the models and systems so that other users may benefit from their experience. Modeling Guru is divided into a hierarchy of communities, each with its own set forums and knowledge base documents. Current modeling communities include those for space science, land and atmospheric dynamics, atmospheric chemistry, and oceanography. In addition, there are communities focused on NCCS systems, HEC tools and libraries, and programming and scripting languages. Anyone may view most of the content on Modeling Guru (available at http://modelingguru.nasa.gov/), but you must log in to post messages and subscribe to community postings. The site offers a full range of "Web 2.0" features, including discussion forums, "wiki" document generation, document uploading, RSS feeds, search tools, blogs, email notification, and "breadcrumb" links. A discussion (a.k.a. forum "thread") is used to post comments, solicit feedback, or ask questions. If marked as a question, SIVO will monitor the thread, and normally respond within a day. Discussions can include embedded images, tables, and formatting through the use of the Rich Text Editor. Also, the user can add "Tags" to their thread to facilitate later searches. The "knowledge base" is comprised of documents that are used to capture and share expertise with others. The default "wiki" document lets users edit within the browser so others can easily collaborate on the

  16. Base Flow Model Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The program focuses on turbulence modeling enhancements for predicting high-speed rocket base flows. A key component of the effort is the collection of...

  17. Constraint Based Modeling Going Multicellular.

    Science.gov (United States)

    Martins Conde, Patricia do Rosario; Sauter, Thomas; Pfau, Thomas

    2016-01-01

    Constraint based modeling has seen applications in many microorganisms. For example, there are now established methods to determine potential genetic modifications and external interventions to increase the efficiency of microbial strains in chemical production pipelines. In addition, multiple models of multicellular organisms have been created including plants and humans. While initially the focus here was on modeling individual cell types of the multicellular organism, this focus recently started to switch. Models of microbial communities, as well as multi-tissue models of higher organisms have been constructed. These models thereby can include different parts of a plant, like root, stem, or different tissue types in the same organ. Such models can elucidate details of the interplay between symbiotic organisms, as well as the concerted efforts of multiple tissues and can be applied to analyse the effects of drugs or mutations on a more systemic level. In this review we give an overview of the recent development of multi-tissue models using constraint based techniques and the methods employed when investigating these models. We further highlight advances in combining constraint based models with dynamic and regulatory information and give an overview of these types of hybrid or multi-level approaches.

  18. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2009-01-01

    The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...

  19. Event-Based Activity Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2004-01-01

    We present and discuss a modeling approach that supports event-based modeling of information and activity in information systems. Interacting human actors and IT-actors may carry out such activity. We use events to create meaningful relations between information structures and the related activit...

  20. Classification tree study of factors affecting isolated systolic hypertension in rural areas of Hanzhong city of Shaanxi Province%陕西省汉中市农村单纯性收缩期高血压影响因素的分类树研究

    Institute of Scientific and Technical Information of China (English)

    方士华; 颜虹; 党少农; 李强; 赵亚玲; 刘小宁; 杨睿海; 任勇

    2013-01-01

    目的 探索陕西省汉中市农村地区单纯性收缩期高血压(isolated systolic hypertension,ISH)的影响因素,为ISH的预防和控制提供依据和建议.方法 以陕西省汉中市农村居民健康调查资料为数据源,采用分类树(CART)分析方法拟合模型,选择适合二分类变量的CART算法,并以错分率、正确分类率和ROC曲线等指标评价模型.结果 该模型的训练集、验证集和测试集的错分率分别为16.6%、16.8%、14.3%,Root ASE分别为0.357、0.351、0.334,ROC曲线下面积大于0.5,说明模型拟合得较好.该模型首先按年龄将全人群分裂,影响因素按重要性依次为年龄、体质指数、受教育水平、吸烟、家庭月平均收入和多盐.结论 ISH受多因素影响,应针对其影响因素做好一级预防,对不同的高危人群给予针对性的干预策略,降低ISH的发病率,提高农村居民的生活质量.%Objective To explore factors affecting isolated systolic hypertension (ISH) in rural areas of Hanzhong city, Shaanxi Province, so as to provide evidence for ISH prevention and control. Methods We used the data collected from a survey on people's health in rural areas in Hanzhong City, Shaanxi Province, as data resource in this research. Classification and regression tree (CART) fitting binary variable was selected and classification tree was used to fit the model. Misclassification rate, correct classification rate and ROC curve were used to evaluate the model. Results Misclassification rate of the model training set, validation set and test set was 16.6%, 16.8% and 14.3%, respectively. Root ASE of the model was 0. 357, 0.351 and 0.334, respectively. The area under ROC curve was greater than 0.5, which indicated that the model was fitted well. Firstly, the population was divided by age in the model. The affecting factors were age, body mass index, education level, smoking habit, the average monthly family income and greater salt intake in the order of

  1. Modelling Gesture Based Ubiquitous Applications

    CERN Document Server

    Zacharia, Kurien; Varghese, Surekha Mariam

    2011-01-01

    A cost effective, gesture based modelling technique called Virtual Interactive Prototyping (VIP) is described in this paper. Prototyping is implemented by projecting a virtual model of the equipment to be prototyped. Users can interact with the virtual model like the original working equipment. For capturing and tracking the user interactions with the model image and sound processing techniques are used. VIP is a flexible and interactive prototyping method that has much application in ubiquitous computing environments. Different commercial as well as socio-economic applications and extension to interactive advertising of VIP are also discussed.

  2. Sketch-based geologic modeling

    Science.gov (United States)

    Rood, M. P.; Jackson, M.; Hampson, G.; Brazil, E. V.; de Carvalho, F.; Coda, C.; Sousa, M. C.; Zhang, Z.; Geiger, S.

    2015-12-01

    Two-dimensional (2D) maps and cross-sections, and 3D conceptual models, are fundamental tools for understanding, communicating and modeling geology. Yet geologists lack dedicated and intuitive tools that allow rapid creation of such figures and models. Standard drawing packages produce only 2D figures that are not suitable for quantitative analysis. Geologic modeling packages can produce 3D models and are widely used in the groundwater and petroleum communities, but are often slow and non-intuitive to use, requiring the creation of a grid early in the modeling workflow and the use of geostatistical methods to populate the grid blocks with geologic information. We present an alternative approach to rapidly create figures and models using sketch-based interface and modelling (SBIM). We leverage methods widely adopted in other industries to prototype complex geometries and designs. The SBIM tool contains built-in geologic rules that constrain how sketched lines and surfaces interact. These rules are based on the logic of superposition and cross-cutting relationships that follow from rock-forming processes, including deposition, deformation, intrusion and modification by diagenesis or metamorphism. The approach allows rapid creation of multiple, geologically realistic, figures and models in 2D and 3D using a simple, intuitive interface. The user can sketch in plan- or cross-section view. Geologic rules are used to extrapolate sketched lines in real time to create 3D surfaces. Quantitative analysis can be carried our directly on the models. Alternatively, they can be output as simple figures or imported directly into other modeling tools. The software runs on a tablet PC and can be used in a variety of settings including the office, classroom and field. The speed and ease of use of SBIM enables multiple interpretations to be developed from limited data, uncertainty to be readily appraised, and figures and models to be rapidly updated to incorporate new data or concepts.

  3. Agent Based Multiviews Requirements Model

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Based on the current researches of viewpoints oriented requirements engineering and intelligent agent, we present the concept of viewpoint agent and its abstract model based on a meta-language for multiviews requirements engineering. It provided a basis for consistency checking and integration of different viewpoint requirements, at the same time, these checking and integration works can automatically realized in virtue of intelligent agent's autonomy, proactiveness and social ability. Finally, we introduce the practical application of the model by the case study of data flow diagram.

  4. HMM-based Trust Model

    DEFF Research Database (Denmark)

    ElSalamouny, Ehab; Nielsen, Mogens; Sassone, Vladimiro

    2010-01-01

    Probabilistic trust has been adopted as an approach to taking security sensitive decisions in modern global computing environments. Existing probabilistic trust frameworks either assume fixed behaviour for the principals or incorporate the notion of ‘decay' as an ad hoc approach to cope with thei...... the major limitation of existing Beta trust model. We show the consistency of the HMM-based trust model and contrast it against the well known Beta trust model with the decay principle in terms of the estimation precision....

  5. Model-Based Security Testing

    CERN Document Server

    Schieferdecker, Ina; Schneider, Martin; 10.4204/EPTCS.80.1

    2012-01-01

    Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security testing (MBST) is a relatively new field and especially dedicated to the systematic and efficient specification and documentation of security test objectives, security test cases and test suites, as well as to their automated or semi-automated generation. In particular, the combination of security modelling and test generation approaches is still a challenge in research and of high interest for industrial applications. MBST includes e.g. security functional testing, model-based fuzzing, risk- and threat-oriented testing,...

  6. Model-based sensor diagnosis

    International Nuclear Information System (INIS)

    Running a nuclear power plant involves monitoring data provided by the installation's sensors. Operators and computerized systems then use these data to establish a diagnostic of the plant. However, the instrumentation system is complex, and is not immune to faults and failures. This paper presents a system for detecting sensor failures using a topological description of the installation and a set of component models. This model of the plant implicitly contains relations between sensor data. These relations must always be checked if all the components are functioning correctly. The failure detection task thus consists of checking these constraints. The constraints are extracted in two stages. Firstly, a qualitative model of their existence is built using structural analysis. Secondly, the models are formally handled according to the results of the structural analysis, in order to establish the constraints on the sensor data. This work constitutes an initial step in extending model-based diagnosis, as the information on which it is based is suspect. This work will be followed by surveillance of the detection system. When the instrumentation is assumed to be sound, the unverified constraints indicate errors on the plant model. (authors). 8 refs., 4 figs

  7. Kernel model-based diagnosis

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The methods for computing the kemel consistency-based diagnoses and the kernel abductive diagnoses are only suited for the situation where part of the fault behavioral modes of the components are known. The characterization of the kernel model-based diagnosis based on the general causal theory is proposed, which can break through the limitation of the above methods when all behavioral modes of each component are known. Using this method, when observation subsets deduced logically are respectively assigned to the empty or the whole observation set, the kernel consistency-based diagnoses and the kernel abductive diagnoses can deal with all situations. The direct relationship between this diagnostic procedure and the prime implicants/implicates is proved, thus linking theoretical result with implementation.

  8. Model-based requirements engineering

    CERN Document Server

    Holt, Jon

    2012-01-01

    This book provides a hands-on introduction to model-based requirementsengineering and management by describing a set of views that form the basisfor the approach. These views take into account each individual requirement interms of its description, but then also provide each requirement with meaning byputting it into the correct 'context'. A requirement that has been put into a contextis known as a 'use case' and may be based upon either stakeholders or levelsof hierarchy in a system. Each use case must then be analysed and validated bydefining a combination of scenarios and formal mathematica

  9. Model-based tomographic reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Chambers, David H.; Lehman, Sean K.; Goodman, Dennis M.

    2012-06-26

    A model-based approach to estimating wall positions for a building is developed and tested using simulated data. It borrows two techniques from geophysical inversion problems, layer stripping and stacking, and combines them with a model-based estimation algorithm that minimizes the mean-square error between the predicted signal and the data. The technique is designed to process multiple looks from an ultra wideband radar array. The processed signal is time-gated and each section processed to detect the presence of a wall and estimate its position, thickness, and material parameters. The floor plan of a building is determined by moving the array around the outside of the building. In this paper we describe how the stacking and layer stripping algorithms are combined and show the results from a simple numerical example of three parallel walls.

  10. Energy based hybrid turbulence modeling

    Science.gov (United States)

    Haering, Sigfried; Moser, Robert

    2015-11-01

    Traditional hybrid approaches exhibit deficiencies when used for fluctuating smooth-wall separation and reattachment necessitating ad-hoc delaying functions and model tuning making them no longer useful as a predictive tool. Additionally, complex geometries and flows often require high cell aspect-ratios and large grid gradients as a compromise between resolution and cost. Such transitions and inconsistencies in resolution detrimentally effect the fidelity of the simulation. We present the continued development of a new hybrid RANS/LES modeling approach specifically developed to address these challenges. In general, modeled turbulence is returned to resolved scales by reduced or negative model viscosity until a balance between theoretical and actual modeled turbulent kinetic energy is attained provided the available resolution. Anisotropy in the grid and resolved field are directly integrated into this balance. A viscosity-based correction is proposed to account for resolution inhomogeneities. Both the hybrid framework and resolution gradient corrections are energy conserving through an exchange of resolved and modeled turbulence.

  11. An Agent Based Classification Model

    CERN Document Server

    Gu, Feng; Greensmith, Julie

    2009-01-01

    The major function of this model is to access the UCI Wisconsin Breast Can- cer data-set[1] and classify the data items into two categories, which are normal and anomalous. This kind of classifi cation can be referred as anomaly detection, which discriminates anomalous behaviour from normal behaviour in computer systems. One popular solution for anomaly detection is Artifi cial Immune Sys- tems (AIS). AIS are adaptive systems inspired by theoretical immunology and observed immune functions, principles and models which are applied to prob- lem solving. The Dendritic Cell Algorithm (DCA)[2] is an AIS algorithm that is developed specifi cally for anomaly detection. It has been successfully applied to intrusion detection in computer security. It is believed that agent-based mod- elling is an ideal approach for implementing AIS, as intelligent agents could be the perfect representations of immune entities in AIS. This model evaluates the feasibility of re-implementing the DCA in an agent-based simulation environ- ...

  12. Crowdsourcing Based 3d Modeling

    Science.gov (United States)

    Somogyi, A.; Barsi, A.; Molnar, B.; Lovas, T.

    2016-06-01

    Web-based photo albums that support organizing and viewing the users' images are widely used. These services provide a convenient solution for storing, editing and sharing images. In many cases, the users attach geotags to the images in order to enable using them e.g. in location based applications on social networks. Our paper discusses a procedure that collects open access images from a site frequently visited by tourists. Geotagged pictures showing the image of a sight or tourist attraction are selected and processed in photogrammetric processing software that produces the 3D model of the captured object. For the particular investigation we selected three attractions in Budapest. To assess the geometrical accuracy, we used laser scanner and DSLR as well as smart phone photography to derive reference values to enable verifying the spatial model obtained from the web-album images. The investigation shows how detailed and accurate models could be derived applying photogrammetric processing software, simply by using images of the community, without visiting the site.

  13. Trace-Based Code Generation for Model-Based Testing

    OpenAIRE

    Kanstrén, T.; Piel, E.; Gross, H.-G.

    2009-01-01

    Paper Submitted for review at the Eighth International Conference on Generative Programming and Component Engineering. Model-based testing can be a powerful means to generate test cases for the system under test. However, creating a useful model for model-based testing requires expertise in the (formal) modeling language of the used tool and the general concept of modeling the system under test for effective test generation. A commonly used modeling notation is to describe the model through a...

  14. Business value modeling based on BPMN models

    OpenAIRE

    Masoumigoudarzi, Farahnaz

    2014-01-01

    In this study we will try to clarify the explanation of modeling and measuring 'Business Values', as it is defined in business context, in the business processes of a company and introduce different methods and select the one which is best for modeling the company's business values. These methods have been used by researchers in business analytics and senior managers of many companies. The focus in this project is business value detection and modeling. The basis of this research is on BPM...

  15. Intelligent model-based OPC

    Science.gov (United States)

    Huang, W. C.; Lai, C. M.; Luo, B.; Tsai, C. K.; Chih, M. H.; Lai, C. W.; Kuo, C. C.; Liu, R. G.; Lin, H. T.

    2006-03-01

    Optical proximity correction is the technique of pre-distorting mask layouts so that the printed patterns are as close to the desired shapes as possible. For model-based optical proximity correction, a lithographic model to predict the edge position (contour) of patterns on the wafer after lithographic processing is needed. Generally, segmentation of edges is performed prior to the correction. Pattern edges are dissected into several small segments with corresponding target points. During the correction, the edges are moved back and forth from the initial drawn position, assisted by the lithographic model, to finally settle on the proper positions. When the correction converges, the intensity predicted by the model in every target points hits the model-specific threshold value. Several iterations are required to achieve the convergence and the computation time increases with the increase of the required iterations. An artificial neural network is an information-processing paradigm inspired by biological nervous systems, such as how the brain processes information. It is composed of a large number of highly interconnected processing elements (neurons) working in unison to solve specific problems. A neural network can be a powerful data-modeling tool that is able to capture and represent complex input/output relationships. The network can accurately predict the behavior of a system via the learning procedure. A radial basis function network, a variant of artificial neural network, is an efficient function approximator. In this paper, a radial basis function network was used to build a mapping from the segment characteristics to the edge shift from the drawn position. This network can provide a good initial guess for each segment that OPC has carried out. The good initial guess reduces the required iterations. Consequently, cycle time can be shortened effectively. The optimization of the radial basis function network for this system was practiced by genetic algorithm

  16. Modelling distribution of marine benthos from hydroacoustics and underwater video

    Science.gov (United States)

    Holmes, K. W.; Van Niel, K. P.; Radford, B.; Kendrick, G. A.; Grove, S. L.

    2008-08-01

    Broad-scale mapping of marine benthos is required for marine resource management and conservation. This study combines textural derivatives based on bathymetry from multibeam hydroacoustics with underwater video observations to model and map sessile biota between 10- and 60-m water depth over 35 km 2 in Point Addis Marine National Park (MNP), Vic., Australia. Classification tree models and maps were developed for macroalgae (all types, mixed red algae, Ecklonia, and rhodoliths) and sessile invertebrates (all types, sponges, and ascidians). Model accuracy was tested on 25% of the video observation dataset reserved from modelling. Models fit well for most macroalgae categories (correct classification rates of 67-84%), but are not as good for sessile invertebrate classes (correct classification rates of 57-62%). The poor fit of the sessile invertebrate models may be the combined result of grouping organisms with different environmental requirements and the effect of false absences recorded during video interpretation due to poor image quality. Probability maps, binary single-class maps, and multi-class maps supply spatially explicit, detailed information on the distribution of sessile benthic biota within the MNP and provide information at a landscape-scale for ecological investigations and marine management.

  17. Sensor-based interior modeling

    International Nuclear Information System (INIS)

    Robots and remote systems will play crucial roles in future decontamination and decommissioning (D ampersand D) of nuclear facilities. Many of these facilities, such as uranium enrichment plants, weapons assembly plants, research and production reactors, and fuel recycling facilities, are dormant; there is also an increasing number of commercial reactors whose useful lifetime is nearly over. To reduce worker exposure to radiation, occupational and other hazards associated with D ampersand D tasks, robots will execute much of the work agenda. Traditional teleoperated systems rely on human understanding (based on information gathered by remote viewing cameras) of the work environment to safely control the remote equipment. However, removing the operator from the work site substantially reduces his efficiency and effectiveness. To approach the productivity of a human worker, tasks will be performed telerobotically, in which many aspects of task execution are delegated to robot controllers and other software. This paper describes a system that semi-automatically builds a virtual world for remote D ampersand D operations by constructing 3-D models of a robot's work environment. Planar and quadric surface representations of objects typically found in nuclear facilities are generated from laser rangefinder data with a minimum of human interaction. The surface representations are then incorporated into a task space model that can be viewed and analyzed by the operator, accessed by motion planning and robot safeguarding algorithms, and ultimately used by the operator to instruct the robot at a level much higher than teleoperation

  18. Optimal pricing decision model based on activity-based costing

    Institute of Scientific and Technical Information of China (English)

    王福胜; 常庆芳

    2003-01-01

    In order to find out the applicability of the optimal pricing decision model based on conventional costbehavior model after activity-based costing has given strong shock to the conventional cost behavior model andits assumptions, detailed analyses have been made using the activity-based cost behavior and cost-volume-profitanalysis model, and it is concluded from these analyses that the theory behind the construction of optimal pri-cing decision model is still tenable under activity-based costing, but the conventional optimal pricing decisionmodel must be modified as appropriate to the activity-based costing based cost behavior model and cost-volume-profit analysis model, and an optimal pricing decision model is really a product pricing decision model construc-ted by following the economic principle of maximizing profit.

  19. Differential geometry based multiscale models.

    Science.gov (United States)

    Wei, Guo-Wei

    2010-08-01

    Large chemical and biological systems such as fuel cells, ion channels, molecular motors, and viruses are of great importance to the scientific community and public health. Typically, these complex systems in conjunction with their aquatic environment pose a fabulous challenge to theoretical description, simulation, and prediction. In this work, we propose a differential geometry based multiscale paradigm to model complex macromolecular systems, and to put macroscopic and microscopic descriptions on an equal footing. In our approach, the differential geometry theory of surfaces and geometric measure theory are employed as a natural means to couple the macroscopic continuum mechanical description of the aquatic environment with the microscopic discrete atomistic description of the macromolecule. Multiscale free energy functionals, or multiscale action functionals are constructed as a unified framework to derive the governing equations for the dynamics of different scales and different descriptions. Two types of aqueous macromolecular complexes, ones that are near equilibrium and others that are far from equilibrium, are considered in our formulations. We show that generalized Navier-Stokes equations for the fluid dynamics, generalized Poisson equations or generalized Poisson-Boltzmann equations for electrostatic interactions, and Newton's equation for the molecular dynamics can be derived by the least action principle. These equations are coupled through the continuum-discrete interface whose dynamics is governed by potential driven geometric flows. Comparison is given to classical descriptions of the fluid and electrostatic interactions without geometric flow based micro-macro interfaces. The detailed balance of forces is emphasized in the present work. We further extend the proposed multiscale paradigm to micro-macro analysis of electrohydrodynamics, electrophoresis, fuel cells, and ion channels. We derive generalized Poisson-Nernst-Planck equations that are

  20. Memristor model based on fuzzy window function

    OpenAIRE

    Abdel-Kader, Rabab Farouk; Abuelenin, Sherif M.

    2016-01-01

    Memristor (memory-resistor) is the fourth passive circuit element. We introduce a memristor model based on a fuzzy logic window function. Fuzzy models are flexible, which enables the capture of the pinched hysteresis behavior of the memristor. The introduced fuzzy model avoids common problems associated with window-function based memristor models, such as the terminal state problem, and the symmetry issues. The model captures the memristor behavior with a simple rule-base which gives an insig...

  1. Classification Tree Model for Drug Penetrability Classification Identification of Biopharmaceutics Classification System%基于分类树模型鉴别药物在生物药剂分类系统的穿透性分类

    Institute of Scientific and Technical Information of China (English)

    曾垂宇; 王晓艳

    2013-01-01

    通过构建基于分子属性的分类树模型以鉴别化合物的生物药剂分类系统(biopharmaceutics classification system,BCS)的穿透性分类.将从不同文献采集的Caco-2穿透性数据构成训练集,建立分类树模型,并应用此模型对外部测试集——美国食品药品监督管理局BCS的标准化合物进行分类测试.由此建立的鉴别化合物的BCS穿透性分类的规则为如果氢键供体原子数量<4、正性范德华极性表面积和<40.71并且溶解能>-33.89,那么该化合物为高穿透性,否则为低穿透性.本分类结构属性关系模型的105个化合物的训练集和17个化合物的外部测试集的识别正确率分别91.43%和82.35%.本模型成功应用于鉴定BCS标准化合物高低穿透性分类药物的分子属性,为药物穿透性的识别提供了简便、有效的分类方法.

  2. A respiratory alert model for the Shenandoah Valley, Virginia, USA.

    Science.gov (United States)

    Hondula, David M; Davis, Robert E; Knight, David B; Sitka, Luke J; Enfield, Kyle; Gawtry, Stephen B; Stenger, Phillip J; Deaton, Michael L; Normile, Caroline P; Lee, Temple R

    2013-01-01

    Respiratory morbidity (particularly COPD and asthma) can be influenced by short-term weather fluctuations that affect air quality and lung function. We developed a model to evaluate meteorological conditions associated with respiratory hospital admissions in the Shenandoah Valley of Virginia, USA. We generated ensembles of classification trees based on six years of respiratory-related hospital admissions (64,620 cases) and a suite of 83 potential environmental predictor variables. As our goal was to identify short-term weather linkages to high admission periods, the dependent variable was formulated as a binary classification of five-day moving average respiratory admission departures from the seasonal mean value. Accounting for seasonality removed the long-term apparent inverse relationship between temperature and admissions. We generated eight total models specific to the northern and southern portions of the valley for each season. All eight models demonstrate predictive skill (mean odds ratio = 3.635) when evaluated using a randomization procedure. The predictor variables selected by the ensembling algorithm vary across models, and both meteorological and air quality variables are included. In general, the models indicate complex linkages between respiratory health and environmental conditions that may be difficult to identify using more traditional approaches. PMID:22438053

  3. Guide to APA-Based Models

    Science.gov (United States)

    Robins, Robert E.; Delisi, Donald P.

    2008-01-01

    In Robins and Delisi (2008), a linear decay model, a new IGE model by Sarpkaya (2006), and a series of APA-Based models were scored using data from three airports. This report is a guide to the APA-based models.

  4. CEAI: CCM based Email Authorship Identification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah

    2013-01-01

    reveal that the proposed CCM-based email authorship identification model, along with the proposed feature set, outperforms the state-of-the-art support vector machine (SVM)-based models, as well as the models proposed by Iqbal et al. [1, 2]. The proposed model attains an accuracy rate of 94% for 10...

  5. Test case generation based on orthogonal table for software black-box testing

    Institute of Scientific and Technical Information of China (English)

    LIU Jiu-fu; YANG Zhong; YANG Zhen-xing; SUN Lin

    2008-01-01

    Software testing is an important means to assure the software quality. This paper presents a practicable method to generate test cases of software testing, which is operational and high efficient. We discuss the identification of software specification categories and choices and make a classification tree. Based on the orthogonal array, it is easy to generate test cases. The number of this method is less than that of all combination of the choices.

  6. Trace-Based Code Generation for Model-Based Testing

    NARCIS (Netherlands)

    Kanstrén, T.; Piel, E.; Gross, H.-G.

    2009-01-01

    Paper Submitted for review at the Eighth International Conference on Generative Programming and Component Engineering. Model-based testing can be a powerful means to generate test cases for the system under test. However, creating a useful model for model-based testing requires expertise in the (fo

  7. Electrical Compact Modeling of Graphene Base Transistors

    Directory of Open Access Journals (Sweden)

    Sébastien Frégonèse

    2015-11-01

    Full Text Available Following the recent development of the Graphene Base Transistor (GBT, a new electrical compact model for GBT devices is proposed. The transistor model includes the quantum capacitance model to obtain a self-consistent base potential. It also uses a versatile transfer current equation to be compatible with the different possible GBT configurations and it account for high injection conditions thanks to a transit time based charge model. Finally, the developed large signal model has been implemented in Verilog-A code and can be used for simulation in a standard circuit design environment such as Cadence or ADS. This model has been verified using advanced numerical simulation.

  8. Landscape patterns as habitat predictors: Building and testing models for cavity-nesting birds in the Uinta Mountains of Utah, USA

    Science.gov (United States)

    Lawler, J.J.; Edwards, T.C.

    2002-01-01

    The ability to predict species occurrences quickly is often crucial for managers and conservation biologists with limited time and funds. We used measured associations with landscape patterns to build accurate predictive habitat models that were quickly and easily applied (i.e., required no additional data collection in the field to make predictions). We used classification trees (a nonparametric alternative to discriminant function analysis, logistic regression, and other generalized linear models) to model nesting habitat of red-naped sapsuckers (Sphyrapicus nuchalis), northern flickers (Colaptes auratus), tree swallows (Tachycineta bicolor), and mountain chickadees (Parus gambeli) in the Uinta Mountains of northeastern Utah, USA. We then tested the predictive capability of the models with independent data collected in the field the following year. The models built for the northern flicker, red-naped sapsucker, and tree swallow were relatively accurate (84%, 80%, and 75% nests correctly classified, respectively) compared to the models for the mountain chickadee (50% nests correctly classified). All four models were more selective than a null model that predicted habitat based solely on a gross association with aspen forests. We conclude that associations with landscape patterns can be used to build relatively accurate, easy to use, predictive models for some species. Our results stress, however, that both selecting the proper scale at which to assess landscape associations and empirically testing the models derived from those associations are crucial for building useful predictive models.

  9. EPR-based material modelling of soils

    Science.gov (United States)

    Faramarzi, Asaad; Alani, Amir M.

    2013-04-01

    In the past few decades, as a result of the rapid developments in computational software and hardware, alternative computer aided pattern recognition approaches have been introduced to modelling many engineering problems, including constitutive modelling of materials. The main idea behind pattern recognition systems is that they learn adaptively from experience and extract various discriminants, each appropriate for its purpose. In this work an approach is presented for developing material models for soils based on evolutionary polynomial regression (EPR). EPR is a recently developed hybrid data mining technique that searches for structured mathematical equations (representing the behaviour of a system) using genetic algorithm and the least squares method. Stress-strain data from triaxial tests are used to train and develop EPR-based material models for soil. The developed models are compared with some of the well-known conventional material models and it is shown that EPR-based models can provide a better prediction for the behaviour of soils. The main benefits of using EPR-based material models are that it provides a unified approach to constitutive modelling of all materials (i.e., all aspects of material behaviour can be implemented within a unified environment of an EPR model); it does not require any arbitrary choice of constitutive (mathematical) models. In EPR-based material models there are no material parameters to be identified. As the model is trained directly from experimental data therefore, EPR-based material models are the shortest route from experimental research (data) to numerical modelling. Another advantage of EPR-based constitutive model is that as more experimental data become available, the quality of the EPR prediction can be improved by learning from the additional data, and therefore, the EPR model can become more effective and robust. The developed EPR-based material models can be incorporated in finite element (FE) analysis.

  10. The Culture Based Model: Constructing a Model of Culture

    Science.gov (United States)

    Young, Patricia A.

    2008-01-01

    Recent trends reveal that models of culture aid in mapping the design and analysis of information and communication technologies. Therefore, models of culture are powerful tools to guide the building of instructional products and services. This research examines the construction of the culture based model (CBM), a model of culture that evolved…

  11. Finite mixture models and model-based clustering

    Directory of Open Access Journals (Sweden)

    Volodymyr Melnykov

    2010-01-01

    Full Text Available Finite mixture models have a long history in statistics, having been used to model population heterogeneity, generalize distributional assumptions, and lately, for providing a convenient yet formal framework for clustering and classification. This paper provides a detailed review into mixture models and model-based clustering. Recent trends as well as open problems in the area are also discussed.

  12. Model-based DSL frameworks

    NARCIS (Netherlands)

    Kurtev, I.; Bézivin, J.; Jouault, F.; Valduriez, P.

    2006-01-01

    More than five years ago, the OMG proposed the Model Driven Architecture (MDA™) approach to deal with the separation of platform dependent and independent aspects in information systems. Since then, the initial idea of MDA evolved and Model Driven Engineering (MDE) is being increasingly promoted to

  13. Data-driven modeling of hydroclimatic trends and soil moisture: Multi-scale data integration and decision support

    Science.gov (United States)

    Coopersmith, Evan Joseph

    regime curve data and facilitate the development of cluster-specific algorithms. Given the desire to enable intelligent decision-making at any location, this classification system is developed in a manner that will allow for classification anywhere in the U.S., even in an ungauged basin. Daily time series data from 428 catchments in the MOPEX database are analyzed to produce an empirical classification tree, partitioning the United States into regions of hydroclimatic similarity. In constructing a classification tree based upon 55 years of data, it is important to recognize the non-stationary nature of climate data. The shifts in climatic regimes will cause certain locations to shift their ultimate position within the classification tree, requiring decision-makers to alter land usage, farming practices, and equipment needs, and algorithms to adjust accordingly. This work adapts the classification model to address the issue of regime shifts over larger temporal scales and suggests how land-usage and farming protocol may vary from hydroclimatic shifts in decades to come. Finally, the generalizability of the hydroclimatic classification system is tested with a physically-based soil moisture model calibrated at several locations throughout the continental United States. The soil moisture model is calibrated at a given site and then applied with the same parameters at other sites within and outside the same hydroclimatic class. The model's performance deteriorates minimally if the calibration and validation location are within the same hydroclimatic class, but deteriorates significantly if the calibration and validates sites are located in different hydroclimatic classes. These soil moisture estimates at the field scale are then further refined by the introduction of LiDAR elevation data, distinguishing faster-drying peaks and ridges from slower-drying valleys. The inclusion of LiDAR enabled multiple locations within the same field to be predicted accurately despite non

  14. Firm Based Trade Models and Turkish Economy

    Directory of Open Access Journals (Sweden)

    Nilüfer ARGIN

    2015-12-01

    Full Text Available Among all international trade models, only The Firm Based Trade Models explains firm’s action and behavior in the world trade. The Firm Based Trade Models focuses on the trade behavior of individual firms that actually make intra industry trade. Firm Based Trade Models can explain globalization process truly. These approaches include multinational cooperation, supply chain and outsourcing also. Our paper aims to explain and analyze Turkish export with Firm Based Trade Models’ context. We use UNCTAD data on exports by SITC Rev 3 categorization to explain total export and 255 products and calculate intensive-extensive margins of Turkish firms.

  15. Traceability in Model-Based Testing

    Directory of Open Access Journals (Sweden)

    Mathew George

    2012-11-01

    Full Text Available The growing complexities of software and the demand for shorter time to market are two important challenges that face today’s IT industry. These challenges demand the increase of both productivity and quality of software. Model-based testing is a promising technique for meeting these challenges. Traceability modeling is a key issue and challenge in model-based testing. Relationships between the different models will help to navigate from one model to another, and trace back to the respective requirements and the design model when the test fails. In this paper, we present an approach for bridging the gaps between the different models in model-based testing. We propose relation definition markup language (RDML for defining the relationships between models.

  16. Lévy-based growth models

    DEFF Research Database (Denmark)

    Jónsdóttir, Kristjana Ýr; Schmiegel, Jürgen; Jensen, Eva Bjørn Vedel

    2008-01-01

    In the present paper, we give a condensed review, for the nonspecialist reader, of a new modelling framework for spatio-temporal processes, based on Lévy theory. We show the potential of the approach in stochastic geometry and spatial statistics by studying Lévy-based growth modelling of planar...... objects. The growth models considered are spatio-temporal stochastic processes on the circle. As a by product, flexible new models for space–time covariance functions on the circle are provided. An application of the Lévy-based growth models to tumour growth is discussed....

  17. Distributed Prognostics Based on Structural Model Decomposition

    Data.gov (United States)

    National Aeronautics and Space Administration — Within systems health management, prognostics focuses on predicting the remaining useful life of a system. In the model-based prognostics paradigm, physics-based...

  18. Base Flow Model Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets...

  19. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae

    2008-01-01

    Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics...... of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical...... constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behaviour as a means of computation...

  20. Radar Target Modelling Based on RCS Measurements

    OpenAIRE

    Wessling, Andreas

    2002-01-01

    When simulating target seekers, there is a great need for computationally efficient, target models. This report considers a study of radar target modelling based on Inverse Synthetic Aperture Radar (ISAR) measurements of generic aircraft. The results underlie future modelling of full-size air targets. A method is developed for two-dimensional modelling of aspect-dependent target scattering. The approach taken is to generate point-scatterer models of two targets, where each point scatterer is...

  1. A Multiple Model Approach to Modeling Based on LPF Algorithm

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Input-output data fitting methods are often used for unknown-structure nonlinear system modeling. Based on model-on-demand tactics, a multiple model approach to modeling for nonlinear systems is presented. The basic idea is to find out, from vast historical system input-output data sets, some data sets matching with the current working point, then to develop a local model using Local Polynomial Fitting (LPF) algorithm. With the change of working points, multiple local models are built, which realize the exact modeling for the global system. By comparing to other methods, the simulation results show good performance for its simple, effective and reliable estimation.``

  2. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  3. Tools for model-based security engineering: models vs. code

    OpenAIRE

    Jürjens, Jan; Yu, Yijun

    2007-01-01

    We present tools to support model-based security engineering on both the model and the code level. In the approach supported by these tools, one firstly specifies the security-critical part of the system (e.g. a crypto protocol) using the UML security extension UMLsec. The models are automatically verified for security properties using automated theorem provers. These are implemented within a framework that supports implementing verification routines, based on XMI output of the diagrams from ...

  4. A Role-Based Fuzzy Assignment Model

    Institute of Scientific and Technical Information of China (English)

    ZUO Bao-he; FENG Shan

    2002-01-01

    It's very important to dynamically assign the tasks to corresponding actors in workflow management system, especially in complex applications. This improves the flexibility of workflow systems.In this paper, a role-based workflow model with fuzzy optimized intelligent assignment is proposed and applied in the investment management system. A groupware-based software model is also proposed.

  5. An acoustical model based monitoring network

    NARCIS (Netherlands)

    Wessels, P.W.; Basten, T.G.H.; Eerden, F.J.M. van der

    2010-01-01

    In this paper the approach for an acoustical model based monitoring network is demonstrated. This network is capable of reconstructing a noise map, based on the combination of measured sound levels and an acoustic model of the area. By pre-calculating the sound attenuation within the network the noi

  6. Hybrid data mining-regression for infrastructure risk assessment based on zero-inflated data

    International Nuclear Information System (INIS)

    Infrastructure disaster risk assessment seeks to estimate the probability of a given customer or area losing service during a disaster, sometimes in conjunction with estimating the duration of each outage. This is often done on the basis of past data about the effects of similar events impacting the same or similar systems. In many situations this past performance data from infrastructure systems is zero-inflated; it has more zeros than can be appropriately modeled with standard probability distributions. The data are also often non-linear and exhibit threshold effects due to the complexities of infrastructure system performance. Standard zero-inflated statistical models such as zero-inflated Poisson and zero-inflated negative binomial regression models do not adequately capture these complexities. In this paper we develop a novel method that is a hybrid classification tree/regression method for complex, zero-inflated data sets. We investigate its predictive accuracy based on a large number of simulated data sets and then demonstrate its practical usefulness with an application to hurricane power outage risk assessment for a large utility based on actual data from the utility. While formulated for infrastructure disaster risk assessment, this method is promising for data-driven analysis for other situations with zero-inflated, complex data exhibiting response thresholds.

  7. An Agent Based Classification Model

    OpenAIRE

    Gu, Feng; Aickelin, Uwe; Greensmith, Julie

    2009-01-01

    The major function of this model is to access the UCI Wisconsin Breast Can- cer data-set[1] and classify the data items into two categories, which are normal and anomalous. This kind of classifi cation can be referred as anomaly detection, which discriminates anomalous behaviour from normal behaviour in computer systems. One popular solution for anomaly detection is Artifi cial Immune Sys- tems (AIS). AIS are adaptive systems inspired by theoretical immunology and observed immune functions, p...

  8. Model Based Control of Solidification

    OpenAIRE

    Furenes, Beathe

    2009-01-01

    The objective of this thesis is to develop models for use in the control of a solidification process. Solidification is the phase change from liquid to solid, and takes place in many important processes ranging from production engineering to solid-state physics. Often during solidification, undesired e¤ects like e.g. variation of composition, microstructure, etc. occur. The solidification structure and its associated defects often persist throughout the subsequent operations, and thus good co...

  9. Model-based internal wave processing

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J.V.; Chambers, D.H.

    1995-06-09

    A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (dept) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves.

  10. PV panel model based on datasheet values

    DEFF Research Database (Denmark)

    Sera, Dezso; Teodorescu, Remus; Rodriguez, Pedro

    2007-01-01

    This work presents the construction of a model for a PV panel using the single-diode five-parameters model, based exclusively on data-sheet parameters. The model takes into account the series and parallel (shunt) resistance of the panel. The equivalent circuit and the basic equations of the PV cell...

  11. Modelling a Peroxidase-based Optical Biosensor

    OpenAIRE

    Juozas Kulys; Evelina Gaidamauskait˙e; Romas Baronas

    2007-01-01

    The response of a peroxidase-based optical biosensor was modelled digitally. A mathematical model of the optical biosensor is based on a system of non-linear reaction-diffusion equations. The modelling biosensor comprises two compartments, an enzyme layer and an outer diffusion layer. The digital simulation was carried out using finite difference technique. The influence of the substrate concentration as well as of the thickness of both the enzyme and diffusion layers on the biosensor respons...

  12. IP Network Management Model Based on NGOSS

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jin-yu; LI Hong-hui; LIU Feng

    2004-01-01

    This paper addresses a management model for IP network based on Next Generation Operation Support System (NGOSS). It makes the network management on the base of all the operation actions of ISP, It provides QoS to user service through the whole path by providing end-to-end Service Level Agreements (SLA) management through whole path. Based on web and coordination technology, this paper gives an implement architecture of this model.

  13. Model Based Control of Reefer Container Systems

    DEFF Research Database (Denmark)

    Sørensen, Kresten Kjær

    This thesis is concerned with the development of model based control for the Star Cool refrigerated container (reefer) with the objective of reducing energy consumption. This project has been carried out under the Danish Industrial PhD programme and has been financed by Lodam together...... with the Danish Ministry of Science, Technology and Innovation. The main contributions in this thesis are on the subjects of modeling, simulation and control of a reefer and experimental model validation. A modular nonlinear simulation model is developed using a control oriented approach, resulting in a model...... that matches the states that are important for control very well both statically and dynamically and different options for faster simulation of the model is investigated. Based on the model, a model predictive controller is developed and shown to reduce energy consumption by up to 21.9%....

  14. Model-based Abstraction of Data Provenance

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, René Rydhof

    2014-01-01

    to bigger models, and the analyses adapt accordingly. Our approach extends provenance both with the origin of data, the actors and processes involved in the handling of data, and policies applied while doing so. The model and corresponding analyses are based on a formal model of spatial and organisational......Identifying provenance of data provides insights to the origin of data and intermediate results, and has recently gained increased interest due to data-centric applications. In this work we extend a data-centric system view with actors handling the data and policies restricting actions....... This extension is based on provenance analysis performed on system models. System models have been introduced to model and analyse spatial and organisational aspects of organisations, to identify, e.g., potential insider threats. Both the models and analyses are naturally modular; models can be combined...

  15. Agent-based modeling and network dynamics

    CERN Document Server

    Namatame, Akira

    2016-01-01

    The book integrates agent-based modeling and network science. It is divided into three parts, namely, foundations, primary dynamics on and of social networks, and applications. The book begins with the network origin of agent-based models, known as cellular automata, and introduce a number of classic models, such as Schelling’s segregation model and Axelrod’s spatial game. The essence of the foundation part is the network-based agent-based models in which agents follow network-based decision rules. Under the influence of the substantial progress in network science in late 1990s, these models have been extended from using lattices into using small-world networks, scale-free networks, etc. The book also shows that the modern network science mainly driven by game-theorists and sociophysicists has inspired agent-based social scientists to develop alternative formation algorithms, known as agent-based social networks. The book reviews a number of pioneering and representative models in this family. Upon the gi...

  16. Simulation-based Manufacturing System Modeling

    Institute of Scientific and Technical Information of China (English)

    卫东; 金烨; 范秀敏; 严隽琪

    2003-01-01

    In recent years, computer simulation appears to be very advantageous technique for researching the resource-constrained manufacturing system. This paper presents an object-oriented simulation modeling method, which combines the merits of traditional methods such as IDEF0 and Petri Net. In this paper, a four-layer-one-angel hierarchical modeling framework based on OOP is defined. And the modeling description of these layers is expounded, such as: hybrid production control modeling and human resource dispatch modeling. To validate the modeling method, a case study of an auto-product line in a motor manufacturing company has been carried out.

  17. Model-Based Clustering of Large Networks

    CERN Document Server

    Vu, Duy Quang; Schweinberger, Michael

    2012-01-01

    We describe a network clustering framework, based on finite mixture models, that can be applied to discrete-valued networks with hundreds of thousands of nodes and billions of edge variables. Relative to other recent model-based clustering work for networks, we introduce a more flexible modeling framework, improve the variational-approximation estimation algorithm, discuss and implement standard error estimation via a parametric bootstrap approach, and apply these methods to much larger datasets than those seen elsewhere in the literature. The more flexible modeling framework is achieved through introducing novel parameterizations of the model, giving varying degrees of parsimony, using exponential family models whose structure may be exploited in various theoretical and algorithmic ways. The algorithms, which we show how to adapt to the more complicated optimization requirements introduced by the constraints imposed by the novel parameterizations we propose, are based on variational generalized EM algorithms...

  18. Software Testing Method Based on Model Comparison

    Institute of Scientific and Technical Information of China (English)

    XIE Xiao-dong; LU Yan-sheng; MAO Cheng-yin

    2008-01-01

    A model comparison based software testing method (MCST) is proposed. In this method, the requirements and programs of software under test are transformed into the ones in the same form, and described by the same model describe language (MDL).Then, the requirements are transformed into a specification model and the programs into an implementation model. Thus, the elements and structures of the two models are compared, and the differences between them are obtained. Based on the diffrences, a test suite is generated. Different MDLs can be chosen for the software under test. The usages of two classical MDLs in MCST, the equivalence classes model and the extended finite state machine (EFSM) model, are described with example applications. The results show that the test suites generated by MCST are more efficient and smaller than some other testing methods, such as the path-coverage testing method, the object state diagram testing method, etc.

  19. Model-based reasoning and large-knowledge bases

    International Nuclear Information System (INIS)

    In such engineering fields as nuclear power plant engineering, technical information expressed in the form of schematics is frequently used. A new paradigm for model-based reasoning (MBR) and an AI tool called PLEXSYS (plant expert system) using this paradigm has been developed. PLEXSYS and the underlying paradigm are specifically designed to handle schematic drawings, by expressing drawings as models and supporting various sophisticated searches on these models. Two application systems have been constructed with PLEXSYS: one generates PLEXSYS models from existing CAD data files, and the other provides functions for nuclear power plant design support. Since the models can be generated from existing data resources, the design support system automatically has full access to a large-scale model or knowledge base representing actual nuclear power plants. (author)

  20. Sketch-based Interfaces and Modeling

    CERN Document Server

    Jorge, Joaquim

    2011-01-01

    The field of sketch-based interfaces and modeling (SBIM) is concerned with developing methods and techniques to enable users to interact with a computer through sketching - a simple, yet highly expressive medium. SBIM blends concepts from computer graphics, human-computer interaction, artificial intelligence, and machine learning. Recent improvements in hardware, coupled with new machine learning techniques for more accurate recognition, and more robust depth inferencing techniques for sketch-based modeling, have resulted in an explosion of both sketch-based interfaces and pen-based computing

  1. Model-based clustered-dot screening

    Science.gov (United States)

    Kim, Sang Ho

    2006-01-01

    I propose a halftone screen design method based on a human visual system model and the characteristics of the electro-photographic (EP) printer engine. Generally, screen design methods based on human visual models produce dispersed-dot type screens while design methods considering EP printer characteristics generate clustered-dot type screens. In this paper, I propose a cost function balancing the conflicting characteristics of the human visual system and the printer. By minimizing the obtained cost function, I design a model-based clustered-dot screen using a modified direct binary search algorithm. Experimental results demonstrate the superior quality of the model-based clustered-dot screen compared to a conventional clustered-dot screen.

  2. Multiscale agent-based consumer market modeling.

    Energy Technology Data Exchange (ETDEWEB)

    North, M. J.; Macal, C. M.; St. Aubin, J.; Thimmapuram, P.; Bragen, M.; Hahn, J.; Karr, J.; Brigham, N.; Lacy, M. E.; Hampton, D.; Decision and Information Sciences; Procter & Gamble Co.

    2010-05-01

    Consumer markets have been studied in great depth, and many techniques have been used to represent them. These have included regression-based models, logit models, and theoretical market-level models, such as the NBD-Dirichlet approach. Although many important contributions and insights have resulted from studies that relied on these models, there is still a need for a model that could more holistically represent the interdependencies of the decisions made by consumers, retailers, and manufacturers. When the need is for a model that could be used repeatedly over time to support decisions in an industrial setting, it is particularly critical. Although some existing methods can, in principle, represent such complex interdependencies, their capabilities might be outstripped if they had to be used for industrial applications, because of the details this type of modeling requires. However, a complementary method - agent-based modeling - shows promise for addressing these issues. Agent-based models use business-driven rules for individuals (e.g., individual consumer rules for buying items, individual retailer rules for stocking items, or individual firm rules for advertizing items) to determine holistic, system-level outcomes (e.g., to determine if brand X's market share is increasing). We applied agent-based modeling to develop a multi-scale consumer market model. We then conducted calibration, verification, and validation tests of this model. The model was successfully applied by Procter & Gamble to several challenging business problems. In these situations, it directly influenced managerial decision making and produced substantial cost savings.

  3. Bayesian Network Based XP Process Modelling

    Directory of Open Access Journals (Sweden)

    Mohamed Abouelela

    2010-07-01

    Full Text Available A Bayesian Network based mathematical model has been used for modelling Extreme Programmingsoftware development process. The model is capable of predicting the expected finish time and theexpected defect rate for each XP release. Therefore, it can be used to determine the success/failure of anyXP Project. The model takes into account the effect of three XP practices, namely: Pair Programming,Test Driven Development and Onsite Customer practices. The model’s predictions were validated againsttwo case studies. Results show the precision of our model especially in predicting the project finish time.

  4. Workflow-Based Dynamic Enterprise Modeling

    Institute of Scientific and Technical Information of China (English)

    黄双喜; 范玉顺; 罗海滨; 林慧萍

    2002-01-01

    Traditional systems for enterprise modeling and business process control are often static and cannot adapt to the changing environment. This paper presents a workflow-based method to dynamically execute the enterprise model. This method gives an explicit representation of the business process logic and the relationships between the elements involved in the process. An execution-oriented integrated enterprise modeling system is proposed in combination with other enterprise views. The enterprise model can be established and executed dynamically in the actual environment due to the dynamic properties of the workflow model.

  5. Model Based Testing for Agent Systems

    Science.gov (United States)

    Zhang, Zhiyong; Thangarajah, John; Padgham, Lin

    Although agent technology is gaining world wide popularity, a hindrance to its uptake is the lack of proper testing mechanisms for agent based systems. While many traditional software testing methods can be generalized to agent systems, there are many aspects that are different and which require an understanding of the underlying agent paradigm. In this paper we present certain aspects of a testing framework that we have developed for agent based systems. The testing framework is a model based approach using the design models of the Prometheus agent development methodology. In this paper we focus on model based unit testing and identify the appropriate units, present mechanisms for generating suitable test cases and for determining the order in which the units are to be tested, present a brief overview of the unit testing process and an example. Although we use the design artefacts from Prometheus the approach is suitable for any plan and event based agent system.

  6. A probabilistic graphical model based stochastic input model construction

    International Nuclear Information System (INIS)

    Model reduction techniques have been widely used in modeling of high-dimensional stochastic input in uncertainty quantification tasks. However, the probabilistic modeling of random variables projected into reduced-order spaces presents a number of computational challenges. Due to the curse of dimensionality, the underlying dependence relationships between these random variables are difficult to capture. In this work, a probabilistic graphical model based approach is employed to learn the dependence by running a number of conditional independence tests using observation data. Thus a probabilistic model of the joint PDF is obtained and the PDF is factorized into a set of conditional distributions based on the dependence structure of the variables. The estimation of the joint PDF from data is then transformed to estimating conditional distributions under reduced dimensions. To improve the computational efficiency, a polynomial chaos expansion is further applied to represent the random field in terms of a set of standard random variables. This technique is combined with both linear and nonlinear model reduction methods. Numerical examples are presented to demonstrate the accuracy and efficiency of the probabilistic graphical model based stochastic input models. - Highlights: • Data-driven stochastic input models without the assumption of independence of the reduced random variables. • The problem is transformed to a Bayesian network structure learning problem. • Examples are given in flows in random media

  7. Grey-theory based intrusion detection model

    Institute of Scientific and Technical Information of China (English)

    Qin Boping; Zhou Xianwei; Yang Jun; Song Cunyi

    2006-01-01

    To solve the problem that current intrusion detection model needs large-scale data in formulating the model in real-time use, an intrusion detection system model based on grey theory (GTIDS) is presented. Grey theory has merits of fewer requirements on original data scale, less limitation of the distribution pattern and simpler algorithm in modeling.With these merits GTIDS constructs model according to partial time sequence for rapid detect on intrusive act in secure system. In this detection model rate of false drop and false retrieval are effectively reduced through twice modeling and repeated detect on target data. Furthermore, GTIDS framework and specific process of modeling algorithm are presented. The affectivity of GTIDS is proved through emulated experiments comparing snort and next-generation intrusion detection expert system (NIDES) in SRI international.

  8. Econophysics of agent-based models

    CERN Document Server

    Aoyama, Hideaki; Chakrabarti, Bikas; Chakraborti, Anirban; Ghosh, Asim

    2014-01-01

    The primary goal of this book is to present the research findings and conclusions of physicists, economists, mathematicians and financial engineers working in the field of "Econophysics" who have undertaken agent-based modelling, comparison with empirical studies and related investigations. Most standard economic models assume the existence of the representative agent, who is “perfectly rational” and applies the utility maximization principle when taking action. One reason for this is the desire to keep models mathematically tractable: no tools are available to economists for solving non-linear models of heterogeneous adaptive agents without explicit optimization. In contrast, multi-agent models, which originated from statistical physics considerations, allow us to go beyond the prototype theories of traditional economics involving the representative agent. This book is based on the Econophys-Kolkata VII Workshop, at which many such modelling efforts were presented. In the book, leading researchers in the...

  9. Approximation Algorithms for Model-Based Diagnosis

    NARCIS (Netherlands)

    Feldman, A.B.

    2010-01-01

    Model-based diagnosis is an area of abductive inference that uses a system model, together with observations about system behavior, to isolate sets of faulty components (diagnoses) that explain the observed behavior, according to some minimality criterion. This thesis presents greedy approximation a

  10. Evaluating face trustworthiness: a model based approach

    OpenAIRE

    Todorov, Alexander; Baron, Sean G.; Oosterhof, Nikolaas N.

    2008-01-01

    Judgments of trustworthiness from faces determine basic approach/avoidance responses and approximate the valence evaluation of faces that runs across multiple person judgments. Here, based on trustworthiness judgments and using a computer model for face representation, we built a model for representing face trustworthiness (study 1). Using this model, we generated novel faces with an increased range of trustworthiness and used these faces as stimuli in a functional Magnetic Resonance Imaging ...

  11. Agent-Based Modeling in Systems Pharmacology.

    Science.gov (United States)

    Cosgrove, J; Butler, J; Alden, K; Read, M; Kumar, V; Cucurull-Sanchez, L; Timmis, J; Coles, M

    2015-11-01

    Modeling and simulation (M&S) techniques provide a platform for knowledge integration and hypothesis testing to gain insights into biological systems that would not be possible a priori. Agent-based modeling (ABM) is an M&S technique that focuses on describing individual components rather than homogenous populations. This tutorial introduces ABM to systems pharmacologists, using relevant case studies to highlight how ABM-specific strengths have yielded success in the area of preclinical mechanistic modeling. PMID:26783498

  12. Agent-based Models of Financial Markets

    OpenAIRE

    Samanidou, E.; E. Zschischang; Stauffer, D.; Lux, T.

    2007-01-01

    This review deals with several microscopic (``agent-based'') models of financial markets which have been studied by economists and physicists over the last decade: Kim-Markowitz, Levy-Levy-Solomon, Cont-Bouchaud, Solomon-Weisbuch, Lux-Marchesi, Donangelo-Sneppen and Solomon-Levy-Huang. After an overview of simulation approaches in financial economics, we first give a summary of the Donangelo-Sneppen model of monetary exchange and compare it with related models in economics literature. Our sel...

  13. PARTICIPATION BASED MODEL OF SHIP CREW MANAGEMENT

    OpenAIRE

    Bielić, Toni; Ivanišević, Dalibor; Gundić, Ana

    2014-01-01

    This paper analyse the participation - based model on board the ship as possibly optimal leadership model existing in the shipping industry with accent on decision - making process. In the paper authors have tried to define master’s behaviour model and management style identifying drawbacks and disadvantages of vertical, pyramidal organization with master on the top. Paper describes efficiency of decision making within team organization and optimization of a ship’s organisation by introduci...

  14. Model-based Abstraction of Data Provenance

    OpenAIRE

    Probst, Christian W.; Hansen, René Rydhof

    2014-01-01

    Identifying provenance of data provides insights to the origin of data and intermediate results, and has recently gained increased interest due to data-centric applications. In this work we extend a data-centric system view with actors handling the data and policies restricting actions. This extension is based on provenance analysis performed on system models. System models have been introduced to model and analyse spatial and organisational aspects of organisations, to identify, e.g., potent...

  15. Information modelling and knowledge bases XXV

    CERN Document Server

    Tokuda, T; Jaakkola, H

    2014-01-01

    Because of our ever increasing use of and reliance on technology and information systems, information modelling and knowledge bases continue to be important topics in those academic communities concerned with data handling and computer science. As the information itself becomes more complex, so do the levels of abstraction and the databases themselves. This book is part of the series Information Modelling and Knowledge Bases, which concentrates on a variety of themes in the important domains of conceptual modeling, design and specification of information systems, multimedia information modelin

  16. Modelling a Peroxidase-based Optical Biosensor

    Science.gov (United States)

    Baronas, Romas; Gaidamauskaite, Evelina; Kulys, Juozas

    2007-01-01

    The response of a peroxidase-based optical biosensor was modelled digitally. A mathematical model of the optical biosensor is based on a system of non-linear reaction-diffusion equations. The modelling biosensor comprises two compartments, an enzyme layer and an outer diffusion layer. The digital simulation was carried out using finite difference technique. The influence of the substrate concentration as well as of the thickness of both the enzyme and diffusion layers on the biosensor response was investigated. Calculations showed complex kinetics of the biosensor response, especially at low concentrations of the peroxidase and of the hydrogen peroxide.

  17. Néron Models and Base Change

    DEFF Research Database (Denmark)

    Halle, Lars Halvard; Nicaise, Johannes

    Presenting the first systematic treatment of the behavior of Néron models under ramified base change, this book can be read as an introduction to various subtle invariants and constructions related to Néron models of semi-abelian varieties, motivated by concrete research problems and complemented...... on Néron component groups, Edixhoven’s filtration and the base change conductor of Chai and Yu, and we study these invariants using various techniques such as models of curves, sheaves on Grothendieck sites and non-archimedean uniformization. We then apply our results to the study of motivic zeta functions...

  18. Neighborhood Mixture Model for Knowledge Base Completion

    OpenAIRE

    Nguyen, Dat Quoc; Sirts, Kairit; Qu, Lizhen; Johnson, Mark

    2016-01-01

    Knowledge bases are useful resources for many natural language processing tasks, however, they are far from complete. In this paper, we define a novel entity representation as a mixture of its neighborhood in the knowledge base and apply this technique on TransE-a well-known embedding model for knowledge base completion. Experimental results show that the neighborhood information significantly helps to improve the results of the TransE, leading to better performance than obtained by other sta...

  19. Multiagent-Based Model For ESCM

    OpenAIRE

    Delia MARINCAS

    2011-01-01

    Web based applications for Supply Chain Management (SCM) are now a necessity for every company in order to meet the increasing customer demands, to face the global competition and to make profit. Multiagent-based approach is appropriate for eSCM because it shows many of the characteristics a SCM system should have. For this reason, we have proposed a multiagent-based eSCM model which configures a virtual SC, automates the SC activities: selling, purchasing, manufacturing, planning, inventory,...

  20. Energy based prediction models for building acoustics

    DEFF Research Database (Denmark)

    Brunskog, Jonas

    2012-01-01

    In order to reach robust and simplified yet accurate prediction models, energy based principle are commonly used in many fields of acoustics, especially in building acoustics. This includes simple energy flow models, the framework of statistical energy analysis (SEA) as well as more elaborated...... principles as, e.g., wave intensity analysis (WIA). The European standards for building acoustic predictions, the EN 12354 series, are based on energy flow and SEA principles. In the present paper, different energy based prediction models are discussed and critically reviewed. Special attention is placed...... on underlying basic assumptions, such as diffuse fields, high modal overlap, resonant field being dominant, etc., and the consequences of these in terms of limitations in the theory and in the practical use of the models....

  1. Flower solid modeling based on sketches

    Institute of Scientific and Technical Information of China (English)

    Zhan DING; Shu-chang XU; Xiu-zi YE; Yin ZHANG; San-yuan ZHANG

    2008-01-01

    In this paper we propose a method to model flowers of solid shape. Based on (Ijiri et al., 2005)'s method, we separate individual flower modeling and inflorescence modeling procedures into structure and geometry modeling. We incorporate interactive editing gestures to allow the user to edit structure parameters freely onto structure diagram. Furthermore, we use free-hand sketching techniques to allow users to create and edit 3D geometrical elements freely and easily. The final step is to automatically merge all independent 3D geometrical elements into a single waterproof mesh. Our experiments show that this solid modeling approach is promising. Using our approach, novice users can create vivid flower models easily and freely. The generated flower model is waterproof. It can have applications in visualization, animation, gaming, and toys and decorations if printed out on 3D rapid prototyping devices.

  2. Modelling Carbon Nanotubes-Based Mediatorless Biosensor

    Directory of Open Access Journals (Sweden)

    Julija Razumiene

    2012-07-01

    Full Text Available This paper presents a mathematical model of carbon nanotubes-based mediatorless biosensor. The developed model is based on nonlinear non-stationary reaction-diffusion equations. The model involves four layers (compartments: a layer of enzyme solution entrapped on a terylene membrane, a layer of the single walled carbon nanotubes deposited on a perforated membrane, and an outer diffusion layer. The biosensor response and sensitivity are investigated by changing the model parameters with a special emphasis on the mediatorless transfer of the electrons in the layer of the enzyme-loaded carbon nanotubes. The numerical simulation at transient and steady state conditions was carried out using the finite difference technique. The mathematical model and the numerical solution were validated by experimental data. The obtained agreement between the simulation results and the experimental data was admissible at different concentrations of the substrate.

  3. Modelling carbon nanotubes-based mediatorless biosensor.

    Science.gov (United States)

    Baronas, Romas; Kulys, Juozas; Petrauskas, Karolis; Razumiene, Julija

    2012-01-01

    This paper presents a mathematical model of carbon nanotubes-based mediatorless biosensor. The developed model is based on nonlinear non-stationary reaction-diffusion equations. The model involves four layers (compartments): a layer of enzyme solution entrapped on a terylene membrane, a layer of the single walled carbon nanotubes deposited on a perforated membrane, and an outer diffusion layer. The biosensor response and sensitivity are investigated by changing the model parameters with a special emphasis on the mediatorless transfer of the electrons in the layer of the enzyme-loaded carbon nanotubes. The numerical simulation at transient and steady state conditions was carried out using the finite difference technique. The mathematical model and the numerical solution were validated by experimental data. The obtained agreement between the simulation results and the experimental data was admissible at different concentrations of the substrate. PMID:23012537

  4. Spatial disaggregation of complex soil map units at regional scale based on soil-landscape relationships

    Science.gov (United States)

    Vincent, Sébastien; Lemercier, Blandine; Berthier, Lionel; Walter, Christian

    2015-04-01

    Accurate soil information over large extent is essential to manage agronomical and environmental issues. Where it exists, information on soil is often sparse or available at coarser resolution than required. Typically, the spatial distribution of soil at regional scale is represented as a set of polygons defining soil map units (SMU), each one describing several soil types not spatially delineated, and a semantic database describing these objects. Delineation of soil types within SMU, ie spatial disaggregation of SMU allows improved soil information's accuracy using legacy data. The aim of this study was to predict soil types by spatial disaggregation of SMU through a decision tree approach, considering expert knowledge on soil-landscape relationships embedded in soil databases. The DSMART (Disaggregation and Harmonization of Soil Map Units Through resampled Classification Trees) algorithm developed by Odgers et al. (2014) was used. It requires soil information, environmental covariates, and calibration samples, to build then extrapolate decision trees. To assign a soil type to a particular spatial position, a weighed random allocation approach is applied: each soil type in the SMU is weighted according to its assumed proportion of occurrence in the SMU. Thus soil-landscape relationships are not considered in the current version of DSMART. Expert rules on soil distribution considering the relief, parent material and wetlands location were proposed to drive the procedure of allocation of soil type to sampled positions, in order to integrate the soil-landscape relationships. Semantic information about spatial organization of soil types within SMU and exhaustive landscape descriptors were used. In the eastern part of Brittany (NW France), 171 soil types were described; their relative area in the SMU were estimated, geomorphological and geological contexts were recorded. The model predicted 144 soil types. An external validation was performed by comparing predicted

  5. Deformable surface modeling based on dual subdivision

    Institute of Scientific and Technical Information of China (English)

    WANG Huawei; SUN Hanqiu; QIN Kaihuai

    2005-01-01

    Based on dual Doo-Sabin subdivision and the corresponding parameterization, a modeling technique of deformable surfaces is presented in this paper. In the proposed model, all the dynamic parameters are computed in a unified way for both non-defective and defective subdivision matrices, and central differences are used to discretize the Lagrangian dynamics equation instead of backward differences. Moreover, a local scheme is developed to solve the dynamics equation approximately, thus the order of the linear equation is reduced greatly. Therefore, the proposed model is more efficient and faster than the existing dynamic models. It can be used for deformable surface design, interactive surface editing, medical imaging and simulation.

  6. Rule-based transformations for geometric modelling

    CERN Document Server

    Bellet, Thomas; Gall, Pascale Le; 10.4204/EPTCS.48.5

    2011-01-01

    The context of this paper is the use of formal methods for topology-based geometric modelling. Topology-based geometric modelling deals with objects of various dimensions and shapes. Usually, objects are defined by a graph-based topological data structure and by an embedding that associates each topological element (vertex, edge, face, etc.) with relevant data as their geometric shape (position, curve, surface, etc.) or application dedicated data (e.g. molecule concentration level in a biological context). We propose to define topology-based geometric objects as labelled graphs. The arc labelling defines the topological structure of the object whose topological consistency is then ensured by labelling constraints. Nodes have as many labels as there are different data kinds in the embedding. Labelling constraints ensure then that the embedding is consistent with the topological structure. Thus, topology-based geometric objects constitute a particular subclass of a category of labelled graphs in which nodes hav...

  7. Rule-based transformations for geometric modelling

    Directory of Open Access Journals (Sweden)

    Thomas Bellet

    2011-02-01

    Full Text Available The context of this paper is the use of formal methods for topology-based geometric modelling. Topology-based geometric modelling deals with objects of various dimensions and shapes. Usually, objects are defined by a graph-based topological data structure and by an embedding that associates each topological element (vertex, edge, face, etc. with relevant data as their geometric shape (position, curve, surface, etc. or application dedicated data (e.g. molecule concentration level in a biological context. We propose to define topology-based geometric objects as labelled graphs. The arc labelling defines the topological structure of the object whose topological consistency is then ensured by labelling constraints. Nodes have as many labels as there are different data kinds in the embedding. Labelling constraints ensure then that the embedding is consistent with the topological structure. Thus, topology-based geometric objects constitute a particular subclass of a category of labelled graphs in which nodes have multiple labels.

  8. A model evaluation checklist for process-based environmental models

    Science.gov (United States)

    Jackson-Blake, Leah

    2015-04-01

    the conceptual model on which it is based. In this study, a number of model structural shortcomings were identified, such as a lack of dissolved phosphorus transport via infiltration excess overland flow, potential discrepancies in the particulate phosphorus simulation and a lack of spatial granularity. (4) Conceptual challenges, as conceptual models on which predictive models are built are often outdated, having not kept up with new insights from monitoring and experiments. For example, soil solution dissolved phosphorus concentration in INCA-P is determined by the Freundlich adsorption isotherm, which could potentially be replaced using more recently-developed adsorption models that take additional soil properties into account. This checklist could be used to assist in identifying why model performance may be poor or unreliable. By providing a model evaluation framework, it could help prioritise which areas should be targeted to improve model performance or model credibility, whether that be through using alternative calibration techniques and statistics, improved data collection, improving or simplifying the model structure or updating the model to better represent current understanding of catchment processes.

  9. A stepwise-cluster microbial biomass inference model in food waste composting

    International Nuclear Information System (INIS)

    A stepwise-cluster microbial biomass inference (SMI) model was developed through introducing stepwise-cluster analysis (SCA) into composting process modeling to tackle the nonlinear relationships among state variables and microbial activities. The essence of SCA is to form a classification tree based on a series of cutting or mergence processes according to given statistical criteria. Eight runs of designed experiments in bench-scale reactors in a laboratory were constructed to demonstrate the feasibility of the proposed method. The results indicated that SMI could help establish a statistical relationship between state variables and composting microbial characteristics, where discrete and nonlinear complexities exist. Significance levels of cutting/merging were provided such that the accuracies of the developed forecasting trees were controllable. Through an attempted definition of input effects on the output in SMI, the effects of the state variables on thermophilic bacteria were ranged in a descending order as: Time (day) > moisture content (%) > ash content (%, dry) > Lower Temperature (deg. C) > pH > NH4+-N (mg/Kg, dry) > Total N (%, dry) > Total C (%, dry); the effects on mesophilic bacteria were ordered as: Time > Upper Temperature (deg. C) > Total N > moisture content > NH4+-N > Total C > pH. This study made the first attempt in applying SCA to mapping the nonlinear and discrete relationships in composting processes.

  10. MEGen: A Physiologically Based Pharmacokinetic Model Generator

    Directory of Open Access Journals (Sweden)

    George D Loizou

    2011-11-01

    Full Text Available Physiologically based pharmacokinetic models are being used in an increasing number of different areas. These not only include the human safety assessment of pharmaceuticals, pesticides, biocides and environmental chemicals but also for food animal, wild mammal and avian risk assessment. The value of PBPK models is that they are tools for estimating tissue dosimetry by integrating in vitro and in vivo mechanistic, pharmacokinetic and toxicological information through their explicit mathematical description of important anatomical, physiological and biochemical determinants of chemical uptake, disposition and elimination. However, PBPK models are perceived as complex, data hungry, resource intensive and time consuming. In addition, model validation and verification are hindered by the relative complexity of the equations. To begin to address these issues a freely available web application for the rapid construction and documentation of bespoke PBPK models is under development. Here we present an overview of the current capabilities of MEGen, a model equation generator and parameter database and discuss future developments.

  11. Development of Ensemble Model Based Water Demand Forecasting Model

    Science.gov (United States)

    Kwon, Hyun-Han; So, Byung-Jin; Kim, Seong-Hyeon; Kim, Byung-Seop

    2014-05-01

    In recent years, Smart Water Grid (SWG) concept has globally emerged over the last decade and also gained significant recognition in South Korea. Especially, there has been growing interest in water demand forecast and optimal pump operation and this has led to various studies regarding energy saving and improvement of water supply reliability. Existing water demand forecasting models are categorized into two groups in view of modeling and predicting their behavior in time series. One is to consider embedded patterns such as seasonality, periodicity and trends, and the other one is an autoregressive model that is using short memory Markovian processes (Emmanuel et al., 2012). The main disadvantage of the abovementioned model is that there is a limit to predictability of water demands of about sub-daily scale because the system is nonlinear. In this regard, this study aims to develop a nonlinear ensemble model for hourly water demand forecasting which allow us to estimate uncertainties across different model classes. The proposed model is consist of two parts. One is a multi-model scheme that is based on combination of independent prediction model. The other one is a cross validation scheme named Bagging approach introduced by Brieman (1996) to derive weighting factors corresponding to individual models. Individual forecasting models that used in this study are linear regression analysis model, polynomial regression, multivariate adaptive regression splines(MARS), SVM(support vector machine). The concepts are demonstrated through application to observed from water plant at several locations in the South Korea. Keywords: water demand, non-linear model, the ensemble forecasting model, uncertainty. Acknowledgements This subject is supported by Korea Ministry of Environment as "Projects for Developing Eco-Innovation Technologies (GT-11-G-02-001-6)

  12. A contextual modeling approach for model-based recommender systems

    OpenAIRE

    Fernández-Tobías, Ignacio; Campos Soto, Pedro G.; Cantador, Iván; Díez, Fernando

    2013-01-01

    The final publication is available at Springer via http://dx.doi.org/10.1007/978-3-642-40643-0_5 Proceedings of 15th Conference of the Spanish Association for Artificial Intelligence, CAEPIA 2013, Madrid, Spain, September 17-20, 2013. In this paper we present a contextual modeling approach for model-based recommender systems that integrates and exploits both user preferences and contextual signals in a common vector space. Differently to previous work, we conduct a user study acquiring ...

  13. Graphical model construction based on evolutionary algorithms

    Institute of Scientific and Technical Information of China (English)

    Youlong YANG; Yan WU; Sanyang LIU

    2006-01-01

    Using Bayesian networks to model promising solutions from the current population of the evolutionary algorithms can ensure efficiency and intelligence search for the optimum. However, to construct a Bayesian network that fits a given dataset is a NP-hard problem, and it also needs consuming mass computational resources. This paper develops a methodology for constructing a graphical model based on Bayesian Dirichlet metric. Our approach is derived from a set of propositions and theorems by researching the local metric relationship of networks matching dataset. This paper presents the algorithm to construct a tree model from a set of potential solutions using above approach. This method is important not only for evolutionary algorithms based on graphical models, but also for machine learning and data mining.The experimental results show that the exact theoretical results and the approximations match very well.

  14. Spatial interactions in agent-based modeling

    CERN Document Server

    Ausloos, Marcel; Merlone, Ugo

    2014-01-01

    Agent Based Modeling (ABM) has become a widespread approach to model complex interactions. In this chapter after briefly summarizing some features of ABM the different approaches in modeling spatial interactions are discussed. It is stressed that agents can interact either indirectly through a shared environment and/or directly with each other. In such an approach, higher-order variables such as commodity prices, population dynamics or even institutions, are not exogenously specified but instead are seen as the results of interactions. It is highlighted in the chapter that the understanding of patterns emerging from such spatial interaction between agents is a key problem as much as their description through analytical or simulation means. The chapter reviews different approaches for modeling agents' behavior, taking into account either explicit spatial (lattice based) structures or networks. Some emphasis is placed on recent ABM as applied to the description of the dynamics of the geographical distribution o...

  15. Model based Software Develeopment: Issues & Challenges

    CERN Document Server

    basha, N Md Jubair; Rizwanullah, Mohammed

    2012-01-01

    One of the goals of software design is to model a system in such a way that it is easily understandable. Nowadays the tendency for software development is changing from manual coding to automatic code generation; it is becoming model-based. This is a response to the software crisis, in which the cost of hardware has decreased and conversely the cost of software development has increased sharply. The methodologies that allowed this change are model-based, thus relieving the human from detailed coding. Still there is a long way to achieve this goal, but work is being done worldwide to achieve this objective. This paper presents the drastic changes related to modeling and important challenging issues and techniques that recur in MBSD.

  16. Activity-based resource capability modeling

    Institute of Scientific and Technical Information of China (English)

    CHENG Shao-wu; XU Xiao-fei; WANG Gang; SUN Xue-dong

    2008-01-01

    To analyse and optimize a enterprise process in a wide scope, an activity-based method of modeling resource capabilities is presented. It models resource capabilities by means of the same structure as an activity, that is, resource capabilities are defined by input objects, actions and output objects. A set of activity-based re-source capability modeling rules and matching rules between an activity and a resource are introduced. This method can not only be used to describe capability of manufacturing tools, but also capability of persons and applications, etc. It unifies methods of modeling capability of all kinds of resources in an enterprise and supports the optimization of the resource allocation of a process.

  17. GIS-Based Hydrogeological-Parameter Modeling

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    A regression model is proposed to relate the variation of water well depth with topographic properties (area and slope), the variation of hydraulic conductivity and vertical decay factor. The implementation of this model in GIS environment (ARC/TNFO) based on known water data and DEM is used to estimate the variation of hydraulic conductivity and decay factor of different lithoiogy units in watershed context.

  18. Atom-Role-Based Access Control Model

    Science.gov (United States)

    Cai, Weihong; Huang, Richeng; Hou, Xiaoli; Wei, Gang; Xiao, Shui; Chen, Yindong

    Role-based access control (RBAC) model has been widely recognized as an efficient access control model and becomes a hot research topic of information security at present. However, in the large-scale enterprise application environments, the traditional RBAC model based on the role hierarchy has the following deficiencies: Firstly, it is unable to reflect the role relationships in complicated cases effectively, which does not accord with practical applications. Secondly, the senior role unconditionally inherits all permissions of the junior role, thus if a user is under the supervisor role, he may accumulate all permissions, and this easily causes the abuse of permission and violates the least privilege principle, which is one of the main security principles. To deal with these problems, we, after analyzing permission types and role relationships, proposed the concept of atom role and built an atom-role-based access control model, called ATRBAC, by dividing the permission set of each regular role based on inheritance path relationships. Through the application-specific analysis, this model can well meet the access control requirements.

  19. Evolutionary modeling-based approach for model errors correction

    Science.gov (United States)

    Wan, S. Q.; He, W. P.; Wang, L.; Jiang, W.; Zhang, W.

    2012-08-01

    The inverse problem of using the information of historical data to estimate model errors is one of the science frontier research topics. In this study, we investigate such a problem using the classic Lorenz (1963) equation as a prediction model and the Lorenz equation with a periodic evolutionary function as an accurate representation of reality to generate "observational data." On the basis of the intelligent features of evolutionary modeling (EM), including self-organization, self-adaptive and self-learning, the dynamic information contained in the historical data can be identified and extracted by computer automatically. Thereby, a new approach is proposed to estimate model errors based on EM in the present paper. Numerical tests demonstrate the ability of the new approach to correct model structural errors. In fact, it can actualize the combination of the statistics and dynamics to certain extent.

  20. Evolutionary modeling-based approach for model errors correction

    Directory of Open Access Journals (Sweden)

    S. Q. Wan

    2012-08-01

    Full Text Available The inverse problem of using the information of historical data to estimate model errors is one of the science frontier research topics. In this study, we investigate such a problem using the classic Lorenz (1963 equation as a prediction model and the Lorenz equation with a periodic evolutionary function as an accurate representation of reality to generate "observational data."

    On the basis of the intelligent features of evolutionary modeling (EM, including self-organization, self-adaptive and self-learning, the dynamic information contained in the historical data can be identified and extracted by computer automatically. Thereby, a new approach is proposed to estimate model errors based on EM in the present paper. Numerical tests demonstrate the ability of the new approach to correct model structural errors. In fact, it can actualize the combination of the statistics and dynamics to certain extent.

  1. Model-based testing for embedded systems

    CERN Document Server

    Zander, Justyna; Mosterman, Pieter J

    2011-01-01

    What the experts have to say about Model-Based Testing for Embedded Systems: "This book is exactly what is needed at the exact right time in this fast-growing area. From its beginnings over 10 years ago of deriving tests from UML statecharts, model-based testing has matured into a topic with both breadth and depth. Testing embedded systems is a natural application of MBT, and this book hits the nail exactly on the head. Numerous topics are presented clearly, thoroughly, and concisely in this cutting-edge book. The authors are world-class leading experts in this area and teach us well-used

  2. A multivalued knowledge-base model

    CERN Document Server

    Achs, Agnes

    2010-01-01

    The basic aim of our study is to give a possible model for handling uncertain information. This model is worked out in the framework of DATALOG. At first the concept of fuzzy Datalog will be summarized, then its extensions for intuitionistic- and interval-valued fuzzy logic is given and the concept of bipolar fuzzy Datalog is introduced. Based on these ideas the concept of multivalued knowledge-base will be defined as a quadruple of any background knowledge; a deduction mechanism; a connecting algorithm, and a function set of the program, which help us to determine the uncertainty levels of the results. At last a possible evaluation strategy is given.

  3. Physically based modeling and animation of tornado

    Institute of Scientific and Technical Information of China (English)

    LIU Shi-guang; WANG Zhang-ye; GONG Zheng; CHEN Fei-fei; PENG Qun-sheng

    2006-01-01

    Realistic modeling and rendering of dynamic tornado scene is recognized as a challenging task for researchers of computer graphics. In this paper a new physically based method for simulating and animating tornado scene is presented. We first propose a Two-Fluid model based on the physical theory of tornado, then we simulate the flow of tornado and its interaction with surrounding objects such as debris, etc. Taking the scattering and absorption of light by the participating media into account, the illumination effects of the tornado scene can be generated realistically. With the support of graphics hardware, various kinds of dynamic tornado scenes can be rendered at interactive rates.

  4. Application software development via model based design

    OpenAIRE

    Haapala, O. (Olli)

    2015-01-01

    This thesis was set to study the utilization of the MathWorks’ Simulink® program in model based application software development and its compatibility with the Vacon 100 inverter. The target was to identify all the problems related to everyday usage of this method and create a white paper of how to execute a model based design to create a Vacon 100 compatible system software. Before this thesis was started, there was very little knowledge of the compatibility of this method. However durin...

  5. A subgrid based approach for morphodynamic modelling

    Science.gov (United States)

    Volp, N. D.; van Prooijen, B. C.; Pietrzak, J. D.; Stelling, G. S.

    2016-07-01

    To improve the accuracy and the efficiency of morphodynamic simulations, we present a subgrid based approach for a morphodynamic model. This approach is well suited for areas characterized by sub-critical flow, like in estuaries, coastal areas and in low land rivers. This new method uses a different grid resolution to compute the hydrodynamics and the morphodynamics. The hydrodynamic computations are carried out with a subgrid based, two-dimensional, depth-averaged model. This model uses a coarse computational grid in combination with a subgrid. The subgrid contains high resolution bathymetry and roughness information to compute volumes, friction and advection. The morphodynamic computations are carried out entirely on a high resolution grid, the bed grid. It is key to find a link between the information defined on the different grids in order to guaranty the feedback between the hydrodynamics and the morphodynamics. This link is made by using a new physics-based interpolation method. The method interpolates water levels and velocities from the coarse grid to the high resolution bed grid. The morphodynamic solution improves significantly when using the subgrid based method compared to a full coarse grid approach. The Exner equation is discretised with an upwind method based on the direction of the bed celerity. This ensures a stable solution for the Exner equation. By means of three examples, it is shown that the subgrid based approach offers a significant improvement at a minimal computational cost.

  6. Model Predictive Control based on Finite Impulse Response Models

    DEFF Research Database (Denmark)

    Prasath, Guru; Jørgensen, John Bagterp

    2008-01-01

    We develop a regularized l2 finite impulse response (FIR) predictive controller with input and input-rate constraints. Feedback is based on a simple constant output disturbance filter. The performance of the predictive controller in the face of plant-model mismatch is investigated by simulations ...

  7. Incident Duration Modeling Using Flexible Parametric Hazard-Based Models

    Directory of Open Access Journals (Sweden)

    Ruimin Li

    2014-01-01

    Full Text Available Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time.

  8. Incident duration modeling using flexible parametric hazard-based models.

    Science.gov (United States)

    Li, Ruimin; Shang, Pan

    2014-01-01

    Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time.

  9. Haptics-based dynamic implicit solid modeling.

    Science.gov (United States)

    Hua, Jing; Qin, Hong

    2004-01-01

    This paper systematically presents a novel, interactive solid modeling framework, Haptics-based Dynamic Implicit Solid Modeling, which is founded upon volumetric implicit functions and powerful physics-based modeling. In particular, we augment our modeling framework with a haptic mechanism in order to take advantage of additional realism associated with a 3D haptic interface. Our dynamic implicit solids are semi-algebraic sets of volumetric implicit functions and are governed by the principles of dynamics, hence responding to sculpting forces in a natural and predictable manner. In order to directly manipulate existing volumetric data sets as well as point clouds, we develop a hierarchical fitting algorithm to reconstruct and represent discrete data sets using our continuous implicit functions, which permit users to further design and edit those existing 3D models in real-time using a large variety of haptic and geometric toolkits, and visualize their interactive deformation at arbitrary resolution. The additional geometric and physical constraints afford more sophisticated control of the dynamic implicit solids. The versatility of our dynamic implicit modeling enables the user to easily modify both the geometry and the topology of modeled objects, while the inherent physical properties can offer an intuitive haptic interface for direct manipulation with force feedback. PMID:15794139

  10. A Conceptual Model for Ontology Based Learning

    Directory of Open Access Journals (Sweden)

    Touraj Banirostam

    2012-11-01

    Full Text Available Utilizing learning features by many fields like education, artificial intelligence, and multi-agent systems, leads to generation of various definitions for this concept. In this article, these field’s significant definitions for learning will be presented, and their key concepts in each field will be described. Using the mentioned features in different learning definitions, ontology will get presented for the concept of learning. In the ontology, the main ontological concepts and their relations have been represented. Also a conceptual model for learning based on presented ontology will be proposed by means of model and modeling description. Then concepts of presented definitions are going to be shown in proposed model and after that, the model’s functionality will be discuss. Twelve main characteristics have been used to describe the proposed model’s functionality. Utilizing learning ontology to improve the proposed conceptual model can be used also as a guide to model learning and also can be useful in different learning models’ comparison. So that the key concepts which can be used for considered learning model will be determined. Furthermore, an example based on proposed ontology and definition features is explained.

  11. Model-Based Power Plant Master Control

    Energy Technology Data Exchange (ETDEWEB)

    Boman, Katarina; Thomas, Jean; Funkquist, Jonas

    2010-08-15

    The main goal of the project has been to evaluate the potential of a coordinated master control for a solid fuel power plant in terms of tracking capability, stability and robustness. The control strategy has been model-based predictive control (MPC) and the plant used in the case study has been the Vattenfall power plant Idbaecken in Nykoeping. A dynamic plant model based on nonlinear physical models was used to imitate the true plant in MATLAB/SIMULINK simulations. The basis for this model was already developed in previous Vattenfall internal projects, along with a simulation model of the existing control implementation with traditional PID controllers. The existing PID control is used as a reference performance, and it has been thoroughly studied and tuned in these previous Vattenfall internal projects. A turbine model was developed with characteristics based on the results of steady-state simulations of the plant using the software EBSILON. Using the derived model as a representative for the actual process, an MPC control strategy was developed using linearization and gain-scheduling. The control signal constraints (rate of change) and constraints on outputs were implemented to comply with plant constraints. After tuning the MPC control parameters, a number of simulation scenarios were performed to compare the MPC strategy with the existing PID control structure. The simulation scenarios also included cases highlighting the robustness properties of the MPC strategy. From the study, the main conclusions are: - The proposed Master MPC controller shows excellent set-point tracking performance even though the plant has strong interactions and non-linearity, and the controls and their rate of change are bounded. - The proposed Master MPC controller is robust, stable in the presence of disturbances and parameter variations. Even though the current study only considered a very small number of the possible disturbances and modelling errors, the considered cases are

  12. Classification tree analysis of factors affecting parking choices in Qatar

    OpenAIRE

    Shaaban, K.; Pande, A

    2015-01-01

    Qatar has experienced a significant population growth in the past decade. The growth has been accompanied by an increase in automobile ownership rates leading to parking problems especially in the capital city of Doha. The objective of this study was to find the factors affecting people's choice of parking in this rich developing country when different parking options are available. Two commercial centers located in the city of Doha, Qatar were selected for this study; the City Center mall an...

  13. Port-based modeling of mechatronic systems

    NARCIS (Netherlands)

    Breedveld, Peter C.

    2004-01-01

    Many engineering activities, including mechatronic design, require that a multidomain or ‘multi-physics’ system and its control system be designed as an integrated system. This contribution discusses the background and tools for a port-based approach to integrated modeling and simulation of physical

  14. Requirements engineering-based conceptual modeling

    NARCIS (Netherlands)

    Insfrán, E.; Pastor, O.; Wieringa, R.J.

    2002-01-01

    The software production process involves a set of phases where a clear relationship and smooth transitions between them should be introduced. In this paper, a requirements engineering-based conceptual modelling approach is introduced as a way to improve the quality of the software production process

  15. Agent Based Modelling for Social Simulation

    NARCIS (Netherlands)

    Smit, S.K.; Ubink, E.M.; Vecht, B. van der; Langley, D.J.

    2013-01-01

    This document is the result of an exploratory project looking into the status of, and opportunities for Agent Based Modelling (ABM) at TNO. The project focussed on ABM applications containing social interactions and human factors, which we termed ABM for social simulation (ABM4SS). During the course

  16. Sandboxes for Model-Based Inquiry

    Science.gov (United States)

    Brady, Corey; Holbert, Nathan; Soylu, Firat; Novak, Michael; Wilensky, Uri

    2015-01-01

    In this article, we introduce a class of constructionist learning environments that we call "Emergent Systems Sandboxes" ("ESSs"), which have served as a centerpiece of our recent work in developing curriculum to support scalable model-based learning in classroom settings. ESSs are a carefully specified form of virtual…

  17. What's Missing in Model-Based Teaching

    Science.gov (United States)

    Khan, Samia

    2011-01-01

    In this study, the author investigated how four science teachers employed model-based teaching (MBT) over a 1-year period. The purpose of the research was to develop a baseline of the fundamental and specific dimensions of MBT that are present and absent in science teaching. Teacher interviews, classroom observations, and pre and post-student…

  18. Introducing Waqf Based Takaful Model in India

    Directory of Open Access Journals (Sweden)

    Syed Ahmed Salman

    2014-03-01

    Full Text Available Objective – Waqf is a unique feature of the socioeconomic system of Islam in a multi- religious and developing country like India. India is a rich country with waqf assets. The history of waqf in India can be traced back to 800 years ago. Most of the researchers, suggest how waqf can be used a tool to mitigate the poverty of Muslims. India has the third highest Muslim population after Indonesia and Pakistan. However, the majority of Muslims belong to the low income group and they are in need of help. It is believed that waqf can be utilized for the betterment of Indian Muslim community. Among the available uses of waqf assets, the main objective of this paper is to introduce waqf based takaful model in India. In addition, how this proposed model can be adopted in India is highlighted.Methods – Library research is applied since this paper relies on secondary data by thoroughlyreviewing the most relevant literature.Result – India as a rich country with waqf assets should fully utilize the resources to help the Muslims through takaful.Conclusion – In this study, we have proposed waqf based takaful model with the combination of the concepts mudarabah and wakalah for India. We recommend this model based on the background of the  country and situations. Since we have not tested the viability of this model in India, future research should be continued on this testing.Keywords : Wakaf, Takaful, Kemiskinan dan India

  19. Agent Based Modeling as an Educational Tool

    Science.gov (United States)

    Fuller, J. H.; Johnson, R.; Castillo, V.

    2012-12-01

    Motivation is a key element in high school education. One way to improve motivation and provide content, while helping address critical thinking and problem solving skills, is to have students build and study agent based models in the classroom. This activity visually connects concepts with their applied mathematical representation. "Engaging students in constructing models may provide a bridge between frequently disconnected conceptual and mathematical forms of knowledge." (Levy and Wilensky, 2011) We wanted to discover the feasibility of implementing a model based curriculum in the classroom given current and anticipated core and content standards.; Simulation using California GIS data ; Simulation of high school student lunch popularity using aerial photograph on top of terrain value map.

  20. A Novel Template-Based Learning Model

    CERN Document Server

    Abolghasemi-Dahaghani, Mohammadreza; Nowroozi, Alireza

    2011-01-01

    This article presents a model which is capable of learning and abstracting new concepts based on comparing observations and finding the resemblance between the observations. In the model, the new observations are compared with the templates which have been derived from the previous experiences. In the first stage, the objects are first represented through a geometric description which is used for finding the object boundaries and a descriptor which is inspired by the human visual system and then they are fed into the model. Next, the new observations are identified through comparing them with the previously-learned templates and are used for producing new templates. The comparisons are made based on measures like Euclidean or correlation distance. The new template is created by applying onion-pealing algorithm. The algorithm consecutively uses convex hulls which are made by the points representing the objects. If the new observation is remarkably similar to one of the observed categories, it is no longer util...

  1. Modeling Leaves Based on Real Image

    Institute of Scientific and Technical Information of China (English)

    CAOYu-kun; LIYun-feng; ZHUQing-sheng; LIUYin-bin

    2004-01-01

    Plants have complex structures. The shape of a plant component is vital for capturing the characteristics of a species. One of the challenges in computer graphics is to create geometry of objects in an intuitive and direct way while allowing interactive manipulation of the resulting shapes. In this paper,an interactive method for modeling leaves based on real image is proposed using biological data for individual plants. The modeling process begins with a one-dimensional analogue of implicit surfaces,from which a 2D silhouette of a leaf is generated based on image segmentation. The silhouette skeleton is thus obtained. Feature parameters of the leaf are extracted based on biologically experimental data, and the obtained leaf structure is then modified by comparing the synthetic result with the real leaf so as to make the leaf structure more realistic. Finally, the leaf mesh is constructed by sweeps.

  2. Multiagent-Based Model For ESCM

    Directory of Open Access Journals (Sweden)

    Delia MARINCAS

    2011-01-01

    Full Text Available Web based applications for Supply Chain Management (SCM are now a necessity for every company in order to meet the increasing customer demands, to face the global competition and to make profit. Multiagent-based approach is appropriate for eSCM because it shows many of the characteristics a SCM system should have. For this reason, we have proposed a multiagent-based eSCM model which configures a virtual SC, automates the SC activities: selling, purchasing, manufacturing, planning, inventory, etc. This model will allow a better coordination of the supply chain network and will increase the effectiveness of Web and intel-ligent technologies employed in eSCM software.

  3. Mesoscopic model of actin-based propulsion.

    Directory of Open Access Journals (Sweden)

    Jie Zhu

    Full Text Available Two theoretical models dominate current understanding of actin-based propulsion: microscopic polymerization ratchet model predicts that growing and writhing actin filaments generate forces and movements, while macroscopic elastic propulsion model suggests that deformation and stress of growing actin gel are responsible for the propulsion. We examine both experimentally and computationally the 2D movement of ellipsoidal beads propelled by actin tails and show that neither of the two models can explain the observed bistability of the orientation of the beads. To explain the data, we develop a 2D hybrid mesoscopic model by reconciling these two models such that individual actin filaments undergoing nucleation, elongation, attachment, detachment and capping are embedded into the boundary of a node-spring viscoelastic network representing the macroscopic actin gel. Stochastic simulations of this 'in silico' actin network show that the combined effects of the macroscopic elastic deformation and microscopic ratchets can explain the observed bistable orientation of the actin-propelled ellipsoidal beads. To test the theory further, we analyze observed distribution of the curvatures of the trajectories and show that the hybrid model's predictions fit the data. Finally, we demonstrate that the model can explain both concave-up and concave-down force-velocity relations for growing actin networks depending on the characteristic time scale and network recoil. To summarize, we propose that both microscopic polymerization ratchets and macroscopic stresses of the deformable actin network are responsible for the force and movement generation.

  4. [Fast spectral modeling based on Voigt peaks].

    Science.gov (United States)

    Li, Jin-rong; Dai, Lian-kui

    2012-03-01

    Indirect hard modeling (IHM) is a recently introduced method for quantitative spectral analysis, which was applied to the analysis of nonlinear relation between mixture spectrum and component concentration. In addition, IHM is an effectual technology for the analysis of components of mixture with molecular interactions and strongly overlapping bands. Before the establishment of regression model, IHM needs to model the measured spectrum as a sum of Voigt peaks. The precision of the spectral model has immediate impact on the accuracy of the regression model. A spectrum often includes dozens or even hundreds of Voigt peaks, which mean that spectral modeling is a optimization problem with high dimensionality in fact. So, large operation overhead is needed and the solution would not be numerically unique due to the ill-condition of the optimization problem. An improved spectral modeling method is presented in the present paper, which reduces the dimensionality of optimization problem by determining the overlapped peaks in spectrum. Experimental results show that the spectral modeling based on the new method is more accurate and needs much shorter running time than conventional method. PMID:22582612

  5. Sandboxes for Model-Based Inquiry

    Science.gov (United States)

    Brady, Corey; Holbert, Nathan; Soylu, Firat; Novak, Michael; Wilensky, Uri

    2015-04-01

    In this article, we introduce a class of constructionist learning environments that we call Emergent Systems Sandboxes ( ESSs), which have served as a centerpiece of our recent work in developing curriculum to support scalable model-based learning in classroom settings. ESSs are a carefully specified form of virtual construction environment that support students in creating, exploring, and sharing computational models of dynamic systems that exhibit emergent phenomena. They provide learners with "entity"-level construction primitives that reflect an underlying scientific model. These primitives can be directly "painted" into a sandbox space, where they can then be combined, arranged, and manipulated to construct complex systems and explore the emergent properties of those systems. We argue that ESSs offer a means of addressing some of the key barriers to adopting rich, constructionist model-based inquiry approaches in science classrooms at scale. Situating the ESS in a large-scale science modeling curriculum we are implementing across the USA, we describe how the unique "entity-level" primitive design of an ESS facilitates knowledge system refinement at both an individual and social level, we describe how it supports flexible modeling practices by providing both continuous and discrete modes of executability, and we illustrate how it offers students a variety of opportunities for validating their qualitative understandings of emergent systems as they develop.

  6. Multivariate statistical modelling based on generalized linear models

    CERN Document Server

    Fahrmeir, Ludwig

    1994-01-01

    This book is concerned with the use of generalized linear models for univariate and multivariate regression analysis. Its emphasis is to provide a detailed introductory survey of the subject based on the analysis of real data drawn from a variety of subjects including the biological sciences, economics, and the social sciences. Where possible, technical details and proofs are deferred to an appendix in order to provide an accessible account for non-experts. Topics covered include: models for multi-categorical responses, model checking, time series and longitudinal data, random effects models, and state-space models. Throughout, the authors have taken great pains to discuss the underlying theoretical ideas in ways that relate well to the data at hand. As a result, numerous researchers whose work relies on the use of these models will find this an invaluable account to have on their desks. "The basic aim of the authors is to bring together and review a large part of recent advances in statistical modelling of m...

  7. A comprehensive theory-based transport model

    International Nuclear Information System (INIS)

    A new theory based transport model with comprehensive physics (trapping, general toroidal geometry, finite beta, collisions) has been developed. The core of the model is the new trapped-gyro-Landau-fluid (TGLF) equations which provide a fast and accurate approximation to the linear eigenmodes for gyrokinetic drift-wave instabilities (trapped ion and electron modes, ion and electron temperature gradient modes and kinetic ballooning modes). This new TGLF transport model removes the limitation of its predecessor GLF23 and is valid for the same conditions as the gyrokinetic equations. A theory-based philosophy is used in the model construction. The closure coefficients of the TGLF equations are fit to local kinetic theory to give accurate linear eigenmodes. The saturation model is fit to non-linear turbulence simulations. No fitting to experiment is done so applying the model to experiments is a true test of the theory it is approximating. The TGLF model unifies trapped and passing particles in a single set of gyrofluid equations. A model for the averaging of the Landau resonance by the trapped particles makes the equations work seamlessly over the whole drift-wave wavenumber range from trapped ion modes to electron temperature gradient modes. A fast eigenmode solution method enables unrestricted magnetic geometry. Electron-ion collisions and full electromagnetic fluctuations round out the physics. The linear eigenmodes have been benchmarked against comprehensive physics gyrokinetic calculations over a large range of plasma parameters. The deviation between the gyrokinetic and TGLF linear growth rates averages 11.4% in shifted circle geometry. The transport model uses the TGLF eigenmodes to compute quasilinear fluxes of energy and particles. A model for the saturated amplitude of the turbulence completes the calculation. The saturation model is constructed to fit a large set of nonlinear gyrokinetic turbulence simulations. The TGLF model is valid in new physical

  8. A comprehensive theory-based transport model

    International Nuclear Information System (INIS)

    Full text: A new theory based transport model with comprehensive physics (trapping, general toroidal geometry, finite beta, collisions) has been developed. The core of the model is the new trapped-gyro- Landau-fluid (TGLF) equations which provide a fast and accurate approximation to the linear eigenmodes for gyrokinetic drift-wave instabilities (trapped ion and electron modes, ion and electron temperature gradient modes and kinetic ballooning modes). This new TGLF transport model removes the limitation of its predecessor GLF23 and is valid for the same conditions as the gyrokinetic equations. A theory-based philosophy is used in the model construction. The closure coefficients of the TGLF equations are fit to local kinetic theory to give accurate linear eigenmodes. The saturation model is fit to non-linear turbulence simulations. No fitting to experiment is done so applying the model to experiments is a true test of the theory it is approximating. The TGLF model unifies trapped and passing particles in a single set of gyrofluid equations. A model for the averaging of the Landau resonance by the trapped particles makes the equations work seamlessly over the whole drift-wave wavenumber range from trapped ion modes to electron temperature gradient modes. A fast eigenmode solution method enables unrestricted magnetic geometry. Electron-ion collisions and full electromagnetic fluctuations round out the physics. The linear eigenmodes have been benchmarked against comprehensive physics gyrokinetic calculations over a large range of plasma parameters. The deviation between the gyrokinetic and TGLF linear growth rates averages 11.4% in shifted circle geometry. The transport model uses the TGLF eigenmodes to compute quasilinear fluxes of energy and particles. A model for the saturated amplitude of the turbulence completes the calculation. The saturation model is constructed to fit a large set of nonlinear gyrokinetic turbulence simulations. The TGLF model is valid in new

  9. Electrochemistry-based Battery Modeling for Prognostics

    Science.gov (United States)

    Daigle, Matthew J.; Kulkarni, Chetan Shrikant

    2013-01-01

    Batteries are used in a wide variety of applications. In recent years, they have become popular as a source of power for electric vehicles such as cars, unmanned aerial vehicles, and commericial passenger aircraft. In such application domains, it becomes crucial to both monitor battery health and performance and to predict end of discharge (EOD) and end of useful life (EOL) events. To implement such technologies, it is crucial to understand how batteries work and to capture that knowledge in the form of models that can be used by monitoring, diagnosis, and prognosis algorithms. In this work, we develop electrochemistry-based models of lithium-ion batteries that capture the significant electrochemical processes, are computationally efficient, capture the effects of aging, and are of suitable accuracy for reliable EOD prediction in a variety of usage profiles. This paper reports on the progress of such a model, with results demonstrating the model validity and accurate EOD predictions.

  10. An immune based dynamic intrusion detection model

    Institute of Scientific and Technical Information of China (English)

    LI Tao

    2005-01-01

    With the dynamic description method for self and antigen, and the concept of dynamic immune tolerance for lymphocytes in network-security domain presented in this paper, a new immune based dynamic intrusion detection model (Idid) is proposed. In Idid, the dynamic models and the corresponding recursive equations of the lifecycle of mature lymphocytes, and the immune memory are built. Therefore, the problem of the dynamic description of self and nonself in computer immune systems is solved, and the defect of the low efficiency of mature lymphocyte generating in traditional computer immune systems is overcome. Simulations of this model are performed, and the comparison experiment results show that the proposed dynamic intrusion detection model has a better adaptability than the traditional methods.

  11. Model-based multiple patterning layout decomposition

    Science.gov (United States)

    Guo, Daifeng; Tian, Haitong; Du, Yuelin; Wong, Martin D. F.

    2015-10-01

    As one of the most promising next generation lithography technologies, multiple patterning lithography (MPL) plays an important role in the attempts to keep in pace with 10 nm technology node and beyond. With feature size keeps shrinking, it has become impossible to print dense layouts within one single exposure. As a result, MPL such as double patterning lithography (DPL) and triple patterning lithography (TPL) has been widely adopted. There is a large volume of literature on DPL/TPL layout decomposition, and the current approach is to formulate the problem as a classical graph-coloring problem: Layout features (polygons) are represented by vertices in a graph G and there is an edge between two vertices if and only if the distance between the two corresponding features are less than a minimum distance threshold value dmin. The problem is to color the vertices of G using k colors (k = 2 for DPL, k = 3 for TPL) such that no two vertices connected by an edge are given the same color. This is a rule-based approach, which impose a geometric distance as a minimum constraint to simply decompose polygons within the distance into different masks. It is not desired in practice because this criteria cannot completely capture the behavior of the optics. For example, it lacks of sufficient information such as the optical source characteristics and the effects between the polygons outside the minimum distance. To remedy the deficiency, a model-based layout decomposition approach to make the decomposition criteria base on simulation results was first introduced at SPIE 2013.1 However, the algorithm1 is based on simplified assumption on the optical simulation model and therefore its usage on real layouts is limited. Recently AMSL2 also proposed a model-based approach to layout decomposition by iteratively simulating the layout, which requires excessive computational resource and may lead to sub-optimal solutions. The approach2 also potentially generates too many stiches. In this

  12. Investigating the relationship between hydrologic model parameters and physical catchment metrics for improved modeling in data-sparse regions

    Science.gov (United States)

    Marshall, L. A.; Weber, K.; Greenwood, M. C.; Smith, T. J.; Sharma, A.

    2013-12-01

    In regions with sparse data, hydrologic modelers often endeavor to transfer information from longer-term gauged catchments to those with limited data. In this approach, it is assumed that these gauged ';surrogates' can provide useful information for those ungauged catchments that are hydrologically similar. One recent method aims to pool catchments with similar hydrologic behavior so that models may be more convincingly applied to catchments without detailed observations. An ongoing concern, however, is how to identify catchments that behave similarly in terms of hydrologic processes and thus classify catchments in terms of their modeled behavior. In this study, we investigate the complex relationship between physical catchment characteristics, hydrologic signatures, and optimized hydrologic models for regions with sparse data. We make use of a data set of over 150 catchments located in southeast Australia with basic climatic and hydrologic time series and limited information on physical catchment characteristics. A conceptual rainfall-runoff model is calibrated for each of the catchments and hierarchical clustering is performed to link catchments based on their calibrated model parameters. We then aim to isolate the physical and spatial metrics that are common to each member of a given cluster with the ultimate goal of providing insight to the selection of gauged surrogates for ungauged watersheds. A Permutational Multivariate Analysis of Variance (perMANOVA) is performed to determine if significant differences exist between clusters according to certain physical and climatic catchment descriptors. We further analyze the data using a classification tree to determine the extent to which cluster membership can be predicted by basic catchment descriptors. Our results show support for the 'surrogate' technique for hydrologic regionalization by demonstrating that the clusters, though built using calibrated model parameters, are related to clear differences in the

  13. 3-D model-based vehicle tracking.

    Science.gov (United States)

    Lou, Jianguang; Tan, Tieniu; Hu, Weiming; Yang, Hao; Maybank, Steven J

    2005-10-01

    This paper aims at tracking vehicles from monocular intensity image sequences and presents an efficient and robust approach to three-dimensional (3-D) model-based vehicle tracking. Under the weak perspective assumption and the ground-plane constraint, the movements of model projection in the two-dimensional image plane can be decomposed into two motions: translation and rotation. They are the results of the corresponding movements of 3-D translation on the ground plane (GP) and rotation around the normal of the GP, which can be determined separately. A new metric based on point-to-line segment distance is proposed to evaluate the similarity between an image region and an instantiation of a 3-D vehicle model under a given pose. Based on this, we provide an efficient pose refinement method to refine the vehicle's pose parameters. An improved EKF is also proposed to track and to predict vehicle motion with a precise kinematics model. Experimental results with both indoor and outdoor data show that the algorithm obtains desirable performance even under severe occlusion and clutter. PMID:16238061

  14. Image-based modelling of organogenesis.

    Science.gov (United States)

    Iber, Dagmar; Karimaddini, Zahra; Ünal, Erkan

    2016-07-01

    One of the major challenges in biology concerns the integration of data across length and time scales into a consistent framework: how do macroscopic properties and functionalities arise from the molecular regulatory networks-and how can they change as a result of mutations? Morphogenesis provides an excellent model system to study how simple molecular networks robustly control complex processes on the macroscopic scale despite molecular noise, and how important functional variants can emerge from small genetic changes. Recent advancements in three-dimensional imaging technologies, computer algorithms and computer power now allow us to develop and analyse increasingly realistic models of biological control. Here, we present our pipeline for image-based modelling that includes the segmentation of images, the determination of displacement fields and the solution of systems of partial differential equations on the growing, embryonic domains. The development of suitable mathematical models, the data-based inference of parameter sets and the evaluation of competing models are still challenging, and current approaches are discussed. PMID:26510443

  15. Model transformation based information system modernization

    Directory of Open Access Journals (Sweden)

    Olegas Vasilecas

    2013-03-01

    Full Text Available Information systems begin to date increasingly faster because of rapidly changing business environment. Usually, small changes are not sufficient to adapt complex legacy information systems to changing business needs. A new functionality should be installed with the requirement of putting business data in the smallest possible risk. Information systems modernization problems are beeing analyzed in this paper and a method for information system modernization is proposed. It involves programming code transformation into abstract syntax tree metamodel (ASTM and model based transformation from ASTM into knowledge discovery model (KDM. The method is validated on example for SQL language.

  16. Soft sensor modeling based on Gaussian processes

    Institute of Scientific and Technical Information of China (English)

    XIONG Zhi-hua; HUANG Guo-hong; SHAO Hui-he

    2005-01-01

    In order to meet the demand of online optimal running, a novel soft sensor modeling approach based on Gaussian processes was proposed. The approach is moderately simple to implement and use without loss of performance. It is trained by optimizing the hyperparameters using the scaled conjugate gradient algorithm with the squared exponential covariance function employed. Experimental simulations show that the soft sensor modeling approach has the advantage via a real-world example in a refinery. Meanwhile, the method opens new possibilities for application of kernel methods to potential fields.

  17. Modelling spatial association in pattern based land use simulation models.

    Science.gov (United States)

    Anputhas, Markandu; Janmaat, Johannus John A; Nichol, Craig F; Wei, Xiaohua Adam

    2016-10-01

    Pattern based land use models are widely used to forecast land use change. These models predict land use change using driving variables observed on the studied landscape. Many of these models have a limited capacity to account for interactions between neighbouring land parcels. Some modellers have used common spatial statistical measures to incorporate neighbour effects. However, these approaches were developed for continuous variables, while land use classifications are categorical. Neighbour interactions are also endogenous, changing as the land use patterns change. In this study we describe a single variable measure that captures aspects of neighbour interactions as reflected in the land use pattern. We use a stepwise updating process to demonstrate how dynamic updating of our measure impacts on model forecasts. We illustrate these results using the CLUE-S (Conversion of Land Use and its Effects at Small regional extent) system to forecast land use change for the Deep Creek watershed in the northern Okanagan Valley of British Columbia, Canada. Results establish that our measure improves model calibration and that ignoring changing spatial influences biases land use change forecasts. PMID:27420169

  18. History-based trust negotiation model

    Institute of Scientific and Technical Information of China (English)

    ZHAO Yi-zhu; ZHAO Yan-hua; LU Hong-wei

    2009-01-01

    Trust negotiation (TN) is an approach to establish trust between strangers through iterative disclosure of digital credentials. Speeding up subsequent negotiations between the same negotiators is a problem worth of research. This paper introduces the concept of visiting card, and presents a history-based trust negotiation (HBTN) model. HBTN creates an account for a counterpart at the first negotiation and records valid credentials that the counterpart disclosed during each trust negotiation in his historical information base (HIB). For the following negotiation, no more credentials need to be disclosed for both parties. HBTN speeds up subsequent negotiations between the entities that interact with each other frequently without impairing the privacy preservation.

  19. Item Modeling Concept Based on Multimedia Authoring

    Directory of Open Access Journals (Sweden)

    Janez Stergar

    2008-09-01

    Full Text Available In this paper a modern item design framework for computer based assessment based on Flash authoring environment will be introduced. Question design will be discussed as well as the multimedia authoring environment used for item modeling emphasized. Item type templates are a structured means of collecting and storing item information that can be used to improve the efficiency and security of the innovative item design process. Templates can modernize the item design, enhance and speed up the development process. Along with content creation, multimedia has vast potential for use in innovative testing. The introduced item design template is based on taxonomy of innovative items which have great potential for expanding the content areas and construct coverage of an assessment. The presented item design approach is based on GUI's – one for question design based on implemented item design templates and one for user interaction tracking/retrieval. The concept of user interfaces based on Flash technology will be discussed as well as implementation of the innovative approach of the item design forms with multimedia authoring. Also an innovative method for user interaction storage/retrieval based on PHP extending Flash capabilities in the proposed framework will be introduced.

  20. Graph based model to support nurses' work.

    Science.gov (United States)

    Benedik, Peter; Rajkovič, Uroš; Sušteršič, Olga; Prijatelj, Vesna; Rajkovič, Vladislav

    2014-01-01

    Health care is a knowledge-based community that critically depends on knowledge management activities in order to ensure quality. Nurses are primary stakeholders and need to ensure that their information and knowledge needs are being met in such ways that enable them, to improve the quality and efficiency of health care service delivery for all subjects of health care. This paper describes a system to help nurses to create nursing care plan. It supports focusing nurse's attention on those resources/solutions that are likely to be most relevant to their particular situation/problem in nursing domain. System is based on multi-relational property graph representing a flexible modeling construct. Graph allows modeling a nursing domain (ontology) and the indices that partition domain into an efficient, searchable space where the solution to a problem is seen as abstractly defined traversals through its vertices and edges. PMID:24943559

  1. Model Based Control of Refrigeration Systems

    DEFF Research Database (Denmark)

    Larsen, Lars Finn Sloth

    Today refrigeration control systems consist of a number of self-contained distributed control loops that during the past years has been optimized obtaining a high performance of the individual subsystems, thus disregarding cross-couplings as well dynamically as statically. The supervisory control...... for automation of these procedures, that is to incorporate some "intelligence" in the control system, this project was started up. The main emphasis of this work has been on model based methods for system optimizing control in supermarket refrigeration systems. The idea of implementing a system optimizing......-couplings resulting in large disturbances. In supermarkets refrigeration systems the temperature control in the refrigerated display cases are maintained by hysteresis controllers. Based on a model predictive hybrid framework a novel approach for desynchronization is presented. The approach is applied...

  2. Agent Based Model of Livestock Movements

    Science.gov (United States)

    Miron, D. J.; Emelyanova, I. V.; Donald, G. E.; Garner, G. M.

    The modelling of livestock movements within Australia is of national importance for the purposes of the management and control of exotic disease spread, infrastructure development and the economic forecasting of livestock markets. In this paper an agent based model for the forecasting of livestock movements is presented. This models livestock movements from farm to farm through a saleyard. The decision of farmers to sell or buy cattle is often complex and involves many factors such as climate forecast, commodity prices, the type of farm enterprise, the number of animals available and associated off-shore effects. In this model the farm agent's intelligence is implemented using a fuzzy decision tree that utilises two of these factors. These two factors are the livestock price fetched at the last sale and the number of stock on the farm. On each iteration of the model farms choose either to buy, sell or abstain from the market thus creating an artificial supply and demand. The buyers and sellers then congregate at the saleyard where livestock are auctioned using a second price sealed bid. The price time series output by the model exhibits properties similar to those found in real livestock markets.

  3. Model-Based Trace-Checking

    CERN Document Server

    Howard, Y; Gravell, A; Ferreira, C; Augusto, J C

    2011-01-01

    Trace analysis can be a useful way to discover problems in a program under test. Rather than writing a special purpose trace analysis tool, this paper proposes that traces can usefully be analysed by checking them against a formal model using a standard model-checker or else an animator for executable specifications. These techniques are illustrated using a Travel Agent case study implemented in J2EE. We added trace beans to this code that write trace information to a database. The traces are then extracted and converted into a form suitable for analysis by Spin, a popular model-checker, and Pro-B, a model-checker and animator for the B notation. This illustrates the technique, and also the fact that such a system can have a variety of models, in different notations, that capture different features. These experiments have demonstrated that model-based trace-checking is feasible. Future work is focussed on scaling up the approach to larger systems by increasing the level of automation.

  4. Ecosystem Based Business Model of Smart Grid

    OpenAIRE

    Lundgaard, Morten Raahauge; Ma, Zheng; Jørgensen, Bo Nørregaard

    2015-01-01

    This paper tries to investigate the ecosystem based business model in a smart grid infrastructure and the potential of value capture in the highly complex macro infrastructure such as smart grid. This paper proposes an alternative perspective to study the smart grid business ecosystem to support the infrastructural challenges, such as the interoperability of business components for smart grid. So far little research has explored the business ecosystem in the smart grid concept. The study on t...

  5. Market Segmentation Using Bayesian Model Based Clustering

    OpenAIRE

    Van Hattum, P.

    2009-01-01

    This dissertation deals with two basic problems in marketing, that are market segmentation, which is the grouping of persons who share common aspects, and market targeting, which is focusing your marketing efforts on one or more attractive market segments. For the grouping of persons who share common aspects a Bayesian model based clustering approach is proposed such that it can be applied to data sets that are specifically used for market segmentation. The cluster algorithm can handle very l...

  6. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  7. Genome Informed Trait-Based Models

    Science.gov (United States)

    Karaoz, U.; Cheng, Y.; Bouskill, N.; Tang, J.; Beller, H. R.; Brodie, E.; Riley, W. J.

    2013-12-01

    Trait-based approaches are powerful tools for representing microbial communities across both spatial and temporal scales within ecosystem models. Trait-based models (TBMs) represent the diversity of microbial taxa as stochastic assemblages with a distribution of traits constrained by trade-offs between these traits. Such representation with its built-in stochasticity allows the elucidation of the interactions between the microbes and their environment by reducing the complexity of microbial community diversity into a limited number of functional ';guilds' and letting them emerge across spatio-temporal scales. From the biogeochemical/ecosystem modeling perspective, the emergent properties of the microbial community could be directly translated into predictions of biogeochemical reaction rates and microbial biomass. The accuracy of TBMs depends on the identification of key traits of the microbial community members and on the parameterization of these traits. Current approaches to inform TBM parameterization are empirical (i.e., based on literature surveys). Advances in omic technologies (such as genomics, metagenomics, metatranscriptomics, and metaproteomics) pave the way to better-initialize models that can be constrained in a generic or site-specific fashion. Here we describe the coupling of metagenomic data to the development of a TBM representing the dynamics of metabolic guilds from an organic carbon stimulated groundwater microbial community. Illumina paired-end metagenomic data were collected from the community as it transitioned successively through electron-accepting conditions (nitrate-, sulfate-, and Fe(III)-reducing), and used to inform estimates of growth rates and the distribution of metabolic pathways (i.e., aerobic and anaerobic oxidation, fermentation) across a spatially resolved TBM. We use this model to evaluate the emergence of different metabolisms and predict rates of biogeochemical processes over time. We compare our results to observational

  8. On Reading-Based Writing Instruction Model

    Institute of Scientific and Technical Information of China (English)

    李大艳; 王建安

    2012-01-01

    English writing is a complex integrative process of comprehensive skills. A host of students are still unable to write a coherent English paragraph after having learned English for many years at school. To help college students improve their writing competence is a great challenge facing the English teaching in China. Researches on writing teaching method abroad have experienced prosperity. In China, however, researches in this field are far behind. There is great need to search for more efficient writing instruction model so that it can serve well in Chinese context. Enlightened by Krashen's input hypothesis and Swain's output hypothesis, the writer put forward Reading-Based Writing Instruction Model. This paper aims to discuss the effectiveness of this model from the different perspectives.

  9. Model based monitoring of stormwater runoff quality

    DEFF Research Database (Denmark)

    Birch, Heidi; Vezzaro, Luca; Mikkelsen, Peter Steen

    2012-01-01

    the information obtained about MPs discharged from the monitored system. A dynamic stormwater quality model was calibrated using MP data collected by volume-proportional and passive sampling in a storm drainage system in the outskirts of Copenhagen (Denmark) and a 10-year rain series was used to find annual......Monitoring of micropollutants (MP) in stormwater is essential to evaluate the impacts of stormwater on the receiving aquatic environment. The aim of this study was to investigate how different strategies for monitoring of stormwater quality (combination of model with field sampling) affect...... average and maximum event mean concentrations. Use of this model reduced the uncertainty of predicted annual average concentrations compared to a simple stochastic method based solely on data. The predicted annual average obtained by using passive sampler measurements (one month installation...

  10. Realistic face modeling based on multiple deformations

    Institute of Scientific and Technical Information of China (English)

    GONG Xun; WANG Guo-yin

    2007-01-01

    On the basis of the assumption that the human face belongs to a linear class, a multiple-deformation model is proposed to recover face shape from a few points on a single 2D image. Compared to the conventional methods, this study has the following advantages. First, the proposed modified 3D sparse deforming model is a noniterative approach that can compute global translation efficiently and accurately. Subsequently, the overfitting problem can be alleviated based on the proposed multiple deformation model. Finally, by keeping the main features, the texture generated is realistic. The comparison results show that this novel method outperforms the existing methods by using ground truth data and that realistic 3D faces can be recovered efficiently from a single photograph.

  11. Business Models for NFC based mobile payments

    Directory of Open Access Journals (Sweden)

    Johannes Sang Un Chae

    2015-01-01

    Full Text Available Purpose: The purpose of the paper is to develop a business model framework for NFC based mobile payment solutions consisting of four mutually interdepended components: the value service, value network, value architecture, and value finance. Design: Using a comparative case study method, the paper investigates Google Wallet and ISIS Mobile Wallet and their underlying business models. Findings: Google Wallet and ISIS Mobile Wallet are focusing on providing an enhanced customer experience with their mobile wallet through a multifaceted value proposition. The delivery of its offering requires cooperation from multiple stakeholders and the creation of an ecosystem. Furthermore, they focus on the scalability of their value propositions. Originality / value: The paper offers an applicable business model framework that allows practitioners and academics to study current and future mobile payment approaches.

  12. Business Models for NFC Based Mobile Payments

    DEFF Research Database (Denmark)

    Chae, Johannes Sang-Un; Hedman, Jonas

    2015-01-01

    from multiple stakeholders and the creation of an ecosystem. Furthermore, they focus on the scalability of their value propositions. Originality / value: The paper offers an applicable business model framework that allows practitioners and academics to study current and future mobile payment approaches.......Purpose: The purpose of the paper is to develop a business model framework for NFC based mobile payment solutions consisting of four mutually interdepended components: the value service, value network, value architecture, and value finance. Design: Using a comparative case study method, the paper...... investigates Google Wallet and ISIS Mobile Wallet and their underlying business models. Findings: Google Wallet and ISIS Mobile Wallet are focusing on providing an enhanced customer experience with their mobile wallet through a multifaceted value proposition. The delivery of its offering requires cooperation...

  13. Néron models and base change

    CERN Document Server

    Halle, Lars Halvard

    2016-01-01

    Presenting the first systematic treatment of the behavior of Néron models under ramified base change, this book can be read as an introduction to various subtle invariants and constructions related to Néron models of semi-abelian varieties, motivated by concrete research problems and complemented with explicit examples. Néron models of abelian and semi-abelian varieties have become an indispensable tool in algebraic and arithmetic geometry since Néron introduced them in his seminal 1964 paper. Applications range from the theory of heights in Diophantine geometry to Hodge theory. We focus specifically on Néron component groups, Edixhoven’s filtration and the base change conductor of Chai and Yu, and we study these invariants using various techniques such as models of curves, sheaves on Grothendieck sites and non-archimedean uniformization. We then apply our results to the study of motivic zeta functions of abelian varieties. The final chapter contains a list of challenging open questions. This book is a...

  14. Concept Tree Based Information Retrieval Model

    Directory of Open Access Journals (Sweden)

    Chunyan Yuan

    2014-05-01

    Full Text Available This paper proposes a novel concept-based query expansion technique named Markov concept tree model (MCTM, discovering term relationship through the concept tree deduced by term markov network. We address two important issues for query expansion: the selection and the weighting of expansion search terms. In contrast to earlier methods, queries are expanded by adding those terms that are most similar to the concept of the query, rather than selecting terms that are similar to a signal query terms. Utilizing Markov network which is constructed according to the co-occurrence information of the terms in collection, it generate concept tree for each original query term, remove the redundant and irrelevant nodes in concept tree, then adjust the weight of original query and the weight of expansion term based on a pruning algorithm. We use this model for query expansion and evaluate the effectiveness of the model by examining the accuracy and robustness of the expansion methods, Compared with the baseline model, the experiments on standard dataset reveal that this method can achieve a better query quality

  15. Inference-based procedural modeling of solids

    KAUST Repository

    Biggers, Keith

    2011-11-01

    As virtual environments become larger and more complex, there is an increasing need for more automated construction algorithms to support the development process. We present an approach for modeling solids by combining prior examples with a simple sketch. Our algorithm uses an inference-based approach to incrementally fit patches together in a consistent fashion to define the boundary of an object. This algorithm samples and extracts surface patches from input models, and develops a Petri net structure that describes the relationship between patches along an imposed parameterization. Then, given a new parameterized line or curve, we use the Petri net to logically fit patches together in a manner consistent with the input model. This allows us to easily construct objects of varying sizes and configurations using arbitrary articulation, repetition, and interchanging of parts. The result of our process is a solid model representation of the constructed object that can be integrated into a simulation-based environment. © 2011 Elsevier Ltd. All rights reserved.

  16. Model-based vision for car following

    Science.gov (United States)

    Schneiderman, Henry; Nashman, Marilyn; Lumia, Ronald

    1993-08-01

    This paper describes a vision processing algorithm that supports autonomous car following. The algorithm visually tracks the position of a `lead vehicle' from the vantage of a pursuing `chase vehicle.' The algorithm requires a 2-D model of the back of the lead vehicle. This model is composed of line segments corresponding to features that give rise to strong edges. There are seven sequential stages of computation: (1) Extracting edge points; (2) Associating extracted edge points with the model features; (3) Determining the position of each model feature; (4) Determining the model position; (5) Updating the motion model of the object; (6) Predicting the position of the object in next image; (7) Predicting the location of all object features from prediction of object position. All processing is confined to the 2-D image plane. The 2-D model location computed in this processing is used to determine the position of the lead vehicle with respect to a 3-D coordinate frame affixed to the chase vehicle. This algorithm has been used as part of a complete system to drive an autonomous vehicle, a High Mobility Multipurpose Wheeled Vehicle (HMMWV) such that it follows a lead vehicle at speeds up to 35 km/hr. The algorithm runs at an update rate of 15 Hertz and has a worst case computational delay of 128 ms. The algorithm is implemented under the NASA/NBS Standard Reference Model for Telerobotic Control System Architecture (NASREM) and runs on a dedicated vision processing engine and a VME-based multiprocessor system.

  17. Model-based explanation of plant knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Huuskonen, P.J. [VTT Electronics, Oulu (Finland). Embedded Software

    1997-12-31

    This thesis deals with computer explanation of knowledge related to design and operation of industrial plants. The needs for explanation are motivated through case studies and literature reviews. A general framework for analysing plant explanations is presented. Prototypes demonstrate key mechanisms for implementing parts of the framework. Power plants, steel mills, paper factories, and high energy physics control systems are studied to set requirements for explanation. The main problems are seen to be either lack or abundance of information. Design knowledge in particular is found missing at plants. Support systems and automation should be enhanced with ways to explain plant knowledge to the plant staff. A framework is formulated for analysing explanations of plant knowledge. It consists of three parts: 1. a typology of explanation, organised by the class of knowledge (factual, functional, or strategic) and by the target of explanation (processes, automation, or support systems), 2. an identification of explanation tasks generic for the plant domain, and 3. an identification of essential model types for explanation (structural, behavioural, functional, and teleological). The tasks use the models to create the explanations of the given classes. Key mechanisms are discussed to implement the generic explanation tasks. Knowledge representations based on objects and their relations form a vocabulary to model and present plant knowledge. A particular class of models, means-end models, are used to explain plant knowledge. Explanations are generated through searches in the models. Hypertext is adopted to communicate explanations over dialogue based on context. The results are demonstrated in prototypes. The VICE prototype explains the reasoning of an expert system for diagnosis of rotating machines at power plants. The Justifier prototype explains design knowledge obtained from an object-oriented plant design tool. Enhanced access mechanisms into on-line documentation are

  18. MRF model and FRAME model-based unsupervised image segmentation

    Institute of Scientific and Technical Information of China (English)

    CHENG Bing; WANG Ying; ZHENG Nanning; JIA Xinchun; BIAN Zhengzhong

    2004-01-01

    This paper presents a method for unsupervised segmentation of images consisting of multiple textures. The images under study are modeled by a proposed hierarchical random field model, which has two layers. The first layer is modeled as a Markov Random Field (MRF) representing an unobservable region image and the second layer uses "Filters, Random and Maximum Entropy (Abb. FRAME)" model to represent multiple textures which cover each region. Compared with the traditional Hierarchical Markov Random Field (HMRF), the FRAME can use a bigger neighborhood system and model more complex patterns. The segmentation problem is formulated as Maximum a Posteriori (MAP) estimation according to the Bayesian rule. The iterated conditional modes (ICM) algorithm is carried out to find the solution of the MAP estimation. An algorithm based on the local entropy rate is proposed to simplify the estimation of the parameters of MRF. The parameters of FRAME are estimated by the Expectation- Maximum (EM) algorithm. Finally, an experiment with synthesized and real images is given, which shows that the method can segment images with complex textures efficiently and is robust to noise.

  19. Family-Based Model Checking Without a Family-Based Model Checker

    DEFF Research Database (Denmark)

    Dimovski, Aleksandar; Al-Sibahi, Ahmad Salim; Brabrand, Claus;

    2015-01-01

    be used to model-check variational models using the standard version of (single system) SPIN. The abstractions are first defined as Galois connections on semantic domains. We then show how to translate them into syntactic source-to-source transformations on variational models. This allows the use of SPIN...... with all its accumulated optimizations for efficient verification of variational models without any knowledge about variability. We demonstrate the practicality of this method on several examples using both the SNIP (family based) and SPIN (single system) model checkers....

  20. Model based systems engineering for astronomical projects

    Science.gov (United States)

    Karban, R.; Andolfato, L.; Bristow, P.; Chiozzi, G.; Esselborn, M.; Schilling, M.; Schmid, C.; Sommer, H.; Zamparelli, M.

    2014-08-01

    Model Based Systems Engineering (MBSE) is an emerging field of systems engineering for which the System Modeling Language (SysML) is a key enabler for descriptive, prescriptive and predictive models. This paper surveys some of the capabilities, expectations and peculiarities of tools-assisted MBSE experienced in real-life astronomical projects. The examples range in depth and scope across a wide spectrum of applications (for example documentation, requirements, analysis, trade studies) and purposes (addressing a particular development need, or accompanying a project throughout many - if not all - its lifecycle phases, fostering reuse and minimizing ambiguity). From the beginnings of the Active Phasing Experiment, through VLT instrumentation, VLTI infrastructure, Telescope Control System for the E-ELT, until Wavefront Control for the E-ELT, we show how stepwise refinements of tools, processes and methods have provided tangible benefits to customary system engineering activities like requirement flow-down, design trade studies, interfaces definition, and validation, by means of a variety of approaches (like Model Checking, Simulation, Model Transformation) and methodologies (like OOSEM, State Analysis)

  1. New global ICT-based business models

    DEFF Research Database (Denmark)

    The New Global Business model (NEWGIBM) book describes the background, theory references, case studies, results and learning imparted by the NEWGIBM project, which is supported by ICT, to a research group during the period from 2005-2011. The book is a result of the efforts and the collaborative ...... The NEWGIBM Cases Show? The Strategy Concept in Light of the Increased Importance of Innovative Business Models Successful Implementation of Global BM Innovation Globalisation Of ICT Based Business Models: Today And In 2020......The New Global Business model (NEWGIBM) book describes the background, theory references, case studies, results and learning imparted by the NEWGIBM project, which is supported by ICT, to a research group during the period from 2005-2011. The book is a result of the efforts and the collaborative...... work between SMEs, consultancies and researchers across various lines of business, competences and research domains. The book commences with a theoretical discussion of the business model and its innovation literature and explains how this was a collaborative study by researchers from three Danish...

  2. Grid based calibration of SWAT hydrological models

    Directory of Open Access Journals (Sweden)

    D. Gorgan

    2012-07-01

    Full Text Available The calibration and execution of large hydrological models, such as SWAT (soil and water assessment tool, developed for large areas, high resolution, and huge input data, need not only quite a long execution time but also high computation resources. SWAT hydrological model supports studies and predictions of the impact of land management practices on water, sediment, and agricultural chemical yields in complex watersheds. The paper presents the gSWAT application as a web practical solution for environmental specialists to calibrate extensive hydrological models and to run scenarios, by hiding the complex control of processes and heterogeneous resources across the grid based high computation infrastructure. The paper highlights the basic functionalities of the gSWAT platform, and the features of the graphical user interface. The presentation is concerned with the development of working sessions, interactive control of calibration, direct and basic editing of parameters, process monitoring, and graphical and interactive visualization of the results. The experiments performed on different SWAT models and the obtained results argue the benefits brought by the grid parallel and distributed environment as a solution for the processing platform. All the instances of SWAT models used in the reported experiments have been developed through the enviroGRIDS project, targeting the Black Sea catchment area.

  3. Modeling oil production based on symbolic regression

    International Nuclear Information System (INIS)

    Numerous models have been proposed to forecast the future trends of oil production and almost all of them are based on some predefined assumptions with various uncertainties. In this study, we propose a novel data-driven approach that uses symbolic regression to model oil production. We validate our approach on both synthetic and real data, and the results prove that symbolic regression could effectively identify the true models beneath the oil production data and also make reliable predictions. Symbolic regression indicates that world oil production will peak in 2021, which broadly agrees with other techniques used by researchers. Our results also show that the rate of decline after the peak is almost half the rate of increase before the peak, and it takes nearly 12 years to drop 4% from the peak. These predictions are more optimistic than those in several other reports, and the smoother decline will provide the world, especially the developing countries, with more time to orchestrate mitigation plans. -- Highlights: •A data-driven approach has been shown to be effective at modeling the oil production. •The Hubbert model could be discovered automatically from data. •The peak of world oil production is predicted to appear in 2021. •The decline rate after peak is half of the increase rate before peak. •Oil production projected to decline 4% post-peak

  4. Physics-based models of the plasmasphere

    Energy Technology Data Exchange (ETDEWEB)

    Jordanova, Vania K [Los Alamos National Laboratory; Pierrard, Vivane [BELGIUM; Goldstein, Jerry [SWRI; Andr' e, Nicolas [ESTEC/ESA; Kotova, Galina A [SRI, RUSSIA; Lemaire, Joseph F [BELGIUM; Liemohn, Mike W [U OF MICHIGAN; Matsui, H [UNIV OF NEW HAMPSHIRE

    2008-01-01

    We describe recent progress in physics-based models of the plasmasphere using the Auid and the kinetic approaches. Global modeling of the dynamics and inAuence of the plasmasphere is presented. Results from global plasmasphere simulations are used to understand and quantify (i) the electric potential pattern and evolution during geomagnetic storms, and (ii) the inAuence of the plasmasphere on the excitation of electromagnetic ion cyclotron (ElvIIC) waves a.nd precipitation of energetic ions in the inner magnetosphere. The interactions of the plasmasphere with the ionosphere a.nd the other regions of the magnetosphere are pointed out. We show the results of simulations for the formation of the plasmapause and discuss the inAuence of plasmaspheric wind and of ultra low frequency (ULF) waves for transport of plasmaspheric material. Theoretical formulations used to model the electric field and plasma distribution in the plasmasphere are given. Model predictions are compared to recent CLUSTER and MAGE observations, but also to results of earlier models and satellite observations.

  5. Flow based vs. demand based energy-water modelling

    Science.gov (United States)

    Rozos, Evangelos; Nikolopoulos, Dionysis; Efstratiadis, Andreas; Koukouvinos, Antonios; Makropoulos, Christos

    2015-04-01

    The water flow in hydro-power generation systems is often used downstream to cover other type of demands like irrigation and water supply. However, the typical case is that the energy demand (operation of hydro-power plant) and the water demand do not coincide. Furthermore, the water inflow into a reservoir is a stochastic process. Things become more complicated if renewable resources (wind-turbines or photovoltaic panels) are included into the system. For this reason, the assessment and optimization of the operation of hydro-power systems are challenging tasks that require computer modelling. This modelling should not only simulate the water budget of the reservoirs and the energy production/consumption (pumped-storage), but should also take into account the constraints imposed by the natural or artificial water network using a flow routing algorithm. HYDRONOMEAS, for example, uses an elegant mathematical approach (digraph) to calculate the flow in a water network based on: the demands (input timeseries), the water availability (simulated) and the capacity of the transmission components (properties of channels, rivers, pipes, etc.). The input timeseries of demand should be estimated by another model and linked to the corresponding network nodes. A model that could be used to estimate these timeseries is UWOT. UWOT is a bottom up urban water cycle model that simulates the generation, aggregation and routing of water demand signals. In this study, we explore the potentials of UWOT in simulating the operation of complex hydrosystems that include energy generation. The evident advantage of this approach is the use of a single model instead of one for estimation of demands and another for the system simulation. An application of UWOT in a large scale system is attempted in mainland Greece in an area extending over 130×170 km². The challenges, the peculiarities and the advantages of this approach are examined and critically discussed.

  6. Model-based Prognostics with Concurrent Damage Progression Processes

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics approaches rely on physics-based models that describe the behavior of systems and their components. These models must account for the...

  7. A Comparison of Filter-based Approaches for Model-based Prognostics

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics approaches use domain knowledge about a system and its failure modes through the use of physics-based models. Model-based prognosis is...

  8. Effect of species rarity on the accuracy of species distribution models for reptiles and amphibians in southern California

    Science.gov (United States)

    Franklin, J.; Wejnert, K.E.; Hathaway, S.A.; Rochester, C.J.; Fisher, R.N.

    2009-01-01

    Aim: Several studies have found that more accurate predictive models of species' occurrences can be developed for rarer species; however, one recent study found the relationship between range size and model performance to be an artefact of sample prevalence, that is, the proportion of presence versus absence observations in the data used to train the model. We examined the effect of model type, species rarity class, species' survey frequency, detectability and manipulated sample prevalence on the accuracy of distribution models developed for 30 reptile and amphibian species. Location: Coastal southern California, USA. Methods: Classification trees, generalized additive models and generalized linear models were developed using species presence and absence data from 420 locations. Model performance was measured using sensitivity, specificity and the area under the curve (AUC) of the receiver-operating characteristic (ROC) plot based on twofold cross-validation, or on bootstrapping. Predictors included climate, terrain, soil and vegetation variables. Species were assigned to rarity classes by experts. The data were sampled to generate subsets with varying ratios of presences and absences to test for the effect of sample prevalence. Join count statistics were used to characterize spatial dependence in the prediction errors. Results: Species in classes with higher rarity were more accurately predicted than common species, and this effect was independent of sample prevalence. Although positive spatial autocorrelation remained in the prediction errors, it was weaker than was observed in the species occurrence data. The differences in accuracy among model types were slight. Main conclusions: Using a variety of modelling methods, more accurate species distribution models were developed for rarer than for more common species. This was presumably because it is difficult to discriminate suitable from unsuitable habitat for habitat generalists, and not as an artefact of the

  9. Model-based phase-shifting interferometer

    Science.gov (United States)

    Liu, Dong; Zhang, Lei; Shi, Tu; Yang, Yongying; Chong, Shiyao; Miao, Liang; Huang, Wei; Shen, Yibing; Bai, Jian

    2015-10-01

    A model-based phase-shifting interferometer (MPI) is developed, in which a novel calculation technique is proposed instead of the traditional complicated system structure, to achieve versatile, high precision and quantitative surface tests. In the MPI, the partial null lens (PNL) is employed to implement the non-null test. With some alternative PNLs, similar as the transmission spheres in ZYGO interferometers, the MPI provides a flexible test for general spherical and aspherical surfaces. Based on modern computer modeling technique, a reverse iterative optimizing construction (ROR) method is employed for the retrace error correction of non-null test, as well as figure error reconstruction. A self-compiled ray-tracing program is set up for the accurate system modeling and reverse ray tracing. The surface figure error then can be easily extracted from the wavefront data in forms of Zernike polynomials by the ROR method. Experiments of the spherical and aspherical tests are presented to validate the flexibility and accuracy. The test results are compared with those of Zygo interferometer (null tests), which demonstrates the high accuracy of the MPI. With such accuracy and flexibility, the MPI would possess large potential in modern optical shop testing.

  10. Human physiologically based pharmacokinetic model for propofol

    Directory of Open Access Journals (Sweden)

    Schnider Thomas W

    2005-04-01

    Full Text Available Abstract Background Propofol is widely used for both short-term anesthesia and long-term sedation. It has unusual pharmacokinetics because of its high lipid solubility. The standard approach to describing the pharmacokinetics is by a multi-compartmental model. This paper presents the first detailed human physiologically based pharmacokinetic (PBPK model for propofol. Methods PKQuest, a freely distributed software routine http://www.pkquest.com, was used for all the calculations. The "standard human" PBPK parameters developed in previous applications is used. It is assumed that the blood and tissue binding is determined by simple partition into the tissue lipid, which is characterized by two previously determined set of parameters: 1 the value of the propofol oil/water partition coefficient; 2 the lipid fraction in the blood and tissues. The model was fit to the individual experimental data of Schnider et. al., Anesthesiology, 1998; 88:1170 in which an initial bolus dose was followed 60 minutes later by a one hour constant infusion. Results The PBPK model provides a good description of the experimental data over a large range of input dosage, subject age and fat fraction. Only one adjustable parameter (the liver clearance is required to describe the constant infusion phase for each individual subject. In order to fit the bolus injection phase, for 10 or the 24 subjects it was necessary to assume that a fraction of the bolus dose was sequestered and then slowly released from the lungs (characterized by two additional parameters. The average weighted residual error (WRE of the PBPK model fit to the both the bolus and infusion phases was 15%; similar to the WRE for just the constant infusion phase obtained by Schnider et. al. using a 6-parameter NONMEM compartmental model. Conclusion A PBPK model using standard human parameters and a simple description of tissue binding provides a good description of human propofol kinetics. The major advantage of a

  11. Trip Generation Model Based on Destination Attractiveness

    Institute of Scientific and Technical Information of China (English)

    YAO Liya; GUAN Hongzhi; YAN Hai

    2008-01-01

    Traditional trip generation forecasting methods use unified average trip generation rates to determine trip generation volumes in various traffic zones without considering the individual characteristics of each traffic zone.Therefore,the results can have significant errors.To reduce the forecasting error produced by uniform trip generation rates for different traffic zones,the behavior of each traveler was studied instead of the characteristics of the traffic zone.This paper gives a method for calculating the trip efficiency and the effect of traffic zones combined with a destination selection model based on disaggregate theory for trip generation.Beijing data is used with the trip generation method to predict trip volumes.The results show that the disaggregate model in this paper is more accurate than the traditional method.An analysis of the factors influencing traveler behavior and destination selection shows that the attractiveness of the traffic zone strongly affects the trip generation volume.

  12. Cloth Modeling Based on Particle System

    Institute of Scientific and Technical Information of China (English)

    钟跃崎; 王善元

    2001-01-01

    A physical-based particle system is employed for cloth modeling supported by two basic algorithms, between which one is the construction of the internal and external forces acting on the particle system in terms of KES-F bending and shearing tests, and the other is the collision algorithm of which the collision detection is carried by means of bi-section of time step and the collision response is handled according to the empirical law for frictionless collision With these algorithms. the geometric state of parcles can be expressed as ordinary differential equationswhich is numerically solved by fourth order Runge- Kutta integration. Different draping figures of cotton fabric and wool fabric prove that such a particle system is suitable for 3D cloth modeling and simulation.

  13. CNEM: Cluster Based Network Evolution Model

    Directory of Open Access Journals (Sweden)

    Sarwat Nizamani

    2015-01-01

    Full Text Available This paper presents a network evolution model, which is based on the clustering approach. The proposed approach depicts the network evolution, which demonstrates the network formation from individual nodes to fully evolved network. An agglomerative hierarchical clustering method is applied for the evolution of network. In the paper, we present three case studies which show the evolution of the networks from the scratch. These case studies include: terrorist network of 9/11 incidents, terrorist network of WMD (Weapons Mass Destruction plot against France and a network of tweets discussing a topic. The network of 9/11 is also used for evaluation, using other social network analysis methods which show that the clusters created using the proposed model of network evolution are of good quality, thus the proposed method can be used by law enforcement agencies in order to further investigate the criminal networks

  14. Agent based modeling in tactical wargaming

    Science.gov (United States)

    James, Alex; Hanratty, Timothy P.; Tuttle, Daniel C.; Coles, John B.

    2016-05-01

    Army staffs at division, brigade, and battalion levels often plan for contingency operations. As such, analysts consider the impact and potential consequences of actions taken. The Army Military Decision-Making Process (MDMP) dictates identification and evaluation of possible enemy courses of action; however, non-state actors often do not exhibit the same level and consistency of planned actions that the MDMP was originally designed to anticipate. The fourth MDMP step is a particular challenge, wargaming courses of action within the context of complex social-cultural behaviors. Agent-based Modeling (ABM) and its resulting emergent behavior is a potential solution to model terrain in terms of the human domain and improve the results and rigor of the traditional wargaming process.

  15. Agent-based modeling and simulation

    CERN Document Server

    Taylor, Simon

    2014-01-01

    Operational Research (OR) deals with the use of advanced analytical methods to support better decision-making. It is multidisciplinary with strong links to management science, decision science, computer science and many application areas such as engineering, manufacturing, commerce and healthcare. In the study of emergent behaviour in complex adaptive systems, Agent-based Modelling & Simulation (ABMS) is being used in many different domains such as healthcare, energy, evacuation, commerce, manufacturing and defense. This collection of articles presents a convenient introduction to ABMS with pa

  16. Model-based vision using geometric hashing

    Science.gov (United States)

    Akerman, Alexander, III; Patton, Ronald

    1991-04-01

    The Geometric Hashing technique developed by the NYU Courant Institute has been applied to various automatic target recognition applications. In particular, I-MATH has extended the hashing algorithm to perform automatic target recognition ofsynthetic aperture radar (SAR) imagery. For this application, the hashing is performed upon the geometric locations of dominant scatterers. In addition to being a robust model-based matching algorithm -- invariant under translation, scale, and 3D rotations of the target -- hashing is of particular utility because it can still perform effective matching when the target is partially obscured. Moreover, hashing is very amenable to a SIMD parallel processing architecture, and thus potentially realtime implementable.

  17. Flow Modeling Based Wall Element Technique

    Directory of Open Access Journals (Sweden)

    Sabah Tamimi

    2012-08-01

    Full Text Available Two types of flow where examined, pressure and combination of pressure and Coquette flow of confined turbulent flow with a one equation model used to depict the turbulent viscosity of confined flow in a smooth straight channel when a finite element technique based on a zone close to a solid wall has been adopted for predicting the distribution of the pertinent variables in this zone and examined even with case when the near wall zone was extended away from the wall. The validation of imposed technique has been tested and well compared with other techniques.

  18. Physics-Based Loop Surface Modeling

    Institute of Scientific and Technical Information of China (English)

    秦开怀; 常正义; 王华维; 李登高

    2002-01-01

    Strongly inspired by the research on physics-based dynamic models for sur-faces, we propose a new method for precisely evaluating the dynamic parameters (mass, damp-ing and stiffness matrices, and dynamic forces) for Loop surfaces without recursive subdivisionregardless of regular or irregular faces. It is shown that the thin-plate-energy of Loop surfacescan be evaluated precisely and efficiently, even though there are extraordinary points in theinitial meshes, unlike the previous dynamic Loop surface scheme. Hence, the new methodpresented for Loop surfaces is much more efficient than the previous schemes.

  19. SAT-Based Model Checking without Unrolling

    Science.gov (United States)

    Bradley, Aaron R.

    A new form of SAT-based symbolic model checking is described. Instead of unrolling the transition relation, it incrementally generates clauses that are inductive relative to (and augment) stepwise approximate reachability information. In this way, the algorithm gradually refines the property, eventually producing either an inductive strengthening of the property or a counterexample trace. Our experimental studies show that induction is a powerful tool for generalizing the unreachability of given error states: it can refine away many states at once, and it is effective at focusing the proof search on aspects of the transition system relevant to the property. Furthermore, the incremental structure of the algorithm lends itself to a parallel implementation.

  20. Quantum Mechanics Based Multiscale Modeling of Materials

    Science.gov (United States)

    Lu, Gang

    2013-03-01

    We present two quantum mechanics based multiscale approaches that can simulate extended defects in metals accurately and efficiently. The first approach (QCDFT) can treat multimillion atoms effectively via density functional theory (DFT). The method is an extension of the original quasicontinuum approach with DFT as its sole energetic formulation. The second method (QM/MM) has to do with quantum mechanics/molecular mechanics coupling based on the constrained density functional theory, which provides an exact framework for a self-consistent quantum mechanical embedding. Several important materials problems will be addressed using the multiscale modeling approaches, including hydrogen-assisted cracking in Al, magnetism-controlled dislocation properties in Fe and Si pipe diffusion along Al dislocation core. We acknowledge the support from the Office of Navel Research and the Army Research Office.

  1. Agent Based Modeling in Public Administration

    Directory of Open Access Journals (Sweden)

    Osman SEYHAN

    2013-06-01

    Full Text Available This study aims to explore the role of agent based modeling (ABM as a simulation method in analyzing and formulating the policy making processes and modern public management that is under the pressure of information age and socio-politic demands of open societies. ABM is a simulative research method to understand complex adaptive systems (cas from the perspective of its constituent entities. In this study, by employing agent based computing and Netlogo language, twocase studies about organizational design and organizational riskanalyses have been examined. Results revealed that ABM is anefficient platform determining the optimum results from various scenarios in order to understand structures and processes about policy making in both organizational design and risk management. In the future, more researches are needed about understanding role of ABM on understanding and making decision on future of cas especially in conjunction with developments in computer technologies.

  2. Characterization and Architecture of Component Based Models

    Directory of Open Access Journals (Sweden)

    Er.Iqbaldeep kaur

    2011-01-01

    Full Text Available Component based Software Engineering is the most common term nowadays in the field of software development. The CBSE approach is actually based on the principle of ‘Select and Use’ rather than ‘Design and Test’ as in traditional software development methods. Since this trend of using and ‘reusing’ components is in its developing stage, there are many advantages and problems as well that occur while use of components. Here is presented a series of papers that cover various important and integral issues in the field concerned. This paper is an introductory research on the essential concepts, principles and steps that underlie the available commercialized models in CBD. This research work has a scope extending to Component retrieval in repositories and their management and implementing the results verification

  3. Integrated Semantic Similarity Model Based on Ontology

    Institute of Scientific and Technical Information of China (English)

    LIU Ya-Jun; ZHAO Yun

    2004-01-01

    To solve the problem of the inadequacy of semantic processing in the intelligent question answering system, an integrated semantic similarity model which calculates the semantic similarity using the geometric distance and information content is presented in this paper.With the help of interrelationship between concepts, the information content of concepts and the strength of the edges in the ontology network, we can calculate the semantic similarity between two concepts and provide information for the further calculation of the semantic similarity between user's question and answers in knowlegdge base.The results of the experiments on the prototype have shown that the semantic problem in natural language processing can also be solved with the help of the knowledge and the abundant semantic information in ontology.More than 90% accuracy with less than 50 ms average searching time in the intelligent question answering prototype system based on ontology has been reached.The result is very satisfied.

  4. Biologically based multistage modeling of radiation effects

    Energy Technology Data Exchange (ETDEWEB)

    William Hazelton; Suresh Moolgavkar; E. Georg Luebeck

    2005-08-30

    This past year we have made substantial progress in modeling the contribution of homeostatic regulation to low-dose radiation effects and carcinogenesis. We have worked to refine and apply our multistage carcinogenesis models to explicitly incorporate cell cycle states, simple and complex damage, checkpoint delay, slow and fast repair, differentiation, and apoptosis to study the effects of low-dose ionizing radiation in mouse intestinal crypts, as well as in other tissues. We have one paper accepted for publication in ''Advances in Space Research'', and another manuscript in preparation describing this work. I also wrote a chapter describing our combined cell-cycle and multistage carcinogenesis model that will be published in a book on stochastic carcinogenesis models edited by Wei-Yuan Tan. In addition, we organized and held a workshop on ''Biologically Based Modeling of Human Health Effects of Low dose Ionizing Radiation'', July 28-29, 2005 at Fred Hutchinson Cancer Research Center in Seattle, Washington. We had over 20 participants, including Mary Helen Barcellos-Hoff as keynote speaker, talks by most of the low-dose modelers in the DOE low-dose program, experimentalists including Les Redpath (and Mary Helen), Noelle Metting from DOE, and Tony Brooks. It appears that homeostatic regulation may be central to understanding low-dose radiation phenomena. The primary effects of ionizing radiation (IR) are cell killing, delayed cell cycling, and induction of mutations. However, homeostatic regulation causes cells that are killed or damaged by IR to eventually be replaced. Cells with an initiating mutation may have a replacement advantage, leading to clonal expansion of these initiated cells. Thus we have focused particularly on modeling effects that disturb homeostatic regulation as early steps in the carcinogenic process. There are two primary considerations that support our focus on homeostatic regulation. First, a number of

  5. Model based control of refrigeration systems

    Energy Technology Data Exchange (ETDEWEB)

    Sloth Larsen, L.F.

    2005-11-15

    The subject for this Ph.D. thesis is model based control of refrigeration systems. Model based control covers a variety of different types of controls, that incorporates mathematical models. In this thesis the main subject therefore has been restricted to deal with system optimizing control. The optimizing control is divided into two layers, where the system oriented top layers deals with set-point optimizing control and the lower layer deals with dynamical optimizing control in the subsystems. The thesis has two main contributions, i.e. a novel approach for set-point optimization and a novel approach for desynchronization based on dynamical optimization. The focus in the development of the proposed set-point optimizing control has been on deriving a simple and general method, that with ease can be applied on various compositions of the same class of systems, such as refrigeration systems. The method is based on a set of parameter depended static equations describing the considered process. By adapting the parameters to the given process, predict the steady state and computing a steady state gradient of the cost function, the process can be driven continuously towards zero gradient, i.e. the optimum (if the cost function is convex). The method furthermore deals with system constrains by introducing barrier functions, hereby the best possible performance taking the given constrains in to account can be obtained, e.g. under extreme operational conditions. The proposed method has been applied on a test refrigeration system, placed at Aalborg University, for minimization of the energy consumption. Here it was proved that by using general static parameter depended system equations it was possible drive the set-points close to the optimum and thus reduce the power consumption with up to 20%. In the dynamical optimizing layer the idea is to optimize the operation of the subsystem or the groupings of subsystems, that limits the obtainable system performance. In systems

  6. Models-Based Practice: Great White Hope or White Elephant?

    Science.gov (United States)

    Casey, Ashley

    2014-01-01

    Background: Many critical curriculum theorists in physical education have advocated a model- or models-based approach to teaching in the subject. This paper explores the literature base around models-based practice (MBP) and asks if this multi-models approach to curriculum planning has the potential to be the great white hope of pedagogical change…

  7. Evaluating face trustworthiness: a model based approach

    Science.gov (United States)

    Baron, Sean G.; Oosterhof, Nikolaas N.

    2008-01-01

    Judgments of trustworthiness from faces determine basic approach/avoidance responses and approximate the valence evaluation of faces that runs across multiple person judgments. Here, based on trustworthiness judgments and using a computer model for face representation, we built a model for representing face trustworthiness (study 1). Using this model, we generated novel faces with an increased range of trustworthiness and used these faces as stimuli in a functional Magnetic Resonance Imaging study (study 2). Although participants did not engage in explicit evaluation of the faces, the amygdala response changed as a function of face trustworthiness. An area in the right amygdala showed a negative linear response—as the untrustworthiness of faces increased so did the amygdala response. Areas in the left and right putamen, the latter area extended into the anterior insula, showed a similar negative linear response. The response in the left amygdala was quadratic—strongest for faces on both extremes of the trustworthiness dimension. The medial prefrontal cortex and precuneus also showed a quadratic response, but their response was strongest to faces in the middle range of the trustworthiness dimension. PMID:19015102

  8. Evaluating face trustworthiness: a model based approach.

    Science.gov (United States)

    Todorov, Alexander; Baron, Sean G; Oosterhof, Nikolaas N

    2008-06-01

    Judgments of trustworthiness from faces determine basic approach/avoidance responses and approximate the valence evaluation of faces that runs across multiple person judgments. Here, based on trustworthiness judgments and using a computer model for face representation, we built a model for representing face trustworthiness (study 1). Using this model, we generated novel faces with an increased range of trustworthiness and used these faces as stimuli in a functional Magnetic Resonance Imaging study (study 2). Although participants did not engage in explicit evaluation of the faces, the amygdala response changed as a function of face trustworthiness. An area in the right amygdala showed a negative linear response-as the untrustworthiness of faces increased so did the amygdala response. Areas in the left and right putamen, the latter area extended into the anterior insula, showed a similar negative linear response. The response in the left amygdala was quadratic--strongest for faces on both extremes of the trustworthiness dimension. The medial prefrontal cortex and precuneus also showed a quadratic response, but their response was strongest to faces in the middle range of the trustworthiness dimension. PMID:19015102

  9. Intellectual Model-Based Configuration Management Conception

    Directory of Open Access Journals (Sweden)

    Bartusevics Arturs

    2014-07-01

    Full Text Available Software configuration management is one of the most important disciplines within the software development project, which helps control the software evolution process and allows including into the end project only tested and validated changes. To achieve this, software management completes certain tasks. Concrete tools are used for technical implementation of tasks, such as version control systems, servers of continuous integration, compilers, etc. A correct configuration management process usually requires several tools, which mutually exchange information by generating various kinds of transfers. When it comes to introducing the configuration management process, often there are situations when tool installation is started, yet at that given moment there is no general picture of the total process. The article offers a model-based configuration management concept, which foresees the development of an abstract model for the configuration management process that later is transformed to lower abstraction level models and tools are indicated to support the technical process. A solution of this kind allows a more rational introduction and configuration of tools

  10. Hazard identification based on plant functional modelling

    International Nuclear Information System (INIS)

    A major objective of the present work is to provide means for representing a process plant as a socio-technical system, so as to allow hazard identification at a high level. The method includes technical, human and organisational aspects and is intended to be used for plant level hazard identification so as to identify critical areas and the need for further analysis using existing methods. The first part of the method is the preparation of a plant functional model where a set of plant functions link together hardware, software, operations, work organisation and other safety related aspects of the plant. The basic principle of the functional modelling is that any aspect of the plant can be represented by an object (in the sense that this term is used in computer science) based upon an Intent (or goal); associated with each Intent are Methods, by which the Intent is realized, and Constraints, which limit the Intent. The Methods and Constraints can themselves be treated as objects and decomposed into lower-level Intents (hence the procedure is known as functional decomposition) so giving rise to a hierarchical, object-oriented structure. The plant level hazard identification is carried out on the plant functional model using the Concept Hazard Analysis method. In this, the user will be supported by checklists and keywords and the analysis is structured by pre-defined worksheets. The preparation of the plant functional model and the performance of the hazard identification can be carried out manually or with computer support. (au) (4 tabs., 10 ills., 7 refs.)

  11. Agent-based models of financial markets

    Science.gov (United States)

    Samanidou, E.; Zschischang, E.; Stauffer, D.; Lux, T.

    2007-03-01

    This review deals with several microscopic ('agent-based') models of financial markets which have been studied by economists and physicists over the last decade: Kim-Markowitz, Levy-Levy-Solomon, Cont-Bouchaud, Solomon-Weisbuch, Lux-Marchesi, Donangelo-Sneppen and Solomon-Levy-Huang. After an overview of simulation approaches in financial economics, we first give a summary of the Donangelo-Sneppen model of monetary exchange and compare it with related models in economics literature. Our selective review then outlines the main ingredients of some influential early models of multi-agent dynamics in financial markets (Kim-Markowitz, Levy-Levy-Solomon). As will be seen, these contributions draw their inspiration from the complex appearance of investors' interactions in real-life markets. Their main aim is to reproduce (and, thereby, provide possible explanations) for the spectacular bubbles and crashes seen in certain historical episodes, but they lack (like almost all the work before 1998 or so) a perspective in terms of the universal statistical features of financial time series. In fact, awareness of a set of such regularities (power-law tails of the distribution of returns, temporal scaling of volatility) only gradually appeared over the nineties. With the more precise description of the formerly relatively vague characteristics (e.g. moving from the notion of fat tails to the more concrete one of a power law with index around three), it became clear that financial market dynamics give rise to some kind of universal scaling law. Showing similarities with scaling laws for other systems with many interacting sub-units, an exploration of financial markets as multi-agent systems appeared to be a natural consequence. This topic has been pursued by quite a number of contributions appearing in both the physics and economics literature since the late nineties. From the wealth of different flavours of multi-agent models that have appeared up to now, we discuss the Cont

  12. Agent-based models of financial markets

    Energy Technology Data Exchange (ETDEWEB)

    Samanidou, E [Department of Economics, University of Kiel, Olshausenstrasse 40, D-24118 Kiel (Germany); Zschischang, E [HSH Nord Bank, Portfolio Mngmt. and Inv., Martensdamm 6, D-24103 Kiel (Germany); Stauffer, D [Institute for Theoretical Physics, Cologne University, D-50923 Koeln (Germany); Lux, T [Department of Economics, University of Kiel, Olshausenstrasse 40, D-24118 Kiel (Germany)

    2007-03-15

    This review deals with several microscopic ('agent-based') models of financial markets which have been studied by economists and physicists over the last decade: Kim-Markowitz, Levy-Levy-Solomon, Cont-Bouchaud, Solomon-Weisbuch, Lux-Marchesi, Donangelo-Sneppen and Solomon-Levy-Huang. After an overview of simulation approaches in financial economics, we first give a summary of the Donangelo-Sneppen model of monetary exchange and compare it with related models in economics literature. Our selective review then outlines the main ingredients of some influential early models of multi-agent dynamics in financial markets (Kim-Markowitz, Levy-Levy-Solomon). As will be seen, these contributions draw their inspiration from the complex appearance of investors' interactions in real-life markets. Their main aim is to reproduce (and, thereby, provide possible explanations) for the spectacular bubbles and crashes seen in certain historical episodes, but they lack (like almost all the work before 1998 or so) a perspective in terms of the universal statistical features of financial time series. In fact, awareness of a set of such regularities (power-law tails of the distribution of returns, temporal scaling of volatility) only gradually appeared over the nineties. With the more precise description of the formerly relatively vague characteristics (e.g. moving from the notion of fat tails to the more concrete one of a power law with index around three), it became clear that financial market dynamics give rise to some kind of universal scaling law. Showing similarities with scaling laws for other systems with many interacting sub-units, an exploration of financial markets as multi-agent systems appeared to be a natural consequence. This topic has been pursued by quite a number of contributions appearing in both the physics and economics literature since the late nineties. From the wealth of different flavours of multi-agent models that have appeared up to now, we

  13. An Attack Modeling Based on Colored Petri Net

    Institute of Scientific and Technical Information of China (English)

    ZHOU Shijie; QIN Zhiguang; ZHANG Feng; LIU Jinde

    2004-01-01

    A color petri net (CPN) based attack modeling approach is addressed.Compared with graph-based modeling,CPN based attack model is fiexible enough to model Intemet intrusions,because of their static and dynamic features.The processes and rules of building CPN based attack model from attack tree are also presented.In order to evaluate the risk of intrusion,some cost elements are added to CPN based attack modeling.This extended model is useful in intrusion detection and risk evaluation.Experiences show that it is easy to exploit CPN based attack modeling approach to provide the controlling functions,such as intrusion response and intrusion defense.A case study given in this paper shows that CPN based attack model has many unique characters which attack tree model hasn't.

  14. Ecosystem Based Business Model of Smart Grid

    DEFF Research Database (Denmark)

    Lundgaard, Morten Raahauge; Ma, Zheng; Jørgensen, Bo Nørregaard

    2015-01-01

    This paper tries to investigate the ecosystem based business model in a smart grid infrastructure and the potential of value capture in the highly complex macro infrastructure such as smart grid. This paper proposes an alternative perspective to study the smart grid business ecosystem to support...... the infrastructural challenges, such as the interoperability of business components for smart grid. So far little research has explored the business ecosystem in the smart grid concept. The study on the smart grid with the theory of business ecosystem may open opportunities to understand market catalysts. This study...... contributes an understanding of business ecosystem applicable for smart grid. Smart grid infrastructure is an intricate business ecosystem, which have several intentions to deliver the value proposition and what it should be. The findings help to identify and capture value from markets....

  15. Prototype-based models in machine learning.

    Science.gov (United States)

    Biehl, Michael; Hammer, Barbara; Villmann, Thomas

    2016-01-01

    An overview is given of prototype-based models in machine learning. In this framework, observations, i.e., data, are stored in terms of typical representatives. Together with a suitable measure of similarity, the systems can be employed in the context of unsupervised and supervised analysis of potentially high-dimensional, complex datasets. We discuss basic schemes of competitive vector quantization as well as the so-called neural gas approach and Kohonen's topology-preserving self-organizing map. Supervised learning in prototype systems is exemplified in terms of learning vector quantization. Most frequently, the familiar Euclidean distance serves as a dissimilarity measure. We present extensions of the framework to nonstandard measures and give an introduction to the use of adaptive distances in relevance learning. PMID:26800334

  16. Model-based control of networked systems

    CERN Document Server

    Garcia, Eloy; Montestruque, Luis A

    2014-01-01

    This monograph introduces a class of networked control systems (NCS) called model-based networked control systems (MB-NCS) and presents various architectures and control strategies designed to improve the performance of NCS. The overall performance of NCS considers the appropriate use of network resources, particularly network bandwidth, in conjunction with the desired response of the system being controlled.   The book begins with a detailed description of the basic MB-NCS architecture that provides stability conditions in terms of state feedback updates . It also covers typical problems in NCS such as network delays, network scheduling, and data quantization, as well as more general control problems such as output feedback control, nonlinear systems stabilization, and tracking control.   Key features and topics include: Time-triggered and event-triggered feedback updates Stabilization of uncertain systems subject to time delays, quantization, and extended absence of feedback Optimal control analysis and ...

  17. Model based optimization of EMC input filters

    Energy Technology Data Exchange (ETDEWEB)

    Raggl, K; Kolar, J. W. [Swiss Federal Institute of Technology, Power Electronic Systems Laboratory, Zuerich (Switzerland); Nussbaumer, T. [Levitronix GmbH, Zuerich (Switzerland)

    2008-07-01

    Input filters of power converters for compliance with regulatory electromagnetic compatibility (EMC) standards are often over-dimensioned in practice due to a non-optimal selection of number of filter stages and/or the lack of solid volumetric models of the inductor cores. This paper presents a systematic filter design approach based on a specific filter attenuation requirement and volumetric component parameters. It is shown that a minimal volume can be found for a certain optimal number of filter stages for both the differential mode (DM) and common mode (CM) filter. The considerations are carried out exemplarily for an EMC input filter of a single phase power converter for the power levels of 100 W, 300 W, and 500 W. (author)

  18. An Agent-Based Monetary Production Simulation Model

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2006-01-01

    An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable......An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable...

  19. Multiple Damage Progression Paths in Model-based Prognostics

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics approaches employ do- main knowledge about a system, its components, and how they fail through the use of physics-based models. Compo- nent...

  20. Distributed Damage Estimation for Prognostics based on Structural Model Decomposition

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics approaches capture system knowl- edge in the form of physics-based models of components that include how they fail. These methods consist of...

  1. Integrated Enterprise Modeling Method Based on Workflow Model and Multiviews%Integrated Enterprise Modeling Method Based on Workflow Model and Multiviews

    Institute of Scientific and Technical Information of China (English)

    林慧苹; 范玉顺; 吴澄

    2001-01-01

    Many enterprise modeling methods are proposed to model thebusiness process of enterprises and to implement CIM systems. But difficulties are still encountered when these methods are applied to the CIM system design and implementation. This paper proposes a new integrated enterprise modeling methodology based on the workflow model. The system architecture and the integrated modeling environment are described with a new simulation strategy. The modeling process and the relationship between the workflow model and the views are discussed.

  2. Validating agent based models through virtual worlds.

    Energy Technology Data Exchange (ETDEWEB)

    Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E.; Bernstein, Jeremy Ray Rhythm

    2014-01-01

    As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior

  3. A model-based evaluation system of enterprise

    Institute of Scientific and Technical Information of China (English)

    Yan Junwei; Ye Yang; Wang Jian

    2005-01-01

    This paper analyses the architecture of enterprise modeling, proposesindicator selection principles and indicator decomposition methods, examines the approaches to the evaluation of enterprise modeling and designs an evaluation model of AHP. Then a model-based evaluation system of enterprise is presented toeffectively evaluate the business model in the framework of enterprise modeling.

  4. Model Mapping Approach Based on Ontology Semantics

    Directory of Open Access Journals (Sweden)

    Jinkui Hou

    2013-09-01

    Full Text Available The mapping relations between different models are the foundation for model transformation in model-driven software development. On the basis of ontology semantics, model mappings between different levels are classified by using structural semantics of modeling languages. The general definition process for mapping relations is explored, and the principles of structure mapping are proposed subsequently. The approach is further illustrated by the mapping relations from class model of object oriented modeling language to the C programming codes. The application research shows that the approach provides a theoretical guidance for the realization of model mapping, and thus can make an effective support to model-driven software development

  5. Tree-based disease classification using protein data.

    Science.gov (United States)

    Zhu, Hongtu; Yu, Chang-Yung; Zhang, Heping

    2003-09-01

    A reliable and precise classification of diseases is essential for successful diagnosis and treatment. Using mass spectrometry from clinical specimens, scientists may find the protein variations among disease and use this information to improve diagnosis. In this paper, we propose a novel procedure to classify disease status based on the protein data from mass spectrometry. Our new tree-based algorithm consists of three steps: projection, selection and classification tree. The projection step aims to project all observations from specimens into the same bases so that the projected data have fixed coordinates. Thus, for each specimen, we obtain a large vector of 'coefficients' on the same basis. The purpose of the selection step is data reduction by condensing the large vector from the projection step into a much lower order of informative vector. Finally, using these reduced vectors, we apply recursive partitioning to construct an informative classification tree. This method has been successfully applied to protein data, provided by the Department of Radiology and Chemistry at Duke University.

  6. Framework of Pattern Recognition Model Based on the Cognitive Psychology

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    According to the fundamental theory of visual cognition mechanism and cognitive psychology,the visual pattern recognition model is introduced briefly.Three pattern recognition models,i.e.template-based matching model,prototype-based matching model and feature-based matching model are built and discussed separately.In addition,the influence of object background information and visual focus point to the result of pattern recognition is also discussed with the example of recognition for fuzzy letters and figures.

  7. Embedding Task-Based Neural Models into a Connectome-Based Model of the Cerebral Cortex

    Science.gov (United States)

    Ulloa, Antonio; Horwitz, Barry

    2016-01-01

    A number of recent efforts have used large-scale, biologically realistic, neural models to help understand the neural basis for the patterns of activity observed in both resting state and task-related functional neural imaging data. An example of the former is The Virtual Brain (TVB) software platform, which allows one to apply large-scale neural modeling in a whole brain framework. TVB provides a set of structural connectomes of the human cerebral cortex, a collection of neural processing units for each connectome node, and various forward models that can convert simulated neural activity into a variety of functional brain imaging signals. In this paper, we demonstrate how to embed a previously or newly constructed task-based large-scale neural model into the TVB platform. We tested our method on a previously constructed large-scale neural model (LSNM) of visual object processing that consisted of interconnected neural populations that represent, primary and secondary visual, inferotemporal, and prefrontal cortex. Some neural elements in the original model were “non-task-specific” (NS) neurons that served as noise generators to “task-specific” neurons that processed shapes during a delayed match-to-sample (DMS) task. We replaced the NS neurons with an anatomical TVB connectome model of the cerebral cortex comprising 998 regions of interest interconnected by white matter fiber tract weights. We embedded our LSNM of visual object processing into corresponding nodes within the TVB connectome. Reciprocal connections between TVB nodes and our task-based modules were included in this framework. We ran visual object processing simulations and showed that the TVB simulator successfully replaced the noise generation originally provided by NS neurons; i.e., the DMS tasks performed with the hybrid LSNM/TVB simulator generated equivalent neural and fMRI activity to that of the original task-based models. Additionally, we found partial agreement between the functional

  8. Embedding Task-Based Neural Models into a Connectome-Based Model of the Cerebral Cortex.

    Science.gov (United States)

    Ulloa, Antonio; Horwitz, Barry

    2016-01-01

    A number of recent efforts have used large-scale, biologically realistic, neural models to help understand the neural basis for the patterns of activity observed in both resting state and task-related functional neural imaging data. An example of the former is The Virtual Brain (TVB) software platform, which allows one to apply large-scale neural modeling in a whole brain framework. TVB provides a set of structural connectomes of the human cerebral cortex, a collection of neural processing units for each connectome node, and various forward models that can convert simulated neural activity into a variety of functional brain imaging signals. In this paper, we demonstrate how to embed a previously or newly constructed task-based large-scale neural model into the TVB platform. We tested our method on a previously constructed large-scale neural model (LSNM) of visual object processing that consisted of interconnected neural populations that represent, primary and secondary visual, inferotemporal, and prefrontal cortex. Some neural elements in the original model were "non-task-specific" (NS) neurons that served as noise generators to "task-specific" neurons that processed shapes during a delayed match-to-sample (DMS) task. We replaced the NS neurons with an anatomical TVB connectome model of the cerebral cortex comprising 998 regions of interest interconnected by white matter fiber tract weights. We embedded our LSNM of visual object processing into corresponding nodes within the TVB connectome. Reciprocal connections between TVB nodes and our task-based modules were included in this framework. We ran visual object processing simulations and showed that the TVB simulator successfully replaced the noise generation originally provided by NS neurons; i.e., the DMS tasks performed with the hybrid LSNM/TVB simulator generated equivalent neural and fMRI activity to that of the original task-based models. Additionally, we found partial agreement between the functional

  9. A Software Service Framework Model Based on Agent

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper presents an agent-based software service framework model called ASF, and definesthe basic concepts and structure of ASF model. It also describes the management and process mechanismsin ASF model.

  10. The research on Virtual Plants Growth Based on DLA Model

    Science.gov (United States)

    Zou, YunLan; Chai, Bencheng

    This article summarizes the separated Evolutionary Algorithm in fractal algorithm of Diffusion Limited Aggregation model (i.e. DLA model) and put forward the virtual plant growth realization in computer based on DLA model. The method is carried out in the VB6.0 environment to achieve and verify the plant growth based on DLA model.

  11. Learning of Chemical Equilibrium through Modelling-Based Teaching

    Science.gov (United States)

    Maia, Poliana Flavia; Justi, Rosaria

    2009-01-01

    This paper presents and discusses students' learning process of chemical equilibrium from a modelling-based approach developed from the use of the "Model of Modelling" diagram. The investigation was conducted in a regular classroom (students 14-15 years old) and aimed at discussing how modelling-based teaching can contribute to students learning…

  12. Measurement-based load modeling: Theory and application

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Load model is one of the most important elements in power system operation and control. However, owing to its complexity, load modeling is still an open and very difficult problem. Summarizing our work on measurement-based load modeling in China for more than twenty years, this paper systematically introduces the mathematical theory and applications regarding the load modeling. The flow chart and algorithms for measurement-based load modeling are presented. A composite load model structure with 13 parameters is also proposed. Analysis results based on the trajectory sensitivity theory indicate the importance of the load model parameters for the identification. Case studies show the accuracy of the presented measurement-based load model. The load model thus built has been validated by field measurements all over China. Future working directions on measurement- based load modeling are also discussed in the paper.

  13. MRO CTX-based Digital Terrain Models

    Science.gov (United States)

    Dumke, Alexander

    2016-04-01

    In planetary surface sciences, digital terrain models (DTM) are paramount when it comes to understanding and quantifying processes. In this contribution an approach for the derivation of digital terrain models from stereo images of the NASA Mars Reconnaissance Orbiter (MRO) Context Camera (CTX) are described. CTX consists of a 350 mm focal length telescope and 5000 CCD sensor elements and is operated as pushbroom camera. It acquires images with ~6 m/px over a swath width of ~30 km of the Mars surface [1]. Today, several approaches for the derivation of CTX DTMs exist [e. g. 2, 3, 4]. The discussed approach here is based on established software and combines them with proprietary software as described below. The main processing task for the derivation of CTX stereo DTMs is based on six steps: (1) First, CTX images are radiometrically corrected using the ISIS software package [5]. (2) For selected CTX stereo images, exterior orientation data from reconstructed NAIF SPICE data are extracted [6]. (3) In the next step High Resolution Stereo Camera (HRSC) DTMs [7, 8, 9] are used for the rectification of CTX stereo images to reduce the search area during the image matching. Here, HRSC DTMs are used due to their higher spatial resolution when compared to MOLA DTMs. (4) The determination of coordinates of homologous points between stereo images, i.e. the stereo image matching process, consists of two steps: first, a cross-correlation to obtain approximate values and secondly, their use in a least-square matching (LSM) process in order to obtain subpixel positions. (5) The stereo matching results are then used to generate object points from forward ray intersections. (6) As a last step, the DTM-raster generation is performed using software developed at the German Aerospace Center, Berlin. Whereby only object points are used that have a smaller error than a threshold value. References: [1] Malin, M. C. et al., 2007, JGR 112, doi:10.1029/2006JE002808 [2] Broxton, M. J. et al

  14. A Bit Progress on Word—Based Language Model

    Institute of Scientific and Technical Information of China (English)

    陈勇; 陈国评

    2003-01-01

    A good language model is essential to a postprocessing algorithm for recognition systems. In the past, researchers have pre-sented various language models, such as character based language models, word based language model, syntactical rules :language mod-el, hybrid models, etc. The word N-gram model is by far an effective and efficient model, but one has to address the problem of data sparseness in establishing the model. Katz and Kneser et al. respectively presented effective remedies to solve this challenging prob-lem. In this study, we proposed an improvement to their methods by incorporating Chinese language-specific information or Chinese word class information into the system.

  15. GENETIC-BASED NUTRITION RECOMMENDATION MODEL

    Directory of Open Access Journals (Sweden)

    S. A.A. Fayoumi

    2014-01-01

    Full Text Available Evolutionary computing is the collective name for a range of problem-solving techniques based on principles of biological evolution, such as natural selection and genetic inheritance. These techniques are being widely applied to a variety of problems in many vital fields. Also, Evolutionary Algorithms (EA which applied the principles of Evolutionary computations, such as genetic algorithm, particle swarm, ant colony and bees algorithm and so on play an important role in decision making process. EAs serve a lot of fields which can affect our life directly, such as medicine, engineering, transportations, communications. One of these vital fields is Nutrition which can be viewed from several points of view as medical, physical, social, environmental and psychological point of view. This study, presents a proposed model that shows how evolutionary computing generally and genetic algorithm specifically-as a powerful algorithm of evolutionary algorithms-can be used to recommend an appropriate nutrition style in a medical and physical sides only to each person according to his/her personal and medical measurements.

  16. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  17. Modeling dark fermentation for biohydrogen production: ADM1-based model vs. Gompertz model

    Energy Technology Data Exchange (ETDEWEB)

    Gadhamshetty, Venkataramana [Air Force Research Laboratory, Tyndall AFB, 139 Barnes Drive, Panama City, FL 32403 (United States); Arudchelvam, Yalini; Nirmalakhandan, Nagamany [Civil Engineering Department, New Mexico State University, Las Cruces, NM 88003 (United States); Johnson, David C. [Institute for Energy and Environment, New Mexico State University, Las Cruces, NM 88003 (United States)

    2010-01-15

    Biohydrogen production by dark fermentation in batch reactors was modeled using the Gompertz equation and a model based on Anaerobic Digestion Model (ADM1). The ADM1 framework, which has been well accepted for modeling methane production by anaerobic digestion, was modified in this study for modeling hydrogen production. Experimental hydrogen production data from eight reactor configurations varying in pressure conditions, temperature, type and concentration of substrate, inocula source, and stirring conditions were used to evaluate the predictive abilities of the two modeling approaches. Although the quality of fit between the measured and fitted hydrogen evolution by the Gompertz equation was high in all the eight reactor configurations with r{sup 2} {proportional_to}0.98, each configuration required a different set of model parameters, negating its utility as a general approach to predict hydrogen evolution. On the other hand, the ADM1-based model (ADM1BM) with predefined parameters was able to predict COD, cumulative hydrogen production, as well as volatile fatty acids production, albeit at a slightly lower quality of fit. Agreement between the experimental temporal hydrogen evolution data and the ADM1BM predictions was statistically significant with r{sup 2} > 0.91 and p-value <1E-04. Sensitivity analysis of the validated model revealed that hydrogen production was sensitive to only six parameters in the ADM1BM. (author)

  18. Computer Profiling Based Model for Investigation

    Directory of Open Access Journals (Sweden)

    Neeraj Choudhary

    2011-10-01

    Full Text Available Computer profiling is used for computer forensic analysis, and proposes and elaborates on a novel model for use in computer profiling, the computer profiling object model. The computer profiling object model is an information model which models a computer as objects with various attributes and inter-relationships. These together provide the information necessary for a human investigator or an automated reasoning engine to make judgments as to the probable usage and evidentiary value of a computer system. The computer profiling object model can be implemented so as to support automated analysis to provide an investigator with the informationneeded to decide whether manual analysis is required.

  19. Research of database-based modeling for mining management system

    Institute of Scientific and Technical Information of China (English)

    WU Hai-feng; JIN Zhi-xin; BAI Xi-jun

    2005-01-01

    Put forward the method to construct the simulation model automatically with database-based automatic modeling(DBAM) for mining system. Designed the standard simulation model linked with some open cut Pautomobile dispatch system. Analyzed and finded out the law among them, and designed model maker to realize the automatic programming of the new model program.

  20. A generic testing framework for agent-based simulation models

    OpenAIRE

    Gürcan, Önder; Dikenelli, Oguz; Bernon, Carole

    2013-01-01

    International audience Agent-based modelling and simulation (ABMS) had an increasing attention during the last decade. However, the weak validation and verification of agent-based simulation models makes ABMS hard to trust. There is no comprehensive tool set for verification and validation of agent-based simulation models, which demonstrates that inaccuracies exist and/or reveals the existing errors in the model. Moreover, on the practical side, many ABMS frameworks are in use. In this sen...

  1. Agent Based Modelling and Simulation of Social Processes

    OpenAIRE

    Armano Srbljinovic; Ognjen Skunca

    2003-01-01

    The paper provides an introduction to agent-based modelling and simulation of social processes. Reader is introduced to the worldview underlying agent-based models, some basic terminology, basic properties of agent-based models, as well as to what one can and what cannot expect from such models, particularly when they are applied to social-scientific investigation. Special attention is given to the issues of validation. Classification-ACM-1998: J.4 [Computer Applications]; Social and behavior...

  2. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  3. Model based sustainable production of biomethane

    OpenAIRE

    Biernacki, Piotr

    2014-01-01

    The main intention of this dissertation was to evaluate sustainable production of biomethane with use of mathematical modelling. To achieve this goal, widely acknowledged models like Anaerobic Digestion Model No.1 (ADM1), describing anaerobic digestion, and electrolyte Non-Random Two Liquid Model (eNRTL), for gas purification, were utilized. The experimental results, batch anaerobic digestion of different substrates and carbon dioxide solubility in 2-(Ethylamino)ethanol, were used to determin...

  4. Agent Based Reasoning in Multilevel Flow Modeling

    DEFF Research Database (Denmark)

    Lind, Morten; Zhang, Xinxin

    2012-01-01

    Multilevel Flow Modeling (MFM) is a modeling method used for modeling complex industrial plant. Currently, MFM is supported with a standalone software tool called MFM Workbench, which is equipped with causal-relation analysis and other functionalities. The aim of this paper is to offer a new design...

  5. On Process Modelling Using Physical Oriented And Phenomena Based Principles

    Directory of Open Access Journals (Sweden)

    Mihai Culea

    2000-12-01

    Full Text Available This work presents a modelling framework based on phenomena description of the process. The approach is taken to easy understand and construct process model in heterogeneous possible distributed modelling and simulation environments. A simplified case study of a heat exchanger is considered and Modelica modelling language to check the proposed concept. The partial results are promising and the research effort will be extended in a computer aided modelling environment based on phenomena.

  6. A Size-based Ecosystem Model

    DEFF Research Database (Denmark)

    Ravn-Jonsen, Lars

     Ecosystem Management requires models that can link the ecosystem level to the operation level. This link can be created by an ecosystem production model. Because the function of the individual fish in the marine ecosystem, seen in trophic context, is closely related to its size, the model groups...... fish according to size. The model summarises individual predation events into ecosystem level properties, and thereby uses the law of conversation of mass as a framework. This paper provides the background, the conceptual model, basic assumptions, integration of fishing activities, mathematical...... completion, and a numeric implementation. Using two experiments, the model's ability to act as tool for economic production analysis and regulation design testing is demonstrated. The presented model is the simplest possible and is built on the principles of (i) size, as the attribute that determines...

  7. A model-based multisensor data fusion knowledge management approach

    Science.gov (United States)

    Straub, Jeremy

    2014-06-01

    A variety of approaches exist for combining data from multiple sensors. The model-based approach combines data based on its support for or refutation of elements of the model which in turn can be used to evaluate an experimental thesis. This paper presents a collection of algorithms for mapping various types of sensor data onto a thesis-based model and evaluating the truth or falsity of the thesis, based on the model. The use of this approach for autonomously arriving at findings and for prioritizing data are considered. Techniques for updating the model (instead of arriving at a true/false assertion) are also discussed.

  8. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae; Top, Søren

    2008-01-01

    of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical...... of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behaviour, and the transformation of the software system into the S-functions. The general aim of this work is the improvement of multi-disciplinary development of embedded systems with the focus on the relation...

  9. Agent-based modeling and systems dynamics model reproduction.

    Energy Technology Data Exchange (ETDEWEB)

    North, M. J.; Macal, C. M. (Decision and Information Sciences)

    2009-01-01

    Reproducibility is a pillar of the scientific endeavour. We view computer simulations as laboratories for electronic experimentation and therefore as tools for science. Recent studies have addressed model reproduction and found it to be surprisingly difficult to replicate published findings. There have been enough failed simulation replications to raise the question, 'can computer models be fully replicated?' This paper answers in the affirmative by reporting on a successful reproduction study using Mathematica, Repast and Swarm for the Beer Game supply chain model. The reproduction process was valuable because it demonstrated the original result's robustness across modelling methodologies and implementation environments.

  10. The study of snack behaviors and applying classification tree to filter influencing factors anong middle school students in Guangzhou%广州市某中学初中生零食行为及其影响因素的分类树研究

    Institute of Scientific and Technical Information of China (English)

    田唤; 马绍斌; 范存欣; 刘国宁; 陈然

    2011-01-01

    目的 了解广州市初中生的零食行为,探讨其影响因素,为学校和相关卫生部门开展零食教育工作提供依据.方法 采用便利抽样方法,对广州市天秀初级中学全体在校生进行问卷调查,比较知晓程度以及性别对零食行为的影响,并用分类树模型筛选影响因素.结果 该校初中生零食消费率为46.3%;良好零食行为习惯形成比率是12.1%~72.6%;对指南认知程度不同的学生在注意零食营养(χ2=18.317,P<0.05)、注意零食保质期(χ2=54.014,P<0.05)、不在上网、看电视时吃零食(χ2=4.799,P<0.05)、不在正餐前/后1~2 h吃零食(χ2=56.147,P<0.05)、不以零食代替正餐(χ2=7.635,P<0.05)等条目上差异有统计学意义;不同性别的学生在大部分的零食行为上差异均有统计学意义(P<0.05);分类树分析表明:家长态度、每周可用零食费用、对指南的知晓程度和零食包装能够影响初中生每天的零食食用次数.结论 初中生良好的零食消费观念和行为还需改善,加强多部门合作,采取多种途径的宣传教育,合理指导初中生零食选择和食用十分重要.%Objective To describe the snack behaviors and investigate the infiuencing factors, then provide reference for school snack education. Methods Convenience sampling, questionnaires were conducted in Tianxiu middle school in Guangzhou, then the consumption rates were compared to find out if there were any difference between Consumer Guide to Children and Adolescents on Snacks, gender. Classification tree was applied to filter the influencing factors. Results The rates of reasonable snack behaviors were 12.1% ~ 72.6%. The students who known the guide were more likely eating nutritious snacks (x2 = 18. 317, P<0.05 ), and more concerning about the quality guarantee period (x2 = 54.014, P < 0.05). Students who don' t know the guide were more likely eating snacks when they were watching TV or surfing the Internet (x2

  11. A Chakra-Based Model of Group Development.

    Science.gov (United States)

    Gilchrist, Roger; Mikulas, William L.

    1993-01-01

    Describes a model for sequential stages of group development based on yogic chakra system. Compares chakra-based model with other models of group developmental stages. Using context of chakra system, specifies basic dynamic issues and leader interventions for each stage and discusses relationship of individual development to group process. (Author)

  12. The Gap of Current Agent Based Simulation Modeling Practices and Feasibility of a Generic Agent Based Simulation Model

    OpenAIRE

    Yim Ling Loo; Alicia Y.C. Tang; Azhana Ahmad

    2015-01-01

    Agent-based modeling had been revolving to be established approach in modeling simulation systems which are used to understand and predict certain real-life scenarios in specific domains. Past researches which are domain-specific caused repetitive building of new models from scratch and restrict replication and reuse because of limitation of models’ description. This paper presents a review of gaps between domain-specific agent-based simulation modeling and the recent practices of agent-based...

  13. Model-based software process improvement

    Science.gov (United States)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  14. Designing Network-based Business Model Ontology

    DEFF Research Database (Denmark)

    Hashemi Nekoo, Ali Reza; Ashourizadeh, Shayegheh; Zarei, Behrouz

    2015-01-01

    Survival on dynamic environment is not achieved without a map. Scanning and monitoring of the market show business models as a fruitful tool. But scholars believe that old-fashioned business models are dead; as they are not included the effect of internet and network in themselves. This paper...... such as shared-mental model and trust. However, it mostly covers previous business model elements. To confirm the applicability of this ontology, it has been implemented in business angel network and showed how it works....

  15. Image based modeling of tumor growth.

    Science.gov (United States)

    Meghdadi, N; Soltani, M; Niroomand-Oscuii, H; Ghalichi, F

    2016-09-01

    Tumors are a main cause of morbidity and mortality worldwide. Despite the efforts of the clinical and research communities, little has been achieved in the past decades in terms of improving the treatment of aggressive tumors. Understanding the underlying mechanism of tumor growth and evaluating the effects of different therapies are valuable steps in predicting the survival time and improving the patients' quality of life. Several studies have been devoted to tumor growth modeling at different levels to improve the clinical outcome by predicting the results of specific treatments. Recent studies have proposed patient-specific models using clinical data usually obtained from clinical images and evaluating the effects of various therapies. The aim of this review is to highlight the imaging role in tumor growth modeling and provide a worthwhile reference for biomedical and mathematical researchers with respect to tumor modeling using the clinical data to develop personalized models of tumor growth and evaluating the effect of different therapies.

  16. A methodology to support multidisciplinary model-based water management

    NARCIS (Netherlands)

    Scholten, H.; Kassahun, A.; Refsgaard, J.C.; Kargas, Th.; Gavardinas, C.; Beulens, A.J.M.

    2007-01-01

    Quality assurance in model based water management is needed because of some frequently perceived shortcomings, e.g. a lack of mutual understanding between modelling team members, malpractice and a tendency of modellers to oversell model capabilities. Initiatives to support quality assurance focus on

  17. An approach for activity-based DEVS model specification

    DEFF Research Database (Denmark)

    Alshareef, Abdurrahman; Sarjoughian, Hessam S.; Zarrin, Bahram

    2016-01-01

    activity-based behavior modeling of parallel DEVS atomic models. We consider UML activities and actions as fundamental units of behavior modeling, especially in the presence of recent advances in the UML 2.5 specifications. We describe in detail how to approach activity modeling with a set of elemental...

  18. A Stock Pricing Model Based on Arithmetic Brown Motion

    Institute of Scientific and Technical Information of China (English)

    YAN Yong-xin; HAN Wen-xiu

    2001-01-01

    This paper presents a new stock pricing model based on arithmetic Brown motion. The model overcomes the shortcomings of Gordon model completely. With the model investors can estimate the stock value of surplus companies, deficit companies, zero increase companies and bankrupt companies in long term investment or in short term investment.

  19. Agent-Based Modeling of Growth Processes

    Science.gov (United States)

    Abraham, Ralph

    2014-01-01

    Growth processes abound in nature, and are frequently the target of modeling exercises in the sciences. In this article we illustrate an agent-based approach to modeling, in the case of a single example from the social sciences: bullying.

  20. Computer-based modelling and analysis in engineering geology

    OpenAIRE

    Giles, David

    2014-01-01

    This body of work presents the research and publications undertaken under a general theme of computer-based modelling and analysis in engineering geology. Papers are presented on geotechnical data management, data interchange, Geographical Information Systems, surface modelling, geostatistical methods, risk-based modelling, knowledge-based systems, remote sensing in engineering geology and on the integration of computer applications into applied geoscience teaching. The work highlights my...

  1. Model Based Predictive Control of a Fully Parallel Robot

    OpenAIRE

    Vivas, Oscar Andrès; Poignet, Philippe

    2003-01-01

    This paper deals with an efficient application of a model based predictive control in parallel machines. A receding horizon control strategy based on a simplified dynamic model is implemented. Experimental results are shown for the H4 robot, a fully parallel structure providing 3 degrees of freedom (dof) in translation and 1 dof in rotation. The model based predictive control and the commonly used computed torque control strategies are compared. The tracking performances and the robustness wi...

  2. Modeling amperometric biosensors based on allosteric enzymes

    Directory of Open Access Journals (Sweden)

    Liutauras Ričkus

    2013-09-01

    Full Text Available Computational modeling of a biosensor with allosteric enzyme layer was investigated in this study. The operation of the biosensor is modeled using non-stationary reaction-diffusion equations. The model involves three regions: the allosteric enzyme layer where the allosteric enzyme reactions as well as then mass transport by diffusion take place, the diffusion region where the mass transport by diffusion and non-enzymatic reactions take place and the convective region in which the analyte concentration is maintained constant. The biosensor response on dependency substrate concentration, cooperativity coefficient and the diffusion layer thickness on the same parameters have been studied.

  3. Statistical Model-Based Face Pose Estimation

    Institute of Scientific and Technical Information of China (English)

    GE Xinliang; YANG Jie; LI Feng; WANG Huahua

    2007-01-01

    A robust face pose estimation approach is proposed by using face shape statistical model approach and pose parameters are represented by trigonometric functions. The face shape statistical model is firstly built by analyzing the face shapes from different people under varying poses. The shape alignment is vital in the process of building the statistical model. Then, six trigonometric functions are employed to represent the face pose parameters. Lastly, the mapping function is constructed between face image and face pose by linearly relating different parameters. The proposed approach is able to estimate different face poses using a few face training samples. Experimental results are provided to demonstrate its efficiency and accuracy.

  4. Multimedia Data Modeling Based on Temporal Logic and XYZ System

    Institute of Scientific and Technical Information of China (English)

    MA Huadong; LIU Shenquan

    1999-01-01

    This paper proposes a new approach to modeling multimedia data. The newapproach is the multimedia data model based on temporal logic and XYZSystem. It supports the formal specifications in a multimedia system.Using this model, we can not only specify information unitsbut also design and script a multimedia title in an unified framework.Based on this model, an interactive multimedia authoring environment hasbeen developed.

  5. Simplified Atmospheric Dispersion Model andModel Based Real Field Estimation System ofAir Pollution

    Institute of Scientific and Technical Information of China (English)

    2015-01-01

    The atmospheric dispersion model has been well developed and applied in pollution emergency and prediction. Based on thesophisticated air diffusion model, this paper proposes a simplified model and some optimization about meteorological andgeological conditions. The model is suitable for what is proposed as Real Field Monitor and Estimation system. The principle ofsimplified diffusion model and its optimization is studied. The design of Real Field Monitor system based on this model and itsfundamental implementations are introduced.

  6. Image-Based Modeling of Plants and Trees

    CERN Document Server

    Kang, Sing Bang

    2009-01-01

    Plants and trees are among the most complex natural objects. Much work has been done attempting to model them, with varying degrees of success. In this book, we review the various approaches in computer graphics, which we categorize as rule-based, image-based, and sketch-based methods. We describe our approaches for modeling plants and trees using images. Image-based approaches have the distinct advantage that the resulting model inherits the realistic shape and complexity of a real plant or tree. We use different techniques for modeling plants (with relatively large leaves) and trees (with re

  7. Physics-Based Pneumatic Hammer Instability Model Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Florida Turbine Technologies (FTT) proposes to conduct research necessary to develop a physics-based pneumatic hammer instability model for hydrostatic bearings...

  8. Probabilistic reasoning for assembly-based 3D modeling

    KAUST Repository

    Chaudhuri, Siddhartha

    2011-01-01

    Assembly-based modeling is a promising approach to broadening the accessibility of 3D modeling. In assembly-based modeling, new models are assembled from shape components extracted from a database. A key challenge in assembly-based modeling is the identification of relevant components to be presented to the user. In this paper, we introduce a probabilistic reasoning approach to this problem. Given a repository of shapes, our approach learns a probabilistic graphical model that encodes semantic and geometric relationships among shape components. The probabilistic model is used to present components that are semantically and stylistically compatible with the 3D model that is being assembled. Our experiments indicate that the probabilistic model increases the relevance of presented components. © 2011 ACM.

  9. Demand forecast model based on CRM

    Science.gov (United States)

    Cai, Yuancui; Chen, Lichao

    2006-11-01

    With interiorizing day by day management thought that regarding customer as the centre, forecasting customer demand becomes more and more important. In the demand forecast of customer relationship management, the traditional forecast methods have very great limitation because much uncertainty of the demand, these all require new modeling to meet the demands of development. In this paper, the notion is that forecasting the demand according to characteristics of the potential customer, then modeling by it. The model first depicts customer adopting uniform multiple indexes. Secondly, the model acquires characteristic customers on the basis of data warehouse and the technology of data mining. The last, there get the most similar characteristic customer by their comparing and forecast the demands of new customer by the most similar characteristic customer.

  10. Models for Patch-Based Image Restoration

    Directory of Open Access Journals (Sweden)

    Petrovic Nemanja

    2009-01-01

    Full Text Available Abstract We present a supervised learning approach for object-category specific restoration, recognition, and segmentation of images which are blurred using an unknown kernel. The novelty of this work is a multilayer graphical model which unifies the low-level vision task of restoration and the high-level vision task of recognition in a cooperative framework. The graphical model is an interconnected two-layer Markov random field. The restoration layer accounts for the compatibility between sharp and blurred images and models the association between adjacent patches in the sharp image. The recognition layer encodes the entity class and its location in the underlying scene. The potentials are represented using nonparametric kernel densities and are learnt from training data. Inference is performed using nonparametric belief propagation. Experiments demonstrate the effectiveness of our model for the restoration and recognition of blurred license plates as well as face images.

  11. Models for Patch-Based Image Restoration

    Directory of Open Access Journals (Sweden)

    Mithun Das Gupta

    2009-01-01

    Full Text Available We present a supervised learning approach for object-category specific restoration, recognition, and segmentation of images which are blurred using an unknown kernel. The novelty of this work is a multilayer graphical model which unifies the low-level vision task of restoration and the high-level vision task of recognition in a cooperative framework. The graphical model is an interconnected two-layer Markov random field. The restoration layer accounts for the compatibility between sharp and blurred images and models the association between adjacent patches in the sharp image. The recognition layer encodes the entity class and its location in the underlying scene. The potentials are represented using nonparametric kernel densities and are learnt from training data. Inference is performed using nonparametric belief propagation. Experiments demonstrate the effectiveness of our model for the restoration and recognition of blurred license plates as well as face images.

  12. An event-based model for contracts

    Directory of Open Access Journals (Sweden)

    Tiziana Cimoli

    2013-02-01

    Full Text Available We introduce a basic model for contracts. Our model extends event structures with a new relation, which faithfully captures the circular dependencies among contract clauses. We establish whether an agreement exists which respects all the contracts at hand (i.e. all the dependencies can be resolved, and we detect the obligations of each participant. The main technical contribution is a correspondence between our model and a fragment of the contract logic PCL. More precisely, we show that the reachable events are exactly those which correspond to provable atoms in the logic. Despite of this strong correspondence, our model improves previous work on PCL by exhibiting a finer-grained notion of culpability, which takes into account the legitimate orderings of events.

  13. A Model-Based Privacy Compliance Checker

    OpenAIRE

    Siani Pearson; Damien Allison

    2009-01-01

    Increasingly, e-business organisations are coming under pressure to be compliant to a range of privacy legislation, policies and best practice. There is a clear need for high-level management and administrators to be able to assess in a dynamic, customisable way the degree to which their enterprise complies with these. We outline a solution to this problem in the form of a model-driven automated privacy process analysis and configuration checking system. This system models privacy compliance ...

  14. ALC: automated reduction of rule-based models

    Directory of Open Access Journals (Sweden)

    Gilles Ernst

    2008-10-01

    Full Text Available Abstract Background Combinatorial complexity is a challenging problem for the modeling of cellular signal transduction since the association of a few proteins can give rise to an enormous amount of feasible protein complexes. The layer-based approach is an approximative, but accurate method for the mathematical modeling of signaling systems with inherent combinatorial complexity. The number of variables in the simulation equations is highly reduced and the resulting dynamic models show a pronounced modularity. Layer-based modeling allows for the modeling of systems not accessible previously. Results ALC (Automated Layer Construction is a computer program that highly simplifies the building of reduced modular models, according to the layer-based approach. The model is defined using a simple but powerful rule-based syntax that supports the concepts of modularity and macrostates. ALC performs consistency checks on the model definition and provides the model output in different formats (C MEX, MATLAB, Mathematica and SBML as ready-to-run simulation files. ALC also provides additional documentation files that simplify the publication or presentation of the models. The tool can be used offline or via a form on the ALC website. Conclusion ALC allows for a simple rule-based generation of layer-based reduced models. The model files are given in different formats as ready-to-run simulation files.

  15. Comprehensive Equivalent Circuit Based Modeling and Model Based Management of Aged Lithium ion Batteries

    Science.gov (United States)

    Tong, Shijie

    Energy storage is one of society's grand challenges for the 21st century. Lithium ion batteries (LIBs) are widely used in mobile devices, transportation, and stationary energy storages due to lowering cost combined with excellent power/energy density as well as cycle durability. The need for a battery management system (BMS) arises from a demand to improve cycle life, assure safety, and optimize the full pack performance. In this work, we proposed a model based battery on-line state of charge (SoC) and state of health (SoH) estimator for LIBs. The estimator incorporates a comprehensive Equivalent Circuit Model (ECM) as reference, an Extended Kalman Filter (EKF) as state observer, a Recursive Least Square (RLS) algorithm as parameter identifier, and Parameter Varying Approach (PVA) based optimization algorithms for the parameter function regressions. The developed adaptive estimator was applied to a 10kW smart grid energy storage application using retired electric vehicle batteries. The estimator exhibits a high numerical efficiency as well as an excellent accuracy in estimating SoC and SoH. The estimator also provides a novel method to optimize the correlation between battery open circuit voltage (OCV) and SoC, which further improves states estimation accuracy.

  16. Model based process-product design and analysis

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid...... model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types...

  17. Model-based Prognostics under Limited Sensing

    Data.gov (United States)

    National Aeronautics and Space Administration — Prognostics is crucial to providing reliable condition-based maintenance decisions. To obtain accurate predictions of component life, a variety of sensors are often...

  18. A PSEUDO RELEVANCE BASED IMAGE RETRIEVAL MODEL

    OpenAIRE

    Kamini Thakur; Preetika Saxena

    2015-01-01

    Image retrieval is the basic requirement, task now a day. Content based image retrieval is the popular image retrieval system by which the target image to be retrieved based on the useful features of the given image. CBIR has an active and fast growing research area in both image processing and data mining. In marine ecosystems the captured images having lower resolution, transformation invariant and translation capabilities. Therefore, accurate image extraction according to the u...

  19. A conceptual data model coupling with physically-based distributed hydrological models based on catchment discretization schemas

    Science.gov (United States)

    Liu, Yuanming; Zhang, Wanchang; Zhang, Zhijie

    2015-11-01

    In hydrology, the data types, spatio-temporal scales and formats for physically-based distributed hydrological models and the distributed data or parameters may be different before significant data pre-processing or may change during hydrological simulation run time. A data model is devoted to these problems for sophisticated numerical hydrological modeling procedures. In this paper, we propose a conceptual data model to interpret the comprehensive, universal and complex water environmental entities. We also present an innovative integration methodology to couple the data model with physically-based distributed hydrological models (DHMs) based on catchment discretization schemas. The data model provides a reasonable framework for researchers of organizing and pre-processing water environmental spatio-temporal datasets. It also facilitates seamless data flow fluid and dynamic by hydrological response units (HRUs) as the core between the object-oriented databases and physically-based distributed hydrological models.

  20. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  1. Mechanics and model-based control of advanced engineering systems

    CERN Document Server

    Irschik, Hans; Krommer, Michael

    2014-01-01

    Mechanics and Model-Based Control of Advanced Engineering Systems collects 32 contributions presented at the International Workshop on Advanced Dynamics and Model Based Control of Structures and Machines, which took place in St. Petersburg, Russia in July 2012. The workshop continued a series of international workshops, which started with a Japan-Austria Joint Workshop on Mechanics and Model Based Control of Smart Materials and Structures and a Russia-Austria Joint Workshop on Advanced Dynamics and Model Based Control of Structures and Machines. In the present volume, 10 full-length papers based on presentations from Russia, 9 from Austria, 8 from Japan, 3 from Italy, one from Germany and one from Taiwan are included, which represent the state of the art in the field of mechanics and model based control, with particular emphasis on the application of advanced structures and machines.

  2. Active Appearance Model Based Hand Gesture Recognition

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    This paper addresses the application of hand gesture recognition in monocular image sequences using Active Appearance Model (AAM). For this work, the proposed algorithm is conposed of constructing AAMs and fitting the models to the interest region. In training stage, according to the manual labeled feature points, the relative AAM is constructed and the corresponding average feature is obtained. In recognition stage, the interesting hand gesture region is firstly segmented by skin and movement cues.Secondly, the models are fitted to the image that includes the hand gesture, and the relative features are extracted.Thirdly, the classification is done by comparing the extracted features and average features. 30 different gestures of Chinese sign language are applied for testing the effectiveness of the method. The Experimental results are given indicating good performance of the algorithm.

  3. Model-Based Design of Biochemical Microreactors.

    Science.gov (United States)

    Elbinger, Tobias; Gahn, Markus; Neuss-Radu, Maria; Hante, Falk M; Voll, Lars M; Leugering, Günter; Knabner, Peter

    2016-01-01

    Mathematical modeling of biochemical pathways is an important resource in Synthetic Biology, as the predictive power of simulating synthetic pathways represents an important step in the design of synthetic metabolons. In this paper, we are concerned with the mathematical modeling, simulation, and optimization of metabolic processes in biochemical microreactors able to carry out enzymatic reactions and to exchange metabolites with their surrounding medium. The results of the reported modeling approach are incorporated in the design of the first microreactor prototypes that are under construction. These microreactors consist of compartments separated by membranes carrying specific transporters for the input of substrates and export of products. Inside the compartments of the reactor multienzyme complexes assembled on nano-beads by peptide adapters are used to carry out metabolic reactions. The spatially resolved mathematical model describing the ongoing processes consists of a system of diffusion equations together with boundary and initial conditions. The boundary conditions model the exchange of metabolites with the neighboring compartments and the reactions at the surface of the nano-beads carrying the multienzyme complexes. Efficient and accurate approaches for numerical simulation of the mathematical model and for optimal design of the microreactor are developed. As a proof-of-concept scenario, a synthetic pathway for the conversion of sucrose to glucose-6-phosphate (G6P) was chosen. In this context, the mathematical model is employed to compute the spatio-temporal distributions of the metabolite concentrations, as well as application relevant quantities like the outflow rate of G6P. These computations are performed for different scenarios, where the number of beads as well as their loading capacity are varied. The computed metabolite distributions show spatial patterns, which differ for different experimental arrangements. Furthermore, the total output of G6P

  4. Kalman filter-based gap conductance modeling

    International Nuclear Information System (INIS)

    Geometric and thermal property uncertainties contribute greatly to the problem of determining conductance within the fuel-clad gas gap of a nuclear fuel pin. Accurate conductance values are needed for power plant licensing transient analysis and for test analyses at research facilities. Recent work by Meek, Doerner, and Adams has shown that use of Kalman filters to estimate gap conductance is a promising approach. A Kalman filter is simply a mathematical algorithm that employs available system measurements and assumed dynamic models to generate optimal system state vector estimates. This summary addresses another Kalman filter approach to gap conductance estimation and subsequent identification of an empirical conductance model

  5. Automata-Based CSL Model Checking

    DEFF Research Database (Denmark)

    Zhang, Lijun; Jansen, David N.; Nielson, Flemming;

    2011-01-01

    For continuous-time Markov chains, the model-checking problem with respect to continuous-time stochastic logic (CSL) has been introduced and shown to be decidable by Aziz, Sanwal, Singhal and Brayton in 1996. The presented decision procedure, however, has exponential complexity. In this paper, we...... probability can then be approximated in polynomial time (using uniformization). This makes the present work the centerpiece of a broadly applicable full CSL model checker. Recently, the decision algorithm by Aziz et al. was shown to be incorrect in general. In fact, it works only for stratified CTMCs...

  6. Improved world-based language model

    Institute of Scientific and Technical Information of China (English)

    CHEN Yong(陈勇); CHAN Kwok-ping

    2004-01-01

    In order to construct a good language model used in the postprocessing phase of a recognition system.A smoothing technique must be used to solve the data sparseness problem. In the past, many smoothing techniques have been proposed. Among them, Katz' s smoothing technique is well known. However, we found that a weakness with the Katz' s smoothing technique. We improved this approach by incorporating one kind of special Chinese language information and Chinese word class information into the language model. We tested the new smoothing technique with a Chinese character recognition system. The experimental result showed that a better performance can be achieved.

  7. Thermodynamics-based models of transcriptional regulation with gene sequence.

    Science.gov (United States)

    Wang, Shuqiang; Shen, Yanyan; Hu, Jinxing

    2015-12-01

    Quantitative models of gene regulatory activity have the potential to improve our mechanistic understanding of transcriptional regulation. However, the few models available today have been based on simplistic assumptions about the sequences being modeled or heuristic approximations of the underlying regulatory mechanisms. In this work, we have developed a thermodynamics-based model to predict gene expression driven by any DNA sequence. The proposed model relies on a continuous time, differential equation description of transcriptional dynamics. The sequence features of the promoter are exploited to derive the binding affinity which is derived based on statistical molecular thermodynamics. Experimental results show that the proposed model can effectively identify the activity levels of transcription factors and the regulatory parameters. Comparing with the previous models, the proposed model can reveal more biological sense.

  8. Gradient-based adaptation of continuous dynamic model structures

    Science.gov (United States)

    La Cava, William G.; Danai, Kourosh

    2016-01-01

    A gradient-based method of symbolic adaptation is introduced for a class of continuous dynamic models. The proposed model structure adaptation method starts with the first-principles model of the system and adapts its structure after adjusting its individual components in symbolic form. A key contribution of this work is its introduction of the model's parameter sensitivity as the measure of symbolic changes to the model. This measure, which is essential to defining the structural sensitivity of the model, not only accommodates algebraic evaluation of candidate models in lieu of more computationally expensive simulation-based evaluation, but also makes possible the implementation of gradient-based optimisation in symbolic adaptation. The proposed method is applied to models of several virtual and real-world systems that demonstrate its potential utility.

  9. Cognitive Modeling for Agent-Based Simulation of Child Maltreatment

    Science.gov (United States)

    Hu, Xiaolin; Puddy, Richard

    This paper extends previous work to develop cognitive modeling for agent-based simulation of child maltreatment (CM). The developed model is inspired from parental efficacy, parenting stress, and the theory of planned behavior. It provides an explanatory, process-oriented model of CM and incorporates causality relationship and feedback loops from different factors in the social ecology in order for simulating the dynamics of CM. We describe the model and present simulation results to demonstrate the features of this model.

  10. An Optimization Model Based on Game Theory

    Directory of Open Access Journals (Sweden)

    Yang Shi

    2014-04-01

    Full Text Available Game Theory has a wide range of applications in department of economics, but in the field of computer science, especially in the optimization algorithm is seldom used. In this paper, we integrate thinking of game theory into optimization algorithm, and then propose a new optimization model which can be widely used in optimization processing. This optimization model is divided into two types, which are called “the complete consistency” and “the partial consistency”. In these two types, the partial consistency is added disturbance strategy on the basis of the complete consistency. When model’s consistency is satisfied, the Nash equilibrium of the optimization model is global optimal and when the model’s consistency is not met, the presence of perturbation strategy can improve the application of the algorithm. The basic experiments suggest that this optimization model has broad applicability and better performance, and gives a new idea for some intractable problems in the field of artificial intelligence

  11. A Qualitative Acceleration Model Based on Intervals

    Directory of Open Access Journals (Sweden)

    Ester MARTINEZ-MARTIN

    2013-08-01

    Full Text Available On the way to autonomous service robots, spatial reasoning plays a main role since it properly deals with problems involving uncertainty. In particular, we are interested in knowing people's pose to avoid collisions. With that aim, in this paper, we present a qualitative acceleration model for robotic applications including representation, reasoning and a practical application.

  12. Will Rule based BPM obliterate Process Models?

    NARCIS (Netherlands)

    Joosten, S.; Joosten, H.J.M.

    2007-01-01

    Business rules can be used directly for controlling business processes, without reference to a business process model. In this paper we propose to use business rules to specify both business processes and the software that supports them. Business rules expressed in smart mathematical notations bring

  13. Kinetic data base for combustion modeling

    Energy Technology Data Exchange (ETDEWEB)

    Tsang, W.; Herron, J.T. [National Institute of Standards and Technology, Gaithersburg, MD (United States)

    1993-12-01

    The aim of this work is to develop a set of evaluated rate constants for use in the simulation of hydrocarbon combustion. The approach has been to begin with the small molecules and then introduce larger species with the various structural elements that can be found in all hydrocarbon fuels and decomposition products. Currently, the data base contains most of the species present in combustion systems with up to four carbon atoms. Thus, practically all the structural grouping found in aliphatic compounds have now been captured. The direction of future work is the addition of aromatic compounds to the data base.

  14. (Re)configuration based on model generation

    CERN Document Server

    Friedrich, Gerhard; Falkner, Andreas A; Haselböck, Alois; Schenner, Gottfried; Schreiner, Herwig; 10.4204/EPTCS.65.3

    2011-01-01

    Reconfiguration is an important activity for companies selling configurable products or services which have a long life time. However, identification of a set of required changes in a legacy configuration is a hard problem, since even small changes in the requirements might imply significant modifications. In this paper we show a solution based on answer set programming, which is a logic-based knowledge representation formalism well suited for a compact description of (re)configuration problems. Its applicability is demonstrated on simple abstractions of several real-world scenarios. The evaluation of our solution on a set of benchmark instances derived from commercial (re)configuration problems shows its practical applicability.

  15. Model-reduced gradient-based history matching

    NARCIS (Netherlands)

    Kaleta, M.P.

    2011-01-01

    Since the world's energy demand increases every year, the oil & gas industry makes a continuous effort to improve fossil fuel recovery. Physics-based petroleum reservoir modeling and closed-loop model-based reservoir management concept can play an important role here. In this concept measured data a

  16. Software Reuse of Mobile Systems based on Modelling

    Directory of Open Access Journals (Sweden)

    Guo Ping

    2016-01-01

    Full Text Available This paper presents an architectural style based modelling approach for architectural design, analysis of mobile systems. The approach is developed based on UML-like meta models and graph transformation techniques to support sound methodological principals, formal analysis and refinement. The approach could support mobile system development.

  17. Agent-based modelling of socio-technical systems

    CERN Document Server

    van Dam, Koen H; Lukszo, Zofia

    2012-01-01

    Here is a practical introduction to agent-based modelling of socio-technical systems, based on methodology developed at TU Delft, which has been deployed in a number of case studies. Offers theory, methods and practical steps for creating real-world models.

  18. An Active Learning Exercise for Introducing Agent-Based Modeling

    Science.gov (United States)

    Pinder, Jonathan P.

    2013-01-01

    Recent developments in agent-based modeling as a method of systems analysis and optimization indicate that students in business analytics need an introduction to the terminology, concepts, and framework of agent-based modeling. This article presents an active learning exercise for MBA students in business analytics that demonstrates agent-based…

  19. Towards automatic model based controller design for reconfigurable plants

    DEFF Research Database (Denmark)

    Michelsen, Axel Gottlieb; Stoustrup, Jakob; Izadi-Zamanabadi, Roozbeh

    2008-01-01

    This paper introduces model-based Plug and Play Process Control, a novel concept for process control, which allows a model-based control system to be reconfigured when a sensor or an actuator is plugged into a controlled process. The work reported in this paper focuses on composing a monolithic m...

  20. A Granular Computing Model Based on Tolerance relation

    Institute of Scientific and Technical Information of China (English)

    WANG Guo-yin; HU Feng; HUANG Hai; WU Yu

    2005-01-01

    Granular computing is a new intelligent computing theory based on partition of problem concepts. It is an important problem in Rough Set theory to process incomplete information systems directly. In this paper, a granular computing model based on tolerance relation for processing incomplete information systems is developed. Furthermore, a criteria condition for attribution necessity is proposed in this model.

  1. Category model to modeling the surface texture knowledge-base

    OpenAIRE

    Yan WANG; Scott, Paul J; Jiang, Xiang

    2006-01-01

    The next generations of Geometrical Product Specification (GPS) standards are considered to be too theoretical, abstract, complex and over-elaborate. And it is not easier for industry to understand and implement them efficiently in a short time. An intelligent knowledge-based system, “VirtualSurf” is being developed to solve the problem, particularly for surface texture knowledge, which is a critical part of GPS. This system will provide expert knowledge of surface texture to link...

  2. CSPBuilder - CSP based Scientific Workflow Modelling

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard; Vinter, Brian

    2008-01-01

    This paper introduces a framework for building CSP based applications, targeted for clusters and next generation CPU designs. CPUs are produced with several cores today and every future CPU generation will feature increasingly more cores, resulting in a requirement for concurrency that has not pr...

  3. Knowledge-based geometric modeling in construction

    DEFF Research Database (Denmark)

    Bonev, Martin; Hvam, Lars

    2012-01-01

    a considerably high amount of their recourses is required for designing and specifying the majority of their product assortment. As design decisions are hereby based on knowledge and experience about behaviour and applicability of construction techniques and materials for a predefined design situation, smart...

  4. STOCHASTIC ADAPTIVE SWITCHING CONTROL BASED ON MULTIPLE MODELS

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yanxia; GUO Lei

    2002-01-01

    It is well known that the transient behaviors of the traditional adaptive control may be very poor in general, and that the adaptive control designed based on switching between multiple models is an intuitively appealing and practically feasible approach to improve the transient performances. In this paper, we shall prove that for a typical class of linear systems disturbed by random noises, the multiple model based least-squares (LS)adaptive switching control is stable and convergent, and has the same convergence rate as that established for the standard least-squares-based self-tunning regulators. Moreover,the mixed case combining adaptive models with fixed models is also considered.

  5. Geometric Feature Extraction and Model Reconstruction Based on Scattered Data

    Institute of Scientific and Technical Information of China (English)

    胡鑫; 习俊通; 金烨

    2004-01-01

    A method of 3D model reconstruction based on scattered point data in reverse engineering is presented here. The topological relationship of scattered points was established firstly, then the data set was triangulated to reconstruct the mesh surface model. The curvatures of cloud data were calculated based on the mesh surface, and the point data were segmented by edge-based method; Every patch of data was fitted by quadric surface of freeform surface, and the type of quadric surface was decided by parameters automatically, at last the whole CAD model was created. An example of mouse model was employed to confirm the effect of the algorithm.

  6. Warehouse Optimization Model Based on Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Guofeng Qin

    2013-01-01

    Full Text Available This paper takes Bao Steel logistics automated warehouse system as an example. The premise is to maintain the focus of the shelf below half of the height of the shelf. As a result, the cost time of getting or putting goods on the shelf is reduced, and the distance of the same kind of goods is also reduced. Construct a multiobjective optimization model, using genetic algorithm to optimize problem. At last, we get a local optimal solution. Before optimization, the average cost time of getting or putting goods is 4.52996 s, and the average distance of the same kinds of goods is 2.35318 m. After optimization, the average cost time is 4.28859 s, and the average distance is 1.97366 m. After analysis, we can draw the conclusion that this model can improve the efficiency of cargo storage.

  7. Model-based scenarios of Mediterranean droughts

    Directory of Open Access Journals (Sweden)

    M. Weiß

    2007-11-01

    Full Text Available This study examines the change in current 100-year hydrological drought frequencies in the Mediterranean in comparison to the 2070s as simulated by the global model WaterGAP. The analysis considers socio-economic and climate changes as indicated by the IPCC scenarios A2 and B2 and the global general circulation model ECHAM4. Under these conditions today's 100-year drought is estimated to occur 10 times more frequently in the future over a large part of the Northern Mediterranean while in North Africa, today's 100-year drought will occur less frequently. Water abstractions are shown to play a minor role in comparison to the impact of climate change, but can intensify the situation.

  8. Comparing models for quantitative risk assessment: an application to the European Registry of foreign body injuries in children.

    Science.gov (United States)

    Berchialla, Paola; Scarinzi, Cecilia; Snidero, Silvia; Gregori, Dario

    2016-08-01

    Risk Assessment is the systematic study of decisions subject to uncertain consequences. An increasing interest has been focused on modeling techniques like Bayesian Networks since their capability of (1) combining in the probabilistic framework different type of evidence including both expert judgments and objective data; (2) overturning previous beliefs in the light of the new information being received and (3) making predictions even with incomplete data. In this work, we proposed a comparison among Bayesian Networks and other classical Quantitative Risk Assessment techniques such as Neural Networks, Classification Trees, Random Forests and Logistic Regression models. Hybrid approaches, combining both Classification Trees and Bayesian Networks, were also considered. Among Bayesian Networks, a clear distinction between purely data-driven approach and combination of expert knowledge with objective data is made. The aim of this paper consists in evaluating among this models which best can be applied, in the framework of Quantitative Risk Assessment, to assess the safety of children who are exposed to the risk of inhalation/insertion/aspiration of consumer products. The issue of preventing injuries in children is of paramount importance, in particular where product design is involved: quantifying the risk associated to product characteristics can be of great usefulness in addressing the product safety design regulation. Data of the European Registry of Foreign Bodies Injuries formed the starting evidence for risk assessment. Results showed that Bayesian Networks appeared to have both the ease of interpretability and accuracy in making prediction, even if simpler models like logistic regression still performed well.

  9. A method to manage the model base in DSS

    Institute of Scientific and Technical Information of China (English)

    孙成双; 李桂君

    2004-01-01

    How to manage and use models in DSS is a most important subject. Generally, it costs a lot of money and time to develop the model base management system in the development of DSS and most are simple in function or cannot be used efficiently in practice. It is a very effective, applicable, and economical choice to make use of the interfaces of professional computer software to develop a model base management system. This paper presents the method of using MATLAB, a well-known statistics software, as the development platform of a model base management system. The main functional framework of a MATLAB-based model base managementsystem is discussed. Finally, in this paper, its feasible application is illustrated in the field of construction projects.

  10. Robust speech features representation based on computational auditory model

    Institute of Scientific and Technical Information of China (English)

    LU Xugang; JIA Chuan; DANG Jianwu

    2004-01-01

    A speech signal processing and features extracting method based on computational auditory model is proposed. The computational model is based on psychological, physiological knowledge and digital signal processing methods. In each stage of a hearing perception system, there is a corresponding computational model to simulate its function. Based on this model, speech features are extracted. In each stage, the features in different kinds of level are extracted. A further processing for primary auditory spectrum based on lateral inhibition is proposed to extract much more robust speech features. All these features can be regarded as the internal representations of speech stimulation in hearing system. The robust speech recognition experiments are conducted to test the robustness of the features. Results show that the representations based on the proposed computational auditory model are robust representations for speech signals.

  11. A general model for membrane-based separation processes

    DEFF Research Database (Denmark)

    Soni, Vipasha; Abildskov, Jens; Jonsson, Gunnar Eigil;

    2009-01-01

    A separation process could be defined as a process that transforms a given mixture of chemicals into two or more compositionally distinct end-use products. One way to design these separation processes is to employ a model-based approach, where mathematical models that reliably predict the process...... behaviour will play an important role. In this paper, modelling of membrane-based processes for separation of gas and liquid mixtures are considered. Two general models, one for membrane-based liquid separation processes (with phase change) and another for membrane-based gas separation are presented. The...... modelling assumptions. Analyses of the generated models, together with their validation and application in process design/analysis are highlighted through several case studies....

  12. Adaptive Digital Image Watermarking Based on Combination of HVS Models

    Directory of Open Access Journals (Sweden)

    P. Foris

    2009-09-01

    Full Text Available In this paper two new blind adaptive digital watermarking methods of color images are presented. The adaptability is based on perceptual watermarking which exploits Human Visual System (HVS models. The first method performs watermark embedding in transform domain of DCT and the second method is based on DWT. Watermark is embedded into transform domain of a chosen color image component in a selected color space. Both methods use a combination of HVS models to select perceptually significant transform coefficients and at the same time to determine the bounds of modification of selected coefficients. The final HVS model consists of three parts. The first part is the HVS model in DCT (DWT domain. The second part is the HVS model based on Region of Interest and finally the third part is the HVS model based on Noise Visibility Function. Watermark has a form of a real number sequence with normal distribution.

  13. Verifying Service Choreography Model Based on Description Logic

    Directory of Open Access Journals (Sweden)

    Minggang Yu

    2016-01-01

    Full Text Available Web Services Choreography Description Language lacks a formal system to accurately express the semantics of service behaviors and verify the correctness of a service choreography model. The paper presents a new approach of choreography model verification based on Description Logic. A metamodel of service choreography is built to provide a conceptual framework to capture the formal syntax and semantics of service choreography. Based on the framework, a set of rules and constraints are defined in Description Logic for choreography model verification. To automate model verification, the UML-based service choreography model will be transformed, by the given algorithms, into the DL-based ontology, and thus the model properties can be verified by reasoning through the ontology with the help of a popular DL reasoner. A case study is given to demonstrate applicability of the method. Furthermore, the work will be compared with other related researches.

  14. Copula-based bivariate binary response models

    OpenAIRE

    Winkelmann, Rainer

    2009-01-01

    The bivariate probit model is frequently used for estimating the effect of an endogenous binary regressor on a binary outcome variable. This paper discusses simple modifications that maintain the probit assumption for the marginal distributions while introducing non-normal dependence among the two variables using copulas. Simulation results and evidence from two applications, one on the effect of insurance status on ambulatory expenditure and one on the effect of completing high school on sub...

  15. Quality based strategy: modelling for lean manufacturing

    OpenAIRE

    Cruz Machado, Virgilio A.

    1994-01-01

    The research develops and applies an integrated methodology for creating a Lean Manufacturing Environment in a traditional industry: the Portuguese Textile and Clothing Industry. This is achieved by developing a modelling tool using quality as a basis of performance assessment. In the context of the textile industry specific research objectives were: to evaluate current and potential application of Lean Manufacturing; to determine current business performance assessment crit...

  16. Vision-based macroscopic pedestrian models

    OpenAIRE

    Degond, Pierre; Appert-Rolland, Cécile; Pettré, Julien; Theraulaz, Guy

    2013-01-01

    We propose a hierarchy of kinetic and macroscopic models for a system consisting of a large number of interacting pedestrians. The basic interaction rules are derived from earlier work where the dangerousness level of an interaction with another pedestrian is measured in terms of the derivative of the bearing angle (angle between the walking direction and the line connecting the two subjects) and of the time-to-interaction (time before reaching the closest distance between the two subjects). ...

  17. A Family of RBAC- Based Workflow Authorization Models

    Institute of Scientific and Technical Information of China (English)

    HONG Fan; XING Guang-lin

    2005-01-01

    A family of RBAC-based workflow authorization models, called RWAM, are proposed. RWAM consists of a basic model and other models constructed from the basic model. The basic model provides the notion of temporal permission which means that a user can perform certain operation on a task only for a time interval, this not only ensure that only authorized users could execute a task but also ensure that the authorization flow is synchronized with workflow. The two advance models of RWAM deal with role hierarchy and constraints respectively. RWAM ranges from simple to complex and provides a general reference model for other researches and developments of such area.

  18. Fujisaki Model Based Intonation Modeling for Korean TTS System

    Science.gov (United States)

    Kim, Byeongchang; Lee, Jinsik; Lee, Gary Geunbae

    One of the enduring problems in developing high-quality TTS (text-to-speech) system is pitch contour generation. Considering language specific knowledge, an adjusted Fujisaki model for Korean TTS system is introduced along with refined machine learning features. The results of quantitative and qualitative evaluations show the validity of our system: the accuracy of the phrase command prediction is 0.8928; the correlations of the predicted amplitudes of a phrase command and an accent command are 0.6644 and 0.6002, respectively; our method achieved the level of "fair" naturalness (3.6) in a MOS scale for generated F0 curves.

  19. Characteristics of a Logistics-Based Business Model

    OpenAIRE

    Sandberg, Erik; Kihlén, Tobias; Abrahamsson, Mats

    2011-01-01

    In companies where excellence in logistics is decisive for the outperformance of competitors and logistics has an outspoken role for the strategy of the firm, there is present what we refer to here as a “logistics-based business model.” Based on a multiple case study of three Nordic retail companies, the purpose of this article is to explore the characteristics of such a logistics-based business model. As such, this research helps to provide structure to logistics-based business models and id...

  20. Solitonic Models Based on Quantum Groups and the Standard Model

    CERN Document Server

    Finkelstein, Robert J

    2010-01-01

    The idea that the elementary particles might have the symmetry of knots has had a long history. In any current formulation of this idea, however, the knot must be quantized. The present review is a summary of a small set of papers that began as an attempt to correlate the properties of quantized knots with the empirical properties of the elementary particles. As the ideas behind these papers have developed over a number of years the model has evolved, and this review is intended to present the model in its current form. The original picture of an elementary fermion as a solitonic knot of field, described by the trefoil representation of SUq(2), has expanded into its current form in which a knotted field is complementary to a composite structure composed of three or more preons that in turn are described by the fundamental representation of SLq(2). These complementary descriptions may be interpreted as describing single composite particles composed of three or more preons bound by a knotted field.

  1. Verification and Validation of Model-Based Autonomous Systems

    Science.gov (United States)

    Pecheur, Charles; Koga, Dennis (Technical Monitor)

    2001-01-01

    This paper presents a three year project (FY99 to FY01) on the verification and validation of model based autonomous systems. The topics include: 1) Project Profile; 2) Model-Based Autonomy; 3) The Livingstone MIR; 4) MPL2SMV; 5) Livingstone to SMV Translation; 6) Symbolic Model Checking; 7) From Livingstone Models to SMV Models; 8) Application In-Situ Propellant Production; 9) Closed-Loop Verification Principle; 10) Livingstone PathFinder (LPF); 11) Publications and Presentations; and 12) Future Directions. This paper is presented in viewgraph form.

  2. An XML-based information model for archaeological pottery

    Institute of Scientific and Technical Information of China (English)

    LIU De-zhi; RAZDAN Anshuman; SIMON Arleyn; BAE Myungsoo

    2005-01-01

    An information model is defined to support sharing scientific information on Web for archaeological pottery. Apart from non-shape information, such as age, material, etc., the model also consists of shape information and shape feature information. Shape information is collected by Lasers Scanner and geometric modelling techniques. Feature information is generated from shape information via feature extracting techniques. The model is used in an integrated storage, archival, and sketch-based query and retrieval system for 3D objects, native American ceramic vessels. A novel aspect of the information model is that it is totally implemented with XML, and is designed for Web-based visual query and storage application.

  3. Extending EMMS-based models to CFB boiler applications

    Institute of Scientific and Technical Information of China (English)

    Bona Lu; Nan Zhang; Wei Wang; Jinghai Li

    2012-01-01

    Recently,EMMS-based models are being widely applied in simulations of high-throughput circulating fluidized beds (CFBs) with fine particles.Its use for low flux systems,such as CFB boiler (CFBB),still remains unexplored.In this work,it has been found that the original definition of cluster diameter in EMMS model is unsuitable for simulations of the CFB boiler with low solids flux.To remedy this,we propose a new model of cluster diameter.The EMMS-based drag model (EMMS/matrix model) with this revised cluster definition is validated through the computational fluid dynamics (CFD) simulation of a CFB boiler.

  4. An Algebraic Dexter-Based Hypertext Reference Model

    CERN Document Server

    Mattick, Volker

    2009-01-01

    We present the first formal algebraic specification of a hypertext reference model. It is based on the well-known Dexter Hypertext Reference Model and includes modifications with respect to the development of hypertext since the WWW came up. Our hypertext model was developed as a product model with the aim to automatically support the design process and is extended to a model of hypertext-systems in order to be able to describe the state transitions in this process. While the specification should be easy to read for non-experts in algebraic specification, it guarantees a unique understanding and enables a close connection to logic-based development and verification.

  5. A VENSIM BASED ANALYSIS FOR SUPPLY CHAIN MODEL

    Directory of Open Access Journals (Sweden)

    Mohammad SHAMSUDDOHA

    2014-01-01

    Full Text Available The emphasis on supply chain has increased in recent years among academic and industry circles. In this paper, a supply chain model will be developed based on a case study of the poultry industry under the Vensim environment. System dynamics, supply chain, design science and case method under positivist and quantitative paradigm will be studied to develop a simulation model. The objectives of this paper are to review literature, develop a Vensim based simulation supply chain model, and examine the model qualitatively and quantitatively. The model will be also briefly discussed in relation of among forward, reverse and mainstream supply chain of the case.

  6. Attenuating wind turbine loads through model based individual pitch control

    DEFF Research Database (Denmark)

    Thomsen, Sven Creutz; Niemann, Hans Henrik; Poulsen, Niels Kjølstad

    2009-01-01

    In this paper we consider wind turbine load attenuation through model based control. Asymmetric loads caused by the wind field can be reduced by pitching the blades individually. To this end we investigate the use of stochastic models of the wind which can be included in a model based individual...... pitch controller design. In this way the variability of the wind can be estimated and compensated for by the controller. The wind turbine model is in general time-variant due to its rotational nature. For this reason the modeling and control is carried out in so-called multiblade coordinates...

  7. Online Knowledge-Based Model for Big Data Topic Extraction

    Science.gov (United States)

    Khan, Muhammad Taimoor; Durrani, Mehr; Khalid, Shehzad; Aziz, Furqan

    2016-01-01

    Lifelong machine learning (LML) models learn with experience maintaining a knowledge-base, without user intervention. Unlike traditional single-domain models they can easily scale up to explore big data. The existing LML models have high data dependency, consume more resources, and do not support streaming data. This paper proposes online LML model (OAMC) to support streaming data with reduced data dependency. With engineering the knowledge-base and introducing new knowledge features the learning pattern of the model is improved for data arriving in pieces. OAMC improves accuracy as topic coherence by 7% for streaming data while reducing the processing cost to half. PMID:27195004

  8. On grey relation projection model based on projection pursuit

    Institute of Scientific and Technical Information of China (English)

    Wang Shuo; Yang Shanlin; Ma Xijun

    2008-01-01

    Multidimensional grey relation projection value can be synthesized as one-dimensional projection value by u-sing projection pursuit model.The larger the projection value is,the better the model.Thus,according to the projection value,the best one can be chosen from the model aggregation.Because projection pursuit modeling based on accelera-ting genetic algorithm can simplify the implementation procedure of the projection pursuit technique and overcome its complex calculation as well as the difficulty in implementing its program,a new method can be obtained for choosing the best grey relation projection model based on the projection pursuit technique.

  9. Understanding Elementary Astronomy by Making Drawing-Based Models

    Science.gov (United States)

    van Joolingen, W. R.; Aukes, Annika V.; Gijlers, H.; Bollen, L.

    2015-01-01

    Modeling is an important approach in the teaching and learning of science. In this study, we attempt to bring modeling within the reach of young children by creating the SimSketch modeling system, which is based on freehand drawings that can be turned into simulations. This system was used by 247 children (ages ranging from 7 to 15) to create a…

  10. Physiologically based kinetic modeling of the bioactivation of myristicin

    NARCIS (Netherlands)

    Al-Malahmeh, Amer J.; Al-Ajlouni, Abdelmajeed; Wesseling, Sebastiaan; Soffers, Ans E.M.F.; Al-Subeihi, A.; Kiwamoto, Reiko; Vervoort, Jacques; Rietjens, Ivonne M.C.M.

    2016-01-01

    The present study describes physiologically based kinetic (PBK) models for the alkenylbenzene myristicin that were developed by extension of the PBK models for the structurally related alkenylbenzene safrole in rat and human. The newly developed myristicin models revealed that the formation of th

  11. IDEF method-based simulation model design and development framework

    Directory of Open Access Journals (Sweden)

    Ki-Young Jeong

    2009-09-01

    Full Text Available The purpose of this study is to provide an IDEF method-based integrated framework for a business process simulation model to reduce the model development time by increasing the communication and knowledge reusability during a simulation project. In this framework, simulation requirements are collected by a function modeling method (IDEF0 and a process modeling method (IDEF3. Based on these requirements, a common data model is constructed using the IDEF1X method. From this reusable data model, multiple simulation models are automatically generated using a database-driven simulation model development approach. The framework is claimed to help both requirement collection and experimentation phases during a simulation project by improving system knowledge, model reusability, and maintainability through the systematic use of three descriptive IDEF methods and the features of the relational database technologies. A complex semiconductor fabrication case study was used as a testbed to evaluate and illustrate the concepts and the framework. Two different simulation software products were used to develop and control the semiconductor model from the same knowledge base. The case study empirically showed that this framework could help improve the simulation project processes by using IDEF-based descriptive models and the relational database technology. Authors also concluded that this framework could be easily applied to other analytical model generation by separating the logic from the data.

  12. Case-Based Modeling for Learning Management and Interpersonal Skills

    Science.gov (United States)

    Lyons, Paul

    2008-01-01

    This article offers an introduction to case-based modeling (CBM) and a demonstration of the efficacy of this instructional model. CBM is grounded primarily in the concepts and theory of experiential learning, augmented by concepts of script creation. Although it is labor intensive, the model is one that has value for instruction in various…

  13. Integration Issues of an Ontology based Context Modelling Approach

    OpenAIRE

    Strang, Thomas; Linnhoff-Popien, Claudia; Frank, Korbinian

    2003-01-01

    In this paper we analyse the applicability of our ontology based context modelling approach, considering a range of use cases. After wrapping up the model and the Context Ontology Language (CoOL) derived from it, we introduce some interesting applications of the language, based on a scenario showing the challenges in context aware service interactions. We focus on two submodels of our model for context aware service interactions, namely Context Bindings and Context Obligations, and demonstrat...

  14. Model Based Bootstrap Methods for Interval Censored Data

    OpenAIRE

    Sen, Bodhisattva; Xu, Gongjun

    2013-01-01

    We investigate the performance of model based bootstrap methods for constructing point-wise confidence intervals around the survival function with interval censored data. We show that bootstrapping from the nonparametric maximum likelihood estimator of the survival function is inconsistent for both the current status and case 2 interval censoring models. A model based smoothed bootstrap procedure is proposed and shown to be consistent. In addition, simulation studies are conducted to illustra...

  15. Physics-Based Learning Models for Ship Hydrodynamics

    OpenAIRE

    Weymouth, Gabriel D.; Yue, Dick K.P.

    2014-01-01

    We present the concepts of physics-based learning models (PBLM) and their relevance and application to the field of ship hydrodynamics. The utility of physics-based learning is motivated by contrasting generic learning models for regression predictions, which do not presume any knowledge of the system other than the training data provided with methods such as semi-empirical models, which incorporate physical insights along with data-fitting. PBLM provides a framework wherein intermediate mode...

  16. Container Terminal Operations Modeling through Multi agent based Simulation

    OpenAIRE

    Ayub, Yasir; Faruki, Usman

    2009-01-01

    This thesis aims to propose a multi-agent based hierarchical model for the operations of container terminals. We have divided our model into four key agents that are involved in each sub processes. The proposed agent allocation policies are recommended for different situations that may occur at a container terminal. A software prototype is developed which implements the hierarchical model. This web based application is used in order to simulate the various processes involved in the following ...

  17. Semiparametric theory based MIMO model and performance analysis

    Institute of Scientific and Technical Information of China (English)

    XU Fang-min; XU Xiao-dong; ZHANG Ping

    2007-01-01

    In this article, a new approach for modeling multi- input multi-output (MIMO) systems with unknown nonlinear interference is introduced. The semiparametric theory based MIMO model is established, and Kernel estimation is applied to combat the nonlinear interference. Furthermore, we derive MIMO capacity for these systems and explore the asymptotic properties of the new channel matrix via theoretical analysis. The simulation results show that the semiparametric theory based modeling and kernel estimation are valid to combat this kind of interference.

  18. Towards an Intelligent Project Based Organization Business Model

    OpenAIRE

    Alami Marrouni Oussama; Beidouri Zitouni; Bouksour Othmane

    2013-01-01

    Global economy is undergoing a recession phase that had made competition tougher and imposed new business framework. Businesses have to shift from the classical management approaches to an Intelligent Project Based Organization (IPBO) model that provides flexibility and agility. IPBO model is intended to reinforce the proven advantages of Project Based Organization (PBO) by the use of suitable Enterprise Intelligence (EI) Systems. The goal of this paper is to propose an IPBO model that combin...

  19. Accurate Load Modeling Based on Analytic Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Zhenshu Wang

    2016-01-01

    Full Text Available Establishing an accurate load model is a critical problem in power system modeling. That has significant meaning in power system digital simulation and dynamic security analysis. The synthesis load model (SLM considers the impact of power distribution network and compensation capacitor, while randomness of power load is more precisely described by traction power system load model (TPSLM. On the basis of these two load models, a load modeling method that combines synthesis load with traction power load is proposed in this paper. This method uses analytic hierarchy process (AHP to interact with two load models. Weight coefficients of two models can be calculated after formulating criteria and judgment matrixes and then establishing a synthesis model by weight coefficients. The effectiveness of the proposed method was examined through simulation. The results show that accurate load modeling based on AHP can effectively improve the accuracy of load model and prove the validity of this method.

  20. A Technology-based Model for Learning

    Directory of Open Access Journals (Sweden)

    Michael Williams

    2004-12-01

    Full Text Available The Math Emporium, opened in 1997, is an open 7000-squaremeter facility with 550+ workstations arranged in an array of widely spaced hexagonal "pods", designed to support group work at the same time maintaining an academic air. We operate it 24/7 with math support personnel in attendance 12 hours per day. Students have access to online course resources at all times, from anywhere. We have used this unique asset to transform traditional classroom-based courses into technology based learning programs that have no class meetings at all. The structure of the program is very different from the conventional one, having a new set of expectations and motivations. The results include: more effective students, substantial cost savings, economies of scale and scope and a stream-lined process for creating new on-line courses.

  1. Towards a contract-based interoperation model

    OpenAIRE

    Fernández Peña, Félix Oscar; Willmott, Steven Nicolás

    2007-01-01

    Web Services-based solutions for interoperating processes are considered to be one of the most promising technologies for achieving truly interoperable functioning in open environments. In the last three years, the specification in particular of agreements between resource / service providers and consumers, as well as protocols for their negotiation have been proposed as a possible solution for managing the resulting computing systems. In this report, the state of the art in the area of contr...

  2. Adopsi Model Competency Based Training dalam Kewirausahaan

    OpenAIRE

    I Ketut Santra

    2009-01-01

    The aim of the research is improving the teaching method in entrepreneurship subject. This research adopted the competency based training (CBT) into the entrepreneurship. The major task in this research is formulated and designed the entrepreneurship competency. Entrepreneurship competency indicated by Personal, Strategic and Situational and Business competence. All of entrepreneurship competences are described into sub topic of competence. After designing and formulating the game and simulat...

  3. A Time Based Blended Learning Model

    OpenAIRE

    Norberg, Anders; Dziuban, Charles D; Moskal, Patsy M

    2011-01-01

    Purpose – This paper seeks to outline a time-based strategy for blended learning that illustrates course design and delivery by framing students' learning opportunities in synchronous and asynchronous modalities. Design/methodology/approach – This paper deconstructs the evolving components of blended learning in order to identify changes induced by digital technologies for enhancing teaching and learning environments. Findings – This paper hypothesizes that blended learning may be traced back...

  4. Models of Quantitative Estimations: Rule-Based and Exemplar-Based Processes Compared

    Science.gov (United States)

    von Helversen, Bettina; Rieskamp, Jorg

    2009-01-01

    The cognitive processes underlying quantitative estimations vary. Past research has identified task-contingent changes between rule-based and exemplar-based processes (P. Juslin, L. Karlsson, & H. Olsson, 2008). B. von Helversen and J. Rieskamp (2008), however, proposed a simple rule-based model--the mapping model--that outperformed the exemplar…

  5. A Knowledge Representation Model for Video—Based Animation

    Institute of Scientific and Technical Information of China (English)

    劳志强; 潘云鹤

    1998-01-01

    In this paper,a brief survey on knowledge-based animation techniques is given.Then a VideoStream-based Knowledge Representation Model(VSKRM)for Joint Objects is presented which includes the knowledge representation of :Graphic Object,Action and VideoStream.Next a general description of the UI framework of a system is given based on the VSKRM model.Finally,a conclusion is reached.

  6. Agent-Based Modeling and Mapping of Manufacturing System

    Institute of Scientific and Technical Information of China (English)

    Z; Zhang

    2002-01-01

    Considering the agent-based modeling and mapping i n manufacturing system, some system models are described in this paper, which are included: Domain Based Hierarchical Structure (DBHS), Cascading Agent Structure (CAS), Proximity Relation Structure (PRS), and Bus-based Network Structure (BNS ). In DBHS, one sort of agents, called static agents, individually acts as Domai n Agents, Resources Agents, UserInterface Agents and Gateway Agents. And the oth ers, named mobile agents, are the brokers of task and ...

  7. Identity-based encryption with wildcards in the standard model

    Institute of Scientific and Technical Information of China (English)

    MING Yang; SHEN Xiao-qin; WANG Yu-min

    2009-01-01

    In this article, based on Chatterjee-Sarkar' hierarchical identity-based encryption (HIBE), a novel identity-based encryption with wildcards (WIBE) scheme is proposed and is proven secure in the standard model (without random oracle). The proposed scheme is proven to be secure assuming that the decisional Bilinear Diffie-Hellman (DBDH) problem is hard. Compared with the Wa-WIBE scheme that is secure in the standard model, our scheme has shorter common parameters and ciphertext length.

  8. Agent-based Modeling and Mapping of Manufacturing System

    Institute of Scientific and Technical Information of China (English)

    Z; Zhang

    2002-01-01

    Considering the gent-based modeling and mapping in m anufacturing system, in this paper, some system models are described, which are including: Domain Based Hierarchical Structure (DBHS), Cascading Agent Struc ture (CAS), Proximity Relation structure (PRS), and Bus-based network structure (BNS). In DBHS, one sort of agent individually delegates Domain Agents, Res ources Agents, UserInterface Agents and Gateway Agents and the other one is a br oker of tasks and process flow. Static agents representing...

  9. Model-based design of integrated production systems: a review

    OpenAIRE

    Ould Sidi, Mohamed Mahmoud; Lescourret, Francoise

    2011-01-01

    Pest resistance and water pollution are major issues caused by the excessive use of pesticides in intensive agriculture. The concept of integrated production system (IPS) has been thus designed to solve those issues and also to meet the need for better food quality and production. Methodologies such as agronomic diagnosis-based design, prototyping, and model-based design have been developed. Here we review the model-based design of IPS. We identify tools for the development of comprehensive m...

  10. Reduced model-based decision-making in schizophrenia.

    Science.gov (United States)

    Culbreth, Adam J; Westbrook, Andrew; Daw, Nathaniel D; Botvinick, Matthew; Barch, Deanna M

    2016-08-01

    Individuals with schizophrenia have a diminished ability to use reward history to adaptively guide behavior. However, tasks traditionally used to assess such deficits often rely on multiple cognitive and neural processes, leaving etiology unresolved. In the current study, we adopted recent computational formalisms of reinforcement learning to distinguish between model-based and model-free decision-making in hopes of specifying mechanisms associated with reinforcement-learning dysfunction in schizophrenia. Under this framework, decision-making is model-free to the extent that it relies solely on prior reward history, and model-based if it relies on prospective information such as motivational state, future consequences, and the likelihood of obtaining various outcomes. Model-based and model-free decision-making was assessed in 33 schizophrenia patients and 30 controls using a 2-stage 2-alternative forced choice task previously demonstrated to discern individual differences in reliance on the 2 forms of reinforcement-learning. We show that, compared with controls, schizophrenia patients demonstrate decreased reliance on model-based decision-making. Further, parameter estimates of model-based behavior correlate positively with IQ and working memory measures, suggesting that model-based deficits seen in schizophrenia may be partially explained by higher-order cognitive deficits. These findings demonstrate specific reinforcement-learning and decision-making deficits and thereby provide valuable insights for understanding disordered behavior in schizophrenia. (PsycINFO Database Record PMID:27175984

  11. Reduced model-based decision-making in schizophrenia.

    Science.gov (United States)

    Culbreth, Adam J; Westbrook, Andrew; Daw, Nathaniel D; Botvinick, Matthew; Barch, Deanna M

    2016-08-01

    Individuals with schizophrenia have a diminished ability to use reward history to adaptively guide behavior. However, tasks traditionally used to assess such deficits often rely on multiple cognitive and neural processes, leaving etiology unresolved. In the current study, we adopted recent computational formalisms of reinforcement learning to distinguish between model-based and model-free decision-making in hopes of specifying mechanisms associated with reinforcement-learning dysfunction in schizophrenia. Under this framework, decision-making is model-free to the extent that it relies solely on prior reward history, and model-based if it relies on prospective information such as motivational state, future consequences, and the likelihood of obtaining various outcomes. Model-based and model-free decision-making was assessed in 33 schizophrenia patients and 30 controls using a 2-stage 2-alternative forced choice task previously demonstrated to discern individual differences in reliance on the 2 forms of reinforcement-learning. We show that, compared with controls, schizophrenia patients demonstrate decreased reliance on model-based decision-making. Further, parameter estimates of model-based behavior correlate positively with IQ and working memory measures, suggesting that model-based deficits seen in schizophrenia may be partially explained by higher-order cognitive deficits. These findings demonstrate specific reinforcement-learning and decision-making deficits and thereby provide valuable insights for understanding disordered behavior in schizophrenia. (PsycINFO Database Record

  12. A Requirements Analysis Model Based on QFD

    Institute of Scientific and Technical Information of China (English)

    TANG Zhi-wei; Nelson K.H.Tang

    2004-01-01

    The enterprise resource planning (ERP) system has emerged to offer an integrated IT solution and more and more enterprises are increasing by adopting this system and regarding it as an important innovation. However, there is already evidence of high failure risks in ERP project implementation, one major reason is poor analysis of the requirements for system implementation. In this paper, the importance of requirements analysis for ERP project implementation is highlighted, and a requirements analysis model by applying quality function deployment (QFD) is presented, which will support to conduct requirements analysis for ERP project.

  13. Piecewise Linear Model-Based Image Enhancement

    Directory of Open Access Journals (Sweden)

    Fabrizio Russo

    2004-09-01

    Full Text Available A novel technique for the sharpening of noisy images is presented. The proposed enhancement system adopts a simple piecewise linear (PWL function in order to sharpen the image edges and to reduce the noise. Such effects can easily be controlled by varying two parameters only. The noise sensitivity of the operator is further decreased by means of an additional filtering step, which resorts to a nonlinear model too. Results of computer simulations show that the proposed sharpening system is simple and effective. The application of the method to contrast enhancement of color images is also discussed.

  14. Artificial Neuron Modelling Based on Wave Shape

    Directory of Open Access Journals (Sweden)

    Kieran Greer

    2013-10-01

    Full Text Available This paper describes a new model for an artificial neural network processing unit or neuron. It is slightly different to a traditional feedforward network by the fact that it favours a mechanism of trying to match the wave-like ‘shape’ of the input with the shape of the output against specific value error corrections. The expectation is then that a best fit shape can be transposed into the desired output values more easily. This allows for notions of reinforcement through resonance and also the construction of synapses.

  15. An Efficient Semantic Model For Concept Based Clustering And Classification

    Directory of Open Access Journals (Sweden)

    SaiSindhu Bandaru

    2012-03-01

    Full Text Available Usually in text mining techniques the basic measures like term frequency of a term (word or phrase is computed to compute the importance of the term in the document. But with statistical analysis, the original semantics of the term may not carry the exact meaning of the term. To overcome this problem, a new framework has been introduced which relies on concept based model and synonym based approach. The proposed model can efficiently find significant matching and related concepts between documents according to concept based and synonym based approaches. Large sets of experiments using the proposed model on different set in clustering and classification are conducted. Experimental results demonstrate the substantialenhancement of the clustering quality using sentence based, document based, corpus based and combined approach concept analysis. A new similarity measure has been proposed to find the similarity between adocument and the existing clusters, which can be used in classification of the document with existing clusters.

  16. Lattice-based flow field modeling.

    Science.gov (United States)

    Wei, Xiaoming; Zhao, Ye; Fan, Zhe; Li, Wei; Qiu, Feng; Yoakum-Stover, Suzanne; Kaufman, Arie E

    2004-01-01

    We present an approach for simulating the natural dynamics that emerge from the interaction between a flow field and immersed objects. We model the flow field using the Lattice Boltzmann Model (LBM) with boundary conditions appropriate for moving objects and accelerate the computation on commodity graphics hardware (GPU) to achieve real-time performance. The boundary conditions mediate the exchange of momentum between the flow field and the moving objects resulting in forces exerted by the flow on the objects as well as the back-coupling on the flow. We demonstrate our approach using soap bubbles and a feather. The soap bubbles illustrate Fresnel reflection, reveal the dynamics of the unseen flow field in which they travel, and display spherical harmonics in their undulations. Our simulation allows the user to directly interact with the flow field to influence the dynamics in real time. The free feather flutters and gyrates in response to lift and drag forces created by its motion relative to the flow. Vortices are created as the free feather falls in an otherwise quiescent flow. PMID:15527053

  17. Phase Correlation Based Iris Image Registration Model

    Institute of Scientific and Technical Information of China (English)

    Jun-Zhou Huang; Tie-Niu Tan; Li Ma; Yun-Hong Wang

    2005-01-01

    Iris recognition is one of the most reliable personal identification methods. In iris recognition systems, image registration is an important component. Accurately registering iris images leads to higher recognition rate for an iris recognition system. This paper proposes a phase correlation based method for iris image registration with sub-pixel accuracy.Compared with existing methods, it is insensitive to image intensity and can compensate to a certain extent the non-linear iris deformation caused by pupil movement. Experimental results show that the proposed algorithm has an encouraging performance.

  18. Model and Behavior-Based Robotic Goalkeeper

    DEFF Research Database (Denmark)

    Lausen, H.; Nielsen, J.; Nielsen, M.;

    2003-01-01

    control are implemented by a non-linear control algorithm, adapted to the different task goals (e.g., follow the ball or the robot posture from local features extracted from images acquired by a catadioptric omni-directional vision system. Most robot parameters were designed based on simulations carried......This paper describes the design, implementation and test of a goalkeeper robot for the Middle-Size League of RoboCub. The goalkeeper task is implemented by a set of primitive tasks and behaviours coordinated by a 2-level hierarchical state machine. The primitive tasks concerning complex motion...

  19. Model-Drive Architecture for Agent-Based Systems

    Science.gov (United States)

    Gradanin, Denis; Singh, H. Lally; Bohner, Shawn A.; Hinchey, Michael G.

    2004-01-01

    The Model Driven Architecture (MDA) approach uses a platform-independent model to define system functionality, or requirements, using some specification language. The requirements are then translated to a platform-specific model for implementation. An agent architecture based on the human cognitive model of planning, the Cognitive Agent Architecture (Cougaar) is selected for the implementation platform. The resulting Cougaar MDA prescribes certain kinds of models to be used, how those models may be prepared and the relationships of the different kinds of models. Using the existing Cougaar architecture, the level of application composition is elevated from individual components to domain level model specifications in order to generate software artifacts. The software artifacts generation is based on a metamodel. Each component maps to a UML structured component which is then converted into multiple artifacts: Cougaar/Java code, documentation, and test cases.

  20. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    International Nuclear Information System (INIS)

    simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions

  1. Neural-Based Models of Semiconductor Devices for SPICE Simulator

    OpenAIRE

    Hanene B. Hammouda; Mongia Mhiri; Zièd Gafsi; Kamel Besbes

    2008-01-01

    The paper addresses a simple and fast new approach to implement Artificial Neural Networks (ANN) models for the MOS transistor into SPICE. The proposed approach involves two steps, the modeling phase of the device by NN providing its input/output patterns, and the SPICE implementation process of the resulting model. Using the Taylor series expansion, a neural based small-signal model is derived. The reliability of our approach is validated through simulations of some circuits in DC and small-...

  2. Precise numerical modeling of next generation multimode fiber based links

    Science.gov (United States)

    Maksymiuk, L.; Stepniak, G.

    2015-12-01

    In order to numerically model modern multimode fiber based links we are required to take into account modal and chromatic dispersion, profile dispersion and spectral dependent coupling. In this paper we propose a complete numerical model which not only is precise but also versatile. Additionally to the detailed mathematical description of the model we provide also a bunch of numerical calculations performed with the use of the model.

  3. Pervasive Computing Location-aware Model Based on Ontology

    Institute of Scientific and Technical Information of China (English)

    PU Fang; CAI Hai-bin; CAO Qi-ying; SUN Dao-qing; LI Tong

    2008-01-01

    In order to integrate heterogeneous location-aware systems into pervasive computing environment, a novel pervasive computing location-aware model based on ontology is presented. A location-aware model ontology (LMO) is constructed. The location-aware model has the capabilities of sharing knowledge, reasoning and adjusting the usage policies of services dynamically through a unified semantic location manner. At last, the work process of our proposed location-aware model is explained by an application scenario.

  4. Variable selection in model-based discriminant analysis

    OpenAIRE

    Maugis, Cathy; Celeux, Gilles; Martin-Magniette, Marie-Laure

    2010-01-01

    A general methodology for selecting predictors for Gaussian generative classification models is presented. The problem is regarded as a model selection problem. Three different roles for each possible predictor are considered: a variable can be a relevant classification predictor or not, and the irrelevant classification variables can be linearly dependent on a part of the relevant predictors or independent variables. This variable selection model was inspired by the model-based clustering mo...

  5. Dynamic control of modern, network-based epidemic models

    OpenAIRE

    Sélley, Fanni; Besenyei, Ádám; Kiss, Istvan; Simon, Péter L

    2014-01-01

    In this paper we make the first steps to bridge the gap between classic control theory and modern, network-based epidemic models. In particular, we apply nonlinear model predictive control (NMPC) to a pairwise ODE model which we use to model a susceptible-infectious-susceptible (SIS) epidemic on non-trivial contact structures. While classic control of epidemics concentrates on aspects such as vaccination, quarantine and fast diagnosis, our novel setup allows us to deliver control by altering ...

  6. Deep Structured Energy Based Models for Anomaly Detection

    OpenAIRE

    Zhai, Shuangfei; Cheng, Yu; Lu, Weining; Zhang, Zhongfei

    2016-01-01

    In this paper, we attack the anomaly detection problem by directly modeling the data distribution with deep architectures. We propose deep structured energy based models (DSEBMs), where the energy function is the output of a deterministic deep neural network with structure. We develop novel model architectures to integrate EBMs with different types of data such as static data, sequential data, and spatial data, and apply appropriate model architectures to adapt to the data structure. Our trai...

  7. A hierarchy of heuristic-based models of crowd dynamics

    OpenAIRE

    Degond, Pierre; Appert-Rolland, Cécile; Moussaid, Mehdi; Pettré, Julien; Theraulaz, Guy

    2013-01-01

    We derive a hierarchy of kinetic and macroscopic models from a noisy variant of the heuristic behavioral Individual-Based Model of Moussaid et al, PNAS 2011, where the pedestrians are supposed to have constant speeds. This IBM supposes that the pedestrians seek the best compromise between navigation towards their target and collisions avoidance. We first propose a kinetic model for the probability distribution function of the pedestrians. Then, we derive fluid models and propose three differe...

  8. CREDIT SCORING MODELS WITH AUC MAXIMIZATION BASED ON WEIGHTED SVM

    OpenAIRE

    LIGANG ZHOU; KIN KEUNG LAI; JEROME YEN

    2009-01-01

    Credit scoring models are very important tools for financial institutions to make credit granting decisions. In the last few decades, many quantitative methods have been used for the development of credit scoring models with focus on maximizing classification accuracy. This paper proposes the credit scoring models with the area under receiver operating characteristics curve (AUC) maximization based on the new emerged support vector machines (SVM) techniques. Three main SVM models with differe...

  9. Business model for sensor-based fall recognition systems.

    Science.gov (United States)

    Fachinger, Uwe; Schöpke, Birte

    2014-01-01

    AAL systems require, in addition to sophisticated and reliable technology, adequate business models for their launch and sustainable establishment. This paper presents the basic features of alternative business models for a sensor-based fall recognition system which was developed within the context of the "Lower Saxony Research Network Design of Environments for Ageing" (GAL). The models were developed parallel to the R&D process with successive adaptation and concretization. An overview of the basic features (i.e. nine partial models) of the business model is given and the mutual exclusive alternatives for each partial model are presented. The partial models are interconnected and the combinations of compatible alternatives lead to consistent alternative business models. However, in the current state, only initial concepts of alternative business models can be deduced. The next step will be to gather additional information to work out more detailed models.

  10. UML statechart based rigorous modeling of real-time system

    Institute of Scientific and Technical Information of China (English)

    LAI Ming-zhi; YOU Jin-yuan

    2005-01-01

    Rigorous modeling could ensure correctness and could verify a reduced cost in embedded real-time system development for models. Software methods are needed for rigorous modeling of embedded real-time systems. PVS is a formal method with precise syntax and semantics defined. System modeled by PVS specification could be verified by tools. Combining the widely used UML with PVS, this paper provides a novel modeling and verification approach for embedded real-time systems. In this approach, we provide 1 ) a time-extended UML statechart for modeling dynamic behavior of an embedded real-time system; 2) an approach to capture timed automata based semantics from a timed statechart; and 3) an algorithm to generate a finite state model expressed in PVS specification for model checking. The benefits of our approach include flexibility and user friendliness in modeling, extendability in formalization and verification content, and better performance. Time constraints are modeled and verified and is a highlight of this paper.

  11. Distributed Maximality based CTL Model Checking

    Directory of Open Access Journals (Sweden)

    Djamel Eddine Saidouni

    2010-05-01

    Full Text Available In this paper we investigate an approach to perform a distributed CTL Model checker algorithm on a network of workstations using Kleen three value logic, the state spaces is partitioned among the network nodes, We represent the incomplete state spaces as a Maximality labeled Transition System MLTS which are able to express true concurrency. we execute in parallel the same algorithm in each node, for a certain property on an incomplete MLTS , this last compute the set of states which satisfy or which if they fail are assigned the value .The third value mean unknown whether true or false because the partial state space lacks sufficient information needed for a precise answer concerning the complete state space .To solve this problem each node exchange the information needed to conclude the result about the complete state space. The experimental version of the algorithm is currently being implemented using the functional programming language Erlang.

  12. Model-based satellite image fusion

    DEFF Research Database (Denmark)

    Aanæs, Henrik; Sveinsson, J. R.; Nielsen, Allan Aasbjerg;

    2008-01-01

    A method is proposed for pixel-level satellite image fusion derived directly from a model of the imaging sensor. By design, the proposed method is spectrally consistent. It is argued that the proposed method needs regularization, as is the case for any method for this problem. A framework for pixel...... neighborhood regularization is presented. This framework enables the formulation of the regularization in a way that corresponds well with our prior assumptions of the image data. The proposed method is validated and compared with other approaches on several data sets. Lastly, the intensity......-hue-saturation method is revisited in order to gain additional insight of what implications the spectral consistency has for an image fusion method....

  13. Repetition-based Interactive Facade Modeling

    KAUST Repository

    AlHalawani, Sawsan

    2012-07-01

    Modeling and reconstruction of urban environments has gained researchers attention throughout the past few years. It spreads in a variety of directions across multiple disciplines such as image processing, computer graphics and computer vision as well as in architecture, geoscience and remote sensing. Having a virtual world of our real cities is very attractive in various directions such as entertainment, engineering, governments among many others. In this thesis, we address the problem of processing a single fa cade image to acquire useful information that can be utilized to manipulate the fa cade and generate variations of fa cade images which can be later used for buildings\\' texturing. Typical fa cade structures exhibit a rectilinear distribution where in windows and other elements are organized in a grid of horizontal and vertical repetitions of similar patterns. In the firt part of this thesis, we propose an efficient algorithm that exploits information obtained from a single image to identify the distribution grid of the dominant elements i.e. windows. This detection method is initially assisted with the user marking the dominant window followed by an automatic process for identifying its repeated instances which are used to define the structure grid. Given the distribution grid, we allow the user to interactively manipulate the fa cade by adding, deleting, resizing or repositioning the windows in order to generate new fa cade structures. Having the utility for the interactive fa cade is very valuable to create fa cade variations and generate new textures for building models. Ultimately, there is a wide range of interesting possibilities of interactions to be explored.

  14. Intelligent Transportation and Evacuation Planning A Modeling-Based Approach

    CERN Document Server

    Naser, Arab

    2012-01-01

    Intelligent Transportation and Evacuation Planning: A Modeling-Based Approach provides a new paradigm for evacuation planning strategies and techniques. Recently, evacuation planning and modeling have increasingly attracted interest among researchers as well as government officials. This interest stems from the recent catastrophic hurricanes and weather-related events that occurred in the southeastern United States (Hurricane Katrina and Rita). The evacuation methods that were in place before and during the hurricanes did not work well and resulted in thousands of deaths. This book offers insights into the methods and techniques that allow for implementing mathematical-based, simulation-based, and integrated optimization and simulation-based engineering approaches for evacuation planning. This book also: Comprehensively discusses the application of mathematical models for evacuation and intelligent transportation modeling Covers advanced methodologies in evacuation modeling and planning Discusses principles a...

  15. Scale-based spatial data model for GIS

    Institute of Scientific and Technical Information of China (English)

    WEI Zu-kuan

    2004-01-01

    Being the primary media of geographical information and the elementary objects manipulated, almost all of maps adopt the layer-based model to represent geographic information in the existent GIS. However, it is difficult to extend the map represented in layer-based model. Furthermore, in Web-Based GIS, It is slow to transmit the spatial data for map viewing. In this paper, for solving the questions above, we have proposed a new method for representing the spatial data. That is scale-based model. In this model we represent maps in three levels: scale-view, block, and spatial object, and organize the maps in a set of map layers, named Scale-View, which associates some given scales.Lastly, a prototype Web-Based GIS using the proposed spatial data representation is described briefly.

  16. Discrete Element Simulation of Asphalt Mastics Based on Burgers Model

    Institute of Scientific and Technical Information of China (English)

    LIU Yu; FENG Shi-rong; HU Xia-guang

    2007-01-01

    In order to investigate the viscoelastic performance of asphalt mastics, a micro-mechanical model for asphalt mastics was built by applying Burgers model to discrete element simulation and constructing Burgers contact model. Then the numerical simulation of creep tests was conducted, and results from the simulation were compared with the analytical solution for Burgers model. The comparision snowed that the two results agreed well with each other, suggesting that discrete element model based on Burgers model could be employed in the numerical simulation for asphalt mastics.

  17. GIS-BASED 1-D DIFFUSIVE WAVE OVERLAND FLOW MODEL

    Energy Technology Data Exchange (ETDEWEB)

    KALYANAPU, ALFRED [Los Alamos National Laboratory; MCPHERSON, TIMOTHY N. [Los Alamos National Laboratory; BURIAN, STEVEN J. [NON LANL

    2007-01-17

    This paper presents a GIS-based 1-d distributed overland flow model and summarizes an application to simulate a flood event. The model estimates infiltration using the Green-Ampt approach and routes excess rainfall using the 1-d diffusive wave approximation. The model was designed to use readily available topographic, soils, and land use/land cover data and rainfall predictions from a meteorological model. An assessment of model performance was performed for a small catchment and a large watershed, both in urban environments. Simulated runoff hydrographs were compared to observations for a selected set of validation events. Results confirmed the model provides reasonable predictions in a short period of time.

  18. Physics-Based Reactive Burn Model: Grain Size Effects

    Science.gov (United States)

    Lu, X.; Hamate, Y.; Horie, Y.

    2007-12-01

    We have been developing a physics-based reactive burn (PBRB) model, which was formulated based on the concept of a statistical hot spot cell. In the model, essential thermomechanics and physiochemical features are explicitly modeled. In this paper, we have extended the statistical hot spot model to explicitly describe the ignition and growth of hot spots. In particular, grain size effects are explicitly delineated through introduction of grain size-dependent, thickness of the hot-region, energy deposition criterion, and specific surface area. Besides the linear relationships between the run distance to detonation and the critical diameter with respect to the reciprocal specific surface area of heterogeneous explosives (HE), which is based on the original model and discussed in a parallel paper of this meeting, parametric studies have shown that the extended PBRB model can predict a non-monotonic variation of shock sensitivity with grain size, as observed by Moulard et al.

  19. CONCEPTUAL MODELING BASED ON LOGICAL EXPRESSION AND EVOLVEMENT

    Institute of Scientific and Technical Information of China (English)

    Yl Guodong; ZHANG Shuyou; TAN Jianrong; JI Yangjian

    2007-01-01

    Aiming at the problem of abstract and polytype information modeling in product conceptual design, a method of conceptual modeling based on logical expression and evolvement is presented. Based on the logic expressions of the product conceptual design information, a function/logic/structure mapping model is set up. First, the function semantics is transformed into logical expressions through function/logic mapping. Second, the methods of logical evolvement are utilized to describe the function analysis, function/structure mapping and structure combination. Last, the logical structure scheme is transformed into geometrical sketch through logic/structure mapping. The conceptual design information and modeling process are described uniformly with logical methods in the model, and an effective method for computer aided conceptual design based on the model is implemented.

  20. Model Based Mission Assurance: Emerging Opportunities for Robotic Systems

    Science.gov (United States)

    Evans, John W.; DiVenti, Tony

    2016-01-01

    The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).

  1. Rodent model of activity-based anorexia.

    Science.gov (United States)

    Carrera, Olaia; Fraga, Ángela; Pellón, Ricardo; Gutiérrez, Emilio

    2014-04-10

    Activity-based anorexia (ABA) consists of a procedure that involves the simultaneous exposure of animals to a restricted feeding schedule, while free access is allowed to an activity wheel. Under these conditions, animals show a progressive increase in wheel running, a reduced efficiency in food intake to compensate for their increased activity, and a severe progression of weight loss. Due to the parallelism with the clinical manifestations of anorexia nervosa including increased activity, reduced food intake and severe weight loss, the ABA procedure has been proposed as the best analog of human anorexia nervosa (AN). Thus, ABA research could both allow a better understanding of the mechanisms underlying AN and generate useful leads for treatment development in AN.

  2. Adopsi Model Competency Based Training dalam Kewirausahaan

    Directory of Open Access Journals (Sweden)

    I Ketut Santra

    2009-01-01

    Full Text Available The aim of the research is improving the teaching method in entrepreneurship subject. This research adopted the competency based training (CBT into the entrepreneurship. The major task in this research is formulated and designed the entrepreneurship competency. Entrepreneurship competency indicated by Personal, Strategic and Situational and Business competence. All of entrepreneurship competences are described into sub topic of competence. After designing and formulating the game and simulation the research continuing to implement the competency based training in the real class. The time consumed to implementing the CBT one semester, starting on September 2006 to early February 2007. The lesson learnt from the implementation period, the CBT could improve the student competence in Personal, Situational Strategic and Business. The three of the competencies are important for the success entrepreneur. It is a sign of application of “Kurikulum Berbasis Kompetensi”. There are many evidences to describe the achievement of the CBT in entrepreneurship subject. Firstly, physically achievement, that all of the student’s business plan could became the real business. The evidences are presented by picture of the student’s real business. Secondly theoretically achievement, that the Personal, Situational Strategic and Business competence statistically have significant relation with Business Plan even Real Business quality. The effect of the Personal, Situational Strategic and Business competence to Business Plan quality is 84.4%. and, to the Real Business quality 77.2%. The statistic’s evidence suggests that the redesign of the entrepreneurship subject is the right way. The content of the entrepreneur competence (Personal, Situational and Strategic and Business competence have impact to the student to conduct and running for own business.

  3. A review of urban residential choice models using agent-based modeling

    NARCIS (Netherlands)

    Huang, Qingxu; Parker, Dawn C.; Filatova, Tatiana; Sun, Shipeng

    2014-01-01

    Urban land-use modeling methods have experienced substantial improvements in the last several decades. With the advancement of urban land-use change theories and modeling techniques, a considerable number of models have been developed. The relatively young approach, agent-based modeling, provides ur

  4. Developing a Physiologically-Based Pharmacokinetic Model Knowledgebase in Support of Provisional Model Construction

    Science.gov (United States)

    Developing physiologically-based pharmacokinetic (PBPK) models for chemicals can be resource-intensive, as neither chemical-specific parameters nor in vivo pharmacokinetic data are easily available for model construction. Previously developed, well-parameterized, and thoroughly-v...

  5. When Does Model-Based Control Pay Off?

    Science.gov (United States)

    Kool, Wouter; Cushman, Fiery A; Gershman, Samuel J

    2016-08-01

    Many accounts of decision making and reinforcement learning posit the existence of two distinct systems that control choice: a fast, automatic system and a slow, deliberative system. Recent research formalizes this distinction by mapping these systems to "model-free" and "model-based" strategies in reinforcement learning. Model-free strategies are computationally cheap, but sometimes inaccurate, because action values can be accessed by inspecting a look-up table constructed through trial-and-error. In contrast, model-based strategies compute action values through planning in a causal model of the environment, which is more accurate but also more cognitively demanding. It is assumed that this trade-off between accuracy and computational demand plays an important role in the arbitration between the two strategies, but we show that the hallmark task for dissociating model-free and model-based strategies, as well as several related variants, do not embody such a trade-off. We describe five factors that reduce the effectiveness of the model-based strategy on these tasks by reducing its accuracy in estimating reward outcomes and decreasing the importance of its choices. Based on these observations, we describe a version of the task that formally and empirically obtains an accuracy-demand trade-off between model-free and model-based strategies. Moreover, we show that human participants spontaneously increase their reliance on model-based control on this task, compared to the original paradigm. Our novel task and our computational analyses may prove important in subsequent empirical investigations of how humans balance accuracy and demand. PMID:27564094

  6. Uncertainty Representation and Interpretation in Model-based Prognostics Algorithms based on Kalman Filter Estimation

    Data.gov (United States)

    National Aeronautics and Space Administration — This article discusses several aspects of uncertainty represen- tation and management for model-based prognostics method- ologies based on our experience with...

  7. A Role- Based PMI Security Model for E- Government

    Institute of Scientific and Technical Information of China (English)

    WU Li-jun; SU Kai-le; YANG Zhi-hua

    2005-01-01

    We introduce the general AC( attribute certificate ), the role specification AC and the role assignment AC.We discuss the role-based PMI (Privilege Management Infrastructure) architecture. The role-based PMI(Public-Key Infrastructure) secure model for E-government is researched by combining the role-based PMI with PKI architecture (Public Key Infrastructure). The model has advantages of flexibility,convenience, less storage space and less network consumption etc. We are going to use the secure model in the E-government system.

  8. Interactive Coherence-Based Façade Modeling

    KAUST Repository

    Musialski, Przemyslaw

    2012-05-01

    We propose a novel interactive framework for modeling building facades from images. Our method is based on the notion of coherence-based editing which allows exploiting partial symmetries across the facade at any level of detail. The proposed workflow mixes manual interaction with automatic splitting and grouping operations based on unsupervised cluster analysis. In contrast to previous work, our approach leads to detailed 3d geometric models with up to several thousand regions per facade. We compare our modeling scheme to others and evaluate our approach in a user study with an experienced user and several novice users.

  9. Anchor-based English-Chinese Bilingual Chunk Alignment Model

    Institute of Scientific and Technical Information of China (English)

    WU We-lin; CHENG Chang-sheng; XU Liang-xian; LU Ru-zhan

    2005-01-01

    Chunk alignment for the bilingual corpus is the base of Example-based Machine Translation. An anchor-based English-Chinese bilingual chunk alignment model and the corresponding algorithm of alignment are presented in this paper. It can effectively overcome the sparse data problem due to the limited size of the bilingual corpus. In this model, the chunk segmentation disambiguation is delayed to the alignment process, and hence the accuracy of chunk segmentation is improved. The experimental results demonstrate the feasibility and viability of this model.

  10. SDRAM-based packet buffer model for high speed switches

    DEFF Research Database (Denmark)

    Rasmussen, Anders; Ruepp, Sarah Renée; Berger, Michael Stübert;

    2011-01-01

    This article investigates the how the performance of SDRAM based packet buffering systems for high performance switches can be simulated using OPNET. In order to include the access pattern dependent performance of SDRAM modules in simulations, a custom SDRAM model is implemented in OPNET Modeller...... based on the specifications of a real-life DDR3-SDRAM chip. Based on this model the performance of different schemes for optimizing the performance of such a packet buffer can be evaluated. The purpose of this study is to find efficient schemes for memory mapping of the packet queues and I/O traffic...

  11. Model-Based Integration and Interpretation of Data

    DEFF Research Database (Denmark)

    Petersen, Johannes

    2004-01-01

    Data integration and interpretation plays a crucial role in supervisory control. The paper defines a set of generic inference steps for the data integration and interpretation process based on a three-layer model of system representations. The three-layer model is used to clarify the combination...... of constraint and object-centered representations of the work domain throwing new light on the basic principles underlying the data integration and interpretation process of Rasmussen's abstraction hierarchy as well as other model-based approaches combining constraint and object-centered representations. Based...

  12. Optical computing based on neuronal models

    Science.gov (United States)

    Farhat, Nabil H.

    1987-10-01

    Ever since the fit between what neural net models can offer (collective, iterative, nonlinear, robust, and fault-tolerant approach to information processing) and the inherent capabilities of optics (parallelism and massive interconnectivity) was first pointed out and the first optical associative memory demonstrated in 1985, work and interest in neuromorphic optical signal processing has been growing steadily. For example, work in optical associative memories is currently being conducted at several academic institutions (e.g., California Institute of Technology, University of Colorado, University of California-San Diego, Stanford University, University of Rochester, and the author's own institution the University of Pennsylvania) and at several industrial and governmental laboratories (e.g., Hughes Research Laboratories - Malibu, the Naval Research Laboratory, and the Jet Propulsion Laboratory). In these efforts, in addition to the vector matrix multiplication with thresholding and feedback scheme utilized in early implementations, an arsenal of sophisticated optical tools such as holographic storage, phase conjugate optics, and wavefront modulation and mixing are being drawn on to realize associative memory functions.

  13. Business Model Innovation and Competitive Imitation: The Case of Sponsor-Based Business Models

    OpenAIRE

    Casadesus-Masanell, Ramon; Zhu, Feng

    2013-01-01

    This paper provides the first formal model of business model innovation. Our analysis focuses on sponsor-based business model innovations where a firm monetizes its product through sponsors rather than setting prices to its customer base. We analyze strategic interactions between an innovative entrant and an incumbent where the incumbent may imitate the entrant's business model innovation once it is revealed. The results suggest that an entrant needs to strategically choose whether to reveal ...

  14. BP Network Based Users' Interest Model in Mining WWW Cache

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    By analyzing the WWW Cache model, we bring forward a user-interest description method based on the fuzzy theory and user-interest inferential relations based on BP(back propagation) neural network. By this method, the users' interest in the WWW cache can be described and the neural network of users' interest can be constructed by positive spread of interest and the negative spread of errors. This neural network can infer the users' interest. This model is not the simple extension of the simple interest model, but the round improvement of the model and its related algorithm.

  15. Elastoplastic cup model for cement-based materials

    Directory of Open Access Journals (Sweden)

    Yan ZHANG

    2010-03-01

    Full Text Available Based on experimental data obtained from triaxial tests and a hydrostatic test, a cup model was formulated. Two plastic mechanisms, respectively a deviatoric shearing and a pore collapse, are taken into account. This model also considers the influence of confining pressure. In this paper, the calibration of the model is detailed and numerical simulations of the main mechanical behavior of cement paste over a large range of stress are described, showing good agreement with experimental results. The case study shows that this cup model has extensive applicability for cement-based materials and other quasi-brittle and high-porosity materials in a complex stress state.

  16. Numerical simulation of base flow with hot base bleed for two jet models

    Institute of Scientific and Technical Information of China (English)

    Wen-jie YU; Yong-gang YU; Bin NI

    2014-01-01

    In order to improve the benefits of base bleed in base flow field, the base flow with hot base bleed for two jet models is studied. Two-dimensional axisymmetric NaviereStokes equations are computed by using a finite volume scheme. The base flow of a cylinder afterbody with base bleed is simulated. The simulation results are validated with the experimental data, and the experimental results are well reproduced. On this basis, the base flow fields with base bleed for a circular jet model and an annulus jet model are investigated by selecting the injection temperature from 830 K to 2200 K. The results show that the base pressure of the annular jet model is higher than that of the circular jet model with the changes of the injection parameter and the injection temperature. For the circular jet model, the hot gases are concentrated in the vicinity of the base. For the annular jet model, the bleed gases flow into the shear layer directly so that the hot gases are concentrated in the shear layer. The latter temperature distribution is better for the increase of base pressure.

  17. Random sampling or geostatistical modelling? Choosing between design-based and model-based sampling strategies for soil (with discussion)

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    1997-01-01

    Classical sampling theory has been repeatedly identified with classical statistics which assumes that data are identically and independently distributed. This explains the switch of many soil scientists from design-based sampling strategies, based on classical sampling theory, to the model-based app

  18. RPOA Model-Based Optimal Resource Provisioning

    Directory of Open Access Journals (Sweden)

    Noha El. Attar

    2014-01-01

    Full Text Available Optimal utilization of resources is the core of the provisioning process in the cloud computing. Sometimes the local resources of a data center are not adequate to satisfy the users’ requirements. So, the providers need to create several data centers at different geographical area around the world and spread the users’ applications on these resources to satisfy both service providers and customers QoS requirements. By considering the expansion of the resources and applications, the transmission cost and time have to be concerned as significant factors in the allocation process. According to the work of our previous paper, a Resource Provision Optimal Algorithm (RPOA based on Particle Swarm Optimization (PSO has been introduced to find the near optimal resource utilization with considering the customer budget and suitable for deadline time. This paper is considered an enhancement to RPOA algorithm to find the near optimal resource utilization with considering the data transfer time and cost, in addition to the customer budget and deadline time, in the performance measurement.

  19. Hybrid and adaptive meta-model-based global optimization

    Science.gov (United States)

    Gu, J.; Li, G. Y.; Dong, Z.

    2012-01-01

    As an efficient and robust technique for global optimization, meta-model-based search methods have been increasingly used in solving complex and computation intensive design optimization problems. In this work, a hybrid and adaptive meta-model-based global optimization method that can automatically select appropriate meta-modelling techniques during the search process to improve search efficiency is introduced. The search initially applies three representative meta-models concurrently. Progress towards a better performing model is then introduced by selecting sample data points adaptively according to the calculated values of the three meta-models to improve modelling accuracy and search efficiency. To demonstrate the superior performance of the new algorithm over existing search methods, the new method is tested using various benchmark global optimization problems and applied to a real industrial design optimization example involving vehicle crash simulation. The method is particularly suitable for design problems involving computation intensive, black-box analyses and simulations.

  20. Modelling Amperometric Biosensors Based on Chemically Modified Electrodes

    Science.gov (United States)

    Baronas, Romas; Kulys, Juozas

    2008-01-01

    The response of an amperometric biosensor based on a chemically modified electrode was modelled numerically. A mathematical model of the biosensor is based on a system of non-linear reaction-diffusion equations. The modelling biosensor comprises two compartments: an enzyme layer and an outer diffusion layer. In order to define the main governing parameters the corresponding dimensionless mathematical model was derived. The digital simulation was carried out using the finite difference technique. The adequacy of the model was evaluated using analytical solutions known for very specific cases of the model parameters. By changing model parameters the output results were numerically analyzed at transition and steady state conditions. The influence of the substrate and mediator concentrations as well as of the thicknesses of the enzyme and diffusion layers on the biosensor response was investigated. Calculations showed complex kinetics of the biosensor response, especially when the biosensor acts under a mixed limitation of the diffusion and the enzyme interaction with the substrate.

  1. A community-based framework for aquatic ecosystem models

    DEFF Research Database (Denmark)

    Trolle, Didde; Hamilton, D. P.; Hipsey, M. R.;

    2012-01-01

    Here, we communicate a point of departure in the development of aquatic ecosystem models, namely a new community-based framework, which supports an enhanced and transparent union between the collective expertise that exists in the communities of traditional ecologists and model developers. Through...... aim to (i) advance collaboration within the aquatic ecosystem modelling community, (ii) enable increased use of models for research, policy and ecosystem-based management, (iii) facilitate a collective framework using common (standardised) code to ensure that model development is incremental, (iv......) avoid 're-inventing the wheel', thus accelerating improvements to aquatic ecosystem models. We intend to achieve this as a community that fosters interactions amongst ecologists and model developers. Further, we outline scientific topics recently articulated by the scientific community, which lend...

  2. Multi-level spherical moments based 3D model retrieval

    Institute of Scientific and Technical Information of China (English)

    LIU Wei; HE Yuan-jun

    2006-01-01

    In this paper a novel 3D model retrieval method that employs multi-level spherical moment analysis and relies on voxelization and spherical mapping of the 3D models is proposed. For a given polygon-soup 3D model, first a pose normalization step is done to align the model into a canonical coordinate frame so as to define the shape representation with respect to this orientation. Afterward we rasterize its exterior surface into cubical voxel grids, then a series of homocentric spheres with their center superposing the center of the voxel grids cut the voxel grids into several spherical images. Finally moments belonging to each sphere are computed and the moments of all spheres constitute the descriptor of the model. Experiments showed that Euclidean distance based on this kind of feature vector can distinguish different 3D models well and that the 3D model retrieval system based on this arithmetic yields satisfactory performance.

  3. Research on Bayesian Network Based User's Interest Model

    Institute of Scientific and Technical Information of China (English)

    ZHANG Weifeng; XU Baowen; CUI Zifeng; XU Lei

    2007-01-01

    It has very realistic significance for improving the quality of users' accessing information to filter and selectively retrieve the large number of information on the Internet. On the basis of analyzing the existing users' interest models and some basic questions of users' interest (representation, derivation and identification of users' interest), a Bayesian network based users' interest model is given. In this model, the users' interest reduction algorithm based on Markov Blanket model is used to reduce the interest noise, and then users' interested and not interested documents are used to train the Bayesian network. Compared to the simple model, this model has the following advantages like small space requirements, simple reasoning method and high recognition rate. The experiment result shows this model can more appropriately reflect the user's interest, and has higher performance and good usability.

  4. An Agent-Based Modeling for Pandemic Influenza in Egypt

    CERN Document Server

    Khalil, Khaled M; Nazmy, Taymour T; Salem, Abdel-Badeeh M

    2010-01-01

    Pandemic influenza has great potential to cause large and rapid increases in deaths and serious illness. The objective of this paper is to develop an agent-based model to simulate the spread of pandemic influenza (novel H1N1) in Egypt. The proposed multi-agent model is based on the modeling of individuals' interactions in a space time context. The proposed model involves different types of parameters such as: social agent attributes, distribution of Egypt population, and patterns of agents' interactions. Analysis of modeling results leads to understanding the characteristics of the modeled pandemic, transmission patterns, and the conditions under which an outbreak might occur. In addition, the proposed model is used to measure the effectiveness of different control strategies to intervene the pandemic spread.

  5. Trust Model Based on P2P Network

    Directory of Open Access Journals (Sweden)

    Guang Ouyang

    2013-09-01

    Full Text Available In order to change the defect of traditional trust model, the research on trust model based on P2P network is proposed in this paper. It is based on the theoretical characteristic of P2P network and analyzes the trust mechanism and application model and set up a kind of new trust model. This kind of trust model is great helpful to improve the success rate of transaction and the trust model was designed. Finally, the simulation experiment is made. The results show that using P2P trust model can be more effectively inhibit and isolate the malicious nodes than the other and improving PZP network environment can make the network gets a benign development.

  6. Image based 3D city modeling : Comparative study

    Science.gov (United States)

    Singh, S. P.; Jain, K.; Mandla, V. R.

    2014-06-01

    3D city model is a digital representation of the Earth's surface and it's related objects such as building, tree, vegetation, and some manmade feature belonging to urban area. The demand of 3D city modeling is increasing rapidly for various engineering and non-engineering applications. Generally four main image based approaches were used for virtual 3D city models generation. In first approach, researchers were used Sketch based modeling, second method is Procedural grammar based modeling, third approach is Close range photogrammetry based modeling and fourth approach is mainly based on Computer Vision techniques. SketchUp, CityEngine, Photomodeler and Agisoft Photoscan are the main softwares to represent these approaches respectively. These softwares have different approaches & methods suitable for image based 3D city modeling. Literature study shows that till date, there is no complete such type of comparative study available to create complete 3D city model by using images. This paper gives a comparative assessment of these four image based 3D modeling approaches. This comparative study is mainly based on data acquisition methods, data processing techniques and output 3D model products. For this research work, study area is the campus of civil engineering department, Indian Institute of Technology, Roorkee (India). This 3D campus acts as a prototype for city. This study also explains various governing parameters, factors and work experiences. This research work also gives a brief introduction, strengths and weakness of these four image based techniques. Some personal comment is also given as what can do or what can't do from these softwares. At the last, this study shows; it concluded that, each and every software has some advantages and limitations. Choice of software depends on user requirements of 3D project. For normal visualization project, SketchUp software is a good option. For 3D documentation record, Photomodeler gives good result. For Large city

  7. Parameter optimization in differential geometry based solvation models.

    Science.gov (United States)

    Wang, Bao; Wei, G W

    2015-10-01

    Differential geometry (DG) based solvation models are a new class of variational implicit solvent approaches that are able to avoid unphysical solvent-solute boundary definitions and associated geometric singularities, and dynamically couple polar and non-polar interactions in a self-consistent framework. Our earlier study indicates that DG based non-polar solvation model outperforms other methods in non-polar solvation energy predictions. However, the DG based full solvation model has not shown its superiority in solvation analysis, due to its difficulty in parametrization, which must ensure the stability of the solution of strongly coupled nonlinear Laplace-Beltrami and Poisson-Boltzmann equations. In this work, we introduce new parameter learning algorithms based on perturbation and convex optimization theories to stabilize the numerical solution and thus achieve an optimal parametrization of the DG based solvation models. An interesting feature of the present DG based solvation model is that it provides accurate solvation free energy predictions for both polar and non-polar molecules in a unified formulation. Extensive numerical experiment demonstrates that the present DG based solvation model delivers some of the most accurate predictions of the solvation free energies for a large number of molecules.

  8. Attributes Enhanced Role-Based Access Control Model

    DEFF Research Database (Denmark)

    Mahmood Rajpoot, Qasim; Jensen, Christian D.; Krishnan, Ram

    2015-01-01

    Attribute-based access control (ABAC) and role-based access control (RBAC) are currently the two most popular access control models. Yet, they both have known limitations and offer features com- plimentary to each other. Due to this fact, integration of RBAC and ABAC has recently emerged as an im......Attribute-based access control (ABAC) and role-based access control (RBAC) are currently the two most popular access control models. Yet, they both have known limitations and offer features com- plimentary to each other. Due to this fact, integration of RBAC and ABAC has recently emerged...

  9. EPR-based material modelling of soils considering volume changes

    Science.gov (United States)

    Faramarzi, Asaad; Javadi, Akbar A.; Alani, Amir M.

    2012-11-01

    In this paper an approach is presented for developing material models for soils based on evolutionary polynomial regression (EPR), taking into account its volumetric behaviour. EPR is a recently developed hybrid data mining technique that searches for structured mathematical equations (representing the behaviour of a system) using genetic algorithm and the least squares method. Stress-strain data from triaxial test are used to train and develop EPR-based material models for soil. The developed models are compared with some of the well known conventional material models. In particular, the capability of the developed EPR models in predicting volume change behaviour of soils is illustrated. It is also shown that the developed EPR-based material models can be incorporated in finite element (FE) analysis. Two geotechnical examples are presented to verify the developed EPR-based FE model (EPR-FEM). The results of the EPR-FEM are compared with those of a standard FEM where conventional constitutive models are used to describe the material behaviour. The results show that EPR-FEM can be successfully employed to analyse geotechnical engineering problems. The advantages of the proposed EPR models are highlighted.

  10. Error model identification of inertial navigation platform based on errors-in-variables model

    Institute of Scientific and Technical Information of China (English)

    Liu Ming; Liu Yu; Su Baoku

    2009-01-01

    Because the real input acceleration cannot be obtained during the error model identification of inertial navigation platform, both the input and output data contain noises. In this case, the conventional regression model and the least squares (LS) method will result in bias. Based on the models of inertial navigation platform error and observation error, the errors-in-variables (EV) model and the total least squares (TLS) method are proposed to identify the error model of the inertial navigation platform. The estimation precision is improved and the result is better than the conventional regression model based LS method. The simulation results illustrate the effectiveness of the proposed method.

  11. Probabilistic Model-Based Diagnosis for Electrical Power Systems

    Data.gov (United States)

    National Aeronautics and Space Administration — We present in this article a case study of the probabilistic approach to model-based diagnosis. Here, the diagnosed system is a real-world electrical power system,...

  12. RESEARCH ON VIRTUAL-PART-BASED CONNECTING ELEMENT MODELING

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    Based on the inner character analysis of interpart, detail modification and assembly relation of mechanical connecting element, the idea, which extends the feature modeling of part to the interpart feature modeling for assembly purpose, is presented, and virtual-part-based connecting element modeling is proposed. During the assembly modeling, base parts are modified by the Boolean subtraction between the virtual part and the part to be connected. Dynamic matching algorithm, which is based on list database, is designed for dynamic extension and off-line editing of connecting part and virtual part, and design rules of connecting element is encapsulated by the virtual part. A prototyped software module for rapid design of connecting elements is implemented under self-developed CAD/CAM platform-SuperMan.

  13. Model-based approach for elevator performance estimation

    Science.gov (United States)

    Esteban, E.; Salgado, O.; Iturrospe, A.; Isasa, I.

    2016-02-01

    In this paper, a dynamic model for an elevator installation is presented in the state space domain. The model comprises both the mechanical and the electrical subsystems, including the electrical machine and a closed-loop field oriented control. The proposed model is employed for monitoring the condition of the elevator installation. The adopted model-based approach for monitoring employs the Kalman filter as an observer. A Kalman observer estimates the elevator car acceleration, which determines the elevator ride quality, based solely on the machine control signature and the encoder signal. Finally, five elevator key performance indicators are calculated based on the estimated car acceleration. The proposed procedure is experimentally evaluated, by comparing the key performance indicators calculated based on the estimated car acceleration and the values obtained from actual acceleration measurements in a test bench. Finally, the proposed procedure is compared with the sliding mode observer.

  14. A Model-based Avionic Prognostic Reasoner (MAPR)

    Data.gov (United States)

    National Aeronautics and Space Administration — The Model-based Avionic Prognostic Reasoner (MAPR) presented in this paper is an innovative solution for non-intrusively monitoring the state of health (SoH) and...

  15. Physics-Based Pneumatic Hammer Instability Model Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this project is to develop a physics-based pneumatic hammer instability model that accurately predicts the stability of hydrostatic bearings...

  16. Model-based Prognostics with Fixed-lag Particle Filters

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics exploits domain knowl- edge of the system, its components, and how they fail by casting the underlying physical phenom- ena in a...

  17. A Model-based Prognostics Approach Applied to Pneumatic Valves

    Data.gov (United States)

    National Aeronautics and Space Administration — Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain...

  18. A Model-Based Prognostics Approach Applied to Pneumatic Valves

    Data.gov (United States)

    National Aeronautics and Space Administration — Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain...

  19. Content-based network model with duplication and divergence

    Science.gov (United States)

    Şengün, Yasemin; Erzan, Ayşe

    2006-06-01

    We construct a minimal content-based realization of the duplication and divergence model of genomic networks introduced by Wagner [Proc. Natl. Acad. Sci. 91 (1994) 4387] and investigate the scaling properties of the directed degree distribution and clustering coefficient. We find that the content-based network exhibits crossover between two scaling regimes, with log-periodic oscillations for large degrees. These features are not present in the original gene duplication model, but inherent in the content-based model of Balcan and Erzan. The scaling form of the degree distribution of the content-based model turns out to be robust under duplication and divergence, with some re-adjustment of the scaling exponents, while the out-clustering coefficient goes over from a weak power-law dependence on the degree, to an exponential decay under mutations which include splitting and merging of strings.

  20. CDMBE: A Case Description Model Based on Evidence

    Directory of Open Access Journals (Sweden)

    Jianlin Zhu

    2015-01-01

    Full Text Available By combining the advantages of argument map and Bayesian network, a case description model based on evidence (CDMBE, which is suitable to continental law system, is proposed to describe the criminal cases. The logic of the model adopts the credibility logical reason and gets evidence-based reasoning quantitatively based on evidences. In order to consist with practical inference rules, five types of relationship and a set of rules are defined to calculate the credibility of assumptions based on the credibility and supportability of the related evidences. Experiments show that the model can get users’ ideas into a figure and the results calculated from CDMBE are in line with those from Bayesian model.

  1. Human insulin dynamics in women: a physiologically based model.

    Science.gov (United States)

    Weiss, Michael; Tura, Andrea; Kautzky-Willer, Alexandra; Pacini, Giovanni; D'Argenio, David Z

    2016-02-01

    Currently available models of insulin dynamics are mostly based on the classical compartmental structure and, thus, their physiological utility is limited. In this work, we describe the development of a physiologically based model and its application to data from 154 patients who underwent an insulin-modified intravenous glucose tolerance test (IM-IVGTT). To determine the time profile of endogenous insulin delivery without using C-peptide data and to evaluate the transcapillary transport of insulin, the hepatosplanchnic, renal, and peripheral beds were incorporated into the circulatory model as separate subsystems. Physiologically reasonable population mean estimates were obtained for all estimated model parameters, including plasma volume, interstitial volume of the peripheral circulation (mainly skeletal muscle), uptake clearance into the interstitial space, hepatic and renal clearance, as well as total insulin delivery into plasma. The results indicate that, at a population level, the proposed physiologically based model provides a useful description of insulin disposition, which allows for the assessment of muscle insulin uptake.

  2. Using Built-In Domain-Specific Modeling Support to Guide Model-Based Test Generation

    CERN Document Server

    Kanstrén, Teemu; 10.4204/EPTCS.80.5

    2012-01-01

    We present a model-based testing approach to support automated test generation with domain-specific concepts. This includes a language expert who is an expert at building test models and domain experts who are experts in the domain of the system under test. First, we provide a framework to support the language expert in building test models using a full (Java) programming language with the help of simple but powerful modeling elements of the framework. Second, based on the model built with this framework, the toolset automatically forms a domain-specific modeling language that can be used to further constrain and guide test generation from these models by a domain expert. This makes it possible to generate a large set of test cases covering the full model, chosen (constrained) parts of the model, or manually define specific test cases on top of the model while using concepts familiar to the domain experts.

  3. Using Built-In Domain-Specific Modeling Support to Guide Model-Based Test Generation

    Directory of Open Access Journals (Sweden)

    Teemu Kanstrén

    2012-02-01

    Full Text Available We present a model-based testing approach to support automated test generation with domain-specific concepts. This includes a language expert who is an expert at building test models and domain experts who are experts in the domain of the system under test. First, we provide a framework to support the language expert in building test models using a full (Java programming language with the help of simple but powerful modeling elements of the framework. Second, based on the model built with this framework, the toolset automatically forms a domain-specific modeling language that can be used to further constrain and guide test generation from these models by a domain expert. This makes it possible to generate a large set of test cases covering the full model, chosen (constrained parts of the model, or manually define specific test cases on top of the model while using concepts familiar to the domain experts.

  4. Lagrangian-based Hydrodynamic Model: Freeway Traffic Estimation

    OpenAIRE

    Han, Ke; Yao, Tao; Terry L. Friesz

    2012-01-01

    This paper is concerned with highway traffic estimation using traffic sensing data, in a Lagrangian-based modeling framework. We consider the Lighthill-Whitham-Richards (LWR) model (Lighthill and Whitham, 1955; Richards, 1956) in Lagrangian-coordinates, and provide rigorous mathematical results regarding the equivalence of viscosity solutions to the Hamilton-Jacobi equations in Eulerian and Lagrangian coordinates. We derive closed-form solutions to the Lagrangian-based Hamilton-Jacobi equatio...

  5. Articulated Pose Estimation Using Hierarchical Exemplar-Based Models

    OpenAIRE

    Liu, Jiongxin; Li, Yinxiao; Allen, Peter; Belhumeur, Peter

    2015-01-01

    Exemplar-based models have achieved great success on localizing the parts of semi-rigid objects. However, their efficacy on highly articulated objects such as humans is yet to be explored. Inspired by hierarchical object representation and recent application of Deep Convolutional Neural Networks (DCNNs) on human pose estimation, we propose a novel formulation that incorporates both hierarchical exemplar-based models and DCNNs in the spatial terms. Specifically, we obtain more expressive spati...

  6. An Efficient Semantic Model For Concept Based Clustering And Classification

    OpenAIRE

    SaiSindhu Bandaru; Dr. K B Madhuri

    2012-01-01

    Usually in text mining techniques the basic measures like term frequency of a term (word or phrase) is computed to compute the importance of the term in the document. But with statistical analysis, the original semantics of the term may not carry the exact meaning of the term. To overcome this problem, a new framework has been introduced which relies on concept based model and synonym based approach. The proposed model can efficiently find significant matching and related concepts between doc...

  7. Agent-based Models for Economic Policy Design

    OpenAIRE

    Dawid, Herbert; Neugart, Michael

    2010-01-01

    Agent-based simulation models are used by an increasing number of scholars as a tool for providing evaluations of economic policy measures and policy recommendations in complex environments. On the basis of recent work in this area we discuss the advantages of agent-based modeling for economic policy design and identify further needs to be addressed for strengthening this methodological approach as a basis for sound policy advice.

  8. A NEW DYNAMIC DEFENSE MODEL BASED ON ACTIVE DECEPTION

    Institute of Scientific and Technical Information of China (English)

    Gong Jing; Sun Zhixin; Gu Qiang

    2009-01-01

    Aiming at the traditional passive deception models, this paper constructs a Decoy Platform based on Intelligent Agent (DPIA) to realize dynamic defense. The paper explores a new dynamic defense model based on active deception, introduces its architecture, and expatiates on communication methods and security guarantee in information transference. Simulation results show that the DPIA can attract hacker agility and activity, lead abnormal traffic into it, distribute a large number of attack data, and ensure real network security.

  9. Business model analysis of interest-based social networking services

    OpenAIRE

    Lehtinen, Ilari

    2013-01-01

    Objectives of the study This research paper sets out to examine interest-based social networking services and the underlying business models that provide the logic for value creation, delivery, and capture. The objective of this paper is to uncover the common characteristics of interest-based social networking services' business models in order to understand the necessary building blocks that need to be present for a new service to function properly. Furthermore, it aims at giving manager...

  10. Discrete Discriminant analysis based on tree-structured graphical models

    DEFF Research Database (Denmark)

    Perez de la Cruz, Gonzalo; Eslava, Guillermina

    The purpose of this paper is to illustrate the potential use of discriminant analysis based on tree{structured graphical models for discrete variables. This is done by comparing its empirical performance using estimated error rates for real and simulated data. The results show that discriminant...... analysis based on tree{structured graphical models is a simple nonlinear method competitive with, and sometimes superior to, other well{known linear methods like those assuming mutual independence between variables and linear logistic regression....

  11. TRAFFIC FLOW MODEL BASED ON CELLULAR AUTOMATION WITH ADAPTIVE DECELERATION

    OpenAIRE

    Shinkarev, A. A.

    2016-01-01

    This paper describes continuation of the authors’ work in the field of traffic flow mathematical models based on the cellular automata theory. The refactored representation of the multifactorial traffic flow model based on the cellular automata theory is used for a representation of an adaptive deceleration step implementation. The adaptive deceleration step in the case of a leader deceleration allows slowing down smoothly but not instantly. Concepts of the number of time steps without confli...

  12. Perti Net-Based Workflow Access Control Model

    Institute of Scientific and Technical Information of China (English)

    陈卓; 骆婷; 石磊; 洪帆

    2004-01-01

    Access control is an important protection mechanism for information systems. This paper shows how to make access control in workflow system. We give a workflow access control model (WACM) based on several current access control models. The model supports roles assignment and dynamic authorization. The paper defines the workflow using Petri net. It firstly gives the definition and description of the workflow, and then analyzes the architecture of the workflow access control model (WACM). Finally, an example of an e-commerce workflow access control model is discussed in detail.

  13. A General Polygon-based Deformable Model for Object Recognition

    DEFF Research Database (Denmark)

    Jensen, Rune Fisker; Carstensen, Jens Michael

    1999-01-01

    We propose a general scheme for object localization and recognition based on a deformable model. The model combines shape and image properties by warping a arbitrary prototype intensity template according to the deformation in shape. The shape deformations are constrained by a probabilistic...... distribution, which combined with a match of the warped intensity template and the image form the final criteria used for localization and recognition of a given object. The chosen representation gives the model an ability to model an almost arbitrary object. Beside the actual model a full general scheme for...

  14. A Prototype-Based Resonance Model of Rhythm Categorization

    Directory of Open Access Journals (Sweden)

    Rasmus Bååth

    2014-10-01

    Full Text Available Categorization of rhythmic patterns is prevalent in musical practice, an example of this being the transcription of (possibly not strictly metrical music into musical notation. In this article we implement a dynamical systems' model of rhythm categorization based on the resonance theory of rhythm perception developed by Large (2010. This model is used to simulate the categorical choices of participants in two experiments of Desain and Honing (2003. The model accurately replicates the experimental data. Our results support resonance theory as a viable model of rhythm perception and show that by viewing rhythm perception as a dynamical system it is possible to model central properties of rhythm categorization.

  15. Intelligent Cost Modeling Based on Soft Computing for Avionics Systems

    Institute of Scientific and Technical Information of China (English)

    ZHU Li-li; LI Zhuang-sheng; XU Zong-ze

    2006-01-01

    In parametric cost estimating, objections to using statistical Cost Estimating Relationships (CERs) and parametric models include problems of low statistical significance due to limited data points, biases in the underlying data, and lack of robustness. Soft Computing (SC) technologies are used for building intelligent cost models. The SC models are systemically evaluated based on their training and prediction of the historical cost data of airborne avionics systems. Results indicating the strengths and weakness of each model are presented. In general, the intelligent cost models have higher prediction precision, better data adaptability, and stronger self-learning capability than the regression CERs.

  16. Model-Based Reconstructive Elasticity Imaging Using Ultrasound

    Directory of Open Access Journals (Sweden)

    Salavat R. Aglyamov

    2007-01-01

    Full Text Available Elasticity imaging is a reconstructive imaging technique where tissue motion in response to mechanical excitation is measured using modern imaging systems, and the estimated displacements are then used to reconstruct the spatial distribution of Young's modulus. Here we present an ultrasound elasticity imaging method that utilizes the model-based technique for Young's modulus reconstruction. Based on the geometry of the imaged object, only one axial component of the strain tensor is used. The numerical implementation of the method is highly efficient because the reconstruction is based on an analytic solution of the forward elastic problem. The model-based approach is illustrated using two potential clinical applications: differentiation of liver hemangioma and staging of deep venous thrombosis. Overall, these studies demonstrate that model-based reconstructive elasticity imaging can be used in applications where the geometry of the object and the surrounding tissue is somewhat known and certain assumptions about the pathology can be made.

  17. Electromagnetic Model and Image Reconstruction Algorithms Based on EIT System

    Institute of Scientific and Technical Information of China (English)

    CAO Zhang; WANG Huaxiang

    2006-01-01

    An intuitive 2 D model of circular electrical impedance tomography ( EIT) sensor with small size electrodes is established based on the theory of analytic functions.The validation of the model is proved using the result from the solution of Laplace equation.Suggestions on to electrode optimization and explanation to the ill-condition property of the sensitivity matrix are provided based on the model,which takes electrode distance into account and can be generalized to the sensor with any simple connected region through a conformal transformation.Image reconstruction algorithms based on the model are implemented to show feasibility of the model using experimental data collected from the EIT system developed in Tianjin University.In the simulation with a human chestlike configuration,electrical conductivity distributions are reconstructed using equi-potential backprojection (EBP) and Tikhonov regularization (TR) based on a conformal transformation of the model.The algorithms based on the model are suitable for online image reconstruction and the reconstructed results are good both in size and position.

  18. The impact of design-based modeling instruction on seventh graders' spatial abilities and model-based argumentation

    Science.gov (United States)

    McConnell, William J.

    Due to the call of current science education reform for the integration of engineering practices within science classrooms, design-based instruction is receiving much attention in science education literature. Although some aspect of modeling is often included in well-known design-based instructional methods, it is not always a primary focus. The purpose of this study was to better understand how design-based instruction with an emphasis on scientific modeling might impact students' spatial abilities and their model-based argumentation abilities. In the following mixed-method multiple case study, seven seventh grade students attending a secular private school in the Mid-Atlantic region of the United States underwent an instructional intervention involving design-based instruction, modeling and argumentation. Through the course of a lesson involving students in exploring the interrelatedness of the environment and an animal's form and function, students created and used multiple forms of expressed models to assist them in model-based scientific argument. Pre/post data were collected through the use of The Purdue Spatial Visualization Test: Rotation, the Mental Rotation Test and interviews. Other data included a spatial activities survey, student artifacts in the form of models, notes, exit tickets, and video recordings of students throughout the intervention. Spatial abilities tests were analyzed using descriptive statistics while students' arguments were analyzed using the Instrument for the Analysis of Scientific Curricular Arguments and a behavior protocol. Models were analyzed using content analysis and interviews and all other data were coded and analyzed for emergent themes. Findings in the area of spatial abilities included increases in spatial reasoning for six out of seven participants, and an immense difference in the spatial challenges encountered by students when using CAD software instead of paper drawings to create models. Students perceived 3D printed

  19. Propositions for a PDF model based on fluid particle acceleration

    International Nuclear Information System (INIS)

    This paper describes theoretical propositions to model the acceleration of a fluid particle in a turbulent flow. Such a model is useful for the PDF approach to turbulent reactive flows as well as for the Lagrangian modelling of two-phase flows. The model developed here draws from ideas already put forward by Sawford but which are generalized to the case of non-homogeneous flows. The model is built so as to revert continuously to Pope's model, which uses a Langevin equation for particle velocities, when the Reynolds number becomes very high. The derivation is based on the technique of fast variable elimination. This technique allow a careful analysis of the relations between different levels of modelling. It also allows to address certain problems in a more rigorous way. In particular, application of this technique shows that models presently used can in principle simulate bubbly flows including the pressure-gradient and added-mass forces. (author)

  20. Model-based uncertainty in species range prediction

    DEFF Research Database (Denmark)

    Pearson, R. G.; Thuiller, Wilfried; Bastos Araujo, Miguel;

    2006-01-01

    Aim Many attempts to predict the potential range of species rely on environmental niche (or 'bioclimate envelope') modelling, yet the effects of using different niche-based methodologies require further investigation. Here we investigate the impact that the choice of model can have on predictions......, identify key reasons why model output may differ and discuss the implications that model uncertainty has for policy-guiding applications. Location The Western Cape of South Africa. Methods We applied nine of the most widely used modelling techniques to model potential distributions under current...... and predicted future climate for four species (including two subspecies) of Proteaceae. Each model was built using an identical set of five input variables and distribution data for 3996 sampled sites. We compare model predictions by testing agreement between observed and simulated distributions for the present...

  1. Map-Based Channel Model for Urban Macrocell Propagation Scenarios

    Directory of Open Access Journals (Sweden)

    Jose F. Monserrat

    2015-01-01

    Full Text Available The evolution of LTE towards 5G has started and different research projects and institutions are in the process of verifying new technology components through simulations. Coordination between groups is strongly recommended and, in this sense, a common definition of test cases and simulation models is needed. The scope of this paper is to present a realistic channel model for urban macrocell scenarios. This model is map-based and takes into account the layout of buildings situated in the area under study. A detailed description of the model is given together with a comparison with other widely used channel models. The benchmark includes a measurement campaign in which the proposed model is shown to be much closer to the actual behavior of a cellular system. Particular attention is given to the outdoor component of the model, since it is here where the proposed approach is showing main difference with other previous models.

  2. MDA based-approach for UML Models Complete Comparison

    CERN Document Server

    Chaouni, Samia Benabdellah; Mouline, Salma

    2011-01-01

    If a modeling task is distributed, it will frequently be necessary to integrate models developed by different team members. Problems occur in the models integration step and particularly, in the comparison phase of the integration. This issue had been discussed in several domains and various models. However, previous approaches have not correctly handled the semantic comparison. In the current paper, we provide a MDA-based approach for models comparison which aims at comparing UML models. We develop an hybrid approach which takes into account syntactic, semantic and structural comparison aspects. For this purpose, we use the domain ontology as well as other resources such as dictionaries. We propose a decision support system which permits the user to validate (or not) correspondences extracted in the comparison phase. For implementation, we propose an extension of the generic correspondence metamodel AMW in order to transform UML models to the correspondence model.

  3. MDA based-approach for UML Models Complete Comparison

    Directory of Open Access Journals (Sweden)

    Samia Benabdellah Chaouni

    2011-03-01

    Full Text Available If a modeling task is distributed, it will frequently be necessary to integrate models developed by different team members. Problems occur in the models integration step and particularly, in the comparison phase of the integration. This issue had been discussed in several domains and various models. However, previous approaches have not correctly handled the semantic comparison. In the current paper, we provide a MDA-based approach for models comparison which aims at comparing UML models. We develop an hybrid approach which takes into account syntactic, semantic and structural comparison aspects. For this purpose, we use the domain ontology as well as other resources such as dictionaries. We propose a decision support system which permits the user to validate (or not correspondences extracted in the comparison phase. For implementation, we propose an extension of the generic correspondence metamodel AMW in order to transform UML models to the correspondence model.

  4. Evaluation of Artificial Intelligence Based Models for Chemical Biodegradability Prediction

    Directory of Open Access Journals (Sweden)

    Aleksandar Sabljic

    2004-12-01

    Full Text Available This study presents a review of biodegradability modeling efforts including a detailed assessment of two models developed using an artificial intelligence based methodology. Validation results for these models using an independent, quality reviewed database, demonstrate that the models perform well when compared to another commonly used biodegradability model, against the same data. The ability of models induced by an artificial intelligence methodology to accommodate complex interactions in detailed systems, and the demonstrated reliability of the approach evaluated by this study, indicate that the methodology may have application in broadening the scope of biodegradability models. Given adequate data for biodegradability of chemicals under environmental conditions, this may allow for the development of future models that include such things as surface interface impacts on biodegradability for example.

  5. Model Checking-Based Testing of Web Applications

    Institute of Scientific and Technical Information of China (English)

    ZENG Hongwei; MIAO Huaikou

    2007-01-01

    A formal model representing the navigation behavior of a Web application as the Kripke structure is proposed and an approach that applies model checking to test case generation is presented. The Object Relation Diagram as the object model is employed to describe the object structure of a Web application design and can be translated into the behavior model. A key problem of model checking-based test generation for a Web application is how to construct a set of trap properties that intend to cause the violations of model checking against the behavior model and output of counterexamples used to construct the test sequences.We give an algorithm that derives trap properties from the object model with respect to node and edge coverage criteria.

  6. A DSM-based framework for integrated function modelling

    DEFF Research Database (Denmark)

    Eisenbart, Boris; Gericke, Kilian; Blessing, Lucienne T. M.;

    2016-01-01

    perspectives and using different structures and forms for representing the contained information. This hampers the exchange of information between the models and poses particular challenges to joint modelling and shared comprehension between designers from different disciplines. This article proposes......Function modelling is proposed in the literature from different disciplines, in interdisciplinary approaches, and used in practice with the intention of facilitating system conceptualisation. However, function models across disciplines are largely diverse addressing different function modelling...... an integrated function modelling framework, which specifically aims at relating between the different function modelling perspectives prominently addressed in different disciplines. It uses interlinked matrices based on the concept of DSM and MDM in order to facilitate cross-disciplinary modelling and analysis...

  7. Building information modeling based on intelligent parametric technology

    Institute of Scientific and Technical Information of China (English)

    ZENG Xudong; TAN Jie

    2007-01-01

    In order to push the information organization process of the building industry,promote sustainable architectural design and enhance the competitiveness of China's building industry,the author studies building information modeling (BIM) based on intelligent parametric modeling technology.Building information modeling is a new technology in the field of computer aided architectural design,which contains not only geometric data,but also the great amount of engineering data throughout the lifecycle of a building.The author also compares BIM technology with two-dimensional CAD technology,and demonstrates the advantages and characteristics of intelligent parametric modeling technology.Building information modeling,which is based on intelligent parametric modeling technology,will certainly replace traditional computer aided architectural design and become the new driving force to push forward China's building industry in this information age.

  8. Support vector machine-based multi-model predictive control

    Institute of Scientific and Technical Information of China (English)

    Zhejing BA; Youxian SUN

    2008-01-01

    In this paper,a support vector machine-based multi-model predictive control is proposed,in which SVM classification combines well with SVM regression.At first,each working environment is modeled by SVM regression and the support vector machine network-based model predictive control(SVMN-MPC)algorithm corresponding to each environment is developed,and then a multi-class SVM model is established to recognize multiple operating conditions.As for control,the current environment is identified by the multi-class SVM model and then the corresponding SVMN.MPCcontroller is activated at each sampling instant.The proposed modeling,switching and controller design is demonstrated in simulation results.

  9. Improved Computational Model of Grid Cells Based on Column Structure

    Institute of Scientific and Technical Information of China (English)

    Yang Zhou; Dewei Wu; Weilong Li; Jia Du

    2016-01-01

    To simulate the firing pattern of biological grid cells, this paper presents an improved computational model of grid cells based on column structure. In this model, the displacement along different directions is processed by modulus operation, and the obtained remainder is associated with firing rate of grid cell. Compared with the original model, the improved parts include that: the base of modulus operation is changed, and the firing rate in firing field is encoded by Gaussian⁃like function. Simulation validates that the firing pattern generated by the improved computational model is more consistent with biological characteristic than original model. Besides, the firing pattern is badly influenced by the cumulative positioning error, but the computational model can also generate the regularly hexagonal firing pattern when the real⁃time positioning results are modified.

  10. Tutorial on agent-based modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Macal, C. M.; North, M. J.; Decision and Information Sciences

    2005-01-01

    Agent-based modeling and simulation (ABMS) is a new approach to modeling systems comprised of autonomous, interacting agents. ABMS promises to have far-reaching effects on the way that businesses use computers to support decision-making and researchers use electronic laboratories to support their research. Some have gone so far as to contend that ABMS is a third way of doing science besides deductive and inductive reasoning. Computational advances have made possible a growing number of agent-based applications in a variety of fields. Applications range from modeling agent behavior in the stock market and supply chains, to predicting the spread of epidemics and the threat of bio-warfare, from modeling consumer behavior to understanding the fall of ancient civilizations, to name a few. This tutorial describes the theoretical and practical foundations of ABMS, identifies toolkits and methods for developing ABMS models, and provides some thoughts on the relationship between ABMS and traditional modeling techniques.

  11. Automated Decomposition of Model-based Learning Problems

    Science.gov (United States)

    Williams, Brian C.; Millar, Bill

    1996-01-01

    A new generation of sensor rich, massively distributed autonomous systems is being developed that has the potential for unprecedented performance, such as smart buildings, reconfigurable factories, adaptive traffic systems and remote earth ecosystem monitoring. To achieve high performance these massive systems will need to accurately model themselves and their environment from sensor information. Accomplishing this on a grand scale requires automating the art of large-scale modeling. This paper presents a formalization of [\\em decompositional model-based learning (DML)], a method developed by observing a modeler's expertise at decomposing large scale model estimation tasks. The method exploits a striking analogy between learning and consistency-based diagnosis. Moriarty, an implementation of DML, has been applied to thermal modeling of a smart building, demonstrating a significant improvement in learning rate.

  12. Event-Based Corpuscular Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    Michielsen, K.; Jin, F.; Raedt, H. De

    2011-01-01

    A corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one is presented. The event-based corpuscular model is shown to give a u

  13. ATTEND: Toward a Mindfulness-Based Bereavement Care Model

    Science.gov (United States)

    Cacciatore, Joanne; Flint, Melissa

    2012-01-01

    Few, if any, mindfulness-based bereavement care models exist. The ATTEND (attunement, trust, touch, egalitarianism, nuance, and death education) model is an interdisciplinary paradigm for providers, including physicians, social workers, therapists, nursing staff, and others. Using a case example to enhance the breadth and depth of understanding,…

  14. Model-Based GUI Testing Using Uppaal at Novo Nordisk

    DEFF Research Database (Denmark)

    H. Hjort, Ulrik; Rasmussen, Jacob Illum; Larsen, Kim Guldstrand;

    2009-01-01

    This paper details a collaboration between Aalborg University and Novo Nordiskin developing an automatic model-based test generation tool for system testing of the graphical user interface of a medical device on an embedded platform. The tool takes as input an UML Statemachine model and generates...

  15. A Multinomial Model of Event-Based Prospective Memory

    Science.gov (United States)

    Smith, Rebekah E.; Bayen, Ute J.

    2004-01-01

    Prospective memory is remembering to perform an action in the future. The authors introduce the 1st formal model of event-based prospective memory, namely, a multinomial model that includes 2 separate parameters related to prospective memory processes. The 1st measures preparatory attentional processes, and the 2nd measures retrospective memory…

  16. Mental Models about Seismic Effects: Students' Profile Based Comparative Analysis

    Science.gov (United States)

    Moutinho, Sara; Moura, Rui; Vasconcelos, Clara

    2016-01-01

    Nowadays, meaningful learning takes a central role in science education and is based in mental models that allow the representation of the real world by individuals. Thus, it is essential to analyse the student's mental models by promoting an easier reconstruction of scientific knowledge, by allowing them to become consistent with the curricular…

  17. Model based decision support for planning of road maintenance

    NARCIS (Netherlands)

    Worm, J.M.; Harten, van A.

    1996-01-01

    In this article we describe a Decision Support Model, based on Operational Research methods, for the multi-period planning of maintenance of bituminous pavements. This model is a tool for the road manager to assist in generating an optimal maintenance plan for a road. Optimal means: minimising the N

  18. On infrastructure network design with agent-based modelling

    NARCIS (Netherlands)

    Chappin, E.J.L.; Heijnen, P.W.

    2014-01-01

    We have developed an agent-based model to optimize green-field network design in an industrial area. We aim to capture some of the deep uncertainties surrounding infrastructure design by modelling it developing specific ant colony optimizations. Hence, we propose a variety of extensions to our exist

  19. A Memory-Based Model of Hick's Law

    Science.gov (United States)

    Schneider, Darryl W.; Anderson, John R.

    2011-01-01

    We propose and evaluate a memory-based model of Hick's law, the approximately linear increase in choice reaction time with the logarithm of set size (the number of stimulus-response alternatives). According to the model, Hick's law reflects a combination of associative interference during retrieval from declarative memory and occasional savings…

  20. Statistical virtual eye model based on wavefront aberration

    OpenAIRE

    Wang, Jie-Mei; Liu, Chun-Ling; Luo, Yi-Ning; Liu, Yi-Guang; Hu, Bing-Jie

    2012-01-01

    Wavefront aberration affects the quality of retinal image directly. This paper reviews the representation and reconstruction of wavefront aberration, as well as the construction of virtual eye model based on Zernike polynomial coefficients. In addition, the promising prospect of virtual eye model is emphasized.