Kheir, Rania Bou; Greve, Mogens Humlekrog; Bøcher, Peder Klith;
2010-01-01
Soil organic carbon (SOC) is one of the most important carbon stocks globally and has large potential to affect global climate. Distribution patterns of SOC in Denmark constitute a nation-wide baseline for studies on soil carbon changes (with respect to Kyoto protocol). This paper predicts and maps...... the geographic distribution of SOC across Denmark using remote sensing (RS), geographic information systems (GISs) and decision-tree modeling (un-pruned and pruned classification trees). Seventeen parameters, i.e. parent material, soil type, landscape type, elevation, slope gradient, slope aspect......, mean curvature, plan curvature, profile curvature, flow accumulation, specific catchment area, tangent slope, tangent curvature, steady-state wetness index, Normalized Difference Vegetation Index (NDVI), Normalized Difference Wetness Index (NDWI) and Soil Color Index (SCI) were generated to...
Investigation of the Effect of Traffic Parameters on Road Hazard Using Classification Tree Model
Md. Mahmud Hasan
2012-09-01
Full Text Available This paper presents a method for the identification of hazardous situations on the freeways. For this study, about 18 km long section of Eastern Freeway in Melbourne, Australia was selected as a test bed. Three categories of data i.e. traffic, weather and accident record data were used for the analysis and modelling. In developing the crash risk probability model, classification tree based model was developed in this study. In formulating the models, it was found that weather conditions did not have significant impact on accident occurrence so the classification tree was built using two traffic indices; traffic flow and vehicle speed only. The formulated classification tree is able to identify the possible hazard and non-hazard situations on freeway. The outcome of the study will aid the hazard mitigation strategies.
Bou Kheir, Rania; Greve, Mogens H; Bøcher, Peder K; Greve, Mette B; Larsen, René; McCloy, Keith
2010-05-01
Soil organic carbon (SOC) is one of the most important carbon stocks globally and has large potential to affect global climate. Distribution patterns of SOC in Denmark constitute a nation-wide baseline for studies on soil carbon changes (with respect to Kyoto protocol). This paper predicts and maps the geographic distribution of SOC across Denmark using remote sensing (RS), geographic information systems (GISs) and decision-tree modeling (un-pruned and pruned classification trees). Seventeen parameters, i.e. parent material, soil type, landscape type, elevation, slope gradient, slope aspect, mean curvature, plan curvature, profile curvature, flow accumulation, specific catchment area, tangent slope, tangent curvature, steady-state wetness index, Normalized Difference Vegetation Index (NDVI), Normalized Difference Wetness Index (NDWI) and Soil Color Index (SCI) were generated to statistically explain SOC field measurements in the area of interest (Denmark). A large number of tree-based classification models (588) were developed using (i) all of the parameters, (ii) all Digital Elevation Model (DEM) parameters only, (iii) the primary DEM parameters only, (iv), the remote sensing (RS) indices only, (v) selected pairs of parameters, (vi) soil type, parent material and landscape type only, and (vii) the parameters having a high impact on SOC distribution in built pruned trees. The best constructed classification tree models (in the number of three) with the lowest misclassification error (ME) and the lowest number of nodes (N) as well are: (i) the tree (T1) combining all of the parameters (ME=29.5%; N=54); (ii) the tree (T2) based on the parent material, soil type and landscape type (ME=31.5%; N=14); and (iii) the tree (T3) constructed using parent material, soil type, landscape type, elevation, tangent slope and SCI (ME=30%; N=39). The produced SOC maps at 1:50,000 cartographic scale using these trees are highly matching with coincidence values equal to 90.5% (Map T1
Ivana Đurđević Babić
2015-03-01
Full Text Available Student satisfaction with courses in academic institutions is an important issue and is recognized as a form of support in ensuring effective and quality education, as well as enhancing student course experience. This paper investigates whether there is a connection between student satisfaction with courses and log data on student courses in a virtual learning environment. Furthermore, it explores whether a successful classification model for predicting student satisfaction with course can be developed based on course log data and compares the results obtained from implemented methods. The research was conducted at the Faculty of Education in Osijek and included analysis of log data and course satisfaction on a sample of third and fourth year students. Multilayer Perceptron (MLP with different activation functions and Radial Basis Function (RBF neural networks as well as classification tree models were developed, trained and tested in order to classify students into one of two categories of course satisfaction. Type I and type II errors, and input variable importance were used for model comparison and classification accuracy. The results indicate that a successful classification model using tested methods can be created. The MLP model provides the highest average classification accuracy and the lowest preference in misclassification of students with a low level of course satisfaction, although a t-test for the difference in proportions showed that the difference in performance between the compared models is not statistically significant. Student involvement in forum discussions is recognized as a valuable predictor of student satisfaction with courses in all observed models.
Ivana Đurđević Babić
2015-01-01
Student satisfaction with courses in academic institutions is an important issue and is recognized as a form of support in ensuring effective and quality education, as well as enhancing student course experience. This paper investigates whether there is a connection between student satisfaction with courses and log data on student courses in a virtual learning environment. Furthermore, it explores whether a successful classification model for predicting student satisfaction with course can be...
Hung Cong
2008-01-01
Full Text Available Abstract Background In Central Vietnam, forest malaria remains difficult to control due to the complex interactions between human, vector and environmental factors. Methods Prior to a community-based intervention to assess the efficacy of long-lasting insecticidal hammocks, a complete census (18,646 individuals and a baseline cross-sectional survey for determining malaria prevalence and related risk factors were carried out. Multivariate analysis using survey logistic regression was combined to a classification tree model (CART to better define the relative importance and inter-relations between the different risk factors. Results The study population was mostly from the Ra-glai ethnic group (88%, with both low education and socio-economic status and engaged mainly in forest activities (58%. The multivariate analysis confirmed forest activity, bed net use, ethnicity, age and education as risk factors for malaria infections, but could not handle multiple interactions. The CART analysis showed that the most important risk factor for malaria was the wealth category, the wealthiest group being much less infected (8.9% than the lower and medium wealth category (16.6%. In the former, forest activity and bed net use were the most determinant risk factors for malaria, while in the lower and medium wealth category, insecticide treated nets were most important, although the latter were less protective among Ra-glai people. Conclusion The combination of CART and multivariate analysis constitute a novel analytical approach, providing an accurate and dynamic picture of the main risk factors for malaria infection. Results show that the control of forest malaria remains an extremely complex task that has to address poverty-related risk factors such as education, ethnicity and housing conditions.
Krasteva, Vessela; Jekova, Irena; Leber, Remo; Schmid, Ramun; Abächerli, Roger
2015-01-01
This study presents a 2-stage heartbeat classifier of supraventricular (SVB) and ventricular (VB) beats. Stage 1 makes computationally-efficient classification of SVB-beats, using simple correlation threshold criterion for finding close match with a predominant normal (reference) beat template. The non-matched beats are next subjected to measurement of 20 basic features, tracking the beat and reference template morphology and RR-variability for subsequent refined classification in SVB or VB-class by Stage 2. Four linear classifiers are compared: cluster, fuzzy, linear discriminant analysis (LDA) and classification tree (CT), all subjected to iterative training for selection of the optimal feature space among extended 210-sized set, embodying interactive second-order effects between 20 independent features. The optimization process minimizes at equal weight the false positives in SVB-class and false negatives in VB-class. The training with European ST-T, AHA, MIT-BIH Supraventricular Arrhythmia databases found the best performance settings of all classification models: Cluster (30 features), Fuzzy (72 features), LDA (142 coefficients), CT (221 decision nodes) with top-3 best scored features: normalized current RR-interval, higher/lower frequency content ratio, beat-to-template correlation. Unbiased test-validation with MIT-BIH Arrhythmia database rates the classifiers in descending order of their specificity for SVB-class: CT (99.9%), LDA (99.6%), Cluster (99.5%), Fuzzy (99.4%); sensitivity for ventricular ectopic beats as part from VB-class (commonly reported in published beat-classification studies): CT (96.7%), Fuzzy (94.4%), LDA (94.2%), Cluster (92.4%); positive predictivity: CT (99.2%), Cluster (93.6%), LDA (93.0%), Fuzzy (92.4%). CT has superior accuracy by 0.3–6.8% points, with the advantage for easy model complexity configuration by pruning the tree consisted of easy interpretable ‘if-then’ rules. PMID:26461492
Building classification trees to explain the radioactive contamination levels of the plants
The objective of this thesis is the development of a method allowing the identification of factors leading to various radioactive contamination levels of the plants. The methodology suggested is based on the use of a radioecological transfer model of the radionuclides through the environment (A.S.T.R.A.L. computer code) and a classification-tree method. Particularly, to avoid the instability problems of classification trees and to preserve the tree structure, a node level stabilizing technique is used. Empirical comparisons are carried out between classification trees built by this method (called R.E.N. method) and those obtained by the C.A.R.T. method. A similarity measure is defined to compare the structure of two classification trees. This measure is used to study the stabilizing performance of the R.E.N. method. The methodology suggested is applied to a simplified contamination scenario. By the results obtained, we can identify the main variables responsible of the various radioactive contamination levels of four leafy-vegetables (lettuce, cabbage, spinach and leek). Some extracted rules from these classification trees can be usable in a post-accidental context. (author)
Sensitivity of missing values in classification tree for large sample
Hasan, Norsida; Adam, Mohd Bakri; Mustapha, Norwati; Abu Bakar, Mohd Rizam
2012-05-01
Missing values either in predictor or in response variables are a very common problem in statistics and data mining. Cases with missing values are often ignored which results in loss of information and possible bias. The objectives of our research were to investigate the sensitivity of missing data in classification tree model for large sample. Data were obtained from one of the high level educational institutions in Malaysia. Students' background data were randomly eliminated and classification tree was used to predict students degree classification. The results showed that for large sample, the structure of the classification tree was sensitive to missing values especially for sample contains more than ten percent missing values.
Influence Measures for CART Classification Trees
Bar-Hen, Avner; Gey, Servane; Poggi, Jean-Michel
2015-01-01
This paper deals with measuring the influence of observations on the results obtained with CART classification trees. To define the influence of individuals on the analysis, we use influence functions to propose some general criterions to measure the sensitivity of the CART analysis and its robustness. The proposals, based on jakknife trees, are organized around two lines: influence on predictions and influence on partitions. In addition, the analysis is extended to the pruned sequences of CA...
Segmentation of Firms by Means of Classification Trees
Mirosława Lasek; Marek Pęczkowski
2002-01-01
The objective of the paper was to present the utility and applicability of the method of generating classification trees for the purposes of segmentation of firms by their economic standing, i.e. their financial and assets condition. The method of classification tree generation belongs to the group of the ihdata mininglt methods that permit to find out, basing on large data sets, the relationships and links among data. Variables used to classify the firms were financial and assets indices, in...
Consensus of classification trees for skin sensitisation hazard prediction.
Asturiol, D; Casati, S; Worth, A
2016-10-01
Since March 2013, it is no longer possible to market in the European Union (EU) cosmetics containing new ingredients tested on animals. Although several in silico alternatives are available and achievements have been made in the development and regulatory adoption of skin sensitisation non-animal tests, there is not yet a generally accepted approach for skin sensitisation assessment that would fully substitute the need for animal testing. The aim of this work was to build a defined approach (i.e. a predictive model based on readouts from various information sources that uses a fixed procedure for generating a prediction) for skin sensitisation hazard prediction (sensitiser/non-sensitiser) using Local Lymph Node Assay (LLNA) results as reference classifications. To derive the model, we built a dataset with high quality data from in chemico (DPRA) and in vitro (KeratinoSens™ and h-CLAT) methods, and it was complemented with predictions from several software packages. The modelling exercise showed that skin sensitisation hazard was better predicted by classification trees based on in silico predictions. The defined approach consists of a consensus of two classification trees that are based on descriptors that account for protein reactivity and structural features. The model showed an accuracy of 0.93, sensitivity of 0.98, and specificity of 0.85 for 269 chemicals. In addition, the defined approach provides a measure of confidence associated to the prediction. PMID:27458072
Predictive Classification Trees
Dlugosz, Stephan; Müller-Funk, Ulrich
CART (Breiman et al., Classification and Regression Trees, Chapman and Hall, New York, 1984) and (exhaustive) CHAID (Kass, Appl Stat 29:119-127, 1980) figure prominently among the procedures actually used in data based management, etc. CART is a well-established procedure that produces binary trees. CHAID, in contrast, admits multiple splittings, a feature that allows to exploit the splitting variable more extensively. On the other hand, that procedure depends on premises that are questionable in practical applications. This can be put down to the fact that CHAID relies on simultaneous Chi-Square- resp. F-tests. The null-distribution of the second test statistic, for instance, relies on the normality assumption that is not plausible in a data mining context. Moreover, none of these procedures - as implemented in SPSS, for instance - take ordinal dependent variables into account. In the paper we suggest an alternative tree-algorithm that: Requires explanatory categorical variables
Predicting 'very poor' beach water quality gradings using classification tree.
Thoe, Wai; Choi, King Wah; Lee, Joseph Hun-wei
2016-02-01
A beach water quality prediction system has been developed in Hong Kong using multiple linear regression (MLR) models. However, linear models are found to be weak at capturing the infrequent 'very poor' water quality occasions when Escherichia coli (E. coli) concentration exceeds 610 counts/100 mL. This study uses a classification tree to increase the accuracy in predicting the 'very poor' water quality events at three Hong Kong beaches affected either by non-point source or point source pollution. Binary-output classification trees (to predict whether E. coli concentration exceeds 610 counts/100 mL) are developed over the periods before and after the implementation of the Harbour Area Treatment Scheme, when systematic changes in water quality were observed. Results show that classification trees can capture more 'very poor' events in both periods when compared to the corresponding linear models, with an increase in correct positives by an average of 20%. Classification trees are also developed at two beaches to predict the four-category Beach Water Quality Indices. They perform worse than the binary tree and give excessive false alarms of 'very poor' events. Finally, a combined modelling approach using both MLR model and classification tree is proposed to enhance the beach water quality prediction system for Hong Kong. PMID:26837834
Design of Radar Software Test Case Based on Classification Tree%基于分类树的雷达软件测试用例设计
职晓; 裴阿平; 张江华
2014-01-01
Owing to larger and larger scale of software size, it is less and less feasible to test every functional unit of modern radar software by using common combinatorial testing techniques in engineering. Aiming at solving defi-ciency of a large amount of redundant test cases generated by using the classification tree method ( CTM) designing test cases, the orthogonal experimental design method based on case set generated by CTM is used to simplify and optimize the testing so as to improve testing efficiency. The experimental results show that optimization of testing case based on orthogonal experimental test designing method can be used to reduce redundant test cases effectively and save test source and cost. It possesses applicable value in engineering.%现代雷达软件测试由于软件规模越来越大，利用常规的组合覆盖方法测试各功能单元工程上越来越不现实。文章针对分类树方法设计测试用例产生大量冗余测试用例的缺陷，提出了在分类树方法生成的用例集基础上，利用正交试验设计法对其进行精简优化，以提高测试效率。实验结果表明，基于正交试验设计法的测试用例优化，可以有效减少冗余测试用例，节省测试资源和成本，具有一定的工程应用价值。
Ying Cai
2012-09-01
Full Text Available In previous attempts to identify aquatic vegetation from remotely-sensed images using classification trees (CT, the images used to apply CT models to different times or locations necessarily originated from the same satellite sensor as that from which the original images used in model development came, greatly limiting the application of CT. We have developed an effective normalization method to improve the robustness of CT models when applied to images originating from different sensors and dates. A total of 965 ground-truth samples of aquatic vegetation types were obtained in 2009 and 2010 in Taihu Lake, China. Using relevant spectral indices (SI as classifiers, we manually developed a stable CT model structure and then applied a standard CT algorithm to obtain quantitative (optimal thresholds from 2009 ground-truth data and images from Landsat7-ETM+, HJ-1B-CCD, Landsat5-TM and ALOS-AVNIR-2 sensors. Optimal CT thresholds produced average classification accuracies of 78.1%, 84.7% and 74.0% for emergent vegetation, floating-leaf vegetation and submerged vegetation, respectively. However, the optimal CT thresholds for different sensor images differed from each other, with an average relative variation (RV of 6.40%. We developed and evaluated three new approaches to normalizing the images. The best-performing method (Method of 0.1% index scaling normalized the SI images using tailored percentages of extreme pixel values. Using the images normalized by Method of 0.1% index scaling, CT models for a particular sensor in which thresholds were replaced by those from the models developed for images originating from other sensors provided average classification accuracies of 76.0%, 82.8% and 68.9% for emergent vegetation, floating-leaf vegetation and submerged vegetation, respectively. Applying the CT models developed for normalized 2009 images to 2010 images resulted in high classification (78.0%–93.3% and overall (92.0%–93.1% accuracies. Our
Briand, B
2008-04-15
The objective of this thesis is the development of a method allowing the identification of factors leading to various radioactive contamination levels of the plants. The methodology suggested is based on the use of a radioecological transfer model of the radionuclides through the environment (A.S.T.R.A.L. computer code) and a classification-tree method. Particularly, to avoid the instability problems of classification trees and to preserve the tree structure, a node level stabilizing technique is used. Empirical comparisons are carried out between classification trees built by this method (called R.E.N. method) and those obtained by the C.A.R.T. method. A similarity measure is defined to compare the structure of two classification trees. This measure is used to study the stabilizing performance of the R.E.N. method. The methodology suggested is applied to a simplified contamination scenario. By the results obtained, we can identify the main variables responsible of the various radioactive contamination levels of four leafy-vegetables (lettuce, cabbage, spinach and leek). Some extracted rules from these classification trees can be usable in a post-accidental context. (author)
Plass, Julia; Fink, Paul; Schöning, Norbert; Augustin, Thomas
2015-01-01
In surveys, and most notably in election polls, undecided participants frequently constitute subgroups of their own with specific individual characteristics. While traditional survey methods and corresponding statistical models are inherently damned to neglect this valuable information, an ontic random set view provides us with the full power of the whole statistical modelling framework. We elaborate this idea for a multinomial logistic regression model (which can be derived as a discrete cho...
Prasetyo Utomo, Chandra
2011-06-01
Permeability is an important parameter connected with oil reservoir. Predicting the permeability could save millions of dollars. Unfortunately, petroleum engineers have faced numerous challenges arriving at cost-efficient predictions. Much work has been carried out to solve this problem. The main challenge is to handle the high range of permeability in each reservoir. For about a hundred year, mathematicians and engineers have tried to deliver best prediction models. However, none of them have produced satisfying results. In the last two decades, artificial intelligence models have been used. The current best prediction model in permeability prediction is extreme learning machine (ELM). It produces fairly good results but a clear explanation of the model is hard to come by because it is so complex. The aim of this research is to propose a way out of this complexity through the design of a hybrid intelligent model. In this proposal, the system combines classification and regression models to predict the permeability value. These are based on the well logs data. In order to handle the high range of the permeability value, a classification tree is utilized. A benefit of this innovation is that the tree represents knowledge in a clear and succinct fashion and thereby avoids the complexity of all previous models. Finally, it is important to note that the ELM is used as a final predictor. Results demonstrate that this proposed hybrid model performs better when compared with support vector machines (SVM) and ELM in term of correlation coefficient. Moreover, the classification tree model potentially leads to better communication among petroleum engineers concerning this important process and has wider implications for oil reservoir management efficiency.
Chandra Prasetyo Utomo
2013-01-01
Full Text Available Permeability is an important parameter connected with oil reservoir. In the last two decades, artificial intelligence models have been used. The current best prediction model in permeability prediction is extreme learning machine (ELM. It produces fairly good results but a clear explanation of the model is hard to come by because it is so complex. The aim of this research is to propose a way out of this complexity through the design of a hybrid intelligent model. The model combines classification and regression. In order to handle the high range of the permeability value, a classification tree is utilized. ELM is used as a final predictor. Results demonstrate that this proposed model performs better when compared with support vector machines (SVM and ELM in term of correlation coefficient. Moreover, the classification tree model potentially leads to better communication among petroleum engineers and has wider implications for oil reservoir management efficiency.
Predicting battle outcomes with classification trees
Coban, Muzaffer.
2001-01-01
Historical combat data analysis is a way of understanding the factors affecting battle outcomes. Current studies mostly prefer simulations that are based on mathematical abstractions of battles. However, these abstractions emphasize objective variables, such as force ratio. Models have very limited abilities of modeling important intangible factors like morale, leadership, and luck. Historical combat analysis provides a way to understand battles with the data taken from the actual battlefield...
This paper presents a computer-aided diagnosis technique for improving the accuracy of early diagnosis of Alzheimer-type dementia. The proposed methodology is based on the selection of voxels which present Welch's t-test between both classes, normal and Alzheimer images, greater than a given threshold. The mean and standard deviation of intensity values are calculated for selected voxels. They are chosen as feature vectors for two different classifiers: support vector machines with linear kernel and classification trees. The proposed methodology reaches greater than 95% accuracy in the classification task.
NA Xiaodong; ZHANG Shuqing; ZHANG Huaiqing; LI Xiaofeng; YU Huan; LIU Chunyue
2009-01-01
The main objective of this research is to determine the capacity of land cover classification combining spectral and textural features of Landsat TM imagery with ancillary geographical data in wetlands of the Sanjiang Plain, Heilongjiang Province, China. Semi-variograms and Z-test value were calculated to assess the separability of grey-level co-occurrence texture measures to maximize the difference between land cover types. The degree of spatial autocorrelation showed that window sizes of 3×3 pixels and 11×11 pixels were most appropriate for Landsat TM image texture calculations. The texture analysis showed that co-occurrence entropy, dissimilarity, and variance texture measures, derived from the Landsat TM spectrum bands and vegetation indices provided the most significant statistical differentiation between land cover types. Subsequently, a Classification and Regression Tree (CART) algorithm was applied to three different combinations of predictors: 1) TM imagery alone (TM-only); 2) TM imagery plus image texture (TM+TXT model); and 3) all predictors including TM imagery, image texture and additional ancillary GIS information (TM+TXT+GIS model). Compared with traditional Maximum Likelihood Classification (MLC) supervised classification, three classification trees predictive models reduced the overall error rate significantly. Image texture measures and ancillary geographical variables depressed the speckle noise effectively and reduced classification error rate of marsh obviously. For classification trees model making use of all available predictors, omission error rate was 12.90% and commission error rate was 10.99% for marsh. The developed method is portable, relatively easy to implement and should be applicable in other settings and over larger extents.
A critical choice facing breast cancer patients is which surgical treatment – mastectomy or breast conserving surgery (BCS) – is most appropriate. Several studies have investigated factors that impact the type of surgery chosen, identifying features such as place of residence, age at diagnosis, tumor size, socio-economic and racial/ethnic elements as relevant. Such assessment of 'propensity' is important in understanding issues such as a reported under-utilisation of BCS among women for whom such treatment was not contraindicated. Using Western Australian (WA) data, we further examine the factors associated with the type of surgical treatment for breast cancer using a classification tree approach. This approach deals naturally with complicated interactions between factors, and so allows flexible and interpretable models for treatment choice to be built that add to the current understanding of this complex decision process. Data was extracted from the WA Cancer Registry on women diagnosed with breast cancer in WA from 1990 to 2000. Subjects' treatment preferences were predicted from covariates using both classification trees and logistic regression. Tumor size was the primary determinant of patient choice, subjects with tumors smaller than 20 mm in diameter preferring BCS. For subjects with tumors greater than 20 mm in diameter factors such as patient age, nodal status, and tumor histology become relevant as predictors of patient choice. Classification trees perform as well as logistic regression for predicting patient choice, but are much easier to interpret for clinical use. The selected tree can inform clinicians' advice to patients
Classification tree analysis of second neoplasms in survivors of childhood cancer
Todorovski Ljupčo; Jazbec Janez; Jereb Berta
2007-01-01
Abstract Background Reports on childhood cancer survivors estimated cumulative probability of developing secondary neoplasms vary from 3,3% to 25% at 25 years from diagnosis, and the risk of developing another cancer to several times greater than in the general population. Methods In our retrospective study, we have used the classification tree multivariate method on a group of 849 first cancer survivors, to identify childhood cancer patients with the greatest risk for development of secondar...
Reedbed monitoring using classification trees and SPOT-5 seasonal time series
Davranche, Aurélie; Poulin, Brigitte; Lefebvre, Gaëtan
2010-01-01
The Rhône river delta (Camargue) in south of France, has lost 40,000 ha of natural areas, including 33,000 ha of wetlands over the last 60 years, following the extension of agriculture, salt exploitation and industry. Reed development and density in Camargue marshes is influenced by physical factors such as salinity, water depth, and water level fluctuations, which have an effect on reflectance spectra. Classification trees applied to time series of SPOT-5 images appear as a powerful and reli...
Classification tree analysis of second neoplasms in survivors of childhood cancer
Reports on childhood cancer survivors estimated cumulative probability of developing secondary neoplasms vary from 3,3% to 25% at 25 years from diagnosis, and the risk of developing another cancer to several times greater than in the general population. In our retrospective study, we have used the classification tree multivariate method on a group of 849 first cancer survivors, to identify childhood cancer patients with the greatest risk for development of secondary neoplasms. In observed group of patients, 34 develop secondary neoplasm after treatment of primary cancer. Analysis of parameters present at the treatment of first cancer, exposed two groups of patients at the special risk for secondary neoplasm. First are female patients treated for Hodgkin's disease at the age between 10 and 15 years, whose treatment included radiotherapy. Second group at special risk were male patients with acute lymphoblastic leukemia who were treated at the age between 4,6 and 6,6 years of age. The risk groups identified in our study are similar to the results of studies that used more conventional approaches. Usefulness of our approach in study of occurrence of second neoplasms should be confirmed in larger sample study, but user friendly presentation of results makes it attractive for further studies
Baltzer, Pascal A.T. [Medical University Vienna, Department of Radiology, Vienna (Austria); Dietzel, Matthias [University hospital Erlangen, Department of Neuroradiology, Erlangen (Germany); Kaiser, Werner A. [University Hospital Jena, Institute of Diagnostic and Interventional Radiology 1, Jena (Germany)
2013-08-15
In the face of multiple available diagnostic criteria in MR-mammography (MRM), a practical algorithm for lesion classification is needed. Such an algorithm should be as simple as possible and include only important independent lesion features to differentiate benign from malignant lesions. This investigation aimed to develop a simple classification tree for differential diagnosis in MRM. A total of 1,084 lesions in standardised MRM with subsequent histological verification (648 malignant, 436 benign) were investigated. Seventeen lesion criteria were assessed by 2 readers in consensus. Classification analysis was performed using the chi-squared automatic interaction detection (CHAID) method. Results include the probability for malignancy for every descriptor combination in the classification tree. A classification tree incorporating 5 lesion descriptors with a depth of 3 ramifications (1, root sign; 2, delayed enhancement pattern; 3, border, internal enhancement and oedema) was calculated. Of all 1,084 lesions, 262 (40.4 %) and 106 (24.3 %) could be classified as malignant and benign with an accuracy above 95 %, respectively. Overall diagnostic accuracy was 88.4 %. The classification algorithm reduced the number of categorical descriptors from 17 to 5 (29.4 %), resulting in a high classification accuracy. More than one third of all lesions could be classified with accuracy above 95 %. (orig.)
Deconinck, Eric; Sacré, Pierre-Yves; Coomans, Danny; De Beer, Jacques
2012-01-01
Due to the extension of the internet, counterfeit drugs represent a growing threat for public health in the developing countries but also more and more in the industrial world. In literature several analytical techniques were applied in order to discriminate between genuine and counterfeit medecines. One thing all these techniques have in common is that they generate a huge amount of data, which is often difficult to interpret in order to see differences between the different samp...
Knowledge-Based Classification in Automated Soil Mapping
ZHOU BIN; WANG RENCHAO
2003-01-01
A machine-learning approach was developed for automated building of knowledge bases for soil resourcesmapping by using a classification tree to generate knowledge from training data. With this method, buildinga knowledge base for automated soil mapping was easier than using the conventional knowledge acquisitionapproach. The knowledge base built by classification tree was used by the knowledge classifier to perform thesoil type classification of Longyou County, Zhejiang Province, China using Landsat TM bi-temporal imagesand GIS data. To evaluate the performance of the resultant knowledge bases, the classification results werecompared to existing soil map based on a field survey. The accuracy assessment and analysis of the resultantsoil maps suggested that the knowledge bases built by the machine-learning method was of good quality formapping distribution model of soil classes over the study area.
Chandra Prasetyo Utomo
2013-01-01
Permeability is an important parameter connected with oil reservoir. In the last two decades, artificial intelligence models have been used. The current best prediction model in permeability prediction is extreme learning machine (ELM). It produces fairly good results but a clear explanation of the model is hard to come by because it is so complex. The aim of this research is to propose a way out of this complexity through the design of a hybrid intelligent model. The model combines classific...
Melillo, Paolo; De Luca, Nicola; Bracale, Marcello; Pecchia, Leandro
2013-05-01
This study aims to develop an automatic classifier for risk assessment in patients suffering from congestive heart failure (CHF). The proposed classifier separates lower risk patients from higher risk ones, using standard long-term heart rate variability (HRV) measures. Patients are labeled as lower or higher risk according to the New York Heart Association classification (NYHA). A retrospective analysis on two public Holter databases was performed, analyzing the data of 12 patients suffering from mild CHF (NYHA I and II), labeled as lower risk, and 32 suffering from severe CHF (NYHA III and IV), labeled as higher risk. Only patients with a fraction of total heartbeats intervals (RR) classified as normal-to-normal (NN) intervals (NN/RR) higher than 80% were selected as eligible in order to have a satisfactory signal quality. Classification and regression tree (CART) was employed to develop the classifiers. A total of 30 higher risk and 11 lower risk patients were included in the analysis. The proposed classification trees achieved a sensitivity and a specificity rate of 93.3% and 63.6%, respectively, in identifying higher risk patients. Finally, the rules obtained by CART are comprehensible and consistent with the consensus showed by previous studies that depressed HRV is a useful tool for risk assessment in patients suffering from CHF. PMID:24592473
Kritski Afrânio
2006-02-01
Full Text Available Abstract Background Smear negative pulmonary tuberculosis (SNPT accounts for 30% of pulmonary tuberculosis cases reported yearly in Brazil. This study aimed to develop a prediction model for SNPT for outpatients in areas with scarce resources. Methods The study enrolled 551 patients with clinical-radiological suspicion of SNPT, in Rio de Janeiro, Brazil. The original data was divided into two equivalent samples for generation and validation of the prediction models. Symptoms, physical signs and chest X-rays were used for constructing logistic regression and classification and regression tree models. From the logistic regression, we generated a clinical and radiological prediction score. The area under the receiver operator characteristic curve, sensitivity, and specificity were used to evaluate the model's performance in both generation and validation samples. Results It was possible to generate predictive models for SNPT with sensitivity ranging from 64% to 71% and specificity ranging from 58% to 76%. Conclusion The results suggest that those models might be useful as screening tools for estimating the risk of SNPT, optimizing the utilization of more expensive tests, and avoiding costs of unnecessary anti-tuberculosis treatment. Those models might be cost-effective tools in a health care network with hierarchical distribution of scarce resources.
Kritski Afrânio; Chaisson Richard E; Conde Marcus; Rezende Valéria MC; Soares Sérgio; Bastos Luiz; Mello Fernanda; Ruffino-Netto Antonio; Werneck Guilherme
2006-01-01
Abstract Background Smear negative pulmonary tuberculosis (SNPT) accounts for 30% of pulmonary tuberculosis cases reported yearly in Brazil. This study aimed to develop a prediction model for SNPT for outpatients in areas with scarce resources. Methods The study enrolled 551 patients with clinical-radiological suspicion of SNPT, in Rio de Janeiro, Brazil. The original data was divided into two equivalent samples for generation and validation of the prediction models. Symptoms, physical signs ...
Santana Isabel
2011-08-01
Full Text Available Abstract Background Dementia and cognitive impairment associated with aging are a major medical and social concern. Neuropsychological testing is a key element in the diagnostic procedures of Mild Cognitive Impairment (MCI, but has presently a limited value in the prediction of progression to dementia. We advance the hypothesis that newer statistical classification methods derived from data mining and machine learning methods like Neural Networks, Support Vector Machines and Random Forests can improve accuracy, sensitivity and specificity of predictions obtained from neuropsychological testing. Seven non parametric classifiers derived from data mining methods (Multilayer Perceptrons Neural Networks, Radial Basis Function Neural Networks, Support Vector Machines, CART, CHAID and QUEST Classification Trees and Random Forests were compared to three traditional classifiers (Linear Discriminant Analysis, Quadratic Discriminant Analysis and Logistic Regression in terms of overall classification accuracy, specificity, sensitivity, Area under the ROC curve and Press'Q. Model predictors were 10 neuropsychological tests currently used in the diagnosis of dementia. Statistical distributions of classification parameters obtained from a 5-fold cross-validation were compared using the Friedman's nonparametric test. Results Press' Q test showed that all classifiers performed better than chance alone (p Conclusions When taking into account sensitivity, specificity and overall classification accuracy Random Forests and Linear Discriminant analysis rank first among all the classifiers tested in prediction of dementia using several neuropsychological tests. These methods may be used to improve accuracy, sensitivity and specificity of Dementia predictions from neuropsychological testing.
Avoiding overfit by restricted model search in tree-based EEG classification
Klaschka, Jan
The Hague: International Statistical Institute, 2012, s. 5077-5082. ISBN 978-90-73592-33-9. [ISI 2011. Session of the International Statistical Institute /58./. Dublin (IE), 21.08.2011-26.08.2011] R&D Projects: GA MŠk ME 949 Institutional research plan: CEZ:AV0Z10300504 Keywords : model search * electroencephalography * classification trees and forests * random forests Subject RIV: BB - Applied Statistics, Operational Research http://2011.isiproceedings.org/papers/950644.pdf
R. Bou Kheir
2010-06-01
Full Text Available Accurate information about organic/mineral soil occurrence is a prerequisite for many land resources management applications (including climate change mitigation. This paper aims at investigating the potential of using geomorphometrical analysis and decision tree modeling to predict the geographic distribution of hydromorphic organic landscapes in unsampled area in Denmark. Nine primary (elevation, slope angle, slope aspect, plan curvature, profile curvature, tangent curvature, flow direction, flow accumulation, and specific catchment area and one secondary (steady-state topographic wetness index topographic parameters were generated from Digital Elevation Models (DEMs acquired using airborne LIDAR (Light Detection and Ranging systems. They were used along with existing digital data collected from other sources (soil type, geological substrate and landscape type to explain organic/mineral field measurements in hydromorphic landscapes of the Danish area chosen. A large number of tree-based classification models (186 were developed using (1 all of the parameters, (2 the primary DEM-derived topographic (morphological/hydrological parameters only, (3 selected pairs of parameters and (4 excluding each parameter one at a time from the potential pool of predictor parameters. The best classification tree model (with the lowest misclassification error and the smallest number of terminal nodes and predictor parameters combined the steady-state topographic wetness index and soil type, and explained 68% of the variability in organic/mineral field measurements. The overall accuracy of the predictive organic/inorganic landscapes' map produced (at 1:50 000 cartographic scale using the best tree was estimated to be ca. 75%. The proposed classification-tree model is relatively simple, quick, realistic and practical, and it can be applied to other areas, thereby providing a tool to facilitate the implementation of pedological/hydrological plans for conservation
R. Bou Kheir
2010-01-01
Full Text Available Accurate information about soil organic carbon (SOC, presented in a spatially form, is prerequisite for many land resources management applications (including climate change mitigation. This paper aims to investigate the potential of using geomorphometrical analysis and decision tree modeling to predict the geographic distribution of hydromorphic organic landscapes at unsampled area in Denmark. Nine primary (elevation, slope angle, slope aspect, plan curvature, profile curvature, tangent curvature, flow direction, flow accumulation, and specific catchment area and one secondary (steady-state topographic wetness index topographic parameters were generated from Digital Elevation Models (DEMs acquired using airborne LIDAR (Light Detection and Ranging systems. They were used along with existing digital data collected from other sources (soil type, geological substrate and landscape type to statistically explain SOC field measurements in hydromorphic landscapes of the chosen Danish area. A large number of tree-based classification models (186 were developed using (1 all of the parameters, (2 the primary DEM-derived topographic (morphological/hydrological parameters only, (3 selected pairs of parameters and (4 excluding each parameter one at a time from the potential pool of predictor parameters. The best classification tree model (with the lowest misclassification error and the smallest number of terminal nodes and predictor parameters combined the steady-state topographic wetness index and soil type, and explained 68% of the variability in field SOC measurements. The overall accuracy of the produced predictive SOC map (at 1:50 000 cartographic scale using the best tree was estimated to be ca. 75%. The proposed classification-tree model is relatively simple, quick, realistic and practical, and it can be applied to other areas, thereby providing a tool to help with the implementation of pedological/hydrological plans for conservation and sustainable
Isabel C. Pérez Hoyos
2016-04-01
Full Text Available Groundwater Dependent Ecosystems (GDEs are increasingly threatened by humans’ rising demand for water resources. Consequently, it is imperative to identify the location of GDEs to protect them. This paper develops a methodology to identify the probability of an ecosystem to be groundwater dependent. Probabilities are obtained by modeling the relationship between the known locations of GDEs and factors influencing groundwater dependence, namely water table depth and climatic aridity index. Probabilities are derived for the state of Nevada, USA, using modeled water table depth and aridity index values obtained from the Global Aridity database. The model selected results from the performance comparison of classification trees (CT and random forests (RF. Based on a threshold-independent accuracy measure, RF has a better ability to generate probability estimates. Considering a threshold that minimizes the misclassification rate for each model, RF also proves to be more accurate. Regarding training accuracy, performance measures such as accuracy, sensitivity, and specificity are higher for RF. For the test set, higher values of accuracy and kappa for CT highlight the fact that these measures are greatly affected by low prevalence. As shown for RF, the choice of the cutoff probability value has important consequences on model accuracy and the overall proportion of locations where GDEs are found.
Detection of colonic polyps in CT colonography is problematic due to complexities of polyp shape and the surface of the normal colon. Published results indicate the feasibility of computer-aided detection of polyps but better classifiers are needed to improve specificity. In this paper we compare the classification results of two approaches: neural networks and recursive binary trees. As our starting point we collect surface geometry information from three-dimensional reconstruction of the colon, followed by a filter based on selected variables such as region density, Gaussian and average curvature and sphericity. The filter returns sites that are candidate polyps, based on earlier work using detection thresholds, to which the neural nets or the binary trees are applied. A data set of 39 polyps from 3 to 25 mm in size was used in our investigation. For both neural net and binary trees we use tenfold cross-validation to better estimate the true error rates. The backpropagation neural net with one hidden layer trained with Levenberg-Marquardt algorithm achieved the best results: sensitivity 90% and specificity 95% with 16 false positives per study
Mengersen Kerrie L
2011-07-01
Full Text Available Abstract Background Strategies for cancer reduction and management are targeted at both individual and area levels. Area-level strategies require careful understanding of geographic differences in cancer incidence, in particular the association with factors such as socioeconomic status, ethnicity and accessibility. This study aimed to identify the complex interplay of area-level factors associated with high area-specific incidence of Australian priority cancers using a classification and regression tree (CART approach. Methods Area-specific smoothed standardised incidence ratios were estimated for priority-area cancers across 478 statistical local areas in Queensland, Australia (1998-2007, n = 186,075. For those cancers with significant spatial variation, CART models were used to identify whether area-level accessibility, socioeconomic status and ethnicity were associated with high area-specific incidence. Results The accessibility of a person's residence had the most consistent association with the risk of cancer diagnosis across the specific cancers. Many cancers were likely to have high incidence in more urban areas, although male lung cancer and cervical cancer tended to have high incidence in more remote areas. The impact of socioeconomic status and ethnicity on these associations differed by type of cancer. Conclusions These results highlight the complex interactions between accessibility, socioeconomic status and ethnicity in determining cancer incidence risk.
Diggle, Peter J
2007-01-01
Model-based geostatistics refers to the application of general statistical principles of modeling and inference to geostatistical problems. This volume provides a treatment of model-based geostatistics and emphasizes on statistical methods and applications. It also features analyses of datasets from a range of scientific contexts.
Staying Power of Churn Prediction Models
Risselada, Hans; Verhoef, Peter C.; Bijmolt, Tammo H. A.
2010-01-01
In this paper, we study the staying power of various churn prediction models. Staying power is defined as the predictive performance of a model in a number of periods after the estimation period. We examine two methods, logit models and classification trees, both with and without applying a bagging
Hector R. Wong
2015-12-01
Full Text Available The temporal version of the pediatric sepsis biomarker risk model (tPERSEVERE estimates the risk of a complicated course in children with septic shock based on biomarker changes from days 1 to 3 of septic shock. We validated tPERSEVERE performance in a prospective cohort, with an a priori plan to redesign tPERSEVERE if it did not perform well. Biomarkers were measured in the validation cohort (n = 168 and study subjects were classified according to tPERSEVERE. To redesign tPERSEVERE, the validation cohort and the original derivation cohort (n = 299 were combined and randomly allocated to training (n = 374 and test (n = 93 sets. tPERSEVERE was redesigned using the training set and CART methodology. tPERSEVERE performed poorly in the validation cohort, with an area under the curve (AUC of 0.67 (95% CI: 0.58–0.75. Failure analysis revealed potential confounders related to clinical characteristics. The redesigned tPERSEVERE model had an AUC of 0.83 (0.79–0.87 and a sensitivity of 93% (68–97 for estimating the risk of a complicated course. Similar performance was seen in the test set. The classification tree segregated patients into two broad endotypes of septic shock characterized by either excessive inflammation or immune suppression.
Heimann, Tobias; Delingette, Hervé
2011-01-01
This chapter starts with a brief introduction into model-based segmentation, explaining the basic concepts and different approaches. Subsequently, two segmentation approaches are presented in more detail: First, the method of deformable simplex meshes is described, explaining the special properties of the simplex mesh and the formulation of the internal forces. Common choices for image forces are presented, and how to evolve the mesh to adapt to certain structures. Second, the method of point...
刘甲野; 马吉祥; 徐爱强; 付振涛; 贺桂顺; 贾崇奇; 于洋
2008-01-01
Objective To explore the risk factors of hypertension and risk population for adults aged≥25 in the mid-western rural areas of Shandong province and to provide evidence for development of intervention measure. Methods Subjects aged ≥25 were selected by multi-stage stratified random sampling method. All participants were interviewed with a standard questionnaire and physically examined on height, weight, waist circumference, blood pressure and fasting plasma glucose (FPG). Classification tree analysis was employed to determine the risk factors of hypertension and high risk populations related to it. Results The major risk factors of hypertension would include age, abdominal obesity, overweight or obesity, family history and high blood sugar. The major populations at high risk would include populations as: a) being clderly, b) at middle-age but with: high blood sugar or with abdominal obesity/overweight, or with family history, c) people at middle-age but with family history and abdominal obesity. Through classification tree analysis, sensitivity, specificity and overall correct rates were 71.87%, 66.38% and 68.79 %, respectively on ' learning sample' while 70.70 %, 65.84 % and 67.97 % respectively on ' testing sample'. Conclusion Efforts on both weight and blood sugar reduction were common prevention measures for general population. Different kinds of prevention and control measures should be taken according to different risk factors existed in the targeted high-risk population of hypertension. Community-based prevention and control for hypertension measures should be integrated when targeting the population at high risk.%目的 探讨山东省中西部地区25岁以上农村常住居民高血压的危险因素及高危人群.方法 采用多阶段分层随机抽样的方法 ,对该地区调查对象进行问卷调查、体格检查以及实验室检测.应用分类树分析高血压人群的危险因素及高危人群.结果 高血压的主要危险因素为
Rowe, Sidney E.
2010-01-01
In September 2007, the Engineering Directorate at the Marshall Space Flight Center (MSFC) created the Design System Focus Team (DSFT). MSFC was responsible for the in-house design and development of the Ares 1 Upper Stage and the Engineering Directorate was preparing to deploy a new electronic Configuration Management and Data Management System with the Design Data Management System (DDMS) based upon a Commercial Off The Shelf (COTS) Product Data Management (PDM) System. The DSFT was to establish standardized CAD practices and a new data life cycle for design data. Of special interest here, the design teams were to implement Model Based Definition (MBD) in support of the Upper Stage manufacturing contract. It is noted that this MBD does use partially dimensioned drawings for auxiliary information to the model. The design data lifecycle implemented several new release states to be used prior to formal release that allowed the models to move through a flow of progressive maturity. The DSFT identified some 17 Lessons Learned as outcomes of the standards development, pathfinder deployments and initial application to the Upper Stage design completion. Some of the high value examples are reviewed.
LSTM based Conversation Models
Luan, Yi; Ji, Yangfeng; Ostendorf, Mari
2016-01-01
In this paper, we present a conversational model that incorporates both context and participant role for two-party conversations. Different architectures are explored for integrating participant role and context information into a Long Short-term Memory (LSTM) language model. The conversational model can function as a language model or a language generation model. Experiments on the Ubuntu Dialog Corpus show that our model can capture multiple turn interaction between participants. The propos...
Ifenthaler, Dirk; Seel, Norbert M.
2013-01-01
In this paper, there will be a particular focus on mental models and their application to inductive reasoning within the realm of instruction. A basic assumption of this study is the observation that the construction of mental models and related reasoning is a slowly developing capability of cognitive systems that emerges effectively with proper…
Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael
1992-01-01
Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.
Model-based Software Engineering
Kindler, Ekkart
2010-01-01
The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models...
Principles of models based engineering
Dolin, R.M.; Hefele, J.
1996-11-01
This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.
Model Construct Based Enterprise Model Architecture and Its Modeling Approach
无
2002-01-01
In order to support enterprise integration, a kind of model construct based enterprise model architecture and its modeling approach are studied in this paper. First, the structural makeup and internal relationships of enterprise model architecture are discussed. Then, the concept of reusable model construct (MC) which belongs to the control view and can help to derive other views is proposed. The modeling approach based on model construct consists of three steps, reference model architecture synthesis, enterprise model customization, system design and implementation. According to MC based modeling approach a case study with the background of one-kind-product machinery manufacturing enterprises is illustrated. It is shown that proposal model construct based enterprise model architecture and modeling approach are practical and efficient.
Methods of development fuzzy logic driven decision-support models in copper alloys processing
S. Kluska-Nawarecka
2010-01-01
Full Text Available Development of a diagnostic decision support system using different then divalent logical formalism, in particular fuzzy logic, allows the inference from the facts presented not as explicit numbers, but described by linguistic variables such as the "high level", "low temperature", "too much content", etc. Thanks to this, process of inference resembles human manner in actual conditions of decision-making processes. Knowledge of experts allows him to discover the functions describing the relationship between the classification of a set of objects and their characteristics, on the basis of which it is possible to create a decision-making rules for classifying new objects of unknown classification so far. This process can be automated. Experimental studies conducted on copper alloys provide large amounts of data. Processing of these data can be greatly accelerated by the classification trees algorithms which provides classes that can be used in fuzzy inference model. Fuzzy logic also provides the flexibility of allocating to classes on the basis of membership functions (which is similar to events in real-world conditions. Decision-making in foundry operations often requires reliance on knowledge incomplete and ambiguous, hence that the conclusions from the data and facts may be "to some extent" true, and the technologist has to determine what level of confidence is acceptable, although the degree of accuracy for specific criteria is defined by membership function, which takes values from interval . This paper describes the methodology and the process of developing fuzzy logic-based models of decision making based on preprocessed data with classification trees, where the needs of the diverse characteristics of copper alloys processing are the scope. Algorithms for automatic classification of the materials research work of copper alloys are clearly the nature of the innovative and promising hope for practical applications in this area.
Graph Model Based Indoor Tracking
Jensen, Christian Søndergaard; Lu, Hua; Yang, Bin
2009-01-01
The tracking of the locations of moving objects in large indoor spaces is important, as it enables a range of applications related to, e.g., security and indoor navigation and guidance. This paper presents a graph model based approach to indoor tracking that offers a uniform data management...... infrastructure for different symbolic positioning technologies, e.g., Bluetooth and RFID. More specifically, the paper proposes a model of indoor space that comprises a base graph and mappings that represent the topology of indoor space at different levels. The resulting model can be used for one or several...... indoor positioning technologies. Focusing on RFID-based positioning, an RFID specific reader deployment graph model is built from the base graph model. This model is then used in several algorithms for constructing and refining trajectories from raw RFID readings. Empirical studies with implementations...
Cluster Based Text Classification Model
Nizamani, Sarwat; Memon, Nasrullah; Wiil, Uffe Kock
2011-01-01
We propose a cluster based classification model for suspicious email detection and other text classification tasks. The text classification tasks comprise many training examples that require a complex classification model. Using clusters for classification makes the model simpler and increases the...... classifier is trained on each cluster having reduced dimensionality and less number of examples. The experimental results show that the proposed model outperforms the existing classification models for the task of suspicious email detection and topic categorization on the Reuters-21578 and 20 Newsgroups...... datasets. Our model also outperforms A Decision Cluster Classification (ADCC) and the Decision Cluster Forest Classification (DCFC) models on the Reuters-21578 dataset....
Van Damme Pierre
2009-01-01
Full Text Available Abstract Background Until recently, mathematical models of person to person infectious diseases transmission had to make assumptions on transmissions enabled by personal contacts by estimating the so-called WAIFW-matrix. In order to better inform such estimates, a population based contact survey has been carried out in Belgium over the period March-May 2006. In contrast to other European surveys conducted simultaneously, each respondent recorded contacts over two days. Special attention was given to holiday periods, and respondents with large numbers of professional contacts. Methods Participants kept a paper diary with information on their contacts over two different days. A contact was defined as a two-way conversation of at least three words in each others proximity. The contact information included the age of the contact, gender, location, duration, frequency, and whether or not touching was involved. For data analysis, we used association rules and classification trees. Weighted generalized estimating equations were used to analyze contact frequency while accounting for the correlation between contacts reported on the two different days. A contact surface, expressing the average number of contacts between persons of different ages was obtained by a bivariate smoothing approach and the relation to the so-called next-generation matrix was established. Results People mostly mixed with people of similar age, or with their offspring, their parents and their grandparents. By imputing professional contacts, the average number of daily contacts increased from 11.84 to 15.70. The number of reported contacts depended heavily on the household size, class size for children and number of professional contacts for adults. Adults living with children had on average 2 daily contacts more than adults living without children. In the holiday period, the daily contact frequency for children and adolescents decreased with about 19% while a similar observation
Base Flow Model Validation Project
National Aeronautics and Space Administration — The program focuses on turbulence modeling enhancements for predicting high-speed rocket base flows. A key component of the effort is the collection of...
Modeling Guru: Knowledge Base for NASA Modelers
Seablom, M. S.; Wojcik, G. S.; van Aartsen, B. H.
2009-05-01
Modeling Guru is an on-line knowledge-sharing resource for anyone involved with or interested in NASA's scientific models or High End Computing (HEC) systems. Developed and maintained by the NASA's Software Integration and Visualization Office (SIVO) and the NASA Center for Computational Sciences (NCCS), Modeling Guru's combined forums and knowledge base for research and collaboration is becoming a repository for the accumulated expertise of NASA's scientific modeling and HEC communities. All NASA modelers and associates are encouraged to participate and provide knowledge about the models and systems so that other users may benefit from their experience. Modeling Guru is divided into a hierarchy of communities, each with its own set forums and knowledge base documents. Current modeling communities include those for space science, land and atmospheric dynamics, atmospheric chemistry, and oceanography. In addition, there are communities focused on NCCS systems, HEC tools and libraries, and programming and scripting languages. Anyone may view most of the content on Modeling Guru (available at http://modelingguru.nasa.gov/), but you must log in to post messages and subscribe to community postings. The site offers a full range of "Web 2.0" features, including discussion forums, "wiki" document generation, document uploading, RSS feeds, search tools, blogs, email notification, and "breadcrumb" links. A discussion (a.k.a. forum "thread") is used to post comments, solicit feedback, or ask questions. If marked as a question, SIVO will monitor the thread, and normally respond within a day. Discussions can include embedded images, tables, and formatting through the use of the Rich Text Editor. Also, the user can add "Tags" to their thread to facilitate later searches. The "knowledge base" is comprised of documents that are used to capture and share expertise with others. The default "wiki" document lets users edit within the browser so others can easily collaborate on the
Bækgaard, Lars
2004-01-01
We present and discuss a modeling approach that supports event-based modeling of information and activity in information systems. Interacting human actors and IT-actors may carry out such activity. We use events to create meaningful relations between information structures and the related...
Modelling Gesture Based Ubiquitous Applications
Zacharia, Kurien; Varghese, Surekha Mariam
2011-01-01
A cost effective, gesture based modelling technique called Virtual Interactive Prototyping (VIP) is described in this paper. Prototyping is implemented by projecting a virtual model of the equipment to be prototyped. Users can interact with the virtual model like the original working equipment. For capturing and tracking the user interactions with the model image and sound processing techniques are used. VIP is a flexible and interactive prototyping method that has much application in ubiquitous computing environments. Different commercial as well as socio-economic applications and extension to interactive advertising of VIP are also discussed.
Sketch-based geologic modeling
Rood, M. P.; Jackson, M.; Hampson, G.; Brazil, E. V.; de Carvalho, F.; Coda, C.; Sousa, M. C.; Zhang, Z.; Geiger, S.
2015-12-01
Two-dimensional (2D) maps and cross-sections, and 3D conceptual models, are fundamental tools for understanding, communicating and modeling geology. Yet geologists lack dedicated and intuitive tools that allow rapid creation of such figures and models. Standard drawing packages produce only 2D figures that are not suitable for quantitative analysis. Geologic modeling packages can produce 3D models and are widely used in the groundwater and petroleum communities, but are often slow and non-intuitive to use, requiring the creation of a grid early in the modeling workflow and the use of geostatistical methods to populate the grid blocks with geologic information. We present an alternative approach to rapidly create figures and models using sketch-based interface and modelling (SBIM). We leverage methods widely adopted in other industries to prototype complex geometries and designs. The SBIM tool contains built-in geologic rules that constrain how sketched lines and surfaces interact. These rules are based on the logic of superposition and cross-cutting relationships that follow from rock-forming processes, including deposition, deformation, intrusion and modification by diagenesis or metamorphism. The approach allows rapid creation of multiple, geologically realistic, figures and models in 2D and 3D using a simple, intuitive interface. The user can sketch in plan- or cross-section view. Geologic rules are used to extrapolate sketched lines in real time to create 3D surfaces. Quantitative analysis can be carried our directly on the models. Alternatively, they can be output as simple figures or imported directly into other modeling tools. The software runs on a tablet PC and can be used in a variety of settings including the office, classroom and field. The speed and ease of use of SBIM enables multiple interpretations to be developed from limited data, uncertainty to be readily appraised, and figures and models to be rapidly updated to incorporate new data or concepts.
ElSalamouny, Ehab; Nielsen, Mogens; Sassone, Vladimiro
2010-01-01
Probabilistic trust has been adopted as an approach to taking security sensitive decisions in modern global computing environments. Existing probabilistic trust frameworks either assume fixed behaviour for the principals or incorporate the notion of ‘decay' as an ad hoc approach to cope with thei...... the major limitation of existing Beta trust model. We show the consistency of the HMM-based trust model and contrast it against the well known Beta trust model with the decay principle in terms of the estimation precision....
A model-based display is identified, discussed, and illustrated. The model used in the display is based upon the Rankine Cycle, a heat engine cycle. Plant process data from the loss of main and auxiliary feedwater event at the Davis-Besse Plant on June 9, l985 is used to illustrate the display. The model used in the display fuses individual process variables into process functions. It also serves as a medium to communicate status of the process to human users. The human users may evaluate the goals of operation from the displayed process functions. Because of these display features, the user's cognitive workload is minimized. The opinions expressed herein are the author's personal ones and do not necessarily reflect criteria, requirements, and guidelines of the U.S. Nuclear Regulatory Commission
Running a nuclear power plant involves monitoring data provided by the installation's sensors. Operators and computerized systems then use these data to establish a diagnostic of the plant. However, the instrumentation system is complex, and is not immune to faults and failures. This paper presents a system for detecting sensor failures using a topological description of the installation and a set of component models. This model of the plant implicitly contains relations between sensor data. These relations must always be checked if all the components are functioning correctly. The failure detection task thus consists of checking these constraints. The constraints are extracted in two stages. Firstly, a qualitative model of their existence is built using structural analysis. Secondly, the models are formally handled according to the results of the structural analysis, in order to establish the constraints on the sensor data. This work constitutes an initial step in extending model-based diagnosis, as the information on which it is based is suspect. This work will be followed by surveillance of the detection system. When the instrumentation is assumed to be sound, the unverified constraints indicate errors on the plant model. (authors). 8 refs., 4 figs
Model-based requirements engineering
Holt, Jon
2012-01-01
This book provides a hands-on introduction to model-based requirementsengineering and management by describing a set of views that form the basisfor the approach. These views take into account each individual requirement interms of its description, but then also provide each requirement with meaning byputting it into the correct 'context'. A requirement that has been put into a contextis known as a 'use case' and may be based upon either stakeholders or levelsof hierarchy in a system. Each use case must then be analysed and validated bydefining a combination of scenarios and formal mathematica
Model-based tomographic reconstruction
Chambers, David H.; Lehman, Sean K.; Goodman, Dennis M.
2012-06-26
A model-based approach to estimating wall positions for a building is developed and tested using simulated data. It borrows two techniques from geophysical inversion problems, layer stripping and stacking, and combines them with a model-based estimation algorithm that minimizes the mean-square error between the predicted signal and the data. The technique is designed to process multiple looks from an ultra wideband radar array. The processed signal is time-gated and each section processed to detect the presence of a wall and estimate its position, thickness, and material parameters. The floor plan of a building is determined by moving the array around the outside of the building. In this paper we describe how the stacking and layer stripping algorithms are combined and show the results from a simple numerical example of three parallel walls.
Differential Geometry Based Multiscale Models
Wei, Guo-Wei
2010-01-01
Large chemical and biological systems such as fuel cells, ion channels, molecular motors, and viruses are of great importance to the scientific community and public health. Typically, these complex systems in conjunction with their aquatic environment pose a fabulous challenge to theoretical description, simulation, and prediction. In this work, we propose a differential geometry based multiscale paradigm to model complex macromolecular systems, and to put macroscopic and microscopic descript...
Crowdsourcing Based 3d Modeling
Somogyi, A.; Barsi, A.; Molnar, B.; Lovas, T.
2016-06-01
Web-based photo albums that support organizing and viewing the users' images are widely used. These services provide a convenient solution for storing, editing and sharing images. In many cases, the users attach geotags to the images in order to enable using them e.g. in location based applications on social networks. Our paper discusses a procedure that collects open access images from a site frequently visited by tourists. Geotagged pictures showing the image of a sight or tourist attraction are selected and processed in photogrammetric processing software that produces the 3D model of the captured object. For the particular investigation we selected three attractions in Budapest. To assess the geometrical accuracy, we used laser scanner and DSLR as well as smart phone photography to derive reference values to enable verifying the spatial model obtained from the web-album images. The investigation shows how detailed and accurate models could be derived applying photogrammetric processing software, simply by using images of the community, without visiting the site.
An Agent Based Classification Model
Gu, Feng; Greensmith, Julie
2009-01-01
The major function of this model is to access the UCI Wisconsin Breast Can- cer data-set[1] and classify the data items into two categories, which are normal and anomalous. This kind of classifi cation can be referred as anomaly detection, which discriminates anomalous behaviour from normal behaviour in computer systems. One popular solution for anomaly detection is Artifi cial Immune Sys- tems (AIS). AIS are adaptive systems inspired by theoretical immunology and observed immune functions, principles and models which are applied to prob- lem solving. The Dendritic Cell Algorithm (DCA)[2] is an AIS algorithm that is developed specifi cally for anomaly detection. It has been successfully applied to intrusion detection in computer security. It is believed that agent-based mod- elling is an ideal approach for implementing AIS, as intelligent agents could be the perfect representations of immune entities in AIS. This model evaluates the feasibility of re-implementing the DCA in an agent-based simulation environ- ...
Hibbard, Bill
2012-05-01
Orseau and Ring, as well as Dewey, have recently described problems, including self-delusion, with the behavior of agents using various definitions of utility functions. An agent's utility function is defined in terms of the agent's history of interactions with its environment. This paper argues, via two examples, that the behavior problems can be avoided by formulating the utility function in two steps: 1) inferring a model of the environment from interactions, and 2) computing utility as a function of the environment model. Basing a utility function on a model that the agent must learn implies that the utility function must initially be expressed in terms of specifications to be matched to structures in the learned model. These specifications constitute prior assumptions about the environment so this approach will not work with arbitrary environments. But the approach should work for agents designed by humans to act in the physical world. The paper also addresses the issue of self-modifying agents and shows that if provided with the possibility to modify their utility functions agents will not choose to do so, under some usual assumptions.
Trace-Based Code Generation for Model-Based Testing
Kanstrén, T.; Piel, E.; Gross, H.-G.
2009-01-01
Paper Submitted for review at the Eighth International Conference on Generative Programming and Component Engineering. Model-based testing can be a powerful means to generate test cases for the system under test. However, creating a useful model for model-based testing requires expertise in the (formal) modeling language of the used tool and the general concept of modeling the system under test for effective test generation. A commonly used modeling notation is to describe the model through a...
Business value modeling based on BPMN models
Masoumigoudarzi, Farahnaz
2014-01-01
In this study we will try to clarify the explanation of modeling and measuring 'Business Values', as it is defined in business context, in the business processes of a company and introduce different methods and select the one which is best for modeling the company's business values. These methods have been used by researchers in business analytics and senior managers of many companies. The focus in this project is business value detection and modeling. The basis of this research is on BPM...
Optimal pricing decision model based on activity-based costing
王福胜; 常庆芳
2003-01-01
In order to find out the applicability of the optimal pricing decision model based on conventional costbehavior model after activity-based costing has given strong shock to the conventional cost behavior model andits assumptions, detailed analyses have been made using the activity-based cost behavior and cost-volume-profitanalysis model, and it is concluded from these analyses that the theory behind the construction of optimal pri-cing decision model is still tenable under activity-based costing, but the conventional optimal pricing decisionmodel must be modified as appropriate to the activity-based costing based cost behavior model and cost-volume-profit analysis model, and an optimal pricing decision model is really a product pricing decision model construc-ted by following the economic principle of maximizing profit.
Sensor-based interior modeling
Robots and remote systems will play crucial roles in future decontamination and decommissioning (D ampersand D) of nuclear facilities. Many of these facilities, such as uranium enrichment plants, weapons assembly plants, research and production reactors, and fuel recycling facilities, are dormant; there is also an increasing number of commercial reactors whose useful lifetime is nearly over. To reduce worker exposure to radiation, occupational and other hazards associated with D ampersand D tasks, robots will execute much of the work agenda. Traditional teleoperated systems rely on human understanding (based on information gathered by remote viewing cameras) of the work environment to safely control the remote equipment. However, removing the operator from the work site substantially reduces his efficiency and effectiveness. To approach the productivity of a human worker, tasks will be performed telerobotically, in which many aspects of task execution are delegated to robot controllers and other software. This paper describes a system that semi-automatically builds a virtual world for remote D ampersand D operations by constructing 3-D models of a robot's work environment. Planar and quadric surface representations of objects typically found in nuclear facilities are generated from laser rangefinder data with a minimum of human interaction. The surface representations are then incorporated into a task space model that can be viewed and analyzed by the operator, accessed by motion planning and robot safeguarding algorithms, and ultimately used by the operator to instruct the robot at a level much higher than teleoperation
Memristor model based on fuzzy window function
Abdel-Kader, Rabab Farouk; Abuelenin, Sherif M.
2016-01-01
Memristor (memory-resistor) is the fourth passive circuit element. We introduce a memristor model based on a fuzzy logic window function. Fuzzy models are flexible, which enables the capture of the pinched hysteresis behavior of the memristor. The introduced fuzzy model avoids common problems associated with window-function based memristor models, such as the terminal state problem, and the symmetry issues. The model captures the memristor behavior with a simple rule-base which gives an insig...
A respiratory alert model for the Shenandoah Valley, Virginia, USA.
Hondula, David M; Davis, Robert E; Knight, David B; Sitka, Luke J; Enfield, Kyle; Gawtry, Stephen B; Stenger, Phillip J; Deaton, Michael L; Normile, Caroline P; Lee, Temple R
2013-01-01
Respiratory morbidity (particularly COPD and asthma) can be influenced by short-term weather fluctuations that affect air quality and lung function. We developed a model to evaluate meteorological conditions associated with respiratory hospital admissions in the Shenandoah Valley of Virginia, USA. We generated ensembles of classification trees based on six years of respiratory-related hospital admissions (64,620 cases) and a suite of 83 potential environmental predictor variables. As our goal was to identify short-term weather linkages to high admission periods, the dependent variable was formulated as a binary classification of five-day moving average respiratory admission departures from the seasonal mean value. Accounting for seasonality removed the long-term apparent inverse relationship between temperature and admissions. We generated eight total models specific to the northern and southern portions of the valley for each season. All eight models demonstrate predictive skill (mean odds ratio = 3.635) when evaluated using a randomization procedure. The predictor variables selected by the ensembling algorithm vary across models, and both meteorological and air quality variables are included. In general, the models indicate complex linkages between respiratory health and environmental conditions that may be difficult to identify using more traditional approaches. PMID:22438053
Test case generation based on orthogonal table for software black-box testing
LIU Jiu-fu; YANG Zhong; YANG Zhen-xing; SUN Lin
2008-01-01
Software testing is an important means to assure the software quality. This paper presents a practicable method to generate test cases of software testing, which is operational and high efficient. We discuss the identification of software specification categories and choices and make a classification tree. Based on the orthogonal array, it is easy to generate test cases. The number of this method is less than that of all combination of the choices.
Robins, Robert E.; Delisi, Donald P.
2008-01-01
In Robins and Delisi (2008), a linear decay model, a new IGE model by Sarpkaya (2006), and a series of APA-Based models were scored using data from three airports. This report is a guide to the APA-based models.
CEAI: CCM based Email Authorship Identification Model
Nizamani, Sarwat; Memon, Nasrullah
2013-01-01
reveal that the proposed CCM-based email authorship identification model, along with the proposed feature set, outperforms the state-of-the-art support vector machine (SVM)-based models, as well as the models proposed by Iqbal et al. [1, 2]. The proposed model attains an accuracy rate of 94% for 10...
Mining geriatric assessment data for in-patient fall prediction models and high-risk subgroups
Marschollek Michael
2012-03-01
Full Text Available Abstract Background Hospital in-patient falls constitute a prominent problem in terms of costs and consequences. Geriatric institutions are most often affected, and common screening tools cannot predict in-patient falls consistently. Our objectives are to derive comprehensible fall risk classification models from a large data set of geriatric in-patients' assessment data and to evaluate their predictive performance (aim#1, and to identify high-risk subgroups from the data (aim#2. Methods A data set of n = 5,176 single in-patient episodes covering 1.5 years of admissions to a geriatric hospital were extracted from the hospital's data base and matched with fall incident reports (n = 493. A classification tree model was induced using the C4.5 algorithm as well as a logistic regression model, and their predictive performance was evaluated. Furthermore, high-risk subgroups were identified from extracted classification rules with a support of more than 100 instances. Results The classification tree model showed an overall classification accuracy of 66%, with a sensitivity of 55.4%, a specificity of 67.1%, positive and negative predictive values of 15% resp. 93.5%. Five high-risk groups were identified, defined by high age, low Barthel index, cognitive impairment, multi-medication and co-morbidity. Conclusions Our results show that a little more than half of the fallers may be identified correctly by our model, but the positive predictive value is too low to be applicable. Non-fallers, on the other hand, may be sorted out with the model quite well. The high-risk subgroups and the risk factors identified (age, low ADL score, cognitive impairment, institutionalization, polypharmacy and co-morbidity reflect domain knowledge and may be used to screen certain subgroups of patients with a high risk of falling. Classification models derived from a large data set using data mining methods can compete with current dedicated fall risk screening tools, yet lack
Advanced statistical models can help industry to design more economical and rational investment plans. Fault detection and diagnosis is an important problem in continuous hot dip galvanizing. Increasingly stringent quality requirements in the automotive industry also require ongoing efforts in process control to make processes more robust. Robust methods for estimating the quality of galvanized steel coils are an important tool for the comprehensive monitoring of the performance of the manufacturing process. This study applies different statistical regression models: generalized linear models, generalized additive models and classification trees to estimate the quality of galvanized steel coils on the basis of short time histories. The data, consisting of 48 galvanized steel coils, was divided into sets of conforming and nonconforming coils. Five variables were selected for monitoring the process: steel strip velocity and four bath temperatures. The present paper reports a comparative evaluation of statistical models for binary data using Receiver Operating Characteristic (ROC) curves. A ROC curve is a graph or a technique for visualizing, organizing and selecting classifiers based on their performance. The purpose of this paper is to examine their use in research to obtain the best model to predict defective steel coil probability. In relation to the work of other authors who only propose goodness of fit statistics, we should highlight one distinctive feature of the methodology presented here, which is the possibility of comparing the different models with ROC graphs which are based on model classification performance. Finally, the results are validated by bootstrap procedures.
Trace-Based Code Generation for Model-Based Testing
Kanstrén, T.; Piel, E.; Gross, H.-G.
2009-01-01
Paper Submitted for review at the Eighth International Conference on Generative Programming and Component Engineering. Model-based testing can be a powerful means to generate test cases for the system under test. However, creating a useful model for model-based testing requires expertise in the (fo
Rule-based decision making model
A rule-based decision making model is designed in G2 environment. A theoretical and methodological frame for the model is composed and motivated. The rule-based decision making model is based on object-oriented modelling, knowledge engineering and decision theory. The idea of safety objective tree is utilized. Advanced rule-based methodologies are applied. A general decision making model 'decision element' is constructed. The strategy planning of the decision element is based on e.g. value theory and utility theory. A hypothetical process model is built to give input data for the decision element. The basic principle of the object model in decision making is division in tasks. Probability models are used in characterizing component availabilities. Bayes' theorem is used to recalculate the probability figures when new information is got. The model includes simple learning features to save the solution path. A decision analytic interpretation is given to the decision making process. (author)
Lawler, J.J.; Edwards, T.C.
2002-01-01
The ability to predict species occurrences quickly is often crucial for managers and conservation biologists with limited time and funds. We used measured associations with landscape patterns to build accurate predictive habitat models that were quickly and easily applied (i.e., required no additional data collection in the field to make predictions). We used classification trees (a nonparametric alternative to discriminant function analysis, logistic regression, and other generalized linear models) to model nesting habitat of red-naped sapsuckers (Sphyrapicus nuchalis), northern flickers (Colaptes auratus), tree swallows (Tachycineta bicolor), and mountain chickadees (Parus gambeli) in the Uinta Mountains of northeastern Utah, USA. We then tested the predictive capability of the models with independent data collected in the field the following year. The models built for the northern flicker, red-naped sapsucker, and tree swallow were relatively accurate (84%, 80%, and 75% nests correctly classified, respectively) compared to the models for the mountain chickadee (50% nests correctly classified). All four models were more selective than a null model that predicted habitat based solely on a gross association with aspen forests. We conclude that associations with landscape patterns can be used to build relatively accurate, easy to use, predictive models for some species. Our results stress, however, that both selecting the proper scale at which to assess landscape associations and empirically testing the models derived from those associations are crucial for building useful predictive models.
Electrical Compact Modeling of Graphene Base Transistors
Sébastien Frégonèse
2015-11-01
Full Text Available Following the recent development of the Graphene Base Transistor (GBT, a new electrical compact model for GBT devices is proposed. The transistor model includes the quantum capacitance model to obtain a self-consistent base potential. It also uses a versatile transfer current equation to be compatible with the different possible GBT configurations and it account for high injection conditions thanks to a transit time based charge model. Finally, the developed large signal model has been implemented in Verilog-A code and can be used for simulation in a standard circuit design environment such as Cadence or ADS. This model has been verified using advanced numerical simulation.
EPR-based material modelling of soils
Faramarzi, Asaad; Alani, Amir M.
2013-04-01
In the past few decades, as a result of the rapid developments in computational software and hardware, alternative computer aided pattern recognition approaches have been introduced to modelling many engineering problems, including constitutive modelling of materials. The main idea behind pattern recognition systems is that they learn adaptively from experience and extract various discriminants, each appropriate for its purpose. In this work an approach is presented for developing material models for soils based on evolutionary polynomial regression (EPR). EPR is a recently developed hybrid data mining technique that searches for structured mathematical equations (representing the behaviour of a system) using genetic algorithm and the least squares method. Stress-strain data from triaxial tests are used to train and develop EPR-based material models for soil. The developed models are compared with some of the well-known conventional material models and it is shown that EPR-based models can provide a better prediction for the behaviour of soils. The main benefits of using EPR-based material models are that it provides a unified approach to constitutive modelling of all materials (i.e., all aspects of material behaviour can be implemented within a unified environment of an EPR model); it does not require any arbitrary choice of constitutive (mathematical) models. In EPR-based material models there are no material parameters to be identified. As the model is trained directly from experimental data therefore, EPR-based material models are the shortest route from experimental research (data) to numerical modelling. Another advantage of EPR-based constitutive model is that as more experimental data become available, the quality of the EPR prediction can be improved by learning from the additional data, and therefore, the EPR model can become more effective and robust. The developed EPR-based material models can be incorporated in finite element (FE) analysis.
Kurtev, I.; Bézivin, J.; Jouault, F.; Valduriez, P.
2006-01-01
More than five years ago, the OMG proposed the Model Driven Architecture (MDA™) approach to deal with the separation of platform dependent and independent aspects in information systems. Since then, the initial idea of MDA evolved and Model Driven Engineering (MDE) is being increasingly promoted to
Coopersmith, Evan Joseph
regime curve data and facilitate the development of cluster-specific algorithms. Given the desire to enable intelligent decision-making at any location, this classification system is developed in a manner that will allow for classification anywhere in the U.S., even in an ungauged basin. Daily time series data from 428 catchments in the MOPEX database are analyzed to produce an empirical classification tree, partitioning the United States into regions of hydroclimatic similarity. In constructing a classification tree based upon 55 years of data, it is important to recognize the non-stationary nature of climate data. The shifts in climatic regimes will cause certain locations to shift their ultimate position within the classification tree, requiring decision-makers to alter land usage, farming practices, and equipment needs, and algorithms to adjust accordingly. This work adapts the classification model to address the issue of regime shifts over larger temporal scales and suggests how land-usage and farming protocol may vary from hydroclimatic shifts in decades to come. Finally, the generalizability of the hydroclimatic classification system is tested with a physically-based soil moisture model calibrated at several locations throughout the continental United States. The soil moisture model is calibrated at a given site and then applied with the same parameters at other sites within and outside the same hydroclimatic class. The model's performance deteriorates minimally if the calibration and validation location are within the same hydroclimatic class, but deteriorates significantly if the calibration and validates sites are located in different hydroclimatic classes. These soil moisture estimates at the field scale are then further refined by the introduction of LiDAR elevation data, distinguishing faster-drying peaks and ridges from slower-drying valleys. The inclusion of LiDAR enabled multiple locations within the same field to be predicted accurately despite non
The Culture Based Model: Constructing a Model of Culture
Young, Patricia A.
2008-01-01
Recent trends reveal that models of culture aid in mapping the design and analysis of information and communication technologies. Therefore, models of culture are powerful tools to guide the building of instructional products and services. This research examines the construction of the culture based model (CBM), a model of culture that evolved…
Finite mixture models and model-based clustering
Volodymyr Melnykov
2010-01-01
Full Text Available Finite mixture models have a long history in statistics, having been used to model population heterogeneity, generalize distributional assumptions, and lately, for providing a convenient yet formal framework for clustering and classification. This paper provides a detailed review into mixture models and model-based clustering. Recent trends as well as open problems in the area are also discussed.
Probabilistic Model-Based Safety Analysis
Güdemann, Matthias; 10.4204/EPTCS.28.8
2010-01-01
Model-based safety analysis approaches aim at finding critical failure combinations by analysis of models of the whole system (i.e. software, hardware, failure modes and environment). The advantage of these methods compared to traditional approaches is that the analysis of the whole system gives more precise results. Only few model-based approaches have been applied to answer quantitative questions in safety analysis, often limited to analysis of specific failure propagation models, limited types of failure modes or without system dynamics and behavior, as direct quantitative analysis is uses large amounts of computing resources. New achievements in the domain of (probabilistic) model-checking now allow for overcoming this problem. This paper shows how functional models based on synchronous parallel semantics, which can be used for system design, implementation and qualitative safety analysis, can be directly re-used for (model-based) quantitative safety analysis. Accurate modeling of different types of proba...
P-Graph-based Workflow Modelling
József Tick
2007-01-01
Workflow modelling has been successfully introduced and implemented in severalapplication fields. Therefore, its significance has increased dramatically. Several work flowmodelling techniques have been published so far, out of which quite a number arewidespread applications. For instance the Petri-Net-based modelling has become popularpartly due to its graphical design and partly due to its correct mathematical background.The workflow modelling based on Unified Modelling Language is important...
Stochastic Modelling for Condition Based Maintenance
Han, Zehan
2015-01-01
This Master's thesis covers almost all aspects of Condition Based Maintenance (CBM). All objectives in Chapter 1 are met. The thesis is mainly comprised of three parts. First part introduces the world of CBM to readers. This part presents data acquisition, data processing and databases, which are the foundation to CBM. Then it highlights models which are divided into physics based models, data-driven models and hybrid models, for diagnostic and prognostic use. Three promising diagnostic and p...
Base Flow Model Validation Project
National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets...
Firm Based Trade Models and Turkish Economy
Nilüfer ARGIN
2015-12-01
Full Text Available Among all international trade models, only The Firm Based Trade Models explains firm’s action and behavior in the world trade. The Firm Based Trade Models focuses on the trade behavior of individual firms that actually make intra industry trade. Firm Based Trade Models can explain globalization process truly. These approaches include multinational cooperation, supply chain and outsourcing also. Our paper aims to explain and analyze Turkish export with Firm Based Trade Models’ context. We use UNCTAD data on exports by SITC Rev 3 categorization to explain total export and 255 products and calculate intensive-extensive margins of Turkish firms.
Distributed Prognostics Based on Structural Model Decomposition
National Aeronautics and Space Administration — Within systems health management, prognostics focuses on predicting the remaining useful life of a system. In the model-based prognostics paradigm, physics-based...
Jónsdóttir, Kristjana Ýr; Schmiegel, Jürgen; Jensen, Eva Bjørn Vedel
2008-01-01
In the present paper, we give a condensed review, for the nonspecialist reader, of a new modelling framework for spatio-temporal processes, based on Lévy theory. We show the potential of the approach in stochastic geometry and spatial statistics by studying Lévy-based growth modelling of planar...... objects. The growth models considered are spatio-temporal stochastic processes on the circle. As a by product, flexible new models for space–time covariance functions on the circle are provided. An application of the Lévy-based growth models to tumour growth is discussed....
Traceability in Model-Based Testing
Mathew George
2012-11-01
Full Text Available The growing complexities of software and the demand for shorter time to market are two important challenges that face today’s IT industry. These challenges demand the increase of both productivity and quality of software. Model-based testing is a promising technique for meeting these challenges. Traceability modeling is a key issue and challenge in model-based testing. Relationships between the different models will help to navigate from one model to another, and trace back to the respective requirements and the design model when the test fails. In this paper, we present an approach for bridging the gaps between the different models in model-based testing. We propose relation definition markup language (RDML for defining the relationships between models.
Hybrid data mining-regression for infrastructure risk assessment based on zero-inflated data
Infrastructure disaster risk assessment seeks to estimate the probability of a given customer or area losing service during a disaster, sometimes in conjunction with estimating the duration of each outage. This is often done on the basis of past data about the effects of similar events impacting the same or similar systems. In many situations this past performance data from infrastructure systems is zero-inflated; it has more zeros than can be appropriately modeled with standard probability distributions. The data are also often non-linear and exhibit threshold effects due to the complexities of infrastructure system performance. Standard zero-inflated statistical models such as zero-inflated Poisson and zero-inflated negative binomial regression models do not adequately capture these complexities. In this paper we develop a novel method that is a hybrid classification tree/regression method for complex, zero-inflated data sets. We investigate its predictive accuracy based on a large number of simulated data sets and then demonstrate its practical usefulness with an application to hurricane power outage risk assessment for a large utility based on actual data from the utility. While formulated for infrastructure disaster risk assessment, this method is promising for data-driven analysis for other situations with zero-inflated, complex data exhibiting response thresholds.
Hibbard, Bill
2011-01-01
At the recent AGI-11 Conference Orseau and Ring, and Dewey, described problems, including self-delusion, with the behavior of AIXI agents using various definitions of utility functions. An agent's utility function is defined in terms of the agent's history of interactions with its environment. This paper argues that the behavior problems can be avoided by formulating the utility function in two steps: 1) inferring a model of the environment from interactions, and 2) computing utility as a function of the environment model. The paper also argues that agents will not choose to modify their utility functions.
Li, Ruijiang; Jia, Xun; Zhao, Tianyu; Lamb, James; Yang, Deshan; Low, Daniel A; Jiang, Steve B
2010-01-01
Organ motion induced by respiration may cause clinically significant targeting errors and greatly degrade the effectiveness of conformal radiotherapy. It is therefore crucial to be able to model respiratory motion accurately. A recently proposed lung motion model based on principal component analysis (PCA) has been shown to be promising on a few patients. However, there is still a need to understand the underlying reason why it works. In this paper, we present a much deeper and detailed analysis of the PCA-based lung motion model. We provide the theoretical justification of the effectiveness of PCA in modeling lung motion. We also prove that under certain conditions, the PCA motion model is equivalent to 5D motion model, which is based on physiology and anatomy of the lung. The modeling power of PCA model was tested on clinical data and the average 3D error was found to be below 1 mm.
An acoustical model based monitoring network
Wessels, P.W.; Basten, T.G.H.; Eerden, F.J.M. van der
2010-01-01
In this paper the approach for an acoustical model based monitoring network is demonstrated. This network is capable of reconstructing a noise map, based on the combination of measured sound levels and an acoustic model of the area. By pre-calculating the sound attenuation within the network the noi
Model Validation in Ontology Based Transformations
Jesús M. Almendros-Jiménez
2012-10-01
Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.
Agent-based pedestrian modelling
Batty, Michael
2003-01-01
When the focus of interest in geographical systems is at the very fine scale, at the level of streets and buildings for example, movement becomes central to simulations of how spatial activities are used and develop. Recent advances in computing power and the acquisition of fine scale digital data now mean that we are able to attempt to understand and predict such phenomena with the focus in spatial modelling changing to dynamic simulations of the individual and collective beha...
An Agent Based Classification Model
Gu, Feng; Aickelin, Uwe; Greensmith, Julie
2009-01-01
The major function of this model is to access the UCI Wisconsin Breast Can- cer data-set[1] and classify the data items into two categories, which are normal and anomalous. This kind of classifi cation can be referred as anomaly detection, which discriminates anomalous behaviour from normal behaviour in computer systems. One popular solution for anomaly detection is Artifi cial Immune Sys- tems (AIS). AIS are adaptive systems inspired by theoretical immunology and observed immune functions, p...
Model Based Control of Solidification
Furenes, Beathe
2009-01-01
The objective of this thesis is to develop models for use in the control of a solidification process. Solidification is the phase change from liquid to solid, and takes place in many important processes ranging from production engineering to solid-state physics. Often during solidification, undesired e¤ects like e.g. variation of composition, microstructure, etc. occur. The solidification structure and its associated defects often persist throughout the subsequent operations, and thus good co...
Integration of Simulink Models with Component-based Software Models
Marian, Nicolae
2008-01-01
constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behaviour as a means of computation...... constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems) is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI), University of Southern Denmark. Once specified, the software model has to be......Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics of...
A Multiple Model Approach to Modeling Based on LPF Algorithm
无
2001-01-01
Input-output data fitting methods are often used for unknown-structure nonlinear system modeling. Based on model-on-demand tactics, a multiple model approach to modeling for nonlinear systems is presented. The basic idea is to find out, from vast historical system input-output data sets, some data sets matching with the current working point, then to develop a local model using Local Polynomial Fitting (LPF) algorithm. With the change of working points, multiple local models are built, which realize the exact modeling for the global system. By comparing to other methods, the simulation results show good performance for its simple, effective and reliable estimation.``
Model-based internal wave processing
Candy, J.V.; Chambers, D.H.
1995-06-09
A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (dept) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves.
Tools for model-based security engineering: models vs. code
Jürjens, Jan; Yu, Yijun
2007-01-01
We present tools to support model-based security engineering on both the model and the code level. In the approach supported by these tools, one firstly specifies the security-critical part of the system (e.g. a crypto protocol) using the UML security extension UMLsec. The models are automatically verified for security properties using automated theorem provers. These are implemented within a framework that supports implementing verification routines, based on XMI output of the diagrams from ...
Modelling a Peroxidase-based Optical Biosensor
Juozas Kulys; Evelina GaidamauskaitÃ‹Â™e; Romas Baronas
2007-01-01
The response of a peroxidase-based optical biosensor was modelled digitally. A mathematical model of the optical biosensor is based on a system of non-linear reaction-diffusion equations. The modelling biosensor comprises two compartments, an enzyme layer and an outer diffusion layer. The digital simulation was carried out using finite difference technique. The influence of the substrate concentration as well as of the thickness of both the enzyme and diffusion layers on the biosensor respons...
Navigation based on symbolic space models
Baras, Karolina; Moreira, Adriano; Meneses, Filipe
2010-01-01
Existing navigation systems are very appropriate for car navigation, but lack support for convenient pedestrian navigation and cannot be used indoors due to GPS limitations. In addition, the creation and the maintenance of the required models are costly and time consuming, and are usually based on proprietary data structures. In this paper we describe a navigation system based on a human inspired symbolic space model. We argue that symbolic space models are much easier...
P-Graph-based Workflow Modelling
József Tick
2007-03-01
Full Text Available Workflow modelling has been successfully introduced and implemented in severalapplication fields. Therefore, its significance has increased dramatically. Several work flowmodelling techniques have been published so far, out of which quite a number arewidespread applications. For instance the Petri-Net-based modelling has become popularpartly due to its graphical design and partly due to its correct mathematical background.The workflow modelling based on Unified Modelling Language is important because of itspractical usage. This paper introduces and examines the workflow modelling techniquebased on the Process-graph as a possible new solution next to the already existingmodelling techniques.
IP Network Management Model Based on NGOSS
ZHANG Jin-yu; LI Hong-hui; LIU Feng
2004-01-01
This paper addresses a management model for IP network based on Next Generation Operation Support System (NGOSS). It makes the network management on the base of all the operation actions of ISP, It provides QoS to user service through the whole path by providing end-to-end Service Level Agreements (SLA) management through whole path. Based on web and coordination technology, this paper gives an implement architecture of this model.
Gradient-based model calibration with proxy-model assistance
Burrows, Wesley; Doherty, John
2016-02-01
Use of a proxy model in gradient-based calibration and uncertainty analysis of a complex groundwater model with large run times and problematic numerical behaviour is described. The methodology is general, and can be used with models of all types. The proxy model is based on a series of analytical functions that link all model outputs used in the calibration process to all parameters requiring estimation. In enforcing history-matching constraints during the calibration and post-calibration uncertainty analysis processes, the proxy model is run for the purposes of populating the Jacobian matrix, while the original model is run when testing parameter upgrades; the latter process is readily parallelized. Use of a proxy model in this fashion dramatically reduces the computational burden of complex model calibration and uncertainty analysis. At the same time, the effect of model numerical misbehaviour on calculation of local gradients is mitigated, this allowing access to the benefits of gradient-based analysis where lack of integrity in finite-difference derivatives calculation would otherwise have impeded such access. Construction of a proxy model, and its subsequent use in calibration of a complex model, and in analysing the uncertainties of predictions made by that model, is implemented in the PEST suite.
Model-based Abstraction of Data Provenance
Probst, Christian W.; Hansen, René Rydhof
2014-01-01
to bigger models, and the analyses adapt accordingly. Our approach extends provenance both with the origin of data, the actors and processes involved in the handling of data, and policies applied while doing so. The model and corresponding analyses are based on a formal model of spatial and organisational......Identifying provenance of data provides insights to the origin of data and intermediate results, and has recently gained increased interest due to data-centric applications. In this work we extend a data-centric system view with actors handling the data and policies restricting actions....... This extension is based on provenance analysis performed on system models. System models have been introduced to model and analyse spatial and organisational aspects of organisations, to identify, e.g., potential insider threats. Both the models and analyses are naturally modular; models can be combined...
Agent-based modeling and network dynamics
Namatame, Akira
2016-01-01
The book integrates agent-based modeling and network science. It is divided into three parts, namely, foundations, primary dynamics on and of social networks, and applications. The book begins with the network origin of agent-based models, known as cellular automata, and introduce a number of classic models, such as Schelling’s segregation model and Axelrod’s spatial game. The essence of the foundation part is the network-based agent-based models in which agents follow network-based decision rules. Under the influence of the substantial progress in network science in late 1990s, these models have been extended from using lattices into using small-world networks, scale-free networks, etc. The book also shows that the modern network science mainly driven by game-theorists and sociophysicists has inspired agent-based social scientists to develop alternative formation algorithms, known as agent-based social networks. The book reviews a number of pioneering and representative models in this family. Upon the gi...
Model-Based Clustering of Large Networks
Vu, Duy Quang; Schweinberger, Michael
2012-01-01
We describe a network clustering framework, based on finite mixture models, that can be applied to discrete-valued networks with hundreds of thousands of nodes and billions of edge variables. Relative to other recent model-based clustering work for networks, we introduce a more flexible modeling framework, improve the variational-approximation estimation algorithm, discuss and implement standard error estimation via a parametric bootstrap approach, and apply these methods to much larger datasets than those seen elsewhere in the literature. The more flexible modeling framework is achieved through introducing novel parameterizations of the model, giving varying degrees of parsimony, using exponential family models whose structure may be exploited in various theoretical and algorithmic ways. The algorithms, which we show how to adapt to the more complicated optimization requirements introduced by the constraints imposed by the novel parameterizations we propose, are based on variational generalized EM algorithms...
Simulation-based Manufacturing System Modeling
卫东; 金烨; 范秀敏; 严隽琪
2003-01-01
In recent years, computer simulation appears to be very advantageous technique for researching the resource-constrained manufacturing system. This paper presents an object-oriented simulation modeling method, which combines the merits of traditional methods such as IDEF0 and Petri Net. In this paper, a four-layer-one-angel hierarchical modeling framework based on OOP is defined. And the modeling description of these layers is expounded, such as: hybrid production control modeling and human resource dispatch modeling. To validate the modeling method, a case study of an auto-product line in a motor manufacturing company has been carried out.
Energy based prediction models for building acoustics
Brunskog, Jonas
2012-01-01
In order to reach robust and simplified yet accurate prediction models, energy based principle are commonly used in many fields of acoustics, especially in building acoustics. This includes simple energy flow models, the framework of statistical energy analysis (SEA) as well as more elaborated...... principles as, e.g., wave intensity analysis (WIA). The European standards for building acoustic predictions, the EN 12354 series, are based on energy flow and SEA principles. In the present paper, different energy based prediction models are discussed and critically reviewed. Special attention is placed on...
Model-based reasoning and large-knowledge bases
In such engineering fields as nuclear power plant engineering, technical information expressed in the form of schematics is frequently used. A new paradigm for model-based reasoning (MBR) and an AI tool called PLEXSYS (plant expert system) using this paradigm has been developed. PLEXSYS and the underlying paradigm are specifically designed to handle schematic drawings, by expressing drawings as models and supporting various sophisticated searches on these models. Two application systems have been constructed with PLEXSYS: one generates PLEXSYS models from existing CAD data files, and the other provides functions for nuclear power plant design support. Since the models can be generated from existing data resources, the design support system automatically has full access to a large-scale model or knowledge base representing actual nuclear power plants. (author)
Ground-Based Telescope Parametric Cost Model
Stahl, H. Philip; Rowell, Ginger Holmes
2004-01-01
A parametric cost model for ground-based telescopes is developed using multi-variable statistical analysis, The model includes both engineering and performance parameters. While diameter continues to be the dominant cost driver, other significant factors include primary mirror radius of curvature and diffraction limited wavelength. The model includes an explicit factor for primary mirror segmentation and/or duplication (i.e.. multi-telescope phased-array systems). Additionally, single variable models based on aperture diameter are derived. This analysis indicates that recent mirror technology advances have indeed reduced the historical telescope cost curve.
Software Testing Method Based on Model Comparison
XIE Xiao-dong; LU Yan-sheng; MAO Cheng-yin
2008-01-01
A model comparison based software testing method (MCST) is proposed. In this method, the requirements and programs of software under test are transformed into the ones in the same form, and described by the same model describe language (MDL).Then, the requirements are transformed into a specification model and the programs into an implementation model. Thus, the elements and structures of the two models are compared, and the differences between them are obtained. Based on the diffrences, a test suite is generated. Different MDLs can be chosen for the software under test. The usages of two classical MDLs in MCST, the equivalence classes model and the extended finite state machine (EFSM) model, are described with example applications. The results show that the test suites generated by MCST are more efficient and smaller than some other testing methods, such as the path-coverage testing method, the object state diagram testing method, etc.
Sketch-based Interfaces and Modeling
Jorge, Joaquim
2011-01-01
The field of sketch-based interfaces and modeling (SBIM) is concerned with developing methods and techniques to enable users to interact with a computer through sketching - a simple, yet highly expressive medium. SBIM blends concepts from computer graphics, human-computer interaction, artificial intelligence, and machine learning. Recent improvements in hardware, coupled with new machine learning techniques for more accurate recognition, and more robust depth inferencing techniques for sketch-based modeling, have resulted in an explosion of both sketch-based interfaces and pen-based computing
Location-based Modeling and Analysis: Tropos-based Approach
Ali, Raian; Dalpiaz, Fabiano; Giorgini, Paolo
2008-01-01
The continuous growth of interest in mobile applications makes the concept of location essential to design and develop software systems. Location-based software is supposed to be able to monitor the location and choose accordingly the most appropriate behavior. In this paper, we propose a novel conceptual framework to model and analyze location-based software. We mainly focus on the social facets of locations adopting concepts such as social actor, resource, and location-based behavior. Our a...
Model-based clustered-dot screening
Kim, Sang Ho
2006-01-01
I propose a halftone screen design method based on a human visual system model and the characteristics of the electro-photographic (EP) printer engine. Generally, screen design methods based on human visual models produce dispersed-dot type screens while design methods considering EP printer characteristics generate clustered-dot type screens. In this paper, I propose a cost function balancing the conflicting characteristics of the human visual system and the printer. By minimizing the obtained cost function, I design a model-based clustered-dot screen using a modified direct binary search algorithm. Experimental results demonstrate the superior quality of the model-based clustered-dot screen compared to a conventional clustered-dot screen.
Multiscale agent-based consumer market modeling.
North, M. J.; Macal, C. M.; St. Aubin, J.; Thimmapuram, P.; Bragen, M.; Hahn, J.; Karr, J.; Brigham, N.; Lacy, M. E.; Hampton, D.; Decision and Information Sciences; Procter & Gamble Co.
2010-05-01
Consumer markets have been studied in great depth, and many techniques have been used to represent them. These have included regression-based models, logit models, and theoretical market-level models, such as the NBD-Dirichlet approach. Although many important contributions and insights have resulted from studies that relied on these models, there is still a need for a model that could more holistically represent the interdependencies of the decisions made by consumers, retailers, and manufacturers. When the need is for a model that could be used repeatedly over time to support decisions in an industrial setting, it is particularly critical. Although some existing methods can, in principle, represent such complex interdependencies, their capabilities might be outstripped if they had to be used for industrial applications, because of the details this type of modeling requires. However, a complementary method - agent-based modeling - shows promise for addressing these issues. Agent-based models use business-driven rules for individuals (e.g., individual consumer rules for buying items, individual retailer rules for stocking items, or individual firm rules for advertizing items) to determine holistic, system-level outcomes (e.g., to determine if brand X's market share is increasing). We applied agent-based modeling to develop a multi-scale consumer market model. We then conducted calibration, verification, and validation tests of this model. The model was successfully applied by Procter & Gamble to several challenging business problems. In these situations, it directly influenced managerial decision making and produced substantial cost savings.
Workflow-Based Dynamic Enterprise Modeling
黄双喜; 范玉顺; 罗海滨; 林慧萍
2002-01-01
Traditional systems for enterprise modeling and business process control are often static and cannot adapt to the changing environment. This paper presents a workflow-based method to dynamically execute the enterprise model. This method gives an explicit representation of the business process logic and the relationships between the elements involved in the process. An execution-oriented integrated enterprise modeling system is proposed in combination with other enterprise views. The enterprise model can be established and executed dynamically in the actual environment due to the dynamic properties of the workflow model.
Bayesian Network Based XP Process Modelling
Mohamed Abouelela
2010-07-01
Full Text Available A Bayesian Network based mathematical model has been used for modelling Extreme Programmingsoftware development process. The model is capable of predicting the expected finish time and theexpected defect rate for each XP release. Therefore, it can be used to determine the success/failure of anyXP Project. The model takes into account the effect of three XP practices, namely: Pair Programming,Test Driven Development and Onsite Customer practices. The model’s predictions were validated againsttwo case studies. Results show the precision of our model especially in predicting the project finish time.
Econophysics of agent-based models
Aoyama, Hideaki; Chakrabarti, Bikas; Chakraborti, Anirban; Ghosh, Asim
2014-01-01
The primary goal of this book is to present the research findings and conclusions of physicists, economists, mathematicians and financial engineers working in the field of "Econophysics" who have undertaken agent-based modelling, comparison with empirical studies and related investigations. Most standard economic models assume the existence of the representative agent, who is “perfectly rational” and applies the utility maximization principle when taking action. One reason for this is the desire to keep models mathematically tractable: no tools are available to economists for solving non-linear models of heterogeneous adaptive agents without explicit optimization. In contrast, multi-agent models, which originated from statistical physics considerations, allow us to go beyond the prototype theories of traditional economics involving the representative agent. This book is based on the Econophys-Kolkata VII Workshop, at which many such modelling efforts were presented. In the book, leading researchers in the...
Evaluating face trustworthiness: a model based approach
Todorov, Alexander; Baron, Sean G.; Oosterhof, Nikolaas N.
2008-01-01
Judgments of trustworthiness from faces determine basic approach/avoidance responses and approximate the valence evaluation of faces that runs across multiple person judgments. Here, based on trustworthiness judgments and using a computer model for face representation, we built a model for representing face trustworthiness (study 1). Using this model, we generated novel faces with an increased range of trustworthiness and used these faces as stimuli in a functional Magnetic Resonance Imaging ...
Agent-Based Modeling in Systems Pharmacology.
Cosgrove, J; Butler, J; Alden, K; Read, M; Kumar, V; Cucurull-Sanchez, L; Timmis, J; Coles, M
2015-11-01
Modeling and simulation (M&S) techniques provide a platform for knowledge integration and hypothesis testing to gain insights into biological systems that would not be possible a priori. Agent-based modeling (ABM) is an M&S technique that focuses on describing individual components rather than homogenous populations. This tutorial introduces ABM to systems pharmacologists, using relevant case studies to highlight how ABM-specific strengths have yielded success in the area of preclinical mechanistic modeling. PMID:26783498
Agent-based Models of Financial Markets
Samanidou, E.; E. Zschischang; Stauffer, D.; Lux, T.
2007-01-01
This review deals with several microscopic (``agent-based'') models of financial markets which have been studied by economists and physicists over the last decade: Kim-Markowitz, Levy-Levy-Solomon, Cont-Bouchaud, Solomon-Weisbuch, Lux-Marchesi, Donangelo-Sneppen and Solomon-Levy-Huang. After an overview of simulation approaches in financial economics, we first give a summary of the Donangelo-Sneppen model of monetary exchange and compare it with related models in economics literature. Our sel...
Literature Survey on Model based Slicing
Sneh Krishna*,; Alekh Dwivedi
2014-01-01
Software testing is an activity which aims at evaluating an feature or capability of system and determining that whether it meets required expectations. One way to ease this program slicing technique is to break down the large programs into smaller ones and into other is model based slicing that break down the large software architecture model into smaller models at the early stage of SDLC (Software Development Life Cycle). This is the novel methodology to extract the sub mode...
PARTICIPATION BASED MODEL OF SHIP CREW MANAGEMENT
Bielić, Toni; Ivanišević, Dalibor; Gundić, Ana
2014-01-01
This paper analyse the participation - based model on board the ship as possibly optimal leadership model existing in the shipping industry with accent on decision - making process. In the paper authors have tried to define master’s behaviour model and management style identifying drawbacks and disadvantages of vertical, pyramidal organization with master on the top. Paper describes efficiency of decision making within team organization and optimization of a ship’s organisation by introduci...
Model-based Abstraction of Data Provenance
Probst, Christian W.; Hansen, René Rydhof
2014-01-01
Identifying provenance of data provides insights to the origin of data and intermediate results, and has recently gained increased interest due to data-centric applications. In this work we extend a data-centric system view with actors handling the data and policies restricting actions. This extension is based on provenance analysis performed on system models. System models have been introduced to model and analyse spatial and organisational aspects of organisations, to identify, e.g., potent...
A probabilistic graphical model based stochastic input model construction
Model reduction techniques have been widely used in modeling of high-dimensional stochastic input in uncertainty quantification tasks. However, the probabilistic modeling of random variables projected into reduced-order spaces presents a number of computational challenges. Due to the curse of dimensionality, the underlying dependence relationships between these random variables are difficult to capture. In this work, a probabilistic graphical model based approach is employed to learn the dependence by running a number of conditional independence tests using observation data. Thus a probabilistic model of the joint PDF is obtained and the PDF is factorized into a set of conditional distributions based on the dependence structure of the variables. The estimation of the joint PDF from data is then transformed to estimating conditional distributions under reduced dimensions. To improve the computational efficiency, a polynomial chaos expansion is further applied to represent the random field in terms of a set of standard random variables. This technique is combined with both linear and nonlinear model reduction methods. Numerical examples are presented to demonstrate the accuracy and efficiency of the probabilistic graphical model based stochastic input models. - Highlights: • Data-driven stochastic input models without the assumption of independence of the reduced random variables. • The problem is transformed to a Bayesian network structure learning problem. • Examples are given in flows in random media
Mineral resources estimation based on block modeling
Bargawa, Waterman Sulistyana; Amri, Nur Ali
2016-02-01
The estimation in this paper uses three kinds of block models of nearest neighbor polygon, inverse distance squared and ordinary kriging. The techniques are weighting scheme which is based on the principle that block content is a linear combination of the grade data or the sample around the block being estimated. The case study in Pongkor area, here is gold-silver resource modeling that allegedly shaped of quartz vein as a hydrothermal process of epithermal type. Resources modeling includes of data entry, statistical and variography analysis of topography and geological model, the block model construction, estimation parameter, presentation model and tabulation of mineral resources. Skewed distribution, here isolated by robust semivariogram. The mineral resources classification generated in this model based on an analysis of the kriging standard deviation and number of samples which are used in the estimation of each block. Research results are used to evaluate the performance of OK and IDS estimator. Based on the visual and statistical analysis, concluded that the model of OK gives the estimation closer to the data used for modeling.
Modelling a Peroxidase-based Optical Biosensor
Baronas, Romas; Gaidamauskaite, Evelina; Kulys, Juozas
2007-01-01
The response of a peroxidase-based optical biosensor was modelled digitally. A mathematical model of the optical biosensor is based on a system of non-linear reaction-diffusion equations. The modelling biosensor comprises two compartments, an enzyme layer and an outer diffusion layer. The digital simulation was carried out using finite difference technique. The influence of the substrate concentration as well as of the thickness of both the enzyme and diffusion layers on the biosensor response was investigated. Calculations showed complex kinetics of the biosensor response, especially at low concentrations of the peroxidase and of the hydrogen peroxide.
Information modelling and knowledge bases XXV
Tokuda, T; Jaakkola, H
2014-01-01
Because of our ever increasing use of and reliance on technology and information systems, information modelling and knowledge bases continue to be important topics in those academic communities concerned with data handling and computer science. As the information itself becomes more complex, so do the levels of abstraction and the databases themselves. This book is part of the series Information Modelling and Knowledge Bases, which concentrates on a variety of themes in the important domains of conceptual modeling, design and specification of information systems, multimedia information modelin
Neighborhood Mixture Model for Knowledge Base Completion
Nguyen, Dat Quoc; Sirts, Kairit; Qu, Lizhen; Johnson, Mark
2016-01-01
Knowledge bases are useful resources for many natural language processing tasks, however, they are far from complete. In this paper, we define a novel entity representation as a mixture of its neighborhood in the knowledge base and apply this technique on TransE-a well-known embedding model for knowledge base completion. Experimental results show that the neighborhood information significantly helps to improve the results of the TransE, leading to better performance than obtained by other sta...
Multiagent-Based Model For ESCM
Delia MARINCAS
2011-01-01
Web based applications for Supply Chain Management (SCM) are now a necessity for every company in order to meet the increasing customer demands, to face the global competition and to make profit. Multiagent-based approach is appropriate for eSCM because it shows many of the characteristics a SCM system should have. For this reason, we have proposed a multiagent-based eSCM model which configures a virtual SC, automates the SC activities: selling, purchasing, manufacturing, planning, inventory,...
Modelling carbon nanotubes-based mediatorless biosensor.
Baronas, Romas; Kulys, Juozas; Petrauskas, Karolis; Razumiene, Julija
2012-01-01
This paper presents a mathematical model of carbon nanotubes-based mediatorless biosensor. The developed model is based on nonlinear non-stationary reaction-diffusion equations. The model involves four layers (compartments): a layer of enzyme solution entrapped on a terylene membrane, a layer of the single walled carbon nanotubes deposited on a perforated membrane, and an outer diffusion layer. The biosensor response and sensitivity are investigated by changing the model parameters with a special emphasis on the mediatorless transfer of the electrons in the layer of the enzyme-loaded carbon nanotubes. The numerical simulation at transient and steady state conditions was carried out using the finite difference technique. The mathematical model and the numerical solution were validated by experimental data. The obtained agreement between the simulation results and the experimental data was admissible at different concentrations of the substrate. PMID:23012537
Modelling Carbon Nanotubes-Based Mediatorless Biosensor
Julija Razumiene
2012-07-01
Full Text Available This paper presents a mathematical model of carbon nanotubes-based mediatorless biosensor. The developed model is based on nonlinear non-stationary reaction-diffusion equations. The model involves four layers (compartments: a layer of enzyme solution entrapped on a terylene membrane, a layer of the single walled carbon nanotubes deposited on a perforated membrane, and an outer diffusion layer. The biosensor response and sensitivity are investigated by changing the model parameters with a special emphasis on the mediatorless transfer of the electrons in the layer of the enzyme-loaded carbon nanotubes. The numerical simulation at transient and steady state conditions was carried out using the finite difference technique. The mathematical model and the numerical solution were validated by experimental data. The obtained agreement between the simulation results and the experimental data was admissible at different concentrations of the substrate.
A stepwise-cluster microbial biomass inference model in food waste composting
A stepwise-cluster microbial biomass inference (SMI) model was developed through introducing stepwise-cluster analysis (SCA) into composting process modeling to tackle the nonlinear relationships among state variables and microbial activities. The essence of SCA is to form a classification tree based on a series of cutting or mergence processes according to given statistical criteria. Eight runs of designed experiments in bench-scale reactors in a laboratory were constructed to demonstrate the feasibility of the proposed method. The results indicated that SMI could help establish a statistical relationship between state variables and composting microbial characteristics, where discrete and nonlinear complexities exist. Significance levels of cutting/merging were provided such that the accuracies of the developed forecasting trees were controllable. Through an attempted definition of input effects on the output in SMI, the effects of the state variables on thermophilic bacteria were ranged in a descending order as: Time (day) > moisture content (%) > ash content (%, dry) > Lower Temperature (deg. C) > pH > NH4+-N (mg/Kg, dry) > Total N (%, dry) > Total C (%, dry); the effects on mesophilic bacteria were ordered as: Time > Upper Temperature (deg. C) > Total N > moisture content > NH4+-N > Total C > pH. This study made the first attempt in applying SCA to mapping the nonlinear and discrete relationships in composting processes.
MEGen: A Physiologically Based Pharmacokinetic Model Generator
GeorgeDLoizou
2011-11-01
Full Text Available Physiologically based pharmacokinetic models are being used in an increasing number of different areas. These not only include the human safety assessment of pharmaceuticals, pesticides, biocides and environmental chemicals but also for food animal, wild mammal and avian risk assessment. The value of PBPK models is that they are tools for estimating tissue dosimetry by integrating in vitro and in vivo mechanistic, pharmacokinetic and toxicological information through their explicit mathematical description of important anatomical, physiological and biochemical determinants of chemical uptake, disposition and elimination. However, PBPK models are perceived as complex, data hungry, resource intensive and time consuming. In addition, model validation and verification are hindered by the relative complexity of the equations. To begin to address these issues a freely available web application for the rapid construction and documentation of bespoke PBPK models is under development. Here we present an overview of the current capabilities of MEGen, a model equation generator and parameter database and discuss future developments.
A model evaluation checklist for process-based environmental models
Jackson-Blake, Leah
2015-04-01
the conceptual model on which it is based. In this study, a number of model structural shortcomings were identified, such as a lack of dissolved phosphorus transport via infiltration excess overland flow, potential discrepancies in the particulate phosphorus simulation and a lack of spatial granularity. (4) Conceptual challenges, as conceptual models on which predictive models are built are often outdated, having not kept up with new insights from monitoring and experiments. For example, soil solution dissolved phosphorus concentration in INCA-P is determined by the Freundlich adsorption isotherm, which could potentially be replaced using more recently-developed adsorption models that take additional soil properties into account. This checklist could be used to assist in identifying why model performance may be poor or unreliable. By providing a model evaluation framework, it could help prioritise which areas should be targeted to improve model performance or model credibility, whether that be through using alternative calibration techniques and statistics, improved data collection, improving or simplifying the model structure or updating the model to better represent current understanding of catchment processes.
Spatial interactions in agent-based modeling
Ausloos, Marcel; Merlone, Ugo
2014-01-01
Agent Based Modeling (ABM) has become a widespread approach to model complex interactions. In this chapter after briefly summarizing some features of ABM the different approaches in modeling spatial interactions are discussed. It is stressed that agents can interact either indirectly through a shared environment and/or directly with each other. In such an approach, higher-order variables such as commodity prices, population dynamics or even institutions, are not exogenously specified but instead are seen as the results of interactions. It is highlighted in the chapter that the understanding of patterns emerging from such spatial interaction between agents is a key problem as much as their description through analytical or simulation means. The chapter reviews different approaches for modeling agents' behavior, taking into account either explicit spatial (lattice based) structures or networks. Some emphasis is placed on recent ABM as applied to the description of the dynamics of the geographical distribution o...
Model based Software Develeopment: Issues & Challenges
basha, N Md Jubair; Rizwanullah, Mohammed
2012-01-01
One of the goals of software design is to model a system in such a way that it is easily understandable. Nowadays the tendency for software development is changing from manual coding to automatic code generation; it is becoming model-based. This is a response to the software crisis, in which the cost of hardware has decreased and conversely the cost of software development has increased sharply. The methodologies that allowed this change are model-based, thus relieving the human from detailed coding. Still there is a long way to achieve this goal, but work is being done worldwide to achieve this objective. This paper presents the drastic changes related to modeling and important challenging issues and techniques that recur in MBSD.
Graphical model construction based on evolutionary algorithms
Youlong YANG; Yan WU; Sanyang LIU
2006-01-01
Using Bayesian networks to model promising solutions from the current population of the evolutionary algorithms can ensure efficiency and intelligence search for the optimum. However, to construct a Bayesian network that fits a given dataset is a NP-hard problem, and it also needs consuming mass computational resources. This paper develops a methodology for constructing a graphical model based on Bayesian Dirichlet metric. Our approach is derived from a set of propositions and theorems by researching the local metric relationship of networks matching dataset. This paper presents the algorithm to construct a tree model from a set of potential solutions using above approach. This method is important not only for evolutionary algorithms based on graphical models, but also for machine learning and data mining.The experimental results show that the exact theoretical results and the approximations match very well.
PV panel model based on datasheet values
Sera, Dezso; Teodorescu, Remus; Rodriguez, Pedro
This work presents the construction of a model for a PV panel using the single-diode five-parameters model, based exclusively on data-sheet parameters. The model takes into account the series and parallel (shunt) resistance of the panel. The equivalent circuit and the basic equations of the PV cell....../panel in Standard Test Conditions (STC) are shown, as well as the parameters extraction from the data-sheet values. The temperature dependence of the cell dark saturation current is expressed with an alternative formula, which gives better correlation with the datasheet values of the power temperature...... dependence. Based on these equations, a PV panel model, which is able to predict the panel behavior in different temperature and irradiance conditions, is built and tested....
A contextual modeling approach for model-based recommender systems
Fernández-Tobías, Ignacio; Campos Soto, Pedro G.; Cantador, Iván; Díez, Fernando
2013-01-01
The final publication is available at Springer via http://dx.doi.org/10.1007/978-3-642-40643-0_5 Proceedings of 15th Conference of the Spanish Association for Artificial Intelligence, CAEPIA 2013, Madrid, Spain, September 17-20, 2013. In this paper we present a contextual modeling approach for model-based recommender systems that integrates and exploits both user preferences and contextual signals in a common vector space. Differently to previous work, we conduct a user study acquiring ...
Efficient Textural Model-Based Mammogram Enhancement
Haindl, Michal; Remeš, Václav
Piscataway: IEEE, 2013, s. 522-523. ISBN 978-1-4799-1053-3. [2013 IEEE 26th International Symposium on Computer-Based Medical Systems (CBMS). Porto (PT), 20.06.2013-22.06.2013] Institutional support: RVO:67985556 Keywords : mammogram enhancement * autoregressive texture model * breast tissue modeling Subject RIV: BD - Theory of Information http://library.utia.cas.cz/separaty/2013/RO/haindl-0397607.pdf
Application software development via model based design
Haapala, O. (Olli)
2015-01-01
This thesis was set to study the utilization of the MathWorks’ Simulink® program in model based application software development and its compatibility with the Vacon 100 inverter. The target was to identify all the problems related to everyday usage of this method and create a white paper of how to execute a model based design to create a Vacon 100 compatible system software. Before this thesis was started, there was very little knowledge of the compatibility of this method. However durin...
Physically based modeling and animation of tornado
LIU Shi-guang; WANG Zhang-ye; GONG Zheng; CHEN Fei-fei; PENG Qun-sheng
2006-01-01
Realistic modeling and rendering of dynamic tornado scene is recognized as a challenging task for researchers of computer graphics. In this paper a new physically based method for simulating and animating tornado scene is presented. We first propose a Two-Fluid model based on the physical theory of tornado, then we simulate the flow of tornado and its interaction with surrounding objects such as debris, etc. Taking the scattering and absorption of light by the participating media into account, the illumination effects of the tornado scene can be generated realistically. With the support of graphics hardware, various kinds of dynamic tornado scenes can be rendered at interactive rates.
Model-based testing for embedded systems
Zander, Justyna; Mosterman, Pieter J
2011-01-01
What the experts have to say about Model-Based Testing for Embedded Systems: "This book is exactly what is needed at the exact right time in this fast-growing area. From its beginnings over 10 years ago of deriving tests from UML statecharts, model-based testing has matured into a topic with both breadth and depth. Testing embedded systems is a natural application of MBT, and this book hits the nail exactly on the head. Numerous topics are presented clearly, thoroughly, and concisely in this cutting-edge book. The authors are world-class leading experts in this area and teach us well-used
A multivalued knowledge-base model
Achs, Agnes
2010-01-01
The basic aim of our study is to give a possible model for handling uncertain information. This model is worked out in the framework of DATALOG. At first the concept of fuzzy Datalog will be summarized, then its extensions for intuitionistic- and interval-valued fuzzy logic is given and the concept of bipolar fuzzy Datalog is introduced. Based on these ideas the concept of multivalued knowledge-base will be defined as a quadruple of any background knowledge; a deduction mechanism; a connecting algorithm, and a function set of the program, which help us to determine the uncertainty levels of the results. At last a possible evaluation strategy is given.
Physiologically based synthetic models of hepatic disposition.
Hunt, C Anthony; Ropella, Glen E P; Yan, Li; Hung, Daniel Y; Roberts, Michael S
2006-12-01
Current physiologically based pharmacokinetic (PBPK) models are inductive. We present an additional, different approach that is based on the synthetic rather than the inductive approach to modeling and simulation. It relies on object-oriented programming. A model of the referent system in its experimental context is synthesized by assembling objects that represent components such as molecules, cells, aspects of tissue architecture, catheters, etc. The single pass perfused rat liver has been well described in evaluating hepatic drug pharmacokinetics (PK) and is the system on which we focus. In silico experiments begin with administration of objects representing actual compounds. Data are collected in a manner analogous to that in the referent PK experiments. The synthetic modeling method allows for recognition and representation of discrete event and discrete time processes, as well as heterogeneity in organization, function, and spatial effects. An application is developed for sucrose and antipyrine, administered separately and together. PBPK modeling has made extensive progress in characterizing abstracted PK properties but this has also been its limitation. Now, other important questions and possible extensions emerge. How are these PK properties and the observed behaviors generated? The inherent heuristic limitations of traditional models have hindered getting meaningful, detailed answers to such questions. Synthetic models of the type described here are specifically intended to help answer such questions. Analogous to wet-lab experimental models, they retain their applicability even when broken apart into sub-components. Having and applying this new class of models along with traditional PK modeling methods is expected to increase the productivity of pharmaceutical research at all levels that make use of modeling and simulation. PMID:17051440
A subgrid based approach for morphodynamic modelling
Volp, N. D.; van Prooijen, B. C.; Pietrzak, J. D.; Stelling, G. S.
2016-07-01
To improve the accuracy and the efficiency of morphodynamic simulations, we present a subgrid based approach for a morphodynamic model. This approach is well suited for areas characterized by sub-critical flow, like in estuaries, coastal areas and in low land rivers. This new method uses a different grid resolution to compute the hydrodynamics and the morphodynamics. The hydrodynamic computations are carried out with a subgrid based, two-dimensional, depth-averaged model. This model uses a coarse computational grid in combination with a subgrid. The subgrid contains high resolution bathymetry and roughness information to compute volumes, friction and advection. The morphodynamic computations are carried out entirely on a high resolution grid, the bed grid. It is key to find a link between the information defined on the different grids in order to guaranty the feedback between the hydrodynamics and the morphodynamics. This link is made by using a new physics-based interpolation method. The method interpolates water levels and velocities from the coarse grid to the high resolution bed grid. The morphodynamic solution improves significantly when using the subgrid based method compared to a full coarse grid approach. The Exner equation is discretised with an upwind method based on the direction of the bed celerity. This ensures a stable solution for the Exner equation. By means of three examples, it is shown that the subgrid based approach offers a significant improvement at a minimal computational cost.
Designing Network-based Business Model Ontology
Hashemi Nekoo, Ali Reza; Ashourizadeh, Shayegheh; Zarei, Behrouz
2015-01-01
is going to propose e-business model ontology from the network point of view and its application in real world. The suggested ontology for network-based businesses is composed of individuals` characteristics and what kind of resources they own. also, their connections and pre-conceptions of connections...... such as shared-mental model and trust. However, it mostly covers previous business model elements. To confirm the applicability of this ontology, it has been implemented in business angel network and showed how it works....
Physiologically based pharmacokinetic model for acetone.
Kumagai, S.; Matsunaga, I
1995-01-01
OBJECTIVE--This study aimed to develop a physiologically based pharmacokinetic model for acetone and to predict the kinetic behaviour of acetone in the human body with that model. METHODS--The model consists of eight tissue groups in which acetone can be distributed: the mucous layer of the inhaled air tract, the mucous layer of the exhaled air tract, a compartment for gas exchange (alveolus of the lung), a group of blood vessel rich tissues including the brain and heart, a group of tissues i...
Analysis of landslide hazard area in Ludian earthquake based on Random Forests
Xie, J.-C.; Liu, R.; Li, H.-W.; Lai, Z.-L.
2015-04-01
With the development of machine learning theory, more and more algorithms are evaluated for seismic landslides. After the Ludian earthquake, the research team combine with the special geological structure in Ludian area and the seismic filed exploration results, selecting SLOPE(PODU); River distance(HL); Fault distance(DC); Seismic Intensity(LD) and Digital Elevation Model(DEM), the normalized difference vegetation index(NDVI) which based on remote sensing images as evaluation factors. But the relationships among these factors are fuzzy, there also exists heavy noise and high-dimensional, we introduce the random forest algorithm to tolerate these difficulties and get the evaluation result of Ludian landslide areas, in order to verify the accuracy of the result, using the ROC graphs for the result evaluation standard, AUC covers an area of 0.918, meanwhile, the random forest's generalization error rate decreases with the increase of the classification tree to the ideal 0.08 by using Out Of Bag(OOB) Estimation. Studying the final landslides inversion results, paper comes to a statistical conclusion that near 80% of the whole landslides and dilapidations are in areas with high susceptibility and moderate susceptibility, showing the forecast results are reasonable and adopted.
Evolutionary modeling-based approach for model errors correction
S. Q. Wan
2012-08-01
Full Text Available The inverse problem of using the information of historical data to estimate model errors is one of the science frontier research topics. In this study, we investigate such a problem using the classic Lorenz (1963 equation as a prediction model and the Lorenz equation with a periodic evolutionary function as an accurate representation of reality to generate "observational data."
On the basis of the intelligent features of evolutionary modeling (EM, including self-organization, self-adaptive and self-learning, the dynamic information contained in the historical data can be identified and extracted by computer automatically. Thereby, a new approach is proposed to estimate model errors based on EM in the present paper. Numerical tests demonstrate the ability of the new approach to correct model structural errors. In fact, it can actualize the combination of the statistics and dynamics to certain extent.
Knowledge base system visualization reasoning model based on ICON
Chen, Deyun; Pei, Shujun; Quan, Zhiying
2005-03-01
Knowledge base system is one of the most future branches for artificial intelligence facing with practical application. But the reasoning process of system is invisible, not visual and users cannot intervene the reasoning process, therefore for users the system is only a black box. This condition causes many users to take a suspicious attitude to the conclusions analyzing and drawing from the system, that means even though the system has the explanation function, but it is still not far enough. If we adopt graph or image technique to display this reasoning procedure interactively and dynamically which can make this procedure be visual, users can intervene the reasoning procedure which can greatly reduce users" gain giving, and at the same time it can provide a given method for integrity check to knowledge of the knowledge base. Therefore, we can say that reasoning visualization of knowledge base system has a further meaning than general visualization. In this paper the visual problem of reasoning process for knowledge base system on the basis of the formalized analysis for ICON system, Icon operation, syntax and semanteme of the statement is presented, a reasoning model of knowledge base system that has a visual characteristics is established, the model is used to do an integrity check in practical expert system and knowledge base, better effect is got.
Haptics-based dynamic implicit solid modeling.
Hua, Jing; Qin, Hong
2004-01-01
This paper systematically presents a novel, interactive solid modeling framework, Haptics-based Dynamic Implicit Solid Modeling, which is founded upon volumetric implicit functions and powerful physics-based modeling. In particular, we augment our modeling framework with a haptic mechanism in order to take advantage of additional realism associated with a 3D haptic interface. Our dynamic implicit solids are semi-algebraic sets of volumetric implicit functions and are governed by the principles of dynamics, hence responding to sculpting forces in a natural and predictable manner. In order to directly manipulate existing volumetric data sets as well as point clouds, we develop a hierarchical fitting algorithm to reconstruct and represent discrete data sets using our continuous implicit functions, which permit users to further design and edit those existing 3D models in real-time using a large variety of haptic and geometric toolkits, and visualize their interactive deformation at arbitrary resolution. The additional geometric and physical constraints afford more sophisticated control of the dynamic implicit solids. The versatility of our dynamic implicit modeling enables the user to easily modify both the geometry and the topology of modeled objects, while the inherent physical properties can offer an intuitive haptic interface for direct manipulation with force feedback. PMID:15794139
Model-Based Power Plant Master Control
Boman, Katarina; Thomas, Jean; Funkquist, Jonas
2010-08-15
The main goal of the project has been to evaluate the potential of a coordinated master control for a solid fuel power plant in terms of tracking capability, stability and robustness. The control strategy has been model-based predictive control (MPC) and the plant used in the case study has been the Vattenfall power plant Idbaecken in Nykoeping. A dynamic plant model based on nonlinear physical models was used to imitate the true plant in MATLAB/SIMULINK simulations. The basis for this model was already developed in previous Vattenfall internal projects, along with a simulation model of the existing control implementation with traditional PID controllers. The existing PID control is used as a reference performance, and it has been thoroughly studied and tuned in these previous Vattenfall internal projects. A turbine model was developed with characteristics based on the results of steady-state simulations of the plant using the software EBSILON. Using the derived model as a representative for the actual process, an MPC control strategy was developed using linearization and gain-scheduling. The control signal constraints (rate of change) and constraints on outputs were implemented to comply with plant constraints. After tuning the MPC control parameters, a number of simulation scenarios were performed to compare the MPC strategy with the existing PID control structure. The simulation scenarios also included cases highlighting the robustness properties of the MPC strategy. From the study, the main conclusions are: - The proposed Master MPC controller shows excellent set-point tracking performance even though the plant has strong interactions and non-linearity, and the controls and their rate of change are bounded. - The proposed Master MPC controller is robust, stable in the presence of disturbances and parameter variations. Even though the current study only considered a very small number of the possible disturbances and modelling errors, the considered cases are
Model Predictive Control based on Finite Impulse Response Models
Prasath, Guru; Jørgensen, John Bagterp
2008-01-01
We develop a regularized l2 finite impulse response (FIR) predictive controller with input and input-rate constraints. Feedback is based on a simple constant output disturbance filter. The performance of the predictive controller in the face of plant-model mismatch is investigated by simulations...
Incident duration modeling using flexible parametric hazard-based models.
Li, Ruimin; Shang, Pan
2014-01-01
Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time. PMID:25530753
Incident Duration Modeling Using Flexible Parametric Hazard-Based Models
Ruimin Li
2014-01-01
Full Text Available Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time.
Model Based Control of Reefer Container Systems
Sørensen, Kresten Kjær
This thesis is concerned with the development of model based control for the Star Cool refrigerated container (reefer) with the objective of reducing energy consumption. This project has been carried out under the Danish Industrial PhD programme and has been financed by Lodam together with the...
Port-based modeling of mechatronic systems
Breedveld, Peter C.
2004-01-01
Many engineering activities, including mechatronic design, require that a multidomain or ‘multi-physics’ system and its control system be designed as an integrated system. This contribution discusses the background and tools for a port-based approach to integrated modeling and simulation of physical
Agent Based Modelling for Social Simulation
Smit, S.K.; Ubink, E.M.; Vecht, B. van der; Langley, D.J.
2013-01-01
This document is the result of an exploratory project looking into the status of, and opportunities for Agent Based Modelling (ABM) at TNO. The project focussed on ABM applications containing social interactions and human factors, which we termed ABM for social simulation (ABM4SS). During the course
What's Missing in Model-Based Teaching
Khan, Samia
2011-01-01
In this study, the author investigated how four science teachers employed model-based teaching (MBT) over a 1-year period. The purpose of the research was to develop a baseline of the fundamental and specific dimensions of MBT that are present and absent in science teaching. Teacher interviews, classroom observations, and pre and post-student…
Introducing Waqf Based Takaful Model in India
Syed Ahmed Salman
2014-03-01
Full Text Available Objective – Waqf is a unique feature of the socioeconomic system of Islam in a multi- religious and developing country like India. India is a rich country with waqf assets. The history of waqf in India can be traced back to 800 years ago. Most of the researchers, suggest how waqf can be used a tool to mitigate the poverty of Muslims. India has the third highest Muslim population after Indonesia and Pakistan. However, the majority of Muslims belong to the low income group and they are in need of help. It is believed that waqf can be utilized for the betterment of Indian Muslim community. Among the available uses of waqf assets, the main objective of this paper is to introduce waqf based takaful model in India. In addition, how this proposed model can be adopted in India is highlighted.Methods – Library research is applied since this paper relies on secondary data by thoroughlyreviewing the most relevant literature.Result – India as a rich country with waqf assets should fully utilize the resources to help the Muslims through takaful.Conclusion – In this study, we have proposed waqf based takaful model with the combination of the concepts mudarabah and wakalah for India. We recommend this model based on the background of the country and situations. Since we have not tested the viability of this model in India, future research should be continued on this testing.Keywords : Wakaf, Takaful, Kemiskinan dan India
Unifying Model-Based and Reactive Programming within a Model-Based Executive
Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)
1999-01-01
Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.
A Novel Template-Based Learning Model
Abolghasemi-Dahaghani, Mohammadreza; Nowroozi, Alireza
2011-01-01
This article presents a model which is capable of learning and abstracting new concepts based on comparing observations and finding the resemblance between the observations. In the model, the new observations are compared with the templates which have been derived from the previous experiences. In the first stage, the objects are first represented through a geometric description which is used for finding the object boundaries and a descriptor which is inspired by the human visual system and then they are fed into the model. Next, the new observations are identified through comparing them with the previously-learned templates and are used for producing new templates. The comparisons are made based on measures like Euclidean or correlation distance. The new template is created by applying onion-pealing algorithm. The algorithm consecutively uses convex hulls which are made by the points representing the objects. If the new observation is remarkably similar to one of the observed categories, it is no longer util...
Agent Based Modeling as an Educational Tool
Fuller, J. H.; Johnson, R.; Castillo, V.
2012-12-01
Motivation is a key element in high school education. One way to improve motivation and provide content, while helping address critical thinking and problem solving skills, is to have students build and study agent based models in the classroom. This activity visually connects concepts with their applied mathematical representation. "Engaging students in constructing models may provide a bridge between frequently disconnected conceptual and mathematical forms of knowledge." (Levy and Wilensky, 2011) We wanted to discover the feasibility of implementing a model based curriculum in the classroom given current and anticipated core and content standards.; Simulation using California GIS data ; Simulation of high school student lunch popularity using aerial photograph on top of terrain value map.
Multiagent-Based Model For ESCM
Delia MARINCAS
2011-01-01
Full Text Available Web based applications for Supply Chain Management (SCM are now a necessity for every company in order to meet the increasing customer demands, to face the global competition and to make profit. Multiagent-based approach is appropriate for eSCM because it shows many of the characteristics a SCM system should have. For this reason, we have proposed a multiagent-based eSCM model which configures a virtual SC, automates the SC activities: selling, purchasing, manufacturing, planning, inventory, etc. This model will allow a better coordination of the supply chain network and will increase the effectiveness of Web and intel-ligent technologies employed in eSCM software.
Modeling Leaves Based on Real Image
CAO Yu-kun; LI Yun-feng; ZHU Qing-sheng; LIU Yin-bin
2004-01-01
Plants have complex structures. The shape of a plant component is vital for capturing the characteristics of a species. One of the challenges in computer graphics is to create geometry of objects in an intuitive and direct way while allowing interactive manipulation of the resulting shapes. In this paper,an interactive method for modeling leaves based on real image is proposed using biological data for individual plants. The modeling process begins with a one-dimensional analogue of implicit surfaces,from which a 2D silhouette of a leaf is generated based on image segmentation. The silhouette skeleton is thus obtained. Feature parameters of the leaf are extracted based on biologically experimental data, and the obtained leaf structure is then modified by comparing the synthetic result with the real leaf so as to make the leaf structure more realistic. Finally, the leaf mesh is constructed by sweeps.
[Fast spectral modeling based on Voigt peaks].
Li, Jin-rong; Dai, Lian-kui
2012-03-01
Indirect hard modeling (IHM) is a recently introduced method for quantitative spectral analysis, which was applied to the analysis of nonlinear relation between mixture spectrum and component concentration. In addition, IHM is an effectual technology for the analysis of components of mixture with molecular interactions and strongly overlapping bands. Before the establishment of regression model, IHM needs to model the measured spectrum as a sum of Voigt peaks. The precision of the spectral model has immediate impact on the accuracy of the regression model. A spectrum often includes dozens or even hundreds of Voigt peaks, which mean that spectral modeling is a optimization problem with high dimensionality in fact. So, large operation overhead is needed and the solution would not be numerically unique due to the ill-condition of the optimization problem. An improved spectral modeling method is presented in the present paper, which reduces the dimensionality of optimization problem by determining the overlapped peaks in spectrum. Experimental results show that the spectral modeling based on the new method is more accurate and needs much shorter running time than conventional method. PMID:22582612
Mesoscopic model of actin-based propulsion.
Jie Zhu
Full Text Available Two theoretical models dominate current understanding of actin-based propulsion: microscopic polymerization ratchet model predicts that growing and writhing actin filaments generate forces and movements, while macroscopic elastic propulsion model suggests that deformation and stress of growing actin gel are responsible for the propulsion. We examine both experimentally and computationally the 2D movement of ellipsoidal beads propelled by actin tails and show that neither of the two models can explain the observed bistability of the orientation of the beads. To explain the data, we develop a 2D hybrid mesoscopic model by reconciling these two models such that individual actin filaments undergoing nucleation, elongation, attachment, detachment and capping are embedded into the boundary of a node-spring viscoelastic network representing the macroscopic actin gel. Stochastic simulations of this 'in silico' actin network show that the combined effects of the macroscopic elastic deformation and microscopic ratchets can explain the observed bistable orientation of the actin-propelled ellipsoidal beads. To test the theory further, we analyze observed distribution of the curvatures of the trajectories and show that the hybrid model's predictions fit the data. Finally, we demonstrate that the model can explain both concave-up and concave-down force-velocity relations for growing actin networks depending on the characteristic time scale and network recoil. To summarize, we propose that both microscopic polymerization ratchets and macroscopic stresses of the deformable actin network are responsible for the force and movement generation.
Modeling acquaintance networks based on balance theory
Vukašinović Vida
2014-09-01
Full Text Available An acquaintance network is a social structure made up of a set of actors and the ties between them. These ties change dynamically as a consequence of incessant interactions between the actors. In this paper we introduce a social network model called the Interaction-Based (IB model that involves well-known sociological principles. The connections between the actors and the strength of the connections are influenced by the continuous positive and negative interactions between the actors and, vice versa, the future interactions are more likely to happen between the actors that are connected with stronger ties. The model is also inspired by the social behavior of animal species, particularly that of ants in their colony. A model evaluation showed that the IB model turned out to be sparse. The model has a small diameter and an average path length that grows in proportion to the logarithm of the number of vertices. The clustering coefficient is relatively high, and its value stabilizes in larger networks. The degree distributions are slightly right-skewed. In the mature phase of the IB model, i.e., when the number of edges does not change significantly, most of the network properties do not change significantly either. The IB model was found to be the best of all the compared models in simulating the e-mail URV (University Rovira i Virgili of Tarragona network because the properties of the IB model more closely matched those of the e-mail URV network than the other models
A comprehensive theory-based transport model
A new theory based transport model with comprehensive physics (trapping, general toroidal geometry, finite beta, collisions) has been developed. The core of the model is the new trapped-gyro-Landau-fluid (TGLF) equations which provide a fast and accurate approximation to the linear eigenmodes for gyrokinetic drift-wave instabilities (trapped ion and electron modes, ion and electron temperature gradient modes and kinetic ballooning modes). This new TGLF transport model removes the limitation of its predecessor GLF23 and is valid for the same conditions as the gyrokinetic equations. A theory-based philosophy is used in the model construction. The closure coefficients of the TGLF equations are fit to local kinetic theory to give accurate linear eigenmodes. The saturation model is fit to non-linear turbulence simulations. No fitting to experiment is done so applying the model to experiments is a true test of the theory it is approximating. The TGLF model unifies trapped and passing particles in a single set of gyrofluid equations. A model for the averaging of the Landau resonance by the trapped particles makes the equations work seamlessly over the whole drift-wave wavenumber range from trapped ion modes to electron temperature gradient modes. A fast eigenmode solution method enables unrestricted magnetic geometry. Electron-ion collisions and full electromagnetic fluctuations round out the physics. The linear eigenmodes have been benchmarked against comprehensive physics gyrokinetic calculations over a large range of plasma parameters. The deviation between the gyrokinetic and TGLF linear growth rates averages 11.4% in shifted circle geometry. The transport model uses the TGLF eigenmodes to compute quasilinear fluxes of energy and particles. A model for the saturated amplitude of the turbulence completes the calculation. The saturation model is constructed to fit a large set of nonlinear gyrokinetic turbulence simulations. The TGLF model is valid in new physical
A comprehensive theory-based transport model
Full text: A new theory based transport model with comprehensive physics (trapping, general toroidal geometry, finite beta, collisions) has been developed. The core of the model is the new trapped-gyro- Landau-fluid (TGLF) equations which provide a fast and accurate approximation to the linear eigenmodes for gyrokinetic drift-wave instabilities (trapped ion and electron modes, ion and electron temperature gradient modes and kinetic ballooning modes). This new TGLF transport model removes the limitation of its predecessor GLF23 and is valid for the same conditions as the gyrokinetic equations. A theory-based philosophy is used in the model construction. The closure coefficients of the TGLF equations are fit to local kinetic theory to give accurate linear eigenmodes. The saturation model is fit to non-linear turbulence simulations. No fitting to experiment is done so applying the model to experiments is a true test of the theory it is approximating. The TGLF model unifies trapped and passing particles in a single set of gyrofluid equations. A model for the averaging of the Landau resonance by the trapped particles makes the equations work seamlessly over the whole drift-wave wavenumber range from trapped ion modes to electron temperature gradient modes. A fast eigenmode solution method enables unrestricted magnetic geometry. Electron-ion collisions and full electromagnetic fluctuations round out the physics. The linear eigenmodes have been benchmarked against comprehensive physics gyrokinetic calculations over a large range of plasma parameters. The deviation between the gyrokinetic and TGLF linear growth rates averages 11.4% in shifted circle geometry. The transport model uses the TGLF eigenmodes to compute quasilinear fluxes of energy and particles. A model for the saturated amplitude of the turbulence completes the calculation. The saturation model is constructed to fit a large set of nonlinear gyrokinetic turbulence simulations. The TGLF model is valid in new
Electrochemistry-based Battery Modeling for Prognostics
Daigle, Matthew J.; Kulkarni, Chetan Shrikant
2013-01-01
Batteries are used in a wide variety of applications. In recent years, they have become popular as a source of power for electric vehicles such as cars, unmanned aerial vehicles, and commericial passenger aircraft. In such application domains, it becomes crucial to both monitor battery health and performance and to predict end of discharge (EOD) and end of useful life (EOL) events. To implement such technologies, it is crucial to understand how batteries work and to capture that knowledge in the form of models that can be used by monitoring, diagnosis, and prognosis algorithms. In this work, we develop electrochemistry-based models of lithium-ion batteries that capture the significant electrochemical processes, are computationally efficient, capture the effects of aging, and are of suitable accuracy for reliable EOD prediction in a variety of usage profiles. This paper reports on the progress of such a model, with results demonstrating the model validity and accurate EOD predictions.
An immune based dynamic intrusion detection model
LI Tao
2005-01-01
With the dynamic description method for self and antigen, and the concept of dynamic immune tolerance for lymphocytes in network-security domain presented in this paper, a new immune based dynamic intrusion detection model (Idid) is proposed. In Idid, the dynamic models and the corresponding recursive equations of the lifecycle of mature lymphocytes, and the immune memory are built. Therefore, the problem of the dynamic description of self and nonself in computer immune systems is solved, and the defect of the low efficiency of mature lymphocyte generating in traditional computer immune systems is overcome. Simulations of this model are performed, and the comparison experiment results show that the proposed dynamic intrusion detection model has a better adaptability than the traditional methods.
Internet Supported Model Based Condition Monitoring
A Lewlaski,
2010-04-01
Full Text Available The importance of condition monitoring for preventive and predictive maintenance has increased through the use of system modelling. This modelling is carried out using manufacturer(s information. Data collection using data acquisition cards provides raw data to support system monitoring, especially when used through internet and network facilities, which make it economically available for larger number of users. A model based condition monitoring system which utilises Internet is presented in this paper. The overall software/hardware for this system will be referred to as MBCM portable unit here. It contains a standalone model of system under consideration. The unit is capable of capturing signals directly from the system, savingthem and producing these data in different formats for further analysis. In order to monitor the performance of the investigated system, the MBCM contains an embedded web-server to enable different signals to be monitored and captured locally, over a network, and via internet connection.
Model-based multiple patterning layout decomposition
Guo, Daifeng; Tian, Haitong; Du, Yuelin; Wong, Martin D. F.
2015-10-01
As one of the most promising next generation lithography technologies, multiple patterning lithography (MPL) plays an important role in the attempts to keep in pace with 10 nm technology node and beyond. With feature size keeps shrinking, it has become impossible to print dense layouts within one single exposure. As a result, MPL such as double patterning lithography (DPL) and triple patterning lithography (TPL) has been widely adopted. There is a large volume of literature on DPL/TPL layout decomposition, and the current approach is to formulate the problem as a classical graph-coloring problem: Layout features (polygons) are represented by vertices in a graph G and there is an edge between two vertices if and only if the distance between the two corresponding features are less than a minimum distance threshold value dmin. The problem is to color the vertices of G using k colors (k = 2 for DPL, k = 3 for TPL) such that no two vertices connected by an edge are given the same color. This is a rule-based approach, which impose a geometric distance as a minimum constraint to simply decompose polygons within the distance into different masks. It is not desired in practice because this criteria cannot completely capture the behavior of the optics. For example, it lacks of sufficient information such as the optical source characteristics and the effects between the polygons outside the minimum distance. To remedy the deficiency, a model-based layout decomposition approach to make the decomposition criteria base on simulation results was first introduced at SPIE 2013.1 However, the algorithm1 is based on simplified assumption on the optical simulation model and therefore its usage on real layouts is limited. Recently AMSL2 also proposed a model-based approach to layout decomposition by iteratively simulating the layout, which requires excessive computational resource and may lead to sub-optimal solutions. The approach2 also potentially generates too many stiches. In this
Multivariate statistical modelling based on generalized linear models
Fahrmeir, Ludwig
1994-01-01
This book is concerned with the use of generalized linear models for univariate and multivariate regression analysis. Its emphasis is to provide a detailed introductory survey of the subject based on the analysis of real data drawn from a variety of subjects including the biological sciences, economics, and the social sciences. Where possible, technical details and proofs are deferred to an appendix in order to provide an accessible account for non-experts. Topics covered include: models for multi-categorical responses, model checking, time series and longitudinal data, random effects models, and state-space models. Throughout, the authors have taken great pains to discuss the underlying theoretical ideas in ways that relate well to the data at hand. As a result, numerous researchers whose work relies on the use of these models will find this an invaluable account to have on their desks. "The basic aim of the authors is to bring together and review a large part of recent advances in statistical modelling of m...
Image-based modelling of organogenesis.
Iber, Dagmar; Karimaddini, Zahra; Ünal, Erkan
2016-07-01
One of the major challenges in biology concerns the integration of data across length and time scales into a consistent framework: how do macroscopic properties and functionalities arise from the molecular regulatory networks-and how can they change as a result of mutations? Morphogenesis provides an excellent model system to study how simple molecular networks robustly control complex processes on the macroscopic scale despite molecular noise, and how important functional variants can emerge from small genetic changes. Recent advancements in three-dimensional imaging technologies, computer algorithms and computer power now allow us to develop and analyse increasingly realistic models of biological control. Here, we present our pipeline for image-based modelling that includes the segmentation of images, the determination of displacement fields and the solution of systems of partial differential equations on the growing, embryonic domains. The development of suitable mathematical models, the data-based inference of parameter sets and the evaluation of competing models are still challenging, and current approaches are discussed. PMID:26510443
Business Models for NFC Based Mobile Payments
Chae, Johannes Sang-Un; Hedman, Jonas
2015-01-01
Purpose: The purpose of the paper is to develop a business model framework for NFC based mobile payment solutions consisting of four mutually interdepended components: the value service, value network, value architecture, and value finance. Design: Using a comparative case study method, the paper...... from multiple stakeholders and the creation of an ecosystem. Furthermore, they focus on the scalability of their value propositions. Originality / value: The paper offers an applicable business model framework that allows practitioners and academics to study current and future mobile payment approaches....
Model-based Tomographic Reconstruction Literature Search
Chambers, D H; Lehman, S K
2005-11-30
In the process of preparing a proposal for internal research funding, a literature search was conducted on the subject of model-based tomographic reconstruction (MBTR). The purpose of the search was to ensure that the proposed research would not replicate any previous work. We found that the overwhelming majority of work on MBTR which used parameterized models of the object was theoretical in nature. Only three researchers had applied the technique to actual data. In this note, we summarize the findings of the literature search.
History-based trust negotiation model
ZHAO Yi-zhu; ZHAO Yan-hua; LU Hong-wei
2009-01-01
Trust negotiation (TN) is an approach to establish trust between strangers through iterative disclosure of digital credentials. Speeding up subsequent negotiations between the same negotiators is a problem worth of research. This paper introduces the concept of visiting card, and presents a history-based trust negotiation (HBTN) model. HBTN creates an account for a counterpart at the first negotiation and records valid credentials that the counterpart disclosed during each trust negotiation in his historical information base (HIB). For the following negotiation, no more credentials need to be disclosed for both parties. HBTN speeds up subsequent negotiations between the entities that interact with each other frequently without impairing the privacy preservation.
Item Modeling Concept Based on Multimedia Authoring
Janez Stergar
2008-09-01
Full Text Available In this paper a modern item design framework for computer based assessment based on Flash authoring environment will be introduced. Question design will be discussed as well as the multimedia authoring environment used for item modeling emphasized. Item type templates are a structured means of collecting and storing item information that can be used to improve the efficiency and security of the innovative item design process. Templates can modernize the item design, enhance and speed up the development process. Along with content creation, multimedia has vast potential for use in innovative testing. The introduced item design template is based on taxonomy of innovative items which have great potential for expanding the content areas and construct coverage of an assessment. The presented item design approach is based on GUI's – one for question design based on implemented item design templates and one for user interaction tracking/retrieval. The concept of user interfaces based on Flash technology will be discussed as well as implementation of the innovative approach of the item design forms with multimedia authoring. Also an innovative method for user interaction storage/retrieval based on PHP extending Flash capabilities in the proposed framework will be introduced.
Modelling spatial association in pattern based land use simulation models.
Anputhas, Markandu; Janmaat, Johannus John A; Nichol, Craig F; Wei, Xiaohua Adam
2016-10-01
Pattern based land use models are widely used to forecast land use change. These models predict land use change using driving variables observed on the studied landscape. Many of these models have a limited capacity to account for interactions between neighbouring land parcels. Some modellers have used common spatial statistical measures to incorporate neighbour effects. However, these approaches were developed for continuous variables, while land use classifications are categorical. Neighbour interactions are also endogenous, changing as the land use patterns change. In this study we describe a single variable measure that captures aspects of neighbour interactions as reflected in the land use pattern. We use a stepwise updating process to demonstrate how dynamic updating of our measure impacts on model forecasts. We illustrate these results using the CLUE-S (Conversion of Land Use and its Effects at Small regional extent) system to forecast land use change for the Deep Creek watershed in the northern Okanagan Valley of British Columbia, Canada. Results establish that our measure improves model calibration and that ignoring changing spatial influences biases land use change forecasts. PMID:27420169
Graph based model to support nurses' work.
Benedik, Peter; Rajkovič, Uroš; Sušteršič, Olga; Prijatelj, Vesna; Rajkovič, Vladislav
2014-01-01
Health care is a knowledge-based community that critically depends on knowledge management activities in order to ensure quality. Nurses are primary stakeholders and need to ensure that their information and knowledge needs are being met in such ways that enable them, to improve the quality and efficiency of health care service delivery for all subjects of health care. This paper describes a system to help nurses to create nursing care plan. It supports focusing nurse's attention on those resources/solutions that are likely to be most relevant to their particular situation/problem in nursing domain. System is based on multi-relational property graph representing a flexible modeling construct. Graph allows modeling a nursing domain (ontology) and the indices that partition domain into an efficient, searchable space where the solution to a problem is seen as abstractly defined traversals through its vertices and edges. PMID:24943559
Ma, Weiwei; Gong, Cailan; Hu, Yong; Li, Long; Meng, Peng
2015-10-01
Remote sensing technology has been broadly recognized for its convenience and efficiency in mapping vegetation, particularly in high-altitude and inaccessible areas where there are lack of in-situ observations. In this study, Landsat Thematic Mapper (TM) images and Chinese environmental mitigation satellite CCD sensor (HJ-1 CCD) images, both of which are at 30m spatial resolution were employed for identifying and monitoring of vegetation types in a area of Western China——Qinghai Lake Watershed(QHLW). A decision classification tree (DCT) algorithm using multi-characteristic including seasonal TM/HJ-1 CCD time series data combined with digital elevation models (DEMs) dataset, and a supervised maximum likelihood classification (MLC) algorithm with single-data TM image were applied vegetation classification. Accuracy of the two algorithms was assessed using field observation data. Based on produced vegetation classification maps, it was found that the DCT using multi-season data and geomorphologic parameters was superior to the MLC algorithm using single-data image, improving the overall accuracy by 11.86% at second class level and significantly reducing the "salt and pepper" noise. The DCT algorithm applied to TM /HJ-1 CCD time series data geomorphologic parameters appeared as a valuable and reliable tool for monitoring vegetation at first class level (5 vegetation classes) and second class level(8 vegetation subclasses). The DCT algorithm using multi-characteristic might provide a theoretical basis and general approach to automatic extraction of vegetation types from remote sensing imagery over plateau areas.
Requirements Engineering Model: Role Based Goal Oriented Model
Sandfreni; Surendro Ir. Kridanto
2016-01-01
Requirements engineering approach through intentional perspective is one of the arguments that appear in the field of requirement engineering. That approach can explain the characteristics of the behavior of an actor. The usage Goal Based Workflow and KAOS method in iStar modeling might help the system analyst to gain knowledge about the internal process inside each of actor sequentially, such that the whole sequential activity to achieve the goal are exposed clearly in those actor’s internal...
Howard, Y; Gravell, A; Ferreira, C; Augusto, J C
2011-01-01
Trace analysis can be a useful way to discover problems in a program under test. Rather than writing a special purpose trace analysis tool, this paper proposes that traces can usefully be analysed by checking them against a formal model using a standard model-checker or else an animator for executable specifications. These techniques are illustrated using a Travel Agent case study implemented in J2EE. We added trace beans to this code that write trace information to a database. The traces are then extracted and converted into a form suitable for analysis by Spin, a popular model-checker, and Pro-B, a model-checker and animator for the B notation. This illustrates the technique, and also the fact that such a system can have a variety of models, in different notations, that capture different features. These experiments have demonstrated that model-based trace-checking is feasible. Future work is focussed on scaling up the approach to larger systems by increasing the level of automation.
Agent Based Model of Livestock Movements
Miron, D. J.; Emelyanova, I. V.; Donald, G. E.; Garner, G. M.
The modelling of livestock movements within Australia is of national importance for the purposes of the management and control of exotic disease spread, infrastructure development and the economic forecasting of livestock markets. In this paper an agent based model for the forecasting of livestock movements is presented. This models livestock movements from farm to farm through a saleyard. The decision of farmers to sell or buy cattle is often complex and involves many factors such as climate forecast, commodity prices, the type of farm enterprise, the number of animals available and associated off-shore effects. In this model the farm agent's intelligence is implemented using a fuzzy decision tree that utilises two of these factors. These two factors are the livestock price fetched at the last sale and the number of stock on the farm. On each iteration of the model farms choose either to buy, sell or abstain from the market thus creating an artificial supply and demand. The buyers and sellers then congregate at the saleyard where livestock are auctioned using a second price sealed bid. The price time series output by the model exhibits properties similar to those found in real livestock markets.
Ecosystem Based Business Model of Smart Grid
Lundgaard, Morten Raahauge; Ma, Zheng; Jørgensen, Bo Nørregaard
2015-01-01
This paper tries to investigate the ecosystem based business model in a smart grid infrastructure and the potential of value capture in the highly complex macro infrastructure such as smart grid. This paper proposes an alternative perspective to study the smart grid business ecosystem to support the infrastructural challenges, such as the interoperability of business components for smart grid. So far little research has explored the business ecosystem in the smart grid concept. The study on t...
Market Segmentation Using Bayesian Model Based Clustering
Van Hattum, P.
2009-01-01
This dissertation deals with two basic problems in marketing, that are market segmentation, which is the grouping of persons who share common aspects, and market targeting, which is focusing your marketing efforts on one or more attractive market segments. For the grouping of persons who share common aspects a Bayesian model based clustering approach is proposed such that it can be applied to data sets that are specifically used for market segmentation. The cluster algorithm can handle very l...
Enhancing the Combat ID Agent Based Model
Spaans, M.; Petiet, P.J.; Dean, D; Jackson, J.; Bradley, W.; Shan, L. Y.; Ka-Yoon, W.; Yongwei, D.W.; Kai, C.W.
2007-01-01
During previous Project Albert and International data Farming Workshops (IDFW) and during discussions between Dstl and TNO, the suitability and feasibility of Agent Based Models (ABMs) to support research on Combat Identification (Combat ID) was examined. The objective of this research is to: Investigate the effect of (a large number of) different variations in Situational Awareness, Situation Awareness (SA), Target Identification (Target ID), Human Factors, and Tactics, Techniques, and Proce...
INVESTOR BASED PSYCHOLOGICAL DECISION MAKING MODEL
Mohd Alnajjar
2013-01-01
Significance of behavioral finance has been realized. Intrinsic value of behavioral finance is to investigate the irrational attitude of investor while making investment decision. This study explores the investor based psychological decision making model for enhancing the understanding about the psychological decision making process of investors in Islamabad Stock Exchange. A self-administered structured questionnaire is used for gathering data from 168 respondents (investors of ISE). Correla...
Building Information Modelling Incorporating Technology Based Assessment
Murphy, Maurice; Scott, Lloyd
2011-01-01
Building Information Modelling (BIM) is currently being developed as a virtual learning tool for construction and surveying students in the Dublin Institute of Technology. This advanced technology is also used to develop a technology based assessment practice for enhancing the learning environment of construction and surveying students. A theoretical design framework is presented in this paper, which combines advanced technology and assessment theory to create a virtual learning environment. ...
CEAI: CCM based Email Authorship Identification Model
Nizamani, Sarwat; Memon, Nasrullah
2013-01-01
In this paper we present a model for email authorship identification (EAI) by employing a Cluster-based Classification (CCM) technique. Traditionally, stylometric features have been successfully employed in various authorship analysis tasks; we extend the traditional feature-set to include some more interesting and effective features for email authorship identification (e.g. the last punctuation mark used in an email, the tendency of an author to use capitalization at the start of an email, o...
Requirements engineering-based conceptual modeling
Insfrán, E.; O. Pastor; Wieringa, R.J.
2002-01-01
The software production process involves a set of phases where a clear relationship and smooth transitions between them should be introduced. In this paper, a requirements engineering-based conceptual modelling approach is introduced as a way to improve the quality of the software production process. The aim of this approach is to provide a set of techniques and methods to capture software requirements and to provide a way to move from requirements to a conceptual schema in a traceable way. T...
Genome Informed Trait-Based Models
Karaoz, U.; Cheng, Y.; Bouskill, N.; Tang, J.; Beller, H. R.; Brodie, E.; Riley, W. J.
2013-12-01
Trait-based approaches are powerful tools for representing microbial communities across both spatial and temporal scales within ecosystem models. Trait-based models (TBMs) represent the diversity of microbial taxa as stochastic assemblages with a distribution of traits constrained by trade-offs between these traits. Such representation with its built-in stochasticity allows the elucidation of the interactions between the microbes and their environment by reducing the complexity of microbial community diversity into a limited number of functional ';guilds' and letting them emerge across spatio-temporal scales. From the biogeochemical/ecosystem modeling perspective, the emergent properties of the microbial community could be directly translated into predictions of biogeochemical reaction rates and microbial biomass. The accuracy of TBMs depends on the identification of key traits of the microbial community members and on the parameterization of these traits. Current approaches to inform TBM parameterization are empirical (i.e., based on literature surveys). Advances in omic technologies (such as genomics, metagenomics, metatranscriptomics, and metaproteomics) pave the way to better-initialize models that can be constrained in a generic or site-specific fashion. Here we describe the coupling of metagenomic data to the development of a TBM representing the dynamics of metabolic guilds from an organic carbon stimulated groundwater microbial community. Illumina paired-end metagenomic data were collected from the community as it transitioned successively through electron-accepting conditions (nitrate-, sulfate-, and Fe(III)-reducing), and used to inform estimates of growth rates and the distribution of metabolic pathways (i.e., aerobic and anaerobic oxidation, fermentation) across a spatially resolved TBM. We use this model to evaluate the emergence of different metabolisms and predict rates of biogeochemical processes over time. We compare our results to observational
Model based risk assessment - the CORAS framework
Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.
2004-04-15
Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)
Realistic face modeling based on multiple deformations
GONG Xun; WANG Guo-yin
2007-01-01
On the basis of the assumption that the human face belongs to a linear class, a multiple-deformation model is proposed to recover face shape from a few points on a single 2D image. Compared to the conventional methods, this study has the following advantages. First, the proposed modified 3D sparse deforming model is a noniterative approach that can compute global translation efficiently and accurately. Subsequently, the overfitting problem can be alleviated based on the proposed multiple deformation model. Finally, by keeping the main features, the texture generated is realistic. The comparison results show that this novel method outperforms the existing methods by using ground truth data and that realistic 3D faces can be recovered efficiently from a single photograph.
Model based monitoring of stormwater runoff quality
Birch, Heidi; Vezzaro, Luca; Mikkelsen, Peter Steen
2012-01-01
average and maximum event mean concentrations. Use of this model reduced the uncertainty of predicted annual average concentrations compared to a simple stochastic method based solely on data. The predicted annual average obtained by using passive sampler measurements (one month installation) for......Monitoring of micropollutants (MP) in stormwater is essential to evaluate the impacts of stormwater on the receiving aquatic environment. The aim of this study was to investigate how different strategies for monitoring of stormwater quality (combination of model with field sampling) affect the...... information obtained about MPs discharged from the monitored system. A dynamic stormwater quality model was calibrated using MP data collected by volume-proportional and passive sampling in a storm drainage system in the outskirts of Copenhagen (Denmark) and a 10-year rain series was used to find annual...
On Reading-Based Writing Instruction Model
李大艳; 王建安
2012-01-01
English writing is a complex integrative process of comprehensive skills. A host of students are still unable to write a coherent English paragraph after having learned English for many years at school. To help college students improve their writing competence is a great challenge facing the English teaching in China. Researches on writing teaching method abroad have experienced prosperity. In China, however, researches in this field are far behind. There is great need to search for more efficient writing instruction model so that it can serve well in Chinese context. Enlightened by Krashen＇s input hypothesis and Swain＇s output hypothesis, the writer put forward Reading-Based Writing Instruction Model. This paper aims to discuss the effectiveness of this model from the different perspectives.
Business Models for NFC based mobile payments
Johannes Sang Un Chae
2015-01-01
Full Text Available Purpose: The purpose of the paper is to develop a business model framework for NFC based mobile payment solutions consisting of four mutually interdepended components: the value service, value network, value architecture, and value finance. Design: Using a comparative case study method, the paper investigates Google Wallet and ISIS Mobile Wallet and their underlying business models. Findings: Google Wallet and ISIS Mobile Wallet are focusing on providing an enhanced customer experience with their mobile wallet through a multifaceted value proposition. The delivery of its offering requires cooperation from multiple stakeholders and the creation of an ecosystem. Furthermore, they focus on the scalability of their value propositions. Originality / value: The paper offers an applicable business model framework that allows practitioners and academics to study current and future mobile payment approaches.
Model-based vision for space applications
Chaconas, Karen; Nashman, Marilyn; Lumia, Ronald
1992-01-01
This paper describes a method for tracking moving image features by combining spatial and temporal edge information with model based feature information. The algorithm updates the two-dimensional position of object features by correlating predicted model features with current image data. The results of the correlation process are used to compute an updated model. The algorithm makes use of a high temporal sampling rate with respect to spatial changes of the image features and operates in a real-time multiprocessing environment. Preliminary results demonstrate successful tracking for image feature velocities between 1.1 and 4.5 pixels every image frame. This work has applications for docking, assembly, retrieval of floating objects and a host of other space-related tasks.
Model-based target and background characterization
Mueller, Markus; Krueger, Wolfgang; Heinze, Norbert
2000-07-01
Up to now most approaches of target and background characterization (and exploitation) concentrate solely on the information given by pixels. In many cases this is a complex and unprofitable task. During the development of automatic exploitation algorithms the main goal is the optimization of certain performance parameters. These parameters are measured during test runs while applying one algorithm with one parameter set to images that constitute of image domains with very different domain characteristics (targets and various types of background clutter). Model based geocoding and registration approaches provide means for utilizing the information stored in GIS (Geographical Information Systems). The geographical information stored in the various GIS layers can define ROE (Regions of Expectations) and may allow for dedicated algorithm parametrization and development. ROI (Region of Interest) detection algorithms (in most cases MMO (Man- Made Object) detection) use implicit target and/or background models. The detection algorithms of ROIs utilize gradient direction models that have to be matched with transformed image domain data. In most cases simple threshold calculations on the match results discriminate target object signatures from the background. The geocoding approaches extract line-like structures (street signatures) from the image domain and match the graph constellation against a vector model extracted from a GIS (Geographical Information System) data base. Apart from geo-coding the algorithms can be also used for image-to-image registration (multi sensor and data fusion) and may be used for creation and validation of geographical maps.
Inference-based procedural modeling of solids
Biggers, Keith
2011-11-01
As virtual environments become larger and more complex, there is an increasing need for more automated construction algorithms to support the development process. We present an approach for modeling solids by combining prior examples with a simple sketch. Our algorithm uses an inference-based approach to incrementally fit patches together in a consistent fashion to define the boundary of an object. This algorithm samples and extracts surface patches from input models, and develops a Petri net structure that describes the relationship between patches along an imposed parameterization. Then, given a new parameterized line or curve, we use the Petri net to logically fit patches together in a manner consistent with the input model. This allows us to easily construct objects of varying sizes and configurations using arbitrary articulation, repetition, and interchanging of parts. The result of our process is a solid model representation of the constructed object that can be integrated into a simulation-based environment. © 2011 Elsevier Ltd. All rights reserved.
Halle, Lars Halvard
2016-01-01
Presenting the first systematic treatment of the behavior of Néron models under ramified base change, this book can be read as an introduction to various subtle invariants and constructions related to Néron models of semi-abelian varieties, motivated by concrete research problems and complemented with explicit examples. Néron models of abelian and semi-abelian varieties have become an indispensable tool in algebraic and arithmetic geometry since Néron introduced them in his seminal 1964 paper. Applications range from the theory of heights in Diophantine geometry to Hodge theory. We focus specifically on Néron component groups, Edixhoven’s filtration and the base change conductor of Chai and Yu, and we study these invariants using various techniques such as models of curves, sheaves on Grothendieck sites and non-archimedean uniformization. We then apply our results to the study of motivic zeta functions of abelian varieties. The final chapter contains a list of challenging open questions. This book is a...
Concept Tree Based Information Retrieval Model
Chunyan Yuan
2014-05-01
Full Text Available This paper proposes a novel concept-based query expansion technique named Markov concept tree model (MCTM, discovering term relationship through the concept tree deduced by term markov network. We address two important issues for query expansion: the selection and the weighting of expansion search terms. In contrast to earlier methods, queries are expanded by adding those terms that are most similar to the concept of the query, rather than selecting terms that are similar to a signal query terms. Utilizing Markov network which is constructed according to the co-occurrence information of the terms in collection, it generate concept tree for each original query term, remove the redundant and irrelevant nodes in concept tree, then adjust the weight of original query and the weight of expansion term based on a pruning algorithm. We use this model for query expansion and evaluate the effectiveness of the model by examining the accuracy and robustness of the expansion methods, Compared with the baseline model, the experiments on standard dataset reveal that this method can achieve a better query quality
Model-based explanation of plant knowledge
Huuskonen, P.J. [VTT Electronics, Oulu (Finland). Embedded Software
1997-12-31
This thesis deals with computer explanation of knowledge related to design and operation of industrial plants. The needs for explanation are motivated through case studies and literature reviews. A general framework for analysing plant explanations is presented. Prototypes demonstrate key mechanisms for implementing parts of the framework. Power plants, steel mills, paper factories, and high energy physics control systems are studied to set requirements for explanation. The main problems are seen to be either lack or abundance of information. Design knowledge in particular is found missing at plants. Support systems and automation should be enhanced with ways to explain plant knowledge to the plant staff. A framework is formulated for analysing explanations of plant knowledge. It consists of three parts: 1. a typology of explanation, organised by the class of knowledge (factual, functional, or strategic) and by the target of explanation (processes, automation, or support systems), 2. an identification of explanation tasks generic for the plant domain, and 3. an identification of essential model types for explanation (structural, behavioural, functional, and teleological). The tasks use the models to create the explanations of the given classes. Key mechanisms are discussed to implement the generic explanation tasks. Knowledge representations based on objects and their relations form a vocabulary to model and present plant knowledge. A particular class of models, means-end models, are used to explain plant knowledge. Explanations are generated through searches in the models. Hypertext is adopted to communicate explanations over dialogue based on context. The results are demonstrated in prototypes. The VICE prototype explains the reasoning of an expert system for diagnosis of rotating machines at power plants. The Justifier prototype explains design knowledge obtained from an object-oriented plant design tool. Enhanced access mechanisms into on-line documentation are
Model-based vision for car following
Schneiderman, Henry; Nashman, Marilyn; Lumia, Ronald
1993-08-01
This paper describes a vision processing algorithm that supports autonomous car following. The algorithm visually tracks the position of a `lead vehicle' from the vantage of a pursuing `chase vehicle.' The algorithm requires a 2-D model of the back of the lead vehicle. This model is composed of line segments corresponding to features that give rise to strong edges. There are seven sequential stages of computation: (1) Extracting edge points; (2) Associating extracted edge points with the model features; (3) Determining the position of each model feature; (4) Determining the model position; (5) Updating the motion model of the object; (6) Predicting the position of the object in next image; (7) Predicting the location of all object features from prediction of object position. All processing is confined to the 2-D image plane. The 2-D model location computed in this processing is used to determine the position of the lead vehicle with respect to a 3-D coordinate frame affixed to the chase vehicle. This algorithm has been used as part of a complete system to drive an autonomous vehicle, a High Mobility Multipurpose Wheeled Vehicle (HMMWV) such that it follows a lead vehicle at speeds up to 35 km/hr. The algorithm runs at an update rate of 15 Hertz and has a worst case computational delay of 128 ms. The algorithm is implemented under the NASA/NBS Standard Reference Model for Telerobotic Control System Architecture (NASREM) and runs on a dedicated vision processing engine and a VME-based multiprocessor system.
A Visual Attention Model Based Image Fusion
Rishabh Gupta
2013-12-01
Full Text Available To develop an efficient image fusion algorithm based on visual attention model for images with distinct objects. Image fusion is a process of combining complementary information from multiple images of the same scene into an image, so that the resultant image contains a more accurate description of the scene than any of the individual source images. The two basic fusion techniques are pixel level and region level fusion. Pixel level fusion deals with the operations on each and every pixel separately. The various pixel level techniques are averaging, stationary wavelet transforms, discrete wavelet transforms, Principal Component Analysis (PCA. But because of less sensitivity to noise and mis-registration, the region level image fusion is an emerging approach in the field of multifocus image fusion. The most appreciated approaches in region-based methods are multifocus image fusion using the concept of focal connectivity and spatial frequency. These two methods works well on still images as well as on video frames as inputs. A new region based technique is been proposed for the multifocus images having distinct objects. The method is based on the visual attention models and results obtained are appreciating for the distinct objects input images. The Proposed method results are highlighted using tenengrade and extended spatial frequency as performance parameters by taking several pairs of multi-focus input images like microscopic images, forensic images and video frames.
Model-Based Method for Sensor Validation
Vatan, Farrokh
2012-01-01
Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).
Flow based vs. demand based energy-water modelling
Rozos, Evangelos; Nikolopoulos, Dionysis; Efstratiadis, Andreas; Koukouvinos, Antonios; Makropoulos, Christos
2015-04-01
The water flow in hydro-power generation systems is often used downstream to cover other type of demands like irrigation and water supply. However, the typical case is that the energy demand (operation of hydro-power plant) and the water demand do not coincide. Furthermore, the water inflow into a reservoir is a stochastic process. Things become more complicated if renewable resources (wind-turbines or photovoltaic panels) are included into the system. For this reason, the assessment and optimization of the operation of hydro-power systems are challenging tasks that require computer modelling. This modelling should not only simulate the water budget of the reservoirs and the energy production/consumption (pumped-storage), but should also take into account the constraints imposed by the natural or artificial water network using a flow routing algorithm. HYDRONOMEAS, for example, uses an elegant mathematical approach (digraph) to calculate the flow in a water network based on: the demands (input timeseries), the water availability (simulated) and the capacity of the transmission components (properties of channels, rivers, pipes, etc.). The input timeseries of demand should be estimated by another model and linked to the corresponding network nodes. A model that could be used to estimate these timeseries is UWOT. UWOT is a bottom up urban water cycle model that simulates the generation, aggregation and routing of water demand signals. In this study, we explore the potentials of UWOT in simulating the operation of complex hydrosystems that include energy generation. The evident advantage of this approach is the use of a single model instead of one for estimation of demands and another for the system simulation. An application of UWOT in a large scale system is attempted in mainland Greece in an area extending over 130×170 km². The challenges, the peculiarities and the advantages of this approach are examined and critically discussed.
Grid based calibration of SWAT hydrological models
D. Gorgan
2012-07-01
Full Text Available The calibration and execution of large hydrological models, such as SWAT (soil and water assessment tool, developed for large areas, high resolution, and huge input data, need not only quite a long execution time but also high computation resources. SWAT hydrological model supports studies and predictions of the impact of land management practices on water, sediment, and agricultural chemical yields in complex watersheds. The paper presents the gSWAT application as a web practical solution for environmental specialists to calibrate extensive hydrological models and to run scenarios, by hiding the complex control of processes and heterogeneous resources across the grid based high computation infrastructure. The paper highlights the basic functionalities of the gSWAT platform, and the features of the graphical user interface. The presentation is concerned with the development of working sessions, interactive control of calibration, direct and basic editing of parameters, process monitoring, and graphical and interactive visualization of the results. The experiments performed on different SWAT models and the obtained results argue the benefits brought by the grid parallel and distributed environment as a solution for the processing platform. All the instances of SWAT models used in the reported experiments have been developed through the enviroGRIDS project, targeting the Black Sea catchment area.
Physics-based models of the plasmasphere
Jordanova, Vania K [Los Alamos National Laboratory; Pierrard, Vivane [BELGIUM; Goldstein, Jerry [SWRI; Andr' e, Nicolas [ESTEC/ESA; Kotova, Galina A [SRI, RUSSIA; Lemaire, Joseph F [BELGIUM; Liemohn, Mike W [U OF MICHIGAN; Matsui, H [UNIV OF NEW HAMPSHIRE
2008-01-01
We describe recent progress in physics-based models of the plasmasphere using the Auid and the kinetic approaches. Global modeling of the dynamics and inAuence of the plasmasphere is presented. Results from global plasmasphere simulations are used to understand and quantify (i) the electric potential pattern and evolution during geomagnetic storms, and (ii) the inAuence of the plasmasphere on the excitation of electromagnetic ion cyclotron (ElvIIC) waves a.nd precipitation of energetic ions in the inner magnetosphere. The interactions of the plasmasphere with the ionosphere a.nd the other regions of the magnetosphere are pointed out. We show the results of simulations for the formation of the plasmapause and discuss the inAuence of plasmaspheric wind and of ultra low frequency (ULF) waves for transport of plasmaspheric material. Theoretical formulations used to model the electric field and plasma distribution in the plasmasphere are given. Model predictions are compared to recent CLUSTER and MAGE observations, but also to results of earlier models and satellite observations.
Model based systems engineering for astronomical projects
Karban, R.; Andolfato, L.; Bristow, P.; Chiozzi, G.; Esselborn, M.; Schilling, M.; Schmid, C.; Sommer, H.; Zamparelli, M.
2014-08-01
Model Based Systems Engineering (MBSE) is an emerging field of systems engineering for which the System Modeling Language (SysML) is a key enabler for descriptive, prescriptive and predictive models. This paper surveys some of the capabilities, expectations and peculiarities of tools-assisted MBSE experienced in real-life astronomical projects. The examples range in depth and scope across a wide spectrum of applications (for example documentation, requirements, analysis, trade studies) and purposes (addressing a particular development need, or accompanying a project throughout many - if not all - its lifecycle phases, fostering reuse and minimizing ambiguity). From the beginnings of the Active Phasing Experiment, through VLT instrumentation, VLTI infrastructure, Telescope Control System for the E-ELT, until Wavefront Control for the E-ELT, we show how stepwise refinements of tools, processes and methods have provided tangible benefits to customary system engineering activities like requirement flow-down, design trade studies, interfaces definition, and validation, by means of a variety of approaches (like Model Checking, Simulation, Model Transformation) and methodologies (like OOSEM, State Analysis)
Modeling oil production based on symbolic regression
Numerous models have been proposed to forecast the future trends of oil production and almost all of them are based on some predefined assumptions with various uncertainties. In this study, we propose a novel data-driven approach that uses symbolic regression to model oil production. We validate our approach on both synthetic and real data, and the results prove that symbolic regression could effectively identify the true models beneath the oil production data and also make reliable predictions. Symbolic regression indicates that world oil production will peak in 2021, which broadly agrees with other techniques used by researchers. Our results also show that the rate of decline after the peak is almost half the rate of increase before the peak, and it takes nearly 12 years to drop 4% from the peak. These predictions are more optimistic than those in several other reports, and the smoother decline will provide the world, especially the developing countries, with more time to orchestrate mitigation plans. -- Highlights: •A data-driven approach has been shown to be effective at modeling the oil production. •The Hubbert model could be discovered automatically from data. •The peak of world oil production is predicted to appear in 2021. •The decline rate after peak is half of the increase rate before peak. •Oil production projected to decline 4% post-peak
A Comparison of Filter-based Approaches for Model-based Prognostics
National Aeronautics and Space Administration — Model-based prognostics approaches use domain knowledge about a system and its failure modes through the use of physics-based models. Model-based prognosis is...
Franklin, J.; Wejnert, K.E.; Hathaway, S.A.; Rochester, C.J.; Fisher, R.N.
2009-01-01
Aim: Several studies have found that more accurate predictive models of species' occurrences can be developed for rarer species; however, one recent study found the relationship between range size and model performance to be an artefact of sample prevalence, that is, the proportion of presence versus absence observations in the data used to train the model. We examined the effect of model type, species rarity class, species' survey frequency, detectability and manipulated sample prevalence on the accuracy of distribution models developed for 30 reptile and amphibian species. Location: Coastal southern California, USA. Methods: Classification trees, generalized additive models and generalized linear models were developed using species presence and absence data from 420 locations. Model performance was measured using sensitivity, specificity and the area under the curve (AUC) of the receiver-operating characteristic (ROC) plot based on twofold cross-validation, or on bootstrapping. Predictors included climate, terrain, soil and vegetation variables. Species were assigned to rarity classes by experts. The data were sampled to generate subsets with varying ratios of presences and absences to test for the effect of sample prevalence. Join count statistics were used to characterize spatial dependence in the prediction errors. Results: Species in classes with higher rarity were more accurately predicted than common species, and this effect was independent of sample prevalence. Although positive spatial autocorrelation remained in the prediction errors, it was weaker than was observed in the species occurrence data. The differences in accuracy among model types were slight. Main conclusions: Using a variety of modelling methods, more accurate species distribution models were developed for rarer than for more common species. This was presumably because it is difficult to discriminate suitable from unsuitable habitat for habitat generalists, and not as an artefact of the
Model-based Prognostics with Concurrent Damage Progression Processes
National Aeronautics and Space Administration — Model-based prognostics approaches rely on physics-based models that describe the behavior of systems and their components. These models must account for the...
Model-based phase-shifting interferometer
Liu, Dong; Zhang, Lei; Shi, Tu; Yang, Yongying; Chong, Shiyao; Miao, Liang; Huang, Wei; Shen, Yibing; Bai, Jian
2015-10-01
A model-based phase-shifting interferometer (MPI) is developed, in which a novel calculation technique is proposed instead of the traditional complicated system structure, to achieve versatile, high precision and quantitative surface tests. In the MPI, the partial null lens (PNL) is employed to implement the non-null test. With some alternative PNLs, similar as the transmission spheres in ZYGO interferometers, the MPI provides a flexible test for general spherical and aspherical surfaces. Based on modern computer modeling technique, a reverse iterative optimizing construction (ROR) method is employed for the retrace error correction of non-null test, as well as figure error reconstruction. A self-compiled ray-tracing program is set up for the accurate system modeling and reverse ray tracing. The surface figure error then can be easily extracted from the wavefront data in forms of Zernike polynomials by the ROR method. Experiments of the spherical and aspherical tests are presented to validate the flexibility and accuracy. The test results are compared with those of Zygo interferometer (null tests), which demonstrates the high accuracy of the MPI. With such accuracy and flexibility, the MPI would possess large potential in modern optical shop testing.
Human physiologically based pharmacokinetic model for propofol
Schnider Thomas W
2005-04-01
Full Text Available Abstract Background Propofol is widely used for both short-term anesthesia and long-term sedation. It has unusual pharmacokinetics because of its high lipid solubility. The standard approach to describing the pharmacokinetics is by a multi-compartmental model. This paper presents the first detailed human physiologically based pharmacokinetic (PBPK model for propofol. Methods PKQuest, a freely distributed software routine http://www.pkquest.com, was used for all the calculations. The "standard human" PBPK parameters developed in previous applications is used. It is assumed that the blood and tissue binding is determined by simple partition into the tissue lipid, which is characterized by two previously determined set of parameters: 1 the value of the propofol oil/water partition coefficient; 2 the lipid fraction in the blood and tissues. The model was fit to the individual experimental data of Schnider et. al., Anesthesiology, 1998; 88:1170 in which an initial bolus dose was followed 60 minutes later by a one hour constant infusion. Results The PBPK model provides a good description of the experimental data over a large range of input dosage, subject age and fat fraction. Only one adjustable parameter (the liver clearance is required to describe the constant infusion phase for each individual subject. In order to fit the bolus injection phase, for 10 or the 24 subjects it was necessary to assume that a fraction of the bolus dose was sequestered and then slowly released from the lungs (characterized by two additional parameters. The average weighted residual error (WRE of the PBPK model fit to the both the bolus and infusion phases was 15%; similar to the WRE for just the constant infusion phase obtained by Schnider et. al. using a 6-parameter NONMEM compartmental model. Conclusion A PBPK model using standard human parameters and a simple description of tissue binding provides a good description of human propofol kinetics. The major advantage of a
New global ICT-based business models
The New Global Business model (NEWGIBM) book describes the background, theory references, case studies, results and learning imparted by the NEWGIBM project, which is supported by ICT, to a research group during the period from 2005-2011. The book is a result of the efforts and the collaborative ...... NEWGIBM Cases Show? The Strategy Concept in Light of the Increased Importance of Innovative Business Models Successful Implementation of Global BM Innovation Globalisation Of ICT Based Business Models: Today And In 2020......The New Global Business model (NEWGIBM) book describes the background, theory references, case studies, results and learning imparted by the NEWGIBM project, which is supported by ICT, to a research group during the period from 2005-2011. The book is a result of the efforts and the collaborative...... NEWGIBM book serves as a part of the final evaluation and documentation of the NEWGIBM project and is supported by results from the following projects: M-commerce, Global Innovation, Global Ebusiness & M-commerce, The Blue Ocean project, International Center for Innovation and Women in Business, NEFFICS...
Trip Generation Model Based on Destination Attractiveness
YAO Liya; GUAN Hongzhi; YAN Hai
2008-01-01
Traditional trip generation forecasting methods use unified average trip generation rates to determine trip generation volumes in various traffic zones without considering the individual characteristics of each traffic zone.Therefore,the results can have significant errors.To reduce the forecasting error produced by uniform trip generation rates for different traffic zones,the behavior of each traveler was studied instead of the characteristics of the traffic zone.This paper gives a method for calculating the trip efficiency and the effect of traffic zones combined with a destination selection model based on disaggregate theory for trip generation.Beijing data is used with the trip generation method to predict trip volumes.The results show that the disaggregate model in this paper is more accurate than the traditional method.An analysis of the factors influencing traveler behavior and destination selection shows that the attractiveness of the traffic zone strongly affects the trip generation volume.
Cloth Modeling Based on Particle System
钟跃崎; 王善元
2001-01-01
A physical-based particle system is employed for cloth modeling supported by two basic algorithms, between which one is the construction of the internal and external forces acting on the particle system in terms of KES-F bending and shearing tests, and the other is the collision algorithm of which the collision detection is carried by means of bi-section of time step and the collision response is handled according to the empirical law for frictionless collision With these algorithms. the geometric state of parcles can be expressed as ordinary differential equationswhich is numerically solved by fourth order Runge- Kutta integration. Different draping figures of cotton fabric and wool fabric prove that such a particle system is suitable for 3D cloth modeling and simulation.
Adaptive Model-Based Mammogram Enhancement
Haindl, Michal; Remeš, Václav
Los Alamitos, USA: IEEE Computer Society CPS, 2014 - (Yetongno, K.; Dipanda, A.; Chbeir, R.), s. 65-72 ISBN 978-1-4799-7978-3. [Tenth International Conference on Signal-Image Technology & Internet-Based Systems (SITIS 2014). Marrakech (MA), 23.11.2014-27.11.2014] R&D Projects: GA ČR(CZ) GA14-10911S Institutional support: RVO:67985556 Keywords : mammography * image enhancement * MRF * textural models Subject RIV: BD - Theory of Information http://library.utia.cas.cz/separaty/2014/RO/haindl-0436549.pdf
Agent-based modeling and simulation
Taylor, Simon
2014-01-01
Operational Research (OR) deals with the use of advanced analytical methods to support better decision-making. It is multidisciplinary with strong links to management science, decision science, computer science and many application areas such as engineering, manufacturing, commerce and healthcare. In the study of emergent behaviour in complex adaptive systems, Agent-based Modelling & Simulation (ABMS) is being used in many different domains such as healthcare, energy, evacuation, commerce, manufacturing and defense. This collection of articles presents a convenient introduction to ABMS with pa
SAT-Based Model Checking without Unrolling
Bradley, Aaron R.
A new form of SAT-based symbolic model checking is described. Instead of unrolling the transition relation, it incrementally generates clauses that are inductive relative to (and augment) stepwise approximate reachability information. In this way, the algorithm gradually refines the property, eventually producing either an inductive strengthening of the property or a counterexample trace. Our experimental studies show that induction is a powerful tool for generalizing the unreachability of given error states: it can refine away many states at once, and it is effective at focusing the proof search on aspects of the transition system relevant to the property. Furthermore, the incremental structure of the algorithm lends itself to a parallel implementation.
Flow Modeling Based Wall Element Technique
Sabah Tamimi
2012-08-01
Full Text Available Two types of flow where examined, pressure and combination of pressure and Coquette flow of confined turbulent flow with a one equation model used to depict the turbulent viscosity of confined flow in a smooth straight channel when a finite element technique based on a zone close to a solid wall has been adopted for predicting the distribution of the pertinent variables in this zone and examined even with case when the near wall zone was extended away from the wall. The validation of imposed technique has been tested and well compared with other techniques.
Knowledge-based geometric modeling in construction
Bonev, Martin; Hvam, Lars
2012-01-01
considerably high amount of their recourses is required for designing and specifying the majority of their product assortment. As design decisions are hereby based on knowledge and experience about behaviour and applicability of construction techniques and materials for a predefined design situation, smart...... tools need to be developed, to support these activities. In order to achieve a higher degree of design automation, this study proposes a framework for using configuration systems within the CAD environment together with suitable geometric modeling techniques on the example of a Danish manufacturer for...
Compositional Testing for FSM-Based Models
Bilal Kanso
2014-06-01
Full Text Available The contribution of this paper is threefold: first, it defines a framework for modelling component –based systems, as well as a formalization of integration rules to combine their behaviour. This is based on finite state machines (FSM. Second, it studies compositional conformance testing i.e. checking whether an implementation made of conforming components combined with integration operators is conform to its specification. Third, it shows the correctness of the global system can be obtained by testing the components involved into it towards the projection of the global specification on the specifications of the components. This result is useful to build adequate test purposes for testing components taking into ac count the system where they are plugged in.
Agent Based Modeling in Public Administration
Osman SEYHAN
2013-06-01
Full Text Available This study aims to explore the role of agent based modeling (ABM as a simulation method in analyzing and formulating the policy making processes and modern public management that is under the pressure of information age and socio-politic demands of open societies. ABM is a simulative research method to understand complex adaptive systems (cas from the perspective of its constituent entities. In this study, by employing agent based computing and Netlogo language, twocase studies about organizational design and organizational riskanalyses have been examined. Results revealed that ABM is anefficient platform determining the optimum results from various scenarios in order to understand structures and processes about policy making in both organizational design and risk management. In the future, more researches are needed about understanding role of ABM on understanding and making decision on future of cas especially in conjunction with developments in computer technologies.
Integrated Semantic Similarity Model Based on Ontology
LIU Ya-Jun; ZHAO Yun
2004-01-01
To solve the problem of the inadequacy of semantic processing in the intelligent question answering system, an integrated semantic similarity model which calculates the semantic similarity using the geometric distance and information content is presented in this paper.With the help of interrelationship between concepts, the information content of concepts and the strength of the edges in the ontology network, we can calculate the semantic similarity between two concepts and provide information for the further calculation of the semantic similarity between user's question and answers in knowlegdge base.The results of the experiments on the prototype have shown that the semantic problem in natural language processing can also be solved with the help of the knowledge and the abundant semantic information in ontology.More than 90% accuracy with less than 50 ms average searching time in the intelligent question answering prototype system based on ontology has been reached.The result is very satisfied.
Quantum Mechanics Based Multiscale Modeling of Materials
Lu, Gang
2013-03-01
We present two quantum mechanics based multiscale approaches that can simulate extended defects in metals accurately and efficiently. The first approach (QCDFT) can treat multimillion atoms effectively via density functional theory (DFT). The method is an extension of the original quasicontinuum approach with DFT as its sole energetic formulation. The second method (QM/MM) has to do with quantum mechanics/molecular mechanics coupling based on the constrained density functional theory, which provides an exact framework for a self-consistent quantum mechanical embedding. Several important materials problems will be addressed using the multiscale modeling approaches, including hydrogen-assisted cracking in Al, magnetism-controlled dislocation properties in Fe and Si pipe diffusion along Al dislocation core. We acknowledge the support from the Office of Navel Research and the Army Research Office.
Python-Based Applications for Hydrogeological Modeling
Khambhammettu, P.
2013-12-01
Python is a general-purpose, high-level programming language whose design philosophy emphasizes code readability. Add-on packages supporting fast array computation (numpy), plotting (matplotlib), scientific /mathematical Functions (scipy), have resulted in a powerful ecosystem for scientists interested in exploratory data analysis, high-performance computing and data visualization. Three examples are provided to demonstrate the applicability of the Python environment in hydrogeological applications. Python programs were used to model an aquifer test and estimate aquifer parameters at a Superfund site. The aquifer test conducted at a Groundwater Circulation Well was modeled with the Python/FORTRAN-based TTIM Analytic Element Code. The aquifer parameters were estimated with PEST such that a good match was produced between the simulated and observed drawdowns. Python scripts were written to interface with PEST and visualize the results. A convolution-based approach was used to estimate source concentration histories based on observed concentrations at receptor locations. Unit Response Functions (URFs) that relate the receptor concentrations to a unit release at the source were derived with the ATRANS code. The impact of any releases at the source could then be estimated by convolving the source release history with the URFs. Python scripts were written to compute and visualize receptor concentrations for user-specified source histories. The framework provided a simple and elegant way to test various hypotheses about the site. A Python/FORTRAN-based program TYPECURVEGRID-Py was developed to compute and visualize groundwater elevations and drawdown through time in response to a regional uniform hydraulic gradient and the influence of pumping wells using either the Theis solution for a fully-confined aquifer or the Hantush-Jacob solution for a leaky confined aquifer. The program supports an arbitrary number of wells that can operate according to arbitrary schedules. The
Model based control of refrigeration systems
Sloth Larsen, L.F.
2005-11-15
The subject for this Ph.D. thesis is model based control of refrigeration systems. Model based control covers a variety of different types of controls, that incorporates mathematical models. In this thesis the main subject therefore has been restricted to deal with system optimizing control. The optimizing control is divided into two layers, where the system oriented top layers deals with set-point optimizing control and the lower layer deals with dynamical optimizing control in the subsystems. The thesis has two main contributions, i.e. a novel approach for set-point optimization and a novel approach for desynchronization based on dynamical optimization. The focus in the development of the proposed set-point optimizing control has been on deriving a simple and general method, that with ease can be applied on various compositions of the same class of systems, such as refrigeration systems. The method is based on a set of parameter depended static equations describing the considered process. By adapting the parameters to the given process, predict the steady state and computing a steady state gradient of the cost function, the process can be driven continuously towards zero gradient, i.e. the optimum (if the cost function is convex). The method furthermore deals with system constrains by introducing barrier functions, hereby the best possible performance taking the given constrains in to account can be obtained, e.g. under extreme operational conditions. The proposed method has been applied on a test refrigeration system, placed at Aalborg University, for minimization of the energy consumption. Here it was proved that by using general static parameter depended system equations it was possible drive the set-points close to the optimum and thus reduce the power consumption with up to 20%. In the dynamical optimizing layer the idea is to optimize the operation of the subsystem or the groupings of subsystems, that limits the obtainable system performance. In systems
Biologically based multistage modeling of radiation effects
William Hazelton; Suresh Moolgavkar; E. Georg Luebeck
2005-08-30
This past year we have made substantial progress in modeling the contribution of homeostatic regulation to low-dose radiation effects and carcinogenesis. We have worked to refine and apply our multistage carcinogenesis models to explicitly incorporate cell cycle states, simple and complex damage, checkpoint delay, slow and fast repair, differentiation, and apoptosis to study the effects of low-dose ionizing radiation in mouse intestinal crypts, as well as in other tissues. We have one paper accepted for publication in ''Advances in Space Research'', and another manuscript in preparation describing this work. I also wrote a chapter describing our combined cell-cycle and multistage carcinogenesis model that will be published in a book on stochastic carcinogenesis models edited by Wei-Yuan Tan. In addition, we organized and held a workshop on ''Biologically Based Modeling of Human Health Effects of Low dose Ionizing Radiation'', July 28-29, 2005 at Fred Hutchinson Cancer Research Center in Seattle, Washington. We had over 20 participants, including Mary Helen Barcellos-Hoff as keynote speaker, talks by most of the low-dose modelers in the DOE low-dose program, experimentalists including Les Redpath (and Mary Helen), Noelle Metting from DOE, and Tony Brooks. It appears that homeostatic regulation may be central to understanding low-dose radiation phenomena. The primary effects of ionizing radiation (IR) are cell killing, delayed cell cycling, and induction of mutations. However, homeostatic regulation causes cells that are killed or damaged by IR to eventually be replaced. Cells with an initiating mutation may have a replacement advantage, leading to clonal expansion of these initiated cells. Thus we have focused particularly on modeling effects that disturb homeostatic regulation as early steps in the carcinogenic process. There are two primary considerations that support our focus on homeostatic regulation. First, a number of
GLDAS Land Surface Models based Aridity Indices
Pande, S.; Ghazanfari, S.
2011-12-01
Identification of dryland areas is crucial to guide policy aimed at intervening in water stressed areas and addressing its perennial livelihood or food insecurity. Aridity indices based on spatially relative soil moisture conditions such as NCEP aridity index allow cross comparison of dry conditions between sites. NCEP aridity index is based on the ratio of annual precipitation (supply) to annual potential evaporation (demand). Such an index ignores subannual scale competition between evaporation and drainage functions well as rainfall and temperature regimes. This determines partitioning of annual supply of precipitation into two competing (but met) evaporation and runoff demands. We here introduce aridity indices based on these additional considerations by using soil moisture time series for the past 3 decades from three Land Surface Models (LSM) models and compare it with NCEP index. We analyze global monthly soil moisture time series (385 months) at 1 x 1 degree spatial resolution as modeled by three GLDAS LSMs - VIC, MOSAIC and NOAH. The first eigen vector from Empirical Orthogonal Function (EOF) analysis, as it is the most dominant spatial template of global soil moisture conditions, is extracted. Frequency of nonexceedences of this dominant soil moisture mode for a location by other locations is calculated and is used as our proposed aridity index. An area is indexed drier (relative to other areas in the world) if its frequency of nonexceedence is lower. The EOF analysis reveals that their first eigen vector explains approximately 32%, 43% and 47% of variance explained by first 385 eigen vectors for VIC, MOSAIC and NOAH respectively. The temporal coefficients associated with it for all three LSMS show seasonality with a jump in trend around the year 1999 for NOAH and MOSAIC. The VIC aridity index displays a pattern most closely resembling that of NCEP though all LSM based indices isolate dominant dryland areas. However, all three LSMs identify some parts of
Models-Based Practice: Great White Hope or White Elephant?
Casey, Ashley
2014-01-01
Background: Many critical curriculum theorists in physical education have advocated a model- or models-based approach to teaching in the subject. This paper explores the literature base around models-based practice (MBP) and asks if this multi-models approach to curriculum planning has the potential to be the great white hope of pedagogical change…