WorldWideScience

Sample records for classification-tree based models

  1. Predictive mapping of soil organic carbon in wet cultivated lands using classification-tree based models

    DEFF Research Database (Denmark)

    Kheir, Rania Bou; Greve, Mogens Humlekrog; Bøcher, Peder Klith;

    2010-01-01

    Soil organic carbon (SOC) is one of the most important carbon stocks globally and has large potential to affect global climate. Distribution patterns of SOC in Denmark constitute a nation-wide baseline for studies on soil carbon changes (with respect to Kyoto protocol). This paper predicts and maps...... the geographic distribution of SOC across Denmark using remote sensing (RS), geographic information systems (GISs) and decision-tree modeling (un-pruned and pruned classification trees). Seventeen parameters, i.e. parent material, soil type, landscape type, elevation, slope gradient, slope aspect......, mean curvature, plan curvature, profile curvature, flow accumulation, specific catchment area, tangent slope, tangent curvature, steady-state wetness index, Normalized Difference Vegetation Index (NDVI), Normalized Difference Wetness Index (NDWI) and Soil Color Index (SCI) were generated to...

  2. Investigation of the Effect of Traffic Parameters on Road Hazard Using Classification Tree Model

    Directory of Open Access Journals (Sweden)

    Md. Mahmud Hasan

    2012-09-01

    Full Text Available This paper presents a method for the identification of hazardous situations on the freeways. For this study, about 18 km long section of Eastern Freeway in Melbourne, Australia was selected as a test bed. Three categories of data i.e. traffic, weather and accident record data were used for the analysis and modelling. In developing the crash risk probability model, classification tree based model was developed in this study. In formulating the models, it was found that weather conditions did not have significant impact on accident occurrence so the classification tree was built using two traffic indices; traffic flow and vehicle speed only. The formulated classification tree is able to identify the possible hazard and non-hazard situations on freeway. The outcome of the study will aid the hazard mitigation strategies.

  3. Predictive mapping of soil organic carbon in wet cultivated lands using classification-tree based models: the case study of Denmark.

    Science.gov (United States)

    Bou Kheir, Rania; Greve, Mogens H; Bøcher, Peder K; Greve, Mette B; Larsen, René; McCloy, Keith

    2010-05-01

    Soil organic carbon (SOC) is one of the most important carbon stocks globally and has large potential to affect global climate. Distribution patterns of SOC in Denmark constitute a nation-wide baseline for studies on soil carbon changes (with respect to Kyoto protocol). This paper predicts and maps the geographic distribution of SOC across Denmark using remote sensing (RS), geographic information systems (GISs) and decision-tree modeling (un-pruned and pruned classification trees). Seventeen parameters, i.e. parent material, soil type, landscape type, elevation, slope gradient, slope aspect, mean curvature, plan curvature, profile curvature, flow accumulation, specific catchment area, tangent slope, tangent curvature, steady-state wetness index, Normalized Difference Vegetation Index (NDVI), Normalized Difference Wetness Index (NDWI) and Soil Color Index (SCI) were generated to statistically explain SOC field measurements in the area of interest (Denmark). A large number of tree-based classification models (588) were developed using (i) all of the parameters, (ii) all Digital Elevation Model (DEM) parameters only, (iii) the primary DEM parameters only, (iv), the remote sensing (RS) indices only, (v) selected pairs of parameters, (vi) soil type, parent material and landscape type only, and (vii) the parameters having a high impact on SOC distribution in built pruned trees. The best constructed classification tree models (in the number of three) with the lowest misclassification error (ME) and the lowest number of nodes (N) as well are: (i) the tree (T1) combining all of the parameters (ME=29.5%; N=54); (ii) the tree (T2) based on the parent material, soil type and landscape type (ME=31.5%; N=14); and (iii) the tree (T3) constructed using parent material, soil type, landscape type, elevation, tangent slope and SCI (ME=30%; N=39). The produced SOC maps at 1:50,000 cartographic scale using these trees are highly matching with coincidence values equal to 90.5% (Map T1

  4. Predicting student satisfaction with courses based on log data from a virtual learning environment – a neural network and classification tree model

    Directory of Open Access Journals (Sweden)

    Ivana Đurđević Babić

    2015-03-01

    Full Text Available Student satisfaction with courses in academic institutions is an important issue and is recognized as a form of support in ensuring effective and quality education, as well as enhancing student course experience. This paper investigates whether there is a connection between student satisfaction with courses and log data on student courses in a virtual learning environment. Furthermore, it explores whether a successful classification model for predicting student satisfaction with course can be developed based on course log data and compares the results obtained from implemented methods. The research was conducted at the Faculty of Education in Osijek and included analysis of log data and course satisfaction on a sample of third and fourth year students. Multilayer Perceptron (MLP with different activation functions and Radial Basis Function (RBF neural networks as well as classification tree models were developed, trained and tested in order to classify students into one of two categories of course satisfaction. Type I and type II errors, and input variable importance were used for model comparison and classification accuracy. The results indicate that a successful classification model using tested methods can be created. The MLP model provides the highest average classification accuracy and the lowest preference in misclassification of students with a low level of course satisfaction, although a t-test for the difference in proportions showed that the difference in performance between the compared models is not statistically significant. Student involvement in forum discussions is recognized as a valuable predictor of student satisfaction with courses in all observed models.

  5. Predicting student satisfaction with courses based on log data from a virtual learning environment – a neural network and classification tree model

    OpenAIRE

    Ivana Đurđević Babić

    2015-01-01

    Student satisfaction with courses in academic institutions is an important issue and is recognized as a form of support in ensuring effective and quality education, as well as enhancing student course experience. This paper investigates whether there is a connection between student satisfaction with courses and log data on student courses in a virtual learning environment. Furthermore, it explores whether a successful classification model for predicting student satisfaction with course can be...

  6. Malaria in central Vietnam: analysis of risk factors by multivariate analysis and classification tree models

    Directory of Open Access Journals (Sweden)

    Hung Cong

    2008-01-01

    Full Text Available Abstract Background In Central Vietnam, forest malaria remains difficult to control due to the complex interactions between human, vector and environmental factors. Methods Prior to a community-based intervention to assess the efficacy of long-lasting insecticidal hammocks, a complete census (18,646 individuals and a baseline cross-sectional survey for determining malaria prevalence and related risk factors were carried out. Multivariate analysis using survey logistic regression was combined to a classification tree model (CART to better define the relative importance and inter-relations between the different risk factors. Results The study population was mostly from the Ra-glai ethnic group (88%, with both low education and socio-economic status and engaged mainly in forest activities (58%. The multivariate analysis confirmed forest activity, bed net use, ethnicity, age and education as risk factors for malaria infections, but could not handle multiple interactions. The CART analysis showed that the most important risk factor for malaria was the wealth category, the wealthiest group being much less infected (8.9% than the lower and medium wealth category (16.6%. In the former, forest activity and bed net use were the most determinant risk factors for malaria, while in the lower and medium wealth category, insecticide treated nets were most important, although the latter were less protective among Ra-glai people. Conclusion The combination of CART and multivariate analysis constitute a novel analytical approach, providing an accurate and dynamic picture of the main risk factors for malaria infection. Results show that the control of forest malaria remains an extremely complex task that has to address poverty-related risk factors such as education, ethnicity and housing conditions.

  7. Superiority of Classification Tree versus Cluster, Fuzzy and Discriminant Models in a Heartbeat Classification System

    Science.gov (United States)

    Krasteva, Vessela; Jekova, Irena; Leber, Remo; Schmid, Ramun; Abächerli, Roger

    2015-01-01

    This study presents a 2-stage heartbeat classifier of supraventricular (SVB) and ventricular (VB) beats. Stage 1 makes computationally-efficient classification of SVB-beats, using simple correlation threshold criterion for finding close match with a predominant normal (reference) beat template. The non-matched beats are next subjected to measurement of 20 basic features, tracking the beat and reference template morphology and RR-variability for subsequent refined classification in SVB or VB-class by Stage 2. Four linear classifiers are compared: cluster, fuzzy, linear discriminant analysis (LDA) and classification tree (CT), all subjected to iterative training for selection of the optimal feature space among extended 210-sized set, embodying interactive second-order effects between 20 independent features. The optimization process minimizes at equal weight the false positives in SVB-class and false negatives in VB-class. The training with European ST-T, AHA, MIT-BIH Supraventricular Arrhythmia databases found the best performance settings of all classification models: Cluster (30 features), Fuzzy (72 features), LDA (142 coefficients), CT (221 decision nodes) with top-3 best scored features: normalized current RR-interval, higher/lower frequency content ratio, beat-to-template correlation. Unbiased test-validation with MIT-BIH Arrhythmia database rates the classifiers in descending order of their specificity for SVB-class: CT (99.9%), LDA (99.6%), Cluster (99.5%), Fuzzy (99.4%); sensitivity for ventricular ectopic beats as part from VB-class (commonly reported in published beat-classification studies): CT (96.7%), Fuzzy (94.4%), LDA (94.2%), Cluster (92.4%); positive predictivity: CT (99.2%), Cluster (93.6%), LDA (93.0%), Fuzzy (92.4%). CT has superior accuracy by 0.3–6.8% points, with the advantage for easy model complexity configuration by pruning the tree consisted of easy interpretable ‘if-then’ rules. PMID:26461492

  8. Building classification trees to explain the radioactive contamination levels of the plants

    International Nuclear Information System (INIS)

    The objective of this thesis is the development of a method allowing the identification of factors leading to various radioactive contamination levels of the plants. The methodology suggested is based on the use of a radioecological transfer model of the radionuclides through the environment (A.S.T.R.A.L. computer code) and a classification-tree method. Particularly, to avoid the instability problems of classification trees and to preserve the tree structure, a node level stabilizing technique is used. Empirical comparisons are carried out between classification trees built by this method (called R.E.N. method) and those obtained by the C.A.R.T. method. A similarity measure is defined to compare the structure of two classification trees. This measure is used to study the stabilizing performance of the R.E.N. method. The methodology suggested is applied to a simplified contamination scenario. By the results obtained, we can identify the main variables responsible of the various radioactive contamination levels of four leafy-vegetables (lettuce, cabbage, spinach and leek). Some extracted rules from these classification trees can be usable in a post-accidental context. (author)

  9. Sensitivity of missing values in classification tree for large sample

    Science.gov (United States)

    Hasan, Norsida; Adam, Mohd Bakri; Mustapha, Norwati; Abu Bakar, Mohd Rizam

    2012-05-01

    Missing values either in predictor or in response variables are a very common problem in statistics and data mining. Cases with missing values are often ignored which results in loss of information and possible bias. The objectives of our research were to investigate the sensitivity of missing data in classification tree model for large sample. Data were obtained from one of the high level educational institutions in Malaysia. Students' background data were randomly eliminated and classification tree was used to predict students degree classification. The results showed that for large sample, the structure of the classification tree was sensitive to missing values especially for sample contains more than ten percent missing values.

  10. Influence Measures for CART Classification Trees

    OpenAIRE

    Bar-Hen, Avner; Gey, Servane; Poggi, Jean-Michel

    2015-01-01

    This paper deals with measuring the influence of observations on the results obtained with CART classification trees. To define the influence of individuals on the analysis, we use influence functions to propose some general criterions to measure the sensitivity of the CART analysis and its robustness. The proposals, based on jakknife trees, are organized around two lines: influence on predictions and influence on partitions. In addition, the analysis is extended to the pruned sequences of CA...

  11. Segmentation of Firms by Means of Classification Trees

    OpenAIRE

    Mirosława Lasek; Marek Pęczkowski

    2002-01-01

    The objective of the paper was to present the utility and applicability of the method of generating classification trees for the purposes of segmentation of firms by their economic standing, i.e. their financial and assets condition. The method of classification tree generation belongs to the group of the ihdata mininglt methods that permit to find out, basing on large data sets, the relationships and links among data. Variables used to classify the firms were financial and assets indices, in...

  12. Consensus of classification trees for skin sensitisation hazard prediction.

    Science.gov (United States)

    Asturiol, D; Casati, S; Worth, A

    2016-10-01

    Since March 2013, it is no longer possible to market in the European Union (EU) cosmetics containing new ingredients tested on animals. Although several in silico alternatives are available and achievements have been made in the development and regulatory adoption of skin sensitisation non-animal tests, there is not yet a generally accepted approach for skin sensitisation assessment that would fully substitute the need for animal testing. The aim of this work was to build a defined approach (i.e. a predictive model based on readouts from various information sources that uses a fixed procedure for generating a prediction) for skin sensitisation hazard prediction (sensitiser/non-sensitiser) using Local Lymph Node Assay (LLNA) results as reference classifications. To derive the model, we built a dataset with high quality data from in chemico (DPRA) and in vitro (KeratinoSens™ and h-CLAT) methods, and it was complemented with predictions from several software packages. The modelling exercise showed that skin sensitisation hazard was better predicted by classification trees based on in silico predictions. The defined approach consists of a consensus of two classification trees that are based on descriptors that account for protein reactivity and structural features. The model showed an accuracy of 0.93, sensitivity of 0.98, and specificity of 0.85 for 269 chemicals. In addition, the defined approach provides a measure of confidence associated to the prediction. PMID:27458072

  13. Predictive Classification Trees

    Science.gov (United States)

    Dlugosz, Stephan; Müller-Funk, Ulrich

    CART (Breiman et al., Classification and Regression Trees, Chapman and Hall, New York, 1984) and (exhaustive) CHAID (Kass, Appl Stat 29:119-127, 1980) figure prominently among the procedures actually used in data based management, etc. CART is a well-established procedure that produces binary trees. CHAID, in contrast, admits multiple splittings, a feature that allows to exploit the splitting variable more extensively. On the other hand, that procedure depends on premises that are questionable in practical applications. This can be put down to the fact that CHAID relies on simultaneous Chi-Square- resp. F-tests. The null-distribution of the second test statistic, for instance, relies on the normality assumption that is not plausible in a data mining context. Moreover, none of these procedures - as implemented in SPSS, for instance - take ordinal dependent variables into account. In the paper we suggest an alternative tree-algorithm that: Requires explanatory categorical variables

  14. Predicting 'very poor' beach water quality gradings using classification tree.

    Science.gov (United States)

    Thoe, Wai; Choi, King Wah; Lee, Joseph Hun-wei

    2016-02-01

    A beach water quality prediction system has been developed in Hong Kong using multiple linear regression (MLR) models. However, linear models are found to be weak at capturing the infrequent 'very poor' water quality occasions when Escherichia coli (E. coli) concentration exceeds 610 counts/100 mL. This study uses a classification tree to increase the accuracy in predicting the 'very poor' water quality events at three Hong Kong beaches affected either by non-point source or point source pollution. Binary-output classification trees (to predict whether E. coli concentration exceeds 610 counts/100 mL) are developed over the periods before and after the implementation of the Harbour Area Treatment Scheme, when systematic changes in water quality were observed. Results show that classification trees can capture more 'very poor' events in both periods when compared to the corresponding linear models, with an increase in correct positives by an average of 20%. Classification trees are also developed at two beaches to predict the four-category Beach Water Quality Indices. They perform worse than the binary tree and give excessive false alarms of 'very poor' events. Finally, a combined modelling approach using both MLR model and classification tree is proposed to enhance the beach water quality prediction system for Hong Kong. PMID:26837834

  15. Design of Radar Software Test Case Based on Classification Tree%基于分类树的雷达软件测试用例设计

    Institute of Scientific and Technical Information of China (English)

    职晓; 裴阿平; 张江华

    2014-01-01

    Owing to larger and larger scale of software size, it is less and less feasible to test every functional unit of modern radar software by using common combinatorial testing techniques in engineering. Aiming at solving defi-ciency of a large amount of redundant test cases generated by using the classification tree method ( CTM) designing test cases, the orthogonal experimental design method based on case set generated by CTM is used to simplify and optimize the testing so as to improve testing efficiency. The experimental results show that optimization of testing case based on orthogonal experimental test designing method can be used to reduce redundant test cases effectively and save test source and cost. It possesses applicable value in engineering.%现代雷达软件测试由于软件规模越来越大,利用常规的组合覆盖方法测试各功能单元工程上越来越不现实。文章针对分类树方法设计测试用例产生大量冗余测试用例的缺陷,提出了在分类树方法生成的用例集基础上,利用正交试验设计法对其进行精简优化,以提高测试效率。实验结果表明,基于正交试验设计法的测试用例优化,可以有效减少冗余测试用例,节省测试资源和成本,具有一定的工程应用价值。

  16. A Method for Application of Classification Tree Models to Map Aquatic Vegetation Using Remotely Sensed Images from Different Sensors and Dates

    Directory of Open Access Journals (Sweden)

    Ying Cai

    2012-09-01

    Full Text Available In previous attempts to identify aquatic vegetation from remotely-sensed images using classification trees (CT, the images used to apply CT models to different times or locations necessarily originated from the same satellite sensor as that from which the original images used in model development came, greatly limiting the application of CT. We have developed an effective normalization method to improve the robustness of CT models when applied to images originating from different sensors and dates. A total of 965 ground-truth samples of aquatic vegetation types were obtained in 2009 and 2010 in Taihu Lake, China. Using relevant spectral indices (SI as classifiers, we manually developed a stable CT model structure and then applied a standard CT algorithm to obtain quantitative (optimal thresholds from 2009 ground-truth data and images from Landsat7-ETM+, HJ-1B-CCD, Landsat5-TM and ALOS-AVNIR-2 sensors. Optimal CT thresholds produced average classification accuracies of 78.1%, 84.7% and 74.0% for emergent vegetation, floating-leaf vegetation and submerged vegetation, respectively. However, the optimal CT thresholds for different sensor images differed from each other, with an average relative variation (RV of 6.40%. We developed and evaluated three new approaches to normalizing the images. The best-performing method (Method of 0.1% index scaling normalized the SI images using tailored percentages of extreme pixel values. Using the images normalized by Method of 0.1% index scaling, CT models for a particular sensor in which thresholds were replaced by those from the models developed for images originating from other sensors provided average classification accuracies of 76.0%, 82.8% and 68.9% for emergent vegetation, floating-leaf vegetation and submerged vegetation, respectively. Applying the CT models developed for normalized 2009 images to 2010 images resulted in high classification (78.0%–93.3% and overall (92.0%–93.1% accuracies. Our

  17. Building classification trees to explain the radioactive contamination levels of the plants; Construction d'arbres de discrimination pour expliquer les niveaux de contamination radioactive des vegetaux

    Energy Technology Data Exchange (ETDEWEB)

    Briand, B

    2008-04-15

    The objective of this thesis is the development of a method allowing the identification of factors leading to various radioactive contamination levels of the plants. The methodology suggested is based on the use of a radioecological transfer model of the radionuclides through the environment (A.S.T.R.A.L. computer code) and a classification-tree method. Particularly, to avoid the instability problems of classification trees and to preserve the tree structure, a node level stabilizing technique is used. Empirical comparisons are carried out between classification trees built by this method (called R.E.N. method) and those obtained by the C.A.R.T. method. A similarity measure is defined to compare the structure of two classification trees. This measure is used to study the stabilizing performance of the R.E.N. method. The methodology suggested is applied to a simplified contamination scenario. By the results obtained, we can identify the main variables responsible of the various radioactive contamination levels of four leafy-vegetables (lettuce, cabbage, spinach and leek). Some extracted rules from these classification trees can be usable in a post-accidental context. (author)

  18. Statistical Modelling in Surveys without Neglecting "The Undecided": Multinomial Logistic Regression Models and Imprecise Classification Trees under Ontic Data Imprecision - extended version

    OpenAIRE

    Plass, Julia; Fink, Paul; Schöning, Norbert; Augustin, Thomas

    2015-01-01

    In surveys, and most notably in election polls, undecided participants frequently constitute subgroups of their own with specific individual characteristics. While traditional survey methods and corresponding statistical models are inherently damned to neglect this valuable information, an ontic random set view provides us with the full power of the whole statistical modelling framework. We elaborate this idea for a multinomial logistic regression model (which can be derived as a discrete cho...

  19. The Hybrid of Classification Tree and Extreme Learning Machine for Permeability Prediction in Oil Reservoir

    KAUST Repository

    Prasetyo Utomo, Chandra

    2011-06-01

    Permeability is an important parameter connected with oil reservoir. Predicting the permeability could save millions of dollars. Unfortunately, petroleum engineers have faced numerous challenges arriving at cost-efficient predictions. Much work has been carried out to solve this problem. The main challenge is to handle the high range of permeability in each reservoir. For about a hundred year, mathematicians and engineers have tried to deliver best prediction models. However, none of them have produced satisfying results. In the last two decades, artificial intelligence models have been used. The current best prediction model in permeability prediction is extreme learning machine (ELM). It produces fairly good results but a clear explanation of the model is hard to come by because it is so complex. The aim of this research is to propose a way out of this complexity through the design of a hybrid intelligent model. In this proposal, the system combines classification and regression models to predict the permeability value. These are based on the well logs data. In order to handle the high range of the permeability value, a classification tree is utilized. A benefit of this innovation is that the tree represents knowledge in a clear and succinct fashion and thereby avoids the complexity of all previous models. Finally, it is important to note that the ELM is used as a final predictor. Results demonstrate that this proposed hybrid model performs better when compared with support vector machines (SVM) and ELM in term of correlation coefficient. Moreover, the classification tree model potentially leads to better communication among petroleum engineers concerning this important process and has wider implications for oil reservoir management efficiency.

  20. The Hybrid of Classification Tree and Extreme Learning Machine for Permeability Prediction in Oil Reservoir

    Directory of Open Access Journals (Sweden)

    Chandra Prasetyo Utomo

    2013-01-01

    Full Text Available Permeability is an important parameter connected with oil reservoir. In the last two decades, artificial intelligence models have been used. The current best prediction model in permeability prediction is extreme learning machine (ELM. It produces fairly good results but a clear explanation of the model is hard to come by because it is so complex. The aim of this research is to propose a way out of this complexity through the design of a hybrid intelligent model. The model combines classification and regression. In order to handle the high range of the permeability value, a classification tree is utilized. ELM is used as a final predictor. Results demonstrate that this proposed model performs better when compared with support vector machines (SVM and ELM in term of correlation coefficient. Moreover, the classification tree model potentially leads to better communication among petroleum engineers and has wider implications for oil reservoir management efficiency.

  1. Predicting battle outcomes with classification trees

    OpenAIRE

    Coban, Muzaffer.

    2001-01-01

    Historical combat data analysis is a way of understanding the factors affecting battle outcomes. Current studies mostly prefer simulations that are based on mathematical abstractions of battles. However, these abstractions emphasize objective variables, such as force ratio. Models have very limited abilities of modeling important intangible factors like morale, leadership, and luck. Historical combat analysis provides a way to understand battles with the data taken from the actual battlefield...

  2. Computer-aided diagnosis of Alzheimer's disease using support vector machines and classification trees

    International Nuclear Information System (INIS)

    This paper presents a computer-aided diagnosis technique for improving the accuracy of early diagnosis of Alzheimer-type dementia. The proposed methodology is based on the selection of voxels which present Welch's t-test between both classes, normal and Alzheimer images, greater than a given threshold. The mean and standard deviation of intensity values are calculated for selected voxels. They are chosen as feature vectors for two different classifiers: support vector machines with linear kernel and classification trees. The proposed methodology reaches greater than 95% accuracy in the classification task.

  3. Integrating TM and Ancillary Geographical Data with Classification Trees for Land Cover Classification of Marsh Area

    Institute of Scientific and Technical Information of China (English)

    NA Xiaodong; ZHANG Shuqing; ZHANG Huaiqing; LI Xiaofeng; YU Huan; LIU Chunyue

    2009-01-01

    The main objective of this research is to determine the capacity of land cover classification combining spectral and textural features of Landsat TM imagery with ancillary geographical data in wetlands of the Sanjiang Plain, Heilongjiang Province, China. Semi-variograms and Z-test value were calculated to assess the separability of grey-level co-occurrence texture measures to maximize the difference between land cover types. The degree of spatial autocorrelation showed that window sizes of 3×3 pixels and 11×11 pixels were most appropriate for Landsat TM image texture calculations. The texture analysis showed that co-occurrence entropy, dissimilarity, and variance texture measures, derived from the Landsat TM spectrum bands and vegetation indices provided the most significant statistical differentiation between land cover types. Subsequently, a Classification and Regression Tree (CART) algorithm was applied to three different combinations of predictors: 1) TM imagery alone (TM-only); 2) TM imagery plus image texture (TM+TXT model); and 3) all predictors including TM imagery, image texture and additional ancillary GIS information (TM+TXT+GIS model). Compared with traditional Maximum Likelihood Classification (MLC) supervised classification, three classification trees predictive models reduced the overall error rate significantly. Image texture measures and ancillary geographical variables depressed the speckle noise effectively and reduced classification error rate of marsh obviously. For classification trees model making use of all available predictors, omission error rate was 12.90% and commission error rate was 10.99% for marsh. The developed method is portable, relatively easy to implement and should be applicable in other settings and over larger extents.

  4. Mastectomy or breast conserving surgery? Factors affecting type of surgical treatment for breast cancer – a classification tree approach

    International Nuclear Information System (INIS)

    A critical choice facing breast cancer patients is which surgical treatment – mastectomy or breast conserving surgery (BCS) – is most appropriate. Several studies have investigated factors that impact the type of surgery chosen, identifying features such as place of residence, age at diagnosis, tumor size, socio-economic and racial/ethnic elements as relevant. Such assessment of 'propensity' is important in understanding issues such as a reported under-utilisation of BCS among women for whom such treatment was not contraindicated. Using Western Australian (WA) data, we further examine the factors associated with the type of surgical treatment for breast cancer using a classification tree approach. This approach deals naturally with complicated interactions between factors, and so allows flexible and interpretable models for treatment choice to be built that add to the current understanding of this complex decision process. Data was extracted from the WA Cancer Registry on women diagnosed with breast cancer in WA from 1990 to 2000. Subjects' treatment preferences were predicted from covariates using both classification trees and logistic regression. Tumor size was the primary determinant of patient choice, subjects with tumors smaller than 20 mm in diameter preferring BCS. For subjects with tumors greater than 20 mm in diameter factors such as patient age, nodal status, and tumor histology become relevant as predictors of patient choice. Classification trees perform as well as logistic regression for predicting patient choice, but are much easier to interpret for clinical use. The selected tree can inform clinicians' advice to patients

  5. Classification tree analysis of second neoplasms in survivors of childhood cancer

    OpenAIRE

    Todorovski Ljupčo; Jazbec Janez; Jereb Berta

    2007-01-01

    Abstract Background Reports on childhood cancer survivors estimated cumulative probability of developing secondary neoplasms vary from 3,3% to 25% at 25 years from diagnosis, and the risk of developing another cancer to several times greater than in the general population. Methods In our retrospective study, we have used the classification tree multivariate method on a group of 849 first cancer survivors, to identify childhood cancer patients with the greatest risk for development of secondar...

  6. Reedbed monitoring using classification trees and SPOT-5 seasonal time series

    OpenAIRE

    Davranche, Aurélie; Poulin, Brigitte; Lefebvre, Gaëtan

    2010-01-01

    The Rhône river delta (Camargue) in south of France, has lost 40,000 ha of natural areas, including 33,000 ha of wetlands over the last 60 years, following the extension of agriculture, salt exploitation and industry. Reed development and density in Camargue marshes is influenced by physical factors such as salinity, water depth, and water level fluctuations, which have an effect on reflectance spectra. Classification trees applied to time series of SPOT-5 images appear as a powerful and reli...

  7. Classification tree analysis of second neoplasms in survivors of childhood cancer

    International Nuclear Information System (INIS)

    Reports on childhood cancer survivors estimated cumulative probability of developing secondary neoplasms vary from 3,3% to 25% at 25 years from diagnosis, and the risk of developing another cancer to several times greater than in the general population. In our retrospective study, we have used the classification tree multivariate method on a group of 849 first cancer survivors, to identify childhood cancer patients with the greatest risk for development of secondary neoplasms. In observed group of patients, 34 develop secondary neoplasm after treatment of primary cancer. Analysis of parameters present at the treatment of first cancer, exposed two groups of patients at the special risk for secondary neoplasm. First are female patients treated for Hodgkin's disease at the age between 10 and 15 years, whose treatment included radiotherapy. Second group at special risk were male patients with acute lymphoblastic leukemia who were treated at the age between 4,6 and 6,6 years of age. The risk groups identified in our study are similar to the results of studies that used more conventional approaches. Usefulness of our approach in study of occurrence of second neoplasms should be confirmed in larger sample study, but user friendly presentation of results makes it attractive for further studies

  8. A simple and robust classification tree for differentiation between benign and malignant lesions in MR-mammography

    Energy Technology Data Exchange (ETDEWEB)

    Baltzer, Pascal A.T. [Medical University Vienna, Department of Radiology, Vienna (Austria); Dietzel, Matthias [University hospital Erlangen, Department of Neuroradiology, Erlangen (Germany); Kaiser, Werner A. [University Hospital Jena, Institute of Diagnostic and Interventional Radiology 1, Jena (Germany)

    2013-08-15

    In the face of multiple available diagnostic criteria in MR-mammography (MRM), a practical algorithm for lesion classification is needed. Such an algorithm should be as simple as possible and include only important independent lesion features to differentiate benign from malignant lesions. This investigation aimed to develop a simple classification tree for differential diagnosis in MRM. A total of 1,084 lesions in standardised MRM with subsequent histological verification (648 malignant, 436 benign) were investigated. Seventeen lesion criteria were assessed by 2 readers in consensus. Classification analysis was performed using the chi-squared automatic interaction detection (CHAID) method. Results include the probability for malignancy for every descriptor combination in the classification tree. A classification tree incorporating 5 lesion descriptors with a depth of 3 ramifications (1, root sign; 2, delayed enhancement pattern; 3, border, internal enhancement and oedema) was calculated. Of all 1,084 lesions, 262 (40.4 %) and 106 (24.3 %) could be classified as malignant and benign with an accuracy above 95 %, respectively. Overall diagnostic accuracy was 88.4 %. The classification algorithm reduced the number of categorical descriptors from 17 to 5 (29.4 %), resulting in a high classification accuracy. More than one third of all lesions could be classified with accuracy above 95 %. (orig.)

  9. Classification trees based on infrared spectroscopic data to discriminate between genuine and counterfeit medicines

    OpenAIRE

    Deconinck, Eric; Sacré, Pierre-Yves; Coomans, Danny; De Beer, Jacques

    2012-01-01

    Due to the extension of the internet, counterfeit drugs represent a growing threat for public health in the developing countries but also more and more in the industrial world. In literature several analytical techniques were applied in order to discriminate between genuine and counterfeit medecines. One thing all these techniques have in common is that they generate a huge amount of data, which is often difficult to interpret in order to see differences between the different samp...

  10. Knowledge-Based Classification in Automated Soil Mapping

    Institute of Scientific and Technical Information of China (English)

    ZHOU BIN; WANG RENCHAO

    2003-01-01

    A machine-learning approach was developed for automated building of knowledge bases for soil resourcesmapping by using a classification tree to generate knowledge from training data. With this method, buildinga knowledge base for automated soil mapping was easier than using the conventional knowledge acquisitionapproach. The knowledge base built by classification tree was used by the knowledge classifier to perform thesoil type classification of Longyou County, Zhejiang Province, China using Landsat TM bi-temporal imagesand GIS data. To evaluate the performance of the resultant knowledge bases, the classification results werecompared to existing soil map based on a field survey. The accuracy assessment and analysis of the resultantsoil maps suggested that the knowledge bases built by the machine-learning method was of good quality formapping distribution model of soil classes over the study area.

  11. The Hybrid of Classification Tree and Extreme Learning Machine for Permeability Prediction in Oil Reservoir

    OpenAIRE

    Chandra Prasetyo Utomo

    2013-01-01

    Permeability is an important parameter connected with oil reservoir. In the last two decades, artificial intelligence models have been used. The current best prediction model in permeability prediction is extreme learning machine (ELM). It produces fairly good results but a clear explanation of the model is hard to come by because it is so complex. The aim of this research is to propose a way out of this complexity through the design of a hybrid intelligent model. The model combines classific...

  12. Classification tree for risk assessment in patients suffering from congestive heart failure via long-term heart rate variability.

    Science.gov (United States)

    Melillo, Paolo; De Luca, Nicola; Bracale, Marcello; Pecchia, Leandro

    2013-05-01

    This study aims to develop an automatic classifier for risk assessment in patients suffering from congestive heart failure (CHF). The proposed classifier separates lower risk patients from higher risk ones, using standard long-term heart rate variability (HRV) measures. Patients are labeled as lower or higher risk according to the New York Heart Association classification (NYHA). A retrospective analysis on two public Holter databases was performed, analyzing the data of 12 patients suffering from mild CHF (NYHA I and II), labeled as lower risk, and 32 suffering from severe CHF (NYHA III and IV), labeled as higher risk. Only patients with a fraction of total heartbeats intervals (RR) classified as normal-to-normal (NN) intervals (NN/RR) higher than 80% were selected as eligible in order to have a satisfactory signal quality. Classification and regression tree (CART) was employed to develop the classifiers. A total of 30 higher risk and 11 lower risk patients were included in the analysis. The proposed classification trees achieved a sensitivity and a specificity rate of 93.3% and 63.6%, respectively, in identifying higher risk patients. Finally, the rules obtained by CART are comprehensible and consistent with the consensus showed by previous studies that depressed HRV is a useful tool for risk assessment in patients suffering from CHF. PMID:24592473

  13. Predicting smear negative pulmonary tuberculosis with classification trees and logistic regression: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Kritski Afrânio

    2006-02-01

    Full Text Available Abstract Background Smear negative pulmonary tuberculosis (SNPT accounts for 30% of pulmonary tuberculosis cases reported yearly in Brazil. This study aimed to develop a prediction model for SNPT for outpatients in areas with scarce resources. Methods The study enrolled 551 patients with clinical-radiological suspicion of SNPT, in Rio de Janeiro, Brazil. The original data was divided into two equivalent samples for generation and validation of the prediction models. Symptoms, physical signs and chest X-rays were used for constructing logistic regression and classification and regression tree models. From the logistic regression, we generated a clinical and radiological prediction score. The area under the receiver operator characteristic curve, sensitivity, and specificity were used to evaluate the model's performance in both generation and validation samples. Results It was possible to generate predictive models for SNPT with sensitivity ranging from 64% to 71% and specificity ranging from 58% to 76%. Conclusion The results suggest that those models might be useful as screening tools for estimating the risk of SNPT, optimizing the utilization of more expensive tests, and avoiding costs of unnecessary anti-tuberculosis treatment. Those models might be cost-effective tools in a health care network with hierarchical distribution of scarce resources.

  14. Predicting smear negative pulmonary tuberculosis with classification trees and logistic regression: a cross-sectional study

    OpenAIRE

    Kritski Afrânio; Chaisson Richard E; Conde Marcus; Rezende Valéria MC; Soares Sérgio; Bastos Luiz; Mello Fernanda; Ruffino-Netto Antonio; Werneck Guilherme

    2006-01-01

    Abstract Background Smear negative pulmonary tuberculosis (SNPT) accounts for 30% of pulmonary tuberculosis cases reported yearly in Brazil. This study aimed to develop a prediction model for SNPT for outpatients in areas with scarce resources. Methods The study enrolled 551 patients with clinical-radiological suspicion of SNPT, in Rio de Janeiro, Brazil. The original data was divided into two equivalent samples for generation and validation of the prediction models. Symptoms, physical signs ...

  15. Data mining methods in the prediction of Dementia: A real-data comparison of the accuracy, sensitivity and specificity of linear discriminant analysis, logistic regression, neural networks, support vector machines, classification trees and random forests

    Directory of Open Access Journals (Sweden)

    Santana Isabel

    2011-08-01

    Full Text Available Abstract Background Dementia and cognitive impairment associated with aging are a major medical and social concern. Neuropsychological testing is a key element in the diagnostic procedures of Mild Cognitive Impairment (MCI, but has presently a limited value in the prediction of progression to dementia. We advance the hypothesis that newer statistical classification methods derived from data mining and machine learning methods like Neural Networks, Support Vector Machines and Random Forests can improve accuracy, sensitivity and specificity of predictions obtained from neuropsychological testing. Seven non parametric classifiers derived from data mining methods (Multilayer Perceptrons Neural Networks, Radial Basis Function Neural Networks, Support Vector Machines, CART, CHAID and QUEST Classification Trees and Random Forests were compared to three traditional classifiers (Linear Discriminant Analysis, Quadratic Discriminant Analysis and Logistic Regression in terms of overall classification accuracy, specificity, sensitivity, Area under the ROC curve and Press'Q. Model predictors were 10 neuropsychological tests currently used in the diagnosis of dementia. Statistical distributions of classification parameters obtained from a 5-fold cross-validation were compared using the Friedman's nonparametric test. Results Press' Q test showed that all classifiers performed better than chance alone (p Conclusions When taking into account sensitivity, specificity and overall classification accuracy Random Forests and Linear Discriminant analysis rank first among all the classifiers tested in prediction of dementia using several neuropsychological tests. These methods may be used to improve accuracy, sensitivity and specificity of Dementia predictions from neuropsychological testing.

  16. Avoiding overfit by restricted model search in tree-based EEG classification

    Czech Academy of Sciences Publication Activity Database

    Klaschka, Jan

    The Hague: International Statistical Institute, 2012, s. 5077-5082. ISBN 978-90-73592-33-9. [ISI 2011. Session of the International Statistical Institute /58./. Dublin (IE), 21.08.2011-26.08.2011] R&D Projects: GA MŠk ME 949 Institutional research plan: CEZ:AV0Z10300504 Keywords : model search * electroencephalography * classification trees and forests * random forests Subject RIV: BB - Applied Statistics, Operational Research http://2011.isiproceedings.org/papers/950644.pdf

  17. The application of GIS based decision-tree models for generating the spatial distribution of hydromorphic organic landscapes in relation to digital terrain data

    Directory of Open Access Journals (Sweden)

    R. Bou Kheir

    2010-06-01

    Full Text Available Accurate information about organic/mineral soil occurrence is a prerequisite for many land resources management applications (including climate change mitigation. This paper aims at investigating the potential of using geomorphometrical analysis and decision tree modeling to predict the geographic distribution of hydromorphic organic landscapes in unsampled area in Denmark. Nine primary (elevation, slope angle, slope aspect, plan curvature, profile curvature, tangent curvature, flow direction, flow accumulation, and specific catchment area and one secondary (steady-state topographic wetness index topographic parameters were generated from Digital Elevation Models (DEMs acquired using airborne LIDAR (Light Detection and Ranging systems. They were used along with existing digital data collected from other sources (soil type, geological substrate and landscape type to explain organic/mineral field measurements in hydromorphic landscapes of the Danish area chosen. A large number of tree-based classification models (186 were developed using (1 all of the parameters, (2 the primary DEM-derived topographic (morphological/hydrological parameters only, (3 selected pairs of parameters and (4 excluding each parameter one at a time from the potential pool of predictor parameters. The best classification tree model (with the lowest misclassification error and the smallest number of terminal nodes and predictor parameters combined the steady-state topographic wetness index and soil type, and explained 68% of the variability in organic/mineral field measurements. The overall accuracy of the predictive organic/inorganic landscapes' map produced (at 1:50 000 cartographic scale using the best tree was estimated to be ca. 75%. The proposed classification-tree model is relatively simple, quick, realistic and practical, and it can be applied to other areas, thereby providing a tool to facilitate the implementation of pedological/hydrological plans for conservation

  18. The application of GIS based decision-tree models for generating the spatial distribution of hydromorphic organic landscapes in relation to digital terrain data

    Directory of Open Access Journals (Sweden)

    R. Bou Kheir

    2010-01-01

    Full Text Available Accurate information about soil organic carbon (SOC, presented in a spatially form, is prerequisite for many land resources management applications (including climate change mitigation. This paper aims to investigate the potential of using geomorphometrical analysis and decision tree modeling to predict the geographic distribution of hydromorphic organic landscapes at unsampled area in Denmark. Nine primary (elevation, slope angle, slope aspect, plan curvature, profile curvature, tangent curvature, flow direction, flow accumulation, and specific catchment area and one secondary (steady-state topographic wetness index topographic parameters were generated from Digital Elevation Models (DEMs acquired using airborne LIDAR (Light Detection and Ranging systems. They were used along with existing digital data collected from other sources (soil type, geological substrate and landscape type to statistically explain SOC field measurements in hydromorphic landscapes of the chosen Danish area. A large number of tree-based classification models (186 were developed using (1 all of the parameters, (2 the primary DEM-derived topographic (morphological/hydrological parameters only, (3 selected pairs of parameters and (4 excluding each parameter one at a time from the potential pool of predictor parameters. The best classification tree model (with the lowest misclassification error and the smallest number of terminal nodes and predictor parameters combined the steady-state topographic wetness index and soil type, and explained 68% of the variability in field SOC measurements. The overall accuracy of the produced predictive SOC map (at 1:50 000 cartographic scale using the best tree was estimated to be ca. 75%. The proposed classification-tree model is relatively simple, quick, realistic and practical, and it can be applied to other areas, thereby providing a tool to help with the implementation of pedological/hydrological plans for conservation and sustainable

  19. Estimating the Probability of Vegetation to Be Groundwater Dependent Based on the Evaluation of Tree Models

    Directory of Open Access Journals (Sweden)

    Isabel C. Pérez Hoyos

    2016-04-01

    Full Text Available Groundwater Dependent Ecosystems (GDEs are increasingly threatened by humans’ rising demand for water resources. Consequently, it is imperative to identify the location of GDEs to protect them. This paper develops a methodology to identify the probability of an ecosystem to be groundwater dependent. Probabilities are obtained by modeling the relationship between the known locations of GDEs and factors influencing groundwater dependence, namely water table depth and climatic aridity index. Probabilities are derived for the state of Nevada, USA, using modeled water table depth and aridity index values obtained from the Global Aridity database. The model selected results from the performance comparison of classification trees (CT and random forests (RF. Based on a threshold-independent accuracy measure, RF has a better ability to generate probability estimates. Considering a threshold that minimizes the misclassification rate for each model, RF also proves to be more accurate. Regarding training accuracy, performance measures such as accuracy, sensitivity, and specificity are higher for RF. For the test set, higher values of accuracy and kappa for CT highlight the fact that these measures are greatly affected by low prevalence. As shown for RF, the choice of the cutoff probability value has important consequences on model accuracy and the overall proportion of locations where GDEs are found.

  20. Computer-assisted detection of colonic polyps with CT colonography using neural networks and binary classification trees

    International Nuclear Information System (INIS)

    Detection of colonic polyps in CT colonography is problematic due to complexities of polyp shape and the surface of the normal colon. Published results indicate the feasibility of computer-aided detection of polyps but better classifiers are needed to improve specificity. In this paper we compare the classification results of two approaches: neural networks and recursive binary trees. As our starting point we collect surface geometry information from three-dimensional reconstruction of the colon, followed by a filter based on selected variables such as region density, Gaussian and average curvature and sphericity. The filter returns sites that are candidate polyps, based on earlier work using detection thresholds, to which the neural nets or the binary trees are applied. A data set of 39 polyps from 3 to 25 mm in size was used in our investigation. For both neural net and binary trees we use tenfold cross-validation to better estimate the true error rates. The backpropagation neural net with one hidden layer trained with Levenberg-Marquardt algorithm achieved the best results: sensitivity 90% and specificity 95% with 16 false positives per study

  1. Identification of area-level influences on regions of high cancer incidence in Queensland, Australia: a classification tree approach

    Directory of Open Access Journals (Sweden)

    Mengersen Kerrie L

    2011-07-01

    Full Text Available Abstract Background Strategies for cancer reduction and management are targeted at both individual and area levels. Area-level strategies require careful understanding of geographic differences in cancer incidence, in particular the association with factors such as socioeconomic status, ethnicity and accessibility. This study aimed to identify the complex interplay of area-level factors associated with high area-specific incidence of Australian priority cancers using a classification and regression tree (CART approach. Methods Area-specific smoothed standardised incidence ratios were estimated for priority-area cancers across 478 statistical local areas in Queensland, Australia (1998-2007, n = 186,075. For those cancers with significant spatial variation, CART models were used to identify whether area-level accessibility, socioeconomic status and ethnicity were associated with high area-specific incidence. Results The accessibility of a person's residence had the most consistent association with the risk of cancer diagnosis across the specific cancers. Many cancers were likely to have high incidence in more urban areas, although male lung cancer and cervical cancer tended to have high incidence in more remote areas. The impact of socioeconomic status and ethnicity on these associations differed by type of cancer. Conclusions These results highlight the complex interactions between accessibility, socioeconomic status and ethnicity in determining cancer incidence risk.

  2. Model-based geostatistics

    CERN Document Server

    Diggle, Peter J

    2007-01-01

    Model-based geostatistics refers to the application of general statistical principles of modeling and inference to geostatistical problems. This volume provides a treatment of model-based geostatistics and emphasizes on statistical methods and applications. It also features analyses of datasets from a range of scientific contexts.

  3. Staying Power of Churn Prediction Models

    NARCIS (Netherlands)

    Risselada, Hans; Verhoef, Peter C.; Bijmolt, Tammo H. A.

    2010-01-01

    In this paper, we study the staying power of various churn prediction models. Staying power is defined as the predictive performance of a model in a number of periods after the estimation period. We examine two methods, logit models and classification trees, both with and without applying a bagging

  4. Prospective Testing and Redesign of a Temporal Biomarker Based Risk Model for Patients With Septic Shock: Implications for Septic Shock Biology

    Directory of Open Access Journals (Sweden)

    Hector R. Wong

    2015-12-01

    Full Text Available The temporal version of the pediatric sepsis biomarker risk model (tPERSEVERE estimates the risk of a complicated course in children with septic shock based on biomarker changes from days 1 to 3 of septic shock. We validated tPERSEVERE performance in a prospective cohort, with an a priori plan to redesign tPERSEVERE if it did not perform well. Biomarkers were measured in the validation cohort (n = 168 and study subjects were classified according to tPERSEVERE. To redesign tPERSEVERE, the validation cohort and the original derivation cohort (n = 299 were combined and randomly allocated to training (n = 374 and test (n = 93 sets. tPERSEVERE was redesigned using the training set and CART methodology. tPERSEVERE performed poorly in the validation cohort, with an area under the curve (AUC of 0.67 (95% CI: 0.58–0.75. Failure analysis revealed potential confounders related to clinical characteristics. The redesigned tPERSEVERE model had an AUC of 0.83 (0.79–0.87 and a sensitivity of 93% (68–97 for estimating the risk of a complicated course. Similar performance was seen in the test set. The classification tree segregated patients into two broad endotypes of septic shock characterized by either excessive inflammation or immune suppression.

  5. Model-based segmentation

    OpenAIRE

    Heimann, Tobias; Delingette, Hervé

    2011-01-01

    This chapter starts with a brief introduction into model-based segmentation, explaining the basic concepts and different approaches. Subsequently, two segmentation approaches are presented in more detail: First, the method of deformable simplex meshes is described, explaining the special properties of the simplex mesh and the formulation of the internal forces. Common choices for image forces are presented, and how to evolve the mesh to adapt to certain structures. Second, the method of point...

  6. 山东省中西部农村居民高血压危险因素分类树分析%Study on the risk factors of hypertension among rural residents in mid-west areas of Shandong province, using the classification tree analysis methodology

    Institute of Scientific and Technical Information of China (English)

    刘甲野; 马吉祥; 徐爱强; 付振涛; 贺桂顺; 贾崇奇; 于洋

    2008-01-01

    Objective To explore the risk factors of hypertension and risk population for adults aged≥25 in the mid-western rural areas of Shandong province and to provide evidence for development of intervention measure. Methods Subjects aged ≥25 were selected by multi-stage stratified random sampling method. All participants were interviewed with a standard questionnaire and physically examined on height, weight, waist circumference, blood pressure and fasting plasma glucose (FPG). Classification tree analysis was employed to determine the risk factors of hypertension and high risk populations related to it. Results The major risk factors of hypertension would include age, abdominal obesity, overweight or obesity, family history and high blood sugar. The major populations at high risk would include populations as: a) being clderly, b) at middle-age but with: high blood sugar or with abdominal obesity/overweight, or with family history, c) people at middle-age but with family history and abdominal obesity. Through classification tree analysis, sensitivity, specificity and overall correct rates were 71.87%, 66.38% and 68.79 %, respectively on ' learning sample' while 70.70 %, 65.84 % and 67.97 % respectively on ' testing sample'. Conclusion Efforts on both weight and blood sugar reduction were common prevention measures for general population. Different kinds of prevention and control measures should be taken according to different risk factors existed in the targeted high-risk population of hypertension. Community-based prevention and control for hypertension measures should be integrated when targeting the population at high risk.%目的 探讨山东省中西部地区25岁以上农村常住居民高血压的危险因素及高危人群.方法 采用多阶段分层随机抽样的方法 ,对该地区调查对象进行问卷调查、体格检查以及实验室检测.应用分类树分析高血压人群的危险因素及高危人群.结果 高血压的主要危险因素为

  7. Model Based Definition

    Science.gov (United States)

    Rowe, Sidney E.

    2010-01-01

    In September 2007, the Engineering Directorate at the Marshall Space Flight Center (MSFC) created the Design System Focus Team (DSFT). MSFC was responsible for the in-house design and development of the Ares 1 Upper Stage and the Engineering Directorate was preparing to deploy a new electronic Configuration Management and Data Management System with the Design Data Management System (DDMS) based upon a Commercial Off The Shelf (COTS) Product Data Management (PDM) System. The DSFT was to establish standardized CAD practices and a new data life cycle for design data. Of special interest here, the design teams were to implement Model Based Definition (MBD) in support of the Upper Stage manufacturing contract. It is noted that this MBD does use partially dimensioned drawings for auxiliary information to the model. The design data lifecycle implemented several new release states to be used prior to formal release that allowed the models to move through a flow of progressive maturity. The DSFT identified some 17 Lessons Learned as outcomes of the standards development, pathfinder deployments and initial application to the Upper Stage design completion. Some of the high value examples are reviewed.

  8. LSTM based Conversation Models

    OpenAIRE

    Luan, Yi; Ji, Yangfeng; Ostendorf, Mari

    2016-01-01

    In this paper, we present a conversational model that incorporates both context and participant role for two-party conversations. Different architectures are explored for integrating participant role and context information into a Long Short-term Memory (LSTM) language model. The conversational model can function as a language model or a language generation model. Experiments on the Ubuntu Dialog Corpus show that our model can capture multiple turn interaction between participants. The propos...

  9. Model-Based Reasoning

    Science.gov (United States)

    Ifenthaler, Dirk; Seel, Norbert M.

    2013-01-01

    In this paper, there will be a particular focus on mental models and their application to inductive reasoning within the realm of instruction. A basic assumption of this study is the observation that the construction of mental models and related reasoning is a slowly developing capability of cognitive systems that emerges effectively with proper…

  10. Model-based software design

    Science.gov (United States)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  11. Model-based Software Engineering

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2010-01-01

    The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models...

  12. Principles of models based engineering

    Energy Technology Data Exchange (ETDEWEB)

    Dolin, R.M.; Hefele, J.

    1996-11-01

    This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.

  13. Model Construct Based Enterprise Model Architecture and Its Modeling Approach

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In order to support enterprise integration, a kind of model construct based enterprise model architecture and its modeling approach are studied in this paper. First, the structural makeup and internal relationships of enterprise model architecture are discussed. Then, the concept of reusable model construct (MC) which belongs to the control view and can help to derive other views is proposed. The modeling approach based on model construct consists of three steps, reference model architecture synthesis, enterprise model customization, system design and implementation. According to MC based modeling approach a case study with the background of one-kind-product machinery manufacturing enterprises is illustrated. It is shown that proposal model construct based enterprise model architecture and modeling approach are practical and efficient.

  14. Methods of development fuzzy logic driven decision-support models in copper alloys processing

    Directory of Open Access Journals (Sweden)

    S. Kluska-Nawarecka

    2010-01-01

    Full Text Available Development of a diagnostic decision support system using different then divalent logical formalism, in particular fuzzy logic, allows the inference from the facts presented not as explicit numbers, but described by linguistic variables such as the "high level", "low temperature", "too much content", etc. Thanks to this, process of inference resembles human manner in actual conditions of decision-making processes. Knowledge of experts allows him to discover the functions describing the relationship between the classification of a set of objects and their characteristics, on the basis of which it is possible to create a decision-making rules for classifying new objects of unknown classification so far. This process can be automated. Experimental studies conducted on copper alloys provide large amounts of data. Processing of these data can be greatly accelerated by the classification trees algorithms which provides classes that can be used in fuzzy inference model. Fuzzy logic also provides the flexibility of allocating to classes on the basis of membership functions (which is similar to events in real-world conditions. Decision-making in foundry operations often requires reliance on knowledge incomplete and ambiguous, hence that the conclusions from the data and facts may be "to some extent" true, and the technologist has to determine what level of confidence is acceptable, although the degree of accuracy for specific criteria is defined by membership function, which takes values from interval . This paper describes the methodology and the process of developing fuzzy logic-based models of decision making based on preprocessed data with classification trees, where the needs of the diverse characteristics of copper alloys processing are the scope. Algorithms for automatic classification of the materials research work of copper alloys are clearly the nature of the innovative and promising hope for practical applications in this area.

  15. Graph Model Based Indoor Tracking

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Lu, Hua; Yang, Bin

    2009-01-01

    The tracking of the locations of moving objects in large indoor spaces is important, as it enables a range of applications related to, e.g., security and indoor navigation and guidance. This paper presents a graph model based approach to indoor tracking that offers a uniform data management...... infrastructure for different symbolic positioning technologies, e.g., Bluetooth and RFID. More specifically, the paper proposes a model of indoor space that comprises a base graph and mappings that represent the topology of indoor space at different levels. The resulting model can be used for one or several...... indoor positioning technologies. Focusing on RFID-based positioning, an RFID specific reader deployment graph model is built from the base graph model. This model is then used in several algorithms for constructing and refining trajectories from raw RFID readings. Empirical studies with implementations...

  16. Cluster Based Text Classification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Wiil, Uffe Kock

    2011-01-01

    We propose a cluster based classification model for suspicious email detection and other text classification tasks. The text classification tasks comprise many training examples that require a complex classification model. Using clusters for classification makes the model simpler and increases the...... classifier is trained on each cluster having reduced dimensionality and less number of examples. The experimental results show that the proposed model outperforms the existing classification models for the task of suspicious email detection and topic categorization on the Reuters-21578 and 20 Newsgroups...... datasets. Our model also outperforms A Decision Cluster Classification (ADCC) and the Decision Cluster Forest Classification (DCFC) models on the Reuters-21578 dataset....

  17. Mining social mixing patterns for infectious disease models based on a two-day population survey in Belgium

    Directory of Open Access Journals (Sweden)

    Van Damme Pierre

    2009-01-01

    Full Text Available Abstract Background Until recently, mathematical models of person to person infectious diseases transmission had to make assumptions on transmissions enabled by personal contacts by estimating the so-called WAIFW-matrix. In order to better inform such estimates, a population based contact survey has been carried out in Belgium over the period March-May 2006. In contrast to other European surveys conducted simultaneously, each respondent recorded contacts over two days. Special attention was given to holiday periods, and respondents with large numbers of professional contacts. Methods Participants kept a paper diary with information on their contacts over two different days. A contact was defined as a two-way conversation of at least three words in each others proximity. The contact information included the age of the contact, gender, location, duration, frequency, and whether or not touching was involved. For data analysis, we used association rules and classification trees. Weighted generalized estimating equations were used to analyze contact frequency while accounting for the correlation between contacts reported on the two different days. A contact surface, expressing the average number of contacts between persons of different ages was obtained by a bivariate smoothing approach and the relation to the so-called next-generation matrix was established. Results People mostly mixed with people of similar age, or with their offspring, their parents and their grandparents. By imputing professional contacts, the average number of daily contacts increased from 11.84 to 15.70. The number of reported contacts depended heavily on the household size, class size for children and number of professional contacts for adults. Adults living with children had on average 2 daily contacts more than adults living without children. In the holiday period, the daily contact frequency for children and adolescents decreased with about 19% while a similar observation

  18. Base Flow Model Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The program focuses on turbulence modeling enhancements for predicting high-speed rocket base flows. A key component of the effort is the collection of...

  19. Modeling Guru: Knowledge Base for NASA Modelers

    Science.gov (United States)

    Seablom, M. S.; Wojcik, G. S.; van Aartsen, B. H.

    2009-05-01

    Modeling Guru is an on-line knowledge-sharing resource for anyone involved with or interested in NASA's scientific models or High End Computing (HEC) systems. Developed and maintained by the NASA's Software Integration and Visualization Office (SIVO) and the NASA Center for Computational Sciences (NCCS), Modeling Guru's combined forums and knowledge base for research and collaboration is becoming a repository for the accumulated expertise of NASA's scientific modeling and HEC communities. All NASA modelers and associates are encouraged to participate and provide knowledge about the models and systems so that other users may benefit from their experience. Modeling Guru is divided into a hierarchy of communities, each with its own set forums and knowledge base documents. Current modeling communities include those for space science, land and atmospheric dynamics, atmospheric chemistry, and oceanography. In addition, there are communities focused on NCCS systems, HEC tools and libraries, and programming and scripting languages. Anyone may view most of the content on Modeling Guru (available at http://modelingguru.nasa.gov/), but you must log in to post messages and subscribe to community postings. The site offers a full range of "Web 2.0" features, including discussion forums, "wiki" document generation, document uploading, RSS feeds, search tools, blogs, email notification, and "breadcrumb" links. A discussion (a.k.a. forum "thread") is used to post comments, solicit feedback, or ask questions. If marked as a question, SIVO will monitor the thread, and normally respond within a day. Discussions can include embedded images, tables, and formatting through the use of the Rich Text Editor. Also, the user can add "Tags" to their thread to facilitate later searches. The "knowledge base" is comprised of documents that are used to capture and share expertise with others. The default "wiki" document lets users edit within the browser so others can easily collaborate on the

  20. Event-Based Activity Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2004-01-01

    We present and discuss a modeling approach that supports event-based modeling of information and activity in information systems. Interacting human actors and IT-actors may carry out such activity. We use events to create meaningful relations between information structures and the related...

  1. Modelling Gesture Based Ubiquitous Applications

    CERN Document Server

    Zacharia, Kurien; Varghese, Surekha Mariam

    2011-01-01

    A cost effective, gesture based modelling technique called Virtual Interactive Prototyping (VIP) is described in this paper. Prototyping is implemented by projecting a virtual model of the equipment to be prototyped. Users can interact with the virtual model like the original working equipment. For capturing and tracking the user interactions with the model image and sound processing techniques are used. VIP is a flexible and interactive prototyping method that has much application in ubiquitous computing environments. Different commercial as well as socio-economic applications and extension to interactive advertising of VIP are also discussed.

  2. Sketch-based geologic modeling

    Science.gov (United States)

    Rood, M. P.; Jackson, M.; Hampson, G.; Brazil, E. V.; de Carvalho, F.; Coda, C.; Sousa, M. C.; Zhang, Z.; Geiger, S.

    2015-12-01

    Two-dimensional (2D) maps and cross-sections, and 3D conceptual models, are fundamental tools for understanding, communicating and modeling geology. Yet geologists lack dedicated and intuitive tools that allow rapid creation of such figures and models. Standard drawing packages produce only 2D figures that are not suitable for quantitative analysis. Geologic modeling packages can produce 3D models and are widely used in the groundwater and petroleum communities, but are often slow and non-intuitive to use, requiring the creation of a grid early in the modeling workflow and the use of geostatistical methods to populate the grid blocks with geologic information. We present an alternative approach to rapidly create figures and models using sketch-based interface and modelling (SBIM). We leverage methods widely adopted in other industries to prototype complex geometries and designs. The SBIM tool contains built-in geologic rules that constrain how sketched lines and surfaces interact. These rules are based on the logic of superposition and cross-cutting relationships that follow from rock-forming processes, including deposition, deformation, intrusion and modification by diagenesis or metamorphism. The approach allows rapid creation of multiple, geologically realistic, figures and models in 2D and 3D using a simple, intuitive interface. The user can sketch in plan- or cross-section view. Geologic rules are used to extrapolate sketched lines in real time to create 3D surfaces. Quantitative analysis can be carried our directly on the models. Alternatively, they can be output as simple figures or imported directly into other modeling tools. The software runs on a tablet PC and can be used in a variety of settings including the office, classroom and field. The speed and ease of use of SBIM enables multiple interpretations to be developed from limited data, uncertainty to be readily appraised, and figures and models to be rapidly updated to incorporate new data or concepts.

  3. HMM-based Trust Model

    DEFF Research Database (Denmark)

    ElSalamouny, Ehab; Nielsen, Mogens; Sassone, Vladimiro

    2010-01-01

    Probabilistic trust has been adopted as an approach to taking security sensitive decisions in modern global computing environments. Existing probabilistic trust frameworks either assume fixed behaviour for the principals or incorporate the notion of ‘decay' as an ad hoc approach to cope with thei...... the major limitation of existing Beta trust model. We show the consistency of the HMM-based trust model and contrast it against the well known Beta trust model with the decay principle in terms of the estimation precision....

  4. A model-based display

    International Nuclear Information System (INIS)

    A model-based display is identified, discussed, and illustrated. The model used in the display is based upon the Rankine Cycle, a heat engine cycle. Plant process data from the loss of main and auxiliary feedwater event at the Davis-Besse Plant on June 9, l985 is used to illustrate the display. The model used in the display fuses individual process variables into process functions. It also serves as a medium to communicate status of the process to human users. The human users may evaluate the goals of operation from the displayed process functions. Because of these display features, the user's cognitive workload is minimized. The opinions expressed herein are the author's personal ones and do not necessarily reflect criteria, requirements, and guidelines of the U.S. Nuclear Regulatory Commission

  5. Model-based sensor diagnosis

    International Nuclear Information System (INIS)

    Running a nuclear power plant involves monitoring data provided by the installation's sensors. Operators and computerized systems then use these data to establish a diagnostic of the plant. However, the instrumentation system is complex, and is not immune to faults and failures. This paper presents a system for detecting sensor failures using a topological description of the installation and a set of component models. This model of the plant implicitly contains relations between sensor data. These relations must always be checked if all the components are functioning correctly. The failure detection task thus consists of checking these constraints. The constraints are extracted in two stages. Firstly, a qualitative model of their existence is built using structural analysis. Secondly, the models are formally handled according to the results of the structural analysis, in order to establish the constraints on the sensor data. This work constitutes an initial step in extending model-based diagnosis, as the information on which it is based is suspect. This work will be followed by surveillance of the detection system. When the instrumentation is assumed to be sound, the unverified constraints indicate errors on the plant model. (authors). 8 refs., 4 figs

  6. Model-based requirements engineering

    CERN Document Server

    Holt, Jon

    2012-01-01

    This book provides a hands-on introduction to model-based requirementsengineering and management by describing a set of views that form the basisfor the approach. These views take into account each individual requirement interms of its description, but then also provide each requirement with meaning byputting it into the correct 'context'. A requirement that has been put into a contextis known as a 'use case' and may be based upon either stakeholders or levelsof hierarchy in a system. Each use case must then be analysed and validated bydefining a combination of scenarios and formal mathematica

  7. Model-based tomographic reconstruction

    Science.gov (United States)

    Chambers, David H.; Lehman, Sean K.; Goodman, Dennis M.

    2012-06-26

    A model-based approach to estimating wall positions for a building is developed and tested using simulated data. It borrows two techniques from geophysical inversion problems, layer stripping and stacking, and combines them with a model-based estimation algorithm that minimizes the mean-square error between the predicted signal and the data. The technique is designed to process multiple looks from an ultra wideband radar array. The processed signal is time-gated and each section processed to detect the presence of a wall and estimate its position, thickness, and material parameters. The floor plan of a building is determined by moving the array around the outside of the building. In this paper we describe how the stacking and layer stripping algorithms are combined and show the results from a simple numerical example of three parallel walls.

  8. Differential Geometry Based Multiscale Models

    OpenAIRE

    Wei, Guo-Wei

    2010-01-01

    Large chemical and biological systems such as fuel cells, ion channels, molecular motors, and viruses are of great importance to the scientific community and public health. Typically, these complex systems in conjunction with their aquatic environment pose a fabulous challenge to theoretical description, simulation, and prediction. In this work, we propose a differential geometry based multiscale paradigm to model complex macromolecular systems, and to put macroscopic and microscopic descript...

  9. Crowdsourcing Based 3d Modeling

    Science.gov (United States)

    Somogyi, A.; Barsi, A.; Molnar, B.; Lovas, T.

    2016-06-01

    Web-based photo albums that support organizing and viewing the users' images are widely used. These services provide a convenient solution for storing, editing and sharing images. In many cases, the users attach geotags to the images in order to enable using them e.g. in location based applications on social networks. Our paper discusses a procedure that collects open access images from a site frequently visited by tourists. Geotagged pictures showing the image of a sight or tourist attraction are selected and processed in photogrammetric processing software that produces the 3D model of the captured object. For the particular investigation we selected three attractions in Budapest. To assess the geometrical accuracy, we used laser scanner and DSLR as well as smart phone photography to derive reference values to enable verifying the spatial model obtained from the web-album images. The investigation shows how detailed and accurate models could be derived applying photogrammetric processing software, simply by using images of the community, without visiting the site.

  10. An Agent Based Classification Model

    CERN Document Server

    Gu, Feng; Greensmith, Julie

    2009-01-01

    The major function of this model is to access the UCI Wisconsin Breast Can- cer data-set[1] and classify the data items into two categories, which are normal and anomalous. This kind of classifi cation can be referred as anomaly detection, which discriminates anomalous behaviour from normal behaviour in computer systems. One popular solution for anomaly detection is Artifi cial Immune Sys- tems (AIS). AIS are adaptive systems inspired by theoretical immunology and observed immune functions, principles and models which are applied to prob- lem solving. The Dendritic Cell Algorithm (DCA)[2] is an AIS algorithm that is developed specifi cally for anomaly detection. It has been successfully applied to intrusion detection in computer security. It is believed that agent-based mod- elling is an ideal approach for implementing AIS, as intelligent agents could be the perfect representations of immune entities in AIS. This model evaluates the feasibility of re-implementing the DCA in an agent-based simulation environ- ...

  11. Model-based Utility Functions

    Science.gov (United States)

    Hibbard, Bill

    2012-05-01

    Orseau and Ring, as well as Dewey, have recently described problems, including self-delusion, with the behavior of agents using various definitions of utility functions. An agent's utility function is defined in terms of the agent's history of interactions with its environment. This paper argues, via two examples, that the behavior problems can be avoided by formulating the utility function in two steps: 1) inferring a model of the environment from interactions, and 2) computing utility as a function of the environment model. Basing a utility function on a model that the agent must learn implies that the utility function must initially be expressed in terms of specifications to be matched to structures in the learned model. These specifications constitute prior assumptions about the environment so this approach will not work with arbitrary environments. But the approach should work for agents designed by humans to act in the physical world. The paper also addresses the issue of self-modifying agents and shows that if provided with the possibility to modify their utility functions agents will not choose to do so, under some usual assumptions.

  12. Trace-Based Code Generation for Model-Based Testing

    OpenAIRE

    Kanstrén, T.; Piel, E.; Gross, H.-G.

    2009-01-01

    Paper Submitted for review at the Eighth International Conference on Generative Programming and Component Engineering. Model-based testing can be a powerful means to generate test cases for the system under test. However, creating a useful model for model-based testing requires expertise in the (formal) modeling language of the used tool and the general concept of modeling the system under test for effective test generation. A commonly used modeling notation is to describe the model through a...

  13. Business value modeling based on BPMN models

    OpenAIRE

    Masoumigoudarzi, Farahnaz

    2014-01-01

    In this study we will try to clarify the explanation of modeling and measuring 'Business Values', as it is defined in business context, in the business processes of a company and introduce different methods and select the one which is best for modeling the company's business values. These methods have been used by researchers in business analytics and senior managers of many companies. The focus in this project is business value detection and modeling. The basis of this research is on BPM...

  14. Optimal pricing decision model based on activity-based costing

    Institute of Scientific and Technical Information of China (English)

    王福胜; 常庆芳

    2003-01-01

    In order to find out the applicability of the optimal pricing decision model based on conventional costbehavior model after activity-based costing has given strong shock to the conventional cost behavior model andits assumptions, detailed analyses have been made using the activity-based cost behavior and cost-volume-profitanalysis model, and it is concluded from these analyses that the theory behind the construction of optimal pri-cing decision model is still tenable under activity-based costing, but the conventional optimal pricing decisionmodel must be modified as appropriate to the activity-based costing based cost behavior model and cost-volume-profit analysis model, and an optimal pricing decision model is really a product pricing decision model construc-ted by following the economic principle of maximizing profit.

  15. Sensor-based interior modeling

    International Nuclear Information System (INIS)

    Robots and remote systems will play crucial roles in future decontamination and decommissioning (D ampersand D) of nuclear facilities. Many of these facilities, such as uranium enrichment plants, weapons assembly plants, research and production reactors, and fuel recycling facilities, are dormant; there is also an increasing number of commercial reactors whose useful lifetime is nearly over. To reduce worker exposure to radiation, occupational and other hazards associated with D ampersand D tasks, robots will execute much of the work agenda. Traditional teleoperated systems rely on human understanding (based on information gathered by remote viewing cameras) of the work environment to safely control the remote equipment. However, removing the operator from the work site substantially reduces his efficiency and effectiveness. To approach the productivity of a human worker, tasks will be performed telerobotically, in which many aspects of task execution are delegated to robot controllers and other software. This paper describes a system that semi-automatically builds a virtual world for remote D ampersand D operations by constructing 3-D models of a robot's work environment. Planar and quadric surface representations of objects typically found in nuclear facilities are generated from laser rangefinder data with a minimum of human interaction. The surface representations are then incorporated into a task space model that can be viewed and analyzed by the operator, accessed by motion planning and robot safeguarding algorithms, and ultimately used by the operator to instruct the robot at a level much higher than teleoperation

  16. Memristor model based on fuzzy window function

    OpenAIRE

    Abdel-Kader, Rabab Farouk; Abuelenin, Sherif M.

    2016-01-01

    Memristor (memory-resistor) is the fourth passive circuit element. We introduce a memristor model based on a fuzzy logic window function. Fuzzy models are flexible, which enables the capture of the pinched hysteresis behavior of the memristor. The introduced fuzzy model avoids common problems associated with window-function based memristor models, such as the terminal state problem, and the symmetry issues. The model captures the memristor behavior with a simple rule-base which gives an insig...

  17. A respiratory alert model for the Shenandoah Valley, Virginia, USA.

    Science.gov (United States)

    Hondula, David M; Davis, Robert E; Knight, David B; Sitka, Luke J; Enfield, Kyle; Gawtry, Stephen B; Stenger, Phillip J; Deaton, Michael L; Normile, Caroline P; Lee, Temple R

    2013-01-01

    Respiratory morbidity (particularly COPD and asthma) can be influenced by short-term weather fluctuations that affect air quality and lung function. We developed a model to evaluate meteorological conditions associated with respiratory hospital admissions in the Shenandoah Valley of Virginia, USA. We generated ensembles of classification trees based on six years of respiratory-related hospital admissions (64,620 cases) and a suite of 83 potential environmental predictor variables. As our goal was to identify short-term weather linkages to high admission periods, the dependent variable was formulated as a binary classification of five-day moving average respiratory admission departures from the seasonal mean value. Accounting for seasonality removed the long-term apparent inverse relationship between temperature and admissions. We generated eight total models specific to the northern and southern portions of the valley for each season. All eight models demonstrate predictive skill (mean odds ratio = 3.635) when evaluated using a randomization procedure. The predictor variables selected by the ensembling algorithm vary across models, and both meteorological and air quality variables are included. In general, the models indicate complex linkages between respiratory health and environmental conditions that may be difficult to identify using more traditional approaches. PMID:22438053

  18. Test case generation based on orthogonal table for software black-box testing

    Institute of Scientific and Technical Information of China (English)

    LIU Jiu-fu; YANG Zhong; YANG Zhen-xing; SUN Lin

    2008-01-01

    Software testing is an important means to assure the software quality. This paper presents a practicable method to generate test cases of software testing, which is operational and high efficient. We discuss the identification of software specification categories and choices and make a classification tree. Based on the orthogonal array, it is easy to generate test cases. The number of this method is less than that of all combination of the choices.

  19. Guide to APA-Based Models

    Science.gov (United States)

    Robins, Robert E.; Delisi, Donald P.

    2008-01-01

    In Robins and Delisi (2008), a linear decay model, a new IGE model by Sarpkaya (2006), and a series of APA-Based models were scored using data from three airports. This report is a guide to the APA-based models.

  20. CEAI: CCM based Email Authorship Identification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah

    2013-01-01

    reveal that the proposed CCM-based email authorship identification model, along with the proposed feature set, outperforms the state-of-the-art support vector machine (SVM)-based models, as well as the models proposed by Iqbal et al. [1, 2]. The proposed model attains an accuracy rate of 94% for 10...

  1. Mining geriatric assessment data for in-patient fall prediction models and high-risk subgroups

    Directory of Open Access Journals (Sweden)

    Marschollek Michael

    2012-03-01

    Full Text Available Abstract Background Hospital in-patient falls constitute a prominent problem in terms of costs and consequences. Geriatric institutions are most often affected, and common screening tools cannot predict in-patient falls consistently. Our objectives are to derive comprehensible fall risk classification models from a large data set of geriatric in-patients' assessment data and to evaluate their predictive performance (aim#1, and to identify high-risk subgroups from the data (aim#2. Methods A data set of n = 5,176 single in-patient episodes covering 1.5 years of admissions to a geriatric hospital were extracted from the hospital's data base and matched with fall incident reports (n = 493. A classification tree model was induced using the C4.5 algorithm as well as a logistic regression model, and their predictive performance was evaluated. Furthermore, high-risk subgroups were identified from extracted classification rules with a support of more than 100 instances. Results The classification tree model showed an overall classification accuracy of 66%, with a sensitivity of 55.4%, a specificity of 67.1%, positive and negative predictive values of 15% resp. 93.5%. Five high-risk groups were identified, defined by high age, low Barthel index, cognitive impairment, multi-medication and co-morbidity. Conclusions Our results show that a little more than half of the fallers may be identified correctly by our model, but the positive predictive value is too low to be applicable. Non-fallers, on the other hand, may be sorted out with the model quite well. The high-risk subgroups and the risk factors identified (age, low ADL score, cognitive impairment, institutionalization, polypharmacy and co-morbidity reflect domain knowledge and may be used to screen certain subgroups of patients with a high risk of falling. Classification models derived from a large data set using data mining methods can compete with current dedicated fall risk screening tools, yet lack

  2. Fault diagnosis and comparing risk for the steel coil manufacturing process using statistical models for binary data

    International Nuclear Information System (INIS)

    Advanced statistical models can help industry to design more economical and rational investment plans. Fault detection and diagnosis is an important problem in continuous hot dip galvanizing. Increasingly stringent quality requirements in the automotive industry also require ongoing efforts in process control to make processes more robust. Robust methods for estimating the quality of galvanized steel coils are an important tool for the comprehensive monitoring of the performance of the manufacturing process. This study applies different statistical regression models: generalized linear models, generalized additive models and classification trees to estimate the quality of galvanized steel coils on the basis of short time histories. The data, consisting of 48 galvanized steel coils, was divided into sets of conforming and nonconforming coils. Five variables were selected for monitoring the process: steel strip velocity and four bath temperatures. The present paper reports a comparative evaluation of statistical models for binary data using Receiver Operating Characteristic (ROC) curves. A ROC curve is a graph or a technique for visualizing, organizing and selecting classifiers based on their performance. The purpose of this paper is to examine their use in research to obtain the best model to predict defective steel coil probability. In relation to the work of other authors who only propose goodness of fit statistics, we should highlight one distinctive feature of the methodology presented here, which is the possibility of comparing the different models with ROC graphs which are based on model classification performance. Finally, the results are validated by bootstrap procedures.

  3. Trace-Based Code Generation for Model-Based Testing

    NARCIS (Netherlands)

    Kanstrén, T.; Piel, E.; Gross, H.-G.

    2009-01-01

    Paper Submitted for review at the Eighth International Conference on Generative Programming and Component Engineering. Model-based testing can be a powerful means to generate test cases for the system under test. However, creating a useful model for model-based testing requires expertise in the (fo

  4. Rule-based decision making model

    International Nuclear Information System (INIS)

    A rule-based decision making model is designed in G2 environment. A theoretical and methodological frame for the model is composed and motivated. The rule-based decision making model is based on object-oriented modelling, knowledge engineering and decision theory. The idea of safety objective tree is utilized. Advanced rule-based methodologies are applied. A general decision making model 'decision element' is constructed. The strategy planning of the decision element is based on e.g. value theory and utility theory. A hypothetical process model is built to give input data for the decision element. The basic principle of the object model in decision making is division in tasks. Probability models are used in characterizing component availabilities. Bayes' theorem is used to recalculate the probability figures when new information is got. The model includes simple learning features to save the solution path. A decision analytic interpretation is given to the decision making process. (author)

  5. Landscape patterns as habitat predictors: Building and testing models for cavity-nesting birds in the Uinta Mountains of Utah, USA

    Science.gov (United States)

    Lawler, J.J.; Edwards, T.C.

    2002-01-01

    The ability to predict species occurrences quickly is often crucial for managers and conservation biologists with limited time and funds. We used measured associations with landscape patterns to build accurate predictive habitat models that were quickly and easily applied (i.e., required no additional data collection in the field to make predictions). We used classification trees (a nonparametric alternative to discriminant function analysis, logistic regression, and other generalized linear models) to model nesting habitat of red-naped sapsuckers (Sphyrapicus nuchalis), northern flickers (Colaptes auratus), tree swallows (Tachycineta bicolor), and mountain chickadees (Parus gambeli) in the Uinta Mountains of northeastern Utah, USA. We then tested the predictive capability of the models with independent data collected in the field the following year. The models built for the northern flicker, red-naped sapsucker, and tree swallow were relatively accurate (84%, 80%, and 75% nests correctly classified, respectively) compared to the models for the mountain chickadee (50% nests correctly classified). All four models were more selective than a null model that predicted habitat based solely on a gross association with aspen forests. We conclude that associations with landscape patterns can be used to build relatively accurate, easy to use, predictive models for some species. Our results stress, however, that both selecting the proper scale at which to assess landscape associations and empirically testing the models derived from those associations are crucial for building useful predictive models.

  6. Electrical Compact Modeling of Graphene Base Transistors

    Directory of Open Access Journals (Sweden)

    Sébastien Frégonèse

    2015-11-01

    Full Text Available Following the recent development of the Graphene Base Transistor (GBT, a new electrical compact model for GBT devices is proposed. The transistor model includes the quantum capacitance model to obtain a self-consistent base potential. It also uses a versatile transfer current equation to be compatible with the different possible GBT configurations and it account for high injection conditions thanks to a transit time based charge model. Finally, the developed large signal model has been implemented in Verilog-A code and can be used for simulation in a standard circuit design environment such as Cadence or ADS. This model has been verified using advanced numerical simulation.

  7. EPR-based material modelling of soils

    Science.gov (United States)

    Faramarzi, Asaad; Alani, Amir M.

    2013-04-01

    In the past few decades, as a result of the rapid developments in computational software and hardware, alternative computer aided pattern recognition approaches have been introduced to modelling many engineering problems, including constitutive modelling of materials. The main idea behind pattern recognition systems is that they learn adaptively from experience and extract various discriminants, each appropriate for its purpose. In this work an approach is presented for developing material models for soils based on evolutionary polynomial regression (EPR). EPR is a recently developed hybrid data mining technique that searches for structured mathematical equations (representing the behaviour of a system) using genetic algorithm and the least squares method. Stress-strain data from triaxial tests are used to train and develop EPR-based material models for soil. The developed models are compared with some of the well-known conventional material models and it is shown that EPR-based models can provide a better prediction for the behaviour of soils. The main benefits of using EPR-based material models are that it provides a unified approach to constitutive modelling of all materials (i.e., all aspects of material behaviour can be implemented within a unified environment of an EPR model); it does not require any arbitrary choice of constitutive (mathematical) models. In EPR-based material models there are no material parameters to be identified. As the model is trained directly from experimental data therefore, EPR-based material models are the shortest route from experimental research (data) to numerical modelling. Another advantage of EPR-based constitutive model is that as more experimental data become available, the quality of the EPR prediction can be improved by learning from the additional data, and therefore, the EPR model can become more effective and robust. The developed EPR-based material models can be incorporated in finite element (FE) analysis.

  8. Model-based DSL frameworks

    NARCIS (Netherlands)

    Kurtev, I.; Bézivin, J.; Jouault, F.; Valduriez, P.

    2006-01-01

    More than five years ago, the OMG proposed the Model Driven Architecture (MDA™) approach to deal with the separation of platform dependent and independent aspects in information systems. Since then, the initial idea of MDA evolved and Model Driven Engineering (MDE) is being increasingly promoted to

  9. Data-driven modeling of hydroclimatic trends and soil moisture: Multi-scale data integration and decision support

    Science.gov (United States)

    Coopersmith, Evan Joseph

    regime curve data and facilitate the development of cluster-specific algorithms. Given the desire to enable intelligent decision-making at any location, this classification system is developed in a manner that will allow for classification anywhere in the U.S., even in an ungauged basin. Daily time series data from 428 catchments in the MOPEX database are analyzed to produce an empirical classification tree, partitioning the United States into regions of hydroclimatic similarity. In constructing a classification tree based upon 55 years of data, it is important to recognize the non-stationary nature of climate data. The shifts in climatic regimes will cause certain locations to shift their ultimate position within the classification tree, requiring decision-makers to alter land usage, farming practices, and equipment needs, and algorithms to adjust accordingly. This work adapts the classification model to address the issue of regime shifts over larger temporal scales and suggests how land-usage and farming protocol may vary from hydroclimatic shifts in decades to come. Finally, the generalizability of the hydroclimatic classification system is tested with a physically-based soil moisture model calibrated at several locations throughout the continental United States. The soil moisture model is calibrated at a given site and then applied with the same parameters at other sites within and outside the same hydroclimatic class. The model's performance deteriorates minimally if the calibration and validation location are within the same hydroclimatic class, but deteriorates significantly if the calibration and validates sites are located in different hydroclimatic classes. These soil moisture estimates at the field scale are then further refined by the introduction of LiDAR elevation data, distinguishing faster-drying peaks and ridges from slower-drying valleys. The inclusion of LiDAR enabled multiple locations within the same field to be predicted accurately despite non

  10. The Culture Based Model: Constructing a Model of Culture

    Science.gov (United States)

    Young, Patricia A.

    2008-01-01

    Recent trends reveal that models of culture aid in mapping the design and analysis of information and communication technologies. Therefore, models of culture are powerful tools to guide the building of instructional products and services. This research examines the construction of the culture based model (CBM), a model of culture that evolved…

  11. Finite mixture models and model-based clustering

    Directory of Open Access Journals (Sweden)

    Volodymyr Melnykov

    2010-01-01

    Full Text Available Finite mixture models have a long history in statistics, having been used to model population heterogeneity, generalize distributional assumptions, and lately, for providing a convenient yet formal framework for clustering and classification. This paper provides a detailed review into mixture models and model-based clustering. Recent trends as well as open problems in the area are also discussed.

  12. Probabilistic Model-Based Safety Analysis

    CERN Document Server

    Güdemann, Matthias; 10.4204/EPTCS.28.8

    2010-01-01

    Model-based safety analysis approaches aim at finding critical failure combinations by analysis of models of the whole system (i.e. software, hardware, failure modes and environment). The advantage of these methods compared to traditional approaches is that the analysis of the whole system gives more precise results. Only few model-based approaches have been applied to answer quantitative questions in safety analysis, often limited to analysis of specific failure propagation models, limited types of failure modes or without system dynamics and behavior, as direct quantitative analysis is uses large amounts of computing resources. New achievements in the domain of (probabilistic) model-checking now allow for overcoming this problem. This paper shows how functional models based on synchronous parallel semantics, which can be used for system design, implementation and qualitative safety analysis, can be directly re-used for (model-based) quantitative safety analysis. Accurate modeling of different types of proba...

  13. P-Graph-based Workflow Modelling

    OpenAIRE

    József Tick

    2007-01-01

    Workflow modelling has been successfully introduced and implemented in severalapplication fields. Therefore, its significance has increased dramatically. Several work flowmodelling techniques have been published so far, out of which quite a number arewidespread applications. For instance the Petri-Net-based modelling has become popularpartly due to its graphical design and partly due to its correct mathematical background.The workflow modelling based on Unified Modelling Language is important...

  14. Stochastic Modelling for Condition Based Maintenance

    OpenAIRE

    Han, Zehan

    2015-01-01

    This Master's thesis covers almost all aspects of Condition Based Maintenance (CBM). All objectives in Chapter 1 are met. The thesis is mainly comprised of three parts. First part introduces the world of CBM to readers. This part presents data acquisition, data processing and databases, which are the foundation to CBM. Then it highlights models which are divided into physics based models, data-driven models and hybrid models, for diagnostic and prognostic use. Three promising diagnostic and p...

  15. Base Flow Model Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets...

  16. Firm Based Trade Models and Turkish Economy

    Directory of Open Access Journals (Sweden)

    Nilüfer ARGIN

    2015-12-01

    Full Text Available Among all international trade models, only The Firm Based Trade Models explains firm’s action and behavior in the world trade. The Firm Based Trade Models focuses on the trade behavior of individual firms that actually make intra industry trade. Firm Based Trade Models can explain globalization process truly. These approaches include multinational cooperation, supply chain and outsourcing also. Our paper aims to explain and analyze Turkish export with Firm Based Trade Models’ context. We use UNCTAD data on exports by SITC Rev 3 categorization to explain total export and 255 products and calculate intensive-extensive margins of Turkish firms.

  17. Distributed Prognostics Based on Structural Model Decomposition

    Data.gov (United States)

    National Aeronautics and Space Administration — Within systems health management, prognostics focuses on predicting the remaining useful life of a system. In the model-based prognostics paradigm, physics-based...

  18. Lévy-based growth models

    DEFF Research Database (Denmark)

    Jónsdóttir, Kristjana Ýr; Schmiegel, Jürgen; Jensen, Eva Bjørn Vedel

    2008-01-01

    In the present paper, we give a condensed review, for the nonspecialist reader, of a new modelling framework for spatio-temporal processes, based on Lévy theory. We show the potential of the approach in stochastic geometry and spatial statistics by studying Lévy-based growth modelling of planar...... objects. The growth models considered are spatio-temporal stochastic processes on the circle. As a by product, flexible new models for space–time covariance functions on the circle are provided. An application of the Lévy-based growth models to tumour growth is discussed....

  19. Traceability in Model-Based Testing

    Directory of Open Access Journals (Sweden)

    Mathew George

    2012-11-01

    Full Text Available The growing complexities of software and the demand for shorter time to market are two important challenges that face today’s IT industry. These challenges demand the increase of both productivity and quality of software. Model-based testing is a promising technique for meeting these challenges. Traceability modeling is a key issue and challenge in model-based testing. Relationships between the different models will help to navigate from one model to another, and trace back to the respective requirements and the design model when the test fails. In this paper, we present an approach for bridging the gaps between the different models in model-based testing. We propose relation definition markup language (RDML for defining the relationships between models.

  20. Hybrid data mining-regression for infrastructure risk assessment based on zero-inflated data

    International Nuclear Information System (INIS)

    Infrastructure disaster risk assessment seeks to estimate the probability of a given customer or area losing service during a disaster, sometimes in conjunction with estimating the duration of each outage. This is often done on the basis of past data about the effects of similar events impacting the same or similar systems. In many situations this past performance data from infrastructure systems is zero-inflated; it has more zeros than can be appropriately modeled with standard probability distributions. The data are also often non-linear and exhibit threshold effects due to the complexities of infrastructure system performance. Standard zero-inflated statistical models such as zero-inflated Poisson and zero-inflated negative binomial regression models do not adequately capture these complexities. In this paper we develop a novel method that is a hybrid classification tree/regression method for complex, zero-inflated data sets. We investigate its predictive accuracy based on a large number of simulated data sets and then demonstrate its practical usefulness with an application to hurricane power outage risk assessment for a large utility based on actual data from the utility. While formulated for infrastructure disaster risk assessment, this method is promising for data-driven analysis for other situations with zero-inflated, complex data exhibiting response thresholds.

  1. Model-based Utility Functions

    CERN Document Server

    Hibbard, Bill

    2011-01-01

    At the recent AGI-11 Conference Orseau and Ring, and Dewey, described problems, including self-delusion, with the behavior of AIXI agents using various definitions of utility functions. An agent's utility function is defined in terms of the agent's history of interactions with its environment. This paper argues that the behavior problems can be avoided by formulating the utility function in two steps: 1) inferring a model of the environment from interactions, and 2) computing utility as a function of the environment model. The paper also argues that agents will not choose to modify their utility functions.

  2. PCA-based lung motion model

    CERN Document Server

    Li, Ruijiang; Jia, Xun; Zhao, Tianyu; Lamb, James; Yang, Deshan; Low, Daniel A; Jiang, Steve B

    2010-01-01

    Organ motion induced by respiration may cause clinically significant targeting errors and greatly degrade the effectiveness of conformal radiotherapy. It is therefore crucial to be able to model respiratory motion accurately. A recently proposed lung motion model based on principal component analysis (PCA) has been shown to be promising on a few patients. However, there is still a need to understand the underlying reason why it works. In this paper, we present a much deeper and detailed analysis of the PCA-based lung motion model. We provide the theoretical justification of the effectiveness of PCA in modeling lung motion. We also prove that under certain conditions, the PCA motion model is equivalent to 5D motion model, which is based on physiology and anatomy of the lung. The modeling power of PCA model was tested on clinical data and the average 3D error was found to be below 1 mm.

  3. An acoustical model based monitoring network

    NARCIS (Netherlands)

    Wessels, P.W.; Basten, T.G.H.; Eerden, F.J.M. van der

    2010-01-01

    In this paper the approach for an acoustical model based monitoring network is demonstrated. This network is capable of reconstructing a noise map, based on the combination of measured sound levels and an acoustic model of the area. By pre-calculating the sound attenuation within the network the noi

  4. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  5. Agent-based pedestrian modelling

    OpenAIRE

    Batty, Michael

    2003-01-01

    When the focus of interest in geographical systems is at the very fine scale, at the level of streets and buildings for example, movement becomes central to simulations of how spatial activities are used and develop. Recent advances in computing power and the acquisition of fine scale digital data now mean that we are able to attempt to understand and predict such phenomena with the focus in spatial modelling changing to dynamic simulations of the individual and collective beha...

  6. An Agent Based Classification Model

    OpenAIRE

    Gu, Feng; Aickelin, Uwe; Greensmith, Julie

    2009-01-01

    The major function of this model is to access the UCI Wisconsin Breast Can- cer data-set[1] and classify the data items into two categories, which are normal and anomalous. This kind of classifi cation can be referred as anomaly detection, which discriminates anomalous behaviour from normal behaviour in computer systems. One popular solution for anomaly detection is Artifi cial Immune Sys- tems (AIS). AIS are adaptive systems inspired by theoretical immunology and observed immune functions, p...

  7. Model Based Control of Solidification

    OpenAIRE

    Furenes, Beathe

    2009-01-01

    The objective of this thesis is to develop models for use in the control of a solidification process. Solidification is the phase change from liquid to solid, and takes place in many important processes ranging from production engineering to solid-state physics. Often during solidification, undesired e¤ects like e.g. variation of composition, microstructure, etc. occur. The solidification structure and its associated defects often persist throughout the subsequent operations, and thus good co...

  8. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae

    2008-01-01

    constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behaviour as a means of computation...... constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems) is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI), University of Southern Denmark. Once specified, the software model has to be......Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics of...

  9. A Multiple Model Approach to Modeling Based on LPF Algorithm

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Input-output data fitting methods are often used for unknown-structure nonlinear system modeling. Based on model-on-demand tactics, a multiple model approach to modeling for nonlinear systems is presented. The basic idea is to find out, from vast historical system input-output data sets, some data sets matching with the current working point, then to develop a local model using Local Polynomial Fitting (LPF) algorithm. With the change of working points, multiple local models are built, which realize the exact modeling for the global system. By comparing to other methods, the simulation results show good performance for its simple, effective and reliable estimation.``

  10. Model-based internal wave processing

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J.V.; Chambers, D.H.

    1995-06-09

    A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (dept) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves.

  11. Tools for model-based security engineering: models vs. code

    OpenAIRE

    Jürjens, Jan; Yu, Yijun

    2007-01-01

    We present tools to support model-based security engineering on both the model and the code level. In the approach supported by these tools, one firstly specifies the security-critical part of the system (e.g. a crypto protocol) using the UML security extension UMLsec. The models are automatically verified for security properties using automated theorem provers. These are implemented within a framework that supports implementing verification routines, based on XMI output of the diagrams from ...

  12. Modelling a Peroxidase-based Optical Biosensor

    OpenAIRE

    Juozas Kulys; Evelina Gaidamauskait˙e; Romas Baronas

    2007-01-01

    The response of a peroxidase-based optical biosensor was modelled digitally. A mathematical model of the optical biosensor is based on a system of non-linear reaction-diffusion equations. The modelling biosensor comprises two compartments, an enzyme layer and an outer diffusion layer. The digital simulation was carried out using finite difference technique. The influence of the substrate concentration as well as of the thickness of both the enzyme and diffusion layers on the biosensor respons...

  13. Navigation based on symbolic space models

    OpenAIRE

    Baras, Karolina; Moreira, Adriano; Meneses, Filipe

    2010-01-01

    Existing navigation systems are very appropriate for car navigation, but lack support for convenient pedestrian navigation and cannot be used indoors due to GPS limitations. In addition, the creation and the maintenance of the required models are costly and time consuming, and are usually based on proprietary data structures. In this paper we describe a navigation system based on a human inspired symbolic space model. We argue that symbolic space models are much easier...

  14. P-Graph-based Workflow Modelling

    Directory of Open Access Journals (Sweden)

    József Tick

    2007-03-01

    Full Text Available Workflow modelling has been successfully introduced and implemented in severalapplication fields. Therefore, its significance has increased dramatically. Several work flowmodelling techniques have been published so far, out of which quite a number arewidespread applications. For instance the Petri-Net-based modelling has become popularpartly due to its graphical design and partly due to its correct mathematical background.The workflow modelling based on Unified Modelling Language is important because of itspractical usage. This paper introduces and examines the workflow modelling techniquebased on the Process-graph as a possible new solution next to the already existingmodelling techniques.

  15. IP Network Management Model Based on NGOSS

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jin-yu; LI Hong-hui; LIU Feng

    2004-01-01

    This paper addresses a management model for IP network based on Next Generation Operation Support System (NGOSS). It makes the network management on the base of all the operation actions of ISP, It provides QoS to user service through the whole path by providing end-to-end Service Level Agreements (SLA) management through whole path. Based on web and coordination technology, this paper gives an implement architecture of this model.

  16. Gradient-based model calibration with proxy-model assistance

    Science.gov (United States)

    Burrows, Wesley; Doherty, John

    2016-02-01

    Use of a proxy model in gradient-based calibration and uncertainty analysis of a complex groundwater model with large run times and problematic numerical behaviour is described. The methodology is general, and can be used with models of all types. The proxy model is based on a series of analytical functions that link all model outputs used in the calibration process to all parameters requiring estimation. In enforcing history-matching constraints during the calibration and post-calibration uncertainty analysis processes, the proxy model is run for the purposes of populating the Jacobian matrix, while the original model is run when testing parameter upgrades; the latter process is readily parallelized. Use of a proxy model in this fashion dramatically reduces the computational burden of complex model calibration and uncertainty analysis. At the same time, the effect of model numerical misbehaviour on calculation of local gradients is mitigated, this allowing access to the benefits of gradient-based analysis where lack of integrity in finite-difference derivatives calculation would otherwise have impeded such access. Construction of a proxy model, and its subsequent use in calibration of a complex model, and in analysing the uncertainties of predictions made by that model, is implemented in the PEST suite.

  17. Model-based Abstraction of Data Provenance

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, René Rydhof

    2014-01-01

    to bigger models, and the analyses adapt accordingly. Our approach extends provenance both with the origin of data, the actors and processes involved in the handling of data, and policies applied while doing so. The model and corresponding analyses are based on a formal model of spatial and organisational......Identifying provenance of data provides insights to the origin of data and intermediate results, and has recently gained increased interest due to data-centric applications. In this work we extend a data-centric system view with actors handling the data and policies restricting actions....... This extension is based on provenance analysis performed on system models. System models have been introduced to model and analyse spatial and organisational aspects of organisations, to identify, e.g., potential insider threats. Both the models and analyses are naturally modular; models can be combined...

  18. Agent-based modeling and network dynamics

    CERN Document Server

    Namatame, Akira

    2016-01-01

    The book integrates agent-based modeling and network science. It is divided into three parts, namely, foundations, primary dynamics on and of social networks, and applications. The book begins with the network origin of agent-based models, known as cellular automata, and introduce a number of classic models, such as Schelling’s segregation model and Axelrod’s spatial game. The essence of the foundation part is the network-based agent-based models in which agents follow network-based decision rules. Under the influence of the substantial progress in network science in late 1990s, these models have been extended from using lattices into using small-world networks, scale-free networks, etc. The book also shows that the modern network science mainly driven by game-theorists and sociophysicists has inspired agent-based social scientists to develop alternative formation algorithms, known as agent-based social networks. The book reviews a number of pioneering and representative models in this family. Upon the gi...

  19. Model-Based Clustering of Large Networks

    CERN Document Server

    Vu, Duy Quang; Schweinberger, Michael

    2012-01-01

    We describe a network clustering framework, based on finite mixture models, that can be applied to discrete-valued networks with hundreds of thousands of nodes and billions of edge variables. Relative to other recent model-based clustering work for networks, we introduce a more flexible modeling framework, improve the variational-approximation estimation algorithm, discuss and implement standard error estimation via a parametric bootstrap approach, and apply these methods to much larger datasets than those seen elsewhere in the literature. The more flexible modeling framework is achieved through introducing novel parameterizations of the model, giving varying degrees of parsimony, using exponential family models whose structure may be exploited in various theoretical and algorithmic ways. The algorithms, which we show how to adapt to the more complicated optimization requirements introduced by the constraints imposed by the novel parameterizations we propose, are based on variational generalized EM algorithms...

  20. Simulation-based Manufacturing System Modeling

    Institute of Scientific and Technical Information of China (English)

    卫东; 金烨; 范秀敏; 严隽琪

    2003-01-01

    In recent years, computer simulation appears to be very advantageous technique for researching the resource-constrained manufacturing system. This paper presents an object-oriented simulation modeling method, which combines the merits of traditional methods such as IDEF0 and Petri Net. In this paper, a four-layer-one-angel hierarchical modeling framework based on OOP is defined. And the modeling description of these layers is expounded, such as: hybrid production control modeling and human resource dispatch modeling. To validate the modeling method, a case study of an auto-product line in a motor manufacturing company has been carried out.

  1. Energy based prediction models for building acoustics

    DEFF Research Database (Denmark)

    Brunskog, Jonas

    2012-01-01

    In order to reach robust and simplified yet accurate prediction models, energy based principle are commonly used in many fields of acoustics, especially in building acoustics. This includes simple energy flow models, the framework of statistical energy analysis (SEA) as well as more elaborated...... principles as, e.g., wave intensity analysis (WIA). The European standards for building acoustic predictions, the EN 12354 series, are based on energy flow and SEA principles. In the present paper, different energy based prediction models are discussed and critically reviewed. Special attention is placed on...

  2. Model-based reasoning and large-knowledge bases

    International Nuclear Information System (INIS)

    In such engineering fields as nuclear power plant engineering, technical information expressed in the form of schematics is frequently used. A new paradigm for model-based reasoning (MBR) and an AI tool called PLEXSYS (plant expert system) using this paradigm has been developed. PLEXSYS and the underlying paradigm are specifically designed to handle schematic drawings, by expressing drawings as models and supporting various sophisticated searches on these models. Two application systems have been constructed with PLEXSYS: one generates PLEXSYS models from existing CAD data files, and the other provides functions for nuclear power plant design support. Since the models can be generated from existing data resources, the design support system automatically has full access to a large-scale model or knowledge base representing actual nuclear power plants. (author)

  3. Ground-Based Telescope Parametric Cost Model

    Science.gov (United States)

    Stahl, H. Philip; Rowell, Ginger Holmes

    2004-01-01

    A parametric cost model for ground-based telescopes is developed using multi-variable statistical analysis, The model includes both engineering and performance parameters. While diameter continues to be the dominant cost driver, other significant factors include primary mirror radius of curvature and diffraction limited wavelength. The model includes an explicit factor for primary mirror segmentation and/or duplication (i.e.. multi-telescope phased-array systems). Additionally, single variable models based on aperture diameter are derived. This analysis indicates that recent mirror technology advances have indeed reduced the historical telescope cost curve.

  4. Software Testing Method Based on Model Comparison

    Institute of Scientific and Technical Information of China (English)

    XIE Xiao-dong; LU Yan-sheng; MAO Cheng-yin

    2008-01-01

    A model comparison based software testing method (MCST) is proposed. In this method, the requirements and programs of software under test are transformed into the ones in the same form, and described by the same model describe language (MDL).Then, the requirements are transformed into a specification model and the programs into an implementation model. Thus, the elements and structures of the two models are compared, and the differences between them are obtained. Based on the diffrences, a test suite is generated. Different MDLs can be chosen for the software under test. The usages of two classical MDLs in MCST, the equivalence classes model and the extended finite state machine (EFSM) model, are described with example applications. The results show that the test suites generated by MCST are more efficient and smaller than some other testing methods, such as the path-coverage testing method, the object state diagram testing method, etc.

  5. Sketch-based Interfaces and Modeling

    CERN Document Server

    Jorge, Joaquim

    2011-01-01

    The field of sketch-based interfaces and modeling (SBIM) is concerned with developing methods and techniques to enable users to interact with a computer through sketching - a simple, yet highly expressive medium. SBIM blends concepts from computer graphics, human-computer interaction, artificial intelligence, and machine learning. Recent improvements in hardware, coupled with new machine learning techniques for more accurate recognition, and more robust depth inferencing techniques for sketch-based modeling, have resulted in an explosion of both sketch-based interfaces and pen-based computing

  6. Location-based Modeling and Analysis: Tropos-based Approach

    OpenAIRE

    Ali, Raian; Dalpiaz, Fabiano; Giorgini, Paolo

    2008-01-01

    The continuous growth of interest in mobile applications makes the concept of location essential to design and develop software systems. Location-based software is supposed to be able to monitor the location and choose accordingly the most appropriate behavior. In this paper, we propose a novel conceptual framework to model and analyze location-based software. We mainly focus on the social facets of locations adopting concepts such as social actor, resource, and location-based behavior. Our a...

  7. Model-based clustered-dot screening

    Science.gov (United States)

    Kim, Sang Ho

    2006-01-01

    I propose a halftone screen design method based on a human visual system model and the characteristics of the electro-photographic (EP) printer engine. Generally, screen design methods based on human visual models produce dispersed-dot type screens while design methods considering EP printer characteristics generate clustered-dot type screens. In this paper, I propose a cost function balancing the conflicting characteristics of the human visual system and the printer. By minimizing the obtained cost function, I design a model-based clustered-dot screen using a modified direct binary search algorithm. Experimental results demonstrate the superior quality of the model-based clustered-dot screen compared to a conventional clustered-dot screen.

  8. Multiscale agent-based consumer market modeling.

    Energy Technology Data Exchange (ETDEWEB)

    North, M. J.; Macal, C. M.; St. Aubin, J.; Thimmapuram, P.; Bragen, M.; Hahn, J.; Karr, J.; Brigham, N.; Lacy, M. E.; Hampton, D.; Decision and Information Sciences; Procter & Gamble Co.

    2010-05-01

    Consumer markets have been studied in great depth, and many techniques have been used to represent them. These have included regression-based models, logit models, and theoretical market-level models, such as the NBD-Dirichlet approach. Although many important contributions and insights have resulted from studies that relied on these models, there is still a need for a model that could more holistically represent the interdependencies of the decisions made by consumers, retailers, and manufacturers. When the need is for a model that could be used repeatedly over time to support decisions in an industrial setting, it is particularly critical. Although some existing methods can, in principle, represent such complex interdependencies, their capabilities might be outstripped if they had to be used for industrial applications, because of the details this type of modeling requires. However, a complementary method - agent-based modeling - shows promise for addressing these issues. Agent-based models use business-driven rules for individuals (e.g., individual consumer rules for buying items, individual retailer rules for stocking items, or individual firm rules for advertizing items) to determine holistic, system-level outcomes (e.g., to determine if brand X's market share is increasing). We applied agent-based modeling to develop a multi-scale consumer market model. We then conducted calibration, verification, and validation tests of this model. The model was successfully applied by Procter & Gamble to several challenging business problems. In these situations, it directly influenced managerial decision making and produced substantial cost savings.

  9. Workflow-Based Dynamic Enterprise Modeling

    Institute of Scientific and Technical Information of China (English)

    黄双喜; 范玉顺; 罗海滨; 林慧萍

    2002-01-01

    Traditional systems for enterprise modeling and business process control are often static and cannot adapt to the changing environment. This paper presents a workflow-based method to dynamically execute the enterprise model. This method gives an explicit representation of the business process logic and the relationships between the elements involved in the process. An execution-oriented integrated enterprise modeling system is proposed in combination with other enterprise views. The enterprise model can be established and executed dynamically in the actual environment due to the dynamic properties of the workflow model.

  10. Bayesian Network Based XP Process Modelling

    Directory of Open Access Journals (Sweden)

    Mohamed Abouelela

    2010-07-01

    Full Text Available A Bayesian Network based mathematical model has been used for modelling Extreme Programmingsoftware development process. The model is capable of predicting the expected finish time and theexpected defect rate for each XP release. Therefore, it can be used to determine the success/failure of anyXP Project. The model takes into account the effect of three XP practices, namely: Pair Programming,Test Driven Development and Onsite Customer practices. The model’s predictions were validated againsttwo case studies. Results show the precision of our model especially in predicting the project finish time.

  11. Econophysics of agent-based models

    CERN Document Server

    Aoyama, Hideaki; Chakrabarti, Bikas; Chakraborti, Anirban; Ghosh, Asim

    2014-01-01

    The primary goal of this book is to present the research findings and conclusions of physicists, economists, mathematicians and financial engineers working in the field of "Econophysics" who have undertaken agent-based modelling, comparison with empirical studies and related investigations. Most standard economic models assume the existence of the representative agent, who is “perfectly rational” and applies the utility maximization principle when taking action. One reason for this is the desire to keep models mathematically tractable: no tools are available to economists for solving non-linear models of heterogeneous adaptive agents without explicit optimization. In contrast, multi-agent models, which originated from statistical physics considerations, allow us to go beyond the prototype theories of traditional economics involving the representative agent. This book is based on the Econophys-Kolkata VII Workshop, at which many such modelling efforts were presented. In the book, leading researchers in the...

  12. Evaluating face trustworthiness: a model based approach

    OpenAIRE

    Todorov, Alexander; Baron, Sean G.; Oosterhof, Nikolaas N.

    2008-01-01

    Judgments of trustworthiness from faces determine basic approach/avoidance responses and approximate the valence evaluation of faces that runs across multiple person judgments. Here, based on trustworthiness judgments and using a computer model for face representation, we built a model for representing face trustworthiness (study 1). Using this model, we generated novel faces with an increased range of trustworthiness and used these faces as stimuli in a functional Magnetic Resonance Imaging ...

  13. Agent-Based Modeling in Systems Pharmacology.

    Science.gov (United States)

    Cosgrove, J; Butler, J; Alden, K; Read, M; Kumar, V; Cucurull-Sanchez, L; Timmis, J; Coles, M

    2015-11-01

    Modeling and simulation (M&S) techniques provide a platform for knowledge integration and hypothesis testing to gain insights into biological systems that would not be possible a priori. Agent-based modeling (ABM) is an M&S technique that focuses on describing individual components rather than homogenous populations. This tutorial introduces ABM to systems pharmacologists, using relevant case studies to highlight how ABM-specific strengths have yielded success in the area of preclinical mechanistic modeling. PMID:26783498

  14. Agent-based Models of Financial Markets

    OpenAIRE

    Samanidou, E.; E. Zschischang; Stauffer, D.; Lux, T.

    2007-01-01

    This review deals with several microscopic (``agent-based'') models of financial markets which have been studied by economists and physicists over the last decade: Kim-Markowitz, Levy-Levy-Solomon, Cont-Bouchaud, Solomon-Weisbuch, Lux-Marchesi, Donangelo-Sneppen and Solomon-Levy-Huang. After an overview of simulation approaches in financial economics, we first give a summary of the Donangelo-Sneppen model of monetary exchange and compare it with related models in economics literature. Our sel...

  15. Literature Survey on Model based Slicing

    OpenAIRE

    Sneh Krishna*,; Alekh Dwivedi

    2014-01-01

    Software testing is an activity which aims at evaluating an feature or capability of system and determining that whether it meets required expectations. One way to ease this program slicing technique is to break down the large programs into smaller ones and into other is model based slicing that break down the large software architecture model into smaller models at the early stage of SDLC (Software Development Life Cycle). This is the novel methodology to extract the sub mode...

  16. PARTICIPATION BASED MODEL OF SHIP CREW MANAGEMENT

    OpenAIRE

    Bielić, Toni; Ivanišević, Dalibor; Gundić, Ana

    2014-01-01

    This paper analyse the participation - based model on board the ship as possibly optimal leadership model existing in the shipping industry with accent on decision - making process. In the paper authors have tried to define master’s behaviour model and management style identifying drawbacks and disadvantages of vertical, pyramidal organization with master on the top. Paper describes efficiency of decision making within team organization and optimization of a ship’s organisation by introduci...

  17. Model-based Abstraction of Data Provenance

    OpenAIRE

    Probst, Christian W.; Hansen, René Rydhof

    2014-01-01

    Identifying provenance of data provides insights to the origin of data and intermediate results, and has recently gained increased interest due to data-centric applications. In this work we extend a data-centric system view with actors handling the data and policies restricting actions. This extension is based on provenance analysis performed on system models. System models have been introduced to model and analyse spatial and organisational aspects of organisations, to identify, e.g., potent...

  18. A probabilistic graphical model based stochastic input model construction

    International Nuclear Information System (INIS)

    Model reduction techniques have been widely used in modeling of high-dimensional stochastic input in uncertainty quantification tasks. However, the probabilistic modeling of random variables projected into reduced-order spaces presents a number of computational challenges. Due to the curse of dimensionality, the underlying dependence relationships between these random variables are difficult to capture. In this work, a probabilistic graphical model based approach is employed to learn the dependence by running a number of conditional independence tests using observation data. Thus a probabilistic model of the joint PDF is obtained and the PDF is factorized into a set of conditional distributions based on the dependence structure of the variables. The estimation of the joint PDF from data is then transformed to estimating conditional distributions under reduced dimensions. To improve the computational efficiency, a polynomial chaos expansion is further applied to represent the random field in terms of a set of standard random variables. This technique is combined with both linear and nonlinear model reduction methods. Numerical examples are presented to demonstrate the accuracy and efficiency of the probabilistic graphical model based stochastic input models. - Highlights: • Data-driven stochastic input models without the assumption of independence of the reduced random variables. • The problem is transformed to a Bayesian network structure learning problem. • Examples are given in flows in random media

  19. Mineral resources estimation based on block modeling

    Science.gov (United States)

    Bargawa, Waterman Sulistyana; Amri, Nur Ali

    2016-02-01

    The estimation in this paper uses three kinds of block models of nearest neighbor polygon, inverse distance squared and ordinary kriging. The techniques are weighting scheme which is based on the principle that block content is a linear combination of the grade data or the sample around the block being estimated. The case study in Pongkor area, here is gold-silver resource modeling that allegedly shaped of quartz vein as a hydrothermal process of epithermal type. Resources modeling includes of data entry, statistical and variography analysis of topography and geological model, the block model construction, estimation parameter, presentation model and tabulation of mineral resources. Skewed distribution, here isolated by robust semivariogram. The mineral resources classification generated in this model based on an analysis of the kriging standard deviation and number of samples which are used in the estimation of each block. Research results are used to evaluate the performance of OK and IDS estimator. Based on the visual and statistical analysis, concluded that the model of OK gives the estimation closer to the data used for modeling.

  20. Modelling a Peroxidase-based Optical Biosensor

    Science.gov (United States)

    Baronas, Romas; Gaidamauskaite, Evelina; Kulys, Juozas

    2007-01-01

    The response of a peroxidase-based optical biosensor was modelled digitally. A mathematical model of the optical biosensor is based on a system of non-linear reaction-diffusion equations. The modelling biosensor comprises two compartments, an enzyme layer and an outer diffusion layer. The digital simulation was carried out using finite difference technique. The influence of the substrate concentration as well as of the thickness of both the enzyme and diffusion layers on the biosensor response was investigated. Calculations showed complex kinetics of the biosensor response, especially at low concentrations of the peroxidase and of the hydrogen peroxide.

  1. Information modelling and knowledge bases XXV

    CERN Document Server

    Tokuda, T; Jaakkola, H

    2014-01-01

    Because of our ever increasing use of and reliance on technology and information systems, information modelling and knowledge bases continue to be important topics in those academic communities concerned with data handling and computer science. As the information itself becomes more complex, so do the levels of abstraction and the databases themselves. This book is part of the series Information Modelling and Knowledge Bases, which concentrates on a variety of themes in the important domains of conceptual modeling, design and specification of information systems, multimedia information modelin

  2. Neighborhood Mixture Model for Knowledge Base Completion

    OpenAIRE

    Nguyen, Dat Quoc; Sirts, Kairit; Qu, Lizhen; Johnson, Mark

    2016-01-01

    Knowledge bases are useful resources for many natural language processing tasks, however, they are far from complete. In this paper, we define a novel entity representation as a mixture of its neighborhood in the knowledge base and apply this technique on TransE-a well-known embedding model for knowledge base completion. Experimental results show that the neighborhood information significantly helps to improve the results of the TransE, leading to better performance than obtained by other sta...

  3. Multiagent-Based Model For ESCM

    OpenAIRE

    Delia MARINCAS

    2011-01-01

    Web based applications for Supply Chain Management (SCM) are now a necessity for every company in order to meet the increasing customer demands, to face the global competition and to make profit. Multiagent-based approach is appropriate for eSCM because it shows many of the characteristics a SCM system should have. For this reason, we have proposed a multiagent-based eSCM model which configures a virtual SC, automates the SC activities: selling, purchasing, manufacturing, planning, inventory,...

  4. Modelling carbon nanotubes-based mediatorless biosensor.

    Science.gov (United States)

    Baronas, Romas; Kulys, Juozas; Petrauskas, Karolis; Razumiene, Julija

    2012-01-01

    This paper presents a mathematical model of carbon nanotubes-based mediatorless biosensor. The developed model is based on nonlinear non-stationary reaction-diffusion equations. The model involves four layers (compartments): a layer of enzyme solution entrapped on a terylene membrane, a layer of the single walled carbon nanotubes deposited on a perforated membrane, and an outer diffusion layer. The biosensor response and sensitivity are investigated by changing the model parameters with a special emphasis on the mediatorless transfer of the electrons in the layer of the enzyme-loaded carbon nanotubes. The numerical simulation at transient and steady state conditions was carried out using the finite difference technique. The mathematical model and the numerical solution were validated by experimental data. The obtained agreement between the simulation results and the experimental data was admissible at different concentrations of the substrate. PMID:23012537

  5. Modelling Carbon Nanotubes-Based Mediatorless Biosensor

    Directory of Open Access Journals (Sweden)

    Julija Razumiene

    2012-07-01

    Full Text Available This paper presents a mathematical model of carbon nanotubes-based mediatorless biosensor. The developed model is based on nonlinear non-stationary reaction-diffusion equations. The model involves four layers (compartments: a layer of enzyme solution entrapped on a terylene membrane, a layer of the single walled carbon nanotubes deposited on a perforated membrane, and an outer diffusion layer. The biosensor response and sensitivity are investigated by changing the model parameters with a special emphasis on the mediatorless transfer of the electrons in the layer of the enzyme-loaded carbon nanotubes. The numerical simulation at transient and steady state conditions was carried out using the finite difference technique. The mathematical model and the numerical solution were validated by experimental data. The obtained agreement between the simulation results and the experimental data was admissible at different concentrations of the substrate.

  6. A stepwise-cluster microbial biomass inference model in food waste composting

    International Nuclear Information System (INIS)

    A stepwise-cluster microbial biomass inference (SMI) model was developed through introducing stepwise-cluster analysis (SCA) into composting process modeling to tackle the nonlinear relationships among state variables and microbial activities. The essence of SCA is to form a classification tree based on a series of cutting or mergence processes according to given statistical criteria. Eight runs of designed experiments in bench-scale reactors in a laboratory were constructed to demonstrate the feasibility of the proposed method. The results indicated that SMI could help establish a statistical relationship between state variables and composting microbial characteristics, where discrete and nonlinear complexities exist. Significance levels of cutting/merging were provided such that the accuracies of the developed forecasting trees were controllable. Through an attempted definition of input effects on the output in SMI, the effects of the state variables on thermophilic bacteria were ranged in a descending order as: Time (day) > moisture content (%) > ash content (%, dry) > Lower Temperature (deg. C) > pH > NH4+-N (mg/Kg, dry) > Total N (%, dry) > Total C (%, dry); the effects on mesophilic bacteria were ordered as: Time > Upper Temperature (deg. C) > Total N > moisture content > NH4+-N > Total C > pH. This study made the first attempt in applying SCA to mapping the nonlinear and discrete relationships in composting processes.

  7. MEGen: A Physiologically Based Pharmacokinetic Model Generator

    Directory of Open Access Journals (Sweden)

    GeorgeDLoizou

    2011-11-01

    Full Text Available Physiologically based pharmacokinetic models are being used in an increasing number of different areas. These not only include the human safety assessment of pharmaceuticals, pesticides, biocides and environmental chemicals but also for food animal, wild mammal and avian risk assessment. The value of PBPK models is that they are tools for estimating tissue dosimetry by integrating in vitro and in vivo mechanistic, pharmacokinetic and toxicological information through their explicit mathematical description of important anatomical, physiological and biochemical determinants of chemical uptake, disposition and elimination. However, PBPK models are perceived as complex, data hungry, resource intensive and time consuming. In addition, model validation and verification are hindered by the relative complexity of the equations. To begin to address these issues a freely available web application for the rapid construction and documentation of bespoke PBPK models is under development. Here we present an overview of the current capabilities of MEGen, a model equation generator and parameter database and discuss future developments.

  8. A model evaluation checklist for process-based environmental models

    Science.gov (United States)

    Jackson-Blake, Leah

    2015-04-01

    the conceptual model on which it is based. In this study, a number of model structural shortcomings were identified, such as a lack of dissolved phosphorus transport via infiltration excess overland flow, potential discrepancies in the particulate phosphorus simulation and a lack of spatial granularity. (4) Conceptual challenges, as conceptual models on which predictive models are built are often outdated, having not kept up with new insights from monitoring and experiments. For example, soil solution dissolved phosphorus concentration in INCA-P is determined by the Freundlich adsorption isotherm, which could potentially be replaced using more recently-developed adsorption models that take additional soil properties into account. This checklist could be used to assist in identifying why model performance may be poor or unreliable. By providing a model evaluation framework, it could help prioritise which areas should be targeted to improve model performance or model credibility, whether that be through using alternative calibration techniques and statistics, improved data collection, improving or simplifying the model structure or updating the model to better represent current understanding of catchment processes.

  9. Spatial interactions in agent-based modeling

    CERN Document Server

    Ausloos, Marcel; Merlone, Ugo

    2014-01-01

    Agent Based Modeling (ABM) has become a widespread approach to model complex interactions. In this chapter after briefly summarizing some features of ABM the different approaches in modeling spatial interactions are discussed. It is stressed that agents can interact either indirectly through a shared environment and/or directly with each other. In such an approach, higher-order variables such as commodity prices, population dynamics or even institutions, are not exogenously specified but instead are seen as the results of interactions. It is highlighted in the chapter that the understanding of patterns emerging from such spatial interaction between agents is a key problem as much as their description through analytical or simulation means. The chapter reviews different approaches for modeling agents' behavior, taking into account either explicit spatial (lattice based) structures or networks. Some emphasis is placed on recent ABM as applied to the description of the dynamics of the geographical distribution o...

  10. Model based Software Develeopment: Issues & Challenges

    CERN Document Server

    basha, N Md Jubair; Rizwanullah, Mohammed

    2012-01-01

    One of the goals of software design is to model a system in such a way that it is easily understandable. Nowadays the tendency for software development is changing from manual coding to automatic code generation; it is becoming model-based. This is a response to the software crisis, in which the cost of hardware has decreased and conversely the cost of software development has increased sharply. The methodologies that allowed this change are model-based, thus relieving the human from detailed coding. Still there is a long way to achieve this goal, but work is being done worldwide to achieve this objective. This paper presents the drastic changes related to modeling and important challenging issues and techniques that recur in MBSD.

  11. Graphical model construction based on evolutionary algorithms

    Institute of Scientific and Technical Information of China (English)

    Youlong YANG; Yan WU; Sanyang LIU

    2006-01-01

    Using Bayesian networks to model promising solutions from the current population of the evolutionary algorithms can ensure efficiency and intelligence search for the optimum. However, to construct a Bayesian network that fits a given dataset is a NP-hard problem, and it also needs consuming mass computational resources. This paper develops a methodology for constructing a graphical model based on Bayesian Dirichlet metric. Our approach is derived from a set of propositions and theorems by researching the local metric relationship of networks matching dataset. This paper presents the algorithm to construct a tree model from a set of potential solutions using above approach. This method is important not only for evolutionary algorithms based on graphical models, but also for machine learning and data mining.The experimental results show that the exact theoretical results and the approximations match very well.

  12. PV panel model based on datasheet values

    DEFF Research Database (Denmark)

    Sera, Dezso; Teodorescu, Remus; Rodriguez, Pedro

    This work presents the construction of a model for a PV panel using the single-diode five-parameters model, based exclusively on data-sheet parameters. The model takes into account the series and parallel (shunt) resistance of the panel. The equivalent circuit and the basic equations of the PV cell....../panel in Standard Test Conditions (STC) are shown, as well as the parameters extraction from the data-sheet values. The temperature dependence of the cell dark saturation current is expressed with an alternative formula, which gives better correlation with the datasheet values of the power temperature...... dependence. Based on these equations, a PV panel model, which is able to predict the panel behavior in different temperature and irradiance conditions, is built and tested....

  13. A contextual modeling approach for model-based recommender systems

    OpenAIRE

    Fernández-Tobías, Ignacio; Campos Soto, Pedro G.; Cantador, Iván; Díez, Fernando

    2013-01-01

    The final publication is available at Springer via http://dx.doi.org/10.1007/978-3-642-40643-0_5 Proceedings of 15th Conference of the Spanish Association for Artificial Intelligence, CAEPIA 2013, Madrid, Spain, September 17-20, 2013. In this paper we present a contextual modeling approach for model-based recommender systems that integrates and exploits both user preferences and contextual signals in a common vector space. Differently to previous work, we conduct a user study acquiring ...

  14. Efficient Textural Model-Based Mammogram Enhancement

    Czech Academy of Sciences Publication Activity Database

    Haindl, Michal; Remeš, Václav

    Piscataway: IEEE, 2013, s. 522-523. ISBN 978-1-4799-1053-3. [2013 IEEE 26th International Symposium on Computer-Based Medical Systems (CBMS). Porto (PT), 20.06.2013-22.06.2013] Institutional support: RVO:67985556 Keywords : mammogram enhancement * autoregressive texture model * breast tissue modeling Subject RIV: BD - Theory of Information http://library.utia.cas.cz/separaty/2013/RO/haindl-0397607.pdf

  15. Application software development via model based design

    OpenAIRE

    Haapala, O. (Olli)

    2015-01-01

    This thesis was set to study the utilization of the MathWorks’ Simulink® program in model based application software development and its compatibility with the Vacon 100 inverter. The target was to identify all the problems related to everyday usage of this method and create a white paper of how to execute a model based design to create a Vacon 100 compatible system software. Before this thesis was started, there was very little knowledge of the compatibility of this method. However durin...

  16. Physically based modeling and animation of tornado

    Institute of Scientific and Technical Information of China (English)

    LIU Shi-guang; WANG Zhang-ye; GONG Zheng; CHEN Fei-fei; PENG Qun-sheng

    2006-01-01

    Realistic modeling and rendering of dynamic tornado scene is recognized as a challenging task for researchers of computer graphics. In this paper a new physically based method for simulating and animating tornado scene is presented. We first propose a Two-Fluid model based on the physical theory of tornado, then we simulate the flow of tornado and its interaction with surrounding objects such as debris, etc. Taking the scattering and absorption of light by the participating media into account, the illumination effects of the tornado scene can be generated realistically. With the support of graphics hardware, various kinds of dynamic tornado scenes can be rendered at interactive rates.

  17. Model-based testing for embedded systems

    CERN Document Server

    Zander, Justyna; Mosterman, Pieter J

    2011-01-01

    What the experts have to say about Model-Based Testing for Embedded Systems: "This book is exactly what is needed at the exact right time in this fast-growing area. From its beginnings over 10 years ago of deriving tests from UML statecharts, model-based testing has matured into a topic with both breadth and depth. Testing embedded systems is a natural application of MBT, and this book hits the nail exactly on the head. Numerous topics are presented clearly, thoroughly, and concisely in this cutting-edge book. The authors are world-class leading experts in this area and teach us well-used

  18. A multivalued knowledge-base model

    CERN Document Server

    Achs, Agnes

    2010-01-01

    The basic aim of our study is to give a possible model for handling uncertain information. This model is worked out in the framework of DATALOG. At first the concept of fuzzy Datalog will be summarized, then its extensions for intuitionistic- and interval-valued fuzzy logic is given and the concept of bipolar fuzzy Datalog is introduced. Based on these ideas the concept of multivalued knowledge-base will be defined as a quadruple of any background knowledge; a deduction mechanism; a connecting algorithm, and a function set of the program, which help us to determine the uncertainty levels of the results. At last a possible evaluation strategy is given.

  19. Physiologically based synthetic models of hepatic disposition.

    Science.gov (United States)

    Hunt, C Anthony; Ropella, Glen E P; Yan, Li; Hung, Daniel Y; Roberts, Michael S

    2006-12-01

    Current physiologically based pharmacokinetic (PBPK) models are inductive. We present an additional, different approach that is based on the synthetic rather than the inductive approach to modeling and simulation. It relies on object-oriented programming. A model of the referent system in its experimental context is synthesized by assembling objects that represent components such as molecules, cells, aspects of tissue architecture, catheters, etc. The single pass perfused rat liver has been well described in evaluating hepatic drug pharmacokinetics (PK) and is the system on which we focus. In silico experiments begin with administration of objects representing actual compounds. Data are collected in a manner analogous to that in the referent PK experiments. The synthetic modeling method allows for recognition and representation of discrete event and discrete time processes, as well as heterogeneity in organization, function, and spatial effects. An application is developed for sucrose and antipyrine, administered separately and together. PBPK modeling has made extensive progress in characterizing abstracted PK properties but this has also been its limitation. Now, other important questions and possible extensions emerge. How are these PK properties and the observed behaviors generated? The inherent heuristic limitations of traditional models have hindered getting meaningful, detailed answers to such questions. Synthetic models of the type described here are specifically intended to help answer such questions. Analogous to wet-lab experimental models, they retain their applicability even when broken apart into sub-components. Having and applying this new class of models along with traditional PK modeling methods is expected to increase the productivity of pharmaceutical research at all levels that make use of modeling and simulation. PMID:17051440

  20. A subgrid based approach for morphodynamic modelling

    Science.gov (United States)

    Volp, N. D.; van Prooijen, B. C.; Pietrzak, J. D.; Stelling, G. S.

    2016-07-01

    To improve the accuracy and the efficiency of morphodynamic simulations, we present a subgrid based approach for a morphodynamic model. This approach is well suited for areas characterized by sub-critical flow, like in estuaries, coastal areas and in low land rivers. This new method uses a different grid resolution to compute the hydrodynamics and the morphodynamics. The hydrodynamic computations are carried out with a subgrid based, two-dimensional, depth-averaged model. This model uses a coarse computational grid in combination with a subgrid. The subgrid contains high resolution bathymetry and roughness information to compute volumes, friction and advection. The morphodynamic computations are carried out entirely on a high resolution grid, the bed grid. It is key to find a link between the information defined on the different grids in order to guaranty the feedback between the hydrodynamics and the morphodynamics. This link is made by using a new physics-based interpolation method. The method interpolates water levels and velocities from the coarse grid to the high resolution bed grid. The morphodynamic solution improves significantly when using the subgrid based method compared to a full coarse grid approach. The Exner equation is discretised with an upwind method based on the direction of the bed celerity. This ensures a stable solution for the Exner equation. By means of three examples, it is shown that the subgrid based approach offers a significant improvement at a minimal computational cost.

  1. Designing Network-based Business Model Ontology

    DEFF Research Database (Denmark)

    Hashemi Nekoo, Ali Reza; Ashourizadeh, Shayegheh; Zarei, Behrouz

    2015-01-01

    is going to propose e-business model ontology from the network point of view and its application in real world. The suggested ontology for network-based businesses is composed of individuals` characteristics and what kind of resources they own. also, their connections and pre-conceptions of connections...... such as shared-mental model and trust. However, it mostly covers previous business model elements. To confirm the applicability of this ontology, it has been implemented in business angel network and showed how it works....

  2. Physiologically based pharmacokinetic model for acetone.

    OpenAIRE

    Kumagai, S.; Matsunaga, I

    1995-01-01

    OBJECTIVE--This study aimed to develop a physiologically based pharmacokinetic model for acetone and to predict the kinetic behaviour of acetone in the human body with that model. METHODS--The model consists of eight tissue groups in which acetone can be distributed: the mucous layer of the inhaled air tract, the mucous layer of the exhaled air tract, a compartment for gas exchange (alveolus of the lung), a group of blood vessel rich tissues including the brain and heart, a group of tissues i...

  3. Analysis of landslide hazard area in Ludian earthquake based on Random Forests

    Science.gov (United States)

    Xie, J.-C.; Liu, R.; Li, H.-W.; Lai, Z.-L.

    2015-04-01

    With the development of machine learning theory, more and more algorithms are evaluated for seismic landslides. After the Ludian earthquake, the research team combine with the special geological structure in Ludian area and the seismic filed exploration results, selecting SLOPE(PODU); River distance(HL); Fault distance(DC); Seismic Intensity(LD) and Digital Elevation Model(DEM), the normalized difference vegetation index(NDVI) which based on remote sensing images as evaluation factors. But the relationships among these factors are fuzzy, there also exists heavy noise and high-dimensional, we introduce the random forest algorithm to tolerate these difficulties and get the evaluation result of Ludian landslide areas, in order to verify the accuracy of the result, using the ROC graphs for the result evaluation standard, AUC covers an area of 0.918, meanwhile, the random forest's generalization error rate decreases with the increase of the classification tree to the ideal 0.08 by using Out Of Bag(OOB) Estimation. Studying the final landslides inversion results, paper comes to a statistical conclusion that near 80% of the whole landslides and dilapidations are in areas with high susceptibility and moderate susceptibility, showing the forecast results are reasonable and adopted.

  4. Evolutionary modeling-based approach for model errors correction

    Directory of Open Access Journals (Sweden)

    S. Q. Wan

    2012-08-01

    Full Text Available The inverse problem of using the information of historical data to estimate model errors is one of the science frontier research topics. In this study, we investigate such a problem using the classic Lorenz (1963 equation as a prediction model and the Lorenz equation with a periodic evolutionary function as an accurate representation of reality to generate "observational data."

    On the basis of the intelligent features of evolutionary modeling (EM, including self-organization, self-adaptive and self-learning, the dynamic information contained in the historical data can be identified and extracted by computer automatically. Thereby, a new approach is proposed to estimate model errors based on EM in the present paper. Numerical tests demonstrate the ability of the new approach to correct model structural errors. In fact, it can actualize the combination of the statistics and dynamics to certain extent.

  5. Knowledge base system visualization reasoning model based on ICON

    Science.gov (United States)

    Chen, Deyun; Pei, Shujun; Quan, Zhiying

    2005-03-01

    Knowledge base system is one of the most future branches for artificial intelligence facing with practical application. But the reasoning process of system is invisible, not visual and users cannot intervene the reasoning process, therefore for users the system is only a black box. This condition causes many users to take a suspicious attitude to the conclusions analyzing and drawing from the system, that means even though the system has the explanation function, but it is still not far enough. If we adopt graph or image technique to display this reasoning procedure interactively and dynamically which can make this procedure be visual, users can intervene the reasoning procedure which can greatly reduce users" gain giving, and at the same time it can provide a given method for integrity check to knowledge of the knowledge base. Therefore, we can say that reasoning visualization of knowledge base system has a further meaning than general visualization. In this paper the visual problem of reasoning process for knowledge base system on the basis of the formalized analysis for ICON system, Icon operation, syntax and semanteme of the statement is presented, a reasoning model of knowledge base system that has a visual characteristics is established, the model is used to do an integrity check in practical expert system and knowledge base, better effect is got.

  6. Haptics-based dynamic implicit solid modeling.

    Science.gov (United States)

    Hua, Jing; Qin, Hong

    2004-01-01

    This paper systematically presents a novel, interactive solid modeling framework, Haptics-based Dynamic Implicit Solid Modeling, which is founded upon volumetric implicit functions and powerful physics-based modeling. In particular, we augment our modeling framework with a haptic mechanism in order to take advantage of additional realism associated with a 3D haptic interface. Our dynamic implicit solids are semi-algebraic sets of volumetric implicit functions and are governed by the principles of dynamics, hence responding to sculpting forces in a natural and predictable manner. In order to directly manipulate existing volumetric data sets as well as point clouds, we develop a hierarchical fitting algorithm to reconstruct and represent discrete data sets using our continuous implicit functions, which permit users to further design and edit those existing 3D models in real-time using a large variety of haptic and geometric toolkits, and visualize their interactive deformation at arbitrary resolution. The additional geometric and physical constraints afford more sophisticated control of the dynamic implicit solids. The versatility of our dynamic implicit modeling enables the user to easily modify both the geometry and the topology of modeled objects, while the inherent physical properties can offer an intuitive haptic interface for direct manipulation with force feedback. PMID:15794139

  7. Model-Based Power Plant Master Control

    Energy Technology Data Exchange (ETDEWEB)

    Boman, Katarina; Thomas, Jean; Funkquist, Jonas

    2010-08-15

    The main goal of the project has been to evaluate the potential of a coordinated master control for a solid fuel power plant in terms of tracking capability, stability and robustness. The control strategy has been model-based predictive control (MPC) and the plant used in the case study has been the Vattenfall power plant Idbaecken in Nykoeping. A dynamic plant model based on nonlinear physical models was used to imitate the true plant in MATLAB/SIMULINK simulations. The basis for this model was already developed in previous Vattenfall internal projects, along with a simulation model of the existing control implementation with traditional PID controllers. The existing PID control is used as a reference performance, and it has been thoroughly studied and tuned in these previous Vattenfall internal projects. A turbine model was developed with characteristics based on the results of steady-state simulations of the plant using the software EBSILON. Using the derived model as a representative for the actual process, an MPC control strategy was developed using linearization and gain-scheduling. The control signal constraints (rate of change) and constraints on outputs were implemented to comply with plant constraints. After tuning the MPC control parameters, a number of simulation scenarios were performed to compare the MPC strategy with the existing PID control structure. The simulation scenarios also included cases highlighting the robustness properties of the MPC strategy. From the study, the main conclusions are: - The proposed Master MPC controller shows excellent set-point tracking performance even though the plant has strong interactions and non-linearity, and the controls and their rate of change are bounded. - The proposed Master MPC controller is robust, stable in the presence of disturbances and parameter variations. Even though the current study only considered a very small number of the possible disturbances and modelling errors, the considered cases are

  8. Model Predictive Control based on Finite Impulse Response Models

    DEFF Research Database (Denmark)

    Prasath, Guru; Jørgensen, John Bagterp

    2008-01-01

    We develop a regularized l2 finite impulse response (FIR) predictive controller with input and input-rate constraints. Feedback is based on a simple constant output disturbance filter. The performance of the predictive controller in the face of plant-model mismatch is investigated by simulations...

  9. Incident duration modeling using flexible parametric hazard-based models.

    Science.gov (United States)

    Li, Ruimin; Shang, Pan

    2014-01-01

    Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time. PMID:25530753

  10. Incident Duration Modeling Using Flexible Parametric Hazard-Based Models

    Directory of Open Access Journals (Sweden)

    Ruimin Li

    2014-01-01

    Full Text Available Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time.

  11. Model Based Control of Reefer Container Systems

    DEFF Research Database (Denmark)

    Sørensen, Kresten Kjær

    This thesis is concerned with the development of model based control for the Star Cool refrigerated container (reefer) with the objective of reducing energy consumption. This project has been carried out under the Danish Industrial PhD programme and has been financed by Lodam together with the...

  12. Port-based modeling of mechatronic systems

    NARCIS (Netherlands)

    Breedveld, Peter C.

    2004-01-01

    Many engineering activities, including mechatronic design, require that a multidomain or ‘multi-physics’ system and its control system be designed as an integrated system. This contribution discusses the background and tools for a port-based approach to integrated modeling and simulation of physical

  13. Agent Based Modelling for Social Simulation

    NARCIS (Netherlands)

    Smit, S.K.; Ubink, E.M.; Vecht, B. van der; Langley, D.J.

    2013-01-01

    This document is the result of an exploratory project looking into the status of, and opportunities for Agent Based Modelling (ABM) at TNO. The project focussed on ABM applications containing social interactions and human factors, which we termed ABM for social simulation (ABM4SS). During the course

  14. What's Missing in Model-Based Teaching

    Science.gov (United States)

    Khan, Samia

    2011-01-01

    In this study, the author investigated how four science teachers employed model-based teaching (MBT) over a 1-year period. The purpose of the research was to develop a baseline of the fundamental and specific dimensions of MBT that are present and absent in science teaching. Teacher interviews, classroom observations, and pre and post-student…

  15. Introducing Waqf Based Takaful Model in India

    Directory of Open Access Journals (Sweden)

    Syed Ahmed Salman

    2014-03-01

    Full Text Available Objective – Waqf is a unique feature of the socioeconomic system of Islam in a multi- religious and developing country like India. India is a rich country with waqf assets. The history of waqf in India can be traced back to 800 years ago. Most of the researchers, suggest how waqf can be used a tool to mitigate the poverty of Muslims. India has the third highest Muslim population after Indonesia and Pakistan. However, the majority of Muslims belong to the low income group and they are in need of help. It is believed that waqf can be utilized for the betterment of Indian Muslim community. Among the available uses of waqf assets, the main objective of this paper is to introduce waqf based takaful model in India. In addition, how this proposed model can be adopted in India is highlighted.Methods – Library research is applied since this paper relies on secondary data by thoroughlyreviewing the most relevant literature.Result – India as a rich country with waqf assets should fully utilize the resources to help the Muslims through takaful.Conclusion – In this study, we have proposed waqf based takaful model with the combination of the concepts mudarabah and wakalah for India. We recommend this model based on the background of the  country and situations. Since we have not tested the viability of this model in India, future research should be continued on this testing.Keywords : Wakaf, Takaful, Kemiskinan dan India

  16. Unifying Model-Based and Reactive Programming within a Model-Based Executive

    Science.gov (United States)

    Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)

    1999-01-01

    Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.

  17. A Novel Template-Based Learning Model

    CERN Document Server

    Abolghasemi-Dahaghani, Mohammadreza; Nowroozi, Alireza

    2011-01-01

    This article presents a model which is capable of learning and abstracting new concepts based on comparing observations and finding the resemblance between the observations. In the model, the new observations are compared with the templates which have been derived from the previous experiences. In the first stage, the objects are first represented through a geometric description which is used for finding the object boundaries and a descriptor which is inspired by the human visual system and then they are fed into the model. Next, the new observations are identified through comparing them with the previously-learned templates and are used for producing new templates. The comparisons are made based on measures like Euclidean or correlation distance. The new template is created by applying onion-pealing algorithm. The algorithm consecutively uses convex hulls which are made by the points representing the objects. If the new observation is remarkably similar to one of the observed categories, it is no longer util...

  18. Agent Based Modeling as an Educational Tool

    Science.gov (United States)

    Fuller, J. H.; Johnson, R.; Castillo, V.

    2012-12-01

    Motivation is a key element in high school education. One way to improve motivation and provide content, while helping address critical thinking and problem solving skills, is to have students build and study agent based models in the classroom. This activity visually connects concepts with their applied mathematical representation. "Engaging students in constructing models may provide a bridge between frequently disconnected conceptual and mathematical forms of knowledge." (Levy and Wilensky, 2011) We wanted to discover the feasibility of implementing a model based curriculum in the classroom given current and anticipated core and content standards.; Simulation using California GIS data ; Simulation of high school student lunch popularity using aerial photograph on top of terrain value map.

  19. Multiagent-Based Model For ESCM

    Directory of Open Access Journals (Sweden)

    Delia MARINCAS

    2011-01-01

    Full Text Available Web based applications for Supply Chain Management (SCM are now a necessity for every company in order to meet the increasing customer demands, to face the global competition and to make profit. Multiagent-based approach is appropriate for eSCM because it shows many of the characteristics a SCM system should have. For this reason, we have proposed a multiagent-based eSCM model which configures a virtual SC, automates the SC activities: selling, purchasing, manufacturing, planning, inventory, etc. This model will allow a better coordination of the supply chain network and will increase the effectiveness of Web and intel-ligent technologies employed in eSCM software.

  20. Modeling Leaves Based on Real Image

    Institute of Scientific and Technical Information of China (English)

    CAO Yu-kun; LI Yun-feng; ZHU Qing-sheng; LIU Yin-bin

    2004-01-01

    Plants have complex structures. The shape of a plant component is vital for capturing the characteristics of a species. One of the challenges in computer graphics is to create geometry of objects in an intuitive and direct way while allowing interactive manipulation of the resulting shapes. In this paper,an interactive method for modeling leaves based on real image is proposed using biological data for individual plants. The modeling process begins with a one-dimensional analogue of implicit surfaces,from which a 2D silhouette of a leaf is generated based on image segmentation. The silhouette skeleton is thus obtained. Feature parameters of the leaf are extracted based on biologically experimental data, and the obtained leaf structure is then modified by comparing the synthetic result with the real leaf so as to make the leaf structure more realistic. Finally, the leaf mesh is constructed by sweeps.

  1. [Fast spectral modeling based on Voigt peaks].

    Science.gov (United States)

    Li, Jin-rong; Dai, Lian-kui

    2012-03-01

    Indirect hard modeling (IHM) is a recently introduced method for quantitative spectral analysis, which was applied to the analysis of nonlinear relation between mixture spectrum and component concentration. In addition, IHM is an effectual technology for the analysis of components of mixture with molecular interactions and strongly overlapping bands. Before the establishment of regression model, IHM needs to model the measured spectrum as a sum of Voigt peaks. The precision of the spectral model has immediate impact on the accuracy of the regression model. A spectrum often includes dozens or even hundreds of Voigt peaks, which mean that spectral modeling is a optimization problem with high dimensionality in fact. So, large operation overhead is needed and the solution would not be numerically unique due to the ill-condition of the optimization problem. An improved spectral modeling method is presented in the present paper, which reduces the dimensionality of optimization problem by determining the overlapped peaks in spectrum. Experimental results show that the spectral modeling based on the new method is more accurate and needs much shorter running time than conventional method. PMID:22582612

  2. Mesoscopic model of actin-based propulsion.

    Directory of Open Access Journals (Sweden)

    Jie Zhu

    Full Text Available Two theoretical models dominate current understanding of actin-based propulsion: microscopic polymerization ratchet model predicts that growing and writhing actin filaments generate forces and movements, while macroscopic elastic propulsion model suggests that deformation and stress of growing actin gel are responsible for the propulsion. We examine both experimentally and computationally the 2D movement of ellipsoidal beads propelled by actin tails and show that neither of the two models can explain the observed bistability of the orientation of the beads. To explain the data, we develop a 2D hybrid mesoscopic model by reconciling these two models such that individual actin filaments undergoing nucleation, elongation, attachment, detachment and capping are embedded into the boundary of a node-spring viscoelastic network representing the macroscopic actin gel. Stochastic simulations of this 'in silico' actin network show that the combined effects of the macroscopic elastic deformation and microscopic ratchets can explain the observed bistable orientation of the actin-propelled ellipsoidal beads. To test the theory further, we analyze observed distribution of the curvatures of the trajectories and show that the hybrid model's predictions fit the data. Finally, we demonstrate that the model can explain both concave-up and concave-down force-velocity relations for growing actin networks depending on the characteristic time scale and network recoil. To summarize, we propose that both microscopic polymerization ratchets and macroscopic stresses of the deformable actin network are responsible for the force and movement generation.

  3. Modeling acquaintance networks based on balance theory

    Directory of Open Access Journals (Sweden)

    Vukašinović Vida

    2014-09-01

    Full Text Available An acquaintance network is a social structure made up of a set of actors and the ties between them. These ties change dynamically as a consequence of incessant interactions between the actors. In this paper we introduce a social network model called the Interaction-Based (IB model that involves well-known sociological principles. The connections between the actors and the strength of the connections are influenced by the continuous positive and negative interactions between the actors and, vice versa, the future interactions are more likely to happen between the actors that are connected with stronger ties. The model is also inspired by the social behavior of animal species, particularly that of ants in their colony. A model evaluation showed that the IB model turned out to be sparse. The model has a small diameter and an average path length that grows in proportion to the logarithm of the number of vertices. The clustering coefficient is relatively high, and its value stabilizes in larger networks. The degree distributions are slightly right-skewed. In the mature phase of the IB model, i.e., when the number of edges does not change significantly, most of the network properties do not change significantly either. The IB model was found to be the best of all the compared models in simulating the e-mail URV (University Rovira i Virgili of Tarragona network because the properties of the IB model more closely matched those of the e-mail URV network than the other models

  4. A comprehensive theory-based transport model

    International Nuclear Information System (INIS)

    A new theory based transport model with comprehensive physics (trapping, general toroidal geometry, finite beta, collisions) has been developed. The core of the model is the new trapped-gyro-Landau-fluid (TGLF) equations which provide a fast and accurate approximation to the linear eigenmodes for gyrokinetic drift-wave instabilities (trapped ion and electron modes, ion and electron temperature gradient modes and kinetic ballooning modes). This new TGLF transport model removes the limitation of its predecessor GLF23 and is valid for the same conditions as the gyrokinetic equations. A theory-based philosophy is used in the model construction. The closure coefficients of the TGLF equations are fit to local kinetic theory to give accurate linear eigenmodes. The saturation model is fit to non-linear turbulence simulations. No fitting to experiment is done so applying the model to experiments is a true test of the theory it is approximating. The TGLF model unifies trapped and passing particles in a single set of gyrofluid equations. A model for the averaging of the Landau resonance by the trapped particles makes the equations work seamlessly over the whole drift-wave wavenumber range from trapped ion modes to electron temperature gradient modes. A fast eigenmode solution method enables unrestricted magnetic geometry. Electron-ion collisions and full electromagnetic fluctuations round out the physics. The linear eigenmodes have been benchmarked against comprehensive physics gyrokinetic calculations over a large range of plasma parameters. The deviation between the gyrokinetic and TGLF linear growth rates averages 11.4% in shifted circle geometry. The transport model uses the TGLF eigenmodes to compute quasilinear fluxes of energy and particles. A model for the saturated amplitude of the turbulence completes the calculation. The saturation model is constructed to fit a large set of nonlinear gyrokinetic turbulence simulations. The TGLF model is valid in new physical

  5. A comprehensive theory-based transport model

    International Nuclear Information System (INIS)

    Full text: A new theory based transport model with comprehensive physics (trapping, general toroidal geometry, finite beta, collisions) has been developed. The core of the model is the new trapped-gyro- Landau-fluid (TGLF) equations which provide a fast and accurate approximation to the linear eigenmodes for gyrokinetic drift-wave instabilities (trapped ion and electron modes, ion and electron temperature gradient modes and kinetic ballooning modes). This new TGLF transport model removes the limitation of its predecessor GLF23 and is valid for the same conditions as the gyrokinetic equations. A theory-based philosophy is used in the model construction. The closure coefficients of the TGLF equations are fit to local kinetic theory to give accurate linear eigenmodes. The saturation model is fit to non-linear turbulence simulations. No fitting to experiment is done so applying the model to experiments is a true test of the theory it is approximating. The TGLF model unifies trapped and passing particles in a single set of gyrofluid equations. A model for the averaging of the Landau resonance by the trapped particles makes the equations work seamlessly over the whole drift-wave wavenumber range from trapped ion modes to electron temperature gradient modes. A fast eigenmode solution method enables unrestricted magnetic geometry. Electron-ion collisions and full electromagnetic fluctuations round out the physics. The linear eigenmodes have been benchmarked against comprehensive physics gyrokinetic calculations over a large range of plasma parameters. The deviation between the gyrokinetic and TGLF linear growth rates averages 11.4% in shifted circle geometry. The transport model uses the TGLF eigenmodes to compute quasilinear fluxes of energy and particles. A model for the saturated amplitude of the turbulence completes the calculation. The saturation model is constructed to fit a large set of nonlinear gyrokinetic turbulence simulations. The TGLF model is valid in new

  6. Electrochemistry-based Battery Modeling for Prognostics

    Science.gov (United States)

    Daigle, Matthew J.; Kulkarni, Chetan Shrikant

    2013-01-01

    Batteries are used in a wide variety of applications. In recent years, they have become popular as a source of power for electric vehicles such as cars, unmanned aerial vehicles, and commericial passenger aircraft. In such application domains, it becomes crucial to both monitor battery health and performance and to predict end of discharge (EOD) and end of useful life (EOL) events. To implement such technologies, it is crucial to understand how batteries work and to capture that knowledge in the form of models that can be used by monitoring, diagnosis, and prognosis algorithms. In this work, we develop electrochemistry-based models of lithium-ion batteries that capture the significant electrochemical processes, are computationally efficient, capture the effects of aging, and are of suitable accuracy for reliable EOD prediction in a variety of usage profiles. This paper reports on the progress of such a model, with results demonstrating the model validity and accurate EOD predictions.

  7. An immune based dynamic intrusion detection model

    Institute of Scientific and Technical Information of China (English)

    LI Tao

    2005-01-01

    With the dynamic description method for self and antigen, and the concept of dynamic immune tolerance for lymphocytes in network-security domain presented in this paper, a new immune based dynamic intrusion detection model (Idid) is proposed. In Idid, the dynamic models and the corresponding recursive equations of the lifecycle of mature lymphocytes, and the immune memory are built. Therefore, the problem of the dynamic description of self and nonself in computer immune systems is solved, and the defect of the low efficiency of mature lymphocyte generating in traditional computer immune systems is overcome. Simulations of this model are performed, and the comparison experiment results show that the proposed dynamic intrusion detection model has a better adaptability than the traditional methods.

  8. Internet Supported Model Based Condition Monitoring

    Directory of Open Access Journals (Sweden)

    A Lewlaski,

    2010-04-01

    Full Text Available The importance of condition monitoring for preventive and predictive maintenance has increased through the use of system modelling. This modelling is carried out using manufacturer(s information. Data collection using data acquisition cards provides raw data to support system monitoring, especially when used through internet and network facilities, which make it economically available for larger number of users. A model based condition monitoring system which utilises Internet is presented in this paper. The overall software/hardware for this system will be referred to as MBCM portable unit here. It contains a standalone model of system under consideration. The unit is capable of capturing signals directly from the system, savingthem and producing these data in different formats for further analysis. In order to monitor the performance of the investigated system, the MBCM contains an embedded web-server to enable different signals to be monitored and captured locally, over a network, and via internet connection.

  9. Model-based multiple patterning layout decomposition

    Science.gov (United States)

    Guo, Daifeng; Tian, Haitong; Du, Yuelin; Wong, Martin D. F.

    2015-10-01

    As one of the most promising next generation lithography technologies, multiple patterning lithography (MPL) plays an important role in the attempts to keep in pace with 10 nm technology node and beyond. With feature size keeps shrinking, it has become impossible to print dense layouts within one single exposure. As a result, MPL such as double patterning lithography (DPL) and triple patterning lithography (TPL) has been widely adopted. There is a large volume of literature on DPL/TPL layout decomposition, and the current approach is to formulate the problem as a classical graph-coloring problem: Layout features (polygons) are represented by vertices in a graph G and there is an edge between two vertices if and only if the distance between the two corresponding features are less than a minimum distance threshold value dmin. The problem is to color the vertices of G using k colors (k = 2 for DPL, k = 3 for TPL) such that no two vertices connected by an edge are given the same color. This is a rule-based approach, which impose a geometric distance as a minimum constraint to simply decompose polygons within the distance into different masks. It is not desired in practice because this criteria cannot completely capture the behavior of the optics. For example, it lacks of sufficient information such as the optical source characteristics and the effects between the polygons outside the minimum distance. To remedy the deficiency, a model-based layout decomposition approach to make the decomposition criteria base on simulation results was first introduced at SPIE 2013.1 However, the algorithm1 is based on simplified assumption on the optical simulation model and therefore its usage on real layouts is limited. Recently AMSL2 also proposed a model-based approach to layout decomposition by iteratively simulating the layout, which requires excessive computational resource and may lead to sub-optimal solutions. The approach2 also potentially generates too many stiches. In this

  10. Multivariate statistical modelling based on generalized linear models

    CERN Document Server

    Fahrmeir, Ludwig

    1994-01-01

    This book is concerned with the use of generalized linear models for univariate and multivariate regression analysis. Its emphasis is to provide a detailed introductory survey of the subject based on the analysis of real data drawn from a variety of subjects including the biological sciences, economics, and the social sciences. Where possible, technical details and proofs are deferred to an appendix in order to provide an accessible account for non-experts. Topics covered include: models for multi-categorical responses, model checking, time series and longitudinal data, random effects models, and state-space models. Throughout, the authors have taken great pains to discuss the underlying theoretical ideas in ways that relate well to the data at hand. As a result, numerous researchers whose work relies on the use of these models will find this an invaluable account to have on their desks. "The basic aim of the authors is to bring together and review a large part of recent advances in statistical modelling of m...

  11. Image-based modelling of organogenesis.

    Science.gov (United States)

    Iber, Dagmar; Karimaddini, Zahra; Ünal, Erkan

    2016-07-01

    One of the major challenges in biology concerns the integration of data across length and time scales into a consistent framework: how do macroscopic properties and functionalities arise from the molecular regulatory networks-and how can they change as a result of mutations? Morphogenesis provides an excellent model system to study how simple molecular networks robustly control complex processes on the macroscopic scale despite molecular noise, and how important functional variants can emerge from small genetic changes. Recent advancements in three-dimensional imaging technologies, computer algorithms and computer power now allow us to develop and analyse increasingly realistic models of biological control. Here, we present our pipeline for image-based modelling that includes the segmentation of images, the determination of displacement fields and the solution of systems of partial differential equations on the growing, embryonic domains. The development of suitable mathematical models, the data-based inference of parameter sets and the evaluation of competing models are still challenging, and current approaches are discussed. PMID:26510443

  12. Business Models for NFC Based Mobile Payments

    DEFF Research Database (Denmark)

    Chae, Johannes Sang-Un; Hedman, Jonas

    2015-01-01

    Purpose: The purpose of the paper is to develop a business model framework for NFC based mobile payment solutions consisting of four mutually interdepended components: the value service, value network, value architecture, and value finance. Design: Using a comparative case study method, the paper...... from multiple stakeholders and the creation of an ecosystem. Furthermore, they focus on the scalability of their value propositions. Originality / value: The paper offers an applicable business model framework that allows practitioners and academics to study current and future mobile payment approaches....

  13. Model-based Tomographic Reconstruction Literature Search

    Energy Technology Data Exchange (ETDEWEB)

    Chambers, D H; Lehman, S K

    2005-11-30

    In the process of preparing a proposal for internal research funding, a literature search was conducted on the subject of model-based tomographic reconstruction (MBTR). The purpose of the search was to ensure that the proposed research would not replicate any previous work. We found that the overwhelming majority of work on MBTR which used parameterized models of the object was theoretical in nature. Only three researchers had applied the technique to actual data. In this note, we summarize the findings of the literature search.

  14. History-based trust negotiation model

    Institute of Scientific and Technical Information of China (English)

    ZHAO Yi-zhu; ZHAO Yan-hua; LU Hong-wei

    2009-01-01

    Trust negotiation (TN) is an approach to establish trust between strangers through iterative disclosure of digital credentials. Speeding up subsequent negotiations between the same negotiators is a problem worth of research. This paper introduces the concept of visiting card, and presents a history-based trust negotiation (HBTN) model. HBTN creates an account for a counterpart at the first negotiation and records valid credentials that the counterpart disclosed during each trust negotiation in his historical information base (HIB). For the following negotiation, no more credentials need to be disclosed for both parties. HBTN speeds up subsequent negotiations between the entities that interact with each other frequently without impairing the privacy preservation.

  15. Item Modeling Concept Based on Multimedia Authoring

    Directory of Open Access Journals (Sweden)

    Janez Stergar

    2008-09-01

    Full Text Available In this paper a modern item design framework for computer based assessment based on Flash authoring environment will be introduced. Question design will be discussed as well as the multimedia authoring environment used for item modeling emphasized. Item type templates are a structured means of collecting and storing item information that can be used to improve the efficiency and security of the innovative item design process. Templates can modernize the item design, enhance and speed up the development process. Along with content creation, multimedia has vast potential for use in innovative testing. The introduced item design template is based on taxonomy of innovative items which have great potential for expanding the content areas and construct coverage of an assessment. The presented item design approach is based on GUI's – one for question design based on implemented item design templates and one for user interaction tracking/retrieval. The concept of user interfaces based on Flash technology will be discussed as well as implementation of the innovative approach of the item design forms with multimedia authoring. Also an innovative method for user interaction storage/retrieval based on PHP extending Flash capabilities in the proposed framework will be introduced.

  16. Modelling spatial association in pattern based land use simulation models.

    Science.gov (United States)

    Anputhas, Markandu; Janmaat, Johannus John A; Nichol, Craig F; Wei, Xiaohua Adam

    2016-10-01

    Pattern based land use models are widely used to forecast land use change. These models predict land use change using driving variables observed on the studied landscape. Many of these models have a limited capacity to account for interactions between neighbouring land parcels. Some modellers have used common spatial statistical measures to incorporate neighbour effects. However, these approaches were developed for continuous variables, while land use classifications are categorical. Neighbour interactions are also endogenous, changing as the land use patterns change. In this study we describe a single variable measure that captures aspects of neighbour interactions as reflected in the land use pattern. We use a stepwise updating process to demonstrate how dynamic updating of our measure impacts on model forecasts. We illustrate these results using the CLUE-S (Conversion of Land Use and its Effects at Small regional extent) system to forecast land use change for the Deep Creek watershed in the northern Okanagan Valley of British Columbia, Canada. Results establish that our measure improves model calibration and that ignoring changing spatial influences biases land use change forecasts. PMID:27420169

  17. Graph based model to support nurses' work.

    Science.gov (United States)

    Benedik, Peter; Rajkovič, Uroš; Sušteršič, Olga; Prijatelj, Vesna; Rajkovič, Vladislav

    2014-01-01

    Health care is a knowledge-based community that critically depends on knowledge management activities in order to ensure quality. Nurses are primary stakeholders and need to ensure that their information and knowledge needs are being met in such ways that enable them, to improve the quality and efficiency of health care service delivery for all subjects of health care. This paper describes a system to help nurses to create nursing care plan. It supports focusing nurse's attention on those resources/solutions that are likely to be most relevant to their particular situation/problem in nursing domain. System is based on multi-relational property graph representing a flexible modeling construct. Graph allows modeling a nursing domain (ontology) and the indices that partition domain into an efficient, searchable space where the solution to a problem is seen as abstractly defined traversals through its vertices and edges. PMID:24943559

  18. A multi-characteristic based algorithm for classifying vegetation in a plateau area: Qinghai Lake watershed, northwestern China

    Science.gov (United States)

    Ma, Weiwei; Gong, Cailan; Hu, Yong; Li, Long; Meng, Peng

    2015-10-01

    Remote sensing technology has been broadly recognized for its convenience and efficiency in mapping vegetation, particularly in high-altitude and inaccessible areas where there are lack of in-situ observations. In this study, Landsat Thematic Mapper (TM) images and Chinese environmental mitigation satellite CCD sensor (HJ-1 CCD) images, both of which are at 30m spatial resolution were employed for identifying and monitoring of vegetation types in a area of Western China——Qinghai Lake Watershed(QHLW). A decision classification tree (DCT) algorithm using multi-characteristic including seasonal TM/HJ-1 CCD time series data combined with digital elevation models (DEMs) dataset, and a supervised maximum likelihood classification (MLC) algorithm with single-data TM image were applied vegetation classification. Accuracy of the two algorithms was assessed using field observation data. Based on produced vegetation classification maps, it was found that the DCT using multi-season data and geomorphologic parameters was superior to the MLC algorithm using single-data image, improving the overall accuracy by 11.86% at second class level and significantly reducing the "salt and pepper" noise. The DCT algorithm applied to TM /HJ-1 CCD time series data geomorphologic parameters appeared as a valuable and reliable tool for monitoring vegetation at first class level (5 vegetation classes) and second class level(8 vegetation subclasses). The DCT algorithm using multi-characteristic might provide a theoretical basis and general approach to automatic extraction of vegetation types from remote sensing imagery over plateau areas.

  19. Requirements Engineering Model: Role Based Goal Oriented Model

    OpenAIRE

    Sandfreni; Surendro Ir. Kridanto

    2016-01-01

    Requirements engineering approach through intentional perspective is one of the arguments that appear in the field of requirement engineering. That approach can explain the characteristics of the behavior of an actor. The usage Goal Based Workflow and KAOS method in iStar modeling might help the system analyst to gain knowledge about the internal process inside each of actor sequentially, such that the whole sequential activity to achieve the goal are exposed clearly in those actor’s internal...

  20. Model-Based Trace-Checking

    CERN Document Server

    Howard, Y; Gravell, A; Ferreira, C; Augusto, J C

    2011-01-01

    Trace analysis can be a useful way to discover problems in a program under test. Rather than writing a special purpose trace analysis tool, this paper proposes that traces can usefully be analysed by checking them against a formal model using a standard model-checker or else an animator for executable specifications. These techniques are illustrated using a Travel Agent case study implemented in J2EE. We added trace beans to this code that write trace information to a database. The traces are then extracted and converted into a form suitable for analysis by Spin, a popular model-checker, and Pro-B, a model-checker and animator for the B notation. This illustrates the technique, and also the fact that such a system can have a variety of models, in different notations, that capture different features. These experiments have demonstrated that model-based trace-checking is feasible. Future work is focussed on scaling up the approach to larger systems by increasing the level of automation.

  1. Agent Based Model of Livestock Movements

    Science.gov (United States)

    Miron, D. J.; Emelyanova, I. V.; Donald, G. E.; Garner, G. M.

    The modelling of livestock movements within Australia is of national importance for the purposes of the management and control of exotic disease spread, infrastructure development and the economic forecasting of livestock markets. In this paper an agent based model for the forecasting of livestock movements is presented. This models livestock movements from farm to farm through a saleyard. The decision of farmers to sell or buy cattle is often complex and involves many factors such as climate forecast, commodity prices, the type of farm enterprise, the number of animals available and associated off-shore effects. In this model the farm agent's intelligence is implemented using a fuzzy decision tree that utilises two of these factors. These two factors are the livestock price fetched at the last sale and the number of stock on the farm. On each iteration of the model farms choose either to buy, sell or abstain from the market thus creating an artificial supply and demand. The buyers and sellers then congregate at the saleyard where livestock are auctioned using a second price sealed bid. The price time series output by the model exhibits properties similar to those found in real livestock markets.

  2. Ecosystem Based Business Model of Smart Grid

    OpenAIRE

    Lundgaard, Morten Raahauge; Ma, Zheng; Jørgensen, Bo Nørregaard

    2015-01-01

    This paper tries to investigate the ecosystem based business model in a smart grid infrastructure and the potential of value capture in the highly complex macro infrastructure such as smart grid. This paper proposes an alternative perspective to study the smart grid business ecosystem to support the infrastructural challenges, such as the interoperability of business components for smart grid. So far little research has explored the business ecosystem in the smart grid concept. The study on t...

  3. Market Segmentation Using Bayesian Model Based Clustering

    OpenAIRE

    Van Hattum, P.

    2009-01-01

    This dissertation deals with two basic problems in marketing, that are market segmentation, which is the grouping of persons who share common aspects, and market targeting, which is focusing your marketing efforts on one or more attractive market segments. For the grouping of persons who share common aspects a Bayesian model based clustering approach is proposed such that it can be applied to data sets that are specifically used for market segmentation. The cluster algorithm can handle very l...

  4. Enhancing the Combat ID Agent Based Model

    OpenAIRE

    Spaans, M.; Petiet, P.J.; Dean, D; Jackson, J.; Bradley, W.; Shan, L. Y.; Ka-Yoon, W.; Yongwei, D.W.; Kai, C.W.

    2007-01-01

    During previous Project Albert and International data Farming Workshops (IDFW) and during discussions between Dstl and TNO, the suitability and feasibility of Agent Based Models (ABMs) to support research on Combat Identification (Combat ID) was examined. The objective of this research is to: Investigate the effect of (a large number of) different variations in Situational Awareness, Situation Awareness (SA), Target Identification (Target ID), Human Factors, and Tactics, Techniques, and Proce...

  5. INVESTOR BASED PSYCHOLOGICAL DECISION MAKING MODEL

    OpenAIRE

    Mohd Alnajjar

    2013-01-01

    Significance of behavioral finance has been realized. Intrinsic value of behavioral finance is to investigate the irrational attitude of investor while making investment decision. This study explores the investor based psychological decision making model for enhancing the understanding about the psychological decision making process of investors in Islamabad Stock Exchange. A self-administered structured questionnaire is used for gathering data from 168 respondents (investors of ISE). Correla...

  6. Building Information Modelling Incorporating Technology Based Assessment

    OpenAIRE

    Murphy, Maurice; Scott, Lloyd

    2011-01-01

    Building Information Modelling (BIM) is currently being developed as a virtual learning tool for construction and surveying students in the Dublin Institute of Technology. This advanced technology is also used to develop a technology based assessment practice for enhancing the learning environment of construction and surveying students. A theoretical design framework is presented in this paper, which combines advanced technology and assessment theory to create a virtual learning environment. ...

  7. CEAI: CCM based Email Authorship Identification Model

    OpenAIRE

    Nizamani, Sarwat; Memon, Nasrullah

    2013-01-01

    In this paper we present a model for email authorship identification (EAI) by employing a Cluster-based Classification (CCM) technique. Traditionally, stylometric features have been successfully employed in various authorship analysis tasks; we extend the traditional feature-set to include some more interesting and effective features for email authorship identification (e.g. the last punctuation mark used in an email, the tendency of an author to use capitalization at the start of an email, o...

  8. Requirements engineering-based conceptual modeling

    OpenAIRE

    Insfrán, E.; O. Pastor; Wieringa, R.J.

    2002-01-01

    The software production process involves a set of phases where a clear relationship and smooth transitions between them should be introduced. In this paper, a requirements engineering-based conceptual modelling approach is introduced as a way to improve the quality of the software production process. The aim of this approach is to provide a set of techniques and methods to capture software requirements and to provide a way to move from requirements to a conceptual schema in a traceable way. T...

  9. Genome Informed Trait-Based Models

    Science.gov (United States)

    Karaoz, U.; Cheng, Y.; Bouskill, N.; Tang, J.; Beller, H. R.; Brodie, E.; Riley, W. J.

    2013-12-01

    Trait-based approaches are powerful tools for representing microbial communities across both spatial and temporal scales within ecosystem models. Trait-based models (TBMs) represent the diversity of microbial taxa as stochastic assemblages with a distribution of traits constrained by trade-offs between these traits. Such representation with its built-in stochasticity allows the elucidation of the interactions between the microbes and their environment by reducing the complexity of microbial community diversity into a limited number of functional ';guilds' and letting them emerge across spatio-temporal scales. From the biogeochemical/ecosystem modeling perspective, the emergent properties of the microbial community could be directly translated into predictions of biogeochemical reaction rates and microbial biomass. The accuracy of TBMs depends on the identification of key traits of the microbial community members and on the parameterization of these traits. Current approaches to inform TBM parameterization are empirical (i.e., based on literature surveys). Advances in omic technologies (such as genomics, metagenomics, metatranscriptomics, and metaproteomics) pave the way to better-initialize models that can be constrained in a generic or site-specific fashion. Here we describe the coupling of metagenomic data to the development of a TBM representing the dynamics of metabolic guilds from an organic carbon stimulated groundwater microbial community. Illumina paired-end metagenomic data were collected from the community as it transitioned successively through electron-accepting conditions (nitrate-, sulfate-, and Fe(III)-reducing), and used to inform estimates of growth rates and the distribution of metabolic pathways (i.e., aerobic and anaerobic oxidation, fermentation) across a spatially resolved TBM. We use this model to evaluate the emergence of different metabolisms and predict rates of biogeochemical processes over time. We compare our results to observational

  10. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  11. Realistic face modeling based on multiple deformations

    Institute of Scientific and Technical Information of China (English)

    GONG Xun; WANG Guo-yin

    2007-01-01

    On the basis of the assumption that the human face belongs to a linear class, a multiple-deformation model is proposed to recover face shape from a few points on a single 2D image. Compared to the conventional methods, this study has the following advantages. First, the proposed modified 3D sparse deforming model is a noniterative approach that can compute global translation efficiently and accurately. Subsequently, the overfitting problem can be alleviated based on the proposed multiple deformation model. Finally, by keeping the main features, the texture generated is realistic. The comparison results show that this novel method outperforms the existing methods by using ground truth data and that realistic 3D faces can be recovered efficiently from a single photograph.

  12. Model based monitoring of stormwater runoff quality

    DEFF Research Database (Denmark)

    Birch, Heidi; Vezzaro, Luca; Mikkelsen, Peter Steen

    2012-01-01

    average and maximum event mean concentrations. Use of this model reduced the uncertainty of predicted annual average concentrations compared to a simple stochastic method based solely on data. The predicted annual average obtained by using passive sampler measurements (one month installation) for......Monitoring of micropollutants (MP) in stormwater is essential to evaluate the impacts of stormwater on the receiving aquatic environment. The aim of this study was to investigate how different strategies for monitoring of stormwater quality (combination of model with field sampling) affect the...... information obtained about MPs discharged from the monitored system. A dynamic stormwater quality model was calibrated using MP data collected by volume-proportional and passive sampling in a storm drainage system in the outskirts of Copenhagen (Denmark) and a 10-year rain series was used to find annual...

  13. On Reading-Based Writing Instruction Model

    Institute of Scientific and Technical Information of China (English)

    李大艳; 王建安

    2012-01-01

    English writing is a complex integrative process of comprehensive skills. A host of students are still unable to write a coherent English paragraph after having learned English for many years at school. To help college students improve their writing competence is a great challenge facing the English teaching in China. Researches on writing teaching method abroad have experienced prosperity. In China, however, researches in this field are far behind. There is great need to search for more efficient writing instruction model so that it can serve well in Chinese context. Enlightened by Krashen's input hypothesis and Swain's output hypothesis, the writer put forward Reading-Based Writing Instruction Model. This paper aims to discuss the effectiveness of this model from the different perspectives.

  14. Business Models for NFC based mobile payments

    Directory of Open Access Journals (Sweden)

    Johannes Sang Un Chae

    2015-01-01

    Full Text Available Purpose: The purpose of the paper is to develop a business model framework for NFC based mobile payment solutions consisting of four mutually interdepended components: the value service, value network, value architecture, and value finance. Design: Using a comparative case study method, the paper investigates Google Wallet and ISIS Mobile Wallet and their underlying business models. Findings: Google Wallet and ISIS Mobile Wallet are focusing on providing an enhanced customer experience with their mobile wallet through a multifaceted value proposition. The delivery of its offering requires cooperation from multiple stakeholders and the creation of an ecosystem. Furthermore, they focus on the scalability of their value propositions. Originality / value: The paper offers an applicable business model framework that allows practitioners and academics to study current and future mobile payment approaches.

  15. Model-based vision for space applications

    Science.gov (United States)

    Chaconas, Karen; Nashman, Marilyn; Lumia, Ronald

    1992-01-01

    This paper describes a method for tracking moving image features by combining spatial and temporal edge information with model based feature information. The algorithm updates the two-dimensional position of object features by correlating predicted model features with current image data. The results of the correlation process are used to compute an updated model. The algorithm makes use of a high temporal sampling rate with respect to spatial changes of the image features and operates in a real-time multiprocessing environment. Preliminary results demonstrate successful tracking for image feature velocities between 1.1 and 4.5 pixels every image frame. This work has applications for docking, assembly, retrieval of floating objects and a host of other space-related tasks.

  16. Model-based target and background characterization

    Science.gov (United States)

    Mueller, Markus; Krueger, Wolfgang; Heinze, Norbert

    2000-07-01

    Up to now most approaches of target and background characterization (and exploitation) concentrate solely on the information given by pixels. In many cases this is a complex and unprofitable task. During the development of automatic exploitation algorithms the main goal is the optimization of certain performance parameters. These parameters are measured during test runs while applying one algorithm with one parameter set to images that constitute of image domains with very different domain characteristics (targets and various types of background clutter). Model based geocoding and registration approaches provide means for utilizing the information stored in GIS (Geographical Information Systems). The geographical information stored in the various GIS layers can define ROE (Regions of Expectations) and may allow for dedicated algorithm parametrization and development. ROI (Region of Interest) detection algorithms (in most cases MMO (Man- Made Object) detection) use implicit target and/or background models. The detection algorithms of ROIs utilize gradient direction models that have to be matched with transformed image domain data. In most cases simple threshold calculations on the match results discriminate target object signatures from the background. The geocoding approaches extract line-like structures (street signatures) from the image domain and match the graph constellation against a vector model extracted from a GIS (Geographical Information System) data base. Apart from geo-coding the algorithms can be also used for image-to-image registration (multi sensor and data fusion) and may be used for creation and validation of geographical maps.

  17. Inference-based procedural modeling of solids

    KAUST Repository

    Biggers, Keith

    2011-11-01

    As virtual environments become larger and more complex, there is an increasing need for more automated construction algorithms to support the development process. We present an approach for modeling solids by combining prior examples with a simple sketch. Our algorithm uses an inference-based approach to incrementally fit patches together in a consistent fashion to define the boundary of an object. This algorithm samples and extracts surface patches from input models, and develops a Petri net structure that describes the relationship between patches along an imposed parameterization. Then, given a new parameterized line or curve, we use the Petri net to logically fit patches together in a manner consistent with the input model. This allows us to easily construct objects of varying sizes and configurations using arbitrary articulation, repetition, and interchanging of parts. The result of our process is a solid model representation of the constructed object that can be integrated into a simulation-based environment. © 2011 Elsevier Ltd. All rights reserved.

  18. Néron models and base change

    CERN Document Server

    Halle, Lars Halvard

    2016-01-01

    Presenting the first systematic treatment of the behavior of Néron models under ramified base change, this book can be read as an introduction to various subtle invariants and constructions related to Néron models of semi-abelian varieties, motivated by concrete research problems and complemented with explicit examples. Néron models of abelian and semi-abelian varieties have become an indispensable tool in algebraic and arithmetic geometry since Néron introduced them in his seminal 1964 paper. Applications range from the theory of heights in Diophantine geometry to Hodge theory. We focus specifically on Néron component groups, Edixhoven’s filtration and the base change conductor of Chai and Yu, and we study these invariants using various techniques such as models of curves, sheaves on Grothendieck sites and non-archimedean uniformization. We then apply our results to the study of motivic zeta functions of abelian varieties. The final chapter contains a list of challenging open questions. This book is a...

  19. Concept Tree Based Information Retrieval Model

    Directory of Open Access Journals (Sweden)

    Chunyan Yuan

    2014-05-01

    Full Text Available This paper proposes a novel concept-based query expansion technique named Markov concept tree model (MCTM, discovering term relationship through the concept tree deduced by term markov network. We address two important issues for query expansion: the selection and the weighting of expansion search terms. In contrast to earlier methods, queries are expanded by adding those terms that are most similar to the concept of the query, rather than selecting terms that are similar to a signal query terms. Utilizing Markov network which is constructed according to the co-occurrence information of the terms in collection, it generate concept tree for each original query term, remove the redundant and irrelevant nodes in concept tree, then adjust the weight of original query and the weight of expansion term based on a pruning algorithm. We use this model for query expansion and evaluate the effectiveness of the model by examining the accuracy and robustness of the expansion methods, Compared with the baseline model, the experiments on standard dataset reveal that this method can achieve a better query quality

  20. Model-based explanation of plant knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Huuskonen, P.J. [VTT Electronics, Oulu (Finland). Embedded Software

    1997-12-31

    This thesis deals with computer explanation of knowledge related to design and operation of industrial plants. The needs for explanation are motivated through case studies and literature reviews. A general framework for analysing plant explanations is presented. Prototypes demonstrate key mechanisms for implementing parts of the framework. Power plants, steel mills, paper factories, and high energy physics control systems are studied to set requirements for explanation. The main problems are seen to be either lack or abundance of information. Design knowledge in particular is found missing at plants. Support systems and automation should be enhanced with ways to explain plant knowledge to the plant staff. A framework is formulated for analysing explanations of plant knowledge. It consists of three parts: 1. a typology of explanation, organised by the class of knowledge (factual, functional, or strategic) and by the target of explanation (processes, automation, or support systems), 2. an identification of explanation tasks generic for the plant domain, and 3. an identification of essential model types for explanation (structural, behavioural, functional, and teleological). The tasks use the models to create the explanations of the given classes. Key mechanisms are discussed to implement the generic explanation tasks. Knowledge representations based on objects and their relations form a vocabulary to model and present plant knowledge. A particular class of models, means-end models, are used to explain plant knowledge. Explanations are generated through searches in the models. Hypertext is adopted to communicate explanations over dialogue based on context. The results are demonstrated in prototypes. The VICE prototype explains the reasoning of an expert system for diagnosis of rotating machines at power plants. The Justifier prototype explains design knowledge obtained from an object-oriented plant design tool. Enhanced access mechanisms into on-line documentation are

  1. Model-based vision for car following

    Science.gov (United States)

    Schneiderman, Henry; Nashman, Marilyn; Lumia, Ronald

    1993-08-01

    This paper describes a vision processing algorithm that supports autonomous car following. The algorithm visually tracks the position of a `lead vehicle' from the vantage of a pursuing `chase vehicle.' The algorithm requires a 2-D model of the back of the lead vehicle. This model is composed of line segments corresponding to features that give rise to strong edges. There are seven sequential stages of computation: (1) Extracting edge points; (2) Associating extracted edge points with the model features; (3) Determining the position of each model feature; (4) Determining the model position; (5) Updating the motion model of the object; (6) Predicting the position of the object in next image; (7) Predicting the location of all object features from prediction of object position. All processing is confined to the 2-D image plane. The 2-D model location computed in this processing is used to determine the position of the lead vehicle with respect to a 3-D coordinate frame affixed to the chase vehicle. This algorithm has been used as part of a complete system to drive an autonomous vehicle, a High Mobility Multipurpose Wheeled Vehicle (HMMWV) such that it follows a lead vehicle at speeds up to 35 km/hr. The algorithm runs at an update rate of 15 Hertz and has a worst case computational delay of 128 ms. The algorithm is implemented under the NASA/NBS Standard Reference Model for Telerobotic Control System Architecture (NASREM) and runs on a dedicated vision processing engine and a VME-based multiprocessor system.

  2. A Visual Attention Model Based Image Fusion

    Directory of Open Access Journals (Sweden)

    Rishabh Gupta

    2013-12-01

    Full Text Available To develop an efficient image fusion algorithm based on visual attention model for images with distinct objects. Image fusion is a process of combining complementary information from multiple images of the same scene into an image, so that the resultant image contains a more accurate description of the scene than any of the individual source images. The two basic fusion techniques are pixel level and region level fusion. Pixel level fusion deals with the operations on each and every pixel separately. The various pixel level techniques are averaging, stationary wavelet transforms, discrete wavelet transforms, Principal Component Analysis (PCA. But because of less sensitivity to noise and mis-registration, the region level image fusion is an emerging approach in the field of multifocus image fusion. The most appreciated approaches in region-based methods are multifocus image fusion using the concept of focal connectivity and spatial frequency. These two methods works well on still images as well as on video frames as inputs. A new region based technique is been proposed for the multifocus images having distinct objects. The method is based on the visual attention models and results obtained are appreciating for the distinct objects input images. The Proposed method results are highlighted using tenengrade and extended spatial frequency as performance parameters by taking several pairs of multi-focus input images like microscopic images, forensic images and video frames.

  3. Model-Based Method for Sensor Validation

    Science.gov (United States)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  4. Flow based vs. demand based energy-water modelling

    Science.gov (United States)

    Rozos, Evangelos; Nikolopoulos, Dionysis; Efstratiadis, Andreas; Koukouvinos, Antonios; Makropoulos, Christos

    2015-04-01

    The water flow in hydro-power generation systems is often used downstream to cover other type of demands like irrigation and water supply. However, the typical case is that the energy demand (operation of hydro-power plant) and the water demand do not coincide. Furthermore, the water inflow into a reservoir is a stochastic process. Things become more complicated if renewable resources (wind-turbines or photovoltaic panels) are included into the system. For this reason, the assessment and optimization of the operation of hydro-power systems are challenging tasks that require computer modelling. This modelling should not only simulate the water budget of the reservoirs and the energy production/consumption (pumped-storage), but should also take into account the constraints imposed by the natural or artificial water network using a flow routing algorithm. HYDRONOMEAS, for example, uses an elegant mathematical approach (digraph) to calculate the flow in a water network based on: the demands (input timeseries), the water availability (simulated) and the capacity of the transmission components (properties of channels, rivers, pipes, etc.). The input timeseries of demand should be estimated by another model and linked to the corresponding network nodes. A model that could be used to estimate these timeseries is UWOT. UWOT is a bottom up urban water cycle model that simulates the generation, aggregation and routing of water demand signals. In this study, we explore the potentials of UWOT in simulating the operation of complex hydrosystems that include energy generation. The evident advantage of this approach is the use of a single model instead of one for estimation of demands and another for the system simulation. An application of UWOT in a large scale system is attempted in mainland Greece in an area extending over 130×170 km². The challenges, the peculiarities and the advantages of this approach are examined and critically discussed.

  5. Grid based calibration of SWAT hydrological models

    Directory of Open Access Journals (Sweden)

    D. Gorgan

    2012-07-01

    Full Text Available The calibration and execution of large hydrological models, such as SWAT (soil and water assessment tool, developed for large areas, high resolution, and huge input data, need not only quite a long execution time but also high computation resources. SWAT hydrological model supports studies and predictions of the impact of land management practices on water, sediment, and agricultural chemical yields in complex watersheds. The paper presents the gSWAT application as a web practical solution for environmental specialists to calibrate extensive hydrological models and to run scenarios, by hiding the complex control of processes and heterogeneous resources across the grid based high computation infrastructure. The paper highlights the basic functionalities of the gSWAT platform, and the features of the graphical user interface. The presentation is concerned with the development of working sessions, interactive control of calibration, direct and basic editing of parameters, process monitoring, and graphical and interactive visualization of the results. The experiments performed on different SWAT models and the obtained results argue the benefits brought by the grid parallel and distributed environment as a solution for the processing platform. All the instances of SWAT models used in the reported experiments have been developed through the enviroGRIDS project, targeting the Black Sea catchment area.

  6. Physics-based models of the plasmasphere

    Energy Technology Data Exchange (ETDEWEB)

    Jordanova, Vania K [Los Alamos National Laboratory; Pierrard, Vivane [BELGIUM; Goldstein, Jerry [SWRI; Andr' e, Nicolas [ESTEC/ESA; Kotova, Galina A [SRI, RUSSIA; Lemaire, Joseph F [BELGIUM; Liemohn, Mike W [U OF MICHIGAN; Matsui, H [UNIV OF NEW HAMPSHIRE

    2008-01-01

    We describe recent progress in physics-based models of the plasmasphere using the Auid and the kinetic approaches. Global modeling of the dynamics and inAuence of the plasmasphere is presented. Results from global plasmasphere simulations are used to understand and quantify (i) the electric potential pattern and evolution during geomagnetic storms, and (ii) the inAuence of the plasmasphere on the excitation of electromagnetic ion cyclotron (ElvIIC) waves a.nd precipitation of energetic ions in the inner magnetosphere. The interactions of the plasmasphere with the ionosphere a.nd the other regions of the magnetosphere are pointed out. We show the results of simulations for the formation of the plasmapause and discuss the inAuence of plasmaspheric wind and of ultra low frequency (ULF) waves for transport of plasmaspheric material. Theoretical formulations used to model the electric field and plasma distribution in the plasmasphere are given. Model predictions are compared to recent CLUSTER and MAGE observations, but also to results of earlier models and satellite observations.

  7. Model based systems engineering for astronomical projects

    Science.gov (United States)

    Karban, R.; Andolfato, L.; Bristow, P.; Chiozzi, G.; Esselborn, M.; Schilling, M.; Schmid, C.; Sommer, H.; Zamparelli, M.

    2014-08-01

    Model Based Systems Engineering (MBSE) is an emerging field of systems engineering for which the System Modeling Language (SysML) is a key enabler for descriptive, prescriptive and predictive models. This paper surveys some of the capabilities, expectations and peculiarities of tools-assisted MBSE experienced in real-life astronomical projects. The examples range in depth and scope across a wide spectrum of applications (for example documentation, requirements, analysis, trade studies) and purposes (addressing a particular development need, or accompanying a project throughout many - if not all - its lifecycle phases, fostering reuse and minimizing ambiguity). From the beginnings of the Active Phasing Experiment, through VLT instrumentation, VLTI infrastructure, Telescope Control System for the E-ELT, until Wavefront Control for the E-ELT, we show how stepwise refinements of tools, processes and methods have provided tangible benefits to customary system engineering activities like requirement flow-down, design trade studies, interfaces definition, and validation, by means of a variety of approaches (like Model Checking, Simulation, Model Transformation) and methodologies (like OOSEM, State Analysis)

  8. Modeling oil production based on symbolic regression

    International Nuclear Information System (INIS)

    Numerous models have been proposed to forecast the future trends of oil production and almost all of them are based on some predefined assumptions with various uncertainties. In this study, we propose a novel data-driven approach that uses symbolic regression to model oil production. We validate our approach on both synthetic and real data, and the results prove that symbolic regression could effectively identify the true models beneath the oil production data and also make reliable predictions. Symbolic regression indicates that world oil production will peak in 2021, which broadly agrees with other techniques used by researchers. Our results also show that the rate of decline after the peak is almost half the rate of increase before the peak, and it takes nearly 12 years to drop 4% from the peak. These predictions are more optimistic than those in several other reports, and the smoother decline will provide the world, especially the developing countries, with more time to orchestrate mitigation plans. -- Highlights: •A data-driven approach has been shown to be effective at modeling the oil production. •The Hubbert model could be discovered automatically from data. •The peak of world oil production is predicted to appear in 2021. •The decline rate after peak is half of the increase rate before peak. •Oil production projected to decline 4% post-peak

  9. A Comparison of Filter-based Approaches for Model-based Prognostics

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics approaches use domain knowledge about a system and its failure modes through the use of physics-based models. Model-based prognosis is...

  10. Effect of species rarity on the accuracy of species distribution models for reptiles and amphibians in southern California

    Science.gov (United States)

    Franklin, J.; Wejnert, K.E.; Hathaway, S.A.; Rochester, C.J.; Fisher, R.N.

    2009-01-01

    Aim: Several studies have found that more accurate predictive models of species' occurrences can be developed for rarer species; however, one recent study found the relationship between range size and model performance to be an artefact of sample prevalence, that is, the proportion of presence versus absence observations in the data used to train the model. We examined the effect of model type, species rarity class, species' survey frequency, detectability and manipulated sample prevalence on the accuracy of distribution models developed for 30 reptile and amphibian species. Location: Coastal southern California, USA. Methods: Classification trees, generalized additive models and generalized linear models were developed using species presence and absence data from 420 locations. Model performance was measured using sensitivity, specificity and the area under the curve (AUC) of the receiver-operating characteristic (ROC) plot based on twofold cross-validation, or on bootstrapping. Predictors included climate, terrain, soil and vegetation variables. Species were assigned to rarity classes by experts. The data were sampled to generate subsets with varying ratios of presences and absences to test for the effect of sample prevalence. Join count statistics were used to characterize spatial dependence in the prediction errors. Results: Species in classes with higher rarity were more accurately predicted than common species, and this effect was independent of sample prevalence. Although positive spatial autocorrelation remained in the prediction errors, it was weaker than was observed in the species occurrence data. The differences in accuracy among model types were slight. Main conclusions: Using a variety of modelling methods, more accurate species distribution models were developed for rarer than for more common species. This was presumably because it is difficult to discriminate suitable from unsuitable habitat for habitat generalists, and not as an artefact of the

  11. Model-based Prognostics with Concurrent Damage Progression Processes

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics approaches rely on physics-based models that describe the behavior of systems and their components. These models must account for the...

  12. Model-based phase-shifting interferometer

    Science.gov (United States)

    Liu, Dong; Zhang, Lei; Shi, Tu; Yang, Yongying; Chong, Shiyao; Miao, Liang; Huang, Wei; Shen, Yibing; Bai, Jian

    2015-10-01

    A model-based phase-shifting interferometer (MPI) is developed, in which a novel calculation technique is proposed instead of the traditional complicated system structure, to achieve versatile, high precision and quantitative surface tests. In the MPI, the partial null lens (PNL) is employed to implement the non-null test. With some alternative PNLs, similar as the transmission spheres in ZYGO interferometers, the MPI provides a flexible test for general spherical and aspherical surfaces. Based on modern computer modeling technique, a reverse iterative optimizing construction (ROR) method is employed for the retrace error correction of non-null test, as well as figure error reconstruction. A self-compiled ray-tracing program is set up for the accurate system modeling and reverse ray tracing. The surface figure error then can be easily extracted from the wavefront data in forms of Zernike polynomials by the ROR method. Experiments of the spherical and aspherical tests are presented to validate the flexibility and accuracy. The test results are compared with those of Zygo interferometer (null tests), which demonstrates the high accuracy of the MPI. With such accuracy and flexibility, the MPI would possess large potential in modern optical shop testing.

  13. Human physiologically based pharmacokinetic model for propofol

    Directory of Open Access Journals (Sweden)

    Schnider Thomas W

    2005-04-01

    Full Text Available Abstract Background Propofol is widely used for both short-term anesthesia and long-term sedation. It has unusual pharmacokinetics because of its high lipid solubility. The standard approach to describing the pharmacokinetics is by a multi-compartmental model. This paper presents the first detailed human physiologically based pharmacokinetic (PBPK model for propofol. Methods PKQuest, a freely distributed software routine http://www.pkquest.com, was used for all the calculations. The "standard human" PBPK parameters developed in previous applications is used. It is assumed that the blood and tissue binding is determined by simple partition into the tissue lipid, which is characterized by two previously determined set of parameters: 1 the value of the propofol oil/water partition coefficient; 2 the lipid fraction in the blood and tissues. The model was fit to the individual experimental data of Schnider et. al., Anesthesiology, 1998; 88:1170 in which an initial bolus dose was followed 60 minutes later by a one hour constant infusion. Results The PBPK model provides a good description of the experimental data over a large range of input dosage, subject age and fat fraction. Only one adjustable parameter (the liver clearance is required to describe the constant infusion phase for each individual subject. In order to fit the bolus injection phase, for 10 or the 24 subjects it was necessary to assume that a fraction of the bolus dose was sequestered and then slowly released from the lungs (characterized by two additional parameters. The average weighted residual error (WRE of the PBPK model fit to the both the bolus and infusion phases was 15%; similar to the WRE for just the constant infusion phase obtained by Schnider et. al. using a 6-parameter NONMEM compartmental model. Conclusion A PBPK model using standard human parameters and a simple description of tissue binding provides a good description of human propofol kinetics. The major advantage of a

  14. New global ICT-based business models

    DEFF Research Database (Denmark)

    The New Global Business model (NEWGIBM) book describes the background, theory references, case studies, results and learning imparted by the NEWGIBM project, which is supported by ICT, to a research group during the period from 2005-2011. The book is a result of the efforts and the collaborative ...... NEWGIBM Cases Show? The Strategy Concept in Light of the Increased Importance of Innovative Business Models Successful Implementation of Global BM Innovation Globalisation Of ICT Based Business Models: Today And In 2020......The New Global Business model (NEWGIBM) book describes the background, theory references, case studies, results and learning imparted by the NEWGIBM project, which is supported by ICT, to a research group during the period from 2005-2011. The book is a result of the efforts and the collaborative...... NEWGIBM book serves as a part of the final evaluation and documentation of the NEWGIBM project and is supported by results from the following projects: M-commerce, Global Innovation, Global Ebusiness & M-commerce, The Blue Ocean project, International Center for Innovation and Women in Business, NEFFICS...

  15. Trip Generation Model Based on Destination Attractiveness

    Institute of Scientific and Technical Information of China (English)

    YAO Liya; GUAN Hongzhi; YAN Hai

    2008-01-01

    Traditional trip generation forecasting methods use unified average trip generation rates to determine trip generation volumes in various traffic zones without considering the individual characteristics of each traffic zone.Therefore,the results can have significant errors.To reduce the forecasting error produced by uniform trip generation rates for different traffic zones,the behavior of each traveler was studied instead of the characteristics of the traffic zone.This paper gives a method for calculating the trip efficiency and the effect of traffic zones combined with a destination selection model based on disaggregate theory for trip generation.Beijing data is used with the trip generation method to predict trip volumes.The results show that the disaggregate model in this paper is more accurate than the traditional method.An analysis of the factors influencing traveler behavior and destination selection shows that the attractiveness of the traffic zone strongly affects the trip generation volume.

  16. Cloth Modeling Based on Particle System

    Institute of Scientific and Technical Information of China (English)

    钟跃崎; 王善元

    2001-01-01

    A physical-based particle system is employed for cloth modeling supported by two basic algorithms, between which one is the construction of the internal and external forces acting on the particle system in terms of KES-F bending and shearing tests, and the other is the collision algorithm of which the collision detection is carried by means of bi-section of time step and the collision response is handled according to the empirical law for frictionless collision With these algorithms. the geometric state of parcles can be expressed as ordinary differential equationswhich is numerically solved by fourth order Runge- Kutta integration. Different draping figures of cotton fabric and wool fabric prove that such a particle system is suitable for 3D cloth modeling and simulation.

  17. Adaptive Model-Based Mammogram Enhancement

    Czech Academy of Sciences Publication Activity Database

    Haindl, Michal; Remeš, Václav

    Los Alamitos, USA: IEEE Computer Society CPS, 2014 - (Yetongno, K.; Dipanda, A.; Chbeir, R.), s. 65-72 ISBN 978-1-4799-7978-3. [Tenth International Conference on Signal-Image Technology & Internet-Based Systems (SITIS 2014). Marrakech (MA), 23.11.2014-27.11.2014] R&D Projects: GA ČR(CZ) GA14-10911S Institutional support: RVO:67985556 Keywords : mammography * image enhancement * MRF * textural models Subject RIV: BD - Theory of Information http://library.utia.cas.cz/separaty/2014/RO/haindl-0436549.pdf

  18. Agent-based modeling and simulation

    CERN Document Server

    Taylor, Simon

    2014-01-01

    Operational Research (OR) deals with the use of advanced analytical methods to support better decision-making. It is multidisciplinary with strong links to management science, decision science, computer science and many application areas such as engineering, manufacturing, commerce and healthcare. In the study of emergent behaviour in complex adaptive systems, Agent-based Modelling & Simulation (ABMS) is being used in many different domains such as healthcare, energy, evacuation, commerce, manufacturing and defense. This collection of articles presents a convenient introduction to ABMS with pa

  19. SAT-Based Model Checking without Unrolling

    Science.gov (United States)

    Bradley, Aaron R.

    A new form of SAT-based symbolic model checking is described. Instead of unrolling the transition relation, it incrementally generates clauses that are inductive relative to (and augment) stepwise approximate reachability information. In this way, the algorithm gradually refines the property, eventually producing either an inductive strengthening of the property or a counterexample trace. Our experimental studies show that induction is a powerful tool for generalizing the unreachability of given error states: it can refine away many states at once, and it is effective at focusing the proof search on aspects of the transition system relevant to the property. Furthermore, the incremental structure of the algorithm lends itself to a parallel implementation.

  20. Flow Modeling Based Wall Element Technique

    Directory of Open Access Journals (Sweden)

    Sabah Tamimi

    2012-08-01

    Full Text Available Two types of flow where examined, pressure and combination of pressure and Coquette flow of confined turbulent flow with a one equation model used to depict the turbulent viscosity of confined flow in a smooth straight channel when a finite element technique based on a zone close to a solid wall has been adopted for predicting the distribution of the pertinent variables in this zone and examined even with case when the near wall zone was extended away from the wall. The validation of imposed technique has been tested and well compared with other techniques.

  1. Knowledge-based geometric modeling in construction

    DEFF Research Database (Denmark)

    Bonev, Martin; Hvam, Lars

    2012-01-01

    considerably high amount of their recourses is required for designing and specifying the majority of their product assortment. As design decisions are hereby based on knowledge and experience about behaviour and applicability of construction techniques and materials for a predefined design situation, smart...... tools need to be developed, to support these activities. In order to achieve a higher degree of design automation, this study proposes a framework for using configuration systems within the CAD environment together with suitable geometric modeling techniques on the example of a Danish manufacturer for...

  2. Compositional Testing for FSM-Based Models

    Directory of Open Access Journals (Sweden)

    Bilal Kanso

    2014-06-01

    Full Text Available The contribution of this paper is threefold: first, it defines a framework for modelling component –based systems, as well as a formalization of integration rules to combine their behaviour. This is based on finite state machines (FSM. Second, it studies compositional conformance testing i.e. checking whether an implementation made of conforming components combined with integration operators is conform to its specification. Third, it shows the correctness of the global system can be obtained by testing the components involved into it towards the projection of the global specification on the specifications of the components. This result is useful to build adequate test purposes for testing components taking into ac count the system where they are plugged in.

  3. Agent Based Modeling in Public Administration

    Directory of Open Access Journals (Sweden)

    Osman SEYHAN

    2013-06-01

    Full Text Available This study aims to explore the role of agent based modeling (ABM as a simulation method in analyzing and formulating the policy making processes and modern public management that is under the pressure of information age and socio-politic demands of open societies. ABM is a simulative research method to understand complex adaptive systems (cas from the perspective of its constituent entities. In this study, by employing agent based computing and Netlogo language, twocase studies about organizational design and organizational riskanalyses have been examined. Results revealed that ABM is anefficient platform determining the optimum results from various scenarios in order to understand structures and processes about policy making in both organizational design and risk management. In the future, more researches are needed about understanding role of ABM on understanding and making decision on future of cas especially in conjunction with developments in computer technologies.

  4. Integrated Semantic Similarity Model Based on Ontology

    Institute of Scientific and Technical Information of China (English)

    LIU Ya-Jun; ZHAO Yun

    2004-01-01

    To solve the problem of the inadequacy of semantic processing in the intelligent question answering system, an integrated semantic similarity model which calculates the semantic similarity using the geometric distance and information content is presented in this paper.With the help of interrelationship between concepts, the information content of concepts and the strength of the edges in the ontology network, we can calculate the semantic similarity between two concepts and provide information for the further calculation of the semantic similarity between user's question and answers in knowlegdge base.The results of the experiments on the prototype have shown that the semantic problem in natural language processing can also be solved with the help of the knowledge and the abundant semantic information in ontology.More than 90% accuracy with less than 50 ms average searching time in the intelligent question answering prototype system based on ontology has been reached.The result is very satisfied.

  5. Quantum Mechanics Based Multiscale Modeling of Materials

    Science.gov (United States)

    Lu, Gang

    2013-03-01

    We present two quantum mechanics based multiscale approaches that can simulate extended defects in metals accurately and efficiently. The first approach (QCDFT) can treat multimillion atoms effectively via density functional theory (DFT). The method is an extension of the original quasicontinuum approach with DFT as its sole energetic formulation. The second method (QM/MM) has to do with quantum mechanics/molecular mechanics coupling based on the constrained density functional theory, which provides an exact framework for a self-consistent quantum mechanical embedding. Several important materials problems will be addressed using the multiscale modeling approaches, including hydrogen-assisted cracking in Al, magnetism-controlled dislocation properties in Fe and Si pipe diffusion along Al dislocation core. We acknowledge the support from the Office of Navel Research and the Army Research Office.

  6. Python-Based Applications for Hydrogeological Modeling

    Science.gov (United States)

    Khambhammettu, P.

    2013-12-01

    Python is a general-purpose, high-level programming language whose design philosophy emphasizes code readability. Add-on packages supporting fast array computation (numpy), plotting (matplotlib), scientific /mathematical Functions (scipy), have resulted in a powerful ecosystem for scientists interested in exploratory data analysis, high-performance computing and data visualization. Three examples are provided to demonstrate the applicability of the Python environment in hydrogeological applications. Python programs were used to model an aquifer test and estimate aquifer parameters at a Superfund site. The aquifer test conducted at a Groundwater Circulation Well was modeled with the Python/FORTRAN-based TTIM Analytic Element Code. The aquifer parameters were estimated with PEST such that a good match was produced between the simulated and observed drawdowns. Python scripts were written to interface with PEST and visualize the results. A convolution-based approach was used to estimate source concentration histories based on observed concentrations at receptor locations. Unit Response Functions (URFs) that relate the receptor concentrations to a unit release at the source were derived with the ATRANS code. The impact of any releases at the source could then be estimated by convolving the source release history with the URFs. Python scripts were written to compute and visualize receptor concentrations for user-specified source histories. The framework provided a simple and elegant way to test various hypotheses about the site. A Python/FORTRAN-based program TYPECURVEGRID-Py was developed to compute and visualize groundwater elevations and drawdown through time in response to a regional uniform hydraulic gradient and the influence of pumping wells using either the Theis solution for a fully-confined aquifer or the Hantush-Jacob solution for a leaky confined aquifer. The program supports an arbitrary number of wells that can operate according to arbitrary schedules. The

  7. Model based control of refrigeration systems

    Energy Technology Data Exchange (ETDEWEB)

    Sloth Larsen, L.F.

    2005-11-15

    The subject for this Ph.D. thesis is model based control of refrigeration systems. Model based control covers a variety of different types of controls, that incorporates mathematical models. In this thesis the main subject therefore has been restricted to deal with system optimizing control. The optimizing control is divided into two layers, where the system oriented top layers deals with set-point optimizing control and the lower layer deals with dynamical optimizing control in the subsystems. The thesis has two main contributions, i.e. a novel approach for set-point optimization and a novel approach for desynchronization based on dynamical optimization. The focus in the development of the proposed set-point optimizing control has been on deriving a simple and general method, that with ease can be applied on various compositions of the same class of systems, such as refrigeration systems. The method is based on a set of parameter depended static equations describing the considered process. By adapting the parameters to the given process, predict the steady state and computing a steady state gradient of the cost function, the process can be driven continuously towards zero gradient, i.e. the optimum (if the cost function is convex). The method furthermore deals with system constrains by introducing barrier functions, hereby the best possible performance taking the given constrains in to account can be obtained, e.g. under extreme operational conditions. The proposed method has been applied on a test refrigeration system, placed at Aalborg University, for minimization of the energy consumption. Here it was proved that by using general static parameter depended system equations it was possible drive the set-points close to the optimum and thus reduce the power consumption with up to 20%. In the dynamical optimizing layer the idea is to optimize the operation of the subsystem or the groupings of subsystems, that limits the obtainable system performance. In systems

  8. Biologically based multistage modeling of radiation effects

    Energy Technology Data Exchange (ETDEWEB)

    William Hazelton; Suresh Moolgavkar; E. Georg Luebeck

    2005-08-30

    This past year we have made substantial progress in modeling the contribution of homeostatic regulation to low-dose radiation effects and carcinogenesis. We have worked to refine and apply our multistage carcinogenesis models to explicitly incorporate cell cycle states, simple and complex damage, checkpoint delay, slow and fast repair, differentiation, and apoptosis to study the effects of low-dose ionizing radiation in mouse intestinal crypts, as well as in other tissues. We have one paper accepted for publication in ''Advances in Space Research'', and another manuscript in preparation describing this work. I also wrote a chapter describing our combined cell-cycle and multistage carcinogenesis model that will be published in a book on stochastic carcinogenesis models edited by Wei-Yuan Tan. In addition, we organized and held a workshop on ''Biologically Based Modeling of Human Health Effects of Low dose Ionizing Radiation'', July 28-29, 2005 at Fred Hutchinson Cancer Research Center in Seattle, Washington. We had over 20 participants, including Mary Helen Barcellos-Hoff as keynote speaker, talks by most of the low-dose modelers in the DOE low-dose program, experimentalists including Les Redpath (and Mary Helen), Noelle Metting from DOE, and Tony Brooks. It appears that homeostatic regulation may be central to understanding low-dose radiation phenomena. The primary effects of ionizing radiation (IR) are cell killing, delayed cell cycling, and induction of mutations. However, homeostatic regulation causes cells that are killed or damaged by IR to eventually be replaced. Cells with an initiating mutation may have a replacement advantage, leading to clonal expansion of these initiated cells. Thus we have focused particularly on modeling effects that disturb homeostatic regulation as early steps in the carcinogenic process. There are two primary considerations that support our focus on homeostatic regulation. First, a number of

  9. GLDAS Land Surface Models based Aridity Indices

    Science.gov (United States)

    Pande, S.; Ghazanfari, S.

    2011-12-01

    Identification of dryland areas is crucial to guide policy aimed at intervening in water stressed areas and addressing its perennial livelihood or food insecurity. Aridity indices based on spatially relative soil moisture conditions such as NCEP aridity index allow cross comparison of dry conditions between sites. NCEP aridity index is based on the ratio of annual precipitation (supply) to annual potential evaporation (demand). Such an index ignores subannual scale competition between evaporation and drainage functions well as rainfall and temperature regimes. This determines partitioning of annual supply of precipitation into two competing (but met) evaporation and runoff demands. We here introduce aridity indices based on these additional considerations by using soil moisture time series for the past 3 decades from three Land Surface Models (LSM) models and compare it with NCEP index. We analyze global monthly soil moisture time series (385 months) at 1 x 1 degree spatial resolution as modeled by three GLDAS LSMs - VIC, MOSAIC and NOAH. The first eigen vector from Empirical Orthogonal Function (EOF) analysis, as it is the most dominant spatial template of global soil moisture conditions, is extracted. Frequency of nonexceedences of this dominant soil moisture mode for a location by other locations is calculated and is used as our proposed aridity index. An area is indexed drier (relative to other areas in the world) if its frequency of nonexceedence is lower. The EOF analysis reveals that their first eigen vector explains approximately 32%, 43% and 47% of variance explained by first 385 eigen vectors for VIC, MOSAIC and NOAH respectively. The temporal coefficients associated with it for all three LSMS show seasonality with a jump in trend around the year 1999 for NOAH and MOSAIC. The VIC aridity index displays a pattern most closely resembling that of NCEP though all LSM based indices isolate dominant dryland areas. However, all three LSMs identify some parts of

  10. Models-Based Practice: Great White Hope or White Elephant?

    Science.gov (United States)

    Casey, Ashley

    2014-01-01

    Background: Many critical curriculum theorists in physical education have advocated a model- or models-based approach to teaching in the subject. This paper explores the literature base around models-based practice (MBP) and asks if this multi-models approach to curriculum planning has the potential to be the great white hope of pedagogical change…

  11. Evaluating face trustworthiness: a model based approach

    Science.gov (United States)

    Baron, Sean G.; Oosterhof, Nikolaas N.

    2008-01-01

    Judgments of trustworthiness from faces determine basic approach/avoidance responses and approximate the valence evaluation of faces that runs across multiple person judgments. Here, based on trustworthiness judgments and using a computer model for face representation, we built a model for representing face trustworthiness (study 1). Using this model, we generated novel faces with an increased range of trustworthiness and used these faces as stimuli in a functional Magnetic Resonance Imaging study (study 2). Although participants did not engage in explicit evaluation of the faces, the amygdala response changed as a function of face trustworthiness. An area in the right amygdala showed a negative linear response—as the untrustworthiness of faces increased so did the amygdala response. Areas in the left and right putamen, the latter area extended into the anterior insula, showed a similar negative linear response. The response in the left amygdala was quadratic—strongest for faces on both extremes of the trustworthiness dimension. The medial prefrontal cortex and precuneus also showed a quadratic response, but their response was strongest to faces in the middle range of the trustworthiness dimension. PMID:19015102

  12. Hazard identification based on plant functional modelling

    International Nuclear Information System (INIS)

    A major objective of the present work is to provide means for representing a process plant as a socio-technical system, so as to allow hazard identification at a high level. The method includes technical, human and organisational aspects and is intended to be used for plant level hazard identification so as to identify critical areas and the need for further analysis using existing methods. The first part of the method is the preparation of a plant functional model where a set of plant functions link together hardware, software, operations, work organisation and other safety related aspects of the plant. The basic principle of the functional modelling is that any aspect of the plant can be represented by an object (in the sense that this term is used in computer science) based upon an Intent (or goal); associated with each Intent are Methods, by which the Intent is realized, and Constraints, which limit the Intent. The Methods and Constraints can themselves be treated as objects and decomposed into lower-level Intents (hence the procedure is known as functional decomposition) so giving rise to a hierarchical, object-oriented structure. The plant level hazard identification is carried out on the plant functional model using the Concept Hazard Analysis method. In this, the user will be supported by checklists and keywords and the analysis is structured by pre-defined worksheets. The preparation of the plant functional model and the performance of the hazard identification can be carried out manually or with computer support. (au) (4 tabs., 10 ills., 7 refs.)

  13. Agent-based models of financial markets

    International Nuclear Information System (INIS)

    This review deals with several microscopic ('agent-based') models of financial markets which have been studied by economists and physicists over the last decade: Kim-Markowitz, Levy-Levy-Solomon, Cont-Bouchaud, Solomon-Weisbuch, Lux-Marchesi, Donangelo-Sneppen and Solomon-Levy-Huang. After an overview of simulation approaches in financial economics, we first give a summary of the Donangelo-Sneppen model of monetary exchange and compare it with related models in economics literature. Our selective review then outlines the main ingredients of some influential early models of multi-agent dynamics in financial markets (Kim-Markowitz, Levy-Levy-Solomon). As will be seen, these contributions draw their inspiration from the complex appearance of investors' interactions in real-life markets. Their main aim is to reproduce (and, thereby, provide possible explanations) for the spectacular bubbles and crashes seen in certain historical episodes, but they lack (like almost all the work before 1998 or so) a perspective in terms of the universal statistical features of financial time series. In fact, awareness of a set of such regularities (power-law tails of the distribution of returns, temporal scaling of volatility) only gradually appeared over the nineties. With the more precise description of the formerly relatively vague characteristics (e.g. moving from the notion of fat tails to the more concrete one of a power law with index around three), it became clear that financial market dynamics give rise to some kind of universal scaling law. Showing similarities with scaling laws for other systems with many interacting sub-units, an exploration of financial markets as multi-agent systems appeared to be a natural consequence. This topic has been pursued by quite a number of contributions appearing in both the physics and economics literature since the late nineties. From the wealth of different flavours of multi-agent models that have appeared up to now, we discuss the Cont

  14. Agent-based models of financial markets

    Science.gov (United States)

    Samanidou, E.; Zschischang, E.; Stauffer, D.; Lux, T.

    2007-03-01

    This review deals with several microscopic ('agent-based') models of financial markets which have been studied by economists and physicists over the last decade: Kim-Markowitz, Levy-Levy-Solomon, Cont-Bouchaud, Solomon-Weisbuch, Lux-Marchesi, Donangelo-Sneppen and Solomon-Levy-Huang. After an overview of simulation approaches in financial economics, we first give a summary of the Donangelo-Sneppen model of monetary exchange and compare it with related models in economics literature. Our selective review then outlines the main ingredients of some influential early models of multi-agent dynamics in financial markets (Kim-Markowitz, Levy-Levy-Solomon). As will be seen, these contributions draw their inspiration from the complex appearance of investors' interactions in real-life markets. Their main aim is to reproduce (and, thereby, provide possible explanations) for the spectacular bubbles and crashes seen in certain historical episodes, but they lack (like almost all the work before 1998 or so) a perspective in terms of the universal statistical features of financial time series. In fact, awareness of a set of such regularities (power-law tails of the distribution of returns, temporal scaling of volatility) only gradually appeared over the nineties. With the more precise description of the formerly relatively vague characteristics (e.g. moving from the notion of fat tails to the more concrete one of a power law with index around three), it became clear that financial market dynamics give rise to some kind of universal scaling law. Showing similarities with scaling laws for other systems with many interacting sub-units, an exploration of financial markets as multi-agent systems appeared to be a natural consequence. This topic has been pursued by quite a number of contributions appearing in both the physics and economics literature since the late nineties. From the wealth of different flavours of multi-agent models that have appeared up to now, we discuss the Cont

  15. Agent-based models of financial markets

    Energy Technology Data Exchange (ETDEWEB)

    Samanidou, E [Department of Economics, University of Kiel, Olshausenstrasse 40, D-24118 Kiel (Germany); Zschischang, E [HSH Nord Bank, Portfolio Mngmt. and Inv., Martensdamm 6, D-24103 Kiel (Germany); Stauffer, D [Institute for Theoretical Physics, Cologne University, D-50923 Koeln (Germany); Lux, T [Department of Economics, University of Kiel, Olshausenstrasse 40, D-24118 Kiel (Germany)

    2007-03-15

    This review deals with several microscopic ('agent-based') models of financial markets which have been studied by economists and physicists over the last decade: Kim-Markowitz, Levy-Levy-Solomon, Cont-Bouchaud, Solomon-Weisbuch, Lux-Marchesi, Donangelo-Sneppen and Solomon-Levy-Huang. After an overview of simulation approaches in financial economics, we first give a summary of the Donangelo-Sneppen model of monetary exchange and compare it with related models in economics literature. Our selective review then outlines the main ingredients of some influential early models of multi-agent dynamics in financial markets (Kim-Markowitz, Levy-Levy-Solomon). As will be seen, these contributions draw their inspiration from the complex appearance of investors' interactions in real-life markets. Their main aim is to reproduce (and, thereby, provide possible explanations) for the spectacular bubbles and crashes seen in certain historical episodes, but they lack (like almost all the work before 1998 or so) a perspective in terms of the universal statistical features of financial time series. In fact, awareness of a set of such regularities (power-law tails of the distribution of returns, temporal scaling of volatility) only gradually appeared over the nineties. With the more precise description of the formerly relatively vague characteristics (e.g. moving from the notion of fat tails to the more concrete one of a power law with index around three), it became clear that financial market dynamics give rise to some kind of universal scaling law. Showing similarities with scaling laws for other systems with many interacting sub-units, an exploration of financial markets as multi-agent systems appeared to be a natural consequence. This topic has been pursued by quite a number of contributions appearing in both the physics and economics literature since the late nineties. From the wealth of different flavours of multi-agent models that have appeared up to now, we

  16. The Use of Modeling-Based Text to Improve Students' Modeling Competencies

    Science.gov (United States)

    Jong, Jing-Ping; Chiu, Mei-Hung; Chung, Shiao-Lan

    2015-01-01

    This study investigated the effects of a modeling-based text on 10th graders' modeling competencies. Fifteen 10th graders read a researcher-developed modeling-based science text on the ideal gas law that included explicit descriptions and representations of modeling processes (i.e., model selection, model construction, model validation, model…

  17. Model Based Control of Refrigeration Systems

    DEFF Research Database (Denmark)

    Larsen, Lars Finn Sloth

    of the supermarket refrigeration systems therefore greatly relies on a human operator to detect and accommodate failures, and to optimize system performance under varying operational condition. Today these functions are maintained by monitoring centres located all over the world. Initiated by the growing need...... for automation of these procedures, that is to incorporate some "intelligence" in the control system, this project was started up. The main emphasis of this work has been on model based methods for system optimizing control in supermarket refrigeration systems. The idea of implementing a system optimizing...... optimizing the steady state operation "set-point optimizing control" and a part optimizing dynamic behaviour of the system "dynamical optimizing control". A novel approach for set-point optimization will be presented. The general idea is to use a prediction of the steady state, for computation of the cost...

  18. Model based optimization of EMC input filters

    Energy Technology Data Exchange (ETDEWEB)

    Raggl, K; Kolar, J. W. [Swiss Federal Institute of Technology, Power Electronic Systems Laboratory, Zuerich (Switzerland); Nussbaumer, T. [Levitronix GmbH, Zuerich (Switzerland)

    2008-07-01

    Input filters of power converters for compliance with regulatory electromagnetic compatibility (EMC) standards are often over-dimensioned in practice due to a non-optimal selection of number of filter stages and/or the lack of solid volumetric models of the inductor cores. This paper presents a systematic filter design approach based on a specific filter attenuation requirement and volumetric component parameters. It is shown that a minimal volume can be found for a certain optimal number of filter stages for both the differential mode (DM) and common mode (CM) filter. The considerations are carried out exemplarily for an EMC input filter of a single phase power converter for the power levels of 100 W, 300 W, and 500 W. (author)

  19. Model-based control of networked systems

    CERN Document Server

    Garcia, Eloy; Montestruque, Luis A

    2014-01-01

    This monograph introduces a class of networked control systems (NCS) called model-based networked control systems (MB-NCS) and presents various architectures and control strategies designed to improve the performance of NCS. The overall performance of NCS considers the appropriate use of network resources, particularly network bandwidth, in conjunction with the desired response of the system being controlled.   The book begins with a detailed description of the basic MB-NCS architecture that provides stability conditions in terms of state feedback updates . It also covers typical problems in NCS such as network delays, network scheduling, and data quantization, as well as more general control problems such as output feedback control, nonlinear systems stabilization, and tracking control.   Key features and topics include: Time-triggered and event-triggered feedback updates Stabilization of uncertain systems subject to time delays, quantization, and extended absence of feedback Optimal control analysis and ...

  20. Prototype-based models in machine learning.

    Science.gov (United States)

    Biehl, Michael; Hammer, Barbara; Villmann, Thomas

    2016-01-01

    An overview is given of prototype-based models in machine learning. In this framework, observations, i.e., data, are stored in terms of typical representatives. Together with a suitable measure of similarity, the systems can be employed in the context of unsupervised and supervised analysis of potentially high-dimensional, complex datasets. We discuss basic schemes of competitive vector quantization as well as the so-called neural gas approach and Kohonen's topology-preserving self-organizing map. Supervised learning in prototype systems is exemplified in terms of learning vector quantization. Most frequently, the familiar Euclidean distance serves as a dissimilarity measure. We present extensions of the framework to nonstandard measures and give an introduction to the use of adaptive distances in relevance learning. PMID:26800334

  1. Ecosystem Based Business Model of Smart Grid

    DEFF Research Database (Denmark)

    Lundgaard, Morten Raahauge; Ma, Zheng; Jørgensen, Bo Nørregaard

    2015-01-01

    This paper tries to investigate the ecosystem based business model in a smart grid infrastructure and the potential of value capture in the highly complex macro infrastructure such as smart grid. This paper proposes an alternative perspective to study the smart grid business ecosystem to support...... the infrastructural challenges, such as the interoperability of business components for smart grid. So far little research has explored the business ecosystem in the smart grid concept. The study on the smart grid with the theory of business ecosystem may open opportunities to understand market...... catalysts. This study contributes an understanding of business ecosystem applicable for smart grid. Smart grid infrastructure is an intricate business ecosystem, which have several intentions to deliver the value proposition and what it should be. The findings help to identify and capture value from markets....

  2. Comparing model predictions for ecosystem-based management

    DEFF Research Database (Denmark)

    Jacobsen, Nis Sand; Essington, Timothy E.; Andersen, Ken Haste

    2016-01-01

    Ecosystem modeling is becoming an integral part of fisheries management, but there is a need to identify differences between predictions derived from models employed for scientific and management purposes. Here, we compared two models: a biomass-based food-web model (Ecopath with Ecosim (EwE)) and...... predictions, underscoring the importance of incorporating knowledge of model assumptions and limitation, possibly through using model ensembles, when providing model-based scientific advice to policy makers....

  3. Validating agent based models through virtual worlds.

    Energy Technology Data Exchange (ETDEWEB)

    Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E.; Bernstein, Jeremy Ray Rhythm

    2014-01-01

    As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior

  4. Multiple Damage Progression Paths in Model-based Prognostics

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics approaches employ do- main knowledge about a system, its components, and how they fail through the use of physics-based models. Compo- nent...

  5. Distributed Damage Estimation for Prognostics based on Structural Model Decomposition

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics approaches capture system knowl- edge in the form of physics-based models of components that include how they fail. These methods consist of...

  6. An Agent-Based Monetary Production Simulation Model

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2006-01-01

    An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable......An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable...

  7. A model-based evaluation system of enterprise

    Institute of Scientific and Technical Information of China (English)

    Yan Junwei; Ye Yang; Wang Jian

    2005-01-01

    This paper analyses the architecture of enterprise modeling, proposesindicator selection principles and indicator decomposition methods, examines the approaches to the evaluation of enterprise modeling and designs an evaluation model of AHP. Then a model-based evaluation system of enterprise is presented toeffectively evaluate the business model in the framework of enterprise modeling.

  8. Embedding Task-Based Neural Models into a Connectome-Based Model of the Cerebral Cortex

    Science.gov (United States)

    Ulloa, Antonio; Horwitz, Barry

    2016-01-01

    A number of recent efforts have used large-scale, biologically realistic, neural models to help understand the neural basis for the patterns of activity observed in both resting state and task-related functional neural imaging data. An example of the former is The Virtual Brain (TVB) software platform, which allows one to apply large-scale neural modeling in a whole brain framework. TVB provides a set of structural connectomes of the human cerebral cortex, a collection of neural processing units for each connectome node, and various forward models that can convert simulated neural activity into a variety of functional brain imaging signals. In this paper, we demonstrate how to embed a previously or newly constructed task-based large-scale neural model into the TVB platform. We tested our method on a previously constructed large-scale neural model (LSNM) of visual object processing that consisted of interconnected neural populations that represent, primary and secondary visual, inferotemporal, and prefrontal cortex. Some neural elements in the original model were “non-task-specific” (NS) neurons that served as noise generators to “task-specific” neurons that processed shapes during a delayed match-to-sample (DMS) task. We replaced the NS neurons with an anatomical TVB connectome model of the cerebral cortex comprising 998 regions of interest interconnected by white matter fiber tract weights. We embedded our LSNM of visual object processing into corresponding nodes within the TVB connectome. Reciprocal connections between TVB nodes and our task-based modules were included in this framework. We ran visual object processing simulations and showed that the TVB simulator successfully replaced the noise generation originally provided by NS neurons; i.e., the DMS tasks performed with the hybrid LSNM/TVB simulator generated equivalent neural and fMRI activity to that of the original task-based models. Additionally, we found partial agreement between the functional

  9. Embedding Task-Based Neural Models into a Connectome-Based Model of the Cerebral Cortex.

    Science.gov (United States)

    Ulloa, Antonio; Horwitz, Barry

    2016-01-01

    A number of recent efforts have used large-scale, biologically realistic, neural models to help understand the neural basis for the patterns of activity observed in both resting state and task-related functional neural imaging data. An example of the former is The Virtual Brain (TVB) software platform, which allows one to apply large-scale neural modeling in a whole brain framework. TVB provides a set of structural connectomes of the human cerebral cortex, a collection of neural processing units for each connectome node, and various forward models that can convert simulated neural activity into a variety of functional brain imaging signals. In this paper, we demonstrate how to embed a previously or newly constructed task-based large-scale neural model into the TVB platform. We tested our method on a previously constructed large-scale neural model (LSNM) of visual object processing that consisted of interconnected neural populations that represent, primary and secondary visual, inferotemporal, and prefrontal cortex. Some neural elements in the original model were "non-task-specific" (NS) neurons that served as noise generators to "task-specific" neurons that processed shapes during a delayed match-to-sample (DMS) task. We replaced the NS neurons with an anatomical TVB connectome model of the cerebral cortex comprising 998 regions of interest interconnected by white matter fiber tract weights. We embedded our LSNM of visual object processing into corresponding nodes within the TVB connectome. Reciprocal connections between TVB nodes and our task-based modules were included in this framework. We ran visual object processing simulations and showed that the TVB simulator successfully replaced the noise generation originally provided by NS neurons; i.e., the DMS tasks performed with the hybrid LSNM/TVB simulator generated equivalent neural and fMRI activity to that of the original task-based models. Additionally, we found partial agreement between the functional

  10. Framework of Pattern Recognition Model Based on the Cognitive Psychology

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    According to the fundamental theory of visual cognition mechanism and cognitive psychology,the visual pattern recognition model is introduced briefly.Three pattern recognition models,i.e.template-based matching model,prototype-based matching model and feature-based matching model are built and discussed separately.In addition,the influence of object background information and visual focus point to the result of pattern recognition is also discussed with the example of recognition for fuzzy letters and figures.

  11. Process Based Modelling of Phosphorus Losses from Arable Land

    OpenAIRE

    Ekstrand, Sam; Wallenberg, Peter; Djodjic, Faruk

    2010-01-01

    Improved understanding of temporal and spatial Phosphorus (P) discharge variations is needed for improved modelling and prioritisation of abatement strategies that take into account local conditions . This study is aimed at developing modelling of agricultural Phosphorus losses with improved spatial and temporal resolution, and to compare the accuracy of a detailed process-based model with a rainfall-runoff coefficient-based model. The process-based SWAT model (Soil and Water Assessment Tool)...

  12. Mutation-based Model Synthesis in Model Driven Engineering

    OpenAIRE

    Sen, Sagar; Baudry, Benoit

    2006-01-01

    With the increasing use of models for software development and the emergence of model-driven engineering, it has become important to build accurate and precise models that present certain characteristics. Model transformation testing is a domain that requires generating a large number of models that satisfy coverage properties (cover the code of the transformation or the structure of the metamodel). However, manually building a set of models to test a transformation is a tedious task and havi...

  13. Image segmentation based on adaptive mixture model

    International Nuclear Information System (INIS)

    As an important research field, image segmentation has attracted considerable attention. The classical geodesic active contour (GAC) model tends to produce fake edges in smooth regions, while the Chan–Vese (CV) model cannot effectively detect images with holes and obtain the precise boundary. To address the above issues, this paper proposes an adaptive mixture model synthesizing the GAC model and the CV model by a weight function. According to image characteristics, the proposed model can adaptively adjust the weight function. In this way, the model exploits the advantages of the GAC model in regions with rich textures or edges, while exploiting the advantages of the CV model in smooth local regions. Moreover, the proposed model is extended to vector-valued images. Through experiments, it is verified that the proposed model obtains better results than the traditional models. (paper)

  14. Integrated Enterprise Modeling Method Based on Workflow Model and Multiviews%Integrated Enterprise Modeling Method Based on Workflow Model and Multiviews

    Institute of Scientific and Technical Information of China (English)

    林慧苹; 范玉顺; 吴澄

    2001-01-01

    Many enterprise modeling methods are proposed to model thebusiness process of enterprises and to implement CIM systems. But difficulties are still encountered when these methods are applied to the CIM system design and implementation. This paper proposes a new integrated enterprise modeling methodology based on the workflow model. The system architecture and the integrated modeling environment are described with a new simulation strategy. The modeling process and the relationship between the workflow model and the views are discussed.

  15. Video-Based Modeling: Differential Effects due to Treatment Protocol

    Science.gov (United States)

    Mason, Rose A.; Ganz, Jennifer B.; Parker, Richard I.; Boles, Margot B.; Davis, Heather S.; Rispoli, Mandy J.

    2013-01-01

    Identifying evidence-based practices for individuals with disabilities requires specification of procedural implementation. Video-based modeling (VBM), consisting of both video self-modeling and video modeling with others as model (VMO), is one class of interventions that has frequently been explored in the literature. However, current information…

  16. The research on Virtual Plants Growth Based on DLA Model

    Science.gov (United States)

    Zou, YunLan; Chai, Bencheng

    This article summarizes the separated Evolutionary Algorithm in fractal algorithm of Diffusion Limited Aggregation model (i.e. DLA model) and put forward the virtual plant growth realization in computer based on DLA model. The method is carried out in the VB6.0 environment to achieve and verify the plant growth based on DLA model.

  17. MRO CTX-based Digital Terrain Models

    Science.gov (United States)

    Dumke, Alexander

    2016-04-01

    In planetary surface sciences, digital terrain models (DTM) are paramount when it comes to understanding and quantifying processes. In this contribution an approach for the derivation of digital terrain models from stereo images of the NASA Mars Reconnaissance Orbiter (MRO) Context Camera (CTX) are described. CTX consists of a 350 mm focal length telescope and 5000 CCD sensor elements and is operated as pushbroom camera. It acquires images with ~6 m/px over a swath width of ~30 km of the Mars surface [1]. Today, several approaches for the derivation of CTX DTMs exist [e. g. 2, 3, 4]. The discussed approach here is based on established software and combines them with proprietary software as described below. The main processing task for the derivation of CTX stereo DTMs is based on six steps: (1) First, CTX images are radiometrically corrected using the ISIS software package [5]. (2) For selected CTX stereo images, exterior orientation data from reconstructed NAIF SPICE data are extracted [6]. (3) In the next step High Resolution Stereo Camera (HRSC) DTMs [7, 8, 9] are used for the rectification of CTX stereo images to reduce the search area during the image matching. Here, HRSC DTMs are used due to their higher spatial resolution when compared to MOLA DTMs. (4) The determination of coordinates of homologous points between stereo images, i.e. the stereo image matching process, consists of two steps: first, a cross-correlation to obtain approximate values and secondly, their use in a least-square matching (LSM) process in order to obtain subpixel positions. (5) The stereo matching results are then used to generate object points from forward ray intersections. (6) As a last step, the DTM-raster generation is performed using software developed at the German Aerospace Center, Berlin. Whereby only object points are used that have a smaller error than a threshold value. References: [1] Malin, M. C. et al., 2007, JGR 112, doi:10.1029/2006JE002808 [2] Broxton, M. J. et al

  18. Measurement-based load modeling: Theory and application

    Institute of Scientific and Technical Information of China (English)

    MA; Jin; HAN; Dong; HE; RenMu

    2007-01-01

    Load model is one of the most important elements in power system operation and control. However, owing to its complexity, load modeling is still an open and very difficult problem. Summarizing our work on measurement-based load modeling in China for more than twenty years, this paper systematically introduces the mathematical theory and applications regarding the load modeling. The flow chart and algorithms for measurement-based load modeling are presented. A composite load model structure with 13 parameters is also proposed. Analysis results based on the trajectory sensitivity theory indicate the importance of the load model parameters for the identification. Case studies show the accuracy of the presented measurement-based load model. The load model thus built has been validated by field measurements all over China. Future working directions on measurement- based load modeling are also discussed in the paper.

  19. Audio Modeling based on Delayed Sinusoids

    OpenAIRE

    Boyer, Remy; Abed-Meraim, Karim

    2004-01-01

    In this work, we present an evolution of the DDS (Damped & Delayed Sinusoidal) model introduced within the framework of the general signal modeling. This model is named the Partial Damped & Delayed Sinusoidal (PDDS) model and takes into account a single time delay parameter for a set (sum) of damped sinusoids. This modi- ¯cation is more consistent with the transient audio modeling problem. We show the validity of this approach by compari- son with the well-known EDS (Exponentially Damped Sinu...

  20. A Bit Progress on Word—Based Language Model

    Institute of Scientific and Technical Information of China (English)

    陈勇; 陈国评

    2003-01-01

    A good language model is essential to a postprocessing algorithm for recognition systems. In the past, researchers have pre-sented various language models, such as character based language models, word based language model, syntactical rules :language mod-el, hybrid models, etc. The word N-gram model is by far an effective and efficient model, but one has to address the problem of data sparseness in establishing the model. Katz and Kneser et al. respectively presented effective remedies to solve this challenging prob-lem. In this study, we proposed an improvement to their methods by incorporating Chinese language-specific information or Chinese word class information into the system.

  1. Research of database-based modeling for mining management system

    Institute of Scientific and Technical Information of China (English)

    WU Hai-feng; JIN Zhi-xin; BAI Xi-jun

    2005-01-01

    Put forward the method to construct the simulation model automatically with database-based automatic modeling(DBAM) for mining system. Designed the standard simulation model linked with some open cut Pautomobile dispatch system. Analyzed and finded out the law among them, and designed model maker to realize the automatic programming of the new model program.

  2. Model-based clustering using copulas with applications

    OpenAIRE

    Kosmidis, Ioannis; Karlis, Dimitris

    2014-01-01

    The majority of model-based clustering techniques is based on multivariate Normal models and their variants. In this paper copulas are used for the construction of flexible families of models for clustering applications. The use of copulas in model-based clustering offers two direct advantages over current methods: i) the appropriate choice of copulas provides the ability to obtain a range of exotic shapes for the clusters, and ii) the explicit choice of marginal distributions for the cluster...

  3. A generic testing framework for agent-based simulation models

    OpenAIRE

    Gürcan, Önder; Dikenelli, Oguz; Bernon, Carole

    2013-01-01

    International audience Agent-based modelling and simulation (ABMS) had an increasing attention during the last decade. However, the weak validation and verification of agent-based simulation models makes ABMS hard to trust. There is no comprehensive tool set for verification and validation of agent-based simulation models, which demonstrates that inaccuracies exist and/or reveals the existing errors in the model. Moreover, on the practical side, many ABMS frameworks are in use. In this sen...

  4. Agent Based Modelling and Simulation of Social Processes

    OpenAIRE

    Armano Srbljinovic; Ognjen Skunca

    2003-01-01

    The paper provides an introduction to agent-based modelling and simulation of social processes. Reader is introduced to the worldview underlying agent-based models, some basic terminology, basic properties of agent-based models, as well as to what one can and what cannot expect from such models, particularly when they are applied to social-scientific investigation. Special attention is given to the issues of validation. Classification-ACM-1998: J.4 [Computer Applications]; Social and behavior...

  5. Model based sustainable production of biomethane

    OpenAIRE

    Biernacki, Piotr

    2014-01-01

    The main intention of this dissertation was to evaluate sustainable production of biomethane with use of mathematical modelling. To achieve this goal, widely acknowledged models like Anaerobic Digestion Model No.1 (ADM1), describing anaerobic digestion, and electrolyte Non-Random Two Liquid Model (eNRTL), for gas purification, were utilized. The experimental results, batch anaerobic digestion of different substrates and carbon dioxide solubility in 2-(Ethylamino)ethanol, were used to determin...

  6. Likelihood-Based Climate Model Evaluation

    Science.gov (United States)

    Braverman, Amy; Cressie, Noel; Teixeira, Joao

    2012-01-01

    Climate models are deterministic, mathematical descriptions of the physics of climate. Confidence in predictions of future climate is increased if the physics are verifiably correct. A necessary, (but not sufficient) condition is that past and present climate be simulated well. Quantify the likelihood that a (summary statistic computed from a) set of observations arises from a physical system with the characteristics captured by a model generated time series. Given a prior on models, we can go further: posterior distribution of model given observations.

  7. Agent Based Reasoning in Multilevel Flow Modeling

    DEFF Research Database (Denmark)

    Lind, Morten; Zhang, Xinxin

    2012-01-01

    Multilevel Flow Modeling (MFM) is a modeling method used for modeling complex industrial plant. Currently, MFM is supported with a standalone software tool called MFM Workbench, which is equipped with causal-relation analysis and other functionalities. The aim of this paper is to offer a new design...

  8. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae; Top, Søren

    2008-01-01

    abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical...... analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set of...

  9. Dopamine enhances model-based over model-free choice behavior.

    OpenAIRE

    Wunderlich, K; Smittenaar, P.; Dolan, R J

    2012-01-01

    Summary Decision making is often considered to arise out of contributions from a model-free habitual system and a model-based goal-directed system. Here, we investigated the effect of a dopamine manipulation on the degree to which either system contributes to instrumental behavior in a two-stage Markov decision task, which has been shown to discriminate model-free from model-based control. We found increased dopamine levels promote model-based over model-free choice.

  10. On Process Modelling Using Physical Oriented And Phenomena Based Principles

    Directory of Open Access Journals (Sweden)

    Mihai Culea

    2000-12-01

    Full Text Available This work presents a modelling framework based on phenomena description of the process. The approach is taken to easy understand and construct process model in heterogeneous possible distributed modelling and simulation environments. A simplified case study of a heat exchanger is considered and Modelica modelling language to check the proposed concept. The partial results are promising and the research effort will be extended in a computer aided modelling environment based on phenomena.

  11. A Size-based Ecosystem Model

    DEFF Research Database (Denmark)

    Ravn-Jonsen, Lars

    fish according to size. The model summarises individual predation events into ecosystem level properties, and thereby uses the law of conversation of mass as a framework. This paper provides the background, the conceptual model, basic assumptions, integration of fishing activities, mathematical...... completion, and a numeric implementation. Using two experiments, the model's ability to act as tool for economic production analysis and regulation design testing is demonstrated. The presented model is the simplest possible and is built on the principles of (i) size, as the attribute that determines the...... Ecosystem Management requires models that can link the ecosystem level to the operation level. This link can be created by an ecosystem production model. Because the function of the individual fish in the marine ecosystem, seen in trophic context, is closely related to its size, the model groups...

  12. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  13. A model-based multisensor data fusion knowledge management approach

    Science.gov (United States)

    Straub, Jeremy

    2014-06-01

    A variety of approaches exist for combining data from multiple sensors. The model-based approach combines data based on its support for or refutation of elements of the model which in turn can be used to evaluate an experimental thesis. This paper presents a collection of algorithms for mapping various types of sensor data onto a thesis-based model and evaluating the truth or falsity of the thesis, based on the model. The use of this approach for autonomously arriving at findings and for prioritizing data are considered. Techniques for updating the model (instead of arriving at a true/false assertion) are also discussed.

  14. A Chakra-Based Model of Group Development.

    Science.gov (United States)

    Gilchrist, Roger; Mikulas, William L.

    1993-01-01

    Describes a model for sequential stages of group development based on yogic chakra system. Compares chakra-based model with other models of group developmental stages. Using context of chakra system, specifies basic dynamic issues and leader interventions for each stage and discusses relationship of individual development to group process. (Author)

  15. A Training Model for School-Based Decision Making.

    Science.gov (United States)

    Horgan, Dianne D.

    The development of a comprehensive training model designed specifically for school-based decision making is discussed in this report, with a focus on teaching relevant skills and when to utilize them. Loosely based on Vroom and Yetton's 1973 model of participative decision making, the model is characterized by a general-to-specific continuum and…

  16. Attenuating wind turbine loads through model based individual pitch control

    DEFF Research Database (Denmark)

    Thomsen, Sven Creutz; Niemann, Hans Henrik; Poulsen, Niels Kjølstad

    2009-01-01

    In this paper we consider wind turbine load attenuation through model based control. Asymmetric loads caused by the wind field can be reduced by pitching the blades individually. To this end we investigate the use of stochastic models of the wind which can be included in a model based individual...

  17. The Gap of Current Agent Based Simulation Modeling Practices and Feasibility of a Generic Agent Based Simulation Model

    OpenAIRE

    Yim Ling Loo; Alicia Y.C. Tang; Azhana Ahmad

    2015-01-01

    Agent-based modeling had been revolving to be established approach in modeling simulation systems which are used to understand and predict certain real-life scenarios in specific domains. Past researches which are domain-specific caused repetitive building of new models from scratch and restrict replication and reuse because of limitation of models’ description. This paper presents a review of gaps between domain-specific agent-based simulation modeling and the recent practices of agent-based...

  18. Evaluating Model-based Trees in Practice

    OpenAIRE

    Zeileis, Achim; Hothorn, Torsten; Hornik, Kurt

    2006-01-01

    A recently suggested algorithm for recursive partitioning of statistical models (Zeileis, Hothorn and Hornik, 2005), such as models estimated by maximum likelihood or least squares, is evaluated in practice. The general algorithm is applied to linear regression, logisitic regression and survival regression and applied to economical and medical regression problems. Furthermore, its performance with respect to prediction quality and model complexity is compared in a benchmark study with a large...

  19. Student Modeling Based on Problem Solving Times

    Science.gov (United States)

    Pelánek, Radek; Jarušek, Petr

    2015-01-01

    Student modeling in intelligent tutoring systems is mostly concerned with modeling correctness of students' answers. As interactive problem solving activities become increasingly common in educational systems, it is useful to focus also on timing information associated with problem solving. We argue that the focus on timing is natural for certain…

  20. A Stock Pricing Model Based on Arithmetic Brown Motion

    Institute of Scientific and Technical Information of China (English)

    YAN Yong-xin; HAN Wen-xiu

    2001-01-01

    This paper presents a new stock pricing model based on arithmetic Brown motion. The model overcomes the shortcomings of Gordon model completely. With the model investors can estimate the stock value of surplus companies, deficit companies, zero increase companies and bankrupt companies in long term investment or in short term investment.

  1. Agent-Based Modeling of Growth Processes

    Science.gov (United States)

    Abraham, Ralph

    2014-01-01

    Growth processes abound in nature, and are frequently the target of modeling exercises in the sciences. In this article we illustrate an agent-based approach to modeling, in the case of a single example from the social sciences: bullying.

  2. Statistical Model-Based Face Pose Estimation

    Institute of Scientific and Technical Information of China (English)

    GE Xinliang; YANG Jie; LI Feng; WANG Huahua

    2007-01-01

    A robust face pose estimation approach is proposed by using face shape statistical model approach and pose parameters are represented by trigonometric functions. The face shape statistical model is firstly built by analyzing the face shapes from different people under varying poses. The shape alignment is vital in the process of building the statistical model. Then, six trigonometric functions are employed to represent the face pose parameters. Lastly, the mapping function is constructed between face image and face pose by linearly relating different parameters. The proposed approach is able to estimate different face poses using a few face training samples. Experimental results are provided to demonstrate its efficiency and accuracy.

  3. Modeling amperometric biosensors based on allosteric enzymes

    Directory of Open Access Journals (Sweden)

    Liutauras Ričkus

    2013-09-01

    Full Text Available Computational modeling of a biosensor with allosteric enzyme layer was investigated in this study. The operation of the biosensor is modeled using non-stationary reaction-diffusion equations. The model involves three regions: the allosteric enzyme layer where the allosteric enzyme reactions as well as then mass transport by diffusion take place, the diffusion region where the mass transport by diffusion and non-enzymatic reactions take place and the convective region in which the analyte concentration is maintained constant. The biosensor response on dependency substrate concentration, cooperativity coefficient and the diffusion layer thickness on the same parameters have been studied.

  4. Model-Based Development of Control Systems for Forestry Cranes

    OpenAIRE

    Pedro La Hera; Daniel Ortíz Morales

    2015-01-01

    Model-based methods are used in industry for prototyping concepts based on mathematical models. With our forest industry partners, we have established a model-based workflow for rapid development of motion control systems for forestry cranes. Applying this working method, we can verify control algorithms, both theoretically and practically. This paper is an example of this workflow and presents four topics related to the application of nonlinear control theory. The first topic presents th...

  5. Individual-Based Modelling of Bacterial Ecologies and Evolution

    OpenAIRE

    Saunders, J R; Q. H. Wu; Paton, R. C.; Vlachos, C.; Gregory, R.

    2004-01-01

    This paper presents two approaches to the individual-based modelling of bacterial ecologies and evolution using computational tools. The first approach is a fine-grained model that is based on networks of interactivity between computational objects representing genes and proteins. The second approach is a coarser-grained, agent-based model, which is designed to explore the evolvability of adaptive behavioural strategies in artificial bacteria represented by learning classifier systems. The st...

  6. Model Based Predictive Control of a Fully Parallel Robot

    OpenAIRE

    Vivas, Oscar Andrès; Poignet, Philippe

    2003-01-01

    This paper deals with an efficient application of a model based predictive control in parallel machines. A receding horizon control strategy based on a simplified dynamic model is implemented. Experimental results are shown for the H4 robot, a fully parallel structure providing 3 degrees of freedom (dof) in translation and 1 dof in rotation. The model based predictive control and the commonly used computed torque control strategies are compared. The tracking performances and the robustness wi...

  7. Computer-based modelling and analysis in engineering geology

    OpenAIRE

    Giles, David

    2014-01-01

    This body of work presents the research and publications undertaken under a general theme of computer-based modelling and analysis in engineering geology. Papers are presented on geotechnical data management, data interchange, Geographical Information Systems, surface modelling, geostatistical methods, risk-based modelling, knowledge-based systems, remote sensing in engineering geology and on the integration of computer applications into applied geoscience teaching. The work highlights my...

  8. Multimedia Data Modeling Based on Temporal Logic and XYZ System

    Institute of Scientific and Technical Information of China (English)

    MA Huadong; LIU Shenquan

    1999-01-01

    This paper proposes a new approach to modeling multimedia data. The newapproach is the multimedia data model based on temporal logic and XYZSystem. It supports the formal specifications in a multimedia system.Using this model, we can not only specify information unitsbut also design and script a multimedia title in an unified framework.Based on this model, an interactive multimedia authoring environment hasbeen developed.

  9. The fractional volatility model: An agent-based interpretation

    Science.gov (United States)

    Vilela Mendes, R.

    2008-06-01

    Based on the criteria of mathematical simplicity and consistency with empirical market data, a model with volatility driven by fractional noise has been constructed which provides a fairly accurate mathematical parametrization of the data. Here, some features of the model are reviewed and extended to account for leverage effects. Using agent-based models, one tries to find which agent strategies and (or) properties of the financial institutions might be responsible for the features of the fractional volatility model.

  10. An event-based model for contracts

    Directory of Open Access Journals (Sweden)

    Tiziana Cimoli

    2013-02-01

    Full Text Available We introduce a basic model for contracts. Our model extends event structures with a new relation, which faithfully captures the circular dependencies among contract clauses. We establish whether an agreement exists which respects all the contracts at hand (i.e. all the dependencies can be resolved, and we detect the obligations of each participant. The main technical contribution is a correspondence between our model and a fragment of the contract logic PCL. More precisely, we show that the reachable events are exactly those which correspond to provable atoms in the logic. Despite of this strong correspondence, our model improves previous work on PCL by exhibiting a finer-grained notion of culpability, which takes into account the legitimate orderings of events.

  11. Models for Patch-Based Image Restoration

    Directory of Open Access Journals (Sweden)

    Mithun Das Gupta

    2009-01-01

    Full Text Available We present a supervised learning approach for object-category specific restoration, recognition, and segmentation of images which are blurred using an unknown kernel. The novelty of this work is a multilayer graphical model which unifies the low-level vision task of restoration and the high-level vision task of recognition in a cooperative framework. The graphical model is an interconnected two-layer Markov random field. The restoration layer accounts for the compatibility between sharp and blurred images and models the association between adjacent patches in the sharp image. The recognition layer encodes the entity class and its location in the underlying scene. The potentials are represented using nonparametric kernel densities and are learnt from training data. Inference is performed using nonparametric belief propagation. Experiments demonstrate the effectiveness of our model for the restoration and recognition of blurred license plates as well as face images.

  12. Models for Patch-Based Image Restoration

    Directory of Open Access Journals (Sweden)

    Petrovic Nemanja

    2009-01-01

    Full Text Available Abstract We present a supervised learning approach for object-category specific restoration, recognition, and segmentation of images which are blurred using an unknown kernel. The novelty of this work is a multilayer graphical model which unifies the low-level vision task of restoration and the high-level vision task of recognition in a cooperative framework. The graphical model is an interconnected two-layer Markov random field. The restoration layer accounts for the compatibility between sharp and blurred images and models the association between adjacent patches in the sharp image. The recognition layer encodes the entity class and its location in the underlying scene. The potentials are represented using nonparametric kernel densities and are learnt from training data. Inference is performed using nonparametric belief propagation. Experiments demonstrate the effectiveness of our model for the restoration and recognition of blurred license plates as well as face images.

  13. Demand forecast model based on CRM

    Science.gov (United States)

    Cai, Yuancui; Chen, Lichao

    2006-11-01

    With interiorizing day by day management thought that regarding customer as the centre, forecasting customer demand becomes more and more important. In the demand forecast of customer relationship management, the traditional forecast methods have very great limitation because much uncertainty of the demand, these all require new modeling to meet the demands of development. In this paper, the notion is that forecasting the demand according to characteristics of the potential customer, then modeling by it. The model first depicts customer adopting uniform multiple indexes. Secondly, the model acquires characteristic customers on the basis of data warehouse and the technology of data mining. The last, there get the most similar characteristic customer by their comparing and forecast the demands of new customer by the most similar characteristic customer.

  14. Physics-Based Pneumatic Hammer Instability Model Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Florida Turbine Technologies (FTT) proposes to conduct research necessary to develop a physics-based pneumatic hammer instability model for hydrostatic bearings...

  15. Image-Based Modeling of Plants and Trees

    CERN Document Server

    Kang, Sing Bang

    2009-01-01

    Plants and trees are among the most complex natural objects. Much work has been done attempting to model them, with varying degrees of success. In this book, we review the various approaches in computer graphics, which we categorize as rule-based, image-based, and sketch-based methods. We describe our approaches for modeling plants and trees using images. Image-based approaches have the distinct advantage that the resulting model inherits the realistic shape and complexity of a real plant or tree. We use different techniques for modeling plants (with relatively large leaves) and trees (with re

  16. A Model-Based Privacy Compliance Checker

    OpenAIRE

    Siani Pearson; Damien Allison

    2009-01-01

    Increasingly, e-business organisations are coming under pressure to be compliant to a range of privacy legislation, policies and best practice. There is a clear need for high-level management and administrators to be able to assess in a dynamic, customisable way the degree to which their enterprise complies with these. We outline a solution to this problem in the form of a model-driven automated privacy process analysis and configuration checking system. This system models privacy compliance ...

  17. Model-based immunization information routing.

    OpenAIRE

    Wang, D.; Jenders, R. A.

    2000-01-01

    We have developed a model for clinical information routing within an immunization registry. Components in this model include partners, contents and mechanisms. Partners are classified into senders, receivers and intermediates. Contents are classified into core contents and management information. Mechanisms are classified into topological control, temporal control, process control and communication channel control. Immunization reminders, forecasts and recalls in e-mail, fax and regular mail ...

  18. Probabilistic reasoning for assembly-based 3D modeling

    KAUST Repository

    Chaudhuri, Siddhartha

    2011-01-01

    Assembly-based modeling is a promising approach to broadening the accessibility of 3D modeling. In assembly-based modeling, new models are assembled from shape components extracted from a database. A key challenge in assembly-based modeling is the identification of relevant components to be presented to the user. In this paper, we introduce a probabilistic reasoning approach to this problem. Given a repository of shapes, our approach learns a probabilistic graphical model that encodes semantic and geometric relationships among shape components. The probabilistic model is used to present components that are semantically and stylistically compatible with the 3D model that is being assembled. Our experiments indicate that the probabilistic model increases the relevance of presented components. © 2011 ACM.

  19. Model-based Prognostics under Limited Sensing

    Data.gov (United States)

    National Aeronautics and Space Administration — Prognostics is crucial to providing reliable condition-based maintenance decisions. To obtain accurate predictions of component life, a variety of sensors are often...

  20. A PSEUDO RELEVANCE BASED IMAGE RETRIEVAL MODEL

    OpenAIRE

    Kamini Thakur; Preetika Saxena

    2015-01-01

    Image retrieval is the basic requirement, task now a day. Content based image retrieval is the popular image retrieval system by which the target image to be retrieved based on the useful features of the given image. CBIR has an active and fast growing research area in both image processing and data mining. In marine ecosystems the captured images having lower resolution, transformation invariant and translation capabilities. Therefore, accurate image extraction according to the u...

  1. Model based process-product design and analysis

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid...... model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types...

  2. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  3. Mechanics and model-based control of advanced engineering systems

    CERN Document Server

    Irschik, Hans; Krommer, Michael

    2014-01-01

    Mechanics and Model-Based Control of Advanced Engineering Systems collects 32 contributions presented at the International Workshop on Advanced Dynamics and Model Based Control of Structures and Machines, which took place in St. Petersburg, Russia in July 2012. The workshop continued a series of international workshops, which started with a Japan-Austria Joint Workshop on Mechanics and Model Based Control of Smart Materials and Structures and a Russia-Austria Joint Workshop on Advanced Dynamics and Model Based Control of Structures and Machines. In the present volume, 10 full-length papers based on presentations from Russia, 9 from Austria, 8 from Japan, 3 from Italy, one from Germany and one from Taiwan are included, which represent the state of the art in the field of mechanics and model based control, with particular emphasis on the application of advanced structures and machines.

  4. A conceptual data model coupling with physically-based distributed hydrological models based on catchment discretization schemas

    Science.gov (United States)

    Liu, Yuanming; Zhang, Wanchang; Zhang, Zhijie

    2015-11-01

    In hydrology, the data types, spatio-temporal scales and formats for physically-based distributed hydrological models and the distributed data or parameters may be different before significant data pre-processing or may change during hydrological simulation run time. A data model is devoted to these problems for sophisticated numerical hydrological modeling procedures. In this paper, we propose a conceptual data model to interpret the comprehensive, universal and complex water environmental entities. We also present an innovative integration methodology to couple the data model with physically-based distributed hydrological models (DHMs) based on catchment discretization schemas. The data model provides a reasonable framework for researchers of organizing and pre-processing water environmental spatio-temporal datasets. It also facilitates seamless data flow fluid and dynamic by hydrological response units (HRUs) as the core between the object-oriented databases and physically-based distributed hydrological models.

  5. Active Appearance Model Based Hand Gesture Recognition

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    This paper addresses the application of hand gesture recognition in monocular image sequences using Active Appearance Model (AAM). For this work, the proposed algorithm is conposed of constructing AAMs and fitting the models to the interest region. In training stage, according to the manual labeled feature points, the relative AAM is constructed and the corresponding average feature is obtained. In recognition stage, the interesting hand gesture region is firstly segmented by skin and movement cues.Secondly, the models are fitted to the image that includes the hand gesture, and the relative features are extracted.Thirdly, the classification is done by comparing the extracted features and average features. 30 different gestures of Chinese sign language are applied for testing the effectiveness of the method. The Experimental results are given indicating good performance of the algorithm.

  6. Model-Based Design of Biochemical Microreactors.

    Science.gov (United States)

    Elbinger, Tobias; Gahn, Markus; Neuss-Radu, Maria; Hante, Falk M; Voll, Lars M; Leugering, Günter; Knabner, Peter

    2016-01-01

    Mathematical modeling of biochemical pathways is an important resource in Synthetic Biology, as the predictive power of simulating synthetic pathways represents an important step in the design of synthetic metabolons. In this paper, we are concerned with the mathematical modeling, simulation, and optimization of metabolic processes in biochemical microreactors able to carry out enzymatic reactions and to exchange metabolites with their surrounding medium. The results of the reported modeling approach are incorporated in the design of the first microreactor prototypes that are under construction. These microreactors consist of compartments separated by membranes carrying specific transporters for the input of substrates and export of products. Inside the compartments of the reactor multienzyme complexes assembled on nano-beads by peptide adapters are used to carry out metabolic reactions. The spatially resolved mathematical model describing the ongoing processes consists of a system of diffusion equations together with boundary and initial conditions. The boundary conditions model the exchange of metabolites with the neighboring compartments and the reactions at the surface of the nano-beads carrying the multienzyme complexes. Efficient and accurate approaches for numerical simulation of the mathematical model and for optimal design of the microreactor are developed. As a proof-of-concept scenario, a synthetic pathway for the conversion of sucrose to glucose-6-phosphate (G6P) was chosen. In this context, the mathematical model is employed to compute the spatio-temporal distributions of the metabolite concentrations, as well as application relevant quantities like the outflow rate of G6P. These computations are performed for different scenarios, where the number of beads as well as their loading capacity are varied. The computed metabolite distributions show spatial patterns, which differ for different experimental arrangements. Furthermore, the total output of G6P

  7. Kalman filter-based gap conductance modeling

    International Nuclear Information System (INIS)

    Geometric and thermal property uncertainties contribute greatly to the problem of determining conductance within the fuel-clad gas gap of a nuclear fuel pin. Accurate conductance values are needed for power plant licensing transient analysis and for test analyses at research facilities. Recent work by Meek, Doerner, and Adams has shown that use of Kalman filters to estimate gap conductance is a promising approach. A Kalman filter is simply a mathematical algorithm that employs available system measurements and assumed dynamic models to generate optimal system state vector estimates. This summary addresses another Kalman filter approach to gap conductance estimation and subsequent identification of an empirical conductance model

  8. Flood forecasting for River Mekong with data-based models

    Science.gov (United States)

    Shahzad, Khurram M.; Plate, Erich J.

    2014-09-01

    In many regions of the world, the task of flood forecasting is made difficult because only a limited database is available for generating a suitable forecast model. This paper demonstrates that in such cases parsimonious data-based hydrological models for flood forecasting can be developed if the special conditions of climate and topography are used to advantage. As an example, the middle reach of River Mekong in South East Asia is considered, where a database of discharges from seven gaging stations on the river and 31 rainfall stations on the subcatchments between gaging stations is available for model calibration. Special conditions existing for River Mekong are identified and used in developing first a network connecting all discharge gages and then models for forecasting discharge increments between gaging stations. Our final forecast model (Model 3) is a linear combination of two structurally different basic models: a model (Model 1) using linear regressions for forecasting discharge increments, and a model (Model 2) using rainfall-runoff models. Although the model based on linear regressions works reasonably well for short times, better results are obtained with rainfall-runoff modeling. However, forecast accuracy of Model 2 is limited by the quality of rainfall forecasts. For best results, both models are combined by taking weighted averages to form Model 3. Model quality is assessed by means of both persistence index PI and standard deviation of forecast error.

  9. Gradient-based adaptation of continuous dynamic model structures

    Science.gov (United States)

    La Cava, William G.; Danai, Kourosh

    2016-01-01

    A gradient-based method of symbolic adaptation is introduced for a class of continuous dynamic models. The proposed model structure adaptation method starts with the first-principles model of the system and adapts its structure after adjusting its individual components in symbolic form. A key contribution of this work is its introduction of the model's parameter sensitivity as the measure of symbolic changes to the model. This measure, which is essential to defining the structural sensitivity of the model, not only accommodates algebraic evaluation of candidate models in lieu of more computationally expensive simulation-based evaluation, but also makes possible the implementation of gradient-based optimisation in symbolic adaptation. The proposed method is applied to models of several virtual and real-world systems that demonstrate its potential utility.

  10. A model-based control system concept

    International Nuclear Information System (INIS)

    This paper presents an overview of a new concept for DCSs developed within the KBRTCS (Knowledge-Based Real-Time Control Systems) project performed between 1988 and 1991 as a part of the Swedish IT4 programme. The partners of the project have been the Department of Automatic Control at Lund University, Asea Brown Boveri, and during parts of the project, SattControl, and TeleLogic. The aim of the project has been to develop a concept for future generations of DCSs based on a plant database containing a description of the plant together with the control system. The database is object-based and supports multiple views of an objects. A demonstrator is presented where a DCS system of this type is emulated. The demonstrator contains a number of control, monitoring, and diagnosis applications that execute in real time against a simulations of Steritherm sterilization process. (25 refs.)

  11. An Optimization Model Based on Game Theory

    Directory of Open Access Journals (Sweden)

    Yang Shi

    2014-04-01

    Full Text Available Game Theory has a wide range of applications in department of economics, but in the field of computer science, especially in the optimization algorithm is seldom used. In this paper, we integrate thinking of game theory into optimization algorithm, and then propose a new optimization model which can be widely used in optimization processing. This optimization model is divided into two types, which are called “the complete consistency” and “the partial consistency”. In these two types, the partial consistency is added disturbance strategy on the basis of the complete consistency. When model’s consistency is satisfied, the Nash equilibrium of the optimization model is global optimal and when the model’s consistency is not met, the presence of perturbation strategy can improve the application of the algorithm. The basic experiments suggest that this optimization model has broad applicability and better performance, and gives a new idea for some intractable problems in the field of artificial intelligence

  12. MATLAB Based PCM Modeling and Simulation

    OpenAIRE

    Yongchao Jin; Hong Liang; Weiwei Feng; Qiong Wang

    2013-01-01

    PCM is the key technology of digital communication, and has especially been widely used in the optical fiber communication, digital microwave communication, satellite communication. Modeling PCM communication systems with the pulse code system by programming, and conduct computer simulation by MATLAB, to analysis performance of the linear PCM and logarithmic PCM.  

  13. CSPBuilder - CSP based Scientific Workflow Modelling

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard; Vinter, Brian

    previously been called for. The framework is CSP presented as a scienti¿c work¿ow model, specialized for scienti¿c computing applications. The purpose of the framework is to enable scientists to exploit large parallel computation resources, which has previously been hard due of the dif¿culty of concurrent...

  14. Automata-Based CSL Model Checking

    DEFF Research Database (Denmark)

    Zhang, Lijun; Jansen, David N.; Nielson, Flemming; Hermanns, Holger

    For continuous-time Markov chains, the model-checking problem with respect to continuous-time stochastic logic (CSL) has been introduced and shown to be decidable by Aziz, Sanwal, Singhal and Brayton in 1996. The presented decision procedure, however, has exponential complexity. In this paper, we...

  15. Model based monitoring of stormwater runoff quality

    DEFF Research Database (Denmark)

    Birch, Heidi; Vezzaro, Luca; Mikkelsen, Peter Steen

    2012-01-01

    Monitoring of micropollutants (MP) in stormwater is essential to evaluate the impacts of stormwater on the receiving aquatic environment. The aim of this study was to investigate how different strategies for monitoring of stormwater quality (combination of model with field sampling) affect the in...

  16. A Qualitative Acceleration Model Based on Intervals

    Directory of Open Access Journals (Sweden)

    Ester MARTINEZ-MARTIN

    2013-08-01

    Full Text Available On the way to autonomous service robots, spatial reasoning plays a main role since it properly deals with problems involving uncertainty. In particular, we are interested in knowing people's pose to avoid collisions. With that aim, in this paper, we present a qualitative acceleration model for robotic applications including representation, reasoning and a practical application.

  17. Baryonic masses based on the NJL model

    International Nuclear Information System (INIS)

    We employ the Nambu-Jona-Lasinio model to determine the vacuum pressure on the quarks in a baryon and hence their density inside. Then we estimate the baryonic masses by implementing the local density approximation for the mean-field quark energies obtained in a uniform and isotropic system. We obtain a fair agreement with the experimental masses. (orig.)

  18. Cognitive Modeling for Agent-Based Simulation of Child Maltreatment

    Science.gov (United States)

    Hu, Xiaolin; Puddy, Richard

    This paper extends previous work to develop cognitive modeling for agent-based simulation of child maltreatment (CM). The developed model is inspired from parental efficacy, parenting stress, and the theory of planned behavior. It provides an explanatory, process-oriented model of CM and incorporates causality relationship and feedback loops from different factors in the social ecology in order for simulating the dynamics of CM. We describe the model and present simulation results to demonstrate the features of this model.

  19. Detecting influential observations in a model-based cluster analysis

    OpenAIRE

    Bruckers, L.; Molenberghs, G; Verbeke, G; Geys, H.

    2016-01-01

    Finite mixture models have been used to model population heterogeneity and to relax distributional assumptions. These models are also convenient tools for clustering and classification of complex data such as, for example, repeated-measurements data. The performance of model-based clustering algorithms is sensitive to influential and outlying observations. Methods for identifying outliers in a finite mixture model have been described in the literature. Approaches to identify influential obser...

  20. Kinetic data base for combustion modeling

    Energy Technology Data Exchange (ETDEWEB)

    Tsang, W.; Herron, J.T. [National Institute of Standards and Technology, Gaithersburg, MD (United States)

    1993-12-01

    The aim of this work is to develop a set of evaluated rate constants for use in the simulation of hydrocarbon combustion. The approach has been to begin with the small molecules and then introduce larger species with the various structural elements that can be found in all hydrocarbon fuels and decomposition products. Currently, the data base contains most of the species present in combustion systems with up to four carbon atoms. Thus, practically all the structural grouping found in aliphatic compounds have now been captured. The direction of future work is the addition of aromatic compounds to the data base.

  1. (Re)configuration based on model generation

    CERN Document Server

    Friedrich, Gerhard; Falkner, Andreas A; Haselböck, Alois; Schenner, Gottfried; Schreiner, Herwig; 10.4204/EPTCS.65.3

    2011-01-01

    Reconfiguration is an important activity for companies selling configurable products or services which have a long life time. However, identification of a set of required changes in a legacy configuration is a hard problem, since even small changes in the requirements might imply significant modifications. In this paper we show a solution based on answer set programming, which is a logic-based knowledge representation formalism well suited for a compact description of (re)configuration problems. Its applicability is demonstrated on simple abstractions of several real-world scenarios. The evaluation of our solution on a set of benchmark instances derived from commercial (re)configuration problems shows its practical applicability.

  2. Modeling Mass Spectrometry Based Protein Analysis

    OpenAIRE

    Eriksson, Jan; Fenyö, David

    2011-01-01

    The success of mass spectrometry based proteomics depends on efficient methods for data analysis. These methods require a detailed understanding of the information value of the data. Here, we describe how the information value can be elucidated by performing simulations using synthetic data.

  3. Agent-based modelling of socio-technical systems

    CERN Document Server

    van Dam, Koen H; Lukszo, Zofia

    2012-01-01

    Here is a practical introduction to agent-based modelling of socio-technical systems, based on methodology developed at TU Delft, which has been deployed in a number of case studies. Offers theory, methods and practical steps for creating real-world models.

  4. Group Contribution Based Process Flowsheet Synthesis, Design and Modelling

    DEFF Research Database (Denmark)

    Gani, Rafiqul; d'Anterroches, Loïc

    This paper presents a process-group-contribution Method to model. simulate and synthesize a flowsheet. The process-group based representation of a flowsheet together with a process "property" model are presented. The process-group based synthesis method is developed on the basis of the computer...

  5. Facilitating Change to a Problem-based Model

    DEFF Research Database (Denmark)

    Kolmos, Anette

    The paper presents the barriers which arise during the change process from a traditional educational system to a problem-based educational model.......The paper presents the barriers which arise during the change process from a traditional educational system to a problem-based educational model....

  6. An Active Learning Exercise for Introducing Agent-Based Modeling

    Science.gov (United States)

    Pinder, Jonathan P.

    2013-01-01

    Recent developments in agent-based modeling as a method of systems analysis and optimization indicate that students in business analytics need an introduction to the terminology, concepts, and framework of agent-based modeling. This article presents an active learning exercise for MBA students in business analytics that demonstrates agent-based…

  7. Formal model based methodology for developing software for nuclear applications

    International Nuclear Information System (INIS)

    The approach used in model based design is to build the model of the system in graphical/textual language. In older model based design approach, the correctness of the model is usually established by simulation. Simulation which is analogous to testing, cannot guarantee that the design meets the system requirements under all possible scenarios. This is however possible if the modeling language is based on formal semantics so that the developed model can be subjected to formal verification of properties based on specification. The verified model can then be translated into an implementation through reliable/verified code generator thereby reducing the necessity of low level testing. Such a methodology is admissible as per guidelines of IEC60880 standard applicable to software used in computer based systems performing category A functions in nuclear power plant and would also be acceptable for category B functions. In this article, the experience in implementation and formal verification of important controllers used in the process control system of a nuclear reactor. We have used The SCADE (Safety Critical System Analysis and Design Environment) environment to model the controllers. The modeling language used in SCADE is based on the synchronous dataflow model of computation. A set of safety properties has been verified using formal verification technique

  8. Warehouse Optimization Model Based on Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Guofeng Qin

    2013-01-01

    Full Text Available This paper takes Bao Steel logistics automated warehouse system as an example. The premise is to maintain the focus of the shelf below half of the height of the shelf. As a result, the cost time of getting or putting goods on the shelf is reduced, and the distance of the same kind of goods is also reduced. Construct a multiobjective optimization model, using genetic algorithm to optimize problem. At last, we get a local optimal solution. Before optimization, the average cost time of getting or putting goods is 4.52996 s, and the average distance of the same kinds of goods is 2.35318 m. After optimization, the average cost time is 4.28859 s, and the average distance is 1.97366 m. After analysis, we can draw the conclusion that this model can improve the efficiency of cargo storage.

  9. Probabilistic mixture-based image modelling

    Czech Academy of Sciences Publication Activity Database

    Haindl, Michal; Havlíček, Vojtěch; Grim, Jiří

    2011-01-01

    Roč. 47, č. 3 (2011), s. 482-500. ISSN 0023-5954 R&D Projects: GA MŠk 1M0572; GA ČR GA102/08/0593 Grant ostatní: CESNET(CZ) 387/2010; GA MŠk(CZ) 2C06019; GA ČR(CZ) GA103/11/0335 Institutional research plan: CEZ:AV0Z10750506 Keywords : BTF texture modelling * discrete distribution mixtures * Bernoulli mixture * Gaussian mixture * multi-spectral texture modelling Subject RIV: BD - Theory of Information Impact factor: 0.454, year: 2011 http://library.utia.cas.cz/separaty/2011/RO/haindl-0360244.pdf

  10. Nonlinear system modeling based on experimental data

    Energy Technology Data Exchange (ETDEWEB)

    PAEZ,THOMAS L.; HUNTER,NORMAN F.

    2000-02-02

    The canonical variate analysis technique is used in this investigation, along with a data transformation algorithm, to identify a system in a transform space. The transformation algorithm involves the preprocessing of measured excitation/response data with a zero-memory-nonlinear transform, specifically, the Rosenblatt transform. This transform approximately maps the measured excitation and response data from its own space into the space of uncorrelated, standard normal random variates. Following this transform, it is appropriate to model the excitation/response relation as linear since Gaussian inputs excite Gaussian responses in linear structures. The linear model is identified in the transform space using the canonical variate analysis approach, and system responses in the original space are predicted using inverse Rosenblatt transformation. An example is presented.

  11. Text document classification based on mixture models

    Czech Academy of Sciences Publication Activity Database

    Novovičová, Jana; Malík, Antonín

    2004-01-01

    Roč. 40, č. 3 (2004), s. 293-304. ISSN 0023-5954 R&D Projects: GA AV ČR IAA2075302; GA ČR GA102/03/0049; GA AV ČR KSK1019101 Institutional research plan: CEZ:AV0Z1075907 Keywords : text classification * text categorization * multinomial mixture model Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.224, year: 2004

  12. Copula-based bivariate binary response models

    OpenAIRE

    Winkelmann, Rainer

    2009-01-01

    The bivariate probit model is frequently used for estimating the effect of an endogenous binary regressor on a binary outcome variable. This paper discusses simple modifications that maintain the probit assumption for the marginal distributions while introducing non-normal dependence among the two variables using copulas. Simulation results and evidence from two applications, one on the effect of insurance status on ambulatory expenditure and one on the effect of completing high school on sub...

  13. Vision-based macroscopic pedestrian models

    OpenAIRE

    Degond, Pierre; Appert-Rolland, Cécile; Pettré, Julien; Theraulaz, Guy

    2013-01-01

    We propose a hierarchy of kinetic and macroscopic models for a system consisting of a large number of interacting pedestrians. The basic interaction rules are derived from earlier work where the dangerousness level of an interaction with another pedestrian is measured in terms of the derivative of the bearing angle (angle between the walking direction and the line connecting the two subjects) and of the time-to-interaction (time before reaching the closest distance between the two subjects). ...

  14. Topological Modeling Based Diagnostic Tests Selection

    OpenAIRE

    Eriņš, M

    2014-01-01

    This article covers the process of software testing. Test management and creation methods are described within the scope of the research. The process of test selection through several stages of project development is discussed and practical examples of appliance are given for the test organization and decision making with the help of topological models of software. The criteria of test ranging are described within scope of each of the testing levels. The paper indicates the use of topological...

  15. Topological Modeling Based Diagnostic Tests Selection

    OpenAIRE

    Erins, Matiss

    2015-01-01

    This article covers the process of software testing. Test management and creation methods are described within the scope of the research. The process of test selection through several stages of project development is discussed and practical examples of appliance are given for the test organization and decision making with the help of topological models of software. The criteria of test ranging are described within scope of each of the testing levels. The paper indicates the use of topological...

  16. News-Based Group Modeling and Forecasting

    OpenAIRE

    Zhang, Wenbin; Skiena, Steven

    2014-01-01

    In this paper, we study news group modeling and forecasting methods using quantitative data generated by our large-scale natural language processing (NLP) text analysis system. A news group is a set of news entities, like top U.S. cities, governors, senators, golfers, or movie actors. Our fame distribution analysis of news groups shows that log-normal and power-law distributions generally could describe news groups in many aspects. We use several real news groups including cities, politicians...

  17. Model-based satellite image fusion

    DEFF Research Database (Denmark)

    Aanæs, Henrik; Sveinsson, J. R.; Nielsen, Allan Aasbjerg;

    2008-01-01

    A method is proposed for pixel-level satellite image fusion derived directly from a model of the imaging sensor. By design, the proposed method is spectrally consistent. It is argued that the proposed method needs regularization, as is the case for any method for this problem. A framework for pixel......-hue-saturation method is revisited in order to gain additional insight of what implications the spectral consistency has for an image fusion method....

  18. Quality based strategy: modelling for lean manufacturing

    OpenAIRE

    Cruz Machado, Virgilio A.

    1994-01-01

    The research develops and applies an integrated methodology for creating a Lean Manufacturing Environment in a traditional industry: the Portuguese Textile and Clothing Industry. This is achieved by developing a modelling tool using quality as a basis of performance assessment. In the context of the textile industry specific research objectives were: to evaluate current and potential application of Lean Manufacturing; to determine current business performance assessment crit...

  19. STOCHASTIC ADAPTIVE SWITCHING CONTROL BASED ON MULTIPLE MODELS

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yanxia; GUO Lei

    2002-01-01

    It is well known that the transient behaviors of the traditional adaptive control may be very poor in general, and that the adaptive control designed based on switching between multiple models is an intuitively appealing and practically feasible approach to improve the transient performances. In this paper, we shall prove that for a typical class of linear systems disturbed by random noises, the multiple model based least-squares (LS)adaptive switching control is stable and convergent, and has the same convergence rate as that established for the standard least-squares-based self-tunning regulators. Moreover,the mixed case combining adaptive models with fixed models is also considered.

  20. Geometric Feature Extraction and Model Reconstruction Based on Scattered Data

    Institute of Scientific and Technical Information of China (English)

    胡鑫; 习俊通; 金烨

    2004-01-01

    A method of 3D model reconstruction based on scattered point data in reverse engineering is presented here. The topological relationship of scattered points was established firstly, then the data set was triangulated to reconstruct the mesh surface model. The curvatures of cloud data were calculated based on the mesh surface, and the point data were segmented by edge-based method; Every patch of data was fitted by quadric surface of freeform surface, and the type of quadric surface was decided by parameters automatically, at last the whole CAD model was created. An example of mouse model was employed to confirm the effect of the algorithm.

  1. A PC-based graphical simulator for physiological pharmacokinetic models.

    Science.gov (United States)

    Wada, D R; Stanski, D R; Ebling, W F

    1995-04-01

    Since many intravenous anesthetic drugs alter blood flows, physiologically-based pharmacokinetic models describing drug disposition may be time-varying. Using the commercially available programming software MATLAB, a platform to simulate time-varying physiological pharmacokinetic models was developed. The platform is based upon a library of pharmacokinetic blocks which mimic physiological structure. The blocks can be linked together flexibly to form models for different drugs. Because of MATLAB's additional numerical capabilities (e.g. non-linear optimization), the platform provides a complete graphical microcomputer-based tool for physiologic pharmacokinetic modeling. PMID:7656558

  2. A method to manage the model base in DSS

    Institute of Scientific and Technical Information of China (English)

    孙成双; 李桂君

    2004-01-01

    How to manage and use models in DSS is a most important subject. Generally, it costs a lot of money and time to develop the model base management system in the development of DSS and most are simple in function or cannot be used efficiently in practice. It is a very effective, applicable, and economical choice to make use of the interfaces of professional computer software to develop a model base management system. This paper presents the method of using MATLAB, a well-known statistics software, as the development platform of a model base management system. The main functional framework of a MATLAB-based model base managementsystem is discussed. Finally, in this paper, its feasible application is illustrated in the field of construction projects.

  3. Robust speech features representation based on computational auditory model

    Institute of Scientific and Technical Information of China (English)

    LU Xugang; JIA Chuan; DANG Jianwu

    2004-01-01

    A speech signal processing and features extracting method based on computational auditory model is proposed. The computational model is based on psychological, physiological knowledge and digital signal processing methods. In each stage of a hearing perception system, there is a corresponding computational model to simulate its function. Based on this model, speech features are extracted. In each stage, the features in different kinds of level are extracted. A further processing for primary auditory spectrum based on lateral inhibition is proposed to extract much more robust speech features. All these features can be regarded as the internal representations of speech stimulation in hearing system. The robust speech recognition experiments are conducted to test the robustness of the features. Results show that the representations based on the proposed computational auditory model are robust representations for speech signals.

  4. A general model for membrane-based separation processes

    DEFF Research Database (Denmark)

    Soni, Vipasha; Abildskov, Jens; Jonsson, Gunnar Eigil;

    2009-01-01

    A separation process could be defined as a process that transforms a given mixture of chemicals into two or more compositionally distinct end-use products. One way to design these separation processes is to employ a model-based approach, where mathematical models that reliably predict the process...... behaviour will play an important role. In this paper, modelling of membrane-based processes for separation of gas and liquid mixtures are considered. Two general models, one for membrane-based liquid separation processes (with phase change) and another for membrane-based gas separation are presented. The...... modelling assumptions. Analyses of the generated models, together with their validation and application in process design/analysis are highlighted through several case studies....

  5. Adaptive Digital Image Watermarking Based on Combination of HVS Models

    Directory of Open Access Journals (Sweden)

    P. Foris

    2009-09-01

    Full Text Available In this paper two new blind adaptive digital watermarking methods of color images are presented. The adaptability is based on perceptual watermarking which exploits Human Visual System (HVS models. The first method performs watermark embedding in transform domain of DCT and the second method is based on DWT. Watermark is embedded into transform domain of a chosen color image component in a selected color space. Both methods use a combination of HVS models to select perceptually significant transform coefficients and at the same time to determine the bounds of modification of selected coefficients. The final HVS model consists of three parts. The first part is the HVS model in DCT (DWT domain. The second part is the HVS model based on Region of Interest and finally the third part is the HVS model based on Noise Visibility Function. Watermark has a form of a real number sequence with normal distribution.

  6. Category model to modeling the surface texture knowledge-base

    OpenAIRE

    Yan WANG; Scott, Paul J; Jiang, Xiang

    2006-01-01

    The next generations of Geometrical Product Specification (GPS) standards are considered to be too theoretical, abstract, complex and over-elaborate. And it is not easier for industry to understand and implement them efficiently in a short time. An intelligent knowledge-based system, “VirtualSurf” is being developed to solve the problem, particularly for surface texture knowledge, which is a critical part of GPS. This system will provide expert knowledge of surface texture to link...

  7. Characteristics of a Logistics-Based Business Model

    OpenAIRE

    Sandberg, Erik; Kihlén, Tobias; Abrahamsson, Mats

    2011-01-01

    In companies where excellence in logistics is decisive for the outperformance of competitors and logistics has an outspoken role for the strategy of the firm, there is present what we refer to here as a “logistics-based business model.” Based on a multiple case study of three Nordic retail companies, the purpose of this article is to explore the characteristics of such a logistics-based business model. As such, this research helps to provide structure to logistics-based business models and id...

  8. A Technology-based Model for Learning

    Directory of Open Access Journals (Sweden)

    Michael Williams

    2004-12-01

    Full Text Available The Math Emporium, opened in 1997, is an open 7000-squaremeter facility with 550+ workstations arranged in an array of widely spaced hexagonal "pods", designed to support group work at the same time maintaining an academic air. We operate it 24/7 with math support personnel in attendance 12 hours per day. Students have access to online course resources at all times, from anywhere. We have used this unique asset to transform traditional classroom-based courses into technology based learning programs that have no class meetings at all. The structure of the program is very different from the conventional one, having a new set of expectations and motivations. The results include: more effective students, substantial cost savings, economies of scale and scope and a stream-lined process for creating new on-line courses.

  9. Ray-Based Reflectance Model for Diffraction

    CERN Document Server

    Cuypers, Tom; Haber, Tom; Bekaert, Philippe; Raskar, Ramesh

    2011-01-01

    We present a novel method of simulating wave effects in graphics using ray--based renderers with a new function: the Wave BSDF (Bidirectional Scattering Distribution Function). Reflections from neighboring surface patches represented by local BSDFs are mutually independent. However, in many surfaces with wavelength-scale microstructures, interference and diffraction requires a joint analysis of reflected wavefronts from neighboring patches. We demonstrate a simple method to compute the BSDF for the entire microstructure, which can be used independently for each patch. This allows us to use traditional ray--based rendering pipelines to synthesize wave effects of light and sound. We exploit the Wigner Distribution Function (WDF) to create transmissive, reflective, and emissive BSDFs for various diffraction phenomena in a physically accurate way. In contrast to previous methods for computing interference, we circumvent the need to explicitly keep track of the phase of the wave by using BSDFs that include positiv...

  10. Towards a contract-based interoperation model

    OpenAIRE

    Fernández Peña, Félix Oscar; Willmott, Steven Nicolás

    2007-01-01

    Web Services-based solutions for interoperating processes are considered to be one of the most promising technologies for achieving truly interoperable functioning in open environments. In the last three years, the specification in particular of agreements between resource / service providers and consumers, as well as protocols for their negotiation have been proposed as a possible solution for managing the resulting computing systems. In this report, the state of the art in the area of contr...

  11. A Time Based Blended Learning Model

    OpenAIRE

    Norberg, Anders; Dziuban, Charles D; Moskal, Patsy M

    2011-01-01

    Purpose – This paper seeks to outline a time-based strategy for blended learning that illustrates course design and delivery by framing students' learning opportunities in synchronous and asynchronous modalities. Design/methodology/approach – This paper deconstructs the evolving components of blended learning in order to identify changes induced by digital technologies for enhancing teaching and learning environments. Findings – This paper hypothesizes that blended learning may be traced back...

  12. Adopsi Model Competency Based Training dalam Kewirausahaan

    OpenAIRE

    I Ketut Santra

    2009-01-01

    The aim of the research is improving the teaching method in entrepreneurship subject. This research adopted the competency based training (CBT) into the entrepreneurship. The major task in this research is formulated and designed the entrepreneurship competency. Entrepreneurship competency indicated by Personal, Strategic and Situational and Business competence. All of entrepreneurship competences are described into sub topic of competence. After designing and formulating the game and simulat...

  13. On a Flow-Based Paradigm in Modeling and Programming

    OpenAIRE

    Sabah Al-Fedaghi

    2015-01-01

    In computer science, the concept of flow is reflected in many terms such as data flow, control flow, message flow, information flow, and so forth. Many fields of study utilize the notion, including programming, communication (e.g., Shannon-Weaver communication model), software modeling, artificial intelligence, and knowledge representation. This paper focuses on two approaches that explicitly assert a flow-based paradigm: flow-based programming (FBP) and flowthing modeling (FM). The first is ...

  14. Model Based Bootstrap Methods for Interval Censored Data

    OpenAIRE

    Sen, Bodhisattva; Xu, Gongjun

    2013-01-01

    We investigate the performance of model based bootstrap methods for constructing point-wise confidence intervals around the survival function with interval censored data. We show that bootstrapping from the nonparametric maximum likelihood estimator of the survival function is inconsistent for both the current status and case 2 interval censoring models. A model based smoothed bootstrap procedure is proposed and shown to be consistent. In addition, simulation studies are conducted to illustra...

  15. Integration Issues of an Ontology based Context Modelling Approach

    OpenAIRE

    Strang, Thomas; Linnhoff-Popien, Claudia; Frank, Korbinian

    2003-01-01

    In this paper we analyse the applicability of our ontology based context modelling approach, considering a range of use cases. After wrapping up the model and the Context Ontology Language (CoOL) derived from it, we introduce some interesting applications of the language, based on a scenario showing the challenges in context aware service interactions. We focus on two submodels of our model for context aware service interactions, namely Context Bindings and Context Obligations, and demonstrat...

  16. Study of Teaching Model based on Cooperative Learning

    OpenAIRE

    Jing-qin SU; Fei-xue HUANG

    2010-01-01

    Cooperative learning is a popular teaching method now in the world. This paper first discusses the teaching model based on cooperative learning, then analyzes the advantages of cooperative learning and at last proposes the steps of carrying out cooperative learning. It is necessary to introduce the teaching model based on cooperative learning into the teaching for training software talents of China.
    Key words: Cooperative Learning; Training Model; Teach...

  17. A VENSIM BASED ANALYSIS FOR SUPPLY CHAIN MODEL

    OpenAIRE

    Mohammad SHAMSUDDOHA; Alexandru Mircea NEDELEA

    2013-01-01

    The emphasis on supply chain has increased in recent years among academic and industry circles. In this paper, a supply chain model will be developed based on a case study of the poultry industry under the Vensim environment. System dynamics, supply chain, design science and case method under positivist and quantitative paradigm will be studied to develop a simulation model. The objectives of this paper are to review literature, develop a Vensim based simulation supply chain model, and examin...

  18. A new physics based SPICE model for NPT IGBT

    OpenAIRE

    Rui Filipe Marques Chibante; Armando Luís Sousa Araújo; Adriano da Silva Carvalho

    2003-01-01

    A physics based, Non-Punch-Through, Insulated Gate Bipolar Transistor (NPT-IGBT) model is presented, as well as its porting into available circuit simulator SPICE. The developed model results in a system of ODEs, from which time/space hole/electron distribution is obtained, and is based on solution of ambipolar diffusion equation (ADE) trough a variational formulation, with posterior implementation using one-dimensional simplex finite elements. Other parts of the device are modeled using stan...

  19. Physics-Based Learning Models for Ship Hydrodynamics

    OpenAIRE

    Weymouth, Gabriel D.; Yue, Dick K.P.

    2014-01-01

    We present the concepts of physics-based learning models (PBLM) and their relevance and application to the field of ship hydrodynamics. The utility of physics-based learning is motivated by contrasting generic learning models for regression predictions, which do not presume any knowledge of the system other than the training data provided with methods such as semi-empirical models, which incorporate physical insights along with data-fitting. PBLM provides a framework wherein intermediate mode...

  20. Nonlinear Hamiltonian modelling of magnetic shape memory alloy based actuators.

    OpenAIRE

    Gauthier, Jean-Yves; Hubert, Arnaud; Abadie, Joël; Chaillet, Nicolas; Lexcellent, Christian

    2008-01-01

    This paper proposes an application of the Lagrangian formalism and its Hamiltonian extension to design, model and control a mechatronic system using Magnetic Shape Memory Alloys. In this aim, an original dynamical modelling of a Magnetic Shape Memory Alloy based actuator is presented. Energy-based techniques are used to obtain a coherent modelling of the magnetical, mechanical and thermodynamic phenomena. The Lagrangian formalism, well suited in such a case, is introduced and used to take int...

  1. Semiparametric theory based MIMO model and performance analysis

    Institute of Scientific and Technical Information of China (English)

    XU Fang-min; XU Xiao-dong; ZHANG Ping

    2007-01-01

    In this article, a new approach for modeling multi- input multi-output (MIMO) systems with unknown nonlinear interference is introduced. The semiparametric theory based MIMO model is established, and Kernel estimation is applied to combat the nonlinear interference. Furthermore, we derive MIMO capacity for these systems and explore the asymptotic properties of the new channel matrix via theoretical analysis. The simulation results show that the semiparametric theory based modeling and kernel estimation are valid to combat this kind of interference.

  2. Fuzzy model-based control of a nuclear reactor

    International Nuclear Information System (INIS)

    The fuzzy model-based control of a nuclear power reactor is an emerging research topic world-wide. SCK-CEN is dealing with this research in a preliminary stage, including two aspects, namely fuzzy control and fuzzy modelling. The aim is to combine both methodologies in contrast to conventional model-based PID control techniques, and to state advantages of including fuzzy parameters as safety and operator feedback. This paper summarizes the general scheme of this new research project

  3. Container Terminal Operations Modeling through Multi agent based Simulation

    OpenAIRE

    Ayub, Yasir; Faruki, Usman

    2009-01-01

    This thesis aims to propose a multi-agent based hierarchical model for the operations of container terminals. We have divided our model into four key agents that are involved in each sub processes. The proposed agent allocation policies are recommended for different situations that may occur at a container terminal. A software prototype is developed which implements the hierarchical model. This web based application is used in order to simulate the various processes involved in the following ...

  4. Criteria for folding in structure-based models of proteins

    OpenAIRE

    Wołek, Karol; Cieplak, Marek

    2016-01-01

    In structure-based models of proteins, one often assumes that folding is accomplished when all contacts are established. This assumption may frequently lead to a conceptual problem that folding takes place in a temperature region of very low thermodynamic stability, especially when the contact map used is too sparse. We consider six different structure-based models and show that allowing for a small, but model-dependent, percentage of the native contacts not being established boosts the foldi...

  5. Model based control of dynamic atomic force microscope

    International Nuclear Information System (INIS)

    A model-based robust control approach is proposed that significantly improves imaging bandwidth for the dynamic mode atomic force microscopy. A model for cantilever oscillation amplitude and phase dynamics is derived and used for the control design. In particular, the control design is based on a linearized model and robust H∞ control theory. This design yields a significant improvement when compared to the conventional proportional-integral designs and verified by experiments

  6. Modelling catchment hydrology within a GIS based SVAT-model framework

    OpenAIRE

    Ludwig, R.; Mauser, W

    2000-01-01

    The physically-based soil-vegetation-atmosphere-transfer model PROMET (PRocess-Oriented Model for Evapo Transpiration) developed at the Institute of Geography, University of Munich, is applied to the Ammer basin (approx. 600 km2 ) in the alpine foreland of Germany. The hourly actual evapotranspiration rate is calculated for a 14-year time series. A rainfall-runoff model, based on an enhanced distributed TOPMODEL structure, is linked to the SVAT-model in order to provide a hydrological model c...

  7. Towards automatic model based controller design for reconfigurable plants

    DEFF Research Database (Denmark)

    Michelsen, Axel Gottlieb; Stoustrup, Jakob; Izadi-Zamanabadi, Roozbeh

    2008-01-01

    This paper introduces model-based Plug and Play Process Control, a novel concept for process control, which allows a model-based control system to be reconfigured when a sensor or an actuator is plugged into a controlled process. The work reported in this paper focuses on composing a monolithic...... model from models of a process to be controlled and the actuators and sensors connected to the process, and propagation of tuning criteria from these sub-models, thereby accommodating automatic controller synthesis using existing methods. The developed method is successfully tested on an industrial case...

  8. Verification and Validation of Model-Based Autonomous Systems

    Science.gov (United States)

    Pecheur, Charles; Koga, Dennis (Technical Monitor)

    2001-01-01

    This paper presents a three year project (FY99 to FY01) on the verification and validation of model based autonomous systems. The topics include: 1) Project Profile; 2) Model-Based Autonomy; 3) The Livingstone MIR; 4) MPL2SMV; 5) Livingstone to SMV Translation; 6) Symbolic Model Checking; 7) From Livingstone Models to SMV Models; 8) Application In-Situ Propellant Production; 9) Closed-Loop Verification Principle; 10) Livingstone PathFinder (LPF); 11) Publications and Presentations; and 12) Future Directions. This paper is presented in viewgraph form.

  9. A VENSIM BASED ANALYSIS FOR SUPPLY CHAIN MODEL

    Directory of Open Access Journals (Sweden)

    Mohammad SHAMSUDDOHA

    2014-01-01

    Full Text Available The emphasis on supply chain has increased in recent years among academic and industry circles. In this paper, a supply chain model will be developed based on a case study of the poultry industry under the Vensim environment. System dynamics, supply chain, design science and case method under positivist and quantitative paradigm will be studied to develop a simulation model. The objectives of this paper are to review literature, develop a Vensim based simulation supply chain model, and examine the model qualitatively and quantitatively. The model will be also briefly discussed in relation of among forward, reverse and mainstream supply chain of the case.

  10. On grey relation projection model based on projection pursuit

    Institute of Scientific and Technical Information of China (English)

    Wang Shuo; Yang Shanlin; Ma Xijun

    2008-01-01

    Multidimensional grey relation projection value can be synthesized as one-dimensional projection value by u-sing projection pursuit model.The larger the projection value is,the better the model.Thus,according to the projection value,the best one can be chosen from the model aggregation.Because projection pursuit modeling based on accelera-ting genetic algorithm can simplify the implementation procedure of the projection pursuit technique and overcome its complex calculation as well as the difficulty in implementing its program,a new method can be obtained for choosing the best grey relation projection model based on the projection pursuit technique.

  11. Extending EMMS-based models to CFB boiler applications

    Institute of Scientific and Technical Information of China (English)

    Bona Lu; Nan Zhang; Wei Wang; Jinghai Li

    2012-01-01

    Recently,EMMS-based models are being widely applied in simulations of high-throughput circulating fluidized beds (CFBs) with fine particles.Its use for low flux systems,such as CFB boiler (CFBB),still remains unexplored.In this work,it has been found that the original definition of cluster diameter in EMMS model is unsuitable for simulations of the CFB boiler with low solids flux.To remedy this,we propose a new model of cluster diameter.The EMMS-based drag model (EMMS/matrix model) with this revised cluster definition is validated through the computational fluid dynamics (CFD) simulation of a CFB boiler.

  12. An XML-based information model for archaeological pottery

    Institute of Scientific and Technical Information of China (English)

    LIU De-zhi; RAZDAN Anshuman; SIMON Arleyn; BAE Myungsoo

    2005-01-01

    An information model is defined to support sharing scientific information on Web for archaeological pottery. Apart from non-shape information, such as age, material, etc., the model also consists of shape information and shape feature information. Shape information is collected by Lasers Scanner and geometric modelling techniques. Feature information is generated from shape information via feature extracting techniques. The model is used in an integrated storage, archival, and sketch-based query and retrieval system for 3D objects, native American ceramic vessels. A novel aspect of the information model is that it is totally implemented with XML, and is designed for Web-based visual query and storage application.

  13. Visual Data Recognition and Modeling Based on Local Markovian Models

    Czech Academy of Sciences Publication Activity Database

    Haindl, Michal

    Vol. 14. London: Springer London, 2012 - (Florack, L.; Duits, R.; Jongbloed, G.; Lieshout, M.; Davies, L.), s. 241-259. (Computational Imaging and Vision). ISBN 978-1-4471-2353-8 R&D Projects: GA MŠk 1M0572; GA ČR GAP103/11/0335; GA ČR GA102/08/0593 Grant ostatní: CESNET(CZ) 387/2010 Institutional research plan: CEZ:AV0Z10750506 Keywords : Markov random fields * image modeling * image recognition Subject RIV: BD - Theory of Information http://library.utia.cas.cz/separaty/2012/RO/haindl-0374142.pdf

  14. Fujisaki Model Based Intonation Modeling for Korean TTS System

    Science.gov (United States)

    Kim, Byeongchang; Lee, Jinsik; Lee, Gary Geunbae

    One of the enduring problems in developing high-quality TTS (text-to-speech) system is pitch contour generation. Considering language specific knowledge, an adjusted Fujisaki model for Korean TTS system is introduced along with refined machine learning features. The results of quantitative and qualitative evaluations show the validity of our system: the accuracy of the phrase command prediction is 0.8928; the correlations of the predicted amplitudes of a phrase command and an accent command are 0.6644 and 0.6002, respectively; our method achieved the level of "fair" naturalness (3.6) in a MOS scale for generated F0 curves.

  15. IDEF method-based simulation model design and development framework

    Directory of Open Access Journals (Sweden)

    Ki-Young Jeong

    2009-09-01

    Full Text Available The purpose of this study is to provide an IDEF method-based integrated framework for a business process simulation model to reduce the model development time by increasing the communication and knowledge reusability during a simulation project. In this framework, simulation requirements are collected by a function modeling method (IDEF0 and a process modeling method (IDEF3. Based on these requirements, a common data model is constructed using the IDEF1X method. From this reusable data model, multiple simulation models are automatically generated using a database-driven simulation model development approach. The framework is claimed to help both requirement collection and experimentation phases during a simulation project by improving system knowledge, model reusability, and maintainability through the systematic use of three descriptive IDEF methods and the features of the relational database technologies. A complex semiconductor fabrication case study was used as a testbed to evaluate and illustrate the concepts and the framework. Two different simulation software products were used to develop and control the semiconductor model from the same knowledge base. The case study empirically showed that this framework could help improve the simulation project processes by using IDEF-based descriptive models and the relational database technology. Authors also concluded that this framework could be easily applied to other analytical model generation by separating the logic from the data.

  16. Weibull Parameters Estimation Based on Physics of Failure Model

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2012-01-01

    Reliability estimation procedures are discussed for the example of fatigue development in solder joints using a physics of failure model. The accumulated damage is estimated based on a physics of failure model, the Rainflow counting algorithm and the Miner’s rule. A threshold model is used for de...

  17. Understanding Elementary Astronomy by Making Drawing-Based Models

    Science.gov (United States)

    van Joolingen, W. R.; Aukes, Annika V.; Gijlers, H.; Bollen, L.

    2015-01-01

    Modeling is an important approach in the teaching and learning of science. In this study, we attempt to bring modeling within the reach of young children by creating the SimSketch modeling system, which is based on freehand drawings that can be turned into simulations. This system was used by 247 children (ages ranging from 7 to 15) to create a…

  18. Accurate Load Modeling Based on Analytic Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Zhenshu Wang

    2016-01-01

    Full Text Available Establishing an accurate load model is a critical problem in power system modeling. That has significant meaning in power system digital simulation and dynamic security analysis. The synthesis load model (SLM considers the impact of power distribution network and compensation capacitor, while randomness of power load is more precisely described by traction power system load model (TPSLM. On the basis of these two load models, a load modeling method that combines synthesis load with traction power load is proposed in this paper. This method uses analytic hierarchy process (AHP to interact with two load models. Weight coefficients of two models can be calculated after formulating criteria and judgment matrixes and then establishing a synthesis model by weight coefficients. The effectiveness of the proposed method was examined through simulation. The results show that accurate load modeling based on AHP can effectively improve the accuracy of load model and prove the validity of this method.

  19. A Requirements Analysis Model Based on QFD

    Institute of Scientific and Technical Information of China (English)

    TANG Zhi-wei; Nelson K.H.Tang

    2004-01-01

    The enterprise resource planning (ERP) system has emerged to offer an integrated IT solution and more and more enterprises are increasing by adopting this system and regarding it as an important innovation. However, there is already evidence of high failure risks in ERP project implementation, one major reason is poor analysis of the requirements for system implementation. In this paper, the importance of requirements analysis for ERP project implementation is highlighted, and a requirements analysis model by applying quality function deployment (QFD) is presented, which will support to conduct requirements analysis for ERP project.

  20. A Knowledge Representation Model for Video—Based Animation

    Institute of Scientific and Technical Information of China (English)

    劳志强; 潘云鹤

    1998-01-01

    In this paper,a brief survey on knowledge-based animation techniques is given.Then a VideoStream-based Knowledge Representation Model(VSKRM)for Joint Objects is presented which includes the knowledge representation of :Graphic Object,Action and VideoStream.Next a general description of the UI framework of a system is given based on the VSKRM model.Finally,a conclusion is reached.

  1. Model-based design of integrated production systems: a review

    OpenAIRE

    Ould Sidi, Mohamed Mahmoud; Lescourret, Francoise

    2011-01-01

    Pest resistance and water pollution are major issues caused by the excessive use of pesticides in intensive agriculture. The concept of integrated production system (IPS) has been thus designed to solve those issues and also to meet the need for better food quality and production. Methodologies such as agronomic diagnosis-based design, prototyping, and model-based design have been developed. Here we review the model-based design of IPS. We identify tools for the development of comprehensive m...

  2. Parametric Modeling for Globoidal Cam Based and NC Machining

    OpenAIRE

    Haitao Liu; Zhibin Chang; Lei Zhang; Wei Jiang; Guangguo Zhang

    2013-01-01

    The design and machining of globoidal cam is difficult, because the shape of working surface is complex,and the curved surface of its space is undevelopable. Based on the above reasons,we proposed the parametric Modeling method for globoidal cam based. Through the secondary development, we realized the parameterized modeling of globoidal cam. Base on using of globular indexing cam NC milling machine existing in the lab,research the programming technology,analyses the defects of conventional p...

  3. Agent-based Modeling and Mapping of Manufacturing System

    Institute of Scientific and Technical Information of China (English)

    Z; Zhang

    2002-01-01

    Considering the gent-based modeling and mapping in m anufacturing system, in this paper, some system models are described, which are including: Domain Based Hierarchical Structure (DBHS), Cascading Agent Struc ture (CAS), Proximity Relation structure (PRS), and Bus-based network structure (BNS). In DBHS, one sort of agent individually delegates Domain Agents, Res ources Agents, UserInterface Agents and Gateway Agents and the other one is a br oker of tasks and process flow. Static agents representing...

  4. Agent-Based Modeling and Mapping of Manufacturing System

    Institute of Scientific and Technical Information of China (English)

    Z; Zhang

    2002-01-01

    Considering the agent-based modeling and mapping i n manufacturing system, some system models are described in this paper, which are included: Domain Based Hierarchical Structure (DBHS), Cascading Agent Structure (CAS), Proximity Relation Structure (PRS), and Bus-based Network Structure (BNS ). In DBHS, one sort of agents, called static agents, individually acts as Domai n Agents, Resources Agents, UserInterface Agents and Gateway Agents. And the oth ers, named mobile agents, are the brokers of task and ...

  5. Reduced model-based decision-making in schizophrenia.

    Science.gov (United States)

    Culbreth, Adam J; Westbrook, Andrew; Daw, Nathaniel D; Botvinick, Matthew; Barch, Deanna M

    2016-08-01

    Individuals with schizophrenia have a diminished ability to use reward history to adaptively guide behavior. However, tasks traditionally used to assess such deficits often rely on multiple cognitive and neural processes, leaving etiology unresolved. In the current study, we adopted recent computational formalisms of reinforcement learning to distinguish between model-based and model-free decision-making in hopes of specifying mechanisms associated with reinforcement-learning dysfunction in schizophrenia. Under this framework, decision-making is model-free to the extent that it relies solely on prior reward history, and model-based if it relies on prospective information such as motivational state, future consequences, and the likelihood of obtaining various outcomes. Model-based and model-free decision-making was assessed in 33 schizophrenia patients and 30 controls using a 2-stage 2-alternative forced choice task previously demonstrated to discern individual differences in reliance on the 2 forms of reinforcement-learning. We show that, compared with controls, schizophrenia patients demonstrate decreased reliance on model-based decision-making. Further, parameter estimates of model-based behavior correlate positively with IQ and working memory measures, suggesting that model-based deficits seen in schizophrenia may be partially explained by higher-order cognitive deficits. These findings demonstrate specific reinforcement-learning and decision-making deficits and thereby provide valuable insights for understanding disordered behavior in schizophrenia. (PsycINFO Database Record PMID:27175984

  6. Model and Behavior-Based Robotic Goalkeeper

    DEFF Research Database (Denmark)

    Lausen, H.; Nielsen, J.; Nielsen, M.;

    2003-01-01

    control are implemented by a non-linear control algorithm, adapted to the different task goals (e.g., follow the ball or the robot posture from local features extracted from images acquired by a catadioptric omni-directional vision system. Most robot parameters were designed based on simulations carried......This paper describes the design, implementation and test of a goalkeeper robot for the Middle-Size League of RoboCub. The goalkeeper task is implemented by a set of primitive tasks and behaviours coordinated by a 2-level hierarchical state machine. The primitive tasks concerning complex motion...

  7. Phase Correlation Based Iris Image Registration Model

    Institute of Scientific and Technical Information of China (English)

    Jun-Zhou Huang; Tie-Niu Tan; Li Ma; Yun-Hong Wang

    2005-01-01

    Iris recognition is one of the most reliable personal identification methods. In iris recognition systems, image registration is an important component. Accurately registering iris images leads to higher recognition rate for an iris recognition system. This paper proposes a phase correlation based method for iris image registration with sub-pixel accuracy.Compared with existing methods, it is insensitive to image intensity and can compensate to a certain extent the non-linear iris deformation caused by pupil movement. Experimental results show that the proposed algorithm has an encouraging performance.

  8. An Efficient Semantic Model For Concept Based Clustering And Classification

    Directory of Open Access Journals (Sweden)

    SaiSindhu Bandaru

    2012-03-01

    Full Text Available Usually in text mining techniques the basic measures like term frequency of a term (word or phrase is computed to compute the importance of the term in the document. But with statistical analysis, the original semantics of the term may not carry the exact meaning of the term. To overcome this problem, a new framework has been introduced which relies on concept based model and synonym based approach. The proposed model can efficiently find significant matching and related concepts between documents according to concept based and synonym based approaches. Large sets of experiments using the proposed model on different set in clustering and classification are conducted. Experimental results demonstrate the substantialenhancement of the clustering quality using sentence based, document based, corpus based and combined approach concept analysis. A new similarity measure has been proposed to find the similarity between adocument and the existing clusters, which can be used in classification of the document with existing clusters.

  9. Characteristics-based modelling of flow problems

    International Nuclear Information System (INIS)

    The method of characteristics is an exact way to proceed to the solution of hyperbolic partial differential equations. The numerical solutions, however, are obtained in the fixed computational grid where interpolations of values between the mesh points cause numerical errors. The Piecewise Linear Interpolation Method, PLIM, the utilization of which is based on the method of characteristics, has been developed to overcome these deficiencies. The thesis concentrates on the computer simulation of the two-phase flow. The main topics studied are: (1) the PLIM method has been applied to study the validity of the numerical scheme through solving various flow problems to achieve knowledge for the further development of the method, (2) the mathematical and physical validity and applicability of the two-phase flow equations based on the SFAV (Separation of the two-phase Flow According to Velocities) approach has been studied, and (3) The SFAV approach has been further developed for particular cases such as stratified horizontal two-phase flow. (63 refs., 4 figs.)

  10. Model-Drive Architecture for Agent-Based Systems

    Science.gov (United States)

    Gradanin, Denis; Singh, H. Lally; Bohner, Shawn A.; Hinchey, Michael G.

    2004-01-01

    The Model Driven Architecture (MDA) approach uses a platform-independent model to define system functionality, or requirements, using some specification language. The requirements are then translated to a platform-specific model for implementation. An agent architecture based on the human cognitive model of planning, the Cognitive Agent Architecture (Cougaar) is selected for the implementation platform. The resulting Cougaar MDA prescribes certain kinds of models to be used, how those models may be prepared and the relationships of the different kinds of models. Using the existing Cougaar architecture, the level of application composition is elevated from individual components to domain level model specifications in order to generate software artifacts. The software artifacts generation is based on a metamodel. Each component maps to a UML structured component which is then converted into multiple artifacts: Cougaar/Java code, documentation, and test cases.

  11. Volatility clustering in agent based market models

    Science.gov (United States)

    Giardina, Irene; Bouchaud, Jean-Philippe

    2003-06-01

    We define and study a market model, where agents have different strategies among which they can choose, according to their relative profitability, with the possibility of not participating to the market. The price is updated according to the excess demand, and the wealth of the agents is properly accounted for. Only two parameters play a significant role: one describes the impact of trading on the price, and the other describes the propensity of agents to be trend following or contrarian. We observe three different regimes, depending on the value of these two parameters: an oscillating phase with bubbles and crashes, an intermittent phase and a stable ‘rational’ market phase. The statistics of price changes in the intermittent phase resembles that of real price changes, with small linear correlations, fat tails and long-range volatility clustering. We discuss how the time dependence of these two parameters spontaneously drives the system in the intermittent region.

  12. Repetition-based Interactive Facade Modeling

    KAUST Repository

    AlHalawani, Sawsan

    2012-07-01

    Modeling and reconstruction of urban environments has gained researchers attention throughout the past few years. It spreads in a variety of directions across multiple disciplines such as image processing, computer graphics and computer vision as well as in architecture, geoscience and remote sensing. Having a virtual world of our real cities is very attractive in various directions such as entertainment, engineering, governments among many others. In this thesis, we address the problem of processing a single fa cade image to acquire useful information that can be utilized to manipulate the fa cade and generate variations of fa cade images which can be later used for buildings\\' texturing. Typical fa cade structures exhibit a rectilinear distribution where in windows and other elements are organized in a grid of horizontal and vertical repetitions of similar patterns. In the firt part of this thesis, we propose an efficient algorithm that exploits information obtained from a single image to identify the distribution grid of the dominant elements i.e. windows. This detection method is initially assisted with the user marking the dominant window followed by an automatic process for identifying its repeated instances which are used to define the structure grid. Given the distribution grid, we allow the user to interactively manipulate the fa cade by adding, deleting, resizing or repositioning the windows in order to generate new fa cade structures. Having the utility for the interactive fa cade is very valuable to create fa cade variations and generate new textures for building models. Ultimately, there is a wide range of interesting possibilities of interactions to be explored.

  13. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    International Nuclear Information System (INIS)

    simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions

  14. Neural-Based Models of Semiconductor Devices for SPICE Simulator

    OpenAIRE

    Hanene B. Hammouda; Mongia Mhiri; Zièd Gafsi; Kamel Besbes

    2008-01-01

    The paper addresses a simple and fast new approach to implement Artificial Neural Networks (ANN) models for the MOS transistor into SPICE. The proposed approach involves two steps, the modeling phase of the device by NN providing its input/output patterns, and the SPICE implementation process of the resulting model. Using the Taylor series expansion, a neural based small-signal model is derived. The reliability of our approach is validated through simulations of some circuits in DC and small-...

  15. A hierarchy of heuristic-based models of crowd dynamics

    OpenAIRE

    Degond, Pierre; Appert-Rolland, Cécile; Moussaid, Mehdi; Pettré, Julien; Theraulaz, Guy

    2013-01-01

    We derive a hierarchy of kinetic and macroscopic models from a noisy variant of the heuristic behavioral Individual-Based Model of Moussaid et al, PNAS 2011, where the pedestrians are supposed to have constant speeds. This IBM supposes that the pedestrians seek the best compromise between navigation towards their target and collisions avoidance. We first propose a kinetic model for the probability distribution function of the pedestrians. Then, we derive fluid models and propose three differe...

  16. Dynamic control of modern, network-based epidemic models

    OpenAIRE

    Sélley, Fanni; Besenyei, Ádám; Kiss, Istvan; Simon, Péter L

    2014-01-01

    In this paper we make the first steps to bridge the gap between classic control theory and modern, network-based epidemic models. In particular, we apply nonlinear model predictive control (NMPC) to a pairwise ODE model which we use to model a susceptible-infectious-susceptible (SIS) epidemic on non-trivial contact structures. While classic control of epidemics concentrates on aspects such as vaccination, quarantine and fast diagnosis, our novel setup allows us to deliver control by altering ...

  17. Proposing a Cloud-based Operational Model for IT Infrastructure

    OpenAIRE

    Lapin, Denis

    2014-01-01

    This Master’s Thesis investigates the possible use of Infrastructure as a Service (IaaS) service model in the case company and proposes a new cloud service based operational model for it. IaaS represents the service model where IT infrastructure is delivered to the customers through the network by the cloud service providers. The proposed operational model aims to addresses the needs for improving business agility and cost efficiency, as commissioned by the IT Infrastructure department of the...

  18. Model-Based Gait Enrolment in Real-World Imagery

    OpenAIRE

    Wagg, David K; Nixon, Mark S.

    2003-01-01

    We present a model-based approach to gait extraction that is capable of reliable operation on real-world imagery. Hierarchies of shape and motion are employed to yield relatively modest computational demands, avoiding the high-dimensional search spaces associated with complex models. Anatomical data is used to generate shape models consistent with normal human body proportions. Mean gait data is used to create prototype gait motion models, which are adapted to fit individual subjects. Accurac...

  19. Control volume based modelling of compressible flow in reciprocating machines

    DEFF Research Database (Denmark)

    Andersen, Stig Kildegård; Thomsen, Per Grove; Carlsen, Henrik

    2004-01-01

    An approach to modelling unsteady compressible flow that is primarily one dimensional is presented. The approach was developed for creating distributed models of machines with reciprocating pistons but it is not limited to this application. The approach is based on the integral form of the unstea...... presented. The capabilities of the modelling approach are illustrated with a solution to a Stirling engine model that provides results in good agreement with experimental data. Keywords: Simulation, compressible flow, one dimensional, control volume....

  20. Precise numerical modeling of next generation multimode fiber based links

    Science.gov (United States)

    Maksymiuk, L.; Stepniak, G.

    2015-12-01

    In order to numerically model modern multimode fiber based links we are required to take into account modal and chromatic dispersion, profile dispersion and spectral dependent coupling. In this paper we propose a complete numerical model which not only is precise but also versatile. Additionally to the detailed mathematical description of the model we provide also a bunch of numerical calculations performed with the use of the model.

  1. Deep Structured Energy Based Models for Anomaly Detection

    OpenAIRE

    Zhai, Shuangfei; Cheng, Yu; Lu, Weining; Zhang, Zhongfei

    2016-01-01

    In this paper, we attack the anomaly detection problem by directly modeling the data distribution with deep architectures. We propose deep structured energy based models (DSEBMs), where the energy function is the output of a deterministic deep neural network with structure. We develop novel model architectures to integrate EBMs with different types of data such as static data, sequential data, and spatial data, and apply appropriate model architectures to adapt to the data structure. Our trai...

  2. Numerical Model of Radical Photopolymerization Based on Interdiffusion

    OpenAIRE

    Shuhei Yoshida; Yosuke Takahata; Shuma Horiuchi; Hiroyuki Kurata; Manabu Yamamoto

    2014-01-01

    An accurate reaction model is required to analyze the characteristics of photopolymers. For this purpose, we propose a numerical model for radical photopolymerization. In the proposed model, elementary reactions such as initiation, propagation, and termination are considered, and we assume interdiffusion for each component in the material. We analyzed the diffraction characteristics of a radical photopolymer based on the proposed interdiffusion model with the beam propagation method. Moreover...

  3. CREDIT SCORING MODELS WITH AUC MAXIMIZATION BASED ON WEIGHTED SVM

    OpenAIRE

    LIGANG ZHOU; KIN KEUNG LAI; JEROME YEN

    2009-01-01

    Credit scoring models are very important tools for financial institutions to make credit granting decisions. In the last few decades, many quantitative methods have been used for the development of credit scoring models with focus on maximizing classification accuracy. This paper proposes the credit scoring models with the area under receiver operating characteristics curve (AUC) maximization based on the new emerged support vector machines (SVM) techniques. Three main SVM models with differe...

  4. Pervasive Computing Location-aware Model Based on Ontology

    Institute of Scientific and Technical Information of China (English)

    PU Fang; CAI Hai-bin; CAO Qi-ying; SUN Dao-qing; LI Tong

    2008-01-01

    In order to integrate heterogeneous location-aware systems into pervasive computing environment, a novel pervasive computing location-aware model based on ontology is presented. A location-aware model ontology (LMO) is constructed. The location-aware model has the capabilities of sharing knowledge, reasoning and adjusting the usage policies of services dynamically through a unified semantic location manner. At last, the work process of our proposed location-aware model is explained by an application scenario.

  5. Variable selection in model-based discriminant analysis

    OpenAIRE

    Maugis, Cathy; Celeux, Gilles; Martin-Magniette, Marie-Laure

    2010-01-01

    A general methodology for selecting predictors for Gaussian generative classification models is presented. The problem is regarded as a model selection problem. Three different roles for each possible predictor are considered: a variable can be a relevant classification predictor or not, and the irrelevant classification variables can be linearly dependent on a part of the relevant predictors or independent variables. This variable selection model was inspired by the model-based clustering mo...

  6. Intelligent Transportation and Evacuation Planning A Modeling-Based Approach

    CERN Document Server

    Naser, Arab

    2012-01-01

    Intelligent Transportation and Evacuation Planning: A Modeling-Based Approach provides a new paradigm for evacuation planning strategies and techniques. Recently, evacuation planning and modeling have increasingly attracted interest among researchers as well as government officials. This interest stems from the recent catastrophic hurricanes and weather-related events that occurred in the southeastern United States (Hurricane Katrina and Rita). The evacuation methods that were in place before and during the hurricanes did not work well and resulted in thousands of deaths. This book offers insights into the methods and techniques that allow for implementing mathematical-based, simulation-based, and integrated optimization and simulation-based engineering approaches for evacuation planning. This book also: Comprehensively discusses the application of mathematical models for evacuation and intelligent transportation modeling Covers advanced methodologies in evacuation modeling and planning Discusses principles a...

  7. Consequences of spatial autocorrelation for niche-based models

    DEFF Research Database (Denmark)

    Segurado, P.; Araújo, Miguel B.; Kunin, W. E.

    2006-01-01

    of significance based on randomizations were obtained. 3.  Spatial autocorrelation was shown to represent a serious problem for niche-based species' distribution models. Significance values were found to be inflated up to 90-fold. 4.  In general, GAM and CTA performed better than GLM, although all three methods...... autocorrelated variables, these need to be adjusted. The reliability and value of niche-based distribution models for management and other applied ecology purposes can be improved if certain techniques and procedures, such as the null model approach recommended in this study, are implemented during the model......1.  Spatial autocorrelation is an important source of bias in most spatial analyses. We explored the bias introduced by spatial autocorrelation on the explanatory and predictive power of species' distribution models, and make recommendations for dealing with the problem. 2.  Analyses were based...

  8. Adopsi Model Competency Based Training dalam Kewirausahaan

    Directory of Open Access Journals (Sweden)

    I Ketut Santra

    2009-01-01

    Full Text Available The aim of the research is improving the teaching method in entrepreneurship subject. This research adopted the competency based training (CBT into the entrepreneurship. The major task in this research is formulated and designed the entrepreneurship competency. Entrepreneurship competency indicated by Personal, Strategic and Situational and Business competence. All of entrepreneurship competences are described into sub topic of competence. After designing and formulating the game and simulation the research continuing to implement the competency based training in the real class. The time consumed to implementing the CBT one semester, starting on September 2006 to early February 2007. The lesson learnt from the implementation period, the CBT could improve the student competence in Personal, Situational Strategic and Business. The three of the competencies are important for the success entrepreneur. It is a sign of application of “Kurikulum Berbasis Kompetensi”. There are many evidences to describe the achievement of the CBT in entrepreneurship subject. Firstly, physically achievement, that all of the student’s business plan could became the real business. The evidences are presented by picture of the student’s real business. Secondly theoretically achievement, that the Personal, Situational Strategic and Business competence statistically have significant relation with Business Plan even Real Business quality. The effect of the Personal, Situational Strategic and Business competence to Business Plan quality is 84.4%. and, to the Real Business quality 77.2%. The statistic’s evidence suggests that the redesign of the entrepreneurship subject is the right way. The content of the entrepreneur competence (Personal, Situational and Strategic and Business competence have impact to the student to conduct and running for own business.

  9. Convex-based void filling method for CAD-based Monte Carlo geometry modeling

    International Nuclear Information System (INIS)

    Highlights: • We present a new void filling method named CVF for CAD based MC geometry modeling. • We describe convex based void description based and quality-based space subdivision. • The results showed improvements provided by CVF for both modeling and MC calculation efficiency. - Abstract: CAD based automatic geometry modeling tools have been widely applied to generate Monte Carlo (MC) calculation geometry for complex systems according to CAD models. Automatic void filling is one of the main functions in the CAD based MC geometry modeling tools, because the void space between parts in CAD models is traditionally not modeled while MC codes such as MCNP need all the problem space to be described. A dedicated void filling method, named Convex-based Void Filling (CVF), is proposed in this study for efficient void filling and concise void descriptions. The method subdivides all the problem space into disjointed regions using Quality based Subdivision (QS) and describes the void space in each region with complementary descriptions of the convex volumes intersecting with that region. It has been implemented in SuperMC/MCAM, the Multiple-Physics Coupling Analysis Modeling Program, and tested on International Thermonuclear Experimental Reactor (ITER) Alite model. The results showed that the new method reduced both automatic modeling time and MC calculation time

  10. Uncertainty Representation and Interpretation in Model-based Prognostics Algorithms based on Kalman Filter Estimation

    Data.gov (United States)

    National Aeronautics and Space Administration — This article discusses several aspects of uncertainty represen- tation and management for model-based prognostics method- ologies based on our experience with...

  11. DEVELOPMENT MODEL OF PATISSERIE PROJECT-BASED LEARNING

    Directory of Open Access Journals (Sweden)

    Ana Ana

    2013-02-01

    Full Text Available The study aims to find a model of patisserie project-based learning with production approach that can improve effectiveness of patisserie learning. Delphi Technique, Cohen's Kappa and percentages of agreements were used to assess model of patisserie project based learning. Data collection techniques employed in the study were questionnaire, check list worksheet, observation, and interview sheets. Subjects were 13 lectures of expertise food and nutrition and 91 students of Food and Nutrition Program from UPI, UNY, UNESA and UM. The results shows that model of patisserie project based learning with production approach can be used in patisserie learning, with a fairly high level of agreement, which is 0.78 average Cohen's Kappa coefficient. The model was implemented very well, as shown by 94.77% average implementation. Thus, model of patisserie project based learning with production approach can be used to improve the quality of patisserie learning. The results shows that model of patisserie project based learning with production approach can be used in patisserie learning with a fairly high level of agreement in 0.78 average Cohen's Kappa coefficient. The model was implemented very well, as shown by 94.77% average implementation. Thus, model of patisserie project based learning with production approach can be used to improve the quality of patisserie learning.

  12. Automated extraction of knowledge for model-based diagnostics

    Science.gov (United States)

    Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.

    1990-01-01

    The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.

  13. Scale-based spatial data model for GIS

    Institute of Scientific and Technical Information of China (English)

    WEI Zu-kuan

    2004-01-01

    Being the primary media of geographical information and the elementary objects manipulated, almost all of maps adopt the layer-based model to represent geographic information in the existent GIS. However, it is difficult to extend the map represented in layer-based model. Furthermore, in Web-Based GIS, It is slow to transmit the spatial data for map viewing. In this paper, for solving the questions above, we have proposed a new method for representing the spatial data. That is scale-based model. In this model we represent maps in three levels: scale-view, block, and spatial object, and organize the maps in a set of map layers, named Scale-View, which associates some given scales.Lastly, a prototype Web-Based GIS using the proposed spatial data representation is described briefly.

  14. Physics-Based Reactive Burn Model: Grain Size Effects

    Science.gov (United States)

    Lu, X.; Hamate, Y.; Horie, Y.

    2007-12-01

    We have been developing a physics-based reactive burn (PBRB) model, which was formulated based on the concept of a statistical hot spot cell. In the model, essential thermomechanics and physiochemical features are explicitly modeled. In this paper, we have extended the statistical hot spot model to explicitly describe the ignition and growth of hot spots. In particular, grain size effects are explicitly delineated through introduction of grain size-dependent, thickness of the hot-region, energy deposition criterion, and specific surface area. Besides the linear relationships between the run distance to detonation and the critical diameter with respect to the reciprocal specific surface area of heterogeneous explosives (HE), which is based on the original model and discussed in a parallel paper of this meeting, parametric studies have shown that the extended PBRB model can predict a non-monotonic variation of shock sensitivity with grain size, as observed by Moulard et al.

  15. CONCEPTUAL MODELING BASED ON LOGICAL EXPRESSION AND EVOLVEMENT

    Institute of Scientific and Technical Information of China (English)

    Yl Guodong; ZHANG Shuyou; TAN Jianrong; JI Yangjian

    2007-01-01

    Aiming at the problem of abstract and polytype information modeling in product conceptual design, a method of conceptual modeling based on logical expression and evolvement is presented. Based on the logic expressions of the product conceptual design information, a function/logic/structure mapping model is set up. First, the function semantics is transformed into logical expressions through function/logic mapping. Second, the methods of logical evolvement are utilized to describe the function analysis, function/structure mapping and structure combination. Last, the logical structure scheme is transformed into geometrical sketch through logic/structure mapping. The conceptual design information and modeling process are described uniformly with logical methods in the model, and an effective method for computer aided conceptual design based on the model is implemented.

  16. Model Based Mission Assurance: Emerging Opportunities for Robotic Systems

    Science.gov (United States)

    Evans, John W.; DiVenti, Tony

    2016-01-01

    The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).

  17. CAD Model Retrieval Based on Graduated Assignment Algorithm

    Science.gov (United States)

    Tao, Songqiao

    2015-06-01

    A retrieval approach for CAD models based on graduated assignment algorithm is proposed in this paper. First, CAD models are transformed into face adjacency graphs (FAGs). Second, the vertex compatibility matrix and edge compatibility matrix between the FAGs of the query and data models are calculated, and the similarity metric for the two comparison models is established from their compatibility matrices, which serves as the optimization objective function for selecting vertex mapping matrix M between the two comparison models. Finally, Sinkhorn's alternative normalization approach for M's rows and columns is adopted to find the optimal vertex mapping matrix M. Experimental results have shown that the proposed approach supports CAD model retrieval.

  18. Discrete Element Simulation of Asphalt Mastics Based on Burgers Model

    Institute of Scientific and Technical Information of China (English)

    LIU Yu; FENG Shi-rong; HU Xia-guang

    2007-01-01

    In order to investigate the viscoelastic performance of asphalt mastics, a micro-mechanical model for asphalt mastics was built by applying Burgers model to discrete element simulation and constructing Burgers contact model. Then the numerical simulation of creep tests was conducted, and results from the simulation were compared with the analytical solution for Burgers model. The comparision snowed that the two results agreed well with each other, suggesting that discrete element model based on Burgers model could be employed in the numerical simulation for asphalt mastics.

  19. GIS-BASED 1-D DIFFUSIVE WAVE OVERLAND FLOW MODEL

    Energy Technology Data Exchange (ETDEWEB)

    KALYANAPU, ALFRED [Los Alamos National Laboratory; MCPHERSON, TIMOTHY N. [Los Alamos National Laboratory; BURIAN, STEVEN J. [NON LANL

    2007-01-17

    This paper presents a GIS-based 1-d distributed overland flow model and summarizes an application to simulate a flood event. The model estimates infiltration using the Green-Ampt approach and routes excess rainfall using the 1-d diffusive wave approximation. The model was designed to use readily available topographic, soils, and land use/land cover data and rainfall predictions from a meteorological model. An assessment of model performance was performed for a small catchment and a large watershed, both in urban environments. Simulated runoff hydrographs were compared to observations for a selected set of validation events. Results confirmed the model provides reasonable predictions in a short period of time.

  20. When Does Model-Based Control Pay Off?

    Science.gov (United States)

    Kool, Wouter; Cushman, Fiery A; Gershman, Samuel J

    2016-08-01

    Many accounts of decision making and reinforcement learning posit the existence of two distinct systems that control choice: a fast, automatic system and a slow, deliberative system. Recent research formalizes this distinction by mapping these systems to "model-free" and "model-based" strategies in reinforcement learning. Model-free strategies are computationally cheap, but sometimes inaccurate, because action values can be accessed by inspecting a look-up table constructed through trial-and-error. In contrast, model-based strategies compute action values through planning in a causal model of the environment, which is more accurate but also more cognitively demanding. It is assumed that this trade-off between accuracy and computational demand plays an important role in the arbitration between the two strategies, but we show that the hallmark task for dissociating model-free and model-based strategies, as well as several related variants, do not embody such a trade-off. We describe five factors that reduce the effectiveness of the model-based strategy on these tasks by reducing its accuracy in estimating reward outcomes and decreasing the importance of its choices. Based on these observations, we describe a version of the task that formally and empirically obtains an accuracy-demand trade-off between model-free and model-based strategies. Moreover, we show that human participants spontaneously increase their reliance on model-based control on this task, compared to the original paradigm. Our novel task and our computational analyses may prove important in subsequent empirical investigations of how humans balance accuracy and demand. PMID:27564094

  1. Numerical simulation of base flow with hot base bleed for two jet models

    Institute of Scientific and Technical Information of China (English)

    Wen-jie YU; Yong-gang YU; Bin NI

    2014-01-01

    In order to improve the benefits of base bleed in base flow field, the base flow with hot base bleed for two jet models is studied. Two-dimensional axisymmetric NaviereStokes equations are computed by using a finite volume scheme. The base flow of a cylinder afterbody with base bleed is simulated. The simulation results are validated with the experimental data, and the experimental results are well reproduced. On this basis, the base flow fields with base bleed for a circular jet model and an annulus jet model are investigated by selecting the injection temperature from 830 K to 2200 K. The results show that the base pressure of the annular jet model is higher than that of the circular jet model with the changes of the injection parameter and the injection temperature. For the circular jet model, the hot gases are concentrated in the vicinity of the base. For the annular jet model, the bleed gases flow into the shear layer directly so that the hot gases are concentrated in the shear layer. The latter temperature distribution is better for the increase of base pressure.

  2. Optical computing based on neuronal models

    Science.gov (United States)

    Farhat, Nabil H.

    1987-10-01

    Ever since the fit between what neural net models can offer (collective, iterative, nonlinear, robust, and fault-tolerant approach to information processing) and the inherent capabilities of optics (parallelism and massive interconnectivity) was first pointed out and the first optical associative memory demonstrated in 1985, work and interest in neuromorphic optical signal processing has been growing steadily. For example, work in optical associative memories is currently being conducted at several academic institutions (e.g., California Institute of Technology, University of Colorado, University of California-San Diego, Stanford University, University of Rochester, and the author's own institution the University of Pennsylvania) and at several industrial and governmental laboratories (e.g., Hughes Research Laboratories - Malibu, the Naval Research Laboratory, and the Jet Propulsion Laboratory). In these efforts, in addition to the vector matrix multiplication with thresholding and feedback scheme utilized in early implementations, an arsenal of sophisticated optical tools such as holographic storage, phase conjugate optics, and wavefront modulation and mixing are being drawn on to realize associative memory functions.

  3. Evaluating performances of simplified physically based landslide susceptibility models.

    Science.gov (United States)

    Capparelli, Giovanna; Formetta, Giuseppe; Versace, Pasquale

    2015-04-01

    Rainfall induced shallow landslides cause significant damages involving loss of life and properties. Prediction of shallow landslides susceptible locations is a complex task that involves many disciplines: hydrology, geotechnical science, geomorphology, and statistics. Usually to accomplish this task two main approaches are used: statistical or physically based model. This paper presents a package of GIS based models for landslide susceptibility analysis. It was integrated in the NewAge-JGrass hydrological model using the Object Modeling System (OMS) modeling framework. The package includes three simplified physically based models for landslides susceptibility analysis (M1, M2, and M3) and a component for models verifications. It computes eight goodness of fit indices (GOF) by comparing pixel-by-pixel model results and measurements data. Moreover, the package integration in NewAge-JGrass allows the use of other components such as geographic information system tools to manage inputs-output processes, and automatic calibration algorithms to estimate model parameters. The system offers the possibility to investigate and fairly compare the quality and the robustness of models and models parameters, according a procedure that includes: i) model parameters estimation by optimizing each of the GOF index separately, ii) models evaluation in the ROC plane by using each of the optimal parameter set, and iii) GOF robustness evaluation by assessing their sensitivity to the input parameter variation. This procedure was repeated for all three models. The system was applied for a case study in Calabria (Italy) along the Salerno-Reggio Calabria highway, between Cosenza and Altilia municipality. The analysis provided that among all the optimized indices and all the three models, Average Index (AI) optimization coupled with model M3 is the best modeling solution for our test case. This research was funded by PON Project No. 01_01503 "Integrated Systems for Hydrogeological Risk

  4. A Role- Based PMI Security Model for E- Government

    Institute of Scientific and Technical Information of China (English)

    WU Li-jun; SU Kai-le; YANG Zhi-hua

    2005-01-01

    We introduce the general AC( attribute certificate ), the role specification AC and the role assignment AC.We discuss the role-based PMI (Privilege Management Infrastructure) architecture. The role-based PMI(Public-Key Infrastructure) secure model for E-government is researched by combining the role-based PMI with PKI architecture (Public Key Infrastructure). The model has advantages of flexibility,convenience, less storage space and less network consumption etc. We are going to use the secure model in the E-government system.

  5. Interactive Coherence-Based Façade Modeling

    KAUST Repository

    Musialski, Przemyslaw

    2012-05-01

    We propose a novel interactive framework for modeling building facades from images. Our method is based on the notion of coherence-based editing which allows exploiting partial symmetries across the facade at any level of detail. The proposed workflow mixes manual interaction with automatic splitting and grouping operations based on unsupervised cluster analysis. In contrast to previous work, our approach leads to detailed 3d geometric models with up to several thousand regions per facade. We compare our modeling scheme to others and evaluate our approach in a user study with an experienced user and several novice users.

  6. Model-Based Integration and Interpretation of Data

    DEFF Research Database (Denmark)

    Petersen, Johannes

    2004-01-01

    Data integration and interpretation plays a crucial role in supervisory control. The paper defines a set of generic inference steps for the data integration and interpretation process based on a three-layer model of system representations. The three-layer model is used to clarify the combination...... of constraint and object-centered representations of the work domain throwing new light on the basic principles underlying the data integration and interpretation process of Rasmussen's abstraction hierarchy as well as other model-based approaches combining constraint and object-centered representations. Based...

  7. Patch-Based Generative Shape Model and MDL Model Selection for Statistical Analysis of Archipelagos

    DEFF Research Database (Denmark)

    Ganz, Melanie; Nielsen, Mads; Brandt, Sami

    2010-01-01

    patch-based dictionary for possible shapes, (2) building up a time-homogeneous Markov model to model the neighbourhood correlations between the patches, and (3) automatic selection of the model complexity by the minimum description length principle. The generative shape model is proposed as a...

  8. Developing a Physiologically-Based Pharmacokinetic Model Knowledgebase in Support of Provisional Model Construction

    Science.gov (United States)

    Developing physiologically-based pharmacokinetic (PBPK) models for chemicals can be resource-intensive, as neither chemical-specific parameters nor in vivo pharmacokinetic data are easily available for model construction. Previously developed, well-parameterized, and thoroughly-v...

  9. Elastoplastic cup model for cement-based materials

    Directory of Open Access Journals (Sweden)

    Yan ZHANG

    2010-03-01

    Full Text Available Based on experimental data obtained from triaxial tests and a hydrostatic test, a cup model was formulated. Two plastic mechanisms, respectively a deviatoric shearing and a pore collapse, are taken into account. This model also considers the influence of confining pressure. In this paper, the calibration of the model is detailed and numerical simulations of the main mechanical behavior of cement paste over a large range of stress are described, showing good agreement with experimental results. The case study shows that this cup model has extensive applicability for cement-based materials and other quasi-brittle and high-porosity materials in a complex stress state.

  10. BP Network Based Users' Interest Model in Mining WWW Cache

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    By analyzing the WWW Cache model, we bring forward a user-interest description method based on the fuzzy theory and user-interest inferential relations based on BP(back propagation) neural network. By this method, the users' interest in the WWW cache can be described and the neural network of users' interest can be constructed by positive spread of interest and the negative spread of errors. This neural network can infer the users' interest. This model is not the simple extension of the simple interest model, but the round improvement of the model and its related algorithm.

  11. Modelling Amperometric Biosensors Based on Chemically Modified Electrodes

    Science.gov (United States)

    Baronas, Romas; Kulys, Juozas

    2008-01-01

    The response of an amperometric biosensor based on a chemically modified electrode was modelled numerically. A mathematical model of the biosensor is based on a system of non-linear reaction-diffusion equations. The modelling biosensor comprises two compartments: an enzyme layer and an outer diffusion layer. In order to define the main governing parameters the corresponding dimensionless mathematical model was derived. The digital simulation was carried out using the finite difference technique. The adequacy of the model was evaluated using analytical solutions known for very specific cases of the model parameters. By changing model parameters the output results were numerically analyzed at transition and steady state conditions. The influence of the substrate and mediator concentrations as well as of the thicknesses of the enzyme and diffusion layers on the biosensor response was investigated. Calculations showed complex kinetics of the biosensor response, especially when the biosensor acts under a mixed limitation of the diffusion and the enzyme interaction with the substrate.

  12. An Agent-Based Modeling for Pandemic Influenza in Egypt

    CERN Document Server

    Khalil, Khaled M; Nazmy, Taymour T; Salem, Abdel-Badeeh M

    2010-01-01

    Pandemic influenza has great potential to cause large and rapid increases in deaths and serious illness. The objective of this paper is to develop an agent-based model to simulate the spread of pandemic influenza (novel H1N1) in Egypt. The proposed multi-agent model is based on the modeling of individuals' interactions in a space time context. The proposed model involves different types of parameters such as: social agent attributes, distribution of Egypt population, and patterns of agents' interactions. Analysis of modeling results leads to understanding the characteristics of the modeled pandemic, transmission patterns, and the conditions under which an outbreak might occur. In addition, the proposed model is used to measure the effectiveness of different control strategies to intervene the pandemic spread.

  13. Research on Bayesian Network Based User's Interest Model

    Institute of Scientific and Technical Information of China (English)

    ZHANG Weifeng; XU Baowen; CUI Zifeng; XU Lei

    2007-01-01

    It has very realistic significance for improving the quality of users' accessing information to filter and selectively retrieve the large number of information on the Internet. On the basis of analyzing the existing users' interest models and some basic questions of users' interest (representation, derivation and identification of users' interest), a Bayesian network based users' interest model is given. In this model, the users' interest reduction algorithm based on Markov Blanket model is used to reduce the interest noise, and then users' interested and not interested documents are used to train the Bayesian network. Compared to the simple model, this model has the following advantages like small space requirements, simple reasoning method and high recognition rate. The experiment result shows this model can more appropriately reflect the user's interest, and has higher performance and good usability.

  14. Map-based model of the cardiac action potential

    International Nuclear Information System (INIS)

    A simple computationally efficient model which is capable of replicating the basic features of cardiac cell action potential is proposed. The model is a four-dimensional map and demonstrates good correspondence with real cardiac cells. Various regimes of cardiac activity, which can be reproduced by the proposed model, are shown. Bifurcation mechanisms of these regimes transitions are explained using phase space analysis. The dynamics of 1D and 2D lattices of coupled maps which model the behavior of electrically connected cells is discussed in the context of synchronization theory. -- Highlights: → Recent experimental-data based models are complicated for analysis and simulation. → The simplified map-based model of the cardiac cell is constructed. → The model is capable for replication of different types of cardiac activity. → The spatio-temporal dynamics of ensembles of coupled maps are investigated. → Received data are analyzed in context of biophysical processes in the myocardium.

  15. An Ontology-Based Framework for Modeling User Behavior

    DEFF Research Database (Denmark)

    Razmerita, Liana

    2011-01-01

    This paper focuses on the role of user modeling and semantically enhanced representations for personalization. This paper presents a generic Ontology-based User Modeling framework (OntobUMf), its components, and its associated user modeling processes. This framework models the behavior of the users...... characteristics of the users interacting with the system. Concrete examples of how OntobUMf is used in the context of a Knowledge Management (KM) System are provided. This paper discusses some of the implications of ontology-based user modeling for semantically enhanced KM and, in particular, for personal KM. The...... results of this research may contribute to the development of other frameworks for modeling user behavior, other semantically enhanced user modeling frameworks, or other semantically enhanced information systems....

  16. Modelling catchment hydrology within a GIS based SVAT-model framework

    OpenAIRE

    Ludwig, R.; Mauser, W

    2000-01-01

    The physically-based soil-vegetation-atmosphere-transfer model PROMET (PRocess-Oriented Model for Evapo Transpiration) developed at the Institute of Geography, University of Munich, is applied to the Ammer basin (approx. 600 km2 ) in the alpine foreland of Germany. The hourly actual evapotranspiration rate is calculated for a 14-year time series. A rainfall-runoff model, based on an enhanced distributed TOPMODEL structure, is linked to the SVAT-model in order to provide a hydrologica...

  17. Business Model Innovation and Competitive Imitation: The Case of Sponsor-Based Business Models

    OpenAIRE

    Casadesus-Masanell, Ramon; Zhu, Feng

    2013-01-01

    This paper provides the first formal model of business model innovation. Our analysis focuses on sponsor-based business model innovations where a firm monetizes its product through sponsors rather than setting prices to its customer base. We analyze strategic interactions between an innovative entrant and an incumbent where the incumbent may imitate the entrant's business model innovation once it is revealed. The results suggest that an entrant needs to strategically choose whether to reveal ...

  18. Image based 3D city modeling : Comparative study

    Science.gov (United States)

    Singh, S. P.; Jain, K.; Mandla, V. R.

    2014-06-01

    3D city model is a digital representation of the Earth's surface and it's related objects such as building, tree, vegetation, and some manmade feature belonging to urban area. The demand of 3D city modeling is increasing rapidly for various engineering and non-engineering applications. Generally four main image based approaches were used for virtual 3D city models generation. In first approach, researchers were used Sketch based modeling, second method is Procedural grammar based modeling, third approach is Close range photogrammetry based modeling and fourth approach is mainly based on Computer Vision techniques. SketchUp, CityEngine, Photomodeler and Agisoft Photoscan are the main softwares to represent these approaches respectively. These softwares have different approaches & methods suitable for image based 3D city modeling. Literature study shows that till date, there is no complete such type of comparative study available to create complete 3D city model by using images. This paper gives a comparative assessment of these four image based 3D modeling approaches. This comparative study is mainly based on data acquisition methods, data processing techniques and output 3D model products. For this research work, study area is the campus of civil engineering department, Indian Institute of Technology, Roorkee (India). This 3D campus acts as a prototype for city. This study also explains various governing parameters, factors and work experiences. This research work also gives a brief introduction, strengths and weakness of these four image based techniques. Some personal comment is also given as what can do or what can't do from these softwares. At the last, this study shows; it concluded that, each and every software has some advantages and limitations. Choice of software depends on user requirements of 3D project. For normal visualization project, SketchUp software is a good option. For 3D documentation record, Photomodeler gives good result. For Large city

  19. Variance-based sensitivity indices for models with dependent inputs

    International Nuclear Information System (INIS)

    Computational models are intensively used in engineering for risk analysis or prediction of future outcomes. Uncertainty and sensitivity analyses are of great help in these purposes. Although several methods exist to perform variance-based sensitivity analysis of model output with independent inputs only a few are proposed in the literature in the case of dependent inputs. This is explained by the fact that the theoretical framework for the independent case is set and a univocal set of variance-based sensitivity indices is defined. In the present work, we propose a set of variance-based sensitivity indices to perform sensitivity analysis of models with dependent inputs. These measures allow us to distinguish between the mutual dependent contribution and the independent contribution of an input to the model response variance. Their definition relies on a specific orthogonalisation of the inputs and ANOVA-representations of the model output. In the applications, we show the interest of the new sensitivity indices for model simplification setting. - Highlights: ► Uncertainty and sensitivity analyses are of great help in engineering. ► Several methods exist to perform variance-based sensitivity analysis of model output with independent inputs. ► We define a set of variance-based sensitivity indices for models with dependent inputs. ► Inputs mutual contributions are distinguished from their independent contributions. ► Analytical and computational tests are performed and discussed.

  20. Parameter optimization in differential geometry based solvation models.

    Science.gov (United States)

    Wang, Bao; Wei, G W

    2015-10-01

    Differential geometry (DG) based solvation models are a new class of variational implicit solvent approaches that are able to avoid unphysical solvent-solute boundary definitions and associated geometric singularities, and dynamically couple polar and non-polar interactions in a self-consistent framework. Our earlier study indicates that DG based non-polar solvation model outperforms other methods in non-polar solvation energy predictions. However, the DG based full solvation model has not shown its superiority in solvation analysis, due to its difficulty in parametrization, which must ensure the stability of the solution of strongly coupled nonlinear Laplace-Beltrami and Poisson-Boltzmann equations. In this work, we introduce new parameter learning algorithms based on perturbation and convex optimization theories to stabilize the numerical solution and thus achieve an optimal parametrization of the DG based solvation models. An interesting feature of the present DG based solvation model is that it provides accurate solvation free energy predictions for both polar and non-polar molecules in a unified formulation. Extensive numerical experiment demonstrates that the present DG based solvation model delivers some of the most accurate predictions of the solvation free energies for a large number of molecules. PMID:26450304

  1. EPR-based material modelling of soils considering volume changes

    Science.gov (United States)

    Faramarzi, Asaad; Javadi, Akbar A.; Alani, Amir M.

    2012-11-01

    In this paper an approach is presented for developing material models for soils based on evolutionary polynomial regression (EPR), taking into account its volumetric behaviour. EPR is a recently developed hybrid data mining technique that searches for structured mathematical equations (representing the behaviour of a system) using genetic algorithm and the least squares method. Stress-strain data from triaxial test are used to train and develop EPR-based material models for soil. The developed models are compared with some of the well known conventional material models. In particular, the capability of the developed EPR models in predicting volume change behaviour of soils is illustrated. It is also shown that the developed EPR-based material models can be incorporated in finite element (FE) analysis. Two geotechnical examples are presented to verify the developed EPR-based FE model (EPR-FEM). The results of the EPR-FEM are compared with those of a standard FEM where conventional constitutive models are used to describe the material behaviour. The results show that EPR-FEM can be successfully employed to analyse geotechnical engineering problems. The advantages of the proposed EPR models are highlighted.

  2. A Model-Based Prognostics Approach Applied to Pneumatic Valves

    Data.gov (United States)

    National Aeronautics and Space Administration — Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain...

  3. A Model-based Prognostics Approach Applied to Pneumatic Valves

    Data.gov (United States)

    National Aeronautics and Space Administration — Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain...

  4. Model-based Prognostics with Fixed-lag Particle Filters

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics exploits domain knowl- edge of the system, its components, and how they fail by casting the underlying physical phenom- ena in a...

  5. A model-data based systems approach to process intensification

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    is to apply a model-based synthesis method to systematically generate and evaluate alternatives in the first stage and an experiment-model based validation in the second stage. In this way, the search for alternatives is done very quickly, reliably and systematically over a wide range, while resources...... are preserved for focused validation of only the promising candidates in the second-stage. This approach, however, would be limited to intensification based on “known” unit operations, unless the PI process synthesis/design is considered at a lower level of aggregation, namely the phenomena level. That is......, the model-based synthesis method must employ models at lower levels of aggregation and through combination rules for phenomena, generate (synthesize) new intensified unit operations. An efficient solution procedure for the synthesis problem is needed to tackle the potentially large number of options...

  6. Probabilistic Model-Based Diagnosis for Electrical Power Systems

    Data.gov (United States)

    National Aeronautics and Space Administration — We present in this article a case study of the probabilistic approach to model-based diagnosis. Here, the diagnosed system is a real-world electrical power system,...

  7. RESEARCH ON VIRTUAL-PART-BASED CONNECTING ELEMENT MODELING

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    Based on the inner character analysis of interpart, detail modification and assembly relation of mechanical connecting element, the idea, which extends the feature modeling of part to the interpart feature modeling for assembly purpose, is presented, and virtual-part-based connecting element modeling is proposed. During the assembly modeling, base parts are modified by the Boolean subtraction between the virtual part and the part to be connected. Dynamic matching algorithm, which is based on list database, is designed for dynamic extension and off-line editing of connecting part and virtual part, and design rules of connecting element is encapsulated by the virtual part. A prototyped software module for rapid design of connecting elements is implemented under self-developed CAD/CAM platform-SuperMan.

  8. Content-based network model with duplication and divergence

    Science.gov (United States)

    Şengün, Yasemin; Erzan, Ayşe

    2006-06-01

    We construct a minimal content-based realization of the duplication and divergence model of genomic networks introduced by Wagner [Proc. Natl. Acad. Sci. 91 (1994) 4387] and investigate the scaling properties of the directed degree distribution and clustering coefficient. We find that the content-based network exhibits crossover between two scaling regimes, with log-periodic oscillations for large degrees. These features are not present in the original gene duplication model, but inherent in the content-based model of Balcan and Erzan. The scaling form of the degree distribution of the content-based model turns out to be robust under duplication and divergence, with some re-adjustment of the scaling exponents, while the out-clustering coefficient goes over from a weak power-law dependence on the degree, to an exponential decay under mutations which include splitting and merging of strings.

  9. Model-based approach for elevator performance estimation

    Science.gov (United States)

    Esteban, E.; Salgado, O.; Iturrospe, A.; Isasa, I.

    2016-02-01

    In this paper, a dynamic model for an elevator installation is presented in the state space domain. The model comprises both the mechanical and the electrical subsystems, including the electrical machine and a closed-loop field oriented control. The proposed model is employed for monitoring the condition of the elevator installation. The adopted model-based approach for monitoring employs the Kalman filter as an observer. A Kalman observer estimates the elevator car acceleration, which determines the elevator ride quality, based solely on the machine control signature and the encoder signal. Finally, five elevator key performance indicators are calculated based on the estimated car acceleration. The proposed procedure is experimentally evaluated, by comparing the key performance indicators calculated based on the estimated car acceleration and the values obtained from actual acceleration measurements in a test bench. Finally, the proposed procedure is compared with the sliding mode observer.

  10. Physics-Based Pneumatic Hammer Instability Model Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this project is to develop a physics-based pneumatic hammer instability model that accurately predicts the stability of hydrostatic bearings...

  11. A Model-based Avionic Prognostic Reasoner (MAPR)

    Data.gov (United States)

    National Aeronautics and Space Administration — The Model-based Avionic Prognostic Reasoner (MAPR) presented in this paper is an innovative solution for non-intrusively monitoring the state of health (SoH) and...

  12. Similar words analysis based on POS-CBOW language model

    Directory of Open Access Journals (Sweden)

    Dongru RUAN

    2015-10-01

    Full Text Available Similar words analysis is one of the important aspects in the field of natural language processing, and it has important research and application values in text classification, machine translation and information recommendation. Focusing on the features of Sina Weibo's short text, this paper presents a language model named as POS-CBOW, which is a kind of continuous bag-of-words language model with the filtering layer and part-of-speech tagging layer. The proposed approach can adjust the word vectors' similarity according to the cosine similarity and the word vectors' part-of-speech metrics. It can also filter those similar words set on the base of the statistical analysis model. The experimental result shows that the similar words analysis algorithm based on the proposed POS-CBOW language model is better than that based on the traditional CBOW language model.

  13. Establishing a model for evidence based collection management

    OpenAIRE

    Koufogiannakis, Denise

    2007-01-01

    Establishing a model for evidence based collection management Question: How can collection managers and selectors structure their practice so that collection decisions are more evidence based? Can a model be established to provide a framework for decision making in a large academic institution? What questions need to be answered and what sources of information are most appropriate? Where does one begin to find useful information and how can it make a difference in day-to-day work? Sett...

  14. An Efficient Semantic Model For Concept Based Clustering And Classification

    OpenAIRE

    SaiSindhu Bandaru; Dr. K B Madhuri

    2012-01-01

    Usually in text mining techniques the basic measures like term frequency of a term (word or phrase) is computed to compute the importance of the term in the document. But with statistical analysis, the original semantics of the term may not carry the exact meaning of the term. To overcome this problem, a new framework has been introduced which relies on concept based model and synonym based approach. The proposed model can efficiently find significant matching and related concepts between doc...

  15. Articulated Pose Estimation Using Hierarchical Exemplar-Based Models

    OpenAIRE

    Liu, Jiongxin; Li, Yinxiao; Allen, Peter; Belhumeur, Peter

    2015-01-01

    Exemplar-based models have achieved great success on localizing the parts of semi-rigid objects. However, their efficacy on highly articulated objects such as humans is yet to be explored. Inspired by hierarchical object representation and recent application of Deep Convolutional Neural Networks (DCNNs) on human pose estimation, we propose a novel formulation that incorporates both hierarchical exemplar-based models and DCNNs in the spatial terms. Specifically, we obtain more expressive spati...

  16. Agent-based Models for Economic Policy Design

    OpenAIRE

    Dawid, Herbert; Neugart, Michael

    2010-01-01

    Agent-based simulation models are used by an increasing number of scholars as a tool for providing evaluations of economic policy measures and policy recommendations in complex environments. On the basis of recent work in this area we discuss the advantages of agent-based modeling for economic policy design and identify further needs to be addressed for strengthening this methodological approach as a basis for sound policy advice.

  17. Modeling And Implementation Of Cognitive-Based Supervision and Assistance

    OpenAIRE

    Fu, Xingguang; Mosebach, Henning Hajo; Gamrad, Dennis; Lemmer, Karsten; Söffker, Dirk

    2009-01-01

    This contribution presents firstly the implementation of an automated cognitive-based supervision concept of a real vehicle. The concept employs a Situation-Operator-Modelling (SOM) approach as a representational level to model and formalize the logic of interaction between driver, vehicle and environment based on sensor and video data. The programmed implementation is realized by a Java-Application which can be connected with the ViewCar, the DLR experimental vehicle equipped with specialize...

  18. A NEW DYNAMIC DEFENSE MODEL BASED ON ACTIVE DECEPTION

    Institute of Scientific and Technical Information of China (English)

    Gong Jing; Sun Zhixin; Gu Qiang

    2009-01-01

    Aiming at the traditional passive deception models, this paper constructs a Decoy Platform based on Intelligent Agent (DPIA) to realize dynamic defense. The paper explores a new dynamic defense model based on active deception, introduces its architecture, and expatiates on communication methods and security guarantee in information transference. Simulation results show that the DPIA can attract hacker agility and activity, lead abnormal traffic into it, distribute a large number of attack data, and ensure real network security.

  19. Business model analysis of interest-based social networking services

    OpenAIRE

    Lehtinen, Ilari

    2013-01-01

    Objectives of the study This research paper sets out to examine interest-based social networking services and the underlying business models that provide the logic for value creation, delivery, and capture. The objective of this paper is to uncover the common characteristics of interest-based social networking services' business models in order to understand the necessary building blocks that need to be present for a new service to function properly. Furthermore, it aims at giving manager...

  20. TRAFFIC FLOW MODEL BASED ON CELLULAR AUTOMATION WITH ADAPTIVE DECELERATION

    OpenAIRE

    Shinkarev, A. A.

    2016-01-01

    This paper describes continuation of the authors’ work in the field of traffic flow mathematical models based on the cellular automata theory. The refactored representation of the multifactorial traffic flow model based on the cellular automata theory is used for a representation of an adaptive deceleration step implementation. The adaptive deceleration step in the case of a leader deceleration allows slowing down smoothly but not instantly. Concepts of the number of time steps without confli...

  1. GIS-based modeling of runoff source areas and pathways

    OpenAIRE

    N. J. Kuhn; Zhu, H.

    2002-01-01

    The application of runoff models that rely on calibration to future land use and climate conditions is restricted to situations where the reaction of Hydrologic Response Units to environmental change is known. This limitation and the ensuing uncertainty of model results can be avoided when a risk-based approach to landscape and runoff analysis is taken. GIS-based landscape analysis provides the possibility of assessing the risks associated with non-linear responses ...

  2. Discrete Discriminant analysis based on tree-structured graphical models

    DEFF Research Database (Denmark)

    Perez de la Cruz, Gonzalo; Eslava, Guillermina

    The purpose of this paper is to illustrate the potential use of discriminant analysis based on tree{structured graphical models for discrete variables. This is done by comparing its empirical performance using estimated error rates for real and simulated data. The results show that discriminant a...... analysis based on tree{structured graphical models is a simple nonlinear method competitive with, and sometimes superior to, other well{known linear methods like those assuming mutual independence between variables and linear logistic regression....

  3. Lagrangian-based Hydrodynamic Model: Freeway Traffic Estimation

    OpenAIRE

    Han, Ke; Yao, Tao; Terry L. Friesz

    2012-01-01

    This paper is concerned with highway traffic estimation using traffic sensing data, in a Lagrangian-based modeling framework. We consider the Lighthill-Whitham-Richards (LWR) model (Lighthill and Whitham, 1955; Richards, 1956) in Lagrangian-coordinates, and provide rigorous mathematical results regarding the equivalence of viscosity solutions to the Hamilton-Jacobi equations in Eulerian and Lagrangian coordinates. We derive closed-form solutions to the Lagrangian-based Hamilton-Jacobi equatio...

  4. Error model identification of inertial navigation platform based on errors-in-variables model

    Institute of Scientific and Technical Information of China (English)

    Liu Ming; Liu Yu; Su Baoku

    2009-01-01

    Because the real input acceleration cannot be obtained during the error model identification of inertial navigation platform, both the input and output data contain noises. In this case, the conventional regression model and the least squares (LS) method will result in bias. Based on the models of inertial navigation platform error and observation error, the errors-in-variables (EV) model and the total least squares (TLS) method are proposed to identify the error model of the inertial navigation platform. The estimation precision is improved and the result is better than the conventional regression model based LS method. The simulation results illustrate the effectiveness of the proposed method.

  5. Model-Based Reconstructive Elasticity Imaging Using Ultrasound

    Directory of Open Access Journals (Sweden)

    Salavat R. Aglyamov

    2007-01-01

    Full Text Available Elasticity imaging is a reconstructive imaging technique where tissue motion in response to mechanical excitation is measured using modern imaging systems, and the estimated displacements are then used to reconstruct the spatial distribution of Young's modulus. Here we present an ultrasound elasticity imaging method that utilizes the model-based technique for Young's modulus reconstruction. Based on the geometry of the imaged object, only one axial component of the strain tensor is used. The numerical implementation of the method is highly efficient because the reconstruction is based on an analytic solution of the forward elastic problem. The model-based approach is illustrated using two potential clinical applications: differentiation of liver hemangioma and staging of deep venous thrombosis. Overall, these studies demonstrate that model-based reconstructive elasticity imaging can be used in applications where the geometry of the object and the surrounding tissue is somewhat known and certain assumptions about the pathology can be made.

  6. A General Polygon-based Deformable Model for Object Recognition

    DEFF Research Database (Denmark)

    Jensen, Rune Fisker; Carstensen, Jens Michael

    1999-01-01

    We propose a general scheme for object localization and recognition based on a deformable model. The model combines shape and image properties by warping a arbitrary prototype intensity template according to the deformation in shape. The shape deformations are constrained by a probabilistic...... distribution, which combined with a match of the warped intensity template and the image form the final criteria used for localization and recognition of a given object. The chosen representation gives the model an ability to model an almost arbitrary object. Beside the actual model a full general scheme for...

  7. Intelligent Cost Modeling Based on Soft Computing for Avionics Systems

    Institute of Scientific and Technical Information of China (English)

    ZHU Li-li; LI Zhuang-sheng; XU Zong-ze

    2006-01-01

    In parametric cost estimating, objections to using statistical Cost Estimating Relationships (CERs) and parametric models include problems of low statistical significance due to limited data points, biases in the underlying data, and lack of robustness. Soft Computing (SC) technologies are used for building intelligent cost models. The SC models are systemically evaluated based on their training and prediction of the historical cost data of airborne avionics systems. Results indicating the strengths and weakness of each model are presented. In general, the intelligent cost models have higher prediction precision, better data adaptability, and stronger self-learning capability than the regression CERs.

  8. Electromagnetic Model and Image Reconstruction Algorithms Based on EIT System

    Institute of Scientific and Technical Information of China (English)

    CAO Zhang; WANG Huaxiang

    2006-01-01

    An intuitive 2 D model of circular electrical impedance tomography ( EIT) sensor with small size electrodes is established based on the theory of analytic functions.The validation of the model is proved using the result from the solution of Laplace equation.Suggestions on to electrode optimization and explanation to the ill-condition property of the sensitivity matrix are provided based on the model,which takes electrode distance into account and can be generalized to the sensor with any simple connected region through a conformal transformation.Image reconstruction algorithms based on the model are implemented to show feasibility of the model using experimental data collected from the EIT system developed in Tianjin University.In the simulation with a human chestlike configuration,electrical conductivity distributions are reconstructed using equi-potential backprojection (EBP) and Tikhonov regularization (TR) based on a conformal transformation of the model.The algorithms based on the model are suitable for online image reconstruction and the reconstructed results are good both in size and position.

  9. The impact of design-based modeling instruction on seventh graders' spatial abilities and model-based argumentation

    Science.gov (United States)

    McConnell, William J.

    Due to the call of current science education reform for the integration of engineering practices within science classrooms, design-based instruction is receiving much attention in science education literature. Although some aspect of modeling is often included in well-known design-based instructional methods, it is not always a primary focus. The purpose of this study was to better understand how design-based instruction with an emphasis on scientific modeling might impact students' spatial abilities and their model-based argumentation abilities. In the following mixed-method multiple case study, seven seventh grade students attending a secular private school in the Mid-Atlantic region of the United States underwent an instructional intervention involving design-based instruction, modeling and argumentation. Through the course of a lesson involving students in exploring the interrelatedness of the environment and an animal's form and function, students created and used multiple forms of expressed models to assist them in model-based scientific argument. Pre/post data were collected through the use of The Purdue Spatial Visualization Test: Rotation, the Mental Rotation Test and interviews. Other data included a spatial activities survey, student artifacts in the form of models, notes, exit tickets, and video recordings of students throughout the intervention. Spatial abilities tests were analyzed using descriptive statistics while students' arguments were analyzed using the Instrument for the Analysis of Scientific Curricular Arguments and a behavior protocol. Models were analyzed using content analysis and interviews and all other data were coded and analyzed for emergent themes. Findings in the area of spatial abilities included increases in spatial reasoning for six out of seven participants, and an immense difference in the spatial challenges encountered by students when using CAD software instead of paper drawings to create models. Students perceived 3D printed

  10. Using Built-In Domain-Specific Modeling Support to Guide Model-Based Test Generation

    CERN Document Server

    Kanstrén, Teemu; 10.4204/EPTCS.80.5

    2012-01-01

    We present a model-based testing approach to support automated test generation with domain-specific concepts. This includes a language expert who is an expert at building test models and domain experts who are experts in the domain of the system under test. First, we provide a framework to support the language expert in building test models using a full (Java) programming language with the help of simple but powerful modeling elements of the framework. Second, based on the model built with this framework, the toolset automatically forms a domain-specific modeling language that can be used to further constrain and guide test generation from these models by a domain expert. This makes it possible to generate a large set of test cases covering the full model, chosen (constrained) parts of the model, or manually define specific test cases on top of the model while using concepts familiar to the domain experts.

  11. Using Built-In Domain-Specific Modeling Support to Guide Model-Based Test Generation

    Directory of Open Access Journals (Sweden)

    Teemu Kanstrén

    2012-02-01

    Full Text Available We present a model-based testing approach to support automated test generation with domain-specific concepts. This includes a language expert who is an expert at building test models and domain experts who are experts in the domain of the system under test. First, we provide a framework to support the language expert in building test models using a full (Java programming language with the help of simple but powerful modeling elements of the framework. Second, based on the model built with this framework, the toolset automatically forms a domain-specific modeling language that can be used to further constrain and guide test generation from these models by a domain expert. This makes it possible to generate a large set of test cases covering the full model, chosen (constrained parts of the model, or manually define specific test cases on top of the model while using concepts familiar to the domain experts.

  12. Support vector machine-based multi-model predictive control

    Institute of Scientific and Technical Information of China (English)

    Zhejing BA; Youxian SUN

    2008-01-01

    In this paper,a support vector machine-based multi-model predictive control is proposed,in which SVM classification combines well with SVM regression.At first,each working environment is modeled by SVM regression and the support vector machine network-based model predictive control(SVMN-MPC)algorithm corresponding to each environment is developed,and then a multi-class SVM model is established to recognize multiple operating conditions.As for control,the current environment is identified by the multi-class SVM model and then the corresponding SVMN.MPCcontroller is activated at each sampling instant.The proposed modeling,switching and controller design is demonstrated in simulation results.

  13. Tutorial on agent-based modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Macal, C. M.; North, M. J.; Decision and Information Sciences

    2005-01-01

    Agent-based modeling and simulation (ABMS) is a new approach to modeling systems comprised of autonomous, interacting agents. ABMS promises to have far-reaching effects on the way that businesses use computers to support decision-making and researchers use electronic laboratories to support their research. Some have gone so far as to contend that ABMS is a third way of doing science besides deductive and inductive reasoning. Computational advances have made possible a growing number of agent-based applications in a variety of fields. Applications range from modeling agent behavior in the stock market and supply chains, to predicting the spread of epidemics and the threat of bio-warfare, from modeling consumer behavior to understanding the fall of ancient civilizations, to name a few. This tutorial describes the theoretical and practical foundations of ABMS, identifies toolkits and methods for developing ABMS models, and provides some thoughts on the relationship between ABMS and traditional modeling techniques.

  14. Automated Decomposition of Model-based Learning Problems

    Science.gov (United States)

    Williams, Brian C.; Millar, Bill

    1996-01-01

    A new generation of sensor rich, massively distributed autonomous systems is being developed that has the potential for unprecedented performance, such as smart buildings, reconfigurable factories, adaptive traffic systems and remote earth ecosystem monitoring. To achieve high performance these massive systems will need to accurately model themselves and their environment from sensor information. Accomplishing this on a grand scale requires automating the art of large-scale modeling. This paper presents a formalization of [\\em decompositional model-based learning (DML)], a method developed by observing a modeler's expertise at decomposing large scale model estimation tasks. The method exploits a striking analogy between learning and consistency-based diagnosis. Moriarty, an implementation of DML, has been applied to thermal modeling of a smart building, demonstrating a significant improvement in learning rate.

  15. Building information modeling based on intelligent parametric technology

    Institute of Scientific and Technical Information of China (English)

    ZENG Xudong; TAN Jie

    2007-01-01

    In order to push the information organization process of the building industry,promote sustainable architectural design and enhance the competitiveness of China's building industry,the author studies building information modeling (BIM) based on intelligent parametric modeling technology.Building information modeling is a new technology in the field of computer aided architectural design,which contains not only geometric data,but also the great amount of engineering data throughout the lifecycle of a building.The author also compares BIM technology with two-dimensional CAD technology,and demonstrates the advantages and characteristics of intelligent parametric modeling technology.Building information modeling,which is based on intelligent parametric modeling technology,will certainly replace traditional computer aided architectural design and become the new driving force to push forward China's building industry in this information age.

  16. Evaluation of Artificial Intelligence Based Models for Chemical Biodegradability Prediction

    Directory of Open Access Journals (Sweden)

    Aleksandar Sabljic

    2004-12-01

    Full Text Available This study presents a review of biodegradability modeling efforts including a detailed assessment of two models developed using an artificial intelligence based methodology. Validation results for these models using an independent, quality reviewed database, demonstrate that the models perform well when compared to another commonly used biodegradability model, against the same data. The ability of models induced by an artificial intelligence methodology to accommodate complex interactions in detailed systems, and the demonstrated reliability of the approach evaluated by this study, indicate that the methodology may have application in broadening the scope of biodegradability models. Given adequate data for biodegradability of chemicals under environmental conditions, this may allow for the development of future models that include such things as surface interface impacts on biodegradability for example.

  17. A community-based framework for aquatic ecosystem models

    DEFF Research Database (Denmark)

    Trolle, Didde; Hamilton, D. P.; Hipsey, M. R.; Bolding, Karsten; Bruggeman, J.; Mooij, W. M.; Janse, J. H.; Nielsen, A.; Jeppesen, E.; Elliott, J. A.; Makler-Pick, V.; Petzoldt, T.; Rinke, K.; Flindt, M. R.; Arhonditsis, G. B.; Gal, G.; Bjerring, R.; Tominaga, K.; Hoen, J.; Downing, A. S.; Marques, D. M.; Fragoso, C. R.; Sondergaard, M.; Hanson, P. C.

    2012-01-01

    , and (viii) avoid 're-inventing the wheel', thus accelerating improvements to aquatic ecosystem models. We intend to achieve this as a community that fosters interactions amongst ecologists and model developers. Further, we outline scientific topics recently articulated by the scientific community...... a literature survey, we document the growing importance of numerical aquatic ecosystem models while also noting the difficulties, up until now, of the aquatic scientific community to make significant advances in these models during the past two decades. Through a common forum for aquatic ecosystem...... modellers we aim to (i) advance collaboration within the aquatic ecosystem modelling community, (ii) enable increased use of models for research, policy and ecosystem-based management, (iii) facilitate a collective framework using common (standardised) code to ensure that model development is incremental...

  18. Propositions for a PDF model based on fluid particle acceleration

    International Nuclear Information System (INIS)

    This paper describes theoretical propositions to model the acceleration of a fluid particle in a turbulent flow. Such a model is useful for the PDF approach to turbulent reactive flows as well as for the Lagrangian modelling of two-phase flows. The model developed here draws from ideas already put forward by Sawford but which are generalized to the case of non-homogeneous flows. The model is built so as to revert continuously to Pope's model, which uses a Langevin equation for particle velocities, when the Reynolds number becomes very high. The derivation is based on the technique of fast variable elimination. This technique allow a careful analysis of the relations between different levels of modelling. It also allows to address certain problems in a more rigorous way. In particular, application of this technique shows that models presently used can in principle simulate bubbly flows including the pressure-gradient and added-mass forces. (author)

  19. Model Checking-Based Testing of Web Applications

    Institute of Scientific and Technical Information of China (English)

    ZENG Hongwei; MIAO Huaikou

    2007-01-01

    A formal model representing the navigation behavior of a Web application as the Kripke structure is proposed and an approach that applies model checking to test case generation is presented. The Object Relation Diagram as the object model is employed to describe the object structure of a Web application design and can be translated into the behavior model. A key problem of model checking-based test generation for a Web application is how to construct a set of trap properties that intend to cause the violations of model checking against the behavior model and output of counterexamples used to construct the test sequences.We give an algorithm that derives trap properties from the object model with respect to node and edge coverage criteria.

  20. Map-Based Channel Model for Urban Macrocell Propagation Scenarios

    Directory of Open Access Journals (Sweden)

    Jose F. Monserrat

    2015-01-01

    Full Text Available The evolution of LTE towards 5G has started and different research projects and institutions are in the process of verifying new technology components through simulations. Coordination between groups is strongly recommended and, in this sense, a common definition of test cases and simulation models is needed. The scope of this paper is to present a realistic channel model for urban macrocell scenarios. This model is map-based and takes into account the layout of buildings situated in the area under study. A detailed description of the model is given together with a comparison with other widely used channel models. The benchmark includes a measurement campaign in which the proposed model is shown to be much closer to the actual behavior of a cellular system. Particular attention is given to the outdoor component of the model, since it is here where the proposed approach is showing main difference with other previous models.