WorldWideScience

Sample records for soft computing approach

  1. Hybrid soft computing approaches research and applications

    CERN Document Server

    Dutta, Paramartha; Chakraborty, Susanta

    2016-01-01

    The book provides a platform for dealing with the flaws and failings of the soft computing paradigm through different manifestations. The different chapters highlight the necessity of the hybrid soft computing methodology in general with emphasis on several application perspectives in particular. Typical examples include (a) Study of Economic Load Dispatch by Various Hybrid Optimization Techniques, (b) An Application of Color Magnetic Resonance Brain Image Segmentation by ParaOptiMUSIG activation Function, (c) Hybrid Rough-PSO Approach in Remote Sensing Imagery Analysis,  (d) A Study and Analysis of Hybrid Intelligent Techniques for Breast Cancer Detection using Breast Thermograms, and (e) Hybridization of 2D-3D Images for Human Face Recognition. The elaborate findings of the chapters enhance the exhibition of the hybrid soft computing paradigm in the field of intelligent computing.

  2. Microarray-based cancer prediction using soft computing approach.

    Science.gov (United States)

    Wang, Xiaosheng; Gotoh, Osamu

    2009-05-26

    One of the difficulties in using gene expression profiles to predict cancer is how to effectively select a few informative genes to construct accurate prediction models from thousands or ten thousands of genes. We screen highly discriminative genes and gene pairs to create simple prediction models involved in single genes or gene pairs on the basis of soft computing approach and rough set theory. Accurate cancerous prediction is obtained when we apply the simple prediction models for four cancerous gene expression datasets: CNS tumor, colon tumor, lung cancer and DLBCL. Some genes closely correlated with the pathogenesis of specific or general cancers are identified. In contrast with other models, our models are simple, effective and robust. Meanwhile, our models are interpretable for they are based on decision rules. Our results demonstrate that very simple models may perform well on cancerous molecular prediction and important gene markers of cancer can be detected if the gene selection approach is chosen reasonably.

  3. Vehicular traffic noise prediction using soft computing approach.

    Science.gov (United States)

    Singh, Daljeet; Nigam, S P; Agrawal, V P; Kumar, Maneek

    2016-12-01

    A new approach for the development of vehicular traffic noise prediction models is presented. Four different soft computing methods, namely, Generalized Linear Model, Decision Trees, Random Forests and Neural Networks, have been used to develop models to predict the hourly equivalent continuous sound pressure level, Leq, at different locations in the Patiala city in India. The input variables include the traffic volume per hour, percentage of heavy vehicles and average speed of vehicles. The performance of the four models is compared on the basis of performance criteria of coefficient of determination, mean square error and accuracy. 10-fold cross validation is done to check the stability of the Random Forest model, which gave the best results. A t-test is performed to check the fit of the model with the field data. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Role of Soft Computing Approaches in HealthCare Domain: A Mini Review.

    Science.gov (United States)

    Gambhir, Shalini; Malik, Sanjay Kumar; Kumar, Yugal

    2016-12-01

    In the present era, soft computing approaches play a vital role in solving the different kinds of problems and provide promising solutions. Due to popularity of soft computing approaches, these approaches have also been applied in healthcare data for effectively diagnosing the diseases and obtaining better results in comparison to traditional approaches. Soft computing approaches have the ability to adapt itself according to problem domain. Another aspect is a good balance between exploration and exploitation processes. These aspects make soft computing approaches more powerful, reliable and efficient. The above mentioned characteristics make the soft computing approaches more suitable and competent for health care data. The first objective of this review paper is to identify the various soft computing approaches which are used for diagnosing and predicting the diseases. Second objective is to identify various diseases for which these approaches are applied. Third objective is to categories the soft computing approaches for clinical support system. In literature, it is found that large number of soft computing approaches have been applied for effectively diagnosing and predicting the diseases from healthcare data. Some of these are particle swarm optimization, genetic algorithm, artificial neural network, support vector machine etc. A detailed discussion on these approaches are presented in literature section. This work summarizes various soft computing approaches used in healthcare domain in last one decade. These approaches are categorized in five different categories based on the methodology, these are classification model based system, expert system, fuzzy and neuro fuzzy system, rule based system and case based system. Lot of techniques are discussed in above mentioned categories and all discussed techniques are summarized in the form of tables also. This work also focuses on accuracy rate of soft computing technique and tabular information is provided for

  5. The soft computing-based approach to investigate allergic diseases: a systematic review.

    Science.gov (United States)

    Tartarisco, Gennaro; Tonacci, Alessandro; Minciullo, Paola Lucia; Billeci, Lucia; Pioggia, Giovanni; Incorvaia, Cristoforo; Gangemi, Sebastiano

    2017-01-01

    Early recognition of inflammatory markers and their relation to asthma, adverse drug reactions, allergic rhinitis, atopic dermatitis and other allergic diseases is an important goal in allergy. The vast majority of studies in the literature are based on classic statistical methods; however, developments in computational techniques such as soft computing-based approaches hold new promise in this field. The aim of this manuscript is to systematically review the main soft computing-based techniques such as artificial neural networks, support vector machines, bayesian networks and fuzzy logic to investigate their performances in the field of allergic diseases. The review was conducted following PRISMA guidelines and the protocol was registered within PROSPERO database (CRD42016038894). The research was performed on PubMed and ScienceDirect, covering the period starting from September 1, 1990 through April 19, 2016. The review included 27 studies related to allergic diseases and soft computing performances. We observed promising results with an overall accuracy of 86.5%, mainly focused on asthmatic disease. The review reveals that soft computing-based approaches are suitable for big data analysis and can be very powerful, especially when dealing with uncertainty and poorly characterized parameters. Furthermore, they can provide valuable support in case of lack of data and entangled cause-effect relationships, which make it difficult to assess the evolution of disease. Although most works deal with asthma, we believe the soft computing approach could be a real breakthrough and foster new insights into other allergic diseases as well.

  6. An Integrated Soft Computing Approach to Hughes Syndrome Risk Assessment.

    Science.gov (United States)

    Vilhena, João; Rosário Martins, M; Vicente, Henrique; Grañeda, José M; Caldeira, Filomena; Gusmão, Rodrigo; Neves, João; Neves, José

    2017-03-01

    The AntiPhospholipid Syndrome (APS) is an acquired autoimmune disorder induced by high levels of antiphospholipid antibodies that cause arterial and veins thrombosis, as well as pregnancy-related complications and morbidity, as clinical manifestations. This autoimmune hypercoagulable state, usually known as Hughes syndrome, has severe consequences for the patients, being one of the main causes of thrombotic disorders and death. Therefore, it is required to be preventive; being aware of how probable is to have that kind of syndrome. Despite the updated of antiphospholipid syndrome classification, the diagnosis remains difficult to establish. Additional research on clinically relevant antibodies and standardization of their quantification are required in order to improve the antiphospholipid syndrome risk assessment. Thus, this work will focus on the development of a diagnosis decision support system in terms of a formal agenda built on a Logic Programming approach to knowledge representation and reasoning, complemented with a computational framework based on Artificial Neural Networks. The proposed model allows for improving the diagnosis, classifying properly the patients that really presented this pathology (sensitivity higher than 85%), as well as classifying the absence of APS (specificity close to 95%).

  7. A Soft Computing Approach to Kidney Diseases Evaluation.

    Science.gov (United States)

    Neves, José; Martins, M Rosário; Vilhena, João; Neves, João; Gomes, Sabino; Abelha, António; Machado, José; Vicente, Henrique

    2015-10-01

    Kidney renal failure means that one's kidney have unexpectedly stopped functioning, i.e., once chronic disease is exposed, the presence or degree of kidney dysfunction and its progression must be assessed, and the underlying syndrome has to be diagnosed. Although the patient's history and physical examination may denote good practice, some key information has to be obtained from valuation of the glomerular filtration rate, and the analysis of serum biomarkers. Indeed, chronic kidney sickness depicts anomalous kidney function and/or its makeup, i.e., there is evidence that treatment may avoid or delay its progression, either by reducing and prevent the development of some associated complications, namely hypertension, obesity, diabetes mellitus, and cardiovascular complications. Acute kidney injury appears abruptly, with a rapid deterioration of the renal function, but is often reversible if it is recognized early and treated promptly. In both situations, i.e., acute kidney injury and chronic kidney disease, an early intervention can significantly improve the prognosis. The assessment of these pathologies is therefore mandatory, although it is hard to do it with traditional methodologies and existing tools for problem solving. Hence, in this work, we will focus on the development of a hybrid decision support system, in terms of its knowledge representation and reasoning procedures based on Logic Programming, that will allow one to consider incomplete, unknown, and even contradictory information, complemented with an approach to computing centered on Artificial Neural Networks, in order to weigh the Degree-of-Confidence that one has on such a happening. The present study involved 558 patients with an age average of 51.7 years and the chronic kidney disease was observed in 175 cases. The dataset comprise twenty four variables, grouped into five main categories. The proposed model showed a good performance in the diagnosis of chronic kidney disease, since the

  8. Current Trend Towards Using Soft Computing Approaches to Phase Synchronization in Communication Systems

    Science.gov (United States)

    Drake, Jeffrey T.; Prasad, Nadipuram R.

    1999-01-01

    This paper surveys recent advances in communications that utilize soft computing approaches to phase synchronization. Soft computing, as opposed to hard computing, is a collection of complementary methodologies that act in producing the most desirable control, decision, or estimation strategies. Recently, the communications area has explored the use of the principal constituents of soft computing, namely, fuzzy logic, neural networks, and genetic algorithms, for modeling, control, and most recently for the estimation of phase in phase-coherent communications. If the receiver in a digital communications system is phase-coherent, as is often the case, phase synchronization is required. Synchronization thus requires estimation and/or control at the receiver of an unknown or random phase offset.

  9. A Hybrid Soft Computing Approach for Subset Problems

    Directory of Open Access Journals (Sweden)

    Broderick Crawford

    2013-01-01

    Full Text Available Subset problems (set partitioning, packing, and covering are formal models for many practical optimization problems. A set partitioning problem determines how the items in one set (S can be partitioned into smaller subsets. All items in S must be contained in one and only one partition. Related problems are set packing (all items must be contained in zero or one partitions and set covering (all items must be contained in at least one partition. Here, we present a hybrid solver based on ant colony optimization (ACO combined with arc consistency for solving this kind of problems. ACO is a swarm intelligence metaheuristic inspired on ants behavior when they search for food. It allows to solve complex combinatorial problems for which traditional mathematical techniques may fail. By other side, in constraint programming, the solving process of Constraint Satisfaction Problems can dramatically reduce the search space by means of arc consistency enforcing constraint consistencies either prior to or during search. Our hybrid approach was tested with set covering and set partitioning dataset benchmarks. It was observed that the performance of ACO had been improved embedding this filtering technique in its constructive phase.

  10. Driving profile modeling and recognition based on soft computing approach.

    Science.gov (United States)

    Wahab, Abdul; Quek, Chai; Tan, Chin Keong; Takeda, Kazuya

    2009-04-01

    Advancements in biometrics-based authentication have led to its increasing prominence and are being incorporated into everyday tasks. Existing vehicle security systems rely only on alarms or smart card as forms of protection. A biometric driver recognition system utilizing driving behaviors is a highly novel and personalized approach and could be incorporated into existing vehicle security system to form a multimodal identification system and offer a greater degree of multilevel protection. In this paper, detailed studies have been conducted to model individual driving behavior in order to identify features that may be efficiently and effectively used to profile each driver. Feature extraction techniques based on Gaussian mixture models (GMMs) are proposed and implemented. Features extracted from the accelerator and brake pedal pressure were then used as inputs to a fuzzy neural network (FNN) system to ascertain the identity of the driver. Two fuzzy neural networks, namely, the evolving fuzzy neural network (EFuNN) and the adaptive network-based fuzzy inference system (ANFIS), are used to demonstrate the viability of the two proposed feature extraction techniques. The performances were compared against an artificial neural network (NN) implementation using the multilayer perceptron (MLP) network and a statistical method based on the GMM. Extensive testing was conducted and the results show great potential in the use of the FNN for real-time driver identification and verification. In addition, the profiling of driver behaviors has numerous other potential applications for use by law enforcement and companies dealing with buses and truck drivers.

  11. Soft and hard computing approaches for real-time prediction of currents in a tide-dominated coastal area

    Digital Repository Service at National Institute of Oceanography (India)

    Charhate, S.B.; Deo, M.C.; SanilKumar, V.

    . Owing to the complex real sea conditions, such methods may not always yield satisfactory results. This paper discusses a few alternative approaches based on the soft computing tools of artificial neural networks (ANNs) and genetic programming (GP...

  12. Engineering applications of soft computing

    CERN Document Server

    Díaz-Cortés, Margarita-Arimatea; Rojas, Raúl

    2017-01-01

    This book bridges the gap between Soft Computing techniques and their applications to complex engineering problems. In each chapter we endeavor to explain the basic ideas behind the proposed applications in an accessible format for readers who may not possess a background in some of the fields. Therefore, engineers or practitioners who are not familiar with Soft Computing methods will appreciate that the techniques discussed go beyond simple theoretical tools, since they have been adapted to solve significant problems that commonly arise in such areas. At the same time, the book will show members of the Soft Computing community how engineering problems are now being solved and handled with the help of intelligent approaches. Highlighting new applications and implementations of Soft Computing approaches in various engineering contexts, the book is divided into 12 chapters. Further, it has been structured so that each chapter can be read independently of the others.

  13. Soft computing approach for reliability optimization: State-of-the-art survey

    International Nuclear Information System (INIS)

    Gen, Mitsuo; Yun, Young Su

    2006-01-01

    In the broadest sense, reliability is a measure of performance of systems. As systems have grown more complex, the consequences of their unreliable behavior have become severe in terms of cost, effort, lives, etc., and the interest in assessing system reliability and the need for improving the reliability of products and systems have become very important. Most solution methods for reliability optimization assume that systems have redundancy components in series and/or parallel systems and alternative designs are available. Reliability optimization problems concentrate on optimal allocation of redundancy components and optimal selection of alternative designs to meet system requirement. In the past two decades, numerous reliability optimization techniques have been proposed. Generally, these techniques can be classified as linear programming, dynamic programming, integer programming, geometric programming, heuristic method, Lagrangean multiplier method and so on. A Genetic Algorithm (GA), as a soft computing approach, is a powerful tool for solving various reliability optimization problems. In this paper, we briefly survey GA-based approach for various reliability optimization problems, such as reliability optimization of redundant system, reliability optimization with alternative design, reliability optimization with time-dependent reliability, reliability optimization with interval coefficients, bicriteria reliability optimization, and reliability optimization with fuzzy goals. We also introduce the hybrid approaches for combining GA with fuzzy logic, neural network and other conventional search techniques. Finally, we have some experiments with an example of various reliability optimization problems using hybrid GA approach

  14. A Computational Modeling Approach for Investigating Soft Tissue Balancing in Bicruciate Retaining Knee Arthroplasty

    Directory of Open Access Journals (Sweden)

    Shahram Amiri

    2012-01-01

    Full Text Available Bicruciate retaining knee arthroplasty, although has shown improved functions and patient satisfaction compared to other designs of total knee replacement, remains a technically demanding option for treating severe cases of arthritic knees. One of the main challenges in bicruciate retaining arthroplasty is proper balancing of the soft tissue during the surgery. In this study biomechanics of soft tissue balancing was investigated using a validated computational model of the knee joint with high fidelity definitions of the soft tissue structures along with a Taguchi method for design of experiments. The model was used to simulate intraoperative balancing of soft tissue structures following the combinations suggested by an orthogonal array design. The results were used to quantify the corresponding effects on the laxity of the joint under anterior-posterior, internal-external, and varus-valgus loads. These effects were ranked for each ligament bundle to identify the components of laxity which were most sensitive to the corresponding surgical modifications. The resulting map of sensitivity for all the ligament bundles determined the components of laxity most suitable for examination during intraoperative balancing of the soft tissue. Ultimately, a sequence for intraoperative soft tissue balancing was suggested for a bicruciate retaining knee arthroplasty.

  15. A Computational Modeling Approach for Investigating Soft Tissue Balancing in Bicruciate Retaining Knee Arthroplasty

    Science.gov (United States)

    Amiri, Shahram; Wilson, David R.

    2012-01-01

    Bicruciate retaining knee arthroplasty, although has shown improved functions and patient satisfaction compared to other designs of total knee replacement, remains a technically demanding option for treating severe cases of arthritic knees. One of the main challenges in bicruciate retaining arthroplasty is proper balancing of the soft tissue during the surgery. In this study biomechanics of soft tissue balancing was investigated using a validated computational model of the knee joint with high fidelity definitions of the soft tissue structures along with a Taguchi method for design of experiments. The model was used to simulate intraoperative balancing of soft tissue structures following the combinations suggested by an orthogonal array design. The results were used to quantify the corresponding effects on the laxity of the joint under anterior-posterior, internal-external, and varus-valgus loads. These effects were ranked for each ligament bundle to identify the components of laxity which were most sensitive to the corresponding surgical modifications. The resulting map of sensitivity for all the ligament bundles determined the components of laxity most suitable for examination during intraoperative balancing of the soft tissue. Ultimately, a sequence for intraoperative soft tissue balancing was suggested for a bicruciate retaining knee arthroplasty. PMID:23082090

  16. Evaluating six soft approaches

    DEFF Research Database (Denmark)

    Sørensen, Lene; Vidal, Rene Victor Valqui

    2006-01-01

    ’s interactive planning principles to be supported by soft approaches in carrying out the principles in action. These six soft approaches are suitable for supporting various steps of the strategy development and planning process. These are the SWOT analysis, the Future Workshop, the Scenario methodology......, Strategic Option Development and Analysis, Strategic Choice Approach and Soft Systems Methodology. Evaluations of each methodology are carried out using a conceptual framework in which the organisation, the result, the process and the technology of the specific approach are taken into consideration. Using...

  17. Evaluating Six Soft Approaches

    DEFF Research Database (Denmark)

    Sørensen, Lene Tolstrup; Valqui Vidal, René Victor

    2008-01-01

    's interactive planning principles to be supported by soft approaches in carrying out the principles in action. These six soft approaches are suitable forsupporting various steps of the strategy development and planning process. These are the SWOT analysis, the Future Workshop, the Scenario methodology......, Strategic Option Development and Analysis, Strategic Choice Approach and Soft Systems Methodology. Evaluations of each methodology are carried out using a conceptual framework in which the organisation, the result, the process and the technology of the specific approach are taken into consideration. Using...

  18. Evaluating six soft approaches

    DEFF Research Database (Denmark)

    Sørensen, Lene Tolstrup; Vidal, Rene Victor Valqui

    2008-01-01

    's interactive planning principles to be supported by soft approaches in carrying out the principles in action. These six soft approaches are suitable forsupporting various steps of the strategy development and planning process. These are the SWOT analysis, the Future Workshop, the Scenario methodology......, Strategic Option Development and Analysis, Strategic Choice Approach and Soft Systems Methodology. Evaluations of each methodology are carried out using a conceptual framework in which the organisation, the result, the process and the technology of the specific approach are taken into consideration. Using...

  19. Seismic Response Prediction of Buildings with Base Isolation Using Advanced Soft Computing Approaches

    Directory of Open Access Journals (Sweden)

    Mosbeh R. Kaloop

    2017-01-01

    Full Text Available Modeling response of structures under seismic loads is an important factor in Civil Engineering as it crucially affects the design and management of structures, especially for the high-risk areas. In this study, novel applications of advanced soft computing techniques are utilized for predicting the behavior of centrically braced frame (CBF buildings with lead-rubber bearing (LRB isolation system under ground motion effects. These techniques include least square support vector machine (LSSVM, wavelet neural networks (WNN, and adaptive neurofuzzy inference system (ANFIS along with wavelet denoising. The simulation of a 2D frame model and eight ground motions are considered in this study to evaluate the prediction models. The comparison results indicate that the least square support vector machine is superior to other techniques in estimating the behavior of smart structures.

  20. Assessing the suitability of soft computing approaches for forest fires prediction

    Directory of Open Access Journals (Sweden)

    Samaher Al_Janabi

    2018-07-01

    Full Text Available Forest fires present one of the main causes of environmental hazards that have many negative results in different aspect of life. Therefore, early prediction, fast detection and rapid action are the key elements for controlling such phenomenon and saving lives. Through this work, 517 different entries were selected at different times for montesinho natural park (MNP in Portugal to determine the best predictor that has the ability to detect forest fires, The principle component analysis (PCA was applied to find the critical patterns and particle swarm optimization (PSO technique was used to segment the fire regions (clusters. In the next stage, five soft computing (SC Techniques based on neural network were used in parallel to identify the best technique that would potentially give more accurate and optimum results in predicting of forest fires, these techniques namely; cascade correlation network (CCN, multilayer perceptron neural network (MPNN, polynomial neural network (PNN, radial basis function (RBF and support vector machine (SVM In the final stage, the predictors and their performance were evaluated based on five quality measures including root mean squared error (RMSE, mean squared error (MSE, relative absolute error (RAE, mean absolute error (MAE and information gain (IG. The results indicate that SVM technique was more effective and efficient than the RBF, MPNN, PNN and CCN predictors. The results also show that the SVM algorithm provides more precise predictions compared with other predictors with small estimation error. The obtained results confirm that the SVM improves the prediction accuracy and suitable for forest fires prediction compared to other methods. Keywords: Forest fires, Soft computing, Prediction, Principle component analysis, Particle swarm optimization, Cascade correlation network, Multilayer perceptron neural network, Polynomial neural networks, Radial basis function, Support vector machine

  1. Soft computing based on hierarchical evaluation approach and criteria interdependencies for energy decision-making problems: A case study

    International Nuclear Information System (INIS)

    Gitinavard, Hossein; Mousavi, S. Meysam; Vahdani, Behnam

    2017-01-01

    In numerous real-world energy decision problems, decision makers often encounter complex environments, in which existent imprecise data and uncertain information lead us to make an appropriate decision. In this paper, a new soft computing group decision-making approach is introduced based on novel compromise ranking method and interval-valued hesitant fuzzy sets (IVHFSs) for energy decision-making problems under multiple criteria. In the proposed approach, the assessment information is provided by energy experts or decision makers based on interval-valued hesitant fuzzy elements under incomplete criteria weights. In this respect, a new ranking index is presented respecting to interval-valued hesitant fuzzy Hamming distance measure to prioritize energy candidates, and criteria weights are computed based on an extended maximizing deviation method by considering the preferences experts' judgments about the relative importance of each criterion. Also, a decision making trial and evaluation laboratory (DEMATEL) method is extended under an IVHF-environment to compute the interdependencies between and within the selected criteria in the hierarchical structure. Accordingly, to demonstrate the applicability of the presented approach a case study and a practical example are provided regarding to hierarchical structure and criteria interdependencies relations for renewable energy and energy policy selection problems. Hence, the obtained computational results are compared with a fuzzy decision-making method from the recent literature based on some comparison parameters to show the advantages and constraints of the proposed approach. Finally, a sensitivity analysis is prepared to indicate effects of different criteria weights on ranking results to present the robustness or sensitiveness of the proposed soft computing approach versus the relative importance of criteria. - Highlights: • Introducing a novel interval-valued hesitant fuzzy compromise ranking method. • Presenting

  2. Soft Computing Approach to Evaluate and Predict Blast-Induced Ground Vibration

    Science.gov (United States)

    Khandelwal, Manoj

    2010-05-01

    the same excavation site, different predictors give different values of safe PPV vis-à-vis safe charge per delay. There is no uniformity in the predicted result by different predictors. All vibration predictor equations have their site specific constants. Therefore, they cannot be used in a generalized way with confidence and zero level of risk. To overcome on this aspect new soft computing tools like artificial neural network (ANN) has attracted because of its ability to learn from the pattern acquainted before. ANN has the ability to learn from patterns acquainted before. It is a highly interconnected network of a large number of processing elements called neurons in an architecture inspired by the brain. ANN can be massively parallel and hence said to exhibit parallel distributed processing. Once, the network has been trained, with sufficient number of sample data sets, it can make reliable and trustworthy predictions on the basis of its previous learning, about the output related to new input data set of similar pattern. This paper deals the application of ANN for the prediction of ground vibration by taking into consideration of maximum charge per delay and distance between blast face to monitoring point. To investigate the appropriateness of this approach, the predictions by ANN have been also compared with other vibration predictor equations.

  3. A Soft Computing Based Approach Using Modified Selection Strategy for Feature Reduction of Medical Systems

    Directory of Open Access Journals (Sweden)

    Kursat Zuhtuogullari

    2013-01-01

    Full Text Available The systems consisting high input spaces require high processing times and memory usage. Most of the attribute selection algorithms have the problems of input dimensions limits and information storage problems. These problems are eliminated by means of developed feature reduction software using new modified selection mechanism with middle region solution candidates adding. The hybrid system software is constructed for reducing the input attributes of the systems with large number of input variables. The designed software also supports the roulette wheel selection mechanism. Linear order crossover is used as the recombination operator. In the genetic algorithm based soft computing methods, locking to the local solutions is also a problem which is eliminated by using developed software. Faster and effective results are obtained in the test procedures. Twelve input variables of the urological system have been reduced to the reducts (reduced input attributes with seven, six, and five elements. It can be seen from the obtained results that the developed software with modified selection has the advantages in the fields of memory allocation, execution time, classification accuracy, sensitivity, and specificity values when compared with the other reduction algorithms by using the urological test data.

  4. A soft computing based approach using modified selection strategy for feature reduction of medical systems.

    Science.gov (United States)

    Zuhtuogullari, Kursat; Allahverdi, Novruz; Arikan, Nihat

    2013-01-01

    The systems consisting high input spaces require high processing times and memory usage. Most of the attribute selection algorithms have the problems of input dimensions limits and information storage problems. These problems are eliminated by means of developed feature reduction software using new modified selection mechanism with middle region solution candidates adding. The hybrid system software is constructed for reducing the input attributes of the systems with large number of input variables. The designed software also supports the roulette wheel selection mechanism. Linear order crossover is used as the recombination operator. In the genetic algorithm based soft computing methods, locking to the local solutions is also a problem which is eliminated by using developed software. Faster and effective results are obtained in the test procedures. Twelve input variables of the urological system have been reduced to the reducts (reduced input attributes) with seven, six, and five elements. It can be seen from the obtained results that the developed software with modified selection has the advantages in the fields of memory allocation, execution time, classification accuracy, sensitivity, and specificity values when compared with the other reduction algorithms by using the urological test data.

  5. Soft computing approach to 3D lung nodule segmentation in CT.

    Science.gov (United States)

    Badura, P; Pietka, E

    2014-10-01

    This paper presents a novel, multilevel approach to the segmentation of various types of pulmonary nodules in computed tomography studies. It is based on two branches of computational intelligence: the fuzzy connectedness (FC) and the evolutionary computation. First, the image and auxiliary data are prepared for the 3D FC analysis during the first stage of an algorithm - the masks generation. Its main goal is to process some specific types of nodules connected to the pleura or vessels. It consists of some basic image processing operations as well as dedicated routines for the specific cases of nodules. The evolutionary computation is performed on the image and seed points in order to shorten the FC analysis and improve its accuracy. After the FC application, the remaining vessels are removed during the postprocessing stage. The method has been validated using the first dataset of studies acquired and described by the Lung Image Database Consortium (LIDC) and by its latest release - the LIDC-IDRI (Image Database Resource Initiative) database. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Hardware for soft computing and soft computing for hardware

    CERN Document Server

    Nedjah, Nadia

    2014-01-01

    Single and Multi-Objective Evolutionary Computation (MOEA),  Genetic Algorithms (GAs), Artificial Neural Networks (ANNs), Fuzzy Controllers (FCs), Particle Swarm Optimization (PSO) and Ant colony Optimization (ACO) are becoming omnipresent in almost every intelligent system design. Unfortunately, the application of the majority of these techniques is complex and so requires a huge computational effort to yield useful and practical results. Therefore, dedicated hardware for evolutionary, neural and fuzzy computation is a key issue for designers. With the spread of reconfigurable hardware such as FPGAs, digital as well as analog hardware implementations of such computation become cost-effective. The idea behind this book is to offer a variety of hardware designs for soft computing techniques that can be embedded in any final product. Also, to introduce the successful application of soft computing technique to solve many hard problem encountered during the design of embedded hardware designs. Reconfigurable em...

  7. Soft computing in computer and information science

    CERN Document Server

    Fray, Imed; Pejaś, Jerzy

    2015-01-01

    This book presents a carefully selected and reviewed collection of papers presented during the 19th Advanced Computer Systems conference ACS-2014. The Advanced Computer Systems conference concentrated from its beginning on methods and algorithms of artificial intelligence. Further future brought new areas of interest concerning technical informatics related to soft computing and some more technological aspects of computer science such as multimedia and computer graphics, software engineering, web systems, information security and safety or project management. These topics are represented in the present book under the categories Artificial Intelligence, Design of Information and Multimedia Systems, Information Technology Security and Software Technologies.

  8. Soft computing for business intelligence

    CERN Document Server

    Pérez, Rafael; Cobo, Angel; Marx, Jorge; Valdés, Ariel

    2014-01-01

    The book Soft Computing for Business Intelligence is the remarkable output of a program based on the idea of joint trans-disciplinary research as supported by the Eureka Iberoamerica Network and the University of Oldenburg. It contains twenty-seven papers allocated to three sections: Soft Computing, Business Intelligence and Knowledge Discovery, and Knowledge Management and Decision Making. Although the contents touch different domains they are similar in so far as they follow the BI principle “Observation and Analysis” while keeping a practical oriented theoretical eye on sound methodologies, like Fuzzy Logic, Compensatory Fuzzy Logic (CFL), Rough Sets and other softcomputing elements. The book tears down the traditional focus on business, and extends Business Intelligence techniques in an impressive way to a broad range of fields like medicine, environment, wind farming, social collaboration and interaction, car sharing and sustainability.

  9. Advance Trends in Soft Computing

    CERN Document Server

    Kreinovich, Vladik; Kacprzyk, Janusz; WCSC 2013

    2014-01-01

    This book is the proceedings of the 3rd World Conference on Soft Computing (WCSC), which was held in San Antonio, TX, USA, on December 16-18, 2013. It presents start-of-the-art theory and applications of soft computing together with an in-depth discussion of current and future challenges in the field, providing readers with a 360 degree view on soft computing. Topics range from fuzzy sets, to fuzzy logic, fuzzy mathematics, neuro-fuzzy systems, fuzzy control, decision making in fuzzy environments, image processing and many more. The book is dedicated to Lotfi A. Zadeh, a renowned specialist in signal analysis and control systems research who proposed the idea of fuzzy sets, in which an element may have a partial membership, in the early 1960s, followed by the idea of fuzzy logic, in which a statement can be true only to a certain degree, with degrees described by numbers in the interval [0,1]. The performance of fuzzy systems can often be improved with the help of optimization techniques, e.g. evolutionary co...

  10. A soft computing-based approach to optimise queuing-inventory control problem

    Science.gov (United States)

    Alaghebandha, Mohammad; Hajipour, Vahid

    2015-04-01

    In this paper, a multi-product continuous review inventory control problem within batch arrival queuing approach (MQr/M/1) is developed to find the optimal quantities of maximum inventory. The objective function is to minimise summation of ordering, holding and shortage costs under warehouse space, service level and expected lost-sales shortage cost constraints from retailer and warehouse viewpoints. Since the proposed model is Non-deterministic Polynomial-time hard, an efficient imperialist competitive algorithm (ICA) is proposed to solve the model. To justify proposed ICA, both ganetic algorithm and simulated annealing algorithm are utilised. In order to determine the best value of algorithm parameters that result in a better solution, a fine-tuning procedure is executed. Finally, the performance of the proposed ICA is analysed using some numerical illustrations.

  11. A Soft Computing Approach to Crack Detection and Impact Source Identification with Field-Programmable Gate Array Implementation

    Directory of Open Access Journals (Sweden)

    Arati M. Dixit

    2013-01-01

    Full Text Available The real-time nondestructive testing (NDT for crack detection and impact source identification (CDISI has attracted the researchers from diverse areas. This is apparent from the current work in the literature. CDISI has usually been performed by visual assessment of waveforms generated by a standard data acquisition system. In this paper we suggest an automation of CDISI for metal armor plates using a soft computing approach by developing a fuzzy inference system to effectively deal with this problem. It is also advantageous to develop a chip that can contribute towards real time CDISI. The objective of this paper is to report on efforts to develop an automated CDISI procedure and to formulate a technique such that the proposed method can be easily implemented on a chip. The CDISI fuzzy inference system is developed using MATLAB’s fuzzy logic toolbox. A VLSI circuit for CDISI is developed on basis of fuzzy logic model using Verilog, a hardware description language (HDL. The Xilinx ISE WebPACK9.1i is used for design, synthesis, implementation, and verification. The CDISI field-programmable gate array (FPGA implementation is done using Xilinx’s Spartan 3 FPGA. SynaptiCAD’s Verilog Simulators—VeriLogger PRO and ModelSim—are used as the software simulation and debug environment.

  12. Evaluation of laser cutting process with auxiliary gas pressure by soft computing approach

    Science.gov (United States)

    Lazov, Lyubomir; Nikolić, Vlastimir; Jovic, Srdjan; Milovančević, Miloš; Deneva, Heristina; Teirumenieka, Erika; Arsic, Nebojsa

    2018-06-01

    Evaluation of the optimal laser cutting parameters is very important for the high cut quality. This is highly nonlinear process with different parameters which is the main challenge in the optimization process. Data mining methodology is one of most versatile method which can be used laser cutting process optimization. Support vector regression (SVR) procedure is implemented since it is a versatile and robust technique for very nonlinear data regression. The goal in this study was to determine the optimal laser cutting parameters to ensure robust condition for minimization of average surface roughness. Three cutting parameters, the cutting speed, the laser power, and the assist gas pressure, were used in the investigation. As a laser type TruLaser 1030 technological system was used. Nitrogen as an assisted gas was used in the laser cutting process. As the data mining method, support vector regression procedure was used. Data mining prediction accuracy was very high according the coefficient (R2) of determination and root mean square error (RMSE): R2 = 0.9975 and RMSE = 0.0337. Therefore the data mining approach could be used effectively for determination of the optimal conditions of the laser cutting process.

  13. Genetic networks and soft computing.

    Science.gov (United States)

    Mitra, Sushmita; Das, Ranajit; Hayashi, Yoichi

    2011-01-01

    The analysis of gene regulatory networks provides enormous information on various fundamental cellular processes involving growth, development, hormone secretion, and cellular communication. Their extraction from available gene expression profiles is a challenging problem. Such reverse engineering of genetic networks offers insight into cellular activity toward prediction of adverse effects of new drugs or possible identification of new drug targets. Tasks such as classification, clustering, and feature selection enable efficient mining of knowledge about gene interactions in the form of networks. It is known that biological data is prone to different kinds of noise and ambiguity. Soft computing tools, such as fuzzy sets, evolutionary strategies, and neurocomputing, have been found to be helpful in providing low-cost, acceptable solutions in the presence of various types of uncertainties. In this paper, we survey the role of these soft methodologies and their hybridizations, for the purpose of generating genetic networks.

  14. Soft Computing Methods for Disulfide Connectivity Prediction.

    Science.gov (United States)

    Márquez-Chamorro, Alfonso E; Aguilar-Ruiz, Jesús S

    2015-01-01

    The problem of protein structure prediction (PSP) is one of the main challenges in structural bioinformatics. To tackle this problem, PSP can be divided into several subproblems. One of these subproblems is the prediction of disulfide bonds. The disulfide connectivity prediction problem consists in identifying which nonadjacent cysteines would be cross-linked from all possible candidates. Determining the disulfide bond connectivity between the cysteines of a protein is desirable as a previous step of the 3D PSP, as the protein conformational search space is highly reduced. The most representative soft computing approaches for the disulfide bonds connectivity prediction problem of the last decade are summarized in this paper. Certain aspects, such as the different methodologies based on soft computing approaches (artificial neural network or support vector machine) or features of the algorithms, are used for the classification of these methods.

  15. Soft computing techniques in engineering applications

    CERN Document Server

    Zhong, Baojiang

    2014-01-01

    The Soft Computing techniques, which are based on the information processing of biological systems are now massively used in the area of pattern recognition, making prediction & planning, as well as acting on the environment. Ideally speaking, soft computing is not a subject of homogeneous concepts and techniques; rather, it is an amalgamation of distinct methods that confirms to its guiding principle. At present, the main aim of soft computing is to exploit the tolerance for imprecision and uncertainty to achieve tractability, robustness and low solutions cost. The principal constituents of soft computing techniques are probabilistic reasoning, fuzzy logic, neuro-computing, genetic algorithms, belief networks, chaotic systems, as well as learning theory. This book covers contributions from various authors to demonstrate the use of soft computing techniques in various applications of engineering.  

  16. 4th World Conference on Soft Computing

    CERN Document Server

    Abbasov, Ali; Yager, Ronald; Shahbazova, Shahnaz; Reformat, Marek

    2016-01-01

    This book reports on advanced theories and cutting-edge applications in the field of soft computing. The individual chapters, written by leading researchers, are based on contributions presented during the 4th World Conference on Soft Computing, held May 25-27, 2014, in Berkeley. The book covers a wealth of key topics in soft computing, focusing on both fundamental aspects and applications. The former include fuzzy mathematics, type-2 fuzzy sets, evolutionary-based optimization, aggregation and neural networks, while the latter include soft computing in data analysis, image processing, decision-making, classification, series prediction, economics, control, and modeling. By providing readers with a timely, authoritative view on the field, and by discussing thought-provoking developments and challenges, the book will foster new research directions in the diverse areas of soft computing. .

  17. New Concepts and Applications in Soft Computing

    CERN Document Server

    Fodor, János; Várkonyi-Kóczy, Annamária

    2013-01-01

                  The book provides a sample of research on the innovative theory and applications of soft computing paradigms.             The idea of Soft Computing was initiated in 1981 when Professor Zadeh published his first paper on soft data analysis and constantly evolved ever since. Professor Zadeh defined Soft Computing as the fusion of the fields of fuzzy logic (FL), neural network theory (NN) and probabilistic reasoning (PR), with the latter subsuming belief networks, evolutionary computing including DNA computing, chaos theory and parts of learning theory into one multidisciplinary system. As Zadeh said the essence of soft computing is that unlike the traditional, hard computing, soft computing is aimed at an accommodation with the pervasive imprecision of the real world. Thus, the guiding principle of soft computing is to exploit the tolerance for imprecision, uncertainty and partial truth to achieve tractability, robustness, low solution cost and better rapport with reality. ...

  18. Soft computing methods for geoidal height transformation

    Science.gov (United States)

    Akyilmaz, O.; Özlüdemir, M. T.; Ayan, T.; Çelik, R. N.

    2009-07-01

    Soft computing techniques, such as fuzzy logic and artificial neural network (ANN) approaches, have enabled researchers to create precise models for use in many scientific and engineering applications. Applications that can be employed in geodetic studies include the estimation of earth rotation parameters and the determination of mean sea level changes. Another important field of geodesy in which these computing techniques can be applied is geoidal height transformation. We report here our use of a conventional polynomial model, the Adaptive Network-based Fuzzy (or in some publications, Adaptive Neuro-Fuzzy) Inference System (ANFIS), an ANN and a modified ANN approach to approximate geoid heights. These approximation models have been tested on a number of test points. The results obtained through the transformation processes from ellipsoidal heights into local levelling heights have also been compared.

  19. Practical applications of soft computing in engineering

    CERN Document Server

    2001-01-01

    Soft computing has been presented not only with the theoretical developments but also with a large variety of realistic applications to consumer products and industrial systems. Application of soft computing has provided the opportunity to integrate human-like vagueness and real-life uncertainty into an otherwise hard computer program. This book highlights some of the recent developments in practical applications of soft computing in engineering problems. All the chapters have been sophisticatedly designed and revised by international experts to achieve wide but in-depth coverage. Contents: Au

  20. Fuzzy logic, neural networks, and soft computing

    Science.gov (United States)

    Zadeh, Lofti A.

    1994-01-01

    The past few years have witnessed a rapid growth of interest in a cluster of modes of modeling and computation which may be described collectively as soft computing. The distinguishing characteristic of soft computing is that its primary aims are to achieve tractability, robustness, low cost, and high MIQ (machine intelligence quotient) through an exploitation of the tolerance for imprecision and uncertainty. Thus, in soft computing what is usually sought is an approximate solution to a precisely formulated problem or, more typically, an approximate solution to an imprecisely formulated problem. A simple case in point is the problem of parking a car. Generally, humans can park a car rather easily because the final position of the car is not specified exactly. If it were specified to within, say, a few millimeters and a fraction of a degree, it would take hours or days of maneuvering and precise measurements of distance and angular position to solve the problem. What this simple example points to is the fact that, in general, high precision carries a high cost. The challenge, then, is to exploit the tolerance for imprecision by devising methods of computation which lead to an acceptable solution at low cost. By its nature, soft computing is much closer to human reasoning than the traditional modes of computation. At this juncture, the major components of soft computing are fuzzy logic (FL), neural network theory (NN), and probabilistic reasoning techniques (PR), including genetic algorithms, chaos theory, and part of learning theory. Increasingly, these techniques are used in combination to achieve significant improvement in performance and adaptability. Among the important application areas for soft computing are control systems, expert systems, data compression techniques, image processing, and decision support systems. It may be argued that it is soft computing, rather than the traditional hard computing, that should be viewed as the foundation for artificial

  1. The role of soft computing in intelligent machines.

    Science.gov (United States)

    de Silva, Clarence W

    2003-08-15

    An intelligent machine relies on computational intelligence in generating its intelligent behaviour. This requires a knowledge system in which representation and processing of knowledge are central functions. Approximation is a 'soft' concept, and the capability to approximate for the purposes of comparison, pattern recognition, reasoning, and decision making is a manifestation of intelligence. This paper examines the use of soft computing in intelligent machines. Soft computing is an important branch of computational intelligence, where fuzzy logic, probability theory, neural networks, and genetic algorithms are synergistically used to mimic the reasoning and decision making of a human. This paper explores several important characteristics and capabilities of machines that exhibit intelligent behaviour. Approaches that are useful in the development of an intelligent machine are introduced. The paper presents a general structure for an intelligent machine, giving particular emphasis to its primary components, such as sensors, actuators, controllers, and the communication backbone, and their interaction. The role of soft computing within the overall system is discussed. Common techniques and approaches that will be useful in the development of an intelligent machine are introduced, and the main steps in the development of an intelligent machine for practical use are given. An industrial machine, which employs the concepts of soft computing in its operation, is presented, and one aspect of intelligent tuning, which is incorporated into the machine, is illustrated.

  2. International Conference on Soft Computing Systems

    CERN Document Server

    Panigrahi, Bijaya

    2016-01-01

    The book is a collection of high-quality peer-reviewed research papers presented in International Conference on Soft Computing Systems (ICSCS 2015) held at Noorul Islam Centre for Higher Education, Chennai, India. These research papers provide the latest developments in the emerging areas of Soft Computing in Engineering and Technology. The book is organized in two volumes and discusses a wide variety of industrial, engineering and scientific applications of the emerging techniques. It presents invited papers from the inventors/originators of new applications and advanced technologies.

  3. 6th International Workshop Soft Computing Applications

    CERN Document Server

    Jain, Lakhmi; Kovačević, Branko

    2016-01-01

    These volumes constitute the Proceedings of the 6th International Workshop on Soft Computing Applications, or SOFA 2014, held on 24-26 July 2014 in Timisoara, Romania. This edition was organized by the University of Belgrade, Serbia in conjunction with Romanian Society of Control Engineering and Technical Informatics (SRAIT) - Arad Section, The General Association of Engineers in Romania - Arad Section, Institute of Computer Science, Iasi Branch of the Romanian Academy and IEEE Romanian Section.                 The Soft Computing concept was introduced by Lotfi Zadeh in 1991 and serves to highlight the emergence of computing methodologies in which the accent is on exploiting the tolerance for imprecision and uncertainty to achieve tractability, robustness and low solution cost. Soft computing facilitates the use of fuzzy logic, neurocomputing, evolutionary computing and probabilistic computing in combination, leading to the concept of hybrid intelligent systems.        The combination of ...

  4. Discussion on Soft Computing at FLINS '96

    NARCIS (Netherlands)

    Ruan, D.; Wal, A.J. van der

    1998-01-01

    This is a report on the discussion about soft computing (SC) during FLINS'96. The discussion is based on the five questions formulated by X. Li, viz. (1) What is SC? (2) What are the characteristics of SC? (3) What are the principal achievements of SC? (4) What are the typical problems of SC and

  5. The separate universe approach to soft limits

    Energy Technology Data Exchange (ETDEWEB)

    Kenton, Zachary; Mulryne, David J., E-mail: z.a.kenton@qmul.ac.uk, E-mail: d.mulryne@qmul.ac.uk [School of Physics and Astronomy, Queen Mary University of London, Mile End Road, London, E1 4NS (United Kingdom)

    2016-10-01

    We develop a formalism for calculating soft limits of n -point inflationary correlation functions using separate universe techniques. Our method naturally allows for multiple fields and leads to an elegant diagrammatic approach. As an application we focus on the trispectrum produced by inflation with multiple light fields, giving explicit formulae for all possible single- and double-soft limits. We also investigate consistency relations and present an infinite tower of inequalities between soft correlation functions which generalise the Suyama-Yamaguchi inequality.

  6. Research Update: Computational materials discovery in soft matter

    Directory of Open Access Journals (Sweden)

    Tristan Bereau

    2016-05-01

    Full Text Available Soft matter embodies a wide range of materials, which all share the common characteristics of weak interaction energies determining their supramolecular structure. This complicates structure-property predictions and hampers the direct application of data-driven approaches to their modeling. We present several aspects in which these methods play a role in designing soft-matter materials: drug design as well as information-driven computer simulations, e.g., histogram reweighting. We also discuss recent examples of rational design of soft-matter materials fostered by physical insight and assisted by data-driven approaches. We foresee the combination of data-driven and physical approaches a promising strategy to move the field forward.

  7. Phoneme-based speech segmentation using hybrid soft computing framework

    CERN Document Server

    Sarma, Mousmita

    2014-01-01

    The book discusses intelligent system design using soft computing and similar systems and their interdisciplinary applications. It also focuses on the recent trends to use soft computing as a versatile tool for designing a host of decision support systems.

  8. Soft Computing Methods in Design of Superalloys

    Science.gov (United States)

    Cios, K. J.; Berke, L.; Vary, A.; Sharma, S.

    1996-01-01

    Soft computing techniques of neural networks and genetic algorithms are used in the design of superalloys. The cyclic oxidation attack parameter K(sub a), generated from tests at NASA Lewis Research Center, is modelled as a function of the superalloy chemistry and test temperature using a neural network. This model is then used in conjunction with a genetic algorithm to obtain an optimized superalloy composition resulting in low K(sub a) values.

  9. RNA secondary structure prediction using soft computing.

    Science.gov (United States)

    Ray, Shubhra Sankar; Pal, Sankar K

    2013-01-01

    Prediction of RNA structure is invaluable in creating new drugs and understanding genetic diseases. Several deterministic algorithms and soft computing-based techniques have been developed for more than a decade to determine the structure from a known RNA sequence. Soft computing gained importance with the need to get approximate solutions for RNA sequences by considering the issues related with kinetic effects, cotranscriptional folding, and estimation of certain energy parameters. A brief description of some of the soft computing-based techniques, developed for RNA secondary structure prediction, is presented along with their relevance. The basic concepts of RNA and its different structural elements like helix, bulge, hairpin loop, internal loop, and multiloop are described. These are followed by different methodologies, employing genetic algorithms, artificial neural networks, and fuzzy logic. The role of various metaheuristics, like simulated annealing, particle swarm optimization, ant colony optimization, and tabu search is also discussed. A relative comparison among different techniques, in predicting 12 known RNA secondary structures, is presented, as an example. Future challenging issues are then mentioned.

  10. A Systematic Approach for Soft Sensor Development

    DEFF Research Database (Denmark)

    Lin, Bao; Recke, Bodil; Renaudat, Philippe

    2007-01-01

    This paper presents a systematic approach based on robust statistical techniques for development of a data-driven soft sensor, which is an important component of the process analytical technology (PAT) and is essential for effective quality control. The data quality is obviously of essential...... significance for a data-driven soft sensor. Therefore, preprocessing procedures for process measurements are described in detail. First, a template is defined based on one or more key process variables to handle missing data related to severe operation interruptions. Second, a univariate, followed...... reveal the effectiveness of the systematic framework in deriving data-driven soft sensors that provide reasonably reliable one-step-ahead predictions....

  11. 22nd International Conference on Soft Computing

    CERN Document Server

    2017-01-01

    This proceeding book contains a collection of selected accepted papers of the Mendel conference held in Brno, Czech Republic in June 2016. The proceedings book contains three chapters which present recent advances in soft computing including intelligent image processing. The Mendel conference was established in 1995 and is named after the scientist and Augustinian priest Gregor J. Mendel who discovered the famous Laws of Heredity. The main aim of the conference is to create a regular possibility for students, academics and researchers to exchange ideas and novel research methods on a yearly basis.

  12. Soft Computing Applications : Proceedings of the 5th International Workshop Soft Computing Applications

    CERN Document Server

    Fodor, János; Várkonyi-Kóczy, Annamária; Dombi, Joszef; Jain, Lakhmi

    2013-01-01

                    This volume contains the Proceedings of the 5thInternational Workshop on Soft Computing Applications (SOFA 2012).                                The book covers a broad spectrum of soft computing techniques, theoretical and practical applications employing knowledge and intelligence to find solutions for world industrial, economic and medical problems. The combination of such intelligent systems tools and a large number of applications introduce a need for a synergy of scientific and technological disciplines in order to show the great potential of Soft Computing in all domains.                   The conference papers included in these proceedings, published post conference, were grouped into the following area of research: ·         Soft Computing and Fusion Algorithms in Biometrics, ·         Fuzzy Theory, Control andApplications, ·         Modelling and Control Applications, ·         Steps towa...

  13. Soft computing techniques toward modeling the water supplies of Cyprus.

    Science.gov (United States)

    Iliadis, L; Maris, F; Tachos, S

    2011-10-01

    This research effort aims in the application of soft computing techniques toward water resources management. More specifically, the target is the development of reliable soft computing models capable of estimating the water supply for the case of "Germasogeia" mountainous watersheds in Cyprus. Initially, ε-Regression Support Vector Machines (ε-RSVM) and fuzzy weighted ε-RSVMR models have been developed that accept five input parameters. At the same time, reliable artificial neural networks have been developed to perform the same job. The 5-fold cross validation approach has been employed in order to eliminate bad local behaviors and to produce a more representative training data set. Thus, the fuzzy weighted Support Vector Regression (SVR) combined with the fuzzy partition has been employed in an effort to enhance the quality of the results. Several rational and reliable models have been produced that can enhance the efficiency of water policy designers. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Optimal reliability allocation for large software projects through soft computing techniques

    DEFF Research Database (Denmark)

    Madsen, Henrik; Albeanu, Grigore; Popentiu-Vladicescu, Florin

    2012-01-01

    or maximizing the system reliability subject to budget constraints. These kinds of optimization problems were considered both in deterministic and stochastic frameworks in literature. Recently, the intuitionistic-fuzzy optimization approach was considered as a soft computing successful modelling approach....... Firstly, a review on existing soft computing approaches to optimization is given. The main section extends the results considering self-organizing migrating algorithms for solving intuitionistic-fuzzy optimization problems attached to complex fault-tolerant software architectures which proved...

  15. Soft matter approaches to food structuring

    NARCIS (Netherlands)

    Sman, van der R.G.M.

    2012-01-01

    We give an overview of the many opportunities that arise from approaching food structuring from the perspective of soft matter physics. This branch of physics employs concepts that build upon the seminal work of van der Waals, such as free volume, the mean field, and effective temperatures. All

  16. 2nd International Conference on Soft Computing and Data Mining

    CERN Document Server

    Ghazali, Rozaida; Nawi, Nazri; Deris, Mustafa

    2017-01-01

    This book provides a comprehensive introduction and practical look at the concepts and techniques readers need to get the most out of their data in real-world, large-scale data mining projects. It also guides readers through the data-analytic thinking necessary for extracting useful knowledge and business value from the data. The book is based on the Soft Computing and Data Mining (SCDM-16) conference, which was held in Bandung, Indonesia on August 18th–20th 2016 to discuss the state of the art in soft computing techniques, and offer participants sufficient knowledge to tackle a wide range of complex systems. The scope of the conference is reflected in the book, which presents a balance of soft computing techniques and data mining approaches. The two constituents are introduced to the reader systematically and brought together using different combinations of applications and practices. It offers engineers, data analysts, practitioners, scientists and managers the insights into the concepts, tools and techni...

  17. 6th International Conference on Soft Computing for Problem Solving

    CERN Document Server

    Bansal, Jagdish; Das, Kedar; Lal, Arvind; Garg, Harish; Nagar, Atulya; Pant, Millie

    2017-01-01

    This two-volume book gathers the proceedings of the Sixth International Conference on Soft Computing for Problem Solving (SocProS 2016), offering a collection of research papers presented during the conference at Thapar University, Patiala, India. Providing a veritable treasure trove for scientists and researchers working in the field of soft computing, it highlights the latest developments in the broad area of “Computational Intelligence” and explores both theoretical and practical aspects using fuzzy logic, artificial neural networks, evolutionary algorithms, swarm intelligence, soft computing, computational intelligence, etc.

  18. Recent developments and new directions in soft computing

    CERN Document Server

    Abbasov, Ali; Yager, Ronald; Shahbazova, Shahnaz; Reformat, Marek

    2014-01-01

    The book reports on the latest advances and challenges of soft computing. It  gathers original scientific contributions written by top scientists in the field and covering theories, methods and applications in a number of research areas related to soft-computing, such as decision-making, probabilistic reasoning, image processing, control, neural networks and data analysis.  

  19. SoftLab: A Soft-Computing Software for Experimental Research with Commercialization Aspects

    Science.gov (United States)

    Akbarzadeh-T, M.-R.; Shaikh, T. S.; Ren, J.; Hubbell, Rob; Kumbla, K. K.; Jamshidi, M

    1998-01-01

    SoftLab is a software environment for research and development in intelligent modeling/control using soft-computing paradigms such as fuzzy logic, neural networks, genetic algorithms, and genetic programs. SoftLab addresses the inadequacies of the existing soft-computing software by supporting comprehensive multidisciplinary functionalities from management tools to engineering systems. Furthermore, the built-in features help the user process/analyze information more efficiently by a friendly yet powerful interface, and will allow the user to specify user-specific processing modules, hence adding to the standard configuration of the software environment.

  20. Complex system modelling and control through intelligent soft computations

    CERN Document Server

    Azar, Ahmad

    2015-01-01

    The book offers a snapshot of the theories and applications of soft computing in the area of complex systems modeling and control. It presents the most important findings discussed during the 5th International Conference on Modelling, Identification and Control, held in Cairo, from August 31-September 2, 2013. The book consists of twenty-nine selected contributions, which have been thoroughly reviewed and extended before their inclusion in the volume. The different chapters, written by active researchers in the field, report on both current theories and important applications of soft-computing. Besides providing the readers with soft-computing fundamentals, and soft-computing based inductive methodologies/algorithms, the book also discusses key industrial soft-computing applications, as well as multidisciplinary solutions developed for a variety of purposes, like windup control, waste management, security issues, biomedical applications and many others. It is a perfect reference guide for graduate students, r...

  1. Advances in soft computing, intelligent robotics and control

    CERN Document Server

    Fullér, Robert

    2014-01-01

    Soft computing, intelligent robotics and control are in the core interest of contemporary engineering. Essential characteristics of soft computing methods are the ability to handle vague information, to apply human-like reasoning, their learning capability, and ease of application. Soft computing techniques are widely applied in the control of dynamic systems, including mobile robots. The present volume is a collection of 20 chapters written by respectable experts of the fields, addressing various theoretical and practical aspects in soft computing, intelligent robotics and control. The first part of the book concerns with issues of intelligent robotics, including robust xed point transformation design, experimental verification of the input-output feedback linearization of differentially driven mobile robot and applying kinematic synthesis to micro electro-mechanical systems design. The second part of the book is devoted to fundamental aspects of soft computing. This includes practical aspects of fuzzy rule ...

  2. 4th International Conference on Quantitative Logic and Soft Computing

    CERN Document Server

    Chen, Shui-Li; Wang, San-Min; Li, Yong-Ming

    2017-01-01

    This book is the proceedings of the Fourth International Conference on Quantitative Logic and Soft Computing (QLSC2016) held 14-17, October, 2016 in Zhejiang Sci-Tech University, Hangzhou, China. It includes 61 papers, of which 5 are plenary talks( 3 abstracts and 2 full length talks). QLSC2016 was the fourth in a series of conferences on Quantitative Logic and Soft Computing. This conference was a major symposium for scientists, engineers and practitioners to present their updated results, ideas, developments and applications in all areas of quantitative logic and soft computing. The book aims to strengthen relations between industry research laboratories and universities in fields such as quantitative logic and soft computing worldwide as follows: (1) Quantitative Logic and Uncertainty Logic; (2) Automata and Quantification of Software; (3) Fuzzy Connectives and Fuzzy Reasoning; (4) Fuzzy Logical Algebras; (5) Artificial Intelligence and Soft Computing; (6) Fuzzy Sets Theory and Applications.

  3. Thermal sensation prediction by soft computing methodology.

    Science.gov (United States)

    Jović, Srđan; Arsić, Nebojša; Vilimonović, Jovana; Petković, Dalibor

    2016-12-01

    Thermal comfort in open urban areas is very factor based on environmental point of view. Therefore it is need to fulfill demands for suitable thermal comfort during urban planning and design. Thermal comfort can be modeled based on climatic parameters and other factors. The factors are variables and they are changed throughout the year and days. Therefore there is need to establish an algorithm for thermal comfort prediction according to the input variables. The prediction results could be used for planning of time of usage of urban areas. Since it is very nonlinear task, in this investigation was applied soft computing methodology in order to predict the thermal comfort. The main goal was to apply extreme leaning machine (ELM) for forecasting of physiological equivalent temperature (PET) values. Temperature, pressure, wind speed and irradiance were used as inputs. The prediction results are compared with some benchmark models. Based on the results ELM can be used effectively in forecasting of PET. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Soft Computing Techniques in Vision Science

    CERN Document Server

    Yang, Yeon-Mo

    2012-01-01

    This Special Edited Volume is a unique approach towards Computational solution for the upcoming field of study called Vision Science. From a scientific firmament Optics, Ophthalmology, and Optical Science has surpassed an Odyssey of optimizing configurations of Optical systems, Surveillance Cameras and other Nano optical devices with the metaphor of Nano Science and Technology. Still these systems are falling short of its computational aspect to achieve the pinnacle of human vision system. In this edited volume much attention has been given to address the coupling issues Computational Science and Vision Studies.  It is a comprehensive collection of research works addressing various related areas of Vision Science like Visual Perception and Visual system, Cognitive Psychology, Neuroscience, Psychophysics and Ophthalmology, linguistic relativity, color vision etc. This issue carries some latest developments in the form of research articles and presentations. The volume is rich of contents with technical tools ...

  5. Understanding soft condensed matter via modeling and computation

    CERN Document Server

    Shi, An-Chang

    2011-01-01

    All living organisms consist of soft matter. For this reason alone, it is important to be able to understand and predict the structural and dynamical properties of soft materials such as polymers, surfactants, colloids, granular matter and liquids crystals. To achieve a better understanding of soft matter, three different approaches have to be integrated: experiment, theory and simulation. This book focuses on the third approach - but always in the context of the other two.

  6. Developing a multimodal biometric authentication system using soft computing methods.

    Science.gov (United States)

    Malcangi, Mario

    2015-01-01

    Robust personal authentication is becoming ever more important in computer-based applications. Among a variety of methods, biometric offers several advantages, mainly in embedded system applications. Hard and soft multi-biometric, combined with hard and soft computing methods, can be applied to improve the personal authentication process and to generalize the applicability. This chapter describes the embedded implementation of a multi-biometric (voiceprint and fingerprint) multimodal identification system based on hard computing methods (DSP) for feature extraction and matching, an artificial neural network (ANN) for soft feature pattern matching, and a fuzzy logic engine (FLE) for data fusion and decision.

  7. Detecting Soft Errors in Stencil based Computations

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, V. [Univ. of Utah, Salt Lake City, UT (United States); Gopalkrishnan, G. [Univ. of Utah, Salt Lake City, UT (United States); Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-05-06

    Given the growing emphasis on system resilience, it is important to develop software-level error detectors that help trap hardware-level faults with reasonable accuracy while minimizing false alarms as well as the performance overhead introduced. We present a technique that approaches this idea by taking stencil computations as our target, and synthesizing detectors based on machine learning. In particular, we employ linear regression to generate computationally inexpensive models which form the basis for error detection. Our technique has been incorporated into a new open-source library called SORREL. In addition to reporting encouraging experimental results, we demonstrate techniques that help reduce the size of training data. We also discuss the efficacy of various detectors synthesized, as well as our future plans.

  8. Fuzzy systems and soft computing in nuclear engineering

    International Nuclear Information System (INIS)

    Ruan, D.

    2000-01-01

    This book is an organized edited collection of twenty-one contributed chapters covering nuclear engineering applications of fuzzy systems, neural networks, genetic algorithms and other soft computing techniques. All chapters are either updated review or original contributions by leading researchers written exclusively for this volume. The volume highlights the advantages of applying fuzzy systems and soft computing in nuclear engineering, which can be viewed as complementary to traditional methods. As a result, fuzzy sets and soft computing provide a powerful tool for solving intricate problems pertaining in nuclear engineering. Each chapter of the book is self-contained and also indicates the future research direction on this topic of applications of fuzzy systems and soft computing in nuclear engineering. (orig.)

  9. Data mining in soft computing framework: a survey.

    Science.gov (United States)

    Mitra, S; Pal, S K; Mitra, P

    2002-01-01

    The present article provides a survey of the available literature on data mining using soft computing. A categorization has been provided based on the different soft computing tools and their hybridizations used, the data mining function implemented, and the preference criterion selected by the model. The utility of the different soft computing methodologies is highlighted. Generally fuzzy sets are suitable for handling the issues related to understandability of patterns, incomplete/noisy data, mixed media information and human interaction, and can provide approximate solutions faster. Neural networks are nonparametric, robust, and exhibit good learning and generalization capabilities in data-rich environments. Genetic algorithms provide efficient search algorithms to select a model, from mixed media data, based on some preference criterion/objective function. Rough sets are suitable for handling different types of uncertainty in data. Some challenges to data mining and the application of soft computing methodologies are indicated. An extensive bibliography is also included.

  10. Soft computing trends in nuclear energy system

    International Nuclear Information System (INIS)

    Paramasivan, B.

    2012-01-01

    In spite of so many advancements in the power and energy sector over the last two decades, its survival to cater quality power with due consideration for planning, coordination, marketing, safety, stability, optimality and reliability is still believed to remain critical. Though it appears simple from the outside, yet the internal structure of large scale power systems is so complex that event management and decision making requires a formidable preliminary preparation, which gets still worsened in the presence of uncertainties and contingencies. These aspects have attracted several researchers to carryout continued research in this field and their valued contributions have been significantly helping the newcomers in understanding the evolutionary growth in this sector, starting from phenomena, tools, methodologies to strategies so as to ensure smooth, stable, safe, reliable and economic operation. The usage of soft computing would accelerate interaction between the energy and technology research community with an aim to foster unified development in the next generation. Monitoring the mechanical impact of a loose (detached or drifting) part in the reactor coolant system of a nuclear power plant is one of the essential functions for operation and maintenance of the plant. Large data tables are generated during this monitoring process. This data can be 'mined' to reveal latent patterns of interest to operation and maintenance. Rough set theory has been applied successfully to data mining. It can be used in the nuclear power industry and elsewhere to identify classes in datasets, finding dependencies in relations and discovering rules which are hidden in databases. An important role may be played by nuclear energy, provided that major safety, waste and proliferation issues affecting current nuclear reactors are satisfactorily addressed. In this respect, a large effort is under way since a few years towards the development of advanced nuclear systems that would use

  11. Verifying Stability of Dynamic Soft-Computing Systems

    Science.gov (United States)

    Wen, Wu; Napolitano, Marcello; Callahan, John

    1997-01-01

    Soft computing is a general term for algorithms that learn from human knowledge and mimic human skills. Example of such algorithms are fuzzy inference systems and neural networks. Many applications, especially in control engineering, have demonstrated their appropriateness in building intelligent systems that are flexible and robust. Although recent research have shown that certain class of neuro-fuzzy controllers can be proven bounded and stable, they are implementation dependent and difficult to apply to the design and validation process. Many practitioners adopt the trial and error approach for system validation or resort to exhaustive testing using prototypes. In this paper, we describe our on-going research towards establishing necessary theoretic foundation as well as building practical tools for the verification and validation of soft-computing systems. A unified model for general neuro-fuzzy system is adopted. Classic non-linear system control theory and recent results of its applications to neuro-fuzzy systems are incorporated and applied to the unified model. It is hoped that general tools can be developed to help the designer to visualize and manipulate the regions of stability and boundedness, much the same way Bode plots and Root locus plots have helped conventional control design and validation.

  12. Use of Soft Computing Technologies For Rocket Engine Control

    Science.gov (United States)

    Trevino, Luis C.; Olcmen, Semih; Polites, Michael

    2003-01-01

    The problem to be addressed in this paper is to explore how the use of Soft Computing Technologies (SCT) could be employed to further improve overall engine system reliability and performance. Specifically, this will be presented by enhancing rocket engine control and engine health management (EHM) using SCT coupled with conventional control technologies, and sound software engineering practices used in Marshall s Flight Software Group. The principle goals are to improve software management, software development time and maintenance, processor execution, fault tolerance and mitigation, and nonlinear control in power level transitions. The intent is not to discuss any shortcomings of existing engine control and EHM methodologies, but to provide alternative design choices for control, EHM, implementation, performance, and sustaining engineering. The approaches outlined in this paper will require knowledge in the fields of rocket engine propulsion, software engineering for embedded systems, and soft computing technologies (i.e., neural networks, fuzzy logic, and Bayesian belief networks), much of which is presented in this paper. The first targeted demonstration rocket engine platform is the MC-1 (formerly FASTRAC Engine) which is simulated with hardware and software in the Marshall Avionics & Software Testbed laboratory that

  13. Experimental and Computational Techniques in Soft Condensed Matter Physics

    Science.gov (United States)

    Olafsen, Jeffrey

    2010-09-01

    1. Microscopy of soft materials Eric R. Weeks; 2. Computational methods to study jammed Systems Carl F. Schrek and Corey S. O'Hern; 3. Soft random solids: particulate gels, compressed emulsions and hybrid materials Anthony D. Dinsmore; 4. Langmuir monolayers Michael Dennin; 5. Computer modeling of granular rheology Leonardo E. Silbert; 6. Rheological and microrheological measurements of soft condensed matter John R. de Bruyn and Felix K. Oppong; 7. Particle-based measurement techniques for soft matter Nicholas T. Ouellette; 8. Cellular automata models of granular flow G. William Baxter; 9. Photoelastic materials Brian Utter; 10. Image acquisition and analysis in soft condensed matter Jeffrey S. Olafsen; 11. Structure and patterns in bacterial colonies Nicholas C. Darnton.

  14. 4th International Conference on Soft Computing for Problem Solving

    CERN Document Server

    Deep, Kusum; Pant, Millie; Bansal, Jagdish; Nagar, Atulya

    2015-01-01

    This two volume book is based on the research papers presented at the 4th International Conference on Soft Computing for Problem Solving (SocProS 2014) and covers a variety of topics, including mathematical modelling, image processing, optimization methods, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, medical and healthcare, data mining, etc. Mainly the emphasis is on Soft Computing and its applications in diverse areas. The prime objective of this book is to familiarize the reader with the latest scientific developments in various fields of Science, Engineering and Technology and is directed to the researchers and scientists engaged in various real-world applications of ‘Soft Computing’.

  15. Soft computing integrating evolutionary, neural, and fuzzy systems

    CERN Document Server

    Tettamanzi, Andrea

    2001-01-01

    Soft computing encompasses various computational methodologies, which, unlike conventional algorithms, are tolerant of imprecision, uncertainty, and partial truth. Soft computing technologies offer adaptability as a characteristic feature and thus permit the tracking of a problem through a changing environment. Besides some recent developments in areas like rough sets and probabilistic networks, fuzzy logic, evolutionary algorithms, and artificial neural networks are core ingredients of soft computing, which are all bio-inspired and can easily be combined synergetically. This book presents a well-balanced integration of fuzzy logic, evolutionary computing, and neural information processing. The three constituents are introduced to the reader systematically and brought together in differentiated combinations step by step. The text was developed from courses given by the authors and offers numerous illustrations as

  16. Hybrid soft computing systems for electromyographic signals analysis: a review

    Science.gov (United States)

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  17. Optical character recognition systems for different languages with soft computing

    CERN Document Server

    Chaudhuri, Arindam; Badelia, Pratixa; K Ghosh, Soumya

    2017-01-01

    The book offers a comprehensive survey of soft-computing models for optical character recognition systems. The various techniques, including fuzzy and rough sets, artificial neural networks and genetic algorithms, are tested using real texts written in different languages, such as English, French, German, Latin, Hindi and Gujrati, which have been extracted by publicly available datasets. The simulation studies, which are reported in details here, show that soft-computing based modeling of OCR systems performs consistently better than traditional models. Mainly intended as state-of-the-art survey for postgraduates and researchers in pattern recognition, optical character recognition and soft computing, this book will be useful for professionals in computer vision and image processing alike, dealing with different issues related to optical character recognition.

  18. Hybrid soft computing systems for electromyographic signals analysis: a review.

    Science.gov (United States)

    Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates

    2014-02-03

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.

  19. Soft Tissue Biomechanical Modeling for Computer Assisted Surgery

    CERN Document Server

    2012-01-01

      This volume focuses on the biomechanical modeling of biological tissues in the context of Computer Assisted Surgery (CAS). More specifically, deformable soft tissues are addressed since they are the subject of the most recent developments in this field. The pioneering works on this CAS topic date from the 1980's, with applications in orthopaedics and biomechanical models of bones. More recently, however, biomechanical models of soft tissues have been proposed since most of the human body is made of soft organs that can be deformed by the surgical gesture. Such models are much more complicated to handle since the tissues can be subject to large deformations (non-linear geometrical framework) as well as complex stress/strain relationships (non-linear mechanical framework). Part 1 of the volume presents biomechanical models that have been developed in a CAS context and used during surgery. This is particularly new since most of the soft tissues models already proposed concern Computer Assisted Planning, with ...

  20. Intelligent systems and soft computing for nuclear science and industry

    International Nuclear Information System (INIS)

    Ruan, D.; D'hondt, P.; Govaerts, P.; Kerre, E.E.

    1996-01-01

    The second international workshop on Fuzzy Logic and Intelligent Technologies in Nuclear Science (FLINS) addresses topics related to intelligent systems and soft computing for nuclear science and industry. The proceedings contain 52 papers in different fields such as radiation protection, nuclear safety (human factors and reliability), safeguards, nuclear reactor control, production processes in the fuel cycle, dismantling, waste and disposal, decision making, and nuclear reactor control. A clear link is made between theory and applications of fuzzy logic such as neural networks, expert systems, robotics, man-machine interfaces, and decision-support techniques by using modern and advanced technologies and tools. The papers are grouped in three sections. The first section (Soft computing techniques) deals with basic tools to treat fuzzy logic, neural networks, genetic algorithms, decision-making, and software used for general soft-computing aspects. The second section (Intelligent engineering systems) includes contributions on engineering problems such as knowledge-based engineering, expert systems, process control integration, diagnosis, measurements, and interpretation by soft computing. The third section (Nuclear applications) focusses on the application of soft computing and intelligent systems in nuclear science and industry

  1. Soft Computing Techniques for the Protein Folding Problem on High Performance Computing Architectures.

    Science.gov (United States)

    Llanes, Antonio; Muñoz, Andrés; Bueno-Crespo, Andrés; García-Valverde, Teresa; Sánchez, Antonia; Arcas-Túnez, Francisco; Pérez-Sánchez, Horacio; Cecilia, José M

    2016-01-01

    The protein-folding problem has been extensively studied during the last fifty years. The understanding of the dynamics of global shape of a protein and the influence on its biological function can help us to discover new and more effective drugs to deal with diseases of pharmacological relevance. Different computational approaches have been developed by different researchers in order to foresee the threedimensional arrangement of atoms of proteins from their sequences. However, the computational complexity of this problem makes mandatory the search for new models, novel algorithmic strategies and hardware platforms that provide solutions in a reasonable time frame. We present in this revision work the past and last tendencies regarding protein folding simulations from both perspectives; hardware and software. Of particular interest to us are both the use of inexact solutions to this computationally hard problem as well as which hardware platforms have been used for running this kind of Soft Computing techniques.

  2. The use of time-of-flight camera for navigating robots in computer-aided surgery: monitoring the soft tissue envelope of minimally invasive hip approach in a cadaver study.

    Science.gov (United States)

    Putzer, David; Klug, Sebastian; Moctezuma, Jose Luis; Nogler, Michael

    2014-12-01

    Time-of-flight (TOF) cameras can guide surgical robots or provide soft tissue information for augmented reality in the medical field. In this study, a method to automatically track the soft tissue envelope of a minimally invasive hip approach in a cadaver study is described. An algorithm for the TOF camera was developed and 30 measurements on 8 surgical situs (direct anterior approach) were carried out. The results were compared to a manual measurement of the soft tissue envelope. The TOF camera showed an overall recognition rate of the soft tissue envelope of 75%. On comparing the results from the algorithm with the manual measurements, a significant difference was found (P > .005). In this preliminary study, we have presented a method for automatically recognizing the soft tissue envelope of the surgical field in a real-time application. Further improvements could result in a robotic navigation device for minimally invasive hip surgery. © The Author(s) 2014.

  3. A Simulation-Based Soft Error Estimation Methodology for Computer Systems

    OpenAIRE

    Sugihara, Makoto; Ishihara, Tohru; Hashimoto, Koji; Muroyama, Masanori

    2006-01-01

    This paper proposes a simulation-based soft error estimation methodology for computer systems. Accumulating soft error rates (SERs) of all memories in a computer system results in pessimistic soft error estimation. This is because memory cells are used spatially and temporally and not all soft errors in them make the computer system faulty. Our soft-error estimation methodology considers the locations and the timings of soft errors occurring at every level of memory hierarchy and estimates th...

  4. The First International Conference on Soft Computing and Data Mining

    CERN Document Server

    Ghazali, Rozaida; Deris, Mustafa

    2014-01-01

    This book constitutes the refereed proceedings of the First International Conference on Soft Computing and Data Mining, SCDM 2014, held in Universiti Tun Hussein Onn Malaysia, in June 16th-18th, 2014. The 65 revised full papers presented in this book were carefully reviewed and selected from 145 submissions, and organized into two main topical sections; Data Mining and Soft Computing. The goal of this book is to provide both theoretical concepts and, especially, practical techniques on these exciting fields of soft computing and data mining, ready to be applied in real-world applications. The exchanges of views pertaining future research directions to be taken in this field and the resultant dissemination of the latest research findings makes this work of immense value to all those having an interest in the topics covered.    

  5. Soft computing for fault diagnosis in power plants

    International Nuclear Information System (INIS)

    Ciftcioglu, O.; Turkcan, E.

    1998-01-01

    Considering the advancements in the AI technology, there arises a new concept known as soft computing. It can be defined as the processing of uncertain information with the AI methods, that refers to explicitly the methods using neural networks, fuzzy logic and evolutionary algorithms. In this respect, soft computing is a new dimension in information processing technology where linguistic information can also be processed in contrast with the classical stochastic and deterministic treatments of data. On one hand it can process uncertain/incomplete information and on the other hand it can deal with non-linearity of large-scale systems where uncertainty is particularly relevant with respect to linguistic information and incompleteness is related to fault tolerance in fault diagnosis. In this perspective, the potential role of soft computing in power plant operation is presented. (author)

  6. A comparative analysis of soft computing techniques for gene prediction.

    Science.gov (United States)

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. Soft computing in green and renewable energy systems

    Energy Technology Data Exchange (ETDEWEB)

    Gopalakrishnan, Kasthurirangan [Iowa State Univ., Ames, IA (United States). Iowa Bioeconomy Inst.; US Department of Energy, Ames, IA (United States). Ames Lab; Kalogirou, Soteris [Cyprus Univ. of Technology, Limassol (Cyprus). Dept. of Mechanical Engineering and Materials Sciences and Engineering; Khaitan, Siddhartha Kumar (eds.) [Iowa State Univ. of Science and Technology, Ames, IA (United States). Dept. of Electrical Engineering and Computer Engineering

    2011-07-01

    Soft Computing in Green and Renewable Energy Systems provides a practical introduction to the application of soft computing techniques and hybrid intelligent systems for designing, modeling, characterizing, optimizing, forecasting, and performance prediction of green and renewable energy systems. Research is proceeding at jet speed on renewable energy (energy derived from natural resources such as sunlight, wind, tides, rain, geothermal heat, biomass, hydrogen, etc.) as policy makers, researchers, economists, and world agencies have joined forces in finding alternative sustainable energy solutions to current critical environmental, economic, and social issues. The innovative models, environmentally benign processes, data analytics, etc. employed in renewable energy systems are computationally-intensive, non-linear and complex as well as involve a high degree of uncertainty. Soft computing technologies, such as fuzzy sets and systems, neural science and systems, evolutionary algorithms and genetic programming, and machine learning, are ideal in handling the noise, imprecision, and uncertainty in the data, and yet achieve robust, low-cost solutions. As a result, intelligent and soft computing paradigms are finding increasing applications in the study of renewable energy systems. Researchers, practitioners, undergraduate and graduate students engaged in the study of renewable energy systems will find this book very useful. (orig.)

  8. Discrete Cosserat Approach for Multi-Section Soft Robots Dynamics

    OpenAIRE

    Renda, Federico; Boyer, Frederic; Dias, Jorge; Seneviratne, Lakmal

    2017-01-01

    In spite of recent progress, soft robotics still suffers from a lack of unified modeling framework. Nowadays, the most adopted model for the design and control of soft robots is the piece-wise constant curvature model, with its consolidated benefits and drawbacks. In this work, an alternative model for multisection soft robots dynamics is presented based on a discrete Cosserat approach, which, not only takes into account shear and torsional deformations, essentials to cope with out-of-plane e...

  9. Soft Computing in Construction Information Technology

    NARCIS (Netherlands)

    Ciftcioglu, O.; Durmisevic, S.; Sariyildiz, S.

    2001-01-01

    The last decade, civil engineering has exercised a rapidly growing interest in the application of neurally inspired computing techniques. The motive for this interest was the promises of certain information processing characteristics, which are similar to some extend, to those of human brain. The

  10. Osteotomy simulation and soft tissue prediction using computer tomography scans

    International Nuclear Information System (INIS)

    Teschner, M.; Girod, S.; Girod, B.

    1999-01-01

    In this paper, a system is presented that can be used to simulate osteotomies of the skull and to estimate the resulting of tissue changes. Thus, the three-dimensional, photorealistic, postoperative appearance of a patient can be assessed. The system is based on a computer tomography scan and a photorealistic laser scan of the patient's face. In order to predict the postoperative appearance of a patient the soft tissue must follow the movement of the underlying bone. In this paper, a multi-layer soft tissue model is proposed that is based on springs. It incorporates features like skin turgor, gravity and sliding bone contact. The prediction of soft tissue changes due to bone realignments is computed using a very efficient and robust optimization method. The system can handle individual patient data sets and has been tested with several clinical cases. (author)

  11. 5th International Conference on Soft Computing for Problem Solving

    CERN Document Server

    Deep, Kusum; Bansal, Jagdish; Nagar, Atulya; Das, Kedar

    2016-01-01

    This two volume book is based on the research papers presented at the 5th International Conference on Soft Computing for Problem Solving (SocProS 2015) and covers a variety of topics, including mathematical modelling, image processing, optimization methods, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, medical and health care, data mining, etc. Mainly the emphasis is on Soft Computing and its applications in diverse areas. The prime objective of this book is to familiarize the reader with the latest scientific developments in various fields of Science, Engineering and Technology and is directed to the researchers and scientists engaged in various real-world applications of ‘Soft Computing’.

  12. Water demand forecasting: review of soft computing methods.

    Science.gov (United States)

    Ghalehkhondabi, Iman; Ardjmand, Ehsan; Young, William A; Weckman, Gary R

    2017-07-01

    Demand forecasting plays a vital role in resource management for governments and private companies. Considering the scarcity of water and its inherent constraints, demand management and forecasting in this domain are critically important. Several soft computing techniques have been developed over the last few decades for water demand forecasting. This study focuses on soft computing methods of water consumption forecasting published between 2005 and 2015. These methods include artificial neural networks (ANNs), fuzzy and neuro-fuzzy models, support vector machines, metaheuristics, and system dynamics. Furthermore, it was discussed that while in short-term forecasting, ANNs have been superior in many cases, but it is still very difficult to pick a single method as the overall best. According to the literature, various methods and their hybrids are applied to water demand forecasting. However, it seems soft computing has a lot more to contribute to water demand forecasting. These contribution areas include, but are not limited, to various ANN architectures, unsupervised methods, deep learning, various metaheuristics, and ensemble methods. Moreover, it is found that soft computing methods are mainly used for short-term demand forecasting.

  13. Darwinian Spacecraft: Soft Computing Strategies Breeding Better, Faster Cheaper

    Science.gov (United States)

    Noever, David A.; Baskaran, Subbiah

    1999-01-01

    Computers can create infinite lists of combinations to try to solve a particular problem, a process called "soft-computing." This process uses statistical comparables, neural networks, genetic algorithms, fuzzy variables in uncertain environments, and flexible machine learning to create a system which will allow spacecraft to increase robustness, and metric evaluation. These concepts will allow for the development of a spacecraft which will allow missions to be performed at lower costs.

  14. A new paradigm of knowledge engineering by soft computing

    CERN Document Server

    Ding, Liya

    2001-01-01

    Soft computing (SC) consists of several computing paradigms, including neural networks, fuzzy set theory, approximate reasoning, and derivative-free optimization methods such as genetic algorithms. The integration of those constituent methodologies forms the core of SC. In addition, the synergy allows SC to incorporate human knowledge effectively, deal with imprecision and uncertainty, and learn to adapt to unknown or changing environments for better performance. Together with other modern technologies, SC and its applications exert unprecedented influence on intelligent systems that mimic hum

  15. Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.

    Science.gov (United States)

    Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander

    2018-04-10

    A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing

  16. Renormalization group approach to soft gluon resummation

    International Nuclear Information System (INIS)

    Forte, Stefano; Ridolfi, Giovanni

    2003-01-01

    We present a simple proof of the all-order exponentiation of soft logarithmic corrections to hard processes in perturbative QCD. Our argument is based on proving that all large logs in the soft limit can be expressed in terms of a single dimensionful variable, and then using the renormalization group to resum them. Beyond the next-to-leading log level, our result is somewhat less predictive than previous all-order resummation formulae, but it does not rely on non-standard factorization, and it is thus possibly more general. We use our result to settle issues of convergence of the resummed series, we discuss scheme dependence at the resummed level, and we provide explicit resummed expressions in various factorization schemes

  17. Second International Conference on Soft Computing for Problem Solving

    CERN Document Server

    Nagar, Atulya; Deep, Kusum; Pant, Millie; Bansal, Jagdish; Ray, Kanad; Gupta, Umesh

    2014-01-01

    The present book is based on the research papers presented in the International Conference on Soft Computing for Problem Solving (SocProS 2012), held at JK Lakshmipat University, Jaipur, India. This book provides the latest developments in the area of soft computing and covers a variety of topics, including mathematical modeling, image processing, optimization, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, data mining, etc. The objective of the book is to familiarize the reader with the latest scientific developments that are taking place in various fields and the latest sophisticated problem solving tools that are being developed to deal with the complex and intricate problems that are otherwise difficult to solve by the usual and traditional methods. The book is directed to the researchers and scientists engaged in various fields of Science and Technology.

  18. Third International Conference on Soft Computing for Problem Solving

    CERN Document Server

    Deep, Kusum; Nagar, Atulya; Bansal, Jagdish

    2014-01-01

    The present book is based on the research papers presented in the 3rd International Conference on Soft Computing for Problem Solving (SocProS 2013), held as a part of the golden jubilee celebrations of the Saharanpur Campus of IIT Roorkee, at the Noida Campus of Indian Institute of Technology Roorkee, India. This book is divided into two volumes and covers a variety of topics including mathematical modelling, image processing, optimization, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, medical and health care, data mining etc. Particular emphasis is laid on soft computing and its application to diverse fields. The prime objective of the book is to familiarize the reader with the latest scientific developments that are taking place in various fields and the latest sophisticated problem solving tools that are being developed to deal with the complex and intricate problems, which are otherwise difficult to solve by the usual and traditional methods. The book is directed ...

  19. International Conference on Soft Computing Techniques and Engineering Application

    CERN Document Server

    Li, Xiaolong

    2014-01-01

    The main objective of ICSCTEA 2013 is to provide a platform for researchers, engineers and academicians from all over the world to present their research results and development activities in soft computing techniques and engineering application. This conference provides opportunities for them to exchange new ideas and application experiences face to face, to establish business or research relations and to find global partners for future collaboration.

  20. Soft Computing. Nové informatické paradigma, nebo módní slogan?

    Czech Academy of Sciences Publication Activity Database

    Hájek, Petr

    2000-01-01

    Roč. 79, č. 12 (2000), s. 683-685 ISSN 0042-4544 Institutional research plan: AV0Z1030915 Keywords : soft computing * fuzzy computing * neural computing * generic computing Subject RIV: BA - General Mathematics

  1. Soft Computing in Information Communication Technology Volume 2

    CERN Document Server

    2012-01-01

    This book is a collection of the accepted papers concerning soft computing in information communication technology. The resultant dissemination of the latest research results, and the exchanges of views concerning the future research directions to be taken in this field makes the work of immense value to all those having an interest in the topics covered. The present book represents a cooperative effort to seek out the best strategies for effecting improvements in the quality and the reliability of Fuzzy Logic, Machine Learning, Cryptography, Pattern Recognition, Bioinformatics, Biomedical Engineering, Advancements in ICT.

  2. Modeling soft factors in computer-based wargames

    Science.gov (United States)

    Alexander, Steven M.; Ross, David O.; Vinarskai, Jonathan S.; Farr, Steven D.

    2002-07-01

    Computer-based wargames have seen much improvement in recent years due to rapid increases in computing power. Because these games have been developed for the entertainment industry, most of these advances have centered on the graphics, sound, and user interfaces integrated into these wargames with less attention paid to the game's fidelity. However, for a wargame to be useful to the military, it must closely approximate as many of the elements of war as possible. Among the elements that are typically not modeled or are poorly modeled in nearly all military computer-based wargames are systematic effects, command and control, intelligence, morale, training, and other human and political factors. These aspects of war, with the possible exception of systematic effects, are individually modeled quite well in many board-based commercial wargames. The work described in this paper focuses on incorporating these elements from the board-based games into a computer-based wargame. This paper will also address the modeling and simulation of the systemic paralysis of an adversary that is implied by the concept of Effects Based Operations (EBO). Combining the fidelity of current commercial board wargames with the speed, ease of use, and advanced visualization of the computer can significantly improve the effectiveness of military decision making and education. Once in place, the process of converting board wargames concepts to computer wargames will allow the infusion of soft factors into military training and planning.

  3. Advanced soft computing diagnosis method for tumour grading.

    Science.gov (United States)

    Papageorgiou, E I; Spyridonos, P P; Stylios, C D; Ravazoula, P; Groumpos, P P; Nikiforidis, G N

    2006-01-01

    To develop an advanced diagnostic method for urinary bladder tumour grading. A novel soft computing modelling methodology based on the augmentation of fuzzy cognitive maps (FCMs) with the unsupervised active Hebbian learning (AHL) algorithm is applied. One hundred and twenty-eight cases of urinary bladder cancer were retrieved from the archives of the Department of Histopathology, University Hospital of Patras, Greece. All tumours had been characterized according to the classical World Health Organization (WHO) grading system. To design the FCM model for tumour grading, three experts histopathologists defined the main histopathological features (concepts) and their impact on grade characterization. The resulted FCM model consisted of nine concepts. Eight concepts represented the main histopathological features for tumour grading. The ninth concept represented the tumour grade. To increase the classification ability of the FCM model, the AHL algorithm was applied to adjust the weights of the FCM. The proposed FCM grading model achieved a classification accuracy of 72.5%, 74.42% and 95.55% for tumours of grades I, II and III, respectively. An advanced computerized method to support tumour grade diagnosis decision was proposed and developed. The novelty of the method is based on employing the soft computing method of FCMs to represent specialized knowledge on histopathology and on augmenting FCMs ability using an unsupervised learning algorithm, the AHL. The proposed method performs with reasonably high accuracy compared to other existing methods and at the same time meets the physicians' requirements for transparency and explicability.

  4. International Conference on Soft Computing in Information Communication Technology

    CERN Document Server

    Soft Computing in Information Communication Technology

    2012-01-01

      This is a collection of the accepted papers concerning soft computing in information communication technology. All accepted papers are subjected to strict peer-reviewing by 2 expert referees. The resultant dissemination of the latest research results, and the exchanges of views concerning the future research directions to be taken in this field makes the work of immense value to all those having an interest in the topics covered. The present book represents a cooperative effort to seek out the best strategies for effecting improvements in the quality and the reliability of Neural Networks, Swarm Intelligence, Evolutionary Computing, Image Processing Internet Security, Data Security, Data Mining, Network Security and Protection of data and Cyber laws. Our sincere appreciation and thanks go to these authors for their contributions to this conference. I hope you can gain lots of useful information from the book.

  5. Application of Soft Computing in Coherent Communications Phase Synchronization

    Science.gov (United States)

    Drake, Jeffrey T.; Prasad, Nadipuram R.

    2000-01-01

    The use of soft computing techniques in coherent communications phase synchronization provides an alternative to analytical or hard computing methods. This paper discusses a novel use of Adaptive Neuro-Fuzzy Inference Systems (ANFIS) for phase synchronization in coherent communications systems utilizing Multiple Phase Shift Keying (MPSK) modulation. A brief overview of the M-PSK digital communications bandpass modulation technique is presented and it's requisite need for phase synchronization is discussed. We briefly describe the hybrid platform developed by Jang that incorporates fuzzy/neural structures namely the, Adaptive Neuro-Fuzzy Interference Systems (ANFIS). We then discuss application of ANFIS to phase estimation for M-PSK. The modeling of both explicit, and implicit phase estimation schemes for M-PSK symbols with unknown structure are discussed. Performance results from simulation of the above scheme is presented.

  6. Soft computing in design and manufacturing of advanced materials

    Science.gov (United States)

    Cios, Krzysztof J.; Baaklini, George Y; Vary, Alex

    1993-01-01

    The potential of fuzzy sets and neural networks, often referred to as soft computing, for aiding in all aspects of manufacturing of advanced materials like ceramics is addressed. In design and manufacturing of advanced materials, it is desirable to find which of the many processing variables contribute most to the desired properties of the material. There is also interest in real time quality control of parameters that govern material properties during processing stages. The concepts of fuzzy sets and neural networks are briefly introduced and it is shown how they can be used in the design and manufacturing processes. These two computational methods are alternatives to other methods such as the Taguchi method. The two methods are demonstrated by using data collected at NASA Lewis Research Center. Future research directions are also discussed.

  7. Pattern recognition algorithms for data mining scalability, knowledge discovery and soft granular computing

    CERN Document Server

    Pal, Sankar K

    2004-01-01

    Pattern Recognition Algorithms for Data Mining addresses different pattern recognition (PR) tasks in a unified framework with both theoretical and experimental results. Tasks covered include data condensation, feature selection, case generation, clustering/classification, and rule generation and evaluation. This volume presents various theories, methodologies, and algorithms, using both classical approaches and hybrid paradigms. The authors emphasize large datasets with overlapping, intractable, or nonlinear boundary classes, and datasets that demonstrate granular computing in soft frameworks.Organized into eight chapters, the book begins with an introduction to PR, data mining, and knowledge discovery concepts. The authors analyze the tasks of multi-scale data condensation and dimensionality reduction, then explore the problem of learning with support vector machine (SVM). They conclude by highlighting the significance of granular computing for different mining tasks in a soft paradigm.

  8. Technical Development and Application of Soft Computing in Agricultural and Biological Engineering

    Science.gov (United States)

    Soft computing is a set of “inexact” computing techniques, which are able to model and analyze very complex problems. For these complex problems, more conventional methods have not been able to produce cost-effective, analytical, or complete solutions. Soft computing has been extensively studied and...

  9. Development of Soft Computing and Applications in Agricultural and Biological Engineering

    Science.gov (United States)

    Soft computing is a set of “inexact” computing techniques, which are able to model and analyze very complex problems. For these complex problems, more conventional methods have not been able to produce cost-effective, analytical, or complete solutions. Soft computing has been extensively studied and...

  10. Wearable Intrinsically Soft, Stretchable, Flexible Devices for Memories and Computing.

    Science.gov (United States)

    Rajan, Krishna; Garofalo, Erik; Chiolerio, Alessandro

    2018-01-27

    A recent trend in the development of high mass consumption electron devices is towards electronic textiles (e-textiles), smart wearable devices, smart clothes, and flexible or printable electronics. Intrinsically soft, stretchable, flexible, Wearable Memories and Computing devices (WMCs) bring us closer to sci-fi scenarios, where future electronic systems are totally integrated in our everyday outfits and help us in achieving a higher comfort level, interacting for us with other digital devices such as smartphones and domotics, or with analog devices, such as our brain/peripheral nervous system. WMC will enable each of us to contribute to open and big data systems as individual nodes, providing real-time information about physical and environmental parameters (including air pollution monitoring, sound and light pollution, chemical or radioactive fallout alert, network availability, and so on). Furthermore, WMC could be directly connected to human brain and enable extremely fast operation and unprecedented interface complexity, directly mapping the continuous states available to biological systems. This review focuses on recent advances in nanotechnology and materials science and pays particular attention to any result and promising technology to enable intrinsically soft, stretchable, flexible WMC.

  11. Computer vision and soft computing for automatic skull-face overlay in craniofacial superimposition.

    Science.gov (United States)

    Campomanes-Álvarez, B Rosario; Ibáñez, O; Navarro, F; Alemán, I; Botella, M; Damas, S; Cordón, O

    2014-12-01

    Craniofacial superimposition can provide evidence to support that some human skeletal remains belong or not to a missing person. It involves the process of overlaying a skull with a number of ante mortem images of an individual and the analysis of their morphological correspondence. Within the craniofacial superimposition process, the skull-face overlay stage just focuses on achieving the best possible overlay of the skull and a single ante mortem image of the suspect. Although craniofacial superimposition has been in use for over a century, skull-face overlay is still applied by means of a trial-and-error approach without an automatic method. Practitioners finish the process once they consider that a good enough overlay has been attained. Hence, skull-face overlay is a very challenging, subjective, error prone, and time consuming part of the whole process. Though the numerical assessment of the method quality has not been achieved yet, computer vision and soft computing arise as powerful tools to automate it, dramatically reducing the time taken by the expert and obtaining an unbiased overlay result. In this manuscript, we justify and analyze the use of these techniques to properly model the skull-face overlay problem. We also present the automatic technical procedure we have developed using these computational methods and show the four overlays obtained in two craniofacial superimposition cases. This automatic procedure can be thus considered as a tool to aid forensic anthropologists to develop the skull-face overlay, automating and avoiding subjectivity of the most tedious task within craniofacial superimposition. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  12. Soft Real-Time PID Control on a VME Computer

    Science.gov (United States)

    Karayan, Vahag; Sander, Stanley; Cageao, Richard

    2007-01-01

    microPID (uPID) is a computer program for real-time proportional + integral + derivative (PID) control of a translation stage in a Fourier-transform ultraviolet spectrometer. microPID implements a PID control loop over a position profile at sampling rate of 8 kHz (sampling period 125microseconds). The software runs in a strippeddown Linux operating system on a VersaModule Eurocard (VME) computer operating in real-time priority queue using an embedded controller, a 16-bit digital-to-analog converter (D/A) board, and a laser-positioning board (LPB). microPID consists of three main parts: (1) VME device-driver routines, (2) software that administers a custom protocol for serial communication with a control computer, and (3) a loop section that obtains the current position from an LPB-driver routine, calculates the ideal position from the profile, and calculates a new voltage command by use of an embedded PID routine all within each sampling period. The voltage command is sent to the D/A board to control the stage. microPID uses special kernel headers to obtain microsecond timing resolution. Inasmuch as microPID implements a single-threaded process and all other processes are disabled, the Linux operating system acts as a soft real-time system.

  13. Crops: a green approach toward self-assembled soft materials.

    Science.gov (United States)

    Vemula, Praveen Kumar; John, George

    2008-06-01

    . Importantly, an enzyme triggered drug-delivery model for hydrophobic drugs was demonstrated by using these supramolecularly assembled hydrogels. Following a similar biocatalytic approach, vitamin C amphiphiles were synthesized with different hydrocarbon chain lengths, and their ability to self-assemble into molecular gels and liquid crystals has been studied in detail. Such biobased soft materials were successfully used to develop novel organic-inorganic hybrid materials by in situ synthesis of metal nanoparticles. The self-assembled soft materials were characterized by several spectroscopic techniques, UV-visible, infrared, and fluorescence spectrophotometers, as well as microscopic methods including polarized optical, confocal, scanning, and transmission electron microscopes, and thermal analysis. The molecular packing of the hierarchically assembled bilayer membranes was fully elucidated by X-ray analysis. We envision that the results summarized in this Account will encourage interdisciplinary collaboration between scientists in the fields of organic synthesis, soft materials research, and green chemistry to develop functional materials from underutilized crop-based renewable feedstock, with innovation driven both by material needs and environmentally benign design principles.

  14. Applications of the soft computing in the automated history matching

    Energy Technology Data Exchange (ETDEWEB)

    Silva, P.C.; Maschio, C.; Schiozer, D.J. [Unicamp (Brazil)

    2006-07-01

    Reservoir management is a research field in petroleum engineering that optimizes reservoir performance based on environmental, political, economic and technological criteria. Reservoir simulation is based on geological models that simulate fluid flow. Models must be constantly corrected to yield the observed production behaviour. The process of history matching is controlled by the comparison of production data, well test data and measured data from simulations. Parametrization, objective function analysis, sensitivity analysis and uncertainty analysis are important steps in history matching. One of the main challenges facing automated history matching is to develop algorithms that find the optimal solution in multidimensional search spaces. Optimization algorithms can be either global optimizers that work with noisy multi-modal functions, or local optimizers that cannot work with noisy multi-modal functions. The problem with global optimizers is the very large number of function calls, which is an inconvenience due to the long reservoir simulation time. For that reason, techniques such as least squared, thin plane spline, kriging and artificial neural networks (ANN) have been used as substitutes to reservoir simulators. This paper described the use of optimization algorithms to find optimal solution in automated history matching. Several ANN were used, including the generalized regression neural network, fuzzy system with subtractive clustering and radial basis network. The UNIPAR soft computing method was used along with a modified Hooke- Jeeves optimization method. Two case studies with synthetic and real reservoirs are examined. It was concluded that the combination of global and local optimization has the potential to improve the history matching process and that the use of substitute models can reduce computational efforts. 15 refs., 11 figs.

  15. A programming approach to computability

    CERN Document Server

    Kfoury, A J; Arbib, Michael A

    1982-01-01

    Computability theory is at the heart of theoretical computer science. Yet, ironically, many of its basic results were discovered by mathematical logicians prior to the development of the first stored-program computer. As a result, many texts on computability theory strike today's computer science students as far removed from their concerns. To remedy this, we base our approach to computability on the language of while-programs, a lean subset of PASCAL, and postpone consideration of such classic models as Turing machines, string-rewriting systems, and p. -recursive functions till the final chapter. Moreover, we balance the presentation of un solvability results such as the unsolvability of the Halting Problem with a presentation of the positive results of modern programming methodology, including the use of proof rules, and the denotational semantics of programs. Computer science seeks to provide a scientific basis for the study of information processing, the solution of problems by algorithms, and the design ...

  16. A Wireless Sensor Network with Soft Computing Localization Techniques for Track Cycling Applications.

    Science.gov (United States)

    Gharghan, Sadik Kamel; Nordin, Rosdiadee; Ismail, Mahamod

    2016-08-06

    In this paper, we propose two soft computing localization techniques for wireless sensor networks (WSNs). The two techniques, Neural Fuzzy Inference System (ANFIS) and Artificial Neural Network (ANN), focus on a range-based localization method which relies on the measurement of the received signal strength indicator (RSSI) from the three ZigBee anchor nodes distributed throughout the track cycling field. The soft computing techniques aim to estimate the distance between bicycles moving on the cycle track for outdoor and indoor velodromes. In the first approach the ANFIS was considered, whereas in the second approach the ANN was hybridized individually with three optimization algorithms, namely Particle Swarm Optimization (PSO), Gravitational Search Algorithm (GSA), and Backtracking Search Algorithm (BSA). The results revealed that the hybrid GSA-ANN outperforms the other methods adopted in this paper in terms of accuracy localization and distance estimation accuracy. The hybrid GSA-ANN achieves a mean absolute distance estimation error of 0.02 m and 0.2 m for outdoor and indoor velodromes, respectively.

  17. What is Soft Computing? Bridging Gaps for 21st Century Science!

    Directory of Open Access Journals (Sweden)

    Rudolf Seising

    2010-06-01

    Full Text Available This contribution serves historical and philosophical reflecting cognitions on the role of Soft Computing in the 21st century. Referring to Magdalena's article in this issue, this paper considers the aspects of mixtures of techniques, the opposite pair qHard Computingq and qSoft Computingq, and Computational Intelligence. From the historical perspective the paper goes back to three articles by Warren Weaver that appeared after World War II. A concentrated study of these papers helps to understand that Soft Computing will be able to play a key role in the future development of science and technology.

  18. 17th Online World Conference on Soft Computing in Industrial Applications

    CERN Document Server

    Krömer, Pavel; Köppen, Mario; Schaefer, Gerald

    2014-01-01

    This volume of Advances in Intelligent Systems and Computing contains accepted papers presented at WSC17, the 17th Online World Conference on Soft Computing in Industrial Applications, held from December 2012 to January 2013 on the Internet. WSC17 continues a successful series of scientific events started over a decade ago by the World Federation of Soft Computing. It brought together researchers from over the world interested in the ever advancing state of the art in the field. Continuous technological improvements make this online forum a viable gathering format for a world class conference. The aim of WSC17 was to disseminate excellent research results and contribute to building a global network of scientists interested in both theoretical foundations and practical applications of soft computing.   The 2012 edition of the Online World Conference on Soft Computing in Industrial Applications consisted of general track and special session on Continuous Features Discretization for Anomaly Intrusion Detectors...

  19. Computer architecture a quantitative approach

    CERN Document Server

    Hennessy, John L

    2019-01-01

    Computer Architecture: A Quantitative Approach, Sixth Edition has been considered essential reading by instructors, students and practitioners of computer design for over 20 years. The sixth edition of this classic textbook is fully revised with the latest developments in processor and system architecture. It now features examples from the RISC-V (RISC Five) instruction set architecture, a modern RISC instruction set developed and designed to be a free and openly adoptable standard. It also includes a new chapter on domain-specific architectures and an updated chapter on warehouse-scale computing that features the first public information on Google's newest WSC. True to its original mission of demystifying computer architecture, this edition continues the longstanding tradition of focusing on areas where the most exciting computing innovation is happening, while always keeping an emphasis on good engineering design.

  20. Critical Data Analysis Precedes Soft Computing Of Medical Data

    DEFF Research Database (Denmark)

    Keyserlingk, Diedrich Graf von; Jantzen, Jan; Berks, G.

    2000-01-01

    extracted. The factors had different relationships (loadings) to the symptoms. Although the factors were gained only by computations, they seemed to express some modular features of the language disturbances. This phenomenon, that factors represent superior aspects of data, is well known in factor analysis...... the deficits in communication. Sets of symptoms corresponding to the traditional symptoms in Broca and Wernicke aphasia may be represented in the factors, but the factor itself does not represent a syndrome. It is assumed that this kind of data analysis shows a new approach to the understanding of language...

  1. Computational approaches to energy materials

    CERN Document Server

    Catlow, Richard; Walsh, Aron

    2013-01-01

    The development of materials for clean and efficient energy generation and storage is one of the most rapidly developing, multi-disciplinary areas of contemporary science, driven primarily by concerns over global warming, diminishing fossil-fuel reserves, the need for energy security, and increasing consumer demand for portable electronics. Computational methods are now an integral and indispensable part of the materials characterisation and development process.   Computational Approaches to Energy Materials presents a detailed survey of current computational techniques for the

  2. An Approach to the Concept of Soft Vieotoris Topology

    Directory of Open Access Journals (Sweden)

    Izzettin Demir

    2016-10-01

    Full Text Available In the present paper, we study the Vietoris topology in the context of soft set. Firstly, we investigate some aspects of first countability in the soft Vietoris topology. Then, we obtain some properties about its second countability.

  3. Exploiting short-term memory in soft body dynamics as a computational resource.

    Science.gov (United States)

    Nakajima, K; Li, T; Hauser, H; Pfeifer, R

    2014-11-06

    Soft materials are not only highly deformable, but they also possess rich and diverse body dynamics. Soft body dynamics exhibit a variety of properties, including nonlinearity, elasticity and potentially infinitely many degrees of freedom. Here, we demonstrate that such soft body dynamics can be employed to conduct certain types of computation. Using body dynamics generated from a soft silicone arm, we show that they can be exploited to emulate functions that require memory and to embed robust closed-loop control into the arm. Our results suggest that soft body dynamics have a short-term memory and can serve as a computational resource. This finding paves the way towards exploiting passive body dynamics for control of a large class of underactuated systems. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  4. Real-time simulation of biological soft tissues: a PGD approach.

    Science.gov (United States)

    Niroomandi, S; González, D; Alfaro, I; Bordeu, F; Leygue, A; Cueto, E; Chinesta, F

    2013-05-01

    We introduce here a novel approach for the numerical simulation of nonlinear, hyperelastic soft tissues at kilohertz feedback rates necessary for haptic rendering. This approach is based upon the use of proper generalized decomposition techniques, a generalization of PODs. Proper generalized decomposition techniques can be considered as a means of a priori model order reduction and provides a physics-based meta-model without the need for prior computer experiments. The suggested strategy is thus composed of an offline phase, in which a general meta-model is computed, and an online evaluation phase in which the results are obtained at real time. Results are provided that show the potential of the proposed technique, together with some benchmark test that shows the accuracy of the method. Copyright © 2013 John Wiley & Sons, Ltd.

  5. Soft gluon approach for diffractive photoproduction of J/ψ

    International Nuclear Information System (INIS)

    Ma, J.P.; Xu Jiasheng

    2002-01-01

    We study diffractive photoproduction of J/ψ by taking the charm quark as a heavy quark. A description of nonperturbative effect related to J/ψ can be made by using NRQCD. In the forward region of the kinematics, the interaction between the cc-bar-pair and the initial hadron is due to exchange of soft gluons. The effect of the exchange can be studied by using the expansion in the inverse of the quark mass m c . At the leading order we find that the nonperturbative effect related to the initial hadron is represented by a matrix element of field strength operators, which are separated in the moving direction of J/ψ in the spacetime. The S-matrix element is then obtained without using perturbative QCD and the results are not based on any model. Corrections to the results can be systematically added. Keeping the dominant contribution of the S-matrix element in the large energy limit we find that the imaginary part of the S-matrix element is related to the gluon distribution for x→0 with a reasonable assumption, the real part can be obtained with another approximation or with dispersion relation. Our approach is different than previous approaches and also our results are different than those in these approaches. The differences are discussed in detail. A comparison with experiment is also made and a qualitative agreement is found. (author)

  6. Use of Soft Computing Technologies for a Qualitative and Reliable Engine Control System for Propulsion Systems

    Science.gov (United States)

    Trevino, Luis; Brown, Terry; Crumbley, R. T. (Technical Monitor)

    2001-01-01

    The problem to be addressed in this paper is to explore how the use of Soft Computing Technologies (SCT) could be employed to improve overall vehicle system safety, reliability, and rocket engine performance by development of a qualitative and reliable engine control system (QRECS). Specifically, this will be addressed by enhancing rocket engine control using SCT, innovative data mining tools, and sound software engineering practices used in Marshall's Flight Software Group (FSG). The principle goals for addressing the issue of quality are to improve software management, software development time, software maintenance, processor execution, fault tolerance and mitigation, and nonlinear control in power level transitions. The intent is not to discuss any shortcomings of existing engine control methodologies, but to provide alternative design choices for control, implementation, performance, and sustaining engineering, all relative to addressing the issue of reliability. The approaches outlined in this paper will require knowledge in the fields of rocket engine propulsion (system level), software engineering for embedded flight software systems, and soft computing technologies (i.e., neural networks, fuzzy logic, data mining, and Bayesian belief networks); some of which are briefed in this paper. For this effort, the targeted demonstration rocket engine testbed is the MC-1 engine (formerly FASTRAC) which is simulated with hardware and software in the Marshall Avionics & Software Testbed (MAST) laboratory that currently resides at NASA's Marshall Space Flight Center, building 4476, and is managed by the Avionics Department. A brief plan of action for design, development, implementation, and testing a Phase One effort for QRECS is given, along with expected results. Phase One will focus on development of a Smart Start Engine Module and a Mainstage Engine Module for proper engine start and mainstage engine operations. The overall intent is to demonstrate that by

  7. Image Analysis Based on Soft Computing and Applied on Space Shuttle During the Liftoff Process

    Science.gov (United States)

    Dominquez, Jesus A.; Klinko, Steve J.

    2007-01-01

    Imaging techniques based on Soft Computing (SC) and developed at Kennedy Space Center (KSC) have been implemented on a variety of prototype applications related to the safety operation of the Space Shuttle during the liftoff process. These SC-based prototype applications include detection and tracking of moving Foreign Objects Debris (FOD) during the Space Shuttle liftoff, visual anomaly detection on slidewires used in the emergency egress system for the Space Shuttle at the laJlIlch pad, and visual detection of distant birds approaching the Space Shuttle launch pad. This SC-based image analysis capability developed at KSC was also used to analyze images acquired during the accident of the Space Shuttle Columbia and estimate the trajectory and velocity of the foam that caused the accident.

  8. Soft timing closure for soft programmable logic cores: The ARGen approach

    OpenAIRE

    Bollengier , Théotime; Lagadec , Loïc; Najem , Mohamad; Le Lann , Jean-Christophe; Guilloux , Pierre

    2017-01-01

    International audience; Reconfigurable cores support post-release updates which shortens time-to-market while extending circuits’ lifespan. Reconfigurable cores can be provided as hard cores (ASIC) or soft cores (RTL). Soft reconfigurable cores outperform hard reconfigurable cores by preserving the ASIC synthesis flow, at the cost of lowering scalability but also exacerbating timing closure issues. This article tackles these two issues and introduces the ARGen generator that produces scalable...

  9. Adult soft tissue sarcomas: conventional therapies and molecularly targeted approaches.

    Science.gov (United States)

    Mocellin, Simone; Rossi, Carlo R; Brandes, Alba; Nitti, Donato

    2006-02-01

    The therapeutic approach to soft tissue sarcomas (STS) has evolved over the past two decades based on the results from randomized controlled trials, which are guiding physicians in the treatment decision-making process. Despite significant improvements in the control of local disease, a significant number of patients ultimately die of recurrent/metastatic disease following radical surgery due to a lack of effective adjuvant treatments. In addition, the characteristic chemoresistance of STS has compromised the therapeutic value of conventional antineoplastic agents in cases of unresectable advanced/metastatic disease. Therefore, novel therapeutic strategies are urgently needed to improve the prognosis of patients with STS. Recent advances in STS biology are paving the way to the development of molecularly targeted therapeutic strategies, the efficacy of which relies not only on the knowledge of the molecular mechanisms underlying cancer development/progression but also on the personalization of the therapeutic regimen according to the molecular features of individual tumours. In this work, we review the state-of-the-art of conventional treatments for STS and summarize the most promising findings in the development of molecularly targeted therapeutic approaches.

  10. Web mining in soft computing framework: relevance, state of the art and future directions.

    Science.gov (United States)

    Pal, S K; Talwar, V; Mitra, P

    2002-01-01

    The paper summarizes the different characteristics of Web data, the basic components of Web mining and its different types, and the current state of the art. The reason for considering Web mining, a separate field from data mining, is explained. The limitations of some of the existing Web mining methods and tools are enunciated, and the significance of soft computing (comprising fuzzy logic (FL), artificial neural networks (ANNs), genetic algorithms (GAs), and rough sets (RSs) are highlighted. A survey of the existing literature on "soft Web mining" is provided along with the commercially available systems. The prospective areas of Web mining where the application of soft computing needs immediate attention are outlined with justification. Scope for future research in developing "soft Web mining" systems is explained. An extensive bibliography is also provided.

  11. Problems and Issues in Using Computer- Based Support Tools to Enhance 'Soft' Systems Methodologies

    Directory of Open Access Journals (Sweden)

    Mark Stansfield

    2001-11-01

    Full Text Available This paper explores the issue of whether computer-based support tools can enhance the use of 'soft' systems methodologies as applied to real-world problem situations. Although work has been carried out by a number of researchers in applying computer-based technology to concepts and methodologies relating to 'soft' systems thinking such as Soft Systems Methodology (SSM, such attempts appear to be still in their infancy and have not been applied widely to real-world problem situations. This paper will highlight some of the problems that may be encountered in attempting to develop computer-based support tools for 'soft' systems methodologies. Particular attention will be paid to an attempt by the author to develop a computer-based support tool for a particular 'soft' systems method of inquiry known as the Appreciative Inquiry Method that is based upon Vickers' notion of 'appreciation' (Vickers, 196S and Checkland's SSM (Checkland, 1981. The final part of the paper will explore some of the lessons learnt from developing and applying the computer-based support tool to a real world problem situation, as well as considering the feasibility of developing computer-based support tools for 'soft' systems methodologies. This paper will put forward the point that a mixture of manual and computer-based tools should be employed to allow a methodology to be used in an unconstrained manner, but the benefits provided by computer-based technology should be utilised in supporting and enhancing the more mundane and structured tasks.

  12. Computer tomography for rare soft tissue tumours of the extremities

    International Nuclear Information System (INIS)

    Boettger, E.; Semerak, M.; Stoltze, D.; Rossak, K.

    1979-01-01

    Five patients with undiagnosed soft tissue masses in the extremities were examined and in two a pathological diagnosis could be made. One was an extensive, invasive fibroma (desmoid) 22 cm long which could be followed from the thigh almost into the pelvis. It was sharply demarkated form the surrounding muscles and of higher density. The second case was a 12 cm long cavernous haemangioma in the semi-membranosus muscle. This was originally hypo-dense, but showed marked increase in its density after the administration of contrast. (orig.) [de

  13. MOBILE CLOUD COMPUTING APPLIED TO HEALTHCARE APPROACH

    OpenAIRE

    Omar AlSheikSalem

    2016-01-01

    In the past few years it was clear that mobile cloud computing was established via integrating both mobile computing and cloud computing to be add in both storage space and processing speed. Integrating healthcare applications and services is one of the vast data approaches that can be adapted to mobile cloud computing. This work proposes a framework of a global healthcare computing based combining both mobile computing and cloud computing. This approach leads to integrate all of ...

  14. On the possibility of non-invasive multilayer temperature estimation using soft-computing methods.

    Science.gov (United States)

    Teixeira, C A; Pereira, W C A; Ruano, A E; Ruano, M Graça

    2010-01-01

    This work reports original results on the possibility of non-invasive temperature estimation (NITE) in a multilayered phantom by applying soft-computing methods. The existence of reliable non-invasive temperature estimator models would improve the security and efficacy of thermal therapies. These points would lead to a broader acceptance of this kind of therapies. Several approaches based on medical imaging technologies were proposed, magnetic resonance imaging (MRI) being appointed as the only one to achieve the acceptable temperature resolutions for hyperthermia purposes. However, MRI intrinsic characteristics (e.g., high instrumentation cost) lead us to use backscattered ultrasound (BSU). Among the different BSU features, temporal echo-shifts have received a major attention. These shifts are due to changes of speed-of-sound and expansion of the medium. The originality of this work involves two aspects: the estimator model itself is original (based on soft-computing methods) and the application to temperature estimation in a three-layer phantom is also not reported in literature. In this work a three-layer (non-homogeneous) phantom was developed. The two external layers were composed of (in % of weight): 86.5% degassed water, 11% glycerin and 2.5% agar-agar. The intermediate layer was obtained by adding graphite powder in the amount of 2% of the water weight to the above composition. The phantom was developed to have attenuation and speed-of-sound similar to in vivo muscle, according to the literature. BSU signals were collected and cumulative temporal echo-shifts computed. These shifts and the past temperature values were then considered as possible estimators inputs. A soft-computing methodology was applied to look for appropriate multilayered temperature estimators. The methodology involves radial-basis functions neural networks (RBFNN) with structure optimized by the multi-objective genetic algorithm (MOGA). In this work 40 operating conditions were

  15. Towards a cyber-physical era: soft computing framework based multi-sensor array for water quality monitoring

    Directory of Open Access Journals (Sweden)

    J. Bhardwaj

    2018-02-01

    Full Text Available New concepts and techniques are replacing traditional methods of water quality parameter measurement systems. This paper introduces a cyber-physical system (CPS approach for water quality assessment in a distribution network. Cyber-physical systems with embedded sensors, processors and actuators can be designed to sense and interact with the water environment. The proposed CPS is comprised of sensing framework integrated with five different water quality parameter sensor nodes and soft computing framework for computational modelling. Soft computing framework utilizes the applications of Python for user interface and fuzzy sciences for decision making. Introduction of multiple sensors in a water distribution network generates a huge number of data matrices, which are sometimes highly complex, difficult to understand and convoluted for effective decision making. Therefore, the proposed system framework also intends to simplify the complexity of obtained sensor data matrices and to support decision making for water engineers through a soft computing framework. The target of this proposed research is to provide a simple and efficient method to identify and detect presence of contamination in a water distribution network using applications of CPS.

  16. Towards a cyber-physical era: soft computing framework based multi-sensor array for water quality monitoring

    Science.gov (United States)

    Bhardwaj, Jyotirmoy; Gupta, Karunesh K.; Gupta, Rajiv

    2018-02-01

    New concepts and techniques are replacing traditional methods of water quality parameter measurement systems. This paper introduces a cyber-physical system (CPS) approach for water quality assessment in a distribution network. Cyber-physical systems with embedded sensors, processors and actuators can be designed to sense and interact with the water environment. The proposed CPS is comprised of sensing framework integrated with five different water quality parameter sensor nodes and soft computing framework for computational modelling. Soft computing framework utilizes the applications of Python for user interface and fuzzy sciences for decision making. Introduction of multiple sensors in a water distribution network generates a huge number of data matrices, which are sometimes highly complex, difficult to understand and convoluted for effective decision making. Therefore, the proposed system framework also intends to simplify the complexity of obtained sensor data matrices and to support decision making for water engineers through a soft computing framework. The target of this proposed research is to provide a simple and efficient method to identify and detect presence of contamination in a water distribution network using applications of CPS.

  17. Computer Networks A Systems Approach

    CERN Document Server

    Peterson, Larry L

    2011-01-01

    This best-selling and classic book teaches you the key principles of computer networks with examples drawn from the real world of network and protocol design. Using the Internet as the primary example, the authors explain various protocols and networking technologies. Their systems-oriented approach encourages you to think about how individual network components fit into a larger, complex system of interactions. Whatever your perspective, whether it be that of an application developer, network administrator, or a designer of network equipment or protocols, you will come away with a "big pictur

  18. Applications of soft computing in time series forecasting simulation and modeling techniques

    CERN Document Server

    Singh, Pritpal

    2016-01-01

    This book reports on an in-depth study of fuzzy time series (FTS) modeling. It reviews and summarizes previous research work in FTS modeling and also provides a brief introduction to other soft-computing techniques, such as artificial neural networks (ANNs), rough sets (RS) and evolutionary computing (EC), focusing on how these techniques can be integrated into different phases of the FTS modeling approach. In particular, the book describes novel methods resulting from the hybridization of FTS modeling approaches with neural networks and particle swarm optimization. It also demonstrates how a new ANN-based model can be successfully applied in the context of predicting Indian summer monsoon rainfall. Thanks to its easy-to-read style and the clear explanations of the models, the book can be used as a concise yet comprehensive reference guide to fuzzy time series modeling, and will be valuable not only for graduate students, but also for researchers and professionals working for academic, business and governmen...

  19. A Study on Soft Computing Applications in I and C Systems of Nuclear Power Plant

    International Nuclear Information System (INIS)

    Kang, H. T.; Chung, H. Y.

    2006-01-01

    In the paper, the application of the soft computing based nuclear power plant(NPP) is discussed. Soft computing such as neural network(NN), fuzzy logic controller(FLC), and genetic algorithm(GA) and/or their hybrid will be a new frontier for the development of instrument and control(I and C) systems in NPP. The application includes several fields, for example, the diagnostics of system transient, optimal data selection in NN, and intelligent control etc. Two or more combining structure, hybrid system, is more efficient. The concept of FLC, NN, and GA is presented in Section 2. The applications of soft computing used in NPP are presented in Section 3

  20. Computational model of soft tissues in the human upper airway.

    Science.gov (United States)

    Pelteret, J-P V; Reddy, B D

    2012-01-01

    This paper presents a three-dimensional finite element model of the tongue and surrounding soft tissues with potential application to the study of sleep apnoea and of linguistics and speech therapy. The anatomical data was obtained from the Visible Human Project, and the underlying histological data was also extracted and incorporated into the model. Hyperelastic constitutive models were used to describe the material behaviour, and material incompressibility was accounted for. An active Hill three-element muscle model was used to represent the muscular tissue of the tongue. The neural stimulus for each muscle group was determined through the use of a genetic algorithm-based neural control model. The fundamental behaviour of the tongue under gravitational and breathing-induced loading is investigated. It is demonstrated that, when a time-dependent loading is applied to the tongue, the neural model is able to control the position of the tongue and produce a physiologically realistic response for the genioglossus.

  1. Microscope self-calibration based on micro laser line imaging and soft computing algorithms

    Science.gov (United States)

    Apolinar Muñoz Rodríguez, J.

    2018-06-01

    A technique to perform microscope self-calibration via micro laser line and soft computing algorithms is presented. In this technique, the microscope vision parameters are computed by means of soft computing algorithms based on laser line projection. To implement the self-calibration, a microscope vision system is constructed by means of a CCD camera and a 38 μm laser line. From this arrangement, the microscope vision parameters are represented via Bezier approximation networks, which are accomplished through the laser line position. In this procedure, a genetic algorithm determines the microscope vision parameters by means of laser line imaging. Also, the approximation networks compute the three-dimensional vision by means of the laser line position. Additionally, the soft computing algorithms re-calibrate the vision parameters when the microscope vision system is modified during the vision task. The proposed self-calibration improves accuracy of the traditional microscope calibration, which is accomplished via external references to the microscope system. The capability of the self-calibration based on soft computing algorithms is determined by means of the calibration accuracy and the micro-scale measurement error. This contribution is corroborated by an evaluation based on the accuracy of the traditional microscope calibration.

  2. SOFT COMPUTING SINGLE HIDDEN LAYER MODELS FOR SHELF LIFE PREDICTION OF BURFI

    Directory of Open Access Journals (Sweden)

    Sumit Goyal

    2012-05-01

    Full Text Available Burfi is an extremely popular sweetmeat, which is prepared by desiccating the standardized water buffalo milk. Soft computing feedforward single layer models were developed for predicting the shelf life of burfi stored at 30g.C. The data of the product relating to moisture, titratable acidity, free fatty acids, tyrosine, and peroxide value were used as input variables, and the overall acceptability score as output variable. The results showed excellent agreement between the experimental and the predicted data, suggesting that the developed soft computing model can alternatively be used for predicting the shelf life of burfi.

  3. Computational Modeling for Enhancing Soft Tissue Image Guided Surgery: An Application in Neurosurgery.

    Science.gov (United States)

    Miga, Michael I

    2016-01-01

    With the recent advances in computing, the opportunities to translate computational models to more integrated roles in patient treatment are expanding at an exciting rate. One area of considerable development has been directed towards correcting soft tissue deformation within image guided neurosurgery applications. This review captures the efforts that have been undertaken towards enhancing neuronavigation by the integration of soft tissue biomechanical models, imaging and sensing technologies, and algorithmic developments. In addition, the review speaks to the evolving role of modeling frameworks within surgery and concludes with some future directions beyond neurosurgical applications.

  4. SoftWAXS: a computational tool for modeling wide-angle X-ray solution scattering from biomolecules.

    Science.gov (United States)

    Bardhan, Jaydeep; Park, Sanghyun; Makowski, Lee

    2009-10-01

    This paper describes a computational approach to estimating wide-angle X-ray solution scattering (WAXS) from proteins, which has been implemented in a computer program called SoftWAXS. The accuracy and efficiency of SoftWAXS are analyzed for analytically solvable model problems as well as for proteins. Key features of the approach include a numerical procedure for performing the required spherical averaging and explicit representation of the solute-solvent boundary and the surface of the hydration layer. These features allow the Fourier transform of the excluded volume and hydration layer to be computed directly and with high accuracy. This approach will allow future investigation of different treatments of the electron density in the hydration shell. Numerical results illustrate the differences between this approach to modeling the excluded volume and a widely used model that treats the excluded-volume function as a sum of Gaussians representing the individual atomic excluded volumes. Comparison of the results obtained here with those from explicit-solvent molecular dynamics clarifies shortcomings inherent to the representation of solvent as a time-averaged electron-density profile. In addition, an assessment is made of how the calculated scattering patterns depend on input parameters such as the solute-atom radii, the width of the hydration shell and the hydration-layer contrast. These results suggest that obtaining predictive calculations of high-resolution WAXS patterns may require sophisticated treatments of solvent.

  5. Monitoring asthma control in children with allergies by soft computing of lung function and exhaled nitric oxide.

    Science.gov (United States)

    Pifferi, Massimo; Bush, Andrew; Pioggia, Giovanni; Di Cicco, Maria; Chinellato, Iolanda; Bodini, Alessandro; Macchia, Pierantonio; Boner, Attilio L

    2011-02-01

    Asthma control is emphasized by new guidelines but remains poor in many children. Evaluation of control relies on subjective patient recall and may be overestimated by health-care professionals. This study assessed the value of spirometry and fractional exhaled nitric oxide (FeNO) measurements, used alone or in combination, in models developed by a machine learning approach in the objective classification of asthma control according to Global Initiative for Asthma guidelines and tested the model in a second group of children with asthma. Fifty-three children with persistent atopic asthma underwent two to six evaluations of asthma control, including spirometry and FeNO. Soft computing evaluation was performed by means of artificial neural networks and principal component analysis. The model was then tested in a cross-sectional study in an additional 77 children with allergic asthma. The machine learning method was not able to distinguish different levels of control using either spirometry or FeNO values alone. However, their use in combination modeled by soft computing was able to discriminate levels of asthma control. In particular, the model is able to recognize all children with uncontrolled asthma and correctly identify 99.0% of children with totally controlled asthma. In the cross-sectional study, the model prospectively identified correctly all the uncontrolled children and 79.6% of the controlled children. Soft computing analysis of spirometry and FeNO allows objective categorization of asthma control status.

  6. Computational approach to Riemann surfaces

    CERN Document Server

    Klein, Christian

    2011-01-01

    This volume offers a well-structured overview of existent computational approaches to Riemann surfaces and those currently in development. The authors of the contributions represent the groups providing publically available numerical codes in this field. Thus this volume illustrates which software tools are available and how they can be used in practice. In addition examples for solutions to partial differential equations and in surface theory are presented. The intended audience of this book is twofold. It can be used as a textbook for a graduate course in numerics of Riemann surfaces, in which case the standard undergraduate background, i.e., calculus and linear algebra, is required. In particular, no knowledge of the theory of Riemann surfaces is expected; the necessary background in this theory is contained in the Introduction chapter. At the same time, this book is also intended for specialists in geometry and mathematical physics applying the theory of Riemann surfaces in their research. It is the first...

  7. Fuzzy multiple linear regression: A computational approach

    Science.gov (United States)

    Juang, C. H.; Huang, X. H.; Fleming, J. W.

    1992-01-01

    This paper presents a new computational approach for performing fuzzy regression. In contrast to Bardossy's approach, the new approach, while dealing with fuzzy variables, closely follows the conventional regression technique. In this approach, treatment of fuzzy input is more 'computational' than 'symbolic.' The following sections first outline the formulation of the new approach, then deal with the implementation and computational scheme, and this is followed by examples to illustrate the new procedure.

  8. Computational biomechanics for medicine new approaches and new applications

    CERN Document Server

    Miller, Karol; Wittek, Adam; Nielsen, Poul

    2015-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologiesand advancements. Thisvolumecomprises twelve of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, France, Spain and Switzerland. Some of the interesting topics discussed are:real-time simulations; growth and remodelling of soft tissues; inverse and meshless solutions; medical image analysis; and patient-specific solid mechanics simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  9. Family planning and the labor sector: soft-sell approach.

    Science.gov (United States)

    Teston, R C

    1981-01-01

    Dr. Cesar T. San Pedro, the director of the company clinic at Dole Philippines plantation in South Cotabato in Region 11, has been pressing the management to initiate a comprehensive family planning programs for their 10,000 workers. Pedro wants the Ministry of Labor and Employment (MOLE) to enforce its population program. The situation at Dole is one that requires an arbiter. Since 1977, there has not been a Population/Family Planning Officer (PFPO) for the area, and it is not possible to monitor closely if the qualified firms are following the labor code and providing family planning services to their employees. Susan B. Dedel, executive director of the PFPO, has reported that the office has sought to endear its program to the private sector by showing that family planning is also profitable for the firm. This "soft-sell" approach has been the hallmark of the MOLE-PFPO since it began in 1975 as a joint project of the Commission on Population (POPCOM), United Nations Fund for Population Activities (UNFPA), and International Labor Organization (ILO). Some critics have argued that this liberal style of implementation is short-selling the program. They point out that the Labor Code of 1973 enforces all establishments with at least 200 employees to have a free in-plant family planning program which includes clinic care, paid motivators, and volunteer population workers. The critics seem, at 1st glance, to have the statistics on their side. In its 5 years of operation, the PFPO has convinced only 137,000 workers to accept family planning. This is quite low, since of the 1.2 million employed by the covered firms, 800,000 are eligible for the MOLE program. Much of the weakness of the implementation is said to be due to the slow activation of the Labor-Management Coordinating Committees (LMCC). The critics maintain that because of the liberal enforcement of Department Order No. 9, the recalcitrant firms see no reason to comply. Dedel claims that the program is on the

  10. Interior spatial layout with soft objectives using evolutionary computation

    NARCIS (Netherlands)

    Chatzikonstantinou, I.; Bengisu, E.

    2016-01-01

    This paper presents the design problem of furniture arrangement in a residential interior living space, and addresses it by means of evolutionary computation. Interior arrangement is an important and interesting problem that occurs commonly when designing living spaces. It entails determining the

  11. Efficient Buffer Capacity and Scheduler Setting Computation for Soft Real-Time Stream Processing Applications

    NARCIS (Netherlands)

    Bekooij, Marco; Bekooij, Marco Jan Gerrit; Wiggers, M.H.; van Meerbergen, Jef

    2007-01-01

    Soft real-time applications that process data streams can often be intuitively described as dataflow process networks. In this paper we present a novel analysis technique to compute conservative estimates of the required buffer capacities in such process networks. With the same analysis technique

  12. Prediction of scour caused by 2D horizontal jets using soft computing techniques

    Directory of Open Access Journals (Sweden)

    Masoud Karbasi

    2017-12-01

    Full Text Available This paper presents application of five soft-computing techniques, artificial neural networks, support vector regression, gene expression programming, grouping method of data handling (GMDH neural network and adaptive-network-based fuzzy inference system, to predict maximum scour hole depth downstream of a sluice gate. The input parameters affecting the scour depth are the sediment size and its gradation, apron length, sluice gate opening, jet Froude number and the tail water depth. Six non-dimensional parameters were achieved to define a functional relationship between the input and output variables. Published data were used from the experimental researches. The results of soft-computing techniques were compared with empirical and regression based equations. The results obtained from the soft-computing techniques are superior to those of empirical and regression based equations. Comparison of soft-computing techniques showed that accuracy of the ANN model is higher than other models (RMSE = 0.869. A new GEP based equation was proposed.

  13. Development of Fuzzy Logic and Soft Computing Methodologies

    Science.gov (United States)

    Zadeh, L. A.; Yager, R.

    1999-01-01

    Our earlier research on computing with words (CW) has led to a new direction in fuzzy logic which points to a major enlargement of the role of natural languages in information processing, decision analysis and control. This direction is based on the methodology of computing with words and embodies a new theory which is referred to as the computational theory of perceptions (CTP). An important feature of this theory is that it can be added to any existing theory - especially to probability theory, decision analysis, and control - and enhance the ability of the theory to deal with real-world problems in which the decision-relevant information is a mixture of measurements and perceptions. The new direction is centered on an old concept - the concept of a perception - a concept which plays a central role in human cognition. The ability to reason with perceptions perceptions of time, distance, force, direction, shape, intent, likelihood, truth and other attributes of physical and mental objects - underlies the remarkable human capability to perform a wide variety of physical and mental tasks without any measurements and any computations. Everyday examples of such tasks are parking a car, driving in city traffic, cooking a meal, playing golf and summarizing a story. Perceptions are intrinsically imprecise. Imprecision of perceptions reflects the finite ability of sensory organs and ultimately, the brain, to resolve detail and store information. More concretely, perceptions are both fuzzy and granular, or, for short, f-granular. Perceptions are f-granular in the sense that: (a) the boundaries of perceived classes are not sharply defined; and (b) the elements of classes are grouped into granules, with a granule being a clump of elements drawn together by indistinguishability, similarity. proximity or functionality. F-granularity of perceptions may be viewed as a human way of achieving data compression. In large measure, scientific progress has been, and continues to be

  14. Measurement of facial soft tissues thickness using 3D computed tomographic images

    International Nuclear Information System (INIS)

    Jeong, Ho Gul; Kim, Kee Deog; Shin, Dong Won; Hu, Kyung Seok; Lee, Jae Bum; Park, Hyok; Park, Chang Seo; Han, Seung Ho

    2006-01-01

    To evaluate accuracy and reliability of program to measure facial soft tissue thickness using 3D computed tomographic images by comparing with direct measurement. One cadaver was scanned with a Helical CT with 3 mm slice thickness and 3 mm/sec table speed. The acquired data was reconstructed with 1.5 mm reconstruction interval and the images were transferred to a personal computer. The facial soft tissue thickness were measured using a program developed newly in 3D image. For direct measurement, the cadaver was cut with a bone cutter and then a ruler was placed above the cut side. The procedure was followed by taking pictures of the facial soft tissues with a high-resolution digital camera. Then the measurements were done in the photographic images and repeated for ten times. A repeated measure analysis of variance was adopted to compare and analyze the measurements resulting from the two different methods. Comparison according to the areas was analyzed by Mann-Whitney test. There were no statistically significant differences between the direct measurements and those using the 3D images(p>0.05). There were statistical differences in the measurements on 17 points but all the points except 2 points showed a mean difference of 0.5 mm or less. The developed software program to measure the facial soft tissue thickness using 3D images was so accurate that it allows to measure facial soft tissue thickness more easily in forensic science and anthropology

  15. Measurement of facial soft tissues thickness using 3D computed tomographic images

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Ho Gul; Kim, Kee Deog; Shin, Dong Won; Hu, Kyung Seok; Lee, Jae Bum; Park, Hyok; Park, Chang Seo [Yonsei Univ. Hospital, Seoul (Korea, Republic of); Han, Seung Ho [Catholic Univ. of Korea, Seoul (Korea, Republic of)

    2006-03-15

    To evaluate accuracy and reliability of program to measure facial soft tissue thickness using 3D computed tomographic images by comparing with direct measurement. One cadaver was scanned with a Helical CT with 3 mm slice thickness and 3 mm/sec table speed. The acquired data was reconstructed with 1.5 mm reconstruction interval and the images were transferred to a personal computer. The facial soft tissue thickness were measured using a program developed newly in 3D image. For direct measurement, the cadaver was cut with a bone cutter and then a ruler was placed above the cut side. The procedure was followed by taking pictures of the facial soft tissues with a high-resolution digital camera. Then the measurements were done in the photographic images and repeated for ten times. A repeated measure analysis of variance was adopted to compare and analyze the measurements resulting from the two different methods. Comparison according to the areas was analyzed by Mann-Whitney test. There were no statistically significant differences between the direct measurements and those using the 3D images(p>0.05). There were statistical differences in the measurements on 17 points but all the points except 2 points showed a mean difference of 0.5 mm or less. The developed software program to measure the facial soft tissue thickness using 3D images was so accurate that it allows to measure facial soft tissue thickness more easily in forensic science and anthropology.

  16. A Case for Soft Error Detection and Correction in Computational Chemistry.

    Science.gov (United States)

    van Dam, Hubertus J J; Vishnu, Abhinav; de Jong, Wibe A

    2013-09-10

    High performance computing platforms are expected to deliver 10(18) floating operations per second by the year 2022 through the deployment of millions of cores. Even if every core is highly reliable the sheer number of them will mean that the mean time between failures will become so short that most application runs will suffer at least one fault. In particular soft errors caused by intermittent incorrect behavior of the hardware are a concern as they lead to silent data corruption. In this paper we investigate the impact of soft errors on optimization algorithms using Hartree-Fock as a particular example. Optimization algorithms iteratively reduce the error in the initial guess to reach the intended solution. Therefore they may intuitively appear to be resilient to soft errors. Our results show that this is true for soft errors of small magnitudes but not for large errors. We suggest error detection and correction mechanisms for different classes of data structures. The results obtained with these mechanisms indicate that we can correct more than 95% of the soft errors at moderate increases in the computational cost.

  17. Soft condensed matter approach to cooking of meat

    NARCIS (Netherlands)

    Sman, van der R.G.M.

    2007-01-01

    We have viewed cooking meat from the perspective of soft condensed physics and posed that the moisture transport during cooking can be described by Flory-Rehner theory of swelling/shrinking polymer gels. This theory contains the essential physics to describe the transport of liquid moisture due to

  18. The Soft Constraints Hypothesis: A Rational Analysis Approach to Resource Allocation for Interactive Behavior

    National Research Council Canada - National Science Library

    Gray, Wayne D; Sims, Chris R; Schoelles, Michael J; Fu, Wai-Tat

    2006-01-01

    Soft constraints hypothesis (SCH) is a rational analysis approach that holds that the mixture of perceptual-motor and cognitive resources allocated for interactive behavior is adjusted based on temporal cost-benefit tradeoff...

  19. Computed Tomography and Magnetic Resonance Imaging of Myoepitheliloma in the Soft Palate: A Case Report

    International Nuclear Information System (INIS)

    Lim, Hun Cheol; Yu, In Kyu; Park, Mi Ja; Jang, Dong Sik

    2011-01-01

    We report the appearance of myoepithelioma arising from minor salivary glands in the soft palate observed on computed tomography (CT) and magnetic resonance imaging (MRI). CT, the tumor was round with a smooth and partial lobulating contour, and slightly marginal contrast enhancement. On T1-weighted images, the mass had heterogeneous iso-signal intensity compared to the pharyngeal muscle. Additionally, the tumor had heterogeneously high T2 signal intensity with heterogeneously strong enhancement on the Gd-enhanced T1-weighted image. Radiologists should consider myoepithelioma in the radiological differential diagnosis of soft palate tumors.

  20. Determining flexor-tendon repair techniques via soft computing

    Science.gov (United States)

    Johnson, M.; Firoozbakhsh, K.; Moniem, M.; Jamshidi, M.

    2001-01-01

    An SC-based multi-objective decision-making method for determining the optimal flexor-tendon repair technique from experimental and clinical survey data, and with variable circumstances, was presented. Results were compared with those from the Taguchi method. Using the Taguchi method results in the need to perform ad-hoc decisions when the outcomes for individual objectives are contradictory to a particular preference or circumstance, whereas the SC-based multi-objective technique provides a rigorous straightforward computational process in which changing preferences and importance of differing objectives are easily accommodated. Also, adding more objectives is straightforward and easily accomplished. The use of fuzzy-set representations of information categories provides insight into their performance throughout the range of their universe of discourse. The ability of the technique to provide a "best" medical decision given a particular physician, hospital, patient, situation, and other criteria was also demonstrated.

  1. 3D-SoftChip: A Novel Architecture for Next-Generation Adaptive Computing Systems

    Directory of Open Access Journals (Sweden)

    Lee Mike Myung-Ok

    2006-01-01

    Full Text Available This paper introduces a novel architecture for next-generation adaptive computing systems, which we term 3D-SoftChip. The 3D-SoftChip is a 3-dimensional (3D vertically integrated adaptive computing system combining state-of-the-art processing and 3D interconnection technology. It comprises the vertical integration of two chips (a configurable array processor and an intelligent configurable switch through an indium bump interconnection array (IBIA. The configurable array processor (CAP is an array of heterogeneous processing elements (PEs, while the intelligent configurable switch (ICS comprises a switch block, 32-bit dedicated RISC processor for control, on-chip program/data memory, data frame buffer, along with a direct memory access (DMA controller. This paper introduces the novel 3D-SoftChip architecture for real-time communication and multimedia signal processing as a next-generation computing system. The paper further describes the advanced HW/SW codesign and verification methodology, including high-level system modeling of the 3D-SoftChip using SystemC, being used to determine the optimum hardware specification in the early design stage.

  2. Computer Architecture A Quantitative Approach

    CERN Document Server

    Hennessy, John L

    2011-01-01

    The computing world today is in the middle of a revolution: mobile clients and cloud computing have emerged as the dominant paradigms driving programming and hardware innovation today. The Fifth Edition of Computer Architecture focuses on this dramatic shift, exploring the ways in which software and technology in the cloud are accessed by cell phones, tablets, laptops, and other mobile computing devices. Each chapter includes two real-world examples, one mobile and one datacenter, to illustrate this revolutionary change.Updated to cover the mobile computing revolutionEmphasizes the two most im

  3. Detection of Failure in Asynchronous Motor Using Soft Computing Method

    Science.gov (United States)

    Vinoth Kumar, K.; Sony, Kevin; Achenkunju John, Alan; Kuriakose, Anto; John, Ano P.

    2018-04-01

    This paper investigates the stator short winding failure of asynchronous motor also their effects on motor current spectrums. A fuzzy logic approach i.e., model based technique possibly will help to detect the asynchronous motor failure. Actually, fuzzy logic similar to humanoid intelligent methods besides expected linguistic empowering inferences through vague statistics. The dynamic model is technologically advanced for asynchronous motor by means of fuzzy logic classifier towards investigate the stator inter turn failure in addition open phase failure. A hardware implementation was carried out with LabVIEW for the online-monitoring of faults.

  4. The hierarchical expert tuning of PID controllers using tools of soft computing.

    Science.gov (United States)

    Karray, F; Gueaieb, W; Al-Sharhan, S

    2002-01-01

    We present soft computing-based results pertaining to the hierarchical tuning process of PID controllers located within the control loop of a class of nonlinear systems. The results are compared with PID controllers implemented either in a stand alone scheme or as a part of conventional gain scheduling structure. This work is motivated by the increasing need in the industry to design highly reliable and efficient controllers for dealing with regulation and tracking capabilities of complex processes characterized by nonlinearities and possibly time varying parameters. The soft computing-based controllers proposed are hybrid in nature in that they integrate within a well-defined hierarchical structure the benefits of hard algorithmic controllers with those having supervisory capabilities. The controllers proposed also have the distinct features of learning and auto-tuning without the need for tedious and computationally extensive online systems identification schemes.

  5. A multi-modal approach to soft systems methodology

    OpenAIRE

    Bergvall-Kåreborn, Birgitta

    2002-01-01

    The main aim of my research is to explore ways of enriching Soft Systems Methodology by developing intellectual tools that can help designers to conceptualise, create and evaluate different design alternatives. This directs the focus on the methodology’s modelling phase even though some ideas related to analysis also will be presented. In order to realize this objective the study proposes the following supplements. Firstly, a framework of 15 modalities (knowledge areas) is suggested as a supp...

  6. Clinical usefulness of facial soft tissues thickness measurement using 3D computed tomographic images

    International Nuclear Information System (INIS)

    Jeong, Ho Gul; Kim, Kee Deog; Hu, Kyung Seok; Lee, Jae Bum; Park, Hyok; Han, Seung Ho; Choi, Seong Ho; Kim, Chong Kwan; Park, Chang Seo

    2006-01-01

    To evaluate clinical usefulness of facial soft tissue thickness measurement using 3D computed tomographic images. One cadaver that had sound facial soft tissues was chosen for the study. The cadaver was scanned with a Helical CT under following scanning protocols about slice thickness and table speed: 3 mm and 3 mm/sec, 5 mm and 5 mm/sec, 7 mm and 7 mm/sec. The acquired data were reconstructed 1.5, 2.5, 3.5 mm reconstruction interval respectively and the images were transferred to a personal computer. Using a program developed to measure facial soft tissue thickness in 3D image, the facial soft tissue thickness was measured. After the ten-time repeation of the measurement for ten times, repeated measure analysis of variance (ANOVA) was adopted to compare and analyze the measurements using the three scanning protocols. Comparison according to the areas was analysed by Mann-Whitney test. There were no statistically significant intraobserver differences in the measurements of the facial soft tissue thickness using the three scanning protocols (p>0.05). There were no statistically significant differences between measurements in the 3 mm slice thickness and those in the 5 mm, 7 mm slice thickness (p>0.05). There were statistical differences in the 14 of the total 30 measured points in the 5 mm slice thickness and 22 in the 7 mm slice thickness. The facial soft tissue thickness measurement using 3D images of 7 mm slice thickness is acceptable clinically, but those of 5 mm slice thickness is recommended for the more accurate measurement

  7. Inference of Cancer-specific Gene Regulatory Networks Using Soft Computing Rules

    Directory of Open Access Journals (Sweden)

    Xiaosheng Wang

    2010-03-01

    Full Text Available Perturbations of gene regulatory networks are essentially responsible for oncogenesis. Therefore, inferring the gene regulatory networks is a key step to overcoming cancer. In this work, we propose a method for inferring directed gene regulatory networks based on soft computing rules, which can identify important cause-effect regulatory relations of gene expression. First, we identify important genes associated with a specific cancer (colon cancer using a supervised learning approach. Next, we reconstruct the gene regulatory networks by inferring the regulatory relations among the identified genes, and their regulated relations by other genes within the genome. We obtain two meaningful findings. One is that upregulated genes are regulated by more genes than downregulated ones, while downregulated genes regulate more genes than upregulated ones. The other one is that tumor suppressors suppress tumor activators and activate other tumor suppressors strongly, while tumor activators activate other tumor activators and suppress tumor suppressors weakly, indicating the robustness of biological systems. These findings provide valuable insights into the pathogenesis of cancer.

  8. Wind turbine power coefficient estimation by soft computing methodologies: Comparative study

    International Nuclear Information System (INIS)

    Shamshirband, Shahaboddin; Petković, Dalibor; Saboohi, Hadi; Anuar, Nor Badrul; Inayat, Irum; Akib, Shatirah; Ćojbašić, Žarko; Nikolić, Vlastimir; Mat Kiah, Miss Laiha; Gani, Abdullah

    2014-01-01

    Highlights: • Variable speed operation of wind turbine to increase power generation. • Changeability and fluctuation of wind has to be accounted. • To build an effective prediction model of wind turbine power coefficient. • The impact of the variation in the blade pitch angle and tip speed ratio. • Support vector regression methodology application as predictive methodology. - Abstract: Wind energy has become a large contender of traditional fossil fuel energy, particularly with the successful operation of multi-megawatt sized wind turbines. However, reasonable wind speed is not adequately sustainable everywhere to build an economical wind farm. In wind energy conversion systems, one of the operational problems is the changeability and fluctuation of wind. In most cases, wind speed can vacillate rapidly. Hence, quality of produced energy becomes an important problem in wind energy conversion plants. Several control techniques have been applied to improve the quality of power generated from wind turbines. In this study, the polynomial and radial basis function (RBF) are applied as the kernel function of support vector regression (SVR) to estimate optimal power coefficient value of the wind turbines. Instead of minimizing the observed training error, SVR p oly and SVR r bf attempt to minimize the generalization error bound so as to achieve generalized performance. The experimental results show that an improvement in predictive accuracy and capability of generalization can be achieved by the SVR approach in compare to other soft computing methodologies

  9. An appraisal of wind speed distribution prediction by soft computing methodologies: A comparative study

    International Nuclear Information System (INIS)

    Petković, Dalibor; Shamshirband, Shahaboddin; Anuar, Nor Badrul; Saboohi, Hadi; Abdul Wahab, Ainuddin Wahid; Protić, Milan; Zalnezhad, Erfan; Mirhashemi, Seyed Mohammad Amin

    2014-01-01

    Highlights: • Probabilistic distribution functions of wind speed. • Two parameter Weibull probability distribution. • To build an effective prediction model of distribution of wind speed. • Support vector regression application as probability function for wind speed. - Abstract: The probabilistic distribution of wind speed is among the more significant wind characteristics in examining wind energy potential and the performance of wind energy conversion systems. When the wind speed probability distribution is known, the wind energy distribution can be easily obtained. Therefore, the probability distribution of wind speed is a very important piece of information required in assessing wind energy potential. For this reason, a large number of studies have been established concerning the use of a variety of probability density functions to describe wind speed frequency distributions. Although the two-parameter Weibull distribution comprises a widely used and accepted method, solving the function is very challenging. In this study, the polynomial and radial basis functions (RBF) are applied as the kernel function of support vector regression (SVR) to estimate two parameters of the Weibull distribution function according to previously established analytical methods. Rather than minimizing the observed training error, SVR p oly and SVR r bf attempt to minimize the generalization error bound, so as to achieve generalized performance. According to the experimental results, enhanced predictive accuracy and capability of generalization can be achieved using the SVR approach compared to other soft computing methodologies

  10. Inference of cancer-specific gene regulatory networks using soft computing rules.

    Science.gov (United States)

    Wang, Xiaosheng; Gotoh, Osamu

    2010-03-24

    Perturbations of gene regulatory networks are essentially responsible for oncogenesis. Therefore, inferring the gene regulatory networks is a key step to overcoming cancer. In this work, we propose a method for inferring directed gene regulatory networks based on soft computing rules, which can identify important cause-effect regulatory relations of gene expression. First, we identify important genes associated with a specific cancer (colon cancer) using a supervised learning approach. Next, we reconstruct the gene regulatory networks by inferring the regulatory relations among the identified genes, and their regulated relations by other genes within the genome. We obtain two meaningful findings. One is that upregulated genes are regulated by more genes than downregulated ones, while downregulated genes regulate more genes than upregulated ones. The other one is that tumor suppressors suppress tumor activators and activate other tumor suppressors strongly, while tumor activators activate other tumor activators and suppress tumor suppressors weakly, indicating the robustness of biological systems. These findings provide valuable insights into the pathogenesis of cancer.

  11. Development of Semi-Automatic Lathe by using Intelligent Soft Computing Technique

    Science.gov (United States)

    Sakthi, S.; Niresh, J.; Vignesh, K.; Anand Raj, G.

    2018-03-01

    This paper discusses the enhancement of conventional lathe machine to semi-automated lathe machine by implementing a soft computing method. In the present scenario, lathe machine plays a vital role in the engineering division of manufacturing industry. While the manual lathe machines are economical, the accuracy and efficiency are not up to the mark. On the other hand, CNC machine provide the desired accuracy and efficiency, but requires a huge capital. In order to over come this situation, a semi-automated approach towards the conventional lathe machine is developed by employing stepper motors to the horizontal and vertical drive, that can be controlled by Arduino UNO -microcontroller. Based on the input parameters of the lathe operation the arduino coding is been generated and transferred to the UNO board. Thus upgrading from manual to semi-automatic lathe machines can significantly increase the accuracy and efficiency while, at the same time, keeping a check on investment cost and consequently provide a much needed escalation to the manufacturing industry.

  12. Modeling rainfall-runoff process using soft computing techniques

    Science.gov (United States)

    Kisi, Ozgur; Shiri, Jalal; Tombul, Mustafa

    2013-02-01

    Rainfall-runoff process was modeled for a small catchment in Turkey, using 4 years (1987-1991) of measurements of independent variables of rainfall and runoff values. The models used in the study were Artificial Neural Networks (ANNs), Adaptive Neuro-Fuzzy Inference System (ANFIS) and Gene Expression Programming (GEP) which are Artificial Intelligence (AI) approaches. The applied models were trained and tested using various combinations of the independent variables. The goodness of fit for the model was evaluated in terms of the coefficient of determination (R2), root mean square error (RMSE), mean absolute error (MAE), coefficient of efficiency (CE) and scatter index (SI). A comparison was also made between these models and traditional Multi Linear Regression (MLR) model. The study provides evidence that GEP (with RMSE=17.82 l/s, MAE=6.61 l/s, CE=0.72 and R2=0.978) is capable of modeling rainfall-runoff process and is a viable alternative to other applied artificial intelligence and MLR time-series methods.

  13. A Novel Approach to Determine the Prevalence of Type of Soft Palate Using Digital Intraoral Impression

    Directory of Open Access Journals (Sweden)

    Saurabh Chaturvedi

    2017-01-01

    Full Text Available Aim. To determine the prevalence of type of soft palate in targeted population. Materials and Methods. Using computer technology in dentistry, intraoral digital scanner, and 3D analysis software tool, study was conducted. 100 patients selected from the outpatient clinics were divided into two groups based on the ages of 20–40 years and 41–60 years with equal ratio of males and females. Each selected patient’s maxillary arch was scanned with intraoral scanner; images so obtained were sectioned in anteroposterior cross section and with the 3D analysis software; the angulation between hard and soft palate was determined. Results. The prevalence of type II soft palate (angulation between hard and soft palate is between 10 and 45 degrees was highest, 60% in group 1 and 44% in group 2. The difference between genders was statistically significant with p value <0.05 in both the groups, although females had higher angulation compared to the males in all classes of both groups. Conclusions. In targeted population of Aseer Province, Saudi Arabia, the prevalence of type II soft palate was more common, with higher soft palate angulation among females. The advanced age had no effect in the type of soft palate in the region.

  14. A Novel Approach to Determine the Prevalence of Type of Soft Palate Using Digital Intraoral Impression

    Science.gov (United States)

    Khaled Addas, Mohamed; Al Humaidi, Abdullah Saad Ali; Al Qahtani, Abdulrazaq Mohammed; Al Qahtani, Mubarak Daghash

    2017-01-01

    Aim To determine the prevalence of type of soft palate in targeted population. Materials and Methods Using computer technology in dentistry, intraoral digital scanner, and 3D analysis software tool, study was conducted. 100 patients selected from the outpatient clinics were divided into two groups based on the ages of 20–40 years and 41–60 years with equal ratio of males and females. Each selected patient's maxillary arch was scanned with intraoral scanner; images so obtained were sectioned in anteroposterior cross section and with the 3D analysis software; the angulation between hard and soft palate was determined. Results The prevalence of type II soft palate (angulation between hard and soft palate is between 10 and 45 degrees) was highest, 60% in group 1 and 44% in group 2. The difference between genders was statistically significant with p value <0.05 in both the groups, although females had higher angulation compared to the males in all classes of both groups. Conclusions In targeted population of Aseer Province, Saudi Arabia, the prevalence of type II soft palate was more common, with higher soft palate angulation among females. The advanced age had no effect in the type of soft palate in the region. PMID:28951740

  15. MANAGEMENT APPROACH BETWEEN BUSINESS CLUSTER SUCCESS AND SOFT LEADER CHARACTERISTICS

    Directory of Open Access Journals (Sweden)

    Robert Lippert

    2014-05-01

    Full Text Available One of the potential aspects of economic growth lies in focusing on furtherance the development of business clusters. By linking the complementary competencies of profit oriented enterprises, NGO-s, universities, research institutes and local authorities, the innovation potential and the productivity are significantly increased. The present study investigates a specific and challenging managerial activity, the role of the cluster manager. The aim of the research is to reveal the intrinsic motivation of cluster operations and to demonstrate the importance of the manager in the efficient and sustainable operation. An empirical research has been conducted involving cluster managers and member organisations through an extensive questionnaire survey in Hungary. First, determinant factors of cluster success have been identified. By using these factors, as the operational activity of the cluster, as well as the satisfaction of the members in the field of innovation and productivity, a new continuous three-dimensional maturity model has been introduced to evaluate the cluster success. Mapping the soft factors, organisational culture and leadership roles have been assessed by applying Competing Values Framework method. The results of the research depict the correlation found between soft leader characteristics and cluster success.

  16. A soft double regularization approach to parametric blind image deconvolution.

    Science.gov (United States)

    Chen, Li; Yap, Kim-Hui

    2005-05-01

    This paper proposes a blind image deconvolution scheme based on soft integration of parametric blur structures. Conventional blind image deconvolution methods encounter a difficult dilemma of either imposing stringent and inflexible preconditions on the problem formulation or experiencing poor restoration results due to lack of information. This paper attempts to address this issue by assessing the relevance of parametric blur information, and incorporating the knowledge into the parametric double regularization (PDR) scheme. The PDR method assumes that the actual blur satisfies up to a certain degree of parametric structure, as there are many well-known parametric blurs in practical applications. Further, it can be tailored flexibly to include other blur types if some prior parametric knowledge of the blur is available. A manifold soft parametric modeling technique is proposed to generate the blur manifolds, and estimate the fuzzy blur structure. The PDR scheme involves the development of the meaningful cost function, the estimation of blur support and structure, and the optimization of the cost function. Experimental results show that it is effective in restoring degraded images under different environments.

  17. A new ChainMail approach for real-time soft tissue simulation.

    Science.gov (United States)

    Zhang, Jinao; Zhong, Yongmin; Smith, Julian; Gu, Chengfan

    2016-07-03

    This paper presents a new ChainMail method for real-time soft tissue simulation. This method enables the use of different material properties for chain elements to accommodate various materials. Based on the ChainMail bounding region, a new time-saving scheme is developed to improve computational efficiency for isotropic materials. The proposed method also conserves volume and strain energy. Experimental results demonstrate that the proposed ChainMail method can not only accommodate isotropic, anisotropic and heterogeneous materials but also model incompressibility and relaxation behaviors of soft tissues. Further, the proposed method can achieve real-time computational performance.

  18. Proceedings of the International Conference on Soft Computing for Problem Solving

    CERN Document Server

    Nagar, Atulya; Pant, Millie; Bansal, Jagdish

    2012-01-01

    The present book is based on the research papers presented in the International Conference on Soft Computing for Problem Solving (SocProS 2011), held at Roorkee, India. This book is divided into two volumes and covers a variety of topics, including mathematical modeling, image processing, optimization, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, data mining etc. Particular emphasis is laid on Soft Computing and its application to diverse fields. The prime objective of the book is to familiarize the reader with the latest scientific developments that are taking place in various fields and the latest sophisticated problem solving tools that are being developed to deal with the complex and intricate problems that are otherwise difficult to solve by the usual and traditional methods. The book is directed to the researchers and scientists engaged in various fields of Science and Technology.

  19. Soft drink effects on sensorimotor rhythm brain computer interface performance and resting-state spectral power.

    Science.gov (United States)

    Mundahl, John; Jianjun Meng; He, Jeffrey; Bin He

    2016-08-01

    Brain-computer interface (BCI) systems allow users to directly control computers and other machines by modulating their brain waves. In the present study, we investigated the effect of soft drinks on resting state (RS) EEG signals and BCI control. Eight healthy human volunteers each participated in three sessions of BCI cursor tasks and resting state EEG. During each session, the subjects drank an unlabeled soft drink with either sugar, caffeine, or neither ingredient. A comparison of resting state spectral power shows a substantial decrease in alpha and beta power after caffeine consumption relative to control. Despite attenuation of the frequency range used for the control signal, caffeine average BCI performance was the same as control. Our work provides a useful characterization of caffeine, the world's most popular stimulant, on brain signal frequencies and their effect on BCI performance.

  20. Proceedings of the International Conference on Soft Computing for Problem Solving

    CERN Document Server

    Nagar, Atulya; Pant, Millie; Bansal, Jagdish

    2012-01-01

    The present book is based on the research papers presented in the International Conference on Soft Computing for Problem Solving (SocProS 2011), held at Roorkee, India. This book is divided into two volumes and covers a variety of topics, including mathematical modeling, image processing, optimization, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, data mining etc. Particular emphasis is laid on Soft Computing and its application to diverse fields. The prime objective of the book is to familiarize the reader with the latest scientific developments that are taking place in various fields and the latest sophisticated problem solving tools that are being developed to deal with the complex and intricate problems that are otherwise difficult to solve by the usual and traditional methods. The book is directed to the researchers and scientists engaged in various fields of Science and Technology.

  1. Controller Design of DFIG Based Wind Turbine by Using Evolutionary Soft Computational Techniques

    Directory of Open Access Journals (Sweden)

    O. P. Bharti

    2017-06-01

    Full Text Available This manuscript illustrates the controller design for a doubly fed induction generator based variable speed wind turbine by using a bioinspired scheme. This methodology is based on exploiting two proficient swarm intelligence based evolutionary soft computational procedures. The particle swarm optimization (PSO and bacterial foraging optimization (BFO techniques are employed to design the controller intended for small damping plant of the DFIG. Wind energy overview and DFIG operating principle along with the equivalent circuit model is adequately discussed in this paper. The controller design for DFIG based WECS using PSO and BFO are described comparatively in detail. The responses of the DFIG system regarding terminal voltage, current, active-reactive power, and DC-Link voltage have slightly improved with the evolutionary soft computational procedure. Lastly, the obtained output is equated with a standard technique for performance improvement of DFIG based wind energy conversion system.

  2. What is computation : An epistemic approach

    NARCIS (Netherlands)

    Wiedermann, Jiří; van Leeuwen, Jan

    2015-01-01

    Traditionally, computations are seen as processes that transform information. Definitions of computation subsequently concentrate on a description of the mechanisms that lead to such processes. The bottleneck of this approach is twofold. First, it leads to a definition of computation that is too

  3. Claudio Moraga a passion for multi-valued logic and soft computing

    CERN Document Server

    Allende-Cid, Héctor

    2017-01-01

    The book is an authoritative collection of contributions by leading experts on the topics of fuzzy logic, multi-valued logic and neural network. Originally written as an homage to Claudio Moraga, seen by his colleagues as an example of concentration, discipline and passion for science, the book also represents a timely reference guide for advance students and researchers in the field of soft computing, and multiple-valued logic. .

  4. Assessment of traffic noise levels in urban areas using different soft computing techniques.

    Science.gov (United States)

    Tomić, J; Bogojević, N; Pljakić, M; Šumarac-Pavlović, D

    2016-10-01

    Available traffic noise prediction models are usually based on regression analysis of experimental data, and this paper presents the application of soft computing techniques in traffic noise prediction. Two mathematical models are proposed and their predictions are compared to data collected by traffic noise monitoring in urban areas, as well as to predictions of commonly used traffic noise models. The results show that application of evolutionary algorithms and neural networks may improve process of development, as well as accuracy of traffic noise prediction.

  5. Soft computing analysis of the possible correlation between temporal and energy release patterns in seismic activity

    Science.gov (United States)

    Konstantaras, Anthony; Katsifarakis, Emmanouil; Artzouxaltzis, Xristos; Makris, John; Vallianatos, Filippos; Varley, Martin

    2010-05-01

    This paper is a preliminary investigation of the possible correlation of temporal and energy release patterns of seismic activity involving the preparation processes of consecutive sizeable seismic events [1,2]. The background idea is that during periods of low-level seismic activity, stress processes in the crust accumulate energy at the seismogenic area whilst larger seismic events act as a decongesting mechanism releasing considerable energy [3,4]. A dynamic algorithm is being developed aiming to identify and cluster pre- and post- seismic events to the main earthquake following on research carried out by Zubkov [5] and Dobrovolsky [6,7]. This clustering technique along with energy release equations dependent on Richter's scale [8,9] allow for an estimate to be drawn regarding the amount of the energy being released by the seismic sequence. The above approach is being implemented as a monitoring tool to investigate the behaviour of the underlying energy management system by introducing this information to various neural [10,11] and soft computing models [1,12,13,14]. The incorporation of intelligent systems aims towards the detection and simulation of the possible relationship between energy release patterns and time-intervals among consecutive sizeable earthquakes [1,15]. Anticipated successful training of the imported intelligent systems may result in a real-time, on-line processing methodology [1,16] capable to dynamically approximate the time-interval between the latest and the next forthcoming sizeable seismic event by monitoring the energy release process in a specific seismogenic area. Indexing terms: pattern recognition, long-term earthquake precursors, neural networks, soft computing, earthquake occurrence intervals References [1] Konstantaras A., Vallianatos F., Varley M.R. and Makris J. P.: ‘Soft computing modelling of seismicity in the southern Hellenic arc', IEEE Geoscience and Remote Sensing Letters, vol. 5 (3), pp. 323-327, 2008 [2] Eneva M. and

  6. Digital dissection - using contrast-enhanced computed tomography scanning to elucidate hard- and soft-tissue anatomy in the Common Buzzard Buteo buteo.

    Science.gov (United States)

    Lautenschlager, Stephan; Bright, Jen A; Rayfield, Emily J

    2014-04-01

    Gross dissection has a long history as a tool for the study of human or animal soft- and hard-tissue anatomy. However, apart from being a time-consuming and invasive method, dissection is often unsuitable for very small specimens and often cannot capture spatial relationships of the individual soft-tissue structures. The handful of comprehensive studies on avian anatomy using traditional dissection techniques focus nearly exclusively on domestic birds, whereas raptorial birds, and in particular their cranial soft tissues, are essentially absent from the literature. Here, we digitally dissect, identify, and document the soft-tissue anatomy of the Common Buzzard (Buteo buteo) in detail, using the new approach of contrast-enhanced computed tomography using Lugol's iodine. The architecture of different muscle systems (adductor, depressor, ocular, hyoid, neck musculature), neurovascular, and other soft-tissue structures is three-dimensionally visualised and described in unprecedented detail. The three-dimensional model is further presented as an interactive PDF to facilitate the dissemination and accessibility of anatomical data. Due to the digital nature of the data derived from the computed tomography scanning and segmentation processes, these methods hold the potential for further computational analyses beyond descriptive and illustrative proposes. © 2013 The Authors. Journal of Anatomy published by John Wiley & Sons Ltd on behalf of Anatomical Society.

  7. In vivo X-Ray Computed Tomographic Imaging of Soft Tissue with Native, Intravenous, or Oral Contrast

    Science.gov (United States)

    Wathen, Connor A.; Foje, Nathan; van Avermaete, Tony; Miramontes, Bernadette; Chapaman, Sarah E.; Sasser, Todd A.; Kannan, Raghuraman; Gerstler, Steven; Leevy, W. Matthew

    2013-01-01

    X-ray Computed Tomography (CT) is one of the most commonly utilized anatomical imaging modalities for both research and clinical purposes. CT combines high-resolution, three-dimensional data with relatively fast acquisition to provide a solid platform for non-invasive human or specimen imaging. The primary limitation of CT is its inability to distinguish many soft tissues based on native contrast. While bone has high contrast within a CT image due to its material density from calcium phosphate, soft tissue is less dense and many are homogenous in density. This presents a challenge in distinguishing one type of soft tissue from another. A couple exceptions include the lungs as well as fat, both of which have unique densities owing to the presence of air or bulk hydrocarbons, respectively. In order to facilitate X-ray CT imaging of other structures, a range of contrast agents have been developed to selectively identify and visualize the anatomical properties of individual tissues. Most agents incorporate atoms like iodine, gold, or barium because of their ability to absorb X-rays, and thus impart contrast to a given organ system. Here we review the strategies available to visualize lung, fat, brain, kidney, liver, spleen, vasculature, gastrointestinal tract, and liver tissues of living mice using either innate contrast, or commercial injectable or ingestible agents with selective perfusion. Further, we demonstrate how each of these approaches will facilitate the non-invasive, longitudinal, in vivo imaging of pre-clinical disease models at each anatomical site. PMID:23711461

  8. Application of Soft Computing Techniques and Multiple Regression Models for CBR prediction of Soils

    Directory of Open Access Journals (Sweden)

    Fatimah Khaleel Ibrahim

    2017-08-01

    Full Text Available The techniques of soft computing technique such as Artificial Neutral Network (ANN have improved the predicting capability and have actually discovered application in Geotechnical engineering. The aim of this research is to utilize the soft computing technique and Multiple Regression Models (MLR for forecasting the California bearing ratio CBR( of soil from its index properties. The indicator of CBR for soil could be predicted from various soils characterizing parameters with the assist of MLR and ANN methods. The data base that collected from the laboratory by conducting tests on 86 soil samples that gathered from different projects in Basrah districts. Data gained from the experimental result were used in the regression models and soft computing techniques by using artificial neural network. The liquid limit, plastic index , modified compaction test and the CBR test have been determined. In this work, different ANN and MLR models were formulated with the different collection of inputs to be able to recognize their significance in the prediction of CBR. The strengths of the models that were developed been examined in terms of regression coefficient (R2, relative error (RE% and mean square error (MSE values. From the results of this paper, it absolutely was noticed that all the proposed ANN models perform better than that of MLR model. In a specific ANN model with all input parameters reveals better outcomes than other ANN models.

  9. Integrative approaches to computational biomedicine

    Science.gov (United States)

    Coveney, Peter V.; Diaz-Zuccarini, Vanessa; Graf, Norbert; Hunter, Peter; Kohl, Peter; Tegner, Jesper; Viceconti, Marco

    2013-01-01

    The new discipline of computational biomedicine is concerned with the application of computer-based techniques and particularly modelling and simulation to human health. Since 2007, this discipline has been synonymous, in Europe, with the name given to the European Union's ambitious investment in integrating these techniques with the eventual aim of modelling the human body as a whole: the virtual physiological human. This programme and its successors are expected, over the next decades, to transform the study and practice of healthcare, moving it towards the priorities known as ‘4P's’: predictive, preventative, personalized and participatory medicine.

  10. Computed tomography of the soft tissues of the shoulder. Pt. 3

    International Nuclear Information System (INIS)

    Dihlmann, W.; Bandick, J.

    1988-01-01

    Computed tomography of the soft tissue of the shoulder in cases of calcifying tendinitis of the rotator cuff provides the following information: 1. Localisation of the calcium deposits within the rotator cuff. 2. Contours and density of the calcium deposits correlated with the clinical findings as described by Uhthoff et al. Ill-defined contours and non-homogeneous deposits are associated with more severe clinical features. 3. Computed tomography shows that apatite particles, which are not visible radiologically, may penetrate into the shoulder joint and produce synovitis with an effusion. This is of importance in local therapy. (orig.) [de

  11. Infinitesimal symmetries: a computational approach

    International Nuclear Information System (INIS)

    Kersten, P.H.M.

    1985-01-01

    This thesis is concerned with computational aspects in the determination of infinitesimal symmetries and Lie-Baecklund transformations of differential equations. Moreover some problems are calculated explicitly. A brief introduction to some concepts in the theory of symmetries and Lie-Baecklund transformations, relevant for this thesis, are given. The mathematical formalism is shortly reviewed. The jet bundle formulation is chosen, in which, by its algebraic nature, objects can be described very precisely. Consequently it is appropriate for implementation. A number of procedures are discussed, which enable to carry through computations with the help of a computer. These computations are very extensive in practice. The Lie algebras of infinitesimal symmetries of a number of differential equations in Mathematical Physics are established and some of their applications are discussed, i.e., Maxwell equations, nonlinear diffusion equation, nonlinear Schroedinger equation, nonlinear Dirac equations and self dual SU(2) Yang-Mills equations. Lie-Baecklund transformations of Burgers' equation, Classical Boussinesq equation and the Massive Thirring Model are determined. Furthermore, nonlocal Lie-Baecklund transformations of the last equation are derived. (orig.)

  12. Computational approach in zeolite science

    NARCIS (Netherlands)

    Pidko, E.A.; Santen, van R.A.; Chester, A.W.; Derouane, E.G.

    2009-01-01

    This chapter presents an overview of different computational methods and their application to various fields of zeolite chemistry. We will discuss static lattice methods based on interatomic potentials to predict zeolite structures and topologies, Monte Carlo simulations for the investigation of

  13. Challenges in Soft Computing: Case Study with Louisville MSD CSO Modeling

    Science.gov (United States)

    Ormsbee, L.; Tufail, M.

    2005-12-01

    The principal constituents of soft computing include fuzzy logic, neural computing, evolutionary computation, machine learning, and probabilistic reasoning. There are numerous applications of these constituents (both individually and combination of two or more) in the area of water resources and environmental systems. These range from development of data driven models to optimal control strategies to assist in more informed and intelligent decision making process. Availability of data is critical to such applications and having scarce data may lead to models that do not represent the response function over the entire domain. At the same time, too much data has a tendency to lead to over-constraining of the problem. This paper will describe the application of a subset of these soft computing techniques (neural computing and genetic algorithms) to the Beargrass Creek watershed in Louisville, Kentucky. The application include development of inductive models as substitutes for more complex process-based models to predict water quality of key constituents (such as dissolved oxygen) and use them in an optimization framework for optimal load reductions. Such a process will facilitate the development of total maximum daily loads for the impaired water bodies in the watershed. Some of the challenges faced in this application include 1) uncertainty in data sets, 2) model application, and 3) development of cause-and-effect relationships between water quality constituents and watershed parameters through use of inductive models. The paper will discuss these challenges and how they affect the desired goals of the project.

  14. A virtual surgical training system that simulates cutting of soft tissue using a modified pre-computed elastic model.

    Science.gov (United States)

    Toe, Kyaw Kyar; Huang, Weimin; Yang, Tao; Duan, Yuping; Zhou, Jiayin; Su, Yi; Teo, Soo-Kng; Kumar, Selvaraj Senthil; Lim, Calvin Chi-Wan; Chui, Chee Kong; Chang, Stephen

    2015-08-01

    This work presents a surgical training system that incorporates cutting operation of soft tissue simulated based on a modified pre-computed linear elastic model in the Simulation Open Framework Architecture (SOFA) environment. A precomputed linear elastic model used for the simulation of soft tissue deformation involves computing the compliance matrix a priori based on the topological information of the mesh. While this process may require a few minutes to several hours, based on the number of vertices in the mesh, it needs only to be computed once and allows real-time computation of the subsequent soft tissue deformation. However, as the compliance matrix is based on the initial topology of the mesh, it does not allow any topological changes during simulation, such as cutting or tearing of the mesh. This work proposes a way to modify the pre-computed data by correcting the topological connectivity in the compliance matrix, without re-computing the compliance matrix which is computationally expensive.

  15. Computation of stress on the surface of a soft homogeneous arbitrarily shaped particle.

    Science.gov (United States)

    Yang, Minglin; Ren, Kuan Fang; Wu, Yueqian; Sheng, Xinqing

    2014-04-01

    Prediction of the stress on the surface of an arbitrarily shaped particle of soft material is essential in the study of elastic properties of the particles with optical force. It is also necessary in the manipulation and sorting of small particles with optical tweezers, since a regular-shaped particle, such as a sphere, may be deformed under the nonuniform optical stress on its surface. The stress profile on a spherical or small spheroidal soft particle trapped by shaped beams has been studied, however little work on computing the surface stress of an irregular-shaped particle has been reported. We apply in this paper the surface integral equation with multilevel fast multipole algorithm to compute the surface stress on soft homogeneous arbitrarily shaped particles. The comparison of the computed stress profile with that predicted by the generalized Lorenz-Mie theory for a water droplet of diameter equal to 51 wavelengths in a focused Gaussian beam show that the precision of our method is very good. Then stress profiles on spheroids with different aspect ratios are computed. The particles are illuminated by a Gaussian beam of different waist radius at different incidences. Physical analysis on the mechanism of optical stress is given with help of our recently developed vectorial complex ray model. It is found that the maximum of the stress profile on the surface of prolate spheroids is not only determined by the reflected and refracted rays (orders p=0,1) but also the rays undergoing one or two internal reflections where they focus. Computational study of stress on surface of a biconcave cell-like particle, which is a typical application in life science, is also undertaken.

  16. MRT letter: Contrast-enhanced computed tomographic imaging of soft callus formation in fracture healing.

    Science.gov (United States)

    Hayward, Lauren Nicole Miller; de Bakker, Chantal Marie-Jeanne; Lusic, Hrvoje; Gerstenfeld, Louis Charles; Grinstaff, Mark W; Morgan, Elise Feng-I

    2012-01-01

    Formation of a cartilaginous soft callus at the site of a bone fracture is a pivotal stage in the healing process. Noninvasive, or even nondestructive, imaging of soft callus formation can be an important tool in experimental and pre-clinical studies of fracture repair. However, the low X-ray attenuation of cartilage renders the soft callus nearly invisible in radiographs. This study utilized a recently developed, cationic, iodinated contrast agent in conjunction with micro-computed tomography to identify cartilage in fracture calluses in the femora of C57BL/6J and C3H/HeJ mice. Fracture calluses were scanned before and after incubation in the contrast agent. The set of pre-incubation images was registered against and then subtracted from the set of post-incubation images, resulting in a three-dimensional map of the locations of cartilage in the callus, as labeled by the contrast agent. This map was then compared to histology from a previous study. The results showed that the locations where the contrast agent collected in relatively high concentrations were similar to those of the cartilage. The contrast agent also identified a significant difference between the two strains of mice in the percentage of the callus occupied by cartilage, indicating that this method of contrast-enhanced computed tomography may be an effective technique for nondestructive, early evaluation of fracture healing. Copyright © 2011 Wiley Periodicals, Inc.

  17. Softly Broken Lepton Numbers: an Approach to Maximal Neutrino Mixing

    International Nuclear Information System (INIS)

    Grimus, W.; Lavoura, L.

    2001-01-01

    We discuss models where the U(1) symmetries of lepton numbers are responsible for maximal neutrino mixing. We pay particular attention to an extension of the Standard Model (SM) with three right-handed neutrino singlets in which we require that the three lepton numbers L e , L μ , and L τ be separately conserved in the Yukawa couplings, but assume that they are softly broken by the Majorana mass matrix M R of the neutrino singlets. In this framework, where lepton-number breaking occurs at a scale much higher than the electroweak scale, deviations from family lepton number conservation are calculable, i.e., finite, and lepton mixing stems exclusively from M R . We show that in this framework either maximal atmospheric neutrino mixing or maximal solar neutrino mixing or both can be imposed by invoking symmetries. In this way those maximal mixings are stable against radiative corrections. The model which achieves maximal (or nearly maximal) solar neutrino mixing assumes that there are two different scales in M R and that the lepton number (dash)L=L e -L μ -L τ 1 is conserved in between them. We work out the difference between this model and the conventional scenario where (approximate) (dash)L invariance is imposed directly on the mass matrix of the light neutrinos. (author)

  18. Development of fuzzy air quality index using soft computing approach.

    Science.gov (United States)

    Mandal, T; Gorai, A K; Pathak, G

    2012-10-01

    Proper assessment of air quality status in an atmosphere based on limited observations is an essential task for meeting the goals of environmental management. A number of classification methods are available for estimating the changing status of air quality. However, a discrepancy frequently arises from the quality criteria of air employed and vagueness or fuzziness embedded in the decision making output values. Owing to inherent imprecision, difficulties always exist in some conventional methodologies like air quality index when describing integrated air quality conditions with respect to various pollutants parameters and time of exposure. In recent years, the fuzzy logic-based methods have demonstrated to be appropriated to address uncertainty and subjectivity in environmental issues. In the present study, a methodology based on fuzzy inference systems (FIS) to assess air quality is proposed. This paper presents a comparative study to assess status of air quality using fuzzy logic technique and that of conventional technique. The findings clearly indicate that the FIS may successfully harmonize inherent discrepancies and interpret complex conditions.

  19. Computer Architecture A Quantitative Approach

    CERN Document Server

    Hennessy, John L

    2007-01-01

    The era of seemingly unlimited growth in processor performance is over: single chip architectures can no longer overcome the performance limitations imposed by the power they consume and the heat they generate. Today, Intel and other semiconductor firms are abandoning the single fast processor model in favor of multi-core microprocessors--chips that combine two or more processors in a single package. In the fourth edition of Computer Architecture, the authors focus on this historic shift, increasing their coverage of multiprocessors and exploring the most effective ways of achieving parallelis

  20. Computed tomography in the evaluation of soft tissue tumors. Report in 124 cases

    Energy Technology Data Exchange (ETDEWEB)

    Torricelli, P; Calo, M; Boriani, S; De Santis, G

    1986-01-01

    In order to evaluate the role of Computed Tomography (CT) in prediction of nature, staging and follow-up of soft-tessue tumors, the authors examined by CT 124 patients with soft tissue neoplasms who later underwent surgery (116 cases) or fine needle biopsy (8 cases). Comparison between CT and surgical or anatomical results showed that CT was able to correctly predict the benignancy or malignancy of the masses in 76% of cases but it was very seldom able to allow an hystological prediction. On the contrary CT was found to be a very useful tool for pre-therapeutic staging and follow-up of the tumors, because it gave many diagnostic information which influenced therapeutic choiches and strategies. 39 refs.

  1. Soft Electronics Enabled Ergonomic Human-Computer Interaction for Swallowing Training

    Science.gov (United States)

    Lee, Yongkuk; Nicholls, Benjamin; Sup Lee, Dong; Chen, Yanfei; Chun, Youngjae; Siang Ang, Chee; Yeo, Woon-Hong

    2017-04-01

    We introduce a skin-friendly electronic system that enables human-computer interaction (HCI) for swallowing training in dysphagia rehabilitation. For an ergonomic HCI, we utilize a soft, highly compliant (“skin-like”) electrode, which addresses critical issues of an existing rigid and planar electrode combined with a problematic conductive electrolyte and adhesive pad. The skin-like electrode offers a highly conformal, user-comfortable interaction with the skin for long-term wearable, high-fidelity recording of swallowing electromyograms on the chin. Mechanics modeling and experimental quantification captures the ultra-elastic mechanical characteristics of an open mesh microstructured sensor, conjugated with an elastomeric membrane. Systematic in vivo studies investigate the functionality of the soft electronics for HCI-enabled swallowing training, which includes the application of a biofeedback system to detect swallowing behavior. The collection of results demonstrates clinical feasibility of the ergonomic electronics in HCI-driven rehabilitation for patients with swallowing disorders.

  2. Learning and geometry computational approaches

    CERN Document Server

    Smith, Carl

    1996-01-01

    The field of computational learning theory arose out of the desire to for­ mally understand the process of learning. As potential applications to artificial intelligence became apparent, the new field grew rapidly. The learning of geo­ metric objects became a natural area of study. The possibility of using learning techniques to compensate for unsolvability provided an attraction for individ­ uals with an immediate need to solve such difficult problems. Researchers at the Center for Night Vision were interested in solving the problem of interpreting data produced by a variety of sensors. Current vision techniques, which have a strong geometric component, can be used to extract features. However, these techniques fall short of useful recognition of the sensed objects. One potential solution is to incorporate learning techniques into the geometric manipulation of sensor data. As a first step toward realizing such a solution, the Systems Research Center at the University of Maryland, in conjunction with the C...

  3. Quantum Computing: a Quantum Group Approach

    OpenAIRE

    Wang, Zhenghan

    2013-01-01

    There is compelling theoretical evidence that quantum physics will change the face of information science. Exciting progress has been made during the last two decades towards the building of a large scale quantum computer. A quantum group approach stands out as a promising route to this holy grail, and provides hope that we may have quantum computers in our future.

  4. Cloud computing methods and practical approaches

    CERN Document Server

    Mahmood, Zaigham

    2013-01-01

    This book presents both state-of-the-art research developments and practical guidance on approaches, technologies and frameworks for the emerging cloud paradigm. Topics and features: presents the state of the art in cloud technologies, infrastructures, and service delivery and deployment models; discusses relevant theoretical frameworks, practical approaches and suggested methodologies; offers guidance and best practices for the development of cloud-based services and infrastructures, and examines management aspects of cloud computing; reviews consumer perspectives on mobile cloud computing an

  5. A program to compute the soft Robinson-Foulds distance between phylogenetic networks.

    Science.gov (United States)

    Lu, Bingxin; Zhang, Louxin; Leong, Hon Wai

    2017-03-14

    Over the past two decades, phylogenetic networks have been studied to model reticulate evolutionary events. The relationships among phylogenetic networks, phylogenetic trees and clusters serve as the basis for reconstruction and comparison of phylogenetic networks. To understand these relationships, two problems are raised: the tree containment problem, which asks whether a phylogenetic tree is displayed in a phylogenetic network, and the cluster containment problem, which asks whether a cluster is represented at a node in a phylogenetic network. Both the problems are NP-complete. A fast exponential-time algorithm for the cluster containment problem on arbitrary networks is developed and implemented in C. The resulting program is further extended into a computer program for fast computation of the Soft Robinson-Foulds distance between phylogenetic networks. Two computer programs are developed for facilitating reconstruction and validation of phylogenetic network models in evolutionary and comparative genomics. Our simulation tests indicated that they are fast enough for use in practice. Additionally, the distribution of the Soft Robinson-Foulds distance between phylogenetic networks is demonstrated to be unlikely normal by our simulation data.

  6. Image Analysis via Soft Computing: Prototype Applications at NASA KSC and Product Commercialization

    Science.gov (United States)

    Dominguez, Jesus A.; Klinko, Steve

    2011-01-01

    This slide presentation reviews the use of "soft computing" which differs from "hard computing" in that it is more tolerant of imprecision, partial truth, uncertainty, and approximation and its use in image analysis. Soft computing provides flexible information processing to handle real life ambiguous situations and achieve tractability, robustness low solution cost, and a closer resemblance to human decision making. Several systems are or have been developed: Fuzzy Reasoning Edge Detection (FRED), Fuzzy Reasoning Adaptive Thresholding (FRAT), Image enhancement techniques, and visual/pattern recognition. These systems are compared with examples that show the effectiveness of each. NASA applications that are reviewed are: Real-Time (RT) Anomaly Detection, Real-Time (RT) Moving Debris Detection and the Columbia Investigation. The RT anomaly detection reviewed the case of a damaged cable for the emergency egress system. The use of these techniques is further illustrated in the Columbia investigation with the location and detection of Foam debris. There are several applications in commercial usage: image enhancement, human screening and privacy protection, visual inspection, 3D heart visualization, tumor detections and x ray image enhancement.

  7. Computed tomography in soft-tissue lesions of the hand and forearm

    International Nuclear Information System (INIS)

    Schmitt, R.; Warmuth-Metz, M.; Lucas, D.; Feyerabend, T.; Schindler, G.; Lanz, U.

    1990-01-01

    Computed tomography was carried out in 32 patients with clinically equivocal soft-tissue lesions of the hand (24 times) and forearm (8 times). The CT scans were performed with the patients in standard positions; thin slices and zoom technique were used. All soft-tissue tumors were correctly diagnosed with regard to localization, size and infiltration of the surrounding tissue. The histological diagnosis was correct in tendon-sheath proliferations, deposits caused by metabolic disorders, epithelial and ganglion cysts, hemangiomas, lipomas and in one schwannoma. A malignancy was suspected and was proven to be correct in two cases. False-positive diagnoses of a malignant soft-tissue tumor were made in one case of an aggressive fibromatosis, in a rapidly progressive, ossifying myositis, and three times in the presence of postoperative scar tissue following the resection of a sarcoma. Finally, a case of proliferative myositis regarded as semimalignant was underrated by CT. The hand surgeon considered CT diagnostics to be very helpful in planning operations in an anatomically complex organ such as the hand. (orig.) [de

  8. Rough set soft computing cancer classification and network: one stone, two birds.

    Science.gov (United States)

    Zhang, Yue

    2010-07-15

    Gene expression profiling provides tremendous information to help unravel the complexity of cancer. The selection of the most informative genes from huge noise for cancer classification has taken centre stage, along with predicting the function of such identified genes and the construction of direct gene regulatory networks at different system levels with a tuneable parameter. A new study by Wang and Gotoh described a novel Variable Precision Rough Sets-rooted robust soft computing method to successfully address these problems and has yielded some new insights. The significance of this progress and its perspectives will be discussed in this article.

  9. Speed challenge: a case for hardware implementation in soft-computing

    Science.gov (United States)

    Daud, T.; Stoica, A.; Duong, T.; Keymeulen, D.; Zebulum, R.; Thomas, T.; Thakoor, A.

    2000-01-01

    For over a decade, JPL has been actively involved in soft computing research on theory, architecture, applications, and electronics hardware. The driving force in all our research activities, in addition to the potential enabling technology promise, has been creation of a niche that imparts orders of magnitude speed advantage by implementation in parallel processing hardware with algorithms made especially suitable for hardware implementation. We review our work on neural networks, fuzzy logic, and evolvable hardware with selected application examples requiring real time response capabilities.

  10. Live theater on a virtual stage: incorporating soft skills and teamwork in computer graphics education.

    Science.gov (United States)

    Schweppe, M; Geigel, J

    2011-01-01

    Industry has increasingly emphasized the need for "soft" or interpersonal skills development and team-building experience in the college curriculum. Here, we discuss our experiences with providing such opportunities via a collaborative project called the Virtual Theater. In this joint project between the Rochester Institute of Technology's School of Design and Department of Computer Science, the goal is to enable live performance in a virtual space with participants in different physical locales. Students work in teams, collaborating with other students in and out of their disciplines.

  11. Cognitive Approaches for Medicine in Cloud Computing.

    Science.gov (United States)

    Ogiela, Urszula; Takizawa, Makoto; Ogiela, Lidia

    2018-03-03

    This paper will present the application potential of the cognitive approach to data interpretation, with special reference to medical areas. The possibilities of using the meaning approach to data description and analysis will be proposed for data analysis tasks in Cloud Computing. The methods of cognitive data management in Cloud Computing are aimed to support the processes of protecting data against unauthorised takeover and they serve to enhance the data management processes. The accomplishment of the proposed tasks will be the definition of algorithms for the execution of meaning data interpretation processes in safe Cloud Computing. • We proposed a cognitive methods for data description. • Proposed a techniques for secure data in Cloud Computing. • Application of cognitive approaches for medicine was described.

  12. A Soft OR Approach to Fostering Systems Thinking: SODA Maps plus Joint Analytical Process

    Science.gov (United States)

    Wang, Shouhong; Wang, Hai

    2016-01-01

    Higher order thinking skills are important for managers. Systems thinking is an important type of higher order thinking in business education. This article investigates a soft Operations Research approach to teaching and learning systems thinking. It outlines the integrative use of Strategic Options Development and Analysis maps for visualizing…

  13. A soft-contact model for computing safety margins in human prehension.

    Science.gov (United States)

    Singh, Tarkeshwar; Ambike, Satyajit

    2017-10-01

    The soft human digit tip forms contact with grasped objects over a finite area and applies a moment about an axis normal to the area. These moments are important for ensuring stability during precision grasping. However, the contribution of these moments to grasp stability is rarely investigated in prehension studies. The more popular hard-contact model assumes that the digits exert a force vector but no free moment on the grasped object. Many sensorimotor studies use this model and show that humans estimate friction coefficients to scale the normal force to grasp objects stably, i.e. the smoother the surface, the tighter the grasp. The difference between the applied normal force and the minimal normal force needed to prevent slipping is called safety margin and this index is widely used as a measure of grasp planning. Here, we define and quantify safety margin using a more realistic contact model that allows digits to apply both forces and moments. Specifically, we adapt a soft-contact model from robotics and demonstrate that the safety margin thus computed is a more accurate and robust index of grasp planning than its hard-contact variant. Previously, we have used the soft-contact model to propose two indices of grasp planning that show how humans account for the shape and inertial properties of an object. A soft-contact based safety margin offers complementary insights by quantifying how humans may account for surface properties of the object and skin tissue during grasp planning and execution. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. CGC/saturation approach for soft interactions at high energy: survival probability of central exclusive production

    Energy Technology Data Exchange (ETDEWEB)

    Gotsman, E.; Maor, U. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Levin, E. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Universidad Tecnica Federico Santa Maria, Departemento de Fisica, Centro Cientifico-Tecnologico de Valparaiso, Valparaiso (Chile)

    2016-04-15

    We estimate the value of the survival probability for central exclusive production in a model which is based on the CGC/saturation approach. Hard and soft processes are described in the same framework. At LHC energies, we obtain a small value for the survival probability. The source of the small value is the impact parameter dependence of the hard amplitude. Our model has successfully described a large body of soft data: elastic, inelastic and diffractive cross sections, inclusive production and rapidity correlations, as well as the t-dependence of deep inelastic diffractive production of vector mesons. (orig.)

  15. Toward exascale computing through neuromorphic approaches.

    Energy Technology Data Exchange (ETDEWEB)

    James, Conrad D.

    2010-09-01

    While individual neurons function at relatively low firing rates, naturally-occurring nervous systems not only surpass manmade systems in computing power, but accomplish this feat using relatively little energy. It is asserted that the next major breakthrough in computing power will be achieved through application of neuromorphic approaches that mimic the mechanisms by which neural systems integrate and store massive quantities of data for real-time decision making. The proposed LDRD provides a conceptual foundation for SNL to make unique advances toward exascale computing. First, a team consisting of experts from the HPC, MESA, cognitive and biological sciences and nanotechnology domains will be coordinated to conduct an exercise with the outcome being a concept for applying neuromorphic computing to achieve exascale computing. It is anticipated that this concept will involve innovative extension and integration of SNL capabilities in MicroFab, material sciences, high-performance computing, and modeling and simulation of neural processes/systems.

  16. In vivo X-Ray Computed Tomographic Imaging of Soft Tissue with Native, Intravenous, or Oral Contrast

    Directory of Open Access Journals (Sweden)

    W. Matthew Leevy

    2013-05-01

    Full Text Available X-ray Computed Tomography (CT is one of the most commonly utilized anatomical imaging modalities for both research and clinical purposes. CT combines high-resolution, three-dimensional data with relatively fast acquisition to provide a solid platform for non-invasive human or specimen imaging. The primary limitation of CT is its inability to distinguish many soft tissues based on native contrast. While bone has high contrast within a CT image due to its material density from calcium phosphate, soft tissue is less dense and many are homogenous in density. This presents a challenge in distinguishing one type of soft tissue from another. A couple exceptions include the lungs as well as fat, both of which have unique densities owing to the presence of air or bulk hydrocarbons, respectively. In order to facilitate X-ray CT imaging of other structures, a range of contrast agents have been developed to selectively identify and visualize the anatomical properties of individual tissues. Most agents incorporate atoms like iodine, gold, or barium because of their ability to absorb X-rays, and thus impart contrast to a given organ system. Here we review the strategies available to visualize lung, fat, brain, kidney, liver, spleen, vasculature, gastrointestinal tract, and liver tissues of living mice using either innate contrast, or commercial injectable or ingestible agents with selective perfusion. Further, we demonstrate how each of these approaches will facilitate the non-invasive, longitudinal, in vivo imaging of pre-clinical disease models at each anatomical site.

  17. Soft Computing Optimizer For Intelligent Control Systems Design: The Structure And Applications

    Directory of Open Access Journals (Sweden)

    Sergey A. Panfilov

    2003-10-01

    Full Text Available Soft Computing Optimizer (SCO as a new software tool for design of robust intelligent control systems is described. It is based on the hybrid methodology of soft computing and stochastic simulation. It uses as an input the measured or simulated data about the modeled system. SCO is used to design an optimal fuzzy inference system, which approximates a random behavior of control object with the certain accuracy. The task of the fuzzy inference system construction is reduced to the subtasks such as forming of the linguistic variables for each input and output variable, creation of rule data base, optimization of rule data base and refinement of the parameters of the membership functions. Each task by the corresponding genetic algorithm (with an appropriate fitness function is solved. The result of SCO application is the design of Knowledge Base of a Fuzzy Controller, which contains the value information about developed fuzzy inference system. Such value information can be downloaded into the actual fuzzy controller to perform online fuzzy control. Simulations results of robust fuzzy control of nonlinear dynamic systems and experimental results of application on automotive semi-active suspension control are demonstrated.

  18. Effects of Soft Drinks on Resting State EEG and Brain-Computer Interface Performance.

    Science.gov (United States)

    Meng, Jianjun; Mundahl, John; Streitz, Taylor; Maile, Kaitlin; Gulachek, Nicholas; He, Jeffrey; He, Bin

    2017-01-01

    Motor imagery-based (MI based) brain-computer interface (BCI) using electroencephalography (EEG) allows users to directly control a computer or external device by modulating and decoding the brain waves. A variety of factors could potentially affect the performance of BCI such as the health status of subjects or the environment. In this study, we investigated the effects of soft drinks and regular coffee on EEG signals under resting state and on the performance of MI based BCI. Twenty-six healthy human subjects participated in three or four BCI sessions with a resting period in each session. During each session, the subjects drank an unlabeled soft drink with either sugar (Caffeine Free Coca-Cola), caffeine (Diet Coke), neither ingredient (Caffeine Free Diet Coke), or a regular coffee if there was a fourth session. The resting state spectral power in each condition was compared; the analysis showed that power in alpha and beta band after caffeine consumption were decreased substantially compared to control and sugar condition. Although the attenuation of powers in the frequency range used for the online BCI control signal was shown, group averaged BCI online performance after consuming caffeine was similar to those of other conditions. This work, for the first time, shows the effect of caffeine, sugar intake on the online BCI performance and resting state brain signal.

  19. Russian Approach to Soft Power Promotion: Conceptual Approaches in Foreign Policy

    Directory of Open Access Journals (Sweden)

    Yulia Nikitina

    2014-01-01

    Full Text Available Foreign policy is one of the instruments of promoting soft power of a state. According to Joseph Nye, civil society is the main source of a state's international attractiveness. The article analyses how Russian official foreign policy documents present interaction between the state and civil society in order to promote Russian soft power. At the present stage Russian civil society is perceived by state structures as an instrument and not a source of soft power. The article also analyses political values and models of developments as elements of soft power as they are presented in official documents. Russia has a coherent normative model of regional development for the post-Soviet space. For the global level Russia formulates rules of behavior that it would like to see at the international arena, but Russia does not formulate how Russian or regional post-Soviet models of development can contribute to world development.

  20. Multi-GPU Jacobian accelerated computing for soft-field tomography

    International Nuclear Information System (INIS)

    Borsic, A; Attardo, E A; Halter, R J

    2012-01-01

    Image reconstruction in soft-field tomography is based on an inverse problem formulation, where a forward model is fitted to the data. In medical applications, where the anatomy presents complex shapes, it is common to use finite element models (FEMs) to represent the volume of interest and solve a partial differential equation that models the physics of the system. Over the last decade, there has been a shifting interest from 2D modeling to 3D modeling, as the underlying physics of most problems are 3D. Although the increased computational power of modern computers allows working with much larger FEM models, the computational time required to reconstruct 3D images on a fine 3D FEM model can be significant, on the order of hours. For example, in electrical impedance tomography (EIT) applications using a dense 3D FEM mesh with half a million elements, a single reconstruction iteration takes approximately 15–20 min with optimized routines running on a modern multi-core PC. It is desirable to accelerate image reconstruction to enable researchers to more easily and rapidly explore data and reconstruction parameters. Furthermore, providing high-speed reconstructions is essential for some promising clinical application of EIT. For 3D problems, 70% of the computing time is spent building the Jacobian matrix, and 25% of the time in forward solving. In this work, we focus on accelerating the Jacobian computation by using single and multiple GPUs. First, we discuss an optimized implementation on a modern multi-core PC architecture and show how computing time is bounded by the CPU-to-memory bandwidth; this factor limits the rate at which data can be fetched by the CPU. Gains associated with the use of multiple CPU cores are minimal, since data operands cannot be fetched fast enough to saturate the processing power of even a single CPU core. GPUs have much faster memory bandwidths compared to CPUs and better parallelism. We are able to obtain acceleration factors of 20 times

  1. Multi-GPU Jacobian accelerated computing for soft-field tomography.

    Science.gov (United States)

    Borsic, A; Attardo, E A; Halter, R J

    2012-10-01

    Image reconstruction in soft-field tomography is based on an inverse problem formulation, where a forward model is fitted to the data. In medical applications, where the anatomy presents complex shapes, it is common to use finite element models (FEMs) to represent the volume of interest and solve a partial differential equation that models the physics of the system. Over the last decade, there has been a shifting interest from 2D modeling to 3D modeling, as the underlying physics of most problems are 3D. Although the increased computational power of modern computers allows working with much larger FEM models, the computational time required to reconstruct 3D images on a fine 3D FEM model can be significant, on the order of hours. For example, in electrical impedance tomography (EIT) applications using a dense 3D FEM mesh with half a million elements, a single reconstruction iteration takes approximately 15-20 min with optimized routines running on a modern multi-core PC. It is desirable to accelerate image reconstruction to enable researchers to more easily and rapidly explore data and reconstruction parameters. Furthermore, providing high-speed reconstructions is essential for some promising clinical application of EIT. For 3D problems, 70% of the computing time is spent building the Jacobian matrix, and 25% of the time in forward solving. In this work, we focus on accelerating the Jacobian computation by using single and multiple GPUs. First, we discuss an optimized implementation on a modern multi-core PC architecture and show how computing time is bounded by the CPU-to-memory bandwidth; this factor limits the rate at which data can be fetched by the CPU. Gains associated with the use of multiple CPU cores are minimal, since data operands cannot be fetched fast enough to saturate the processing power of even a single CPU core. GPUs have much faster memory bandwidths compared to CPUs and better parallelism. We are able to obtain acceleration factors of 20

  2. A New Screening Methodology for Improved Oil Recovery Processes Using Soft-Computing Techniques

    Science.gov (United States)

    Parada, Claudia; Ertekin, Turgay

    2010-05-01

    able to recognize the strong correlation between the displacement mechanism and the reservoir characteristics as they effectively forecast hydrocarbon production for different types of reservoir undergoing diverse recovery processes. The artificial neuron networks are able to capture the similarities between different displacement mechanisms as same network architecture is successfully applied in both CO2 and N2 injection. The neuro-simulation application tool is built within a graphical user interface to facilitate the display of the results. The developed soft-computing tool offers an innovative approach to design a variety of efficient and feasible IOR processes by using artificial intelligence. The tool provides appropriate guidelines to the reservoir engineer, it facilitates the appraisal of diverse field development strategies for oil reservoirs, and it helps to reduce the number of scenarios evaluated with conventional reservoir simulation.

  3. Finding-specific display presets for computed radiography soft-copy reading.

    Science.gov (United States)

    Andriole, K P; Gould, R G; Webb, W R

    1999-05-01

    Much work has been done to optimize the display of cross-sectional modality imaging examinations for soft-copy reading (i.e., window/level tissue presets, and format presentations such as tile and stack modes, four-on-one, nine-on-one, etc). Less attention has been paid to the display of digital forms of the conventional projection x-ray. The purpose of this study is to assess the utility of providing presets for computed radiography (CR) soft-copy display, based not on the window/level settings, but on processing applied to the image optimized for visualization of specific findings, pathologies, etc (i.e., pneumothorax, tumor, tube location). It is felt that digital display of CR images based on finding-specific processing presets has the potential to: speed reading of digital projection x-ray examinations on soft copy; improve diagnostic efficacy; standardize display across examination type, clinical scenario, important key findings, and significant negatives; facilitate image comparison; and improve confidence in and acceptance of soft-copy reading. Clinical chest images are acquired using an Agfa-Gevaert (Mortsel, Belgium) ADC 70 CR scanner and Fuji (Stamford, CT) 9000 and AC2 CR scanners. Those demonstrating pertinent findings are transferred over the clinical picture archiving and communications system (PACS) network to a research image processing station (Agfa PS5000), where the optimal image-processing settings per finding, pathologic category, etc, are developed in conjunction with a thoracic radiologist, by manipulating the multiscale image contrast amplification (Agfa MUSICA) algorithm parameters. Soft-copy display of images processed with finding-specific settings are compared with the standard default image presentation for 50 cases of each category. Comparison is scored using a 5-point scale with the positive scale denoting the standard presentation is preferred over the finding-specific processing, the negative scale denoting the finding

  4. Computational fluid dynamics a practical approach

    CERN Document Server

    Tu, Jiyuan; Liu, Chaoqun

    2018-01-01

    Computational Fluid Dynamics: A Practical Approach, Third Edition, is an introduction to CFD fundamentals and commercial CFD software to solve engineering problems. The book is designed for a wide variety of engineering students new to CFD, and for practicing engineers learning CFD for the first time. Combining an appropriate level of mathematical background, worked examples, computer screen shots, and step-by-step processes, this book walks the reader through modeling and computing, as well as interpreting CFD results. This new edition has been updated throughout, with new content and improved figures, examples and problems.

  5. Computational neuropharmacology: dynamical approaches in drug discovery.

    Science.gov (United States)

    Aradi, Ildiko; Erdi, Péter

    2006-05-01

    Computational approaches that adopt dynamical models are widely accepted in basic and clinical neuroscience research as indispensable tools with which to understand normal and pathological neuronal mechanisms. Although computer-aided techniques have been used in pharmaceutical research (e.g. in structure- and ligand-based drug design), the power of dynamical models has not yet been exploited in drug discovery. We suggest that dynamical system theory and computational neuroscience--integrated with well-established, conventional molecular and electrophysiological methods--offer a broad perspective in drug discovery and in the search for novel targets and strategies for the treatment of neurological and psychiatric diseases.

  6. MR imaging of soft tissue alterations after total hip arthroplasty: comparison of classic surgical approaches

    Energy Technology Data Exchange (ETDEWEB)

    Agten, Christoph A.; Sutter, Reto; Pfirrmann, Christian W.A. [Balgrist University Hospital, Radiology, Zurich (Switzerland); University of Zurich, Faculty of Medicine, Zurich (Switzerland); Dora, Claudio [Balgrist University Hospital, Orthopedic Surgery, Zurich (Switzerland); University of Zurich, Faculty of Medicine, Zurich (Switzerland)

    2017-03-15

    To compare soft-tissue changes after total hip arthroplasty with posterior, direct-lateral, anterolateral, or anterior surgical approaches. MRI of 120 patients after primary total hip arthroplasty (30 per approach) were included. Each MRI was assessed by two readers regarding identification of surgical access, fatty muscle atrophy (Goutallier classification), tendon quality (0 = normal, 1 = tendinopathy, 2 = partial tear, 3 = avulsion), and fluid collections. Readers were blinded to the surgical approach. Surgical access was correctly identified in all cases. The direct lateral approach showed highest Goutallier grades and tendon damage for gluteus minimus muscle (2.07-2.67 and 2.00-2.77; p = 0.017 and p = 0.001 for readers 1 and 2, respectively) and tendon (2.30/1.67; p < 0.0005 for reader 1/2), and the lateral portion of the gluteus medius tendon (2.77/2.20; p < 0.0005 for reader 1/2). The posterior approach showed highest Goutallier grades and tendon damage for external rotator muscles (1.97-2.67 and 1.57-2.40; p < 0.0005-0.006 for reader 1/2) and tendons (1.41-2.45 and 1.93-2.76; p < 0.0005 for reader 1/2). The anterolateral and anterior approach showed less soft tissue damage. Fluid collections showed no differences between the approaches. MRI is well suited to identify surgical approaches after THA. The anterior and anterolateral approach showed less soft tissue damage compared to the posterior and direct lateral approach. (orig.)

  7. MR imaging of soft tissue alterations after total hip arthroplasty: comparison of classic surgical approaches

    International Nuclear Information System (INIS)

    Agten, Christoph A.; Sutter, Reto; Pfirrmann, Christian W.A.; Dora, Claudio

    2017-01-01

    To compare soft-tissue changes after total hip arthroplasty with posterior, direct-lateral, anterolateral, or anterior surgical approaches. MRI of 120 patients after primary total hip arthroplasty (30 per approach) were included. Each MRI was assessed by two readers regarding identification of surgical access, fatty muscle atrophy (Goutallier classification), tendon quality (0 = normal, 1 = tendinopathy, 2 = partial tear, 3 = avulsion), and fluid collections. Readers were blinded to the surgical approach. Surgical access was correctly identified in all cases. The direct lateral approach showed highest Goutallier grades and tendon damage for gluteus minimus muscle (2.07-2.67 and 2.00-2.77; p = 0.017 and p = 0.001 for readers 1 and 2, respectively) and tendon (2.30/1.67; p < 0.0005 for reader 1/2), and the lateral portion of the gluteus medius tendon (2.77/2.20; p < 0.0005 for reader 1/2). The posterior approach showed highest Goutallier grades and tendon damage for external rotator muscles (1.97-2.67 and 1.57-2.40; p < 0.0005-0.006 for reader 1/2) and tendons (1.41-2.45 and 1.93-2.76; p < 0.0005 for reader 1/2). The anterolateral and anterior approach showed less soft tissue damage. Fluid collections showed no differences between the approaches. MRI is well suited to identify surgical approaches after THA. The anterior and anterolateral approach showed less soft tissue damage compared to the posterior and direct lateral approach. (orig.)

  8. Granular, soft and fuzzy approaches for intelligent systems dedicated to professor Ronald R. Yager

    CERN Document Server

    Filev, Dimitar; Beliakov, Gleb

    2017-01-01

    This book offers a comprehensive report on the state-of-the art in the broadly-intended field of “intelligent systems”. After introducing key theoretical issues, it describes a number of promising models for data and system analysis, decision making, and control. It discusses important theories, including possibility theory, the Dempster-Shafer theory, the theory of approximate reasoning, as well as computing with words, together with novel applications in various areas, such as information aggregation and fusion, linguistic data summarization, participatory learning, systems modeling, and many others. By presenting the methods in their application contexts, the book shows how granular computing, soft computing and fuzzy logic techniques can provide novel, efficient solutions to real-world problems. It is dedicated to Professor Ronald R. Yager for his great scientific and scholarly achievements, and for his long-lasting service to the fuzzy logic, and the artificial and computational intelligence communit...

  9. Computer networking a top-down approach

    CERN Document Server

    Kurose, James

    2017-01-01

    Unique among computer networking texts, the Seventh Edition of the popular Computer Networking: A Top Down Approach builds on the author’s long tradition of teaching this complex subject through a layered approach in a “top-down manner.” The text works its way from the application layer down toward the physical layer, motivating readers by exposing them to important concepts early in their study of networking. Focusing on the Internet and the fundamentally important issues of networking, this text provides an excellent foundation for readers interested in computer science and electrical engineering, without requiring extensive knowledge of programming or mathematics. The Seventh Edition has been updated to reflect the most important and exciting recent advances in networking.

  10. Design approach of soft control system for implementation of advanced MMI in KNGR

    International Nuclear Information System (INIS)

    Kim, J. K.; Choi, M. J.; Choe, I. N.

    1999-01-01

    To overcome the inherent inflexibility of spatially dedicated man-machine interface (MMI) in conventional control room, computer based MMI technologies, along with compact workstation concept, are adopted in KNGR control room target design. In order to achieve the compact workstation design, a large number of spatially dedicated control switches and manual/auto stations in a traditional control room have to be replaced by a few common multi-function devices. These control devices, so called Soft Control System, consist of a personal computer based Flat Panel Display (FPD) device with touch sensitive screen which provides control MMI for the component selected among a number of plant components. Soft Control System is MMI device to allow control of continuous and discrete control device from single panel device. Soft Control System allows a standard interface device to assume the role of numerous control switch and analog control devices via software configuration. This has the advantage of following operator access to all plant control from a single control compact workstation. (author)

  11. CT evaluation of soft tissue and muscle infection and inflammation: A systematic compartmental approach

    Energy Technology Data Exchange (ETDEWEB)

    Beauchamp, N.J. Jr. [Dept. of Radiology, and Radiological Science, The Johns Hopkins Medical Institutions, Baltimore, MD (United States); Scott, W.W. Jr. [Dept. of Radiology, and Radiological Science, The Johns Hopkins Medical Institutions, Baltimore, MD (United States); Gottlieb, L.M. [Dept. of Surgery, The Johns Hopkins Medical Institutions, Baltimore, MD (United States); Fishman, E.K. [Dept. of Surgery, The Johns Hopkins Medical Institutions, Baltimore, MD (United States)

    1995-07-01

    This essay presents a systematic approach to the evaluation of soft tissue and muscle infection by defining the various pathologic processes and then illustrating them through a series of CT studies with corresponding schematic diagrams. The specific processes discussed are cellulitis, lymphangitis/lymphedema, necrotizing fascitis, myositis/myonecrosis, and abscess. Key points in the differential diagnosis of these entities are discussed and illustrated. The clinical management of the specific pathologic processes is also discussed. (orig./MG)

  12. CT evaluation of soft tissue and muscle infection and inflammation: A systematic compartmental approach

    International Nuclear Information System (INIS)

    Beauchamp, N.J. Jr.; Scott, W.W. Jr.; Gottlieb, L.M.; Fishman, E.K.

    1995-01-01

    This essay presents a systematic approach to the evaluation of soft tissue and muscle infection by defining the various pathologic processes and then illustrating them through a series of CT studies with corresponding schematic diagrams. The specific processes discussed are cellulitis, lymphangitis/lymphedema, necrotizing fascitis, myositis/myonecrosis, and abscess. Key points in the differential diagnosis of these entities are discussed and illustrated. The clinical management of the specific pathologic processes is also discussed. (orig./MG)

  13. The potential of soft computing methods in NPP instrumentation and control

    International Nuclear Information System (INIS)

    Hampel, R.; Chaker, N.; Kaestner, W.; Traichel, A.; Wagenknecht, M.; Gocht, U.

    2002-01-01

    The method of signal processing by soft computing include the application of fuzzy logic, synthetic neural networks, and evolutionary algorithms. The article contains an outline of the objectives and results of the application of fuzzy logic and methods of synthetic neural networks in nuclear measurement and control. The special requirements to be met by the software in safety-related areas with respect to reliability, evaluation, and validation are described. Possible uses may be in off-line applications in modeling, simulation, and reliability analysis as well as in on-line applications (real-time systems) for instrumentation and control. Safety-related aspects of signal processing are described and analyzed for the fuzzy logic and synthetic neural network concepts. Application are covered in selected examples. (orig.)

  14. Soft Computing Methods for Microwave and Millimeter-Wave Design Problems

    CERN Document Server

    Chauhan, Narendra; Mittal, Ankush

    2012-01-01

    The growing commercial market of Microwave/ Millimeter wave industry over the past decade has led to the explosion of interests and opportunities for the design and development of microwave components.The design of most microwave components requires the use of commercially available electromagnetic (EM) simulation tools for their analysis. In the design process, the simulations are carried out by varying the design parameters until the desired response is obtained. The optimization of design parameters by manual searching is a cumbersome and time consuming process. Soft computing methods such as Genetic Algorithm (GA), Artificial Neural Network (ANN) and Fuzzy Logic (FL) have been widely used by EM researchers for microwave design since last decade. The aim of these methods is to tolerate imprecision, uncertainty, and approximation to achieve robust and low cost solution in a small time frame.  Modeling and optimization are essential parts and powerful tools for the microwave/millimeter wave design. This boo...

  15. River suspended sediment estimation by climatic variables implication: Comparative study among soft computing techniques

    Science.gov (United States)

    Kisi, Ozgur; Shiri, Jalal

    2012-06-01

    Estimating sediment volume carried by a river is an important issue in water resources engineering. This paper compares the accuracy of three different soft computing methods, Artificial Neural Networks (ANNs), Adaptive Neuro-Fuzzy Inference System (ANFIS), and Gene Expression Programming (GEP), in estimating daily suspended sediment concentration on rivers by using hydro-meteorological data. The daily rainfall, streamflow and suspended sediment concentration data from Eel River near Dos Rios, at California, USA are used as a case study. The comparison results indicate that the GEP model performs better than the other models in daily suspended sediment concentration estimation for the particular data sets used in this study. Levenberg-Marquardt, conjugate gradient and gradient descent training algorithms were used for the ANN models. Out of three algorithms, the Conjugate gradient algorithm was found to be better than the others.

  16. Enhancing performance of next generation FSO communication systems using soft computing-based predictions.

    Science.gov (United States)

    Kazaura, Kamugisha; Omae, Kazunori; Suzuki, Toshiji; Matsumoto, Mitsuji; Mutafungwa, Edward; Korhonen, Timo O; Murakami, Tadaaki; Takahashi, Koichi; Matsumoto, Hideki; Wakamori, Kazuhiko; Arimoto, Yoshinori

    2006-06-12

    The deterioration and deformation of a free-space optical beam wave-front as it propagates through the atmosphere can reduce the link availability and may introduce burst errors thus degrading the performance of the system. We investigate the suitability of utilizing soft-computing (SC) based tools for improving performance of free-space optical (FSO) communications systems. The SC based tools are used for the prediction of key parameters of a FSO communications system. Measured data collected from an experimental FSO communication system is used as training and testing data for a proposed multi-layer neural network predictor (MNNP) used to predict future parameter values. The predicted parameters are essential for reducing transmission errors by improving the antenna's accuracy of tracking data beams. This is particularly essential for periods considered to be of strong atmospheric turbulence. The parameter values predicted using the proposed tool show acceptable conformity with original measurements.

  17. Prediction of monthly regional groundwater levels through hybrid soft-computing techniques

    Science.gov (United States)

    Chang, Fi-John; Chang, Li-Chiu; Huang, Chien-Wei; Kao, I.-Feng

    2016-10-01

    Groundwater systems are intrinsically heterogeneous with dynamic temporal-spatial patterns, which cause great difficulty in quantifying their complex processes, while reliable predictions of regional groundwater levels are commonly needed for managing water resources to ensure proper service of water demands within a region. In this study, we proposed a novel and flexible soft-computing technique that could effectively extract the complex high-dimensional input-output patterns of basin-wide groundwater-aquifer systems in an adaptive manner. The soft-computing models combined the Self Organized Map (SOM) and the Nonlinear Autoregressive with Exogenous Inputs (NARX) network for predicting monthly regional groundwater levels based on hydrologic forcing data. The SOM could effectively classify the temporal-spatial patterns of regional groundwater levels, the NARX could accurately predict the mean of regional groundwater levels for adjusting the selected SOM, the Kriging was used to interpolate the predictions of the adjusted SOM into finer grids of locations, and consequently the prediction of a monthly regional groundwater level map could be obtained. The Zhuoshui River basin in Taiwan was the study case, and its monthly data sets collected from 203 groundwater stations, 32 rainfall stations and 6 flow stations during 2000 and 2013 were used for modelling purpose. The results demonstrated that the hybrid SOM-NARX model could reliably and suitably predict monthly basin-wide groundwater levels with high correlations (R2 > 0.9 in both training and testing cases). The proposed methodology presents a milestone in modelling regional environmental issues and offers an insightful and promising way to predict monthly basin-wide groundwater levels, which is beneficial to authorities for sustainable water resources management.

  18. APPLICATION OF SOFT COMPUTING TECHNIQUES FOR PREDICTING COOLING TIME REQUIRED DROPPING INITIAL TEMPERATURE OF MASS CONCRETE

    Directory of Open Access Journals (Sweden)

    Santosh Bhattarai

    2017-07-01

    Full Text Available Minimizing the thermal cracks in mass concrete at an early age can be achieved by removing the hydration heat as quickly as possible within initial cooling period before the next lift is placed. Recognizing the time needed to remove hydration heat within initial cooling period helps to take an effective and efficient decision on temperature control plan in advance. Thermal properties of concrete, water cooling parameters and construction parameter are the most influencing factors involved in the process and the relationship between these parameters are non-linear in a pattern, complicated and not understood well. Some attempts had been made to understand and formulate the relationship taking account of thermal properties of concrete and cooling water parameters. Thus, in this study, an effort have been made to formulate the relationship for the same taking account of thermal properties of concrete, water cooling parameters and construction parameter, with the help of two soft computing techniques namely: Genetic programming (GP software “Eureqa” and Artificial Neural Network (ANN. Relationships were developed from the data available from recently constructed high concrete double curvature arch dam. The value of R for the relationship between the predicted and real cooling time from GP and ANN model is 0.8822 and 0.9146 respectively. Relative impact on target parameter due to input parameters was evaluated through sensitivity analysis and the results reveal that, construction parameter influence the target parameter significantly. Furthermore, during the testing phase of proposed models with an independent set of data, the absolute and relative errors were significantly low, which indicates the prediction power of the employed soft computing techniques deemed satisfactory as compared to the measured data.

  19. Advanced computational approaches to biomedical engineering

    CERN Document Server

    Saha, Punam K; Basu, Subhadip

    2014-01-01

    There has been rapid growth in biomedical engineering in recent decades, given advancements in medical imaging and physiological modelling and sensing systems, coupled with immense growth in computational and network technology, analytic approaches, visualization and virtual-reality, man-machine interaction and automation. Biomedical engineering involves applying engineering principles to the medical and biological sciences and it comprises several topics including biomedicine, medical imaging, physiological modelling and sensing, instrumentation, real-time systems, automation and control, sig

  20. Soft sensors with white- and black-box approaches for a wastewater treatment process

    Directory of Open Access Journals (Sweden)

    D. Zyngier

    2000-12-01

    Full Text Available The increasing degradation of water resources makes it necessary to monitor and control process variables that may disturb the environment, but which may be very difficult to measure directly, either because there are no physical sensors available, or because these are too expensive. In this work, two soft sensors are proposed for monitoring concentrations of nitrate (NO and ammonium (NH ions, and of carbonaceous matter (CM during nitrification of wastewater. One of them is based on reintegration of a process model to estimate NO and NH and on a feedforward neural network to estimate CM. The other estimator is based on Stacked Neural Networks (SNN, an approach that provides the predictor with robustness. After simulation, both soft sensors were implemented in an experimental unit using FIX MMI (Intellution, Inc automation software as an interface between the process and MATLAB 5.1 (The Mathworks Inc. software.

  1. Soft systems methodology as a systemic approach to nuclear safety management

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Neto, Antonio S.; Guilhen, Sabine N.; Rubin, Gerson A.; Caldeira Filho, Jose S.; Camargo, Iara M.C., E-mail: asvneto@ipen.br, E-mail: snguilhen@ipen.br, E-mail: garubin@ipen.br, E-mail: jscaldeira@ipen.br, E-mail: icamargo@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNE-SP), Sao Paulo, SP (Brazil)

    2017-07-01

    Safety approach currently adopted by nuclear installations is built almost exclusively upon analytical methodologies based, mainly, on the belief that the properties of a system, such as its safety, are given by its constituent parts. This approach, however, does not properly address the complex dynamic interactions between technical, human and organizational factors occurring within and outside the organization. After the accident at Fukushima Daiichi nuclear power plant in March 2011, experts of the International Atomic Energy Agency (IAEA) recommended a systemic approach as a complementary perspective to nuclear safety. The aim of this paper is to present an overview of the systems thinking approach and its potential use for structuring socio technical problems involved in the safety of nuclear installations, highlighting the methodologies related to the soft systems thinking, in particular the Soft Systems Methodology (SSM). The implementation of a systemic approach may thus result in a more holistic picture of the system by the complex dynamic interactions between technical, human and organizational factors. (author)

  2. Soft systems methodology as a systemic approach to nuclear safety management

    International Nuclear Information System (INIS)

    Vieira Neto, Antonio S.; Guilhen, Sabine N.; Rubin, Gerson A.; Caldeira Filho, Jose S.; Camargo, Iara M.C.

    2017-01-01

    Safety approach currently adopted by nuclear installations is built almost exclusively upon analytical methodologies based, mainly, on the belief that the properties of a system, such as its safety, are given by its constituent parts. This approach, however, does not properly address the complex dynamic interactions between technical, human and organizational factors occurring within and outside the organization. After the accident at Fukushima Daiichi nuclear power plant in March 2011, experts of the International Atomic Energy Agency (IAEA) recommended a systemic approach as a complementary perspective to nuclear safety. The aim of this paper is to present an overview of the systems thinking approach and its potential use for structuring socio technical problems involved in the safety of nuclear installations, highlighting the methodologies related to the soft systems thinking, in particular the Soft Systems Methodology (SSM). The implementation of a systemic approach may thus result in a more holistic picture of the system by the complex dynamic interactions between technical, human and organizational factors. (author)

  3. Computational Approaches to Nucleic Acid Origami.

    Science.gov (United States)

    Jabbari, Hosna; Aminpour, Maral; Montemagno, Carlo

    2015-10-12

    Recent advances in experimental DNA origami have dramatically expanded the horizon of DNA nanotechnology. Complex 3D suprastructures have been designed and developed using DNA origami with applications in biomaterial science, nanomedicine, nanorobotics, and molecular computation. Ribonucleic acid (RNA) origami has recently been realized as a new approach. Similar to DNA, RNA molecules can be designed to form complex 3D structures through complementary base pairings. RNA origami structures are, however, more compact and more thermodynamically stable due to RNA's non-canonical base pairing and tertiary interactions. With all these advantages, the development of RNA origami lags behind DNA origami by a large gap. Furthermore, although computational methods have proven to be effective in designing DNA and RNA origami structures and in their evaluation, advances in computational nucleic acid origami is even more limited. In this paper, we review major milestones in experimental and computational DNA and RNA origami and present current challenges in these fields. We believe collaboration between experimental nanotechnologists and computer scientists are critical for advancing these new research paradigms.

  4. Using soft-X-ray energy spectrum to measure electronic temperature Te and primary research with computer data processing

    International Nuclear Information System (INIS)

    Wang Jingyao; Zhang Guangyang

    1993-01-01

    The authors reported the application of SCORPIO--2000 Computer detecting system on a nuclear fusion equipment, to measure the energy spectrum of soft X-ray from which the plasma electronic temperature was calculated. The authors processed systematically the data of the energy area of 1-4 Kev soft X-ray. The program edited was mostly made in FORTRAN, but only one SUBSB was made in assembly language. The program worked normally with convincing operation and easy correction of the data. The result obtained from calculation is the same as what was expected and the diagram obtained is the same as the expected one

  5. Introducing Computational Approaches in Intermediate Mechanics

    Science.gov (United States)

    Cook, David M.

    2006-12-01

    In the winter of 2003, we at Lawrence University moved Lagrangian mechanics and rigid body dynamics from a required sophomore course to an elective junior/senior course, freeing 40% of the time for computational approaches to ordinary differential equations (trajectory problems, the large amplitude pendulum, non-linear dynamics); evaluation of integrals (finding centers of mass and moment of inertia tensors, calculating gravitational potentials for various sources); and finding eigenvalues and eigenvectors of matrices (diagonalizing the moment of inertia tensor, finding principal axes), and to generating graphical displays of computed results. Further, students begin to use LaTeX to prepare some of their submitted problem solutions. Placed in the middle of the sophomore year, this course provides the background that permits faculty members as appropriate to assign computer-based exercises in subsequent courses. Further, students are encouraged to use our Computational Physics Laboratory on their own initiative whenever that use seems appropriate. (Curricular development supported in part by the W. M. Keck Foundation, the National Science Foundation, and Lawrence University.)

  6. Interacting electrons theory and computational approaches

    CERN Document Server

    Martin, Richard M; Ceperley, David M

    2016-01-01

    Recent progress in the theory and computation of electronic structure is bringing an unprecedented level of capability for research. Many-body methods are becoming essential tools vital for quantitative calculations and understanding materials phenomena in physics, chemistry, materials science and other fields. This book provides a unified exposition of the most-used tools: many-body perturbation theory, dynamical mean field theory and quantum Monte Carlo simulations. Each topic is introduced with a less technical overview for a broad readership, followed by in-depth descriptions and mathematical formulation. Practical guidelines, illustrations and exercises are chosen to enable readers to appreciate the complementary approaches, their relationships, and the advantages and disadvantages of each method. This book is designed for graduate students and researchers who want to use and understand these advanced computational tools, get a broad overview, and acquire a basis for participating in new developments.

  7. Computational approaches to analogical reasoning current trends

    CERN Document Server

    Richard, Gilles

    2014-01-01

    Analogical reasoning is known as a powerful mode for drawing plausible conclusions and solving problems. It has been the topic of a huge number of works by philosophers, anthropologists, linguists, psychologists, and computer scientists. As such, it has been early studied in artificial intelligence, with a particular renewal of interest in the last decade. The present volume provides a structured view of current research trends on computational approaches to analogical reasoning. It starts with an overview of the field, with an extensive bibliography. The 14 collected contributions cover a large scope of issues. First, the use of analogical proportions and analogies is explained and discussed in various natural language processing problems, as well as in automated deduction. Then, different formal frameworks for handling analogies are presented, dealing with case-based reasoning, heuristic-driven theory projection, commonsense reasoning about incomplete rule bases, logical proportions induced by similarity an...

  8. CGC/saturation approach for soft interactions at high energy: long range rapidity correlations

    International Nuclear Information System (INIS)

    Gotsman, E.; Maor, U.; Levin, E.

    2015-01-01

    In this paper we continue our program to construct a model for high energy soft interactions that is based on the CGC/saturation approach. The main result of this paper is that we have discovered a mechanism that leads to large long range rapidity correlations and results in large values of the correlation function R(y 1 , y 2 ) ≥ 1, which is independent of y 1 and y 2 . Such a behavior of the correlation function provides strong support for the idea that at high energies the system of partons that is produced is not only dense but also has strong attractive forces acting between the partons. (orig.)

  9. Soft Computing Technique and Conventional Controller for Conical Tank Level Control

    Directory of Open Access Journals (Sweden)

    Sudharsana Vijayan

    2016-03-01

    Full Text Available In many process industries the control of liquid level is mandatory. But the control of nonlinear process is difficult. Many process industries use conical tanks because of its non linear shape contributes better drainage for solid mixtures, slurries and viscous liquids. So, control of conical tank level is a challenging task due to its non-linearity and continually varying cross-section. This is due to relationship between controlled variable level and manipulated variable flow rate, which has a square root relationship. The main objective is to execute the suitable controller for conical tank system to maintain the desired level. System identification of the non-linear process is done using black box modelling and found to be first order plus dead time (FOPDT model. In this paper it is proposed to obtain the mathematical modelling of a conical tank system and to study the system using block diagram after that soft computing technique like fuzzy and conventional controller is also used for the comparison.

  10. Intelligent Soft Computing on Forex: Exchange Rates Forecasting with Hybrid Radial Basis Neural Network.

    Science.gov (United States)

    Falat, Lukas; Marcek, Dusan; Durisova, Maria

    2016-01-01

    This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process.

  11. Intelligent Soft Computing on Forex: Exchange Rates Forecasting with Hybrid Radial Basis Neural Network

    Directory of Open Access Journals (Sweden)

    Lukas Falat

    2016-01-01

    Full Text Available This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process.

  12. Temperature-based estimation of global solar radiation using soft computing methodologies

    Science.gov (United States)

    Mohammadi, Kasra; Shamshirband, Shahaboddin; Danesh, Amir Seyed; Abdullah, Mohd Shahidan; Zamani, Mazdak

    2016-07-01

    Precise knowledge of solar radiation is indeed essential in different technological and scientific applications of solar energy. Temperature-based estimation of global solar radiation would be appealing owing to broad availability of measured air temperatures. In this study, the potentials of soft computing techniques are evaluated to estimate daily horizontal global solar radiation (DHGSR) from measured maximum, minimum, and average air temperatures ( T max, T min, and T avg) in an Iranian city. For this purpose, a comparative evaluation between three methodologies of adaptive neuro-fuzzy inference system (ANFIS), radial basis function support vector regression (SVR-rbf), and polynomial basis function support vector regression (SVR-poly) is performed. Five combinations of T max, T min, and T avg are served as inputs to develop ANFIS, SVR-rbf, and SVR-poly models. The attained results show that all ANFIS, SVR-rbf, and SVR-poly models provide favorable accuracy. Based upon all techniques, the higher accuracies are achieved by models (5) using T max- T min and T max as inputs. According to the statistical results, SVR-rbf outperforms SVR-poly and ANFIS. For SVR-rbf (5), the mean absolute bias error, root mean square error, and correlation coefficient are 1.1931 MJ/m2, 2.0716 MJ/m2, and 0.9380, respectively. The survey results approve that SVR-rbf can be used efficiently to estimate DHGSR from air temperatures.

  13. COOBBO: A Novel Opposition-Based Soft Computing Algorithm for TSP Problems

    Directory of Open Access Journals (Sweden)

    Qingzheng Xu

    2014-12-01

    Full Text Available In this paper, we propose a novel definition of opposite path. Its core feature is that the sequence of candidate paths and the distances between adjacent nodes in the tour are considered simultaneously. In a sense, the candidate path and its corresponding opposite path have the same (or similar at least distance to the optimal path in the current population. Based on an accepted framework for employing opposition-based learning, Oppositional Biogeography-Based Optimization using the Current Optimum, called COOBBO algorithm, is introduced to solve traveling salesman problems. We demonstrate its performance on eight benchmark problems and compare it with other optimization algorithms. Simulation results illustrate that the excellent performance of our proposed algorithm is attributed to the distinct definition of opposite path. In addition, its great strength lies in exploitation for enhancing the solution accuracy, not exploration for improving the population diversity. Finally, by comparing different version of COOBBO, another conclusion is that each successful opposition-based soft computing algorithm needs to adjust and remain a good balance between backward adjacent node and forward adjacent node.

  14. Intelligent Soft Computing on Forex: Exchange Rates Forecasting with Hybrid Radial Basis Neural Network

    Science.gov (United States)

    Marcek, Dusan; Durisova, Maria

    2016-01-01

    This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process. PMID:26977450

  15. Monitoring the Microgravity Environment Quality On-Board the International Space Station Using Soft Computing Techniques

    Science.gov (United States)

    Jules, Kenol; Lin, Paul P.

    2001-01-01

    This paper presents an artificial intelligence monitoring system developed by the NASA Glenn Principal Investigator Microgravity Services project to help the principal investigator teams identify the primary vibratory disturbance sources that are active, at any moment in time, on-board the International Space Station, which might impact the microgravity environment their experiments are exposed to. From the Principal Investigator Microgravity Services' web site, the principal investigator teams can monitor via a graphical display, in near real time, which event(s) is/are on, such as crew activities, pumps, fans, centrifuges, compressor, crew exercise, platform structural modes, etc., and decide whether or not to run their experiments based on the acceleration environment associated with a specific event. This monitoring system is focused primarily on detecting the vibratory disturbance sources, but could be used as well to detect some of the transient disturbance sources, depending on the events duration. The system has built-in capability to detect both known and unknown vibratory disturbance sources. Several soft computing techniques such as Kohonen's Self-Organizing Feature Map, Learning Vector Quantization, Back-Propagation Neural Networks, and Fuzzy Logic were used to design the system.

  16. Risk assessment through drinking water pathway via uncertainty modeling of contaminant transport using soft computing

    International Nuclear Information System (INIS)

    Datta, D.; Ranade, A.K.; Pandey, M.; Sathyabama, N.; Kumar, Brij

    2012-01-01

    The basic objective of an environmental impact assessment (EIA) is to build guidelines to reduce the associated risk or mitigate the consequences of the reactor accident at its source to prevent deterministic health effects, to reduce the risk of stochastic health effects (eg. cancer and severe hereditary effects) as much as reasonable achievable by implementing protective actions in accordance with IAEA guidance (IAEA Safety Series No. 115, 1996). The measure of exposure being the basic tool to take any appropriate decisions related to risk reduction, EIA is traditionally expressed in terms of radiation exposure to the member of the public. However, models used to estimate the exposure received by the member of the public are governed by parameters some of which are deterministic with relative uncertainty and some of which are stochastic as well as imprecise (insufficient knowledge). In an admixture environment of this type, it is essential to assess the uncertainty of a model to estimate the bounds of the exposure to the public to invoke a decision during an event of nuclear or radiological emergency. With a view to this soft computing technique such as evidence theory based assessment of model parameters is addressed to compute the risk or exposure to the member of the public. The possible pathway of exposure to the member of the public in the aquatic food stream is the drinking of water. Accordingly, this paper presents the uncertainty analysis of exposure via uncertainty analysis of the contaminated water. Evidence theory finally addresses the uncertainty in terms of lower bound as belief measure and upper bound of exposure as plausibility measure. In this work EIA is presented using evidence theory. Data fusion technique is used to aggregate the knowledge on the uncertain information. Uncertainty of concentration and exposure is expressed as an interval of belief, plausibility

  17. Soft computing modelling of moisture sorption isotherms of milk-foxtail millet powder and determination of thermodynamic properties.

    Science.gov (United States)

    Simha, H V Vikram; Pushpadass, Heartwin A; Franklin, Magdaline Eljeeva Emerald; Kumar, P Arun; Manimala, K

    2016-06-01

    Moisture sorption isotherms of spray-dried milk-foxtail millet powder were determined at 10, 25 and 40 °C. Sorption data was fitted using classical and soft-computing approaches. The isotherms were of type II, and equilibrium moisture content (EMC) was temperature dependent. The BET monolayer moisture content decreased from 3.30 to 2.67 % as temperature increased from 10 to 40 °C. Amongst the classical models, Ferro-Fontan gave the best fit of EMC-aw data. However, the Sugeno-type adaptive neuro-fuzzy inference system (ANFIS) with generalized bell-shaped membership function performed better than artificial neural network and classical models with RMSE as low as 0.0099. The isosteric heat of sorption decreased from 150.32 kJ mol(-1) at 1 % moisture content to 44.11 kJ mol(-1) at 15 % moisture. The enthalpy-entropy compensation theory was validated, and the isokinetic and harmonic mean temperatures were determined as 333.1 and 297.5 K, respectively.

  18. Accuracy and reliability of facial soft tissue depth measurements using cone beam computer tomography

    NARCIS (Netherlands)

    Fourie, Zacharias; Damstra, Janalt; Gerrits, Pieter; Ren, Yijin

    2010-01-01

    It is important to have accurate and reliable measurements of soft tissue thickness for specific landmarks of the face and scalp when producing a facial reconstruction. In the past several methods have been created to measure facial soft tissue thickness (FSTT) in cadavers and in the living. The

  19. User involvement in the design of human-computer interactions: some similarities and differences between design approaches

    NARCIS (Netherlands)

    Bekker, M.M.; Long, J.B.

    1998-01-01

    This paper presents a general review of user involvement in the design of human-computer interactions, as advocated by a selection of different approaches to design. The selection comprises User-Centred Design, Participatory Design, Socio-Technical Design, Soft Systems Methodology, and Joint

  20. A Dynamic Approach to the Analysis of Soft Power in International Relations

    Directory of Open Access Journals (Sweden)

    Chi Zhang

    2013-12-01

    Full Text Available This article discusses soft power in international relations and the soft power of China’s foreign policy in recent years. After presenting a critique of the soft power theory developed by Joseph S. Nye, the paper provides an alternative interpretation of soft power. The author proposes a dynamic analysis of soft power in international relations, and argues that whether a power resource is soft or hard depends on the perceptions and feelings of various actors in specific situations. Due to the varying degrees of acceptance, power can be divided into hard power, soft power and bargaining power. An analysis should look at the soft or hard effectiveness of a power resource from three perspectives–horizontally, vertically and relatively. Recently, the soft power of China’s foreign policy and international behavior has mainly been manifested in multilateralism, economic diplomacy and a good-neighborly policy.

  1. Evidence of soft bound behaviour in analogue memristive devices for neuromorphic computing.

    Science.gov (United States)

    Frascaroli, Jacopo; Brivio, Stefano; Covi, Erika; Spiga, Sabina

    2018-05-08

    The development of devices that can modulate their conductance under the application of electrical stimuli constitutes a fundamental step towards the realization of synaptic connectivity in neural networks. Optimization of synaptic functionality requires the understanding of the analogue conductance update under different programming conditions. Moreover, properties of physical devices such as bounded conductance values and state-dependent modulation should be considered as they affect storage capacity and performance of the network. This work provides a study of the conductance dynamics produced by identical pulses as a function of the programming parameters in an HfO 2 memristive device. The application of a phenomenological model that considers a soft approach to the conductance boundaries allows the identification of different operation regimes and to quantify conductance modulation in the analogue region. Device non-linear switching kinetics is recognized as the physical origin of the transition between different dynamics and motivates the crucial trade-off between degree of analog modulation and memory window. Different kinetics for the processes of conductance increase and decrease account for device programming asymmetry. The identification of programming trade-off together with an evaluation of device variations provide a guideline for the optimization of the analogue programming in view of hardware implementation of neural networks.

  2. Automatic Generation of Agents using Reusable Soft Computing Code Libraries to develop Multi Agent System for Healthcare

    OpenAIRE

    Priti Srinivas Sajja

    2015-01-01

    This paper illustrates architecture for a multi agent system in healthcare domain. The architecture is generic and designed in form of multiple layers. One of the layers of the architecture contains many proactive, co-operative and intelligent agents such as resource management agent, query agent, pattern detection agent and patient management agent. Another layer of the architecture is a collection of libraries to auto-generate code for agents using soft computing techni...

  3. Impact of Computed Tomography Image Quality on Image-Guided Radiation Therapy Based on Soft Tissue Registration

    International Nuclear Information System (INIS)

    Morrow, Natalya V.; Lawton, Colleen A.; Qi, X. Sharon; Li, X. Allen

    2012-01-01

    Purpose: In image-guided radiation therapy (IGRT), different computed tomography (CT) modalities with varying image quality are being used to correct for interfractional variations in patient set-up and anatomy changes, thereby reducing clinical target volume to the planning target volume (CTV-to-PTV) margins. We explore how CT image quality affects patient repositioning and CTV-to-PTV margins in soft tissue registration-based IGRT for prostate cancer patients. Methods and Materials: Four CT-based IGRT modalities used for prostate RT were considered in this study: MV fan beam CT (MVFBCT) (Tomotherapy), MV cone beam CT (MVCBCT) (MVision; Siemens), kV fan beam CT (kVFBCT) (CTVision, Siemens), and kV cone beam CT (kVCBCT) (Synergy; Elekta). Daily shifts were determined by manual registration to achieve the best soft tissue agreement. Effect of image quality on patient repositioning was determined by statistical analysis of daily shifts for 136 patients (34 per modality). Inter- and intraobserver variability of soft tissue registration was evaluated based on the registration of a representative scan for each CT modality with its corresponding planning scan. Results: Superior image quality with the kVFBCT resulted in reduced uncertainty in soft tissue registration during IGRT compared with other image modalities for IGRT. The largest interobserver variations of soft tissue registration were 1.1 mm, 2.5 mm, 2.6 mm, and 3.2 mm for kVFBCT, kVCBCT, MVFBCT, and MVCBCT, respectively. Conclusions: Image quality adversely affects the reproducibility of soft tissue-based registration for IGRT and necessitates a careful consideration of residual uncertainties in determining different CTV-to-PTV margins for IGRT using different image modalities.

  4. Impact of Computed Tomography Image Quality on Image-Guided Radiation Therapy Based on Soft Tissue Registration

    Energy Technology Data Exchange (ETDEWEB)

    Morrow, Natalya V.; Lawton, Colleen A. [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin (United States); Qi, X. Sharon [Department of Radiation Oncology, University of Colorado Denver, Denver, Colorado (United States); Li, X. Allen, E-mail: ali@mcw.edu [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin (United States)

    2012-04-01

    Purpose: In image-guided radiation therapy (IGRT), different computed tomography (CT) modalities with varying image quality are being used to correct for interfractional variations in patient set-up and anatomy changes, thereby reducing clinical target volume to the planning target volume (CTV-to-PTV) margins. We explore how CT image quality affects patient repositioning and CTV-to-PTV margins in soft tissue registration-based IGRT for prostate cancer patients. Methods and Materials: Four CT-based IGRT modalities used for prostate RT were considered in this study: MV fan beam CT (MVFBCT) (Tomotherapy), MV cone beam CT (MVCBCT) (MVision; Siemens), kV fan beam CT (kVFBCT) (CTVision, Siemens), and kV cone beam CT (kVCBCT) (Synergy; Elekta). Daily shifts were determined by manual registration to achieve the best soft tissue agreement. Effect of image quality on patient repositioning was determined by statistical analysis of daily shifts for 136 patients (34 per modality). Inter- and intraobserver variability of soft tissue registration was evaluated based on the registration of a representative scan for each CT modality with its corresponding planning scan. Results: Superior image quality with the kVFBCT resulted in reduced uncertainty in soft tissue registration during IGRT compared with other image modalities for IGRT. The largest interobserver variations of soft tissue registration were 1.1 mm, 2.5 mm, 2.6 mm, and 3.2 mm for kVFBCT, kVCBCT, MVFBCT, and MVCBCT, respectively. Conclusions: Image quality adversely affects the reproducibility of soft tissue-based registration for IGRT and necessitates a careful consideration of residual uncertainties in determining different CTV-to-PTV margins for IGRT using different image modalities.

  5. Novel computational approaches characterizing knee physiotherapy

    Directory of Open Access Journals (Sweden)

    Wangdo Kim

    2014-01-01

    Full Text Available A knee joint’s longevity depends on the proper integration of structural components in an axial alignment. If just one of the components is abnormally off-axis, the biomechanical system fails, resulting in arthritis. The complexity of various failures in the knee joint has led orthopedic surgeons to select total knee replacement as a primary treatment. In many cases, this means sacrificing much of an otherwise normal joint. Here, we review novel computational approaches to describe knee physiotherapy by introducing a new dimension of foot loading to the knee axis alignment producing an improved functional status of the patient. New physiotherapeutic applications are then possible by aligning foot loading with the functional axis of the knee joint during the treatment of patients with osteoarthritis.

  6. Music Genre Classification Systems - A Computational Approach

    DEFF Research Database (Denmark)

    Ahrendt, Peter

    2006-01-01

    Automatic music genre classification is the classification of a piece of music into its corresponding genre (such as jazz or rock) by a computer. It is considered to be a cornerstone of the research area Music Information Retrieval (MIR) and closely linked to the other areas in MIR. It is thought...... that MIR will be a key element in the processing, searching and retrieval of digital music in the near future. This dissertation is concerned with music genre classification systems and in particular systems which use the raw audio signal as input to estimate the corresponding genre. This is in contrast...... to systems which use e.g. a symbolic representation or textual information about the music. The approach to music genre classification systems has here been system-oriented. In other words, all the different aspects of the systems have been considered and it is emphasized that the systems should...

  7. Soft-computing base analyses of the relationship between annoyance and coping with noise and odor.

    Science.gov (United States)

    Botteldooren, Dick; Lercher, Peter

    2004-06-01

    The majority of research on annoyance as an important impact of noise, odor, and other stressors on man, has regarded the person as a passive receptor. It was however recognized that this person is an active participant trying to alter a troubled person-environment relationship or to sustain a desirable one. Coping has to be incorporated. This is of particular importance in changing exposure situations. For large populations a lot of insight can be gained by looking at average effects only. To investigate changes in annoyance and effects of coping, the individual or small group has to be studied. Then it becomes imperative to recognize the inherent vagueness in perception and human behavior. Fortunately, tools have been developed over the past decades that allow doing this in a mathematically precise way. These tools are sometimes referred to by the common label: soft-computing, hence the title of this paper. This work revealed different styles of coping both by blind clustering and by (fuzzy) logical aggregation of different actions reported in a survey. The relationship between annoyance and the intensity of coping it generates was quantified after it was recognized that the possibility for coping is created by the presence of the stressor rather than the actual fact of coping. It was further proven that refinement of this relationship is possible if a person can be identified as a coper. This personal factor can be extracted from a known reaction to one stressor and be used for predicting coping intensity and style in another situation. The effect of coping on a perceived change in annoyance is quantified by a set of fuzzy linguistic rules. This closes the loop that is responsible for at least some of the dynamics of the response to a stressor. This work thus provides all essential building blocks for designing models for annoyance in changing environments.

  8. A computational approach to animal breeding.

    Science.gov (United States)

    Berger-Wolf, Tanya Y; Moore, Cristopher; Saia, Jared

    2007-02-07

    We propose a computational model of mating strategies for controlled animal breeding programs. A mating strategy in a controlled breeding program is a heuristic with some optimization criteria as a goal. Thus, it is appropriate to use the computational tools available for analysis of optimization heuristics. In this paper, we propose the first discrete model of the controlled animal breeding problem and analyse heuristics for two possible objectives: (1) breeding for maximum diversity and (2) breeding a target individual. These two goals are representative of conservation biology and agricultural livestock management, respectively. We evaluate several mating strategies and provide upper and lower bounds for the expected number of matings. While the population parameters may vary and can change the actual number of matings for a particular strategy, the order of magnitude of the number of expected matings and the relative competitiveness of the mating heuristics remains the same. Thus, our simple discrete model of the animal breeding problem provides a novel viable and robust approach to designing and comparing breeding strategies in captive populations.

  9. Computation within the auxiliary field approach

    International Nuclear Information System (INIS)

    Baeurle, S.A.

    2003-01-01

    Recently, the classical auxiliary field methodology has been developed as a new simulation technique for performing calculations within the framework of classical statistical mechanics. Since the approach suffers from a sign problem, a judicious choice of the sampling algorithm, allowing a fast statistical convergence and an efficient generation of field configurations, is of fundamental importance for a successful simulation. In this paper we focus on the computational aspects of this simulation methodology. We introduce two different types of algorithms, the single-move auxiliary field Metropolis Monte Carlo algorithm and two new classes of force-based algorithms, which enable multiple-move propagation. In addition, to further optimize the sampling, we describe a preconditioning scheme, which permits to treat each field degree of freedom individually with regard to the evolution through the auxiliary field configuration space. Finally, we demonstrate the validity and assess the competitiveness of these algorithms on a representative practical example. We believe that they may also provide an interesting possibility for enhancing the computational efficiency of other auxiliary field methodologies

  10. A computational approach to negative priming

    Science.gov (United States)

    Schrobsdorff, H.; Ihrke, M.; Kabisch, B.; Behrendt, J.; Hasselhorn, M.; Herrmann, J. Michael

    2007-09-01

    Priming is characterized by a sensitivity of reaction times to the sequence of stimuli in psychophysical experiments. The reduction of the reaction time observed in positive priming is well-known and experimentally understood (Scarborough et al., J. Exp. Psycholol: Hum. Percept. Perform., 3, pp. 1-17, 1977). Negative priming—the opposite effect—is experimentally less tangible (Fox, Psychonom. Bull. Rev., 2, pp. 145-173, 1995). The dependence on subtle parameter changes (such as response-stimulus interval) usually varies. The sensitivity of the negative priming effect bears great potential for applications in research in fields such as memory, selective attention, and ageing effects. We develop and analyse a computational realization, CISAM, of a recent psychological model for action decision making, the ISAM (Kabisch, PhD thesis, Friedrich-Schiller-Universitat, 2003), which is sensitive to priming conditions. With the dynamical systems approach of the CISAM, we show that a single adaptive threshold mechanism is sufficient to explain both positive and negative priming effects. This is achieved by comparing results obtained by the computational modelling with experimental data from our laboratory. The implementation provides a rich base from which testable predictions can be derived, e.g. with respect to hitherto untested stimulus combinations (e.g. single-object trials).

  11. An assessment on the use of bivariate, multivariate and soft computing techniques for collapse susceptibility in GIS environ

    Science.gov (United States)

    Yilmaz, Işik; Marschalko, Marian; Bednarik, Martin

    2013-04-01

    The paper presented herein compares and discusses the use of bivariate, multivariate and soft computing techniques for collapse susceptibility modelling. Conditional probability (CP), logistic regression (LR) and artificial neural networks (ANN) models representing the bivariate, multivariate and soft computing techniques were used in GIS based collapse susceptibility mapping in an area from Sivas basin (Turkey). Collapse-related factors, directly or indirectly related to the causes of collapse occurrence, such as distance from faults, slope angle and aspect, topographical elevation, distance from drainage, topographic wetness index (TWI), stream power index (SPI), Normalized Difference Vegetation Index (NDVI) by means of vegetation cover, distance from roads and settlements were used in the collapse susceptibility analyses. In the last stage of the analyses, collapse susceptibility maps were produced from the models, and they were then compared by means of their validations. However, Area Under Curve (AUC) values obtained from all three models showed that the map obtained from soft computing (ANN) model looks like more accurate than the other models, accuracies of all three models can be evaluated relatively similar. The results also showed that the conditional probability is an essential method in preparation of collapse susceptibility map and highly compatible with GIS operating features.

  12. Modeling of Groundwater Resources Heavy Metals Concentration Using Soft Computing Methods: Application of Different Types of Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Meysam Alizamir

    2017-09-01

    Full Text Available Nowadays, groundwater resources play a vital role as a source of drinking water in arid and semiarid regions and forecasting of pollutants content in these resources is very important. Therefore, this study aimed to compare two soft computing methods for modeling Cd, Pb and Zn concentration in groundwater resources of Asadabad Plain, Western Iran. The relative accuracy of several soft computing models, namely multi-layer perceptron (MLP and radial basis function (RBF for forecasting of heavy metals concentration have been investigated. In addition, Levenberg-Marquardt, gradient descent and conjugate gradient training algorithms were utilized for the MLP models. The ANN models for this study were developed using MATLAB R 2014 Software program. The MLP performs better than the other models for heavy metals concentration estimation. The simulation results revealed that MLP model was able to model heavy metals concentration in groundwater resources favorably. It generally is effectively utilized in environmental applications and in the water quality estimations. In addition, out of three algorithms, Levenberg-Marquardt was better than the others were. This study proposed soft computing modeling techniques for the prediction and estimation of heavy metals concentration in groundwater resources of Asadabad Plain. Based on collected data from the plain, MLP and RBF models were developed for each heavy metal. MLP can be utilized effectively in applications of prediction of heavy metals concentration in groundwater resources of Asadabad Plain.

  13. An ELM Based Online Soft Sensing Approach for Alumina Concentration Detection

    Directory of Open Access Journals (Sweden)

    Sen Zhang

    2015-01-01

    Full Text Available The concentration of alumina in the electrolyte is of great significance during the production of aluminum; it may affect the stability of aluminum reduction cell and the current efficiency. However, the concentration of alumina is hard to be detected online because of the special circumstance in the aluminum reduction cell. At present, there is lack of fast and accurate soft sensing methods for alumina concentration and existing methods can not meet the needs for online measurement. In this paper, a novel soft sensing method based on a modified extreme learning machine (MELM for online measurement of the alumina concentration is proposed. The modified ELM algorithm is based on the enhanced random search which is called incremental extreme learning machine in some references. It randomly chooses the input weights and analytically determines the output weights without manual intervention. The simulation results show that the approach can give more accurate estimations of alumina concentration with faster learning speed compared with other methods such as BP and SVM.

  14. SoftAR: visually manipulating haptic softness perception in spatial augmented reality.

    Science.gov (United States)

    Punpongsanon, Parinya; Iwai, Daisuke; Sato, Kosuke

    2015-11-01

    We present SoftAR, a novel spatial augmented reality (AR) technique based on a pseudo-haptics mechanism that visually manipulates the sense of softness perceived by a user pushing a soft physical object. Considering the limitations of projection-based approaches that change only the surface appearance of a physical object, we propose two projection visual effects, i.e., surface deformation effect (SDE) and body appearance effect (BAE), on the basis of the observations of humans pushing physical objects. The SDE visualizes a two-dimensional deformation of the object surface with a controlled softness parameter, and BAE changes the color of the pushing hand. Through psychophysical experiments, we confirm that the SDE can manipulate softness perception such that the participant perceives significantly greater softness than the actual softness. Furthermore, fBAE, in which BAE is applied only for the finger area, significantly enhances manipulation of the perception of softness. We create a computational model that estimates perceived softness when SDE+fBAE is applied. We construct a prototype SoftAR system in which two application frameworks are implemented. The softness adjustment allows a user to adjust the softness parameter of a physical object, and the softness transfer allows the user to replace the softness with that of another object.

  15. Blueprinting Approach in Support of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Willem-Jan van den Heuvel

    2012-03-01

    Full Text Available Current cloud service offerings, i.e., Software-as-a-service (SaaS, Platform-as-a-service (PaaS and Infrastructure-as-a-service (IaaS offerings are often provided as monolithic, one-size-fits-all solutions and give little or no room for customization. This limits the ability of Service-based Application (SBA developers to configure and syndicate offerings from multiple SaaS, PaaS, and IaaS providers to address their application requirements. Furthermore, combining different independent cloud services necessitates a uniform description format that facilitates the design, customization, and composition. Cloud Blueprinting is a novel approach that allows SBA developers to easily design, configure and deploy virtual SBA payloads on virtual machines and resource pools on the cloud. We propose the Blueprint concept as a uniform abstract description for cloud service offerings that may cross different cloud computing layers, i.e., SaaS, PaaS and IaaS. To support developers with the SBA design and development in the cloud, this paper introduces a formal Blueprint Template for unambiguously describing a blueprint, as well as a Blueprint Lifecycle that guides developers through the manipulation, composition and deployment of different blueprints for an SBA. Finally, the empirical evaluation of the blueprinting approach within an EC’s FP7 project is reported and an associated blueprint prototype implementation is presented.

  16. Pauli principle in the soft-photon approach to proton-proton bremsstrahlung

    NARCIS (Netherlands)

    Liou, MK; Timmermans, R; Gibson, BF

    1996-01-01

    A relativistic and manifestly gauge-invariant soft-photon amplitude, which is consistent with the soft-photon theorem and satisfies the Pauli principle, is derived for the proton-proton bremsstrahlung process. This soft-photon amplitude is the first two-u-two-t special amplitude to satisfy all

  17. STANDARDISED CLINICAL EXAMINATION OF SOFT-TISSUE PAIN IN PATIENTS WITH HIP DYSPLASIA USING THE CLINICAL ENTITIES APPROACH

    DEFF Research Database (Denmark)

    Jacobsen, Julie Sandell; Hölmich, Per; Thorborg, Kristian

    2016-01-01

    Introduction In patients with symptomatic hip dysplasia soft-tissue pain may be a prevalent condition that might affect the outcome of periacetabular osteotomy (PAO). However, the distribution of soft-tissue pain in hip dysplasia has never been examined systematically using a standardised...... and reliable protocol. The aim of this study was to investigate five clinical entities in 100 patients with hip dysplasia using the clinical entities approach identifying the anatomic location of soft-tissue pain. The first 50 patients are presented in this paper. Material and Methods Fifty patients (10 males...... prevalence in the iliopsoas and the hip abductors. This indicates that patients with hip dysplasia also experience pain related to the surrounding soft-tissues, and not only from the hip joint. References (1) Holmich P, Holmich LR, Bjerg AM. Clinical examination of athletes with groin pain: an intraobserver...

  18. Computed tomography of the soft tissues of the shoulder. Pt. 3. Calcifying tendinitis of the rotator cuff

    Energy Technology Data Exchange (ETDEWEB)

    Dihlmann, W.; Bandick, J.

    1988-01-01

    Computed tomography of the soft tissue of the shoulder in cases of calcifying tendinitis of the rotator cuff provides the following information: 1. Localisation of the calcium deposits within the rotator cuff. 2. Contours and density of the calcium deposits correlated with the clinical findings as described by Uhthoff et al. Ill-defined contours and non-homogeneous deposits are associated with more severe clinical features. 3. Computed tomography shows that apatite particles, which are not visible radiologically, may penetrate into the shoulder joint and produce synovitis with an effusion. This is of importance in local therapy.

  19. CGC/saturation approach for soft interactions at high energy: long range rapidity correlations

    Energy Technology Data Exchange (ETDEWEB)

    Gotsman, E.; Maor, U. [Tel Aviv University, Department of Particle Physics, School of Physics and Astronomy, Raymond and Beverly Sackler Faculty of Exact Science, Tel Aviv (Israel); Levin, E. [Tel Aviv University, Department of Particle Physics, School of Physics and Astronomy, Raymond and Beverly Sackler Faculty of Exact Science, Tel Aviv (Israel); Universidad Tecnica Federico Santa Maria and Centro Cientifico- Tecnologico de Valparaiso, Departemento de Fisica, Valparaiso (Chile)

    2015-11-15

    In this paper we continue our program to construct a model for high energy soft interactions that is based on the CGC/saturation approach. The main result of this paper is that we have discovered a mechanism that leads to large long range rapidity correlations and results in large values of the correlation function R(y{sub 1}, y{sub 2}) ≥ 1, which is independent of y{sub 1} and y{sub 2}. Such a behavior of the correlation function provides strong support for the idea that at high energies the system of partons that is produced is not only dense but also has strong attractive forces acting between the partons. (orig.)

  20. [A new approach to clinical and laboratory diagnosis of systemic and local soft tissue infections].

    Science.gov (United States)

    Barkhatova, N A

    2009-01-01

    Dynamic measurements of blood TNF-a, IL-IRA, CRP, oligopeptide, and lactoferrin levels in patients with systemic and local soft tissue infections revealed direct correlation between them which allowed to use these indicators for the diagnosis of systemic infections. Results of clinical and laboratory analyses provided a basis for distinguishing short-term systemic inflammatory response syndrome and sepsis and developing relevant diagnostic criteria. Sepsis combined with systemic inflammatory response syndrome persisting for more than 72 hours after the onset of adequate therapy was characterized by CRP levels > 30 mg/l, oligopeptides > 0.34 U, lactoferrin > 1900 ng/ml, TNF-a > 6 pg/ml, ILL-IRA systemic inflammatory response syndrome for less than 72 hours had lower TNF-a, CRP, oligopeptide, and lactoferrin levels with IL-IRA > 1500 pg/ml. This new approach to early diagnosis of systemic infections makes it possible to optimize their treatment and thereby enhance its efficiency.

  1. Devising tissue ingrowth metrics: a contribution to the computational characterization of engineered soft tissue healing.

    Science.gov (United States)

    Alves, Antoine; Attik, Nina; Bayon, Yves; Royet, Elodie; Wirth, Carine; Bourges, Xavier; Piat, Alexis; Dolmazon, Gaëlle; Clermont, Gaëlle; Boutrand, Jean-Pierre; Grosgogeat, Brigitte; Gritsch, Kerstin

    2018-03-14

    The paradigm shift brought about by the expansion of tissue engineering and regenerative medicine away from the use of biomaterials, currently questions the value of histopathologic methods in the evaluation of biological changes. To date, the available tools of evaluation are not fully consistent and satisfactory for these advanced therapies. We have developed a new, simple and inexpensive quantitative digital approach that provides key metrics for structural and compositional characterization of the regenerated tissues. For example, metrics provide the tissue ingrowth rate (TIR) which integrates two separate indicators; the cell ingrowth rate (CIR) and the total collagen content (TCC) as featured in the equation, TIR% = CIR% + TCC%. Moreover a subset of quantitative indicators describing the directional organization of the collagen (relating structure and mechanical function of tissues), the ratio of collagen I to collagen III (remodeling quality) and the optical anisotropy property of the collagen (maturity indicator) was automatically assessed as well. Using an image analyzer, all metrics were extracted from only two serial sections stained with either Feulgen & Rossenbeck (cell specific) or Picrosirius Red F3BA (collagen specific). To validate this new procedure, three-dimensional (3D) scaffolds were intraperitoneally implanted in healthy and in diabetic rats. It was hypothesized that quantitatively, the healing tissue would be significantly delayed and of poor quality in diabetic rats in comparison to healthy rats. In addition, a chemically modified 3D scaffold was similarly implanted in a third group of healthy rats with the assumption that modulation of the ingrown tissue would be quantitatively present in comparison to the 3D scaffold-healthy group. After 21 days of implantation, both hypotheses were verified by use of this novel computerized approach. When the two methods were run in parallel, the quantitative results revealed fine details and

  2. Description of EMX computer code. System for measuring soft X rays

    International Nuclear Information System (INIS)

    Marty, D.A.; Smeulders, P.; Launois, D.

    1978-07-01

    After briefly describing the system for measuring soft X rays implanted in TFR 600, the objectives and principles of the E.M.X calculation programme are presented. This model is divided into two distinct parts. The ultimate aim of EMX 1, the first part, is to build the soft X ray photo of a plasma with varied characteristics, seen through a certain collimation system (in this case a slit). That of EMX 2, the second part, is to filter the previously built soft X ray photo, by means of the system of absorbents belonging to the measuring system and to calculate the currents generated by each detector aimed at a plasma chord. The first calculation results are commented and discussed [fr

  3. a Recursive Approach to Compute Normal Forms

    Science.gov (United States)

    HSU, L.; MIN, L. J.; FAVRETTO, L.

    2001-06-01

    Normal forms are instrumental in the analysis of dynamical systems described by ordinary differential equations, particularly when singularities close to a bifurcation are to be characterized. However, the computation of a normal form up to an arbitrary order is numerically hard. This paper focuses on the computer programming of some recursive formulas developed earlier to compute higher order normal forms. A computer program to reduce the system to its normal form on a center manifold is developed using the Maple symbolic language. However, it should be stressed that the program relies essentially on recursive numerical computations, while symbolic calculations are used only for minor tasks. Some strategies are proposed to save computation time. Examples are presented to illustrate the application of the program to obtain high order normalization or to handle systems with large dimension.

  4. Soft brain-machine interfaces for assistive robotics: A novel control approach.

    Science.gov (United States)

    Schiatti, Lucia; Tessadori, Jacopo; Barresi, Giacinto; Mattos, Leonardo S; Ajoudani, Arash

    2017-07-01

    Robotic systems offer the possibility of improving the life quality of people with severe motor disabilities, enhancing the individual's degree of independence and interaction with the external environment. In this direction, the operator's residual functions must be exploited for the control of the robot movements and the underlying dynamic interaction through intuitive and effective human-robot interfaces. Towards this end, this work aims at exploring the potential of a novel Soft Brain-Machine Interface (BMI), suitable for dynamic execution of remote manipulation tasks for a wide range of patients. The interface is composed of an eye-tracking system, for an intuitive and reliable control of a robotic arm system's trajectories, and a Brain-Computer Interface (BCI) unit, for the control of the robot Cartesian stiffness, which determines the interaction forces between the robot and environment. The latter control is achieved by estimating in real-time a unidimensional index from user's electroencephalographic (EEG) signals, which provides the probability of a neutral or active state. This estimated state is then translated into a stiffness value for the robotic arm, allowing a reliable modulation of the robot's impedance. A preliminary evaluation of this hybrid interface concept provided evidence on the effective execution of tasks with dynamic uncertainties, demonstrating the great potential of this control method in BMI applications for self-service and clinical care.

  5. Percutaneous computed tomography-guided core needle biopsy of soft tissue tumors: results and correlation with surgical specimen analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chojniak, Rubens; Grigio, Henrique Ramos; Bitencourt, Almir Galvao Vieira; Pinto, Paula Nicole Vieira; Tyng, Chiang J.; Cunha, Isabela Werneck da; Aguiar Junior, Samuel; Lopes, Ademar, E-mail: chojniak@uol.com.br [Hospital A.C. Camargo, Sao Paulo, SP (Brazil)

    2012-09-15

    Objective: To evaluate the efficacy of percutaneous computed tomography (CT)-guided core needle biopsy of soft tissue tumors in obtaining appropriate samples for histological analysis, and compare its diagnosis with the results of the surgical pathology as available. Materials and Methods: The authors reviewed medical records, imaging and histological reports of 262 patients with soft-tissue tumors submitted to CT-guided core needle biopsy in an oncologic reference center between 2003 and 2009. Results: Appropriate samples were obtained in 215 (82.1%) out of the 262 patients. The most prevalent tumors were sarcomas (38.6%), metastatic carcinomas (28.8%), benign mesenchymal tumors (20.5%) and lymphomas (9.3%). Histological grading was feasible in 92.8% of sarcoma patients, with the majority of them (77.9%) being classified as high grade tumors. Out of the total sample, 116 patients (44.3%) underwent surgical excision and diagnosis confirmation. Core biopsy demonstrated 94.6% accuracy in the identification of sarcomas, with 96.4% sensitivity and 89.5% specificity. A significant intermethod agreement about histological grading was observed between core biopsy and surgical resection (p < 0.001; kappa = 0.75). Conclusion: CT-guided core needle biopsy demonstrated a high diagnostic accuracy in the evaluation of soft tissue tumors as well as in the histological grading of sarcomas, allowing an appropriate therapeutic planning (author)

  6. Evaluation of the Respimat Soft Mist Inhaler using a concurrent CFD and in vitro approach.

    Science.gov (United States)

    Worth Longest, P; Hindle, Michael

    2009-06-01

    The Respimat Soft Mist Inhaler is reported to generate an aerosol with low spray momentum and a small droplet size. However, the transport characteristics of the Respimat aerosol are not well understood. The objective of this study was to characterize the transport and deposition of an aerosol emitted from the Respimat inhaler using a combination of computational fluid dynamics (CFD) modeling and in vitro experiments. Deposition of the Respimat aerosol was assessed in the inhaler mouthpiece (MP), a standard induction port (IP), and a more realistic mouth-throat (MT) geometry at an inhalation flow rate of 30 L/min. Aerosols were generated using an albuterol sulfate (0.6%) solution, and the drug deposition was quantified using both in vitro experiments and a CFD model of the Respimat inhaler. Laser diffraction experiments were used to determine the initial polydisperse aerosol size distribution. It was found that the aerosol generated from the highly complex process of jet collision and breakup could be approximated in the model using effective spray conditions. Computational predictions of deposition fractions agreed well with in vitro results for both the IP (within 20% error) and MT (within 10% error) geometries. The experimental results indicated that the deposition fraction of drug in the MP ranged from 27 to 29% and accounted for a majority of total drug loss. Based on the CFD solution, high MP deposition was due to a recirculating flow pattern that surrounded the aerosol spray and entrained a significant number of small droplets. In contrast, deposition of the Respimat aerosol in both the IP (4.2%) and MT (7.4%) geometries was relatively low. Results of this study indicate that modifications to the current Respimat MP and control of specific patient variables may significantly reduce deposition in the device and may decrease high oropharyngeal drug loss observed in vivo.

  7. An Analytical Framework for Soft and Hard Data Fusion: A Dempster-Shafer Belief Theoretic Approach

    Science.gov (United States)

    2012-08-01

    generated soft data, such as HUMINT (HUMan 1 2 INTelligence), OSINT (Open Source INTelligence) and COMINT (COMmunications INTelligence), are fundamentally...human intelligence (HUMINT), open source intelligence ( OSINT ), and communications intelligence (COMINT), which is human communications derived from...respectively). The sources correspond to selected intelligence disciplines described in [9]. HUMINT and OSINT sources provide mostly soft

  8. Computer networks ISE a systems approach

    CERN Document Server

    Peterson, Larry L

    2007-01-01

    Computer Networks, 4E is the only introductory computer networking book written by authors who have had first-hand experience with many of the protocols discussed in the book, who have actually designed some of them as well, and who are still actively designing the computer networks today. This newly revised edition continues to provide an enduring, practical understanding of networks and their building blocks through rich, example-based instruction. The authors' focus is on the why of network design, not just the specifications comprising today's systems but how key technologies and p

  9. CREATIVE APPROACHES TO COMPUTER SCIENCE EDUCATION

    Directory of Open Access Journals (Sweden)

    V. B. Raspopov

    2010-04-01

    Full Text Available Using the example of PPS «Toolbox of multimedia lessons «For Children About Chopin» we demonstrate the possibility of involving creative students in developing the software packages for educational purposes. Similar projects can be assigned to school and college students studying computer sciences and informatics, and implemented under the teachers’ supervision, as advanced assignments or thesis projects as a part of a high school course IT or Computer Sciences, a college course of Applied Scientific Research, or as a part of preparation for students’ participation in the Computer Science competitions or IT- competitions of Youth Academy of Sciences ( MAN in Russian or in Ukrainian.

  10. Computer science approach to quantum control

    International Nuclear Information System (INIS)

    Janzing, D.

    2006-01-01

    Whereas it is obvious that every computation process is a physical process it has hardly been recognized that many complex physical processes bear similarities to computation processes. This is in particular true for the control of physical systems on the nanoscopic level: usually the system can only be accessed via a rather limited set of elementary control operations and for many purposes only a concatenation of a large number of these basic operations will implement the desired process. This concatenation is in many cases quite similar to building complex programs from elementary steps and principles for designing algorithm may thus be a paradigm for designing control processes. For instance, one can decrease the temperature of one part of a molecule by transferring its heat to the remaining part where it is then dissipated to the environment. But the implementation of such a process involves a complex sequence of electromagnetic pulses. This work considers several hypothetical control processes on the nanoscopic level and show their analogy to computation processes. We show that measuring certain types of quantum observables is such a complex task that every instrument that is able to perform it would necessarily be an extremely powerful computer. Likewise, the implementation of a heat engine on the nanoscale requires to process the heat in a way that is similar to information processing and it can be shown that heat engines with maximal efficiency would be powerful computers, too. In the same way as problems in computer science can be classified by complexity classes we can also classify control problems according to their complexity. Moreover, we directly relate these complexity classes for control problems to the classes in computer science. Unifying notions of complexity in computer science and physics has therefore two aspects: on the one hand, computer science methods help to analyze the complexity of physical processes. On the other hand, reasonable

  11. Rational design of mesoporous metals and related nanomaterials by a soft-template approach.

    Science.gov (United States)

    Yamauchi, Yusuke; Kuroda, Kazuyuki

    2008-04-07

    We review recent developments in the preparation of mesoporous metals and related metal-based nanomaterials. Among the many types of mesoporous materials, mesoporous metals hold promise for a wide range of potential applications, such as in electronic devices, magnetic recording media, and metal catalysts, owing to their metallic frameworks. Mesoporous metals with highly ordered networks and narrow pore-size distributions have traditionally been produced by using mesoporous silica as a hard template. This method involves the formation of an original template followed by deposition of metals within the mesopores and subsequent removal of the template. Another synthetic method is the direct-template approach from lyotropic liquid crystals (LLCs) made of nonionic surfactants at high concentrations. Direct-template synthesis creates a novel avenue for the production of mesoporous metals as well as related metal-based nanomaterials. Many mesoporous metals have been prepared by the chemical or electrochemical reduction of metal salts dissolved in aqueous LLC domains. As a soft template, LLCs are more versatile and therefore more advantageous than hard templates. It is possible to produce various nanostructures (e.g., lamellar, 2D hexagonal (p6mm), and 3D cubic (Ia\\3d)), nanoparticles, and nanotubes simply by controlling the composition of the reaction bath.

  12. Liquid crystal templating as an approach to spatially and temporally organise soft matter.

    Science.gov (United States)

    van der Asdonk, Pim; Kouwer, Paul H J

    2017-10-02

    Chemistry quickly moves from a molecular science to a systems science. This requires spatial and temporal control over the organisation of molecules and molecular assemblies. Whilst Nature almost by default (transiently) organises her components at multiple different length scales, scientists struggle to realise even relatively straightforward patterns. In the past decades, supramolecular chemistry has taught us the rules to precisely engineer molecular assembly at the nanometre scale. At higher length scales, however, we are bound to top-down nanotechnology techniques to realise order. For soft, biological matter, many of these top-down techniques come with serious limitations since the molecules generally show low susceptibilities to the applied stimuli. A new method is based on liquid crystal templating. In this hierarchical approach, a liquid crystalline host serves as the scaffold to order polymers or assemblies. Being a liquid crystal, the host material can be ordered at many different length scales and on top of that, is highly susceptible to many external stimuli, which can even be used to manipulate the liquid crystal organisation in time. As a result, we anticipate large control over the organisation of the materials inside the liquid crystalline host. Recently, liquid crystal templating was also realised in water. This suddenly makes this tool highly applicable to start organising more delicate biological materials or even small organisms. We review the scope and limitations of liquid crystal templating and look out to where the technique may lead us.

  13. Comparative Metabolomics Approach Detects Stress-Specific Responses during Coral Bleaching in Soft Corals.

    Science.gov (United States)

    Farag, Mohamed A; Meyer, Achim; Ali, Sara E; Salem, Mohamed A; Giavalisco, Patrick; Westphal, Hildegard; Wessjohann, Ludger A

    2018-06-01

    Chronic exposure to ocean acidification and elevated sea-surface temperatures pose significant stress to marine ecosystems. This in turn necessitates costly acclimation responses in corals in both the symbiont and host, with a reorganization of cell metabolism and structure. A large-scale untargeted metabolomics approach comprising gas chromatography mass spectrometry (GC-MS) and ultraperformance liquid chromatography coupled to high resolution mass spectrometry (UPLC-MS) was applied to profile the metabolite composition of the soft coral Sarcophyton ehrenbergi and its dinoflagellate symbiont. Metabolite profiling compared ambient conditions with response to simulated climate change stressors and with the sister species, S. glaucum. Among ∼300 monitored metabolites, 13 metabolites were modulated. Incubation experiments providing four selected upregulated metabolites (alanine, GABA, nicotinic acid, and proline) in the culturing water failed to subside the bleaching response at temperature-induced stress, despite their known ability to mitigate heat stress in plants or animals. Thus, the results hint to metabolite accumulation (marker) during heat stress. This study provides the first detailed map of metabolic pathways transition in corals in response to different environmental stresses, accounting for the superior thermal tolerance of S. ehrenbergi versus S. glaucum, which can ultimately help maintain a viable symbiosis and mitigate against coral bleaching.

  14. COGNITIVE COMPUTER GRAPHICS AS A MEANS OF "SOFT" MODELING IN PROBLEMS OF RESTORATION OF FUNCTIONS OF TWO VARIABLES

    Directory of Open Access Journals (Sweden)

    A.N. Khomchenko

    2016-08-01

    Full Text Available The paper considers the problem of bi-cubic interpolation on the final element of serendipity family. With cognitive-graphical analysis the rigid model of Ergatoudis, Irons and Zenkevich (1968 compared with alternative models, obtained by the methods: direct geometric design, a weighted averaging of the basis polynomials, systematic generation of bases (advanced Taylor procedure. The emphasis is placed on the phenomenon of "gravitational repulsion" (Zenkevich paradox. The causes of rising of inadequate physical spectra nodal loads on serendipity elements of higher orders are investigated. Soft modeling allows us to build a lot of serendipity elements of bicubic interpolation, and you do not even need to know the exact form of the rigid model. The different interpretations of integral characteristics of the basis polynomials: geometrical, physical, probability are offered. Under the soft model in the theory of interpolation of function of two variables implies the model amenable to change through the choice of basis. Such changes in the family of Lagrangian finite elements of higher orders are excluded (hard simulation. Standard models of serendipity family (Zenkevich were also tough. It was found that the "responsibility" for the rigidity of serendipity model rests on ruled surfaces (zero Gaussian curvature - conoids that predominate in the base set. Cognitive portraits zero lines of standard serendipity surfaces suggested that in order to "mitigate" of serendipity pattern conoid should better be replaced by surfaces of alternating Gaussian curvature. The article shows the alternative (soft bases of serendipity models. The work is devoted to solving scientific and technological problems aimed at the creation, dissemination and use of cognitive computer graphics in teaching and learning. The results are of interest to students of specialties: "Computer Science and Information Technologies", "System Analysis", "Software Engineering", as well as

  15. Computational and Experimental Approaches to Visual Aesthetics

    Science.gov (United States)

    Brachmann, Anselm; Redies, Christoph

    2017-01-01

    Aesthetics has been the subject of long-standing debates by philosophers and psychologists alike. In psychology, it is generally agreed that aesthetic experience results from an interaction between perception, cognition, and emotion. By experimental means, this triad has been studied in the field of experimental aesthetics, which aims to gain a better understanding of how aesthetic experience relates to fundamental principles of human visual perception and brain processes. Recently, researchers in computer vision have also gained interest in the topic, giving rise to the field of computational aesthetics. With computing hardware and methodology developing at a high pace, the modeling of perceptually relevant aspect of aesthetic stimuli has a huge potential. In this review, we present an overview of recent developments in computational aesthetics and how they relate to experimental studies. In the first part, we cover topics such as the prediction of ratings, style and artist identification as well as computational methods in art history, such as the detection of influences among artists or forgeries. We also describe currently used computational algorithms, such as classifiers and deep neural networks. In the second part, we summarize results from the field of experimental aesthetics and cover several isolated image properties that are believed to have a effect on the aesthetic appeal of visual stimuli. Their relation to each other and to findings from computational aesthetics are discussed. Moreover, we compare the strategies in the two fields of research and suggest that both fields would greatly profit from a joined research effort. We hope to encourage researchers from both disciplines to work more closely together in order to understand visual aesthetics from an integrated point of view. PMID:29184491

  16. General Biology and Current Management Approaches of Soft Scale Pests (Hemiptera: Coccidae).

    Science.gov (United States)

    Camacho, Ernesto Robayo; Chong, Juang-Horng

    We summarize the economic importance, biology, and management of soft scales, focusing on pests of agricultural, horticultural, and silvicultural crops in outdoor production systems and urban landscapes. We also provide summaries on voltinism, crawler emergence timing, and predictive models for crawler emergence to assist in developing soft scale management programs. Phloem-feeding soft scale pests cause direct (e.g., injuries to plant tissues and removal of nutrients) and indirect damage (e.g., reduction in photosynthesis and aesthetic value by honeydew and sooty mold). Variations in life cycle, reproduction, fecundity, and behavior exist among congenerics due to host, environmental, climatic, and geographical variations. Sampling of soft scale pests involves sighting the insects or their damage, and assessing their abundance. Crawlers of most univoltine species emerge in the spring and the summer. Degree-day models and plant phenological indicators help determine the initiation of sampling and treatment against crawlers (the life stage most vulnerable to contact insecticides). The efficacy of cultural management tactics, such as fertilization, pruning, and irrigation, in reducing soft scale abundance is poorly documented. A large number of parasitoids and predators attack soft scale populations in the field; therefore, natural enemy conservation by using selective insecticides is important. Systemic insecticides provide greater flexibility in application method and timing, and have longer residual longevity than contact insecticides. Application timing of contact insecticides that coincides with crawler emergence is most effective in reducing soft scale abundance.

  17. Multifractal Analysis of Seismically Induced Soft-Sediment Deformation Structures Imaged by X-Ray Computed Tomography

    Science.gov (United States)

    Nakashima, Yoshito; Komatsubara, Junko

    Unconsolidated soft sediments deform and mix complexly by seismically induced fluidization. Such geological soft-sediment deformation structures (SSDSs) recorded in boring cores were imaged by X-ray computed tomography (CT), which enables visualization of the inhomogeneous spatial distribution of iron-bearing mineral grains as strong X-ray absorbers in the deformed strata. Multifractal analysis was applied to the two-dimensional (2D) CT images with various degrees of deformation and mixing. The results show that the distribution of the iron-bearing mineral grains is multifractal for less deformed/mixed strata and almost monofractal for fully mixed (i.e. almost homogenized) strata. Computer simulations of deformation of real and synthetic digital images were performed using the egg-beater flow model. The simulations successfully reproduced the transformation from the multifractal spectra into almost monofractal spectra (i.e. almost convergence on a single point) with an increase in deformation/mixing intensity. The present study demonstrates that multifractal analysis coupled with X-ray CT and the mixing flow model is useful to quantify the complexity of seismically induced SSDSs, standing as a novel method for the evaluation of cores for seismic risk assessment.

  18. Imaging of musculoskeletal soft tissue infections

    Energy Technology Data Exchange (ETDEWEB)

    Turecki, Marcin B.; Taljanovic, Mihra S.; Holden, Dean A.; Hunter, Tim B.; Rogers, Lee F. [University of Arizona HSC, Department of Radiology, Tucson, AZ (United States); Stubbs, Alana Y. [Southern Arizona VA Health Care System, Department of Radiology, Tucson, AZ (United States); Graham, Anna R. [University of Arizona HSC, Department of Pathology, Tucson, AZ (United States)

    2010-10-15

    Prompt and appropriate imaging work-up of the various musculoskeletal soft tissue infections aids early diagnosis and treatment and decreases the risk of complications resulting from misdiagnosis or delayed diagnosis. The signs and symptoms of musculoskeletal soft tissue infections can be nonspecific, making it clinically difficult to distinguish between disease processes and the extent of disease. Magnetic resonance imaging (MRI) is the imaging modality of choice in the evaluation of soft tissue infections. Computed tomography (CT), ultrasound, radiography and nuclear medicine studies are considered ancillary. This manuscript illustrates representative images of superficial and deep soft tissue infections such as infectious cellulitis, superficial and deep fasciitis, including the necrotizing fasciitis, pyomyositis/soft tissue abscess, septic bursitis and tenosynovitis on different imaging modalities, with emphasis on MRI. Typical histopathologic findings of soft tissue infections are also presented. The imaging approach described in the manuscript is based on relevant literature and authors' personal experience and everyday practice. (orig.)

  19. Imaging of musculoskeletal soft tissue infections

    International Nuclear Information System (INIS)

    Turecki, Marcin B.; Taljanovic, Mihra S.; Holden, Dean A.; Hunter, Tim B.; Rogers, Lee F.; Stubbs, Alana Y.; Graham, Anna R.

    2010-01-01

    Prompt and appropriate imaging work-up of the various musculoskeletal soft tissue infections aids early diagnosis and treatment and decreases the risk of complications resulting from misdiagnosis or delayed diagnosis. The signs and symptoms of musculoskeletal soft tissue infections can be nonspecific, making it clinically difficult to distinguish between disease processes and the extent of disease. Magnetic resonance imaging (MRI) is the imaging modality of choice in the evaluation of soft tissue infections. Computed tomography (CT), ultrasound, radiography and nuclear medicine studies are considered ancillary. This manuscript illustrates representative images of superficial and deep soft tissue infections such as infectious cellulitis, superficial and deep fasciitis, including the necrotizing fasciitis, pyomyositis/soft tissue abscess, septic bursitis and tenosynovitis on different imaging modalities, with emphasis on MRI. Typical histopathologic findings of soft tissue infections are also presented. The imaging approach described in the manuscript is based on relevant literature and authors' personal experience and everyday practice. (orig.)

  20. Computational Approaches to Chemical Hazard Assessment

    Science.gov (United States)

    Luechtefeld, Thomas; Hartung, Thomas

    2018-01-01

    Summary Computational prediction of toxicity has reached new heights as a result of decades of growth in the magnitude and diversity of biological data. Public packages for statistics and machine learning make model creation faster. New theory in machine learning and cheminformatics enables integration of chemical structure, toxicogenomics, simulated and physical data in the prediction of chemical health hazards, and other toxicological information. Our earlier publications have characterized a toxicological dataset of unprecedented scale resulting from the European REACH legislation (Registration Evaluation Authorisation and Restriction of Chemicals). These publications dove into potential use cases for regulatory data and some models for exploiting this data. This article analyzes the options for the identification and categorization of chemicals, moves on to the derivation of descriptive features for chemicals, discusses different kinds of targets modeled in computational toxicology, and ends with a high-level perspective of the algorithms used to create computational toxicology models. PMID:29101769

  1. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  2. Soft-error tolerance and energy consumption evaluation of embedded computer with magnetic random access memory in practical systems using computer simulations

    Science.gov (United States)

    Nebashi, Ryusuke; Sakimura, Noboru; Sugibayashi, Tadahiko

    2017-08-01

    We evaluated the soft-error tolerance and energy consumption of an embedded computer with magnetic random access memory (MRAM) using two computer simulators. One is a central processing unit (CPU) simulator of a typical embedded computer system. We simulated the radiation-induced single-event-upset (SEU) probability in a spin-transfer-torque MRAM cell and also the failure rate of a typical embedded computer due to its main memory SEU error. The other is a delay tolerant network (DTN) system simulator. It simulates the power dissipation of wireless sensor network nodes of the system using a revised CPU simulator and a network simulator. We demonstrated that the SEU effect on the embedded computer with 1 Gbit MRAM-based working memory is less than 1 failure in time (FIT). We also demonstrated that the energy consumption of the DTN sensor node with MRAM-based working memory can be reduced to 1/11. These results indicate that MRAM-based working memory enhances the disaster tolerance of embedded computers.

  3. A Computing Method to Determine the Performance of an Ionic Liquid Gel Soft Actuator.

    Science.gov (United States)

    He, Bin; Zhang, Chenghong; Zhou, Yanmin; Wang, Zhipeng

    2018-01-01

    A new type of soft actuator material-an ionic liquid gel (ILG) that consists of BMIMBF 4 , HEMA, DEAP, and ZrO 2 -is polymerized into a gel state under ultraviolet (UV) light irradiation. In this paper, we first propose that the ILG conforms to the assumptions of hyperelastic theory and that the Mooney-Rivlin model can be used to study the properties of the ILG. Under the five-parameter and nine-parameter Mooney-Rivlin models, the formulas for the calculation of the uniaxial tensile stress, plane uniform tensile stress, and 3D directional stress are deduced. The five-parameter and nine-parameter Mooney-Rivlin models of the ILG with a ZrO 2 content of 3 wt% were obtained by uniaxial tensile testing, and the parameters are denoted as c 10 , c 01 , c 20 , c 11 , and c 02 and c 10 , c 01 , c 20 , c 11 , c 02 , c 30 , c 21 , c 12 , and c 03 , respectively. Through the analysis and comparison of the uniaxial tensile stress between the calculated and experimental data, the error between the stress data calculated from the five-parameter Mooney-Rivlin model and the experimental data is less than 0.51%, and the error between the stress data calculated from the nine-parameter Mooney-Rivlin model and the experimental data is no more than 8.87%. Hence, our work presents a feasible and credible formula for the calculation of the stress of the ILG. This work opens a new path to assess the performance of a soft actuator composed of an ILG and will contribute to the optimized design of soft robots.

  4. Approaching Engagement towards Human-Engaged Computing

    DEFF Research Database (Denmark)

    Niksirat, Kavous Salehzadeh; Sarcar, Sayan; Sun, Huatong

    2018-01-01

    Debates regarding the nature and role of HCI research and practice have intensified in recent years, given the ever increasingly intertwined relations between humans and technologies. The framework of Human-Engaged Computing (HEC) was proposed and developed over a series of scholarly workshops to...

  5. Computational and mathematical approaches to societal transitions

    NARCIS (Netherlands)

    J.S. Timmermans (Jos); F. Squazzoni (Flaminio); J. de Haan (Hans)

    2008-01-01

    textabstractAfter an introduction of the theoretical framework and concepts of transition studies, this article gives an overview of how structural change in social systems has been studied from various disciplinary perspectives. This overview first leads to the conclusion that computational and

  6. Heterogeneous Computing in Economics: A Simplified Approach

    DEFF Research Database (Denmark)

    Dziubinski, Matt P.; Grassi, Stefano

    This paper shows the potential of heterogeneous computing in solving dynamic equilibrium models in economics. We illustrate the power and simplicity of the C++ Accelerated Massive Parallelism recently introduced by Microsoft. Starting from the same exercise as Aldrich et al. (2011) we document a ...

  7. A Constructive Induction Approach to Computer Immunology

    Science.gov (United States)

    1999-03-01

    LVM98] Lamont, Gary B., David A. Van Veldhuizen , and Robert E Marmelstein, A Distributed Architecture for a Self-Adaptive Computer Virus...Artificial Intelligence, Herndon, VA, 1995. [MVL98] Marmelstein, Robert E., David A. Van Veldhuizen , and Gary B. Lamont. Modeling & Analysis

  8. Using soft computing techniques to predict corrected air permeability using Thomeer parameters, air porosity and grain density

    Science.gov (United States)

    Nooruddin, Hasan A.; Anifowose, Fatai; Abdulraheem, Abdulazeez

    2014-03-01

    Soft computing techniques are recently becoming very popular in the oil industry. A number of computational intelligence-based predictive methods have been widely applied in the industry with high prediction capabilities. Some of the popular methods include feed-forward neural networks, radial basis function network, generalized regression neural network, functional networks, support vector regression and adaptive network fuzzy inference system. A comparative study among most popular soft computing techniques is presented using a large dataset published in literature describing multimodal pore systems in the Arab D formation. The inputs to the models are air porosity, grain density, and Thomeer parameters obtained using mercury injection capillary pressure profiles. Corrected air permeability is the target variable. Applying developed permeability models in recent reservoir characterization workflow ensures consistency between micro and macro scale information represented mainly by Thomeer parameters and absolute permeability. The dataset was divided into two parts with 80% of data used for training and 20% for testing. The target permeability variable was transformed to the logarithmic scale as a pre-processing step and to show better correlations with the input variables. Statistical and graphical analysis of the results including permeability cross-plots and detailed error measures were created. In general, the comparative study showed very close results among the developed models. The feed-forward neural network permeability model showed the lowest average relative error, average absolute relative error, standard deviations of error and root means squares making it the best model for such problems. Adaptive network fuzzy inference system also showed very good results.

  9. Advances in Soft Matter Mechanics

    CERN Document Server

    Li, Shaofan

    2012-01-01

    "Advances in Soft Matter Mechanics" is a compilation and selection of recent works in soft matter mechanics by a group of active researchers in the field. The main objectives of this book are first to disseminate the latest developments in soft matter mechanics in the field of applied and computational mechanics, and second to introduce soft matter mechanics as a sub-discipline of soft matter physics. As an important branch of soft matter physics, soft matter mechanics has developed rapidly in recent years. A number of the novel approaches discussed in this book are unique, such as the coarse grained finite element method for modeling colloidal adhesion, entropic elasticity, meshfree simulations of liquid crystal elastomers, simulations of DNA, etc. The book is intended for researchers and graduate students in the field of mechanics, condensed matter physics and biomaterials. Dr. Shaofan Li is a professor of the University of California-Berkeley, U.S.A; Dr. Bohua Sun is a professor of Cape Peninsula Universit...

  10. Computational and Game-Theoretic Approaches for Modeling Bounded Rationality

    NARCIS (Netherlands)

    L. Waltman (Ludo)

    2011-01-01

    textabstractThis thesis studies various computational and game-theoretic approaches to economic modeling. Unlike traditional approaches to economic modeling, the approaches studied in this thesis do not rely on the assumption that economic agents behave in a fully rational way. Instead, economic

  11. Computational Thinking and Practice - A Generic Approach to Computing in Danish High Schools

    DEFF Research Database (Denmark)

    Caspersen, Michael E.; Nowack, Palle

    2014-01-01

    Internationally, there is a growing awareness on the necessity of providing relevant computing education in schools, particularly high schools. We present a new and generic approach to Computing in Danish High Schools based on a conceptual framework derived from ideas related to computational thi...

  12. Machine learning and computer vision approaches for phenotypic profiling.

    Science.gov (United States)

    Grys, Ben T; Lo, Dara S; Sahin, Nil; Kraus, Oren Z; Morris, Quaid; Boone, Charles; Andrews, Brenda J

    2017-01-02

    With recent advances in high-throughput, automated microscopy, there has been an increased demand for effective computational strategies to analyze large-scale, image-based data. To this end, computer vision approaches have been applied to cell segmentation and feature extraction, whereas machine-learning approaches have been developed to aid in phenotypic classification and clustering of data acquired from biological images. Here, we provide an overview of the commonly used computer vision and machine-learning methods for generating and categorizing phenotypic profiles, highlighting the general biological utility of each approach. © 2017 Grys et al.

  13. Material parameter identification and inverse problems in soft tissue biomechanics

    CERN Document Server

    Evans, Sam

    2017-01-01

    The articles in this book review hybrid experimental-computational methods applied to soft tissues which have been developed by worldwide specialists in the field. People developing computational models of soft tissues and organs will find solutions for calibrating the material parameters of their models; people performing tests on soft tissues will learn what to extract from the data and how to use these data for their models and people worried about the complexity of the biomechanical behavior of soft tissues will find relevant approaches to address this complexity.

  14. Computational Approaches to Simulation and Optimization of Global Aircraft Trajectories

    Science.gov (United States)

    Ng, Hok Kwan; Sridhar, Banavar

    2016-01-01

    This study examines three possible approaches to improving the speed in generating wind-optimal routes for air traffic at the national or global level. They are: (a) using the resources of a supercomputer, (b) running the computations on multiple commercially available computers and (c) implementing those same algorithms into NASAs Future ATM Concepts Evaluation Tool (FACET) and compares those to a standard implementation run on a single CPU. Wind-optimal aircraft trajectories are computed using global air traffic schedules. The run time and wait time on the supercomputer for trajectory optimization using various numbers of CPUs ranging from 80 to 10,240 units are compared with the total computational time for running the same computation on a single desktop computer and on multiple commercially available computers for potential computational enhancement through parallel processing on the computer clusters. This study also re-implements the trajectory optimization algorithm for further reduction of computational time through algorithm modifications and integrates that with FACET to facilitate the use of the new features which calculate time-optimal routes between worldwide airport pairs in a wind field for use with existing FACET applications. The implementations of trajectory optimization algorithms use MATLAB, Python, and Java programming languages. The performance evaluations are done by comparing their computational efficiencies and based on the potential application of optimized trajectories. The paper shows that in the absence of special privileges on a supercomputer, a cluster of commercially available computers provides a feasible approach for national and global air traffic system studies.

  15. Computational approach to large quantum dynamical problems

    International Nuclear Information System (INIS)

    Friesner, R.A.; Brunet, J.P.; Wyatt, R.E.; Leforestier, C.; Binkley, S.

    1987-01-01

    The organizational structure is described for a new program that permits computations on a variety of quantum mechanical problems in chemical dynamics and spectroscopy. Particular attention is devoted to developing and using algorithms that exploit the capabilities of current vector supercomputers. A key component in this procedure is the recursive transformation of the large sparse Hamiltonian matrix into a much smaller tridiagonal matrix. An application to time-dependent laser molecule energy transfer is presented. Rate of energy deposition in the multimode molecule for systematic variations in the molecular intermode coupling parameters is emphasized

  16. A complex network approach to cloud computing

    International Nuclear Information System (INIS)

    Travieso, Gonzalo; Ruggiero, Carlos Antônio; Bruno, Odemir Martinez; Costa, Luciano da Fontoura

    2016-01-01

    Cloud computing has become an important means to speed up computing. One problem influencing heavily the performance of such systems is the choice of nodes as servers responsible for executing the clients’ tasks. In this article we report how complex networks can be used to model such a problem. More specifically, we investigate the performance of the processing respectively to cloud systems underlaid by Erdős–Rényi (ER) and Barabási-Albert (BA) topology containing two servers. Cloud networks involving two communities not necessarily of the same size are also considered in our analysis. The performance of each configuration is quantified in terms of the cost of communication between the client and the nearest server, and the balance of the distribution of tasks between the two servers. Regarding the latter, the ER topology provides better performance than the BA for smaller average degrees and opposite behaviour for larger average degrees. With respect to cost, smaller values are found in the BA topology irrespective of the average degree. In addition, we also verified that it is easier to find good servers in ER than in BA networks. Surprisingly, balance and cost are not too much affected by the presence of communities. However, for a well-defined community network, we found that it is important to assign each server to a different community so as to achieve better performance. (paper: interdisciplinary statistical mechanics )

  17. A Biogeotechnical approach to Stabilize Soft Marine Soil with a Microbial Organic Material called Biopolymer

    Science.gov (United States)

    Chang, I.; Cho, G. C.; Kwon, Y. M.; Im, J.

    2017-12-01

    The importance and demands of offshore and coastal area development are increasing due to shortage of usable land and to have access to valuable marine resources. However, most coastal soils are soft sediments, mainly composed with fines (silt and clay) and having high water and organic contents, which induce complicated mechanical- and geochemical- behaviors and even be insufficient in Geotechnical engineering aspects. At least, soil stabilization procedures are required for those soft sediments, regardless of the purpose of usage on the site. One of the most common soft soil stabilization method is using ordinary cement as a soil strengthening binder. However, the use of cement in marine environments is reported to occur environmental concerns such as pH increase and accompanying marine ecosystem disturbance. Therefore, a new environmentally-friendly treatment material for coastal and offshore soils. In this study, a biopolymer material produced by microbes is introduced to enhance the physical behavior of a soft tidal flat sediment by considering the biopolymer rheology, soil mineralogy, and chemical properties of marine water. Biopolymer material used in this study forms inter-particle bonds between particles which is promoted through cation-bridges where the cations are provided from marine water. Moreover, biopolymer treatment renders unique stress-strain relationship of soft soils. The mechanical stiffness (M) instantly increase with the presence of biopolymer, while time-dependent settlement behavior (consolidation) shows a big delay due to the viscous biopolymer hydrogels in pore spaces.

  18. Soft, chewable gelatin-based pharmaceutical oral formulations: a technical approach.

    Science.gov (United States)

    Dille, Morten J; Hattrem, Magnus N; Draget, Kurt I

    2018-06-01

    Hard tablets and capsules for oral drug delivery cause problems for people experiencing dysphagia. This work describes the formulation and properties of a gelatin based, self-preserved, and soft chewable tablet as an alternative and novel drug delivery format. Gelatin (8.8-10% in 24.7-29% water) constituted the matrix of the soft, semi-solid tablets. Three different pharmaceuticals (Ibuprofen 10%, Acetaminophen 15%, and Meloxicam 1.5%) were tested in this formulation. Microbial stability was controlled by lowering the water activity with a mixture of sorbitol and xylitol (45.6-55%). Rheological properties were tested applying small strain oscillation measurements. Taste masking of ibuprofen soft-chew tablets was achieved by keeping the ibuprofen insoluble at pH 4.5 and keeping the processing temperature below the crystalline-to-amorphous transition temperature. Soft-chew formulations showed good stability for all three pharmaceuticals (up to 24 months), and the ibuprofen containing formulation exhibited comparable dissolution to a standard oral tablet as well as good microbial stability. The rheological properties of the ibuprofen/gelatin formulation had the fingerprint of a true gelatin gel, albeit higher moduli, and melting temperature. The results suggest that easy-to-swallow and well taste-masked soft chewable tablet formulations with extended shelf life are within reach for several active pharmaceutical ingredients (APIs).

  19. The fundamentals of computational intelligence system approach

    CERN Document Server

    Zgurovsky, Mikhail Z

    2017-01-01

    This monograph is dedicated to the systematic presentation of main trends, technologies and methods of computational intelligence (CI). The book pays big attention to novel important CI technology- fuzzy logic (FL) systems and fuzzy neural networks (FNN). Different FNN including new class of FNN- cascade neo-fuzzy neural networks are considered and their training algorithms are described and analyzed. The applications of FNN to the forecast in macroeconomics and at stock markets are examined. The book presents the problem of portfolio optimization under uncertainty, the novel theory of fuzzy portfolio optimization free of drawbacks of classical model of Markovitz as well as an application for portfolios optimization at Ukrainian, Russian and American stock exchanges. The book also presents the problem of corporations bankruptcy risk forecasting under incomplete and fuzzy information, as well as new methods based on fuzzy sets theory and fuzzy neural networks and results of their application for bankruptcy ris...

  20. Investigation of coupling of magnetohydrodynamic modes by soft x-ray computer tomography on the WT-3 tokamak

    International Nuclear Information System (INIS)

    Yoshimura, Satoru; Maekawa, Takashi; Terumichi, Yasushi

    2002-01-01

    The internal structure of the stationary m=1 and m=2 modes in an ohmic heating plasma and the double m=1 mode structure in a lower hybrid current drive plasma are investigated on the WT-3 tokamak [Maehara et al., Nucl. Fusion 38, 39 (1998)] using computer tomography after the application of the singular value decomposition to the soft x-ray signals. The results show that, in both cases, two coexisting modes have the same frequency and have a fixed mutual phase relation, indicating that two modes are coupled and rotate as one body in the toroidal direction. It is found that the mutual inductance of two loops of helical current filaments for producing magnetic islands always takes the maximum at the experimentally observed positions of two-mode structures. This result means not only that the electromagnetic coupling of two current loops is at the maximum, but also that the two loops are in the dynamically stable position

  1. Application of Soft Computing Tools for Wave Prediction at Specific Locations in the Arabian Sea Using Moored Buoy Observations

    Directory of Open Access Journals (Sweden)

    J. Vimala

    2012-12-01

    Full Text Available The knowledge of design and operational values of significant wave heights is perhaps the single most important input needed in ocean engineering studies. Conventionally such information is obtained using classical statistical analysis and stochastic methods. As the causative variables are innumerable and underlying physics is too complicated, the results obtained from the numerical models may not always be very satisfactory. Soft computing tools like Artificial Neural Network (ANN and Adaptive Network based Fuzzy Inference System (ANFIS may therefore be useful to predict significant wave heights in some situations. The study is aimed at forecasting of significant wave height values in real time over a period of 24hrs at certain locations in Indian seas using the models of ANN and ANFIS. The data for the work were collected by National Institute of Ocean Technology, Chennai. It was found that the predictions of wave heights can be done by both methods with equal efficiency and satisfaction.

  2. Q-P Wave traveltime computation by an iterative approach

    KAUST Repository

    Ma, Xuxin; Alkhalifah, Tariq Ali

    2013-01-01

    In this work, we present a new approach to compute anisotropic traveltime based on solving successively elliptical isotropic traveltimes. The method shows good accuracy and is very simple to implement.

  3. Integration of case study approach, project design and computer ...

    African Journals Online (AJOL)

    Integration of case study approach, project design and computer modeling in managerial accounting education ... Journal of Fundamental and Applied Sciences ... in the Laboratory of Management Accounting and Controlling Systems at the ...

  4. Fractal approach to computer-analytical modelling of tree crown

    International Nuclear Information System (INIS)

    Berezovskaya, F.S.; Karev, G.P.; Kisliuk, O.F.; Khlebopros, R.G.; Tcelniker, Yu.L.

    1993-09-01

    In this paper we discuss three approaches to the modeling of a tree crown development. These approaches are experimental (i.e. regressive), theoretical (i.e. analytical) and simulation (i.e. computer) modeling. The common assumption of these is that a tree can be regarded as one of the fractal objects which is the collection of semi-similar objects and combines the properties of two- and three-dimensional bodies. We show that a fractal measure of crown can be used as the link between the mathematical models of crown growth and light propagation through canopy. The computer approach gives the possibility to visualize a crown development and to calibrate the model on experimental data. In the paper different stages of the above-mentioned approaches are described. The experimental data for spruce, the description of computer system for modeling and the variant of computer model are presented. (author). 9 refs, 4 figs

  5. Soft, curved electrode systems capable of integration on the auricle as a persistent brain–computer interface

    Science.gov (United States)

    Norton, James J. S.; Lee, Dong Sup; Lee, Jung Woo; Lee, Woosik; Kwon, Ohjin; Won, Phillip; Jung, Sung-Young; Cheng, Huanyu; Jeong, Jae-Woong; Akce, Abdullah; Umunna, Stephen; Na, Ilyoun; Kwon, Yong Ho; Wang, Xiao-Qi; Liu, ZhuangJian; Paik, Ungyu; Huang, Yonggang; Bretl, Timothy; Yeo, Woon-Hong; Rogers, John A.

    2015-01-01

    Recent advances in electrodes for noninvasive recording of electroencephalograms expand opportunities collecting such data for diagnosis of neurological disorders and brain–computer interfaces. Existing technologies, however, cannot be used effectively in continuous, uninterrupted modes for more than a few days due to irritation and irreversible degradation in the electrical and mechanical properties of the skin interface. Here we introduce a soft, foldable collection of electrodes in open, fractal mesh geometries that can mount directly and chronically on the complex surface topology of the auricle and the mastoid, to provide high-fidelity and long-term capture of electroencephalograms in ways that avoid any significant thermal, electrical, or mechanical loading of the skin. Experimental and computational studies establish the fundamental aspects of the bending and stretching mechanics that enable this type of intimate integration on the highly irregular and textured surfaces of the auricle. Cell level tests and thermal imaging studies establish the biocompatibility and wearability of such systems, with examples of high-quality measurements over periods of 2 wk with devices that remain mounted throughout daily activities including vigorous exercise, swimming, sleeping, and bathing. Demonstrations include a text speller with a steady-state visually evoked potential-based brain–computer interface and elicitation of an event-related potential (P300 wave). PMID:25775550

  6. Soft, curved electrode systems capable of integration on the auricle as a persistent brain-computer interface.

    Science.gov (United States)

    Norton, James J S; Lee, Dong Sup; Lee, Jung Woo; Lee, Woosik; Kwon, Ohjin; Won, Phillip; Jung, Sung-Young; Cheng, Huanyu; Jeong, Jae-Woong; Akce, Abdullah; Umunna, Stephen; Na, Ilyoun; Kwon, Yong Ho; Wang, Xiao-Qi; Liu, ZhuangJian; Paik, Ungyu; Huang, Yonggang; Bretl, Timothy; Yeo, Woon-Hong; Rogers, John A

    2015-03-31

    Recent advances in electrodes for noninvasive recording of electroencephalograms expand opportunities collecting such data for diagnosis of neurological disorders and brain-computer interfaces. Existing technologies, however, cannot be used effectively in continuous, uninterrupted modes for more than a few days due to irritation and irreversible degradation in the electrical and mechanical properties of the skin interface. Here we introduce a soft, foldable collection of electrodes in open, fractal mesh geometries that can mount directly and chronically on the complex surface topology of the auricle and the mastoid, to provide high-fidelity and long-term capture of electroencephalograms in ways that avoid any significant thermal, electrical, or mechanical loading of the skin. Experimental and computational studies establish the fundamental aspects of the bending and stretching mechanics that enable this type of intimate integration on the highly irregular and textured surfaces of the auricle. Cell level tests and thermal imaging studies establish the biocompatibility and wearability of such systems, with examples of high-quality measurements over periods of 2 wk with devices that remain mounted throughout daily activities including vigorous exercise, swimming, sleeping, and bathing. Demonstrations include a text speller with a steady-state visually evoked potential-based brain-computer interface and elicitation of an event-related potential (P300 wave).

  7. Bioinspired Computational Approach to Missing Value Estimation

    Directory of Open Access Journals (Sweden)

    Israel Edem Agbehadji

    2018-01-01

    Full Text Available Missing data occurs when values of variables in a dataset are not stored. Estimating these missing values is a significant step during the data cleansing phase of a big data management approach. The reason of missing data may be due to nonresponse or omitted entries. If these missing data are not handled properly, this may create inaccurate results during data analysis. Although a traditional method such as maximum likelihood method extrapolates missing values, this paper proposes a bioinspired method based on the behavior of birds, specifically the Kestrel bird. This paper describes the behavior and characteristics of the Kestrel bird, a bioinspired approach, in modeling an algorithm to estimate missing values. The proposed algorithm (KSA was compared with WSAMP, Firefly, and BAT algorithm. The results were evaluated using the mean of absolute error (MAE. A statistical test (Wilcoxon signed-rank test and Friedman test was conducted to test the performance of the algorithms. The results of Wilcoxon test indicate that time does not have a significant effect on the performance, and the quality of estimation between the paired algorithms was significant; the results of Friedman test ranked KSA as the best evolutionary algorithm.

  8. Computational fluid dynamics in ventilation: Practical approach

    Science.gov (United States)

    Fontaine, J. R.

    The potential of computation fluid dynamics (CFD) for conceiving ventilation systems is shown through the simulation of five practical cases. The following examples are considered: capture of pollutants on a surface treating tank equipped with a unilateral suction slot in the presence of a disturbing air draft opposed to suction; dispersion of solid aerosols inside fume cupboards; performances comparison of two general ventilation systems in a silkscreen printing workshop; ventilation of a large open painting area; and oil fog removal inside a mechanical engineering workshop. Whereas the two first problems are analyzed through two dimensional numerical simulations, the three other cases require three dimensional modeling. For the surface treating tank case, numerical results are compared to laboratory experiment data. All simulations are carried out using EOL, a CFD software specially devised to deal with air quality problems in industrial ventilated premises. It contains many analysis tools to interpret the results in terms familiar to the industrial hygienist. Much experimental work has been engaged to validate the predictions of EOL for ventilation flows.

  9. "No zone" approach in penetrating neck trauma reduces unnecessary computed tomography angiography and negative explorations.

    Science.gov (United States)

    Ibraheem, Kareem; Khan, Muhammad; Rhee, Peter; Azim, Asad; O'Keeffe, Terence; Tang, Andrew; Kulvatunyou, Narong; Joseph, Bellal

    2018-01-01

    The most recent management guidelines advocate computed tomography angiography (CTA) for any suspected vascular or aero-digestive injuries in all zones and give zone II injuries special consideration. We hypothesized that physical examination can safely guide CTA use in a "no zone" approach. An 8-year retrospective analysis of all adult trauma patients with penetrating neck trauma (PNT) was performed. We included all patients in whom the platysma was violated. Patients were classified into three groups as follows: hard signs, soft signs, and asymptomatic. CTA use, positive CTA (contrast extravasation, dissection, or intimal flap) and operative details were reported. Primary outcomes were positive CTA and therapeutic neck exploration (TNE) (defined by repair of major vascular or aero-digestive injuries). A total of 337 patients with PNT met the inclusion criteria. Eighty-two patients had hard signs and all of them went to the operating room, of which 59 (72%) had TNE. One hundred fifty-six patients had soft signs, of which CTA was performed in 121 (78%), with positive findings in 12 (10%) patients. The remaining 35 (22%) underwent initial neck exploration, of which 14 (40%) were therapeutic yielding a high rate of negative exploration. Ninty-nine patients were asymptomatic, of which CTA was performed in 79 (80%), with positive findings in 3 (4%), however, none of these patients required TNE. On sub analysis based on symptoms, there was no difference in the rate of TNE between the neck zones in patients with hard signs (P = 0.23) or soft signs (P = 0.51). Regardless of the zone of injury, asymptomatic patients did not require a TNE. Physical examination regardless of the zone of injury should be the primary guide to CTA or TNE in patients with PNT. Following traditional zone-based guidelines can result in unnecessary negative explorations in patients with soft signs and may need rethinking. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. A timed-automata approach for critical path detection in a soft real-time application

    NARCIS (Netherlands)

    Yildiz, Bugra Mehmet; Bockisch, Christoph; Rensink, Arend; Aksit, Mehmet

    In this paper, we report preliminary ideas from our project called “Time Performance Improvement With Parallel Processing Systems‿ (TIPS). In the TIPS project, we plan to take advantage of multi-core platforms for performance improvement by parallelizing a complex soft real-time application. In

  11. Out-of-equilibrium self-assembly approaches for new soft materials

    NARCIS (Netherlands)

    Hendriksen, W.E.

    2015-01-01

    Living creatures exists for an important part out of soft material, such as skin, organs and cells, that are out-of-equilibrium formed by the self-assembly of molecular building blocks. Natural materials are continuously active with dynamic processes occurring, such as growth, shrinkage and

  12. Multi-Attribute Decision-Making Method Based on Neutrosophic Soft Rough Information

    Directory of Open Access Journals (Sweden)

    Muhammad Akram

    2018-03-01

    Full Text Available Soft sets (SSs, neutrosophic sets (NSs, and rough sets (RSs are different mathematical models for handling uncertainties, but they are mutually related. In this research paper, we introduce the notions of soft rough neutrosophic sets (SRNSs and neutrosophic soft rough sets (NSRSs as hybrid models for soft computing. We describe a mathematical approach to handle decision-making problems in view of NSRSs. We also present an efficient algorithm of our proposed hybrid model to solve decision-making problems.

  13. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  14. A New Soft Computing Method for K-Harmonic Means Clustering.

    Science.gov (United States)

    Yeh, Wei-Chang; Jiang, Yunzhi; Chen, Yee-Fen; Chen, Zhe

    2016-01-01

    The K-harmonic means clustering algorithm (KHM) is a new clustering method used to group data such that the sum of the harmonic averages of the distances between each entity and all cluster centroids is minimized. Because it is less sensitive to initialization than K-means (KM), many researchers have recently been attracted to studying KHM. In this study, the proposed iSSO-KHM is based on an improved simplified swarm optimization (iSSO) and integrates a variable neighborhood search (VNS) for KHM clustering. As evidence of the utility of the proposed iSSO-KHM, we present extensive computational results on eight benchmark problems. From the computational results, the comparison appears to support the superiority of the proposed iSSO-KHM over previously developed algorithms for all experiments in the literature.

  15. Computing in the presence of soft bit errors. [caused by single event upset on spacecraft

    Science.gov (United States)

    Rasmussen, R. D.

    1984-01-01

    It is shown that single-event-upsets (SEUs) due to cosmic rays are a significant source of single bit error in spacecraft computers. The physical mechanism of SEU, electron hole generation by means of Linear Energy Transfer (LET), it discussed with reference made to the results of a study of the environmental effects on computer systems of the Galileo spacecraft. Techniques for making software more tolerant of cosmic ray effects are considered, including: reducing the number of registers used by the software; continuity testing of variables; redundant execution of major procedures for error detection; and encoding state variables to detect single-bit changes. Attention is also given to design modifications which may reduce the cosmic ray exposure of on-board hardware. These modifications include: shielding components operating in LEO; removing low-power Schottky parts; and the use of CMOS diodes. The SEU parameters of different electronic components are listed in a table.

  16. A methodological approach to assessing alveolar ridge preservation procedures in humans: soft tissue profile.

    Science.gov (United States)

    Vanhoutte, Vanessa; Rompen, Eric; Lecloux, Geoffrey; Rues, Stefan; Schmitter, Marc; Lambert, France

    2014-03-01

    The aesthetic results of implant restoration in the anterior maxilla are particularly related to the soft tissue profile. Although socket preservation techniques appear to reduce bone remodelling after tooth extraction, there is still few investigations assessing the external soft tissue profile after such procedures. The goal of this study was to describe an accurate technique to evaluate soft tissue contour changes after performing socket preservation procedures. The secondary objective was to apply the newly developed measuring method to a specific socket preservation using a "saddled" connective tissue graft combined with the insertion of slowly resorbable biomaterials into the socket. A total of 14 patients needing tooth replacement in the aesthetic region were included to receive a socket preservation procedure using a connective tissue graft. Impressions were taken before the tooth extraction (baseline) and at 2, 4, and 12 weeks after the procedure. The corresponding plaster casts were scanned, and the evolution of the soft tissue profile in relation to the baseline situation was assessed using imaging software. The measuring technique allowed assessing the soft tissue profiles accurately at different levels of the alveolar process. The insertion of a saddled connective tissue appeared to compensate for the horizontal and vertical bone remodelling after a socket preservation procedure in most regions of the alveolar crest. After 12 weeks, the only significant change was located in the more cervical and central region of the alveolar process and reached a median drop of 0.62 mm from baseline. Within the limitations of this study, we found that a saddled connective tissue graft combined with a socket preservation procedure could almost completely counteract the bone remodelling in terms of the external soft tissue profile. The minor changes found in the cervical region might disappear with the emergence profile of the prosthodontic components. The described

  17. CT discography for cervical soft disc hernia

    Energy Technology Data Exchange (ETDEWEB)

    Iwasa, Kenichi; Mizutani, Shigeru; Morimoto, Hiroyuki; Yamada, Hidehito; Iwasa, Satoru

    1985-03-01

    In this study the effectiveness of computed tomographic discography (CTD) in diagnosing cervical soft disc hernia was evaluated. Twenty-five intervertebral discs of 15 cases with cervical soft disc hernia were examined with a discography and then a CT scan. Results of the CT scan were as follows: three discs were protruded, 12 discs were prolapsed, 6 discs were extruded, and 4 discs were sequestrated. The findings were helpful in determining the location of soft disc hernias between the median and posterolateral discs. They were also valuable in classifying types of hernias and surgical approaches.

  18. An Integrated Computer-Aided Approach for Environmental Studies

    DEFF Research Database (Denmark)

    Gani, Rafiqul; Chen, Fei; Jaksland, Cecilia

    1997-01-01

    A general framework for an integrated computer-aided approach to solve process design, control, and environmental problems simultaneously is presented. Physicochemical properties and their relationships to the molecular structure play an important role in the proposed integrated approach. The sco...... and applicability of the integrated approach is highlighted through examples involving estimation of properties and environmental pollution prevention. The importance of mixture effects on some environmentally important properties is also demonstrated....

  19. Soft-information flipping approach in multi-head multi-track BPMR systems

    Science.gov (United States)

    Warisarn, C.; Busyatras, W.; Myint, L. M. M.

    2018-05-01

    Inter-track interference is one of the most severe impairments in bit-patterned media recording system. This impairment can be effectively handled by a modulation code and a multi-head array jointly processing multiple tracks; however, such a modulation constraint has never been utilized to improve the soft-information. Therefore, this paper proposes the utilization of modulation codes with an encoded constraint defined by the criteria for soft-information flipping during a three-track data detection process. Moreover, we also investigate the optimal offset position of readheads to provide the most improvement in system performance. The simulation results indicate that the proposed systems with and without position jitter are significantly superior to uncoded systems.

  20. Subtidal soft-bottom macroinvertebrate communities of the Canary Islands. An ecological approach

    OpenAIRE

    Monterroso,Oscar; Riera,Rodrigo; Núñez,Jorge

    2012-01-01

    The Canarian archipelago is characterized by a mosaic of soft-bottoms such as Cymodocea nodosa meadows, Caulerpa spp. meadows, mäerl bottoms, sabellid fields and bare sandy seabeds, including various macroinfaunal communities. Vegetated habitats (e.g. Cymodocea and Caulerpa) maintain more diverse communities than the non-vegetated seabeds. The results indicated that Caulerpa meadows and, to a lesser extent, Cymodocea nodosa and sabellid fields are the richest and most diverse ecosystems in th...

  1. A Cryogenic 1 GSa/s, Soft-Core FPGA ADC for Quantum Computing Applications

    NARCIS (Netherlands)

    Homulle, H.A.R.; Charbon, E.E.E.

    2016-01-01

    We propose an analog-to-digital converter (ADC) architecture, implemented in an FPGA, that is fully reconfigurable and easy to calibrate. This approach allows to alter the design, according to the system requirements, with simple modifications in the firmware. Therefore it can be used in a wide

  2. REAL TIME PULVERISED COAL FLOW SOFT SENSOR FOR THERMAL POWER PLANTS USING EVOLUTIONARY COMPUTATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    B. Raja Singh

    2015-01-01

    Full Text Available Pulverised coal preparation system (Coal mills is the heart of coal-fired power plants. The complex nature of a milling process, together with the complex interactions between coal quality and mill conditions, would lead to immense difficulties for obtaining an effective mathematical model of the milling process. In this paper, vertical spindle coal mills (bowl mill that are widely used in coal-fired power plants, is considered for the model development and its pulverised fuel flow rate is computed using the model. For the steady state coal mill model development, plant measurements such as air-flow rate, differential pressure across mill etc., are considered as inputs/outputs. The mathematical model is derived from analysis of energy, heat and mass balances. An Evolutionary computation technique is adopted to identify the unknown model parameters using on-line plant data. Validation results indicate that this model is accurate enough to represent the whole process of steady state coal mill dynamics. This coal mill model is being implemented on-line in a 210 MW thermal power plant and the results obtained are compared with plant data. The model is found accurate and robust that will work better in power plants for system monitoring. Therefore, the model can be used for online monitoring, fault detection, and control to improve the efficiency of combustion.

  3. Soft x-ray continuum radiation transmitted through metallic filters: An analytical approach to fast electron temperature measurements

    International Nuclear Information System (INIS)

    Delgado-Aparicio, L.; Hill, K.; Bitter, M.; Tritz, K.; Kramer, T.; Stutman, D.; Finkenthal, M.

    2010-01-01

    A new set of analytic formulas describes the transmission of soft x-ray continuum radiation through a metallic foil for its application to fast electron temperature measurements in fusion plasmas. This novel approach shows good agreement with numerical calculations over a wide range of plasma temperatures in contrast with the solutions obtained when using a transmission approximated by a single-Heaviside function [S. von Goeler et al., Rev. Sci. Instrum. 70, 599 (1999)]. The new analytic formulas can improve the interpretation of the experimental results and thus contribute in obtaining fast temperature measurements in between intermittent Thomson scattering data.

  4. Digging deeper on "deep" learning: A computational ecology approach.

    Science.gov (United States)

    Buscema, Massimo; Sacco, Pier Luigi

    2017-01-01

    We propose an alternative approach to "deep" learning that is based on computational ecologies of structurally diverse artificial neural networks, and on dynamic associative memory responses to stimuli. Rather than focusing on massive computation of many different examples of a single situation, we opt for model-based learning and adaptive flexibility. Cross-fertilization of learning processes across multiple domains is the fundamental feature of human intelligence that must inform "new" artificial intelligence.

  5. Computational experiment approach to advanced secondary mathematics curriculum

    CERN Document Server

    Abramovich, Sergei

    2014-01-01

    This book promotes the experimental mathematics approach in the context of secondary mathematics curriculum by exploring mathematical models depending on parameters that were typically considered advanced in the pre-digital education era. This approach, by drawing on the power of computers to perform numerical computations and graphical constructions, stimulates formal learning of mathematics through making sense of a computational experiment. It allows one (in the spirit of Freudenthal) to bridge serious mathematical content and contemporary teaching practice. In other words, the notion of teaching experiment can be extended to include a true mathematical experiment. When used appropriately, the approach creates conditions for collateral learning (in the spirit of Dewey) to occur including the development of skills important for engineering applications of mathematics. In the context of a mathematics teacher education program, this book addresses a call for the preparation of teachers capable of utilizing mo...

  6. A Hybrid Soft-computing Method for Image Analysis of Digital Plantar Scanners

    Science.gov (United States)

    Razjouyan, Javad; Khayat, Omid; Siahi, Mehdi; Mansouri, Ali Alizadeh

    2013-01-01

    Digital foot scanners have been developed in recent years to yield anthropometrists digital image of insole with pressure distribution and anthropometric information. In this paper, a hybrid algorithm containing gray level spatial correlation (GLSC) histogram and Shanbag entropy is presented for analysis of scanned foot images. An evolutionary algorithm is also employed to find the optimum parameters of GLSC and transform function of the membership values. Resulting binary images as the thresholded images are undergone anthropometric measurements taking in to account the scale factor of pixel size to metric scale. The proposed method is finally applied to plantar images obtained through scanning feet of randomly selected subjects by a foot scanner system as our experimental setup described in the paper. Running computation time and the effects of GLSC parameters are investigated in the simulation results. PMID:24083133

  7. A Hybrid Soft-computing Method for Image Analysis of Digital Plantar Scanners.

    Science.gov (United States)

    Razjouyan, Javad; Khayat, Omid; Siahi, Mehdi; Mansouri, Ali Alizadeh

    2013-01-01

    Digital foot scanners have been developed in recent years to yield anthropometrists digital image of insole with pressure distribution and anthropometric information. In this paper, a hybrid algorithm containing gray level spatial correlation (GLSC) histogram and Shanbag entropy is presented for analysis of scanned foot images. An evolutionary algorithm is also employed to find the optimum parameters of GLSC and transform function of the membership values. Resulting binary images as the thresholded images are undergone anthropometric measurements taking in to account the scale factor of pixel size to metric scale. The proposed method is finally applied to plantar images obtained through scanning feet of randomly selected subjects by a foot scanner system as our experimental setup described in the paper. Running computation time and the effects of GLSC parameters are investigated in the simulation results.

  8. Soft-tissue perineurioma of the retroperitoneum in a 63-year-old man, computed tomography and magnetic resonance imaging findings: a case report

    Directory of Open Access Journals (Sweden)

    Yasumoto Mayumi

    2010-08-01

    Full Text Available Abstract Introduction Soft-tissue perineuriomas are rare benign peripheral nerve sheath tumors in the subcutis of the extremities and the trunks of young patients. To our knowledge, this the first presentation of the computed tomography and magnetic resonance imaging of a soft-tissue perineurioma in the retroperitoneum with pathologic correlation. Case presentation A 63-year-old Japanese man was referred for assessment of high blood pressure. Abdominal computed tomography and magnetic resonance imaging showed a well-defined, gradually enhancing tumor without focal degeneration or hemorrhage adjacent to the pancreatic body. Tumor excision with distal pancreatectomy and splenectomy was performed, as a malignant tumor of pancreatic origin could not be ruled out. No recurrence has been noted in the 16 months since the operation. Pathologic examination of the tumor revealed a soft-tissue perineurioma of the retroperitoneum. Conclusion Although the definitive diagnosis of soft-tissue perineurioma requires biopsy and immunohistochemical reactivity evaluation, the computed tomography and magnetic resonance imaging findings described in this report suggest inclusion of this rare tumor in the differential diagnosis when such findings occur in the retroperitoneum.

  9. Application of soft computing based hybrid models in hydrological variables modeling: a comprehensive review

    Science.gov (United States)

    Fahimi, Farzad; Yaseen, Zaher Mundher; El-shafie, Ahmed

    2017-05-01

    Since the middle of the twentieth century, artificial intelligence (AI) models have been used widely in engineering and science problems. Water resource variable modeling and prediction are the most challenging issues in water engineering. Artificial neural network (ANN) is a common approach used to tackle this problem by using viable and efficient models. Numerous ANN models have been successfully developed to achieve more accurate results. In the current review, different ANN models in water resource applications and hydrological variable predictions are reviewed and outlined. In addition, recent hybrid models and their structures, input preprocessing, and optimization techniques are discussed and the results are compared with similar previous studies. Moreover, to achieve a comprehensive view of the literature, many articles that applied ANN models together with other techniques are included. Consequently, coupling procedure, model evaluation, and performance comparison of hybrid models with conventional ANN models are assessed, as well as, taxonomy and hybrid ANN models structures. Finally, current challenges and recommendations for future researches are indicated and new hybrid approaches are proposed.

  10. Fuzzy classification for strawberry diseases-infection using machine vision and soft-computing techniques

    Science.gov (United States)

    Altıparmak, Hamit; Al Shahadat, Mohamad; Kiani, Ehsan; Dimililer, Kamil

    2018-04-01

    Robotic agriculture requires smart and doable techniques to substitute the human intelligence with machine intelligence. Strawberry is one of the important Mediterranean product and its productivity enhancement requires modern and machine-based methods. Whereas a human identifies the disease infected leaves by his eye, the machine should also be capable of vision-based disease identification. The objective of this paper is to practically verify the applicability of a new computer-vision method for discrimination between the healthy and disease infected strawberry leaves which does not require neural network or time consuming trainings. The proposed method was tested under outdoor lighting condition using a regular DLSR camera without any particular lens. Since the type and infection degree of disease is approximated a human brain a fuzzy decision maker classifies the leaves over the images captured on-site having the same properties of human vision. Optimizing the fuzzy parameters for a typical strawberry production area at a summer mid-day in Cyprus produced 96% accuracy for segmented iron deficiency and 93% accuracy for segmented using a typical human instant classification approximation as the benchmark holding higher accuracy than a human eye identifier. The fuzzy-base classifier provides approximate result for decision making on the leaf status as if it is healthy or not.

  11. Computer based approach to fatigue analysis and design

    International Nuclear Information System (INIS)

    Comstock, T.R.; Bernard, T.; Nieb, J.

    1979-01-01

    An approach is presented which uses a mini-computer based system for data acquisition, analysis and graphic displays relative to fatigue life estimation and design. Procedures are developed for identifying an eliminating damaging events due to overall duty cycle, forced vibration and structural dynamic characteristics. Two case histories, weld failures in heavy vehicles and low cycle fan blade failures, are discussed to illustrate the overall approach. (orig.) 891 RW/orig. 892 RKD [de

  12. Automatic bone detection and soft tissue aware ultrasound-CT registration for computer-aided orthopedic surgery.

    Science.gov (United States)

    Wein, Wolfgang; Karamalis, Athanasios; Baumgartner, Adrian; Navab, Nassir

    2015-06-01

    The transfer of preoperative CT data into the tracking system coordinates within an operating room is of high interest for computer-aided orthopedic surgery. In this work, we introduce a solution for intra-operative ultrasound-CT registration of bones. We have developed methods for fully automatic real-time bone detection in ultrasound images and global automatic registration to CT. The bone detection algorithm uses a novel bone-specific feature descriptor and was thoroughly evaluated on both in-vivo and ex-vivo data. A global optimization strategy aligns the bone surface, followed by a soft tissue aware intensity-based registration to provide higher local registration accuracy. We evaluated the system on femur, tibia and fibula anatomy in a cadaver study with human legs, where magnetically tracked bone markers were implanted to yield ground truth information. An overall median system error of 3.7 mm was achieved on 11 datasets. Global and fully automatic registration of bones aquired with ultrasound to CT is feasible, with bone detection and tracking operating in real time for immediate feedback to the surgeon.

  13. Strength development in concrete with wood ash blended cement and use of soft computing models to predict strength parameters.

    Science.gov (United States)

    Chowdhury, S; Maniar, A; Suganya, O M

    2015-11-01

    In this study, Wood Ash (WA) prepared from the uncontrolled burning of the saw dust is evaluated for its suitability as partial cement replacement in conventional concrete. The saw dust has been acquired from a wood polishing unit. The physical, chemical and mineralogical characteristics of WA is presented and analyzed. The strength parameters (compressive strength, split tensile strength and flexural strength) of concrete with blended WA cement are evaluated and studied. Two different water-to-binder ratio (0.4 and 0.45) and five different replacement percentages of WA (5%, 10%, 15%, 18% and 20%) including control specimens for both water-to-cement ratio is considered. Results of compressive strength, split tensile strength and flexural strength showed that the strength properties of concrete mixture decreased marginally with increase in wood ash contents, but strength increased with later age. The XRD test results and chemical analysis of WA showed that it contains amorphous silica and thus can be used as cement replacing material. Through the analysis of results obtained in this study, it was concluded that WA could be blended with cement without adversely affecting the strength properties of concrete. Also using a new statistical theory of the Support Vector Machine (SVM), strength parameters were predicted by developing a suitable model and as a result, the application of soft computing in structural engineering has been successfully presented in this research paper.

  14. Strength development in concrete with wood ash blended cement and use of soft computing models to predict strength parameters

    Directory of Open Access Journals (Sweden)

    S. Chowdhury

    2015-11-01

    Full Text Available In this study, Wood Ash (WA prepared from the uncontrolled burning of the saw dust is evaluated for its suitability as partial cement replacement in conventional concrete. The saw dust has been acquired from a wood polishing unit. The physical, chemical and mineralogical characteristics of WA is presented and analyzed. The strength parameters (compressive strength, split tensile strength and flexural strength of concrete with blended WA cement are evaluated and studied. Two different water-to-binder ratio (0.4 and 0.45 and five different replacement percentages of WA (5%, 10%, 15%, 18% and 20% including control specimens for both water-to-cement ratio is considered. Results of compressive strength, split tensile strength and flexural strength showed that the strength properties of concrete mixture decreased marginally with increase in wood ash contents, but strength increased with later age. The XRD test results and chemical analysis of WA showed that it contains amorphous silica and thus can be used as cement replacing material. Through the analysis of results obtained in this study, it was concluded that WA could be blended with cement without adversely affecting the strength properties of concrete. Also using a new statistical theory of the Support Vector Machine (SVM, strength parameters were predicted by developing a suitable model and as a result, the application of soft computing in structural engineering has been successfully presented in this research paper.

  15. Estimation of Rivers Dissolved Solids TDS by Soft Computing (Case Study: Upstream of Boukan Dam

    Directory of Open Access Journals (Sweden)

    S. Zaman Zad Ghavidel

    2017-01-01

    layer with five inputs, one hidden and output layer with three and two neurons for Anyan and Safakhaneh hydrometer stations, respectively. Similar ANN, ANFIS-SC5 model had the best performance. It is clear that the ANFIS with 0/4 and 0/7 radii value has the highest R and the lowest RMSE for Anyan and Safakhaneh hydrometer stations, respectively. Various GEP models have been developed using the input combinations similar ANN and ANFIS models. Comparing the GEP5 estimations with the measured data for the test stage demonstrates a high generalization capacity of the model, with relatively low error and high correlation. From the scatter plots it is obviously seen that the GEP5 predictions are closer to the corresponding measured TDS than other models. As seen from the best straight line equations (assume the equation as y=ax in the scatter plots that the a coefficient for GEP5 is closer to 1 than other models. In addition to previous operation, Gene expression programming offered mathematical relationships in the stations of Anyan and Safakhane with the correlation coefficients, respectively 0.962 , 0.971 and with Root-mean-square errors, respectively 12.82 , 29.08 in order to predict dissolved solids (TDS in the rivers located at upstream of the dam. The obtained results showed the efficiency of the applied models in simulating the nonlinear behavior of TDS variations in terms of performance indices. Overall, the GEP model outperformed the other models. For all of applied models, the best result was obtained by application of input combination (5 including HCO3, Ca, Na, Q and Mg. The results are also tested by using t test for verifying the robustness of the models at 95% significance level. Comparison results indicated that the poorest model in TDS simulation was ANN especially in test period. The observed relationship between residuals and model computed TDS values shows complete independence and random distribution. It is further supported by the respective

  16. A Novel Approach for Prediction of Industrial Catalyst Deactivation Using Soft Sensor Modeling

    Directory of Open Access Journals (Sweden)

    Hamed Gharehbaghi

    2016-06-01

    Full Text Available Soft sensors are used for fault detection and prediction of the process variables in chemical processing units, for which the online measurement is difficult. The present study addresses soft sensor design and identification for deactivation of zeolite catalyst in an industrial-scale fixed bed reactor based on the process data. The two main reactions are disproportionation (DP and transalkylation (TA, which change toluene and C9 aromatics into xylenes and benzene. Two models are considered based on the mass conservation around the reactor. The model parameters are estimated by data-based modeling (DBM philosophy and state dependent parameter (SDP method. In the SDP method, the parameters are assumed to be a function of the system states. The results show that the catalyst activity during the period under study has approximately a monotonic trend. Identification of the system clearly shows that the xylene concentration has a determining role in the conversion of reactions. The activation energies for both DP and TA reactions are found to be 43.8 and 18 kJ/mol, respectively. The model prediction is in good agreement with the observed industrial data.

  17. Developing students' worksheets applying soft skill-based scientific approach for improving building engineering students' competencies in vocational high schools

    Science.gov (United States)

    Suparno, Sudomo, Rahardjo, Boedi

    2017-09-01

    Experts and practitioners agree that the quality of vocational high schools needs to be greatly improved. Many construction services have voiced their dissatisfaction with today's low-quality vocational high school graduates. The low quality of graduates is closely related to the quality of the teaching and learning process, particularly teaching materials. In their efforts to improve the quality of vocational high school education, the government have implemented Curriculum 2013 (K13) and supplied teaching materials. However, the results of monitoring and evaluation done by the Directorate of Vocational High School, Directorate General of Secondary Education (2014), the provision of tasks for students in the teaching materials was totally inadequate. Therefore, to enhance the quality and the result of the instructional process, there should be provided students' worksheets that can stimulate and improve students' problem-solving skills and soft skills. In order to develop worksheets that can meet the academic requirements, the development needs to be in accordance with an innovative learning approach, which is the soft skill-based scientific approach.

  18. Computer and Internet Addiction: Analysis and Classification of Approaches

    Directory of Open Access Journals (Sweden)

    Zaretskaya O.V.

    2017-08-01

    Full Text Available The theoretical analysis of modern research works on the problem of computer and Internet addiction is carried out. The main features of different approaches are outlined. The attempt is made to systematize researches conducted and to classify scientific approaches to the problem of Internet addiction. The author distinguishes nosological, cognitive-behavioral, socio-psychological and dialectical approaches. She justifies the need to use an approach that corresponds to the essence, goals and tasks of social psychology in the field of research as the problem of Internet addiction, and the dependent behavior in general. In the opinion of the author, this dialectical approach integrates the experience of research within the framework of the socio-psychological approach and focuses on the observed inconsistencies in the phenomenon of Internet addiction – the compensatory nature of Internet activity, when people who are interested in the Internet are in a dysfunctional life situation.

  19. Time series analysis of reference crop evapotranspiration using soft computing techniques for Ganjam District, Odisha, India

    Science.gov (United States)

    Patra, S. R.

    2017-12-01

    minimization principle. The reliability of these computational models was analysed in light of simulation results and it was found out that SVM model produces better results among the three. The future research should be routed to extend the validation data set and to check the validity of our results on different areas with hybrid intelligence techniques.

  20. Comparative Analysis of Soft Computing Models in Prediction of Bending Rigidity of Cotton Woven Fabrics

    Science.gov (United States)

    Guruprasad, R.; Behera, B. K.

    2015-10-01

    Quantitative prediction of fabric mechanical properties is an essential requirement for design engineering of textile and apparel products. In this work, the possibility of prediction of bending rigidity of cotton woven fabrics has been explored with the application of Artificial Neural Network (ANN) and two hybrid methodologies, namely Neuro-genetic modeling and Adaptive Neuro-Fuzzy Inference System (ANFIS) modeling. For this purpose, a set of cotton woven grey fabrics was desized, scoured and relaxed. The fabrics were then conditioned and tested for bending properties. With the database thus created, a neural network model was first developed using back propagation as the learning algorithm. The second model was developed by applying a hybrid learning strategy, in which genetic algorithm was first used as a learning algorithm to optimize the number of neurons and connection weights of the neural network. The Genetic algorithm optimized network structure was further allowed to learn using back propagation algorithm. In the third model, an ANFIS modeling approach was attempted to map the input-output data. The prediction performances of the models were compared and a sensitivity analysis was reported. The results show that the prediction by neuro-genetic and ANFIS models were better in comparison with that of back propagation neural network model.

  1. Pedagogical Approaches to Teaching with Computer Simulations in Science Education

    NARCIS (Netherlands)

    Rutten, N.P.G.; van der Veen, Johan (CTIT); van Joolingen, Wouter; McBride, Ron; Searson, Michael

    2013-01-01

    For this study we interviewed 24 physics teachers about their opinions on teaching with computer simulations. The purpose of this study is to investigate whether it is possible to distinguish different types of teaching approaches. Our results indicate the existence of two types. The first type is

  2. Protocol Processing for 100 Gbit/s and Beyond - A Soft Real-Time Approach in Hardware and Software

    Science.gov (United States)

    Büchner, Steffen; Lopacinski, Lukasz; Kraemer, Rolf; Nolte, Jörg

    2017-09-01

    100 Gbit/s wireless communication protocol processing stresses all parts of a communication system until the outermost. The efficient use of upcoming 100 Gbit/s and beyond transmission technology requires the rethinking of the way protocols are processed by the communication endpoints. This paper summarizes the achievements of the project End2End100. We will present a comprehensive soft real-time stream processing approach that allows the protocol designer to develop, analyze, and plan scalable protocols for ultra high data rates of 100 Gbit/s and beyond. Furthermore, we will present an ultra-low power, adaptable, and massively parallelized FEC (Forward Error Correction) scheme that detects and corrects bit errors at line rate with an energy consumption between 1 pJ/bit and 13 pJ/bit. The evaluation results discussed in this publication show that our comprehensive approach allows end-to-end communication with a very low protocol processing overhead.

  3. Identifying Opportunities for Decision Support Systems in Support of Regional Resource Use Planning: An Approach Through Soft Systems Methodology.

    Science.gov (United States)

    Zhu; Dale

    2000-10-01

    / Regional resource use planning relies on key regional stakeholder groups using and having equitable access to appropriate social, economic, and environmental information and assessment tools. Decision support systems (DSS) can improve stakeholder access to such information and analysis tools. Regional resource use planning, however, is a complex process involving multiple issues, multiple assessment criteria, multiple stakeholders, and multiple values. There is a need for an approach to DSS development that can assist in understanding and modeling complex problem situations in regional resource use so that areas where DSSs could provide effective support can be identified, and the user requirements can be well established. This paper presents an approach based on the soft systems methodology for identifying DSS opportunities for regional resource use planning, taking the Central Highlands Region of Queensland, Australia, as a case study.

  4. Cloud Computing - A Unified Approach for Surveillance Issues

    Science.gov (United States)

    Rachana, C. R.; Banu, Reshma, Dr.; Ahammed, G. F. Ali, Dr.; Parameshachari, B. D., Dr.

    2017-08-01

    Cloud computing describes highly scalable resources provided as an external service via the Internet on a basis of pay-per-use. From the economic point of view, the main attractiveness of cloud computing is that users only use what they need, and only pay for what they actually use. Resources are available for access from the cloud at any time, and from any location through networks. Cloud computing is gradually replacing the traditional Information Technology Infrastructure. Securing data is one of the leading concerns and biggest issue for cloud computing. Privacy of information is always a crucial pointespecially when an individual’s personalinformation or sensitive information is beingstored in the organization. It is indeed true that today; cloud authorization systems are notrobust enough. This paper presents a unified approach for analyzing the various security issues and techniques to overcome the challenges in the cloud environment.

  5. Computer Forensics for Graduate Accountants: A Motivational Curriculum Design Approach

    Directory of Open Access Journals (Sweden)

    Grover Kearns

    2010-06-01

    Full Text Available Computer forensics involves the investigation of digital sources to acquire evidence that can be used in a court of law. It can also be used to identify and respond to threats to hosts and systems. Accountants use computer forensics to investigate computer crime or misuse, theft of trade secrets, theft of or destruction of intellectual property, and fraud. Education of accountants to use forensic tools is a goal of the AICPA (American Institute of Certified Public Accountants. Accounting students, however, may not view information technology as vital to their career paths and need motivation to acquire forensic knowledge and skills. This paper presents a curriculum design methodology for teaching graduate accounting students computer forensics. The methodology is tested using perceptions of the students about the success of the methodology and their acquisition of forensics knowledge and skills. An important component of the pedagogical approach is the use of an annotated list of over 50 forensic web-based tools.

  6. Cloud computing approaches to accelerate drug discovery value chain.

    Science.gov (United States)

    Garg, Vibhav; Arora, Suchir; Gupta, Chitra

    2011-12-01

    Continued advancements in the area of technology have helped high throughput screening (HTS) evolve from a linear to parallel approach by performing system level screening. Advanced experimental methods used for HTS at various steps of drug discovery (i.e. target identification, target validation, lead identification and lead validation) can generate data of the order of terabytes. As a consequence, there is pressing need to store, manage, mine and analyze this data to identify informational tags. This need is again posing challenges to computer scientists to offer the matching hardware and software infrastructure, while managing the varying degree of desired computational power. Therefore, the potential of "On-Demand Hardware" and "Software as a Service (SAAS)" delivery mechanisms cannot be denied. This on-demand computing, largely referred to as Cloud Computing, is now transforming the drug discovery research. Also, integration of Cloud computing with parallel computing is certainly expanding its footprint in the life sciences community. The speed, efficiency and cost effectiveness have made cloud computing a 'good to have tool' for researchers, providing them significant flexibility, allowing them to focus on the 'what' of science and not the 'how'. Once reached to its maturity, Discovery-Cloud would fit best to manage drug discovery and clinical development data, generated using advanced HTS techniques, hence supporting the vision of personalized medicine.

  7. Soft-tissue injuries of the fingertip: methods of evaluation and treatment. An algorithmic approach.

    Science.gov (United States)

    Lemmon, Joshua A; Janis, Jeffrey E; Rohrich, Rod J

    2008-09-01

    After studying this article, the participant should be able to: 1. Understand the anatomy of the fingertip. 2. Describe the methods of evaluating fingertip injuries. 3. Discuss reconstructive options for various tip injuries. The fingertip is the most commonly injured part of the hand, and therefore fingertip injuries are among the most frequent injuries that plastic surgeons are asked to treat. Although microsurgical techniques have enabled replantation of even very distal tip amputations, it is relatively uncommon that a distal tip injury will be appropriate for replantation. In the event that replantation is not pursued, options for distal tip soft-tissue reconstruction must be considered. This review presents a straightforward method for evaluating fingertip injuries and provides an algorithm for fingertip reconstruction.

  8. Soft systems methodology and the ecosystem approach: a system study of the Cooum River and environs in Chennai, India.

    Science.gov (United States)

    Bunch, Martin J

    2003-02-01

    This paper discusses the integration of soft systems methodology (SSM) within an ecosystem approach in research to support rehabilitation and management of the Cooum River and environs in Chennai, India. The Cooum is an extremely polluted urban stream. Its management is complicated by high rates of population growth, poverty, uncontrolled urban development, jurisdictional conflicts, institutional culture, flat topography, tidal action, blockage of the river mouth, and monsoon flooding. The situation is characterized by basic uncertainty about main processes and activities, and the nature of relationships among actors and elements in the system.SSM is an approach for dealing with messy or ill-structured problematic situations involving human activity. In this work SSM contributed techniques (such as "rich picture" and "CATWOE" tools) to description of the Cooum situation as a socioecological system and informed the approach itself at a theoretical level. Application of three general phases in SSM is discussed in the context of the Cooum River research: (1) problem definition and exploration of the problem situation, (2) development of conceptual models of relevant systems, and (3) the use of these to generate insight and stimulate debate about desirable and feasible change. Its use here gives weight to the statement by others that SSM would be a particularly appropriate methodology to operate the ecosystem approach. As well as informing efforts at management of the Cooum system, this work led the way to explore an adaptive ecosystem approach more broadly to management of the urban environment for human health in Chennai.

  9. A computational approach to chemical etiologies of diabetes

    DEFF Research Database (Denmark)

    Audouze, Karine Marie Laure; Brunak, Søren; Grandjean, Philippe

    2013-01-01

    Computational meta-analysis can link environmental chemicals to genes and proteins involved in human diseases, thereby elucidating possible etiologies and pathogeneses of non-communicable diseases. We used an integrated computational systems biology approach to examine possible pathogenetic...... linkages in type 2 diabetes (T2D) through genome-wide associations, disease similarities, and published empirical evidence. Ten environmental chemicals were found to be potentially linked to T2D, the highest scores were observed for arsenic, 2,3,7,8-tetrachlorodibenzo-p-dioxin, hexachlorobenzene...

  10. The value of computed tomography in the diagnosis of the rotator cuff tears, and bone and soft tissue tumors

    International Nuclear Information System (INIS)

    Yoh, Sansen

    1984-01-01

    The usefulness of computed tomography (CT) in the diagnosis of rotator cuff tear was assessed. The rotator cuff could not be visualized in detail by CT unless introduction of contrast material into the joint cavity was performed. CT arthrography was performed on 21 cases of rotator cuff tears. The most detailed information was obtained when a relatively low concentration of contrast material (3.25% Angiografin) was filled in the joint cavity, and when the shoulder joint was rotated to the maximum outwards at the side. CT arthrography proved to be the most reliable method for assessing the extent and portion of the rotator cuff tears, so that it demonstrated conclusive evidence of diagnosis and management in 89% of patients studied. The usefulness of CT in the diagnosis of bone and soft tissue tumors was assessed. CT examination provided unique preoperative information which could imagine a more precise histological characteristics and anatomical localization of the lesion. Contrast enhancement (CE), when used, proved to be helpful in predicting the nature of tumors. The CE by intra-arterial infusion, or intravenous bolous injection of contrast material during the scan was more useful than that by intervenous drip infusion of the material. The information regarding change of tumor size, CT number and CE were appropriate indicators which directly corresponded to responsiveness of the tumor to the chemotheraphy and radiotherapy performed. Preoperative ABC classification of the tumor by information regarding its size, location, definition and anatomical relation of tumors to vital structures (neural, vescular, and visceral) was done by using CT. The classification clearly corresponded to the status of patients regarding the treatment required for the patients. (author)

  11. A comparison between ten advanced and soft computing models for groundwater qanat potential assessment in Iran using R and GIS

    Science.gov (United States)

    Naghibi, Seyed Amir; Pourghasemi, Hamid Reza; Abbaspour, Karim

    2018-02-01

    Considering the unstable condition of water resources in Iran and many other countries in arid and semi-arid regions, groundwater studies are very important. Therefore, the aim of this study is to model groundwater potential by qanat locations as indicators and ten advanced and soft computing models applied to the Beheshtabad Watershed, Iran. Qanat is a man-made underground construction which gathers groundwater from higher altitudes and transmits it to low land areas where it can be used for different purposes. For this purpose, at first, the location of the qanats was detected using extensive field surveys. These qanats were classified into two datasets including training (70%) and validation (30%). Then, 14 influence factors depicting the region's physical, morphological, lithological, and hydrological features were identified to model groundwater potential. Linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), flexible discriminant analysis (FDA), penalized discriminant analysis (PDA), boosted regression tree (BRT), random forest (RF), artificial neural network (ANN), K-nearest neighbor (KNN), multivariate adaptive regression splines (MARS), and support vector machine (SVM) models were applied in R scripts to produce groundwater potential maps. For evaluation of the performance accuracies of the developed models, ROC curve and kappa index were implemented. According to the results, RF had the best performance, followed by SVM and BRT models. Our results showed that qanat locations could be used as a good indicator for groundwater potential. Furthermore, altitude, slope, plan curvature, and profile curvature were found to be the most important influence factors. On the other hand, lithology, land use, and slope aspect were the least significant factors. The methodology in the current study could be used by land use and terrestrial planners and water resource managers to reduce the costs of groundwater resource discovery.

  12. Approach to skin and soft tissue infections in non-HIV immunocompromised hosts.

    Science.gov (United States)

    Burke, Victoria E; Lopez, Fred A

    2017-08-01

    Skin and soft tissue infections are frequent contributors to morbidity and mortality in the immunocompromised host. This article reviews the changing epidemiology and clinical manifestations of the most common cutaneous pathogens in non-HIV immunocompromised hosts, including patients with solid organ transplants, stem cell transplants, solid tumors, hematologic malignancies, and receiving chronic immunosuppressive therapy for inflammatory disorders. Defects in the innate or adaptive immune response can predispose the immunocompromised host to certain cutaneous infections in a predictive fashion. Cutaneous lesions in patients with neutrophil defects are commonly due to bacteria, Candida, or invasive molds. Skin lesions in patients with cellular or humoral immunodeficiencies can be due to encapsulated bacteria, Nocardia, mycobacteria, endemic fungal infections, herpesviruses, or parasites. Skin lesions may reflect primary inoculation or, more commonly, disseminated infection. Tissue samples for microscopy, culture, and histopathology are critical to making an accurate diagnosis given the nonspecific and heterogeneous appearance of these skin lesions due to a blunted immune response. As the population of non-HIV immunosuppressed hosts expands with advances in medical therapies, the frequency and variety of cutaneous diseases in these hosts will increase.

  13. CGC/saturation approach for soft interactions at high energy: Inclusive production

    International Nuclear Information System (INIS)

    Gotsman, E.; Levin, E.; Maor, U.

    2015-01-01

    In this letter we demonstrate that our dipole model is successful in describing inclusive production within the same framework as diffractive physics. We believe that this achievement stems from the fact that our approach incorporates the positive features of the Reggeon approach and CGC/saturation effective theory, for high energy QCD

  14. CGC/saturation approach for soft interactions at high energy: Inclusive production

    Energy Technology Data Exchange (ETDEWEB)

    Gotsman, E., E-mail: gotsman@post.tau.ac.il [Department of Particle Physics, School of Physics and Astronomy, Raymond and Beverly Sackler Faculty of Exact Science, Tel Aviv University, Tel Aviv, 69978 (Israel); Levin, E., E-mail: leving@post.tau.ac.il [Department of Particle Physics, School of Physics and Astronomy, Raymond and Beverly Sackler Faculty of Exact Science, Tel Aviv University, Tel Aviv, 69978 (Israel); Departemento de Física, Universidad Técnica Federico Santa María, and Centro Científico-Tecnológico de Valparaíso, Avda. Espana 1680, Casilla 110-V, Valparaíso (Chile); Maor, U., E-mail: maor@post.tau.ac.il [Department of Particle Physics, School of Physics and Astronomy, Raymond and Beverly Sackler Faculty of Exact Science, Tel Aviv University, Tel Aviv, 69978 (Israel)

    2015-06-30

    In this letter we demonstrate that our dipole model is successful in describing inclusive production within the same framework as diffractive physics. We believe that this achievement stems from the fact that our approach incorporates the positive features of the Reggeon approach and CGC/saturation effective theory, for high energy QCD.

  15. Quantitative determination of additive Chlorantraniliprole in Abamectin preparation: Investigation of bootstrapping soft shrinkage approach by mid-infrared spectroscopy

    Science.gov (United States)

    Yan, Hong; Song, Xiangzhong; Tian, Kuangda; Chen, Yilin; Xiong, Yanmei; Min, Shungeng

    2018-02-01

    A novel method, mid-infrared (MIR) spectroscopy, which enables the determination of Chlorantraniliprole in Abamectin within minutes, is proposed. We further evaluate the prediction ability of four wavelength selection methods, including bootstrapping soft shrinkage approach (BOSS), Monte Carlo uninformative variable elimination (MCUVE), genetic algorithm partial least squares (GA-PLS) and competitive adaptive reweighted sampling (CARS) respectively. The results showed that BOSS method obtained the lowest root mean squared error of cross validation (RMSECV) (0.0245) and root mean squared error of prediction (RMSEP) (0.0271), as well as the highest coefficient of determination of cross-validation (Qcv2) (0.9998) and the coefficient of determination of test set (Q2test) (0.9989), which demonstrated that the mid infrared spectroscopy can be used to detect Chlorantraniliprole in Abamectin conveniently. Meanwhile, a suitable wavelength selection method (BOSS) is essential to conducting a component spectral analysis.

  16. Soft System Methodology as a Tool to Understand Issues of Governmental Affordable Housing Programme of India: A Case Study Approach

    Science.gov (United States)

    Ghosh, Sukanya; Roy, Souvanic; Sanyal, Manas Kumar

    2016-09-01

    With the help of a case study, the article has explored current practices of implementation of governmental affordable housing programme for urban poor in a slum of India. This work shows that the issues associated with the problems of governmental affordable housing programme has to be addressed to with a suitable methodology as complexities are not only dealing with quantitative data but qualitative data also. The Hard System Methodologies (HSM), which is conventionally applied to address the issues, deals with real and known problems which can be directly solved. Since most of the issues of affordable housing programme as found in the case study are subjective and complex in nature, Soft System Methodology (SSM) has been tried for better representation from subjective points of views. The article explored drawing of Rich Picture as an SSM approach for better understanding and analysing complex issues and constraints of affordable housing programme so that further exploration of the issues is possible.

  17. WSRC approach to validation of criticality safety computer codes

    International Nuclear Information System (INIS)

    Finch, D.R.; Mincey, J.F.

    1991-01-01

    Recent hardware and operating system changes at Westinghouse Savannah River Site (WSRC) have necessitated review of the validation for JOSHUA criticality safety computer codes. As part of the planning for this effort, a policy for validation of JOSHUA and other criticality safety codes has been developed. This policy will be illustrated with the steps being taken at WSRC. The objective in validating a specific computational method is to reliably correlate its calculated neutron multiplication factor (K eff ) with known values over a well-defined set of neutronic conditions. Said another way, such correlations should be: (1) repeatable; (2) demonstrated with defined confidence; and (3) identify the range of neutronic conditions (area of applicability) for which the correlations are valid. The general approach to validation of computational methods at WSRC must encompass a large number of diverse types of fissile material processes in different operations. Special problems are presented in validating computational methods when very few experiments are available (such as for enriched uranium systems with principal second isotope 236 U). To cover all process conditions at WSRC, a broad validation approach has been used. Broad validation is based upon calculation of many experiments to span all possible ranges of reflection, nuclide concentrations, moderation ratios, etc. Narrow validation, in comparison, relies on calculations of a few experiments very near anticipated worst-case process conditions. The methods and problems of broad validation are discussed

  18. Archiving Software Systems: Approaches to Preserve Computational Capabilities

    Science.gov (United States)

    King, T. A.

    2014-12-01

    A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.

  19. A SURVEY ON DOCUMENT CLUSTERING APPROACH FOR COMPUTER FORENSIC ANALYSIS

    OpenAIRE

    Monika Raghuvanshi*, Rahul Patel

    2016-01-01

    In a forensic analysis, large numbers of files are examined. Much of the information comprises of in unstructured format, so it’s quite difficult task for computer forensic to perform such analysis. That’s why to do the forensic analysis of document within a limited period of time require a special approach such as document clustering. This paper review different document clustering algorithms methodologies for example K-mean, K-medoid, single link, complete link, average link in accorandance...

  20. Soft tissue deformation for surgical simulation: a position-based dynamics approach.

    Science.gov (United States)

    Camara, Mafalda; Mayer, Erik; Darzi, Ara; Pratt, Philip

    2016-06-01

    To assist the rehearsal and planning of robot-assisted partial nephrectomy, a real-time simulation platform is presented that allows surgeons to visualise and interact with rapidly constructed patient-specific biomechanical models of the anatomical regions of interest. Coupled to a framework for volumetric deformation, the platform furthermore simulates intracorporeal 2D ultrasound image acquisition, using preoperative imaging as the data source. This not only facilitates the planning of optimal transducer trajectories and viewpoints, but can also act as a validation context for manually operated freehand 3D acquisitions and reconstructions. The simulation platform was implemented within the GPU-accelerated NVIDIA FleX position-based dynamics framework. In order to validate the model and determine material properties and other simulation parameter values, a porcine kidney with embedded fiducial beads was CT-scanned and segmented. Acquisitions for the rest position and three different levels of probe-induced deformation were collected. Optimal values of the cluster stiffness coefficients were determined for a range of different particle radii, where the objective function comprised the mean distance error between real and simulated fiducial positions over the sequence of deformations. The mean fiducial error at each deformation stage was found to be compatible with the level of ultrasound probe calibration error typically observed in clinical practice. Furthermore, the simulation exhibited unconditional stability on account of its use of clustered shape-matching constraints. A novel position-based dynamics implementation of soft tissue deformation has been shown to facilitate several desirable simulation characteristics: real-time performance, unconditional stability, rapid model construction enabling patient-specific behaviour and accuracy with respect to reference CT images.

  1. Oral soft tissue infections: causes, therapeutic approaches and microbiological spectrum with focus on antibiotic treatment.

    Science.gov (United States)

    Götz, Carolin; Reinhart, Edeltraud; Wolff, Klaus-Dietrich; Kolk, Andreas

    2015-11-01

    Intraoral soft tissue infections (OSTI) are a common problem in dentistry and oral surgery. These abscesses are mostly exacerbated dental infections (OIDC), and some emerge as postoperative infections (POI) after tooth extraction (OITR) or apicoectomy (OIRR). The main aim of this study was to compare OIDC with POI, especially looking at the bacteria involved. An additional question was, therefore, if different antibiotic treatments should be used with OSTI of differing aetiologies. The impact of third molars on OSTI was evaluated and also the rates of POI after removal of third molars were specified. Patient data was collected from the patients' medical records and the results were statistically evaluated with SPSS (SPSS version 21.0; SPSS, IBM; Chicago, IL, USA). The inclusion criterion was the outpatient treatment of a patient with an exacerbated oral infection; the exclusion criteria were an early stage of infiltration without abscess formation; and a need for inpatient treatment. Periapical exacerbated infections, especially in the molar region were the commonest cause of OIDC. In the OITR group, mandibular tooth removal was the commonest factor (p=0.016). Remarkably, retained lower wisdom teeth led to significant number of cases in the OITR group (p=0.022). In our study we could not define differences between the causal bacteria found in patients with OIDC and POI. Due to resistance rates we conclude that amoxicillin combined with clavulanic acid seems to be the antibiotic standard for exacerbated intraoral infections independent of their aetiology. Copyright © 2015 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  2. A new approach for assessing the wear resistance of soft ductile materials

    International Nuclear Information System (INIS)

    Zaid, A.I.O.; Banna, M.A.E.

    2007-01-01

    Aluminum and its alloys are the most versatile and attractive metallic materials which have been used for many decades in many engineering applications specially in the automobile and airspace industries due to their high strength-to- weight ratio, thermal conductivity, electrical conductivity, corrosion and wear resistances. Wear is the loss of material from a surface caused by interaction with another material. The main mechanisms of interaction are applied loads and relative motion, which can cause adhesion or/and abrasion, all of which leads to material loss. Therefore, most of the suggested methods, theoretical and empirical, for estimating the wear resistance of material is based on the mass loss, irrespective of the material or type of existing wear. Experimental observations reveal that in some situations, especially for soft and ductile materials, the tested specimen showed little or no mass loss while its dimensions and shape have suffered from plastic deformation which causes more damage than mass loss. Similar phenomenon was observed during electric spot welding of aluminum and zinc coated steels at the area beneath the electrode where plastic deformation takes place, causing increase in area which reduces the current density, will be also discussed in the paper. The amount of the plastic deformation, even when mentioned in some publications, was neglected in assessing wear resistance. In this paper, a model based on the plastic deformation at the worn end together with the mass loss is forwarded and discussed. The model was tested qualitatively using commercially pure aluminum of 99.97% purity in the as supplied condition and in grain refined conditions by some rare earth materials e.g. titanium and titanium plus boron, which are normally used in industry for improving its hardness and mechanical behavior. The wear tests were carried out under different loads and speeds (the main parameters in assessing wear resistance) and the data was used for

  3. Subtidal soft-bottom macroinvertebrate communities of the Canary Islands. An ecological approach

    Directory of Open Access Journals (Sweden)

    Oscar Monterroso

    2012-03-01

    Full Text Available The Canarian archipelago is characterized by a mosaic of soft-bottoms such as Cymodocea nodosa meadows, Caulerpa spp. meadows, mäerl bottoms, sabellid fields and bare sandy seabeds, including various macroinfaunal communities. Vegetated habitats (e.g. Cymodocea and Caulerpa maintain more diverse communities than the non-vegetated seabeds. The results indicated that Caulerpa meadows and, to a lesser extent, Cymodocea nodosa and sabellid fields are the richest and most diverse ecosystems in the study area. Moreover, biodiversity differences among islands could be detected with maximum values on the eastern islands (Lanzarote and Gran Canaria and lowest values on the western ones (La Palma.O arquipélago das Canárias é caracterizado por um mosaico de fundos inconsolidados contendo bancos de Cymodocea nodosa, Caulerpa spp., fundos calcários, bancos de sabelídeos e sedimento não biogênico, que abrigam diferentes comunidades da macrofauna. Ambientes vegetados (Cymodocea e Caulerpa possuem comunidades mais diversificadas quando comparados aos ambientes de fundos não vegetados. Os resultados do presente estudo indicaram que os bancos de Caulerpa, primeiramente, e em seguida os bancos de Cymodocea nodosa e de sabelídeos, formam os sistemas mais ricos e diversificados da área. Além disso, puderam também ser detectadas diferenças de biodiversidade entre as ilhas do arquipélago, sendo os valores mais altos localizados nas ilhas ao leste (Lanzarote e Gran Canaria e os menores nas ilhas à oeste (La Palma.

  4. Computational alloy design of (Co1-xNix)88Zr7B4Cu1 nanocomposite soft magnets

    Science.gov (United States)

    Dong, B.; Healy, J.; Lan, S.; Daniil, M.; Willard, M. A.

    2018-05-01

    The dependence of coercivity on composition is an important factor for establishing optimized soft magnetic properties. In this study, we have used the random anisotropy and coherent rotation models to estimate the variation of coercivity with composition in (Co1-xNix)88Zr7B4Cu1 nanocomposite alloys. Our calculations that the magnetoelastic anisotropy contribution to coercivity dominates for Ni rich compositions (x > 0.5). A small range of compositions (0.65 < x < 0.75) is predicted to result in low values of coercivity (<10 A/m). To validate this prediction, (Co1-xNix)88Zr7B4Cu1 nanocomposites in this range were prepared by melt spinning followed by 3600 s isothermal annealing at the primary crystallization peak temperature (˜673 K). Hysteresis loops were measured using vibrating sample magnetometry at room temperature and saturation magnetostriction was measured using a strain gage based magnetostrictometer. Moderately small coercivities (30-40 A/m) and magnetostrictions (3-4 ppm) were measured at for samples with 0.685 < x < 0.725. Our measured coercivity had a minimum value of 32 A/m at x = 0.725, a shift in composition of about 5 at% in the direction of higher Ni content and without the anticipated low value of coercivity. Several reasons for the inaccuracy of this approach are described, including: ignored contributions from amorphous phase (especially in magnetoealstic anisotropy), composition segregation during crystallization leading to unpredictable compositional shifts in prediction, and the general observation that the predictability of minimum coercivity from minimal combined anisotropies has unexplained deviation even in far less complicated materials.

  5. Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach

    Science.gov (United States)

    Warner, James E.; Hochhalter, Jacob D.

    2016-01-01

    This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.

  6. Prediction of BP Reactivity to Talking Using Hybrid Soft Computing Approaches

    Directory of Open Access Journals (Sweden)

    Gurmanik Kaur

    2014-01-01

    Full Text Available High blood pressure (BP is associated with an increased risk of cardiovascular diseases. Therefore, optimal precision in measurement of BP is appropriate in clinical and research studies. In this work, anthropometric characteristics including age, height, weight, body mass index (BMI, and arm circumference (AC were used as independent predictor variables for the prediction of BP reactivity to talking. Principal component analysis (PCA was fused with artificial neural network (ANN, adaptive neurofuzzy inference system (ANFIS, and least square-support vector machine (LS-SVM model to remove the multicollinearity effect among anthropometric predictor variables. The statistical tests in terms of coefficient of determination (R2, root mean square error (RMSE, and mean absolute percentage error (MAPE revealed that PCA based LS-SVM (PCA-LS-SVM model produced a more efficient prediction of BP reactivity as compared to other models. This assessment presents the importance and advantages posed by PCA fused prediction models for prediction of biological variables.

  7. Prediction of BP reactivity to talking using hybrid soft computing approaches.

    Science.gov (United States)

    Kaur, Gurmanik; Arora, Ajat Shatru; Jain, Vijender Kumar

    2014-01-01

    High blood pressure (BP) is associated with an increased risk of cardiovascular diseases. Therefore, optimal precision in measurement of BP is appropriate in clinical and research studies. In this work, anthropometric characteristics including age, height, weight, body mass index (BMI), and arm circumference (AC) were used as independent predictor variables for the prediction of BP reactivity to talking. Principal component analysis (PCA) was fused with artificial neural network (ANN), adaptive neurofuzzy inference system (ANFIS), and least square-support vector machine (LS-SVM) model to remove the multicollinearity effect among anthropometric predictor variables. The statistical tests in terms of coefficient of determination (R (2)), root mean square error (RMSE), and mean absolute percentage error (MAPE) revealed that PCA based LS-SVM (PCA-LS-SVM) model produced a more efficient prediction of BP reactivity as compared to other models. This assessment presents the importance and advantages posed by PCA fused prediction models for prediction of biological variables.

  8. Computational Approaches for Integrative Analysis of the Metabolome and Microbiome

    Directory of Open Access Journals (Sweden)

    Jasmine Chong

    2017-11-01

    Full Text Available The study of the microbiome, the totality of all microbes inhabiting the host or an environmental niche, has experienced exponential growth over the past few years. The microbiome contributes functional genes and metabolites, and is an important factor for maintaining health. In this context, metabolomics is increasingly applied to complement sequencing-based approaches (marker genes or shotgun metagenomics to enable resolution of microbiome-conferred functionalities associated with health. However, analyzing the resulting multi-omics data remains a significant challenge in current microbiome studies. In this review, we provide an overview of different computational approaches that have been used in recent years for integrative analysis of metabolome and microbiome data, ranging from statistical correlation analysis to metabolic network-based modeling approaches. Throughout the process, we strive to present a unified conceptual framework for multi-omics integration and interpretation, as well as point out potential future directions.

  9. A Computer Vision Approach to Identify Einstein Rings and Arcs

    Science.gov (United States)

    Lee, Chien-Hsiu

    2017-03-01

    Einstein rings are rare gems of strong lensing phenomena; the ring images can be used to probe the underlying lens gravitational potential at every position angles, tightly constraining the lens mass profile. In addition, the magnified images also enable us to probe high-z galaxies with enhanced resolution and signal-to-noise ratios. However, only a handful of Einstein rings have been reported, either from serendipitous discoveries or or visual inspections of hundred thousands of massive galaxies or galaxy clusters. In the era of large sky surveys, an automated approach to identify ring pattern in the big data to come is in high demand. Here, we present an Einstein ring recognition approach based on computer vision techniques. The workhorse is the circle Hough transform that recognise circular patterns or arcs in the images. We propose a two-tier approach by first pre-selecting massive galaxies associated with multiple blue objects as possible lens, than use Hough transform to identify circular pattern. As a proof-of-concept, we apply our approach to SDSS, with a high completeness, albeit with low purity. We also apply our approach to other lenses in DES, HSC-SSP, and UltraVISTA survey, illustrating the versatility of our approach.

  10. SPINET: A Parallel Computing Approach to Spine Simulations

    Directory of Open Access Journals (Sweden)

    Peter G. Kropf

    1996-01-01

    Full Text Available Research in scientitic programming enables us to realize more and more complex applications, and on the other hand, application-driven demands on computing methods and power are continuously growing. Therefore, interdisciplinary approaches become more widely used. The interdisciplinary SPINET project presented in this article applies modern scientific computing tools to biomechanical simulations: parallel computing and symbolic and modern functional programming. The target application is the human spine. Simulations of the spine help us to investigate and better understand the mechanisms of back pain and spinal injury. Two approaches have been used: the first uses the finite element method for high-performance simulations of static biomechanical models, and the second generates a simulation developmenttool for experimenting with different dynamic models. A finite element program for static analysis has been parallelized for the MUSIC machine. To solve the sparse system of linear equations, a conjugate gradient solver (iterative method and a frontal solver (direct method have been implemented. The preprocessor required for the frontal solver is written in the modern functional programming language SML, the solver itself in C, thus exploiting the characteristic advantages of both functional and imperative programming. The speedup analysis of both solvers show very satisfactory results for this irregular problem. A mixed symbolic-numeric environment for rigid body system simulations is presented. It automatically generates C code from a problem specification expressed by the Lagrange formalism using Maple.

  11. Computer-oriented approach to fault-tree construction

    International Nuclear Information System (INIS)

    Salem, S.L.; Apostolakis, G.E.; Okrent, D.

    1976-11-01

    A methodology for systematically constructing fault trees for general complex systems is developed and applied, via the Computer Automated Tree (CAT) program, to several systems. A means of representing component behavior by decision tables is presented. The method developed allows the modeling of components with various combinations of electrical, fluid and mechanical inputs and outputs. Each component can have multiple internal failure mechanisms which combine with the states of the inputs to produce the appropriate output states. The generality of this approach allows not only the modeling of hardware, but human actions and interactions as well. A procedure for constructing and editing fault trees, either manually or by computer, is described. The techniques employed result in a complete fault tree, in standard form, suitable for analysis by current computer codes. Methods of describing the system, defining boundary conditions and specifying complex TOP events are developed in order to set up the initial configuration for which the fault tree is to be constructed. The approach used allows rapid modifications of the decision tables and systems to facilitate the analysis and comparison of various refinements and changes in the system configuration and component modeling

  12. Understanding Plant Nitrogen Metabolism through Metabolomics and Computational Approaches

    Directory of Open Access Journals (Sweden)

    Perrin H. Beatty

    2016-10-01

    Full Text Available A comprehensive understanding of plant metabolism could provide a direct mechanism for improving nitrogen use efficiency (NUE in crops. One of the major barriers to achieving this outcome is our poor understanding of the complex metabolic networks, physiological factors, and signaling mechanisms that affect NUE in agricultural settings. However, an exciting collection of computational and experimental approaches has begun to elucidate whole-plant nitrogen usage and provides an avenue for connecting nitrogen-related phenotypes to genes. Herein, we describe how metabolomics, computational models of metabolism, and flux balance analysis have been harnessed to advance our understanding of plant nitrogen metabolism. We introduce a model describing the complex flow of nitrogen through crops in a real-world agricultural setting and describe how experimental metabolomics data, such as isotope labeling rates and analyses of nutrient uptake, can be used to refine these models. In summary, the metabolomics/computational approach offers an exciting mechanism for understanding NUE that may ultimately lead to more effective crop management and engineered plants with higher yields.

  13. A comparative approach to closed-loop computation.

    Science.gov (United States)

    Roth, E; Sponberg, S; Cowan, N J

    2014-04-01

    Neural computation is inescapably closed-loop: the nervous system processes sensory signals to shape motor output, and motor output consequently shapes sensory input. Technological advances have enabled neuroscientists to close, open, and alter feedback loops in a wide range of experimental preparations. The experimental capability of manipulating the topology-that is, how information can flow between subsystems-provides new opportunities to understand the mechanisms and computations underlying behavior. These experiments encompass a spectrum of approaches from fully open-loop, restrained preparations to the fully closed-loop character of free behavior. Control theory and system identification provide a clear computational framework for relating these experimental approaches. We describe recent progress and new directions for translating experiments at one level in this spectrum to predictions at another level. Operating across this spectrum can reveal new understanding of how low-level neural mechanisms relate to high-level function during closed-loop behavior. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. A soft lithographic approach to fabricate InAs nanowire field-effect transistors

    DEFF Research Database (Denmark)

    Madsen, Morten; Lee, S. H.; Shin, S.-H.

    2018-01-01

    -down approach and an epitaxial layer transfer process, using MBE-grown ultrathin InAs as a source wafer. The width of the InAs nanowires was controlled using solvent-assisted nanoscale embossing (SANE), descumming, and etching processes. By optimizing these processes, NWs with a width less than 50 nm were...

  15. A New Approach to Studying Biological and Soft Materials Using Focused Ion Beam Scanning Electron Microscopy (FIB SEM)

    International Nuclear Information System (INIS)

    Stokes, D J; Morrissey, F; Lich, B H

    2006-01-01

    Over the last decade techniques such as confocal light microscopy, in combination with fluorescent labelling, have helped biologists and life scientists to study biological architectures at tissue and cell level in great detail. Meanwhile, obtaining information at very small length scales is possible with the combination of sample preparation techniques and transmission electron microscopy (TEM) or scanning transmission electron microscopy (STEM). Scanning electron microscopy (SEM) is well known for the determination of surface characteristics and morphology. However, the desire to understand the three dimensional relationships of meso-scale hierarchies has led to the development of advanced microscopy techniques, to give a further complementary approach. A focused ion beam (FIB) can be used as a nano-scalpel and hence allows us to reveal internal microstructure in a site-specific manner. Whilst FIB instruments have been used to study and verify the three-dimensional architecture of man made materials, SEM and FIB technologies have now been brought together in a single instrument representing a powerful combination for the study of biological specimens and soft materials. We demonstrate the use of FIB SEM to study three-dimensional relationships for a range of length scales and materials, from small-scale cellular structures to the larger scale interactions between biomedical materials and tissues. FIB cutting of heterogeneous mixtures of hard and soft materials, resulting in a uniform cross-section, has proved to be of particular value since classical preparation methods tend to introduce artefacts. Furthermore, by appropriate selection, we can sequentially cross-section to create a series of 'slices' at specific intervals. 3D reconstruction software can then be used to volume-render information from the 2D slices, enabling us to immediately see the spatial relationships between microstructural components

  16. A fully digital approach to replicate peri-implant soft tissue contours and emergence profile in the esthetic zone.

    Science.gov (United States)

    Monaco, Carlo; Evangelisti, Edoardo; Scotti, Roberto; Mignani, Giuseppe; Zucchelli, Giovanni

    2016-12-01

    This short communication reports on a novel digital technique designated - the "Fully Digital Technique (FDT)" - to take the impression of the peri-implant soft tissue and emergence profile with an intraoral scanner, digitally capturing both the three dimensional position of the implant platform and the coronal and gingival parts of the provisional retained restoration. A first intraoral digital impression, which generated a standard triangulation language file (STL1), was taken using a standardized implant scanbody to detect the position of the implant. A second digital impression (STL2) with the provisional retained restoration in situ was performed in two steps: the first part of the scan captured all details of the vestibular and palatal sides of the provisional retained restoration and the adjacent teeth. The provisional retained restoration was then unscrewed, and the subgingival part of the restoration was scanned directly out of the mouth to determine its subgingival shape. STL1 and STL2 were imported into imaging software and superimposed using the "best fit" algorithm to achieve a new merged file (STL3) with the 3D implant position, the peri-implant mucosa, and emergence profile. The merged file was used to design the CAD/CAM customized abutment and to realize a stereolithographic model by 3D printing. The STL superimposition of digital impressions of the implant position and the provisional retained restoration constitute a novel technique to obtain a single STL file with the implant position and its peri-implant mucosal architecture. FDT is a rapid digital approach for achieving all information of the peri-implant soft tissue and emergence profile directly from the provisional retained restoration. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  17. Facial soft-tissue asymmetry in three-dimensional cone-beam computed tomography images of children with surgically corrected unilateral clefts.

    Science.gov (United States)

    Starbuck, John Marlow; Ghoneima, Ahmed; Kula, Katherine

    2014-03-01

    Cleft lip with or without cleft palate (CL/P) is a relatively common craniofacial malformation involving bony and soft-tissue disruptions of the nasolabial and dentoalveolar regions. The combination of CL/P and subsequent craniofacial surgeries to close the cleft and improve appearance of the cutaneous upper lip and nose can cause scarring and muscle pull, possibly resulting in soft-tissue depth asymmetries across the face. We tested the hypothesis that tissue depths in children with unilateral CL/P exhibit differences in symmetry across the sides of the face. Twenty-eight tissue depths were measured on cone-beam computed tomography images of children with unilateral CL/P (n = 55), aged 7 to 17 years, using Dolphin software (version 11.5). Significant differences in tissue depth symmetry were found around the cutaneous upper lip and nose in patients with unilateral CL/P.

  18. Computing dispersion curves of elastic/viscoelastic transversely-isotropic bone plates coupled with soft tissue and marrow using semi-analytical finite element (SAFE) method.

    Science.gov (United States)

    Nguyen, Vu-Hieu; Tran, Tho N H T; Sacchi, Mauricio D; Naili, Salah; Le, Lawrence H

    2017-08-01

    We present a semi-analytical finite element (SAFE) scheme for accurately computing the velocity dispersion and attenuation in a trilayered system consisting of a transversely-isotropic (TI) cortical bone plate sandwiched between the soft tissue and marrow layers. The soft tissue and marrow are mimicked by two fluid layers of finite thickness. A Kelvin-Voigt model accounts for the absorption of all three biological domains. The simulated dispersion curves are validated by the results from the commercial software DISPERSE and published literature. Finally, the algorithm is applied to a viscoelastic trilayered TI bone model to interpret the guided modes of an ex-vivo experimental data set from a bone phantom. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Computational approaches in the design of synthetic receptors - A review.

    Science.gov (United States)

    Cowen, Todd; Karim, Kal; Piletsky, Sergey

    2016-09-14

    The rational design of molecularly imprinted polymers (MIPs) has been a major contributor to their reputation as "plastic antibodies" - high affinity robust synthetic receptors which can be optimally designed, and produced for a much reduced cost than their biological equivalents. Computational design has become a routine procedure in the production of MIPs, and has led to major advances in functional monomer screening, selection of cross-linker and solvent, optimisation of monomer(s)-template ratio and selectivity analysis. In this review the various computational methods will be discussed with reference to all the published relevant literature since the end of 2013, with each article described by the target molecule, the computational approach applied (whether molecular mechanics/molecular dynamics, semi-empirical quantum mechanics, ab initio quantum mechanics (Hartree-Fock, Møller-Plesset, etc.) or DFT) and the purpose for which they were used. Detailed analysis is given to novel techniques including analysis of polymer binding sites, the use of novel screening programs and simulations of MIP polymerisation reaction. The further advances in molecular modelling and computational design of synthetic receptors in particular will have serious impact on the future of nanotechnology and biotechnology, permitting the further translation of MIPs into the realms of analytics and medical technology. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Analytical and computational approaches to define the Aspergillus niger secretome

    Energy Technology Data Exchange (ETDEWEB)

    Tsang, Adrian; Butler, Gregory D.; Powlowski, Justin; Panisko, Ellen A.; Baker, Scott E.

    2009-03-01

    We used computational and mass spectrometric approaches to characterize the Aspergillus niger secretome. The 11,200 gene models predicted in the genome of A. niger strain ATCC 1015 were the data source for the analysis. Depending on the computational methods used, 691 to 881 proteins were predicted to be secreted proteins. We cultured A. niger in six different media and analyzed the extracellular proteins produced using mass spectrometry. A total of 222 proteins were identified, with 39 proteins expressed under all six conditions and 74 proteins expressed under only one condition. The secreted proteins identified by mass spectrometry were used to guide the correction of about 20 gene models. Additional analysis focused on extracellular enzymes of interest for biomass processing. Of the 63 glycoside hydrolases predicted to be capable of hydrolyzing cellulose, hemicellulose or pectin, 94% of the exo-acting enzymes and only 18% of the endo-acting enzymes were experimentally detected.

  1. Identifying Pathogenicity Islands in Bacterial Pathogenomics Using Computational Approaches

    Directory of Open Access Journals (Sweden)

    Dongsheng Che

    2014-01-01

    Full Text Available High-throughput sequencing technologies have made it possible to study bacteria through analyzing their genome sequences. For instance, comparative genome sequence analyses can reveal the phenomenon such as gene loss, gene gain, or gene exchange in a genome. By analyzing pathogenic bacterial genomes, we can discover that pathogenic genomic regions in many pathogenic bacteria are horizontally transferred from other bacteria, and these regions are also known as pathogenicity islands (PAIs. PAIs have some detectable properties, such as having different genomic signatures than the rest of the host genomes, and containing mobility genes so that they can be integrated into the host genome. In this review, we will discuss various pathogenicity island-associated features and current computational approaches for the identification of PAIs. Existing pathogenicity island databases and related computational resources will also be discussed, so that researchers may find it to be useful for the studies of bacterial evolution and pathogenicity mechanisms.

  2. Fast reactor safety and computational thermo-fluid dynamics approaches

    International Nuclear Information System (INIS)

    Ninokata, Hisashi; Shimizu, Takeshi

    1993-01-01

    This article provides a brief description of the safety principle on which liquid metal cooled fast breeder reactors (LMFBRs) is based and the roles of computations in the safety practices. A number of thermohydraulics models have been developed to date that successfully describe several of the important types of fluids and materials motion encountered in the analysis of postulated accidents in LMFBRs. Most of these models use a mixture of implicit and explicit numerical solution techniques in solving a set of conservation equations formulated in Eulerian coordinates, with special techniques included to specific situations. Typical computational thermo-fluid dynamics approaches are discussed in particular areas of analyses of the physical phenomena relevant to the fuel subassembly thermohydraulics design and that involve describing the motion of molten materials in the core over a large scale. (orig.)

  3. Benchmarking of computer codes and approaches for modeling exposure scenarios

    International Nuclear Information System (INIS)

    Seitz, R.R.; Rittmann, P.D.; Wood, M.I.; Cook, J.R.

    1994-08-01

    The US Department of Energy Headquarters established a performance assessment task team (PATT) to integrate the activities of DOE sites that are preparing performance assessments for the disposal of newly generated low-level waste. The PATT chartered a subteam with the task of comparing computer codes and exposure scenarios used for dose calculations in performance assessments. This report documents the efforts of the subteam. Computer codes considered in the comparison include GENII, PATHRAE-EPA, MICROSHIELD, and ISOSHLD. Calculations were also conducted using spreadsheets to provide a comparison at the most fundamental level. Calculations and modeling approaches are compared for unit radionuclide concentrations in water and soil for the ingestion, inhalation, and external dose pathways. Over 30 tables comparing inputs and results are provided

  4. The Soft Constraints Hypothesis: A Rational Analysis Approach to Resource Allocation for Interactive Behavior

    Science.gov (United States)

    2006-01-01

    Gray, Rensselaer Polytechnic Institute, Carnegie Building, 110 8th St., Troy, NY 12180. E-mail: grayw@rpi.edu Psychological Review Copyright 2006 by...the American Psychological Association 2006, Vol. 113, No. 3, 461–482 0033-295X/06/$12.00 DOI: 10.1037/0033-295X.113.3.461 461 Report Documentation...analysis approach from signal-detection theorists (Geisler, 2003; Macmillan & Creelman , 2004) with rational analysis (Anderson, 1990, 1991) to present an

  5. A hollow sphere soft lithography approach for long-term hanging drop methods.

    Science.gov (United States)

    Lee, Won Gu; Ortmann, Daniel; Hancock, Matthew J; Bae, Hojae; Khademhosseini, Ali

    2010-04-01

    In conventional hanging drop (HD) methods, embryonic stem cell aggregates or embryoid bodies (EBs) are often maintained in small inverted droplets. Gravity limits the volumes of these droplets to less than 50 microL, and hence such cell cultures can only be sustained for a few days without frequent media changes. Here we present a new approach to performing long-term HD methods (10-15 days) that can provide larger media reservoirs in a HD format to maintain more consistent culture media conditions. To implement this approach, we fabricated hollow sphere (HS) structures by injecting liquid drops into noncured poly(dimethylsiloxane) mixtures. These structures served as cell culture chambers with large media volumes (500 microL in each sphere) where EBs could grow without media depletion. The results showed that the sizes of the EBs cultured in the HS structures in a long-term HD format were approximately twice those of conventional HD methods after 10 days in culture. Further, HS cultures showed multilineage differentiation, similar to EBs cultured in the HD method. Due to its ease of fabrication and enhanced features, this approach may be of potential benefit as a stem cell culture method for regenerative medicine.

  6. Exploiting the Dynamics of Soft Materials for Machine Learning.

    Science.gov (United States)

    Nakajima, Kohei; Hauser, Helmut; Li, Tao; Pfeifer, Rolf

    2018-06-01

    Soft materials are increasingly utilized for various purposes in many engineering applications. These materials have been shown to perform a number of functions that were previously difficult to implement using rigid materials. Here, we argue that the diverse dynamics generated by actuating soft materials can be effectively used for machine learning purposes. This is demonstrated using a soft silicone arm through a technique of multiplexing, which enables the rich transient dynamics of the soft materials to be fully exploited as a computational resource. The computational performance of the soft silicone arm is examined through two standard benchmark tasks. Results show that the soft arm compares well to or even outperforms conventional machine learning techniques under multiple conditions. We then demonstrate that this system can be used for the sensory time series prediction problem for the soft arm itself, which suggests its immediate applicability to a real-world machine learning problem. Our approach, on the one hand, represents a radical departure from traditional computational methods, whereas on the other hand, it fits nicely into a more general perspective of computation by way of exploiting the properties of physical materials in the real world.

  7. Approaching multiphase flows from the perspective of computational fluid dynamics

    International Nuclear Information System (INIS)

    Banas, A.O.

    1992-01-01

    Thermalhydraulic simulation methodologies based on subchannel and porous-medium concepts are briefly reviewed and contrasted with the general approach of Computational Fluid Dynamics (CFD). An outline of the advanced CFD methods for single-phase turbulent flows is followed by a short discussion of the unified formulation of averaged equations for turbulent and multiphase flows. Some of the recent applications of CFD at Chalk River Laboratories are discussed, and the complementary role of CFD with regard to the established thermalhydraulic methods of analysis is indicated. (author). 8 refs

  8. Applications of Soft Union Sets in the Ring Theory

    Directory of Open Access Journals (Sweden)

    Yongwei Yang

    2013-01-01

    through discussing quotient soft subsets, an approach for constructing quotient soft union rings is made. Finally, isomorphism theorems of λ,μ-soft union rings related to invariant soft sets are discussed.

  9. Bayesian Multi-Energy Computed Tomography reconstruction approaches based on decomposition models

    International Nuclear Information System (INIS)

    Cai, Caifang

    2013-01-01

    Multi-Energy Computed Tomography (MECT) makes it possible to get multiple fractions of basis materials without segmentation. In medical application, one is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical MECT measurements are usually obtained with polychromatic X-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam poly-chromaticity fail to estimate the correct decomposition fractions and result in Beam-Hardening Artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log pre-processing and the water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on non-linear forward models counting the beam poly-chromaticity show great potential for giving accurate fraction images.This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint Maximum A Posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a non-quadratic cost function. To solve it, the use of a monotone Conjugate Gradient (CG) algorithm with suboptimal descent steps is proposed.The performances of the proposed approach are analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also

  10. Stochastic Computational Approach for Complex Nonlinear Ordinary Differential Equations

    International Nuclear Information System (INIS)

    Khan, Junaid Ali; Raja, Muhammad Asif Zahoor; Qureshi, Ijaz Mansoor

    2011-01-01

    We present an evolutionary computational approach for the solution of nonlinear ordinary differential equations (NLODEs). The mathematical modeling is performed by a feed-forward artificial neural network that defines an unsupervised error. The training of these networks is achieved by a hybrid intelligent algorithm, a combination of global search with genetic algorithm and local search by pattern search technique. The applicability of this approach ranges from single order NLODEs, to systems of coupled differential equations. We illustrate the method by solving a variety of model problems and present comparisons with solutions obtained by exact methods and classical numerical methods. The solution is provided on a continuous finite time interval unlike the other numerical techniques with comparable accuracy. With the advent of neuroprocessors and digital signal processors the method becomes particularly interesting due to the expected essential gains in the execution speed. (general)

  11. A Dynamic Bayesian Approach to Computational Laban Shape Quality Analysis

    Directory of Open Access Journals (Sweden)

    Dilip Swaminathan

    2009-01-01

    kinesiology. LMA (especially Effort/Shape emphasizes how internal feelings and intentions govern the patterning of movement throughout the whole body. As we argue, a complex understanding of intention via LMA is necessary for human-computer interaction to become embodied in ways that resemble interaction in the physical world. We thus introduce a novel, flexible Bayesian fusion approach for identifying LMA Shape qualities from raw motion capture data in real time. The method uses a dynamic Bayesian network (DBN to fuse movement features across the body and across time and as we discuss can be readily adapted for low-cost video. It has delivered excellent performance in preliminary studies comprising improvisatory movements. Our approach has been incorporated in Response, a mixed-reality environment where users interact via natural, full-body human movement and enhance their bodily-kinesthetic awareness through immersive sound and light feedback, with applications to kinesiology training, Parkinson's patient rehabilitation, interactive dance, and many other areas.

  12. Error characterization for asynchronous computations: Proxy equation approach

    Science.gov (United States)

    Sallai, Gabriella; Mittal, Ankita; Girimaji, Sharath

    2017-11-01

    Numerical techniques for asynchronous fluid flow simulations are currently under development to enable efficient utilization of massively parallel computers. These numerical approaches attempt to accurately solve time evolution of transport equations using spatial information at different time levels. The truncation error of asynchronous methods can be divided into two parts: delay dependent (EA) or asynchronous error and delay independent (ES) or synchronous error. The focus of this study is a specific asynchronous error mitigation technique called proxy-equation approach. The aim of this study is to examine these errors as a function of the characteristic wavelength of the solution. Mitigation of asynchronous effects requires that the asynchronous error be smaller than synchronous truncation error. For a simple convection-diffusion equation, proxy-equation error analysis identifies critical initial wave-number, λc. At smaller wave numbers, synchronous error are larger than asynchronous errors. We examine various approaches to increase the value of λc in order to improve the range of applicability of proxy-equation approach.

  13. Glass transition of soft colloids

    Science.gov (United States)

    Philippe, Adrian-Marie; Truzzolillo, Domenico; Galvan-Myoshi, Julian; Dieudonné-George, Philippe; Trappe, Véronique; Berthier, Ludovic; Cipelletti, Luca

    2018-04-01

    We explore the glassy dynamics of soft colloids using microgels and charged particles interacting by steric and screened Coulomb interactions, respectively. In the supercooled regime, the structural relaxation time τα of both systems grows steeply with volume fraction, reminiscent of the behavior of colloidal hard spheres. Computer simulations confirm that the growth of τα on approaching the glass transition is independent of particle softness. By contrast, softness becomes relevant at very large packing fractions when the system falls out of equilibrium. In this nonequilibrium regime, τα depends surprisingly weakly on packing fraction, and time correlation functions exhibit a compressed exponential decay consistent with stress-driven relaxation. The transition to this novel regime coincides with the onset of an anomalous decrease in local order with increasing density typical of ultrasoft systems. We propose that these peculiar dynamics results from the combination of the nonequilibrium aging dynamics expected in the glassy state and the tendency of colloids interacting through soft potentials to refluidize at high packing fractions.

  14. Crowd Computing as a Cooperation Problem: An Evolutionary Approach

    Science.gov (United States)

    Christoforou, Evgenia; Fernández Anta, Antonio; Georgiou, Chryssis; Mosteiro, Miguel A.; Sánchez, Angel

    2013-05-01

    Cooperation is one of the socio-economic issues that has received more attention from the physics community. The problem has been mostly considered by studying games such as the Prisoner's Dilemma or the Public Goods Game. Here, we take a step forward by studying cooperation in the context of crowd computing. We introduce a model loosely based on Principal-agent theory in which people (workers) contribute to the solution of a distributed problem by computing answers and reporting to the problem proposer (master). To go beyond classical approaches involving the concept of Nash equilibrium, we work on an evolutionary framework in which both the master and the workers update their behavior through reinforcement learning. Using a Markov chain approach, we show theoretically that under certain----not very restrictive—conditions, the master can ensure the reliability of the answer resulting of the process. Then, we study the model by numerical simulations, finding that convergence, meaning that the system reaches a point in which it always produces reliable answers, may in general be much faster than the upper bounds given by the theoretical calculation. We also discuss the effects of the master's level of tolerance to defectors, about which the theory does not provide information. The discussion shows that the system works even with very large tolerances. We conclude with a discussion of our results and possible directions to carry this research further.

  15. Novel computational approaches for the analysis of cosmic magnetic fields

    Energy Technology Data Exchange (ETDEWEB)

    Saveliev, Andrey [Universitaet Hamburg, Hamburg (Germany); Keldysh Institut, Moskau (Russian Federation)

    2016-07-01

    In order to give a consistent picture of cosmic, i.e. galactic and extragalactic, magnetic fields, different approaches are possible and often even necessary. Here we present three of them: First, a semianalytic analysis of the time evolution of primordial magnetic fields from which their properties and, subsequently, the nature of present-day intergalactic magnetic fields may be deduced. Second, the use of high-performance computing infrastructure by developing powerful algorithms for (magneto-)hydrodynamic simulations and applying them to astrophysical problems. We are currently developing a code which applies kinetic schemes in massive parallel computing on high performance multiprocessor systems in a new way to calculate both hydro- and electrodynamic quantities. Finally, as a third approach, astroparticle physics might be used as magnetic fields leave imprints of their properties on charged particles transversing them. Here we focus on electromagnetic cascades by developing a software based on CRPropa which simulates the propagation of particles from such cascades through the intergalactic medium in three dimensions. This may in particular be used to obtain information about the helicity of extragalactic magnetic fields.

  16. Computing Optimal Stochastic Portfolio Execution Strategies: A Parametric Approach Using Simulations

    Science.gov (United States)

    Moazeni, Somayeh; Coleman, Thomas F.; Li, Yuying

    2010-09-01

    Computing optimal stochastic portfolio execution strategies under appropriate risk consideration presents great computational challenge. We investigate a parametric approach for computing optimal stochastic strategies using Monte Carlo simulations. This approach allows reduction in computational complexity by computing coefficients for a parametric representation of a stochastic dynamic strategy based on static optimization. Using this technique, constraints can be similarly handled using appropriate penalty functions. We illustrate the proposed approach to minimize the expected execution cost and Conditional Value-at-Risk (CVaR).

  17. Theoretical modeling of electroosmotic flow in soft microchannels: A variational approach applied to the rectangular geometry

    Science.gov (United States)

    Sadeghi, Arman

    2018-03-01

    Modeling of fluid flow in polyelectrolyte layer (PEL)-grafted microchannels is challenging due to their two-layer nature. Hence, the pertinent studies are limited only to circular and slit geometries for which matching the solutions for inside and outside the PEL is simple. In this paper, a simple variational-based approach is presented for the modeling of fully developed electroosmotic flow in PEL-grafted microchannels by which the whole fluidic area is considered as a single porous medium of variable properties. The model is capable of being applied to microchannels of a complex cross-sectional area. As an application of the method, it is applied to a rectangular microchannel of uniform PEL properties. It is shown that modeling a rectangular channel as a slit may lead to considerable overestimation of the mean velocity especially when both the PEL and electric double layer (EDL) are thick. It is also demonstrated that the mean velocity is an increasing function of the fixed charge density and PEL thickness and a decreasing function of the EDL thickness and PEL friction coefficient. The influence of the PEL thickness on the mean velocity, however, vanishes when both the PEL thickness and friction coefficient are sufficiently high.

  18. Augmenting Tertiary Students' Soft Skills Via Multiple Intelligences Instructional Approach: Literature Courses in Focus

    Directory of Open Access Journals (Sweden)

    El Sherief Eman

    2017-01-01

    Full Text Available The second half of the twentieth century is a witness to an unprecedentedly soaring increase in the number of students joining the arena of higher education(UNESCO,2001. Currently, the number of students at Saudi universities and colleges exceeds one million vis-à-vis 7000 in 1970(Royal Embassy of Saudi Arabia, Washington. Such enormous body of learners in higher education is per se diverse enough to embrace distinct learning styles, assorted repertoire of backgrounds, prior knowledge, experiences, and perspectives; at this juncture, they presumably share common aspiration which is hooking a compatible post in the labor market upon graduation, and to subsequently be capable of acting competently in a scrupulously competitive workplace environment. Bunch of potentialities and skills are patently vital for a graduate to reach such a prospect. Such bunch of skills in a conventional undergraduate paradigm of education were given no heed, being rather postponed to the post-graduation phase. The current Paper postulated tremendous  merits of deploying the Multiple Intelligences theory as a project-based approach, within  literature classes in higher education; a strategy geared towards reigniting students’ engagement, nurturing their critical thinking capabilities, sustaining their individualistic dispositions, molding them as inquiry-seekers, and ending up engendering life-long, autonomous learners,  well-armed with the substantial skills for traversing the rigorous competition in future labor market.

  19. The Development Of Red Chili Agribusiness Cluster With Soft System Methodology (Ssm Approach In Garut, West Java

    Directory of Open Access Journals (Sweden)

    Sri Ayu Andayani

    2016-12-01

    Full Text Available Red chili is one of the commodities with high price fluctuation and gives influence to inflation. It happens due to the unsustainable supply of red chili from the central production centers to the market. Bank Indonesia (central bank initiates a cluster system to support price controlling and regional economic growth. In this regard, the study is conducted  in Garut regency, which is one of the centers of red chili plantation in West Java and uses as cluster development, and yet there are still many obstacles along the way. This paper has the objective to describe the problem which causes unsustainable production and affects industrial supplies systemically and also to analyze the existing partnerships in order to maintain the continuity of supply as an alternative solution.This study was designed qualitatively with case study method through a system approach namely soft system methodology (SSM. The results shows that the problems in the cluster of red chili are ranging from production planning to the delay of sales payment process which systemically interlinked and the collaboration of executors that have not been optimally implemented. This study offers solution for those problems accordance with change formulation of SSM and industrial emphasis on fairness, transparency and integrated optimization with the principle of production sustainability from all stakeholders through participative collaboration to maintain continuity of production.

  20. Limb sparing approach: Adjuvant radiation therapy in adults with intermediate or high-grade limb soft tissue sarcoma

    International Nuclear Information System (INIS)

    Merimsky, Ofer; Soyfer, Vjacheslav; Kovner, Felix; Bickels, Jacob; Issakov, Josephine; Flusser, Gideon; Meller, Isaac; Ofer, Oded; Kollender, Yehuda

    2005-01-01

    Background: Limb soft tissue sarcomas (STS) are currently treated with limb sparing surgery (LSS) followed by radiation therapy (RT). Patients and methods: Between October 1994 and October 2002, 133 adult patients with intermediate or high-grade limb STS were approached by LSS+RT. Results: RT related toxicity was manageable, with a low rate of severe effects. At 4-year median follow-up, there were 48 recurrences of any type, 23 of isolated local failure, and 35 of systemic spread w/o local failure. DFS and OS were influenced by disease stage II vs I, primary site in the upper limb vs lower limb, MPNST vs other types, induction therapy vs no induction, adequate resection vs marginal resection or involved margins, and good response to induction therapy vs bad response. DFS and OS were Patient's age and sex, tumor depth, acute or late toxicity of RT, or the interval of time between the date of definitive surgery and the start of RT did not affect DFS and or OS. Conclusions: The RT protocol is applicable in the era of complicated, expensive and time-consuming 3D therapy. Our results of LSS+RT in adults with limb HG STS are satisfactory

  1. Soft matter assemblies as nanomedicine platforms for cancer chemotherapy: a journey from market products towards novel approaches.

    Science.gov (United States)

    Jäger, Eliézer; Giacomelli, Fernando C

    2015-01-01

    The current review aims to outline the likely medical applications of nanotechnology and the potential of the emerging field of nanomedicine. Nanomedicine can be defined as the investigation area encompassing the design of diagnostics and therapeutics at the nanoscale, including nanobots, nanobiosensors, nanoparticles and other nanodevices, for the remediation, prevention and diagnosis of a variety of illnesses. The ultimate goal of nanomedicine is to improve patient quality-of-life. Because nanomedicine includes the rational design of an enormous number of nanotechnology-based products focused on miscellaneous diseases, a variety of nanomaterials can be employed. Therefore, this review will focus on recent advances in the manufacture of soft matterbased nanomedicines specifically designed to improve diagnostics and cancer chemotherapy efficacy. It will be particularly highlighted liposomes, polymer-drug conjugates, drug-loaded block copolymer micelles and biodegradable polymeric nanoparticles, emphasizing the current investigations and potential novel approaches towards overcoming the remaining challenges in the field as well as formulations that are in clinical trials and marketed products.

  2. Suggested Approaches to the Measurement of Computer Anxiety.

    Science.gov (United States)

    Toris, Carol

    Psychologists can gain insight into human behavior by examining what people feel about, know about, and do with, computers. Two extreme reactions to computers are computer phobia, or anxiety, and computer addiction, or "hacking". A four-part questionnaire was developed to measure computer anxiety. The first part is a projective technique which…

  3. Computational Diagnostic: A Novel Approach to View Medical Data.

    Energy Technology Data Exchange (ETDEWEB)

    Mane, K. K. (Ketan Kirtiraj); Börner, K. (Katy)

    2007-01-01

    A transition from traditional paper-based medical records to electronic health record is largely underway. The use of electronic records offers tremendous potential to personalize patient diagnosis and treatment. In this paper, we discuss a computational diagnostic tool that uses digital medical records to help doctors gain better insight about a patient's medical condition. The paper details different interactive features of the tool which offer potential to practice evidence-based medicine and advance patient diagnosis practices. The healthcare industry is a constantly evolving domain. Research from this domain is often translated into better understanding of different medical conditions. This new knowledge often contributes towards improved diagnosis and treatment solutions for patients. But the healthcare industry lags behind to seek immediate benefits of the new knowledge as it still adheres to the traditional paper-based approach to keep track of medical records. However recently we notice a drive that promotes a transition towards electronic health record (EHR). An EHR stores patient medical records in digital format and offers potential to replace the paper health records. Earlier attempts of an EHR replicated the paper layout on the screen, representation of medical history of a patient in a graphical time-series format, interactive visualization with 2D/3D generated images from an imaging device. But an EHR can be much more than just an 'electronic view' of the paper record or a collection of images from an imaging device. In this paper, we present an EHR called 'Computational Diagnostic Tool', that provides a novel computational approach to look at patient medical data. The developed EHR system is knowledge driven and acts as clinical decision support tool. The EHR tool provides two visual views of the medical data. Dynamic interaction with data is supported to help doctors practice evidence-based decisions and make judicious

  4. Solvent effect on indocyanine dyes: A computational approach

    International Nuclear Information System (INIS)

    Bertolino, Chiara A.; Ferrari, Anna M.; Barolo, Claudia; Viscardi, Guido; Caputo, Giuseppe; Coluccia, Salvatore

    2006-01-01

    The solvatochromic behaviour of a series of indocyanine dyes (Dyes I-VIII) was investigated by quantum chemical calculations. The effect of the polymethine chain length and of the indolenine structure has been satisfactorily reproduced by semiempirical Pariser-Parr-Pople (PPP) calculations. The solvatochromism of 3,3,3',3'-tetramethyl-N,N'-diethylindocarbocyanine iodide (Dye I) has been deeply investigated within the ab initio time-dependent density functional theory (TD-DFT) approach. Dye I undergoes non-polar solvation and a linear correlation has been individuated between absorption shifts and refractive index. Computed absorption λ max and oscillator strengths obtained by TD-DFT are in good agreement with the experimental data

  5. Systems approaches to computational modeling of the oral microbiome

    Directory of Open Access Journals (Sweden)

    Dimiter V. Dimitrov

    2013-07-01

    Full Text Available Current microbiome research has generated tremendous amounts of data providing snapshots of molecular activity in a variety of organisms, environments, and cell types. However, turning this knowledge into whole system level of understanding on pathways and processes has proven to be a challenging task. In this review we highlight the applicability of bioinformatics and visualization techniques to large collections of data in order to better understand the information that contains related diet – oral microbiome – host mucosal transcriptome interactions. In particular we focus on systems biology of Porphyromonas gingivalis in the context of high throughput computational methods tightly integrated with translational systems medicine. Those approaches have applications for both basic research, where we can direct specific laboratory experiments in model organisms and cell cultures, to human disease, where we can validate new mechanisms and biomarkers for prevention and treatment of chronic disorders

  6. A computational approach to mechanistic and predictive toxicology of pesticides

    DEFF Research Database (Denmark)

    Kongsbak, Kristine Grønning; Vinggaard, Anne Marie; Hadrup, Niels

    2014-01-01

    Emerging challenges of managing and interpreting large amounts of complex biological data have given rise to the growing field of computational biology. We investigated the applicability of an integrated systems toxicology approach on five selected pesticides to get an overview of their modes...... of action in humans, to group them according to their modes of action, and to hypothesize on their potential effects on human health. We extracted human proteins associated to prochloraz, tebuconazole, epoxiconazole, procymidone, and mancozeb and enriched each protein set by using a high confidence human......, and procymidone exerted their effects mainly via interference with steroidogenesis and nuclear receptors. Prochloraz was associated to a large number of human diseases, and together with tebuconazole showed several significant associations to Testicular Dysgenesis Syndrome. Mancozeb showed a differential mode...

  7. An Organic Computing Approach to Self-organising Robot Ensembles

    Directory of Open Access Journals (Sweden)

    Sebastian Albrecht von Mammen

    2016-11-01

    Full Text Available Similar to the Autonomous Computing initiative, that has mainly been advancing techniques for self-optimisation focussing on computing systems and infrastructures, Organic Computing (OC has been driving the development of system design concepts and algorithms for self-adaptive systems at large. Examples of application domains include, for instance, traffic management and control, cloud services, communication protocols, and robotic systems. Such an OC system typically consists of a potentially large set of autonomous and self-managed entities, where each entity acts with a local decision horizon. By means of cooperation of the individual entities, the behaviour of the entire ensemble system is derived. In this article, we present our work on how autonomous, adaptive robot ensembles can benefit from OC technology. Our elaborations are aligned with the different layers of an observer/controller framework which provides the foundation for the individuals' adaptivity at system design-level. Relying on an extended Learning Classifier System (XCS in combination with adequate simulation techniques, this basic system design empowers robot individuals to improve their individual and collaborative performances, e.g. by means of adapting to changing goals and conditions.Not only for the sake of generalisability, but also because of its enormous transformative potential, we stage our research in the domain of robot ensembles that are typically comprised of several quad-rotors and that organise themselves to fulfil spatial tasks such as maintenance of building facades or the collaborative search for mobile targets. Our elaborations detail the architectural concept, provide examples of individual self-optimisation as well as of the optimisation of collaborative efforts, and we show how the user can control the ensembles at multiple levels of abstraction. We conclude with a summary of our approach and an outlook on possible future steps.

  8. A computational approach to climate science education with CLIMLAB

    Science.gov (United States)

    Rose, B. E. J.

    2017-12-01

    CLIMLAB is a Python-based software toolkit for interactive, process-oriented climate modeling for use in education and research. It is motivated by the need for simpler tools and more reproducible workflows with which to "fill in the gaps" between blackboard-level theory and the results of comprehensive climate models. With CLIMLAB you can interactively mix and match physical model components, or combine simpler process models together into a more comprehensive model. I use CLIMLAB in the classroom to put models in the hands of students (undergraduate and graduate), and emphasize a hierarchical, process-oriented approach to understanding the key emergent properties of the climate system. CLIMLAB is equally a tool for climate research, where the same needs exist for more robust, process-based understanding and reproducible computational results. I will give an overview of CLIMLAB and an update on recent developments, including: a full-featured, well-documented, interactive implementation of a widely-used radiation model (RRTM) packaging with conda-forge for compiler-free (and hassle-free!) installation on Mac, Windows and Linux interfacing with xarray for i/o and graphics with gridded model data a rich and growing collection of examples and self-computing lecture notes in Jupyter notebook format

  9. Holiday fun with soft gluons

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Emissions of soft gluons from energetic particles play an important role in collider processes. While the basic physics of soft emissions is simple, it gives rise to a variety of interesting and intricate phenomena (non-global logs, Glauber phases, super-leading logs, factorization breaking). After an introduction, I will review progress in resummation methods such as Soft-Collinear Effective Theory driven by a better understanding of soft emissions. I will also show some new results for computations of soft-gluon effects in gap-between-jets and isolation-cone cross sections.

  10. Towards scalable quantum communication and computation: Novel approaches and realizations

    Science.gov (United States)

    Jiang, Liang

    Quantum information science involves exploration of fundamental laws of quantum mechanics for information processing tasks. This thesis presents several new approaches towards scalable quantum information processing. First, we consider a hybrid approach to scalable quantum computation, based on an optically connected network of few-qubit quantum registers. Specifically, we develop a novel scheme for scalable quantum computation that is robust against various imperfections. To justify that nitrogen-vacancy (NV) color centers in diamond can be a promising realization of the few-qubit quantum register, we show how to isolate a few proximal nuclear spins from the rest of the environment and use them for the quantum register. We also demonstrate experimentally that the nuclear spin coherence is only weakly perturbed under optical illumination, which allows us to implement quantum logical operations that use the nuclear spins to assist the repetitive-readout of the electronic spin. Using this technique, we demonstrate more than two-fold improvement in signal-to-noise ratio. Apart from direct application to enhance the sensitivity of the NV-based nano-magnetometer, this experiment represents an important step towards the realization of robust quantum information processors using electronic and nuclear spin qubits. We then study realizations of quantum repeaters for long distance quantum communication. Specifically, we develop an efficient scheme for quantum repeaters based on atomic ensembles. We use dynamic programming to optimize various quantum repeater protocols. In addition, we propose a new protocol of quantum repeater with encoding, which efficiently uses local resources (about 100 qubits) to identify and correct errors, to achieve fast one-way quantum communication over long distances. Finally, we explore quantum systems with topological order. Such systems can exhibit remarkable phenomena such as quasiparticles with anyonic statistics and have been proposed as

  11. A computational approach to finding novel targets for existing drugs.

    Directory of Open Access Journals (Sweden)

    Yvonne Y Li

    2011-09-01

    Full Text Available Repositioning existing drugs for new therapeutic uses is an efficient approach to drug discovery. We have developed a computational drug repositioning pipeline to perform large-scale molecular docking of small molecule drugs against protein drug targets, in order to map the drug-target interaction space and find novel interactions. Our method emphasizes removing false positive interaction predictions using criteria from known interaction docking, consensus scoring, and specificity. In all, our database contains 252 human protein drug targets that we classify as reliable-for-docking as well as 4621 approved and experimental small molecule drugs from DrugBank. These were cross-docked, then filtered through stringent scoring criteria to select top drug-target interactions. In particular, we used MAPK14 and the kinase inhibitor BIM-8 as examples where our stringent thresholds enriched the predicted drug-target interactions with known interactions up to 20 times compared to standard score thresholds. We validated nilotinib as a potent MAPK14 inhibitor in vitro (IC50 40 nM, suggesting a potential use for this drug in treating inflammatory diseases. The published literature indicated experimental evidence for 31 of the top predicted interactions, highlighting the promising nature of our approach. Novel interactions discovered may lead to the drug being repositioned as a therapeutic treatment for its off-target's associated disease, added insight into the drug's mechanism of action, and added insight into the drug's side effects.

  12. Computer-Aided Approaches for Targeting HIVgp41

    Directory of Open Access Journals (Sweden)

    William J. Allen

    2012-08-01

    Full Text Available Virus-cell fusion is the primary means by which the human immunodeficiency virus-1 (HIV delivers its genetic material into the human T-cell host. Fusion is mediated in large part by the viral glycoprotein 41 (gp41 which advances through four distinct conformational states: (i native, (ii pre-hairpin intermediate, (iii fusion active (fusogenic, and (iv post-fusion. The pre-hairpin intermediate is a particularly attractive step for therapeutic intervention given that gp41 N-terminal heptad repeat (NHR and C‑terminal heptad repeat (CHR domains are transiently exposed prior to the formation of a six-helix bundle required for fusion. Most peptide-based inhibitors, including the FDA‑approved drug T20, target the intermediate and there are significant efforts to develop small molecule alternatives. Here, we review current approaches to studying interactions of inhibitors with gp41 with an emphasis on atomic-level computer modeling methods including molecular dynamics, free energy analysis, and docking. Atomistic modeling yields a unique level of structural and energetic detail, complementary to experimental approaches, which will be important for the design of improved next generation anti-HIV drugs.

  13. Computed tomography of the lung. A pattern approach. 2. ed.

    International Nuclear Information System (INIS)

    Verschakelen, Johny A.; Wever, Walter de

    2018-01-01

    Computed Tomography of the Lung: A Pattern Approach aims to enable the reader to recognize and understand the CT signs of lung diseases and diseases with pulmonary involvement as a sound basis for diagnosis. After an introductory chapter, basic anatomy and its relevance to the interpretation of CT appearances is discussed. Advice is then provided on how to approach a CT scan of the lungs, and the different distribution and appearance patterns of disease are described. Subsequent chapters focus on the nature of these patterns, identify which diseases give rise to them, and explain how to differentiate between the diseases. The concluding chapter presents a large number of typical and less typical cases that will help the reader to practice application of the knowledge gained from the earlier chapters. Since the first edition, the book has been adapted and updated, with the inclusion of many new figures and case studies. It will be an invaluable asset both for radiologists and pulmonologists in training and for more experienced specialists wishing to update their knowledge.

  14. Optical computing - an alternate approach to trigger processing

    International Nuclear Information System (INIS)

    Cleland, W.E.

    1981-01-01

    The enormous rate reduction factors required by most ISABELLE experiments suggest that we should examine every conceivable approach to trigger processing. One approach that has not received much attention by high energy physicists is optical data processing. The past few years have seen rapid advances in optoelectronic technology, stimulated mainly by the military and the communications industry. An intriguing question is whether one can utilize this technology together with the optical computing techniques that have been developed over the past two decades to develop a rapid trigger processor for high energy physics experiments. Optical data processing is a method for performing a few very specialized operations on data which is inherently two dimensional. Typical operations are the formation of convolution or correlation integrals between the input data and information stored in the processor in the form of an optical filter. Optical processors are classed as coherent or incoherent, according to the spatial coherence of the input wavefront. Typically, in a coherent processor a laser beam is modulated with a photographic transparency which represents the input data. In an incoherent processor, the input may be an incoherently illuminated transparency, but self-luminous objects, such as an oscilloscope trace, have also been used. We consider here an incoherent processor in which the input data is converted into an optical wavefront through the excitation of an array of point sources - either light emitting diodes or injection lasers

  15. COMPUTER APPROACHES TO WHEAT HIGH-THROUGHPUT PHENOTYPING

    Directory of Open Access Journals (Sweden)

    Afonnikov D.

    2012-08-01

    Full Text Available The growing need for rapid and accurate approaches for large-scale assessment of phenotypic characters in plants becomes more and more obvious in the studies looking into relationships between genotype and phenotype. This need is due to the advent of high throughput methods for analysis of genomes. Nowadays, any genetic experiment involves data on thousands and dozens of thousands of plants. Traditional ways of assessing most phenotypic characteristics (those with reliance on the eye, the touch, the ruler are little effective on samples of such sizes. Modern approaches seek to take advantage of automated phenotyping, which warrants a much more rapid data acquisition, higher accuracy of the assessment of phenotypic features, measurement of new parameters of these features and exclusion of human subjectivity from the process. Additionally, automation allows measurement data to be rapidly loaded into computer databases, which reduces data processing time.In this work, we present the WheatPGE information system designed to solve the problem of integration of genotypic and phenotypic data and parameters of the environment, as well as to analyze the relationships between the genotype and phenotype in wheat. The system is used to consolidate miscellaneous data on a plant for storing and processing various morphological traits and genotypes of wheat plants as well as data on various environmental factors. The system is available at www.wheatdb.org. Its potential in genetic experiments has been demonstrated in high-throughput phenotyping of wheat leaf pubescence.

  16. An evolutionary computation approach to examine functional brain plasticity

    Directory of Open Access Journals (Sweden)

    Arnab eRoy

    2016-04-01

    Full Text Available One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN and the executive control network (ECN during recovery from traumatic brain injury (TBI; the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in

  17. A machine-learning approach for computation of fractional flow reserve from coronary computed tomography.

    Science.gov (United States)

    Itu, Lucian; Rapaka, Saikiran; Passerini, Tiziano; Georgescu, Bogdan; Schwemmer, Chris; Schoebinger, Max; Flohr, Thomas; Sharma, Puneet; Comaniciu, Dorin

    2016-07-01

    Fractional flow reserve (FFR) is a functional index quantifying the severity of coronary artery lesions and is clinically obtained using an invasive, catheter-based measurement. Recently, physics-based models have shown great promise in being able to noninvasively estimate FFR from patient-specific anatomical information, e.g., obtained from computed tomography scans of the heart and the coronary arteries. However, these models have high computational demand, limiting their clinical adoption. In this paper, we present a machine-learning-based model for predicting FFR as an alternative to physics-based approaches. The model is trained on a large database of synthetically generated coronary anatomies, where the target values are computed using the physics-based model. The trained model predicts FFR at each point along the centerline of the coronary tree, and its performance was assessed by comparing the predictions against physics-based computations and against invasively measured FFR for 87 patients and 125 lesions in total. Correlation between machine-learning and physics-based predictions was excellent (0.9994, P machine-learning algorithm with a sensitivity of 81.6%, a specificity of 83.9%, and an accuracy of 83.2%. The correlation was 0.729 (P assessment of FFR. Average execution time went down from 196.3 ± 78.5 s for the CFD model to ∼2.4 ± 0.44 s for the machine-learning model on a workstation with 3.4-GHz Intel i7 8-core processor. Copyright © 2016 the American Physiological Society.

  18. A Representation-Theoretic Approach to Reversible Computation with Applications

    DEFF Research Database (Denmark)

    Maniotis, Andreas Milton

    Reversible computing is a sub-discipline of computer science that helps to understand the foundations of the interplay between physics, algebra, and logic in the context of computation. Its subjects of study are computational devices and abstract models of computation that satisfy the constraint ......, there is still no uniform and consistent theory that is general in the sense of giving a model-independent account to the field....... of information conservation. Such machine models, which are known as reversible models of computation, have been examined both from a theoretical perspective and from an engineering perspective. While a bundle of many isolated successful findings and applications concerning reversible computing exists...

  19. Soft leptogenesis

    International Nuclear Information System (INIS)

    D'Ambrosio, Giancarlo; Giudice, Gian F.; Raidal, Martti

    2003-01-01

    We study 'soft leptogenesis', a new mechanism of leptogenesis which does not require flavour mixing among the right-handed neutrinos. Supersymmetry soft-breaking terms give a small mass splitting between the CP-even and CP-odd right-handed sneutrino states of a single generation and provide a CP-violating phase sufficient to generate a lepton asymmetry. The mechanism is successful if the lepton-violating soft bilinear coupling is unconventionally (but not unnaturally) small. The values of the right-handed neutrino masses predicted by soft leptogenesis can be low enough to evade the cosmological gravitino problem

  20. A computationally efficient approach for template matching-based ...

    Indian Academy of Sciences (India)

    In this paper, a new computationally efficient image registration method is ...... the proposed method requires less computational time as compared to traditional methods. ... Zitová B and Flusser J 2003 Image registration methods: A survey.

  1. Comparison of distal soft-tissue procedures combined with a distal chevron osteotomy for moderate to severe hallux valgus: first web-space versus transarticular approach.

    Science.gov (United States)

    Park, Yu-Bok; Lee, Keun-Bae; Kim, Sung-Kyu; Seon, Jong-Keun; Lee, Jun-Young

    2013-11-06

    There are two surgical approaches for distal soft-tissue procedures for the correction of hallux valgus-the dorsal first web-space approach, and the medial transarticular approach. The purpose of this study was to compare the outcomes achieved after use of either of these approaches combined with a distal chevron osteotomy in patients with moderate to severe hallux valgus. One hundred and twenty-two female patients (122 feet) who underwent a distal chevron osteotomy as part of a distal soft-tissue procedure for the treatment of symptomatic unilateral moderate to severe hallux valgus constituted the study cohort. The 122 feet were randomly divided into two groups: namely, a dorsal first web-space approach (group D; sixty feet) and a medial transarticular approach (group M; sixty-two feet). The clinical and radiographic results of the two groups were compared at a mean follow-up time of thirty-eight months. The American Orthopaedic Foot & Ankle Society (AOFAS) hindfoot scale hallux metatarsophalangeal-interphalangeal scores improved from a mean and standard deviation of 55.5 ± 12.8 points preoperatively to 93.5 ± 6.3 points at the final follow-up in group D and from 54.9 ± 12.6 points preoperatively to 93.6 ± 6.2 points at the final follow-up in group M. The mean hallux valgus angle in groups D and M was reduced from 32.2° ± 6.3° and 33.1° ± 8.4° preoperatively to 10.5° ± 5.5° and 9.9° ± 5.5°, respectively, at the time of final follow-up. The mean first intermetatarsal angle in groups D and M was reduced from 15.0° ± 2.8° and 15.3° ± 2.7° preoperatively to 6.5° ± 2.2° and 6.3° ± 2.4°, respectively, at the final follow-up. The clinical and radiographic outcomes were not significantly different between the two groups. The final clinical and radiographic outcomes between the two approaches for distal soft-tissue procedures were comparable and equally successful. Accordingly, the results of this study suggest that the medial transarticular

  2. Computational approach for a pair of bubble coalescence process

    International Nuclear Information System (INIS)

    Nurul Hasan; Zalinawati binti Zakaria

    2011-01-01

    The coalescence of bubbles has great value in mineral recovery and oil industry. In this paper, two co-axial bubbles rising in a cylinder is modelled to study the coalescence of bubbles for four computational experimental test cases. The Reynolds' (Re) number is chosen in between 8.50 and 10, Bond number, Bo ∼4.25-50, Morton number, M 0.0125-14.7. The viscosity ratio (μ r ) and density ratio (ρ r ) of liquid to bubble are kept constant (100 and 850 respectively). It was found that the Bo number has significant effect on the coalescence process for constant Re, μ r and ρ r . The bubble-bubble distance over time was validated against published experimental data. The results show that VOF approach can be used to model these phenomena accurately. The surface tension was changed to alter the Bo and density of the fluids to alter the Re and M, keeping the μ r and ρ r the same. It was found that for lower Bo, the bubble coalesce is slower and the pocket at the lower part of the leading bubble is less concave (towards downward) which is supported by the experimental data.

  3. Computational Approach for Epitaxial Polymorph Stabilization through Substrate Selection

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Hong; Dwaraknath, Shyam S.; Garten, Lauren; Ndione, Paul; Ginley, David; Persson, Kristin A.

    2016-05-25

    With the ultimate goal of finding new polymorphs through targeted synthesis conditions and techniques, we outline a computational framework to select optimal substrates for epitaxial growth using first principle calculations of formation energies, elastic strain energy, and topological information. To demonstrate the approach, we study the stabilization of metastable VO2 compounds which provides a rich chemical and structural polymorph space. We find that common polymorph statistics, lattice matching, and energy above hull considerations recommends homostructural growth on TiO2 substrates, where the VO2 brookite phase would be preferentially grown on the a-c TiO2 brookite plane while the columbite and anatase structures favor the a-b plane on the respective TiO2 phases. Overall, we find that a model which incorporates a geometric unit cell area matching between the substrate and the target film as well as the resulting strain energy density of the film provide qualitative agreement with experimental observations for the heterostructural growth of known VO2 polymorphs: rutile, A and B phases. The minimal interfacial geometry matching and estimated strain energy criteria provide several suggestions for substrates and substrate-film orientations for the heterostructural growth of the hitherto hypothetical anatase, brookite, and columbite polymorphs. These criteria serve as a preliminary guidance for the experimental efforts stabilizing new materials and/or polymorphs through epitaxy. The current screening algorithm is being integrated within the Materials Project online framework and data and hence publicly available.

  4. Simulation of the soft-landing and adsorption of C{sub 60} molecules on a graphite substrate and computation of their scanning-tunnelling-microscopy-like images

    Energy Technology Data Exchange (ETDEWEB)

    Rafii-Tabar, H. [Computational Nano-Science Research Group, Centre for Numerical Modelling and Process Analysis, School of Computing and Mathematical Sciences, University of Greenwich, Greenwich, London (United Kingdom); Jurczyszyn, L.; Stankiewicz, B. [Institute of Experimental Physics, University of Wroclaw, Wroclaw (Poland)

    2000-07-03

    A constant-temperature molecular dynamics (MD) simulation was performed to model the soft-landing and adsorption of C{sub 60} molecules on a graphite substrate with the C{sub 60}s treated as soft molecules and released individually towards the substrate. The intra-molecular and intra-planar covalently bonding interactions were modelled by very accurate many-body potentials, and the non-bonding forces were derived from various pairwise potentials. The simulation extended over 1.6 million time steps covering a significant period of 160 picoseconds. The final alignment of the molecules on the surface agrees closely with that observed in an experiment based on scanning tunnelling microscopy (STM) on the same system, performed at room temperature and under ultrahigh-vacuum (UHV) conditions. Using a tungsten tip in a constant-current mode of imaging, we have also computed the STM-like images of one of the adsorbed molecules using a formulation of the STM tunnelling current based on Keldysh's non-equilibrium Green function formalism. Our aim has been to search for tip-induced states, which were speculated, on the basis of another STM-based experiment, performed in air, to form one of the possible origins of the extra features purported to have been observed in that experiment. We have not obtained any such states. (author)

  5. An Educational Approach to Computationally Modeling Dynamical Systems

    Science.gov (United States)

    Chodroff, Leah; O'Neal, Tim M.; Long, David A.; Hemkin, Sheryl

    2009-01-01

    Chemists have used computational science methodologies for a number of decades and their utility continues to be unabated. For this reason we developed an advanced lab in computational chemistry in which students gain understanding of general strengths and weaknesses of computation-based chemistry by working through a specific research problem.…

  6. Teaching Pervasive Computing to CS Freshmen: A Multidisciplinary Approach

    NARCIS (Netherlands)

    Silvis-Cividjian, Natalia

    2015-01-01

    Pervasive Computing is a growing area in research and commercial reality. Despite this extensive growth, there is no clear consensus on how and when to teach it to students. We report on an innovative attempt to teach this subject to first year Computer Science students. Our course combines computer

  7. Results of computer assisted mini-incision subvastus approach for total knee arthroplasty.

    Science.gov (United States)

    Turajane, Thana; Larbpaiboonpong, Viroj; Kongtharvonskul, Jatupon; Maungsiri, Samart

    2009-12-01

    Mini-incision subvastus approach is soft tissue preservation of the knee. Advantages of the mini-incision subvastus approach included reduced blood loss, reduced pain, self rehabilitation and faster recovery. However, the improved visualization, component alignment, and more blood preservation have been debatable to achieve the better outcome and preventing early failure of the Total Knee Arthroplasty (TKA). The computer navigation has been introduced to improve alignment and blood loss. The purpose of this study was to evaluate the short term outcomes of the combination of computer assisted mini-incision subvastus approach for Total Knee Arthroplasty (CMS-TKA). A prospective case series of the initial 80 patients who underwent computer assisted mini-incision subvastus approach for CMS-TKA from January 2007 to October 2008 was carried out. The patients' conditions were classified into 2 groups, the simple OA knee (varus deformity was less than 15 degree, BMI was less than 20%, no associated deformities) and the complex deformity (varus deformity was more than 15 degrees, BMI more was than 20%, associated with flexion contractor). There were 59 patients in group 1 and 21 patients in group 2. Of the 80 knees, 38 were on the left and 42 on the right. The results of CMS-TKA [the mean (range)] in group 1: group 2 were respectively shown as the incision length [10.88 (8-13): 11.92 (10-14], the operation time [118 (111.88-125.12): 131 (119.29-143.71) minutes, lateral releases (0 in both groups), postoperative range of motion in flexion [94.5 (90-100): 95.25 (90-105) degree] and extension [1.75 (0-5): 1.5 (0-5) degree] Blood loss in 24 hours [489.09 (414.7-563.48): 520 (503.46-636.54) ml] and blood transfusion [1 (0-1) unit? in both groups], Tibiofemoral angle preoperative [Varus = 4 (varus 0-10): Varus = 17.14 (varus 15.7-18.5) degree, Tibiofemoral angle postoperative [Valgus = 1.38 (Valgus 0-4): Valgus = 2.85 (valgus 2.1-3.5) degree], Tibiofemoral angle outlier (85% both

  8. Human Computation An Integrated Approach to Learning from the Crowd

    CERN Document Server

    Law, Edith

    2011-01-01

    Human computation is a new and evolving research area that centers around harnessing human intelligence to solve computational problems that are beyond the scope of existing Artificial Intelligence (AI) algorithms. With the growth of the Web, human computation systems can now leverage the abilities of an unprecedented number of people via the Web to perform complex computation. There are various genres of human computation applications that exist today. Games with a purpose (e.g., the ESP Game) specifically target online gamers who generate useful data (e.g., image tags) while playing an enjoy

  9. An exploratory study of contrast agents for soft tissue visualization by means of high resolution X-ray computed tomography imaging.

    Science.gov (United States)

    Pauwels, E; Van Loo, D; Cornillie, P; Brabant, L; Van Hoorebeke, L

    2013-04-01

    High resolution X-ray computed tomography (CT), or microCT, is a promising and already widely used technique in various scientific fields. Also for histological purposes it has great potential. Although microCT has proven to be a valuable technique for the imaging of bone structures, the visualization of soft tissue structures is still an important challenge due to their low inherent X-ray contrast. One way to achieve contrast enhancement is to make use of contrast agents. However, contrary to light and electron microscopy, knowledge about contrast agents and staining procedures is limited for X-ray CT. The purpose of this paper is to identify useful X-ray contrast agents for soft tissue visualization, which can be applied in a simple way and are also suited for samples larger than (1 cm)(3) . And 28 chemical substances have been investigated. All chemicals were applied in the form of concentrated aqueous solutions in which the samples were immersed. First, strips of green Bacon were stained to evaluate contrast enhancement between muscle and adipose tissue. Furthermore it was also tested whether the contrast agents remained fixed in the tissue after staining by re-immersing them in water. Based on the results, 12 contrast agents were selected for further testing on postmortem mice hind legs, containing a variety of different tissues, including muscle, fat, bone, cartilage and tendons. It was evaluated whether the contrast agents allowed a clearer distinction between the different soft tissue structures present. Finally also penetration depth was measured. And 26 chemicals resulted in contrast enhancement between muscle and adipose tissue in the Bacon strips. Mercury(II)chloride (HgCl2 ), phosphotungstic acid (PTA), phosphomolybdic acid (PMA) and ammonium orthomolybdate ((NH4 )2 MoO4 ) remained fixed after re-immersion in water. The penetration tests showed that potassium iodide (KI) and sodium tungstate can be most efficiently used for large samples of the order

  10. A new approach in development of data flow control and investigation system for computer networks

    International Nuclear Information System (INIS)

    Frolov, I.; Vaguine, A.; Silin, A.

    1992-01-01

    This paper describes a new approach in development of data flow control and investigation system for computer networks. This approach was developed and applied in the Moscow Radiotechnical Institute for control and investigations of Institute computer network. It allowed us to solve our network current problems successfully. Description of our approach is represented below along with the most interesting results of our work. (author)

  11. From Soft Sculpture to Soft Robotics: Retracing a Physical Aesthetics of Bio-Morphic Softness

    DEFF Research Database (Denmark)

    Jørgensen, Jonas

    2017-01-01

    Soft robotics has in the past decade emerged as a growing subfield of technical robotics research, distinguishable by its bio-inspired design strategies, interest in morphological computation, and interdisciplinary combination of insights from engineering, computer science, biology and material...... science. Recently, soft robotics technology has also started to make its way into art, design, and architecture. This paper attempts to think an aesthetics of softness and the life-like through an artistic tradition deeply imbricated with an interrogation of softness and its physical substrates, namely...... the soft sculpture that started proliferating in the late 1960s. Critical descriptions of these works, interestingly, frequently emphasize their similarities with living organisms and bodies as a central tenet of their aesthetics. The paper seeks to articulate aspects of a contiguity between softness...

  12. Soft systems methodology as a potential approach to understanding non-motorised transport users in South Africa

    CSIR Research Space (South Africa)

    Van Rooyen, CE

    2016-07-01

    Full Text Available of this paper is to show the potential of using systems thinking and more particularly Soft Systems Methodology (SSM) as a practical and beneficial instrument that will guide BEPDPs with the ongoing learning process of understanding NMT users and their specific...

  13. Interdisciplinary approach to enhance the esthetics of maxillary anterior region using soft- and hard-tissue ridge augmentation in conjunction with a fixed partial prosthesis.

    Science.gov (United States)

    Khetarpal, Shaleen; Chouksey, Ajay; Bele, Anand; Vishnoi, Rahul

    2018-01-01

    Favorable esthetics is one of the most important treatment outcomes in dentistry, and to achieve this, interdisciplinary approaches are often required. Ridge deficiencies can be corrected for both, soft- and hard-tissue discrepancies. To overcome such defects, not only a variety of prosthetic options are at our disposal but also several periodontal plastic surgical techniques are available as well. Various techniques have been described and revised, over the year to correct ridge defects. For enhancing soft-tissue contours in the anterior region, the subepithelial connective tissue graft is the treatment of choice. A combination of alloplastic bone graft in adjunct to connective tissue graft optimizes ridge augmentation and minimizes defects. The present case report describes the use of vascular interpositional connective tissue graft in combination with alloplastic bone graft for correction of Seibert's Class III ridge deficiency followed by a fixed partial prosthesis to achieve a better esthetic outcome.

  14. Interdisciplinary approach to enhance the esthetics of maxillary anterior region using soft- and hard-tissue ridge augmentation in conjunction with a fixed partial prosthesis

    Directory of Open Access Journals (Sweden)

    Shaleen Khetarpal

    2018-01-01

    Full Text Available Favorable esthetics is one of the most important treatment outcomes in dentistry, and to achieve this, interdisciplinary approaches are often required. Ridge deficiencies can be corrected for both, soft- and hard-tissue discrepancies. To overcome such defects, not only a variety of prosthetic options are at our disposal but also several periodontal plastic surgical techniques are available as well. Various techniques have been described and revised, over the year to correct ridge defects. For enhancing soft-tissue contours in the anterior region, the subepithelial connective tissue graft is the treatment of choice. A combination of alloplastic bone graft in adjunct to connective tissue graft optimizes ridge augmentation and minimizes defects. The present case report describes the use of vascular interpositional connective tissue graft in combination with alloplastic bone graft for correction of Seibert's Class III ridge deficiency followed by a fixed partial prosthesis to achieve a better esthetic outcome.

  15. A comparison of monthly precipitation point estimates at 6 locations in Iran using integration of soft computing methods and GARCH time series model

    Science.gov (United States)

    Mehdizadeh, Saeid; Behmanesh, Javad; Khalili, Keivan

    2017-11-01

    Precipitation plays an important role in determining the climate of a region. Precise estimation of precipitation is required to manage and plan water resources, as well as other related applications such as hydrology, climatology, meteorology and agriculture. Time series of hydrologic variables such as precipitation are composed of deterministic and stochastic parts. Despite this fact, the stochastic part of the precipitation data is not usually considered in modeling of precipitation process. As an innovation, the present study introduces three new hybrid models by integrating soft computing methods including multivariate adaptive regression splines (MARS), Bayesian networks (BN) and gene expression programming (GEP) with a time series model, namely generalized autoregressive conditional heteroscedasticity (GARCH) for modeling of the monthly precipitation. For this purpose, the deterministic (obtained by soft computing methods) and stochastic (obtained by GARCH time series model) parts are combined with each other. To carry out this research, monthly precipitation data of Babolsar, Bandar Anzali, Gorgan, Ramsar, Tehran and Urmia stations with different climates in Iran were used during the period of 1965-2014. Root mean square error (RMSE), relative root mean square error (RRMSE), mean absolute error (MAE) and determination coefficient (R2) were employed to evaluate the performance of conventional/single MARS, BN and GEP, as well as the proposed MARS-GARCH, BN-GARCH and GEP-GARCH hybrid models. It was found that the proposed novel models are more precise than single MARS, BN and GEP models. Overall, MARS-GARCH and BN-GARCH models yielded better accuracy than GEP-GARCH. The results of the present study confirmed the suitability of proposed methodology for precise modeling of precipitation.

  16. Diffusible iodine-based contrast-enhanced computed tomography (diceCT): an emerging tool for rapid, high-resolution, 3-D imaging of metazoan soft tissues.

    Science.gov (United States)

    Gignac, Paul M; Kley, Nathan J; Clarke, Julia A; Colbert, Matthew W; Morhardt, Ashley C; Cerio, Donald; Cost, Ian N; Cox, Philip G; Daza, Juan D; Early, Catherine M; Echols, M Scott; Henkelman, R Mark; Herdina, A Nele; Holliday, Casey M; Li, Zhiheng; Mahlow, Kristin; Merchant, Samer; Müller, Johannes; Orsbon, Courtney P; Paluh, Daniel J; Thies, Monte L; Tsai, Henry P; Witmer, Lawrence M

    2016-06-01

    Morphologists have historically had to rely on destructive procedures to visualize the three-dimensional (3-D) anatomy of animals. More recently, however, non-destructive techniques have come to the forefront. These include X-ray computed tomography (CT), which has been used most commonly to examine the mineralized, hard-tissue anatomy of living and fossil metazoans. One relatively new and potentially transformative aspect of current CT-based research is the use of chemical agents to render visible, and differentiate between, soft-tissue structures in X-ray images. Specifically, iodine has emerged as one of the most widely used of these contrast agents among animal morphologists due to its ease of handling, cost effectiveness, and differential affinities for major types of soft tissues. The rapid adoption of iodine-based contrast agents has resulted in a proliferation of distinct specimen preparations and scanning parameter choices, as well as an increasing variety of imaging hardware and software preferences. Here we provide a critical review of the recent contributions to iodine-based, contrast-enhanced CT research to enable researchers just beginning to employ contrast enhancement to make sense of this complex new landscape of methodologies. We provide a detailed summary of recent case studies, assess factors that govern success at each step of the specimen storage, preparation, and imaging processes, and make recommendations for standardizing both techniques and reporting practices. Finally, we discuss potential cutting-edge applications of diffusible iodine-based contrast-enhanced computed tomography (diceCT) and the issues that must still be overcome to facilitate the broader adoption of diceCT going forward. © 2016 The Authors. Journal of Anatomy published by John Wiley & Sons Ltd on behalf of Anatomical Society.

  17. Mutations that Cause Human Disease: A Computational/Experimental Approach

    Energy Technology Data Exchange (ETDEWEB)

    Beernink, P; Barsky, D; Pesavento, B

    2006-01-11

    International genome sequencing projects have produced billions of nucleotides (letters) of DNA sequence data, including the complete genome sequences of 74 organisms. These genome sequences have created many new scientific opportunities, including the ability to identify sequence variations among individuals within a species. These genetic differences, which are known as single nucleotide polymorphisms (SNPs), are particularly important in understanding the genetic basis for disease susceptibility. Since the report of the complete human genome sequence, over two million human SNPs have been identified, including a large-scale comparison of an entire chromosome from twenty individuals. Of the protein coding SNPs (cSNPs), approximately half leads to a single amino acid change in the encoded protein (non-synonymous coding SNPs). Most of these changes are functionally silent, while the remainder negatively impact the protein and sometimes cause human disease. To date, over 550 SNPs have been found to cause single locus (monogenic) diseases and many others have been associated with polygenic diseases. SNPs have been linked to specific human diseases, including late-onset Parkinson disease, autism, rheumatoid arthritis and cancer. The ability to predict accurately the effects of these SNPs on protein function would represent a major advance toward understanding these diseases. To date several attempts have been made toward predicting the effects of such mutations. The most successful of these is a computational approach called ''Sorting Intolerant From Tolerant'' (SIFT). This method uses sequence conservation among many similar proteins to predict which residues in a protein are functionally important. However, this method suffers from several limitations. First, a query sequence must have a sufficient number of relatives to infer sequence conservation. Second, this method does not make use of or provide any information on protein structure, which

  18. A computational intelligence approach to the Mars Precision Landing problem

    Science.gov (United States)

    Birge, Brian Kent, III

    Various proposed Mars missions, such as the Mars Sample Return Mission (MRSR) and the Mars Smart Lander (MSL), require precise re-entry terminal position and velocity states. This is to achieve mission objectives including rendezvous with a previous landed mission, or reaching a particular geographic landmark. The current state of the art footprint is in the magnitude of kilometers. For this research a Mars Precision Landing is achieved with a landed footprint of no more than 100 meters, for a set of initial entry conditions representing worst guess dispersions. Obstacles to reducing the landed footprint include trajectory dispersions due to initial atmospheric entry conditions (entry angle, parachute deployment height, etc.), environment (wind, atmospheric density, etc.), parachute deployment dynamics, unavoidable injection error (propagated error from launch on), etc. Weather and atmospheric models have been developed. Three descent scenarios have been examined. First, terminal re-entry is achieved via a ballistic parachute with concurrent thrusting events while on the parachute, followed by a gravity turn. Second, terminal re-entry is achieved via a ballistic parachute followed by gravity turn to hover and then thrust vector to desired location. Third, a guided parafoil approach followed by vectored thrusting to reach terminal velocity is examined. The guided parafoil is determined to be the best architecture. The purpose of this study is to examine the feasibility of using a computational intelligence strategy to facilitate precision planetary re-entry, specifically to take an approach that is somewhat more intuitive and less rigid, and see where it leads. The test problems used for all research are variations on proposed Mars landing mission scenarios developed by NASA. A relatively recent method of evolutionary computation is Particle Swarm Optimization (PSO), which can be considered to be in the same general class as Genetic Algorithms. An improvement over

  19. Computer Forensics for Graduate Accountants: A Motivational Curriculum Design Approach

    OpenAIRE

    Grover Kearns

    2010-01-01

    Computer forensics involves the investigation of digital sources to acquire evidence that can be used in a court of law. It can also be used to identify and respond to threats to hosts and systems. Accountants use computer forensics to investigate computer crime or misuse, theft of trade secrets, theft of or destruction of intellectual property, and fraud. Education of accountants to use forensic tools is a goal of the AICPA (American Institute of Certified Public Accountants). Accounting stu...

  20. Soft tissue-preserving computer-aided impression: a novel concept using ultrasonic 3D-scanning.

    Science.gov (United States)

    Vollborn, Thorsten; Habor, Daniel; Pekam, Fabrice Chuembou; Heger, Stefan; Marotti, Juliana; Reich, Sven; Wolfart, Stefan; Tinschert, Joachim; Radermacher, Klaus

    2014-01-01

    Subgingival preparations are often affected by blood and saliva during impression taking, regardless of whether one is using compound impression techniques or intraoral digital scanning methods. The latter are currently based on optical principles and therefore also need clean and dry surfaces. In contrast, ultrasonic waves are able to non-invasively penetrate gingiva, saliva, and blood, leading to decisive advantages, as cleaning and drying of the oral cavity becomes unnecessary. In addition, the application of ultrasound may facilitate the detection of subgingival structures without invasive manipulation, thereby reducing the risk of secondary infection and treatment time, and increasing patient comfort. Ultrasound devices commonly available for medical application and for the testing of materials are only suitable to a limited extent, as their resolution, precision, and design do not fulfill the requirements for intraoral scanning. The aim of this article is to describe the development of a novel ultrasound technology that enables soft tissue-preserving digital impressions of preparations for the CAD/CAM-based production of dental prostheses. The concept and development of the high-resolution ultrasound technique and the corresponding intraoral scanning system, as well as the integration into the CAD/CAM process chain, is presented.

  1. Effective action of softly broken supersymmetric theories

    International Nuclear Information System (INIS)

    Groot Nibbelink, S.; Nyawelo, T.S

    2006-12-01

    We study the renormalization of (softly) broken supersymmetric theories at the one loop level in detail. We perform this analysis in a superspace approach in which the supersymmetry breaking interactions are parameterized using spurion insertions. We comment on the uniqueness of this parameterization. We compute the one loop renormalization of such theories by calculating superspace vacuum graphs with multiple spurion insertions. To preform this computation efficiently we develop algebraic properties of spurion operators, that naturally arise because the spurions are often surrounded by superspace projection operators. Our results are general apart from the restrictions that higher super covariant derivative terms and some finite effects due to non-commutativity of superfield dependent mass matrices are ignored. One of the soft potentials induces renormalization of the Kaehler potential. (author)

  2. Overview of Computer Simulation Modeling Approaches and Methods

    Science.gov (United States)

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  3. Gesture Recognition by Computer Vision : An Integral Approach

    NARCIS (Netherlands)

    Lichtenauer, J.F.

    2009-01-01

    The fundamental objective of this Ph.D. thesis is to gain more insight into what is involved in the practical application of a computer vision system, when the conditions of use cannot be controlled completely. The basic assumption is that research on isolated aspects of computer vision often leads

  4. Thermodynamic and relative approach to compute glass-forming ...

    Indian Academy of Sciences (India)

    models) characteristic: the isobaric heat capacity (Cp) of oxides, and execute a mathematical treatment of oxides thermodynamic data. We note this coefficient as thermodynamical relative glass-forming ability (ThRGFA) and for- mulate a model to compute it. Computed values of 2nd, 3rd, 4th and 5th period metal oxides ...

  5. An approach to quantum-computational hydrologic inverse analysis.

    Science.gov (United States)

    O'Malley, Daniel

    2018-05-02

    Making predictions about flow and transport in an aquifer requires knowledge of the heterogeneous properties of the aquifer such as permeability. Computational methods for inverse analysis are commonly used to infer these properties from quantities that are more readily observable such as hydraulic head. We present a method for computational inverse analysis that utilizes a type of quantum computer called a quantum annealer. While quantum computing is in an early stage compared to classical computing, we demonstrate that it is sufficiently developed that it can be used to solve certain subsurface flow problems. We utilize a D-Wave 2X quantum annealer to solve 1D and 2D hydrologic inverse problems that, while small by modern standards, are similar in size and sometimes larger than hydrologic inverse problems that were solved with early classical computers. Our results and the rapid progress being made with quantum computing hardware indicate that the era of quantum-computational hydrology may not be too far in the future.

  6. Reading Emotion From Mouse Cursor Motions: Affective Computing Approach.

    Science.gov (United States)

    Yamauchi, Takashi; Xiao, Kunchen

    2018-04-01

    Affective computing research has advanced emotion recognition systems using facial expressions, voices, gaits, and physiological signals, yet these methods are often impractical. This study integrates mouse cursor motion analysis into affective computing and investigates the idea that movements of the computer cursor can provide information about emotion of the computer user. We extracted 16-26 trajectory features during a choice-reaching task and examined the link between emotion and cursor motions. Participants were induced for positive or negative emotions by music, film clips, or emotional pictures, and they indicated their emotions with questionnaires. Our 10-fold cross-validation analysis shows that statistical models formed from "known" participants (training data) could predict nearly 10%-20% of the variance of positive affect and attentiveness ratings of "unknown" participants, suggesting that cursor movement patterns such as the area under curve and direction change help infer emotions of computer users. Copyright © 2017 Cognitive Science Society, Inc.

  7. Human-Computer Interfaces for Wearable Computers: A Systematic Approach to Development and Evaluation

    OpenAIRE

    Witt, Hendrik

    2007-01-01

    The research presented in this thesis examines user interfaces for wearable computers.Wearable computers are a special kind of mobile computers that can be worn on the body. Furthermore, they integrate themselves even more seamlessly into different activities than a mobile phone or a personal digital assistant can.The thesis investigates the development and evaluation of user interfaces for wearable computers. In particular, it presents fundamental research results as well as supporting softw...

  8. What Computational Approaches Should be Taught for Physics?

    Science.gov (United States)

    Landau, Rubin

    2005-03-01

    The standard Computational Physics courses are designed for upper-level physics majors who already have some computational skills. We believe that it is important for first-year physics students to learn modern computing techniques that will be useful throughout their college careers, even before they have learned the math and science required for Computational Physics. To teach such Introductory Scientific Computing courses requires that some choices be made as to what subjects and computer languages wil be taught. Our survey of colleagues active in Computational Physics and Physics Education show no predominant choice, with strong positions taken for the compiled languages Java, C, C++ and Fortran90, as well as for problem-solving environments like Maple and Mathematica. Over the last seven years we have developed an Introductory course and have written up those courses as text books for others to use. We will describe our model of using both a problem-solving environment and a compiled language. The developed materials are available in both Maple and Mathaematica, and Java and Fortran90ootnotetextPrinceton University Press, to be published; www.physics.orst.edu/˜rubin/IntroBook/.

  9. Computer Tutors: An Innovative Approach to Computer Literacy. Part I: The Early Stages.

    Science.gov (United States)

    Targ, Joan

    1981-01-01

    In Part I of this two-part article, the author describes the evolution of the Computer Tutor project in Palo Alto, California, and the strategies she incorporated into a successful student-taught computer literacy program. Journal availability: Educational Computer, P.O. Box 535, Cupertino, CA 95015. (Editor/SJL)

  10. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  11. PREFACE: 1st International Workshop on Theoretical and Computational Physics: Condensed Matter, Soft Matter and Materials Physics & 38th National Conference on Theoretical Physics

    Science.gov (United States)

    2014-09-01

    This volume contains selected papers presented at the 38th National Conference on Theoretical Physics (NCTP-38) and the 1st International Workshop on Theoretical and Computational Physics: Condensed Matter, Soft Matter and Materials Physics (IWTCP-1). Both the conference and the workshop were held from 29 July to 1 August 2013 in Pullman hotel, Da Nang, Vietnam. The IWTCP-1 was a new activity of the Vietnamese Theoretical Physics Society (VTPS) organized in association with the 38th National Conference on Theoretical Physics (NCTP-38), the most well-known annual scientific forum dedicated to the dissemination of the latest development in the field of theoretical physics within the country. The IWTCP-1 was also an External Activity of the Asia Pacific Center for Theoretical Physics (APCTP). The overriding goal of the IWTCP is to provide an international forum for scientists and engineers from academia to share ideas, problems and solution relating to the recent advances in theoretical physics as well as in computational physics. The main IWTCP motivation is to foster scientific exchanges between the Vietnamese theoretical and computational physics community and world-wide scientists as well as to promote high-standard level of research and education activities for young physicists in the country. About 110 participants coming from 10 countries participated in the conference and the workshop. 4 invited talks, 18 oral contributions and 46 posters were presented at the conference. In the workshop we had one keynote lecture and 9 invited talks presented by international experts in the fields of theoretical and computational physics, together with 14 oral and 33 poster contributions. The proceedings were edited by Nguyen Tri Lan, Trinh Xuan Hoang, and Nguyen Ai Viet. We would like to thank all invited speakers, participants and sponsors for making the conference and the workshop successful. Nguyen Ai Viet Chair of NCTP-38 and IWTCP-1

  12. Food Sustainable Model Development: An ANP Approach to Prioritize Sustainable Factors in the Romanian Natural Soft Drinks Industry Context

    Directory of Open Access Journals (Sweden)

    Răzvan Cătalin Dobrea

    2015-07-01

    Full Text Available The latest developments in natural soft drinks in the Romanian market signal significant changes in consumers’ perceptions of the sustainability concept. While the necessity of preserving natural resources and ensuring a decent level of healthiness seem to be steadily embraced by the Romanian society, the lack of long enough time series to acknowledge this shift render impossible a traditional econometric validation of these recent trends in economic thinking. The large number of European-funded projects for upgrading technology in the Romanian sector of natural soft drinks raises the question whether the learning by doing effect dispersed into the Romanian managers’ investment decision making from the perspective of both economic and food sustainability. This paper presents the construction and the evaluation of an Analytical Network Process (ANP market share model, which emerged from extended in-depth interviews with 10 managers of the main Romanian natural soft drinks producers. This model differs from traditional market share ANP ones since concepts like either food of economic sustainability were considered as significant driving factors. The coincidence between the estimated market share and the actual one, expressed by Saaty’s compatibility index, validate this model and offer comparative numerical weights’ of importance for food or economic sustainability.

  13. A statistical approach to estimating effects of performance shaping factors on human error probabilities of soft controls

    International Nuclear Information System (INIS)

    Kim, Yochan; Park, Jinkyun; Jung, Wondea; Jang, Inseok; Hyun Seong, Poong

    2015-01-01

    Despite recent efforts toward data collection for supporting human reliability analysis, there remains a lack of empirical basis in determining the effects of performance shaping factors (PSFs) on human error probabilities (HEPs). To enhance the empirical basis regarding the effects of the PSFs, a statistical methodology using a logistic regression and stepwise variable selection was proposed, and the effects of the PSF on HEPs related with the soft controls were estimated through the methodology. For this estimation, more than 600 human error opportunities related to soft controls in a computerized control room were obtained through laboratory experiments. From the eight PSF surrogates and combinations of these variables, the procedure quality, practice level, and the operation type were identified as significant factors for screen switch and mode conversion errors. The contributions of these significant factors to HEPs were also estimated in terms of a multiplicative form. The usefulness and limitation of the experimental data and the techniques employed are discussed herein, and we believe that the logistic regression and stepwise variable selection methods will provide a way to estimate the effects of PSFs on HEPs in an objective manner. - Highlights: • It is necessary to develop an empirical basis for the effects of the PSFs on the HEPs. • A statistical method using a logistic regression and variable selection was proposed. • The effects of PSFs on the HEPs of soft controls were empirically investigated. • The significant factors were identified and their effects were estimated

  14. A Human-Centred Tangible approach to learning Computational Thinking

    Directory of Open Access Journals (Sweden)

    Tommaso Turchi

    2016-08-01

    Full Text Available Computational Thinking has recently become a focus of many teaching and research domains; it encapsulates those thinking skills integral to solving complex problems using a computer, thus being widely applicable in our society. It is influencing research across many disciplines and also coming into the limelight of education, mostly thanks to public initiatives such as the Hour of Code. In this paper we present our arguments for promoting Computational Thinking in education through the Human-centred paradigm of Tangible End-User Development, namely by exploiting objects whose interactions with the physical environment are mapped to digital actions performed on the system.

  15. An introduction to statistical computing a simulation-based approach

    CERN Document Server

    Voss, Jochen

    2014-01-01

    A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems.  Sampling-based simulation techniques are now an invaluable tool for exploring statistical models.  This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods.  It also includes some advanced met

  16. Implementation and Analysis for APR1400 Soft Control System

    International Nuclear Information System (INIS)

    2015-01-01

    Due to the rapid advancement of digital technology, the definite technical advantages of digital control system compared to analog control system are accelerating the implementation of advanced distributed digital control system in the nuclear power plant. One of the major advantages of digital control system is the capability of Soft Control System. The design of Soft Control System for Advanced Power Reactor 1400 (APR1400) plant of Man-Machine Interface System (MMIS) is based on full digital technologies to enhance reliability, operability and maintainability. Computer-based compact workstation has been adopted in the APR1400 Main Control Room (MCR) to provide convenient working environment. This paper introduces the approaches and methodologies of Soft Control System for the Advanced Control Room (ACR). This paper also explains major design features for operation and display of the Soft Control System and its implementation to cope with regulatory requirements. (authors)

  17. Computer aided display of multiple soft tissue anatomical surfaces for simultaneous structural and area-dose appreciation in 3D-radiationtherapy planning. 115

    International Nuclear Information System (INIS)

    Moore, C.J.; Mott, D.J.; Wilkinson, J.M.

    1987-01-01

    For radiotherapy applications a 3D display that includes soft tissues is required but the presentation of all anatomical structures is often unnecessary and is potentially confusing. A tumour volume and a small number of critical organs, usually embedded within other soft tissue anatomy, are likely to be all that can be clearly displayed when presented in a 3D format. The inclusion of dose data (in the form of isodose lines or surfaces) adds to the complication of any 3D display. A solution to this problem is to incorporate the presentation of dose distribution into the technique used to provide the illusion of 3D. This illusion can be provided by either depth cueing or by the hypothetical illumination of spatially defined object surfaces. The dose distribution from irradiation fields or, in the case of brachytherapy from radioactive sources, can be regarded as a source of illumination for tumour and critical organs. The intensity of illumination at any point on a tissue surface represents the dose at that point. Such an approach also allows the variation of dose over a given surface (and by extension, over the corresponding volume) to be quantified using histogram techniques. This may be of value in analysing and comparing techniques in which vulnerable tissue surfaces are irradiated. The planning of intracavitary treatments for cervical cancer is one application which might benefit from the display approach described above. Here the variation of dose over the mucosal surfaces of the bladder and the rectum is of particular interest, since dose related morbidity has often been reported following these treatments. 7 refs.; 8 figs

  18. Towards an Approach of Semantic Access Control for Cloud Computing

    Science.gov (United States)

    Hu, Luokai; Ying, Shi; Jia, Xiangyang; Zhao, Kai

    With the development of cloud computing, the mutual understandability among distributed Access Control Policies (ACPs) has become an important issue in the security field of cloud computing. Semantic Web technology provides the solution to semantic interoperability of heterogeneous applications. In this paper, we analysis existing access control methods and present a new Semantic Access Control Policy Language (SACPL) for describing ACPs in cloud computing environment. Access Control Oriented Ontology System (ACOOS) is designed as the semantic basis of SACPL. Ontology-based SACPL language can effectively solve the interoperability issue of distributed ACPs. This study enriches the research that the semantic web technology is applied in the field of security, and provides a new way of thinking of access control in cloud computing.

  19. Diffusive Wave Approximation to the Shallow Water Equations: Computational Approach

    KAUST Repository

    Collier, Nathan; Radwan, Hany; Dalcin, Lisandro; Calo, Victor M.

    2011-01-01

    We discuss the use of time adaptivity applied to the one dimensional diffusive wave approximation to the shallow water equations. A simple and computationally economical error estimator is discussed which enables time-step size adaptivity

  20. Development of Computer Science Disciplines - A Social Network Analysis Approach

    OpenAIRE

    Pham, Manh Cuong; Klamma, Ralf; Jarke, Matthias

    2011-01-01

    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and ...

  1. New Approaches to Quantum Computing using Nuclear Magnetic Resonance Spectroscopy

    International Nuclear Information System (INIS)

    Colvin, M; Krishnan, V V

    2003-01-01

    The power of a quantum computer (QC) relies on the fundamental concept of the superposition in quantum mechanics and thus allowing an inherent large-scale parallelization of computation. In a QC, binary information embodied in a quantum system, such as spin degrees of freedom of a spin-1/2 particle forms the qubits (quantum mechanical bits), over which appropriate logical gates perform the computation. In classical computers, the basic unit of information is the bit, which can take a value of either 0 or 1. Bits are connected together by logic gates to form logic circuits to implement complex logical operations. The expansion of modern computers has been driven by the developments of faster, smaller and cheaper logic gates. As the size of the logic gates become smaller toward the level of atomic dimensions, the performance of such a system is no longer considered classical but is rather governed by quantum mechanics. Quantum computers offer the potentially superior prospect of solving computational problems that are intractable to classical computers such as efficient database searches and cryptography. A variety of algorithms have been developed recently, most notably Shor's algorithm for factorizing long numbers into prime factors in polynomial time and Grover's quantum search algorithm. The algorithms that were of only theoretical interest as recently, until several methods were proposed to build an experimental QC. These methods include, trapped ions, cavity-QED, coupled quantum dots, Josephson junctions, spin resonance transistors, linear optics and nuclear magnetic resonance. Nuclear magnetic resonance (NMR) is uniquely capable of constructing small QCs and several algorithms have been implemented successfully. NMR-QC differs from other implementations in one important way that it is not a single QC, but a statistical ensemble of them. Thus, quantum computing based on NMR is considered as ensemble quantum computing. In NMR quantum computing, the spins with

  2. Soft Robotics.

    Science.gov (United States)

    Whitesides, George M

    2018-04-09

    This description of "soft robotics" is not intended to be a conventional review, in the sense of a comprehensive technical summary of a developing field. Rather, its objective is to describe soft robotics as a new field-one that offers opportunities to chemists and materials scientists who like to make "things" and to work with macroscopic objects that move and exert force. It will give one (personal) view of what soft actuators and robots are, and how this class of soft devices fits into the more highly developed field of conventional "hard" robotics. It will also suggest how and why soft robotics is more than simply a minor technical "tweak" on hard robotics and propose a unique role for chemistry, and materials science, in this field. Soft robotics is, at its core, intellectually and technologically different from hard robotics, both because it has different objectives and uses and because it relies on the properties of materials to assume many of the roles played by sensors, actuators, and controllers in hard robotics. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Soft lubrication

    Science.gov (United States)

    Skotheim, Jan; Mahadevan, Laksminarayanan

    2004-11-01

    We study the lubrication of fluid-immersed soft interfaces and show that elastic deformation couples tangential and normal forces and thus generates lift. We consider materials that deform easily, due to either geometry (e.g a shell) or constitutive properties (e.g. a gel or a rubber), so that the effects of pressure and temperature on the fluid properties may be neglected. Four different system geometries are considered: a rigid cylinder moving tangentially to a soft layer coating a rigid substrate; a soft cylinder moving tangentially to a rigid substrate; a cylindrical shell moving tangentially to a rigid substrate; and finally a journal bearing coated with a thin soft layer, which being a conforming contact allows us to gauge the influence of contact geometry. In addition, for the particular case of a soft layer coating a rigid substrate we consider both elastic and poroelastic material responses. Finally, we consider the role of contact geometry in the context of the journal bearing, a conforming contact. For all these cases we find the same generic behavior: there is an optimal combination of geometric and material parameters that maximizes the dimensionless normal force as a function of the softness.

  4. Demand side management scheme in smart grid with cloud computing approach using stochastic dynamic programming

    Directory of Open Access Journals (Sweden)

    S. Sofana Reka

    2016-09-01

    Full Text Available This paper proposes a cloud computing framework in smart grid environment by creating small integrated energy hub supporting real time computing for handling huge storage of data. A stochastic programming approach model is developed with cloud computing scheme for effective demand side management (DSM in smart grid. Simulation results are obtained using GUI interface and Gurobi optimizer in Matlab in order to reduce the electricity demand by creating energy networks in a smart hub approach.

  5. Soft tissue tumors - imaging methods

    International Nuclear Information System (INIS)

    Arlart, I.P.

    1985-01-01

    Soft Tissue Tumors - Imaging Methods: Imaging methods play an important diagnostic role in soft tissue tumors concerning a preoperative evaluation of localization, size, topographic relationship, dignity, and metastatic disease. The present paper gives an overview about diagnostic methods available today such as ultrasound, thermography, roentgenographic plain films and xeroradiography, radionuclide methods, computed tomography, lymphography, angiography, and magnetic resonance imaging. Besides sonography particularly computed tomography has the most important diagnostic value in soft tissue tumors. The application of a recently developed method, the magnetic resonance imaging, cannot yet be assessed in its significance. (orig.) [de

  6. Autonomous undulatory serpentine locomotion utilizing body dynamics of a fluidic soft robot

    International Nuclear Information System (INIS)

    Onal, Cagdas D; Rus, Daniela

    2013-01-01

    Soft robotics offers the unique promise of creating inherently safe and adaptive systems. These systems bring man-made machines closer to the natural capabilities of biological systems. An important requirement to enable self-contained soft mobile robots is an on-board power source. In this paper, we present an approach to create a bio-inspired soft robotic snake that can undulate in a similar way to its biological counterpart using pressure for actuation power, without human intervention. With this approach, we develop an autonomous soft snake robot with on-board actuation, power, computation and control capabilities. The robot consists of four bidirectional fluidic elastomer actuators in series to create a traveling curvature wave from head to tail along its body. Passive wheels between segments generate the necessary frictional anisotropy for forward locomotion. It takes 14 h to build the soft robotic snake, which can attain an average locomotion speed of 19 mm s −1 . (paper)

  7. Soft Interfaces

    International Nuclear Information System (INIS)

    Strzalkowski, Ireneusz

    1997-01-01

    This book presents an extended form of the 1994 Dirac Memorial Lecture delivered by Pierre Gilles de Gennes at Cambridge University. The main task of the presentation is to show the beauty and richness of structural forms and phenomena which are observed at soft interfaces between two media. They are much more complex than forms and phenomena existing in each phase separately. Problems are discussed including both traditional, classical techniques, such as the contact angle in static and dynamic partial wetting, as well as the latest research methodology, like 'environmental' scanning electron microscopes. The book is not a systematic lecture on phenomena but it can be considered as a compact set of essays on topics which particularly fascinate the author. The continuum theory widely used in the book is based on a deep molecular approach. The author is particularly interested in a broad-minded rheology of liquid systems at interfaces with specific emphasis on polymer melts. To study this, the author has developed a special methodology called anemometry near walls. The second main topic presented in the book is the problem of adhesion. Molecular processes, energy transformations and electrostatic interaction are included in an interesting discussion of the many aspects of the principles of adhesion. The third topic concerns welding between two polymer surfaces, such as A/A and A/B interfaces. Of great worth is the presentation of various unsolved, open problems. The kind of topics and brevity of description indicate that this book is intended for a well prepared reader. However, for any reader it will present an interesting picture of how many mysterious processes are acting in the surrounding world and how these phenomena are perceived by a Nobel Laureate, who won that prize mainly for his investigations in this field. (book review)

  8. A simulation model for visitors’ thermal comfort at urban public squares using non-probabilistic binary-linear classifier through soft-computing methodologies

    International Nuclear Information System (INIS)

    Kariminia, Shahab; Shamshirband, Shahaboddin; Hashim, Roslan; Saberi, Ahmadreza; Petković, Dalibor; Roy, Chandrabhushan; Motamedi, Shervin

    2016-01-01

    Sustaining outdoor life in cities is decreasing because of the recent rapid urbanisation without considering climate-responsive urban design concepts. Such inadvertent climatic modifications at the indoor level have imposed considerable demand on the urban energy resources. It is important to provide comfortable ambient climate at open urban squares. Researchers need to predict the comfortable conditions at such outdoor squares. The main objective of this study is predict the visitors' outdoor comfort indices by using a developed computational model termed as SVM-WAVELET (Support Vector Machines combined with Discrete Wavelet Transform algorithm). For data collection, the field study was conducted in downtown Isfahan, Iran (51°41′ E, 32°37′ N) with hot and arid summers. Based on different environmental elements, four separate locations were monitored across two public squares. Meteorological data were measured simultaneously by surveying the visitors' thermal sensations. According to the subjects' thermal feeling and their characteristics, their level of comfort was estimated. Further, the adapted computational model was used to estimate the visitors’ thermal sensations in terms of thermal comfort indices. The SVM-WAVELET results indicate that R"2 value for input parameters, including Thermal Sensation, PMW (The predicted mean vote), PET (physiologically equivalent temperature), SET (standard effective temperature) and T_m_r_t were estimated at 0.482, 0.943, 0.988, 0.969 and 0.840, respectively. - Highlights: • To explore the visitors' thermal sensation at urban public squares. • This article introduces findings of outdoor comfort prediction. • The developed SVM-WAVELET soft-computing technique was used. • SVM-WAVELET estimation results are more reliable and accurate.

  9. Computer aided approach for qualitative risk assessment of engineered systems

    International Nuclear Information System (INIS)

    Crowley, W.K.; Arendt, J.S.; Fussell, J.B.; Rooney, J.J.; Wagner, D.P.

    1978-01-01

    This paper outlines a computer aided methodology for determining the relative contributions of various subsystems and components to the total risk associated with an engineered system. Major contributors to overall task risk are identified through comparison of an expected frequency density function with an established risk criterion. Contributions that are inconsistently high are also identified. The results from this analysis are useful for directing efforts for improving system safety and performance. An analysis of uranium hexafluoride handling risk at a gaseous diffusion uranium enrichment plant using a preliminary version of the computer program EXCON is briefly described and illustrated

  10. Environmental sciences and computations: a modular data based systems approach

    International Nuclear Information System (INIS)

    Crawford, T.V.; Bailey, C.E.

    1975-07-01

    A major computer code for environmental calculations is under development at the Savannah River Laboratory. The primary aim is to develop a flexible, efficient capability to calculate, for all significant pathways, the dose to man resulting from releases of radionuclides from the Savannah River Plant and from other existing and potential radioactive sources in the southeastern United States. The environmental sciences programs at SRP are described, with emphasis on the development of the calculational system. It is being developed as a modular data-based system within the framework of the larger JOSHUA Computer System, which provides data management, terminal, and job execution facilities. (U.S.)

  11. Computer assisted pyeloplasty in children the retroperitoneal approach

    DEFF Research Database (Denmark)

    Olsen, L H; Jorgensen, T M

    2004-01-01

    PURPOSE: We describe the first series of computer assisted retroperitoneoscopic pyeloplasty in children using the Da Vinci Surgical System (Intuitive Surgical, Inc., Mountainview, California) with regard to setup, method, operation time, complications and preliminary outcome. The small space...... with the Da Vinci Surgical System. With the patient in a lateral semiprone position the retroperitoneal space was developed by blunt and balloon dissection. Three ports were placed for the computer assisted system and 1 for assistance. Pyeloplasty was performed with the mounted system placed behind...

  12. Numerical Methods for Stochastic Computations A Spectral Method Approach

    CERN Document Server

    Xiu, Dongbin

    2010-01-01

    The first graduate-level textbook to focus on fundamental aspects of numerical methods for stochastic computations, this book describes the class of numerical methods based on generalized polynomial chaos (gPC). These fast, efficient, and accurate methods are an extension of the classical spectral methods of high-dimensional random spaces. Designed to simulate complex systems subject to random inputs, these methods are widely used in many areas of computer science and engineering. The book introduces polynomial approximation theory and probability theory; describes the basic theory of gPC meth

  13. Thermodynamic and relative approach to compute glass-forming

    Indian Academy of Sciences (India)

    This study deals with the evaluation of glass-forming ability (GFA) of oxides and is a critical reading of Sun and Rawson thermodynamic approach to quantify this aptitude. Both approaches are adequate but ambiguous regarding the behaviour of some oxides (tendency to amorphization or crystallization). Indeed, ZrO2 and ...

  14. Towards a cyber-physical era: soft computing framework based multi-sensor array for water quality monitoring

    OpenAIRE

    Bhardwaj, Jyotirmoy; Gupta, Karunesh K.; Gupta, Rajiv

    2018-01-01

    New concepts and techniques are replacing traditional methods of water quality parameter measurement systems. This paper introduces a cyber-physical system (CPS) approach for water quality assessment in a distribution network. Cyber-physical systems with embedded sensors, processors and actuators can be designed to sense and interact with the water environment. The proposed CPS is comprised of sensing framework integrated with five different water quality parameter sensor no...

  15. Computer Adaptive Testing, Big Data and Algorithmic Approaches to Education

    Science.gov (United States)

    Thompson, Greg

    2017-01-01

    This article critically considers the promise of computer adaptive testing (CAT) and digital data to provide better and quicker data that will improve the quality, efficiency and effectiveness of schooling. In particular, it uses the case of the Australian NAPLAN test that will become an online, adaptive test from 2016. The article argues that…

  16. A Cellular Automata Approach to Computer Vision and Image Processing.

    Science.gov (United States)

    1980-09-01

    the ACM, vol. 15, no. 9, pp. 827-837. [ Duda and Hart] R. 0. Duda and P. E. Hart, Pattern Classification and Scene Analysis, Wiley, New York, 1973...Center TR-738, 1979. [Farley] Arthur M. Farley and Andrzej Proskurowski, "Gossiping in Grid Graphs", University of Oregon Computer Science Department CS-TR

  17. New approach for virtual machines consolidation in heterogeneous computing systems

    Czech Academy of Sciences Publication Activity Database

    Fesl, Jan; Cehák, J.; Doležalová, Marie; Janeček, J.

    2016-01-01

    Roč. 9, č. 12 (2016), s. 321-332 ISSN 1738-9968 Institutional support: RVO:60077344 Keywords : consolidation * virtual machine * distributed Subject RIV: JD - Computer Applications, Robotics http://www.sersc.org/journals/IJHIT/vol9_no12_2016/29.pdf

  18. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  19. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  20. Computational approaches to cognition: the bottom-up view.

    Science.gov (United States)

    Koch, C

    1993-04-01

    How can higher level aspects of cognition, such as figure-ground segregation, object recognition, selective focal attention and ultimately even awareness, be implemented at the level of synapses and neurons? A number of theoretical studies emerging out of the connectionist and the computational neuroscience communities are starting to address these issues using neural plausible models.