WorldWideScience

Sample records for non-traditional machining techniques

  1. NON-TRADITIONAL MACHINING PROCESS SELECTION - AN INTEGRATED APPROACH

    Directory of Open Access Journals (Sweden)

    Manish Kumar Roy

    2017-03-01

    Full Text Available With a large demand intended for the use of harder and difficult to machine materials like titanium, Inconel, high-strength temperature resistant (HSTR alloys etc. coupled with the need for high accuracy and desired surface finish have lead us to the situation where we find ourselves entangled in a large pool of Non-Traditional machining (NTM processes. As such selecting a particular NTM process turns out to be a complicated job for a specific task. Meticulous selection of a NTM process involves a lot of criteria and hence multi-criteria decision making (MCDM method is used to solve such problems. For the aid of decision maker such that the process of selection gets simplified an integrated method of fuzzy analytic hierarchy process (FAHP with Quality function deployment (QFD has been implemented for finding the significance of different technical requirements on a relative basis. Subsequently grey relational analysis (GRA has been implemented for ranking out the alternatives and it was found that Electrochemical machining (ECM overrules other NTM processes. A problem already existing in the literature has been picked up for the numerical illustration. The results obtained in the present research study are comparable with the existing literature and sensitivity analysis indicates the robustness of the proposed model.

  2. Developing an efficient decision support system for non-traditional machine selection: an application of MOORA and MOOSRA

    Directory of Open Access Journals (Sweden)

    Asis Sarkar

    2015-01-01

    Full Text Available The purpose of this paper is to find out an efficient decision support method for non-traditional machine selection. It seeks to analyze potential non-traditional machine selection attributes with a relatively new MCDM approach of MOORA and MOOSRA method. The use of MOORA and MOOSRA method has been adopted to tackle subjective evaluation of information collected from an expert group. An example case study is shown here for better understanding of the said selection module which can be effectively applied to any other decision-making scenario. The method is not only computationally very simple, easily comprehensible, and robust, but also believed to have numerous subjective attributes. The rankings are expected to provide good guidance to the managers of an organization to select a feasible non-traditional machine. It shall also provide a good insight for the non-traditional machine manufacturer who might encourage research work concerning non-traditional machine selection.

  3. Application of PROMETHEE-GAIA method for non-traditional machining processes selection

    Directory of Open Access Journals (Sweden)

    Prasad Karande

    2012-10-01

    Full Text Available With ever increasing demand for manufactured products of hard alloys and metals with high surface finish and complex shape geometry, more interest is now being paid to non-traditional machining (NTM processes, where energy in its direct form is used to remove material from workpiece surface. Compared to conventional machining processes, NTM processes possess almost unlimited capabilities and there is a strong believe that use of NTM processes would go on increasing in diverse range of applications. Presence of a large number of NTM processes along with complex characteristics and capabilities, and lack of experts in NTM process selection domain compel for development of a structured approach for NTM process selection for a given machining application. Past researchers have already attempted to solve NTM process selection problems using various complex mathematical approaches which often require a profound knowledge in mathematics/artificial intelligence from the part of process engineers. In this paper, four NTM process selection problems are solved using an integrated PROMETHEE (preference ranking organization method for enrichment evaluation and GAIA (geometrical analysis for interactive aid method which would act as a visual decision aid to the process engineers. The observed results are quite satisfactory and exactly match with the expected solutions.

  4. Machining of composite materials. I - Traditional methods. II - Non-traditional methods

    Science.gov (United States)

    Abrate, S.; Walton, D. A.

    Traditional and nontraditional methods for machining organic-matrix and metal-matrix composites are reviewed. Such traditional procedures as drilling, cutting, sawing, routing, and grinding are discussed together with the damage introduced into composites by these manipulations. Particular attention is given to new, nontraditional methods, including laser, water-jet, electrodischarge, electrochemical spark, and ultrasonic machining methods showing that, these methods often speed up cutting and improve the surface quality. Moreover, it is sometimes possible to use new methods in cases where traditional methods are ineffective.

  5. Airfoil shape optimization using non-traditional optimization technique and its validation

    Directory of Open Access Journals (Sweden)

    R. Mukesh

    2014-07-01

    Full Text Available Computational fluid dynamics (CFD is one of the computer-based solution methods which is more widely employed in aerospace engineering. The computational power and time required to carry out the analysis increase as the fidelity of the analysis increases. Aerodynamic shape optimization has become a vital part of aircraft design in the recent years. Generally if we want to optimize an airfoil we have to describe the airfoil and for that, we need to have at least hundred points of x and y co-ordinates. It is really difficult to optimize airfoils with this large number of co-ordinates. Nowadays many different schemes of parameter sets are used to describe general airfoil such as B-spline, and PARSEC. The main goal of these parameterization schemes is to reduce the number of needed parameters as few as possible while controlling the important aerodynamic features effectively. Here the work has been done on the PARSEC geometry representation method. The objective of this work is to introduce the knowledge of describing general airfoil using twelve parameters by representing its shape as a polynomial function. And also we have introduced the concept of Genetic Algorithm to optimize the aerodynamic characteristics of a general airfoil for specific conditions. A MATLAB program has been developed to implement PARSEC, Panel Technique, and Genetic Algorithm. This program has been tested for a standard NACA 2411 airfoil and optimized to improve its coefficient of lift. Pressure distribution and co-efficient of lift for airfoil geometries have been calculated using the Panel method. The optimized airfoil has improved co-efficient of lift compared to the original one. The optimized airfoil is validated using wind tunnel data.

  6. Technique for Machining Glass

    Science.gov (United States)

    Rice, S. H.

    1982-01-01

    Process for machining glass with conventional carbide tools requires a small quantity of a lubricant for aluminum applied to area of glass to be machined. A carbide tool is then placed against workpiece with light pressure. Tool is raised periodically to clear work of glass dust and particles. Additional lubricant is applied as it is displaced.

  7. Non-traditional inheritance

    International Nuclear Information System (INIS)

    Hall, J.G.

    1992-01-01

    In the last few years, several non-traditional forms of inheritance have been recognized. These include mosaicism, cytoplasmic inheritance, uniparental disomy, imprinting, amplification/anticipation, and somatic recombination. Genomic imprinting (GI) is the dependence of the phenotype on the sex of the transmitting parent. GI in humans seems to involve growth, behaviour, and survival in utero. The detailed mechanism of genomic imprinting is not known, but it seems that some process is involved in turning a gene off; this probably involves two genes, one of which produces a product that turns a gene off, and the gene that is itself turned off. The process of imprinting (turning off) may be associated with methylation. Erasure of imprinting can occur, and seems to be associated with meiosis. 10 refs

  8. Machining technique prevents undercutting in tensile specimens

    Science.gov (United States)

    Moscater, R. E.; Royster, D. M.

    1968-01-01

    Machining technique prevents undercutting at the test section in tensile specimens when machining the four corners of the reduced section. Made with a gradual taper in the test section, the width of the center of the tensile specimen is less than the width at the four corners of the reduced section.

  9. Machine learning techniques in optical communication

    DEFF Research Database (Denmark)

    Zibar, Darko; Piels, Molly; Jones, Rasmus Thomas

    2016-01-01

    Machine learning techniques relevant for nonlinearity mitigation, carrier recovery, and nanoscale device characterization are reviewed and employed. Markov Chain Monte Carlo in combination with Bayesian filtering is employed within the nonlinear state-space framework and demonstrated for parameter...

  10. Machine learning techniques in optical communication

    DEFF Research Database (Denmark)

    Zibar, Darko; Piels, Molly; Jones, Rasmus Thomas

    2015-01-01

    Techniques from the machine learning community are reviewed and employed for laser characterization, signal detection in the presence of nonlinear phase noise, and nonlinearity mitigation. Bayesian filtering and expectation maximization are employed within nonlinear state-space framework...

  11. Machine Learning Techniques in Clinical Vision Sciences.

    Science.gov (United States)

    Caixinha, Miguel; Nunes, Sandrina

    2017-01-01

    This review presents and discusses the contribution of machine learning techniques for diagnosis and disease monitoring in the context of clinical vision science. Many ocular diseases leading to blindness can be halted or delayed when detected and treated at its earliest stages. With the recent developments in diagnostic devices, imaging and genomics, new sources of data for early disease detection and patients' management are now available. Machine learning techniques emerged in the biomedical sciences as clinical decision-support techniques to improve sensitivity and specificity of disease detection and monitoring, increasing objectively the clinical decision-making process. This manuscript presents a review in multimodal ocular disease diagnosis and monitoring based on machine learning approaches. In the first section, the technical issues related to the different machine learning approaches will be present. Machine learning techniques are used to automatically recognize complex patterns in a given dataset. These techniques allows creating homogeneous groups (unsupervised learning), or creating a classifier predicting group membership of new cases (supervised learning), when a group label is available for each case. To ensure a good performance of the machine learning techniques in a given dataset, all possible sources of bias should be removed or minimized. For that, the representativeness of the input dataset for the true population should be confirmed, the noise should be removed, the missing data should be treated and the data dimensionally (i.e., the number of parameters/features and the number of cases in the dataset) should be adjusted. The application of machine learning techniques in ocular disease diagnosis and monitoring will be presented and discussed in the second section of this manuscript. To show the clinical benefits of machine learning in clinical vision sciences, several examples will be presented in glaucoma, age-related macular degeneration

  12. MACHINE LEARNING TECHNIQUES USED IN BIG DATA

    Directory of Open Access Journals (Sweden)

    STEFANIA LOREDANA NITA

    2016-07-01

    Full Text Available The classical tools used in data analysis are not enough in order to benefit of all advantages of big data. The amount of information is too large for a complete investigation, and the possible connections and relations between data could be missed, because it is difficult or even impossible to verify all assumption over the information. Machine learning is a great solution in order to find concealed correlations or relationships between data, because it runs at scale machine and works very well with large data sets. The more data we have, the more the machine learning algorithm is useful, because it “learns” from the existing data and applies the found rules on new entries. In this paper, we present some machine learning algorithms and techniques used in big data.

  13. Machine learning techniques in dialogue act recognition

    Directory of Open Access Journals (Sweden)

    Mark Fišel

    2007-05-01

    Full Text Available This report addresses dialogue acts, their existing applications and techniques of automatically recognizing them, in Estonia as well as elsewhere. Three main applications are described: in dialogue systems to determine the intention of the speaker, in dialogue systems with machine translation to resolve ambiguities in the possible translation variants and in speech recognition to reduce word recognition error rate. Several recognition techniques are described on the surface level: how they work and how they are trained. A summary of the corresponding representation methods is provided for each technique. The paper also includes examples of applying the techniques to dialogue act recognition.The author comes to the conclusion that using the current evaluation metric it is impossible to compare dialogue act recognition techniques when these are applied to different dialogue act tag sets. Dialogue acts remain an open research area, with space and need for developing new recognition techniques and methods of evaluation.

  14. Machine learning techniques for optical communication system optimization

    DEFF Research Database (Denmark)

    Zibar, Darko; Wass, Jesper; Thrane, Jakob

    In this paper, machine learning techniques relevant to optical communication are presented and discussed. The focus is on applying machine learning tools to optical performance monitoring and performance prediction.......In this paper, machine learning techniques relevant to optical communication are presented and discussed. The focus is on applying machine learning tools to optical performance monitoring and performance prediction....

  15. CRDM motion analysis using machine learning technique

    International Nuclear Information System (INIS)

    Nishimura, Takuya; Nakayama, Hiroyuki; Saitoh, Mayumi; Yaguchi, Seiji

    2017-01-01

    Magnetic jack type Control Rod Drive Mechanism (CRDM) for pressurized water reactor (PWR) plant operates control rods in response to electrical signals from a reactor control system. CRDM operability is evaluated by quantifying armature's response of closed/opened time which means interval time between coil energizing/de-energizing points and armature closed/opened points. MHI has already developed an automatic CRDM motion analysis and applied it to actual plants so far. However, CRDM operational data has wide variation depending on their characteristics such as plant condition, plant, etc. In the existing motion analysis, there is an issue of analysis accuracy for applying a single analysis technique to all plant conditions, plants, etc. In this study, MHI investigated motion analysis using machine learning (Random Forests) which is flexibly accommodated to CRDM operational data with wide variation, and is improved analysis accuracy. (author)

  16. New technique of machining high precision mirror surface press roller

    Science.gov (United States)

    Hongsen, Deng

    1991-03-01

    High precision mirror surface press roller machining technique of corrosion and grinding proof is one of the key techniques that the production enterprises as well as the machining and manufacturing of the following industries sought to resolve for a long time: plastics, papermaking, rubber, film, and chip production. In Oct. 1984, a new comprehensive machining technique of metal brush coating, grinding with abrasive belt, as well as buffing was used to conduct nearly 20 experiments. In Jan. 1985, a pair of middle convex high precision mirror surface press rollers was successfully machined. The technical process is described.

  17. Micro machining techniques commonly used in manufacturing field

    Directory of Open Access Journals (Sweden)

    Adem Çiçek

    2011-06-01

    Full Text Available Developing technology and the need for high-precision parts in manufacturing industry has revealed the micro-machining. Machine tools and work pieces are miniaturized through micro-machining, materials and power consumption reduced to a minimum level. High productiveness in the use of resources and time can be obtained through this rapidly growing industry around the world. In this paper, different micro-machining techniques have been classified revising the investigations recently performed in the field of micro-machining and discussed their contributions to manufacturing process.

  18. Data Mining Practical Machine Learning Tools and Techniques

    CERN Document Server

    Witten, Ian H; Hall, Mark A

    2011-01-01

    Data Mining: Practical Machine Learning Tools and Techniques offers a thorough grounding in machine learning concepts as well as practical advice on applying machine learning tools and techniques in real-world data mining situations. This highly anticipated third edition of the most acclaimed work on data mining and machine learning will teach you everything you need to know about preparing inputs, interpreting outputs, evaluating results, and the algorithmic methods at the heart of successful data mining. Thorough updates reflect the technical changes and modernizations that have taken place

  19. The Watergate Seminar: Non-Traditional College

    Science.gov (United States)

    Harrison, Joseph

    1977-01-01

    Empire State College, a non-traditional unit of the State University of New York, conducted a series of seminars related to the Watergate Affair, at which 23 students took part in this intensive exploration of a Watergate. It provided tremendous opportunity for insight into the concept of seminars within a college. (Author)

  20. Welfare of non-traditional pets.

    Science.gov (United States)

    Schuppli, C A; Fraser, D; Bacon, H J

    2014-04-01

    The keeping of non-traditional or 'exotic' pets has been growing in popularity worldwide. In addition to the typical welfare challenges of keeping more traditional pet species like dogs and cats, ensuring the welfare of non-traditional pets is complicated by factors such as lack of knowledge, difficulties meeting requirements in the home and where and how animals are obtained. This paper uses examples of different species to highlight three major welfare concerns: ensuring that pets under our care i) function well biologically, ii) are free from negative psychological states and able to experience normal pleasures, and iii) lead reasonably natural lives. The keeping of non-traditional pets also raises ethical concerns about whether the animal poses any danger to others (e.g. transmission of zoonotic diseases) and whether the animal might cause environmental damage (e.g. invading non-native habitats when released). The authors used these considerations to create a checklist, which identifies and organises the various concerns that may arise over keeping non-traditional species as pets. An inability to address these concerns raises questions about how to mitigate them or even whether or not certain species should be kept as pets at all. Thus, the authors propose five categories, which range from relatively unproblematic pet species to species whose keeping poses unacceptable risks to the animals, to humans, or to the environment. This approach to the evaluation and categorisation of species could provide a constructive basis for advocacy and regulatory actions.

  1. BENCHMARKING MACHINE LEARNING TECHNIQUES FOR SOFTWARE DEFECT DETECTION

    OpenAIRE

    Saiqa Aleem; Luiz Fernando Capretz; Faheem Ahmed

    2015-01-01

    Machine Learning approaches are good in solving problems that have less information. In most cases, the software domain problems characterize as a process of learning that depend on the various circumstances and changes accordingly. A predictive model is constructed by using machine learning approaches and classified them into defective and non-defective modules. Machine learning techniques help developers to retrieve useful information after the classification and enable them to analyse data...

  2. Prostate Cancer Probability Prediction By Machine Learning Technique.

    Science.gov (United States)

    Jović, Srđan; Miljković, Milica; Ivanović, Miljan; Šaranović, Milena; Arsić, Milena

    2017-11-26

    The main goal of the study was to explore possibility of prostate cancer prediction by machine learning techniques. In order to improve the survival probability of the prostate cancer patients it is essential to make suitable prediction models of the prostate cancer. If one make relevant prediction of the prostate cancer it is easy to create suitable treatment based on the prediction results. Machine learning techniques are the most common techniques for the creation of the predictive models. Therefore in this study several machine techniques were applied and compared. The obtained results were analyzed and discussed. It was concluded that the machine learning techniques could be used for the relevant prediction of prostate cancer.

  3. A Comparative Analysis of Machine Learning Techniques for Credit Scoring

    OpenAIRE

    Nwulu, Nnamdi; Oroja, Shola; İlkan, Mustafa

    2012-01-01

    Abstract Credit Scoring has become an oft researched topic in light of the increasing volatility of the global economy and the recent world financial crisis. Amidst the many methods used for credit scoring, machine learning techniques are becoming increasingly popular due to their efficient and accurate nature and relative simplicity. Furthermore machine learning techniques minimize the risk of human bias and error and maximize speed as they are able to perform computation...

  4. Comparison of Machine Learning Techniques for Target Detection

    NARCIS (Netherlands)

    Vink, J.P.; Haan, G. de

    2013-01-01

    This paper focuses on machine learning techniques for real-time detection. Although many supervised learning techniques have been described in the literature, no technique always performs best. Several comparative studies are available, but have not always been performedcarefully, leading to invalid

  5. Modelling tick abundance using machine learning techniques and satellite imagery

    DEFF Research Database (Denmark)

    Kjær, Lene Jung; Korslund, L.; Kjelland, V.

    satellite images to run Boosted Regression Tree machine learning algorithms to predict overall distribution (presence/absence of ticks) and relative tick abundance of nymphs and larvae in southern Scandinavia. For nymphs, the predicted abundance had a positive correlation with observed abundance...... the predicted distribution of larvae was mostly even throughout Denmark, it was primarily around the coastlines in Norway and Sweden. Abundance was fairly low overall except in some fragmented patches corresponding to forested habitats in the region. Machine learning techniques allow us to predict for larger...... the collected ticks for pathogens and using the same machine learning techniques to develop prevalence maps of the ScandTick region....

  6. Machine learning techniques to examine large patient databases.

    Science.gov (United States)

    Meyfroidt, Geert; Güiza, Fabian; Ramon, Jan; Bruynooghe, Maurice

    2009-03-01

    Computerization in healthcare in general, and in the operating room (OR) and intensive care unit (ICU) in particular, is on the rise. This leads to large patient databases, with specific properties. Machine learning techniques are able to examine and to extract knowledge from large databases in an automatic way. Although the number of potential applications for these techniques in medicine is large, few medical doctors are familiar with their methodology, advantages and pitfalls. A general overview of machine learning techniques, with a more detailed discussion of some of these algorithms, is presented in this review.

  7. A REVIEW OF STUDIES ON MACHINE LEARNING TECHNIQUES

    OpenAIRE

    Yogesh Singh; Pradeep Kumar Bhatia; Omprakash Sangwan

    2007-01-01

    This paper provides an extensive review of studies related to expert estimation of software development using Machine-Learning Techniques (MLT). Machine learning in this new era, is demonstrating the promise of producing consistently accurate estimates. Machine learning system effectively “learns†how to estimate from training set of completed projects. The main goal and contribution of the review is to support the research on expert estimation, i.e. to ease other researchers for r...

  8. Advanced Techniques for Monitoring, Simulation and Optimization of Machining Processes

    OpenAIRE

    Keshari, Anupam

    2011-01-01

    In today’s manufacturing industry, pressure for productivity, higher quality and cost saving is heavier than ever. Surviving in today’s highly competitive world is not an easy task, contemporary technology updates and heavy investments are needed in state of the art machinery and modern cutting tool systems. If the machining resources are underutilized, feasible techniques are needed to utilize resources efficiently. The new enhancements in the machine tools sector have enabled opportunit...

  9. Comparison of machine milk out and calf nursing techniques for ...

    African Journals Online (AJOL)

    Milk yields by machine milk out and calf nursing techniques were estimated monthly from April to August in 24, 3- years-old, two-breed cross cows. Overall, average milk yield estimates were 16 - 18 lb/day by machine milk out and 12.79 lb/day by calf nursing, with a difference of 3.91 lb/day. The two methods were similar ...

  10. Machine monitoring via current signature analysis techniques

    International Nuclear Information System (INIS)

    Smith, S.F.; Castleberry, K.N.; Nowlin, C.H.

    1992-01-01

    A significant need in the effort to provide increased production quality is to provide improved plant equipment monitoring capabilities. Unfortunately, in today's tight economy, even such monitoring instrumentation must be implemented in a recognizably cost effective manner. By analyzing the electric current drawn by motors, actuator, and other line-powered industrial equipment, significant insights into the operations of the movers, driven equipment, and even the power source can be obtained. The generic term 'current signature analysis' (CSA) has been coined to describe several techniques for extracting useful equipment or process monitoring information from the electrical power feed system. A patented method developed at Oak Ridge National Laboratory is described which recognizes the presence of line-current modulation produced by motors and actuators driving varying loads. The in-situ application of applicable linear demodulation techniques to the analysis of numerous motor-driven systems is also discussed. The use of high-quality amplitude and angle-demodulation circuitry has permitted remote status monitoring of several types of medium and high-power gas compressors in (US DOE facilities) driven by 3-phase induction motors rated from 100 to 3,500 hp, both with and without intervening speed increasers. Flow characteristics of the compressors, including various forms of abnormal behavior such as surging and rotating stall, produce at the output of the specialized detectors specific time and frequency signatures which can be easily identified for monitoring, control, and fault-prevention purposes. The resultant data are similar in form to information obtained via standard vibration-sensing techniques and can be analyzed using essentially identical methods. In addition, other machinery such as refrigeration compressors, brine pumps, vacuum pumps, fans, and electric motors have been characterized

  11. Analysing CMS transfers using Machine Learning techniques

    CERN Document Server

    Diotalevi, Tommaso

    2016-01-01

    LHC experiments transfer more than 10 PB/week between all grid sites using the FTS transfer service. In particular, CMS manages almost 5 PB/week of FTS transfers with PhEDEx (Physics Experiment Data Export). FTS sends metrics about each transfer (e.g. transfer rate, duration, size) to a central HDFS storage at CERN. The work done during these three months, here as a Summer Student, involved the usage of ML techniques, using a CMS framework called DCAFPilot, to process this new data and generate predictions of transfer latencies on all links between Grid sites. This analysis will provide, as a future service, the necessary information in order to proactively identify and maybe fix latency issued transfer over the WLCG.

  12. Vector machine techniques for modeling of seismic liquefaction data

    Directory of Open Access Journals (Sweden)

    Pijush Samui

    2014-06-01

    Full Text Available This article employs three soft computing techniques, Support Vector Machine (SVM; Least Square Support Vector Machine (LSSVM and Relevance Vector Machine (RVM, for prediction of liquefaction susceptibility of soil. SVM and LSSVM are based on the structural risk minimization (SRM principle which seeks to minimize an upper bound of the generalization error consisting of the sum of the training error and a confidence interval. RVM is a sparse Bayesian kernel machine. SVM, LSSVM and RVM have been used as classification tools. The developed SVM, LSSVM and RVM give equations for prediction of liquefaction susceptibility of soil. A comparative study has been carried out between the developed SVM, LSSVM and RVM models. The results from this article indicate that the developed SVM gives the best performance for prediction of liquefaction susceptibility of soil.

  13. Technique for Increasing Accuracy of Positioning System of Machine Tools

    Directory of Open Access Journals (Sweden)

    Sh. Ji

    2014-01-01

    Full Text Available The aim of research is to improve the accuracy of positioning and processing system using a technique for optimization of pressure diagrams of guides in machine tools. The machining quality is directly related to its accuracy, which characterizes an impact degree of various errors of machines. The accuracy of the positioning system is one of the most significant machining characteristics, which allow accuracy evaluation of processed parts.The literature describes that the working area of the machine layout is rather informative to characterize the effect of the positioning system on the macro-geometry of the part surfaces to be processed. To enhance the static accuracy of the studied machine, in principle, two groups of measures are possible. One of them points toward a decrease of the cutting force component, which overturns the slider moments. Another group of measures is related to the changing sizes of the guide facets, which may lead to their profile change.The study was based on mathematical modeling and optimization of the cutting zone coordinates. And we find the formula to determine the surface pressure of the guides. The selected parameters of optimization are vectors of the cutting force and values of slides and guides. Obtained results show that a technique for optimization of coordinates in the cutting zone was necessary to increase a processing accuracy.The research has established that to define the optimal coordinates of the cutting zone we have to change the sizes of slides, value and coordinates of applied forces, reaching the pressure equalization and improving the accuracy of positioning system of machine tools. In different points of the workspace a vector of forces is applied, pressure diagrams are found, which take into account the changes in the parameters of positioning system, and the pressure diagram equalization to provide the most accuracy of machine tools is achieved.

  14. Machine Learning Techniques in Optimal Design

    Science.gov (United States)

    Cerbone, Giuseppe

    1992-01-01

    Many important applications can be formalized as constrained optimization tasks. For example, we are studying the engineering domain of two-dimensional (2-D) structural design. In this task, the goal is to design a structure of minimum weight that bears a set of loads. A solution to a design problem in which there is a single load (L) and two stationary support points (S1 and S2) consists of four members, E1, E2, E3, and E4 that connect the load to the support points is discussed. In principle, optimal solutions to problems of this kind can be found by numerical optimization techniques. However, in practice [Vanderplaats, 1984] these methods are slow and they can produce different local solutions whose quality (ratio to the global optimum) varies with the choice of starting points. Hence, their applicability to real-world problems is severely restricted. To overcome these limitations, we propose to augment numerical optimization by first performing a symbolic compilation stage to produce: (a) objective functions that are faster to evaluate and that depend less on the choice of the starting point and (b) selection rules that associate problem instances to a set of recommended solutions. These goals are accomplished by successive specializations of the problem class and of the associated objective functions. In the end, this process reduces the problem to a collection of independent functions that are fast to evaluate, that can be differentiated symbolically, and that represent smaller regions of the overall search space. However, the specialization process can produce a large number of sub-problems. This is overcome by deriving inductively selection rules which associate problems to small sets of specialized independent sub-problems. Each set of candidate solutions is chosen to minimize a cost function which expresses the tradeoff between the quality of the solution that can be obtained from the sub-problem and the time it takes to produce it. The overall solution

  15. Detecting Mislabeled Data Using Supervised Machine Learning Techniques

    NARCIS (Netherlands)

    Poel, Mannes; Schmorrow, Dylan D.; Fidopiastis, Cali M.

    2017-01-01

    A lot of data sets, gathered for instance during user experiments, are contaminated with noise. Some noise in the measured features is not much of a problem, it even increases the performance of many Machine Learning (ML) techniques. But for noise in the labels (mislabeled data) the situation is

  16. Adapting Virtual Machine Techniques for Seamless Aspect Support

    NARCIS (Netherlands)

    Bockisch, Christoph; Arnold, Matthew; Dinkelaker, Tom; Mezini, Mira

    2006-01-01

    Current approaches to compiling aspect-oriented programs are inefficient. This inefficiency has negative effects on the productivity of the development process and is especially prohibitive for dynamic aspect deployment. In this work, we present how well-known virtual machine techniques can be used

  17. Machine learning techniques to examine large patient databases

    OpenAIRE

    Meyfroidt, Geert; Guiza Grandas, Fabian; Ramon, Jan; Bruynooghe, Maurice

    2009-01-01

    Computerization in health care in general, and in the operating room (OR) and intensive care unit (ICU) in particular, is on the rise. This leads to large patient databases, with specific properties. Machine learning techniques are able to examine and to extract knowledge from large databases in an automatic way. Although the number of potential applications for these techniques in medicine is large, few medical doctors are familiar with their methodology, advantages and pitfalls. A general o...

  18. Contemporary machine learning: techniques for practitioners in the physical sciences

    Science.gov (United States)

    Spears, Brian

    2017-10-01

    Machine learning is the science of using computers to find relationships in data without explicitly knowing or programming those relationships in advance. Often without realizing it, we employ machine learning every day as we use our phones or drive our cars. Over the last few years, machine learning has found increasingly broad application in the physical sciences. This most often involves building a model relationship between a dependent, measurable output and an associated set of controllable, but complicated, independent inputs. The methods are applicable both to experimental observations and to databases of simulated output from large, detailed numerical simulations. In this tutorial, we will present an overview of current tools and techniques in machine learning - a jumping-off point for researchers interested in using machine learning to advance their work. We will discuss supervised learning techniques for modeling complicated functions, beginning with familiar regression schemes, then advancing to more sophisticated decision trees, modern neural networks, and deep learning methods. Next, we will cover unsupervised learning and techniques for reducing the dimensionality of input spaces and for clustering data. We'll show example applications from both magnetic and inertial confinement fusion. Along the way, we will describe methods for practitioners to help ensure that their models generalize from their training data to as-yet-unseen test data. We will finally point out some limitations to modern machine learning and speculate on some ways that practitioners from the physical sciences may be particularly suited to help. This work was performed by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  19. Data mining practical machine learning tools and techniques

    CERN Document Server

    Witten, Ian H

    2005-01-01

    As with any burgeoning technology that enjoys commercial attention, the use of data mining is surrounded by a great deal of hype. Exaggerated reports tell of secrets that can be uncovered by setting algorithms loose on oceans of data. But there is no magic in machine learning, no hidden power, no alchemy. Instead there is an identifiable body of practical techniques that can extract useful information from raw data. This book describes these techniques and shows how they work. The book is a major revision of the first edition that appeared in 1999. While the basic core remains the same

  20. Using Machine Learning Techniques in the Analysis of Oceanographic Data

    Science.gov (United States)

    Falcinelli, K. E.; Abuomar, S.

    2017-12-01

    Acoustic Doppler Current Profilers (ADCPs) are oceanographic tools capable of collecting large amounts of current profile data. Using unsupervised machine learning techniques such as principal component analysis, fuzzy c-means clustering, and self-organizing maps, patterns and trends in an ADCP dataset are found. Cluster validity algorithms such as visual assessment of cluster tendency and clustering index are used to determine the optimal number of clusters in the ADCP dataset. These techniques prove to be useful in analysis of ADCP data and demonstrate potential for future use in other oceanographic applications.

  1. Imaging and machine learning techniques for diagnosis of Alzheimer's disease.

    Science.gov (United States)

    Mirzaei, Golrokh; Adeli, Anahita; Adeli, Hojjat

    2016-12-01

    Alzheimer's disease (AD) is a common health problem in elderly people. There has been considerable research toward the diagnosis and early detection of this disease in the past decade. The sensitivity of biomarkers and the accuracy of the detection techniques have been defined to be the key to an accurate diagnosis. This paper presents a state-of-the-art review of the research performed on the diagnosis of AD based on imaging and machine learning techniques. Different segmentation and machine learning techniques used for the diagnosis of AD are reviewed including thresholding, supervised and unsupervised learning, probabilistic techniques, Atlas-based approaches, and fusion of different image modalities. More recent and powerful classification techniques such as the enhanced probabilistic neural network of Ahmadlou and Adeli should be investigated with the goal of improving the diagnosis accuracy. A combination of different image modalities can help improve the diagnosis accuracy rate. Research is needed on the combination of modalities to discover multi-modal biomarkers.

  2. Analysis of Machine Learning Techniques for Heart Failure Readmissions.

    Science.gov (United States)

    Mortazavi, Bobak J; Downing, Nicholas S; Bucholz, Emily M; Dharmarajan, Kumar; Manhapra, Ajay; Li, Shu-Xia; Negahban, Sahand N; Krumholz, Harlan M

    2016-11-01

    The current ability to predict readmissions in patients with heart failure is modest at best. It is unclear whether machine learning techniques that address higher dimensional, nonlinear relationships among variables would enhance prediction. We sought to compare the effectiveness of several machine learning algorithms for predicting readmissions. Using data from the Telemonitoring to Improve Heart Failure Outcomes trial, we compared the effectiveness of random forests, boosting, random forests combined hierarchically with support vector machines or logistic regression (LR), and Poisson regression against traditional LR to predict 30- and 180-day all-cause readmissions and readmissions because of heart failure. We randomly selected 50% of patients for a derivation set, and a validation set comprised the remaining patients, validated using 100 bootstrapped iterations. We compared C statistics for discrimination and distributions of observed outcomes in risk deciles for predictive range. In 30-day all-cause readmission prediction, the best performing machine learning model, random forests, provided a 17.8% improvement over LR (mean C statistics, 0.628 and 0.533, respectively). For readmissions because of heart failure, boosting improved the C statistic by 24.9% over LR (mean C statistic 0.678 and 0.543, respectively). For 30-day all-cause readmission, the observed readmission rates in the lowest and highest deciles of predicted risk with random forests (7.8% and 26.2%, respectively) showed a much wider separation than LR (14.2% and 16.4%, respectively). Machine learning methods improved the prediction of readmission after hospitalization for heart failure compared with LR and provided the greatest predictive range in observed readmission rates. © 2016 American Heart Association, Inc.

  3. Comparative Performance Analysis of Machine Learning Techniques for Software Bug Detection

    OpenAIRE

    Saiqa Aleem; Luiz Fernando Capretz; Faheem Ahmed

    2015-01-01

    Machine learning techniques can be used to analyse data from different perspectives and enable developers to retrieve useful information. Machine learning techniques are proven to be useful in terms of software bug prediction. In this paper, a comparative performance analysis of different machine learning techniques is explored f or software bug prediction on public available data sets. Results showed most of the mac ...

  4. Dropout Prediction in E-Learning Courses through the Combination of Machine Learning Techniques

    Science.gov (United States)

    Lykourentzou, Ioanna; Giannoukos, Ioannis; Nikolopoulos, Vassilis; Mpardis, George; Loumos, Vassili

    2009-01-01

    In this paper, a dropout prediction method for e-learning courses, based on three popular machine learning techniques and detailed student data, is proposed. The machine learning techniques used are feed-forward neural networks, support vector machines and probabilistic ensemble simplified fuzzy ARTMAP. Since a single technique may fail to…

  5. Detection and Classification of Baleen Whale Foraging Calls Combining Pattern Recognition and Machine Learning Techniques

    Science.gov (United States)

    2016-12-01

    CLASSIFICATION OF BALEEN WHALE FORAGING CALLS COMBINING PATTERN RECOGNITION AND MACHINE LEARNING TECHNIQUES by Ho-Chun Huang December 2016...FORAGING CALLS COMBINING PATTERN RECOGNITION AND MACHINE LEARNING TECHNIQUES 5. FUNDING NUMBERS 6. AUTHOR(S) Ho-Chun Huang 7. PERFORMING...using a machine learning technique, a logistic regression classifier. These algorithms have been trained and evaluated using the Detection

  6. Development of an evaluation technique for human-machine interface

    International Nuclear Information System (INIS)

    Min, Dae Hwan; Koo, Sang Hui; Ahn, Won Yeong; Ryu, Yeong Shin

    1997-07-01

    The purpose of this study is two-fold : firstly to establish an evaluation technique for HMI(Human Machine Interface) in NPPs(Nuclear Power Plants) and secondly to develop an architecture of a support system which can be used for the evaluation of HMI. In order to establish an evaluation technique, this study conducted literature review on basic theories of cognitive science studies and summarized the cognitive characteristics of humans. This study also surveyed evaluation techniques of HMI in general, and reviewed studies on the evaluation of HMI in NPPs. On the basis of this survey, the study established a procedure for the evaluation of HMI in NPPs in Korea and laid a foundation for empirical verification

  7. Development of an evaluation technique for human-machine interface

    Energy Technology Data Exchange (ETDEWEB)

    Min, Dae Hwan; Koo, Sang Hui; Ahn, Won Yeong; Ryu, Yeong Shin [Korea Univ., Seoul (Korea, Republic of)

    1997-07-15

    The purpose of this study is two-fold : firstly to establish an evaluation technique for HMI(Human Machine Interface) in NPPs(Nuclear Power Plants) and secondly to develop an architecture of a support system which can be used for the evaluation of HMI. In order to establish an evaluation technique, this study conducted literature review on basic theories of cognitive science studies and summarized the cognitive characteristics of humans. This study also surveyed evaluation techniques of HMI in general, and reviewed studies on the evaluation of HMI in NPPs. On the basis of this survey, the study established a procedure for the evaluation of HMI in NPPs in Korea and laid a foundation for empirical verification.

  8. Classification of Phishing Email Using Random Forest Machine Learning Technique

    OpenAIRE

    Akinyelu, Andronicus A.; Adewumi, Aderemi O.

    2013-01-01

    Phishing is one of the major challenges faced by the world of e-commerce today. Thanks to phishing attacks, billions of dollars have been lost by many companies and individuals. In 2012, an online report put the loss due to phishing attack at about $1.5 billion. This global impact of phishing attacks will continue to be on the increase and thus requires more efficient phishing detection techniques to curb the menace. This paper investigates and reports the use of random forest machine learnin...

  9. Applying machine learning classification techniques to automate sky object cataloguing

    Science.gov (United States)

    Fayyad, Usama M.; Doyle, Richard J.; Weir, W. Nick; Djorgovski, Stanislav

    1993-08-01

    We describe the application of an Artificial Intelligence machine learning techniques to the development of an automated tool for the reduction of a large scientific data set. The 2nd Mt. Palomar Northern Sky Survey is nearly completed. This survey provides comprehensive coverage of the northern celestial hemisphere in the form of photographic plates. The plates are being transformed into digitized images whose quality will probably not be surpassed in the next ten to twenty years. The images are expected to contain on the order of 107 galaxies and 108 stars. Astronomers wish to determine which of these sky objects belong to various classes of galaxies and stars. Unfortunately, the size of this data set precludes analysis in an exclusively manual fashion. Our approach is to develop a software system which integrates the functions of independently developed techniques for image processing and data classification. Digitized sky images are passed through image processing routines to identify sky objects and to extract a set of features for each object. These routines are used to help select a useful set of attributes for classifying sky objects. Then GID3 (Generalized ID3) and O-B Tree, two inductive learning techniques, learns classification decision trees from examples. These classifiers will then be applied to new data. These developmnent process is highly interactive, with astronomer input playing a vital role. Astronomers refine the feature set used to construct sky object descriptions, and evaluate the performance of the automated classification technique on new data. This paper gives an overview of the machine learning techniques with an emphasis on their general applicability, describes the details of our specific application, and reports the initial encouraging results. The results indicate that our machine learning approach is well-suited to the problem. The primary benefit of the approach is increased data reduction throughput. Another benefit is

  10. An authority-based machines intelligence measurement technique with implementation to human-machine collaboration navigation in the nuclear installation

    International Nuclear Information System (INIS)

    Nugroho, Djoko Hari

    2003-01-01

    An authority-based machines intelligence measurement technique with implementation to human-machine collaboration navigation in the nuclear installation. A technique for measuring the machine intelligence quotient (MIQ) has been developed based on authority approach for decision making in 4 sequences operation task and 8 scale degrees of automation. The index of machine intelligence is important for design goals to manifest the intelligence superiority among products. The technique is mostly beneficial for predicting the autonomy level of the system. Moreover the technique is implemented for the human-machine collaboration navigation configuration in the nuclear installation. It can be concluded that MIQ of the system is 26 from the scale of 4 up to 32

  11. Machine-learning techniques applied to antibacterial drug discovery.

    Science.gov (United States)

    Durrant, Jacob D; Amaro, Rommie E

    2015-01-01

    The emergence of drug-resistant bacteria threatens to revert humanity back to the preantibiotic era. Even now, multidrug-resistant bacterial infections annually result in millions of hospital days, billions in healthcare costs, and, most importantly, tens of thousands of lives lost. As many pharmaceutical companies have abandoned antibiotic development in search of more lucrative therapeutics, academic researchers are uniquely positioned to fill the pipeline. Traditional high-throughput screens and lead-optimization efforts are expensive and labor intensive. Computer-aided drug-discovery techniques, which are cheaper and faster, can accelerate the identification of novel antibiotics, leading to improved hit rates and faster transitions to preclinical and clinical testing. The current review describes two machine-learning techniques, neural networks and decision trees, that have been used to identify experimentally validated antibiotics. We conclude by describing the future directions of this exciting field. © 2015 John Wiley & Sons A/S.

  12. Influence of different cooling and lubrication techniques on material machinability in machining:

    OpenAIRE

    Cica, Djordje; Globočki-Lakić, Gordana; Kramar, Davorin; Sredanović, Branislav

    2013-01-01

    In this paper, a novel approach to the definition of universal machinability is presented. The machinability model is based on analysing the vector of the cutting process performance. The machinability of C45E steel was analysed and evaluated according to the developed machinability definition. As the machinability criteria, cutting force, intensity of tool wear and surface roughness were used. Analysis of machinability was performed using different cooling and lubrication conditions: convent...

  13. Toward accelerating landslide mapping with interactive machine learning techniques

    Science.gov (United States)

    Stumpf, André; Lachiche, Nicolas; Malet, Jean-Philippe; Kerle, Norman; Puissant, Anne

    2013-04-01

    Despite important advances in the development of more automated methods for landslide mapping from optical remote sensing images, the elaboration of inventory maps after major triggering events still remains a tedious task. Image classification with expert defined rules typically still requires significant manual labour for the elaboration and adaption of rule sets for each particular case. Machine learning algorithm, on the contrary, have the ability to learn and identify complex image patterns from labelled examples but may require relatively large amounts of training data. In order to reduce the amount of required training data active learning has evolved as key concept to guide the sampling for applications such as document classification, genetics and remote sensing. The general underlying idea of most active learning approaches is to initialize a machine learning model with a small training set, and to subsequently exploit the model state and/or the data structure to iteratively select the most valuable samples that should be labelled by the user and added in the training set. With relatively few queries and labelled samples, an active learning strategy should ideally yield at least the same accuracy than an equivalent classifier trained with many randomly selected samples. Our study was dedicated to the development of an active learning approach for landslide mapping from VHR remote sensing images with special consideration of the spatial distribution of the samples. The developed approach is a region-based query heuristic that enables to guide the user attention towards few compact spatial batches rather than distributed points resulting in time savings of 50% and more compared to standard active learning techniques. The approach was tested with multi-temporal and multi-sensor satellite images capturing recent large scale triggering events in Brazil and China and demonstrated balanced user's and producer's accuracies between 74% and 80%. The assessment also

  14. Estimation of Alpine Skier Posture Using Machine Learning Techniques

    Science.gov (United States)

    Nemec, Bojan; Petrič, Tadej; Babič, Jan; Supej, Matej

    2014-01-01

    High precision Global Navigation Satellite System (GNSS) measurements are becoming more and more popular in alpine skiing due to the relatively undemanding setup and excellent performance. However, GNSS provides only single-point measurements that are defined with the antenna placed typically behind the skier's neck. A key issue is how to estimate other more relevant parameters of the skier's body, like the center of mass (COM) and ski trajectories. Previously, these parameters were estimated by modeling the skier's body with an inverted-pendulum model that oversimplified the skier's body. In this study, we propose two machine learning methods that overcome this shortcoming and estimate COM and skis trajectories based on a more faithful approximation of the skier's body with nine degrees-of-freedom. The first method utilizes a well-established approach of artificial neural networks, while the second method is based on a state-of-the-art statistical generalization method. Both methods were evaluated using the reference measurements obtained on a typical giant slalom course and compared with the inverted-pendulum method. Our results outperform the results of commonly used inverted-pendulum methods and demonstrate the applicability of machine learning techniques in biomechanical measurements of alpine skiing. PMID:25313492

  15. Estimation of alpine skier posture using machine learning techniques.

    Science.gov (United States)

    Nemec, Bojan; Petrič, Tadej; Babič, Jan; Supej, Matej

    2014-10-13

    High precision Global Navigation Satellite System (GNSS) measurements are becoming more and more popular in alpine skiing due to the relatively undemanding setup and excellent performance. However, GNSS provides only single-point measurements that are defined with the antenna placed typically behind the skier's neck. A key issue is how to estimate other more relevant parameters of the skier's body, like the center of mass (COM) and ski trajectories. Previously, these parameters were estimated by modeling the skier's body with an inverted-pendulum model that oversimplified the skier's body. In this study, we propose two machine learning methods that overcome this shortcoming and estimate COM and skis trajectories based on a more faithful approximation of the skier's body with nine degrees-of-freedom. The first method utilizes a well-established approach of artificial neural networks, while the second method is based on a state-of-the-art statistical generalization method. Both methods were evaluated using the reference measurements obtained on a typical giant slalom course and compared with the inverted-pendulum method. Our results outperform the results of commonly used inverted-pendulum methods and demonstrate the applicability of machine learning techniques in biomechanical measurements of alpine skiing.

  16. Prediction of brain tumor progression using a machine learning technique

    Science.gov (United States)

    Shen, Yuzhong; Banerjee, Debrup; Li, Jiang; Chandler, Adam; Shen, Yufei; McKenzie, Frederic D.; Wang, Jihong

    2010-03-01

    A machine learning technique is presented for assessing brain tumor progression by exploring six patients' complete MRI records scanned during their visits in the past two years. There are ten MRI series, including diffusion tensor image (DTI), for each visit. After registering all series to the corresponding DTI scan at the first visit, annotated normal and tumor regions were overlaid. Intensity value of each pixel inside the annotated regions were then extracted across all of the ten MRI series to compose a 10 dimensional vector. Each feature vector falls into one of three categories:normal, tumor, and normal but progressed to tumor at a later time. In this preliminary study, we focused on the trend of brain tumor progression during three consecutive visits, i.e., visit A, B, and C. A machine learning algorithm was trained using the data containing information from visit A to visit B, and the trained model was used to predict tumor progression from visit A to visit C. Preliminary results showed that prediction for brain tumor progression is feasible. An average of 80.9% pixel-wise accuracy was achieved for tumor progression prediction at visit C.

  17. Classification of Phishing Email Using Random Forest Machine Learning Technique

    Directory of Open Access Journals (Sweden)

    Andronicus A. Akinyelu

    2014-01-01

    Full Text Available Phishing is one of the major challenges faced by the world of e-commerce today. Thanks to phishing attacks, billions of dollars have been lost by many companies and individuals. In 2012, an online report put the loss due to phishing attack at about $1.5 billion. This global impact of phishing attacks will continue to be on the increase and thus requires more efficient phishing detection techniques to curb the menace. This paper investigates and reports the use of random forest machine learning algorithm in classification of phishing attacks, with the major objective of developing an improved phishing email classifier with better prediction accuracy and fewer numbers of features. From a dataset consisting of 2000 phishing and ham emails, a set of prominent phishing email features (identified from the literature were extracted and used by the machine learning algorithm with a resulting classification accuracy of 99.7% and low false negative (FN and false positive (FP rates.

  18. Estimation of Alpine Skier Posture Using Machine Learning Techniques

    Directory of Open Access Journals (Sweden)

    Bojan Nemec

    2014-10-01

    Full Text Available High precision Global Navigation Satellite System (GNSS measurements are becoming more and more popular in alpine skiing due to the relatively undemanding setup and excellent performance. However, GNSS provides only single-point measurements that are defined with the antenna placed typically behind the skier’s neck. A key issue is how to estimate other more relevant parameters of the skier’s body, like the center of mass (COM and ski trajectories. Previously, these parameters were estimated by modeling the skier’s body with an inverted-pendulum model that oversimplified the skier’s body. In this study, we propose two machine learning methods that overcome this shortcoming and estimate COM and skis trajectories based on a more faithful approximation of the skier’s body with nine degrees-of-freedom. The first method utilizes a well-established approach of artificial neural networks, while the second method is based on a state-of-the-art statistical generalization method. Both methods were evaluated using the reference measurements obtained on a typical giant slalom course and compared with the inverted-pendulum method. Our results outperform the results of commonly used inverted-pendulum methods and demonstrate the applicability of machine learning techniques in biomechanical measurements of alpine skiing.

  19. Modern machine learning techniques and their applications in cartoon animation research

    CERN Document Server

    Yu, Jun

    2013-01-01

    The integration of machine learning techniques and cartoon animation research is fast becoming a hot topic. This book helps readers learn the latest machine learning techniques, including patch alignment framework; spectral clustering, graph cuts, and convex relaxation; ensemble manifold learning; multiple kernel learning; multiview subspace learning; and multiview distance metric learning. It then presents the applications of these modern machine learning techniques in cartoon animation research. With these techniques, users can efficiently utilize the cartoon materials to generate animations

  20. Using support vector machines in the multivariate state estimation technique

    International Nuclear Information System (INIS)

    Zavaljevski, N.; Gross, K.C.

    1999-01-01

    One approach to validate nuclear power plant (NPP) signals makes use of pattern recognition techniques. This approach often assumes that there is a set of signal prototypes that are continuously compared with the actual sensor signals. These signal prototypes are often computed based on empirical models with little or no knowledge about physical processes. A common problem of all data-based models is their limited ability to make predictions on the basis of available training data. Another problem is related to suboptimal training algorithms. Both of these potential shortcomings with conventional approaches to signal validation and sensor operability validation are successfully resolved by adopting a recently proposed learning paradigm called the support vector machine (SVM). The work presented here is a novel application of SVM for data-based modeling of system state variables in an NPP, integrated with a nonlinear, nonparametric technique called the multivariate state estimation technique (MSET), an algorithm developed at Argonne National Laboratory for a wide range of nuclear plant applications

  1. Machine Learning Techniques for Arterial Pressure Waveform Analysis

    Directory of Open Access Journals (Sweden)

    João Cardoso

    2013-05-01

    Full Text Available The Arterial Pressure Waveform (APW can provide essential information about arterial wall integrity and arterial stiffness. Most of APW analysis frameworks individually process each hemodynamic parameter and do not evaluate inter-dependencies in the overall pulse morphology. The key contribution of this work is the use of machine learning algorithms to deal with vectorized features extracted from APW. With this purpose, we follow a five-step evaluation methodology: (1 a custom-designed, non-invasive, electromechanical device was used in the data collection from 50 subjects; (2 the acquired position and amplitude of onset, Systolic Peak (SP, Point of Inflection (Pi and Dicrotic Wave (DW were used for the computation of some morphological attributes; (3 pre-processing work on the datasets was performed in order to reduce the number of input features and increase the model accuracy by selecting the most relevant ones; (4 classification of the dataset was carried out using four different machine learning algorithms: Random Forest, BayesNet (probabilistic, J48 (decision tree and RIPPER (rule-based induction; and (5 we evaluate the trained models, using the majority-voting system, comparatively to the respective calculated Augmentation Index (AIx. Classification algorithms have been proved to be efficient, in particular Random Forest has shown good accuracy (96.95% and high area under the curve (AUC of a Receiver Operating Characteristic (ROC curve (0.961. Finally, during validation tests, a correlation between high risk labels, retrieved from the multi-parametric approach, and positive AIx values was verified. This approach gives allowance for designing new hemodynamic morphology vectors and techniques for multiple APW analysis, thus improving the arterial pulse understanding, especially when compared to traditional single-parameter analysis, where the failure in one parameter measurement component, such as Pi, can jeopardize the whole evaluation.

  2. Using machine learning techniques to differentiate acute coronary syndrome

    Directory of Open Access Journals (Sweden)

    Sougand Setareh

    2015-02-01

    Full Text Available Backgroud: Acute coronary syndrome (ACS is an unstable and dynamic process that includes unstable angina, ST elevation myocardial infarction, and non-ST elevation myocardial infarction. Despite recent technological advances in early diognosis of ACS, differentiating between different types of coronary diseases in the early hours of admission is controversial. The present study was aimed to accurately differentiate between various coronary events, using machine learning techniques. Such methods, as a subset of artificial intelligence, include algorithms that allow computers to learn and play a major role in treatment decisions. Methods: 1902 patients diagnosed with ACS and admitted to hospital were selected according to Euro Heart Survey on ACS. Patients were classified based on decision tree J48. Bagging aggregation algorithms was implemented to increase the efficiency of algorithm. Results: The performance of classifiers was estimated and compared based on their accuracy computed from confusion matrix. The accuracy rates of decision tree and bagging algorithm were calculated to be 91.74% and 92.53%, respectively. Conclusion: The proposed methods used in this study proved to have the ability to identify various ACS. In addition, using matrix of confusion, an acceptable number of subjects with acute coronary syndrome were identified in each class.

  3. Machine learning techniques for energy optimization in mobile embedded systems

    Science.gov (United States)

    Donohoo, Brad Kyoshi

    Mobile smartphones and other portable battery operated embedded systems (PDAs, tablets) are pervasive computing devices that have emerged in recent years as essential instruments for communication, business, and social interactions. While performance, capabilities, and design are all important considerations when purchasing a mobile device, a long battery lifetime is one of the most desirable attributes. Battery technology and capacity has improved over the years, but it still cannot keep pace with the power consumption demands of today's mobile devices. This key limiter has led to a strong research emphasis on extending battery lifetime by minimizing energy consumption, primarily using software optimizations. This thesis presents two strategies that attempt to optimize mobile device energy consumption with negligible impact on user perception and quality of service (QoS). The first strategy proposes an application and user interaction aware middleware framework that takes advantage of user idle time between interaction events of the foreground application to optimize CPU and screen backlight energy consumption. The framework dynamically classifies mobile device applications based on their received interaction patterns, then invokes a number of different power management algorithms to adjust processor frequency and screen backlight levels accordingly. The second strategy proposes the usage of machine learning techniques to learn a user's mobile device usage pattern pertaining to spatiotemporal and device contexts, and then predict energy-optimal data and location interface configurations. By learning where and when a mobile device user uses certain power-hungry interfaces (3G, WiFi, and GPS), the techniques, which include variants of linear discriminant analysis, linear logistic regression, non-linear logistic regression, and k-nearest neighbor, are able to dynamically turn off unnecessary interfaces at runtime in order to save energy.

  4. Comparing the techniques of defining the synchronous machine load angle

    Science.gov (United States)

    Kovalenko, P. Y.; Moiseichenkov, A. N.

    2017-07-01

    The low-frequency oscillations are natural for power systems and may arise due to both small variations of load and large disturbance. The effect of slight load changes may significantly differ for cases of low-magnitude permanent oscillations, which may be considered acceptable, and unstable oscillations, which may lead to a major system emergency. The existing trend of increasing the capacity of long-range power transmission has led to the situation where inter-area oscillations may appear underdamped or even rising in terms of magnitude. Effective oscillations detection with the corresponding countermeasures along with eliminating the prerequisites leading to the oscillations is a guarantee of minimizing their negative consequences. Therefore, it is of crucial importance to perform continuous monitoring which is to provide the information on the “source” of oscillations - a generator or a group of generators, which do not contribute to the oscillations damping or even support their development. The algorithm of quantitative estimation of synchronous generators participation in low-frequency oscillations damping based on synchronized phasor measurements has been proposed previously. It implies utilizing the concept of synchronizing power as a measure of the capability of the machine to maintain synchronous operation. The load angle of the generator is necessary to define the value of the synchronizing power and since the direct measurement of the load angle is generally not available the techniques of its derivation have been developed. The comparison of these techniques is presented with the estimation of the adopted assumptions effect on the synchronizing power evaluation results.

  5. A critical survey of live virtual machine migration techniques

    Directory of Open Access Journals (Sweden)

    Anita Choudhary

    2017-11-01

    Full Text Available Abstract Virtualization techniques effectively handle the growing demand for computing, storage, and communication resources in large-scale Cloud Data Centers (CDC. It helps to achieve different resource management objectives like load balancing, online system maintenance, proactive fault tolerance, power management, and resource sharing through Virtual Machine (VM migration. VM migration is a resource-intensive procedure as VM’s continuously demand appropriate CPU cycles, cache memory, memory capacity, and communication bandwidth. Therefore, this process degrades the performance of running applications and adversely affects efficiency of the data centers, particularly when Service Level Agreements (SLA and critical business objectives are to be met. Live VM migration is frequently used because it allows the availability of application service, while migration is performed. In this paper, we make an exhaustive survey of the literature on live VM migration and analyze the various proposed mechanisms. We first classify the types of Live VM migration (single, multiple and hybrid. Next, we categorize VM migration techniques based on duplication mechanisms (replication, de-duplication, redundancy, and compression and awareness of context (dependency, soft page, dirty page, and page fault and evaluate the various Live VM migration techniques. We discuss various performance metrics like application service downtime, total migration time and amount of data transferred. CPU, memory and storage data is transferred during the process of VM migration and we identify the category of data that needs to be transferred in each case. We present a brief discussion on security threats in live VM migration and categories them in three different classes (control plane, data plane, and migration module. We also explain the security requirements and existing solutions to mitigate possible attacks. Specific gaps are identified and the research challenges in improving

  6. Development of techniques to enhance man/machine communication

    Science.gov (United States)

    Targ, R.; Cole, P.; Puthoff, H.

    1974-01-01

    A four-state random stimulus generator, considered to function as an ESP teaching machine was used to investigate an approach to facilitating interactions between man and machines. A subject tries to guess in which of four states the machine is. The machine offers the user feedback and reinforcement as to the correctness of his choice. Using this machine, 148 volunteer subjects were screened under various protocols. Several whose learning slope and/or mean score departed significantly from chance expectation were identified. Direct physiological evidence of perception of remote stimuli not presented to any known sense of the percipient using electroencephalographic (EEG) output when a light was flashed in a distant room was also studied.

  7. Machine-Learning Techniques for the Determination of Attrition of Forces Due to Atmospheric Conditions

    Science.gov (United States)

    2018-02-01

    ARL-TR-8304 ● FEB 2018 US Army Research Laboratory Machine -Learning Techniques for the Determination of Attrition of Forces Due...when it is no longer needed. Do not return it to the originator. ARL-TR-8304 ● FEB 2018 US Army Research Laboratory Machine ...October 2015–September 2017 4. TITLE AND SUBTITLE Machine -Learning Techniques for the Determination of Attrition of Forces Due to Atmospheric

  8. Reaching the Non-Traditional Stopout Population: A Segmentation Approach

    Science.gov (United States)

    Schatzel, Kim; Callahan, Thomas; Scott, Crystal J.; Davis, Timothy

    2011-01-01

    An estimated 21% of 25-34-year-olds in the United States, about eight million individuals, have attended college and quit before completing a degree. These non-traditional students may or may not return to college. Those who return to college are referred to as stopouts, whereas those who do not return are referred to as stayouts. In the face of…

  9. Transitioning Non-Traditional Students to an Undergraduate Business Program

    Science.gov (United States)

    Bailey, April E.; Marsh, Michael T.

    2010-01-01

    This paper reports experiences of non-traditional students in a specially designed section of seminar course which was primarily designed for first-year traditional business students. The College of Business's BSN101, Foundations of Business Administration (FBA), is designed to serves as a course to assist the students with transitioning into the…

  10. Shallow water bathymetry mapping using Support Vector Machine (SVM) technique and multispectral imagery

    NARCIS (Netherlands)

    Misra, Ankita; Vojinovic, Zoran; Ramakrishnan, Balaji; Luijendijk, Arjen; Ranasinghe, Roshanka

    2018-01-01

    Satellite imagery along with image processing techniques prove to be efficient tools for bathymetry retrieval as they provide time and cost-effective alternatives to traditional methods of water depth estimation. In this article, a nonlinear machine learning technique of Support Vector Machine (SVM)

  11. SPAM CLASSIFICATION BASED ON SUPERVISED LEARNING USING MACHINE LEARNING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    T. Hamsapriya

    2011-12-01

    Full Text Available E-mail is one of the most popular and frequently used ways of communication due to its worldwide accessibility, relatively fast message transfer, and low sending cost. The flaws in the e-mail protocols and the increasing amount of electronic business and financial transactions directly contribute to the increase in e-mail-based threats. Email spam is one of the major problems of the today’s Internet, bringing financial damage to companies and annoying individual users. Spam emails are invading users without their consent and filling their mail boxes. They consume more network capacity as well as time in checking and deleting spam mails. The vast majority of Internet users are outspoken in their disdain for spam, although enough of them respond to commercial offers that spam remains a viable source of income to spammers. While most of the users want to do right think to avoid and get rid of spam, they need clear and simple guidelines on how to behave. In spite of all the measures taken to eliminate spam, they are not yet eradicated. Also when the counter measures are over sensitive, even legitimate emails will be eliminated. Among the approaches developed to stop spam, filtering is the one of the most important technique. Many researches in spam filtering have been centered on the more sophisticated classifier-related issues. In recent days, Machine learning for spam classification is an important research issue. The effectiveness of the proposed work is explores and identifies the use of different learning algorithms for classifying spam messages from e-mail. A comparative analysis among the algorithms has also been presented.

  12. Machine throughput improvement achieved using innovative control technique

    International Nuclear Information System (INIS)

    Sharma, V.; Acharya, S.; Mittal, K.C.

    2012-01-01

    In any type of fully or semi automatic machine the control systems plays an important role. The control system on the one hand has to consider the human psychology, intelligence requirement for an operator, and attention needed from him. On the other hand the complexity of the control has also to be understood well before designing a control system that can be handled comfortably and safely by the operator. As far as the user experience/comfort is concerned the design of control system GUI is vital. Considering these two aspects related to the user of the machine it is evident that the control system design is very important because it is has to accommodate the human behaviour and skill sets required/available as well as the capability of the machine under the control of the control system. An intelligently designed control system can enhance the productivity of the machine. (author)

  13. Machine learning techniques applied to system characterization and equalization

    DEFF Research Database (Denmark)

    Zibar, Darko; Thrane, Jakob; Wass, Jesper

    2016-01-01

    Linear signal processing algorithms are effective in combating linear fibre channel impairments. We demonstrate the ability of machine learning algorithms to combat nonlinear fibre channel impairments and perform parameter extraction from directly detected signals.......Linear signal processing algorithms are effective in combating linear fibre channel impairments. We demonstrate the ability of machine learning algorithms to combat nonlinear fibre channel impairments and perform parameter extraction from directly detected signals....

  14. Energy and non-traditional security (NTS) in Asia

    Energy Technology Data Exchange (ETDEWEB)

    Caballero-Anthony, Mely [Nanyang Technological Univ., Singapore (SG). Centre for Non-Traditional Security (NTS) Studies; Chang, Youngho [Nanyang Technological Univ., Singapore (Singapore). Division of Economics; Putra, Nur Azha (eds.) [National Univ. of Singapore (Singapore). Energy Security Division

    2012-07-01

    Traditional notions of security are premised on the primacy of state security. In relation to energy security, traditional policy thinking has focused on ensuring supply without much emphasis on socioeconomic and environmental impacts. Non-traditional security (NTS) scholars argue that threats to human security have become increasingly prominent since the end of the Cold War, and that it is thus critical to adopt a holistic and multidisciplinary approach in addressing rising energy needs. This volume represents the perspectives of scholars from across Asia, looking at diverse aspects of energy security through a non-traditional security lens. The issues covered include environmental and socioeconomic impacts, the role of the market, the role of civil society, energy sustainability and policy trends in the ASEAN region.

  15. Machine Learning Techniques Applied to Profile Mobile Banking Users in India

    OpenAIRE

    M. Carr; V. Ravi; G. Sridharan Reddy; D. Veranna

    2013-01-01

    This paper profiles mobile banking users using machine learning techniques viz. Decision Tree, Logistic Regression, Multilayer Perceptron, and SVM to test a research model with fourteen independent variables and a dependent variable (adoption). A survey was conducted and the results were analysed using these techniques. Using Decision Trees the profile of the mobile banking adopter’s profile was identified. Comparing different machine learning techniques it was found that Decision Trees out...

  16. Machine learning in Python essential techniques for predictive analysis

    CERN Document Server

    Bowles, Michael

    2015-01-01

    Learn a simpler and more effective way to analyze data and predict outcomes with Python Machine Learning in Python shows you how to successfully analyze data using only two core machine learning algorithms, and how to apply them using Python. By focusing on two algorithm families that effectively predict outcomes, this book is able to provide full descriptions of the mechanisms at work, and the examples that illustrate the machinery with specific, hackable code. The algorithms are explained in simple terms with no complex math and applied using Python, with guidance on algorithm selection, d

  17. Optimization of machining techniques – A retrospective and ...

    Indian Academy of Sciences (India)

    pass is more economical than the double-pass, and when the depth of cut rises above this break-even point, double-pass is better. Carbide .... tuning algorithm for multi- dimensional motion control of a computer numerical control machine tool.

  18. Phishtest: Measuring the Impact of Email Headers on the Predictive Accuracy of Machine Learning Techniques

    Science.gov (United States)

    Tout, Hicham

    2013-01-01

    The majority of documented phishing attacks have been carried by email, yet few studies have measured the impact of email headers on the predictive accuracy of machine learning techniques in detecting email phishing attacks. Research has shown that the inclusion of a limited subset of email headers as features in training machine learning…

  19. Predicting breast screening attendance using machine learning techniques.

    Science.gov (United States)

    Baskaran, Vikraman; Guergachi, Aziz; Bali, Rajeev K; Naguib, Raouf N G

    2011-03-01

    Machine learning-based prediction has been effectively applied for many healthcare applications. Predicting breast screening attendance using machine learning (prior to the actual mammogram) is a new field. This paper presents new predictor attributes for such an algorithm. It describes a new hybrid algorithm that relies on back-propagation and radial basis function-based neural networks for prediction. The algorithm has been developed in an open source-based environment. The algorithm was tested on a 13-year dataset (1995-2008). This paper compares the algorithm and validates its accuracy and efficiency with different platforms. Nearly 80% accuracy and 88% positive predictive value and sensitivity were recorded for the algorithm. The results were encouraging; 40-50% of negative predictive value and specificity warrant further work. Preliminary results were promising and provided ample amount of reasons for testing the algorithm on a larger scale.

  20. Machine learning and evolutionary techniques in interplanetary trajectory design

    OpenAIRE

    Izzo, Dario; Sprague, Christopher; Tailor, Dharmesh

    2018-01-01

    After providing a brief historical overview on the synergies between artificial intelligence research, in the areas of evolutionary computations and machine learning, and the optimal design of interplanetary trajectories, we propose and study the use of deep artificial neural networks to represent, on-board, the optimal guidance profile of an interplanetary mission. The results, limited to the chosen test case of an Earth-Mars orbital transfer, extend the findings made previously for landing ...

  1. A comparison of machine learning techniques for predicting downstream acid mine drainage

    CSIR Research Space (South Africa)

    van Zyl, TL

    2014-07-01

    Full Text Available windowing approach over historical values to generate a prediction for the current value. We evaluate a number of Machine Learning techniques as regressors including Support Vector Regression, Random Forests, Stochastic Gradient Decent Regression, Linear...

  2. Kernel-based machine learning techniques for infrasound signal classification

    Science.gov (United States)

    Tuma, Matthias; Igel, Christian; Mialle, Pierrick

    2014-05-01

    Infrasound monitoring is one of four remote sensing technologies continuously employed by the CTBTO Preparatory Commission. The CTBTO's infrasound network is designed to monitor the Earth for potential evidence of atmospheric or shallow underground nuclear explosions. Upon completion, it will comprise 60 infrasound array stations distributed around the globe, of which 47 were certified in January 2014. Three stages can be identified in CTBTO infrasound data processing: automated processing at the level of single array stations, automated processing at the level of the overall global network, and interactive review by human analysts. At station level, the cross correlation-based PMCC algorithm is used for initial detection of coherent wavefronts. It produces estimates for trace velocity and azimuth of incoming wavefronts, as well as other descriptive features characterizing a signal. Detected arrivals are then categorized into potentially treaty-relevant versus noise-type signals by a rule-based expert system. This corresponds to a binary classification task at the level of station processing. In addition, incoming signals may be grouped according to their travel path in the atmosphere. The present work investigates automatic classification of infrasound arrivals by kernel-based pattern recognition methods. It aims to explore the potential of state-of-the-art machine learning methods vis-a-vis the current rule-based and task-tailored expert system. To this purpose, we first address the compilation of a representative, labeled reference benchmark dataset as a prerequisite for both classifier training and evaluation. Data representation is based on features extracted by the CTBTO's PMCC algorithm. As classifiers, we employ support vector machines (SVMs) in a supervised learning setting. Different SVM kernel functions are used and adapted through different hyperparameter optimization routines. The resulting performance is compared to several baseline classifiers. All

  3. Machine Learning Techniques for Prediction of Early Childhood Obesity.

    Science.gov (United States)

    Dugan, T M; Mukhopadhyay, S; Carroll, A; Downs, S

    2015-01-01

    This paper aims to predict childhood obesity after age two, using only data collected prior to the second birthday by a clinical decision support system called CHICA. Analyses of six different machine learning methods: RandomTree, RandomForest, J48, ID3, Naïve Bayes, and Bayes trained on CHICA data show that an accurate, sensitive model can be created. Of the methods analyzed, the ID3 model trained on the CHICA dataset proved the best overall performance with accuracy of 85% and sensitivity of 89%. Additionally, the ID3 model had a positive predictive value of 84% and a negative predictive value of 88%. The structure of the tree also gives insight into the strongest predictors of future obesity in children. Many of the strongest predictors seen in the ID3 modeling of the CHICA dataset have been independently validated in the literature as correlated with obesity, thereby supporting the validity of the model. This study demonstrated that data from a production clinical decision support system can be used to build an accurate machine learning model to predict obesity in children after age two.

  4. Exploring Genome-Wide Expression Profiles Using Machine Learning Techniques.

    Science.gov (United States)

    Kebschull, Moritz; Papapanou, Panos N

    2017-01-01

    Although contemporary high-throughput -omics methods produce high-dimensional data, the resulting wealth of information is difficult to assess using traditional statistical procedures. Machine learning methods facilitate the detection of additional patterns, beyond the mere identification of lists of features that differ between groups.Here, we demonstrate the utility of (1) supervised classification algorithms in class validation, and (2) unsupervised clustering in class discovery. We use data from our previous work that described the transcriptional profiles of gingival tissue samples obtained from subjects suffering from chronic or aggressive periodontitis (1) to test whether the two diagnostic entities were also characterized by differences on the molecular level, and (2) to search for a novel, alternative classification of periodontitis based on the tissue transcriptomes.Using machine learning technology, we provide evidence for diagnostic imprecision in the currently accepted classification of periodontitis, and demonstrate that a novel, alternative classification based on differences in gingival tissue transcriptomes is feasible. The outlined procedures allow for the unbiased interrogation of high-dimensional datasets for characteristic underlying classes, and are applicable to a broad range of -omics data.

  5. Application of machine learning techniques to lepton energy reconstruction in water Cherenkov detectors

    Science.gov (United States)

    Drakopoulou, E.; Cowan, G. A.; Needham, M. D.; Playfer, S.; Taani, M.

    2018-04-01

    The application of machine learning techniques to the reconstruction of lepton energies in water Cherenkov detectors is discussed and illustrated for TITUS, a proposed intermediate detector for the Hyper-Kamiokande experiment. It is found that applying these techniques leads to an improvement of more than 50% in the energy resolution for all lepton energies compared to an approach based upon lookup tables. Machine learning techniques can be easily applied to different detector configurations and the results are comparable to likelihood-function based techniques that are currently used.

  6. The impact of machine learning techniques in the study of bipolar disorder: A systematic review.

    Science.gov (United States)

    Librenza-Garcia, Diego; Kotzian, Bruno Jaskulski; Yang, Jessica; Mwangi, Benson; Cao, Bo; Pereira Lima, Luiza Nunes; Bermudez, Mariane Bagatin; Boeira, Manuela Vianna; Kapczinski, Flávio; Passos, Ives Cavalcante

    2017-09-01

    Machine learning techniques provide new methods to predict diagnosis and clinical outcomes at an individual level. We aim to review the existing literature on the use of machine learning techniques in the assessment of subjects with bipolar disorder. We systematically searched PubMed, Embase and Web of Science for articles published in any language up to January 2017. We found 757 abstracts and included 51 studies in our review. Most of the included studies used multiple levels of biological data to distinguish the diagnosis of bipolar disorder from other psychiatric disorders or healthy controls. We also found studies that assessed the prediction of clinical outcomes and studies using unsupervised machine learning to build more consistent clinical phenotypes of bipolar disorder. We concluded that given the clinical heterogeneity of samples of patients with BD, machine learning techniques may provide clinicians and researchers with important insights in fields such as diagnosis, personalized treatment and prognosis orientation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Machine learning techniques for gait biometric recognition using the ground reaction force

    CERN Document Server

    Mason, James Eric; Woungang, Isaac

    2016-01-01

    This book focuses on how machine learning techniques can be used to analyze and make use of one particular category of behavioral biometrics known as the gait biometric. A comprehensive Ground Reaction Force (GRF)-based Gait Biometrics Recognition framework is proposed and validated by experiments. In addition, an in-depth analysis of existing recognition techniques that are best suited for performing footstep GRF-based person recognition is also proposed, as well as a comparison of feature extractors, normalizers, and classifiers configurations that were never directly compared with one another in any previous GRF recognition research. Finally, a detailed theoretical overview of many existing machine learning techniques is presented, leading to a proposal of two novel data processing techniques developed specifically for the purpose of gait biometric recognition using GRF. This book · introduces novel machine-learning-based temporal normalization techniques · bridges research gaps concerning the effect of ...

  8. Feasibility of Applying Controllable Lubrication Techniques to Reciprocating Machines

    DEFF Research Database (Denmark)

    Pulido, Edgar Estupinan

    The use of active lubrication in journal bearings helps to enhance the thin fluid films by increasing the fluid film thickness and consequently reducing viscous friction losses and vibrations. One refers to active lubrication when conventional hydrodynamic lubrication is combined with dynamically...... modified hydrostatic lubrication. In this case, the hydrostatic lubrication is modified by injecting oil at controllable pressures, through orifices circumferentially located around the bearing surface. In order to study the performance of journal bearings of reciprocating machines, operating under...... conventional lubrication conditions, a mathematical model of a reciprocating mechanism connected to a rigid / flexible rotor via thin fluid films was developed. The mathematical model involves the use of multibody dynamics theory for the modelling of the reciprocating mechanism (rigid bodies), finite elements...

  9. Use of machine learning techniques for modeling of snow depth

    Directory of Open Access Journals (Sweden)

    G. V. Ayzel

    2017-01-01

    Full Text Available Snow exerts significant regulating effect on the land hydrological cycle since it controls intensity of heat and water exchange between the soil-vegetative cover and the atmosphere. Estimating of a spring flood runoff or a rain-flood on mountainous rivers requires understanding of the snow cover dynamics on a watershed. In our work, solving a problem of the snow cover depth modeling is based on both available databases of hydro-meteorological observations and easily accessible scientific software that allows complete reproduction of investigation results and further development of this theme by scientific community. In this research we used the daily observational data on the snow cover and surface meteorological parameters, obtained at three stations situated in different geographical regions: Col de Porte (France, Sodankyla (Finland, and Snoquamie Pass (USA.Statistical modeling of the snow cover depth is based on a complex of freely distributed the present-day machine learning models: Decision Trees, Adaptive Boosting, Gradient Boosting. It is demonstrated that use of combination of modern machine learning methods with available meteorological data provides the good accuracy of the snow cover modeling. The best results of snow cover depth modeling for every investigated site were obtained by the ensemble method of gradient boosting above decision trees – this model reproduces well both, the periods of snow cover accumulation and its melting. The purposeful character of learning process for models of the gradient boosting type, their ensemble character, and use of combined redundancy of a test sample in learning procedure makes this type of models a good and sustainable research tool. The results obtained can be used for estimating the snow cover characteristics for river basins where hydro-meteorological information is absent or insufficient.

  10. Fabrication of titanium implant-retained restorations with nontraditional machining techniques.

    Science.gov (United States)

    Schmitt, S M; Chance, D A

    1995-01-01

    Traditional laboratory techniques are being supplemented by modern precision technologies to solve complex restorative problems. Electrical discharge machining combined with laser scanning and computer aided design-computer aided manufacturing can create very precise restorations without the lost wax method. A laser scanner is used to create a three-dimensional polyline data model that can then be converted into a stereolithography file format for output to a stereolithography apparatus or other rapid prototyping device. A stereolithography-generated model is used to create an electric discharge machining electrode via copper electroforming. This electrode is used to machine dental restorations from an ingot of titanium, bypassing the conventional lost wax casting process. Retaining screw access holes are machined using conventional drilling procedures, but could be accomplished with electric discharge machining if desired. Other rapid prototyping technologies are briefly discussed.

  11. Adult video content detection using Machine Learning Techniques

    OpenAIRE

    Torres Ochoa, Victor Manuel

    2012-01-01

    Automatic adult video detection is a problem of interest for many organizations around the globe aiming to restrict the availability of potentially harmful material for young audiences. Being most of the existing techniques a mere extension of the image categorization problem. In the present work we employ video genre classification techniques applied specifically for adult content detection by considering cinematic principles. Shot structure and camera motion in the temporal domain are used ...

  12. Exploring machine-learning-based control plane intrusion detection techniques in software defined optical networks

    Science.gov (United States)

    Zhang, Huibin; Wang, Yuqiao; Chen, Haoran; Zhao, Yongli; Zhang, Jie

    2017-12-01

    In software defined optical networks (SDON), the centralized control plane may encounter numerous intrusion threatens which compromise the security level of provisioned services. In this paper, the issue of control plane security is studied and two machine-learning-based control plane intrusion detection techniques are proposed for SDON with properly selected features such as bandwidth, route length, etc. We validate the feasibility and efficiency of the proposed techniques by simulations. Results show an accuracy of 83% for intrusion detection can be achieved with the proposed machine-learning-based control plane intrusion detection techniques.

  13. Application of Artificial Intelligence Techniques for the Control of the Asynchronous Machine

    Directory of Open Access Journals (Sweden)

    F. Khammar

    2016-01-01

    Full Text Available The induction machine is experiencing a growing success for two decades by gradually replacing the DC machines and synchronous in many industrial applications. This paper is devoted to the study of advanced methods applied to the command of the asynchronous machine in order to obtain a system of control of high performance. While the criteria for response time, overtaking, and static error can be assured by the techniques of conventional control, the criterion of robustness remains a challenge for researchers. This criterion can be satisfied only by applying advanced techniques of command. After mathematical modeling of the asynchronous machine, it defines the control strategies based on the orientation of the rotor flux. The results of the different simulation tests highlight the properties of robustness of algorithms proposed and suggested to compare the different control strategies.

  14. Nuclear forensics of a non-traditional sample: Neptunium

    International Nuclear Information System (INIS)

    Doyle, Jamie L.; Schwartz, Daniel; Tandon, Lav

    2016-01-01

    Recent nuclear forensics cases have focused primarily on plutonium (Pu) and uranium (U) materials. By definition however, nuclear forensics can apply to any diverted nuclear material. This includes neptunium (Np), an internationally safeguarded material like Pu and U, that could offer a nuclear security concern if significant quantities were found outside of regulatory control. This case study couples scanning electron microscopy (SEM) with quantitative analysis using newly developed specialized software, to evaluate a non-traditional nuclear forensic sample of Np. Here, the results of the morphological analyses were compared with another Np sample of known pedigree, as well as other traditional actinide materials in order to determine potential processing and point-of-origin

  15. Qualitative parameters of non-traditional types of vegetables

    Directory of Open Access Journals (Sweden)

    Eva Kudrnáčová

    2015-08-01

    Full Text Available The main aim of this study was to determine selected quality indicators of non-traditional types of leafy vegetables. Mizuna (Brassica rapa japonica, Chinese mustard (Brassica juncea, edible chrysanthemum (Chrysanthemum coronarium and arugula (Eruca sativa belonged among the selected species of vegetables. During the one-year experiment, spring and autumn sowing was carried out for these species of vegetables. The measured quality parameters were the content of nitrates and ascorbic acid. Sampling was done in the morning and in the laboratory, the samples were further processed according to the type of determination. To determine the content of nitrates and ascorbic acid, leaves were removed from plants. The filtrate from the leaves was then prepared. Determination of nitrates and ascorbic acid was carried out using a special test strip and device Rqflex plus 10. The results of measurement of both sowing varieties were compared. Total nitrate content was higher up to 22% in plants sown in the autumn except edible chrysanthemum (Chrysanthemum coronarium. The highest content was recorded in arugula (Eruca sativa, which was recently implemented to the studies of the European Union and for which there were set the limits of nitrates. Overall, the nitrate content ranged from 221 to 334 ppm in spring varieties and from 249 to 384 mg/kg in autumn varieties. Ascorbic acid content was very high in Chinese mustard (Brassica juncea, edible chrysanthemum (Chrysanthemum coronarium and arugula (Eruca sativa in both spring and autumn varieties. Values of ascorbic acid ranged from 839 in autumn sowing up to 2909 mg/kg in spring sowing. These non-traditional types of leafy vegetables could be included among the other importants sources of vitamin C in the future.  

  16. Down syndrome detection from facial photographs using machine learning techniques

    Science.gov (United States)

    Zhao, Qian; Rosenbaum, Kenneth; Sze, Raymond; Zand, Dina; Summar, Marshall; Linguraru, Marius George

    2013-02-01

    Down syndrome is the most commonly occurring chromosomal condition; one in every 691 babies in United States is born with it. Patients with Down syndrome have an increased risk for heart defects, respiratory and hearing problems and the early detection of the syndrome is fundamental for managing the disease. Clinically, facial appearance is an important indicator in diagnosing Down syndrome and it paves the way for computer-aided diagnosis based on facial image analysis. In this study, we propose a novel method to detect Down syndrome using photography for computer-assisted image-based facial dysmorphology. Geometric features based on facial anatomical landmarks, local texture features based on the Contourlet transform and local binary pattern are investigated to represent facial characteristics. Then a support vector machine classifier is used to discriminate normal and abnormal cases; accuracy, precision and recall are used to evaluate the method. The comparison among the geometric, local texture and combined features was performed using the leave-one-out validation. Our method achieved 97.92% accuracy with high precision and recall for the combined features; the detection results were higher than using only geometric or texture features. The promising results indicate that our method has the potential for automated assessment for Down syndrome from simple, noninvasive imaging data.

  17. A machine vision identification technique from range images

    Science.gov (United States)

    Kehtarnavaz, N.; Mohan, S.

    1988-01-01

    An orientation-independent identification technique from three-dimensional surface maps or range images is developed. Given the range image of an object, it is decomposed into orientation-independent patches using the sign of Gaussian curvature. A relational graph is then set up such that a node represents a patch and an edge represents the adjacency of two patches. The identification of the object is achieved by matching its graph representation to a number of model graphs. The matching is performed by employing the best-first search strategy. Examples of real range images show the merit of the technique.

  18. A technique to identify some typical radio frequency interference using support vector machine

    Science.gov (United States)

    Wang, Yuanchao; Li, Mingtao; Li, Dawei; Zheng, Jianhua

    2017-07-01

    In this paper, we present a technique to automatically identify some typical radio frequency interference from pulsar surveys using support vector machine. The technique has been tested by candidates. In these experiments, to get features of SVM, we use principal component analysis for mosaic plots and its classification accuracy is 96.9%; while we use mathematical morphology operation for smog plots and horizontal stripes plots and its classification accuracy is 86%. The technique is simple, high accurate and useful.

  19. Performance Evaluation of Eleven-Phase Induction Machine with Different PWM Techniques

    Directory of Open Access Journals (Sweden)

    M.I. Masoud

    2015-06-01

    Full Text Available Multiphase induction machines are used extensively in low and medium voltage (MV drives. In MV drives, power switches have a limitation associated with switching frequency. This paper is a comparative study of the eleven-phase induction machine’s performance when used as a prototype and fed sinusoidal pulse-width-modulation (SPWM with a low switching frequency, selective harmonic elimination (SHE, and single pulse modulation (SPM techniques. The comparison depends on voltage/frequency controls for the same phase of voltage applied on the machine terminals for all previous techniques. The comparative study covers torque ripple, stator and harmonic currents, and motor efficiency.

  20. Complex technique for studying the machine part wear

    International Nuclear Information System (INIS)

    Grishko, V.A.; Zhushma, V.F.

    1981-01-01

    A technique to determine the wear of steel details rolling with sliding with circulatory lubrication is suggested. The functional diagram of the experimental device and structural diagrams of equipment to register the wear of tested samples and forming the lubricating layer between them, are considered. Results of testing three conples of disc samples and the data characterizing the dependence of sample wear on the value of contact stress are presented. The peculiarity of the device used is synchronous registering of the lubricating layer formation in the place of contact and detail mass loss in time which is realized correspondingly over discharge voltage on the lubricating layer and the intensity of radiation from detail wear products activated by neutrons. On the basis, of the investigation the conclusion is made that MEhF-1 oil has a greater antiwear effectiveness than the universal TAD-17 1 oil used presently [ru

  1. Prediction of drug synergy in cancer using ensemble-based machine learning techniques

    Science.gov (United States)

    Singh, Harpreet; Rana, Prashant Singh; Singh, Urvinder

    2018-04-01

    Drug synergy prediction plays a significant role in the medical field for inhibiting specific cancer agents. It can be developed as a pre-processing tool for therapeutic successes. Examination of different drug-drug interaction can be done by drug synergy score. It needs efficient regression-based machine learning approaches to minimize the prediction errors. Numerous machine learning techniques such as neural networks, support vector machines, random forests, LASSO, Elastic Nets, etc., have been used in the past to realize requirement as mentioned above. However, these techniques individually do not provide significant accuracy in drug synergy score. Therefore, the primary objective of this paper is to design a neuro-fuzzy-based ensembling approach. To achieve this, nine well-known machine learning techniques have been implemented by considering the drug synergy data. Based on the accuracy of each model, four techniques with high accuracy are selected to develop ensemble-based machine learning model. These models are Random forest, Fuzzy Rules Using Genetic Cooperative-Competitive Learning method (GFS.GCCL), Adaptive-Network-Based Fuzzy Inference System (ANFIS) and Dynamic Evolving Neural-Fuzzy Inference System method (DENFIS). Ensembling is achieved by evaluating the biased weighted aggregation (i.e. adding more weights to the model with a higher prediction score) of predicted data by selected models. The proposed and existing machine learning techniques have been evaluated on drug synergy score data. The comparative analysis reveals that the proposed method outperforms others in terms of accuracy, root mean square error and coefficient of correlation.

  2. The Gritty: Grit and Non-traditional Doctoral Student Success

    Directory of Open Access Journals (Sweden)

    Ted M. Cross

    2014-07-01

    Full Text Available As higher education is changing to reach larger numbers of students via online modalities, the issue of student attrition and other measures of student success become increasingly important. While research has focused largely on undergraduate online students, less has been done in the area of online non-traditional doctoral student success, particularly from the student trait perspective. The concept of grit, passion and persistence for long-term goals, has been identified as an important element of the successful attainment of long-term goals. As doctoral education is a long-term goal the purpose of this study was to examine the impact of doctoral student grit scores on student success. Success was measured by examining current student GPA and other factors. Significant relationships were found between grit and current student GPA, grit and the average number of hours students spent on their program of study weekly, and grit and age. The results of this research maybe important for informing how doctoral education is structured and how students might be better prepared for doctoral work.

  3. An element search ant colony technique for solving virtual machine placement problem

    Science.gov (United States)

    Srija, J.; Rani John, Rose; Kanaga, Grace Mary, Dr.

    2017-09-01

    The data centres in the cloud environment play a key role in providing infrastructure for ubiquitous computing, pervasive computing, mobile computing etc. This computing technique tries to utilize the available resources in order to provide services. Hence maintaining the resource utilization without wastage of power consumption has become a challenging task for the researchers. In this paper we propose the direct guidance ant colony system for effective mapping of virtual machines to the physical machine with maximal resource utilization and minimal power consumption. The proposed algorithm has been compared with the existing ant colony approach which is involved in solving virtual machine placement problem and thus the proposed algorithm proves to provide better result than the existing technique.

  4. Uncertainty analysis in rainfall-runoff modelling : Application of machine learning techniques

    NARCIS (Netherlands)

    Shrestha, D.l.

    2009-01-01

    This thesis presents powerful machine learning (ML) techniques to build predictive models of uncertainty with application to hydrological models. Two different methods are developed and tested. First one focuses on parameter uncertainty analysis by emulating the results of Monte Carlo simulations of

  5. Uncertainty Analysis in Rainfall-Runoff Modelling: Application of Machine Learning Techniques

    NARCIS (Netherlands)

    Shrestha, D.L.

    2009-01-01

    This thesis presents powerful machine learning (ML) techniques to build predictive models of uncertainty with application to hydrological models. Two different methods are developed and tested. First one focuses on parameter uncertainty analysis by emulating the results of Monte Carlo simulations of

  6. Exploring Machine Learning Techniques Using Patient Interactions in Online Health Forums to Classify Drug Safety

    Science.gov (United States)

    Chee, Brant Wah Kwong

    2011-01-01

    This dissertation explores the use of personal health messages collected from online message forums to predict drug safety using natural language processing and machine learning techniques. Drug safety is defined as any drug with an active safety alert from the US Food and Drug Administration (FDA). It is believed that this is the first…

  7. Classification of the Regional Ionospheric Disturbance Based on Machine Learning Techniques

    Science.gov (United States)

    Terzi, Merve Begum; Arikan, Orhan; Karatay, Secil; Arikan, Feza; Gulyaeva, Tamara

    2016-08-01

    In this study, Total Electron Content (TEC) estimated from GPS receivers is used to model the regional and local variability that differs from global activity along with solar and geomagnetic indices. For the automated classification of regional disturbances, a classification technique based on a robust machine learning technique that have found wide spread use, Support Vector Machine (SVM) is proposed. Performance of developed classification technique is demonstrated for midlatitude ionosphere over Anatolia using TEC estimates generated from GPS data provided by Turkish National Permanent GPS Network (TNPGN-Active) for solar maximum year of 2011. As a result of implementing developed classification technique to Global Ionospheric Map (GIM) TEC data, which is provided by the NASA Jet Propulsion Laboratory (JPL), it is shown that SVM can be a suitable learning method to detect anomalies in TEC variations.

  8. Machine learning techniques accurately classify microbial communities by bacterial vaginosis characteristics.

    Directory of Open Access Journals (Sweden)

    Daniel Beck

    Full Text Available Microbial communities are important to human health. Bacterial vaginosis (BV is a disease associated with the vagina microbiome. While the causes of BV are unknown, the microbial community in the vagina appears to play a role. We use three different machine-learning techniques to classify microbial communities into BV categories. These three techniques include genetic programming (GP, random forests (RF, and logistic regression (LR. We evaluate the classification accuracy of each of these techniques on two different datasets. We then deconstruct the classification models to identify important features of the microbial community. We found that the classification models produced by the machine learning techniques obtained accuracies above 90% for Nugent score BV and above 80% for Amsel criteria BV. While the classification models identify largely different sets of important features, the shared features often agree with past research.

  9. Teaching Climate Science in Non-traditional Classrooms

    Science.gov (United States)

    Strybos, J.

    2015-12-01

    San Antonio College is the oldest, largest and centrally-located campus of Alamo Colleges, a network of five community colleges based around San Antonio, Texas with a headcount enrollment of approximately 20,000 students. The student population is diverse in ethnicity, age and income; and the Colleges understand that they play a salient role in educating its students on the foreseen impacts of climate change. This presentation will discuss the key investment Alamo Colleges has adopted to incorporate sustainability and climate science into non-traditional classrooms. The established courses that cover climate-related course material have historically had low enrollments. One of the most significant challenges is informing the student population of the value of this class both in their academic career and in their personal lives. By hosting these lessons in hands-on simulations and demonstrations that are accessible and understandable to students of any age, and pursuing any major, we have found an exciting way to teach all students about climate change and identify solutions. San Antonio College (SAC) hosts the Bill R. Sinkin Eco Centro Community Center, completed in early 2014, that serves as an environmental hub for Alamo Colleges' staff and students as well as the San Antonio community. The center actively engages staff and faculty during training days in sustainability by presenting information on Eco Centro, personal sustainability habits, and inviting faculty to bring their classes for a tour and sustainability primer for students. The Centro has hosted professors from diverse disciplines that include Architecture, Psychology, Engineering, Science, English, Fine Arts, and International Studies to bring their classes to center to learn about energy, water conservation, landscaping, and green building. Additionally, Eco Centro encourages and assists students with research projects, including a solar-hydroponic project currently under development with the support

  10. Non-traditional Stable Isotope Systematics of Seafloor Hydrothermal Systems

    Science.gov (United States)

    Rouxel, O. J.

    2009-05-01

    Seafloor hydrothermal activity at mid-ocean ridges is one of the fundamental processes controlling the chemistry of the oceans and the altered oceanic crust. Past studies have demonstrated the complexity and diversity of seafloor hydrothermal systems and have highlighted the importance of subsurface environments in controlling the composition of hydrothermal fluids and mineralization types. Traditionally, the behavior of metals in seafloor hydrothermal systems have been investigated by integrating results from laboratory studies, theoretical models, mineralogy and fluid and mineral chemistry. Isotope ratios of various metals and metalloids, such as Fe, Cu, Zn, Se, Cd and Sb have recently provided new approaches for the study of seafloor hydrothermal systems. Despite these initial investigations, the cause of the isotopic variability of these elements remains poorly constrained. We have little understanding of the isotope variations between vent types (black or white smokers) as well as the influence of source rock composition (basalt, felsic or ultrabasic rocks) and alteration types. Here, I will review and present new results of metal isotope systematics of seafloor hydrothermal systems, in particular: (1) determination of empirical isotope fractionation factors for Zn, Fe and Cu-isotopes through isotopic analysis of mono-mineralic sulfide grains lining the internal chimney wall in contact with hydrothermal fluid; (2) comparison of Fe- and Cu-isotope signatures of vent fluids from mid- oceanic and back-arc hydrothermal fields, spanning wide ranges of pH, temperature, metal concentrations and contributions of magmatic fluids enriched in SO2. Ultimately, the use of complementary non-traditional stable isotope systems may help identify and constrain the complex interactions between fluids,minerals, and organisms in seafloor hydrothermal systems.

  11. Prediction of mortality after radical cystectomy for bladder cancer by machine learning techniques.

    Science.gov (United States)

    Wang, Guanjin; Lam, Kin-Man; Deng, Zhaohong; Choi, Kup-Sze

    2015-08-01

    Bladder cancer is a common cancer in genitourinary malignancy. For muscle invasive bladder cancer, surgical removal of the bladder, i.e. radical cystectomy, is in general the definitive treatment which, unfortunately, carries significant morbidities and mortalities. Accurate prediction of the mortality of radical cystectomy is therefore needed. Statistical methods have conventionally been used for this purpose, despite the complex interactions of high-dimensional medical data. Machine learning has emerged as a promising technique for handling high-dimensional data, with increasing application in clinical decision support, e.g. cancer prediction and prognosis. Its ability to reveal the hidden nonlinear interactions and interpretable rules between dependent and independent variables is favorable for constructing models of effective generalization performance. In this paper, seven machine learning methods are utilized to predict the 5-year mortality of radical cystectomy, including back-propagation neural network (BPN), radial basis function (RBFN), extreme learning machine (ELM), regularized ELM (RELM), support vector machine (SVM), naive Bayes (NB) classifier and k-nearest neighbour (KNN), on a clinicopathological dataset of 117 patients of the urology unit of a hospital in Hong Kong. The experimental results indicate that RELM achieved the highest average prediction accuracy of 0.8 at a fast learning speed. The research findings demonstrate the potential of applying machine learning techniques to support clinical decision making. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Precipitates Segmentation from Scanning Electron Microscope Images through Machine Learning Techniques

    OpenAIRE

    João P. Papa; Clayton R. Pereira; Victor H.C. de Albuquerque; Cleiton C. Silva; Alexandre X. Falcão; João Manuel R. S.Tavares

    2011-01-01

    The presence of precipitates in metallic materials affects its durability, resistance and mechanical properties. Hence, its automatic identification by image processing and machine learning techniques may lead to reliable and efficient assessments on the materials. In this paper, we introduce four widely used supervised pattern recognition techniques to accomplish metallic precipitates segmentation in scanning electron microscope images from dissimilar welding on a Hastelloy C-276 alloy: Supp...

  13. ISOLATED SPEECH RECOGNITION SYSTEM FOR TAMIL LANGUAGE USING STATISTICAL PATTERN MATCHING AND MACHINE LEARNING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    VIMALA C.

    2015-05-01

    Full Text Available In recent years, speech technology has become a vital part of our daily lives. Various techniques have been proposed for developing Automatic Speech Recognition (ASR system and have achieved great success in many applications. Among them, Template Matching techniques like Dynamic Time Warping (DTW, Statistical Pattern Matching techniques such as Hidden Markov Model (HMM and Gaussian Mixture Models (GMM, Machine Learning techniques such as Neural Networks (NN, Support Vector Machine (SVM, and Decision Trees (DT are most popular. The main objective of this paper is to design and develop a speaker-independent isolated speech recognition system for Tamil language using the above speech recognition techniques. The background of ASR system, the steps involved in ASR, merits and demerits of the conventional and machine learning algorithms and the observations made based on the experiments are presented in this paper. For the above developed system, highest word recognition accuracy is achieved with HMM technique. It offered 100% accuracy during training process and 97.92% for testing process.

  14. Big data - modelling of midges in Europa using machine learning techniques and satellite imagery

    DEFF Research Database (Denmark)

    Cuellar, Ana Carolina; Kjær, Lene Jung; Skovgaard, Henrik

    2017-01-01

    coordinates of each trap, start and end dates of trapping. We used 120 environmental predictor variables together with Random Forest machine learning algorithms to predict the overall species distribution (probability of occurrence) and monthly abundance in Europe. We generated maps for every month...... and the Obsoletus group, although abundance was generally higher for a longer period of time for C. imicula than for the Obsoletus group. Using machine learning techniques, we were able to model the spatial distribution in Europe for C. imicola and the Obsoletus group in terms of abundance and suitability...

  15. Process acceptance and adjustment techniques for Swiss automatic screw machine parts. Final report

    International Nuclear Information System (INIS)

    Robb, J.M.

    1976-01-01

    Product tolerance requirements for small, cylindrical, piece parts produced on swiss automatic screw machines have progressed to the reliability limits of inspection equipment. The miniature size, configuration, and tolerance requirements (plus or minus 0.0001 in.) (0.00254 mm) of these parts preclude the use of screening techniques to accept product or adjust processes during setup and production runs; therefore, existing means of product acceptance and process adjustment must be refined or new techniques must be developed. The purpose of this endeavor has been to determine benefits gained through the implementation of a process acceptance technique (PAT) to swiss automatic screw machine processes. PAT is a statistical approach developed for the purpose of accepting product and centering processes for parts produced by selected, controlled processes. Through this endeavor a determination has been made of the conditions under which PAT can benefit a controlled process and some specific types of screw machine processes upon which PAT could be applied. However, it was also determined that PAT, if used indiscriminately, may become a record keeping burden when applied to more than one dimension at a given machining operation

  16. A Computer Program for Simplifying Incompletely Specified Sequential Machines Using the Paull and Unger Technique

    Science.gov (United States)

    Ebersole, M. M.; Lecoq, P. E.

    1968-01-01

    This report presents a description of a computer program mechanized to perform the Paull and Unger process of simplifying incompletely specified sequential machines. An understanding of the process, as given in Ref. 3, is a prerequisite to the use of the techniques presented in this report. This process has specific application in the design of asynchronous digital machines and was used in the design of operational support equipment for the Mariner 1966 central computer and sequencer. A typical sequential machine design problem is presented to show where the Paull and Unger process has application. A description of the Paull and Unger process together with a description of the computer algorithms used to develop the program mechanization are presented. Several examples are used to clarify the Paull and Unger process and the computer algorithms. Program flow diagrams, program listings, and a program user operating procedures are included as appendixes.

  17. Non-traditional shape GFRP rebars for concrete reinforcement

    Science.gov (United States)

    Claure, Guillermo G.

    The use of glass-fiber-reinforced-polymer (GFRP) composites as internal reinforcement (rebars) for concrete structures has proven to be an alternative to traditional steel reinforcement due to significant advantages such as magnetic transparency and, most importantly, corrosion resistance equating to durability and structural life extension. In recent years, the number of projects specifying GFRP reinforcement has increased dramatically leading the construction industry towards more sustainable practices. Typically, GFRP rebars are similar to their steel counterparts having external deformations or surface enhancements designed to develop bond to concrete, as well as having solid circular cross-sections; but lately, the worldwide composites industry has taken advantage of the pultrusion process developing GFRP rebars with non-traditional cross-sectional shapes destined to optimize their mechanical, physical, and environmental attributes. Recently, circular GFRP rebars with a hollow-core have also become available. They offer advantages such as a larger surface area for improved bond, and the use of the effective cross-sectional area that is engaged to carry load since fibers at the center of a solid cross-section are generally not fully engaged. For a complete understanding of GFRP rebar physical properties, a study on material characterization regarding a quantitative cross-sectional area analysis of different GFRP rebars was undertaken with a sample population of 190 GFRP specimens with rebar denomination ranging from #2 to #6 and with different cross-sectional shapes and surface deformations manufactured by five pultruders from around the world. The water displacement method was applied as a feasible and reliable way to conduct the investigation. In addition to developing a repeatable protocol for measuring cross-sectional area, the objectives of establishing critical statistical information related to the test methodology and recommending improvements to

  18. Electric-Discharge Machining Techniques for Evaluating Tritium Effects on Materials

    International Nuclear Information System (INIS)

    Morgan, M.J.

    2003-01-01

    In this investigation, new ways to evaluate the long-term effects of tritium on the structural properties of components were developed. Electric-discharge machining (EDM) techniques for cutting tensile and fracture toughness samples from tritium exposed regions of returned reservoirs were demonstrated. An existing electric discharge machine was used to cut sub-size tensile and fracture toughness samples from the inside surfaces of reservoir mockups. Tensile properties from the EDM tensile samples were similar to those measured using full-size samples cut from similar stock. Although the existing equipment could not be used for machining tritium-exposed hardware, off-the shelf EDM units are available that could. With the right equipment and the required radiological controls in place, similar machining and testing techniques could be used to directly measure the effects of tritium on the properties of material cut from reservoir returns. Stress-strain properties from tritium-exposed reservoirs would improve finite element modeling of reservoir performance because the data would be representative of the true state of the reservoir material in the field. Tensile data from samples cut directly from reservoirs would also complement existing shelf storage and burst test data of the Life Storage Program and help answer questions about a specific reservoir's processing history and properties

  19. Impact of corpus domain for sentiment classification: An evaluation study using supervised machine learning techniques

    Science.gov (United States)

    Karsi, Redouane; Zaim, Mounia; El Alami, Jamila

    2017-07-01

    Thanks to the development of the internet, a large community now has the possibility to communicate and express its opinions and preferences through multiple media such as blogs, forums, social networks and e-commerce sites. Today, it becomes clearer that opinions published on the web are a very valuable source for decision-making, so a rapidly growing field of research called “sentiment analysis” is born to address the problem of automatically determining the polarity (Positive, negative, neutral,…) of textual opinions. People expressing themselves in a particular domain often use specific domain language expressions, thus, building a classifier, which performs well in different domains is a challenging problem. The purpose of this paper is to evaluate the impact of domain for sentiment classification when using machine learning techniques. In our study three popular machine learning techniques: Support Vector Machines (SVM), Naive Bayes and K nearest neighbors(KNN) were applied on datasets collected from different domains. Experimental results show that Support Vector Machines outperforms other classifiers in all domains, since it achieved at least 74.75% accuracy with a standard deviation of 4,08.

  20. Statistical and Machine-Learning Data Mining Techniques for Better Predictive Modeling and Analysis of Big Data

    CERN Document Server

    Ratner, Bruce

    2011-01-01

    The second edition of a bestseller, Statistical and Machine-Learning Data Mining: Techniques for Better Predictive Modeling and Analysis of Big Data is still the only book, to date, to distinguish between statistical data mining and machine-learning data mining. The first edition, titled Statistical Modeling and Analysis for Database Marketing: Effective Techniques for Mining Big Data, contained 17 chapters of innovative and practical statistical data mining techniques. In this second edition, renamed to reflect the increased coverage of machine-learning data mining techniques, the author has

  1. Approximate multi-state reliability expressions using a new machine learning technique

    International Nuclear Information System (INIS)

    Rocco S, Claudio M.; Muselli, Marco

    2005-01-01

    The machine-learning-based methodology, previously proposed by the authors for approximating binary reliability expressions, is now extended to develop a new algorithm, based on the procedure of Hamming Clustering, which is capable to deal with multi-state systems and any success criterion. The proposed technique is presented in details and verified on literature cases: experiment results show that the new algorithm yields excellent predictions

  2. Non-traditional Sensor Tasking for SSA: A Case Study

    Science.gov (United States)

    Herz, A.; Herz, E.; Center, K.; Martinez, I.; Favero, N.; Clark, C.; Therien, W.; Jeffries, M.

    Industry has recognized that maintaining SSA of the orbital environment going forward is too challenging for the government alone. Consequently there are a significant number of commercial activities in various stages of development standing-up novel sensors and sensor networks to assist in SSA gathering and dissemination. Use of these systems will allow government and military operators to focus on the most sensitive space control issues while allocating routine or lower priority data gathering responsibility to the commercial side. The fact that there will be multiple (perhaps many) commercial sensor capabilities available in this new operational model begets a common access solution. Absent a central access point to assert data needs, optimized use of all commercial sensor resources is not possible and the opportunity for coordinated collections satisfying overarching SSA-elevating objectives is lost. Orbit Logic is maturing its Heimdall Web system - an architecture facilitating “data requestor” perspectives (allowing government operations centers to assert SSA data gathering objectives) and “sensor operator” perspectives (through which multiple sensors of varying phenomenology and capability are integrated via machine -machine interfaces). When requestors submit their needs, Heimdall’s planning engine determines tasking schedules across all sensors, optimizing their use via an SSA-specific figure-of-merit. ExoAnalytic was a key partner in refining the sensor operator interfaces, working with Orbit Logic through specific details of sensor tasking schedule delivery and the return of observation data. Scant preparation on both sides preceded several integration exercises (walk-then-run style), which culminated in successful demonstration of the ability to supply optimized schedules for routine public catalog data collection – then adapt sensor tasking schedules in real-time upon receipt of urgent data collection requests. This paper will provide a

  3. Machine Learning Techniques for Modelling Short Term Land-Use Change

    Directory of Open Access Journals (Sweden)

    Mileva Samardžić-Petrović

    2017-11-01

    Full Text Available The representation of land use change (LUC is often achieved by using data-driven methods that include machine learning (ML techniques. The main objectives of this research study are to implement three ML techniques, Decision Trees (DT, Neural Networks (NN, and Support Vector Machines (SVM for LUC modeling, in order to compare these three ML techniques and to find the appropriate data representation. The ML techniques are applied on the case study of LUC in three municipalities of the City of Belgrade, the Republic of Serbia, using historical geospatial data sets and considering nine land use classes. The ML models were built and assessed using two different time intervals. The information gain ranking technique and the recursive attribute elimination procedure were implemented to find the most informative attributes that were related to LUC in the study area. The results indicate that all three ML techniques can be used effectively for short-term forecasting of LUC, but the SVM achieved the highest agreement of predicted changes.

  4. Wind Power Ramp Events Prediction with Hybrid Machine Learning Regression Techniques and Reanalysis Data

    Directory of Open Access Journals (Sweden)

    Laura Cornejo-Bueno

    2017-11-01

    Full Text Available Wind Power Ramp Events (WPREs are large fluctuations of wind power in a short time interval, which lead to strong, undesirable variations in the electric power produced by a wind farm. Its accurate prediction is important in the effort of efficiently integrating wind energy in the electric system, without affecting considerably its stability, robustness and resilience. In this paper, we tackle the problem of predicting WPREs by applying Machine Learning (ML regression techniques. Our approach consists of using variables from atmospheric reanalysis data as predictive inputs for the learning machine, which opens the possibility of hybridizing numerical-physical weather models with ML techniques for WPREs prediction in real systems. Specifically, we have explored the feasibility of a number of state-of-the-art ML regression techniques, such as support vector regression, artificial neural networks (multi-layer perceptrons and extreme learning machines and Gaussian processes to solve the problem. Furthermore, the ERA-Interim reanalysis from the European Center for Medium-Range Weather Forecasts is the one used in this paper because of its accuracy and high resolution (in both spatial and temporal domains. Aiming at validating the feasibility of our predicting approach, we have carried out an extensive experimental work using real data from three wind farms in Spain, discussing the performance of the different ML regression tested in this wind power ramp event prediction problem.

  5. Optimization of Coolant Technique Conditions for Machining A319 Aluminium Alloy Using Response Surface Method (RSM)

    Science.gov (United States)

    Zainal Ariffin, S.; Razlan, A.; Ali, M. Mohd; Efendee, A. M.; Rahman, M. M.

    2018-03-01

    Background/Objectives: The paper discusses about the optimum cutting parameters with coolant techniques condition (1.0 mm nozzle orifice, wet and dry) to optimize surface roughness, temperature and tool wear in the machining process based on the selected setting parameters. The selected cutting parameters for this study were the cutting speed, feed rate, depth of cut and coolant techniques condition. Methods/Statistical Analysis Experiments were conducted and investigated based on Design of Experiment (DOE) with Response Surface Method. The research of the aggressive machining process on aluminum alloy (A319) for automotive applications is an effort to understand the machining concept, which widely used in a variety of manufacturing industries especially in the automotive industry. Findings: The results show that the dominant failure mode is the surface roughness, temperature and tool wear when using 1.0 mm nozzle orifice, increases during machining and also can be alternative minimize built up edge of the A319. The exploration for surface roughness, productivity and the optimization of cutting speed in the technical and commercial aspects of the manufacturing processes of A319 are discussed in automotive components industries for further work Applications/Improvements: The research result also beneficial in minimizing the costs incurred and improving productivity of manufacturing firms. According to the mathematical model and equations, generated by CCD based RSM, experiments were performed and cutting coolant condition technique using size nozzle can reduces tool wear, surface roughness and temperature was obtained. Results have been analyzed and optimization has been carried out for selecting cutting parameters, shows that the effectiveness and efficiency of the system can be identified and helps to solve potential problems.

  6. Machine learning techniques applied to the determination of road suitability for the transportation of dangerous substances.

    Science.gov (United States)

    Matías, J M; Taboada, J; Ordóñez, C; Nieto, P G

    2007-08-17

    This article describes a methodology to model the degree of remedial action required to make short stretches of a roadway suitable for dangerous goods transport (DGT), particularly pollutant substances, using different variables associated with the characteristics of each segment. Thirty-one factors determining the impact of an accident on a particular stretch of road were identified and subdivided into two major groups: accident probability factors and accident severity factors. Given the number of factors determining the state of a particular road segment, the only viable statistical methods for implementing the model were machine learning techniques, such as multilayer perceptron networks (MLPs), classification trees (CARTs) and support vector machines (SVMs). The results produced by these techniques on a test sample were more favourable than those produced by traditional discriminant analysis, irrespective of whether dimensionality reduction techniques were applied. The best results were obtained using SVMs specifically adapted to ordinal data. This technique takes advantage of the ordinal information contained in the data without penalising the computational load. Furthermore, the technique permits the estimation of the utility function that is latent in expert knowledge.

  7. Andragogical Teaching Methods to Enhance Non-Traditional Student Classroom Engagement

    Science.gov (United States)

    Allen, Pamela; Withey, Paul; Lawton, Deb; Aquino, Carlos Tasso

    2016-01-01

    The aim of this study was to provide a reflection of current trends in higher education, identify some of the changes in student behavior, and potential identification of non-traditional classroom facilitation with the purpose of strengthening active learning and use of technology in the classroom. Non-traditional teaching is emerging in the form…

  8. Exploring Non-Traditional Learning Methods in Virtual and Real-World Environments

    Science.gov (United States)

    Lukman, Rebeka; Krajnc, Majda

    2012-01-01

    This paper identifies the commonalities and differences within non-traditional learning methods regarding virtual and real-world environments. The non-traditional learning methods in real-world have been introduced within the following courses: Process Balances, Process Calculation, and Process Synthesis, and within the virtual environment through…

  9. Prostate cancer detection using machine learning techniques by employing combination of features extracting strategies.

    Science.gov (United States)

    Hussain, Lal; Ahmed, Adeel; Saeed, Sharjil; Rathore, Saima; Awan, Imtiaz Ahmed; Shah, Saeed Arif; Majid, Abdul; Idris, Adnan; Awan, Anees Ahmed

    2018-02-06

    Prostate is a second leading causes of cancer deaths among men. Early detection of cancer can effectively reduce the rate of mortality caused by Prostate cancer. Due to high and multiresolution of MRIs from prostate cancer require a proper diagnostic systems and tools. In the past researchers developed Computer aided diagnosis (CAD) systems that help the radiologist to detect the abnormalities. In this research paper, we have employed novel Machine learning techniques such as Bayesian approach, Support vector machine (SVM) kernels: polynomial, radial base function (RBF) and Gaussian and Decision Tree for detecting prostate cancer. Moreover, different features extracting strategies are proposed to improve the detection performance. The features extracting strategies are based on texture, morphological, scale invariant feature transform (SIFT), and elliptic Fourier descriptors (EFDs) features. The performance was evaluated based on single as well as combination of features using Machine Learning Classification techniques. The Cross validation (Jack-knife k-fold) was performed and performance was evaluated in term of receiver operating curve (ROC) and specificity, sensitivity, Positive predictive value (PPV), negative predictive value (NPV), false positive rate (FPR). Based on single features extracting strategies, SVM Gaussian Kernel gives the highest accuracy of 98.34% with AUC of 0.999. While, using combination of features extracting strategies, SVM Gaussian kernel with texture + morphological, and EFDs + morphological features give the highest accuracy of 99.71% and AUC of 1.00.

  10. LARA. Localization of an automatized refueling machine by acoustical sounding in breeder reactors - implementation of artificial intelligence techniques

    International Nuclear Information System (INIS)

    Lhuillier, C.; Malvache, P.

    1987-01-01

    The automatic control of the machine which handles the nuclear subassemblies in fast neutron reactors requires autonomous perception and decision tools. An acoustical device allows the machine to position in the work area. Artificial intelligence techniques are implemented to interpret the data: pattern recognition, scene analysis. The localization process is managed by an expert system. 6 refs.; 8 figs

  11. Classification of Cytochrome P450 1A2 Inhibitors and Non-Inhibitors by Machine Learning Techniques

    DEFF Research Database (Denmark)

    Vasanthanathan, Poongavanam; Taboureau, Olivier; Oostenbrink, Chris

    2009-01-01

    of CYP1A2 inhibitors and non-inhibitors. Training and test sets consisted of about 400 and 7000 compounds, respectively. Various machine learning techniques, like binary QSAR, support vector machine (SVM), random forest, kappa nearest neighbors (kNN), and decision tree methods were used to develop...

  12. Analysis for Non-Traditional Security Challenges: Methods and Tools

    Science.gov (United States)

    2006-11-20

    Course of Action COCOM Combatant Commander COI Community of Interest CPB Cultural Preparation of the Battlefield CPM Critical Path Method DARPA Defense...Secretary of Defense (Program Analysis and Evaluation) PACOM United States Pacific Command PERT Program Evaluation Review Technique PMESII Political...Availability .U .aid5.ss of each pos f1 00" tool.Ow, pomty al wi c ro.sd. J Metod F pl Tools I I LL J L L L Dvnmuoc jI.1n~stl osb M00io ~g~osgdowr01Vfl) X x

  13. Digital Mayhem 3D machine techniques where inspiration, techniques and digital art meet

    CERN Document Server

    Evans, Duncan

    2014-01-01

    From Icy Tundras to Desert savannahs, master the art of landscape and environment design for 2D and 3D digital content. Make it rain, shower your digital scene with a snow storm or develop a believable urban scene with a critical eye for modeling, lighting and composition. Move beyond the limitations of gallery style coffee table books with Digital Mayhem: 3D Landscapes-offering leading professional techniques, groundbreaking inspiration, and artistic mastery from some of the greatest digital artists. More than just a gallery book - each artist has written a breakdown overview, with supporting

  14. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y

    2016-01-01

    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  15. Engagement techniques and playing level impact the biomechanical demands on rugby forwards during machine-based scrummaging

    OpenAIRE

    Preatoni, Ezio; Stokes, Keith A.; England, Michael E.; Trewartha, Grant

    2014-01-01

    Objectives This cross-sectional study investigated the factors that may influence the physical loading on rugby forwards performing a scrum by studying the biomechanics of machine-based scrummaging under different engagement techniques and playing levels.Methods 34 forward packs from six playing levels performed repetitions of five different types of engagement techniques against an instrumented scrum machine under realistic training conditions. Applied forces and body movements were recorded...

  16. The influence of cooling techniques on cutting forces and surface roughness during cryogenic machining of titanium alloys

    Directory of Open Access Journals (Sweden)

    Wstawska Iwona

    2016-12-01

    Full Text Available Titanium alloys are one of the materials extensively used in the aerospace industry due to its excellent properties of high specific strength and corrosion resistance. On the other hand, they also present problems wherein titanium alloys are extremely difficult materials to machine. In addition, the cost associated with titanium machining is also high due to lower cutting velocities and shorter tool life. The main objective of this work is a comparison of different cooling techniques during cryogenic machining of titanium alloys. The analysis revealed that applied cooling technique has a significant influence on cutting force and surface roughness (Ra parameter values. Furthermore, in all cases observed a positive influence of cryogenic machining on selected aspects after turning and milling of titanium alloys. This work can be also the starting point to the further research, related to the analysis of cutting forces and surface roughness during cryogenic machining of titanium alloys.

  17. Prediction of lung cancer patient survival via supervised machine learning classification techniques.

    Science.gov (United States)

    Lynch, Chip M; Abdollahi, Behnaz; Fuqua, Joshua D; de Carlo, Alexandra R; Bartholomai, James A; Balgemann, Rayeanne N; van Berkel, Victor H; Frieboes, Hermann B

    2017-12-01

    Outcomes for cancer patients have been previously estimated by applying various machine learning techniques to large datasets such as the Surveillance, Epidemiology, and End Results (SEER) program database. In particular for lung cancer, it is not well understood which types of techniques would yield more predictive information, and which data attributes should be used in order to determine this information. In this study, a number of supervised learning techniques is applied to the SEER database to classify lung cancer patients in terms of survival, including linear regression, Decision Trees, Gradient Boosting Machines (GBM), Support Vector Machines (SVM), and a custom ensemble. Key data attributes in applying these methods include tumor grade, tumor size, gender, age, stage, and number of primaries, with the goal to enable comparison of predictive power between the various methods The prediction is treated like a continuous target, rather than a classification into categories, as a first step towards improving survival prediction. The results show that the predicted values agree with actual values for low to moderate survival times, which constitute the majority of the data. The best performing technique was the custom ensemble with a Root Mean Square Error (RMSE) value of 15.05. The most influential model within the custom ensemble was GBM, while Decision Trees may be inapplicable as it had too few discrete outputs. The results further show that among the five individual models generated, the most accurate was GBM with an RMSE value of 15.32. Although SVM underperformed with an RMSE value of 15.82, statistical analysis singles the SVM as the only model that generated a distinctive output. The results of the models are consistent with a classical Cox proportional hazards model used as a reference technique. We conclude that application of these supervised learning techniques to lung cancer data in the SEER database may be of use to estimate patient survival time

  18. OVERVIEW OF WORK PIECE TEMPERATURE MEASUREMENT TECHNIQUES FOR MACHINING OF Ti6Al4V#

    Directory of Open Access Journals (Sweden)

    P.J.T. Conradie

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Ti6Al4V is one of the most widely used titanium alloys in aerospace applications, but its machining remains a challenge. Comprehensive research has been done in the past, mainly investigating tool failure of various materials. Less research has been done to investigate the thermal effect of machining on work piece quality, including fatigue performance. Temperature measurement is considered to be a key enabling technology. This study presents an overview of current temperature measurement techniques for machined and tool surfaces. Two categories of methods were investigated: slower contact, and faster optical methods. Optical fibre two colour pyrometry experiments are reported that demonstrate the technique’s adequate response time. The infrared camera temperature measurement experiments synchronised temperature measurement with visual observation, aimed at mechanism analysis. The results corresponded with the literature.

    AFRIKAANSE OPSOMMING: Ti6Al4V is een van die mees gewilde lugvaart allooie, maar sy masjinering is ’n uitdaging. Bestaande navorsing dek beitelslytasie omvattend. Die termiese effek van masjinering op werkstuk integriteit, insluitend vermoeiingleeftyd, het egter veel minder dekking geniet. Temperatuurmeting wat in hierdie studie ondersoek word, word as ’n sleuteltegnologie beskou. Twee kategorië metodes is ondersoek, nl stadige kontakmetodes en optiese metodes met vinnige respons, wat die meting van oorgangsverskynsels moontlik maak. Eksperimentele werk wat beide optiese vesel tweekleurpirometrie en termiese kamera tegnieke insluit bewys die tegnieke as geskik vir die benodigde navorsing.

  19. A relevance vector machine technique for the automatic detection of clustered microcalcifications (Honorable Mention Poster Award)

    Science.gov (United States)

    Wei, Liyang; Yang, Yongyi; Nishikawa, Robert M.

    2005-04-01

    Microcalcification (MC) clusters in mammograms can be important early signs of breast cancer in women. Accurate detection of MC clusters is an important but challenging problem. In this paper, we propose the use of a recently developed machine learning technique -- relevance vector machine (RVM) -- for automatic detection of MCs in digitized mammograms. RVM is based on Bayesian estimation theory, and as a feature it can yield a decision function that depends on only a very small number of so-called relevance vectors. We formulate MC detection as a supervised-learning problem, and use RVM to classify if an MC object is present or not at each location in a mammogram image. MC clusters are then identified by grouping the detected MC objects. The proposed method is tested using a database of 141 clinical mammograms, and compared with a support vector machine (SVM) classifier which we developed previously. The detection performance is evaluated using the free-response receiver operating characteristic (FROC) curves. It is demonstrated that the RVM classifier matches closely with the SVM classifier in detection performance, and does so with a much sparser kernel representation than the SVM classifier. Consequently, the RVM classifier greatly reduces the computational complexity, making it more suitable for real-time processing of MC clusters in mammograms.

  20. Particle identification at LHCb: new calibration techniques and machine learning classification algorithms

    CERN Document Server

    CERN. Geneva

    2018-01-01

    Particle identification (PID) plays a crucial role in LHCb analyses. Combining information from LHCb subdetectors allows one to distinguish between various species of long-lived charged and neutral particles. PID performance directly affects the sensitivity of most LHCb measurements. Advanced multivariate approaches are used at LHCb to obtain the best PID performance and control systematic uncertainties. This talk highlights recent developments in PID that use innovative machine learning techniques, as well as novel data-driven approaches which ensure that PID performance is well reproduced in simulation.

  1. Like a rolling stone: non-traditional spaces of adult education

    Directory of Open Access Journals (Sweden)

    Emilio Lucio-Villegas

    2016-04-01

    Full Text Available In this article, I try to explore the squeezing concept of adult education that provides a kind of identity to the field characterised by vagueness, diversity and the links to social justice. This diversity is also present when talking about the participants in the process. After presenting the concept of adult education, I explore three different experiences that I have referred to as non-traditional spaces of adult education. In the conclusion, I consider that the diversity, the production of knowledge, and the role of both teacher and learners are essential to define non-traditional spaces and non-traditional participants in adult education.

  2. An overview of non-traditional nuclear threats

    International Nuclear Information System (INIS)

    Geelhood, B.D.; Wogman, N.A.

    2005-01-01

    In view of the terrorist threats to the United States, the country needs to consider new vectors and weapons related to nuclear and radiological threats against our homeland. The traditional threat vectors, missiles and bombers, have expanded to include threats arriving through the flow of commerce. The new commerce-related vectors include: sea cargo, truck cargo, rail cargo, air cargo, and passenger transport. The types of weapons have also expanded beyond nuclear warheads to include radiation dispersal devices (RDD) or 'dirty' bombs. The consequences of these nuclear and radiological threats are both economic and life threatening. The defense against undesirable materials entering our borders involves extensive radiation monitoring at ports of entry. The radiation and other signatures of potential nuclear and radiological threats are examined along with potential sensors to discover undesirable items in the flow of commerce. Techniques to improve radiation detection are considered. A strategy of primary and secondary screening is proposed to rapidly clear most cargo and carefully examine suspect cargo. (author)

  3. Social Capital of Non-Traditional Students at a German University. Do Traditional and Non-Traditional Students Access Different Social Resources?

    Science.gov (United States)

    Brändle, Tobias; Häuberer, Julia

    2015-01-01

    Social capital is of particular value for the acquisition of education. Not only does it prevent scholars from dropping out but it improves the educational achievement. The paper focuses on access to social resources by traditional and non-traditional students at a German university and asks if there are group differences considering this…

  4. GPR Signal Characterization for Automated Landmine and UXO Detection Based on Machine Learning Techniques

    Directory of Open Access Journals (Sweden)

    Xavier Núñez-Nieto

    2014-10-01

    Full Text Available Landmine clearance is an ongoing problem that currently affects millions of people around the world. This study evaluates the effectiveness of ground penetrating radar (GPR in demining and unexploded ordnance detection using 2.3-GHz and 1-GHz high-frequency antennas. An automated detection tool based on machine learning techniques is also presented with the aim of automatically detecting underground explosive artifacts. A GPR survey was conducted on a designed scenario that included the most commonly buried items in historic battle fields, such as mines, projectiles and mortar grenades. The buried targets were identified using both frequencies, although the higher vertical resolution provided by the 2.3-GHz antenna allowed for better recognition of the reflection patterns. The targets were also detected automatically using machine learning techniques. Neural networks and logistic regression algorithms were shown to be able to discriminate between potential targets and clutter. The neural network had the most success, with accuracies ranging from 89% to 92% for the 1-GHz and 2.3-GHz antennas, respectively.

  5. Renewable energy sources. Non-traditional actors on the international market

    International Nuclear Information System (INIS)

    1999-01-01

    Five of Sweden's technical attaches have investigated the non-traditional actors activity within the field of renewable energy sources. Countries studied are USA, Japan, France, Germany and Great Britain

  6. Robot Guidance Using Machine Vision Techniques in Industrial Environments: A Comparative Review

    Directory of Open Access Journals (Sweden)

    Luis Pérez

    2016-03-01

    Full Text Available In the factory of the future, most of the operations will be done by autonomous robots that need visual feedback to move around the working space avoiding obstacles, to work collaboratively with humans, to identify and locate the working parts, to complete the information provided by other sensors to improve their positioning accuracy, etc. Different vision techniques, such as photogrammetry, stereo vision, structured light, time of flight and laser triangulation, among others, are widely used for inspection and quality control processes in the industry and now for robot guidance. Choosing which type of vision system to use is highly dependent on the parts that need to be located or measured. Thus, in this paper a comparative review of different machine vision techniques for robot guidance is presented. This work analyzes accuracy, range and weight of the sensors, safety, processing time and environmental influences. Researchers and developers can take it as a background information for their future works.

  7. Robot Guidance Using Machine Vision Techniques in Industrial Environments: A Comparative Review.

    Science.gov (United States)

    Pérez, Luis; Rodríguez, Íñigo; Rodríguez, Nuria; Usamentiaga, Rubén; García, Daniel F

    2016-03-05

    In the factory of the future, most of the operations will be done by autonomous robots that need visual feedback to move around the working space avoiding obstacles, to work collaboratively with humans, to identify and locate the working parts, to complete the information provided by other sensors to improve their positioning accuracy, etc. Different vision techniques, such as photogrammetry, stereo vision, structured light, time of flight and laser triangulation, among others, are widely used for inspection and quality control processes in the industry and now for robot guidance. Choosing which type of vision system to use is highly dependent on the parts that need to be located or measured. Thus, in this paper a comparative review of different machine vision techniques for robot guidance is presented. This work analyzes accuracy, range and weight of the sensors, safety, processing time and environmental influences. Researchers and developers can take it as a background information for their future works.

  8. Efficiency improvement of the maximum power point tracking for PV systems using support vector machine technique

    Science.gov (United States)

    Kareim, Ameer A.; Mansor, Muhamad Bin

    2013-06-01

    The aim of this paper is to improve efficiency of maximum power point tracking (MPPT) for PV systems. The Support Vector Machine (SVM) was proposed to achieve the MPPT controller. The theoretical, the perturbation and observation (P&O), and incremental conductance (IC) algorithms were used to compare with proposed SVM algorithm. MATLAB models for PV module, theoretical, SVM, P&O, and IC algorithms are implemented. The improved MPPT uses the SVM method to predict the optimum voltage of the PV system in order to extract the maximum power point (MPP). The SVM technique used two inputs which are solar radiation and ambient temperature of the modeled PV module. The results show that the proposed SVM technique has less Root Mean Square Error (RMSE) and higher efficiency than P&O and IC methods.

  9. An Innovative System for the Efficient and Effective Treatment of Non-Traditional Waters for Reuse in Thermoelectric Power Generation

    Energy Technology Data Exchange (ETDEWEB)

    John Rodgers; James Castle

    2008-08-31

    This study assessed opportunities for improving water quality associated with coal-fired power generation including the use of non-traditional waters for cooling, innovative technology for recovering and reusing water within power plants, novel approaches for the removal of trace inorganic compounds from ash pond effluents, and novel approaches for removing biocides from cooling tower blowdown. This research evaluated specifically designed pilot-scale constructed wetland systems for treatment of targeted constituents in non-traditional waters for reuse in thermoelectric power generation and other purposes. The overall objective of this project was to decrease targeted constituents in non-traditional waters to achieve reuse criteria or discharge limitations established by the National Pollutant Discharge Elimination System (NPDES) and Clean Water Act (CWA). The six original project objectives were completed, and results are presented in this final technical report. These objectives included identification of targeted constituents for treatment in four non-traditional water sources, determination of reuse or discharge criteria for treatment, design of constructed wetland treatment systems for these non-traditional waters, and measurement of treatment of targeted constituents in non-traditional waters, as well as determination of the suitability of the treated non-traditional waters for reuse or discharge to receiving aquatic systems. The four non-traditional waters used to accomplish these objectives were ash basin water, cooling water, flue gas desulfurization (FGD) water, and produced water. The contaminants of concern identified in ash basin waters were arsenic, chromium, copper, mercury, selenium, and zinc. Contaminants of concern in cooling waters included free oxidants (chlorine, bromine, and peroxides), copper, lead, zinc, pH, and total dissolved solids. FGD waters contained contaminants of concern including arsenic, boron, chlorides, selenium, mercury

  10. The Revival of Non-Traditional State Actors' Interests in Africa

    DEFF Research Database (Denmark)

    Kragelund, Peter

    2012-01-01

    Africa’s external relations are currently undergoing major changes. Non-traditional state actors like China and India are reviving their ties with African economies and thereby affecting power relations between African states and traditional partners. Meanwhile, high commodity prices and improved...... for African economies this is largely a consequence of the increased availability of external finance - and not just from non-traditional state actors....

  11. Conceptualisation of learning satisfaction experienced by non-traditional learners in Singapore

    OpenAIRE

    Khiat, Henry

    2013-01-01

    This study uncovered the different factors that make up the learning satisfaction of non-traditional learners in Singapore. Data was collected from a component of the student evaluation exercise in a Singapore university in 2011. A mixed-methods approach was adopted in the analysis. The study stated that non-traditional learners’ learning satisfaction can be generally grouped into four main categories: a) Desirable Learning Deliverables; b) Directed Learning Related Factors; c) Lecturer/Tutor...

  12. Novel Machine Learning-Based Techniques for Efficient Resource Allocation in Next Generation Wireless Networks

    KAUST Repository

    AlQuerm, Ismail A.

    2018-02-21

    There is a large demand for applications of high data rates in wireless networks. These networks are becoming more complex and challenging to manage due to the heterogeneity of users and applications specifically in sophisticated networks such as the upcoming 5G. Energy efficiency in the future 5G network is one of the essential problems that needs consideration due to the interference and heterogeneity of the network topology. Smart resource allocation, environmental adaptivity, user-awareness and energy efficiency are essential features in the future networks. It is important to support these features at different networks topologies with various applications. Cognitive radio has been found to be the paradigm that is able to satisfy the above requirements. It is a very interdisciplinary topic that incorporates flexible system architectures, machine learning, context awareness and cooperative networking. Mitola’s vision about cognitive radio intended to build context-sensitive smart radios that are able to adapt to the wireless environment conditions while maintaining quality of service support for different applications. Artificial intelligence techniques including heuristics algorithms and machine learning are the shining tools that are employed to serve the new vision of cognitive radio. In addition, these techniques show a potential to be utilized in an efficient resource allocation for the upcoming 5G networks’ structures such as heterogeneous multi-tier 5G networks and heterogeneous cloud radio access networks due to their capability to allocate resources according to real-time data analytics. In this thesis, we study cognitive radio from a system point of view focusing closely on architectures, artificial intelligence techniques that can enable intelligent radio resource allocation and efficient radio parameters reconfiguration. We propose a modular cognitive resource management architecture, which facilitates a development of flexible control for

  13. Classification of breast tumour using electrical impedance and machine learning techniques

    International Nuclear Information System (INIS)

    Amin, Abdullah Al; Parvin, Shahnaj; Kadir, M A; Tahmid, Tasmia; Alam, S Kaisar; Siddique-e Rabbani, K

    2014-01-01

    When a breast lump is detected through palpation, mammography or ultrasonography, the final test for characterization of the tumour, whether it is malignant or benign, is biopsy. This is invasive and carries hazards associated with any surgical procedures. The present work was undertaken to study the feasibility for such characterization using non-invasive electrical impedance measurements and machine learning techniques. Because of changes in cell morphology of malignant and benign tumours, changes are expected in impedance at a fixed frequency, and versus frequency of measurement. Tetrapolar impedance measurement (TPIM) using four electrodes at the corners of a square region of sides 4 cm was used for zone localization. Data of impedance in two orthogonal directions, measured at 5 and 200 kHz from 19 subjects, and their respective slopes with frequency were subjected to machine learning procedures through the use of feature plots. These patients had single or multiple tumours of various types in one or both breasts, and four of them had malignant tumours, as diagnosed by core biopsy. Although size and depth of the tumours are expected to affect the measurements, this preliminary work ignored these effects. Selecting 12 features from the above measurements, feature plots were drawn for the 19 patients, which displayed considerable overlap between malignant and benign cases. However, based on observed qualitative trend of the measured values, when all the feature values were divided by respective ages, the two types of tumours separated out reasonably well. Using K-NN classification method the results obtained are, positive prediction value: 60%, negative prediction value: 93%, sensitivity: 75%, specificity: 87% and efficacy: 84%, which are very good for such a test on a small sample size. Study on a larger sample is expected to give confidence in this technique, and further improvement of the technique may have the ability to replace biopsy. (paper)

  14. Classification of breast tumour using electrical impedance and machine learning techniques.

    Science.gov (United States)

    Al Amin, Abdullah; Parvin, Shahnaj; Kadir, M A; Tahmid, Tasmia; Alam, S Kaisar; Siddique-e Rabbani, K

    2014-06-01

    When a breast lump is detected through palpation, mammography or ultrasonography, the final test for characterization of the tumour, whether it is malignant or benign, is biopsy. This is invasive and carries hazards associated with any surgical procedures. The present work was undertaken to study the feasibility for such characterization using non-invasive electrical impedance measurements and machine learning techniques. Because of changes in cell morphology of malignant and benign tumours, changes are expected in impedance at a fixed frequency, and versus frequency of measurement. Tetrapolar impedance measurement (TPIM) using four electrodes at the corners of a square region of sides 4 cm was used for zone localization. Data of impedance in two orthogonal directions, measured at 5 and 200 kHz from 19 subjects, and their respective slopes with frequency were subjected to machine learning procedures through the use of feature plots. These patients had single or multiple tumours of various types in one or both breasts, and four of them had malignant tumours, as diagnosed by core biopsy. Although size and depth of the tumours are expected to affect the measurements, this preliminary work ignored these effects. Selecting 12 features from the above measurements, feature plots were drawn for the 19 patients, which displayed considerable overlap between malignant and benign cases. However, based on observed qualitative trend of the measured values, when all the feature values were divided by respective ages, the two types of tumours separated out reasonably well. Using K-NN classification method the results obtained are, positive prediction value: 60%, negative prediction value: 93%, sensitivity: 75%, specificity: 87% and efficacy: 84%, which are very good for such a test on a small sample size. Study on a larger sample is expected to give confidence in this technique, and further improvement of the technique may have the ability to replace biopsy.

  15. Sensorless Speed/Torque Control of DC Machine Using Artificial Neural Network Technique

    Directory of Open Access Journals (Sweden)

    Rakan Kh. Antar

    2017-12-01

    Full Text Available In this paper, Artificial Neural Network (ANN technique is implemented to improve speed and torque control of a separately excited DC machine drive. The speed and torque sensorless scheme based on ANN is estimated adaptively. The proposed controller is designed to estimate rotor speed and mechanical load torque as a Model Reference Adaptive System (MRAS method for DC machine. The DC drive system consists of four quadrant DC/DC chopper with MOSFET transistors, ANN, logic gates and routing circuits. The DC drive circuit is designed, evaluated and modeled by Matlab/Simulink in the forward and reverse operation modes as a motor and generator, respectively. The DC drive system is simulated at different speed values (±1200 rpm and mechanical torque (±7 N.m in steady state and dynamic conditions. The simulation results illustratethe effectiveness of the proposed controller without speed or torque sensors.

  16. A hybrid stock trading framework integrating technical analysis with machine learning techniques

    Directory of Open Access Journals (Sweden)

    Rajashree Dash

    2016-03-01

    Full Text Available In this paper, a novel decision support system using a computational efficient functional link artificial neural network (CEFLANN and a set of rules is proposed to generate the trading decisions more effectively. Here the problem of stock trading decision prediction is articulated as a classification problem with three class values representing the buy, hold and sell signals. The CEFLANN network used in the decision support system produces a set of continuous trading signals within the range 0–1 by analyzing the nonlinear relationship exists between few popular technical indicators. Further the output trading signals are used to track the trend and to produce the trading decision based on that trend using some trading rules. The novelty of the approach is to engender the profitable stock trading decision points through integration of the learning ability of CEFLANN neural network with the technical analysis rules. For assessing the potential use of the proposed method, the model performance is also compared with some other machine learning techniques such as Support Vector Machine (SVM, Naive Bayesian model, K nearest neighbor model (KNN and Decision Tree (DT model.

  17. Controlling the Adhesion of Superhydrophobic Surfaces Using Electrolyte Jet Machining Techniques.

    Science.gov (United States)

    Yang, Xiaolong; Liu, Xin; Lu, Yao; Zhou, Shining; Gao, Mingqian; Song, Jinlong; Xu, Wenji

    2016-04-05

    Patterns with controllable adhesion on superhydrophobic areas have various biomedical and chemical applications. Electrolyte jet machining technique (EJM), an electrochemical machining method, was firstly exploited in constructing dimples with various profiles on the superhydrophobic Al alloy surface using different processing parameters. Sliding angles of water droplets on those dimples firstly increased and then stabilized at a certain value with the increase of the processing time or the applied voltages of the EJM, indicating that surfaces with different adhesion force could be obtained by regulating the processing parameters. The contact angle hysteresis and the adhesion force that restricts the droplet from sliding off were investigated through experiments. The results show that the adhesion force could be well described using the classical Furmidge equation. On account of this controllable adhesion force, water droplets could either be firmly pinned to the surface, forming various patterns or slide off at designed tilting angles at specified positions on a superhydrophobic surface. Such dimples on superhydrophopbic surfaces can be applied in water harvesting, biochemical analysis and lab-on-chip devices.

  18. Heart Failure: Diagnosis, Severity Estimation and Prediction of Adverse Events Through Machine Learning Techniques

    Directory of Open Access Journals (Sweden)

    Evanthia E. Tripoliti

    2017-01-01

    Full Text Available Heart failure is a serious condition with high prevalence (about 2% in the adult population in developed countries, and more than 8% in patients older than 75 years. About 3–5% of hospital admissions are linked with heart failure incidents. Heart failure is the first cause of admission by healthcare professionals in their clinical practice. The costs are very high, reaching up to 2% of the total health costs in the developed countries. Building an effective disease management strategy requires analysis of large amount of data, early detection of the disease, assessment of the severity and early prediction of adverse events. This will inhibit the progression of the disease, will improve the quality of life of the patients and will reduce the associated medical costs. Toward this direction machine learning techniques have been employed. The aim of this paper is to present the state-of-the-art of the machine learning methodologies applied for the assessment of heart failure. More specifically, models predicting the presence, estimating the subtype, assessing the severity of heart failure and predicting the presence of adverse events, such as destabilizations, re-hospitalizations, and mortality are presented. According to the authors' knowledge, it is the first time that such a comprehensive review, focusing on all aspects of the management of heart failure, is presented.

  19. A comparison of machine learning techniques for survival prediction in breast cancer.

    Science.gov (United States)

    Vanneschi, Leonardo; Farinaccio, Antonella; Mauri, Giancarlo; Antoniotti, Mauro; Provero, Paolo; Giacobini, Mario

    2011-05-11

    The ability to accurately classify cancer patients into risk classes, i.e. to predict the outcome of the pathology on an individual basis, is a key ingredient in making therapeutic decisions. In recent years gene expression data have been successfully used to complement the clinical and histological criteria traditionally used in such prediction. Many "gene expression signatures" have been developed, i.e. sets of genes whose expression values in a tumor can be used to predict the outcome of the pathology. Here we investigate the use of several machine learning techniques to classify breast cancer patients using one of such signatures, the well established 70-gene signature. We show that Genetic Programming performs significantly better than Support Vector Machines, Multilayered Perceptrons and Random Forests in classifying patients from the NKI breast cancer dataset, and comparably to the scoring-based method originally proposed by the authors of the 70-gene signature. Furthermore, Genetic Programming is able to perform an automatic feature selection. Since the performance of Genetic Programming is likely to be improvable compared to the out-of-the-box approach used here, and given the biological insight potentially provided by the Genetic Programming solutions, we conclude that Genetic Programming methods are worth further investigation as a tool for cancer patient classification based on gene expression data.

  20. A comparison of machine learning techniques for survival prediction in breast cancer

    Directory of Open Access Journals (Sweden)

    Vanneschi Leonardo

    2011-05-01

    Full Text Available Abstract Background The ability to accurately classify cancer patients into risk classes, i.e. to predict the outcome of the pathology on an individual basis, is a key ingredient in making therapeutic decisions. In recent years gene expression data have been successfully used to complement the clinical and histological criteria traditionally used in such prediction. Many "gene expression signatures" have been developed, i.e. sets of genes whose expression values in a tumor can be used to predict the outcome of the pathology. Here we investigate the use of several machine learning techniques to classify breast cancer patients using one of such signatures, the well established 70-gene signature. Results We show that Genetic Programming performs significantly better than Support Vector Machines, Multilayered Perceptrons and Random Forests in classifying patients from the NKI breast cancer dataset, and comparably to the scoring-based method originally proposed by the authors of the 70-gene signature. Furthermore, Genetic Programming is able to perform an automatic feature selection. Conclusions Since the performance of Genetic Programming is likely to be improvable compared to the out-of-the-box approach used here, and given the biological insight potentially provided by the Genetic Programming solutions, we conclude that Genetic Programming methods are worth further investigation as a tool for cancer patient classification based on gene expression data.

  1. Influence of Heartwood on Wood Density and Pulp Properties Explained by Machine Learning Techniques

    Directory of Open Access Journals (Sweden)

    Carla Iglesias

    2017-01-01

    Full Text Available The aim of this work is to develop a tool to predict some pulp properties e.g., pulp yield, Kappa number, ISO brightness (ISO 2470:2008, fiber length and fiber width, using the sapwood and heartwood proportion in the raw-material. For this purpose, Acacia melanoxylon trees were collected from four sites in Portugal. Percentage of sapwood and heartwood, area and the stem eccentricity (in N-S and E-W directions were measured on transversal stem sections of A. melanoxylon R. Br. The relative position of the samples with respect to the total tree height was also considered as an input variable. Different configurations were tested until the maximum correlation coefficient was achieved. A classical mathematical technique (multiple linear regression and machine learning methods (classification and regression trees, multi-layer perceptron and support vector machines were tested. Classification and regression trees (CART was the most accurate model for the prediction of pulp ISO brightness (R = 0.85. The other parameters could be predicted with fair results (R = 0.64–0.75 by CART. Hence, the proportion of heartwood and sapwood is a relevant parameter for pulping and pulp properties, and should be taken as a quality trait when assessing a pulpwood resource.

  2. Controlling the Adhesion of Superhydrophobic Surfaces Using Electrolyte Jet Machining Techniques

    Science.gov (United States)

    Yang, Xiaolong; Liu, Xin; Lu, Yao; Zhou, Shining; Gao, Mingqian; Song, Jinlong; Xu, Wenji

    2016-01-01

    Patterns with controllable adhesion on superhydrophobic areas have various biomedical and chemical applications. Electrolyte jet machining technique (EJM), an electrochemical machining method, was firstly exploited in constructing dimples with various profiles on the superhydrophobic Al alloy surface using different processing parameters. Sliding angles of water droplets on those dimples firstly increased and then stabilized at a certain value with the increase of the processing time or the applied voltages of the EJM, indicating that surfaces with different adhesion force could be obtained by regulating the processing parameters. The contact angle hysteresis and the adhesion force that restricts the droplet from sliding off were investigated through experiments. The results show that the adhesion force could be well described using the classical Furmidge equation. On account of this controllable adhesion force, water droplets could either be firmly pinned to the surface, forming various patterns or slide off at designed tilting angles at specified positions on a superhydrophobic surface. Such dimples on superhydrophopbic surfaces can be applied in water harvesting, biochemical analysis and lab-on-chip devices. PMID:27046771

  3. Empirical analysis of the efficient use of geometric error identification in a machine tool by tracking measurement techniques

    International Nuclear Information System (INIS)

    Aguado, S; Santolaria, J; Samper, D; Velazquez, J; Aguilar, J J

    2016-01-01

    Volumetric verification is becoming increasingly accepted as a suitable technique with which to improve machine tool accuracy. In the same way, the use of laser trackers to obtain machine error information using the new Active Target motorised retro-reflector allows the verification of all types of machine tool throughout their workspaces. Non-linear optimisation methods and machine tool kinematic models are the mainstays of this technique. Whereas the latter provide the relationship between the nominal coordinates, the geometric errors of the machine and laser tracker measurement, the former reduces the combined influence of geometric errors by obtaining their approximation functions. However, within these two procedures, several factors affect the scope of the produced verification results. The present paper focuses on the analysis of the adequacy of commercial measurement techniques using laser trackers and the new motorised retro-reflector in a real milling machine. An examination is also made regarding the influence of the optimisation sequence defined by the identification strategy, as well as the impact of the number of measured points in relation to the employed regression functions. (paper)

  4. Hybrid machine learning technique for forecasting Dhaka stock market timing decisions.

    Science.gov (United States)

    Banik, Shipra; Khodadad Khan, A F M; Anwer, Mohammad

    2014-01-01

    Forecasting stock market has been a difficult job for applied researchers owing to nature of facts which is very noisy and time varying. However, this hypothesis has been featured by several empirical experiential studies and a number of researchers have efficiently applied machine learning techniques to forecast stock market. This paper studied stock prediction for the use of investors. It is always true that investors typically obtain loss because of uncertain investment purposes and unsighted assets. This paper proposes a rough set model, a neural network model, and a hybrid neural network and rough set model to find optimal buy and sell of a share on Dhaka stock exchange. Investigational findings demonstrate that our proposed hybrid model has higher precision than the single rough set model and the neural network model. We believe this paper findings will help stock investors to decide about optimal buy and/or sell time on Dhaka stock exchange.

  5. Application of machine learning techniques to analyse the effects of physical exercise in ventricular fibrillation.

    Science.gov (United States)

    Caravaca, Juan; Soria-Olivas, Emilio; Bataller, Manuel; Serrano, Antonio J; Such-Miquel, Luis; Vila-Francés, Joan; Guerrero, Juan F

    2014-02-01

    This work presents the application of machine learning techniques to analyse the influence of physical exercise in the physiological properties of the heart, during ventricular fibrillation. To this end, different kinds of classifiers (linear and neural models) are used to classify between trained and sedentary rabbit hearts. The use of those classifiers in combination with a wrapper feature selection algorithm allows to extract knowledge about the most relevant features in the problem. The obtained results show that neural models outperform linear classifiers (better performance indices and a better dimensionality reduction). The most relevant features to describe the benefits of physical exercise are those related to myocardial heterogeneity, mean activation rate and activation complexity. © 2013 Published by Elsevier Ltd.

  6. Model-based orientation-independent 3-D machine vision techniques

    Science.gov (United States)

    De Figueiredo, R. J. P.; Kehtarnavaz, N.

    1988-01-01

    Orientation-dependent techniques for the identification of a three-dimensional object by a machine vision system are represented in parts. In the first part, the data consist of intensity images of polyhedral objects obtained by a single camera, while in the second part, the data consist of range images of curved objects obtained by a laser scanner. In both cases, the attributed graphic representation of the object surface is used to drive the respective algorithm. In this representation, a graph node represents a surface patch and a link represents the adjacency between two patches. The attributes assigned to nodes are moment invariants of the corresponding face for polyhedral objects. For range images, the Gaussian curvature is used as a segmentation criterion for providing symbolic shape attributes. Identification is achieved by an efficient graph-matching algorithm used to match the graph obtained from the data to a subgraph of one of the model graphs stored in the commputer memory.

  7. Experimental Machine Vision System for Training Students in Virtual Instrumentation Techniques

    Directory of Open Access Journals (Sweden)

    Rodica Holonec

    2011-10-01

    Full Text Available The aim of this paper is to present the main techniques in designing and building of a complex machine vision system in order to train electrical engineering students in using virtual instrumentation. The proposed test bench realizes an automatic adjustment of some electrical circuit parameters on a belt conveyer. The students can learn how to combine mechanics, electronics, electrical engineering, image acquisition and processing in order to solve the proposed application. After the system implementation the students are asked to present in which way they can modify or extend the system for industrial environment regarding the automatic adjustment of electric parameters or the calibration of different type of sensors (of distance, of proximity, etc without the intervention of the human factor in the process.

  8. Modeling, Control and Analyze of Multi-Machine Drive Systems using Bond Graph Technique

    Directory of Open Access Journals (Sweden)

    J. Belhadj

    2006-03-01

    Full Text Available In this paper, a system viewpoint method has been investigated to study and analyze complex systems using Bond Graph technique. These systems are multimachine multi-inverter based on Induction Machine (IM, well used in industries like rolling mills, textile, and railway traction. These systems are multi-domains, multi-scales time and present very strong internal and external couplings, with non-linearity characterized by a high model order. The classical study with analytic model is difficult to manipulate and it is limited to some performances. In this study, a “systemic approach” is presented to design these kinds of systems, using an energetic representation based on Bond Graph formalism. Three types of multimachine are studied with their control strategies. The modeling is carried out by Bond Graph and results are discussed to show the performances of this methodology

  9. Markerless gating for lung cancer radiotherapy based on machine learning techniques

    International Nuclear Information System (INIS)

    Lin Tong; Li Ruijiang; Tang Xiaoli; Jiang, Steve B; Dy, Jennifer G

    2009-01-01

    In lung cancer radiotherapy, radiation to a mobile target can be delivered by respiratory gating, for which we need to know whether the target is inside or outside a predefined gating window at any time point during the treatment. This can be achieved by tracking one or more fiducial markers implanted inside or near the target, either fluoroscopically or electromagnetically. However, the clinical implementation of marker tracking is limited for lung cancer radiotherapy mainly due to the risk of pneumothorax. Therefore, gating without implanted fiducial markers is a promising clinical direction. We have developed several template-matching methods for fluoroscopic marker-less gating. Recently, we have modeled the gating problem as a binary pattern classification problem, in which principal component analysis (PCA) and support vector machine (SVM) are combined to perform the classification task. Following the same framework, we investigated different combinations of dimensionality reduction techniques (PCA and four nonlinear manifold learning methods) and two machine learning classification methods (artificial neural networks-ANN and SVM). Performance was evaluated on ten fluoroscopic image sequences of nine lung cancer patients. We found that among all combinations of dimensionality reduction techniques and classification methods, PCA combined with either ANN or SVM achieved a better performance than the other nonlinear manifold learning methods. ANN when combined with PCA achieves a better performance than SVM in terms of classification accuracy and recall rate, although the target coverage is similar for the two classification methods. Furthermore, the running time for both ANN and SVM with PCA is within tolerance for real-time applications. Overall, ANN combined with PCA is a better candidate than other combinations we investigated in this work for real-time gated radiotherapy.

  10. Development of Experimental Setup of Metal Rapid Prototyping Machine using Selective Laser Sintering Technique

    Science.gov (United States)

    Patil, S. N.; Mulay, A. V.; Ahuja, B. B.

    2018-04-01

    Unlike in the traditional manufacturing processes, additive manufacturing as rapid prototyping, allows designers to produce parts that were previously considered too complex to make economically. The shift is taking place from plastic prototype to fully functional metallic parts by direct deposition of metallic powders as produced parts can be directly used for desired purpose. This work is directed towards the development of experimental setup of metal rapid prototyping machine using selective laser sintering and studies the various parameters, which plays important role in the metal rapid prototyping using SLS technique. The machine structure in mainly divided into three main categories namely, (1) Z-movement of bed and table, (2) X-Y movement arrangement for LASER movements and (3) feeder mechanism. Z-movement of bed is controlled by using lead screw, bevel gear pair and stepper motor, which will maintain the accuracy of layer thickness. X-Y movements are controlled using timing belt and stepper motors for precise movements of LASER source. Feeder mechanism is then developed to control uniformity of layer thickness metal powder. Simultaneously, the study is carried out for selection of material. Various types of metal powders can be used for metal RP as Single metal powder, mixture of two metals powder, and combination of metal and polymer powder. Conclusion leads to use of mixture of two metals powder to minimize the problems such as, balling effect and porosity. Developed System can be validated by conducting various experiments on manufactured part to check mechanical and metallurgical properties. After studying the results of these experiments, various process parameters as LASER properties (as power, speed etc.), and material properties (as grain size and structure etc.) will be optimized. This work is mainly focused on the design and development of cost effective experimental setup of metal rapid prototyping using SLS technique which will gives the feel of

  11. Multivariate Time Series Forecasting of Crude Palm Oil Price Using Machine Learning Techniques

    Science.gov (United States)

    Kanchymalay, Kasturi; Salim, N.; Sukprasert, Anupong; Krishnan, Ramesh; Raba'ah Hashim, Ummi

    2017-08-01

    The aim of this paper was to study the correlation between crude palm oil (CPO) price, selected vegetable oil prices (such as soybean oil, coconut oil, and olive oil, rapeseed oil and sunflower oil), crude oil and the monthly exchange rate. Comparative analysis was then performed on CPO price forecasting results using the machine learning techniques. Monthly CPO prices, selected vegetable oil prices, crude oil prices and monthly exchange rate data from January 1987 to February 2017 were utilized. Preliminary analysis showed a positive and high correlation between the CPO price and soy bean oil price and also between CPO price and crude oil price. Experiments were conducted using multi-layer perception, support vector regression and Holt Winter exponential smoothing techniques. The results were assessed by using criteria of root mean square error (RMSE), means absolute error (MAE), means absolute percentage error (MAPE) and Direction of accuracy (DA). Among these three techniques, support vector regression(SVR) with Sequential minimal optimization (SMO) algorithm showed relatively better results compared to multi-layer perceptron and Holt Winters exponential smoothing method.

  12. Machine Learning Techniques for Optical Performance Monitoring from Directly Detected PDM-QAM Signals

    DEFF Research Database (Denmark)

    Thrane, Jakob; Wass, Jesper; Piels, Molly

    2017-01-01

    Linear signal processing algorithms are effective in dealing with linear transmission channel and linear signal detection, while the nonlinear signal processing algorithms, from the machine learning community, are effective in dealing with nonlinear transmission channel and nonlinear signal...... detection. In this paper, a brief overview of the various machine learning methods and their application in optical communication is presented and discussed. Moreover, supervised machine learning methods, such as neural networks and support vector machine, are experimentally demonstrated for in-band optical...

  13. Student Media Usage Patterns and Non-Traditional Learning in Higher Education

    Directory of Open Access Journals (Sweden)

    Olaf Zawacki-Richter

    2015-04-01

    Full Text Available A total of 2,338 students at German universities participated in a survey, which investigated media usage patterns of so-called traditional and non-traditional students (Schuetze & Wolter, 2003. The students provided information on the digital devices that they own or have access to, and on their usage of media and e-learning tools and services for their learning. A distinction was made between external, formal and internal, informal tools and services. Based on the students’ responses, a typology of media usage patterns was established by means of a latent class analysis (LCA. Four types or profiles of media usage patterns were identified. These types were labeled entertainment users, peripheral users, advanced users and instrumental users. Among non-traditional students, the proportion of instrumental users was rather high. Based on the usage patterns of traditional and non-traditional students, implications for media selection in the instructional design process are outlined in the paper.

  14. Increasing diversity in international education: Programming for non-traditional students through an alternative curriculum model

    Directory of Open Access Journals (Sweden)

    Rebecca A. Clothey

    2016-05-01

    Full Text Available This paper looks at an alternative curriculum model for study abroad designed specifically to address some of the needs of non-traditional students enrolled in an online education program. In order to meet the needs of non-traditional students and provide quality international programming for them, it is necessary first to understand their restraints to studying abroad, and then to design alternative educational models that can address these challenges. The paper describes the challenges of balancing the need to create quality international learning opportunities for education students, with the limitations faced by non-traditional online adult learners who have families and full-time jobs. It is based on an action research case study of two study abroad programs implemented for online students at a northeastern four-year research-one institution of higher education.

  15. Role of non-traditional locations for seasonal flu vaccination: Empirical evidence and evaluation.

    Science.gov (United States)

    Kim, Namhoon; Mountain, Travis P

    2017-05-19

    This study investigated the role of non-traditional locations in the decision to vaccinate for seasonal flu. We measured individuals' preferred location for seasonal flu vaccination by examining the National H1N1 Flu Survey (NHFS) conducted from late 2009 to early 2010. Our econometric model estimated the probabilities of possible choices by varying individual characteristics, and predicted the way in which the probabilities are expected to change given the specific covariates of interest. From this estimation, we observed that non-traditional locations significantly influenced the vaccination of certain individuals, such as those who are high-income, educated, White, employed, and living in a metropolitan statistical area (MSA), by increasing the coverage. Thus, based on the empirical evidence, our study suggested that supporting non-traditional locations for vaccination could be effective in increasing vaccination coverage. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Machine learning techniques in disease forecasting: a case study on rice blast prediction

    Directory of Open Access Journals (Sweden)

    Kapoor Amar S

    2006-11-01

    Full Text Available Abstract Background Diverse modeling approaches viz. neural networks and multiple regression have been followed to date for disease prediction in plant populations. However, due to their inability to predict value of unknown data points and longer training times, there is need for exploiting new prediction softwares for better understanding of plant-pathogen-environment relationships. Further, there is no online tool available which can help the plant researchers or farmers in timely application of control measures. This paper introduces a new prediction approach based on support vector machines for developing weather-based prediction models of plant diseases. Results Six significant weather variables were selected as predictor variables. Two series of models (cross-location and cross-year were developed and validated using a five-fold cross validation procedure. For cross-year models, the conventional multiple regression (REG approach achieved an average correlation coefficient (r of 0.50, which increased to 0.60 and percent mean absolute error (%MAE decreased from 65.42 to 52.24 when back-propagation neural network (BPNN was used. With generalized regression neural network (GRNN, the r increased to 0.70 and %MAE also improved to 46.30, which further increased to r = 0.77 and %MAE = 36.66 when support vector machine (SVM based method was used. Similarly, cross-location validation achieved r = 0.48, 0.56 and 0.66 using REG, BPNN and GRNN respectively, with their corresponding %MAE as 77.54, 66.11 and 58.26. The SVM-based method outperformed all the three approaches by further increasing r to 0.74 with improvement in %MAE to 44.12. Overall, this SVM-based prediction approach will open new vistas in the area of forecasting plant diseases of various crops. Conclusion Our case study demonstrated that SVM is better than existing machine learning techniques and conventional REG approaches in forecasting plant diseases. In this direction, we have also

  17. To be or not to be: the importance of attendance in integrated physiology teaching using non-traditional approaches

    Directory of Open Access Journals (Sweden)

    Garrido Concepción

    2011-09-01

    Full Text Available Abstract Background There is increasing use of non-traditional methods like problem-based learning, team-working and several other active-learning techniques in Physiology teaching. While several studies have investigated the impact of class attendance on the academic performance in traditional teaching, there is limited information regarding whether the new modalities are especially sensible to this factor. Methods Here, we performed a comparative study between a control group receiving information through traditional methods and an experimental group submitted to new methodologies in Physiology teaching. Results We found that while mean examination scores were similar in the control and the experimental groups, a different picture emerge when data are organized according to four categorical attendance levels. In the experimental group, scores were not different between the 1st and the 2nd exams (P = 0.429 nor between the 2nd and the 3rd exams (P = 0.225 for students that never or poorly attend classes, in contrast to the control group (P Conclusion We suggest that class attendance is critical for learning using non-traditional methods.

  18. To be or not to be: the importance of attendance in integrated physiology teaching using non-traditional approaches.

    Science.gov (United States)

    Gal, Beatriz; Busturia, Ignacio; Garrido, Concepción

    2011-09-14

    There is increasing use of non-traditional methods like problem-based learning, team-working and several other active-learning techniques in Physiology teaching. While several studies have investigated the impact of class attendance on the academic performance in traditional teaching, there is limited information regarding whether the new modalities are especially sensible to this factor. Here, we performed a comparative study between a control group receiving information through traditional methods and an experimental group submitted to new methodologies in Physiology teaching. We found that while mean examination scores were similar in the control and the experimental groups, a different picture emerge when data are organized according to four categorical attendance levels. In the experimental group, scores were not different between the 1st and the 2nd exams (P = 0.429) nor between the 2nd and the 3rd exams (P = 0.225) for students that never or poorly attend classes, in contrast to the control group (P attending students versus the absentees was maximal in the experimental versus the control group all along the different exams and in the final score. We suggest that class attendance is critical for learning using non-traditional methods.

  19. A methodology for automated CPA extraction using liver biopsy image analysis and machine learning techniques.

    Science.gov (United States)

    Tsipouras, Markos G; Giannakeas, Nikolaos; Tzallas, Alexandros T; Tsianou, Zoe E; Manousou, Pinelopi; Hall, Andrew; Tsoulos, Ioannis; Tsianos, Epameinondas

    2017-03-01

    Collagen proportional area (CPA) extraction in liver biopsy images provides the degree of fibrosis expansion in liver tissue, which is the most characteristic histological alteration in hepatitis C virus (HCV). Assessment of the fibrotic tissue is currently based on semiquantitative staging scores such as Ishak and Metavir. Since its introduction as a fibrotic tissue assessment technique, CPA calculation based on image analysis techniques has proven to be more accurate than semiquantitative scores. However, CPA has yet to reach everyday clinical practice, since the lack of standardized and robust methods for computerized image analysis for CPA assessment have proven to be a major limitation. The current work introduces a three-stage fully automated methodology for CPA extraction based on machine learning techniques. Specifically, clustering algorithms have been employed for background-tissue separation, as well as for fibrosis detection in liver tissue regions, in the first and the third stage of the methodology, respectively. Due to the existence of several types of tissue regions in the image (such as blood clots, muscle tissue, structural collagen, etc.), classification algorithms have been employed to identify liver tissue regions and exclude all other non-liver tissue regions from CPA computation. For the evaluation of the methodology, 79 liver biopsy images have been employed, obtaining 1.31% mean absolute CPA error, with 0.923 concordance correlation coefficient. The proposed methodology is designed to (i) avoid manual threshold-based and region selection processes, widely used in similar approaches presented in the literature, and (ii) minimize CPA calculation time. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. A Study of Information-Seeking Behavior of South Florida Non-Traditional College Students

    Science.gov (United States)

    Argov, Sharon R.

    2017-01-01

    Sense-making theory involves a sociological/communications science approach to the need of students to make sense of their environment and the information they are seeking to find. With many digital resources to choose from, non-traditional students often have difficulty finding the best resource for their assignments, opting for the easiest path…

  1. Testing Algorithmic Skills in Traditional and Non-Traditional Programming Environments

    Science.gov (United States)

    Csernoch, Mária; Biró, Piroska; Máth, János; Abari, Kálmán

    2015-01-01

    The Testing Algorithmic and Application Skills (TAaAS) project was launched in the 2011/2012 academic year to test first year students of Informatics, focusing on their algorithmic skills in traditional and non-traditional programming environments, and on the transference of their knowledge of Informatics from secondary to tertiary education. The…

  2. Non-Traditional Students and Critical Pedagogy: Transformative Practice and the Teaching of Criminal Law

    Science.gov (United States)

    Menis, Susanna

    2017-01-01

    This article explores the practical implication of adopting critical pedagogy, and more specifically critical legal pedagogy, in the teaching of non-traditional students in higher education context. It is based on the teaching of criminal law at Birkbeck School of Law, addressing learning tasks which have been designed to enhance students'…

  3. Non-Traditional Socio-Mathematical Norms in Undergraduate Real Analysis

    Science.gov (United States)

    Dawkins, Paul Christian

    2009-01-01

    This study builds upon the framework of classroom norms (Cobb, Wood, & Yackel, 1993) and socio-mathematical norms (Cobb & Yackel, 1996) to understand how non-traditional socio-mathematical norms influence student reasoning and transitions to advanced mathematical thinking in undergraduate real analysis. The research involves a qualitative…

  4. State-of-the-art report on non-traditional traffic counting methods

    Science.gov (United States)

    2001-10-01

    The purpose of this report is to look at the state-of-the-art of non-traditional traffic counting methods. This is done through a three-fold approach that includes an assessment of currently available technology, a survey of State Department of Trans...

  5. Export contracts for non-traditional products: Chayote from Costa Rica

    NARCIS (Netherlands)

    Saénz, F.; Ruben, R.

    2004-01-01

    This paper focuses on the determinants of market and contract choice for non-traditional crops and the possibilities for involving local producers in global agro-food chains through delivery relationships with packers and brokers. Main attention is given to the importance of quality for entering the

  6. Women into Non-Traditional Sectors: Addressing Gender Segregation in the Northern Ireland Workplace

    Science.gov (United States)

    Potter, Michael; Hill, Myrtle

    2009-01-01

    The horizontal segregation of the workforce along gender lines tends to assign women to lower paid, lower status employment. Consequently, schemes to address segregation have focused on preparing women to enter non-traditional occupations through training and development processes. This article examines models to encourage women into…

  7. Then and Now: The Evolution of a Non-Traditional Institution.

    Science.gov (United States)

    Heiner, Harold G.

    The Whatcom Community College District (WCCD) was created by the Washington State legislature in 1967. With no money available to construct a traditional campus and no evidence that there was a sufficient number of potential students in any case, a non-traditional institution was designed, operating out of rented and borrowed classrooms throughout…

  8. Non-traditional whispering gallery modes inside microspheres visualized with Fourier analysis

    NARCIS (Netherlands)

    Chang, Lantian; Timmermans, Frank; Otto, Cees

    2017-01-01

    Non-traditional whispering gallery modes are studied in a glass microsphere. Geometrical ray tracing is used to explain and calculate these modes. Thermal emission and Raman scattering are used as an internal light source to excite these modes inside the glass microsphere. The thermal and Raman

  9. Student Media Usage Patterns and Non-Traditional Learning in Higher Education

    Science.gov (United States)

    Zawacki-Richter, Olaf; Müskens, Wolfgang; Krause, Ulrike; Alturki, Uthman; Aldraiweesh, Ahmed

    2015-01-01

    A total of 2,338 students at German universities participated in a survey, which investigated media usage patterns of so-called traditional and non-traditional students (Schuetze & Wolter, 2003). The students provided information on the digital devices that they own or have access to, and on their usage of media and e-learning tools and…

  10. Access to and Use of Export Market Information by Non- Traditional ...

    African Journals Online (AJOL)

    Ghana has traditionally depended on a number of export commodities such as cocoa, timber, gold and diamonds for its economic and social development. Recent economic policies of government have aimed to expand the country's exports to include non-traditional exports such as horticultural products, textiles, fishery ...

  11. Motivational Orientations of Non-Traditional Adult Students to Enroll in a Degree-Seeking Program

    Science.gov (United States)

    Francois, Emmanuel Jean

    2014-01-01

    The purpose of this research was to investigate the motivational orientations of non-traditional adult students to enroll in a degree-seeking program based on their academic goal. The Education Participation Scale (EPS) was used to measure the motivational orientations of participants. Professional advancement, cognitive interest, and educational…

  12. Food Challenge: Serving Up 4-H to Non-Traditional Audiences

    Science.gov (United States)

    Dodd, Sara; Follmer-Reece, Holly E.; Kostina-Ritchey, Erin; Reyna, Roxanna

    2015-01-01

    This article describes a novel approach for introducing 4-H to non-traditional/diverse audiences using 4-H Food Challenge. Set in a low SES and minority-serving rural school, Food Challenge was presented during the school day to all 7th grade students, with almost half voluntarily participating in an after-school club component. Program design…

  13. Enhancing Critical Thinking Skills and Writing Skills through the Variation in Non-Traditional Writing Task

    Science.gov (United States)

    Sinaga, Parlindungan; Feranie, Shelly

    2017-01-01

    The research aims to identify the impacts of embedding non-traditional writing tasks within the course of modern physics conducted to the students of Physics Education and Physics Study Programs. It employed a quasi-experimental method with the pretest-posttest control group design. The used instruments were tests on conceptual mastery, tests on…

  14. Machine learning and statistical techniques : an application to the prediction of insolvency in Spanish non-life insurance companies

    OpenAIRE

    Díaz, Zuleyka; Segovia, María Jesús; Fernández, José

    2005-01-01

    Prediction of insurance companies insolvency has arisen as an important problem in the field of financial research. Most methods applied in the past to tackle this issue are traditional statistical techniques which use financial ratios as explicative variables. However, these variables often do not satisfy statistical assumptions, which complicates the application of the mentioned methods. In this paper, a comparative study of the performance of two non-parametric machine learning techniques ...

  15. A comparison of machine learning techniques for detection of drug target articles.

    Science.gov (United States)

    Danger, Roxana; Segura-Bedmar, Isabel; Martínez, Paloma; Rosso, Paolo

    2010-12-01

    Important progress in treating diseases has been possible thanks to the identification of drug targets. Drug targets are the molecular structures whose abnormal activity, associated to a disease, can be modified by drugs, improving the health of patients. Pharmaceutical industry needs to give priority to their identification and validation in order to reduce the long and costly drug development times. In the last two decades, our knowledge about drugs, their mechanisms of action and drug targets has rapidly increased. Nevertheless, most of this knowledge is hidden in millions of medical articles and textbooks. Extracting knowledge from this large amount of unstructured information is a laborious job, even for human experts. Drug target articles identification, a crucial first step toward the automatic extraction of information from texts, constitutes the aim of this paper. A comparison of several machine learning techniques has been performed in order to obtain a satisfactory classifier for detecting drug target articles using semantic information from biomedical resources such as the Unified Medical Language System. The best result has been achieved by a Fuzzy Lattice Reasoning classifier, which reaches 98% of ROC area measure. Copyright © 2010 Elsevier Inc. All rights reserved.

  16. Online laboratory evaluation of seeding-machine application by an acoustic technique

    Directory of Open Access Journals (Sweden)

    Hadi Karimi

    2015-03-01

    Full Text Available Researchers and planter manufacturers have been working closely to develop an automated system for evaluating performance of seeding. In the present study, an innovative use of acoustic signal for laboratory evaluation of seeding-machine application is described. Seed detection technique of the proposed system was based on a rising voltage value that a microphone sensed in each impaction of seeds to a steel plate. Online determining of seed spacing was done with a script which was written in MATLAB software. To evaluate the acoustic system with desired seed spacing, a testing rig was designed. Seeds of wheat, corn and pelleted tomato were used as experimental material. Typical seed patterns were positioned manually on a belt stand with different spacing patterns. When the belt was running, the falling seeds from the end point of the belt impacted to the steel plate, and their acoustic signal was sensed by the microphone. In each impact, data was processed and spacing between the seeds was automatically obtained. Coefficient of determination of gathered data from the belt system and the corresponding seeds spacing measured with the acoustic system in all runs was about 0.98. This strong correlation indicates that the acoustic system worked well in determining the seeds spacing.

  17. Taxi Time Prediction at Charlotte Airport Using Fast-Time Simulation and Machine Learning Techniques

    Science.gov (United States)

    Lee, Hanbong

    2016-01-01

    Accurate taxi time prediction is required for enabling efficient runway scheduling that can increase runway throughput and reduce taxi times and fuel consumptions on the airport surface. Currently NASA and American Airlines are jointly developing a decision-support tool called Spot and Runway Departure Advisor (SARDA) that assists airport ramp controllers to make gate pushback decisions and improve the overall efficiency of airport surface traffic. In this presentation, we propose to use Linear Optimized Sequencing (LINOS), a discrete-event fast-time simulation tool, to predict taxi times and provide the estimates to the runway scheduler in real-time airport operations. To assess its prediction accuracy, we also introduce a data-driven analytical method using machine learning techniques. These two taxi time prediction methods are evaluated with actual taxi time data obtained from the SARDA human-in-the-loop (HITL) simulation for Charlotte Douglas International Airport (CLT) using various performance measurement metrics. Based on the taxi time prediction results, we also discuss how the prediction accuracy can be affected by the operational complexity at this airport and how we can improve the fast time simulation model before implementing it with an airport scheduling algorithm in a real-time environment.

  18. Online laboratory evaluation of seeding-machine application by an acoustic technique

    Energy Technology Data Exchange (ETDEWEB)

    Karimi, H.; Navid, H.; Mahmoudi, A.

    2015-07-01

    Researchers and planter manufacturers have been working closely to develop an automated system for evaluating performance of seeding. In the present study, an innovative use of acoustic signal for laboratory evaluation of seeding-machine application is described. Seed detection technique of the proposed system was based on a rising voltage value that a microphone sensed in each impaction of seeds to a steel plate. Online determining of seed spacing was done with a script which was written in MATLAB software. To evaluate the acoustic system with desired seed spacing, a testing rig was designed. Seeds of wheat, corn and pelleted tomato were used as experimental material. Typical seed patterns were positioned manually on a belt stand with different spacing patterns. When the belt was running, the falling seeds from the end point of the belt impacted to the steel plate, and their acoustic signal was sensed by the microphone. In each impact, data was processed and spacing between the seeds was automatically obtained. Coefficient of determination of gathered data from the belt system and the corresponding seeds spacing measured with the acoustic system in all runs was about 0.98. This strong correlation indicates that the acoustic system worked well in determining the seeds spacing. (Author)

  19. Estimating gypsum equirement under no-till based on machine learning technique

    Directory of Open Access Journals (Sweden)

    Alaine Margarete Guimarães

    Full Text Available Chemical stratification occurs under no-till systems, including pH, considering that higher levels are formed from the soil surface towards the deeper layers. The subsoil acidity is a limiting factor of the yield. Gypsum has been suggested when subsoil acidity limits the crops root growth, i.e., when the calcium (Ca level is low and/or the aluminum (Al level is toxic in the subsoil layers. However, there are doubts about the more efficient methods to estimate the gypsum requirement. This study was carried out to develop numerical models to estimate the gypsum requirement in soils under no-till system by the use of Machine Learning techniques. Computational analyses of the dataset were made applying the M5'Rules algorithm, based on regression models. The dataset comprised of soil chemical properties collected from experiments under no-till that received gypsum rates on the soil surface, throughout eight years after the application, in Southern Brazil. The results showed that the numerical models generated by rule induction M5'Rules algorithm were positively useful contributing for estimate the gypsum requirements under no-till. The models showed that Ca saturation in the effective cation exchange capacity (ECEC was a more important attribute than Al saturation to estimate gypsum requirement in no-till soils.

  20. Evaluating machine-learning techniques for recruitment forecasting of seven North East Atlantic fish species

    KAUST Repository

    Fernandes, José Antonio

    2015-01-01

    The effect of different factors (spawning biomass, environmental conditions) on recruitment is a subject of great importance in the management of fisheries, recovery plans and scenario exploration. In this study, recently proposed supervised classification techniques, tested by the machine-learning community, are applied to forecast the recruitment of seven fish species of North East Atlantic (anchovy, sardine, mackerel, horse mackerel, hake, blue whiting and albacore), using spawning, environmental and climatic data. In addition, the use of the probabilistic flexible naive Bayes classifier (FNBC) is proposed as modelling approach in order to reduce uncertainty for fisheries management purposes. Those improvements aim is to improve probability estimations of each possible outcome (low, medium and high recruitment) based in kernel density estimation, which is crucial for informed management decision making with high uncertainty. Finally, a comparison between goodness-of-fit and generalization power is provided, in order to assess the reliability of the final forecasting models. It is found that in most cases the proposed methodology provides useful information for management whereas the case of horse mackerel is an example of the limitations of the approach. The proposed improvements allow for a better probabilistic estimation of the different scenarios, i.e. to reduce the uncertainty in the provided forecasts.

  1. Advancing Research in Second Language Writing through Computational Tools and Machine Learning Techniques: A Research Agenda

    Science.gov (United States)

    Crossley, Scott A.

    2013-01-01

    This paper provides an agenda for replication studies focusing on second language (L2) writing and the use of natural language processing (NLP) tools and machine learning algorithms. Specifically, it introduces a range of the available NLP tools and machine learning algorithms and demonstrates how these could be used to replicate seminal studies…

  2. Fuzzy support vector machine: an efficient rule-based classification technique for microarrays.

    Science.gov (United States)

    Hajiloo, Mohsen; Rabiee, Hamid R; Anooshahpour, Mahdi

    2013-01-01

    The abundance of gene expression microarray data has led to the development of machine learning algorithms applicable for tackling disease diagnosis, disease prognosis, and treatment selection problems. However, these algorithms often produce classifiers with weaknesses in terms of accuracy, robustness, and interpretability. This paper introduces fuzzy support vector machine which is a learning algorithm based on combination of fuzzy classifiers and kernel machines for microarray classification. Experimental results on public leukemia, prostate, and colon cancer datasets show that fuzzy support vector machine applied in combination with filter or wrapper feature selection methods develops a robust model with higher accuracy than the conventional microarray classification models such as support vector machine, artificial neural network, decision trees, k nearest neighbors, and diagonal linear discriminant analysis. Furthermore, the interpretable rule-base inferred from fuzzy support vector machine helps extracting biological knowledge from microarray data. Fuzzy support vector machine as a new classification model with high generalization power, robustness, and good interpretability seems to be a promising tool for gene expression microarray classification.

  3. Feature-Free Activity Classification of Inertial Sensor Data With Machine Vision Techniques: Method, Development, and Evaluation.

    Science.gov (United States)

    Dominguez Veiga, Jose Juan; O'Reilly, Martin; Whelan, Darragh; Caulfield, Brian; Ward, Tomas E

    2017-08-04

    Inertial sensors are one of the most commonly used sources of data for human activity recognition (HAR) and exercise detection (ED) tasks. The time series produced by these sensors are generally analyzed through numerical methods. Machine learning techniques such as random forests or support vector machines are popular in this field for classification efforts, but they need to be supported through the isolation of a potentially large number of additionally crafted features derived from the raw data. This feature preprocessing step can involve nontrivial digital signal processing (DSP) techniques. However, in many cases, the researchers interested in this type of activity recognition problems do not possess the necessary technical background for this feature-set development. The study aimed to present a novel application of established machine vision methods to provide interested researchers with an easier entry path into the HAR and ED fields. This can be achieved by removing the need for deep DSP skills through the use of transfer learning. This can be done by using a pretrained convolutional neural network (CNN) developed for machine vision purposes for exercise classification effort. The new method should simply require researchers to generate plots of the signals that they would like to build classifiers with, store them as images, and then place them in folders according to their training label before retraining the network. We applied a CNN, an established machine vision technique, to the task of ED. Tensorflow, a high-level framework for machine learning, was used to facilitate infrastructure needs. Simple time series plots generated directly from accelerometer and gyroscope signals are used to retrain an openly available neural network (Inception), originally developed for machine vision tasks. Data from 82 healthy volunteers, performing 5 different exercises while wearing a lumbar-worn inertial measurement unit (IMU), was collected. The ability of the

  4. Gender, ethnic, age, and relationship differences in non-traditional college student alcohol consumption: a tri-ethnic study.

    Science.gov (United States)

    Babb, Stephanie; Stewart, Cynthia; Bachman, Christine

    2012-01-01

    Group differences in four aspects of alcohol consumption behaviors were examined in non-traditional college students (N = 1092; 828 women and 264 men) attending a large, non-residential, urban university. Findings demonstrated several differences between traditional and non-traditional students' drinking behaviors. Specifically, non-traditional students are more likely to abstain; Caucasians are more apt to drink in isolation and experience negative social consequences of drinking; Hispanic and African American women control their alcohol consumption better; and African American men are more likely to experience antisocial consequences due to drinking. These findings have implications for education and prevention efforts targeting non-traditional college students.

  5. Rethinking energy security in Asia. A non-traditional view of human security

    Energy Technology Data Exchange (ETDEWEB)

    Caballero-Anthony, Mely [Nanyang Technological Univ., Singapore (SG). Centre for Non-Traditional Security (NTS) Studies; Chang, Youngho [Nanyang Technological Univ., Singapore (Singapore). Division of Economics; Putra, Nur Azha (eds.) [National Univ. of Singapore (Singapore). Energy Security Division

    2012-07-01

    Traditional notions of security are premised on the primacy of state security. In relation to energy security, traditional policy thinking has focused on ensuring supply without much emphasis on socioeconomic and environmental impacts. Non-traditional security (NTS) scholars argue that threats to human security have become increasingly prominent since the end of the Cold War, and that it is thus critical to adopt a holistic and multidisciplinary approach in addressing rising energy needs. This volume represents the perspectives of scholars from across Asia, looking at diverse aspects of energy security through a non-traditional security lens. The issues covered include environmental and socioeconomic impacts, the role of the market, the role of civil society, energy sustainability and policy trends in the ASEAN region.

  6. Application of Machine Learning Techniques for Amplitude and Phase Noise Characterization

    DEFF Research Database (Denmark)

    Zibar, Darko; de Carvalho, Luis Henrique Hecker; Piels, Molly

    2015-01-01

    In this paper, tools from machine learning community, such as Bayesian filtering and expectation maximization parameter estimation, are presented and employed for laser amplitude and phase noise characterization. We show that phase noise estimation based on Bayesian filtering outperforms...

  7. Applying a Machine Learning Technique to Classification of Japanese Pressure Patterns

    OpenAIRE

    Kimura, H; Kawashima, H; Kusaka, H; Kitagawa, H

    2009-01-01

    In climate research, pressure patterns are often very important. When a climatologists need to know the days of a specific pressure pattern, for example "low pressure in Western areas of Japan and high pressure in Eastern areas of Japan (Japanese winter-type weather)," they have to visually check a huge number of surface weather charts. To overcome this problem, we propose an automatic classification system using a support vector machine (SVM), which is a machine-learning method. We attempted...

  8. TECHNOLOGICAL ASPECTS OF PRODUCTION OF THE CANDIED FRUITS FROM NON-TRADITIONAL RAW MATERIAL

    OpenAIRE

    I. R. Belenkaya; Ya. A. Golinskaya

    2016-01-01

    The article analyses the candied fruit market in Ukraine and describes the main technological operations pertainingto processing of non-traditional candied products – celery and parsnip roots. Darkening of the roots surface caused bythe enzyme oxidation is one of the problems arising when processing white roots, which leads to worse marketable conditionof the product. To prevent darkening, the developed technology provides for soaking raw material in 1% citric acid solutionimmediately after p...

  9. Cardiac sound murmurs classification with autoregressive spectral analysis and multi-support vector machine technique.

    Science.gov (United States)

    Choi, Samjin; Jiang, Zhongwei

    2010-01-01

    In this paper, a novel cardiac sound spectral analysis method using the normalized autoregressive power spectral density (NAR-PSD) curve with the support vector machine (SVM) technique is proposed for classifying the cardiac sound murmurs. The 489 cardiac sound signals with 196 normal and 293 abnormal sound cases acquired from six healthy volunteers and 34 patients were tested. Normal sound signals were recorded by our self-produced wireless electric stethoscope system where the subjects are selected who have no the history of other heart complications. Abnormal sound signals were grouped into six heart valvular disorders such as the atrial fibrillation, aortic insufficiency, aortic stenosis, mitral regurgitation, mitral stenosis and split sounds. These abnormal subjects were also not included other coexistent heart valvular disorder. Considering the morphological characteristics of the power spectral density of the heart sounds in frequency domain, we propose two important diagnostic features Fmax and Fwidth, which describe the maximum peak of NAR-PSD curve and the frequency width between the crossed points of NAR-PSD curve on a selected threshold value (THV), respectively. Furthermore, a two-dimensional representation on (Fmax, Fwidth) is introduced. The proposed cardiac sound spectral envelope curve method is validated by some case studies. Then, the SVM technique is employed as a classification tool to identify the cardiac sounds by the extracted diagnostic features. To detect abnormality of heart sound and to discriminate the heart murmurs, the multi-SVM classifiers composed of six SVM modules are considered and designed. A data set was used to validate the classification performances of each multi-SVM module. As a result, the accuracies of six SVM modules used for detection of abnormality and classification of six heart disorders showed 71-98.9% for THVs=10-90% and 81.2-99.6% for THVs=10-50% with respect to each of SVM modules. With the proposed cardiac sound

  10. Applying machine-learning techniques to Twitter data for automatic hazard-event classification.

    Science.gov (United States)

    Filgueira, R.; Bee, E. J.; Diaz-Doce, D.; Poole, J., Sr.; Singh, A.

    2017-12-01

    The constant flow of information offered by tweets provides valuable information about all sorts of events at a high temporal and spatial resolution. Over the past year we have been analyzing in real-time geological hazards/phenomenon, such as earthquakes, volcanic eruptions, landslides, floods or the aurora, as part of the GeoSocial project, by geo-locating tweets filtered by keywords in a web-map. However, not all the filtered tweets are related with hazard/phenomenon events. This work explores two classification techniques for automatic hazard-event categorization based on tweets about the "Aurora". First, tweets were filtered using aurora-related keywords, removing stop words and selecting the ones written in English. For classifying the remaining between "aurora-event" or "no-aurora-event" categories, we compared two state-of-art techniques: Support Vector Machine (SVM) and Deep Convolutional Neural Networks (CNN) algorithms. Both approaches belong to the family of supervised learning algorithms, which make predictions based on labelled training dataset. Therefore, we created a training dataset by tagging 1200 tweets between both categories. The general form of SVM is used to separate two classes by a function (kernel). We compared the performance of four different kernels (Linear Regression, Logistic Regression, Multinomial Naïve Bayesian and Stochastic Gradient Descent) provided by Scikit-Learn library using our training dataset to build the SVM classifier. The results shown that the Logistic Regression (LR) gets the best accuracy (87%). So, we selected the SVM-LR classifier to categorise a large collection of tweets using the "dispel4py" framework.Later, we developed a CNN classifier, where the first layer embeds words into low-dimensional vectors. The next layer performs convolutions over the embedded word vectors. Results from the convolutional layer are max-pooled into a long feature vector, which is classified using a softmax layer. The CNN's accuracy

  11. Non-traditional risk factors for atherosclerotic disease: A review for emergency physicians.

    Science.gov (United States)

    Long, Adrianna; Long, Brit; Koyfman, Alex

    2018-03-01

    Acute coronary syndrome (ACS) is a life-threatening disease frequently managed in the Emergency Department (ED). Risk factors such as age, hypertension, diabetes mellitus, obesity, and smoking are classically associated with atherosclerosis and ACS. This review evaluates non-traditional risk factors for atherosclerotic disease and seeks to inform physicians of their potential danger, particularly in vulnerable patient populations. Traditional risk factors are commonly utilized in the evaluation of patients with concern for ACS and acute myocardial infarction (AMI), though these may not be as useful for individual patient assessment. Heart disease accounts for a significant number of deaths in the U.S. Awareness of disease presentation and risk factors is important; however, several non-traditional risk factors are associated with atherosclerosis. Vasculitides, as well as immunologic medications used to treat these patients, increase atherosclerosis. Specific types of cancer and some therapies used to treat cancer are associated with atherosclerosis development and cardiovascular disease (CVD). Heavy alcohol use increases atherosclerosis and risk of AMI. Pregnancy also increases risk of AMI. Patients with HIV develop atherosclerosis at higher rates, and antiretroviral therapy predisposes patients to early development of coronary disease. Infections such as pneumonia and sepsis, associated with elevated inflammation, increase rate of ACS events during illness and throughout the one-year period after diagnosis of infection. Several non-traditional factors are associated with increased risk of atherosclerosis and ACS. Knowledge of these risk factors is important in the ED to minimize the potential of missing ACS. Published by Elsevier Inc.

  12. In vitro biological characterization of macroporous 3D Bonelike structures prepared through a 3D machining technique

    Energy Technology Data Exchange (ETDEWEB)

    Laranjeira, M.S.; Dias, A.G. [INEB - Instituto de Engenharia Biomedica, Divisao de Biomateriais, Universidade do Porto, Rua do Campo Alegre, 823, 4150-180 Porto (Portugal); Santos, J.D. [INEB - Instituto de Engenharia Biomedica, Divisao de Biomateriais, Universidade do Porto, Rua do Campo Alegre, 823, 4150-180 Porto (Portugal); Universidade do Porto, Faculdade de Engenharia, Departamento de Engenharia Metalurgica e Materiais, Rua Dr. Roberto Frias, 4200-465 Porto - Portugal (Portugal); Fernandes, M.H., E-mail: mhrf@portugalmail.pt [Universidade do Porto, Faculdade de Medicina Dentaria, Laboratorio de Farmacologia e Biocompatibilidade Celular, Rua Dr. Manuel Pereira da Silva, 4200-392 Porto (Portugal)

    2009-04-30

    3D bioactive macroporous structures were prepared using a 3D machining technique. A virtual 3D structure model was created and a computer numerically controlled (CNC) milling device machined Bonelike samples. The resulting structures showed a reproducible macroporosity and interconnective structure. Macropores size after sintering was approximately 2000 {mu}m. In vitro testing using human bone marrow stroma showed that cells were able to adhere and proliferate on 3D structures surface and migrate into all macropore channels. In addition, these cells were able to differentiate, since mineralized globular structures associated with cell layer were identified. Results obtained showed that 3D structures of Bonelike successfully allow cell migration into all macropores, and allow human bone marrow stromal cells to proliferate and differentiate. This innovative technique may be considered as a step-forward preparation for 3D interconnective macroporous structures that allow bone ingrowth while maintaining mechanical integrity.

  13. Laser machining of advanced materials

    CERN Document Server

    Dahotre, Narendra B

    2011-01-01

    Advanced materialsIntroductionApplicationsStructural ceramicsBiomaterials CompositesIntermetallicsMachining of advanced materials IntroductionFabrication techniquesMechanical machiningChemical Machining (CM)Electrical machiningRadiation machining Hybrid machiningLaser machiningIntroductionAbsorption of laser energy and multiple reflectionsThermal effectsLaser machining of structural ceramicsIntrodu

  14. Moving beyond regression techniques in cardiovascular risk prediction: applying machine learning to address analytic challenges.

    Science.gov (United States)

    Goldstein, Benjamin A; Navar, Ann Marie; Carter, Rickey E

    2017-06-14

    Risk prediction plays an important role in clinical cardiology research. Traditionally, most risk models have been based on regression models. While useful and robust, these statistical methods are limited to using a small number of predictors which operate in the same way on everyone, and uniformly throughout their range. The purpose of this review is to illustrate the use of machine-learning methods for development of risk prediction models. Typically presented as black box approaches, most machine-learning methods are aimed at solving particular challenges that arise in data analysis that are not well addressed by typical regression approaches. To illustrate these challenges, as well as how different methods can address them, we consider trying to predicting mortality after diagnosis of acute myocardial infarction. We use data derived from our institution's electronic health record and abstract data on 13 regularly measured laboratory markers. We walk through different challenges that arise in modelling these data and then introduce different machine-learning approaches. Finally, we discuss general issues in the application of machine-learning methods including tuning parameters, loss functions, variable importance, and missing data. Overall, this review serves as an introduction for those working on risk modelling to approach the diffuse field of machine learning. © The Author 2016. Published by Oxford University Press on behalf of the European Society of Cardiology.

  15. Development and Performance Evaluation of Manually and Motorized Operated Melon Shelling Machine using Impact Technique

    Directory of Open Access Journals (Sweden)

    H. D. Olusegun

    2009-01-01

    Full Text Available Melon shelling in most part of the world is usually done manually by hand, and like all other manual operations it is time consuming and strenuous. The design and construction of manually and motorized operated melon shelling machine using impact method was done in order to meet the domestic, commercial and industrial requirement of melon for food processing. Two of the main cultivars of melon found in Western part of Nigeria; which are Bara and Serewe can be shelled properly by this machine; the machine is made up of three sections namely the hopper, the shelling chamber which consists of the shelling disc and the shaft, and the gear system. The machine was made from locally sourced materials and it can be used in both urban and rural areas even where there is no power supply. The percentage of melon been shelled in either manual or motorized operation in two successive runs of the two types of melon (Bara and Serewe was found to be above eighty percent (80% and the shelling efficiency of the machine is above 68%.

  16. The Impact of Intrinsic and Extrinsic Motivation on the Academic Achievement of Non-Traditional Undergraduate Students

    Science.gov (United States)

    Arce, Alma Lorenia

    2017-01-01

    Non-traditional students have become a growing component of the student population in today's college systems. Research has shown that non-traditional students are less likely to achieve academically and complete their degree programs compared to traditional students. The purpose of this quantitative, correlational study was to investigate the…

  17. Support vector machines to model presence/absence of Alburnus alburnus alborella (Teleostea, Cyprinidae) in North-Western Italy: comparison with other machine learning techniques.

    Science.gov (United States)

    Tirelli, Tina; Gamba, Marco; Pessani, Daniela

    2012-01-01

    Alburnus alburnus alborella is a fish species native to northern Italy. It has suffered a very sharp decrease in population over the last 20 years due to human impact. Therefore, it was selected for reintroduction projects. In this research project, support vector machines (SVM) were tested as possible tools for building reliable models of presence/absence of the species. A system of 198 sites located along the rivers of Piedmont in North-Western Italy was investigated. At each site, 19 physical-chemical and environmental variables were measured. We verified that performances did not improve after feature selection but, instead, they slightly decreased (from Correctly Classified Instances [CCI]=84.34 and Cohen's k [k]=0.69 to CCI=82.81 and k=0.66). However, feature selection is crucial in identifying the relevant features for the presence/absence of the species. We then compared SVMs performances with decision trees (DTs) and artificial neural networks (ANNs) built using the same dataset. SVMs outperformed DTs (CCI=81.39 and k=0.63) but not ANNs (CCI=83.03 and k=0.66), showing that SVMs and ANNs are the best performing models, proving that their application in freshwater management is more promising than traditional and other machine-learning techniques. Copyright © 2012 Académie des sciences. Published by Elsevier SAS. All rights reserved.

  18. PSO-based support vector machine with cuckoo search technique for clinical disease diagnoses.

    Science.gov (United States)

    Liu, Xiaoyong; Fu, Hui

    2014-01-01

    Disease diagnosis is conducted with a machine learning method. We have proposed a novel machine learning method that hybridizes support vector machine (SVM), particle swarm optimization (PSO), and cuckoo search (CS). The new method consists of two stages: firstly, a CS based approach for parameter optimization of SVM is developed to find the better initial parameters of kernel function, and then PSO is applied to continue SVM training and find the best parameters of SVM. Experimental results indicate that the proposed CS-PSO-SVM model achieves better classification accuracy and F-measure than PSO-SVM and GA-SVM. Therefore, we can conclude that our proposed method is very efficient compared to the previously reported algorithms.

  19. ADAPTING HYBRID MACHINE TRANSLATION TECHNIQUES FOR CROSS-LANGUAGE TEXT RETRIEVAL SYSTEM

    Directory of Open Access Journals (Sweden)

    P. ISWARYA

    2017-03-01

    Full Text Available This research work aims in developing Tamil to English Cross - language text retrieval system using hybrid machine translation approach. The hybrid machine translation system is a combination of rule based and statistical based approaches. In an existing word by word translation system there are lot of issues and some of them are ambiguity, Out-of-Vocabulary words, word inflections, and improper sentence structure. To handle these issues, proposed architecture is designed in such a way that, it contains Improved Part-of-Speech tagger, machine learning based morphological analyser, collocation based word sense disambiguation procedure, semantic dictionary, and tense markers with gerund ending rules, and two pass transliteration algorithm. From the experimental results it is clear that the proposed Tamil Query based translation system achieves significantly better translation quality over existing system, and reaches 95.88% of monolingual performance.

  20. Accuracy comparison among different machine learning techniques for detecting malicious codes

    Science.gov (United States)

    Narang, Komal

    2016-03-01

    In this paper, a machine learning based model for malware detection is proposed. It can detect newly released malware i.e. zero day attack by analyzing operation codes on Android operating system. The accuracy of Naïve Bayes, Support Vector Machine (SVM) and Neural Network for detecting malicious code has been compared for the proposed model. In the experiment 400 benign files, 100 system files and 500 malicious files have been used to construct the model. The model yields the best accuracy 88.9% when neural network is used as classifier and achieved 95% and 82.8% accuracy for sensitivity and specificity respectively.

  1. Non-invasive estimate of blood glucose and blood pressure from a photoplethysmograph by means of machine learning techniques.

    Science.gov (United States)

    Monte-Moreno, Enric

    2011-10-01

    This work presents a system for a simultaneous non-invasive estimate of the blood glucose level (BGL) and the systolic (SBP) and diastolic (DBP) blood pressure, using a photoplethysmograph (PPG) and machine learning techniques. The method is independent of the person whose values are being measured and does not need calibration over time or subjects. The architecture of the system consists of a photoplethysmograph sensor, an activity detection module, a signal processing module that extracts features from the PPG waveform, and a machine learning algorithm that estimates the SBP, DBP and BGL values. The idea that underlies the system is that there is functional relationship between the shape of the PPG waveform and the blood pressure and glucose levels. As described in this paper we tested this method on 410 individuals without performing any personalized calibration. The results were computed after cross validation. The machine learning techniques tested were: ridge linear regression, a multilayer perceptron neural network, support vector machines and random forests. The best results were obtained with the random forest technique. In the case of blood pressure, the resulting coefficients of determination for reference vs. prediction were R(SBP)(2)=0.91, R(DBP)(2)=0.89, and R(BGL)(2)=0.90. For the glucose estimation, distribution of the points on a Clarke error grid placed 87.7% of points in zone A, 10.3% in zone B, and 1.9% in zone D. Blood pressure values complied with the grade B protocol of the British Hypertension society. An effective system for estimate of blood glucose and blood pressure from a photoplethysmograph is presented. The main advantage of the system is that for clinical use it complies with the grade B protocol of the British Hypertension society for the blood pressure and only in 1.9% of the cases did not detect hypoglycemia or hyperglycemia. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Applying machine learning and image feature extraction techniques to the problem of cerebral aneurysm rupture

    Directory of Open Access Journals (Sweden)

    Steren Chabert

    2017-01-01

    Full Text Available Cerebral aneurysm is a cerebrovascular disorder characterized by a bulging in a weak area in the wall of an artery that supplies blood to the brain. It is relevant to understand the mechanisms leading to the apparition of aneurysms, their growth and, more important, leading to their rupture. The purpose of this study is to study the impact on aneurysm rupture of the combination of different parameters, instead of focusing on only one factor at a time as is frequently found in the literature, using machine learning and feature extraction techniques. This discussion takes relevance in the context of the complex decision that the physicians have to take to decide which therapy to apply, as each intervention bares its own risks, and implies to use a complex ensemble of resources (human resources, OR, etc. in hospitals always under very high work load. This project has been raised in our actual working team, composed of interventional neuroradiologist, radiologic technologist, informatics engineers and biomedical engineers, from Valparaiso public Hospital, Hospital Carlos van Buren, and from Universidad de Valparaíso – Facultad de Ingeniería and Facultad de Medicina. This team has been working together in the last few years, and is now participating in the implementation of an “interdisciplinary platform for innovation in health”, as part of a bigger project leaded by Universidad de Valparaiso (PMI UVA1402. It is relevant to emphasize that this project is made feasible by the existence of this network between physicians and engineers, and by the existence of data already registered in an orderly manner, structured and recorded in digital format. The present proposal arises from the description in nowadays literature that the actual indicators, whether based on morphological description of the aneurysm, or based on characterization of biomechanical factor or others, these indicators were shown not to provide sufficient information in order

  3. Engagement techniques and playing level impact the biomechanical demands on rugby forwards during machine-based scrummaging.

    Science.gov (United States)

    Preatoni, Ezio; Stokes, Keith A; England, Michael E; Trewartha, Grant

    2015-04-01

    This cross-sectional study investigated the factors that may influence the physical loading on rugby forwards performing a scrum by studying the biomechanics of machine-based scrummaging under different engagement techniques and playing levels. 34 forward packs from six playing levels performed repetitions of five different types of engagement techniques against an instrumented scrum machine under realistic training conditions. Applied forces and body movements were recorded in three orthogonal directions. The modification of the engagement technique altered the load acting on players. These changes were in a similar direction and of similar magnitude irrespective of the playing level. Reducing the dynamics of the initial engagement through a fold-in procedure decreased the peak compression force, the peak downward force and the engagement speed in excess of 30%. For example, peak compression (horizontal) forces in the professional teams changed from 16.5 (baseline technique) to 8.6 kN (fold-in procedure). The fold-in technique also reduced the occurrence of combined high forces and head-trunk misalignment during the absorption of the impact, which was used as a measure of potential hazard, by more than 30%. Reducing the initial impact did not decrease the ability of the teams to produce sustained compression forces. De-emphasising the initial impact against the scrum machine decreased the mechanical stresses acting on forward players and may benefit players' welfare by reducing the hazard factors that may induce chronic degeneration of the spine. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  4. Elevating Virtual Machine Introspection for Fine-Grained Process Monitoring: Techniques and Applications

    Science.gov (United States)

    Srinivasan, Deepa

    2013-01-01

    Recent rapid malware growth has exposed the limitations of traditional in-host malware-defense systems and motivated the development of secure virtualization-based solutions. By running vulnerable systems as virtual machines (VMs) and moving security software from inside VMs to the outside, the out-of-VM solutions securely isolate the anti-malware…

  5. Discrimination of plant root zone water status in greenhouse production based on phenotyping and machine learning techniques.

    Science.gov (United States)

    Guo, Doudou; Juan, Jiaxiang; Chang, Liying; Zhang, Jingjin; Huang, Danfeng

    2017-08-15

    Plant-based sensing on water stress can provide sensitive and direct reference for precision irrigation system in greenhouse. However, plant information acquisition, interpretation, and systematical application remain insufficient. This study developed a discrimination method for plant root zone water status in greenhouse by integrating phenotyping and machine learning techniques. Pakchoi plants were used and treated by three root zone moisture levels, 40%, 60%, and 80% relative water content. Three classification models, Random Forest (RF), Neural Network (NN), and Support Vector Machine (SVM) were developed and validated in different scenarios with overall accuracy over 90% for all. SVM model had the highest value, but it required the longest training time. All models had accuracy over 85% in all scenarios, and more stable performance was observed in RF model. Simplified SVM model developed by the top five most contributing traits had the largest accuracy reduction as 29.5%, while simplified RF and NN model still maintained approximately 80%. For real case application, factors such as operation cost, precision requirement, and system reaction time should be synthetically considered in model selection. Our work shows it is promising to discriminate plant root zone water status by implementing phenotyping and machine learning techniques for precision irrigation management.

  6. Machining process influence on the chip form and surface roughness by neuro-fuzzy technique

    Science.gov (United States)

    Anicic, Obrad; Jović, Srđan; Aksić, Danilo; Skulić, Aleksandar; Nedić, Bogdan

    2017-04-01

    The main aim of the study was to analyze the influence of six machining parameters on the chip shape formation and surface roughness as well during turning of Steel 30CrNiMo8. Three components of cutting forces were used as inputs together with cutting speed, feed rate, and depth of cut. It is crucial for the engineers to use optimal machining parameters to get the best results or to high control of the machining process. Therefore, there is need to find the machining parameters for the optimal procedure of the machining process. Adaptive neuro-fuzzy inference system (ANFIS) was used to estimate the inputs influence on the chip shape formation and surface roughness. According to the results, the cutting force in direction of the depth of cut has the highest influence on the chip form. The testing error for the cutting force in direction of the depth of cut has testing error 0.2562. This cutting force determines the depth of cut. According to the results, the depth of cut has the highest influence on the surface roughness. Also the depth of cut has the highest influence on the surface roughness. The testing error for the cutting force in direction of the depth of cut has testing error 5.2753. Generally the depth of cut and the cutting force which provides the depth of cut are the most dominant factors for chip forms and surface roughness. Any small changes in depth of cut or in cutting force which provide the depth of cut could drastically affect the chip form or surface roughness of the working material.

  7. NON-TRADITIONAL SPORTS AT SCHOOL. BENEFITS FOR PHYSICAL AND MOTOR DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    AMADOR J. LARA-SÁNCHEZ

    2010-12-01

    Full Text Available Physical Education teachers have been using some very classic team sports, like football, basketball, handball, volleyball, etc. for many years in order to develop their education work at school. As a consequence of this, the benefits of this kind of activities on Physical Education lessons have not been as notable as we mighthave expected, since, even if they are increasing, their development and application are still low. There are many and very varied new non-traditional sports that have emerged and extended across Spain in recent years. To mention an example, we could refer to a newly created non-traditional sport such as kin-ball. This sport wascreated for the purpose of achieving a way to combine several factors such as health, team-work and competitiveness. Three teams of four players each participate. This way, every player can participate to a great extent in all the moves of the match, for each of them must defend one area of their half in order to achieve a common objective. Besides, kin-ball helps to develop motor skills at school in an easy way; that is, coordination, balance and perception. There is a large variety of non-traditional games and sports that are similar to kin-ball, such as floorball, intercrosse, mazaball, tchoukball, ultimate, indiaca, shuttleball... All of them show many physical, psychic and social advantages, and can help us to make the Physical Education teaching-learning process more motivating, acquiring the recreational component that it showed some years ago and which hasnow disappeared

  8. Book review: OF OTHER THOUGHTS: NON-TRADITIONAL WAYS TO THE

    Directory of Open Access Journals (Sweden)

    Johan Verbeke

    2014-12-01

    Full Text Available Research paradigms in the fields of architecture and arts have been developing and changing during the last decade. Part of this development is a shift to include design work and artistic work into the knowledge processes of doctoral work. This work evidently also needs supervision. At the same time doctoral degrees have been developing in relation to indigenous ways of thinking. The book Other Thoughts: Non-Traditional Ways to the Doctorate discusses the challenges one is facing, either as a PhD student or as a supervisor, when doing or supervising a PhD in a less established field.

  9. Applying a Machine Learning Technique to Classification of Japanese Pressure Patterns

    Directory of Open Access Journals (Sweden)

    H Kimura

    2009-04-01

    Full Text Available In climate research, pressure patterns are often very important. When a climatologists need to know the days of a specific pressure pattern, for example "low pressure in Western areas of Japan and high pressure in Eastern areas of Japan (Japanese winter-type weather," they have to visually check a huge number of surface weather charts. To overcome this problem, we propose an automatic classification system using a support vector machine (SVM, which is a machine-learning method. We attempted to classify pressure patterns into two classes: "winter type" and "non-winter type". For both training datasets and test datasets, we used the JRA-25 dataset from 1981 to 2000. An experimental evaluation showed that our method obtained a greater than 0.8 F-measure. We noted that variations in results were based on differences in training datasets.

  10. Use of Advanced Machine-Learning Techniques for Non-Invasive Monitoring of Hemorrhage

    Science.gov (United States)

    2010-04-01

    for humans is confounded by species differences (e.g., quadruped vs. biped ), particularly regarding blood pressure regulation, and the presence of...based, robot navigation system for our machine-learning framework. This system builds linear and nonlinear density models in real-time and is able to... robot navigation system. This approach was used to analyze continuous, physiological waveform data from 28 healthy humans exposed to progressive

  11. A Quantitative Proteomics Approach to Clinical Research with Non-Traditional Samples

    Science.gov (United States)

    Licier, Rígel; Miranda, Eric; Serrano, Horacio

    2016-01-01

    The proper handling of samples to be analyzed by mass spectrometry (MS) can guarantee excellent results and a greater depth of analysis when working in quantitative proteomics. This is critical when trying to assess non-traditional sources such as ear wax, saliva, vitreous humor, aqueous humor, tears, nipple aspirate fluid, breast milk/colostrum, cervical-vaginal fluid, nasal secretions, bronco-alveolar lavage fluid, and stools. We intend to provide the investigator with relevant aspects of quantitative proteomics and to recognize the most recent clinical research work conducted with atypical samples and analyzed by quantitative proteomics. Having as reference the most recent and different approaches used with non-traditional sources allows us to compare new strategies in the development of novel experimental models. On the other hand, these references help us to contribute significantly to the understanding of the proportions of proteins in different proteomes of clinical interest and may lead to potential advances in the emerging field of precision medicine. PMID:28248241

  12. A Quantitative Proteomics Approach to Clinical Research with Non-Traditional Samples.

    Science.gov (United States)

    Licier, Rígel; Miranda, Eric; Serrano, Horacio

    2016-10-17

    The proper handling of samples to be analyzed by mass spectrometry (MS) can guarantee excellent results and a greater depth of analysis when working in quantitative proteomics. This is critical when trying to assess non-traditional sources such as ear wax, saliva, vitreous humor, aqueous humor, tears, nipple aspirate fluid, breast milk/colostrum, cervical-vaginal fluid, nasal secretions, bronco-alveolar lavage fluid, and stools. We intend to provide the investigator with relevant aspects of quantitative proteomics and to recognize the most recent clinical research work conducted with atypical samples and analyzed by quantitative proteomics. Having as reference the most recent and different approaches used with non-traditional sources allows us to compare new strategies in the development of novel experimental models. On the other hand, these references help us to contribute significantly to the understanding of the proportions of proteins in different proteomes of clinical interest and may lead to potential advances in the emerging field of precision medicine.

  13. Injury survey of a non-traditional 'soft-edged' trampoline designed to lower equipment hazards.

    Science.gov (United States)

    Eager, David B; Scarrott, Carl; Nixon, Jim; Alexander, Keith

    2013-01-01

    In Australia trampolines contribute one quarter of all childhood play equipment injuries. The objective of this study was to gather and evaluate injury data from a non-traditional, 'soft-edged', consumer trampoline, where the design aimed to minimise injuries from the equipment and from falling off. The manufacturer of the non-traditional trampoline provided the University of Technology Sydney with their Australian customer database. The study involved surveys in Queensland and New South Wales, between May 2007 and March 2010. Initially injury data was gathered by a phone interview pilot study, then in the full study, through an email survey. The 3817 respondents were the carers of child users of the 'soft-edge' trampolines. Responses were compared with Australian and US emergency department data. In both countries the proportion of injuries caused by the equipment and falling off was compared with the proportion caused by the jumpers to themselves or each other. The comparisons showed a significantly lower proportion resulted from falling-off or hitting the equipment for this design when compared to traditional trampolines, both in Australia and the US. This research concludes that equipment-induced and falling-off injuries, the more severe injuries on traditional trampolines, can be significantly reduced with appropriate trampoline design.

  14. Enhancing Critical Thinking Skills and Writing Skills through the Variation in Non-Traditional Writing Task

    Directory of Open Access Journals (Sweden)

    Parlindungan Sinaga

    2017-04-01

    Full Text Available The research aims to identify the impacts of embedding non-traditional writing tasks within the course of modern physics conducted to the students of Physics Education and Physics Study Programs. It employed a quasi-experimental method with the pretest-posttest control group design. The used instruments were tests on conceptual mastery, tests on critical thinking skills, and a rubric of writing assessment. The data were analyzed by determining the percentages of average normalized gains, Cohen’s d, and correlational analysis. Based on the results of data analysis, it is found that the different treatments in the non-traditional writing tasks given to the students of the Physics Education and Physics Programs have the following impacts: 1. There was a significant difference in the increased conceptual mastery and critical thinking skills; 2. There was a difference in the writing quality of the students of the Physics Education and Physics Program; 3. There was a correlation between writing quality and conceptual mastery with a high degree relationship and there was a correlation between writing quality and critical thinking skills with a low degree relationship; 4. Increased conceptual understanding was influenced by the writing domain.

  15. A data-based technique for monitoring of wound rotor induction machines: A simulation study

    KAUST Repository

    Harrou, Fouzi

    2016-05-09

    Detecting faults induction machines is crucial for a safe operation of these machines. The aim of this paper is to present a statistical fault detection methodology for the detection of faults in three-phase wound rotor induction machines (WRIM). The proposed fault detection approach is based on the use of principal components analysis (PCA). However, conventional PCA-based detection indices, such as the T2T2 and the Q statistics, are not well suited to detect small faults because these indices only use information from the most recent available samples. Detection of small faults is one of the most crucial and challenging tasks in the area of fault detection and diagnosis. In this paper, a new statistical system monitoring strategy is proposed for detecting changes resulting from small shifts in several variables associated with WRIM. The proposed approach combines modeling using PCA modeling with the exponentially weighted moving average (EWMA) control scheme. In the proposed approach, EWMA control scheme is applied on the ignored principal components to detect the presence of faults. The performance of the proposed method is compared with those of the traditional PCA-based fault detection indices. The simulation results clearly show the effectiveness of the proposed method over the conventional ones, especially in the presence of faults with small magnitudes.

  16. The Identification of Hunger Behaviour of Lates Calcarifer through the Integration of Image Processing Technique and Support Vector Machine

    Science.gov (United States)

    Taha, Z.; Razman, M. A. M.; Adnan, F. A.; Ghani, A. S. Abdul; Majeed, A. P. P. Abdul; Musa, R. M.; Sallehudin, M. F.; Mukai, Y.

    2018-03-01

    Fish Hunger behaviour is one of the important element in determining the fish feeding routine, especially for farmed fishes. Inaccurate feeding routines (under-feeding or over-feeding) lead the fishes to die and thus, reduces the total production of fishes. The excessive food which is not eaten by fish will be dissolved in the water and thus, reduce the water quality (oxygen quantity in the water will be reduced). The reduction of oxygen (water quality) leads the fish to die and in some cases, may lead to fish diseases. This study correlates Barramundi fish-school behaviour with hunger condition through the hybrid data integration of image processing technique. The behaviour is clustered with respect to the position of the centre of gravity of the school of fish prior feeding, during feeding and after feeding. The clustered fish behaviour is then classified by means of a machine learning technique namely Support vector machine (SVM). It has been shown from the study that the Fine Gaussian variation of SVM is able to provide a reasonably accurate classification of fish feeding behaviour with a classification accuracy of 79.7%. The proposed integration technique may increase the usefulness of the captured data and thus better differentiates the various behaviour of farmed fishes.

  17. Estimating Fractional Shrub Cover Using Simulated EnMAP Data: A Comparison of Three Machine Learning Regression Techniques

    Directory of Open Access Journals (Sweden)

    Marcel Schwieder

    2014-04-01

    Full Text Available Anthropogenic interventions in natural and semi-natural ecosystems often lead to substantial changes in their functioning and may ultimately threaten ecosystem service provision. It is, therefore, necessary to monitor these changes in order to understand their impacts and to support management decisions that help ensuring sustainability. Remote sensing has proven to be a valuable tool for these purposes, and especially hyperspectral sensors are expected to provide valuable data for quantitative characterization of land change processes. In this study, simulated EnMAP data were used for mapping shrub cover fractions along a gradient of shrub encroachment, in a study region in southern Portugal. We compared three machine learning regression techniques: Support Vector Regression (SVR; Random Forest Regression (RF; and Partial Least Squares Regression (PLSR. Additionally, we compared the influence of training sample size on the prediction performance. All techniques showed reasonably good results when trained with large samples, while SVR always outperformed the other algorithms. The best model was applied to produce a fractional shrub cover map for the whole study area. The predicted patterns revealed a gradient of shrub cover between regions affected by special agricultural management schemes for nature protection and areas without land use incentives. Our results highlight the value of EnMAP data in combination with machine learning regression techniques for monitoring gradual land change processes.

  18. From analog timers to the era of machine learning: The case of the transient hot-wire technique

    Science.gov (United States)

    Assael, Yannis M.; Antoniadis, Konstantinos D.; Assael, Marc J.

    2017-07-01

    In this work, we demonstrate how interdisciplinary knowledge can provide solutions to elusive challenges and advance science. As an example, we used the application of the THW in the measurement of the thermal conductivity of solids. To obtain a solution of the equations by FEM, about 10 h were required. By employing tools from the field of machine learning and computer science like a) automating the manual pipeline using a custom framework, b) using efficiently, Bayesian Optimisation to estimate the optimal thermal properties value, and c) applying further task specific optimisations, this time was reduced to 3 min, which is acceptable, and thus the technique can be easier used.

  19. Technique to reduce the shaft torque stress at an induction machine

    Directory of Open Access Journals (Sweden)

    Adrian Tulbure

    2005-10-01

    Full Text Available For the active attenuation at load stress in the drive shaft, the control system should receive as input signal the instantaneous shaft torque value. In this context an intelligent observer for shaft tongue of mains operatea induction machine, which is able to responding by variation of LIF (Load Input Function[1] must be developed. Extensive computer simulation prove the effectiveness of the proposed solution. In order to obtain a practical validation, the stimulated regulator has been designed and tested in the Institute of Electrical Engineering in Clausthal/Germany [2]. This paper contains following parts: Developing the mathematical model, Practical realisation, Simulations and measurements, Evaluating the control solutions and Conclusions.

  20. Iterative optimization techniques using man-machine interaction for university timetabling problems.

    Science.gov (United States)

    Shimazaki, Syunsuke; Sakakibara, Kazutoshi; Matsumoto, Takuya

    2015-01-01

    We focus on a timetabling problem of university makeup classes and construct a scheduling system based on man-machine interaction which enables to reveal the essential and additional information of the problem domain. In this problem, makeup classes which are requested by the lecturers have to be scheduled to a specified time slot under the hard/soft constraints, e.g., schedules of the lecturers and the students. A constraint based scheduling model is newly introduced and several parameters of the model are settled through the repetition of the solution evaluation by the operators. Through the numerical experiment with the actual data, the potential of the proposed approach is examined.

  1. Optimization of fuel exchange machine operation for boiling water reactors using an artificial intelligence technique

    International Nuclear Information System (INIS)

    Sekimizu, K.; Araki, T.; Tatemichi, S.I.

    1987-01-01

    Optimization of fuel assembly exchange machine movements during periodic refueling outage is discussed. The fuel assembly movements during a fuel shuffling were examined, and it was found that the fuel assembly movements consist of two different movement sequences;one is the ''PATH,'' which begins at a discharged fuel assembly and terminates at a fresh fuel assembly, and the other is the ''LOOP,'' where fuel assemblies circulate in the core. It is also shown that fuel-loading patterns during the fuel shuffling can be expressed by the state of each PATH, which is the number of elements already accomplished in the PATH actions. Based on this fact, a scheme to determine a fuel assembly movement sequence within the constraint was formulated using the artificial intelligence language PROLOG. An additional merit to the scheme is that it can simultaneously evaluate fuel assembly movement, due to the control rods and local power range monitor exchange, in addition to normal fuel shuffling. Fuel assembly movements, for fuel shuffling in a 540-MW(electric) boiling water reactor power plant, were calculated by this scheme. It is also shown that the true optimization to minimize the fuel exchange machine movements would be costly to obtain due to the number of alternatives that would need to be evaluated. However, a method to obtain a quasi-optimum solution is suggested

  2. Exploration of machine learning techniques in predicting multiple sclerosis disease course.

    Science.gov (United States)

    Zhao, Yijun; Healy, Brian C; Rotstein, Dalia; Guttmann, Charles R G; Bakshi, Rohit; Weiner, Howard L; Brodley, Carla E; Chitnis, Tanuja

    2017-01-01

    To explore the value of machine learning methods for predicting multiple sclerosis disease course. 1693 CLIMB study patients were classified as increased EDSS≥1.5 (worsening) or not (non-worsening) at up to five years after baseline visit. Support vector machines (SVM) were used to build the classifier, and compared to logistic regression (LR) using demographic, clinical and MRI data obtained at years one and two to predict EDSS at five years follow-up. Baseline data alone provided little predictive value. Clinical observation for one year improved overall SVM sensitivity to 62% and specificity to 65% in predicting worsening cases. The addition of one year MRI data improved sensitivity to 71% and specificity to 68%. Use of non-uniform misclassification costs in the SVM model, weighting towards increased sensitivity, improved predictions (up to 86%). Sensitivity, specificity, and overall accuracy improved minimally with additional follow-up data. Predictions improved within specific groups defined by baseline EDSS. LR performed more poorly than SVM in most cases. Race, family history of MS, and brain parenchymal fraction, ranked highly as predictors of the non-worsening group. Brain T2 lesion volume ranked highly as predictive of the worsening group. SVM incorporating short-term clinical and brain MRI data, class imbalance corrective measures, and classification costs may be a promising means to predict MS disease course, and for selection of patients suitable for more aggressive treatment regimens.

  3. Automatic segmentation of airway tree based on local intensity filter and machine learning technique in 3D chest CT volume.

    Science.gov (United States)

    Meng, Qier; Kitasaka, Takayuki; Nimura, Yukitaka; Oda, Masahiro; Ueno, Junji; Mori, Kensaku

    2017-02-01

    Airway segmentation plays an important role in analyzing chest computed tomography (CT) volumes for computerized lung cancer detection, emphysema diagnosis and pre- and intra-operative bronchoscope navigation. However, obtaining a complete 3D airway tree structure from a CT volume is quite a challenging task. Several researchers have proposed automated airway segmentation algorithms basically based on region growing and machine learning techniques. However, these methods fail to detect the peripheral bronchial branches, which results in a large amount of leakage. This paper presents a novel approach for more accurate extraction of the complex airway tree. This proposed segmentation method is composed of three steps. First, Hessian analysis is utilized to enhance the tube-like structure in CT volumes; then, an adaptive multiscale cavity enhancement filter is employed to detect the cavity-like structure with different radii. In the second step, support vector machine learning will be utilized to remove the false positive (FP) regions from the result obtained in the previous step. Finally, the graph-cut algorithm is used to refine the candidate voxels to form an integrated airway tree. A test dataset including 50 standard-dose chest CT volumes was used for evaluating our proposed method. The average extraction rate was about 79.1 % with the significantly decreased FP rate. A new method of airway segmentation based on local intensity structure and machine learning technique was developed. The method was shown to be feasible for airway segmentation in a computer-aided diagnosis system for a lung and bronchoscope guidance system.

  4. Non-Traditional Security: The Case of Water Security in the Mekong Subregion

    Directory of Open Access Journals (Sweden)

    Haefner, Andrea

    2013-09-01

    Full Text Available In the first decade of the twenty-first century Non-Traditional Security (NTS challenges are of rising importance due to their increasing impact on daily life and broader national interests. This paper focuses on the Mekong Region as an important subregion due to its significance for more than 70 million people living directly on the river banks and its importance for the economic development of the six riparian countries. This paper investigates NTS challenges in the Mekong Subregion with a focus on environmental challenges and argues that NTS are of increasing importance in the region and will increase in the future. Whereas economic growth is crucial for the improvements of the livelihoods on the Mekong River and the overall economic performance of the riparian states, environmental protection cannot be disregarded as doing so would have devastating impact on the subregion and the wider region in the future.

  5. Non-Traditional Systemic Treatments for Diabetic Retinopathy: An
Evidence-Based Review

    Science.gov (United States)

    Simó, Rafael; Ballarini, Stefania; Cunha-Vaz, José; Ji, Linong; Haller, Hermann; Zimmet, Paul; Wong, Tien Y.

    2015-01-01

    The rapid escalation in the global prevalence diabetes, with more than 30% being afflicted with diabetic retinopathy (DR), means it is likely that associated vision-threatening conditions will also rise substantially. This means that new therapeutic approaches need to be found that go beyond the current standards of diabetic care, and which are effective in the early stages of the disease. In recent decades several new pharmacological agents have been investigated for their effectiveness in preventing the appearance and progression of DR or in reversing DR; some with limited success while others appear promising. This up-to-date critical review of non-traditional systemic treatments for DR is based on the published evidence in MEDLINE spanning 1980-December 2014. It discusses a number of therapeutic options, paying particular attention to the mechanisms of action and the clinical evidence for the use of renin-angiotensin system blockade, fenofibrate and calcium dobesilate monohydrate in DR. PMID:25989912

  6. ASEAN’s Environmental Challenges and Non-Traditional Security Cooperation: Towards a Regional Peacekeeping Force?

    Directory of Open Access Journals (Sweden)

    Henning Borchers

    2014-06-01

    Full Text Available This article reflects on the prospect for an ASEAN peacekeeping force and regional security cooperation. I argue that progress on ‘soft’ security issues stands to facilitate a slow deepening of ‘hard’ security cooperation at the ASEAN level. Governments of ASEAN member states are still reluctant to develop a regional mechanism for conflict resolution, which they perceive to be a challenge to the norms of non-interference and state sovereignty. Yet, these norms are subject to dynamic shifts in the security environment that regional governments now have to manage. The establishment of mechanisms to address politically less controversial non-traditional security issues such as environmental challenges stands to further develop and consolidate military-to-military ties and deepen political trust among member states. An ASEAN standby force for emergency response and disaster relief has become a politically acceptable initiative and could set the stage for the development of an ASEAN peacekeeping force.

  7. The Potential Role of Non-Traditional Donors' Aid in Africa

    DEFF Research Database (Denmark)

    Kragelund, Peter

    dictators, providing aid with ‘no strings attached’ thereby undermining the development efforts of the traditional donors. This paper questions this dichotomous view and instead argues that the re-emergence of non-traditional donors may affect African development efforts positively as well as negatively...... capacity in African countries in order to increase the positive developmental effects of the additional flows of development assistance. Moreover, there are some opportunities for collaboration among the two groups of donors provided that ownership of the interventions is on African hands. This could...... enhance mutual understanding among the developing partners and potentially pave the way for a dialogue based on positive development experiences rather than uninformed critique....

  8. Extraction and properties of starches from the non-traditional vegetables Yam and Taro

    Energy Technology Data Exchange (ETDEWEB)

    Andrade, Luan Alberto; Barbosa, Natalia Alves; Pereira, Joelma, E-mail: luandrade87@yahoo.com.br [Universidade Federal de Lavras (UFLA), Lavras, MG (Brazil)

    2017-04-15

    The objective of this study was to assess the chemical, physical, morphological, crystalline and thermal properties of starch from two non-traditional vegetables, yam and taro. The analyses included proximate composition percent, amylose and mineral content, water absorption capacity, absolute density, morphological properties, X-ray diffractometry, thermal properties, pasting properties and infrared spectrum. The extracted starch exhibited a high purity level with low lipid, fiber and ash contents. The electron micrographs suggested that the taro starch granules were smaller than the yam starch granules. The results for the experimental conditions used in this study indicated that the studied starches differed, especially the amylose content, granule size and crystallinity degree and the pattern of the starches. Due to the high amylose content of yam starch, this type of starch can be used for film preparation, whereas the taro starch can be used as a fat substitute due to its small granule size. (author)

  9. Climate trends in a non-traditional high quality wine producing region

    Directory of Open Access Journals (Sweden)

    Ludmila Bardin-Camparotto

    2014-09-01

    Full Text Available The global warming may put pressure over some world's highest quality wine producing regions. This fact indicates the need to evaluate the presence of climate change in non-traditional wine producing regions of the Globe. Therefore, the goals of this study were to detect trends in rainfall and air temperature series obtained from three locations of the eastern part of the State of São Paulo, Brazil (a non-traditional high quality wine producing region and to evaluate the effect of the detected climate trends on agrometeorological indices frequently used to indicate suitable areas for wine production. The trend analyses were applied to maximum and minimum air temperature series, rainfall series and to the following agrometeorological parameters: heliothermal index, cool night index and growing degree-days. These three indices were selected due to their previous use in studies that address the effect of regional climate conditions on the general wine style. The analyses took into account the grape phenological aspects for both summer and winter growing seasons. The results found in this study support the hypothesis of the presence of climate trends in the wine producing regions of the eastern part of the State of São Paulo-Brazil. These trends are mostly linked to changes in the minimum air temperature. The results also reveal a shortening in the duration of grapevines phenological phases and a change to warmer conditions during the ripening Months. These changes are consistent with the climate changes observed in other wine producing regions of the world and may negatively affect the wine production of the eastern part of the State of São Paulo.

  10. A bit of both science and economics: a non-traditional STEM identity narrative

    Science.gov (United States)

    Mark, Sheron L.

    2017-10-01

    Black males, as one non-dominant population, remain underrepresented and less successful in science, technology, engineering, and mathematics (STEM). Researchers focused on non-dominant populations are advised against generalizations and to examine cultural intersections (i.e. race, ethnicity, gender, and more) and also to explore cases of success, in addition to cases of under-achievement and underrepresentation. This study has focused on one African American male, Randy, who expressed high-achieving STEM career goals in computer science and engineering. Furthermore, recognizing that culture and identity development underlie STEM engagement and persistence, this long-term case study focused on how Randy developed a STEM identity during the course of the study and the implications of that process for his STEM career exploration. Étienne Wenger's (1999) communities-of-practice (CoP) was employed as a theoretical framework and, in doing so, (1) the informal STEM program in which Randy participated was characterized as a STEM-for-social-justice CoP and (2) Randy participated in ways that consistently utilized an "economics" lens from beyond the boundaries of the CoP. In doing so, Randy functioned as a broker within the CoP and developed a non-traditional STEM identity-in-practice which integrated STEM, "economics", and community engagement. Randy's STEM identity-in-practice is discussed in terms of the contextual factors that support scientific identity development (Hazari et al. in J Res Sci Teach 47:978-1003, 2010), the importance of recognizing and supporting the development of holistic and non-traditional STEM identities, especially for diverse populations in STEM, and the implications of this new understanding of Randy's STEM identity for his long-term STEM career exploration.

  11. Cathedral outreach: student-led workshops for school curriculum enhancement in non-traditional environments

    Science.gov (United States)

    Posner, Matthew T.; Jantzen, Alexander; van Putten, Lieke D.; Ravagli, Andrea; Donko, Andrei L.; Soper, Nathan; Wong, Nicholas H. L.; John, Pearl V.

    2017-08-01

    Universities in the United Kingdom have been driven to work with a larger pool of potential students than just the more traditional student (middle-class white male), in order to tackle the widely-accepted skills-shortage in the fields of science, technology, engineering and mathematics (STEM), whilst honoring their commitment to fair access to higher education. Student-led outreach programs have contributed significantly to this drive. Two such programs run by postgraduate students at the University of Southampton are the Lightwave Roadshow and Southampton Accelerate!, which focus on photonics and particle physics, respectively. The program ambassadors have developed activities to enhance areas of the national curriculum through presenting fundamental physical sciences and their applications to optics and photonics research. The activities have benefitted significantly from investment from international organizations, such as SPIE, OSA and the IEEE Photonics Society, and UK research councils, in conjunction with university recruitment and outreach strategies. New partnerships have been formed to expand outreach programs to work in non-traditional environments to challenge stereotypes of scientists. This paper presents two case studies of collaboration with education learning centers at Salisbury Cathedral and Winchester Cathedral. The paper outlines workshops and shows developed for pupils aged 6-14 years (UK key stages 2-4) on the electromagnetic spectrum, particle physics, telecommunications and the human eye using a combination of readily obtainable items, hand-built kits and elements from the EYEST Photonics Explorer kit. The activities are interactive to stimulate learning through active participation, complement the UK national curriculum and link the themes of science with the non-traditional setting of a cathedral. We present methods to evaluate the impact of the activity and tools to obtain qualitative feedback for continual program improvement. We also

  12. Assessing readiness for self-directed learning within a non-traditional nursing cohort.

    Science.gov (United States)

    Phillips, Brian N; Turnbull, Beverley J; He, Flora X

    2015-03-01

    Increasing deregulation of the Australian tertiary system has led to changes in entry behaviours anticipated in non-traditional student cohorts. Many nursing students are returning to formal studies later in their lives seeking a career change. Accessibility and flexible study paths make external study increasingly attractive. However external studies require a level of commitment and willingness to develop self-direction and a capacity for resilience. This study sought to elicit the level of self-directed learning readiness (SDLR) among undergraduate nursing students currently enrolled at a bachelor level, and to elicit what differences existed in the levels of SDLR in relation to age, gender, academic year, and previous qualifications. An online survey questionnaire was utilised based on the Self-directed Learning Readiness Scale for Nursing Education. In contrast to earlier work, the participant profile in this study was predominantly non-traditional and captured participants from all three years of the nursing programme. Results found no significant age or gender differences. First year students demonstrated lower levels of self-directed learning readiness. However, unexpected results were demonstrated in the survey subscales in relation to previous qualifications. Participants who already held post-graduate qualifications showed lower scores for Self-Management than those who held diploma qualifications, while students who already held a bachelor's degree had the highest scores in Desire for Learning. The study findings suggest that universities should not assume that SDL capability is dependent on mature age or length of exposure to tertiary study. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Diffusion of non-traditional cookstoves across western Honduras: A social network analysis

    International Nuclear Information System (INIS)

    Ramirez, Sebastian; Dwivedi, Puneet; Ghilardi, Adrian; Bailis, Robert

    2014-01-01

    A third of the world's population uses inefficient biomass stoves, contributing to severe health problems, forest degradation, and climate change. Clean burning, fuel-efficient, non-traditional cookstoves (NTCS) are a promising solution; however, numerous projects fail during the diffusion process. We use social network analysis to reveal patterns driving a successful stove intervention in western Honduras. The intervention lacks formal marketing, but has spread across a wide area in just a few years. To understand the process, we map the social network of active community members who drove diffusion across a large swath of the country. We find that most ACMs heard about stoves twice before sharing information about it with others and introducing the stove into their own communities. On average, the social distance between ACMs and the project team is 3 degrees of separation. Both men and women are critical to the diffusion process, but men tend to communicate over longer distances, while women principally communicate over shorter distances. Government officials are also crucial to diffusion. Understanding how information moves through social networks and across geographic space allows us to theorize how knowledge about beneficial technologies spreads in the absence of formal marketing and inform policies for NTCS deployment worldwide. - Highlights: • We build a chain of referrals to track spread of information about non traditional cookstoves. • We find differences among gender and occupations that should inform policy. • People hear about the stoves twice before becoming suppliers of information. • Government officials play a substantial role in the diffusion. • Males play leading role in diffusion over long distances, females in short distances

  14. Prediction Model for Health-Related Quality of Life of Elderly with Chronic Diseases using Machine Learning Techniques.

    Science.gov (United States)

    Lee, Soo-Kyoung; Son, Youn-Jung; Kim, Jeongeun; Kim, Hong-Gee; Lee, Jae-Il; Kang, Bo-Yeong; Cho, Hyeon-Sung; Lee, Sungin

    2014-04-01

    The purposes of this study were to identify the factors that affect the health-related quality of life (HRQoL) of the elderly with chronic diseases and to subsequently develop from such factors a prediction model to help identify HRQoL risk groups that require intervention. We analyzed a set of secondary data regarding 716 individuals extracted from the Korea National Health and Nutrition Examination Survey from 2008 to 2010. The statistical package of SPSS and MATLAB were used for data analysis and development of the prediction model. The algorithms used in the study were the following: stepwise logistic regression (SLR) analysis and machine learning (ML) techniques, such as decision tree, random forest, and support vector machine methods. FIVE FACTORS WITH STATISTICAL SIGNIFICANCE WERE IDENTIFIED FOR HRQOL IN THE ELDERLY WITH CHRONIC DISEASES: 'monthly income', 'diagnosis of chronic disease', 'depression', 'discomfort', and 'perceived health status.' The SLR analysis showed the best performance with accuracy = 0.93 and F-score = 0.49. The results of this study provide essential materials that will help formulate personalized health management strategies and develop interventions programs towards the improvement of the HRQoL for elderly people with chronic diseases. Our study is, to our best knowledge, the first attempt to identify the influencing factors and to apply prediction models for the HRQoL of the elderly with chronic diseases by using ML techniques as an alternative and complement to the traditional statistical approaches.

  15. Automatic migraine classification via feature selection committee and machine learning techniques over imaging and questionnaire data.

    Science.gov (United States)

    Garcia-Chimeno, Yolanda; Garcia-Zapirain, Begonya; Gomez-Beldarrain, Marian; Fernandez-Ruanova, Begonya; Garcia-Monco, Juan Carlos

    2017-04-13

    Feature selection methods are commonly used to identify subsets of relevant features to facilitate the construction of models for classification, yet little is known about how feature selection methods perform in diffusion tensor images (DTIs). In this study, feature selection and machine learning classification methods were tested for the purpose of automating diagnosis of migraines using both DTIs and questionnaire answers related to emotion and cognition - factors that influence of pain perceptions. We select 52 adult subjects for the study divided into three groups: control group (15), subjects with sporadic migraine (19) and subjects with chronic migraine and medication overuse (18). These subjects underwent magnetic resonance with diffusion tensor to see white matter pathway integrity of the regions of interest involved in pain and emotion. The tests also gather data about pathology. The DTI images and test results were then introduced into feature selection algorithms (Gradient Tree Boosting, L1-based, Random Forest and Univariate) to reduce features of the first dataset and classification algorithms (SVM (Support Vector Machine), Boosting (Adaboost) and Naive Bayes) to perform a classification of migraine group. Moreover we implement a committee method to improve the classification accuracy based on feature selection algorithms. When classifying the migraine group, the greatest improvements in accuracy were made using the proposed committee-based feature selection method. Using this approach, the accuracy of classification into three types improved from 67 to 93% when using the Naive Bayes classifier, from 90 to 95% with the support vector machine classifier, 93 to 94% in boosting. The features that were determined to be most useful for classification included are related with the pain, analgesics and left uncinate brain (connected with the pain and emotions). The proposed feature selection committee method improved the performance of migraine diagnosis

  16. Estimating Global Seafloor Total Organic Carbon Using a Machine Learning Technique and Its Relevance to Methane Hydrates

    Science.gov (United States)

    Lee, T. R.; Wood, W. T.; Dale, J.

    2017-12-01

    Empirical and theoretical models of sub-seafloor organic matter transformation, degradation and methanogenesis require estimates of initial seafloor total organic carbon (TOC). This subsurface methane, under the appropriate geophysical and geochemical conditions may manifest as methane hydrate deposits. Despite the importance of seafloor TOC, actual observations of TOC in the world's oceans are sparse and large regions of the seafloor yet remain unmeasured. To provide an estimate in areas where observations are limited or non-existent, we have implemented interpolation techniques that rely on existing data sets. Recent geospatial analyses have provided accurate accounts of global geophysical and geochemical properties (e.g. crustal heat flow, seafloor biomass, porosity) through machine learning interpolation techniques. These techniques find correlations between the desired quantity (in this case TOC) and other quantities (predictors, e.g. bathymetry, distance from coast, etc.) that are more widely known. Predictions (with uncertainties) of seafloor TOC in regions lacking direct observations are made based on the correlations. Global distribution of seafloor TOC at 1 x 1 arc-degree resolution was estimated from a dataset of seafloor TOC compiled by Seiter et al. [2004] and a non-parametric (i.e. data-driven) machine learning algorithm, specifically k-nearest neighbors (KNN). Built-in predictor selection and a ten-fold validation technique generated statistically optimal estimates of seafloor TOC and uncertainties. In addition, inexperience was estimated. Inexperience is effectively the distance in parameter space to the single nearest neighbor, and it indicates geographic locations where future data collection would most benefit prediction accuracy. These improved geospatial estimates of TOC in data deficient areas will provide new constraints on methane production and subsequent methane hydrate accumulation.

  17. Classification of fMRI resting-state maps using machine learning techniques: A comparative study

    Science.gov (United States)

    Gallos, Ioannis; Siettos, Constantinos

    2017-11-01

    We compare the efficiency of Principal Component Analysis (PCA) and nonlinear learning manifold algorithms (ISOMAP and Diffusion maps) for classifying brain maps between groups of schizophrenia patients and healthy from fMRI scans during a resting-state experiment. After a standard pre-processing pipeline, we applied spatial Independent component analysis (ICA) to reduce (a) noise and (b) spatial-temporal dimensionality of fMRI maps. On the cross-correlation matrix of the ICA components, we applied PCA, ISOMAP and Diffusion Maps to find an embedded low-dimensional space. Finally, support-vector-machines (SVM) and k-NN algorithms were used to evaluate the performance of the algorithms in classifying between the two groups.

  18. Detecting Mental States by Machine Learning Techniques: The Berlin Brain-Computer Interface

    Science.gov (United States)

    Blankertz, Benjamin; Tangermann, Michael; Vidaurre, Carmen; Dickhaus, Thorsten; Sannelli, Claudia; Popescu, Florin; Fazli, Siamac; Danóczy, Márton; Curio, Gabriel; Müller, Klaus-Robert

    The Berlin Brain-Computer Interface Brain-Computer Interface (BBCI) uses a machine learning approach to extract user-specific patterns from high-dimensional EEG-features optimized for revealing the user's mental state. Classical BCI applications are brain actuated tools for patients such as prostheses (see Section 4.1) or mental text entry systems ([1] and see [2-5] for an overview on BCI). In these applications, the BBCI uses natural motor skills of the users and specifically tailored pattern recognition algorithms for detecting the user's intent. But beyond rehabilitation, there is a wide range of possible applications in which BCI technology is used to monitor other mental states, often even covert ones (see also [6] in the fMRI realm). While this field is still largely unexplored, two examples from our studies are exemplified in Sections 4.3 and 4.4.

  19. Analysis and design of machine learning techniques evolutionary solutions for regression, prediction, and control problems

    CERN Document Server

    Stalph, Patrick

    2014-01-01

    Manipulating or grasping objects seems like a trivial task for humans, as these are motor skills of everyday life. Nevertheless, motor skills are not easy to learn for humans and this is also an active research topic in robotics. However, most solutions are optimized for industrial applications and, thus, few are plausible explanations for human learning. The fundamental challenge, that motivates Patrick Stalph, originates from the cognitive science: How do humans learn their motor skills? The author makes a connection between robotics and cognitive sciences by analyzing motor skill learning using implementations that could be found in the human brain – at least to some extent. Therefore three suitable machine learning algorithms are selected – algorithms that are plausible from a cognitive viewpoint and feasible for the roboticist. The power and scalability of those algorithms is evaluated in theoretical simulations and more realistic scenarios with the iCub humanoid robot. Convincing results confirm the...

  20. 3D Cloud Field Prediction using A-Train Data and Machine Learning Techniques

    Science.gov (United States)

    Johnson, C. L.

    2017-12-01

    Validation of cloud process parameterizations used in global climate models (GCMs) would greatly benefit from observed 3D cloud fields at the size comparable to that of a GCM grid cell. For the highest resolution simulations, surface grid cells are on the order of 100 km by 100 km. CloudSat/CALIPSO data provides 1 km width of detailed vertical cloud fraction profile (CFP) and liquid and ice water content (LWC/IWC). This work utilizes four machine learning algorithms to create nonlinear regressions of CFP, LWC, and IWC data using radiances, surface type and location of measurement as predictors and applies the regression equations to off-track locations generating 3D cloud fields for 100 km by 100 km domains. The CERES-CloudSat-CALIPSO-MODIS (C3M) merged data set for February 2007 is used. Support Vector Machines, Artificial Neural Networks, Gaussian Processes and Decision Trees are trained on 1000 km of continuous C3M data. Accuracy is computed using existing vertical profiles that are excluded from the training data and occur within 100 km of the training data. Accuracy of the four algorithms is compared. Average accuracy for one day of predicted data is 86% for the most successful algorithm. The methodology for training the algorithms, determining valid prediction regions and applying the equations off-track is discussed. Predicted 3D cloud fields are provided as inputs to the Ed4 NASA LaRC Fu-Liou radiative transfer code and resulting TOA radiances compared to observed CERES/MODIS radiances. Differences in computed radiances using predicted profiles and observed radiances are compared.

  1. Trace elements and naturally occurring radioactive materials in 'Non-traditional fertilizers' used in Ghana

    International Nuclear Information System (INIS)

    Assibey, E. O.

    2013-07-01

    Fertilizers have been implicated for being contaminated with toxic trace elements and naturally occurring radioactive materials (NORMs) even though they are an indispensable component of our agriculture. This phenomenon of contamination has been investigated and established world-wide in various forms of fertilizers (i.e., granular or 'traditional' type and liquid/powder or 'non-traditional type'). In Ghana, the crop sub-sector has seen a gradual rise in the importation and use of 'non-traditional fertilizers' which are applied to both the foliar parts and roots of plants. This notwithstanding, research on fertilizers has been largely skewed towards the 'traditional' types, focusing principally on the subjects of yield, effects of application and their quality. This study was, therefore, undertaken to bridge the knowledge gap by investigating the levels of trace elements and NORMs found in the 'non-traditional' fertilizers used in Ghana. The principal objective of the study was to investigate the suitability of the 'non-traditional fertilizers' for agricultural purposes with respect to trace elements and NORMs contamination. Atomic Absorption Spectrometry and instrumental Neutron Activation Analysis were employed to determine the trace elements (Cu, Zn, Fe, Na, Al, Br, Ni, Cd, As, Hg, Co, Pb, La, Mn, Si, Ca, Cl, S, K, Ba and V) and NORMs ( 238 U, 232 Th and 40 K) concentrations in thirty-nine (39) fertilizer samples taken from two major agro-input hubs in the country (Kumasi-Kejetia and Accra). Multivariate statistical analyses (cluster analysis, principal component analysis and pearson's correlation) were applied to the data obtained in order to identify possible sources of contamination, investigate sample/ parameter affinities and groupings and for fingerprinting. The toxic trace element concentrations determined in all samples were found to be in the order Fe>Cu>Co>Cd>Cr >Ni>Pb>As>Hg. The study found most of the trace elements determined to be within limits set

  2. Identifying tropical dry forests extent and succession via the use of machine learning techniques

    Science.gov (United States)

    Li, Wei; Cao, Sen; Campos-Vargas, Carlos; Sanchez-Azofeifa, Arturo

    2017-12-01

    Information on ecosystem services as a function of the successional stage for secondary tropical dry forests (TDFs) is scarce and limited. Secondary TDFs succession is defined as regrowth following a complete forest clearance for cattle growth or agriculture activities. In the context of large conservation initiatives, the identification of the extent, structure and composition of secondary TDFs can serve as key elements to estimate the effectiveness of such activities. As such, in this study we evaluate the use of a Hyperspectral MAPper (HyMap) dataset and a waveform LIDAR dataset for characterization of different levels of intra-secondary forests stages at the Santa Rosa National Park (SRNP) Environmental Monitoring Super Site located in Costa Rica. Specifically, a multi-task learning based machine learning classifier (MLC-MTL) is employed on the first shortwave infrared (SWIR1) of HyMap in order to identify the variability of aboveground biomass of secondary TDFs along a successional gradient. Our paper recognizes that the process of ecological succession is not deterministic but a combination of transitional forests types along a stochastic path that depends on ecological, edaphic, land use, and micro-meteorological conditions, and our results provide a new way to obtain the spatial distribution of three main types of TDFs successional stages.

  3. Non-Invasive Blood Pressure Estimation from ECG Using Machine Learning Techniques.

    Science.gov (United States)

    Simjanoska, Monika; Gjoreski, Martin; Gams, Matjaž; Madevska Bogdanova, Ana

    2018-04-11

    Blood pressure (BP) measurements have been used widely in clinical and private environments. Recently, the use of ECG monitors has proliferated; however, they are not enabled with BP estimation. We have developed a method for BP estimation using only electrocardiogram (ECG) signals. Raw ECG data are filtered and segmented, and, following this, a complexity analysis is performed for feature extraction. Then, a machine-learning method is applied, combining a stacking-based classification module and a regression module for building systolic BP (SBP), diastolic BP (DBP), and mean arterial pressure (MAP) predictive models. In addition, the method allows a probability distribution-based calibration to adapt the models to a particular user. Using ECG recordings from 51 different subjects, 3129 30-s ECG segments are constructed, and seven features are extracted. Using a train-validation-test evaluation, the method achieves a mean absolute error (MAE) of 8.64 mmHg for SBP, 18.20 mmHg for DBP, and 13.52 mmHg for the MAP prediction. When models are calibrated, the MAE decreases to 7.72 mmHg for SBP, 9.45 mmHg for DBP and 8.13 mmHg for MAP. The experimental results indicate that, when a probability distribution-based calibration is used, the proposed method can achieve results close to those of a certified medical device for BP estimation.

  4. Automated Classification of Heritage Buildings for As-Built Bim Using Machine Learning Techniques

    Science.gov (United States)

    Bassier, M.; Vergauwen, M.; Van Genechten, B.

    2017-08-01

    Semantically rich three dimensional models such as Building Information Models (BIMs) are increasingly used in digital heritage. They provide the required information to varying stakeholders during the different stages of the historic buildings life cyle which is crucial in the conservation process. The creation of as-built BIM models is based on point cloud data. However, manually interpreting this data is labour intensive and often leads to misinterpretations. By automatically classifying the point cloud, the information can be proccesed more effeciently. A key aspect in this automated scan-to-BIM process is the classification of building objects. In this research we look to automatically recognise elements in existing buildings to create compact semantic information models. Our algorithm efficiently extracts the main structural components such as floors, ceilings, roofs, walls and beams despite the presence of significant clutter and occlusions. More specifically, Support Vector Machines (SVM) are proposed for the classification. The algorithm is evaluated using real data of a variety of existing buildings. The results prove that the used classifier recognizes the objects with both high precision and recall. As a result, entire data sets are reliably labelled at once. The approach enables experts to better document and process heritage assets.

  5. Multivariate Cross-Classification: Applying machine learning techniques to characterize abstraction in neural representations

    Directory of Open Access Journals (Sweden)

    Jonas eKaplan

    2015-03-01

    Full Text Available Here we highlight an emerging trend in the use of machine learning classifiers to test for abstraction across patterns of neural activity. When a classifier algorithm is trained on data from one cognitive context, and tested on data from another, conclusions can be drawn about the role of a given brain region in representing information that abstracts across those cognitive contexts. We call this kind of analysis Multivariate Cross-Classification (MVCC, and review several domains where it has recently made an impact. MVCC has been important in establishing correspondences among neural patterns across cognitive domains, including motor-perception matching and cross-sensory matching. It has been used to test for similarity between neural patterns evoked by perception and those generated from memory. Other work has used MVCC to investigate the similarity of representations for semantic categories across different kinds of stimulus presentation, and in the presence of different cognitive demands. We use these examples to demonstrate the power of MVCC as a tool for investigating neural abstraction and discuss some important methodological issues related to its application.

  6. Prediction of Driver's Intention of Lane Change by Augmenting Sensor Information Using Machine Learning Techniques.

    Science.gov (United States)

    Kim, Il-Hwan; Bong, Jae-Hwan; Park, Jooyoung; Park, Shinsuk

    2017-06-10

    Driver assistance systems have become a major safety feature of modern passenger vehicles. The advanced driver assistance system (ADAS) is one of the active safety systems to improve the vehicle control performance and, thus, the safety of the driver and the passengers. To use the ADAS for lane change control, rapid and correct detection of the driver's intention is essential. This study proposes a novel preprocessing algorithm for the ADAS to improve the accuracy in classifying the driver's intention for lane change by augmenting basic measurements from conventional on-board sensors. The information on the vehicle states and the road surface condition is augmented by using an artificial neural network (ANN) models, and the augmented information is fed to a support vector machine (SVM) to detect the driver's intention with high accuracy. The feasibility of the developed algorithm was tested through driving simulator experiments. The results show that the classification accuracy for the driver's intention can be improved by providing an SVM model with sufficient driving information augmented by using ANN models of vehicle dynamics.

  7. Machine learning techniques for breast cancer computer aided diagnosis using different image modalities: A systematic review.

    Science.gov (United States)

    Yassin, Nisreen I R; Omran, Shaimaa; El Houby, Enas M F; Allam, Hemat

    2018-03-01

    The high incidence of breast cancer in women has increased significantly in the recent years. Physician experience of diagnosing and detecting breast cancer can be assisted by using some computerized features extraction and classification algorithms. This paper presents the conduction and results of a systematic review (SR) that aims to investigate the state of the art regarding the computer aided diagnosis/detection (CAD) systems for breast cancer. The SR was conducted using a comprehensive selection of scientific databases as reference sources, allowing access to diverse publications in the field. The scientific databases used are Springer Link (SL), Science Direct (SD), IEEE Xplore Digital Library, and PubMed. Inclusion and exclusion criteria were defined and applied to each retrieved work to select those of interest. From 320 studies retrieved, 154 studies were included. However, the scope of this research is limited to scientific and academic works and excludes commercial interests. This survey provides a general analysis of the current status of CAD systems according to the used image modalities and the machine learning based classifiers. Potential research studies have been discussed to create a more objective and efficient CAD systems. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. submitter Studies of CMS data access patterns with machine learning techniques

    CERN Document Server

    De Luca, Silvia

    This thesis presents a study of the Grid data access patterns in distributed analysis in the CMS experiment at the LHC accelerator. This study ranges from the deep analysis of the historical patterns of access to the most relevant data types in CMS, to the exploitation of a supervised Machine Learning classification system to set-up a machinery able to eventually predict future data access patterns - i.e. the so-called dataset “popularity” of the CMS datasets on the Grid - with focus on specific data types. All the CMS workflows run on the Worldwide LHC Computing Grid (WCG) computing centers (Tiers), and in particular the distributed analysis systems sustains hundreds of users and applications submitted every day. These applications (or “jobs”) access different data types hosted on disk storage systems at a large set of WLCG Tiers. The detailed study of how this data is accessed, in terms of data types, hosting Tiers, and different time periods, allows to gain precious insight on storage occupancy ove...

  9. A Real-Time Interference Monitoring Technique for GNSS Based on a Twin Support Vector Machine Method

    Directory of Open Access Journals (Sweden)

    Wutao Li

    2016-03-01

    Full Text Available Interferences can severely degrade the performance of Global Navigation Satellite System (GNSS receivers. As the first step of GNSS any anti-interference measures, interference monitoring for GNSS is extremely essential and necessary. Since interference monitoring can be considered as a classification problem, a real-time interference monitoring technique based on Twin Support Vector Machine (TWSVM is proposed in this paper. A TWSVM model is established, and TWSVM is solved by the Least Squares Twin Support Vector Machine (LSTWSVM algorithm. The interference monitoring indicators are analyzed to extract features from the interfered GNSS signals. The experimental results show that the chosen observations can be used as the interference monitoring indicators. The interference monitoring performance of the proposed method is verified by using GPS L1 C/A code signal and being compared with that of standard SVM. The experimental results indicate that the TWSVM-based interference monitoring is much faster than the conventional SVM. Furthermore, the training time of TWSVM is on millisecond (ms level and the monitoring time is on microsecond (μs level, which make the proposed approach usable in practical interference monitoring applications.

  10. Using machine learning techniques and genomic/proteomic information from known databases for defining relevant features for PPI classification.

    Science.gov (United States)

    Urquiza, J M; Rojas, I; Pomares, H; Herrera, J; Florido, J P; Valenzuela, O; Cepero, M

    2012-06-01

    In modern proteomics, prediction of protein-protein interactions (PPIs) is a key research line, as these interactions take part in most essential biological processes. In this paper, a new approach is proposed to PPI data classification based on the extraction of genomic and proteomic information from well-known databases and the incorporation of semantic measures. This approach is carried out through the application of data mining techniques and provides very accurate models with high levels of sensitivity and specificity in the classification of PPIs. The well-known support vector machine paradigm is used to learn the models, which will also return a new confidence score which may help expert researchers to filter out and validate new external PPIs. One of the most-widely analyzed organisms, yeast, will be studied. We processed a very high-confidence dataset by extracting up to 26 specific features obtained from the chosen databases, half of them calculated using two new similarity measures proposed in this paper. Then, by applying a filter-wrapper algorithm for feature selection, we obtained a final set composed of the eight most relevant features for predicting PPIs, which was validated by a ROC analysis. The prediction capability of the support vector machine model using these eight features was tested through the evaluation of the predictions obtained in a set of external experimental, computational, and literature-collected datasets. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. A Real-Time Interference Monitoring Technique for GNSS Based on a Twin Support Vector Machine Method.

    Science.gov (United States)

    Li, Wutao; Huang, Zhigang; Lang, Rongling; Qin, Honglei; Zhou, Kai; Cao, Yongbin

    2016-03-04

    Interferences can severely degrade the performance of Global Navigation Satellite System (GNSS) receivers. As the first step of GNSS any anti-interference measures, interference monitoring for GNSS is extremely essential and necessary. Since interference monitoring can be considered as a classification problem, a real-time interference monitoring technique based on Twin Support Vector Machine (TWSVM) is proposed in this paper. A TWSVM model is established, and TWSVM is solved by the Least Squares Twin Support Vector Machine (LSTWSVM) algorithm. The interference monitoring indicators are analyzed to extract features from the interfered GNSS signals. The experimental results show that the chosen observations can be used as the interference monitoring indicators. The interference monitoring performance of the proposed method is verified by using GPS L1 C/A code signal and being compared with that of standard SVM. The experimental results indicate that the TWSVM-based interference monitoring is much faster than the conventional SVM. Furthermore, the training time of TWSVM is on millisecond (ms) level and the monitoring time is on microsecond (μs) level, which make the proposed approach usable in practical interference monitoring applications.

  12. Reverse engineering smart card malware using side channel analysis with machine learning techniques

    CSIR Research Space (South Africa)

    Djonon Tsague, Hippolyte

    2016-12-01

    Full Text Available by evaluating its power consumption only. Besides well-studied methods from side channel analysis, we apply a combination of dimensionality reduction techniques in the form of PCA and LDA models to compress the large amount of data generated while preserving...

  13. Machine learning techniques for diabetic macular edema (DME) classification on SD-OCT images.

    Science.gov (United States)

    Alsaih, Khaled; Lemaitre, Guillaume; Rastgoo, Mojdeh; Massich, Joan; Sidibé, Désiré; Meriaudeau, Fabrice

    2017-06-07

    Spectral domain optical coherence tomography (OCT) (SD-OCT) is most widely imaging equipment used in ophthalmology to detect diabetic macular edema (DME). Indeed, it offers an accurate visualization of the morphology of the retina as well as the retina layers. The dataset used in this study has been acquired by the Singapore Eye Research Institute (SERI), using CIRRUS TM (Carl Zeiss Meditec, Inc., Dublin, CA, USA) SD-OCT device. The dataset consists of 32 OCT volumes (16 DME and 16 normal cases). Each volume contains 128 B-scans with resolution of 1024 px × 512 px, resulting in more than 3800 images being processed. All SD-OCT volumes are read and assessed by trained graders and identified as normal or DME cases based on evaluation of retinal thickening, hard exudates, intraretinal cystoid space formation, and subretinal fluid. Within the DME sub-set, a large number of lesions has been selected to create a rather complete and diverse DME dataset. This paper presents an automatic classification framework for SD-OCT volumes in order to identify DME versus normal volumes. In this regard, a generic pipeline including pre-processing, feature detection, feature representation, and classification was investigated. More precisely, extraction of histogram of oriented gradients and local binary pattern (LBP) features within a multiresolution approach is used as well as principal component analysis (PCA) and bag of words (BoW) representations. Besides comparing individual and combined features, different representation approaches and different classifiers are evaluated. The best results are obtained for LBP[Formula: see text] vectors while represented and classified using PCA and a linear-support vector machine (SVM), leading to a sensitivity(SE) and specificity (SP) of 87.5 and 87.5%, respectively.

  14. Automated analysis of retinal imaging using machine learning techniques for computer vision.

    Science.gov (United States)

    De Fauw, Jeffrey; Keane, Pearse; Tomasev, Nenad; Visentin, Daniel; van den Driessche, George; Johnson, Mike; Hughes, Cian O; Chu, Carlton; Ledsam, Joseph; Back, Trevor; Peto, Tunde; Rees, Geraint; Montgomery, Hugh; Raine, Rosalind; Ronneberger, Olaf; Cornebise, Julien

    2016-01-01

    There are almost two million people in the United Kingdom living with sight loss, including around 360,000 people who are registered as blind or partially sighted. Sight threatening diseases, such as diabetic retinopathy and age related macular degeneration have contributed to the 40% increase in outpatient attendances in the last decade but are amenable to early detection and monitoring. With early and appropriate intervention, blindness may be prevented in many cases. Ophthalmic imaging provides a way to diagnose and objectively assess the progression of a number of pathologies including neovascular ("wet") age-related macular degeneration (wet AMD) and diabetic retinopathy. Two methods of imaging are commonly used: digital photographs of the fundus (the 'back' of the eye) and Optical Coherence Tomography (OCT, a modality that uses light waves in a similar way to how ultrasound uses sound waves). Changes in population demographics and expectations and the changing pattern of chronic diseases creates a rising demand for such imaging. Meanwhile, interrogation of such images is time consuming, costly, and prone to human error. The application of novel analysis methods may provide a solution to these challenges. This research will focus on applying novel machine learning algorithms to automatic analysis of both digital fundus photographs and OCT in Moorfields Eye Hospital NHS Foundation Trust patients. Through analysis of the images used in ophthalmology, along with relevant clinical and demographic information, DeepMind Health will investigate the feasibility of automated grading of digital fundus photographs and OCT and provide novel quantitative measures for specific disease features and for monitoring the therapeutic success.

  15. River suspended sediment modelling using the CART model: A comparative study of machine learning techniques.

    Science.gov (United States)

    Choubin, Bahram; Darabi, Hamid; Rahmati, Omid; Sajedi-Hosseini, Farzaneh; Kløve, Bjørn

    2018-02-15

    Suspended sediment load (SSL) modelling is an important issue in integrated environmental and water resources management, as sediment affects water quality and aquatic habitats. Although classification and regression tree (CART) algorithms have been applied successfully to ecological and geomorphological modelling, their applicability to SSL estimation in rivers has not yet been investigated. In this study, we evaluated use of a CART model to estimate SSL based on hydro-meteorological data. We also compared the accuracy of the CART model with that of the four most commonly used models for time series modelling of SSL, i.e. adaptive neuro-fuzzy inference system (ANFIS), multi-layer perceptron (MLP) neural network and two kernels of support vector machines (RBF-SVM and P-SVM). The models were calibrated using river discharge, stage, rainfall and monthly SSL data for the Kareh-Sang River gauging station in the Haraz watershed in northern Iran, where sediment transport is a considerable issue. In addition, different combinations of input data with various time lags were explored to estimate SSL. The best input combination was identified through trial and error, percent bias (PBIAS), Taylor diagrams and violin plots for each model. For evaluating the capability of the models, different statistics such as Nash-Sutcliffe efficiency (NSE), Kling-Gupta efficiency (KGE) and percent bias (PBIAS) were used. The results showed that the CART model performed best in predicting SSL (NSE=0.77, KGE=0.8, PBIAS<±15), followed by RBF-SVM (NSE=0.68, KGE=0.72, PBIAS<±15). Thus the CART model can be a helpful tool in basins where hydro-meteorological data are readily available. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Translation of Untranslatable Words — Integration of Lexical Approximation and Phrase-Table Extension Techniques into Statistical Machine Translation

    Science.gov (United States)

    Paul, Michael; Arora, Karunesh; Sumita, Eiichiro

    This paper proposes a method for handling out-of-vocabulary (OOV) words that cannot be translated using conventional phrase-based statistical machine translation (SMT) systems. For a given OOV word, lexical approximation techniques are utilized to identify spelling and inflectional word variants that occur in the training data. All OOV words in the source sentence are then replaced with appropriate word variants found in the training corpus, thus reducing the number of OOV words in the input. Moreover, in order to increase the coverage of such word translations, the SMT translation model is extended by adding new phrase translations for all source language words that do not have a single-word entry in the original phrase-table but only appear in the context of larger phrases. The effectiveness of the proposed methods is investigated for the translation of Hindi to English, Chinese, and Japanese.

  17. Application of a support vector machine algorithm to the safety precaution technique of medium-low pressure gas regulators

    Science.gov (United States)

    Hao, Xuejun; An, Xaioran; Wu, Bo; He, Shaoping

    2018-02-01

    In the gas pipeline system, safe operation of a gas regulator determines the stability of the fuel gas supply, and the medium-low pressure gas regulator of the safety precaution system is not perfect at the present stage in the Beijing Gas Group; therefore, safety precaution technique optimization has important social and economic significance. In this paper, according to the running status of the medium-low pressure gas regulator in the SCADA system, a new method for gas regulator safety precaution based on the support vector machine (SVM) is presented. This method takes the gas regulator outlet pressure data as input variables of the SVM model, the fault categories and degree as output variables, which will effectively enhance the precaution accuracy as well as save significant manpower and material resources.

  18. Uncertainty quantification and integration of machine learning techniques for predicting acid rock drainage chemistry: a probability bounds approach.

    Science.gov (United States)

    Betrie, Getnet D; Sadiq, Rehan; Morin, Kevin A; Tesfamariam, Solomon

    2014-08-15

    Acid rock drainage (ARD) is a major pollution problem globally that has adversely impacted the environment. Identification and quantification of uncertainties are integral parts of ARD assessment and risk mitigation, however previous studies on predicting ARD drainage chemistry have not fully addressed issues of uncertainties. In this study, artificial neural networks (ANN) and support vector machine (SVM) are used for the prediction of ARD drainage chemistry and their predictive uncertainties are quantified using probability bounds analysis. Furthermore, the predictions of ANN and SVM are integrated using four aggregation methods to improve their individual predictions. The results of this study showed that ANN performed better than SVM in enveloping the observed concentrations. In addition, integrating the prediction of ANN and SVM using the aggregation methods improved the predictions of individual techniques. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Best Practices for Fatigue Risk Management in Non-Traditional Shiftwork

    Science.gov (United States)

    Flynn-Evans, Erin E.

    2016-01-01

    Fatigue risk management programs provide effective tools to mitigate fatigue among shift workers. Although such programs are effective for typical shiftwork scenarios, where individuals of equal skill level can be divided into shifts to cover 24 hour operations, traditional programs are not sufficient for managing sleep loss among individuals with unique skill sets, in occupations where non-traditional schedules are required. Such operations are prevalent at NASA and in other high stress occupations, including among airline pilots, military personnel, and expeditioners. These types of operations require fatigue risk management programs tailored to the specific requirements of the mission. Without appropriately tailored fatigue risk management, such operations can lead to an elevated risk of operational failure, disintegration of teamwork, and increased risk of accidents and incidents. In order to design schedules for such operations, schedule planners must evaluate the impact of a given operation on circadian misalignment, acute sleep loss, chronic sleep loss and sleep inertia. In addition, individual-level factors such as morningness-eveningness preference and sleep disorders should be considered. After the impact of each of these factors has been identified, scheduling teams can design schedules that meet operational requirements, while also minimizing fatigue.

  20. TECHNOLOGICAL ASPECTS OF PRODUCTION OF THE CANDIED FRUITS FROM NON-TRADITIONAL RAW MATERIAL

    Directory of Open Access Journals (Sweden)

    I. R. Belenkaya

    2016-08-01

    Full Text Available The article analyses the candied fruit market in Ukraine and describes the main technological operations pertainingto processing of non-traditional candied products – celery and parsnip roots. Darkening of the roots surface caused bythe enzyme oxidation is one of the problems arising when processing white roots, which leads to worse marketable conditionof the product. To prevent darkening, the developed technology provides for soaking raw material in 1% citric acid solutionimmediately after peeling. To improve the diffusion and osmotic processes and to soften roots before boiling in sugar syrup,the steam blanching has been applied. The constructed Gantt diagram proves that the developed technology can shorten thecandied fruit cooking period. The biochemical indicators of the obtained new products have been studied. It was establishedthat the candied fruit possess the appropriate physical and chemical indicators and original organoleptic properties resulting ina demand by consumers. The results of the taste evaluation of the experimental specimen confirmed a high quality of the products.

  1. Correlation of the leucocyte count with traditional and non-traditional components of metabolic syndrome.

    Science.gov (United States)

    Su, Bai-Yu; Tian, Chun-Feng; Gao, Bu-Lang; Tong, Yu-Hong; Zhao, Xu-Hong; Zheng, Ying

    2016-11-01

    To investigate correlation of the white blood cell (WBC) and its subtype count with the traditional and non-traditional components of the metabolic syndrome. Between January 2012 and December 2013, 18,222 people were enrolled in this study. The height, weight, body mass index (BMI) and blood pressure were measured, and blood samples were tested for all subjects after an overnight fast. The count of WBC and its subtypes, total cholesterol, triglyceride, high density lipoprotein (HDL), low-density lipoprotein, aminotransferases, fibrinogen, uric acid, and fasting blood glucose were all assessed. Metabolic syndrome was found in 2502 of 18,222 healthy Chinese people (16.41%). The prevalence of metabolic syndrome was 22.61% for men significantly (P metabolic syndrome. With increase of the WBC count, BMI, systolic and diastolic pressure, fasting blood glucose, triglyceride, glutamic-oxaloacetic transaminase, glutamic-pyruvic transaminase, glutamyltranspetidase, blood urea nitrogen fibrinogen and uric acid all went up significantly (P creatinine remained relatively sTable After adjustment of age, sex, alcoholic drinking and education, the metabolic components of obesity, hypertension, diabetes and hyperlipidemia rose significantly (P metabolic syndrome. Aminotransferases, fibrinogen and uric acid all significantly increase with increased WBC count in a dose-dependent manner. Increased counts of the total WBC and its subtypes are positively associated with presence of metabolic syndrome.

  2. Results of application of some non-traditional al restoration methods on north bohemian mines locations

    International Nuclear Information System (INIS)

    Cablik Vladimir; Rehor Michal; Lang Tomas; Fecko Peter

    2008-01-01

    The importance of brown coal implies from the growing need of energy in the Czech Republic. It is nowadays a single significant fossil raw material, without which our state would become fully dependent on the import of energetic sources. More than 70 % of mined brown coal comes from the North Bohemian Basin these days. Open cast brown coal mining has led to large damage on the landscape. That is why the reclamation work has become important on principle recently. The difficulty of reclamation of North Bohemian localities consists in extremely unfavorable properties of rock strewn to the most of dump bodies. These rocks are mechanically unstable in the wind and water erosion and it gets undesirable, acidic characteristics as SO 3 and Al ions influence by weathering. Limitation of the influence of weathering, amendment of chemistry and physical composition of top rocks strata, and definition of the required amount of fertilizable rock have been successful in recent years as suitable methods have been used. The presented article includes the characteristics of the important phytotoxic areas and the methodology of their reclamation mainly based on the application of suitable fertilizable rocks. Some tentatively used non-traditional methods were evaluated e. g. the application power plant stabilizer and ash. The paper assesses the success rate of the reclamation methods. The results are documented with the long term monitoring of physical, mineralogical, chemical and pedological parameters of rocks in the testing areas.

  3. Reaching Non-Traditional and Under-Served Communities through Global Astronomy Month Programs

    Science.gov (United States)

    Simmons, Michael

    2013-01-01

    Global Astronomy Month (GAM), organized each year by Astronomers Without Borders (AWB), has become the world's largest annual celebration of astronomy. Launched as a follow-up to the unprecedented success of the 100 Hours of Astronomy Cornerstone Project of IYA2009, GAM quickly attracted not only traditional partners in astronomy and space science outreach, but also unusual partners from very different fields. GAM's third annual edition, GAM2012, included worldwide programs for the sight-impaired, astronomy in the arts, and other non-traditional programs. The special planetarium program, OPTICKS, combined elements such as Moonbounce (sending images to the Moon and back) and artistic elements in a unique presentation of the heavens. Programs were developed to present the heavens to the sight-impaired as well. The Cosmic Concert, in which a new musical piece is composed each year, combined with background images of celestial objects, and presented during GAM, has become an annual event. Several astronomy themed art video projects were presented online. AWB's Astropoetry Blog held a very successful contest during GAM2012 that attracted more than 70 entries from 17 countries. Students were engaged by participation in special GAM campaigns of the International Asteroid Search Campaign. AWB and GAM have both developed into platforms where innovative programs can develop, and interdisciplinary collaborations can flourish. As AWB's largest program, GAM brings the audience and resources that provide a boost for these new types of programs. Examples, lessons learned, new projects, and plans for the future of AWB and GAM will be presented.

  4. Current breathomics--a review on data pre-processing techniques and machine learning in metabolomics breath analysis.

    Science.gov (United States)

    Smolinska, A; Hauschild, A-Ch; Fijten, R R R; Dallinga, J W; Baumbach, J; van Schooten, F J

    2014-06-01

    We define breathomics as the metabolomics study of exhaled air. It is a strongly emerging metabolomics research field that mainly focuses on health-related volatile organic compounds (VOCs). Since the amount of these compounds varies with health status, breathomics holds great promise to deliver non-invasive diagnostic tools. Thus, the main aim of breathomics is to find patterns of VOCs related to abnormal (for instance inflammatory) metabolic processes occurring in the human body. Recently, analytical methods for measuring VOCs in exhaled air with high resolution and high throughput have been extensively developed. Yet, the application of machine learning methods for fingerprinting VOC profiles in the breathomics is still in its infancy. Therefore, in this paper, we describe the current state of the art in data pre-processing and multivariate analysis of breathomics data. We start with the detailed pre-processing pipelines for breathomics data obtained from gas-chromatography mass spectrometry and an ion-mobility spectrometer coupled to multi-capillary columns. The outcome of data pre-processing is a matrix containing the relative abundances of a set of VOCs for a group of patients under different conditions (e.g. disease stage, treatment). Independently of the utilized analytical method, the most important question, 'which VOCs are discriminatory?', remains the same. Answers can be given by several modern machine learning techniques (multivariate statistics) and, therefore, are the focus of this paper. We demonstrate the advantages as well the drawbacks of such techniques. We aim to help the community to understand how to profit from a particular method. In parallel, we hope to make the community aware of the existing data fusion methods, as yet unresearched in breathomics.

  5. New technique for iridodialysis correction: Single-knot sewing-machine suture.

    Science.gov (United States)

    Silva, João Luis; Póvoa, João; Lobo, Conceição; Murta, Joaquim

    2016-04-01

    Iridodialysis is a common occurrence after trauma and can be the source of considerable morbidity for the patients. Several options to repair iridodialysis are described in the literature. We present a new technique-a single-thread single-knot suture. This simple approach does not require special material and uses a single thread and a single knot, avoiding the need for using multiple sutures or placing multiple knots. We used this technique in several patients, and it appears to be an effective alternative to iridodialysis repair. None of the authors has a financial or proprietary interest in any material or method mentioned. Copyright © 2016 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  6. A New Profile Learning Model for Recommendation System based on Machine Learning Technique

    Directory of Open Access Journals (Sweden)

    Shereen H. Ali

    2016-03-01

    Full Text Available Recommender systems (RSs have been used to successfully address the information overload problem by providing personalized and targeted recommendations to the end users. RSs are software tools and techniques providing suggestions for items to be of use to a user, hence, they typically apply techniques and methodologies from Data Mining. The main contribution of this paper is to introduce a new user profile learning model to promote the recommendation accuracy of vertical recommendation systems. The proposed profile learning model employs the vertical classifier that has been used in multi classification module of the Intelligent Adaptive Vertical Recommendation (IAVR system to discover the user’s area of interest, and then build the user’s profile accordingly. Experimental results have proven the effectiveness of the proposed profile learning model, which accordingly will promote the recommendation accuracy.

  7. Machine learning techniques for medical diagnosis of diabetes using iris images.

    Science.gov (United States)

    Samant, Piyush; Agarwal, Ravinder

    2018-04-01

    Complementary and alternative medicine techniques have shown their potential for the treatment and diagnosis of chronical diseases like diabetes, arthritis etc. On the same time digital image processing techniques for disease diagnosis is reliable and fastest growing field in biomedical. Proposed model is an attempt to evaluate diagnostic validity of an old complementary and alternative medicine technique, iridology for diagnosis of type-2 diabetes using soft computing methods. Investigation was performed over a close group of total 338 subjects (180 diabetic and 158 non-diabetic). Infra-red images of both the eyes were captured simultaneously. The region of interest from the iris image was cropped as zone corresponds to the position of pancreas organ according to the iridology chart. Statistical, texture and discrete wavelength transformation features were extracted from the region of interest. The results show best classification accuracy of 89.63% calculated from RF classifier. Maximum specificity and sensitivity were absorbed as 0.9687 and 0.988, respectively. Results have revealed the effectiveness and diagnostic significance of proposed model for non-invasive and automatic diabetes diagnosis. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Traditional vs. non-traditional healing for minor and major morbidities in India: uses, cost and quality comparisons.

    Science.gov (United States)

    Singh, Ashish; Madhavan, Harilal

    2015-09-01

    To examine the uses, cost and quality of care of traditional healing for short-term morbidities and major morbidities in India and to compare them with the non-traditional healing. We used data from a nationally representative survey, the India Human Development Survey (2004-2005) and descriptive as well as bivariate analyses for the examination. Use of traditional healing is much less common than use of non-traditional healing in both rural and urban areas and across all socio-economic and demographic characteristics; it is slightly more common in rural than urban areas for short-term morbidities. Use of traditional healing is relatively more frequent for cataract (especially in rural areas), leprosy, asthma, polio, paralysis, epilepsy and mental illnesses; its total cost of care and mean waiting time (in the health facility) are substantially lower than for non-traditional healing. Among patients who use both traditional and non-traditional healing, a relatively higher proportion use traditional healing complemented by non-traditional healing for short-term illnesses, but vice versa for major morbidities. This is the first study which has investigated at the national level the uses, complementarities, cost and quality aspects of traditional and non-traditional healing in India. Traditional healing is more affordable and pro-poor. Relatively higher use of traditional healing in patients from poorly educated as well as poor households and suffering from diseases, such as, epilepsy and mental illnesses; and higher demand for traditional healing for the above diseases highlight the need for research/policy reorientation in India. © 2015 John Wiley & Sons Ltd.

  9. Tool wear monitoring by machine learning techniques and singular spectrum analysis

    Science.gov (United States)

    Kilundu, Bovic; Dehombreux, Pierre; Chiementin, Xavier

    2011-01-01

    This paper explores the use of data mining techniques for tool condition monitoring in metal cutting. Pseudo-local singular spectrum analysis (SSA) is performed on vibration signals measured on the toolholder. This is coupled to a band-pass filter to allow definition and extraction of features which are sensitive to tool wear. These features are defined, in some frequency bands, from sums of Fourier coefficients of reconstructed and residual signals obtained by SSA. This study highlights two important aspects: strong relevance of information in high frequency vibration components and benefits of the combination of SSA and band-pass filtering to get rid of useless components (noise).

  10. Egg volume prediction using machine vision technique based on pappus theorem and artificial neural network.

    Science.gov (United States)

    Soltani, Mahmoud; Omid, Mahmoud; Alimardani, Reza

    2015-05-01

    Egg size is one of the important properties of egg that is judged by customers. Accordingly, in egg sorting and grading, the size of eggs must be considered. In this research, a new method of egg volume prediction was proposed without need to measure weight of egg. An accurate and efficient image processing algorithm was designed and implemented for computing major and minor diameters of eggs. Two methods of egg size modeling were developed. In the first method, a mathematical model was proposed based on Pappus theorem. In second method, Artificial Neural Network (ANN) technique was used to estimate egg volume. The determined egg volume by these methods was compared statistically with actual values. For mathematical modeling, the R(2), Mean absolute error and maximum absolute error values were obtained as 0.99, 0.59 cm(3) and 1.69 cm(3), respectively. To determine the best ANN, R(2) test and RMSEtest were used as selection criteria. The best ANN topology was 2-28-1 which had the R(2) test and RMSEtest of 0.992 and 0.66, respectively. After system calibration, the proposed models were evaluated. The results which indicated the mathematical modeling yielded more satisfying results. So this technique was selected for egg size determination.

  11. Applying machine learning techniques to the identification of late-onset hypogonadism in elderly men.

    Science.gov (United States)

    Lu, Ti; Hu, Ya-Han; Tsai, Chih-Fong; Liu, Shih-Ping; Chen, Pei-Ling

    2016-01-01

    In the diagnosis of late-onset hypogonadism (LOH), the Androgen Deficiency in the Aging Male (ADAM) questionnaire or Aging Males' Symptoms (AMS) scale can be used to assess related symptoms. Subsequently, blood tests are used to measure serum testosterone levels. However, results obtained using ADAM and AMS have revealed no significant correlations between ADAM and AMS scores and LOH, and the rate of misclassification is high. Recently, many studies have reported significant associations between clinical conditions such as the metabolic syndrome, obesity, lower urinary tract symptoms, and LOH. In this study, we sampled 772 clinical cases of men who completed both a health checkup and two questionnaires (ADAM and AMS). The data were obtained from the largest medical center in Taiwan. Two well-known classification techniques, the decision tree (DT) and logistic regression, were used to construct LOH prediction models on the basis of the aforementioned features. The results indicate that although the sensitivity of ADAM is the highest (0.878), it has the lowest specificity (0.099), which implies that ADAM overestimates LOH occurrence. In addition, DT combined with the AdaBoost technique (AdaBoost DT) has the second highest sensitivity (0.861) and specificity (0.842), resulting in having the best accuracy (0.851) among all classifiers. AdaBoost DT can provide robust predictions that will aid clinical decisions and can help medical staff in accurately assessing the possibilities of LOH occurrence.

  12. Applying machine learning techniques for forecasting flexibility of virtual power plants

    DEFF Research Database (Denmark)

    MacDougall, Pamela; Kosek, Anna Magdalena; Bindner, Henrik W.

    2016-01-01

    Previous and existing evaluations of available flexibility using small device demand response have typically been done with detailed information of end-user systems. With these large numbers, having lower level information has both privacy and computational limitations. We propose a black box...... hidden layer artificial neural network (ANN). Both techniques are used to model a relationship between the aggregator portfolio state and requested ramp power to the longevity of the delivered flexibility. Using validated individual household models, a smart controlled aggregated virtual power plant...... is simulated. A hierarchical market-based supply-demand matching control mechanism is used to steer the heating devices in the virtual power plant. For both the training and validation set of clusters, a random number of households, between 200 and 2000, is generated with day ahead profile scaled accordingly...

  13. A Geoscience Workforce Model for Non-Geoscience and Non-Traditional STEM Students

    Science.gov (United States)

    Liou-Mark, J.; Blake, R.; Norouzi, H.; Vladutescu, D. V.; Yuen-Lau, L.

    2016-12-01

    The Summit on the Future of Geoscience Undergraduate Education has recently identified key professional skills, competencies, and conceptual understanding necessary in the development of undergraduate geoscience students (American Geosciences Institute, 2015). Through a comprehensive study involving a diverse range of the geoscience academic and employer community, the following professional scientist skills were rated highly important: 1) critical thinking/problem solving skills; 2) effective communication; 3) ability to access and integrate information; 4) strong quantitative skills; and 5) ability to work in interdisciplinary/cross cultural teams. Based on the findings of the study above, the New York City College of Technology (City Tech) has created a one-year intensive training program that focusses on the development of technical and non-technical geoscience skills for non-geoscience, non-traditional STEM students. Although City Tech does not offer geoscience degrees, the primary goal of the program is to create an unconventional pathway for under-represented minority STEM students to enter, participate, and compete in the geoscience workforce. The selected cohort of STEM students engage in year-round activities that include a geoscience course, enrichment training workshops, networking sessions, leadership development, research experiences, and summer internships at federal, local, and private geoscience facilities. These carefully designed programmatic elements provide both the geoscience knowledge and the non-technical professional skills that are essential for the geoscience workforce. Moreover, by executing this alternate, robust geoscience workforce model that attracts and prepares underrepresented minorities for geoscience careers, this unique pathway opens another corridor that helps to ameliorate the dire plight of the geoscience workforce shortage. This project is supported by NSF IUSE GEOPATH Grant # 1540721.

  14. Traditional and non-traditional uses of Mitragynine (Kratom): A survey of the literature.

    Science.gov (United States)

    Singh, Darshan; Narayanan, Suresh; Vicknasingam, Balasingam

    2016-09-01

    The objective of the paper was to highlight the differences in the traditional and non-traditional users of kratom in the South East Asian and Western contexts. A literature survey of published kratom studies among humans was conducted. Forty published studies relevant to the objective were reviewed. Apart from the differences in the sources of supply, patterns of use and social acceptability of kratom within these two regions, the most interesting finding is its evolution to a recreational drug in both settings and the severity of the adverse effects of kratom use reported in the West. While several cases of toxicity and death have emerged in the West, such reports have been non-existent in South East Asia where kratom has had a longer history of use. We highlight the possible reasons for this as discussed in the literature. More importantly, it should be borne in mind that the individual clinical case-reports emerging from the West that link kratom use to adverse reactions or fatalities frequently pertained to kratom used together with other substances. Therefore, there is a danger of these reports being used to strengthen the case for legal sanction against kratom. This would be unfortunate since the experiences from South East Asia suggest considerable potential for therapeutic use among people who use drugs. Despite its addictive properties, reported side-effects and its tendency to be used a recreational drug, more scientific clinical human studies are necessary to determine its potential therapeutic value. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. The C1q family of proteins: insights into the emerging non-traditional functions

    Directory of Open Access Journals (Sweden)

    Berhane eGhebrehiwet

    2012-04-01

    Full Text Available Research conducted over the past 20 years have helped us unravel not only the hidden structural and functional subtleties of human C1q, but also has catapulted the molecule from a mere recognition unit of the classical pathway to a well-recognized molecular sensor of damage modified self or non-self antigens. Thus, C1q is involved in a rapidly expanding list of pathological disorders—including autoimmunity, trophoblast migration, preeclampsia and cancer. The results of two recent reports are provided to underscore the critical role C1q plays in health and disease. First is the observation by Singh and colleagues showing that pregnant C1q-/- mice recapitulate the key features of human preeclampsia that correlate with increased fetal death. Treatment of the C1q-/- mice with pravastatin restored trophoblast invasiveness, placental blood flow, and angiogenic balance and, thus, prevented the onset of preeclampsia. Second is the report by Hong et al., which showed that C1q can induce apoptosis of prostate cancer cells by activating the tumor suppressor molecule WW-domain containing oxydoreductase (WWOX or WOX1 and destabilizing cell adhesion. Downregulation of C1q on the other hand enhanced prostate hyperplasia and cancer formation due to failure of WOX1 activation. Recent evidence also shows that C1q belongs to a family of structurally and functionally related TNFα-like family of proteins that may have arisen from a common ancestral gene. Therefore C1q not only shares the diverse functions with the TNF family of proteins, but also explains why C1q has retained some of its ancestral cytokine-like activities. This review is intended to highlight some of the structural and functional aspects of C1q by underscoring the growing list of its non-traditional functions.

  16. Improving the vector auto regression technique for time-series link prediction by using support vector machine

    Directory of Open Access Journals (Sweden)

    Co Jan Miles

    2016-01-01

    Full Text Available Predicting links between the nodes of a graph has become an important Data Mining task because of its direct applications to biology, social networking, communication surveillance, and other domains. Recent literature in time-series link prediction has shown that the Vector Auto Regression (VAR technique is one of the most accurate for this problem. In this study, we apply Support Vector Machine (SVM to improve the VAR technique that uses an unweighted adjacency matrix along with 5 matrices: Common Neighbor (CN, Adamic-Adar (AA, Jaccard’s Coefficient (JC, Preferential Attachment (PA, and Research Allocation Index (RA. A DBLP dataset covering the years from 2003 until 2013 was collected and transformed into time-sliced graph representations. The appropriate matrices were computed from these graphs, mapped to the feature space, and then used to build baseline VAR models with lag of 2 and some corresponding SVM classifiers. Using the Area Under the Receiver Operating Characteristic Curve (AUC-ROC as the main fitness metric, the average result of 82.04% for the VAR was improved to 84.78% with SVM. Additional experiments to handle the highly imbalanced dataset by oversampling with SMOTE and undersampling with K-means clusters, however, did not improve the average AUC-ROC of the baseline SVM.

  17. Hair analysis by means of laser induced breakdown spectroscopy technique and support vector machine model for diagnosing addiction

    Directory of Open Access Journals (Sweden)

    M Vahid Dastjerdi

    2018-02-01

    Full Text Available Along with the development of laboratory methods for diagnosing addiction, concealment ways, either physically or chemically, for creating false results have been in progress. In this research based on the Laser Induced Breakdown Spectroscopy technique (LIBS and analyzing hair of addicted and normal people, we are proposing a new method to overcome problems in conventional methods and reduce possibility of cheating in the process of diagnosing addiction. For this purpose, at first we have sampled hair of 17 normal and addicted people and recorded 5 spectrums for each sample, overall 170 spectrums. After analyzing the recorded LIBS spectra and detecting the atomic and ionic lines as well as molecular bands, relative intensities of emission lines for Aluminum to Calcium (Al/Ca and Aluminum to Sodium (Al/Na were selected as the input variables for the Support Vector Machine model (SVM.The Radial Basis, Polynomial Kernel functions and a linear function were chosen for classifying the data in SVM model. The results of this research showed that by the combination of LIBS technique and SVM one can distinguish addicted person with precision of 100%. Because of several advantages of LIBS such as high speed analysis and being portable, this method can be used individually or together with available methods as an automatic method for diagnosing addiction through hair analysis.

  18. GAPscreener: An automatic tool for screening human genetic association literature in PubMed using the support vector machine technique

    Directory of Open Access Journals (Sweden)

    Khoury Muin J

    2008-04-01

    Full Text Available Abstract Background Synthesis of data from published human genetic association studies is a critical step in the translation of human genome discoveries into health applications. Although genetic association studies account for a substantial proportion of the abstracts in PubMed, identifying them with standard queries is not always accurate or efficient. Further automating the literature-screening process can reduce the burden of a labor-intensive and time-consuming traditional literature search. The Support Vector Machine (SVM, a well-established machine learning technique, has been successful in classifying text, including biomedical literature. The GAPscreener, a free SVM-based software tool, can be used to assist in screening PubMed abstracts for human genetic association studies. Results The data source for this research was the HuGE Navigator, formerly known as the HuGE Pub Lit database. Weighted SVM feature selection based on a keyword list obtained by the two-way z score method demonstrated the best screening performance, achieving 97.5% recall, 98.3% specificity and 31.9% precision in performance testing. Compared with the traditional screening process based on a complex PubMed query, the SVM tool reduced by about 90% the number of abstracts requiring individual review by the database curator. The tool also ascertained 47 articles that were missed by the traditional literature screening process during the 4-week test period. We examined the literature on genetic associations with preterm birth as an example. Compared with the traditional, manual process, the GAPscreener both reduced effort and improved accuracy. Conclusion GAPscreener is the first free SVM-based application available for screening the human genetic association literature in PubMed with high recall and specificity. The user-friendly graphical user interface makes this a practical, stand-alone application. The software can be downloaded at no charge.

  19. Evaluating the Tool Wear Rate in Ultrasonic Machining of Titanium using Design of Experiments Approach

    OpenAIRE

    Jatinder Kumar; Vinod Kumar

    2011-01-01

    Ultrasonic machining (USM) is a non-traditional machining process being widely used for commercial machining of brittle and fragile materials such as glass, ceramics and semiconductor materials. However, USM could be a viable alternative for machining a tough material such as titanium; and this aspect needs to be explored through experimental research. This investigation is focused on exploring the use of ultrasonic machining for commercial machining of pure titanium (AST...

  20. Sewing machine technique for laparoscopic mesh fixation in intra-peritoneal on-lay mesh.

    Science.gov (United States)

    Dastoor, Khojasteh Sam; Balsara, Kaiomarz P; Gazi, Asif Y

    2018-01-01

    : Mesh fixation in laparoscopic ventral hernia is accomplished using tacks or tacks with transfascial sutures. This is a painful operation and the pain is believed to be more due to transfascial sutures. We describe a method of transfascial suturing which fixes the mesh securely and probably causes less pain. : Up to six ports may be necessary, three on each side. A suitable-sized mesh is used and fixed with tacks all around. A 20G spinal needle is passed from the skin through one corner of the mesh. A 0 prolene suture is passed through into the peritoneum. With the prolene within, the needle is withdrawn above the anterior rectus sheath and passed again at an angle into the abdomen just outside the mesh. A loop of prolene is thus created which is tied under vision using intra-corporeal knotting. : This method gives a secure mesh fixation and causes less pain than conventional methods. This technique is easy to learn but needs expertise in intra-corporeal knotting.

  1. Comparison of machine learning techniques to predict all-cause mortality using fitness data: the Henry ford exercIse testing (FIT) project.

    Science.gov (United States)

    Sakr, Sherif; Elshawi, Radwa; Ahmed, Amjad M; Qureshi, Waqas T; Brawner, Clinton A; Keteyian, Steven J; Blaha, Michael J; Al-Mallah, Mouaz H

    2017-12-19

    Prior studies have demonstrated that cardiorespiratory fitness (CRF) is a strong marker of cardiovascular health. Machine learning (ML) can enhance the prediction of outcomes through classification techniques that classify the data into predetermined categories. The aim of this study is to present an evaluation and comparison of how machine learning techniques can be applied on medical records of cardiorespiratory fitness and how the various techniques differ in terms of capabilities of predicting medical outcomes (e.g. mortality). We use data of 34,212 patients free of known coronary artery disease or heart failure who underwent clinician-referred exercise treadmill stress testing at Henry Ford Health Systems Between 1991 and 2009 and had a complete 10-year follow-up. Seven machine learning classification techniques were evaluated: Decision Tree (DT), Support Vector Machine (SVM), Artificial Neural Networks (ANN), Naïve Bayesian Classifier (BC), Bayesian Network (BN), K-Nearest Neighbor (KNN) and Random Forest (RF). In order to handle the imbalanced dataset used, the Synthetic Minority Over-Sampling Technique (SMOTE) is used. Two set of experiments have been conducted with and without the SMOTE sampling technique. On average over different evaluation metrics, SVM Classifier has shown the lowest performance while other models like BN, BC and DT performed better. The RF classifier has shown the best performance (AUC = 0.97) among all models trained using the SMOTE sampling. The results show that various ML techniques can significantly vary in terms of its performance for the different evaluation metrics. It is also not necessarily that the more complex the ML model, the more prediction accuracy can be achieved. The prediction performance of all models trained with SMOTE is much better than the performance of models trained without SMOTE. The study shows the potential of machine learning methods for predicting all-cause mortality using cardiorespiratory fitness

  2. An exploration of on-line access by non-traditional students in higher education: a case study.

    Science.gov (United States)

    Dearnley, Chris; Dunn, Ginny; Watson, Sue

    2006-07-01

    The nature of Higher Education (HE) has seen many changes throughout the last decade. The agenda for widening participation in HE has led to an increase in the number of students with a broader range of educational backgrounds. At the same time there has been a surge in the development of digitalisation and the convergence of computing and telecommunications technologies available for use in education. This paper discusses the outcomes of a case study, conducted in a School of Health Studies within a northern English University, which identified the extent to which 'non-traditional' students access on-line learning facilities, such as virtual learning environments and library networks, and what factors enhanced or formed barriers to access. 'Non-traditional' students, for the purpose of this study, were defined as mature students who were returning to higher education after a considerable break. The outcomes indicated that skill deficit is a major obstacle for many 'non-traditional' students. The paper explores this issue in depth and suggests potential ways forward for the delivery of technology supported learning for 'non-traditional' students in Higher Education.

  3. Connecting Bourdieu, Winnicott, and Honneth: Understanding the Experiences of Non-Traditional Learners through an Interdisciplinary Lens

    Science.gov (United States)

    West, Linden; Fleming, Ted; Finnegan, Fergal

    2013-01-01

    This paper connects Bourdieu's concepts of habitus, dispositions and capital with a psychosocial analysis of how Winnicott's psychoanalysis and Honneth's recognition theory can be of importance in understanding how and why non-traditional students remain in higher education. Understanding power relations in an interdisciplinary way makes…

  4. Motivations for Participation in Higher Education: Narratives of Non-Traditional Students at Makerere University in Uganda

    Science.gov (United States)

    Tumuheki, Peace Buhwamatsiko; Zeelen, Jacques; Openjuru, George Ladaah

    2016-01-01

    The objective of this qualitative study was to establish motivations for participation of non-traditional students (NTS) in university education. The findings are drawn from empirical data collected from 15 unstructured in-depth interviews with NTS of the School of Computing and Informatics Technology at Makerere University, and analysed with the…

  5. Motivations for participation in higher education: narratives of non-traditional students at Makerere University in Uganda

    NARCIS (Netherlands)

    Tumuheki, Peace; Zeelen, Jacques; Openjuru, George L.

    2016-01-01

    The objective of this qualitative study was to establish motivations for participation of non-traditional students (NTS) in university education. The findings are drawn from empirical data collected from 15 unstructured in-depth interviews with NTS of the School of Computing and Informatics

  6. Student learning or the student experience: the shift from traditional to non-traditional faculty in higher education

    Directory of Open Access Journals (Sweden)

    Carlos Tasso Eira de Aquino

    2016-10-01

    Full Text Available Trends in higher education indicate transformations from teachers to facilitators, mentors, or coaches. New classroom management requires diverse teaching methods for a changing population. Non-traditional students require non-traditional faculty. Higher education operates similar to a traditional corporation, but competes for students, faculty, and funding to sustain daily operations and improve academic ranking among peers (Pak, 2013. This growing phenomenon suggests the need for faculty to transform the existing educational culture, ensuring the ability to attract and retain students. Transitions from student learning to the student experience and increasing student satisfaction scores are influencing facilitation in the classroom. On-line facilitation methods are transforming to include teamwork, interactive tutorials, media, and extending beyond group discussion. Faculty should be required to provide more facilitation, coaching, and mentoring with the shifting roles resulting in transitions from traditional faculty to faculty-coach and faculty mentor. The non-traditional adult student may require a more hands on guidance approach and may not be as self-directed as the adult learning theory proposes. This topic is important to individuals that support creation of new knowledge related to non-traditional adult learning models.

  7. Assessing Changes in Medical Student Attitudes toward Non-Traditional Human Sexual Behaviors Using a Confidential Audience Response System

    Science.gov (United States)

    Tucker, Phebe; Candler, Chris; Hamm, Robert M.; Smith, E. Michael; Hudson, Joseph C.

    2010-01-01

    Medical students encountering patients with unfamiliar, unconventional sexual practices may have attitudes that can affect open communication during sexual history-taking. We measured changes in first-year US medical student attitudes toward 22 non-traditional sexual behaviors before and after exposure to human sexuality instruction. An…

  8. How the Girl Choosing Technology Became the Symbol of the Non-Traditional Pupil's Choice in Sweden

    Science.gov (United States)

    Hedlin, Maria

    2011-01-01

    The purpose of this article is to elucidate how the girl who chooses technology came to be the symbol of the non-traditional pupil's choice in Sweden. In the early 1960s it was hoped that girls would enter workshop training and then commit themselves to engineering mechanics jobs at a time when Sweden was characterised by economic growth which was…

  9. "Some People Might Say I'm Thriving but?…": Non-Traditional Students' Experiences of University

    Science.gov (United States)

    Meuleman, Anna-Maria; Garrett, Robyne; Wrench, Alison; King, Sharron

    2015-01-01

    The expansion of neo-liberal policies' framing higher education has contributed to an increase in participation rates of students from non-traditional backgrounds. While an increase of a wider range of students might be seen as contributing to a more just and equitable higher education system, research has shown that broadening entry points does…

  10. Work-School Conflict and Coping Strategies: Perceptions of Taiwanese Non-Traditional Students in Technological and Vocational Colleges

    Science.gov (United States)

    Chen, Ching-Yi; Fischer, Jerome; Biller, Ernie

    2009-01-01

    The purpose of this study was designed to measure non-traditional students' perceptions of role conflicts between work and school and subsequent coping strategies, and to determine factors relevant to both role conflicts and coping. A survey was developed and implemented to investigate the continuing education issues. Results were based on 485…

  11. CLASSIFICATION AND RANKING OF FERMI LAT GAMMA-RAY SOURCES FROM THE 3FGL CATALOG USING MACHINE LEARNING TECHNIQUES

    Energy Technology Data Exchange (ETDEWEB)

    Saz Parkinson, P. M. [Department of Physics, The University of Hong Kong, Pokfulam Road, Hong Kong (China); Xu, H.; Yu, P. L. H. [Department of Statistics and Actuarial Science, The University of Hong Kong, Pokfulam Road, Hong Kong (China); Salvetti, D.; Marelli, M. [INAF—Istituto di Astrofisica Spaziale e Fisica Cosmica Milano, via E. Bassini 15, I-20133, Milano (Italy); Falcone, A. D. [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States)

    2016-03-20

    We apply a number of statistical and machine learning techniques to classify and rank gamma-ray sources from the Third Fermi Large Area Telescope Source Catalog (3FGL), according to their likelihood of falling into the two major classes of gamma-ray emitters: pulsars (PSR) or active galactic nuclei (AGNs). Using 1904 3FGL sources that have been identified/associated with AGNs (1738) and PSR (166), we train (using 70% of our sample) and test (using 30%) our algorithms and find that the best overall accuracy (>96%) is obtained with the Random Forest (RF) technique, while using a logistic regression (LR) algorithm results in only marginally lower accuracy. We apply the same techniques on a subsample of 142 known gamma-ray pulsars to classify them into two major subcategories: young (YNG) and millisecond pulsars (MSP). Once more, the RF algorithm has the best overall accuracy (∼90%), while a boosted LR analysis comes a close second. We apply our two best models (RF and LR) to the entire 3FGL catalog, providing predictions on the likely nature of unassociated sources, including the likely type of pulsar (YNG or MSP). We also use our predictions to shed light on the possible nature of some gamma-ray sources with known associations (e.g., binaries, supernova remnants/pulsar wind nebulae). Finally, we provide a list of plausible X-ray counterparts for some pulsar candidates, obtained using Swift, Chandra, and XMM. The results of our study will be of interest both for in-depth follow-up searches (e.g., pulsar) at various wavelengths and for broader population studies.

  12. CLASSIFICATION AND RANKING OF FERMI LAT GAMMA-RAY SOURCES FROM THE 3FGL CATALOG USING MACHINE LEARNING TECHNIQUES

    International Nuclear Information System (INIS)

    Saz Parkinson, P. M.; Xu, H.; Yu, P. L. H.; Salvetti, D.; Marelli, M.; Falcone, A. D.

    2016-01-01

    We apply a number of statistical and machine learning techniques to classify and rank gamma-ray sources from the Third Fermi Large Area Telescope Source Catalog (3FGL), according to their likelihood of falling into the two major classes of gamma-ray emitters: pulsars (PSR) or active galactic nuclei (AGNs). Using 1904 3FGL sources that have been identified/associated with AGNs (1738) and PSR (166), we train (using 70% of our sample) and test (using 30%) our algorithms and find that the best overall accuracy (>96%) is obtained with the Random Forest (RF) technique, while using a logistic regression (LR) algorithm results in only marginally lower accuracy. We apply the same techniques on a subsample of 142 known gamma-ray pulsars to classify them into two major subcategories: young (YNG) and millisecond pulsars (MSP). Once more, the RF algorithm has the best overall accuracy (∼90%), while a boosted LR analysis comes a close second. We apply our two best models (RF and LR) to the entire 3FGL catalog, providing predictions on the likely nature of unassociated sources, including the likely type of pulsar (YNG or MSP). We also use our predictions to shed light on the possible nature of some gamma-ray sources with known associations (e.g., binaries, supernova remnants/pulsar wind nebulae). Finally, we provide a list of plausible X-ray counterparts for some pulsar candidates, obtained using Swift, Chandra, and XMM. The results of our study will be of interest both for in-depth follow-up searches (e.g., pulsar) at various wavelengths and for broader population studies

  13. Glycaemic indices and non-traditional biochemical cardiovascular disease markers in a diabetic population in Nigeria

    International Nuclear Information System (INIS)

    Okeoghene, O.A.; Azenabor, A.

    2011-01-01

    Objective: To determine the frequency of hyperfibrinogenaemia, elevated C-reactive protein, hyperuricaemia and elevated lipoprotein A in a clinic population of patients with type 2 Diabetes mellitus (DM) compared with healthy controls; and determine the interrelationship between fasting plasma glucose levels and indices of long-term glycaemic control (fructosamine and glycosylated haemoglobin) in DM. Study Design: Cross-sectional, analytical study. Place and Duration of Study: The study was conducted at the Lagos State University Teaching Hospital, Ikeja, from April to June 2009. Methodology: A total of 200 patients with type 2 DM and 100 age and gender matched healthy controls were recruited for the study. Glycaemic control was assessed using fasting blood glucose, fructosamine and glycosylated haemoglobin levels. The non-traditional risk factors studied included C-reactive protein (CRP), Lipoprotein a (Lpa), serum uric acid (SUA), microalbuminuria and fibrinogen. Mann-whitney, chi-square and Pearson's correlation tests were used for analysis as applicable. Results: Hyperfibrinoginaemia, elevated CRP, LPa, microalbuminuria and hyperuricaemia were present in 3.5%, 65%, 12%, 6% and 57% respectively in type 2 DM. The mean levels of these CV risk factors were significantly higher in subjects with type 2 DM than that of the control subject. There was a positive and significant correlation between HbA1c and FBS (r=0.46, p=0.0001) and HbA1c and fructosamine (r=0.49, p=0.0001). All studied CVS risk factors were related to indices of glycaemic control which were found to be interrelated. Fasting blood glucose significantly correlated with both HbA1c and fructosamine but HbA1c showed better correlation to FPG than fructosamine (r=0.51 vs. 0.32). Conclusion: Glycosylated haemoglobin and fasting plasma glucose but not fructosamine are significantly associated with microalbuminuria, fibrinogen SUA and CRP in type 2 DM. HbA1c was found to be better than fructosamine in

  14. A case study of non-traditional students re-entry into college physics and engineering

    Science.gov (United States)

    Langton, Stewart Gordon

    Two groups of students in introductory physics courses of an Access Program for engineering technologies were the subjects of this study. Students with a wide range of academic histories and abilities were enrolled in the program; many of the students were re-entry and academically unprepared for post-secondary education. Five years of historical data were evaluated to use as a benchmark for revised instruction. Data were gathered to describe the pre-course academic state of the students and their academic progress during two physics courses. Additional information was used to search for factors that might constrain academic success and as feedback for the instructional methods. The data were interpreted to regulate constructivist design features for the physics courses. The Engineering Technology Access Program was introduced to meet the demand from non-traditional students for admission to two-year engineering' technology programs, but who did not meet normal academic requirements. The duration of the Access Program was two terms for electronic and computer engineering students and three terms for civil and mechanical engineering students. The sequence of mathematics and physics courses was different for the two groups. The Civil/Mechanical students enrolled in their first mathematics course before undertaking their first physics course. The first mathematics and physics courses for the Electronics students were concurrent. Academic success in the two groups was affected by this difference. Over a five-year period the success rate of students graduating with a technology diploma was approximately twenty-five percent. Results from this study indicate that it was possible to reduce the very high attrition in the combined Access/Technology Programs. While the success rate for the Electronics students increased to 38% the rate for the Civil/Mechanical students increased dramatically to 77%. It is likely that several factors, related to the extra term in the Access

  15. Predicting the academic success of architecture students by pre-enrolment requirement: using machine-learning techniques

    Directory of Open Access Journals (Sweden)

    Ralph Olusola Aluko

    2016-12-01

    Full Text Available In recent years, there has been an increase in the number of applicants seeking admission into architecture programmes. As expected, prior academic performance (also referred to as pre-enrolment requirement is a major factor considered during the process of selecting applicants. In the present study, machine learning models were used to predict academic success of architecture students based on information provided in prior academic performance. Two modeling techniques, namely K-nearest neighbour (k-NN and linear discriminant analysis were applied in the study. It was found that K-nearest neighbour (k-NN outperforms the linear discriminant analysis model in terms of accuracy. In addition, grades obtained in mathematics (at ordinary level examinations had a significant impact on the academic success of undergraduate architecture students. This paper makes a modest contribution to the ongoing discussion on the relationship between prior academic performance and academic success of undergraduate students by evaluating this proposition. One of the issues that emerges from these findings is that prior academic performance can be used as a predictor of academic success in undergraduate architecture programmes. Overall, the developed k-NN model can serve as a valuable tool during the process of selecting new intakes into undergraduate architecture programmes in Nigeria.

  16. Machine learning techniques for the optimization of joint replacements: Application to a short-stem hip implant.

    Science.gov (United States)

    Cilla, Myriam; Borgiani, Edoardo; Martínez, Javier; Duda, Georg N; Checa, Sara

    2017-01-01

    Today, different implant designs exist in the market; however, there is not a clear understanding of which are the best implant design parameters to achieve mechanical optimal conditions. Therefore, the aim of this project was to investigate if the geometry of a commercial short stem hip prosthesis can be further optimized to reduce stress shielding effects and achieve better short-stemmed implant performance. To reach this aim, the potential of machine learning techniques combined with parametric Finite Element analysis was used. The selected implant geometrical parameters were: total stem length (L), thickness in the lateral (R1) and medial (R2) and the distance between the implant neck and the central stem surface (D). The results show that the total stem length was not the only parameter playing a role in stress shielding. An optimized implant should aim for a decreased stem length and a reduced length of the surface in contact with the bone. The two radiuses that characterize the stem width at the distal cross-section in contact with the bone were less influential in the reduction of stress shielding compared with the other two parameters; but they also play a role where thinner stems present better results.

  17. Recognition of Mould Colony on Unhulled Paddy Based on Computer Vision using Conventional Machine-learning and Deep Learning Techniques.

    Science.gov (United States)

    Sun, Ke; Wang, Zhengjie; Tu, Kang; Wang, Shaojin; Pan, Leiqing

    2016-11-29

    To investigate the potential of conventional and deep learning techniques to recognize the species and distribution of mould in unhulled paddy, samples were inoculated and cultivated with five species of mould, and sample images were captured. The mould recognition methods were built using support vector machine (SVM), back-propagation neural network (BPNN), convolutional neural network (CNN), and deep belief network (DBN) models. An accuracy rate of 100% was achieved by using the DBN model to identify the mould species in the sample images based on selected colour-histogram parameters, followed by the SVM and BPNN models. A pitch segmentation recognition method combined with different classification models was developed to recognize the mould colony areas in the image. The accuracy rates of the SVM and CNN models for pitch classification were approximately 90% and were higher than those of the BPNN and DBN models. The CNN and DBN models showed quicker calculation speeds for recognizing all of the pitches segmented from a single sample image. Finally, an efficient uniform CNN pitch classification model for all five types of sample images was built. This work compares multiple classification models and provides feasible recognition methods for mouldy unhulled paddy recognition.

  18. Machine learning techniques for the optimization of joint replacements: Application to a short-stem hip implant.

    Directory of Open Access Journals (Sweden)

    Myriam Cilla

    Full Text Available Today, different implant designs exist in the market; however, there is not a clear understanding of which are the best implant design parameters to achieve mechanical optimal conditions. Therefore, the aim of this project was to investigate if the geometry of a commercial short stem hip prosthesis can be further optimized to reduce stress shielding effects and achieve better short-stemmed implant performance. To reach this aim, the potential of machine learning techniques combined with parametric Finite Element analysis was used. The selected implant geometrical parameters were: total stem length (L, thickness in the lateral (R1 and medial (R2 and the distance between the implant neck and the central stem surface (D. The results show that the total stem length was not the only parameter playing a role in stress shielding. An optimized implant should aim for a decreased stem length and a reduced length of the surface in contact with the bone. The two radiuses that characterize the stem width at the distal cross-section in contact with the bone were less influential in the reduction of stress shielding compared with the other two parameters; but they also play a role where thinner stems present better results.

  19. Combining optimization and machine learning techniques for genome-wide prediction of human cell cycle-regulated genes.

    Science.gov (United States)

    De Santis, Marianna; Rinaldi, Francesco; Falcone, Emmanuela; Lucidi, Stefano; Piaggio, Giulia; Gurtner, Aymone; Farina, Lorenzo

    2014-01-15

    The identification of cell cycle-regulated genes through the cyclicity of messenger RNAs in genome-wide studies is a difficult task due to the presence of internal and external noise in microarray data. Moreover, the analysis is also complicated by the loss of synchrony occurring in cell cycle experiments, which often results in additional background noise. To overcome these problems, here we propose the LEON (LEarning and OptimizatioN) algorithm, able to characterize the 'cyclicity degree' of a gene expression time profile using a two-step cascade procedure. The first step identifies a potentially cyclic behavior by means of a Support Vector Machine trained with a reliable set of positive and negative examples. The second step selects those genes having peak timing consistency along two cell cycles by means of a non-linear optimization technique using radial basis functions. To prove the effectiveness of our combined approach, we use recently published human fibroblasts cell cycle data and, performing in vivo experiments, we demonstrate that our computational strategy is able not only to confirm well-known cell cycle-regulated genes, but also to predict not yet identified ones. All scripts for implementation can be obtained on request.

  20. Recognition of Mould Colony on Unhulled Paddy Based on Computer Vision using Conventional Machine-learning and Deep Learning Techniques

    Science.gov (United States)

    Sun, Ke; Wang, Zhengjie; Tu, Kang; Wang, Shaojin; Pan, Leiqing

    2016-11-01

    To investigate the potential of conventional and deep learning techniques to recognize the species and distribution of mould in unhulled paddy, samples were inoculated and cultivated with five species of mould, and sample images were captured. The mould recognition methods were built using support vector machine (SVM), back-propagation neural network (BPNN), convolutional neural network (CNN), and deep belief network (DBN) models. An accuracy rate of 100% was achieved by using the DBN model to identify the mould species in the sample images based on selected colour-histogram parameters, followed by the SVM and BPNN models. A pitch segmentation recognition method combined with different classification models was developed to recognize the mould colony areas in the image. The accuracy rates of the SVM and CNN models for pitch classification were approximately 90% and were higher than those of the BPNN and DBN models. The CNN and DBN models showed quicker calculation speeds for recognizing all of the pitches segmented from a single sample image. Finally, an efficient uniform CNN pitch classification model for all five types of sample images was built. This work compares multiple classification models and provides feasible recognition methods for mouldy unhulled paddy recognition.

  1. Rainfall Prediction of Indian Peninsula: Comparison of Time Series Based Approach and Predictor Based Approach using Machine Learning Techniques

    Science.gov (United States)

    Dash, Y.; Mishra, S. K.; Panigrahi, B. K.

    2017-12-01

    Prediction of northeast/post monsoon rainfall which occur during October, November and December (OND) over Indian peninsula is a challenging task due to the dynamic nature of uncertain chaotic climate. It is imperative to elucidate this issue by examining performance of different machine leaning (ML) approaches. The prime objective of this research is to compare between a) statistical prediction using historical rainfall observations and global atmosphere-ocean predictors like Sea Surface Temperature (SST) and Sea Level Pressure (SLP) and b) empirical prediction based on a time series analysis of past rainfall data without using any other predictors. Initially, ML techniques have been applied on SST and SLP data (1948-2014) obtained from NCEP/NCAR reanalysis monthly mean provided by the NOAA ESRL PSD. Later, this study investigated the applicability of ML methods using OND rainfall time series for 1948-2014 and forecasted up to 2018. The predicted values of aforementioned methods were verified using observed time series data collected from Indian Institute of Tropical Meteorology and the result revealed good performance of ML algorithms with minimal error scores. Thus, it is found that both statistical and empirical methods are useful for long range climatic projections.

  2. Machine Learning

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Machine learning, which builds on ideas in computer science, statistics, and optimization, focuses on developing algorithms to identify patterns and regularities in data, and using these learned patterns to make predictions on new observations. Boosted by its industrial and commercial applications, the field of machine learning is quickly evolving and expanding. Recent advances have seen great success in the realms of computer vision, natural language processing, and broadly in data science. Many of these techniques have already been applied in particle physics, for instance for particle identification, detector monitoring, and the optimization of computer resources. Modern machine learning approaches, such as deep learning, are only just beginning to be applied to the analysis of High Energy Physics data to approach more and more complex problems. These classes will review the framework behind machine learning and discuss recent developments in the field.

  3. Measurements of the neutron dose equivalent for various radiation qualities, treatment machines and delivery techniques in radiation therapy.

    Science.gov (United States)

    Hälg, R A; Besserer, J; Boschung, M; Mayer, S; Lomax, A J; Schneider, U

    2014-05-21

    In radiation therapy, high energy photon and proton beams cause the production of secondary neutrons. This leads to an unwanted dose contribution, which can be considerable for tissues outside of the target volume regarding the long term health of cancer patients. Due to the high biological effectiveness of neutrons in regards to cancer induction, small neutron doses can be important. This study quantified the neutron doses for different radiation therapy modalities. Most of the reports in the literature used neutron dose measurements free in air or on the surface of phantoms to estimate the amount of neutron dose to the patient. In this study, dose measurements were performed in terms of neutron dose equivalent inside an anthropomorphic phantom. The neutron dose equivalent was determined using track etch detectors as a function of the distance to the isocenter, as well as for radiation sensitive organs. The dose distributions were compared with respect to treatment techniques (3D-conformal, volumetric modulated arc therapy and intensity-modulated radiation therapy for photons; spot scanning and passive scattering for protons), therapy machines (Varian, Elekta and Siemens linear accelerators) and radiation quality (photons and protons). The neutron dose equivalent varied between 0.002 and 3 mSv per treatment gray over all measurements. Only small differences were found when comparing treatment techniques, but substantial differences were observed between the linear accelerator models. The neutron dose equivalent for proton therapy was higher than for photons in general and in particular for double-scattered protons. The overall neutron dose equivalent measured in this study was an order of magnitude lower than the stray dose of a treatment using 6 MV photons, suggesting that the contribution of the secondary neutron dose equivalent to the integral dose of a radiotherapy patient is small.

  4. CAD/CAM machining Vs pre-sintering in-lab fabrication techniques of Y-TZP ceramic specimens: Effects on their mechanical fatigue behavior.

    Science.gov (United States)

    Zucuni, C P; Guilardi, L F; Fraga, S; May, L G; Pereira, G K R; Valandro, L F

    2017-07-01

    This study evaluated the effects of different pre-sintering fabrication processing techniques of Y-TZP ceramic (CAD/CAM Vs. in-lab), considering surface characteristics and mechanical performance outcomes. Pre-sintered discs of Y-TZP ceramic (IPS e.max ZirCAD, Ivoclar Vivadent) were produced using different pre-sintering fabrication processing techniques: Machined- milling with a CAD/CAM system; Polished- fabrication using a cutting device followed by polishing (600 and 1200 SiC papers); Xfine- fabrication using a cutting machine followed by grinding with extra-fine diamond bur (grit size 30 μm); Fine- fabrication using a cutting machine followed by grinding with fine diamond bur (grit size 46 μm); SiC- fabrication using a cutting machine followed by grinding with 220 SiC paper. Afterwards, the discs were sintered and submitted to roughness (n=35), surface topography (n=2), phase transformation (n=2), biaxial flexural strength (n=20), and biaxial flexural fatigue strength (fatigue limit) (n=15) analyses. No monoclinic-phase content was observed in all processing techniques. It can be observed that obtaining a surface with similar characteristics to CAD/CAM milling is essential for the observation of similar mechanical performance. On this sense, grinding with fine diamond bur before sintering (Fine group) was the best mimic protocol in comparison to the CAD/CAM milling. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Non-Traditional Wives With Traditional Husbands: Gender Ideology and Husband-to-Wife Physical Violence in Chinese Society.

    Science.gov (United States)

    Cheung, Adam Ka-Lok; Choi, Susanne Yuk-Ping

    2016-03-03

    Feminist scholars have argued that husband gender traditionalism is one of the root causes of spousal violence against women. Using couple-level data from Hong Kong (N = 871 couples), this article argues that a second mechanism-couple gender value mismatch-also explains husband-to-wife physical assault. Our findings show that a husband's gender traditionalism is positively associated with husband-to-wife physical assault only when the husband is coupled with a wife who has non-traditional gender attitudes. Similarly, egalitarian gender attitudes in wives are positively associated with husband-to-wife physical assault only when a non-traditional wife is coupled with a traditional husband. © The Author(s) 2016.

  6. New Paradigms for the Study of Ocular Alphaherpesvirus Infections: Insights into the Use of Non-Traditional Host Model Systems

    Directory of Open Access Journals (Sweden)

    Matthew R. Pennington

    2017-11-01

    Full Text Available Ocular herpesviruses, most notably human alphaherpesvirus 1 (HSV-1, canid alphaherpesvirus 1 (CHV-1 and felid alphaherpesvirus 1 (FHV-1, infect and cause severe disease that may lead to blindness. CHV-1 and FHV-1 have a pathogenesis and induce clinical disease in their hosts that is similar to HSV-1 ocular infections in humans, suggesting that infection of dogs and cats with CHV-1 and FHV-1, respectively, can be used as a comparative natural host model of herpesvirus-induced ocular disease. In this review, we discuss both strengths and limitations of the various available model systems to study ocular herpesvirus infection, with a focus on the use of these non-traditional virus-natural host models. Recent work has demonstrated the robustness and reproducibility of experimental ocular herpesvirus infections in dogs and cats, and, therefore, these non-traditional models can provide additional insights into the pathogenesis of ocular herpesvirus infections.

  7. "Too big to fail" or "Too non-traditional to fail"?: The determinants of banks' systemic importance

    OpenAIRE

    Moore, Kyle; Zhou, Chen

    2013-01-01

    This paper empirically analyzes the determinants of banks' systemic importance. In constructing a measure on the systemic importance of financial institutions we find that size is a leading determinant. This confirms the usual "Too big to fail'' argument. Nevertheless, banks with size above a sufficiently high level have equal systemic importance. In addition to size, we find that the extent to which banks engage in non-traditional banking activities is also positively related to ...

  8. An analysis of the relationship between learning satisfaction and academic achievement of non-traditional learners in Singapore

    OpenAIRE

    Henry Khiat

    2013-01-01

    This study investigated the relationship between the learning satisfaction and academic achievement of non-traditional learners in Singapore. Data were collected from 880 students through a component of the student evaluation exercise in a Singapore university in 2011. A mixed-methods approach was adopted in the analysis. Spearman’s rho coefficient stated that there is a weak but significant correlation between academic achievement and learning satisfaction. Coding and categorization of the q...

  9. Coping with the energy crisis: Impact assessment and potentials of non-traditional renewable energy in rural Kyrgyzstan

    International Nuclear Information System (INIS)

    Liu, Melisande F.M.; Pistorius, Till

    2012-01-01

    The Kyrgyz energy sector is characterised by a dramatic energy crisis that has deprived a substantial part of the population from access to energy. Non-traditional renewable energy sources have emerged as a promising alternative in providing basic energy services to the rural poor. Based on qualitative interview data from local households and project planners, this study sets out to assess impacts, limitations and barriers of non-traditional renewable energy projects in rural areas in Kyrgyzstan. This study argues that recent renewable energy efforts from multilateral international agencies, the private sector, and nongovernmental organisations exhibit great potential in creating tangible benefits and improving basic energy services, but have so far been inefficient in establishing and replicating sustainable and long-term energy solutions. Existing practices need to be improved by attaching greater importance to the capacities and real needs of the rural poor. The guidance of integrated programmes and policies along with alternative financing schemes and awareness-raising are urgently needed to leverage local success stories and to facilitate a sustainable energy development in rural Kyrgyzstan. - Highlights: ► We examine 11 rural households and 5 project planners in rural Kyrgyzstan. ► We assess impacts of non-traditional renewable energies compared with conventional fuels. ► Renewable energies exhibit a range of tangible benefits for rural users. ► Limitations concern performance, durability, repair, acceptance, finance and policy. ► Renewable energy is a promising alternative for rural households in Kyrgyzstan.

  10. An investigation of penetrant techniques for detection of machining-induced surface-breaking cracks on monolithic ceramics

    Energy Technology Data Exchange (ETDEWEB)

    Forster, G.A.; Ellingson, W.A.

    1996-02-01

    The purpose of this effort was to evaluate penetrant methods for their ability to detect surface-breaking cracks in monolithic ceramic materials with an emphasis on detection of cracks generated by machining. There are two basic penetrant types, visible and fluorescent. The visible penetrant method is usually augmented by powder developers and cracks detected can be seen in visible light. Cracks detected by fluorescent penetrant are visible only under ultraviolet light used with or without a developer. The developer is basically a powder that wicks up penetrant from a crack to make it more observable. Although fluorescent penetrants were recommended in the literature survey conducted early in this effort, visible penetrants and two non-standard techniques, a capillary gaseous diffusion method under development at the institute of Chemical Physics in Moscow, and the {open_quotes}statiflux{close_quotes} method which involves use of electrically charged particles, were also investigated. SiAlON ring specimens (1 in. diameter, 3/4 in. wide) which had been subjected to different thermal-shock cycles were used for these tests. The capillary gaseous diffusion method is based on ammonia; the detector is a specially impregnated paper much like litmus paper. As expected, visible dye penetrants offered no detection sensitivity for tight, surface-breaking cracks in ceramics. Although the non-standard statiflux method showed promise on high-crack-density specimens, it was ineffective on limited-crack-density specimens. The fluorescent penetrant method was superior for surface-breaking crack detection, but successful application of this procedure depends greatly on the skill of the user. Two presently available high-sensitivity fluorescent penetrants were then evaluated for detection of microcracks on Si{sub 3}N{sub 4} and SiC from different suppliers. Although 50X optical magnification may be sufficient for many applications, 200X magnification provides excellent delectability.

  11. Estimating photometric redshifts for X-ray sources in the X-ATLAS field using machine-learning techniques

    Science.gov (United States)

    Mountrichas, G.; Corral, A.; Masoura, V. A.; Georgantopoulos, I.; Ruiz, A.; Georgakakis, A.; Carrera, F. J.; Fotopoulou, S.

    2017-12-01

    We present photometric redshifts for 1031 X-ray sources in the X-ATLAS field using the machine-learning technique TPZ. X-ATLAS covers 7.1 deg2 observed with XMM-Newton within the Science Demonstration Phase of the H-ATLAS field, making it one of the largest contiguous areas of the sky with both XMM-Newton and Herschel coverage. All of the sources have available SDSS photometry, while 810 additionally have mid-IR and/or near-IR photometry. A spectroscopic sample of 5157 sources primarily in the XMM/XXL field, but also from several X-ray surveys and the SDSS DR13 redshift catalogue, was used to train the algorithm. Our analysis reveals that the algorithm performs best when the sources are split, based on their optical morphology, into point-like and extended sources. Optical photometry alone is not enough to estimate accurate photometric redshifts, but the results greatly improve when at least mid-IR photometry is added in the training process. In particular, our measurements show that the estimated photometric redshifts for the X-ray sources of the training sample have a normalized absolute median deviation, nmad ≈ 0.06, and a percentage of outliers, η = 10-14%, depending upon whether the sources are extended or point like. Our final catalogue contains photometric redshifts for 933 out of the 1031 X-ray sources with a median redshift of 0.9. The table of the photometric redshifts is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/608/A39

  12. Historical and Epistemological Reflections on the Culture of Machines around the Renaissance: How Science and Technique Work?

    Directory of Open Access Journals (Sweden)

    Raffaele Pisano

    2014-10-01

    Full Text Available This paper is divided into two parts, this being the first one. The second is entitled ‘Historical and Epistemological Reflections on the Culture of Machines around Renaissance: Machines, Machineries and Perpetual Motion’ and will be published in Acta Baltica Historiae et Philosophiae Scientiarum in 2015. Based on our recent studies, we provide here a historical and epistemological feature on the role played by machines and machineries. Ours is an epistemological thesis based on a series of historical examples to show that the relations between theoretical science and the construction of machines cannot be taken for granted, a priori. Our analysis is mainly based on the culture of machines around 15th and 17th centuries, namely the epoch of Late Renaissance and Early Modern Age. For this is the period of scientific revolution and this age offers abundant interesting material for researches into the relations of theoretical science/construction of machines as well. However, to prove our epistemological thesis, we will also exploit examples of machines built in other historical periods. Particularly, a discussion concerning the relationship between science theory and the development of science art crafts produced by non-recognized scientists in a certain historical time is presented. The main questions are: when and why did the tension between science (physics, mathematics and geometry give rise to a new scientific approach to applied discipline such as studies on machines and machineries? What kind of science was used (if at all for projecting machines and machineries? Was science at the time a necessary precondition to build a machine? In the first part we will focus on the difference between Aristotelian-Euclidean and Archimedean approaches and we will outline the heritage of these two different approaches in late medieval and Renaissance science. In the second part, we will apply our reconstructions to some historical and epistemological

  13. Application of machine learning techniques for solving real world business problems : the case study - target marketing of insurance policies

    OpenAIRE

    Juozenaite, Ineta

    2018-01-01

    The concept of machine learning has been around for decades, but now it is becoming more and more popular not only in the business, but everywhere else as well. It is because of increased amount of data, cheaper data storage, more powerful and affordable computational processing. The complexity of business environment leads companies to use data-driven decision making to work more efficiently. The most common machine learning methods, like Logistic Regression, Decision Tree, Artificial Neural...

  14. What Makes a Student Non-Traditional? A Comparison of Students over and under Age 25 in Online, Accelerated Psychology Courses

    Science.gov (United States)

    Tilley, Brian P.

    2014-01-01

    The growing proportion of non-traditional students, very commonly defined as students over the age of 25 (though other features vary from study to study) necessitates more studies with this increasingly relevant group participating. Recently, the growth of non-traditional universities such as those offering predominantly online, accelerated…

  15. Frame by frame stop motion non-traditional approaches to stop motion animation

    CERN Document Server

    Gasek, Tom

    2011-01-01

    In a world that is dominated by computer images, alternative stop motion techniques like pixilation, time-lapse photography and down-shooting techniques combined with new technologies offer a new, tangible and exciting approach to animation. With over 25 years professional experience, industry veteran, Tom Gasek presents a comprehensive guide to stop motion animation without the focus on puppetry or model animation. With tips, tricks and hands-on exercises, Frame by Frame will help both experienced and novice filmmakers get the most effective results from this underutilized branch of animation

  16. A study on ultra-precision machining technique for Al6061-T6 to fabricate space infrared optics

    Science.gov (United States)

    Ryu, Geun-man; Lee, Gil-jae; Hyun, Sang-won; Sung, Ha-yeong; Chung, Euisik; Kim, Geon-hee

    2014-08-01

    In this paper, analysis of variance on designed experiments with full factorial design was applied to determine the optimized machining parameters for ultra-precision fabrication of the secondary aspheric mirror, which is one of the key elements of the space cryogenic infrared optics. A single point diamond turning machine (SPDTM, Nanotech 4μpL Moore) was adopted to fabricate the material, AL6061-T6, and the three machining parameters of cutting speed, feed rate and depth of cut were selected. With several randomly assigned experimental conditions, surface roughness of each condition was measured by a non-contact optical profiler (NT2000; Vecco). As a result of analysis using Minitab, the optimum cutting condition was determined as following; cutting speed: 122 m/min, feed rate: 3 mm/min and depth of cut: 1 μm. Finally, a 120 mm diameter aspheric secondary mirror was attached to a particularly designed jig by using mixture of paraffin and wax and successfully fabricated under the optimum machining parameters. The profile of machined surface was measured by a high-accuracy 3-D profilometer(UA3P; Panasonic) and we obtained the geometrical errors of 30.6 nm(RMS) and 262.4 nm(PV), which satisfy the requirements of the space cryogenic infrared optics.

  17. Multidsciplinary Approaches to Coastal Adaptation - Aplying Machine Learning Techniques to assess coastal risk in Latin America and The Caribbean

    Science.gov (United States)

    Calil, J.

    2016-12-01

    The global population, currently at 7.3 billion, is increasing by nearly 230,000 people every day. As the world's population grows to an estimated 11.2 billion by 2100, the number of people living in low elevation areas, exposed to coastal hazards, is continuing to increase. In 2013, 22 million people were displaced by extreme weather events, with 37 events displacing at least 100,000 people each. Losses from natural disasters and disaster risk are determined by a complex interaction between physical hazards and the vulnerability of a society or social-ecological system, and its exposure to such hazards. Impacts from coastal hazards depend on the number of people, value of assets, and presence of critical resources in harm's way. Moreover, coastal risks are amplified by challenging socioeconomic dynamics, including ill-advised urban development, income inequality, and poverty level. Our results demonstrate that in Latin America and the Caribbean (LAC), more than half a million people live in areas where coastal hazards, exposure (of people, assets and ecosystems), and poverty converge, creating the ideal conditions for a perfect storm. In order to identify the population at greatest risk to coastal hazards in LAC, and in response to a growing demand for multidisciplinary coastal adaptation approaches, this study employs a combination of machine learning clustering techniques (K-Means and Self Organizing Maps), and a spatial index, to assess coastal risks on a comparative scale. Data for more than 13,000 coastal locations in LAC were collected and allocated into three categories: (1) Coastal Hazards (including storm surge, wave energy and El Niño); (2) Geographic Exposure (including population, agriculture, and ecosystems); and (3) Vulnerability (including income inequality, infant mortality rate and malnutrition). This study identified hotspots of coastal vulnerability, the key drivers of coastal risk at each geographic location. Our results provide important

  18. Supervised machine learning techniques to predict binding affinity. A study for cyclin-dependent kinase 2.

    Science.gov (United States)

    de Ávila, Maurício Boff; Xavier, Mariana Morrone; Pintro, Val Oliveira; de Azevedo, Walter Filgueira

    2017-12-09

    Here we report the development of a machine-learning model to predict binding affinity based on the crystallographic structures of protein-ligand complexes. We used an ensemble of crystallographic structures (resolution better than 1.5 Å resolution) for which half-maximal inhibitory concentration (IC 50 ) data is available. Polynomial scoring functions were built using as explanatory variables the energy terms present in the MolDock and PLANTS scoring functions. Prediction performance was tested and the supervised machine learning models showed improvement in the prediction power, when compared with PLANTS and MolDock scoring functions. In addition, the machine-learning model was applied to predict binding affinity of CDK2, which showed a better performance when compared with AutoDock4, AutoDock Vina, MolDock, and PLANTS scores. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. LINGUISTIC ANALYSIS FOR THE BELARUSIAN CORPUS WITH THE APPLICATION OF NATURAL LANGUAGE PROCESSING AND MACHINE LEARNING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Yu. S. Hetsevich

    2017-01-01

    Full Text Available The article focuses on the problems existing in text-to-speech synthesis. Different morphological, lexical and syntactical elements were localized with the help of the Belarusian unit of NooJ program. Those types of errors, which occur in Belarusian texts, were analyzed and corrected. Language model and part of speech tagging model were built. The natural language processing of Belarusian corpus with the help of developed algorithm using machine learning was carried out. The precision of developed models of machine learning has been 80–90 %. The dictionary was enriched with new words for the further using it in the systems of Belarusian speech synthesis.

  20. Traditional and non-traditional treatments for autism spectrum disorder with seizures: an on-line survey

    Directory of Open Access Journals (Sweden)

    Sreenivasula Swapna

    2011-05-01

    Full Text Available Abstract Background Despite the high prevalence of seizure, epilepsy and abnormal electroencephalograms in individuals with autism spectrum disorder (ASD, there is little information regarding the relative effectiveness of treatments for seizures in the ASD population. In order to determine the effectiveness of traditional and non-traditional treatments for improving seizures and influencing other clinical factor relevant to ASD, we developed a comprehensive on-line seizure survey. Methods Announcements (by email and websites by ASD support groups asked parents of children with ASD to complete the on-line surveys. Survey responders choose one of two surveys to complete: a survey about treatments for individuals with ASD and clinical or subclinical seizures or abnormal electroencephalograms, or a control survey for individuals with ASD without clinical or subclinical seizures or abnormal electroencephalograms. Survey responders rated the perceived effect of traditional antiepileptic drug (AED, non-AED seizure treatments and non-traditional ASD treatments on seizures and other clinical factors (sleep, communication, behavior, attention and mood, and listed up to three treatment side effects. Results Responses were obtained concerning 733 children with seizures and 290 controls. In general, AEDs were perceived to improve seizures but worsened other clinical factors for children with clinical seizure. Valproic acid, lamotrigine, levetiracetam and ethosuximide were perceived to improve seizures the most and worsen other clinical factors the least out of all AEDs in children with clinical seizures. Traditional non-AED seizure and non-traditional treatments, as a group, were perceived to improve other clinical factors and seizures but the perceived improvement in seizures was significantly less than that reported for AEDs. Certain traditional non-AED treatments, particularly the ketogenic diet, were perceived to improve both seizures and other clinical

  1. PROSPECTS OF INTRODUCTION OF NON-TRADITIONAL FRUIT BERRY AND VEGETABLE CROPS IN THE CONDITIONS OF DAGESTAN

    Directory of Open Access Journals (Sweden)

    M. S. Gins

    2014-01-01

    Full Text Available June 9-13, 2014 in Makhachkala hosted XI International scientific-methodical conference on the theme: «Introduction, conservation and use of biological diversity of cultivated plants», organized by FGBNU VNIISSOK, Dagestan Research Institute for Agriculture and GBS DSC RAS. The conference was attended by scientists from Russia, CIS and foreign countries. Due to the conference Dagestan turned out to be a prime location for the cultivation of both traditional and non-traditional plants with a high content of biologically active substances, as well as a training ground for resistance tests because of the combination of mountain and plain zones.

  2. Monitoring changes in soil carbon resulting from intensive production, a non-traditional agricultural methodology.

    Energy Technology Data Exchange (ETDEWEB)

    Dwyer, Brian P.

    2013-03-01

    New Mexico State University and a group of New Mexico farmers are evaluating an innovative agricultural technique they call Intensive Production (IP). In contrast to conventional agricultural practice, IP uses intercropping, green fallowing, application of soil amendments and soil microbial inocula to sequester carbon as plant biomass, resulting in improved soil quality. Sandia National Laboratories role was to identify a non-invasive, cost effective technology to monitor soil carbon changes. A technological review indicated that Laser Induced Breakdown Spectroscopy (LIBS) best met the farmers objectives. Sandia partnered with Los Alamos National Laboratory (LANL) to analyze farmers test plots using a portable LIBS developed at LANL. Real-time LIBS field sample analysis was conducted and grab samples were collected for laboratory comparison. The field and laboratory results correlated well implying the strong potential for LIBS as an economical field scale analytical tool for analysis of elements such as carbon, nitrogen, and phosphate.

  3. Intramedullary nail fixation of non-traditional fractures: Clavicle, forearm, fibula.

    Science.gov (United States)

    Dehghan, Niloofar; Schemitsch, Emil H

    2017-06-01

    Locked intramedullary fixation is a well-established technique for managing long-bone fractures. While intramedullary nail fixation of diaphyseal fractures in the femur, tibia, and humerus is well established, the same is not true for other fractures. Surgical fixations of clavicle, forearm and ankle are traditionally treated with plate and screw fixation. In some cases, fixation with an intramedullary device is possible, and may be advantageous. However, there is however a concern regarding a lack of rotational stability and fracture shortening. While new generation of locked intramedullary devices for fractures of clavicle, forearm and fibula are recently available, the outcomes are not as reliable as fixation with plates and screws. Further research in this area is warranted with high quality comparative studies, to investigate the outcomes and indication of these fractures treated with intramedullary nail devices compared to intramedullary nail fixation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Automated Sample Preparation for Radiogenic and Non-Traditional Metal Isotopes: Removing an Analytical Barrier for High Sample Throughput

    Science.gov (United States)

    Field, M. Paul; Romaniello, Stephen; Gordon, Gwyneth W.; Anbar, Ariel D.; Herrmann, Achim; Martinez-Boti, Miguel A.; Anagnostou, Eleni; Foster, Gavin L.

    2014-05-01

    MC-ICP-MS has dramatically improved the analytical throughput for high-precision radiogenic and non-traditional isotope ratio measurements, compared to TIMS. The generation of large data sets, however, remains hampered by tedious manual drip chromatography required for sample purification. A new, automated chromatography system reduces the laboratory bottle neck and expands the utility of high-precision isotope analyses in applications where large data sets are required: geochemistry, forensic anthropology, nuclear forensics, medical research and food authentication. We have developed protocols to automate ion exchange purification for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U) using the new prepFAST-MC™ (ESI, Nebraska, Omaha). The system is not only inert (all-flouropolymer flow paths), but is also very flexible and can easily facilitate different resins, samples, and reagent types. When programmed, precise and accurate user defined volumes and flow rates are implemented to automatically load samples, wash the column, condition the column and elute fractions. Unattended, the automated, low-pressure ion exchange chromatography system can process up to 60 samples overnight. Excellent reproducibility, reliability, recovery, with low blank and carry over for samples in a variety of different matrices, have been demonstrated to give accurate and precise isotopic ratios within analytical error for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U). This illustrates the potential of the new prepFAST-MC™ (ESI, Nebraska, Omaha) as a powerful tool in radiogenic and non-traditional isotope research.

  5. Automatic Classification of Sub-Techniques in Classical Cross-Country Skiing Using a Machine Learning Algorithm on Micro-Sensor Data

    Directory of Open Access Journals (Sweden)

    Ole Marius Hoel Rindal

    2017-12-01

    Full Text Available The automatic classification of sub-techniques in classical cross-country skiing provides unique possibilities for analyzing the biomechanical aspects of outdoor skiing. This is currently possible due to the miniaturization and flexibility of wearable inertial measurement units (IMUs that allow researchers to bring the laboratory to the field. In this study, we aimed to optimize the accuracy of the automatic classification of classical cross-country skiing sub-techniques by using two IMUs attached to the skier’s arm and chest together with a machine learning algorithm. The novelty of our approach is the reliable detection of individual cycles using a gyroscope on the skier’s arm, while a neural network machine learning algorithm robustly classifies each cycle to a sub-technique using sensor data from an accelerometer on the chest. In this study, 24 datasets from 10 different participants were separated into the categories training-, validation- and test-data. Overall, we achieved a classification accuracy of 93.9% on the test-data. Furthermore, we illustrate how an accurate classification of sub-techniques can be combined with data from standard sports equipment including position, altitude, speed and heart rate measuring systems. Combining this information has the potential to provide novel insight into physiological and biomechanical aspects valuable to coaches, athletes and researchers.

  6. Automatic Classification of Sub-Techniques in Classical Cross-Country Skiing Using a Machine Learning Algorithm on Micro-Sensor Data.

    Science.gov (United States)

    Rindal, Ole Marius Hoel; Seeberg, Trine M; Tjønnås, Johannes; Haugnes, Pål; Sandbakk, Øyvind

    2017-12-28

    The automatic classification of sub-techniques in classical cross-country skiing provides unique possibilities for analyzing the biomechanical aspects of outdoor skiing. This is currently possible due to the miniaturization and flexibility of wearable inertial measurement units (IMUs) that allow researchers to bring the laboratory to the field. In this study, we aimed to optimize the accuracy of the automatic classification of classical cross-country skiing sub-techniques by using two IMUs attached to the skier's arm and chest together with a machine learning algorithm. The novelty of our approach is the reliable detection of individual cycles using a gyroscope on the skier's arm, while a neural network machine learning algorithm robustly classifies each cycle to a sub-technique using sensor data from an accelerometer on the chest. In this study, 24 datasets from 10 different participants were separated into the categories training-, validation- and test-data. Overall, we achieved a classification accuracy of 93.9% on the test-data. Furthermore, we illustrate how an accurate classification of sub-techniques can be combined with data from standard sports equipment including position, altitude, speed and heart rate measuring systems. Combining this information has the potential to provide novel insight into physiological and biomechanical aspects valuable to coaches, athletes and researchers.

  7. The Changing Face of the of Former Soviet Cities: Elucidated by Remote Sensing and Machine Learning Techniques

    Science.gov (United States)

    Poghosyan, Armen

    2017-04-01

    Despite remote sensing of urbanization emerged as a powerful tool to acquire critical knowledge about urban growth and its effects on global environmental change, human-environment interface as well as environmentally sustainable urban development, there is lack of studies utilizing remote sensing techniques to investigate urbanization trends in the Post-Soviet states. The unique challenges accompanying the urbanization in the Post-Soviet republics combined with the expected robust urban growth in developing countries over the next several decades highlight the critical need for a quantitative assessment of the urban dynamics in the former Soviet states as they navigate towards a free market democracy. This study uses total of 32 Level-1 precision terrain corrected (L1T) Landsat scenes with 30-m resolution as well as further auxiliary population and economic data for ten cities distributed in nine former Soviet republics to quantify the urbanization patterns in the Post-Soviet region. Land cover in each urban center of this study was classified by using Support Vector Machine (SVM) learning algorithm with overall accuracies ranging from 87 % to 97 % for 29 classification maps over three time steps during the past twenty-five years in order to estimate quantities, trends and drivers of urban growth in the study area. The results demonstrated several spatial and temporal urbanization patterns observed across the Post-Soviet states and based on urban expansion rates the cities can be divided into two groups, fast growing and slow growing urban centers. The relatively fast-growing urban centers have an average urban expansion rate of about 2.8 % per year, whereas the slow growing cities have an average urban expansion rate of about 1.0 % per year. The total area of new land converted to urban environment ranged from as low as 26 km2 to as high as 780 km2 for the ten cities over the 1990 - 2015 period, while the overall urban land increase ranged from 11.3 % to 96

  8. [Application of infrared spectroscopy technique to protein content fast measurement in milk powder based on support vector machines].

    Science.gov (United States)

    Wu, Di; Cao, Fang; Feng, Shui-Juan; He, Yong

    2008-05-01

    In the present study, the JASCO Model FTIR-4 000 fourier transform infrared spectrometer (Japan) was used, with a valid range of 7 800-350 cm(-1). Seven brands of milk powder were bought in a local supermarket. Milk powder was compressed into a uniform tablet with a diameter of 5 mm and a thickness of 2 mm, and then scanned by the spectrometer. Each sample was scanned 40 times and the data were averaged. About 60 samples were measured for each brand, and data for 409 samples were obtained. NIRS analysis was based on the range of 4 000 to 6 666 cm(-1), while MIRS analysis was between 400 and 4 000 cm(-1). The protein content was determined by kjeldahl method and the factor 6.38 was used to convert the nitrogen values to protein. The protein content value is the weight of protein per 100 g of milk powder. The NIR data of the milk powder exhibited slight differences. Univariate analysis was not really appropriate for analyzing the data sets. From NIRS region, it could be observed that the trend of different curves is similar. The one around 4 312 cm(-1) embodies the vibration of protein. From MIRS region, it could be determined that there are many differences between transmission value curves. Two troughs around 1 545 and 1 656 cm(-1) stand for the vibration of amide I and II bands of protein. The smoothing way of Savitzky-Golay with 3 segments and zero polynomials and multiplicative scatter correction (MSC) were applied for denoising. First 8 important principle components (PCs), which were obtained from principle component analysis (PCA), were the optimal input feature subset. Least-squares support vector machines was applied to build the protein prediction model based on infrared spectral transmission value. The prediction result was better than that of traditional PLS regression model as the determination coefficient for prediction (R(p)2) is 0.951 7 and root mean square error for prediction (RMSEP) is 0.520 201. These indicate that LS-SVM is a powerful tool for

  9. Machine Learning

    Science.gov (United States)

    Hoffmann, Achim; Mahidadia, Ashesh

    The purpose of this chapter is to present fundamental ideas and techniques of machine learning suitable for the field of this book, i.e., for automated scientific discovery. The chapter focuses on those symbolic machine learning methods, which produce results that are suitable to be interpreted and understood by humans. This is particularly important in the context of automated scientific discovery as the scientific theories to be produced by machines are usually meant to be interpreted by humans. This chapter contains some of the most influential ideas and concepts in machine learning research to give the reader a basic insight into the field. After the introduction in Sect. 1, general ideas of how learning problems can be framed are given in Sect. 2. The section provides useful perspectives to better understand what learning algorithms actually do. Section 3 presents the Version space model which is an early learning algorithm as well as a conceptual framework, that provides important insight into the general mechanisms behind most learning algorithms. In section 4, a family of learning algorithms, the AQ family for learning classification rules is presented. The AQ family belongs to the early approaches in machine learning. The next, Sect. 5 presents the basic principles of decision tree learners. Decision tree learners belong to the most influential class of inductive learning algorithms today. Finally, a more recent group of learning systems are presented in Sect. 6, which learn relational concepts within the framework of logic programming. This is a particularly interesting group of learning systems since the framework allows also to incorporate background knowledge which may assist in generalisation. Section 7 discusses Association Rules - a technique that comes from the related field of Data mining. Section 8 presents the basic idea of the Naive Bayesian Classifier. While this is a very popular learning technique, the learning result is not well suited for

  10. Predictability of machine learning techniques to forecast the trends of market index prices: Hypothesis testing for the Korean stock markets.

    Science.gov (United States)

    Pyo, Sujin; Lee, Jaewook; Cha, Mincheol; Jang, Huisu

    2017-01-01

    The prediction of the trends of stocks and index prices is one of the important issues to market participants. Investors have set trading or fiscal strategies based on the trends, and considerable research in various academic fields has been studied to forecast financial markets. This study predicts the trends of the Korea Composite Stock Price Index 200 (KOSPI 200) prices using nonparametric machine learning models: artificial neural network, support vector machines with polynomial and radial basis function kernels. In addition, this study states controversial issues and tests hypotheses about the issues. Accordingly, our results are inconsistent with those of the precedent research, which are generally considered to have high prediction performance. Moreover, Google Trends proved that they are not effective factors in predicting the KOSPI 200 index prices in our frameworks. Furthermore, the ensemble methods did not improve the accuracy of the prediction.

  11. Predictability of machine learning techniques to forecast the trends of market index prices: Hypothesis testing for the Korean stock markets.

    Directory of Open Access Journals (Sweden)

    Sujin Pyo

    Full Text Available The prediction of the trends of stocks and index prices is one of the important issues to market participants. Investors have set trading or fiscal strategies based on the trends, and considerable research in various academic fields has been studied to forecast financial markets. This study predicts the trends of the Korea Composite Stock Price Index 200 (KOSPI 200 prices using nonparametric machine learning models: artificial neural network, support vector machines with polynomial and radial basis function kernels. In addition, this study states controversial issues and tests hypotheses about the issues. Accordingly, our results are inconsistent with those of the precedent research, which are generally considered to have high prediction performance. Moreover, Google Trends proved that they are not effective factors in predicting the KOSPI 200 index prices in our frameworks. Furthermore, the ensemble methods did not improve the accuracy of the prediction.

  12. Machine tool

    International Nuclear Information System (INIS)

    Kang, Myeong Sun

    1981-01-01

    This book indicates machine tool, which includes cutting process and processing by cutting process, theory of cutting like tool angle and chip molding, cutting tool such as milling cutter and drill, summary and introduction of following machine ; spindle drive and feed drive, pivot and pivot bearing, frame, guide way and table, drilling machine, boring machine, shaper and planer, milling machine, machine tool for precision finishing like lapping machine and super finishing machine gear cutter.

  13. An evaluation of machine processing techniques of ERTS-1 data for user applications. [urban land use and soil association mapping in Indiana

    Science.gov (United States)

    Landgrebe, D.

    1974-01-01

    A broad study is described to evaluate a set of machine analysis and processing techniques applied to ERTS-1 data. Based on the analysis results in urban land use analysis and soil association mapping together with previously reported results in general earth surface feature identification and crop species classification, a profile of general applicability of this procedure is beginning to emerge. Put in the hands of a user who knows well the information needed from the data and also is familiar with the region to be analyzed it appears that significantly useful information can be generated by these methods. When supported by preprocessing techniques such as the geometric correction and temporal registration capabilities, final products readily useable by user agencies appear possible. In parallel with application, through further research, there is much potential for further development of these techniques both with regard to providing higher performance and in new situations not yet studied.

  14. Parametric optimization of ultrasonic machining process using gravitational search and fireworks algorithms

    Directory of Open Access Journals (Sweden)

    Debkalpa Goswami

    2015-03-01

    Full Text Available Ultrasonic machining (USM is a mechanical material removal process used to erode holes and cavities in hard or brittle workpieces by using shaped tools, high-frequency mechanical motion and an abrasive slurry. Unlike other non-traditional machining processes, such as laser beam and electrical discharge machining, USM process does not thermally damage the workpiece or introduce significant levels of residual stress, which is important for survival of materials in service. For having enhanced machining performance and better machined job characteristics, it is often required to determine the optimal control parameter settings of an USM process. The earlier mathematical approaches for parametric optimization of USM processes have mostly yielded near optimal or sub-optimal solutions. In this paper, two almost unexplored non-conventional optimization techniques, i.e. gravitational search algorithm (GSA and fireworks algorithm (FWA are applied for parametric optimization of USM processes. The optimization performance of these two algorithms is compared with that of other popular population-based algorithms, and the effects of their algorithm parameters on the derived optimal solutions and computational speed are also investigated. It is observed that FWA provides the best optimal results for the considered USM processes.

  15. Non-Traditional Security Threats in the Border Areas: Terrorism, Piracy, Environmental Degradation in Southeast Asian Maritime Domain

    Science.gov (United States)

    Dabova, E. L.

    2013-11-01

    In addition to facilitating peaceful trade and economic development, sovereign territory, territorial waters and international waters are being used by various criminal groups that pose threats to governments, businesses and civilian population in Southeast Asia. Nonstate criminal maritime activities were not receiving appropriate attention as they were overshadowed by traditional military security challenges. Yet more and more frequently, the non-traditional actors challenge lines of communication, jeopardize access to strategic resources, complicate traditional defence tasks, and harm the environment. Understanding the nature of non-traditional threats, and the ways to combat them, requires international legal, historical and political science analysis within a united problem-oriented approach. A fair critique to pure interest, power and knowledge -based theories of regime formation was developed by E.K. Leonard's1, who explained the evolution of the international system from the global governance perspective. The present study is based on the premise that pure nation-state approaches are incapable of providing a theoretical ground for addressing the growing influence of international criminal networks in South East Asia. From an international relations theory perspective, the author of this study agrees with D.Snidal2 that the hegemonic stability theory has "limits" and is insufficient in describing modern challenges to sustainable international security regime, including non-traditional threats, where collective action is more efficient from an interest and capability standpoint. At the same time the author of this study does not share the viewpoint on "marginalization"3 of international law in current international order due to its fragmentation and regionalization4 and "global power shifts"5 . The United Nations, as a global institution at the top of the vertical hierarchy of international legal order, and the EU as an example of "self-contained" regime along

  16. Precision machine design

    CERN Document Server

    Slocum, Alexander H

    1992-01-01

    This book is a comprehensive engineering exploration of all the aspects of precision machine design - both component and system design considerations for precision machines. It addresses both theoretical analysis and practical implementation providing many real-world design case studies as well as numerous examples of existing components and their characteristics. Fast becoming a classic, this book includes examples of analysis techniques, along with the philosophy of the solution method. It explores the physics of errors in machines and how such knowledge can be used to build an error budget for a machine, how error budgets can be used to design more accurate machines.

  17. Defect-originated magnetism in carbon-based and non-traditional inorganic compounds: A new class of magnetic materials

    Science.gov (United States)

    Andriotis, A. N.; Sheetz, R. M.; Richter, E.; Menon, M.

    2005-11-01

    Magnetism in organic and non-traditional inorganic materials (NTIMs) is a fascinating phenomenon from both scientific and technological perspective. Recent experimental discovery of ferromagnetism in organic C60-based polymers has challenged the traditional concepts of the origin of magnetism. Although the nature of the s-p magnetism of the C60-based polymers has been distinguished from the nature of the newly observed magnetism in NTIMs, a defect-based picture of magnetism is found to provide a common thread connecting all these materials. As shown in the present work, this magnetism can be considered as a generalized form of the well-known McConnell model, thereby providing a unified classification of these magnetic materials and elucidating its common origin with the d-ferromagnetism.

  18. [Integration and demonstration of key techniques in surveillance and forecast of schistosomiasis in Jiangsu Province III Development of a machine simultaneously integrating mechanized environmental cleaning and automatic mollusciciding].

    Science.gov (United States)

    Wang, Fu-biao; Ma, Yu-cai; Sun, Le-ping; Hong, Qing-biao; Gao, Yang; Zhang, Chang-lin; Du, Guang-lin; Lu, Da-qin; Sun, Zhi-yong; Wang, Wei; Dai, Jian-rong; Liang, You-sheng

    2016-02-01

    To develop a machine simultaneously integrating mechanized environmental cleaning and automatic mollusciciding and to evaluate its effectiveness of field application, so as to provide a novel Oncomelania hupensis snail control technique in the large-scale marshlands. The machine simultaneously integrating mechanized environmental cleaning and automatic mollusciciding, which was suitable for use in complex marshland areas, was developed according to the mechanization and automation principles, and was used for O. hupensis snail control in the marshland. The effect of the machine on environmental cleaning and plough was evaluated, and the distribution of living snails was observed at various soil layers following plough. The snail control effects of plough alone and plough followed by mollusciciding were compared. The machine could simultaneously complete the procedures of getting vegetation down and cut vegetation into pieces, plough and snail control by spraying niclosamide. After plough, the constituent ratios of living snails were 36.31%, 25.60%, 22.62% and 15.48% in the soil layers at depths of 0-5, 6-10, 11-15 cm and 16-20 cm respectively, and 61.91% living snails were found in the 0-10 cm soil layers. Seven and fifteen days after the experiment, the mortality rates of snails were 9.38% and 8.29% in the plough alone group, and 63.04% and 80.70% in the plough + mollusciciding group respectively (χ²₇ d = 42.74, χ²₁₅ d = 155.56, both P values mollusciciding group, which decreased by 64.92% and 93.60%, respectively, and the decrease rate of snail density was approximately 30% higher in the plough + mollusciciding group than that in the plough alone group. The machine simultaneously integrating mechanized environmental cleaning and automatic mollusciciding achieves the integration of mechanical environmental cleaning and automatic niclosamide spraying in the complex marshland areas, which provides a novel technique of field snail control in the large

  19. Customer Characteristics and Shopping Patterns Associated with Healthy and Unhealthy Purchases at Small and Non-traditional Food Stores.

    Science.gov (United States)

    Lenk, Kathleen M; Caspi, Caitlin E; Harnack, Lisa; Laska, Melissa N

    2018-02-01

    Small and non-traditional food stores (e.g., corner stores) are often the most accessible source of food for residents of lower income urban neighborhoods in the U.S. Although healthy options are often limited at these stores, little is known about customers who purchase healthy, versus less healthy, foods/beverages in these venues. We conducted 661 customer intercept interviews at 105 stores (corner stores, gas marts, pharmacies, dollar stores) in Minneapolis/St. Paul, Minnesota, assessing all food and beverage items purchased. We defined three categories of "healthy" and four categories of "unhealthy" purchases. Interviews assessed customer characteristics [e.g., demographics, body-mass index (BMI)]. We examined associations between healthy versus unhealthy purchases categories and customer characteristics. Overall, 11% of customers purchased ≥1 serving of healthy foods/beverages in one or more of the three categories: 8% purchased fruits/vegetables, 2% whole grains, and 1% non-/low-fat dairy. Seventy-one percent of customers purchased ≥1 serving of unhealthy foods/beverages in one or more of four categories: 46% purchased sugar-sweetened beverages, 17% savory snacks, 15% candy, and 13% sweet baked goods. Male (vs. female) customers, those with a lower education levels, and those who reported shopping at the store for convenience (vs. other reasons) were less likely to purchase fruits/vegetables. Unhealthy purchases were more common among customers with a BMI ≥30 kg/m 2 (vs. lower BMI). Results suggest intervention opportunities to increase healthy purchases at small and non-traditional food stores, particularly interventions aimed at male residents, those with lower education levels and residents living close to the store.

  20. Current breathomics-a review on data pre-processing techniques and machine learning in metabolomics breath analysis

    DEFF Research Database (Denmark)

    Smolinska, A.; Hauschild, A. C.; Fijten, R. R. R.

    2014-01-01

    with the detailed pre-processing pipelines for breathomics data obtained from gas-chromatography mass spectrometry and an ion-mobility spectrometer coupled to multi-capillary columns. The outcome of data pre-processing is a matrix containing the relative abundances of a set of VOCs for a group of patients under...... been extensively developed. Yet, the application of machine learning methods for fingerprinting VOC profiles in the breathomics is still in its infancy. Therefore, in this paper, we describe the current state of the art in data pre-processing and multivariate analysis of breathomics data. We start...

  1. Comprehensive Evaluation of Machine Learning Techniques for Estimating the Responses of Carbon Fluxes to Climatic Forces in Different Terrestrial Ecosystems

    Directory of Open Access Journals (Sweden)

    Xianming Dou

    2018-02-01

    Full Text Available Accurately estimating the carbon budgets in terrestrial ecosystems ranging from flux towers to regional or global scales is particularly crucial for diagnosing past and future climate change. This research investigated the feasibility of two comparatively advanced machine learning approaches, namely adaptive neuro-fuzzy inference system (ANFIS and extreme learning machine (ELM, for reproducing terrestrial carbon fluxes in five different types of ecosystems. Traditional artificial neural network (ANN and support vector machine (SVM models were also utilized as reliable benchmarks to measure the generalization ability of these models according to the following statistical metrics: coefficient of determination (R2, index of agreement (IA, root mean square error (RMSE, and mean absolute error (MAE. In addition, we attempted to explore the responses of all methods to their corresponding intrinsic parameters in terms of the generalization performance. It was found that both the newly proposed ELM and ANFIS models achieved highly satisfactory estimates and were comparable to the ANN and SVM models. The modeling ability of each approach depended upon their respective internal parameters. For example, the SVM model with the radial basis kernel function produced the most accurate estimates and performed substantially better than the SVM models with the polynomial and sigmoid functions. Furthermore, a remarkable difference was found in the estimated accuracy among different carbon fluxes. Specifically, in the forest ecosystem (CA-Obs site, the optimal ANN model obtained slightly higher performance for gross primary productivity, with R2 = 0.9622, IA = 0.9836, RMSE = 0.6548 g C m−2 day−1, and MAE = 0.4220 g C m−2 day−1, compared with, respectively, 0.9554, 0.9845, 0.4280 g C m−2 day−1, and 0.2944 g C m−2 day−1 for ecosystem respiration and 0.8292, 0.9306, 0.6165 g C m−2 day−1, and 0.4407 g C m−2 day−1 for net ecosystem exchange

  2. Evaluation of bent-crystal x-ray backlighting and microscopy techniques for the Sandia Z machine.

    Science.gov (United States)

    Sinars, Daniel B; Bennett, Guy R; Wenger, David F; Cuneo, Michael E; Porter, John L

    2003-07-01

    X-ray backlighting and microscopy systems for the 1-10-keV range based on spherically or toroidally bent crystals are discussed. These systems are ideal for use on the Sandia Z machine, a megajoule-class x-ray facility. Near-normal-incidence crystal microscopy systems have been shown to be more efficient than pinhole cameras with the same spatial resolution and magnification [Appl. Opt. 37, 1784 (1998)]. We show that high-resolution (< or = 10 microm) x-ray backlighting systems using bent crystals can be more efficient than analogous point-projection imaging systems. Examples of bent-crystal-backlighting results that demonstrate 10-microm resolution over a 20-mm field of view are presented.

  3. Machine learning in image steganalysis

    CERN Document Server

    Schaathun, Hans Georg

    2012-01-01

    "The only book to look at steganalysis from the perspective of machine learning theory, and to apply the common technique of machine learning to the particular field of steganalysis; ideal for people working in both disciplines"--

  4. SU-F-I-73: Surface Dose from KV Diagnostic Beams From An On-Board Imager On a Linac Machine Using Different Imaging Techniques and Filters

    Energy Technology Data Exchange (ETDEWEB)

    Ali, I; Hossain, S; Syzek, E; Ahmad, S [University of Oklahoma Health Sciences Center, Department of Radiation Oncology, Oklahoma City, OK (United States)

    2016-06-15

    Purpose: To quantitatively investigate the surface dose deposited in patients imaged with a kV on-board-imager mounted on a radiotherapy machine using different clinical imaging techniques and filters. Methods: A high sensitivity photon diode is used to measure the surface dose on central-axis and at an off-axis-point which is mounted on the top of a phantom setup. The dose is measured for different imaging techniques that include: AP-Pelvis, AP-Head, AP-Abdomen, AP-Thorax, and Extremity. The dose measurements from these imaging techniques are combined with various filtering techniques that include: no-filter (open-field), half-fan bowtie (HF), full-fan bowtie (FF) and Cu-plate filters. The relative surface dose for different imaging and filtering techniques is evaluated quantiatively by the ratio of the dose relative to the Cu-plate filter. Results: The lowest surface dose is deposited with the Cu-plate filter. The highest surface dose deposited results from open fields without filter and it is nearly a factor of 8–30 larger than the corresponding imaging technique with the Cu-plate filter. The AP-Abdomen technique delivers the largest surface dose that is nearly 2.7 times larger than the AP-Head technique. The smallest surface dose is obtained from the Extremity imaging technique. Imaging with bowtie filters decreases the surface dose by nearly 33% in comparison with the open field. The surface doses deposited with the HF or FF-bowtie filters are within few percentages. Image-quality of the radiographic images obtained from the different filtering techniques is similar because the Cu-plate eliminates low-energy photons. The HF- and FF-bowtie filters generate intensity-gradients in the radiographs which affects image-quality in the different imaging technique. Conclusion: Surface dose from kV-imaging decreases significantly with the Cu-plate and bowtie-filters compared to imaging without filters using open-field beams. The use of Cu-plate filter does not affect

  5. Machine-assisted verification of latent fingerprints: first results for nondestructive contact-less optical acquisition techniques with a CWL sensor

    Science.gov (United States)

    Hildebrandt, Mario; Kiltz, Stefan; Krapyvskyy, Dmytro; Dittmann, Jana; Vielhauer, Claus; Leich, Marcus

    2011-11-01

    A machine-assisted analysis of traces from crime scenes might be possible with the advent of new high-resolution non-destructive contact-less acquisition techniques for latent fingerprints. This requires reliable techniques for the automatic extraction of fingerprint features from latent and exemplar fingerprints for matching purposes using pattern recognition approaches. Therefore, we evaluate the NIST Biometric Image Software for the feature extraction and verification of contact-lessly acquired latent fingerprints to determine potential error rates. Our exemplary test setup includes 30 latent fingerprints from 5 people in two test sets that are acquired from different surfaces using a chromatic white light sensor. The first test set includes 20 fingerprints on two different surfaces. It is used to determine the feature extraction performance. The second test set includes one latent fingerprint on 10 different surfaces and an exemplar fingerprint to determine the verification performance. This utilized sensing technique does not require a physical or chemical visibility enhancement of the fingerprint residue, thus the original trace remains unaltered for further investigations. No particular feature extraction and verification techniques have been applied to such data, yet. Hence, we see the need for appropriate algorithms that are suitable to support forensic investigations.

  6. Machining of uranium and uranium alloys

    International Nuclear Information System (INIS)

    Morris, T.O.

    1981-01-01

    Uranium and uranium alloys can be readily machined by conventional methods in the standard machine shop when proper safety and operating techniques are used. Material properties that affect machining processes and recommended machining parameters are discussed. Safety procedures and precautions necessary in machining uranium and uranium alloys are also covered. 30 figures

  7. Use of Machine Learning Techniques for Iidentification of Robust Teleconnections to East African Rainfall Variability in Observations and Models

    Science.gov (United States)

    Roberts, J. Brent; Robertson, Franklin R.; Funk, Chris

    2014-01-01

    Providing advance warning of East African rainfall variations is a particular focus of several groups including those participating in the Famine Early Warming Systems Network. Both seasonal and long-term model projections of climate variability are being used to examine the societal impacts of hydrometeorological variability on seasonal to interannual and longer time scales. The NASA / USAID SERVIR project, which leverages satellite and modeling-based resources for environmental decision making in developing nations, is focusing on the evaluation of both seasonal and climate model projections to develop downscaled scenarios for using in impact modeling. The utility of these projections is reliant on the ability of current models to capture the embedded relationships between East African rainfall and evolving forcing within the coupled ocean-atmosphere-land climate system. Previous studies have posited relationships between variations in El Niño, the Walker circulation, Pacific decadal variability (PDV), and anthropogenic forcing. This study applies machine learning methods (e.g. clustering, probabilistic graphical model, nonlinear PCA) to observational datasets in an attempt to expose the importance of local and remote forcing mechanisms of East African rainfall variability. The ability of the NASA Goddard Earth Observing System (GEOS5) coupled model to capture the associated relationships will be evaluated using Coupled Model Intercomparison Project Phase 5 (CMIP5) simulations.

  8. A new technique for rapid assessment of eutrophication status of coastal waters using a support vector machine

    Science.gov (United States)

    Kong, Xianyu; Che, Xiaowei; Su, Rongguo; Zhang, Chuansong; Yao, Qingzhen; Shi, Xiaoyong

    2017-05-01

    There is an urgent need to develop efficient evaluation tools that use easily measured variables to make rapid and timely eutrophication assessments, which are important for marine health management, and to implement eutrophication monitoring programs. In this study, an approach for rapidly assessing the eutrophication status of coastal waters with three easily measured parameters (turbidity, chlorophyll a and dissolved oxygen) was developed by the grid search (GS) optimized support vector machine (SVM), with trophic index TRIX classification results as the reference. With the optimized penalty parameter C =64 and the kernel parameter γ =1, the classification accuracy rates reached 89.3% for the training data, 88.3% for the cross-validation, and 88.5% for the validation dataset. Because the developed approach only used three easy-to-measure variables, its application could facilitate the rapid assessment of the eutrophication status of coastal waters, resulting in potential cost savings in marine monitoring programs and assisting in the provision of timely advice for marine management.

  9. Prediction of Driver’s Intention of Lane Change by Augmenting Sensor Information Using Machine Learning Techniques

    Science.gov (United States)

    Kim, Il-Hwan; Bong, Jae-Hwan; Park, Jooyoung; Park, Shinsuk

    2017-01-01

    Driver assistance systems have become a major safety feature of modern passenger vehicles. The advanced driver assistance system (ADAS) is one of the active safety systems to improve the vehicle control performance and, thus, the safety of the driver and the passengers. To use the ADAS for lane change control, rapid and correct detection of the driver’s intention is essential. This study proposes a novel preprocessing algorithm for the ADAS to improve the accuracy in classifying the driver’s intention for lane change by augmenting basic measurements from conventional on-board sensors. The information on the vehicle states and the road surface condition is augmented by using an artificial neural network (ANN) models, and the augmented information is fed to a support vector machine (SVM) to detect the driver’s intention with high accuracy. The feasibility of the developed algorithm was tested through driving simulator experiments. The results show that the classification accuracy for the driver’s intention can be improved by providing an SVM model with sufficient driving information augmented by using ANN models of vehicle dynamics. PMID:28604582

  10. Prediction of Driver’s Intention of Lane Change by Augmenting Sensor Information Using Machine Learning Techniques

    Directory of Open Access Journals (Sweden)

    Il-Hwan Kim

    2017-06-01

    Full Text Available Driver assistance systems have become a major safety feature of modern passenger vehicles. The advanced driver assistance system (ADAS is one of the active safety systems to improve the vehicle control performance and, thus, the safety of the driver and the passengers. To use the ADAS for lane change control, rapid and correct detection of the driver’s intention is essential. This study proposes a novel preprocessing algorithm for the ADAS to improve the accuracy in classifying the driver’s intention for lane change by augmenting basic measurements from conventional on-board sensors. The information on the vehicle states and the road surface condition is augmented by using an artificial neural network (ANN models, and the augmented information is fed to a support vector machine (SVM to detect the driver’s intention with high accuracy. The feasibility of the developed algorithm was tested through driving simulator experiments. The results show that the classification accuracy for the driver’s intention can be improved by providing an SVM model with sufficient driving information augmented by using ANN models of vehicle dynamics.

  11. Is demography destiny? Application of machine learning techniques to accurately predict population health outcomes from a minimal demographic dataset.

    Directory of Open Access Journals (Sweden)

    Wei Luo

    Full Text Available For years, we have relied on population surveys to keep track of regional public health statistics, including the prevalence of non-communicable diseases. Because of the cost and limitations of such surveys, we often do not have the up-to-date data on health outcomes of a region. In this paper, we examined the feasibility of inferring regional health outcomes from socio-demographic data that are widely available and timely updated through national censuses and community surveys. Using data for 50 American states (excluding Washington DC from 2007 to 2012, we constructed a machine-learning model to predict the prevalence of six non-communicable disease (NCD outcomes (four NCDs and two major clinical risk factors, based on population socio-demographic characteristics from the American Community Survey. We found that regional prevalence estimates for non-communicable diseases can be reasonably predicted. The predictions were highly correlated with the observed data, in both the states included in the derivation model (median correlation 0.88 and those excluded from the development for use as a completely separated validation sample (median correlation 0.85, demonstrating that the model had sufficient external validity to make good predictions, based on demographics alone, for areas not included in the model development. This highlights both the utility of this sophisticated approach to model development, and the vital importance of simple socio-demographic characteristics as both indicators and determinants of chronic disease.

  12. A self-centering active probing technique for kinematic parameter identification and verification of articulated arm coordinate measuring machines

    Science.gov (United States)

    Santolaria, J.; Brau, A.; Velázquez, J.; Aguilar, J. J.

    2010-05-01

    A crucial task in the procedure of identifying the parameters of a kinematic model of an articulated arm coordinate measuring machine (AACMM) or robot arm is the process of capturing data. In this paper a capturing data method is analyzed using a self-centering active probe, which drastically reduces the capture time and the required number of positions of the gauge as compared to the usual standard and manufacturer methods. The mathematical models of the self-centering active probe and AACMM are explained, as well as the mathematical model that links the AACMM global reference system to the probe reference system. We present a self-calibration method that will allow us to determine a homogeneous transformation matrix that relates the probe's reference system to the AACMM last reference system from the probing of a single sphere. In addition, a comparison between a self-centering passive probe and self-centering active probe is carried out to show the advantages of the latter in the procedures of kinematic parameter identification and verification of the AACMM.

  13. Is demography destiny? Application of machine learning techniques to accurately predict population health outcomes from a minimal demographic dataset.

    Science.gov (United States)

    Luo, Wei; Nguyen, Thin; Nichols, Melanie; Tran, Truyen; Rana, Santu; Gupta, Sunil; Phung, Dinh; Venkatesh, Svetha; Allender, Steve

    2015-01-01

    For years, we have relied on population surveys to keep track of regional public health statistics, including the prevalence of non-communicable diseases. Because of the cost and limitations of such surveys, we often do not have the up-to-date data on health outcomes of a region. In this paper, we examined the feasibility of inferring regional health outcomes from socio-demographic data that are widely available and timely updated through national censuses and community surveys. Using data for 50 American states (excluding Washington DC) from 2007 to 2012, we constructed a machine-learning model to predict the prevalence of six non-communicable disease (NCD) outcomes (four NCDs and two major clinical risk factors), based on population socio-demographic characteristics from the American Community Survey. We found that regional prevalence estimates for non-communicable diseases can be reasonably predicted. The predictions were highly correlated with the observed data, in both the states included in the derivation model (median correlation 0.88) and those excluded from the development for use as a completely separated validation sample (median correlation 0.85), demonstrating that the model had sufficient external validity to make good predictions, based on demographics alone, for areas not included in the model development. This highlights both the utility of this sophisticated approach to model development, and the vital importance of simple socio-demographic characteristics as both indicators and determinants of chronic disease.

  14. Machining heavy plastic sections

    Science.gov (United States)

    Stalkup, O. M.

    1967-01-01

    Machining technique produces consistently satisfactory plane-parallel optical surfaces for pressure windows, made of plexiglass, required to support a photographic study of liquid rocket combustion processes. The surfaces are machined and polished to the required tolerances and show no degradation from stress relaxation over periods as long as 6 months.

  15. Prevalence of chronic kidney disease of non-traditional causes in patients on hemodialysis in southwest Guatemala.

    Science.gov (United States)

    Laux, Timothy S; Barnoya, Joaquin; Cipriano, Ever; Herrera, Erick; Lopez, Noemi; Polo, Vicente Sanchez; Rothstein, Marcos

    2016-04-01

    Objective To document the prevalence of patients on hemodialysis in southwestern Guatemala who have chronic kidney disease (CKD) of non-traditional causes (CKDnt). Methods This cross-sectional descriptive study interviewed patients on hemodialysis at the Instituto Guatemalteco de Seguridad Social on their health and occupational history. Laboratory serum, urine and vital sign data at the initiation of hemodialysis were obtained from chart reviews. Patients were classified according to whether they had hypertension or obesity or neither. The proportion of patients with and without these traditional CKD risk factors was recorded and the association between demographic and occupational factors and a lack of traditional CKD risk factors analyzed using multivariate logistic regression. Results Of 242 total patients (including 171 non-diabetics) enrolled in hemodialysis in southwestern Guatemala, 45 (18.6% of total patients and 26.3% of non-diabetics) lacked traditional CKD risk factors. While agricultural work history was common, only travel time greater than 30 minutes and age less than 50 years old were significantly associated with CKD in the absence of traditional risk factors. Individuals without such risk factors lived throughout southwestern Guatemala's five departments. Conclusions The prevalence of CKDnT appears to be much lower in this sample of patients receiving hemodialysis in Southwestern Guatemala than in hospitalized patients in El Salvador. It has yet to be determined whether the prevalence is higher in the general population and in patients on peritoneal dialysis.

  16. Educational Programs for Graduate Level Learners and Professionals - National Radio Astronomy Observatory National and International Non-Traditional Exchange Program

    Science.gov (United States)

    Wingate, Lory Mitchell

    2017-01-01

    The National Radio Astronomy Observatory’s (NRAO) National and International Non-Traditional Exchange (NINE) Program teaches concepts of project management and systems engineering to chosen participants within a nine-week program held at NRAO in New Mexico. Participants are typically graduate level students or professionals. Participation in the NINE Program is through a competitive process. The program includes a hands-on service project designed to increase the participants knowledge of radio astronomy. The approach demonstrate clearly to the learner the positive net effects of following methodical approaches to achieving optimal science results.The NINE teaches participants important sustainable skills associated with constructing, operating and maintaining radio astronomy observatories. NINE Program learners are expected to return to their host sites and implement the program in their own location as a NINE Hub. This requires forming a committed relationship (through a formal Letter of Agreement), establishing a site location, and developing a program that takes into consideration the needs of the community they represent. The anticipated outcome of this program is worldwide partnerships with fast growing radio astronomy communities designed to facilitate the exchange of staff and the mentoring of under-represented groups of learners, thereby developing a strong pipeline of global talent to construct, operate and maintain radio astronomy observatories.

  17. Forming the Priorities of Non-Traditional Renewable Energy Technologies in Ukraine from the Position of their Life Cycle the Analysis

    Directory of Open Access Journals (Sweden)

    Dyuzhev Viktor G.

    2015-03-01

    Full Text Available The mechanism for the formation of priorities regarding non-traditional renewable energy technologies by understanding the complex of ecological and technogenic as well as socio-economic cooperation has been offered. The priority directions due to optimization of the expenditure chain of the cycle «output — transportation — processing — accumulation — consumption — utilization of energy resources» have been identified. The comparative analysis of typical stages of the cycle “output — processing — consumption” of traditional fuel energy resources and those on the basis of non-traditional renewable energy has been conducted. The problems of developing the measurement system to assess the risk of using traditional and non-traditional renewable energy have been considered.

  18. Estimation of the Impacts of Non-Oil Traditional and NonTraditional Export Sectors on Non-Oil Export of Azerbaijan

    Directory of Open Access Journals (Sweden)

    Nicat Hagverdiyev

    2016-12-01

    Full Text Available The significant share of oil sector of the Azerbaijan export portfolio necessitates promotion of non-oil exports. This study analyzes weather the commodities which contain the main share (more than 70% in non-oil export are traditional or non-traditional areas, using the so-called Commodity-specific cumulative export experience function, for the 1995-2015 time frame. Then, the impact of traditional and non-traditional exports on non-oil GDP investigated employing econometric model. The results of the study based on 16 non-oil commodities show that cotton, tobacco, and production of mechanic devices are traditional sectors in non-oil export. The estimation results of the model indicate that both, traditional and non-traditional non-oil export sectors have economically and statistically significant impact on non-oil GDP.

  19. The effect of educational attainment levels on use of non-traditional health information resources: Findings from the Canadian survey of experiences with primary health care

    Directory of Open Access Journals (Sweden)

    Sean Hardiman

    2015-12-01

    Full Text Available Canadian provincial governments have made significant investments in nurse advice telephone lines and Internet resources as non-traditional options to reduce emergency department visits and improve access to health care for the population. However, little is known about the characteristics of users of these services, and who chooses to use them first, before accessing other sources of health advice. Additionally, individuals with lower levels of education tend to be late adopters of technology and have inconsistent utilization of health services. The purpose of the study is to examine the effect of educational attainment levels on the use of non-traditional health information sources first, before other more conventional sources of health information. The study utilized Canadian Survey of Experiences with Primary Health Care (CSE-PHC, 2007-2008 survey data. Logistic regression models were constructed to examine the relationship between use of non-traditional health information sources first, and educational attainment, adjusted for confounders. Relative to someone with less than secondary education, individuals with secondary education (OR = 4.30, 95% CI: 2.44 – 7.59, and individuals with post-secondary education (OR 4.91, 95% CI: 2.78 – 8.67, had significantly greater odds of using non-traditional health information sources first. These findings suggest that educational attainment has a significant effect on the use of non-traditional health information sources first. Future providers of non-traditional health information sources, especially in the design of future eHealth tools and consideration of eHealth literacy, should consider these results in development and implementation of their communications strategies to maximize the reach of their services.

  20. Mapping Urban Areas with Integration of DMSP/OLS Nighttime Light and MODIS Data Using Machine Learning Techniques

    Directory of Open Access Journals (Sweden)

    Wenlong Jing

    2015-09-01

    Full Text Available Mapping urban areas at global and regional scales is an urgent and crucial task for detecting urbanization and human activities throughout the world and is useful for discerning the influence of urban expansion upon the ecosystem and the surrounding environment. DMSP-OLS stable nighttime lights have provided an effective way to monitor human activities on a global scale. Threshold-based algorithms have been widely used for extracting urban areas and estimating urban expansion, but the accuracy can decrease because of the empirical and subjective selection of threshold values. This paper proposes an approach for extracting urban areas with the integration of DMSP-OLS stable nighttime lights and MODIS data utilizing training sample datasets selected from DMSP-OLS and MODIS NDVI based on several simple strategies. Four classification algorithms were implemented for comparison: the classification and regression tree (CART, k-nearest-neighbors (k-NN, support vector machine (SVM, and random forests (RF. A case study was carried out on the eastern part of China, covering 99 cities and 1,027,700 km2. The classification results were validated using an independent land cover dataset, and then compared with an existing contextual classification method. The results showed that the new method can achieve results with comparable accuracies, and is easier to implement and less sensitive to the initial thresholds than the contextual method. Among the four classifiers implemented, RF achieved the most stable results and the highest average Kappa. Meanwhile CART produced highly overestimated results compared to the other three classifiers. Although k-NN and SVM tended to produce similar accuracy, less-bright areas around the urban cores seemed to be ignored when using SVM, which led to the underestimation of urban areas. Furthermore, quantity assessment showed that the results produced by k-NN, SVM, and RFs exhibited better agreement in larger cities and low

  1. Machine learning techniques to select Be star candidates. An application in the OGLE-IV Gaia south ecliptic pole field

    Science.gov (United States)

    Pérez-Ortiz, M. F.; García-Varela, A.; Quiroz, A. J.; Sabogal, B. E.; Hernández, J.

    2017-09-01

    Context. Optical and infrared variability surveys produce a large number of high quality light curves. Statistical pattern recognition methods have provided competitive solutions for variable star classification at a relatively low computational cost. In order to perform supervised classification, a set of features is proposed and used to train an automatic classification system. Quantities related to the magnitude density of the light curves and their Fourier coefficients have been chosen as features in previous studies. However, some of these features are not robust to the presence of outliers and the calculation of Fourier coefficients is computationally expensive for large data sets. Aims: We propose and evaluate the performance of a new robust set of features using supervised classifiers in order to look for new Be star candidates in the OGLE-IV Gaia south ecliptic pole field. Methods: We calculated the proposed set of features on six types of variable stars and also on a set of Be star candidates reported in the literature. We evaluated the performance of these features using classification trees and random forests along with the K-nearest neighbours, support vector machines, and gradient boosted trees methods. We tuned the classifiers with a 10-fold cross-validation and grid search. We then validated the performance of the best classifier on a set of OGLE-IV light curves and applied this to find new Be star candidates. Results: The random forest classifier outperformed the others. By using the random forest classifier and colours criteria we found 50 Be star candidates in the direction of the Gaia south ecliptic pole field, four of which have infrared colours that are consistent with Herbig Ae/Be stars. Conclusions: Supervised methods are very useful in order to obtain preliminary samples of variable stars extracted from large databases. As usual, the stars classified as Be stars candidates must be checked for the colours and spectroscopic characteristics

  2. Meter-scale Urban Land Cover Mapping for EPA EnviroAtlas Using Machine Learning and OBIA Remote Sensing Techniques

    Science.gov (United States)

    Pilant, A. N.; Baynes, J.; Dannenberg, M.; Riegel, J.; Rudder, C.; Endres, K.

    2013-12-01

    US EPA EnviroAtlas is an online collection of tools and resources that provides geospatial data, maps, research, and analysis on the relationships between nature, people, health, and the economy (http://www.epa.gov/research/enviroatlas/index.htm). Using EnviroAtlas, you can see and explore information related to the benefits (e.g., ecosystem services) that humans receive from nature, including clean air, clean and plentiful water, natural hazard mitigation, biodiversity conservation, food, fuel, and materials, recreational opportunities, and cultural and aesthetic value. EPA developed several urban land cover maps at very high spatial resolution (one-meter pixel size) for a portion of EnviroAtlas devoted to urban studies. This urban mapping effort supported analysis of relations among land cover, human health and demographics at the US Census Block Group level. Supervised classification of 2010 USDA NAIP (National Agricultural Imagery Program) digital aerial photos produced eight-class land cover maps for several cities, including Durham, NC, Portland, ME, Tampa, FL, New Bedford, MA, Pittsburgh, PA, Portland, OR, and Milwaukee, WI. Semi-automated feature extraction methods were used to classify the NAIP imagery: genetic algorithms/machine learning, random forest, and object-based image analysis (OBIA). In this presentation we describe the image processing and fuzzy accuracy assessment methods used, and report on some sustainability and ecosystem service metrics computed using this land cover as input (e.g., carbon sequestration from USFS iTREE model; health and demographics in relation to road buffer forest width). We also discuss the land cover classification schema (a modified Anderson Level 1 after the National Land Cover Data (NLCD)), and offer some observations on lessons learned. Meter-scale urban land cover in Portland, OR overlaid on NAIP aerial photo. Streets, buildings and individual trees are identifiable.

  3. Automated analysis of retinal imaging using machine learning techniques for computer vision [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Jeffrey De Fauw

    2016-07-01

    Full Text Available There are almost two million people in the United Kingdom living with sight loss, including around 360,000 people who are registered as blind or partially sighted. Sight threatening diseases, such as diabetic retinopathy and age related macular degeneration have contributed to the 40% increase in outpatient attendances in the last decade but are amenable to early detection and monitoring. With early and appropriate intervention, blindness may be prevented in many cases.   Ophthalmic imaging provides a way to diagnose and objectively assess the progression of a number of pathologies including neovascular (“wet” age-related macular degeneration (wet AMD and diabetic retinopathy. Two methods of imaging are commonly used: digital photographs of the fundus (the ‘back’ of the eye and Optical Coherence Tomography (OCT, a modality that uses light waves in a similar way to how ultrasound uses sound waves. Changes in population demographics and expectations and the changing pattern of chronic diseases creates a rising demand for such imaging. Meanwhile, interrogation of such images is time consuming, costly, and prone to human error. The application of novel analysis methods may provide a solution to these challenges.   This research will focus on applying novel machine learning algorithms to automatic analysis of both digital fundus photographs and OCT in Moorfields Eye Hospital NHS Foundation Trust patients.   Through analysis of the images used in ophthalmology, along with relevant clinical and demographic information, Google DeepMind Health will investigate the feasibility of automated grading of digital fundus photographs and OCT and provide novel quantitative measures for specific disease features and for monitoring the therapeutic success.

  4. Automated analysis of retinal imaging using machine learning techniques for computer vision [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Jeffrey De Fauw

    2017-06-01

    Full Text Available There are almost two million people in the United Kingdom living with sight loss, including around 360,000 people who are registered as blind or partially sighted. Sight threatening diseases, such as diabetic retinopathy and age related macular degeneration have contributed to the 40% increase in outpatient attendances in the last decade but are amenable to early detection and monitoring. With early and appropriate intervention, blindness may be prevented in many cases. Ophthalmic imaging provides a way to diagnose and objectively assess the progression of a number of pathologies including neovascular (“wet” age-related macular degeneration (wet AMD and diabetic retinopathy. Two methods of imaging are commonly used: digital photographs of the fundus (the ‘back’ of the eye and Optical Coherence Tomography (OCT, a modality that uses light waves in a similar way to how ultrasound uses sound waves. Changes in population demographics and expectations and the changing pattern of chronic diseases creates a rising demand for such imaging. Meanwhile, interrogation of such images is time consuming, costly, and prone to human error. The application of novel analysis methods may provide a solution to these challenges. This research will focus on applying novel machine learning algorithms to automatic analysis of both digital fundus photographs and OCT in Moorfields Eye Hospital NHS Foundation Trust patients. Through analysis of the images used in ophthalmology, along with relevant clinical and demographic information, DeepMind Health will investigate the feasibility of automated grading of digital fundus photographs and OCT and provide novel quantitative measures for specific disease features and for monitoring the therapeutic success.

  5. A Benchmark for Banks’ Strategy in Online Presence – An Innovative Approach Based on Elements of Search Engine Optimization (SEO and Machine Learning Techniques

    Directory of Open Access Journals (Sweden)

    Camelia Elena CIOLAC

    2011-06-01

    Full Text Available This paper aims to offer a new decision tool to assist banks in evaluating their efficiency of Internet presence and in planning the IT investments towards gaining better Internet popularity. The methodology used in this paper goes beyond the simple website interface analysis and uses web crawling as a source for collecting website performance data and employed web technologies and servers. The paper complements this technical perspective with a proposed scorecard used to assess the efforts of banks in Internet presence that reflects the banks’ commitment to Internet as a distribution channel. An innovative approach based on Machine Learning Techniques, the K-Nearest Neighbor Algorithm, is proposed by the author to estimate the Internet Popularity that a bank is likely to achieve based on its size and efforts in Internet presence.

  6. Cells, Agents, and Support Vectors in Interaction - Modeling Urban Sprawl based on Machine Learning and Artificial Intelligence Techniques in a Post-Industrial Region

    Science.gov (United States)

    Rienow, A.; Menz, G.

    2015-12-01

    Since the beginning of the millennium, artificial intelligence techniques as cellular automata (CA) and multi-agent systems (MAS) have been incorporated into land-system simulations to address the complex challenges of transitions in urban areas as open, dynamic systems. The study presents a hybrid modeling approach for modeling the two antagonistic processes of urban sprawl and urban decline at once. The simulation power of support vector machines (SVM), cellular automata (CA) and multi-agent systems (MAS) are integrated into one modeling framework and applied to the largest agglomeration of Central Europe: the Ruhr. A modified version of SLEUTH (short for Slope, Land-use, Exclusion, Urban, Transport, and Hillshade) functions as the CA component. SLEUTH makes use of historic urban land-use data sets and growth coefficients for the purpose of modeling physical urban expansion. The machine learning algorithm of SVM is applied in order to enhance SLEUTH. Thus, the stochastic variability of the CA is reduced and information about the human and ecological forces driving the local suitability of urban sprawl is incorporated. Subsequently, the supported CA is coupled with the MAS ReHoSh (Residential Mobility and the Housing Market of Shrinking City Systems). The MAS models population patterns, housing prices, and housing demand in shrinking regions based on interactions between household and city agents. Semi-explicit urban weights are introduced as a possibility of modeling from and to the pixel simultaneously. Three scenarios of changing housing preferences reveal the urban development of the region in terms of quantity and location. They reflect the dissemination of sustainable thinking among stakeholders versus the steady dream of owning a house in sub- and exurban areas. Additionally, the outcomes are transferred into a digital petri dish reflecting a synthetic environment with perfect conditions of growth. Hence, the generic growth elements affecting the future

  7. Manufacturing of mortars and concretes non-traditionals, by Portland cement, metakaoline and gypsum (15.05%

    Directory of Open Access Journals (Sweden)

    Talero, R.

    1999-12-01

    Full Text Available In a thorough previous research (1, it appeared that creation, evolution and development of the values of compressive mechanical strength (CS and flexural strength (FS, measured in specimens 1x1x6cm of mortar type ASTM C 452-68 (2, manufactured by ordinary Portland cement P-1 (14.11% C3A or PY-6 (0.00% C3A, metakaolin and gypsum (CaSO4∙2H2O -or ternary cements, CT-, were similar to the ones commonly developed in mortars and concretes of OPC. This paper sets up the experimental results obtained from non-traditional mortars and concretes prepared with such ternary cements -TC-, being the portland cement/metakaolin mass ratio, as follows: 80/20, 70/30 and 60/40. Finally, the behaviour of these cements against gypsum attack, has been also determined, using the following parameters: increase in length (ΔL%, compressive, CS, and flexural, FS, strengths, and ultrasound energy, UE. Experimental results obtained from these non-traditional mortars and concretes, show an increase in length (ΔL, in CS and FS, and in UE values, when there is addition of metakaolin.

    En una exhaustiva investigación anterior (1, se pudo comprobar que la creación, evolución y desarrollo de los valores de resistencias mecánicas a compresión, RMC, y flexotracción, RMF, proporcionados por probetas de 1x1x6 cm, de mortero 1:2,75, selenitoso tipo ASTM C 452-68 (2 -que habían sido preparadas con arena de Ottawa, cemento portland, P-1 (14,11% C3A o PY- 6 (0,00% C3A, metacaolín y yeso (CaSO4∙2H2O-, fue semejante a la que, comúnmente, desarrollan los morteros y hormigones tradicionales de cemento portland. En el presente trabajo se exponen los resultados experimentales obtenidos de morteros y hormigones no tradicionales, preparados con dichos cementos ternarios, CT, siendo las proporciones porcentuales en masa ensayadas, cemento portland/metacaolín, las siguientes: 80/20, 70

  8. Dialysis enrollment patterns in Guatemala: evidence of the chronic kidney disease of non-traditional causes epidemic in Mesoamerica.

    Science.gov (United States)

    Laux, Timothy S; Barnoya, Joaquin; Guerrero, Douglas R; Rothstein, Marcos

    2015-04-14

    In western Nicaragua and El Salvador, chronic kidney disease (CKD) is highly prevalent and generally affects young, male, agricultural (usually sugar cane) workers without the established CKD risk factors. It is yet unknown if the prevalence of this CKD of Non-Traditional causes (CKDnT) extends to the northernmost Central American country, Guatemala. Therefore, we sought to compare dialysis enrollment rates by region, municipality, sex, daily temperature, and agricultural production in Guatemala and assess if there is a similar CKDnT distribution pattern as in Nicaragua and El Salvador. The National Center for Chronic Kidney Disease Treatment (Unidad Nacional de Atención al Enfermo Renal Crónico) is the largest provider of dialysis in Guatemala. We used population, Human Development Index, literacy, and agricultural databases to assess the geographic, economic, and educational correlations with the National Center for Chronic Kidney Disease Treatment's hemodialysis and peritoneal dialysis enrollment database. Enrollment rates (per 100 000) inhabitants were compared by region and mapped for comparison to regional agricultural and daytime temperature data. The distribution of men and women enrolled in dialysis were compared by region using Fisher's exact tests. Spearman's rank correlation coefficients were calculated. Dialysis enrollment is higher in the Southwest compared to the rest of the country where enrollees are more likely (p < 0.01) to be male (57.8%) compared to the rest of the country (49.3%). Dialysis enrollment positively correlates with Human Development Index and literacy rates. These correlations are weaker in the agricultural regions (predominantly sugar cane) of Southwest Guatemala. In Guatemala, CKDnT incidence may have a similar geographic distribution as Nicaragua and El Salvador (higher in the high temperature and sugar cane growing regions). Therefore, it is likely that the CKNnT epidemic extends throughout the Mesoamerican region.

  9. Border to Beltway: A Formative Field Exchange Program between Two Community Colleges for Non-Traditional Students

    Science.gov (United States)

    Villalobos, J. I.; Bentley, C.

    2014-12-01

    Community College students account for over 40% of all undergraduates in the US as well as the majority of minority students attending undergraduate courses. With issues in the geosciences such as; being the least diverse of all major STEM fields, an increasing number of retiring geoscientists, and a projected geoscience job growth not matching the number of geoscience graduates, the geoscience community needs to look at community colleges as a solution to these issues. A key factor for students entering and excelling in the geoscience is the opportunity for formative undergraduate field experiences. Formative field experiences go beyond one-day field excursions by incorporating field projects, interactive learning, and community building between participants in regions students are unfamiliar with. Unfortunately, these types of formative experiences often require logistics and resources that are not available or known to community college faculty. In order to build a framework for implementing formative field experiences by community colleges a two-week "field exchange" between two community colleges with different geological, social, and cultural settings was conducted. Supported with a supplemental grant from NSF, the "Border to Beltway" program provided 11 students from El Paso Community College and another 13 from Northern Virginia Community College with two one-week regional geology field trips: First, to West Texas in March 2014, and second, to the mid-Atlantic region in May 2014. Students were selected based on academic standing, non-traditional (minority, female, over 35, veteran) status, and interest in geology. Qualitative data collected from participants regarding the implementation of the field exchange include; student perception of geology before and after exchange, challenges students faced in the field or traveling for the first time, quantity and quality of projects given, and working with others from different backgrounds. Data regarding planning

  10. Design of rotating electrical machines

    CERN Document Server

    Pyrhonen , Juha; Hrabovcova , Valeria

    2013-01-01

    In one complete volume, this essential reference presents an in-depth overview of the theoretical principles and techniques of electrical machine design. This timely new edition offers up-to-date theory and guidelines for the design of electrical machines, taking into account recent advances in permanent magnet machines as well as synchronous reluctance machines. New coverage includes: Brand new material on the ecological impact of the motors, covering the eco-design principles of rotating electrical machinesAn expanded section on the design of permanent magnet synchronous machines, now repo

  11. The Role of Personality and Motivation in Predicting Early College Academic Success in Non-Traditional Students at a Hispanic-Serving Institution

    Science.gov (United States)

    Kaufman, James C.; Agars, Mark D.; Lopez-Wagner, Muriel C.

    2008-01-01

    Non-cognitive factors represent a chance to learn more about how to help students succeed in early college experiences. This study examined personality and motivation as predictors of first-quarter GPA in a sample of 315 non-traditional undergraduates at a Hispanic-serving institution. Our results provide support for the importance of high levels…

  12. The Effects of a Non-Traditional Strength Training Program on the Health-Related Fitness Outcomes of Youth Strength Training Participants

    Science.gov (United States)

    Cowan, Wendy; Foster, Byron

    2009-01-01

    The purpose of this study was to determine the extent to which a non-traditional strength training program will impact the health-related fitness of youth. Researchers hypothesized that the strengthening program would positively affect the fitness outcomes. Participant physical education classes incorporated strengthening exercises three days…

  13. Review of traditional and non-traditional medicinal genetic resources in the USDA, ARS, PGRCU collection evaluated for flavonoid concentrations and anthocyanin indexes

    Science.gov (United States)

    Non-traditional medicinal species include velvetleaf (Abutilon theophrasti Medik.), Desmodium species, Termanus labialis (L.f.) Spreng. and the traditional species consists of roselle (Hibiscus sabdariffa L.). There is a need to identify plant sources of flavonoids and anthocyanins since they have s...

  14. Motivations for participation in higher education: narratives of non-traditional students at Makerere University in Uganda, InternationalJournal of Lifelong Education

    NARCIS (Netherlands)

    Tumuheki, Peace Buhwamatsiko; Zeelen, Jacobus; Openjuru, George L.

    2016-01-01

    The objective of this qualitative study was to establish motivations for participation of non-traditional students (NTS) in university education. The findings are drawn from empirical data collected from 15 unstructured in-depth interviews with NTS of the School of Computing and Informatics

  15. Comparison of a traditional and non-traditional residential care facility for persons living with dementia and the impact of the environment on occupational engagement.

    Science.gov (United States)

    Richards, Kieva; D'Cruz, Rachel; Harman, Suzanne; Stagnitti, Karen

    2015-12-01

    Dementia residential facilities can be described as traditional or non-traditional facilities. Non-traditional facilities aim to utilise principles of environmental design to create a milieu that supports persons experiencing cognitive decline. This study aimed to compare these two environments in rural Australia, and their influence on residents' occupational engagement. The Residential Environment Impact Survey (REIS) was used and consists of: a walk-through of the facility; activity observation; interviews with residents and employees. Thirteen residents were observed and four employees interviewed. Resident interviews did not occur given the population diagnosis of moderate to severe dementia. Descriptive data from the walk-through and activity observation were analysed for potential opportunities of occupational engagement. Interviews were thematically analysed to discern perception of occupational engagement of residents within their facility. Both facilities provided opportunities for occupational engagement. However, the non-traditional facility provided additional opportunities through employee interactions and features of the physical environment. Interviews revealed six themes: Comfortable environment; roles and responsibilities; getting to know the resident; more stimulation can elicit increased engagement; the home-like experience and environmental layout. These themes coupled with the features of the environment provided insight into the complexity of occupational engagement within this population. This study emphasises the influence of the physical and social environment on occupational engagement opportunities. A non-traditional dementia facility maximises these opportunities and can support development of best-practice guidelines within this population. © 2015 Occupational Therapy Australia.

  16. Supporting Online, Non-Traditional Students through the Introduction of Effective E-Learning Tools in a Pre-University Tertiary Enabling Programme

    Science.gov (United States)

    Lambrinidis, George

    2014-01-01

    The increasing number of external students enrolling at Charles Darwin University has led to the university investing in new technologies to provide better support for students studying online. Many students, however, come from non-traditional backgrounds and lack some of the skills and confidence to participate successfully in an e-learning…

  17. Prevalence of E. coli, Salmonella spp. and L. monocytogenes in non-traditional irrigation waters in the Mid-Atlantic U.S.: a conserve project

    Science.gov (United States)

    Introduction: Surface and non-traditional irrigation water (SNIW) sources can increase the irrigation water supplies without consuming potable water. However, these sources must be evaluated for enteric pathogens that could adulterate crops intended for human consumption and comply with Food Safety ...

  18. Detecting Neolithic Burial Mounds from LiDAR-Derived Elevation Data Using a Multi-Scale Approach and Machine Learning Techniques

    Directory of Open Access Journals (Sweden)

    Alexandre Guyot

    2018-02-01

    Full Text Available Airborne LiDAR technology is widely used in archaeology and over the past decade has emerged as an accurate tool to describe anthropomorphic landforms. Archaeological features are traditionally emphasised on a LiDAR-derived Digital Terrain Model (DTM using multiple Visualisation Techniques (VTs, and occasionally aided by automated feature detection or classification techniques. Such an approach offers limited results when applied to heterogeneous structures (different sizes, morphologies, which is often the case for archaeological remains that have been altered throughout the ages. This study proposes to overcome these limitations by developing a multi-scale analysis of topographic position combined with supervised machine learning algorithms (Random Forest. Rather than highlighting individual topographic anomalies, the multi-scalar approach allows archaeological features to be examined not only as individual objects, but within their broader spatial context. This innovative and straightforward method provides two levels of results: a composite image of topographic surface structure and a probability map of the presence of archaeological structures. The method was developed to detect and characterise megalithic funeral structures in the region of Carnac, the Bay of Quiberon, and the Gulf of Morbihan (France, which is currently considered for inclusion on the UNESCO World Heritage List. As a result, known archaeological sites have successfully been geo-referenced with a greater accuracy than before (even when located under dense vegetation and a ground-check confirmed the identification of a previously unknown Neolithic burial mound in the commune of Carnac.

  19. A novel hybrid model for air quality index forecasting based on two-phase decomposition technique and modified extreme learning machine.

    Science.gov (United States)

    Wang, Deyun; Wei, Shuai; Luo, Hongyuan; Yue, Chenqiang; Grunder, Olivier

    2017-02-15

    The randomness, non-stationarity and irregularity of air quality index (AQI) series bring the difficulty of AQI forecasting. To enhance forecast accuracy, a novel hybrid forecasting model combining two-phase decomposition technique and extreme learning machine (ELM) optimized by differential evolution (DE) algorithm is developed for AQI forecasting in this paper. In phase I, the complementary ensemble empirical mode decomposition (CEEMD) is utilized to decompose the AQI series into a set of intrinsic mode functions (IMFs) with different frequencies; in phase II, in order to further handle the high frequency IMFs which will increase the forecast difficulty, variational mode decomposition (VMD) is employed to decompose the high frequency IMFs into a number of variational modes (VMs). Then, the ELM model optimized by DE algorithm is applied to forecast all the IMFs and VMs. Finally, the forecast value of each high frequency IMF is obtained through adding up the forecast results of all corresponding VMs, and the forecast series of AQI is obtained by aggregating the forecast results of all IMFs. To verify and validate the proposed model, two daily AQI series from July 1, 2014 to June 30, 2016 collected from Beijing and Shanghai located in China are taken as the test cases to conduct the empirical study. The experimental results show that the proposed hybrid model based on two-phase decomposition technique is remarkably superior to all other considered models for its higher forecast accuracy. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Accumulation of non-traditional risk factors for coronary heart disease is associated with incident coronary heart disease hospitalization and death.

    Directory of Open Access Journals (Sweden)

    Lindsay M K Wallace

    Full Text Available Assessing multiple traditional risk factors improves prediction for late-life diseases, including coronary heart disease (CHD. It appears that non-traditional risk factors can also predict risk. The objective was to investigate contributions of non-traditional risk factors to coronary heart disease risk using a deficit accumulation approach.Community-dwelling adults with no known history of CHD (n = 2195, mean age 46.9±18.7 years, 51.8% women participated in the 1995 Nova Scotia Health Survey. Three risk factor indices were constructed to quantify the proportion of deficits present in individuals: 1 a 17-item Non-Traditional Risk Factor Index (e.g. sinusitis, arthritis; 2 a 9-item Traditional Risk Factor Index (e.g. hypertension, diabetes; and 3 a frailty index (25 items combined from the other two index measures. Ten-year risks of CHD events (defined as CHD-related hospitalization and CHD-related mortality were evaluated.The Non-Traditional Risk Factor Index, made up of health deficits unrelated to CHD, was independently associated with incident CHD events over 10 years after controlling for age, sex, and the Traditional Risk Factor Index [adjusted {adj.} Hazard Ratio {HR} = 1.31; Confidence Interval {CI} 1.14-1.51]. When all health deficits, both those related and unrelated to CHD, were included in a frailty index the corresponding adjusted hazard ratio was 1.61; CI 1.40-1.85.Both traditional and non-traditional risk factor indices are independently associated with incident CHD events. CHD risk assessment may benefit from consideration of general health information as well as from traditional risk factors.

  1. Diagnostics and Control of Natural Gas-Fired furnaces via Flame Image Analysis using Machine Vision & Artificial Intelligence Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Shahla Keyvan

    2005-12-01

    A new approach for the detection of real-time properties of flames is used in this project to develop improved diagnostics and controls for natural gas fired furnaces. The system utilizes video images along with advanced image analysis and artificial intelligence techniques to provide virtual sensors in a stand-alone expert shell environment. One of the sensors is a flame sensor encompassing a flame detector and a flame analyzer to provide combustion status. The flame detector can identify any burner that has not fired in a multi-burner furnace. Another sensor is a 3-D temperature profiler. One important aspect of combustion control is product quality. The 3-D temperature profiler of this on-line system is intended to provide a tool for a better temperature control in a furnace to improve product quality. In summary, this on-line diagnostic and control system offers great potential for improving furnace thermal efficiency, lowering NOx and carbon monoxide emissions, and improving product quality. The system is applicable in natural gas-fired furnaces in the glass industry and reheating furnaces used in steel and forging industries.

  2. Toward Bulk Synchronous Parallel-Based Machine Learning Techniques for Anomaly Detection in High-Speed Big Data Networks

    Directory of Open Access Journals (Sweden)

    Kamran Siddique

    2017-09-01

    Full Text Available Anomaly detection systems, also known as intrusion detection systems (IDSs, continuously monitor network traffic aiming to identify malicious actions. Extensive research has been conducted to build efficient IDSs emphasizing two essential characteristics. The first is concerned with finding optimal feature selection, while another deals with employing robust classification schemes. However, the advent of big data concepts in anomaly detection domain and the appearance of sophisticated network attacks in the modern era require some fundamental methodological revisions to develop IDSs. Therefore, we first identify two more significant characteristics in addition to the ones mentioned above. These refer to the need for employing specialized big data processing frameworks and utilizing appropriate datasets for validating system’s performance, which is largely overlooked in existing studies. Afterwards, we set out to develop an anomaly detection system that comprehensively follows these four identified characteristics, i.e., the proposed system (i performs feature ranking and selection using information gain and automated branch-and-bound algorithms respectively; (ii employs logistic regression and extreme gradient boosting techniques for classification; (iii introduces bulk synchronous parallel processing to cater computational requirements of high-speed big data networks; and; (iv uses the Infromation Security Centre of Excellence, of the University of Brunswick real-time contemporary dataset for performance evaluation. We present experimental results that verify the efficacy of the proposed system.

  3. Empirical radio propagation model for DTV applied to non-homogeneous paths and different climates using machine learning techniques.

    Science.gov (United States)

    Gomes, Igor Ruiz; Gomes, Cristiane Ruiz; Gomes, Herminio Simões; Cavalcante, Gervásio Protásio Dos Santos

    2018-01-01

    The establishment and improvement of transmission systems rely on models that take into account, (among other factors), the geographical features of the region, as these can lead to signal degradation. This is particularly important in Brazil, where there is a great diversity of scenery and climates. This article proposes an outdoor empirical radio propagation model for Ultra High Frequency (UHF) band, that estimates received power values that can be applied to non-homogeneous paths and different climates, this last being of an innovative character for the UHF band. Different artificial intelligence techniques were chosen on a theoretical and computational basis and made it possible to introduce, organize and describe quantitative and qualitative data quickly and efficiently, and thus determine the received power in a wide range of settings and climates. The proposed model was applied to a city in the Amazon region with heterogeneous paths, wooded urban areas and fractions of freshwater among other factors. Measurement campaigns were conducted to obtain data signals from two digital TV stations in the metropolitan area of the city of Belém, in the State of Pará, to design, compare and validate the model. The results are consistent since the model shows a clear difference between the two seasons of the studied year and small RMS errors in all the cases studied.

  4. Development of automatic surveillance of animal behaviour and welfare using image analysis and machine learned segmentation technique.

    Science.gov (United States)

    Nilsson, M; Herlin, A H; Ardö, H; Guzhva, O; Åström, K; Bergsten, C

    2015-11-01

    In this paper the feasibility to extract the proportion of pigs located in different areas of a pig pen by advanced image analysis technique is explored and discussed for possible applications. For example, pigs generally locate themselves in the wet dunging area at high ambient temperatures in order to avoid heat stress, as wetting the body surface is the major path to dissipate the heat by evaporation. Thus, the portion of pigs in the dunging area and resting area, respectively, could be used as an indicator of failure of controlling the climate in the pig environment as pigs are not supposed to rest in the dunging area. The computer vision methodology utilizes a learning based segmentation approach using several features extracted from the image. The learning based approach applied is based on extended state-of-the-art features in combination with a structured prediction framework based on a logistic regression solver using elastic net regularization. In addition, the method is able to produce a probability per pixel rather than form a hard decision. This overcomes some of the limitations found in a setup using grey-scale information only. The pig pen is a difficult imaging environment because of challenging lighting conditions like shadows, poor lighting and poor contrast between pig and background. In order to test practical conditions, a pen containing nine young pigs was filmed from a top view perspective by an Axis M3006 camera with a resolution of 640 × 480 in three, 10-min sessions under different lighting conditions. The results indicate that a learning based method improves, in comparison with greyscale methods, the possibility to reliable identify proportions of pigs in different areas of the pen. Pigs with a changed behaviour (location) in the pen may indicate changed climate conditions. Changed individual behaviour may also indicate inferior health or acute illness.

  5. Surgical robotics beyond enhanced dexterity instrumentation: a survey of machine learning techniques and their role in intelligent and autonomous surgical actions.

    Science.gov (United States)

    Kassahun, Yohannes; Yu, Bingbin; Tibebu, Abraham Temesgen; Stoyanov, Danail; Giannarou, Stamatia; Metzen, Jan Hendrik; Vander Poorten, Emmanuel

    2016-04-01

    Advances in technology and computing play an increasingly important role in the evolution of modern surgical techniques and paradigms. This article reviews the current role of machine learning (ML) techniques in the context of surgery with a focus on surgical robotics (SR). Also, we provide a perspective on the future possibilities for enhancing the effectiveness of procedures by integrating ML in the operating room. The review is focused on ML techniques directly applied to surgery, surgical robotics, surgical training and assessment. The widespread use of ML methods in diagnosis and medical image computing is beyond the scope of the review. Searches were performed on PubMed and IEEE Explore using combinations of keywords: ML, surgery, robotics, surgical and medical robotics, skill learning, skill analysis and learning to perceive. Studies making use of ML methods in the context of surgery are increasingly being reported. In particular, there is an increasing interest in using ML for developing tools to understand and model surgical skill and competence or to extract surgical workflow. Many researchers begin to integrate this understanding into the control of recent surgical robots and devices. ML is an expanding field. It is popular as it allows efficient processing of vast amounts of data for interpreting and real-time decision making. Already widely used in imaging and diagnosis, it is believed that ML will also play an important role in surgery and interventional treatments. In particular, ML could become a game changer into the conception of cognitive surgical robots. Such robots endowed with cognitive skills would assist the surgical team also on a cognitive level, such as possibly lowering the mental load of the team. For example, ML could help extracting surgical skill, learned through demonstration by human experts, and could transfer this to robotic skills. Such intelligent surgical assistance would significantly surpass the state of the art in surgical

  6. Patient-related quality assurance with different combinations of treatment planning systems, techniques, and machines. A multi-institutional survey

    Energy Technology Data Exchange (ETDEWEB)

    Steiniger, Beatrice; Schwedas, Michael; Weibert, Kirsten; Wiezorek, Tilo [University Hospital Jena, Department of Radiation Oncology, Jena (Germany); Berger, Rene [SRH Hospital Gera, Department of Radiation Oncology, Gera (Germany); Eilzer, Sabine [Martin-Luther-Hospital, Radiation Therapy, Berlin (Germany); Kornhuber, Christine [University Hospital Halle, Department of Radiation Oncology, Halle (Saale) (Germany); Lorenz, Kathleen [Hospital of Chemnitz, Department for Radiation Oncology, Chemnitz (Germany); Peil, Torsten [MVZ Center for Radiation Oncology Halle GmbH, Halle (Saale) (Germany); Reiffenstuhl, Carsten [University Hospital Carl Gustav Carus, Department of Radiation Oncology, Dresden (Germany); Schilz, Johannes [Helios Hospital Erfurt, Department of Radiation Oncology, Erfurt (Germany); Schroeder, Dirk [SRH Central Hospital Suhl, Department of Radiation Oncology, Suhl (Germany); Pensold, Stephanie [Community Hospital Dresden-Friedrichstadt, Department of Radiation Oncology, Dresden (Germany); Walke, Mathias [Otto-von-Guericke University Magdeburg, Department of Radiation Oncology, Magdeburg (Germany); Wolf, Ulrich [University Hospital Leipzig, Department of Radiation Oncology, Leipzig (Germany)

    2017-01-15

    This project compares the different patient-related quality assurance systems for intensity-modulated radiation therapy (IMRT) and volumetric-modulated arc therapy (VMAT) techniques currently used in the central Germany area with an independent measuring system. The participating institutions generated 21 treatment plans with different combinations of treatment planning systems (TPS) and linear accelerators (LINAC) for the QUASIMODO (Quality ASsurance of Intensity MODulated radiation Oncology) patient model. The plans were exposed to the ArcCHECK measuring system (Sun Nuclear Corporation, Melbourne, FL, USA). The dose distributions were analyzed using the corresponding software and a point dose measured at the isocenter with an ionization chamber. According to the generally used criteria of a 10 % threshold, 3 % difference, and 3 mm distance, the majority of plans investigated showed a gamma index exceeding 95 %. Only one plan did not fulfill the criteria and three of the plans did not comply with the commonly accepted tolerance level of ±3 % in point dose measurement. Using only one of the two examined methods for patient-related quality assurance is not sufficiently significant in all cases. (orig.) [German] Im Rahmen des Projekts sollten die verschiedenen derzeit im mitteldeutschen Raum eingesetzten patientenbezogenen Qualitaetssicherungssysteme zur intensitaetsmodulierten Radiotherapie (IMRT) und volumenmodulierten Arc-Radiotherapie (VMAT) mit einem unabhaengigen Messsystem verglichen werden. Die teilnehmenden Einrichtungen berechneten insgesamt 21 Bestrahlungsplaene mit verschiedenen Planungssystemen (TPS) und Linearbeschleunigern (LINAC) fuer das Patientenmodell QUASIMODO (Quality ASsurance of Intensity MODulated radiation Oncology), die dann auf das ArcCHECK-Phantom (Sun Nuclear Corporation, Melbourne, FL, USA) uebertragen und abgestrahlt wurden. Zur Auswertung wurde sowohl eine Punktmessung im Isozentrum als auch die Dosisverteilung in der Diodenebene des

  7. Sustainable machining

    CERN Document Server

    2017-01-01

    This book provides an overview on current sustainable machining. Its chapters cover the concept in economic, social and environmental dimensions. It provides the reader with proper ways to handle several pollutants produced during the machining process. The book is useful on both undergraduate and postgraduate levels and it is of interest to all those working with manufacturing and machining technology.

  8. Machine learning in healthcare informatics

    CERN Document Server

    Acharya, U; Dua, Prerna

    2014-01-01

    The book is a unique effort to represent a variety of techniques designed to represent, enhance, and empower multi-disciplinary and multi-institutional machine learning research in healthcare informatics. The book provides a unique compendium of current and emerging machine learning paradigms for healthcare informatics and reflects the diversity, complexity and the depth and breath of this multi-disciplinary area. The integrated, panoramic view of data and machine learning techniques can provide an opportunity for novel clinical insights and discoveries.

  9. Spatial prediction of landslide susceptibility using an adaptive neuro-fuzzy inference system combined with frequency ratio, generalized additive model, and support vector machine techniques

    Science.gov (United States)

    Chen, Wei; Pourghasemi, Hamid Reza; Panahi, Mahdi; Kornejady, Aiding; Wang, Jiale; Xie, Xiaoshen; Cao, Shubo

    2017-11-01

    The spatial prediction of landslide susceptibility is an important prerequisite for the analysis of landslide hazards and risks in any area. This research uses three data mining techniques, such as an adaptive neuro-fuzzy inference system combined with frequency ratio (ANFIS-FR), a generalized additive model (GAM), and a support vector machine (SVM), for landslide susceptibility mapping in Hanyuan County, China. In the first step, in accordance with a review of the previous literature, twelve conditioning factors, including slope aspect, altitude, slope angle, topographic wetness index (TWI), plan curvature, profile curvature, distance to rivers, distance to faults, distance to roads, land use, normalized difference vegetation index (NDVI), and lithology, were selected. In the second step, a collinearity test and correlation analysis between the conditioning factors and landslides were applied. In the third step, we used three advanced methods, namely, ANFIS-FR, GAM, and SVM, for landslide susceptibility modeling. Subsequently, the results of their accuracy were validated using a receiver operating characteristic curve. The results showed that all three models have good prediction capabilities, while the SVM model has the highest prediction rate of 0.875, followed by the ANFIS-FR and GAM models with prediction rates of 0.851 and 0.846, respectively. Thus, the landslide susceptibility maps produced in the study area can be applied for management of hazards and risks in landslide-prone Hanyuan County.

  10. A Novel Flavour Tagging Algorithm using Machine Learning Techniques and a Precision Measurement of the $B^0 - \\overline{B^0}$ Oscillation Frequency at the LHCb Experiment

    CERN Document Server

    Kreplin, Katharina

    This thesis presents a novel flavour tagging algorithm using machine learning techniques and a precision measurement of the $B^0 -\\overline{B^0}$ oscillation frequency $\\Delta m_d$ using semileptonic $B^0$ decays. The LHC Run I data set is used which corresponds to $3 \\textrm{fb}^{-1}$ of data taken by the LHCb experiment at a center-of-mass energy of 7 TeV and 8 TeV. The performance of flavour tagging algorithms, exploiting the $b\\bar{b}$ pair production and the $b$ quark hadronization, is relatively low at the LHC due to the large amount of soft QCD background in inelastic proton-proton collisions. The standard approach is a cut-based selection of particles, whose charges are correlated to the production flavour of the $B$ meson. The novel tagging algorithm classifies the particles using an artificial neural network (ANN). It assigns higher weights to particles, which are likely to be correlated to the $b$ flavour. A second ANN combines the particles with the highest weights to derive the tagging decision. ...

  11. Identification of temporal variations in mental workload using locally-linear-embedding-based EEG feature reduction and support-vector-machine-based clustering and classification techniques.

    Science.gov (United States)

    Yin, Zhong; Zhang, Jianhua

    2014-07-01

    Identifying the abnormal changes of mental workload (MWL) over time is quite crucial for preventing the accidents due to cognitive overload and inattention of human operators in safety-critical human-machine systems. It is known that various neuroimaging technologies can be used to identify the MWL variations. In order to classify MWL into a few discrete levels using representative MWL indicators and small-sized training samples, a novel EEG-based approach by combining locally linear embedding (LLE), support vector clustering (SVC) and support vector data description (SVDD) techniques is proposed and evaluated by using the experimentally measured data. The MWL indicators from different cortical regions are first elicited by using the LLE technique. Then, the SVC approach is used to find the clusters of these MWL indicators and thereby to detect MWL variations. It is shown that the clusters can be interpreted as the binary class MWL. Furthermore, a trained binary SVDD classifier is shown to be capable of detecting slight variations of those indicators. By combining the two schemes, a SVC-SVDD framework is proposed, where the clear-cut (smaller) cluster is detected by SVC first and then a subsequent SVDD model is utilized to divide the overlapped (larger) cluster into two classes. Finally, three-class MWL levels (low, normal and high) can be identified automatically. The experimental data analysis results are compared with those of several existing methods. It has been demonstrated that the proposed framework can lead to acceptable computational accuracy and has the advantages of both unsupervised and supervised training strategies. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  12. Creativity in Machine Learning

    OpenAIRE

    Thoma, Martin

    2016-01-01

    Recent machine learning techniques can be modified to produce creative results. Those results did not exist before; it is not a trivial combination of the data which was fed into the machine learning system. The obtained results come in multiple forms: As images, as text and as audio. This paper gives a high level overview of how they are created and gives some examples. It is meant to be a summary of the current work and give people who are new to machine learning some starting points.

  13. Association between proximity to and coverage of traditional fast-food restaurants and non-traditional fast-food outlets and fast-food consumption among rural adults

    OpenAIRE

    Sharkey, Joseph R; Johnson, Cassandra M; Dean, Wesley R; Horel, Scott A

    2011-01-01

    Abstract Objective The objective of this study is to examine the relationship between residential exposure to fast-food entrées, using two measures of potential spatial access: proximity (distance to the nearest location) and coverage (number of different locations), and weekly consumption of fast-food meals. Methods Traditional fast-food restaurants and non-traditional fast-food outlets, such as convenience stores, supermarkets, and grocery stores, from the 2006 Brazos Valley Food Environmen...

  14. The development of damage identification methods for buildings with image recognition and machine learning techniques utilizing aerial photographs of the 2016 Kumamoto earthquake

    Science.gov (United States)

    Shohei, N.; Nakamura, H.; Fujiwara, H.; Naoichi, M.; Hiromitsu, T.

    2017-12-01

    It is important to get schematic information of the damage situation immediately after the earthquake utilizing photographs shot from an airplane in terms of the investigation and the decision-making for authorities. In case of the 2016 Kumamoto earthquake, we have acquired more than 1,800 orthographic projection photographs adjacent to damaged areas. These photos have taken between April 16th and 19th by airplanes, then we have distinguished damages of all buildings with 4 levels, and organized as approximately 296,000 GIS data corresponding to the fundamental Geospatial data published by Geospatial Information Authority of Japan. These data have organized by effort of hundreds of engineers. However, it is not considered practical for more extensive disasters like the Nankai Trough earthquake by only human powers. So, we have been developing the automatic damage identification method utilizing image recognition and machine learning techniques. First, we have extracted training data of more than 10,000 buildings which have equally damage levels divided in 4 grades. With these training data, we have been raster scanning in each scanning ranges of entire images, then clipping patch images which represents damage levels each. By utilizing these patch images, we have been developing discriminant models by two ways. One is a model using the Support Vector Machine (SVM). First, extract a feature quantity of each patch images. Then, with these vector values, calculate the histogram density as a method of Bag of Visual Words (BoVW), then classify borders with each damage grades by SVM. The other one is a model using the multi-layered Neural Network. First, design a multi-layered Neural Network. Second, input patch images and damage levels based on a visual judgement, and then, optimize learning parameters with error backpropagation method. By use of both discriminant models, we are going to discriminate damage levels in each patches, then create the image that shows

  15. Management Styles and Techniques: Machines.

    Science.gov (United States)

    Steffen, Susan Swords

    1987-01-01

    Describes the management approach used to introduce new technologies in an academic library and the success of this approach. The management of change is discussed in terms of communication between management and personnel; management style; development of a system; staff development; and sensitivity to staff reactions and needs. (CLB)

  16. A Measure of Precision Regarding Procedural Tasks of Non-Traditional, Adult Learners In an Immersive Virtual Learning Environment

    OpenAIRE

    Mary Leah Caillier Coco, Ph.D.

    2015-01-01

    Immersive Virtual Learning Environments (IVLEs) are an increasingly popular tool used extensively in modern training techniques, but little is known about the learning transfer process accompanying curricula based on these methods and the ability of the technique to teach procedures. As an advantageous instrument in teaching and developing psychomotor and spatial activities, further research and assessment into the strengths and applications of IVLE in training activities is needed to evaluat...

  17. The perfect storm of information: combining traditional and non-traditional data sources for public health situational awareness during hurricane response.

    Science.gov (United States)

    Bennett, Kelly J; Olsen, Jennifer M; Harris, Sara; Mekaru, Sumiko; Livinski, Alicia A; Brownstein, John S

    2013-12-16

    Hurricane Isaac made landfall in southeastern Louisiana in late August 2012, resulting in extensive storm surge and inland flooding. As the lead federal agency responsible for medical and public health response and recovery coordination, the Department of Health and Human Services (HHS) must have situational awareness to prepare for and address state and local requests for assistance following hurricanes. Both traditional and non-traditional data have been used to improve situational awareness in fields like disease surveillance and seismology. This study investigated whether non-traditional data (i.e., tweets and news reports) fill a void in traditional data reporting during hurricane response, as well as whether non-traditional data improve the timeliness for reporting identified HHS Essential Elements of Information (EEI). HHS EEIs provided the information collection guidance, and when the information indicated there was a potential public health threat, an event was identified and categorized within the larger scope of overall Hurricane Issac situational awareness. Tweets, news reports, press releases, and federal situation reports during Hurricane Isaac response were analyzed for information about EEIs. Data that pertained to the same EEI were linked together and given a unique event identification number to enable more detailed analysis of source content. Reports of sixteen unique events were examined for types of data sources reporting on the event and timeliness of the reports. Of these sixteen unique events identified, six were reported by only a single data source, four were reported by two data sources, four were reported by three data sources, and two were reported by four or more data sources. For five of the events where news tweets were one of multiple sources of information about an event, the tweet occurred prior to the news report, press release, local government\\emergency management tweet, and federal situation report. In all circumstances where

  18. Machine Learning for Security

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Applied statistics, aka ‘Machine Learning’, offers a wealth of techniques for answering security questions. It’s a much hyped topic in the big data world, with many companies now providing machine learning as a service. This talk will demystify these techniques, explain the math, and demonstrate their application to security problems. The presentation will include how-to’s on classifying malware, looking into encrypted tunnels, and finding botnets in DNS data. About the speaker Josiah is a security researcher with HP TippingPoint DVLabs Research Group. He has over 15 years of professional software development experience. Josiah used to do AI, with work focused on graph theory, search, and deductive inference on large knowledge bases. As rules only get you so far, he moved from AI to using machine learning techniques identifying failure modes in email traffic. There followed digressions into clustered data storage and later integrated control systems. Current ...

  19. Simple machines

    CERN Document Server

    Graybill, George

    2007-01-01

    Just how simple are simple machines? With our ready-to-use resource, they are simple to teach and easy to learn! Chocked full of information and activities, we begin with a look at force, motion and work, and examples of simple machines in daily life are given. With this background, we move on to different kinds of simple machines including: Levers, Inclined Planes, Wedges, Screws, Pulleys, and Wheels and Axles. An exploration of some compound machines follows, such as the can opener. Our resource is a real time-saver as all the reading passages, student activities are provided. Presented in s

  20. Machine Learning and Radiology

    Science.gov (United States)

    Wang, Shijun; Summers, Ronald M.

    2012-01-01

    In this paper, we give a short introduction to machine learning and survey its applications in radiology. We focused on six categories of applications in radiology: medical image segmentation, registration, computer aided detection and diagnosis, brain function or activity analysis and neurological disease diagnosis from fMR images, content-based image retrieval systems for CT or MRI images, and text analysis of radiology reports using natural language processing (NLP) and natural language understanding (NLU). This survey shows that machine learning plays a key role in many radiology applications. Machine learning identifies complex patterns automatically and helps radiologists make intelligent decisions on radiology data such as conventional radiographs, CT, MRI, and PET images and radiology reports. In many applications, the performance of machine learning-based automatic detection and diagnosis systems has shown to be comparable to that of a well-trained and experienced radiologist. Technology development in machine learning and radiology will benefit from each other in the long run. Key contributions and common characteristics of machine learning techniques in radiology are discussed. We also discuss the problem of translating machine learning applications to the radiology clinical setting, including advantages and potential barriers. PMID:22465077

  1. Machine learning and radiology.

    Science.gov (United States)

    Wang, Shijun; Summers, Ronald M

    2012-07-01

    In this paper, we give a short introduction to machine learning and survey its applications in radiology. We focused on six categories of applications in radiology: medical image segmentation, registration, computer aided detection and diagnosis, brain function or activity analysis and neurological disease diagnosis from fMR images, content-based image retrieval systems for CT or MRI images, and text analysis of radiology reports using natural language processing (NLP) and natural language understanding (NLU). This survey shows that machine learning plays a key role in many radiology applications. Machine learning identifies complex patterns automatically and helps radiologists make intelligent decisions on radiology data such as conventional radiographs, CT, MRI, and PET images and radiology reports. In many applications, the performance of machine learning-based automatic detection and diagnosis systems has shown to be comparable to that of a well-trained and experienced radiologist. Technology development in machine learning and radiology will benefit from each other in the long run. Key contributions and common characteristics of machine learning techniques in radiology are discussed. We also discuss the problem of translating machine learning applications to the radiology clinical setting, including advantages and potential barriers. Copyright © 2012. Published by Elsevier B.V.

  2. Into the Bowels of Depression: Unravelling Medical Symptoms Associated with Depression by Applying Machine-Learning Techniques to a Community Based Population Sample

    Science.gov (United States)

    Dipnall, Joanna F.

    2016-01-01

    Background Depression is commonly comorbid with many other somatic diseases and symptoms. Identification of individuals in clusters with comorbid symptoms may reveal new pathophysiological mechanisms and treatment targets. The aim of this research was to combine machine-learning (ML) algorithms with traditional regression techniques by utilising self-reported medical symptoms to identify and describe clusters of individuals with increased rates of depression from a large cross-sectional community based population epidemiological study. Methods A multi-staged methodology utilising ML and traditional statistical techniques was performed using the community based population National Health and Nutrition Examination Study (2009–2010) (N = 3,922). A Self-organised Mapping (SOM) ML algorithm, combined with hierarchical clustering, was performed to create participant clusters based on 68 medical symptoms. Binary logistic regression, controlling for sociodemographic confounders, was used to then identify the key clusters of participants with higher levels of depression (PHQ-9≥10, n = 377). Finally, a Multiple Additive Regression Tree boosted ML algorithm was run to identify the important medical symptoms for each key cluster within 17 broad categories: heart, liver, thyroid, respiratory, diabetes, arthritis, fractures and osteoporosis, skeletal pain, blood pressure, blood transfusion, cholesterol, vision, hearing, psoriasis, weight, bowels and urinary. Results Five clusters of participants, based on medical symptoms, were identified to have significantly increased rates of depression compared to the cluster with the lowest rate: odds ratios ranged from 2.24 (95% CI 1.56, 3.24) to 6.33 (95% CI 1.67, 24.02). The ML boosted regression algorithm identified three key medical condition categories as being significantly more common in these clusters: bowel, pain and urinary symptoms. Bowel-related symptoms was found to dominate the relative importance of symptoms within the

  3. Machine musicianship

    Science.gov (United States)

    Rowe, Robert

    2002-05-01

    The training of musicians begins by teaching basic musical concepts, a collection of knowledge commonly known as musicianship. Computer programs designed to implement musical skills (e.g., to make sense of what they hear, perform music expressively, or compose convincing pieces) can similarly benefit from access to a fundamental level of musicianship. Recent research in music cognition, artificial intelligence, and music theory has produced a repertoire of techniques that can make the behavior of computer programs more musical. Many of these were presented in a recently published book/CD-ROM entitled Machine Musicianship. For use in interactive music systems, we are interested in those which are fast enough to run in real time and that need only make reference to the material as it appears in sequence. This talk will review several applications that are able to identify the tonal center of musical material during performance. Beyond this specific task, the design of real-time algorithmic listening through the concurrent operation of several connected analyzers is examined. The presentation includes discussion of a library of C++ objects that can be combined to perform interactive listening and a demonstration of their capability.

  4. Face machines

    Energy Technology Data Exchange (ETDEWEB)

    Hindle, D.

    1999-06-01

    The article surveys latest equipment available from the world`s manufacturers of a range of machines for tunnelling. These are grouped under headings: excavators; impact hammers; road headers; and shields and tunnel boring machines. Products of thirty manufacturers are referred to. Addresses and fax numbers of companies are supplied. 5 tabs., 13 photos.

  5. Electric machine

    Science.gov (United States)

    El-Refaie, Ayman Mohamed Fawzi [Niskayuna, NY; Reddy, Patel Bhageerath [Madison, WI

    2012-07-17

    An interior permanent magnet electric machine is disclosed. The interior permanent magnet electric machine comprises a rotor comprising a plurality of radially placed magnets each having a proximal end and a distal end, wherein each magnet comprises a plurality of magnetic segments and at least one magnetic segment towards the distal end comprises a high resistivity magnetic material.

  6. Transductive and matched-pair machine learning for difficult target detection problems

    Science.gov (United States)

    Theiler, James

    2014-06-01

    This paper will describe the application of two non-traditional kinds of machine learning (transductive machine learning and the more recently proposed matched-pair machine learning) to the target detection problem. The approach combines explicit domain knowledge to model the target signal with a more agnostic machine-learning approach to characterize the background. The concept is illustrated with simulated data from an elliptically-contoured background distribution, on which a subpixel target of known spectral signature but unknown spatial extent has been implanted.

  7. Machine learning with R cookbook

    CERN Document Server

    Chiu, Yu-Wei

    2015-01-01

    If you want to learn how to use R for machine learning and gain insights from your data, then this book is ideal for you. Regardless of your level of experience, this book covers the basics of applying R to machine learning through to advanced techniques. While it is helpful if you are familiar with basic programming or machine learning concepts, you do not require prior experience to benefit from this book.

  8. Contribution of non-traditional lipid profiles to reduced glomerular filtration rate in H-type hypertension population of rural China.

    Science.gov (United States)

    Wang, Haoyu; Li, Zhao; Guo, Xiaofan; Chen, Yintao; Chen, Shuang; Tian, Yichen; Sun, Yingxian

    2018-05-01

    Despite current interest in the unfavourable impact of non-traditional lipid profiles on cardiovascular disease, information regarding its relations to reduced glomerular filtration rate (GFR) in H-type hypertension population has not been systemically elucidated. Analyses were based upon a cross-sectional study of 3259 participants with H-type hypertension who underwent assessment of biochemical, anthropometric and blood pressure values. Reduced GFR was considered if meeting estimated GFR <60 ml/min/1.73 m 2 . A stepwise multivariate regression analysis indicated that non-traditional lipid parameters remained as independent determinants of estimated GFR (all p < .001). In multivariable models, we observed a 50%, 51%, 31%, and 24% higher risk for decreased GFR with each SD increment in TC/HDL-C, TG/HDL-C, LDL-C/HDL-C ratios and non-HDL-C levels, respectively. The highest quartile of TC/HDL-C, TG/HDL-C and LDL-C/HDL-C ratios carried reduced GFR odds (confidence intervals) of 5.50 (2.50 to 12.09), 6.63 (2.58 to 17.05) and 2.22 (1.15 to 4.29), respectively. The relative independent contribution of non-traditional lipid profiles, as indexed by TC/HDL-C, TG/HDL-C, LDL-C/HDL-C ratios and non-HDL-C, towards reduced GFR putting research evidence at the very heart of lipoprotein-mediated renal injury set a vital example for applying a clinical and public health recommendation for reducing the burden of chronic kidney disease. KEY MESSAGES Non-traditional lipid profiles has been linked with the occurrence of cardiovascular disease, but none of the studies that address the effect of non-traditional lipid profiles on reduced GFR risk in H-type hypertension population has been specifically established. A greater emphasis of this study resided in the intrinsic value of TC/HDL-C, TG/HDL-C, LDL-C/HDL-C ratios and non-HDL-C that integrate atherogenic and anti-atherogenic lipid molecules to predict the risk of reduced GFR among H-type hypertension population and provide

  9. An improved excitation control technique of three-phase induction machine operating as dual winding generator for micro-wind domestic application

    International Nuclear Information System (INIS)

    Chatterjee, Arunava; Chatterjee, Debashis

    2015-01-01

    Highlights: • A three-phase induction machine working as single phase generator is studied. • The generator is assisted by an inverter and photovoltaic panel for excitation. • Proposed control involves operating the machine as balanced two-phase generator. • Torque pulsations associated with unbalanced phase currents are minimized. • The generator can be used for grid-isolated micro-wind power generation. - Abstract: Single-phase generation schemes are widely utilized for harnessing wind power in remote and grid secluded applications. This paper presents a novel control methodology for a three-phase induction machine working as a single-phase dual winding induction generator. Three-phase induction machines providing single-phase output with proper control strategy can be beneficial in grid secluded micro-wind energy conversion systems compared to single-phase induction generators. Three-phase induction machines operating in single-phase mode are mostly excited asymmetrically to provide single-phase power leading to unbalanced current flow in the stator windings causing heating and insulation breakdown. The asymmetrical excitation also initiates torque pulsations which results in additional stress and vibration at the machine shaft and bearings degrading the machine performance. The proposed control is chiefly aimed to minimize this unbalance. The variable excitation required for the proposed generator is provided through a single-phase inverter with photovoltaic panels. The suitability for such a generator along with its control is tested with appropriate simulations and experimental results. The induction generator with the proposed control strategy is expected to be useful in remote and grid isolated households as a standalone source of single-phase electrical power

  10. The quest for knowledge transfer efficacy: blended teaching, online and in-class, with consideration of learning typologies for non-traditional and traditional students

    Directory of Open Access Journals (Sweden)

    Judy Rouse Van Doorn

    2014-04-01

    Full Text Available The pedagogical paradigm shift in higher education to 24-hour learning environments composed of teaching delivery methods of online courses, blended/hybrid formats, and face-to-face (f2f classes is increasing access to global, lifelong learning. Online degrees have been offered at 62.4% of 2,800 colleges and universities. Students can now design flexible, life-balanced course schedules. Higher knowledge transfer rates may exist with blended course formats with online quizzes and valuable class time set for Socratic, quality discussions and creative team presentations. Research indicates that younger, traditional students exhibit heightened performance goal orientations and prefer entertaining professors who are funny, whereas non-traditional students exhibit mastery profiles and prefer courses taught by flexible, yet organized, professors. A 5-year study found that amongst 51,000 students taking both f2f and online courses, higher online failure rates occurred. Competing life roles for non-traditional students and reading and writing needs for at-risk students suggest that performance may be better if programs are started in f2f courses. Models on effective knowledge transfer consider the planning process, delivery methods, and workplace application, but a gap exists for identifying the diversity of learner needs. Higher education enrollments are being compromised with lower online retention rates. Therefore, the main purpose of this review is to delineate disparate learning styles and present a typology for the learning needs of traditional and non-traditional students. Secondly, psychology as a science may need more rigorous curriculum markers like mapping APA guidelines to knowledge objectives, critical assignments, and student learning outcomes (SLOs (e.g. online rubric assessments for scoring APA style critical thinking essays on selected New York Times books. Efficacious knowledge transfer to diverse, 21st century students should be the

  11. The quest for knowledge transfer efficacy: blended teaching, online and in-class, with consideration of learning typologies for non-traditional and traditional students.

    Science.gov (United States)

    Van Doorn, Judy R; Van Doorn, John D

    2014-01-01

    The pedagogical paradigm shift in higher education to 24-h learning environments composed of teaching delivery methods of online courses, blended/hybrid formats, and face-to-face (f2f) classes is increasing access to global, lifelong learning. Online degrees have been offered at 62.4% of 2800 colleges and universities. Students can now design flexible, life-balanced course schedules. Higher knowledge transfer rates may exist with blended course formats with online quizzes and valuable class time set for Socratic, quality discussions and creative team presentations. Research indicates that younger, traditional students exhibit heightened performance goal orientations and prefer entertaining professors who are funny, whereas non-traditional students exhibit mastery profiles and prefer courses taught by flexible, yet organized, professors. A 5-year study found that amongst 51,000 students taking both f2f and online courses, higher online failure rates occurred. Competing life roles for non-traditional students and reading and writing needs for at-risk students suggest that performance may be better if programs are started in f2f courses. Models on effective knowledge transfer consider the planning process, delivery methods, and workplace application, but a gap exists for identifying the diversity of learner needs. Higher education enrollments are being compromised with lower online retention rates. Therefore, the main purpose of this review is to delineate disparate learning styles and present a typology for the learning needs of traditional and non-traditional students. Secondly, psychology as a science may need more rigorous curriculum markers like mapping APA guidelines to knowledge objectives, critical assignments, and student learning outcomes (SLOs) (e.g., online rubric assessments for scoring APA style critical thinking essays on selected New York Times books). Efficacious knowledge transfer to diverse, 21st century students should be the Academy's focus.

  12. The quest for knowledge transfer efficacy: blended teaching, online and in-class, with consideration of learning typologies for non-traditional and traditional students

    Science.gov (United States)

    Van Doorn, Judy R.; Van Doorn, John D.

    2014-01-01

    The pedagogical paradigm shift in higher education to 24-h learning environments composed of teaching delivery methods of online courses, blended/hybrid formats, and face-to-face (f2f) classes is increasing access to global, lifelong learning. Online degrees have been offered at 62.4% of 2800 colleges and universities. Students can now design flexible, life-balanced course schedules. Higher knowledge transfer rates may exist with blended course formats with online quizzes and valuable class time set for Socratic, quality discussions and creative team presentations. Research indicates that younger, traditional students exhibit heightened performance goal orientations and prefer entertaining professors who are funny, whereas non-traditional students exhibit mastery profiles and prefer courses taught by flexible, yet organized, professors. A 5-year study found that amongst 51,000 students taking both f2f and online courses, higher online failure rates occurred. Competing life roles for non-traditional students and reading and writing needs for at-risk students suggest that performance may be better if programs are started in f2f courses. Models on effective knowledge transfer consider the planning process, delivery methods, and workplace application, but a gap exists for identifying the diversity of learner needs. Higher education enrollments are being compromised with lower online retention rates. Therefore, the main purpose of this review is to delineate disparate learning styles and present a typology for the learning needs of traditional and non-traditional students. Secondly, psychology as a science may need more rigorous curriculum markers like mapping APA guidelines to knowledge objectives, critical assignments, and student learning outcomes (SLOs) (e.g., online rubric assessments for scoring APA style critical thinking essays on selected New York Times books). Efficacious knowledge transfer to diverse, 21st century students should be the Academy's focus. PMID

  13. Practical recommendations for the implementation of health technologies to enhance physical fitness of students in extracurricular classes during non-traditional gymnastics

    Directory of Open Access Journals (Sweden)

    E.V. Fomenko

    2014-07-01

    Full Text Available Purpose : to develop practical recommendations for extracurricular classes nontraditional kinds of gymnastics to improve the organization of physical education teachers in schools. Material : in the experiment involved 358 students. Analyzed the available literature data. Results : a comparative analysis of physical fitness of students and practical recommendations for the non-traditional occupations gymnastics. Been a significant interest in physical education classes. Found that the main ways of improving physical education students may be the formation of the need for strengthening health facilities fitness aerobics, shaping, pilates. Conclusions : highlights the need to structure the problems they need and develop appropriate solutions.

  14. The Machine within the Machine

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    Although Virtual Machines are widespread across CERN, you probably won't have heard of them unless you work for an experiment. Virtual machines - known as VMs - allow you to create a separate machine within your own, allowing you to run Linux on your Mac, or Windows on your Linux - whatever combination you need.   Using a CERN Virtual Machine, a Linux analysis software runs on a Macbook. When it comes to LHC data, one of the primary issues collaborations face is the diversity of computing environments among collaborators spread across the world. What if an institute cannot run the analysis software because they use different operating systems? "That's where the CernVM project comes in," says Gerardo Ganis, PH-SFT staff member and leader of the CernVM project. "We were able to respond to experimentalists' concerns by providing a virtual machine package that could be used to run experiment software. This way, no matter what hardware they have ...

  15. A meta-analysis of the effects of non-traditional teaching methods on the critical thinking abilities of nursing students.

    Science.gov (United States)

    Lee, JuHee; Lee, Yoonju; Gong, SaeLom; Bae, Juyeon; Choi, Moonki

    2016-09-15

    Scientific framework is important in designing curricula and evaluating students in the field of education and clinical practice. The purpose of this study was to examine the effectiveness of non-traditional educational methods on critical thinking skills. A systematic review approach was applied. Studies published in peer-reviewed journals from January 2001 to December 2014 were searched using electronic databases and major education journals. A meta-analysis was performed using Review Manager 5.2. Reviewing the included studies, the California Critical Thinking Dispositions Inventory (CCTDI) and California Critical Thinking Skills Test (CCTST) were used to assess the effectiveness of critical thinking in the meta-analysis. The eight CCTDI datasets showed that non- traditional teaching methods (i.e., no lectures) were more effective compared to control groups (standardized mean difference [SMD]: 0.42, 95 % confidence interval [CI]: 0.26-0.57, p teaching and learning methods in these studies were also had significantly more effects when compared to the control groups (SMD: 0.29, 95 % CI: 0.10-0.48, p = 0.003). This research showed that new teaching and learning methods designed to improve critical thinking were generally effective at enhancing critical thinking dispositions.

  16. Comments on "Gender, ethnic, age, and relationship differences in non-traditional college student alcohol consumption: a tri-ethnic study".

    Science.gov (United States)

    Stolberg, Victor B

    2012-01-01

    The purpose of these comments is to serve as a reaction to an article by Stephanie Babb, Cynthia Stewart, and Christine Bachman of the University of Houston-Downtown. The article is ambitiously titled "Gender, Ethnic, Age, and Relationship Differences in Non-Traditional College Students' Alcohol Consumption: A Tri-Ethnic Study" and is published in this issue of the Journal of Ethnicity in Substance Abuse. These comments are not intended to be a definitive response to all of the possible points raised by the authors of the article; rather they are reflective of the personal views of an addiction professional who has been active in the field for several years, particularly involved with efforts directed at substance use by non-traditional college students, and who has published previously on related topics. It is only possible to react to a few specific issues raised by the article; another commentator or a peer reviewer would probably address a myriad other areas. Indeed, several other topics of concern could have been addressed, but I felt it prudent and hopefully more productive to keep my comments more narrowly focused on some of the matters that seemed more pressing.

  17. Synthesis, Structure, and Magnetism of Tris(amide) {Ln[N(SiMe3)2]3}1- Complexes of the Non-Traditional +2 Lanthanide Ions.

    Science.gov (United States)

    Ryan, Austin Jack; Darago, Lucy E; Balasubramini, Sree Ganesh; Chen, Guo P; Ziller, Joseph W; Furche, Filipp; Long, Jeffrey R; Evans, William J

    2018-02-28

    A new series of Ln2+ complexes has been synthesized that overturns two previous generalizations in rare-earth metal reduction chemistry: that amide ligands do not form isolable complexes of the highly-reducing non-traditional Ln2+ ions and that yttrium is a good model for the late lanthanides in these reductive reactions. Reduction of Ln(NR2)3 (R = SiMe3) complexes in THF under Ar with M = K or Rb in the presence of 2.2.2-cryptand (crypt) forms crystallographically-characterizable [M(crypt)][Ln(NR2)3] complexes not only for the traditional Tm2+ ion and the configurational crossover ions, Nd2+ and Dy2+, but also for the non-traditional Gd2+, Tb2+, Ho2+, and Er2+ ions. Crystallographic data as well as UV-visible, magnetic susceptibility, and density functional theory studies are consistent with the accessibility of 4fn5d1 configurations for Ln2+ ions in this tris(silylamide) ligand environment. The Dy2+ complex, [K(crypt)][Dy(NR2)3], has a higher magnetic moment than previously observed for any monometallic complex: 11.67 µB. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Monel Machining

    Science.gov (United States)

    1983-01-01

    Castle Industries, Inc. is a small machine shop manufacturing replacement plumbing repair parts, such as faucet, tub and ballcock seats. Therese Castley, president of Castle decided to introduce Monel because it offered a chance to improve competitiveness and expand the product line. Before expanding, Castley sought NERAC assistance on Monel technology. NERAC (New England Research Application Center) provided an information package which proved very helpful. The NASA database was included in NERAC's search and yielded a wealth of information on machining Monel.

  19. Computer vision and machine learning for archaeology

    NARCIS (Netherlands)

    van der Maaten, L.J.P.; Boon, P.; Lange, G.; Paijmans, J.J.; Postma, E.

    2006-01-01

    Until now, computer vision and machine learning techniques barely contributed to the archaeological domain. The use of these techniques can support archaeologists in their assessment and classification of archaeological finds. The paper illustrates the use of computer vision techniques for

  20. Electrical machines diagnosis

    CERN Document Server

    Trigeassou, Jean-Claude

    2013-01-01

    Monitoring and diagnosis of electrical machine faults is a scientific and economic issue which is motivated by objectives for reliability and serviceability in electrical drives.This book provides a survey of the techniques used to detect the faults occurring in electrical drives: electrical, thermal and mechanical faults of the electrical machine, faults of the static converter and faults of the energy storage unit.Diagnosis of faults occurring in electrical drives is an essential part of a global monitoring system used to improve reliability and serviceability. This diagnosis is perf

  1. NICeSim: an open-source simulator based on machine learning techniques to support medical research on prenatal and perinatal care decision making.

    Science.gov (United States)

    Cerqueira, Fabio Ribeiro; Ferreira, Tiago Geraldo; de Paiva Oliveira, Alcione; Augusto, Douglas Adriano; Krempser, Eduardo; Corrêa Barbosa, Helio José; do Carmo Castro Franceschini, Sylvia; de Freitas, Brunnella Alcantara Chagas; Gomes, Andreia Patricia; Siqueira-Batista, Rodrigo

    2014-11-01

    This paper describes NICeSim, an open-source simulator that uses machine learning (ML) techniques to aid health professionals to better understand the treatment and prognosis of premature newborns. The application was developed and tested using data collected in a Brazilian hospital. The available data were used to feed an ML pipeline that was designed to create a simulator capable of predicting the outcome (death probability) for newborns admitted to neonatal intensive care units. However, unlike previous scoring systems, our computational tool is not intended to be used at the patients bedside, although it is possible. Our primary goal is to deliver a computational system to aid medical research in understanding the correlation of key variables with the studied outcome so that new standards can be established for future clinical decisions. In the implemented simulation environment, the values of key attributes can be changed using a user-friendly interface, where the impact of each change on the outcome is immediately reported, allowing a quantitative analysis, in addition to a qualitative investigation, and delivering a totally interactive computational tool that facilitates hypothesis construction and testing. Our statistical experiments showed that the resulting model for death prediction could achieve an accuracy of 86.7% and an area under the receiver operating characteristic curve of 0.84 for the positive class. Using this model, three physicians and a neonatal nutritionist performed simulations with key variables correlated with chance of death. The results indicated important tendencies for the effect of each variable and the combination of variables on prognosis. We could also observe values of gestational age and birth weight for which a low Apgar score and the occurrence of respiratory distress syndrome (RDS) could be less or more severe. For instance, we have noticed that for a newborn with 2000 g or more the occurrence of RDS is far less problematic

  2. Machine learning in virtual screening.

    Science.gov (United States)

    Melville, James L; Burke, Edmund K; Hirst, Jonathan D

    2009-05-01

    In this review, we highlight recent applications of machine learning to virtual screening, focusing on the use of supervised techniques to train statistical learning algorithms to prioritize databases of molecules as active against a particular protein target. Both ligand-based similarity searching and structure-based docking have benefited from machine learning algorithms, including naïve Bayesian classifiers, support vector machines, neural networks, and decision trees, as well as more traditional regression techniques. Effective application of these methodologies requires an appreciation of data preparation, validation, optimization, and search methodologies, and we also survey developments in these areas.

  3. Dynamics of cyclic machines

    CERN Document Server

    Vulfson, Iosif

    2015-01-01

    This book focuses on modern methods of oscillation analysis in machines, including cyclic action mechanisms (linkages, cams, steppers, etc.). It presents schematization techniques and mathematical descriptions of oscillating systems, taking into account the variability of the parameters and nonlinearities, engineering evaluations of dynamic errors, and oscillation suppression methods. The majority of the book is devoted to the development of new methods of dynamic analysis and synthesis for cyclic machines that form regular oscillatory systems with multiple duplicate modules.  There are also sections examining aspects of general engineering interest (nonlinear dissipative forces, systems with non-stationary constraints, impacts and pseudo-impacts in clearances, etc.)  The examples in the book are based on the widely used results of theoretical and experimental studies as well as engineering calculations carried out in relation to machines used in the textile, light, polygraphic and other industries. Particu...

  4. Machines and Metaphors

    Directory of Open Access Journals (Sweden)

    Ángel Martínez García-Posada

    2016-10-01

    Full Text Available The edition La ley del reloj. Arquitectura, máquinas y cultura moderna (Cátedra, Madrid, 2016 registers the useful paradox of the analogy between architecture and technique. Its author, the architect Eduardo Prieto, also a philosopher, professor and writer, acknowledges the obvious distance from machines to buildings, so great that it can only be solved using strange comparisons, since architecture does not move nor are the machines habitable, however throughout the book, from the origin of the metaphor of the machine, with clarity in his essay and enlightening erudition, he points out with certainty some concomitances of high interest, drawing throughout history a beautiful cartography of the fruitful encounter between organics and mechanics.

  5. Automatic classification of written descriptions by healthy adults: An overview of the application of natural language processing and machine learning techniques to clinical discourse analysis.

    Science.gov (United States)

    Toledo, Cíntia Matsuda; Cunha, Andre; Scarton, Carolina; Aluísio, Sandra

    2014-01-01

    Discourse production is an important aspect in the evaluation of brain-injured individuals. We believe that studies comparing the performance of brain-injured subjects with that of healthy controls must use groups with compatible education. A pioneering application of machine learning methods using Brazilian Portuguese for clinical purposes is described, highlighting education as an important variable in the Brazilian scenario. The aims were to describe how to:(i) develop machine learning classifiers using features generated by natural language processing tools to distinguish descriptions produced by healthy individuals into classes based on their years of education; and(ii) automatically identify the features that best distinguish the groups. The approach proposed here extracts linguistic features automatically from the written descriptions with the aid of two Natural Language Processing tools: Coh-Metrix-Port and AIC. It also includes nine task-specific features (three new ones, two extracted manually, besides description time; type of scene described - simple or complex; presentation order - which type of picture was described first; and age). In this study, the descriptions by 144 of the subjects studied in Toledo 18 were used,which included 200 healthy Brazilians of both genders. A Support Vector Machine (SVM) with a radial basis function (RBF) kernel is the most recommended approach for the binary classification of our data, classifying three of the four initial classes. CfsSubsetEval (CFS) is a strong candidate to replace manual feature selection methods.

  6. Automatic classification of written descriptions by healthy adults: An overview of the application of natural language processing and machine learning techniques to clinical discourse analysis

    Directory of Open Access Journals (Sweden)

    Cíntia Matsuda Toledo

    Full Text Available Discourse production is an important aspect in the evaluation of brain-injured individuals. We believe that studies comparing the performance of brain-injured subjects with that of healthy controls must use groups with compatible education. A pioneering application of machine learning methods using Brazilian Portuguese for clinical purposes is described, highlighting education as an important variable in the Brazilian scenario.OBJECTIVE: The aims were to describe how to: (i develop machine learning classifiers using features generated by natural language processing tools to distinguish descriptions produced by healthy individuals into classes based on their years of education; and (ii automatically identify the features that best distinguish the groups.METHODS: The approach proposed here extracts linguistic features automatically from the written descriptions with the aid of two Natural Language Processing tools: Coh-Metrix-Port and AIC. It also includes nine task-specific features (three new ones, two extracted manually, besides description time; type of scene described - simple or complex; presentation order - which type of picture was described first; and age. In this study, the descriptions by 144 of the subjects studied in Toledo18 were used, which included 200 healthy Brazilians of both genders.RESULTS AND CONCLUSION:A Support Vector Machine (SVM with a radial basis function (RBF kernel is the most recommended approach for the binary classification of our data, classifying three of the four initial classes. CfsSubsetEval (CFS is a strong candidate to replace manual feature selection methods.

  7. Machine Protection

    CERN Document Server

    Zerlauth, Markus; Wenninger, Jörg

    2012-01-01

    The present architecture of the machine protection system is being recalled and the performance of the associated systems during the 2011 run will be briefly summarized. An analysis of the causes of beam dumps as well as an assessment of the dependability of the machine protection systems (MPS) itself is being presented. Emphasis will be given to events that risked exposing parts of the machine to damage. Further improvements and mitigations of potential holes in the protection systems will be evaluated along with their impact on the 2012 run. The role of rMPP during the various operational phases (commissioning, intensity ramp up, MDs...) will be discussed along with a proposal for the intensity ramp up for the start of beam operation in 2012.

  8. Technologies for the Synthesis of mRNA-Encoding Libraries and Discovery of Bioactive Natural Product-Inspired Non-Traditional Macrocyclic Peptides

    Directory of Open Access Journals (Sweden)

    Hiroaki Suga

    2013-03-01

    Full Text Available In this review, we discuss emerging technologies for drug discovery, which yields novel molecular scaffolds based on natural product-inspired non-traditional peptides expressed using the translation machinery. Unlike natural products, these technologies allow for constructing mRNA-encoding libraries of macrocyclic peptides containing non-canonical sidechains and N-methyl-modified backbones. The complexity of sequence space in such libraries reaches as high as a trillion (>1012, affording initial hits of high affinity ligands against protein targets. Although this article comprehensively covers several related technologies, we discuss in greater detail the technical development and advantages of the Random non-standard Peptide Integration Discovery (RaPID system, including the recent identification of inhibitors against various therapeutic targets.

  9. Tuition fees and funding – barriers for non-traditional students? First results from the international research project Opening Universities for Lifelong Learning (OPULL)

    DEFF Research Database (Denmark)

    Moissidis, Sonja; Schwarz, Jochen; Yndigegn, Carsten

    2011-01-01

    experts undertaken. The current situation and perspectives in each country together with critical issues on how fees and funding influence higher education access for non-traditional students in these countries are discussed and explored through the interview evidence. The initial findings of the first......Project OPULL – Opening Universities for Lifelong Learning – is undertaking research into ways of opening up higher education to vocationally qualified and experienced target groups in four European countries. Open university models in Germany, Denmark, Finland and the United Kingdom are being...... investigated in three research phases between 2009 and 2012 with the aim of identifying critical success factors for building open universities for Europe. This paper presents the first phase, in which educational systems in the participant countries have been mapped and interviews with lifelong learning...

  10. The 21.5-kDa isoform of myelin basic protein has a non-traditional PY-nuclear-localization signal

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Graham S.T.; Seymour, Lauren V. [Molecular and Cellular Biology, University of Guelph, Guelph, Ontario (Canada); Boggs, Joan M. [Molecular Structure and Function, Research Institute, Hospital for Sick Children, and Laboratory Medicine and Pathobiology, University of Toronto, Toronto, Ontario (Canada); Harauz, George, E-mail: gharauz@uoguelph.ca [Molecular and Cellular Biology, University of Guelph, Guelph, Ontario (Canada)

    2012-06-15

    Highlights: Black-Right-Pointing-Pointer Full-length 21.5-kDa MBP isoform is translocated to the nucleus. Black-Right-Pointing-Pointer We hypothesized that the exon-II-encoded sequence contained the NLS. Black-Right-Pointing-Pointer We mutated this sequence in RFP-tagged constructs and transfected N19-cells. Black-Right-Pointing-Pointer Abolition of two key positively-charged residues resulted in loss of nuclear-trafficking. Black-Right-Pointing-Pointer The 21.5-kDa isoform of classic MBP contains a non-traditional PY-NLS. -- Abstract: The predominant 18.5-kDa classic myelin basic protein (MBP) is mainly responsible for compaction of the myelin sheath in the central nervous system, but is multifunctional, having numerous interactions with Ca{sup 2+}-calmodulin, actin, tubulin, and SH3-domains, and can tether these proteins to a lipid membrane in vitro. The full-length 21.5-kDa MBP isoform has an additional 26 residues encoded by exon-II of the classic gene, which causes it to be trafficked to the nucleus of oligodendrocytes (OLGs). We have performed site-directed mutagenesis of selected residues within this segment in red fluorescent protein (RFP)-tagged constructs, which were then transfected into the immortalized N19-OLG cell line to view protein localization using epifluorescence microscopy. We found that 21.5-kDa MBP contains two non-traditional PY-nuclear-localization signals, and that arginine and lysine residues within these motifs were involved in subcellular trafficking of this protein to the nucleus, where it may have functional roles during myelinogenesis.

  11. Opportunities for development of non-traditional hydrocarbon resources in the Timan-North Ural region, taking into account ecosystem services

    Directory of Open Access Journals (Sweden)

    I. G. Burtseva

    2017-12-01

    Full Text Available The authors formulate the definition of non-traditional resources from geological-genetic, technological and economic viewpoints. The authors present a detailed assessment of the resource potential of non-traditional hydrocarbon raw material in the Timan-Severouralsk region, including hydrocarbons in the deposits of the domanic type, methane of coal seams, liquid and gaseous hydrocarbons potentially extracted from black, brown coal and combustible shales. The authors also show the main directions of industrial use of coal and oil shales. The assessment of the resource potential of hydrocarbon raw materials in the deposits of the domanic type varies widely; the recoverable resources may amount to about 1 billion tons. Bituminous coals with a high volatile yield have the highest degree of conversion to liquid hydrocarbons, and brown and black coals of with a low degree of metamorphism usually serve for the production of combustible gas and primary resin. The paper describes the option of developing oil shale deposits as a possible investment project. The determined components and overall values of the economic effect from the implementation of the projects under consideration allow us to estimate that the payback period of investments does not exceed seven years. There is also a social effect: the creation of an additional 550 jobs in the operation of the quarry and about 700 jobs – in the enrichment and processing of oil shales. The estimated annual volume of output is 25–30 billion rubles, and the volume of tax revenues – up to 100 billion rubles. The authors evaluated ecosystem services in the territories of potential industrial development of coal and oil shale deposits; identified the beneficiaries of the benefits from the use of environmental services and the possibility of calculating payments.

  12. Postgraduates' perceptions of preparedness for work as a doctor and making future career decisions: support for rural, non-traditional medical schools.

    Science.gov (United States)

    Eley, D S

    2010-08-01

    The intern year is a critical time for making career decisions and gaining confidence in clinical skills, communication and teamwork practices; this justifies an interest in junior doctors' perceptions of their level of preparedness for hospital work. This study explored Australian junior doctors' perspectives regarding the transition from student to doctor roles, their preparation as medical undergraduates within either traditional metropolitan schools or smaller, outer metropolitan-based (rural) programs such as Rural Clinical Schools (RCS), and the educational environment they experienced in their internship. A qualitative cross-sectional design used semi-structured interviews with postgraduate year one and two junior doctors (9 females and 11 males) within teaching hospitals in Queensland Australia. Interview questions focussed on four major content areas: preparedness for hospital work, undergraduate training, building confidence and career advice. Data were analyzed using a framework method to identify and explore major themes. Junior doctors who spent undergraduate years training at smaller, non-traditional medical schools felt more confident and better prepared at internship. More hands-on experience as students, more patient contact and a better grounding in basic sciences were felt by interns to be ideal for building confidence. Junior doctors perceived a general lack of career guidance in both undergraduate and postgraduate teaching environments to help them with the transition from the student to junior doctor roles. Findings are congruent with studies that have confirmed student opinion on the higher quality of undergraduate medical training outside a traditional metropolitan-based program, such as a RCS. The serious shortage of doctors in rural and remote Australia makes these findings particularly relevant. It will be important to gain a better understanding of how smaller non-traditional medical programs build confidence and feelings of work

  13. Association between proximity to and coverage of traditional fast-food restaurants and non-traditional fast-food outlets and fast-food consumption among rural adults.

    Science.gov (United States)

    Sharkey, Joseph R; Johnson, Cassandra M; Dean, Wesley R; Horel, Scott A

    2011-05-20

    The objective of this study is to examine the relationship between residential exposure to fast-food entrées, using two measures of potential spatial access: proximity (distance to the nearest location) and coverage (number of different locations), and weekly consumption of fast-food meals. Traditional fast-food restaurants and non-traditional fast-food outlets, such as convenience stores, supermarkets, and grocery stores, from the 2006 Brazos Valley Food Environment Project were linked with individual participants (n = 1409) who completed the nutrition module in the 2006 Brazos Valley Community Health Assessment. Increased age, poverty, increased distance to the nearest fast food, and increased number of different traditional fast-food restaurants, non-traditional fast-food outlets, or fast-food opportunities were associated with less frequent weekly consumption of fast-food meals. The interaction of gender and proximity (distance) or coverage (number) indicated that the association of proximity to or coverage of fast-food locations on fast-food consumption was greater among women and opposite of independent effects. Results provide impetus for identifying and understanding the complex relationship between access to all fast-food opportunities, rather than to traditional fast-food restaurants alone, and fast-food consumption. The results indicate the importance of further examining the complex interaction of gender and distance in rural areas and particularly in fast-food consumption. Furthermore, this study emphasizes the need for health promotion and policy efforts to consider all sources of fast-food as part of promoting healthful food choices.

  14. Association between proximity to and coverage of traditional fast-food restaurants and non-traditional fast-food outlets and fast-food consumption among rural adults

    Directory of Open Access Journals (Sweden)

    Horel Scott A

    2011-05-01

    Full Text Available Abstract Objective The objective of this study is to examine the relationship between residential exposure to fast-food entrées, using two measures of potential spatial access: proximity (distance to the nearest location and coverage (number of different locations, and weekly consumption of fast-food meals. Methods Traditional fast-food restaurants and non-traditional fast-food outlets, such as convenience stores, supermarkets, and grocery stores, from the 2006 Brazos Valley Food Environment Project were linked with individual participants (n = 1409 who completed the nutrition module in the 2006 Brazos Valley Community Health Assessment. Results Increased age, poverty, increased distance to the nearest fast food, and increased number of different traditional fast-food restaurants, non-traditional fast-food outlets, or fast-food opportunities were associated with less frequent weekly consumption of fast-food meals. The interaction of gender and proximity (distance or coverage (number indicated that the association of proximity to or coverage of fast-food locations on fast-food consumption was greater among women and opposite of independent effects. Conclusions Results provide impetus for identifying and understanding the complex relationship between access to all fast-food opportunities, rather than to traditional fast-food restaurants alone, and fast-food consumption. The results indicate the importance of further examining the complex interaction of gender and distance in rural areas and particularly in fast-food consumption. Furthermore, this study emphasizes the need for health promotion and policy efforts to consider all sources of fast-food as part of promoting healthful food choices.

  15. Association between proximity to and coverage of traditional fast-food restaurants and non-traditional fast-food outlets and fast-food consumption among rural adults

    Science.gov (United States)

    2011-01-01

    Objective The objective of this study is to examine the relationship between residential exposure to fast-food entrées, using two measures of potential spatial access: proximity (distance to the nearest location) and coverage (number of different locations), and weekly consumption of fast-food meals. Methods Traditional fast-food restaurants and non-traditional fast-food outlets, such as convenience stores, supermarkets, and grocery stores, from the 2006 Brazos Valley Food Environment Project were linked with individual participants (n = 1409) who completed the nutrition module in the 2006 Brazos Valley Community Health Assessment. Results Increased age, poverty, increased distance to the nearest fast food, and increased number of different traditional fast-food restaurants, non-traditional fast-food outlets, or fast-food opportunities were associated with less frequent weekly consumption of fast-food meals. The interaction of gender and proximity (distance) or coverage (number) indicated that the association of proximity to or coverage of fast-food locations on fast-food consumption was greater among women and opposite of independent effects. Conclusions Results provide impetus for identifying and understanding the complex relationship between access to all fast-food opportunities, rather than to traditional fast-food restaurants alone, and fast-food consumption. The results indicate the importance of further examining the complex interaction of gender and distance in rural areas and particularly in fast-food consumption. Furthermore, this study emphasizes the need for health promotion and policy efforts to consider all sources of fast-food as part of promoting healthful food choices. PMID:21599955

  16. non-traditional vegetable production

    African Journals Online (AJOL)

    ern Region is determined by fertilizer and insecticide use. It is rec- ... the supply of fertilizer and insecticides in the market are readily available and ..... “Farmers' Technology,. Economic Performance and Relative Economic Efficiency of Coun- try Bean Growers." Bangladesh Journal of Agricultural Economics. M1): 85-96.

  17. Non-traditional Infrasound Deployment

    Science.gov (United States)

    McKenna, M. H.; McComas, S.; Simpson, C. P.; Diaz-Alvarez, H.; Costley, R. D.; Hayward, C.; Golden, P.; Endress, A.

    2017-12-01

    Historically, infrasound arrays have been deployed in rural environments where anthropological noise sources are limited. As interest in monitoring low energy sources at local distances grows in the infrasound community, it will be vital to understand how to monitor infrasound sources in an urban environment. Arrays deployed in urban centers have to overcome the decreased signal-to-noise ratio and reduced amount of real estate available to deploy an array. To advance the understanding of monitoring infrasound sources in urban environments, local and regional infrasound arrays were deployed on building rooftops on the campus at Southern Methodist University (SMU), and data were collected for one seasonal cycle. The data were evaluated for structural source signals (continuous-wave packets), and when a signal was identified, the back azimuth to the source was determined through frequency-wavenumber analysis. This information was used to identify hypothesized structural sources; these sources were verified through direct measurement and dynamic structural analysis modeling. In addition to the rooftop arrays, a camouflaged infrasound sensor was installed on the SMU campus and evaluated to determine its effectiveness for wind noise reduction. Permission to publish was granted by Director, Geotechnical and Structures Laboratory.

  18. Machine rates for selected forest harvesting machines

    Science.gov (United States)

    R.W. Brinker; J. Kinard; Robert Rummer; B. Lanford

    2002-01-01

    Very little new literature has been published on the subject of machine rates and machine cost analysis since 1989 when the Alabama Agricultural Experiment Station Circular 296, Machine Rates for Selected Forest Harvesting Machines, was originally published. Many machines discussed in the original publication have undergone substantial changes in various aspects, not...

  19. Representational Machines

    DEFF Research Database (Denmark)

    to the enterprises of the medium. This is the subject of Representational Machines: How photography enlists the workings of institutional technologies in search of establishing new iconic and social spaces. Together, the contributions to this edited volume span historical epochs, social environments, technological...

  20. Precision Machining

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    Panasonic's new laser printer using Precitech's machine tools described by Davis in this issue. The need to use natural and synthetic diamonds, a development that uses Bridgman's inven- tion of high pressure presses of over 40 kbars is now surpassed by diamond deposition tech- niques highlighted by Erik Bauer and his ...

  1. Machine testning

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo

    This document is used in connection with a laboratory exercise of 3 hours duration as a part of the course GEOMETRICAL METROLOGY AND MACHINE TESTING. The exercise includes a series of tests carried out by the student on a conventional and a numerically controled lathe, respectively. This document...

  2. Machine performance assessment and enhancement for a hexapod machine

    Energy Technology Data Exchange (ETDEWEB)

    Mou, J.I. [Arizona State Univ., Tempe, AZ (United States); King, C. [Sandia National Labs., Livermore, CA (United States). Integrated Manufacturing Systems Center

    1998-03-19

    The focus of this study is to develop a sensor fused process modeling and control methodology to model, assess, and then enhance the performance of a hexapod machine for precision product realization. Deterministic modeling technique was used to derive models for machine performance assessment and enhancement. Sensor fusion methodology was adopted to identify the parameters of the derived models. Empirical models and computational algorithms were also derived and implemented to model, assess, and then enhance the machine performance. The developed sensor fusion algorithms can be implemented on a PC-based open architecture controller to receive information from various sensors, assess the status of the process, determine the proper action, and deliver the command to actuators for task execution. This will enhance a hexapod machine`s capability to produce workpieces within the imposed dimensional tolerances.

  3. Nontraditional machining processes research advances

    CERN Document Server

    2013-01-01

    Nontraditional machining employs processes that remove material by various methods involving thermal, electrical, chemical and mechanical energy or even combinations of these. Nontraditional Machining Processes covers recent research and development in techniques and processes which focus on achieving high accuracies and good surface finishes, parts machined without burrs or residual stresses especially with materials that cannot be machined by conventional methods. With applications to the automotive, aircraft and mould and die industries, Nontraditional Machining Processes explores different aspects and processes through dedicated chapters. The seven chapters explore recent research into a range of topics including laser assisted manufacturing, abrasive water jet milling and hybrid processes. Students and researchers will find the practical examples and new processes useful for both reference and for developing further processes. Industry professionals and materials engineers will also find Nontraditional M...

  4. Considerations upon the Machine Learning Technologies

    Directory of Open Access Journals (Sweden)

    Alin Munteanu

    2006-01-01

    Full Text Available Artificial intelligence offers superior techniques and methods by which problems from diverse domains may find an optimal solution. The Machine Learning technologies refer to the domain of artificial intelligence aiming to develop the techniques allowing the computers to “learn”. Some systems based on Machine Learning technologies tend to eliminate the necessity of the human intelligence while the others adopt a man-machine collaborative approach.

  5. Charging machine

    International Nuclear Information System (INIS)

    Medlin, J.B.

    1976-01-01

    A charging machine for loading fuel slugs into the process tubes of a nuclear reactor includes a tubular housing connected to the process tube, a charging trough connected to the other end of the tubular housing, a device for loading the charging trough with a group of fuel slugs, means for equalizing the coolant pressure in the charging trough with the pressure in the process tubes, means for pushing the group of fuel slugs into the process tube and a latch and a seal engaging the last object in the group of fuel slugs to prevent the fuel slugs from being ejected from the process tube when the pusher is removed and to prevent pressure liquid from entering the charging machine. 3 claims, 11 drawing figures

  6. Electric machines

    CERN Document Server

    Gross, Charles A

    2006-01-01

    BASIC ELECTROMAGNETIC CONCEPTSBasic Magnetic ConceptsMagnetically Linear Systems: Magnetic CircuitsVoltage, Current, and Magnetic Field InteractionsMagnetic Properties of MaterialsNonlinear Magnetic Circuit AnalysisPermanent MagnetsSuperconducting MagnetsThe Fundamental Translational EM MachineThe Fundamental Rotational EM MachineMultiwinding EM SystemsLeakage FluxThe Concept of Ratings in EM SystemsSummaryProblemsTRANSFORMERSThe Ideal n-Winding TransformerTransformer Ratings and Per-Unit ScalingThe Nonideal Three-Winding TransformerThe Nonideal Two-Winding TransformerTransformer Efficiency and Voltage RegulationPractical ConsiderationsThe AutotransformerOperation of Transformers in Three-Phase EnvironmentsSequence Circuit Models for Three-Phase Transformer AnalysisHarmonics in TransformersSummaryProblemsBASIC MECHANICAL CONSIDERATIONSSome General PerspectivesEfficiencyLoad Torque-Speed CharacteristicsMass Polar Moment of InertiaGearingOperating ModesTranslational SystemsA Comprehensive Example: The ElevatorP...

  7. Genesis machines

    CERN Document Server

    Amos, Martyn

    2014-01-01

    Silicon chips are out. Today's scientists are using real, wet, squishy, living biology to build the next generation of computers. Cells, gels and DNA strands are the 'wetware' of the twenty-first century. Much smaller and more intelligent, these organic computers open up revolutionary possibilities. Tracing the history of computing and revealing a brave new world to come, Genesis Machines describes how this new technology will change the way we think not just about computers - but about life itself.

  8. Non-traditional CD4+CD25-CD69+ regulatory T cells are correlated to leukemia relapse after allogeneic hematopoietic stem cell transplantation.

    Science.gov (United States)

    Zhao, Xiao-su; Wang, Xu-hua; Zhao, Xiang-yu; Chang, Ying-jun; Xu, Lan-ping; Zhang, Xiao-hui; Huang, Xiao-jun

    2014-07-01

    Non-traditional CD4+CD25-CD69+ T cells were found to be involved in disease progression in tumor-bearing mouse models and cancer patients recently. We attempted to define whether this subset of T cells were related to leukemia relapse after allogeneic hematopoietic cell transplantation (allo-HSCT). The frequency of CD4+CD25-CD69+ T cells among the CD4+ T cell population from the bone marrow of relapsed patients, patients with positive minimal residual disease (MRD+) and healthy donors was examined by flow cytometry. The CD4+CD25-CD69+ T cells were also stained with the intracellular markers to determine the cytokine (TGF-β, IL-2 and IL-10) secretion. The results showed that the frequency of CD4+CD25-CD69 + T cells was markedly increased in patients in the relapsed group and the MRD + group compared to the healthy donor group. The percentage of this subset of T cells was significantly decreased after effective intervention treatment. We also analyzed the reconstitution of CD4+CD25-CD69+ T cells at various time points after allo-HSCT, and the results showed that this subset of T cells reconstituted rapidly and reached a relatively higher level at +60 d in patients compared to controls. The incidence of either MRD+ or relapse in patients with a high frequency of CD4+CD25-CD69+ T cells (>7%) was significantly higher than that of patients with a low frequency of CD4+CD25-CD69+ T cells at +60 d, +90 d and +270 d after transplant. However, our preliminary data indicated that CD4+CD25-CD69+ T cells may not exert immunoregulatory function via cytokine secretion. This study provides the first clinical evidence of a correlation between non-traditional CD4+CD25-CD69+ Tregs and leukemia relapse after allo-HSCT and suggests that exploration of new methods of adoptive immunotherapy may be beneficial. Further research related to regulatory mechanism behind this phenomenon would be necessary.

  9. Adapting machine learning techniques to censored time-to-event health record data: A general-purpose approach using inverse probability of censoring weighting.

    Science.gov (United States)

    Vock, David M; Wolfson, Julian; Bandyopadhyay, Sunayan; Adomavicius, Gediminas; Johnson, Paul E; Vazquez-Benitez, Gabriela; O'Connor, Patrick J

    2016-06-01

    Models for predicting the probability of experiencing various health outcomes or adverse events over a certain time frame (e.g., having a heart attack in the next 5years) based on individual patient characteristics are important tools for managing patient care. Electronic health data (EHD) are appealing sources of training data because they provide access to large amounts of rich individual-level data from present-day patient populations. However, because EHD are derived by extracting information from administrative and clinical databases, some fraction of subjects will not be under observation for the entire time frame over which one wants to make predictions; this loss to follow-up is often due to disenrollment from the health system. For subjects without complete follow-up, whether or not they experienced the adverse event is unknown, and in statistical terms the event time is said to be right-censored. Most machine learning approaches to the problem have been relatively ad hoc; for example, common approaches for handling observations in which the event status is unknown include (1) discarding those observations, (2) treating them as non-events, (3) splitting those observations into two observations: one where the event occurs and one where the event does not. In this paper, we present a general-purpose approach to account for right-censored outcomes using inverse probability of censoring weighting (IPCW). We illustrate how IPCW can easily be incorporated into a number of existing machine learning algorithms used to mine big health care data including Bayesian networks, k-nearest neighbors, decision trees, and generalized additive models. We then show that our approach leads to better calibrated predictions than the three ad hoc approaches when applied to predicting the 5-year risk of experiencing a cardiovascular adverse event, using EHD from a large U.S. Midwestern healthcare system. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Sneaky Submarine Landslides, and how to Quantify them: A Case Study from the Mississippi River Delta Front Contrasting Geophysical and Machine Learning Techniques

    Science.gov (United States)

    Obelcz, J.; Xu, K.; Bentley, S. J.; Wood, W. T.; Georgiou, I. Y.; Maloney, J. M.; Miner, M. D.

    2017-12-01

    The highly publicized subsidence and decline of the Mississippi River Delta Front's (MRDF) subaerial section has recently precipitated studies of the subaqueous MRDF to assess whether it too is subsiding and regressing landward. These studies have largely focused on the area offshore the most active current distributary of the Mississippi River, Southwest Pass, during a decade (post-Hurricane Rita 2005-2014) of relatively quiescent Gulf of Mexico hurricane activity. Utilizing repeat swath bathymetric surveys, it was determined that submarine landslides not associated with major (category ≥ 3) passage are important drivers of downslope sediment transport on the MRDF. Volumetrically, sediment flux downslope without major hurricane influence is approximately half that during a given hurricane-influenced year (5.5 x 105 and 1.1 x 106 m3, respectively). This finding is notable and warrants comparison with other settings to assess the global impact on the source-to-sink budget of small but frequent landslides, but the resource-intensive repeat geophysical surveys required make it a prohibitive option at the margin and global scale. One option to quantify small-scale submarine slope failures while reducing required data acquisition is to utilize machine learning algorithms (MLAs) to intelligently estimate the occurrence and magnitude of submarine landslides based on correlated physical and geological parameters. Here, the MRDF volumetric changes described above are parsed into training and validation data, and physical and geological parameters associated with slope failure (such as porosity, steep slopes, high rates of sedimentation, and presence of gas in pore water) known from prior coring and seafloor mapping expeditions serve as potential predictive variables. The resulting submarine landslide spatial distribution and magnitude maps output by the MLAs are compared to those obtained through geophysical surveys, providing a proof of concept that machine learning can

  11. Machine Learning of Musical Gestures

    OpenAIRE

    Caramiaux, Baptiste; Tanaka, Atau

    2013-01-01

    We present an overview of machine learning (ML) techniques and theirapplication in interactive music and new digital instruments design. We firstgive to the non-specialist reader an introduction to two ML tasks,classification and regression, that are particularly relevant for gesturalinteraction. We then present a review of the literature in current NIMEresearch that uses ML in musical gesture analysis and gestural sound control.We describe the ways in which machine learning is useful for cre...

  12. Laser machining of explosives

    Science.gov (United States)

    Perry, Michael D.; Stuart, Brent C.; Banks, Paul S.; Myers, Booth R.; Sefcik, Joseph A.

    2000-01-01

    The invention consists of a method for machining (cutting, drilling, sculpting) of explosives (e.g., TNT, TATB, PETN, RDX, etc.). By using pulses of a duration in the range of 5 femtoseconds to 50 picoseconds, extremely precise and rapid machining can be achieved with essentially no heat or shock affected zone. In this method, material is removed by a nonthermal mechanism. A combination of multiphoton and collisional ionization creates a critical density plasma in a time scale much shorter than electron kinetic energy is transferred to the lattice. The resulting plasma is far from thermal equilibrium. The material is in essence converted from its initial solid-state directly into a fully ionized plasma on a time scale too short for thermal equilibrium to be established with the lattice. As a result, there is negligible heat conduction beyond the region removed resulting in negligible thermal stress or shock to the material beyond a few microns from the laser machined surface. Hydrodynamic expansion of the plasma eliminates the need for any ancillary techniques to remove material and produces extremely high quality machined surfaces. There is no detonation or deflagration of the explosive in the process and the material which is removed is rendered inert.

  13. Lot quality assurance sampling for monitoring coverage and quality of a targeted condom social marketing programme in traditional and non-traditional outlets in India

    Science.gov (United States)

    Piot, Bram; Navin, Deepa; Krishnan, Nattu; Bhardwaj, Ashish; Sharma, Vivek; Marjara, Pritpal

    2010-01-01

    Objectives This study reports on the results of a large-scale targeted condom social marketing campaign in and around areas where female sex workers are present. The paper also describes the method that was used for the routine monitoring of condom availability in these sites. Methods The lot quality assurance sampling (LQAS) method was used for the assessment of the geographical coverage and quality of coverage of condoms in target areas in four states and along selected national highways in India, as part of Avahan, the India AIDS initiative. Results A significant general increase in condom availability was observed in the intervention area between 2005 and 2008. High coverage rates were gradually achieved through an extensive network of pharmacies and particularly of non-traditional outlets, whereas traditional outlets were instrumental in providing large volumes of condoms. Conclusion LQAS is seen as a valuable tool for the routine monitoring of the geographical coverage and of the quality of delivery systems of condoms and of health products and services in general. With a relatively small sample size, easy data collection procedures and simple analytical methods, it was possible to inform decision-makers regularly on progress towards coverage targets. PMID:20167732

  14. Empirical research evaluating the effects of non-traditional approaches to enhancing sleep in typical and clinical children and young people.

    Science.gov (United States)

    France, Karyn G; McLay, Laurie K; Hunter, Jolene E; France, Madeline L S

    2018-06-01

    This paper examines the effects of non-traditional (non-behavioural and non-prescription pharmaceutical) approaches to sleep in children and young people (0-18 y). A systematic search identified 79 studies that met inclusion criteria. Seventeen percent of the studies were rated as having a conclusive level of evidence, forty-two percent with preponderant evidence and forty-one percent with only suggestive evidence. There were promising indications, with certain populations only, for aromatherapy, ketogenic diets, an elimination diet (few foods diet), elimination of cow's milk, avoidance of caffeine, tryptophan with adenosine and uridine, omega-3 and omega-6, valerian, music, osteopathic manipulation and white noise. Bright light therapy and massage returned some positive results. All of these interventions warrant further, more rigorous research. There was limited or no evidence to support acupressure or acupuncture, other diets or dietary supplements, exercise or weighted blankets. Caution is needed in interpreting some studies because poorer quality studies were more likely to return positive results. Suggestions are made for the improvement of large and smaller scale research, especially conceptualization around multiple physiological measures of sleep and the adoption of research methods which are of use in clinical settings. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Eating habits of a population undergoing a rapid dietary transition: portion sizes of traditional and non-traditional foods and beverages consumed by Inuit adults in Nunavut, Canada.

    Science.gov (United States)

    Sheehy, Tony; Roache, Cindy; Sharma, Sangita

    2013-06-02

    To determine the portion sizes of traditional and non-traditional foods being consumed by Inuit adults in three remote communities in Nunavut, Canada. A cross-sectional study was carried out between June and October, 2008. Trained field workers collected dietary data using a culturally appropriate, validated quantitative food frequency questionnaire (QFFQ) developed specifically for the study population. Caribou, muktuk (whale blubber and skin) and Arctic char (salmon family), were the most commonly consumed traditional foods; mean portion sizes for traditional foods ranged from 10 g for fermented seal fat to 424 g for fried caribou. Fried bannock and white bread were consumed by >85% of participants; mean portion sizes for these foods were 189 g and 70 g, respectively. Sugar-sweetened beverages and energy-dense, nutrient-poor foods were also widely consumed. Mean portion sizes for regular pop and sweetened juices with added sugar were 663 g and 572 g, respectively. Mean portion sizes for potato chips, pilot biscuits, cakes, chocolate and cookies were 59 g, 59 g, 106 g, 59 g, and 46 g, respectively. The present study provides further evidence of the nutrition transition that is occurring among Inuit in the Canadian Arctic. It also highlights a number of foods and beverages that could be targeted in future nutritional intervention programs aimed at obesity and diet-related chronic disease prevention in these and other Inuit communities.

  16. Teachers' views of using e-learning for non-traditional students in higher education across three disciplines [nursing, chemistry and management] at a time of massification and increased diversity in higher education.

    Science.gov (United States)

    Allan, Helen T; O'Driscoll, Mike; Simpson, Vikki; Shawe, Jill

    2013-09-01

    The expansion of the higher educational sector in the United Kingdom over the last two decades to meet political aspirations of the successive governments and popular demand for participation in the sector (the Widening Participation Agenda) has overlapped with the introduction of e-learning. This paper describes teachers' views of using e-learning for non-traditional students in higher education across three disciplines [nursing, chemistry and management] at a time of massification and increased diversity in higher education. A three phase, mixed methods study; this paper reports findings from phase two of the study. One university in England. Higher education teachers teaching on the nursing, chemistry and management programmes. Focus groups with these teachers. Findings from these data show that teachers across the programmes have limited knowledge of whether students are non-traditional or what category of non-traditional status they might be in. Such knowledge as they have does not seem to influence the tailoring of teaching and learning for non-traditional students. Teachers in chemistry and nursing want more support from the university to improve their use of e-learning, as did teachers in management but to a lesser extent. Our conclusions confirm other studies in the field outside nursing which suggest that non-traditional students' learning needs have not been considered meaningfully in the development of e-learning strategies in universities. We suggest that this may be because teachers have been required to develop e-learning at the same time as they cope with the massification of, and widening participation in, higher education. The findings are of particular importance to nurse educators given the high number of non-traditional students on nursing programmes. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Shear machines

    International Nuclear Information System (INIS)

    Astill, M.; Sunderland, A.; Waine, M.G.

    1980-01-01

    A shear machine for irradiated nuclear fuel elements has a replaceable shear assembly comprising a fuel element support block, a shear blade support and a clamp assembly which hold the fuel element to be sheared in contact with the support block. A first clamp member contacts the fuel element remote from the shear blade and a second clamp member contacts the fuel element adjacent the shear blade and is advanced towards the support block during shearing to compensate for any compression of the fuel element caused by the shear blade (U.K.)

  18. Environmentally Friendly Machining

    CERN Document Server

    Dixit, U S; Davim, J Paulo

    2012-01-01

    Environment-Friendly Machining provides an in-depth overview of environmentally-friendly machining processes, covering numerous different types of machining in order to identify which practice is the most environmentally sustainable. The book discusses three systems at length: machining with minimal cutting fluid, air-cooled machining and dry machining. Also covered is a way to conserve energy during machining processes, along with useful data and detailed descriptions for developing and utilizing the most efficient modern machining tools. Researchers and engineers looking for sustainable machining solutions will find Environment-Friendly Machining to be a useful volume.

  19. Electricity of machine tool

    International Nuclear Information System (INIS)

    Gijeon media editorial department

    1977-10-01

    This book is divided into three parts. The first part deals with electricity machine, which can taints from generator to motor, motor a power source of machine tool, electricity machine for machine tool such as switch in main circuit, automatic machine, a knife switch and pushing button, snap switch, protection device, timer, solenoid, and rectifier. The second part handles wiring diagram. This concludes basic electricity circuit of machine tool, electricity wiring diagram in your machine like milling machine, planer and grinding machine. The third part introduces fault diagnosis of machine, which gives the practical solution according to fault diagnosis and the diagnostic method with voltage and resistance measurement by tester.

  20. Machine capability index evaluation of machining center

    International Nuclear Information System (INIS)

    Hong, Won Pyo

    2013-01-01

    Recently, there has been an increasing need to produce more precise products, with only the smallest deviations from a defined target value. Machine capability is the ability of a machine tool to produce parts within the tolerance interval. Capability indices are a statistical way of describing how well a product is machined compared to defined target values and tolerances. Currently, there is no standardized way to acquire a machine capability value. This paper describes how machine capability indices are evaluated in machining centers. After the machining of specimens, straightness, roundness and positioning accuracy were measured using CMM(coordinate measuring machine). These measured values and defined tolerances were used to evaluate the machine capability index. It will be useful for the industry to have standardized ways to choose and calculate machine capability indices.

  1. Predictive analysis of the influence of the chemical composition and pre-processing regimen on structural properties of steel alloys using machine learning techniques

    Science.gov (United States)

    Krishnamurthy, Narayanan; Maddali, Siddharth; Romanov, Vyacheslav; Hawk, Jeffrey

    We present some structural properties of multi-component steel alloys as predicted by a random forest machine-learning model. These non-parametric models are trained on high-dimensional data sets defined by features such as chemical composition, pre-processing temperatures and environmental influences, the latter of which are based upon standardized testing procedures for tensile, creep and rupture properties as defined by the American Society of Testing and Materials (ASTM). We quantify the goodness of fit of these models as well as the inferred relative importance of each of these features, all with a conveniently defined metric and scale. The models are tested with synthetic data points, generated subject to the appropriate mathematical constraints for the various features. By this we highlight possible trends in the increase or degradation of the structural properties with perturbations in the features of importance. This work is presented as part of the Data Science Initiative at the National Energy Technology Laboratory, directed specifically towards the computational design of steel alloys.

  2. Machine Protection

    CERN Document Server

    Schmidt, R

    2014-01-01

    The protection of accelerator equipment is as old as accelerator technology and was for many years related to high-power equipment. Examples are the protection of powering equipment from overheating (magnets, power converters, high-current cables), of superconducting magnets from damage after a quench and of klystrons. The protection of equipment from beam accidents is more recent. It is related to the increasing beam power of high-power proton accelerators such as ISIS, SNS, ESS and the PSI cyclotron, to the emission of synchrotron light by electron–positron accelerators and FELs, and to the increase of energy stored in the beam (in particular for hadron colliders such as LHC). Designing a machine protection system requires an excellent understanding of accelerator physics and operation to anticipate possible failures that could lead to damage. Machine protection includes beam and equipment monitoring, a system to safely stop beam operation (e.g. dumping the beam or stopping the beam at low energy) and an ...

  3. Machine Learning for Medical Imaging.

    Science.gov (United States)

    Erickson, Bradley J; Korfiatis, Panagiotis; Akkus, Zeynettin; Kline, Timothy L

    2017-01-01

    Machine learning is a technique for recognizing patterns that can be applied to medical images. Although it is a powerful tool that can help in rendering medical diagnoses, it can be misapplied. Machine learning typically begins with the machine learning algorithm system computing the image features that are believed to be of importance in making the prediction or diagnosis of interest. The machine learning algorithm system then identifies the best combination of these image features for classifying the image or computing some metric for the given image region. There are several methods that can be used, each with different strengths and weaknesses. There are open-source versions of most of these machine learning methods that make them easy to try and apply to images. Several metrics for measuring the performance of an algorithm exist; however, one must be aware of the possible associated pitfalls that can result in misleading metrics. More recently, deep learning has started to be used; this method has the benefit that it does not require image feature identification and calculation as a first step; rather, features are identified as part of the learning process. Machine learning has been used in medical imaging and will have a greater influence in the future. Those working in medical imaging must be aware of how machine learning works. © RSNA, 2017.

  4. Machine Learning for Medical Imaging

    Science.gov (United States)

    Korfiatis, Panagiotis; Akkus, Zeynettin; Kline, Timothy L.

    2017-01-01

    Machine learning is a technique for recognizing patterns that can be applied to medical images. Although it is a powerful tool that can help in rendering medical diagnoses, it can be misapplied. Machine learning typically begins with the machine learning algorithm system computing the image features that are believed to be of importance in making the prediction or diagnosis of interest. The machine learning algorithm system then identifies the best combination of these image features for classifying the image or computing some metric for the given image region. There are several methods that can be used, each with different strengths and weaknesses. There are open-source versions of most of these machine learning methods that make them easy to try and apply to images. Several metrics for measuring the performance of an algorithm exist; however, one must be aware of the possible associated pitfalls that can result in misleading metrics. More recently, deep learning has started to be used; this method has the benefit that it does not require image feature identification and calculation as a first step; rather, features are identified as part of the learning process. Machine learning has been used in medical imaging and will have a greater influence in the future. Those working in medical imaging must be aware of how machine learning works. ©RSNA, 2017 PMID:28212054

  5. Classroom Assessment Techniques: A Literature Review

    Science.gov (United States)

    DiCarlo, Kristen; Cooper, Lori

    2014-01-01

    Effective classroom assessment techniques are directly linked to course objectives and proposed outcomes. Results within formative and summative assessments have been studied in the online learning environment as educators seek to meet objectives with respect to student success in the non-traditional setting. Online classroom assessment techniques…

  6. Representational Machines

    DEFF Research Database (Denmark)

    Petersson, Dag; Dahlgren, Anna; Vestberg, Nina Lager

    Photography not only represents space. Space is produced photographically. Since its inception in the 19th century, photography has brought to light a vast array of represented subjects. Always situated in some spatial order, photographic representations have been operatively underpinned by social...... to the enterprises of the medium. This is the subject of Representational Machines: How photography enlists the workings of institutional technologies in search of establishing new iconic and social spaces. Together, the contributions to this edited volume span historical epochs, social environments, technological......, technical, and institutional mechanisms. Geographically, bodily, and geometrically, the camera has positioned its subjects in social structures and hierarchies, in recognizable localities, and in iconic depth constructions which, although they show remarkable variation, nevertheless belong specifically...

  7. Optimization of machining techniques–A retrospective and literature ...

    Indian Academy of Sciences (India)

    In this paper an attempt is made to review the literature on optimizing machining parameters in turning processes. Various conventional techniques employed for machining optimization include geometric programming, geometric plus linear programming, goal programming, sequential unconstrained minimizationtechnique, ...

  8. Analysis of aerosol emission and hazard evaluation of electrical discharge machining (EDM) process.

    Science.gov (United States)

    Jose, Mathew; Sivapirakasam, S P; Surianarayanan, M

    2010-01-01

    The safety and environmental aspects of a manufacturing process are important due to increased environmental regulations and life quality. In this paper, the concentration of aerosols in the breathing zone of the operator of Electrical Discharge Machining (EDM), a commonly used non traditional manufacturing process is presented. The pattern of aerosol emissions from this process with varying process parameters such as peak current, pulse duration, dielectric flushing pressure and the level of dielectric was evaluated. Further, the HAZOP technique was employed to identify the inherent safety aspects and fire risk of the EDM process under different working conditions. The analysis of aerosol exposure showed that the concentration of aerosol was increased with increase in the peak current, pulse duration and dielectric level and was decreased with increase in the flushing pressure. It was also found that at higher values of peak current (7A) and pulse duration (520 micros), the concentration of aerosols at breathing zone of the operator was above the permissible exposure limit value for respirable particulates (5 mg/m(3)). HAZOP study of the EDM process showed that this process is vulnerable to fire and explosion hazards. A detailed discussion on preventing the fire and explosion hazard is presented in this paper. The emission and risk of fire of the EDM process can be minimized by selecting proper process parameters and employing appropriate control strategy.

  9. Research on the proficient machine system. Theoretical part; Jukutatsu machine system no chosa kenkyu. Rironhen

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-03-01

    The basic theory of the proficient machine system to be developed was studied. Important proficient techniques in manufacturing industries are becoming extinct because of insufficient succession to next generation. The proficient machine system was proposed to cope with such situation. This machine system includes the mechanism for progress and evolution of techniques and sensibilities to be adaptable to environmental changes by learning and recognizing various motions such as work and process. Consequently, the basic research fields are composed of thought, learning, perception and action. This machine requires not only deigned fixed functions but also introduction of the same proficient concept as human being to be adaptable to changes in situation, purpose, time and machine`s complexity. This report explains in detail the basic concept, system principle, approaching procedure and practical elemental technologies of the proficient machine system, and also describes the future prospect. 133 refs., 110 figs., 7 tabs.

  10. Rotating Drive for Electrical-Arc Machining

    Science.gov (United States)

    Fransen, C. D.

    1986-01-01

    Rotating drive improves quality of holes made by electrical-arc machining. Mechanism (Uni-tek, rotary head, or equivalent) attached to electrical-arc system. Drive rotates electrode as though it were mechanical drill, while an arc disintegrates metal in workpiece, thereby creating hole. Rotating electrode method often used in electric-discharge machining. NASA innovation is application of technique to electrical-arc machining.

  11. MLZ: Machine Learning for photo-Z

    Science.gov (United States)

    Carrasco Kind, Matias; Brunner, Robert

    2014-03-01

    The parallel Python framework MLZ (Machine Learning and photo-Z) computes fast and robust photometric redshift PDFs using Machine Learning algorithms. It uses a supervised technique with prediction trees and random forest through TPZ that can be used for a regression or a classification problem, or a unsupervised methods with self organizing maps and random atlas called SOMz. These machine learning implementations can be efficiently combined into a more powerful one resulting in robust and accurate probability distributions for photometric redshifts.

  12. MLnet report: training in Europe on machine learning

    OpenAIRE

    Ellebrecht, Mario; Morik, Katharina

    1999-01-01

    Machine learning techniques offer opportunities for a variety of applications and the theory of machine learning investigates problems that are of interest for other fields of computer science (e.g., complexity theory, logic programming, pattern recognition). However, the impacts of machine learning can only be recognized by those who know the techniques and are able to apply them. Hence, teaching machine learning is necessary before this field can diversify computer science. In order ...

  13. Heavy metals, arsenic, and pesticide contamination in an area with high incidence of chronic kidney disease of non-traditional causes in El Salvador

    Science.gov (United States)

    Lopez, D. A.; Ribó, A.; Quinteros, E.; Mejia, R.; Jovel, R.; VanDervort, D.; Orantes, C. M.

    2013-12-01

    Chronic kidney disease of non-traditional causes is epidemic in Central America, Southern Mexico and other regions of the world such as Sri Lanka, where the origin of the illness is attributed to exposure to agrochemicals and arsenic in soils and groundwater. In Central America, several causes have been suggested for this illness including: high ambient temperatures and chronic dehydration, and toxic effects of agrochemicals. Previous research using step-wise multivariate regression in El Salvador found statistically significant correlation between the spatial distribution of the number of sick people per thousand inhabitants and the percent area cultivated with sugar cane, cotton, and beans, and maximum ambient temperature, with sugar cane cultivation as the most significant factor. This study aims to investigate the possible effects of agricultural activities in the occurrence of this illness looking at heavy metal, arsenic and pesticide contamination in soil, water and sediments of a community located in Bajo Lempa region (Ciudad Romero, El Salvador) and heavily affected by this illness. The Bajo Lempa region is close to Lempa River delta, in the Pacific coast. Ground and surface water, sediment and soil samples were collected in the village where the patients live and in the agricultural areas where they work. With respect to the heavy metals, lead and cadmium where detected in the soils but below the standards for cultivated soils, however, they were not detected in the majority of surface and groundwater. Of the inorganic contaminants, arsenic was present in most soil, sediments, and water samples with some concentrations considerable higher than the standards for cultivated lands and drinking water. Statistically different concentrations in soils were found for the village soils and the cultivated soils, with arsenic higher in the cultivated soils. For the pesticides, results show a significant pollution of soil and groundwater of organochlorine pesticides

  14. A feasibility study for NOn-Traditional providers to support the management of Elderly People with Anxiety and Depression: The NOTEPAD study Protocol.

    Science.gov (United States)

    Burroughs, Heather; Bartlam, Bernadette; Ray, Mo; Kingstone, Tom; Shepherd, Tom; Ogollah, Reuben; Proctor, Janine; Waheed, Waquas; Bower, Peter; Bullock, Peter; Lovell, Karina; Gilbody, Simon; Bailey, Della; Butler-Whalley, Stephanie; Chew-Graham, Carolyn

    2018-03-07

    Anxiety and depression are common among older people, with up to 20% reporting such symptoms, and the prevalence increases with co-morbid chronic physical health problems. Access to treatment for anxiety and depression in this population is poor due to a combination of factors at the level of patient, practitioner and healthcare system. There is evidence to suggest that older people with anxiety and/or depression may benefit both from one-to-one interventions and group social or educational activities, which reduce loneliness, are participatory and offer some activity. Non-traditional providers (support workers) working within third-sector (voluntary) organisations are a valuable source of expertise within the community but are under-utilised by primary care practitioners. Such a resource could increase access to care, and be less stigmatising and more acceptable for older people. The study is in three phases and this paper describes the protocol for phase III, which will evaluate the feasibility of recruiting general practices and patients into the study, and determine whether support workers can deliver the intervention to older people with sufficient fidelity and whether this approach is acceptable to patients, general practitioners and the third-sector providers. Phase III of the NOTEPAD study is a randomised controlled trial (RCT) that is individually randomised. It recruited participants from approximately six general practices in the UK. In total, 100 participants aged 65 years and over who score 10 or more on PHQ9 or GAD7 for anxiety or depression will be recruited and randomised to the intervention or usual general practice care. A mixed methods approach will be used and follow-up will be conducted 12 weeks post-randomisation. This study will inform the design and methods of a future full-scale RCT. ISRCTN, ID: ISRCTN16318986 . Registered 10 November 2016. The ISRCTN registration is in line with the World Health Organization Trial Registration Data Set

  15. Higgs Machine Learning Challenge 2014

    CERN Multimedia

    Olivier, A-P; Bourdarios, C ; LAL / Orsay; Goldfarb, S ; University of Michigan

    2014-01-01

    High Energy Physics (HEP) has been using Machine Learning (ML) techniques such as boosted decision trees (paper) and neural nets since the 90s. These techniques are now routinely used for difficult tasks such as the Higgs boson search. Nevertheless, formal connections between the two research fields are rather scarce, with some exceptions such as the AppStat group at LAL, founded in 2006. In collaboration with INRIA, AppStat promotes interdisciplinary research on machine learning, computational statistics, and high-energy particle and astroparticle physics. We are now exploring new ways to improve the cross-fertilization of the two fields by setting up a data challenge, following the footsteps of, among others, the astrophysics community (dark matter and galaxy zoo challenges) and neurobiology (connectomics and decoding the human brain). The organization committee consists of ATLAS physicists and machine learning researchers. The Challenge will run from Monday 12th to September 2014.

  16. VIRTUAL MODELING OF A NUMERICAL CONTROL MACHINE TOOL USED FOR COMPLEX MACHINING OPERATIONS

    Directory of Open Access Journals (Sweden)

    POPESCU Adrian

    2015-11-01

    Full Text Available This paper presents the 3D virtual model of the numerical control machine Modustar 100, in terms of machine elements. This is a CNC machine of modular construction, all components allowing the assembly in various configurations. The paper focused on the design of the subassemblies specific to the axes numerically controlled by means of CATIA v5, which contained different drive kinematic chains of different translation modules that ensures translation on X, Y and Z axis. Machine tool development for high speed and highly precise cutting demands employment of advanced simulation techniques witch it reflect on cost of total development of the machine.

  17. Effect of machining fluid on the process performance of wire electrical discharge machining of nanocomposite ceramic

    Directory of Open Access Journals (Sweden)

    Zhang Chengmao

    2015-01-01

    Full Text Available Wire electric discharge machining (WEDM promise to be effective and economical techniques for the production of tools and parts from conducting ceramic blanks. However, the manufacturing of nanocomposite ceramics blanks with these processes is a long and costly process. This paper presents a new process of machining nanocomposite ceramics using WEDM. WEDM uses water based emulsion, polyvinyl alcohol and distilled water as the machining fluid. Machining fluid is a primary factor that affects the material removal rate and surface quality of WEDM. The effects of emulsion concentration, polyvinyl alcohol concentration and distilled water of the machining fluid on the process performance have been investigated.

  18. Addiction Machines

    Directory of Open Access Journals (Sweden)

    James Godley

    2011-10-01

    Full Text Available Entry into the crypt William Burroughs shared with his mother opened and shut around a failed re-enactment of William Tell’s shot through the prop placed upon a loved one’s head. The accidental killing of his wife Joan completed the installation of the addictation machine that spun melancholia as manic dissemination. An early encryptment to which was added the audio portion of abuse deposited an undeliverable message in WB. Wil- liam could never tell, although his corpus bears the in- scription of this impossibility as another form of pos- sibility. James Godley is currently a doctoral candidate in Eng- lish at SUNY Buffalo, where he studies psychoanalysis, Continental philosophy, and nineteenth-century litera- ture and poetry (British and American. His work on the concept of mourning and “the dead” in Freudian and Lacanian approaches to psychoanalytic thought and in Gothic literature has also spawned an essay on zombie porn. Since entering the Academy of Fine Arts Karlsruhe in 2007, Valentin Hennig has studied in the classes of Sil- via Bächli, Claudio Moser, and Corinne Wasmuht. In 2010 he spent a semester at the Dresden Academy of Fine Arts. His work has been shown in group exhibi- tions in Freiburg and Karlsruhe.

  19. Machine Translation in Post-Contemporary Era

    Science.gov (United States)

    Lin, Grace Hui Chin

    2010-01-01

    This article focusing on translating techniques via personal computer or laptop reports updated artificial intelligence progresses before 2010. Based on interpretations and information for field of MT [Machine Translation] by Yorick Wilks' book, "Machine Translation, Its scope and limits," this paper displays understandable theoretical frameworks…

  20. The effectiveness of using non-traditional teaching methods to prepare student health care professionals for the delivery of mental state examination: a systematic review.

    Science.gov (United States)

    Xie, Huiting; Liu, Lei; Wang, Jia; Joon, Kum Eng; Parasuram, Rajni; Gunasekaran, Jamuna; Poh, Chee Lien

    2015-08-14

    With the evolution of education, there has been a shift from the use of traditional teaching methods, such as didactic or rote teaching, towards non-traditional teaching methods, such as viewing of role plays, simulation, live interviews and the use of virtual environments. Mental state examination is an essential competency for all student healthcare professionals. If mental state examination is not taught in the most effective manner so learners can comprehend its concepts and interpret the findings correctly, it could lead to serious repercussions and subsequently impact on clinical care provided for patients with mental health conditions, such as incorrect assessment of suicidal ideation. However, the methods for teaching mental state examination vary widely between countries, academic institutions and clinical settings. This systematic review aimed to identify and synthesize the best available evidence of effective teaching methods used to prepare student health care professionals for the delivery of mental state examination. This review considered evidence from primary quantitative studies which address the effectiveness of a chosen method used for the teaching of mental state examination published in English, including studies that measure learner outcomes, i.e. improved knowledge and skills, self-confidence and learners' satisfaction. A three-step search strategy was undertaken in this review to search for articles published in English from the inception of the database to December 2014. An initial search of MEDLINE and CINAHL was undertaken to identify keywords. Secondly, the keywords identified were used to search electronic databases, namely, CINAHL, Medline, Cochrane Central Register of Controlled Trials, Ovid, PsycINFO and, ProQuest Dissertations & Theses. Thirdly, reference lists of the articles identified in the second stage were searched for other relevant studies. Studies selected were assessed by two independent reviewers for methodological