WorldWideScience

Sample records for adaptive techniques applied

  1. Applying perceptual and adaptive learning techniques for teaching introductory histopathology

    Sally Krasne

    2013-01-01

    Full Text Available Background: Medical students are expected to master the ability to interpret histopathologic images, a difficult and time-consuming process. A major problem is the issue of transferring information learned from one example of a particular pathology to a new example. Recent advances in cognitive science have identified new approaches to address this problem. Methods: We adapted a new approach for enhancing pattern recognition of basic pathologic processes in skin histopathology images that utilizes perceptual learning techniques, allowing learners to see relevant structure in novel cases along with adaptive learning algorithms that space and sequence different categories (e.g. diagnoses that appear during a learning session based on each learner′s accuracy and response time (RT. We developed a perceptual and adaptive learning module (PALM that utilized 261 unique images of cell injury, inflammation, neoplasia, or normal histology at low and high magnification. Accuracy and RT were tracked and integrated into a "Score" that reflected students rapid recognition of the pathologies and pre- and post-tests were given to assess the effectiveness. Results: Accuracy, RT and Scores significantly improved from the pre- to post-test with Scores showing much greater improvement than accuracy alone. Delayed post-tests with previously unseen cases, given after 6-7 weeks, showed a decline in accuracy relative to the post-test for 1 st -year students, but not significantly so for 2 nd -year students. However, the delayed post-test scores maintained a significant and large improvement relative to those of the pre-test for both 1 st and 2 nd year students suggesting good retention of pattern recognition. Student evaluations were very favorable. Conclusion: A web-based learning module based on the principles of cognitive science showed an evidence for improved recognition of histopathology patterns by medical students.

  2. Acceptance and Mindfulness Techniques as Applied to Refugee and Ethnic Minority Populations with PTSD: Examples from "Culturally Adapted CBT"

    Hinton, Devon E.; Pich, Vuth; Hofmann, Stefan G.; Otto, Michael W.

    2013-01-01

    In this article we illustrate how we utilize acceptance and mindfulness techniques in our treatment (Culturally Adapted CBT, or CA-CBT) for traumatized refugees and ethnic minority populations. We present a Nodal Network Model (NNM) of Affect to explain the treatment's emphasis on body-centered mindfulness techniques and its focus on psychological…

  3. Applying contemporary statistical techniques

    Wilcox, Rand R

    2003-01-01

    Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc

  4. Applied ALARA techniques

    The presentation focuses on some of the time-proven and new technologies being used to accomplish radiological work. These techniques can be applied at nuclear facilities to reduce radiation doses and protect the environment. The last reactor plants and processing facilities were shutdown and Hanford was given a new mission to put the facilities in a safe condition, decontaminate, and prepare them for decommissioning. The skills that were necessary to operate these facilities were different than the skills needed today to clean up Hanford. Workers were not familiar with many of the tools, equipment, and materials needed to accomplish:the new mission, which includes clean up of contaminated areas in and around all the facilities, recovery of reactor fuel from spent fuel pools, and the removal of millions of gallons of highly radioactive waste from 177 underground tanks. In addition, this work has to be done with a reduced number of workers and a smaller budget. At Hanford, facilities contain a myriad of radioactive isotopes that are 2048 located inside plant systems, underground tanks, and the soil. As cleanup work at Hanford began, it became obvious early that in order to get workers to apply ALARA and use hew tools and equipment to accomplish the radiological work it was necessary to plan the work in advance and get radiological control and/or ALARA committee personnel involved early in the planning process. Emphasis was placed on applying,ALARA techniques to reduce dose, limit contamination spread and minimize the amount of radioactive waste generated. Progress on the cleanup has,b6en steady and Hanford workers have learned to use different types of engineered controls and ALARA techniques to perform radiological work. The purpose of this presentation is to share the lessons learned on how Hanford is accomplishing radiological work

  5. Adaptive cancellation techniques

    1983-11-01

    An adaptive signal canceller has been evaluated for the enhancement of pulse signal reception during the transmission of a high power ECM jamming signal. The canceller design is based on the use of DRFM(Digital RF Memory) technology as part of an adaptive multiple tapped delay line. The study includes analysis of relationship of tap spacing and waveform bandwidth, survey of related documents in areas of sidelobe cancellers, transversal equalizers, and adaptive filters, and derivation of control equations and corresponding control processes. The simulation of overall processes included geometric analysis of the multibeam transmitting antenna, multiple reflection sources and the receiving antenna; waveforms, tap spacings and bandwidths; and alternate control algorithms. Conclusions are provided regarding practical system control algorithms, design characteristics and limitations.

  6. Techniques of English Textbooks Adaptation

    张婧雯; 杨竞欧

    2014-01-01

    This essay attempts to aim English teachers to evaluate and adapt the current English textbooks.According to different levels and majors of the students,English teachers can enhance the teaching materials and their teaching skills.This paper would provide several useful techniques for teachers to make evaluations and adaptations of using teaching materials.

  7. Adaptive Educational Software by Applying Reinforcement Learning

    Abdellah BENNANE

    2013-01-01

    The introduction of the intelligence in teaching software is the object of this paper. In software elaboration process, one uses some learning techniques in order to adapt the teaching software to characteristics of student. Generally, one uses the artificial intelligence techniques like reinforcement learning, Bayesian network in order to adapt the system to the environment internal and external conditions, and allow this system to interact efficiently with its potentials user. The intention...

  8. Adaptive Control Applied to Financial Market Data

    Šindelář, Jan; Kárný, Miroslav

    Strasbourg cedex: European Science Foundation, 2007, s. 1-6. [Advanced Mathematical Methods for Finance. Vídeň (AT), 17.09.2007-22.09.2007] R&D Projects: GA MŠk(CZ) 2C06001 Institutional research plan: CEZ:AV0Z10750506 Keywords : bayesian statistics * portfolio optimization * finance * adaptive control Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2007/si/sindelar-adaptive control applied to financial market data.pdf

  9. Adaptive Control Using Residual Mode Filters Applied to Wind Turbines

    Frost, Susan A.; Balas, Mark J.

    2011-01-01

    Many dynamic systems containing a large number of modes can benefit from adaptive control techniques, which are well suited to applications that have unknown parameters and poorly known operating conditions. In this paper, we focus on a model reference direct adaptive control approach that has been extended to handle adaptive rejection of persistent disturbances. We extend this adaptive control theory to accommodate problematic modal subsystems of a plant that inhibit the adaptive controller by causing the open-loop plant to be non-minimum phase. We will augment the adaptive controller using a Residual Mode Filter (RMF) to compensate for problematic modal subsystems, thereby allowing the system to satisfy the requirements for the adaptive controller to have guaranteed convergence and bounded gains. We apply these theoretical results to design an adaptive collective pitch controller for a high-fidelity simulation of a utility-scale, variable-speed wind turbine that has minimum phase zeros.

  10. Adaptive multiresolution computations applied to detonations

    Roussel, Olivier

    2015-01-01

    A space-time adaptive method is presented for the reactive Euler equations describing chemically reacting gas flow where a two species model is used for the chemistry. The governing equations are discretized with a finite volume method and dynamic space adaptivity is introduced using multiresolution analysis. A time splitting method of Strang is applied to be able to consider stiff problems while keeping the method explicit. For time adaptivity an improved Runge--Kutta--Fehlberg scheme is used. Applications deal with detonation problems in one and two space dimensions. A comparison of the adaptive scheme with reference computations on a regular grid allow to assess the accuracy and the computational efficiency, in terms of CPU time and memory requirements.

  11. New Adaptive Optics Technique Demonstrated

    2007-03-01

    First ever Multi-Conjugate Adaptive Optics at the VLT Achieves First Light On the evening of 25 March 2007, the Multi-Conjugate Adaptive Optics Demonstrator (MAD) achieved First Light at the Visitor Focus of Melipal, the third Unit Telescope of the Very Large Telescope (VLT). MAD allowed the scientists to obtain images corrected for the blurring effect of atmospheric turbulence over the full 2x2 arcminute field of view. This world premiere shows the promises of a crucial technology for Extremely Large Telescopes. ESO PR Photo 19a/07 ESO PR Photo 19a/07 The MCAO Demonstrator Telescopes on the ground suffer from the blurring effect induced by atmospheric turbulence. This turbulence causes the stars to twinkle in a way which delights the poets but frustrates the astronomers, since it blurs the fine details of the images. However, with Adaptive Optics (AO) techniques, this major drawback can be overcome so that the telescope produces images that are as sharp as theoretically possible, i.e., approaching space conditions. Adaptive Optics systems work by means of a computer-controlled deformable mirror (DM) that counteracts the image distortion induced by atmospheric turbulence. It is based on real-time optical corrections computed from image data obtained by a 'wavefront sensor' (a special camera) at very high speed, many hundreds of times each second. The concept is not new. Already in 1989, the first Adaptive Optics system ever built for Astronomy (aptly named "COME-ON") was installed on the 3.6-m telescope at the ESO La Silla Observatory, as the early fruit of a highly successful continuing collaboration between ESO and French research institutes (ONERA and Observatoire de Paris). Ten years ago, ESO initiated an Adaptive Optics program to serve the needs for its frontline VLT project. Today, the Paranal Observatory is without any doubt one of the most advanced of its kind with respect to AO with no less than 7 systems currently installed (NACO, SINFONI, CRIRES and

  12. Adaptive Control Applied to Financial Market Data

    Šindelář, Jan; Kárný, Miroslav

    Vol. I. Praha : Matfyz press, 2007, s. 1-6. ISBN 978-80-7378-023-4. [Week of Doctoral Students 2007. Praha (CZ), 05.06.2007-08.06.2007] R&D Projects: GA MŠk(CZ) 2C06001 Institutional research plan: CEZ:AV0Z10750506 Keywords : baysian statistics * finance * financial engineering * stochastic control Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2007/si/sindelar-adaptive control applied to financial market data.pdf

  13. Development of applied optical techniques

    This report resents the status of researches on the applications of lasers at KAERI. A compact portable laser fluorometer detecting uranium desolved in aqueous solution was built. The laser-induced fluorescence of uranium was detected with a photomultiplier tube. A delayed gate circuit and an integrating circuit were used to process the electrical signal. A small nitrogen laser was used to excite uranium. The detecting limit is about 0.1 ppb. The effect of various acidic solutions was investigated. Standard addition technique was incorporated to improve the measuring accuracy. This instrument can be used for safety inspection of workers in the nuclear fuel cycle facilities. (Author)

  14. Development of applied optical techniques

    The objective of this project is to improve laser application techniques in nuclear industry. A small,light and portable laser induced fluorometer was developed. It was designed to compensate inner filter and quenching effects by on-line data processing during analysis of uranium in aqueous solution. Computer interface improves the accuracy and data processing capabilities of the instrument. Its detection limit is as low as 0.1 ppb of uranium. It is ready to use in routine chemical analysis. The feasible applications such as for uranium level monitoring in discards from reconversion plant or fuel fabrication plant were seriously considered with minor modification of the instrument. It will be used to study trace analysis of rare-earth elements. The IRMPD of CHF3 was carried out and the effects of buffer gases such as Ar,N2 and SF6 were investigated. The IRMPD rate was increased with increasing pressure of the reactant and buffer gases. The pressure effect of the reactant CHF3 below 0.1 Torr showed opposite results. It was considered that the competition between quenching effect and rotational hole-filling effect during intermolecular collisions plays a great role in this low pressure region. The applications of holography in nuclear fuel cycle facilities were surveyed and analyzed. Also, experimental apparatuses such as an Ar ion laser, various kinds of holographic films and several optical components were prepared. (Author)

  15. Computational optimization techniques applied to microgrids planning

    Gamarra, Carlos; Guerrero, Josep M.

    2015-01-01

    appear along the planning process. In this context, technical literature about optimization techniques applied to microgrid planning have been reviewed and the guidelines for innovative planning methodologies focused on economic feasibility can be defined. Finally, some trending techniques and new...

  16. Adaptive techniques in electrical impedance tomography reconstruction

    We present an adaptive algorithm for solving the inverse problem in electrical impedance tomography. To strike a balance between the accuracy of the reconstructed images and the computational efficiency of the forward and inverse solvers, we propose to combine an adaptive mesh refinement technique with the adaptive Kaczmarz method. The iterative algorithm adaptively generates the optimal current patterns and a locally-refined mesh given the conductivity estimate and solves for the unknown conductivity distribution with the block Kaczmarz update step. Simulation and experimental results with numerical analysis demonstrate the accuracy and the efficiency of the proposed algorithm. (paper)

  17. A novel online adaptive time delay identification technique

    Bayrak, Alper; Tatlicioglu, Enver

    2016-05-01

    Time delay is a phenomenon which is common in signal processing, communication, control applications, etc. The special feature of time delay that makes it attractive is that it is a commonly faced problem in many systems. A literature search on time-delay identification highlights the fact that most studies focused on numerical solutions. In this study, a novel online adaptive time-delay identification technique is proposed. This technique is based on an adaptive update law through a minimum-maximum strategy which is firstly applied to time-delay identification. In the design of the adaptive identification law, Lyapunov-based stability analysis techniques are utilised. Several numerical simulations were conducted with Matlab/Simulink to evaluate the performance of the proposed technique. It is numerically demonstrated that the proposed technique works efficiently in identifying both constant and disturbed time delays, and is also robust to measurement noise.

  18. Applying Machine Learning Techniques to ASP Solving

    Maratea, Marco; Pulina, Luca; Ricca, Francesco

    2012-01-01

    Having in mind the task of improving the solving methods for Answer Set Programming (ASP), there are two usual ways to reach this goal: (i) extending state-of-the-art techniques and ASP solvers, or (ii) designing a new ASP solver from scratch. An alternative to these trends is to build on top of state-of-the-art solvers, and to apply machine learning techniques for choosing automatically the “best” available solver on a per-instance basis. In this paper we pursue this latter direction. ...

  19. Adaptive Response Surface Techniques in Reliability Estimation

    Enevoldsen, I.; Faber, M. H.; Sørensen, John Dalsgaard

    1993-01-01

    Problems in connection with estimation of the reliability of a component modelled by a limit state function including noise or first order discontinuitics are considered. A gradient free adaptive response surface algorithm is developed. The algorithm applies second order polynomial surfaces...... determined from central composite designs. In a two phase algorithm the second order surface is adjusted to the domain of the most likely failure point and both FORM and SORM estimates are obtained. The algorithm is implemented as a safeguard algorithm so non-converged solutions are avoided. Furthermore, a...

  20. Image reconstruction techniques applied to nuclear mass models

    Morales, Irving O.; Isacker, P. Van; Velazquez, V.; Barea, J.; Mendoza-Temis, J.; Vieyra, J. C. López; Hirsch, J. G.; Frank, A.

    2010-02-01

    A new procedure is presented that combines well-known nuclear models with image reconstruction techniques. A color-coded image is built by taking the differences between measured masses and the predictions given by the different theoretical models. This image is viewed as part of a larger array in the (N,Z) plane, where unknown nuclear masses are hidden, covered by a “mask.” We apply a suitably adapted deconvolution algorithm, used in astronomical observations, to “open the window” and see the rest of the pattern. We show that it is possible to improve significantly mass predictions in regions not too far from measured nuclear masses.

  1. Image reconstruction techniques applied to nuclear mass models

    A new procedure is presented that combines well-known nuclear models with image reconstruction techniques. A color-coded image is built by taking the differences between measured masses and the predictions given by the different theoretical models. This image is viewed as part of a larger array in the (N,Z) plane, where unknown nuclear masses are hidden, covered by a 'mask'.' We apply a suitably adapted deconvolution algorithm, used in astronomical observations, to 'open the window' and see the rest of the pattern. We show that it is possible to improve significantly mass predictions in regions not too far from measured nuclear masses.

  2. Adaptive Robotic Systems Design in University of Applied Sciences

    Gunsing Jos; Gijselhart Fons; Hagemans Nyke; Jonkers Hans; Kivits Eric; Klijn Peter; Kapteijns Bart; Kroeske Diederich; Langen Hans; Oerlemans Bart; Oostindie Jan; van Stuijvenberg Joost

    2016-01-01

    In the industry for highly specialized machine building (small series with high variety and high complexity) and in healthcare a demand for adaptive robotics is rapidly coming up. Technically skilled people are not always available in sufficient numbers. A lot of know how with respect to the required technologies is available but successful adaptive robotic system designs are still rare. In our research at the university of applied sciences we incorporate new available technologies in our edu...

  3. Ordering operator technique applied to open systems

    The normal ordering technique and the coherent representation is used to describe the evolution of an open system of a single oscillator, linearly coupled with an infinite number of reservoir oscillators, and it is shown how to include the dissipation and obtain the exponential decay. (Author)

  4. Applying Mixed Methods Techniques in Strategic Planning

    Voorhees, Richard A.

    2008-01-01

    In its most basic form, strategic planning is a process of anticipating change, identifying new opportunities, and executing strategy. The use of mixed methods, blending quantitative and qualitative analytical techniques and data, in the process of assembling a strategic plan can help to ensure a successful outcome. In this article, the author…

  5. Parameter Identification and Adaptive Control Applied to the Inverted Pendulum

    Carlos A. Saldarriaga-Cortés

    2012-06-01

    Full Text Available This paper presents a methodology to implement an adaptive control of the inverted pendulum system; which uses the recursive square minimum method for the identification of a dynamic digital model of the plant and then, with its estimated parameters, tune in real time a pole placement control. The plant to be used is an unstable and nonlinear system. This fact, combined with the adaptive controller characteristics, allows the obtained results to be extended to a great variety of systems. The results show that the above methodology was implemented satisfactorily in terms of estimation, stability and control of such a system. It was established that adaptive techniques have a proper performance even in systems with complex features such as nonlinearity and instability.

  6. Digital Speckle Technique Applied to Flow Visualization

    2000-01-01

    Digital speckle technique uses a laser, a CCD camera, and digital processing to generate interference fringes at the television framing rate. Its most obvious advantage is that neither darkroom facilities nor photographic wet chemical processing is required. In addition, it can be used in harsh engineering environments. This paper discusses the strengths and weaknesses of three digital speckle methodologies. (1) Digital speckle pattern interferometry (DSPI) uses an optical polarization phase shifter for visualization and measurement of the density field in a flow field. (2) Digital shearing speckle interferometry (DSSI) utilizes speckle-shearing interferometry in addition to optical polarization phase shifting. (3) Digital speckle photography (DSP) with computer reconstruction. The discussion describes the concepts, the principles and the experimental arrangements with some experimental results. The investigation shows that these three digital speckle techniques provide an excellent method for visualizing flow fields and for measuring density distributions in fluid mechanics and thermal flows.

  7. Modern NDT techniques applied to composite parts

    There are many Non Destructive Testing (NDT) techniques used on qualifying the composite made parts. a) Composite materials pose significant challenge for defect detection, since they are non-homogenous and anisotropic b) in nature. Throughout their life cycle composites are susceptible to the formation of many defects such as delamination, matrix cracking, fiber fracture, fiber pullout and impact damage. Various NDT methods are used for qualifying composite made parts like Ultrasonic, Radiographic, and Eddy Current etc. However, the latest techniques are Infra-Red Thermography, Neutron Radiography, Optical Holography, X-ray Computed Tomography and Acoustic Microscopy. This paper deals with each type of the methods and their suitability for different kinds of composites. (author)

  8. Neutron contrast techniques applied to oxide glasses

    Neutron scattering with isotopic substitution, particularly first and second difference methods, are proving to be excellent techniques for studies of the structure of oxide glasses. Several examples are given in which the measurements provide information that is difficult or impossible to obtain otherwise, for example, accurate, detailed distributions of first- to third-neighbours of Ca, Cu or Ni in silicate and phosphate glasses. In favourable cases, it is also possible to measure, directly, Ca-Ca and Ni-Ni first- and second-neighbour distributions. The relevance of complementary techniques, XAFS, differential anomalous x-ray scattering, x-ray scattering from glasses containing elements of high atomic numbers, is also discussed. (author). 6 figs., 11 refs

  9. Innovative techniques applied to ABWR project engineering

    General Electric's (GE) Advanced Boiling Water Reactor (ABWR) project is characterised by the use of new production methods and tools, a document configuration system that was defined from the outset and wide-ranging, smooth communications. The project also had a large number of participating companies from different cities in the US (San Jose, San Francisco, Kansas City, Washington), Mexico (Veracruz) and Spain (Madrid). One of the basic requirements applicable to advanced nuclear power plant projects is the need for an Information Management System (IMS) which shall be valid for the entire life of the plant, which means that all the documentation must be available in electronic format. The basic engineering tool for the ABWR project is POWRTRAK, a computer application developed by Black and Veatch (B and V). POWRTRAK comprise a single database, in which each datum is stored in only one place and used in real time. It consists of various modules, some of which are associated with technical data and the generation of diagrams (CASES, application used to generate piping and instrumentation, logic and electric wiring diagrams), three-dimensional electronic mock-up, planning, purchasing management, etc. GE adapted the Odesta Document Management System (ODMS) commercial application to its documentation file/control needs. In this system all the documentation produced in the project is filed in both native and universal formats (PDF). (Author)

  10. Basic principles of applied nuclear techniques

    The technological applications of radioactive isotopes and radiation in South Africa have grown steadily since the first consignment of man-made radioisotopes reached this country in 1948. By the end of 1975 there were 412 authorised non-medical organisations (327 industries) using hundreds of sealed sources as well as their fair share of the thousands of radioisotope consignments, annually either imported or produced locally (mainly for medical purposes). Consequently, it is necessary for South African technologists to understand the principles of radioactivity in order to appreciate the industrial applications of nuclear techniques

  11. Applying Cooperative Techniques in Teaching Problem Solving

    Krisztina Barczi

    2013-12-01

    Full Text Available Teaching how to solve problems – from solving simple equations to solving difficult competition tasks – has been one of the greatest challenges for mathematics education for many years. Trying to find an effective method is an important educational task. Among others, the question arises as to whether a method in which students help each other might be useful. The present article describes part of an experiment that was designed to determine the effects of cooperative teaching techniques on the development of problem-solving skills.

  12. Applying Adapted Big Five Teamwork Theory to Agile Software Development

    Strode, Diane

    2016-01-01

    Teamwork is a central tenet of agile software development and various teamwork theories partially explain teamwork in that context. Big Five teamwork theory is one of the most influential teamwork theories, but prior research shows that the team leadership concept in this theory it is not applicable to agile software development. This paper applies an adapted form of Big Five teamwork theory to cases of agile software development. Three independent cases were drawn from a single organisation....

  13. Nuclear analytical techniques applied to forensic chemistry

    Gun shot residues produced by firing guns are mainly composed by visible particles. The individual characterization of these particles allows distinguishing those ones containing heavy metals, from gun shot residues, from those having a different origin or history. In this work, the results obtained from the study of gun shot residues particles collected from hands are presented. The aim of the analysis is to establish whether a person has shot a firing gun has been in contact with one after the shot has been produced. As reference samples, particles collected hands of persons affected to different activities were studied to make comparisons. The complete study was based on the application of nuclear analytical techniques such as Scanning Electron Microscopy, Energy Dispersive X Ray Electron Probe Microanalysis and Graphite Furnace Atomic Absorption Spectrometry. The essays allow to be completed within time compatible with the forensic requirements. (author)

  14. Fast digitizing techniques applied to scintillation detectors

    A 200 MHz 12-bit fast transient recorder card has been used for the digitization of pulses from photomultipliers coupled to organic scintillation detectors. Two modes of operation have been developed at ENEA-Frascati: a) continuous acquisition up to a maximum duration of ∼ 1.3 s corresponding to the full on-board memory (256 MSamples) of the card: in this mode, all scintillation events are recorded; b) non-continuous acquisition in which digitization is triggered by those scintillaton events whose amplitude is above a threshold value: the digitizing interval after each trigger can be set according to the typical decay time of the scintillation events; longer acquisition durations (>1.3 s) can be reached, although with dead time (needed for data storage) which depends on the incoming event rate. Several important features are provided by this novel digital approach: high count rate operation, pulse shape analysis, post-experiment data re-processing, pile-up identification and treatment. In particular, NE213 scintillators have been successfully used with this system for measurements in mixed neutron (n) and gamma (γ) radiation fields from fusion plasmas: separation between γ and neutron events is made by means of a dedicated software comparing the pulse charge integrated in two different time intervals and simultaneous neutron and γ pulse height spectra can be recorded at total count rates in the MHz range. It has been demonstrated that, for scintillation detection applications, 12-bit fast transient recorder cards offer improved performance with respect to analogue hardware; other radiation detectors where pulse identification or high count rate is required might also benefit from such digitizing techniques

  15. Adaptive resource allocation architecture applied to line tracking

    Owen, Mark W.; Pace, Donald W.

    2000-04-01

    Recent research has demonstrated the benefits of a multiple hypothesis, multiple model sonar line tracking solution, achieved at significant computational cost. We have developed an adaptive architecture that trades computational resources for algorithm complexity based on environmental conditions. A Fuzzy Logic Rule-Based approach is applied to adaptively assign algorithmic resources to meet system requirements. The resources allocated by the Fuzzy Logic algorithm include (1) the number of hypotheses permitted (yielding multi-hypothesis and single-hypothesis modes), (2) the number of signal models to use (yielding an interacting multiple model capability), (3) a new track likelihood for hypothesis generation, (4) track attribute evaluator activation (for signal to noise ratio, frequency bandwidth, and others), and (5) adaptive cluster threshold control. Algorithm allocation is driven by a comparison of current throughput rates to a desired real time rate. The Fuzzy Logic Controlled (FLC) line tracker, a single hypothesis line tracker, and a multiple hypothesis line tracker are compared on real sonar data. System resource usage results demonstrate the utility of the FLC line tracker.

  16. ADAPTIVE LIFTING BASED IMAGE COMPRESSION SCHEME WITH PARTICLE SWARM OPTIMIZATION TECHNIQUE

    Nishat kanvel; Dr.S.Letitia,; Dr.Elwin Chandra Monie

    2010-01-01

    This paper presents an adaptive lifting scheme with Particle Swarm Optimization technique for image compression. Particle swarm Optimization technique is used to improve the accuracy of the predictionfunction used in the lifting scheme. This scheme is applied in Image compression and parameters such as PSNR, Compression Ratio and the visual quality of the image is calculated .The proposed scheme iscompared with the existing methods.

  17. Adaptive Robotic Systems Design in University of Applied Sciences

    Gunsing Jos

    2016-01-01

    Full Text Available In the industry for highly specialized machine building (small series with high variety and high complexity and in healthcare a demand for adaptive robotics is rapidly coming up. Technically skilled people are not always available in sufficient numbers. A lot of know how with respect to the required technologies is available but successful adaptive robotic system designs are still rare. In our research at the university of applied sciences we incorporate new available technologies in our education courses by way of research projects; in these projects students will investigate the application possibilities of new technologies together with companies and teachers. Thus we are able to transfer knowledge to the students including an innovation oriented attitude and skills. Last years we developed several industrial binpicking applications for logistics and machining-factories with different types of 3D vision. Also force feedback gripping has been developed including slip sensing. Especially for healthcare robotics we developed a so-called twisted wire actuator, which is very compact in combination with an underactuated gripper, manufactured in one piece in polyurethane. We work both on modeling and testing the functions of these designs but we work also on complete demonstrator systems. Since the amount of disciplines involved in complex product and machine design increases rapidly we pay a lot of attention with respect to systems engineering methods. Apart from the classical engineering disciplines like mechanical, electrical, software and mechatronics engineering, especially for adaptive robotics more and more disciplines like industrial product design, communication … multimedia design and of course physics and even art are to be involved depending on the specific application to be designed. Design tools like V-model, agile/scrum and design-approaches to obtain the best set of requirements are being implemented in the engineering studies from

  18. Adapted G-mode Clustering Method applied to Asteroid Taxonomy

    Hasselmann, Pedro H.; Carvano, Jorge M.; Lazzaro, D.

    2013-11-01

    The original G-mode was a clustering method developed by A. I. Gavrishin in the late 60's for geochemical classification of rocks, but was also applied to asteroid photometry, cosmic rays, lunar sample and planetary science spectroscopy data. In this work, we used an adapted version to classify the asteroid photometry from SDSS Moving Objects Catalog. The method works by identifying normal distributions in a multidimensional space of variables. The identification starts by locating a set of points with smallest mutual distance in the sample, which is a problem when data is not planar. Here we present a modified version of the G-mode algorithm, which was previously written in FORTRAN 77, in Python 2.7 and using NumPy, SciPy and Matplotlib packages. The NumPy was used for array and matrix manipulation and Matplotlib for plot control. The Scipy had a import role in speeding up G-mode, Scipy.spatial.distance.mahalanobis was chosen as distance estimator and Numpy.histogramdd was applied to find the initial seeds from which clusters are going to evolve. Scipy was also used to quickly produce dendrograms showing the distances among clusters. Finally, results for Asteroids Taxonomy and tests for different sample sizes and implementations are presented.

  19. Comparison of optimization techniques applied to nuclear fuel reload design

    In this work a comparison of three techniques of optimization is presented applied to the design of the recharge of fuel in reactors of water in boil. In short, the techniques were applied to the design of a recharge of a cycle of balance of 18 months of the Laguna Verde Nucleo electric Central. The used techniques were Genetic Algorithms, Taboo Search and Neural Nets. The conditions to apply the different techniques were the same ones. The comparison of the results quality and the computational resources required to obtain them, it indicates that with the Taboo Search better results are achieved but the computational cost is very big. On the other hand the neural net with low computational cost obtains acceptable results. Additionally to this comparison, in this work a summary of the works that have been carried out for the fuel recharges optimization from the years 60 until the present time is presented. (Author)

  20. ARTIFICIAL INTELLIGENCE PLANNING TECHNIQUES FOR ADAPTIVE VIRTUAL COURSE CONSTRUCTION

    NÉSTOR DARÍO DUQUE; DEMETRIO ARTURO OVALLE

    2011-01-01

    This paper aims at presenting a planning model for adapting the behavior of virtual courses based on artificial intelligence techniques, in particular using not only a multi-agent system approach, but also artificial intelligence planning methods. The design and implementation of the system by means of a pedagogical multi-agent approach and the definition of a framework to specify the adaptation strategy allow us to incorporate several pedagogical and technological approaches that are in acco...

  1. Radar Range Sidelobe Reduction Using Adaptive Pulse Compression Technique

    Li, Lihua; Coon, Michael; McLinden, Matthew

    2013-01-01

    Pulse compression has been widely used in radars so that low-power, long RF pulses can be transmitted, rather than a highpower short pulse. Pulse compression radars offer a number of advantages over high-power short pulsed radars, such as no need of high-power RF circuitry, no need of high-voltage electronics, compact size and light weight, better range resolution, and better reliability. However, range sidelobe associated with pulse compression has prevented the use of this technique on spaceborne radars since surface returns detected by range sidelobes may mask the returns from a nearby weak cloud or precipitation particles. Research on adaptive pulse compression was carried out utilizing a field-programmable gate array (FPGA) waveform generation board and a radar transceiver simulator. The results have shown significant improvements in pulse compression sidelobe performance. Microwave and millimeter-wave radars present many technological challenges for Earth and planetary science applications. The traditional tube-based radars use high-voltage power supply/modulators and high-power RF transmitters; therefore, these radars usually have large size, heavy weight, and reliability issues for space and airborne platforms. Pulse compression technology has provided a path toward meeting many of these radar challenges. Recent advances in digital waveform generation, digital receivers, and solid-state power amplifiers have opened a new era for applying pulse compression to the development of compact and high-performance airborne and spaceborne remote sensing radars. The primary objective of this innovative effort is to develop and test a new pulse compression technique to achieve ultrarange sidelobes so that this technique can be applied to spaceborne, airborne, and ground-based remote sensing radars to meet future science requirements. By using digital waveform generation, digital receiver, and solid-state power amplifier technologies, this improved pulse compression

  2. An adaptive envelope spectrum technique for bearing fault detection

    In this work, an adaptive envelope spectrum (AES) technique is proposed for bearing fault detection, especially for analyzing signals with transient events. The proposed AES technique first modulates the signal using the empirical mode decomposition to formulate the representative intrinsic mode functions (IMF), and then a novel IMF reconstruction method is proposed based on a correlation analysis of the envelope spectra. The reconstructed signal is post-processed by using an adaptive filter to enhance impulsive signatures, where the filter length is optimized by the proposed sparsity analysis technique. Bearing health conditions are diagnosed by examining bearing characteristic frequency information on the envelope power spectrum. The effectiveness of the proposed fault detection technique is verified by a series of experimental tests corresponding to different bearing conditions. (paper)

  3. OFFLINE HANDWRITTEN SIGNATURE IDENTIFICATION USING ADAPTIVE WINDOW POSITIONING TECHNIQUES

    Ghazali Sulong

    2014-10-01

    Full Text Available The paper presents to address this challenge, we have proposed the use of Adaptive Window Positioning technique which focuses on not just the meaning of the handwritten signature but also on the individuality of the writer. This innovative technique divides the handwritten signature into 13 small windows of size nxn (13x13. This size should be large enough to contain ample information about the style of the author and small enough to ensure a good identification performance. The process was tested with a GPDS datasetcontaining 4870 signature samples from 90 different writers by comparing the robust features of the test signature with that of the user’s signature using an appropriate classifier. Experimental results reveal that adaptive window positioning technique proved to be the efficient and reliable method for accurate signature feature extraction for the identification of offline handwritten signatures .The contribution of this technique can be used to detect signatures signed under emotional duress.

  4. Adapted Cuing Technique for Use in Treatment of Dyspraxia.

    Klick, Susan L.

    1985-01-01

    The Adapted Cuing Technique (ACT) was created to accompany oral stimulus presentation in treatment of dyspraxia. ACT is consistent with current treatment theory, emphasizing patterns of articulatory movement, manner of production, and multimodality facilitation. A case study describes the use of ACT in the treatment of a five-year-old child.…

  5. Adaptive Landmark-Based Navigation System Using Learning Techniques

    Zeidan, Bassel; Dasgupta, Sakyasingha; Wörgötter, Florentin;

    2014-01-01

    . Inspired by this, we develop an adaptive landmark-based navigation system based on sequential reinforcement learning. In addition, correlation-based learning is also integrated into the system to improve learning performance. The proposed system has been applied to simulated simple wheeled and more complex...

  6. Parameter Identification and Adaptive Control Applied to the Inverted Pendulum

    Carlos A. Saldarriaga-Cortés; Víctor D. Correa-Ramírez; Didier Giraldo-Buitrago

    2012-01-01

    This paper presents a methodology to implement an adaptive control of the inverted pendulum system; which uses the recursive square minimum method for the identification of a dynamic digital model of the plant and then, with its estimated parameters, tune in real time a pole placement control. The plant to be used is an unstable and nonlinear system. This fact, combined with the adaptive controller characteristics, allows the obtained results to be extended to a great variety of systems. The ...

  7. Object oriented programming techniques applied to device access and control

    In this paper a model, called the device server model, has been presented for solving the problem of device access and control faced by all control systems. Object Oriented Programming techniques were used to achieve a powerful yet flexible solution. The model provides a solution to the problem which hides device dependancies. It defines a software framework which has to be respected by implementors of device classes - this is very useful for developing groupware. The decision to implement remote access in the root class means that device servers can be easily integrated in a distributed control system. A lot of the advantages and features of the device server model are due to the adoption of OOP techniques. The main conclusion that can be drawn from this paper is that 1. the device access and control problem is adapted to being solved with OOP techniques, 2. OOP techniques offer a distinct advantage over traditional programming techniques for solving the device access problem. (J.P.N.)

  8. Sensor Data Qualification Technique Applied to Gas Turbine Engines

    Csank, Jeffrey T.; Simon, Donald L.

    2013-01-01

    This paper applies a previously developed sensor data qualification technique to a commercial aircraft engine simulation known as the Commercial Modular Aero-Propulsion System Simulation 40,000 (C-MAPSS40k). The sensor data qualification technique is designed to detect, isolate, and accommodate faulty sensor measurements. It features sensor networks, which group various sensors together and relies on an empirically derived analytical model to relate the sensor measurements. Relationships between all member sensors of the network are analyzed to detect and isolate any faulty sensor within the network.

  9. Adaptive feedback linearization applied to steering of ships

    Thor I. Fossen

    1993-10-01

    Full Text Available This paper describes the application of feedback linearization to automatic steering of ships. The flexibility of the design procedure allows the autopilot to be optimized for both course-keeping and course-changing manoeuvres. Direct adaptive versions of both the course-keeping and turning controller are derived. The advantages of the adaptive controllers are improved performance and reduced fuel consumption. The application of nonlinear control theory also allows the designer in a systematic manner to compensate for nonlinearities in the control design.

  10. Technique applied in electrical power distribution for Satellite Launch Vehicle

    João Maurício Rosário

    2010-09-01

    Full Text Available The Satellite Launch Vehicle electrical network, which is currently being developed in Brazil, is sub-divided for analysis in the following parts: Service Electrical Network, Controlling Electrical Network, Safety Electrical Network and Telemetry Electrical Network. During the pre-launching and launching phases, these electrical networks are associated electrically and mechanically to the structure of the vehicle. In order to succeed in the integration of these electrical networks it is necessary to employ techniques of electrical power distribution, which are proper to Launch Vehicle systems. This work presents the most important techniques to be considered in the characterization of the electrical power supply applied to Launch Vehicle systems. Such techniques are primarily designed to allow the electrical networks, when submitted to the single-phase fault to ground, to be able of keeping the power supply to the loads.

  11. Strategies in edge plasma simulation using adaptive dynamic nodalization techniques

    A wide span of steady-state and transient edge plasma processes simulation problems require accurate discretization techniques and can then be treated with Finite Element (FE) and Finite Volume (FV) methods. The software used here to meet these meshing requirements is a 2D finite element grid generator. It allows to produce adaptive unstructured grids taking into consideration the flux surface characteristics. To comply with the common mesh handling features of FE/FV packages, some options have been added to the basic generation tool. These enhancements include quadrilateral meshes without non-regular transition elements obtained by substituting them by transition constructions consisting of regular quadrilateral elements. Furthermore triangular grids can be created with one edge parallel to the magnetic field and modified by the basic adaptation/realignment techniques. Enhanced code operation properties and processing capabilities are expected. (author)

  12. An Adaptive Hybrid Multiprocessor technique for bioinformatics sequence alignment

    Bonny, Talal

    2012-07-28

    Sequence alignment algorithms such as the Smith-Waterman algorithm are among the most important applications in the development of bioinformatics. Sequence alignment algorithms must process large amounts of data which may take a long time. Here, we introduce our Adaptive Hybrid Multiprocessor technique to accelerate the implementation of the Smith-Waterman algorithm. Our technique utilizes both the graphics processing unit (GPU) and the central processing unit (CPU). It adapts to the implementation according to the number of CPUs given as input by efficiently distributing the workload between the processing units. Using existing resources (GPU and CPU) in an efficient way is a novel approach. The peak performance achieved for the platforms GPU + CPU, GPU + 2CPUs, and GPU + 3CPUs is 10.4 GCUPS, 13.7 GCUPS, and 18.6 GCUPS, respectively (with the query length of 511 amino acid). © 2010 IEEE.

  13. Adaptive spectral identification techniques in presence of undetected non linearities

    Cella, G; Guidi, G M

    2002-01-01

    The standard procedure for detection of gravitational wave coalescing binaries signals is based on Wiener filtering with an appropriate bank of template filters. This is the optimal procedure in the hypothesis of addictive Gaussian and stationary noise. We study the possibility of improving the detection efficiency with a class of adaptive spectral identification techniques, analyzing their effect in presence of non stationarities and undetected non linearities in the noise

  14. Load Cell Response Correction Using Analog Adaptive Techniques

    Jafaripanah, Mehdi; Al-Hashimi, Bashir; White, Neil M.

    2003-01-01

    Load cell response correction can be used to speed up the process of measurement. This paper investigates the application of analog adaptive techniques in load cell response correction. The load cell is a sensor with an oscillatory output in which the measurand contributes to response parameters. Thus, a compensation filter needs to track variation in measurand whereas a simple, fixed filter is only valid at one load value. To facilitate this investigation, computer models for the load cell a...

  15. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  16. Highly charged ion beam applied to lithography technique.

    Momota, Sadao; Nojiri, Yoichi; Taniguchi, Jun; Miyamoto, Iwao; Morita, Noboru; Kawasegi, Noritaka

    2008-02-01

    In various fields of nanotechnology, the importance of nanoscale three-dimensional (3D) structures is increasing. In order to develop an efficient process to fabricate nanoscale 3D structures, we have applied highly charged ion (HCI) beams to the ion-beam lithography (IBL) technique. Ar-ion beams with various charge states (1+ to 9+) were applied to fabricate spin on glass (SOG) and Si by means of the IBL technique. The Ar ions were prepared by a facility built at Kochi University of Technology, which includes an electron cyclotron resonance ion source (NANOGAN, 10 GHz). IBL fabrication was performed as a function of not only the charge state but also the energy and the dose of Ar ions. The present results show that the application of an Ar(9+) beam reduces the etching time for SOG and enhances the etching depth compared with those observed with Ar ions in lower charged states. Considering the high-energy deposition of HCI at a surface, the former phenomena can be understood consistently. Also, the latter phenomena can be understood based on anomalously deep structural changes, which are remarkable for glasses. Furthermore, it has also been shown that the etching depth can be easily controlled with the kinetic energy of the Ar ions. These results show the possibilities of the IBL technique with HCI beams in the field of nanoscale 3D fabrication. PMID:18315242

  17. Development and verification of unstructured adaptive mesh technique with edge compatibility

    In the design study of the large-sized sodium-cooled fast reactor (JSFR), one key issue is suppression of gas entrainment (GE) phenomena at a gas-liquid interface. Therefore, the authors have been developed a high-precision CFD algorithm to evaluate the GE phenomena accurately. The CFD algorithm has been developed on unstructured meshes to establish an accurate modeling of JSFR system. For two-phase interfacial flow simulations, a high-precision volume-of-fluid algorithm is employed. It was confirmed that the developed CFD algorithm could reproduce the GE phenomena in a simple GE experiment. Recently, the authors have been developed an important technique for the simulation of the GE phenomena in JSFR. That is an unstructured adaptive mesh technique which can apply fine cells dynamically to the region where the GE occurs in JSFR. In this paper, as a part of the development, a two-dimensional unstructured adaptive mesh technique is discussed. In the two-dimensional adaptive mesh technique, each cell is refined isotropically to reduce distortions of the mesh. In addition, connection cells are formed to eliminate the edge incompatibility between refined and non-refined cells. The two-dimensional unstructured adaptive mesh technique is verified by solving well-known lid-driven cavity flow problem. As a result, the two-dimensional unstructured adaptive mesh technique succeeds in providing a high-precision solution, even though poor-quality distorted initial mesh is employed. In addition, the simulation error on the two-dimensional unstructured adaptive mesh is much less than the error on the structured mesh with a larger number of cells. (author)

  18. Inexpensive rf modeling and analysis techniques as applied to cyclotrons

    A review and expansion of the circuit analogy method of modeling and analysing multiconductor TEM mode rf resonators is described. This method was used to predict the performance of the NSCL K500 and K1200 cyclotron resonators and the results compared well to the measured performance. The method is currently being applied as the initial stage of the design process to optimize the performance of the rf resonators for a proposed K250 cyclotron for medical applications. Although this technique requires an experienced rf modeller, the input files tend to be simple and small, the software is very inexpensive or free, and the computer runtimes are nearly instantaneous

  19. Three-dimensional integrated CAE system applying computer graphic technique

    A three-dimensional CAE system for nuclear power plant design is presented. This system utilizes high-speed computer graphic techniques for the plant design review, and an integrated engineering database for handling the large amount of nuclear power plant engineering data in a unified data format. Applying this system makes it possible to construct a nuclear power plant using only computer data from the basic design phase to the manufacturing phase, and it increases the productivity and reliability of the nuclear power plants. (author)

  20. An adaptive technique for a redundant-sensor navigation system.

    Chien, T.-T.

    1972-01-01

    An on-line adaptive technique is developed to provide a self-contained redundant-sensor navigation system with a capability to utilize its full potentiality in reliability and performance. This adaptive system is structured as a multistage stochastic process of detection, identification, and compensation. It is shown that the detection system can be effectively constructed on the basis of a design value, specified by mission requirements, of the unknown parameter in the actual system, and of a degradation mode in the form of a constant bias jump. A suboptimal detection system on the basis of Wald's sequential analysis is developed using the concept of information value and information feedback. The developed system is easily implemented, and demonstrates a performance remarkably close to that of the optimal nonlinear detection system. An invariant transformation is derived to eliminate the effect of nuisance parameters such that the ambiguous identification system can be reduced to a set of disjoint simple hypotheses tests. By application of a technique of decoupled bias estimation in the compensation system the adaptive system can be operated without any complicated reorganization.

  1. Self-Adaptive Stepsize Search Applied to Optimal Structural Design

    Nolle, L.; Bland, J. A.

    Structural engineering often involves the design of space frames that are required to resist predefined external forces without exhibiting plastic deformation. The weight of the structure and hence the weight of its constituent members has to be as low as possible for economical reasons without violating any of the load constraints. Design spaces are usually vast and the computational costs for analyzing a single design are usually high. Therefore, not every possible design can be evaluated for real-world problems. In this work, a standard structural design problem, the 25-bar problem, has been solved using self-adaptive stepsize search (SASS), a relatively new search heuristic. This algorithm has only one control parameter and therefore overcomes the drawback of modern search heuristics, i.e. the need to first find a set of optimum control parameter settings for the problem at hand. In this work, SASS outperforms simulated-annealing, genetic algorithms, tabu search and ant colony optimization.

  2. Experiments on Adaptive Techniques for Host-Based Intrusion Detection

    This research explores four experiments of adaptive host-based intrusion detection (ID) techniques in an attempt to develop systems that can detect novel exploits. The technique considered to have the most potential is adaptive critic designs (ACDs) because of their utilization of reinforcement learning, which allows learning exploits that are difficult to pinpoint in sensor data. Preliminary results of ID using an ACD, an Elman recurrent neural network, and a statistical anomaly detection technique demonstrate an ability to learn to distinguish between clean and exploit data. We used the Solaris Basic Security Module (BSM) as a data source and performed considerable preprocessing on the raw data. A detection approach called generalized signature-based ID is recommended as a middle ground between signature-based ID, which has an inability to detect novel exploits, and anomaly detection, which detects too many events including events that are not exploits. The primary results of the ID experiments demonstrate the use of custom data for generalized signature-based intrusion detection and the ability of neural network-based systems to learn in this application environment

  3. Statistical mapping techniques applied to radio-immunoscintigraphy

    Full text: Serial image analysis techniques such as kinetic analysis with probability mapping have been successfully applied to radio-immunoscintigraphic localization of occult tumours. Lesion detection by these statistical methods is predicated on a decrease in vascular activity over time in comparison with incremental increase in tumour uptake of radiolabelled antibody on serial images. We have refined the kinetic analysis technique by introduction of weighted error determination and correlation with regional masking for application to serial SPET images as well as planar studies. Six patients undergoing radioimmunoscintigraphy for localization of radiographically occult recurrence or metastases of colon cancer were imaged within 30 min and at 3, 6 and 24 h following intravenous administration of 99Tcm-anti-CEA antibody (CEA Scan Immunomedics). Statistical mapping comprising setting of correlation parameters, subtraction of correlated images and visualization and analysis of statistical maps was performed. We found that changing weights in least square correlation improved delineation of target from background activity. The introduction of regional masking to compensate for the changing pattern of activity in the kidneys and bladder also facilitated correlation of serial images. These statistical mapping techniques were applied to SPET images with CT co-registration for accurate anatomical localization of lesions. The probability of CEA-secreting tumour recurrence or metastasis was expressed as two levels of confidence set arbitrarily as 0.05 (1.96 S.D.) and 0.001 (3.291 S.D.) in respect of CT co-registered SPET 3 and 6 h images of thorax, abdomen and pelvis

  4. Hyperspectral-imaging-based techniques applied to wheat kernels characterization

    Serranti, Silvia; Cesare, Daniela; Bonifazi, Giuseppe

    2012-05-01

    Single kernels of durum wheat have been analyzed by hyperspectral imaging (HSI). Such an approach is based on the utilization of an integrated hardware and software architecture able to digitally capture and handle spectra as an image sequence, as they results along a pre-defined alignment on a surface sample properly energized. The study was addressed to investigate the possibility to apply HSI techniques for classification of different types of wheat kernels: vitreous, yellow berry and fusarium-damaged. Reflectance spectra of selected wheat kernels of the three typologies have been acquired by a laboratory device equipped with an HSI system working in near infrared field (1000-1700 nm). The hypercubes were analyzed applying principal component analysis (PCA) to reduce the high dimensionality of data and for selecting some effective wavelengths. Partial least squares discriminant analysis (PLS-DA) was applied for classification of the three wheat typologies. The study demonstrated that good classification results were obtained not only considering the entire investigated wavelength range, but also selecting only four optimal wavelengths (1104, 1384, 1454 and 1650 nm) out of 121. The developed procedures based on HSI can be utilized for quality control purposes or for the definition of innovative sorting logics of wheat.

  5. Automation of assertion testing - Grid and adaptive techniques

    Andrews, D. M.

    1985-01-01

    Assertions can be used to automate the process of testing software. Two methods for automating the generation of input test data are described in this paper. One method selects the input values of variables at regular intervals in a 'grid'. The other, adaptive testing, uses assertion violations as a measure of errors detected and generates new test cases based on test results. The important features of assertion testing are that: it can be used throughout the entire testing cycle; it provides automatic notification of error conditions; and it can be used with automatic input generation techniques which eliminate the subjectivity in choosing test data.

  6. Active Learning Techniques Applied to an Interdisciplinary Mineral Resources Course.

    Aird, H. M.

    2015-12-01

    An interdisciplinary active learning course was introduced at the University of Puget Sound entitled 'Mineral Resources and the Environment'. Various formative assessment and active learning techniques that have been effective in other courses were adapted and implemented to improve student learning, increase retention and broaden knowledge and understanding of course material. This was an elective course targeted towards upper-level undergraduate geology and environmental majors. The course provided an introduction to the mineral resources industry, discussing geological, environmental, societal and economic aspects, legislation and the processes involved in exploration, extraction, processing, reclamation/remediation and recycling of products. Lectures and associated weekly labs were linked in subject matter; relevant readings from the recent scientific literature were assigned and discussed in the second lecture of the week. Peer-based learning was facilitated through weekly reading assignments with peer-led discussions and through group research projects, in addition to in-class exercises such as debates. Writing and research skills were developed through student groups designing, carrying out and reporting on their own semester-long research projects around the lasting effects of the historical Ruston Smelter on the biology and water systems of Tacoma. The writing of their mini grant proposals and final project reports was carried out in stages to allow for feedback before the deadline. Speakers from industry were invited to share their specialist knowledge as guest lecturers, and students were encouraged to interact with them, with a view to employment opportunities. Formative assessment techniques included jigsaw exercises, gallery walks, placemat surveys, think pair share and take-home point summaries. Summative assessment included discussion leadership, exams, homeworks, group projects, in-class exercises, field trips, and pre-discussion reading exercises

  7. Conceptualizing urban adaptation to climate change: Findings from an applied adaptation assessment framework

    Johnson, Katie; BREIL, MARGARETHA

    2012-01-01

    Urban areas have particular sensitivities to climate change, and therefore adaptation to a warming planet represents a challenging new issue for urban policy makers in both the developed and developing world. Further to climate mitigation strategies implemented in various cities over the past 20 years, more recent efforts of urban management have also included actions taken to adapt to increasing temperatures, sea level and extreme events. Through the examination and comparison of seven citie...

  8. Applying machine learning classification techniques to automate sky object cataloguing

    Fayyad, Usama M.; Doyle, Richard J.; Weir, W. Nick; Djorgovski, Stanislav

    1993-08-01

    We describe the application of an Artificial Intelligence machine learning techniques to the development of an automated tool for the reduction of a large scientific data set. The 2nd Mt. Palomar Northern Sky Survey is nearly completed. This survey provides comprehensive coverage of the northern celestial hemisphere in the form of photographic plates. The plates are being transformed into digitized images whose quality will probably not be surpassed in the next ten to twenty years. The images are expected to contain on the order of 107 galaxies and 108 stars. Astronomers wish to determine which of these sky objects belong to various classes of galaxies and stars. Unfortunately, the size of this data set precludes analysis in an exclusively manual fashion. Our approach is to develop a software system which integrates the functions of independently developed techniques for image processing and data classification. Digitized sky images are passed through image processing routines to identify sky objects and to extract a set of features for each object. These routines are used to help select a useful set of attributes for classifying sky objects. Then GID3 (Generalized ID3) and O-B Tree, two inductive learning techniques, learns classification decision trees from examples. These classifiers will then be applied to new data. These developmnent process is highly interactive, with astronomer input playing a vital role. Astronomers refine the feature set used to construct sky object descriptions, and evaluate the performance of the automated classification technique on new data. This paper gives an overview of the machine learning techniques with an emphasis on their general applicability, describes the details of our specific application, and reports the initial encouraging results. The results indicate that our machine learning approach is well-suited to the problem. The primary benefit of the approach is increased data reduction throughput. Another benefit is

  9. Adaptive Communication Techniques for the Internet of Things

    Peng Du

    2013-03-01

    Full Text Available The vision for the Internet of Things (IoT demands that material objects acquire communications and computation capabilities and become able to automatically identify themselves through standard protocols and open systems, using the Internet as their foundation. Yet, several challenges still must be addressed for this vision to become a reality. A core ingredient in such development is the ability of heterogeneous devices to communicate adaptively so as to make the best of limited spectrum availability and cope with competition which is inevitable as more and more objects connect to the system. This survey provides an overview of current developments in this area, placing emphasis on wireless sensor networks that can provide IoT capabilities for material objects and techniques that can be used in the context of systems employing low-power versions of the Internet Protocol (IP stack. The survey introduces a conceptual model that facilitates the identification of opportunities for adaptation in each layer of the network stack. After a detailed discussion of specific approaches applicable to particular layers, we consider how sharing information across layers can facilitate further adaptation. We conclude with a discussion of future research directions.

  10. A New Local Adaptive Thresholding Technique in Binarization

    Singh, T Romen; Singh, O Imocha; Sinam, Tejmani; Singh, Kh Manglem

    2012-01-01

    Image binarization is the process of separation of pixel values into two groups, white as background and black as foreground. Thresholding plays a major in binarization of images. Thresholding can be categorized into global thresholding and local thresholding. In images with uniform contrast distribution of background and foreground like document images, global thresholding is more appropriate. In degraded document images, where considerable background noise or variation in contrast and illumination exists, there exists many pixels that cannot be easily classified as foreground or background. In such cases, binarization with local thresholding is more appropriate. This paper describes a locally adaptive thresholding technique that removes background by using local mean and mean deviation. Normally the local mean computational time depends on the window size. Our technique uses integral sum image as a prior processing to calculate local mean. It does not involve calculations of standard deviations as in other ...

  11. Image analysis technique applied to lock-exchange gravity currents

    Nogueira, Helena I. S.; Adduce, Claudia; Alves, Elsa; Franca, Mário J.

    2013-04-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in the image relating the amount of dye uniformly distributed in the tank and the greyscale values in the corresponding images. The results are evaluated and corrected by applying the mass conservation principle within the experimental tank. The procedure is a simple way to assess the time-varying density distribution within the gravity current, allowing the investigation of gravity current dynamics and mixing processes.

  12. A Competency-Based Guided-Learning Algorithm Applied on Adaptively Guiding E-Learning

    Hsu, Wei-Chih; Li, Cheng-Hsiu

    2015-01-01

    This paper presents a new algorithm called competency-based guided-learning algorithm (CBGLA), which can be applied on adaptively guiding e-learning. Computational process analysis and mathematical derivation of competency-based learning (CBL) were used to develop the CBGLA. The proposed algorithm could generate an effective adaptively guiding…

  13. Three-dimensional region-based adaptive image processing techniques for volume visualization applications

    de Deus Lopes, Roseli; Zuffo, Marcelo K.; Rangayyan, Rangaraj M.

    1996-04-01

    Recent advances in three-dimensional (3D) imaging techniques have expanded the scope of applications of volume visualization to many areas such as medical imaging, scientific visualization, robotic vision, and virtual reality. Advanced image filtering, enhancement, and analysis techniques are being developed in parallel in the field of digital image processing. Although the fields cited have many aspects in common, it appears that many of the latest developments in image processing are not being applied to the fullest extent possible in visualization. It is common to encounter the use of rather simple and elementary image pre- processing operations being used in visualization and 3D imaging applications. The purpose of this paper is to present an overview of selected topics from recent developments in adaptive image processing and demonstrate or suggest their applications in volume visualization. The techniques include adaptive noise removal; improvement of contrast and visibility of objects; space-variant deblurring and restoration; segmentation-based lossless coding for data compression; and perception-based measures for analysis, enhancement, and rendering. The techniques share the common base of identification of adaptive regions by region growing, which lends them a perceptual basis related to the human visual system. Preliminary results obtained with some of the techniques implemented so far are used to illustrate the concepts involved, and to indicate potential performance capabilities of the methods.

  14. Dust tracking techniques applied to the STARDUST facility: First results

    Malizia, A., E-mail: malizia@ing.uniroma2.it [Associazione EURATOM-ENEA, Department of Industrial Engineering, University of Rome “Tor Vergata”, Via del Politecnico 1, 00133 Rome (Italy); Camplani, M. [Grupo de Tratamiento de Imágenes, E.T.S.I de Telecomunicación, Universidad Politécnica de Madrid (Spain); Gelfusa, M. [Associazione EURATOM-ENEA, Department of Industrial Engineering, University of Rome “Tor Vergata”, Via del Politecnico 1, 00133 Rome (Italy); Lupelli, I. [Associazione EURATOM-ENEA, Department of Industrial Engineering, University of Rome “Tor Vergata”, Via del Politecnico 1, 00133 Rome (Italy); EURATOM/CCFE Association, Culham Science Centre, Abingdon (United Kingdom); Richetta, M.; Antonelli, L.; Conetta, F.; Scarpellini, D.; Carestia, M.; Peluso, E.; Bellecci, C. [Associazione EURATOM-ENEA, Department of Industrial Engineering, University of Rome “Tor Vergata”, Via del Politecnico 1, 00133 Rome (Italy); Salgado, L. [Grupo de Tratamiento de Imágenes, E.T.S.I de Telecomunicación, Universidad Politécnica de Madrid (Spain); Video Processing and Understanding Laboratory, Universidad Autónoma de Madrid (Spain); Gaudio, P. [Associazione EURATOM-ENEA, Department of Industrial Engineering, University of Rome “Tor Vergata”, Via del Politecnico 1, 00133 Rome (Italy)

    2014-10-15

    Highlights: •Use of an experimental facility, STARDUST, to analyze the dust resuspension problem inside the tokamak in case of loss of vacuum accident. •PIV technique implementation to track the dust during a LOVA reproduction inside STARDUST. •Data imaging techniques to analyze dust velocity field: first results and data discussion. -- Abstract: An important issue related to future nuclear fusion reactors fueled with deuterium and tritium is the creation of large amounts of dust due to several mechanisms (disruptions, ELMs and VDEs). The dust size expected in nuclear fusion experiments (such as ITER) is in the order of microns (between 0.1 and 1000 μm). Almost the total amount of this dust remains in the vacuum vessel (VV). This radiological dust can re-suspend in case of LOVA (loss of vacuum accident) and these phenomena can cause explosions and serious damages to the health of the operators and to the integrity of the device. The authors have developed a facility, STARDUST, in order to reproduce the thermo fluid-dynamic conditions comparable to those expected inside the VV of the next generation of experiments such as ITER in case of LOVA. The dust used inside the STARDUST facility presents particle sizes and physical characteristics comparable with those that created inside the VV of nuclear fusion experiments. In this facility an experimental campaign has been conducted with the purpose of tracking the dust re-suspended at low pressurization rates (comparable to those expected in case of LOVA in ITER and suggested by the General Safety and Security Report ITER-GSSR) using a fast camera with a frame rate from 1000 to 10,000 images per second. The velocity fields of the mobilized dust are derived from the imaging of a two-dimensional slice of the flow illuminated by optically adapted laser beam. The aim of this work is to demonstrate the possibility of dust tracking by means of image processing with the objective of determining the velocity field values

  15. Dust tracking techniques applied to the STARDUST facility: First results

    Highlights: •Use of an experimental facility, STARDUST, to analyze the dust resuspension problem inside the tokamak in case of loss of vacuum accident. •PIV technique implementation to track the dust during a LOVA reproduction inside STARDUST. •Data imaging techniques to analyze dust velocity field: first results and data discussion. -- Abstract: An important issue related to future nuclear fusion reactors fueled with deuterium and tritium is the creation of large amounts of dust due to several mechanisms (disruptions, ELMs and VDEs). The dust size expected in nuclear fusion experiments (such as ITER) is in the order of microns (between 0.1 and 1000 μm). Almost the total amount of this dust remains in the vacuum vessel (VV). This radiological dust can re-suspend in case of LOVA (loss of vacuum accident) and these phenomena can cause explosions and serious damages to the health of the operators and to the integrity of the device. The authors have developed a facility, STARDUST, in order to reproduce the thermo fluid-dynamic conditions comparable to those expected inside the VV of the next generation of experiments such as ITER in case of LOVA. The dust used inside the STARDUST facility presents particle sizes and physical characteristics comparable with those that created inside the VV of nuclear fusion experiments. In this facility an experimental campaign has been conducted with the purpose of tracking the dust re-suspended at low pressurization rates (comparable to those expected in case of LOVA in ITER and suggested by the General Safety and Security Report ITER-GSSR) using a fast camera with a frame rate from 1000 to 10,000 images per second. The velocity fields of the mobilized dust are derived from the imaging of a two-dimensional slice of the flow illuminated by optically adapted laser beam. The aim of this work is to demonstrate the possibility of dust tracking by means of image processing with the objective of determining the velocity field values

  16. Analytical techniques applied to study cultural heritage objects

    Rizzutto, M.A.; Curado, J.F.; Bernardes, S.; Campos, P.H.O.V.; Kajiya, E.A.M.; Silva, T.F.; Rodrigues, C.L.; Moro, M.; Tabacniks, M.; Added, N., E-mail: rizzutto@if.usp.br [Universidade de Sao Paulo (USP), SP (Brazil). Instituto de Fisica

    2015-07-01

    The scientific study of artistic and cultural heritage objects have been routinely performed in Europe and the United States for decades. In Brazil this research area is growing, mainly through the use of physical and chemical characterization methods. Since 2003 the Group of Applied Physics with Particle Accelerators of the Physics Institute of the University of Sao Paulo (GFAA-IF) has been working with various methodologies for material characterization and analysis of cultural objects. Initially using ion beam analysis performed with Particle Induced X-Ray Emission (PIXE), Rutherford Backscattering (RBS) and recently Ion Beam Induced Luminescence (IBIL), for the determination of the elements and chemical compounds in the surface layers. These techniques are widely used in the Laboratory of Materials Analysis with Ion Beams (LAMFI-USP). Recently, the GFAA expanded the studies to other possibilities of analysis enabled by imaging techniques that coupled with elemental and compositional characterization provide a better understanding on the materials and techniques used in the creative process in the manufacture of objects. The imaging analysis, mainly used to examine and document artistic and cultural heritage objects, are performed through images with visible light, infrared reflectography (IR), fluorescence with ultraviolet radiation (UV), tangential light and digital radiography. Expanding more the possibilities of analysis, new capabilities were added using portable equipment such as Energy Dispersive X-Ray Fluorescence (ED-XRF) and Raman Spectroscopy that can be used for analysis 'in situ' at the museums. The results of these analyzes are providing valuable information on the manufacturing process and have provided new information on objects of different University of Sao Paulo museums. Improving the arsenal of cultural heritage analysis it was recently constructed an 3D robotic stage for the precise positioning of samples in the external beam setup

  17. Analytical techniques applied to study cultural heritage objects

    The scientific study of artistic and cultural heritage objects have been routinely performed in Europe and the United States for decades. In Brazil this research area is growing, mainly through the use of physical and chemical characterization methods. Since 2003 the Group of Applied Physics with Particle Accelerators of the Physics Institute of the University of Sao Paulo (GFAA-IF) has been working with various methodologies for material characterization and analysis of cultural objects. Initially using ion beam analysis performed with Particle Induced X-Ray Emission (PIXE), Rutherford Backscattering (RBS) and recently Ion Beam Induced Luminescence (IBIL), for the determination of the elements and chemical compounds in the surface layers. These techniques are widely used in the Laboratory of Materials Analysis with Ion Beams (LAMFI-USP). Recently, the GFAA expanded the studies to other possibilities of analysis enabled by imaging techniques that coupled with elemental and compositional characterization provide a better understanding on the materials and techniques used in the creative process in the manufacture of objects. The imaging analysis, mainly used to examine and document artistic and cultural heritage objects, are performed through images with visible light, infrared reflectography (IR), fluorescence with ultraviolet radiation (UV), tangential light and digital radiography. Expanding more the possibilities of analysis, new capabilities were added using portable equipment such as Energy Dispersive X-Ray Fluorescence (ED-XRF) and Raman Spectroscopy that can be used for analysis 'in situ' at the museums. The results of these analyzes are providing valuable information on the manufacturing process and have provided new information on objects of different University of Sao Paulo museums. Improving the arsenal of cultural heritage analysis it was recently constructed an 3D robotic stage for the precise positioning of samples in the external beam setup

  18. Two fiber optics communication adapters apply to the control system of HIRFL-CSR

    The authors introduced two kinds of fiber adapters that apply to the engineering HIRFL-CSR. Including design of two adapters, operational principle, and hardware construction, field of application. How to control equipment which have the standard RS232 or RS485 interface at long distance by two adapters. Replace the RS485 bus with the fiber and the 485-Fiber Adapter, solved the problem of communication disturb. The requirements of control in the national great science engineering HIRFL-CSR are fulfilled. (authors)

  19. Applying advanced digital signal processing techniques in industrial radioisotopes applications

    Radioisotopes can be used to obtain signals or images in order to recognize the information inside the industrial systems. The main problems of using these techniques are the difficulty of identification of the obtained signals or images and the requirement of skilled experts for the interpretation process of the output data of these applications. Now, the interpretation of the output data from these applications is performed mainly manually, depending heavily on the skills and the experience of trained operators. This process is time consuming and the results typically suffer from inconsistency and errors. The objective of the thesis is to apply the advanced digital signal processing techniques for improving the treatment and the interpretation of the output data from the different Industrial Radioisotopes Applications (IRA). This thesis focuses on two IRA; the Residence Time Distribution (RTD) measurement and the defect inspection of welded pipes using a gamma source (gamma radiography). In RTD measurement application, this thesis presents methods for signal pre-processing and modeling of the RTD signals. Simulation results have been presented for two case studies. The first case study is a laboratory experiment for measuring the RTD in a water flow rig. The second case study is an experiment for measuring the RTD in a phosphate production unit. The thesis proposes an approach for RTD signal identification in the presence of noise. In this approach, after signal processing, the Mel Frequency Cepstral Coefficients (MFCCs) and polynomial coefficients are extracted from the processed signal or from one of its transforms. The Discrete Wavelet Transform (DWT), Discrete Cosine Transform (DCT), and Discrete Sine Transform (DST) have been tested and compared for efficient feature extraction. Neural networks have been used for matching of the extracted features. Furthermore, the Power Density Spectrum (PDS) of the RTD signal has been also used instead of the discrete

  20. ESR dating technique applied to Pleistocene Corals (Barbados Island)

    In this work we applied the ESR (Electron Spin Resonance) dating technique to a coral coming from Barbados island. After a preliminary purification treatment, coral samples were milled and separated in different granulometry groups. Powder samples having granulometry values between 125 μm-250 μm and 250 μm-500 μm were irradiated at the Calliope60 Co radioisotope source (R.C. ENEA-Casaccia) at doses between 10-3300 Gγ and their radiation induced ESR signals were measured by a Bruker EMS1O4 spectrometer. The signal/noise ratio turned to be highest far the granulometry between 250 μm-500 μm and consequently the paleo-curve was constructed by using the ESR signals related to this granulometry value. The paleo-curve was fitted with the exponential growth function y = a - b · e-cx which well describes the behaviour of the curve also in the saturation region. Extrapolating the paleo-dose and knowing the annual dose (999±79 μGy/y) we calculated a coral age of 156±12 ky, which is in good agreement with results obtained on coral coming from the same region by other authors

  1. Sensor Web Dynamic Measurement Techniques and Adaptive Observing Strategies

    Talabac, Stephen J.

    2004-01-01

    Sensor Web observing systems may have the potential to significantly improve our ability to monitor, understand, and predict the evolution of rapidly evolving, transient, or variable environmental features and events. This improvement will come about by integrating novel data collection techniques, new or improved instruments, emerging communications technologies and protocols, sensor mark-up languages, and interoperable planning and scheduling systems. In contrast to today's observing systems, "event-driven" sensor webs will synthesize real- or near-real time measurements and information from other platforms and then react by reconfiguring the platforms and instruments to invoke new measurement modes and adaptive observation strategies. Similarly, "model-driven" sensor webs will utilize environmental prediction models to initiate targeted sensor measurements or to use a new observing strategy. The sensor web concept contrasts with today's data collection techniques and observing system operations concepts where independent measurements are made by remote sensing and in situ platforms that do not share, and therefore cannot act upon, potentially useful complementary sensor measurement data and platform state information. This presentation describes NASA's view of event-driven and model-driven Sensor Webs and highlights several research and development activities at the Goddard Space Flight Center.

  2. Adaptive array technique for differential-phase reflectometry in QUEST

    Idei, H., E-mail: idei@triam.kyushu-u.ac.jp; Hanada, K.; Zushi, H. [Research Institute for Applied Mechanics, Kyushu Univ., Kasuga, 816-8560 Japan (Japan); Nagata, K.; Mishra, K.; Itado, T.; Akimoto, R. [Interdisciplinary Grad. School of Eng. Sci., Kyushu Univ., Kasuga, 816-8580 Japan (Japan); Yamamoto, M. K. [Research Institute for Sustainable Humanosphere, Kyoto Univ., Uji, 611-0011 Japan (Japan)

    2014-11-15

    A Phased Array Antenna (PAA) was considered as launching and receiving antennae in reflectometry to attain good directivity in its applied microwave range. A well-focused beam was obtained in a launching antenna application, and differential-phase evolution was properly measured by using a metal reflector plate in the proof-of-principle experiment at low power test facilities. Differential-phase evolution was also evaluated by using the PAA in the Q-shu University Experiment with Steady State Spherical Tokamak (QUEST). A beam-forming technique was applied in receiving phased-array antenna measurements. In the QUEST device that should be considered as a large oversized cavity, standing wave effect was significantly observed with perturbed phase evolution. A new approach using derivative of measured field on propagating wavenumber was proposed to eliminate the standing wave effect.

  3. PIXE and PDMS techniques applied to environmental control

    The airborne particles containing metals are the one of main sources of workers and environmental exposure during mineral mining and milling processes. In order to evaluate the risk of the workers and to the environment due to mineral processes, it is necessary to determine, the concentration and the kinetics of the particles. Furthermore, the chemical composition, particle size and the elemental mass concentration in the fine fraction of aerosol are necessary for evaluation of the risk. Mineral sands are processed to obtain rutile (TiO2), ilmenite (TiFeO3), zircon (ZrSiO4 and monazite (RE3(PO4)) concentrates. The aim of this work was to apply PIXE (Particle Induced X ray Emission) and PDMS (Plasma Desorption Mass Spectrometry) methods to characterize mineral dust particles generated at this plant. The mass spectrum of positive ions of cerium oxide shows that the cerium is associated to oxygen (CeOn). Compounds of thorium (ThO2) and (ThSiO4), Sr, Ca, Zr were also observed in this spectrum. The positive ions mass spectrum of the concentrate of monazite shows that Th was associated to oxygen (ThOn) and Ce was associated to (POn). Also shows compounds of other rare earth as La, Nd and Y. Ions of ZrSiO3, TiO2 and TiFeO3 present in the mass spectra indicate that the concentrate of monazite contains zircon, rutile and ilmenite. Compounds of Cl, Ca, Mn, V, Cu, Zn and Pb also were identified in the mass spectrum. This study shows that PIXE and PDMS techniques can be used as complementary methods for the aerosol analysis. (author)

  4. Indirect techniques for adaptive input-output linearization of non-linear systems

    Teel, Andrew; Kadiyala, Raja; Kokotovic, Peter; Sastry, Shankar

    1991-01-01

    A technique of indirect adaptive control based on certainty equivalence for input output linearization of nonlinear systems is proven convergent. It does not suffer from the overparameterization drawbacks of the direct adaptive control techniques on the same plant. This paper also contains a semiindirect adaptive controller which has several attractive features of both the direct and indirect schemes.

  5. Applied Taxonomy Techniques Intended for Strenuous Random Forest Robustness

    Tarannum A. Bloch

    2011-11-01

    Full Text Available Globalization and economic trade has change the scrutiny of facts from data to knowledge. For the same purpose data mining techniques have been involved in copious real world applications. This paper illustrates appraisal of assorted data mining techniques on diverse data sets. There are scores of data mining techniques for prediction and classification obtainable, this article includes most prominent techniques: J48, random forest, Naïve Bayes, AdaBoostM1 and Bagging. Experiment results prove robustness of random forest classifier by conniving accuracy, weighted average value of ROC and kappa statistics of various data sets

  6. GPS-based ionospheric tomography with a constrained adaptive simultaneous algebraic reconstruction technique

    Wen Debao; Zhang Xiao; Tong Yangjin; Zhang Guangsheng; Zhang Min; Leng Rusong

    2015-03-01

    In this paper, a constrained adaptive simultaneous algebraic reconstruction technique (CASART) is presented to obtain high-quality reconstructions from insufficient projections. According to the continuous smoothness of the variations of ionospheric electron density (IED) among neighbouring voxels, Gauss weighted function is introduced to constrain the tomography system in the new method. It can resolve the dependence on the initial values for those voxels without any GPS rays traversing them. Numerical simulation scheme is devised to validate the feasibility of the new algorithm. Some comparisons are made to demonstrate the superiority of the new method. Finally, the actual GPS observations are applied to further validate the feasibility and superiority of the new algorithm.

  7. GLOBAL COMMUNICATION TECHNIQUES TO BE APPLIED BY MULTINATIONAL COMPANIES

    Alexandru Ionescu; Nicoleta Rossela Dumitru

    2011-01-01

    Global communication is based on a basic principle very clear: in a company, everything communicates. Each expression of communication should be considered as a vital element of enterprise identity and personality. Also, global communication is developed based company’s history and heritage, culture and future. Being rooted in each project’s ambition, the global communication identifies and integrates the core values that will allow the company to grow and adapt to fast environmental changes....

  8. Multicriterial Evaluation of Applying Japanese Management Concepts, Methods and Techniques

    Podobiński, Mateusz

    2014-01-01

    Japanese management concepts, methods and techniques refer to work organization and improvements to companies’ functioning. They appear in numerous Polish companies, especially in the manufacturing ones. Cultural differences are a major impediment in their implementation. Nevertheless, the advantages of using Japanese management concepts, methods and techniques motivate the management to implement them in the company. The author shows research results, which refer to advanta...

  9. Photoacoustic technique applied to the study of skin and leather

    In this paper the photoacoustic technique is used in bull skin for the determination of thermal and optical properties as a function of the tanning process steps. Our results show that the photoacoustic technique is sensitive to the study of physical changes in this kind of material due to the tanning process

  10. Photoacoustic technique applied to the study of skin and leather

    Vargas, M.; Varela, J.; Hernández, L.; González, A.

    1998-08-01

    In this paper the photoacoustic technique is used in bull skin for the determination of thermal and optical properties as a function of the tanning process steps. Our results show that the photoacoustic technique is sensitive to the study of physical changes in this kind of material due to the tanning process.

  11. APPLYING ARTIFICIAL INTELLIGENCE TECHNIQUES TO HUMAN-COMPUTER INTERFACES

    Sonnenwald, Diane H.

    1988-01-01

    A description is given of UIMS (User Interface Management System), a system using a variety of artificial intelligence techniques to build knowledge-based user interfaces combining functionality and information from a variety of computer systems that maintain, test, and configure customer telephone...... and data networks. Three artificial intelligence (AI) techniques used in UIMS are discussed, namely, frame representation, object-oriented programming languages, and rule-based systems. The UIMS architecture is presented, and the structure of the UIMS is explained in terms of the AI techniques....

  12. Manifold learning techniques and model reduction applied to dissipative PDEs

    Sonday, Benjamin E.; Singer, Amit; Gear, C. William; Kevrekidis, Ioannis G.

    2010-01-01

    We link nonlinear manifold learning techniques for data analysis/compression with model reduction techniques for evolution equations with time scale separation. In particular, we demonstrate a `"nonlinear extension" of the POD-Galerkin approach to obtaining reduced dynamic models of dissipative evolution equations. The approach is illustrated through a reaction-diffusion PDE, and the performance of different simulators on the full and the reduced models is compared. We also discuss the relati...

  13. Metamodeling Techniques Applied to the Design of Reconfigurable Control Applications

    Luca Ferrarini

    2008-02-01

    Full Text Available In order to realize autonomous manufacturing systems in environments characterized by high dynamics and high complexity of task, it is necessary to improve the control system modelling and performance. This requires the use of better and reusable abstractions. In this paper, we explore the metamodel techniques as a foundation to the solution of this problem. The increasing popularity of model-driven approaches and a new generation of tools to support metamodel techniques are changing software engineering landscape, boosting the adoption of new methodologies for control application development.

  14. Applying Web Usage Mining for Personalizing Hyperlinks in Web-Based Adaptive Educational Systems

    Romero, Cristobal; Ventura, Sebastian; Zafra, Amelia; de Bra, Paul

    2009-01-01

    Nowadays, the application of Web mining techniques in e-learning and Web-based adaptive educational systems is increasing exponentially. In this paper, we propose an advanced architecture for a personalization system to facilitate Web mining. A specific Web mining tool is developed and a recommender engine is integrated into the AHA! system in…

  15. Flipped Classroom Adapted to the ARCS Model of Motivation and Applied to a Physics Course

    Asiksoy, Gülsüm; Özdamli, Fezile

    2016-01-01

    This study aims to determine the effect on the achievement, motivation and self-sufficiency of students of the flipped classroom approach adapted to Keller's ARCS (Attention, Relevance, Confidence and Satisfaction) motivation model and applied to a physics course. The study involved 66 students divided into two classes of a physics course. The…

  16. Software factory techniques applied to Process Control at CERN

    Dutour, MD

    2007-01-01

    The CERN Large Hadron Collider (LHC) requires constant monitoring and control of quantities of parameters to guarantee operational conditions. For this purpose, a methodology called UNICOS (UNIfied Industrial COntrols Systems) has been implemented to standardize the design of process control applications. To further accelerate the development of these applications, we migrated our existing UNICOS tooling suite toward a software factory in charge of assembling project, domain and technical information seamlessly into deployable PLC (Programmable logic Controller) – SCADA (Supervisory Control And Data Acquisition) systems. This software factory delivers consistently high quality by reducing human error and repetitive tasks, and adapts to user specifications in a cost-efficient way. Hence, this production tool is designed to encapsulate and hide the PLC and SCADA target platforms, enabling the experts to focus on the business model rather than specific syntaxes and grammars. Based on industry standard software...

  17. X-diffraction technique applied for nano system metrology

    The application of nano materials are fast growing in all industrial sectors, with a strong necessity in nano metrology and normalizing in the nano material area. The great potential of the X-ray diffraction technique in this field is illustrated at the example of metals, metal oxides and pharmaceuticals

  18. Flash radiographic technique applied to fuel injector sprays

    A flash radiographic technique, using 50 ns exposure times, was used to study the pattern and density distribution of a fuel injector spray. The experimental apparatus and method are described. An 85 kVp flash x-ray generator, designed and fabricated at the Lawrence Livermore Laboratory, is utilized. Radiographic images, recorded on standard x-ray films, are digitized and computer processed

  19. A practical application and implementation of adaptive techniques using neural networks in autoreclose protection and system control

    Gardiner, I.P.

    1997-12-31

    Reyrolle Protection have carried out research in conjunction with Bath University into applying adaptive techniques to autoreclose schemes and have produced an algorithm based on an artificial neural network which can recognise when it is ``safe to reclose`` and when it is ``unsafe to reclose``. This algorithm is based on examination of the induced voltage on the faulted phase and by applying pattern recognition techniques determines when the secondary arc extinguishes. Significant operational advantages can now be realised using this technology resulting in changes to existing operational philosophy. Conventional autoreclose relays applied to the system have followed the philosophy of ``reclose to restore the system``, but a progression from this philosophy to ``reclose only if safe to do so`` can now be made using this adaptive approach. With this adaptive technique the main requirement remains to protect the investment i.e. the system, by reducing damaging shocks and voltage dips and maintaining continuity of supply. The adaptive technique can be incorporated into a variety of schemes which will act to further this goal in comparison with conventional autoreclose. (Author)

  20. Technology Assessment of Dust Suppression Techniques Applied During Structural Demolition

    Boudreaux, J.F.; Ebadian, M.A.; Williams, P.T.; Dua, S.K.

    1998-10-20

    Hanford, Fernald, Savannah River, and other sites are currently reviewing technologies that can be implemented to demolish buildings in a cost-effective manner. In order to demolish a structure properly and, at the same time, minimize the amount of dust generated from a given technology, an evaluation must be conducted to choose the most appropriate dust suppression technology given site-specific conditions. Thus, the purpose of this research, which was carried out at the Hemispheric Center for Environmental Technology (HCET) at Florida International University, was to conduct an experimental study of dust aerosol abatement (dust suppression) methods as applied to nuclear D and D. This experimental study targeted the problem of dust suppression during the demolition of nuclear facilities. The resulting data were employed to assist in the development of mathematical correlations that can be applied to predict dust generation during structural demolition.

  1. Applying Website Usability Testing Techniques to Promote E-services

    Abdel Nasser H. Zaied; Hassan, Mohamed M.; Islam S. Mohamed

    2015-01-01

    In this competitive world, websites are considered to be a key aspect of any organization’s competitiveness. In addition to visual esthetics, usability of a website is a strong determinant for user’s satisfaction and pleasure. However, lack of appropriate techniques and attributes for measuring usability may constrain the usefulness of a website. To address this issue, we conduct a statistical study to evaluate the usability levels of e-learning and e-training websites based on human (user) p...

  2. Signal Processing Techniques Applied in RFI Mitigation of Radio Astronomy

    Sixiu Wang; Zhengwen Sun; Weixia Wang; Liangquan Jia

    2012-01-01

    Radio broadcast and telecommunications are present at different power levels everywhere on Earth. Radio Frequency Interference (RFI) substantially limits the sensitivity of existing radio telescopes in several frequency bands and may prove to be an even greater obstacle for next generation of telescopes (or arrays) to overcome. A variety of RFI detection and mitigation techniques have been developed in recent years. This study describes various signal process methods of RFI mitigation in radi...

  3. Traffic visualization - applying information visualization techniques to enhance traffic planning

    Picozzi, Matteo; Verdezoto, Nervo; Pouke, Matti; Vatjus-Anttila, Jarkko; Quigley, Aaron John

    2013-01-01

    In this paper, we present a space-time visualization to provide city’s decision-makers the ability to analyse and uncover important “city events” in an understandable manner for city planning activities. An interactive Web mashup visualization is presented that integrates several visualization techniques to give a rapid overview of traffic data. We illustrate our approach as a case study for traffic visualization systems, using datasets from the city of Oulu that can be extended to other city...

  4. Ion beam analysis techniques applied to large scale pollution studies

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 μm particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs

  5. Ion beam analysis techniques applied to large scale pollution studies

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  6. Applying a Splitting Technique to Estimate Electrical Grid Reliability

    Wadman, Wander; Crommelin, Daan; Frank, Jason; Pasupathy, R.; Kim, S.-H.; Tolk, A.; Hill, R; Kuhl, M.E.

    2013-01-01

    As intermittent renewable energy penetrates electrical power grids more and more, assessing grid reliability is of increasing concern for grid operators. Monte Carlo simulation is a robust and popular technique to estimate indices for grid reliability, but the involved computational intensity may be too high for typical reliability analyses. We show that various reliability indices can be expressed as expectations depending on the rare event probability of a so-called power curtailment, and e...

  7. Enhanced nonlinear iterative techniques applied to a nonequilibrium plasma flow

    The authors study the application of enhanced nonlinear iterative methods to the steady-state solution of a system of two-dimensional convection-diffusion-reaction partial differential equations that describe the partially ionized plasma flow in the boundary layer of a tokamak fusion reactor. This system of equations is characterized by multiple time and spatial scales and contains highly anisotropic transport coefficients due to a strong imposed magnetic field. They use Newton's method to linearize the nonlinear system of equations resulting from an implicit, finite volume discretization of the governing partial differential equations, on a staggered Cartesian mesh. The resulting linear systems are neither symmetric nor positive definite, and are poorly conditioned. Preconditioned Krylov iterative techniques are employed to solve these linear systems. They investigate both a modified and a matrix-free Newton-Krylov implementation, with the goal of reducing CPU cost associated with the numerical formation of the Jacobian. A combination of a damped iteration, mesh sequencing, and a pseudotransient continuation technique is used to enhance global nonlinear convergence and CPU efficiency. GMRES is employed as the Krylov method with incomplete lower-upper (ILU) factorization preconditioning. The goal is to construct a combination of nonlinear and linear iterative techniques for this complex physical problem that optimizes trade-offs between robustness, CPU time, memory requirements, and code complexity. It is shown that a mesh sequencing implementation provides significant CPU savings for fine grid calculations. Performance comparisons of modified Newton-Krylov and matrix-free Newton-Krylov algorithms will be presented

  8. Image analysis technique applied to lock-exchange gravity currents

    Nogueira, Helena; Adduce, Claudia; Alves, Elsa; Franca, Rodrigues Pereira Da; Jorge, Mario

    2013-01-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in th...

  9. Unconventional Coding Technique Applied to Multi-Level Polarization Modulation

    Rutigliano, G. G.; Betti, S.; Perrone, P.

    2016-05-01

    A new technique is proposed to improve information confidentiality in optical-fiber communications without bandwidth consumption. A pseudorandom vectorial sequence was generated by a dynamic system algorithm and used to codify a multi-level polarization modulation based on the Stokes vector. Optical-fiber birefringence, usually considered as a disturbance, was exploited to obfuscate the signal transmission. At the receiver end, the same pseudorandom sequence was generated and used to decode the multi-level polarization modulated signal. The proposed scheme, working at the physical layer, provides strong information security without introducing complex processing and thus latency.

  10. Applying Supervised Opinion Mining Techniques on Online User Reviews

    Ion SMEUREANU

    2012-01-01

    Full Text Available In recent years, the spectacular development of web technologies, lead to an enormous quantity of user generated information in online systems. This large amount of information on web platforms make them viable for use as data sources, in applications based on opinion mining and sentiment analysis. The paper proposes an algorithm for detecting sentiments on movie user reviews, based on naive Bayes classifier. We make an analysis of the opinion mining domain, techniques used in sentiment analysis and its applicability. We implemented the proposed algorithm and we tested its performance, and suggested directions of development.

  11. Neutron activation: an invaluable technique for teaching applied radiation

    This experiment introduces students to the important method of neutron activation. A sample of aluminium was irradiated with neutrons from an isotropic 241Am-Be source. Using γ-ray spectroscopy, two radionuclide products were identified as 27Mg and 28Al. Applying a cadmium cut-off filter and an optimum irradiation time of 45 min, the half-life of 27Mg was determined as 9.46±0.50 min. The half-life of the 28Al radionuclide was determined as 2.28±0.10 min using a polythene moderator and an optimum irradiation time of 10 min. (author)

  12. Nuclear Techniques Applied For Optimizing Irrigation In Vegetable Cultivation

    Optimizing irrigation in vegetable cultivation has been carried out based on the water use efficiency (WUE) parameter. The experiment has been conducted with Chinese cabbage planted on alluvial soil using traditional furrow, and drip irrigation technique with scheduling and limitation of the amount of irrigated water estimated based on the water balance to compare to each other. Soil moisture and Evapotranspiration (ET) of the crop were controlled, respectively, using a neutron probe (NP, model PB 205, FielTech, Japan) and a meteorological station installed in the field. Calibration for the NP has been performed directly in the field based on the measurement of the count ratio (Rn) and the soil moisture determined gravimetrically. Productivity of the crop in each experiment was determined as the total biological (Ybio) and the edible yield (YE) harvested and the WUE was estimated as a ratio of the productivity and the amount of irrigated water in unit of kg.m-3. Experimental results showed that the drip irrigation could save up to 10-19% of water as compared to the furrow irrigation depending on the cultivation seasons. Thus, WUE was improved up to 1.4 times, as estimated either by YE or by Ybio productivities. The drip irrigation with scheduling technique could be transferred to semiarid areas in Vietnam for not only vegetable but also fruit, e.g. grape in the southern central part of the country. (author)

  13. Neutrongraphy technique applied to the narcotics and terrorism enforcement

    Among the several methods of non-destructive essays that may be used for the detection of both drugs and explosives, the ones that utilize nuclear techniques have demonstrated to possess essential qualities for an efficient detection system. These techniques allow the inspection of a large quantity of samples fast, sensibly, specifically and with automatic decision, for they utilize radiation of great power of penetration. This work aims to show the neutron radiography and computed tomography potentiality for the detection of the drugs and explosives even when they are concealed by heavy materials. In the radiographic essays with thermal neutrons, samples of powder cocaine and explosives were inspected, concealed by several materials or not. The samples were irradiated during 30 minutes in the J-9 channel of the Argonauta research reactor of the IEN/CNEN in a neutron flux of 2:5 105 n/cm2.s. We used two sheets of gadolinium converter with a thickness of 25 μm each one and a Kodak Industrex A5 photographic plaque. A comparative analysis among the tomographic images experimental and simulated obtained by X-ray, fast and thermal neutron is presented. The thermal neutron tomography demonstrate to be the best. (author)

  14. Quantitative Portfolio Optimization Techniques Applied to the Brazilian Stock Market

    André Alves Portela Santos

    2012-09-01

    Full Text Available In this paper we assess the out-of-sample performance of two alternative quantitative portfolio optimization techniques - mean-variance and minimum variance optimization – and compare their performance with respect to a naive 1/N (or equally-weighted portfolio and also to the market portfolio given by the Ibovespa. We focus on short selling-constrained portfolios and consider alternative estimators for the covariance matrices: sample covariance matrix, RiskMetrics, and three covariance estimators proposed by Ledoit and Wolf (2003, Ledoit and Wolf (2004a and Ledoit and Wolf (2004b. Taking into account alternative portfolio re-balancing frequencies, we compute out-of-sample performance statistics which indicate that the quantitative approaches delivered improved results in terms of lower portfolio volatility and better risk-adjusted returns. Moreover, the use of more sophisticated estimators for the covariance matrix generated optimal portfolios with lower turnover over time.

  15. Innovative Visualization Techniques applied to a Flood Scenario

    Falcão, António; Ho, Quan; Lopes, Pedro; Malamud, Bruce D.; Ribeiro, Rita; Jern, Mikael

    2013-04-01

    The large and ever-increasing amounts of multi-dimensional, time-varying and geospatial digital information from multiple sources represent a major challenge for today's analysts. We present a set of visualization techniques that can be used for the interactive analysis of geo-referenced and time sampled data sets, providing an integrated mechanism and that aids the user to collaboratively explore, present and communicate visually complex and dynamic data. Here we present these concepts in the context of a 4 hour flood scenario from Lisbon in 2010, with data that includes measures of water column (flood height) every 10 minutes at a 4.5 m x 4.5 m resolution, topography, building damage, building information, and online base maps. Techniques we use include web-based linked views, multiple charts, map layers and storytelling. We explain two of these in more detail that are not currently in common use for visualization of data: storytelling and web-based linked views. Visual storytelling is a method for providing a guided but interactive process of visualizing data, allowing more engaging data exploration through interactive web-enabled visualizations. Within storytelling, a snapshot mechanism helps the author of a story to highlight data views of particular interest and subsequently share or guide others within the data analysis process. This allows a particular person to select relevant attributes for a snapshot, such as highlighted regions for comparisons, time step, class values for colour legend, etc. and provide a snapshot of the current application state, which can then be provided as a hyperlink and recreated by someone else. Since data can be embedded within this snapshot, it is possible to interactively visualize and manipulate it. The second technique, web-based linked views, includes multiple windows which interactively respond to the user selections, so that when selecting an object and changing it one window, it will automatically update in all the other

  16. Object Detection Techniques Applied on Mobile Robot Semantic Navigation

    Carlos Astua

    2014-04-01

    Full Text Available The future of robotics predicts that robots will integrate themselves more every day with human beings and their environments. To achieve this integration, robots need to acquire information about the environment and its objects. There is a big need for algorithms to provide robots with these sort of skills, from the location where objects are needed to accomplish a task up to where these objects are considered as information about the environment. This paper presents a way to provide mobile robots with the ability-skill to detect objets for semantic navigation. This paper aims to use current trends in robotics and at the same time, that can be exported to other platforms. Two methods to detect objects are proposed, contour detection and a descriptor based technique, and both of them are combined to overcome their respective limitations. Finally, the code is tested on a real robot, to prove its accuracy and efficiency.

  17. Applying Business Process Mode ling Techniques : Case Study

    Bartosz Marcinkowski

    2010-12-01

    Full Text Available Selection and proper application of business process modeling methods and techniques have a significant impact on organizational improvement capabilities as well as proper understanding of functionality of information systems that shall support activity of the organization. A number of business process modeling notations were implemented in practice in recent decades. Most significant of the notations include ARIS, Business Process Modeling Notation (OMG BPMN and several Unified Modeling Language (OMG UML extensions. In this paper, the assessment whether one of the most flexible and strictly standardized contempo-rary bus iness process modeling notations, i.e. Rational UML Profile for Business Modeling, enable business analysts to prepare business models that are all-embracing and understandable by all the stakeholders. After the introduction, me-thodology of res earch is discussed. The following section presents selected case study results. The paper is concluded with a summary

  18. Signal Processing Techniques Applied in RFI Mitigation of Radio Astronomy

    Sixiu Wang

    2012-08-01

    Full Text Available Radio broadcast and telecommunications are present at different power levels everywhere on Earth. Radio Frequency Interference (RFI substantially limits the sensitivity of existing radio telescopes in several frequency bands and may prove to be an even greater obstacle for next generation of telescopes (or arrays to overcome. A variety of RFI detection and mitigation techniques have been developed in recent years. This study describes various signal process methods of RFI mitigation in radio astronomy, choose the method of Time-frequency domain cancellation to eliminate certain interference and effectively improve the signal to noise ratio in pulsar observations. Finally, RFI mitigation researches and implements in China radio astronomy will be also presented.

  19. Applying Data Privacy Techniques on Tabular Data in Uganda

    Mivule, Kato

    2011-01-01

    The growth of Information Technology(IT) in Africa has led to an increase in the utilization of communication networks for data transaction across the continent. A growing number of entities in the private sector, academia, and government, have deployed the Internet as a medium to transact in data, routinely posting statistical and non statistical data online and thereby making many in Africa increasingly dependent on the Internet for data transactions. In the country of Uganda, exponential growth in data transaction has presented a new challenge: What is the most efficient way to implement data privacy. This article discusses data privacy challenges faced by the country of Uganda and implementation of data privacy techniques for published tabular data. We make the case for data privacy, survey concepts of data privacy, and implementations that could be employed to provide data privacy in Uganda.

  20. Considerations in applying on-line IC techniques to BWR's

    Ion-Chromatography (IC) has moved from its traditional role as a laboratory analytical tool to a real time, dynamic, on-line measurement device to follow ppb and sub-ppb concentrations of deleterious impurities in nuclear power plants. Electric Power Research Institute (EPRI), individual utilities, and industry all have played significant roles in effecting the transition. This paper highlights considerations and the evolution in current on-line Ion Chromatography systems. The first applications of on-line techniques were demonstrated by General Electric (GE) under EPRI sponsorship at Rancho Seco (1980), Calvert Cliffs, and McGuire nuclear units. The primary use was for diagnostic purposes. Today the on-line IC applications have been expanded to include process control and routine plant monitoring. Current on-line IC's are innovative in design, promote operational simplicity, are modular for simplified maintenance and repair, and use field-proven components which enhance reliability. Conductivity detection with electronic or chemical suppression and spectrometric detection techniques are intermixed in applications. Remote multi-point sample systems have addressed memory effects. Early applications measured ionic species in the part per billion range. Today reliable part per trillion measurements are common for on-line systems. Current systems are meeting the challenge of EPRI guideline requirements. Today's on-line IC's, with programmed sampling systems, monitor fluid streams throughout a power plant, supplying data that can be trended, stored and retrieved easily. The on-line IC has come of age. Many technical challenges were overcome to achieve today's IC

  1. Technology Assessment of Dust Suppression Techniques applied During Structural Demolition

    Boudreaux, J.F.; Ebadian, M.A.; Dua, S.K.

    1997-08-06

    Hanford, Fernald, Savannah River, and other sites are currently reviewing technologies that can be implemented to demolish buildings in a cost-effective manner. In order to demolish a structure and, at the same time, minimize the amount of dust generated by a given technology, an evaluation must be conducted to choose the most appropriate dust suppression technology. Thus, the purpose of this research, which was conducted by the Hemispheric Center for Environmental Technology (HCET) at Florida International University (FIU), was to perform an experimental study of dust aerosol abatement (dust suppression) methods as applied to nuclear D and D. This experimental study specifically targeted the problem of dust suppression during demolition. The resulting data were used in the development of mathematical correlations that can be applied to structural demolition. In the Fiscal Year 1996 (FY96), the effectiveness of different dust suppressing agents was investigated for different types of concrete blocks. Initial tests were conducted in a broad particle size range. In Fiscal Year 1997 (FY97), additional tests were performed in the size range in which most of the particles were detected. Since particle distribution is an important parameter for predicting deposition in various compartments of the human respiratory tract, various tests were aimed at determining the particle size distribution of the airborne dust particles. The effectiveness of dust suppressing agents for particles of various size was studied. Instead of conducting experiments on various types of blocks, it was thought prudent to carry out additional tests on blocks of the same type. Several refinements were also incorporated in the test procedures and data acquisition system used in FY96.

  2. Optical Trapping Techniques Applied to the Study of Cell Membranes

    Morss, Andrew J.

    Optical tweezers allow for manipulating micron-sized objects using pN level optical forces. In this work, we use an optical trapping setup to aid in three separate experiments, all related to the physics of the cellular membrane. In the first experiment, in conjunction with Brian Henslee, we use optical tweezers to allow for precise positioning and control of cells in suspension to evaluate the cell size dependence of electroporation. Theory predicts that all cells porate at a transmembrane potential VTMof roughly 1 V. The Schwann equation predicts that the transmembrane potential depends linearly on the cell radius r, thus predicting that cells should porate at threshold electric fields that go as 1/r. The threshold field required to induce poration is determined by applying a low voltage pulse to the cell and then applying additional pulses of greater and greater magnitude, checking for poration at each step using propidium iodide dye. We find that, contrary to expectations, cells do not porate at a constant value of the transmembrane potential but at a constant value of the electric field which we find to be 692 V/cm for K562 cells. Delivering precise dosages of nanoparticles into cells is of importance for assessing toxicity of nanoparticles or for genetic research. In the second experiment, we conduct nano-electroporation—a novel method of applying precise doses of transfection agents to cells—by using optical tweezers in conjunction with a confocal microscope to manipulate cells into contact with 100 nm wide nanochannels. This work was done in collaboration with Pouyan Boukany of Dr. Lee's group. The small cross sectional area of these nano channels means that the electric field within them is extremely large, 60 MV/m, which allows them to electrophoretically drive transfection agents into the cell. We find that nano electroporation results in excellent dose control (to within 10% in our experiments) compared to bulk electroporation. We also find that

  3. Beaconless adaptive-optics technique for HEL beam control

    Khizhnyak, Anatoliy; Markov, Vladimir

    2016-05-01

    Effective performance of forthcoming laser systems capable of power delivery on a distant target requires an adaptive optics system to correct atmospheric perturbations on the laser beam. The turbulence-induced effects are responsible for beam wobbling, wandering, and intensity scintillation, resulting in degradation of the beam quality and power density on the target. Adaptive optics methods are used to compensate for these negative effects. In its turn, operation of the AOS system requires a reference wave that can be generated by the beacon on the target. This report discusses a beaconless approach for wavefront correction with its performance based on the detection of the target-scattered light. Postprocessing of the beacon-generated light field enables retrieval and detailed characterization of the turbulence-perturbed wavefront -data that is essential to control the adaptive optics module of a high-power laser system.

  4. Adaptive Remote-Sensing Techniques Implementing Swarms of Mobile Agents

    Cameron, S.M.; Loubriel, G.M.; Rbinett, R.D. III; Stantz, K.M.; Trahan, M.W.; Wagner, J.S.

    1999-04-01

    This paper focuses on our recent work at Sandia National Laboratories toward engineering a physics-based swarm of mobile vehicles for distributed sensing applications. Our goal is to coordinate a sensor array that optimizes sensor coverage and multivariate signal analysis by implementing artificial intelligence and evolutionary computational techniques. These intelligent control systems integrate both globally operating decision-making systems and locally cooperative information-sharing modes using genetically-trained neural networks. Once trained, neural networks have the ability to enhance real-time operational responses to dynamical environments, such as obstacle avoidance, responding to prevailing wind patterns, and overcoming other natural obscurants or interferences (jammers). The swarm realizes a collective set of sensor neurons with simple properties incorporating interactions based on basic community rules (potential fields) and complex interconnecting functions based on various neural network architectures, Therefore, the swarm is capable of redundant heterogeneous measurements which furnishes an additional degree of robustness and fault tolerance not afforded by conventional systems, while accomplishing such cognitive tasks as generalization, error correction, pattern recognition, and sensor fission. The robotic platforms could be equipped with specialized sensor devices including transmit/receive dipole antennas, chemical or biological sniffers in combination with recognition analysis tools, communication modulators, and laser diodes. Our group has been studying the collective behavior of an autonomous, multi-agent system applied to emerging threat applications. To accomplish such tasks, research in the fields of robotics, sensor technology, and swarms are being conducted within an integrated program. Mission scenarios under consideration include ground penetrating impulse radar (GPR) for detection of under-ground structures, airborne systems, and plume

  5. Adaptive Remote-Sensing Techniques Implementing Swarms of Mobile Agents

    Asher, R.B.; Cameron, S.M.; Loubriel, G.M.; Robinett, R.D.; Stantz, K.M.; Trahan, M.W.; Wagner, J.S.

    1998-11-25

    In many situations, stand-off remote-sensing and hazard-interdiction techniques over realistic operational areas are often impractical "and difficult to characterize. An alternative approach is to implement an adap- tively deployable array of sensitive agent-specific devices. Our group has been studying the collective be- havior of an autonomous, multi-agent system applied to chedbio detection and related emerging threat applications, The current physics-based models we are using coordinate a sensor array for mukivanate sig- nal optimization and coverage as re,alized by a swarm of robots or mobile vehicles. These intelligent control systems integrate'glob"ally operating decision-making systems and locally cooperative learning neural net- works to enhance re+-timp operational responses to dynarnical environments examples of which include obstacle avoidance, res~onding to prevailing wind patterns, and overcoming other natural obscurants or in- terferences. Collectively',tkensor nefirons with simple properties, interacting according to basic community rules, can accomplish complex interconnecting functions such as generalization, error correction, pattern recognition, sensor fusion, and localization. Neural nets provide a greater degree of robusmess and fault tolerance than conventional systems in that minor variations or imperfections do not impair performance. The robotic platforms would be equipped with sensor devices that perform opticaI detection of biologicais in combination with multivariate chemical analysis tools based on genetic and neural network algorithms, laser-diode LIDAR analysis, ultra-wideband short-pulsed transmitting and receiving antennas, thermal im- a:ing sensors, and optical Communication technology providing robust data throughput pathways. Mission scenarios under consideration include ground penetrating radar (GPR) for detection of underground struc- tures, airborne systems, and plume migration and mitigation. We will describe our

  6. Remote sensing techniques applied to seismic vulnerability assessment

    Juan Arranz, Jose; Torres, Yolanda; Hahgi, Azade; Gaspar-Escribano, Jorge

    2016-04-01

    Advances in remote sensing and photogrammetry techniques have increased the degree of accuracy and resolution in the record of the earth's surface. This has expanded the range of possible applications of these data. In this research, we have used these data to document the construction characteristics of the urban environment of Lorca, Spain. An exposure database has been created with the gathered information to be used in seismic vulnerability assessment. To this end, we have used data from photogrammetric flights at different periods, using both orthorectified images in the visible and infrared spectrum. Furthermore, the analysis is completed using LiDAR data. From the combination of these data, it has been possible to delineate the building footprints and characterize the constructions with attributes such as the approximate date of construction, area, type of roof and even building materials. To carry out the calculation, we have developed different algorithms to compare images from different times, segment images, classify LiDAR data, and use the infrared data in order to remove vegetation or to compute roof surfaces with height value, tilt and spectral fingerprint. In addition, the accuracy of our results has been validated with ground truth data. Keywords: LiDAR, remote sensing, seismic vulnerability, Lorca

  7. Applying Website Usability Testing Techniques to Promote E-services

    Abdel Nasser H. Zaied

    2015-09-01

    Full Text Available In this competitive world, websites are considered to be a key aspect of any organization’s competitiveness. In addition to visual esthetics, usability of a website is a strong determinant for user’s satisfaction and pleasure. However, lack of appropriate techniques and attributes for measuring usability may constrain the usefulness of a website. To address this issue, we conduct a statistical study to evaluate the usability levels of e-learning and e-training websites based on human (user perception. The questionnaire is implemented as user based tool, visitors of a website can use it to evaluate the usability of the websites. The results showed that according to the students’ point view the personalization has the first important criterion for the use of the e-learning websites, while according to experts’ point view the accessibility has the first important criterion for the use of the e-learning websites. Also the result indicated that the experienced respondents have demonstrated satisfaction over the usability attributes of e-learning websites they accessed for their learning purposes; while inexperienced students have expressed their perception on the importance of the usability attributes for accessing e-learning websites. When combining and comparing both finings, based on the outcomes it is evident that, all the attributes yielded satisfaction and were felt important.

  8. Digital prototyping technique applied for redesigning plastic products

    Pop, A.; Andrei, A.

    2015-11-01

    After products are on the market for some time, they often need to be redesigned to meet new market requirements. New products are generally derived from similar but outdated products. Redesigning a product is an important part of the production and development process. The purpose of this paper is to show that using modern technology, like Digital Prototyping in industry is an effective way to produce new products. This paper tries to demonstrate and highlight the effectiveness of the concept of Digital Prototyping, both to reduce the design time of a new product, but also the costs required for implementing this step. The results of this paper show that using Digital Prototyping techniques in designing a new product from an existing one available on the market mould offers a significantly manufacturing time and cost reduction. The ability to simulate and test a new product with modern CAD-CAM programs in all aspects of production (designing of the 3D model, simulation of the structural resistance, analysis of the injection process and beautification) offers a helpful tool for engineers. The whole process can be realised by one skilled engineer very fast and effective.

  9. Semantic Data And Visualization Techniques Applied To Geologic Field Mapping

    Houser, P. I. Q.; Royo-Leon, M.; Munoz, R.; Estrada, E.; Villanueva-Rosales, N.; Pennington, D. D.

    2015-12-01

    Geologic field mapping involves the use of technology before, during, and after visiting a site. Geologists utilize hardware such as Global Positioning Systems (GPS) connected to mobile computing platforms such as tablets that include software such as ESRI's ArcPad and other software to produce maps and figures for a final analysis and report. Hand written field notes contain important information and drawings or sketches of specific areas within the field study. Our goal is to collect and geo-tag final and raw field data into a cyber-infrastructure environment with an ontology that allows for large data processing, visualization, sharing, and searching, aiding in connecting field research with prior research in the same area and/or aid with experiment replication. Online searches of a specific field area return results such as weather data from NOAA and QuakeML seismic data from USGS. These results that can then be saved to a field mobile device and searched while in the field where there is no Internet connection. To accomplish this we created the GeoField ontology service using the Web Ontology Language (OWL) and Protégé software. Advanced queries on the dataset can be made using reasoning capabilities can be supported that go beyond a standard database service. These improvements include the automated discovery of data relevant to a specific field site and visualization techniques aimed at enhancing analysis and collaboration while in the field by draping data over mobile views of the site using augmented reality. A case study is being performed at University of Texas at El Paso's Indio Mountains Research Station located near Van Horn, Texas, an active multi-disciplinary field study site. The user can interactively move the camera around the study site and view their data digitally. Geologist's can check their data against the site in real-time and improve collaboration with another person as both parties have the same interactive view of the data.

  10. Sterile insect technique applied to Queensland fruit fly

    The Sterile Insect Technique (SIT) aims to suppress or eradicate pest populations by flooding wild populations with sterile males. To control fruit fly million of flies of both sexes are mass reared at the Gosford Post-Harvest laboratory near Sydney, mixed with sawdust and fluorescent dye at the pupal stage and transported to Ansto where they are exposed to low dose of 70-75Gy of gamma radiation from a Cobalt-60 source. Following irradiation the pupae are transported to the release site in plastic sleeves then transferred to large plastic garbage bins for hatching. These bins are held at 30 deg. C. to synchronise hatching and files are released 48-72 hours after hatching begins. In most cases these bins are placed among fruit trees in the form of an 800 metre grid. This maximises survival of the emerging flies which are released on an almost daily basis. Progress of the SIT program is monitored by collecting flies from traps dotted all over the infested site. The ratio of sterile to wild flies can be detected because the sterile files are coated with the fluorescent dust which can be seen under ultra-violet light. If the SIT program is successful entomologists will trap a high proportion of sterile flies to wild flies and this should result in a clear reduction in maggot infestations. Surveillance, quarantine, and trapping activities continue for 8 or 9 months to check for any surviving pockets of infestation. If any are found the SIT program is reactivated. These programs demonstrated that SIT was an efficient and environmental friendly non-chemical control method for eradicating outbreaks or suppressing fruit fly populations in important fruit growing areas. ills

  11. Applying data mining techniques to improve diagnosis in neonatal jaundice

    Ferreira Duarte

    2012-12-01

    Full Text Available Abstract Background Hyperbilirubinemia is emerging as an increasingly common problem in newborns due to a decreasing hospital length of stay after birth. Jaundice is the most common disease of the newborn and although being benign in most cases it can lead to severe neurological consequences if poorly evaluated. In different areas of medicine, data mining has contributed to improve the results obtained with other methodologies. Hence, the aim of this study was to improve the diagnosis of neonatal jaundice with the application of data mining techniques. Methods This study followed the different phases of the Cross Industry Standard Process for Data Mining model as its methodology. This observational study was performed at the Obstetrics Department of a central hospital (Centro Hospitalar Tâmega e Sousa – EPE, from February to March of 2011. A total of 227 healthy newborn infants with 35 or more weeks of gestation were enrolled in the study. Over 70 variables were collected and analyzed. Also, transcutaneous bilirubin levels were measured from birth to hospital discharge with maximum time intervals of 8 hours between measurements, using a noninvasive bilirubinometer. Different attribute subsets were used to train and test classification models using algorithms included in Weka data mining software, such as decision trees (J48 and neural networks (multilayer perceptron. The accuracy results were compared with the traditional methods for prediction of hyperbilirubinemia. Results The application of different classification algorithms to the collected data allowed predicting subsequent hyperbilirubinemia with high accuracy. In particular, at 24 hours of life of newborns, the accuracy for the prediction of hyperbilirubinemia was 89%. The best results were obtained using the following algorithms: naive Bayes, multilayer perceptron and simple logistic. Conclusions The findings of our study sustain that, new approaches, such as data mining, may support

  12. A hybrid adaptive large neighborhood search algorithm applied to a lot-sizing problem

    Muller, Laurent Flindt; Spoorendonk, Simon

    2010-01-01

    This paper presents a hybrid of a general heuristic framework that has been successfully applied to vehicle routing problems and a general purpose MIP solver. The framework uses local search and an adaptive procedure which choses between a set of large neighborhoods to be searched. A mixed integer programming solver and its built-in feasibility heuristics is used to search a neighborhood for improving solutions. The general reoptimization approach used for repairing solutions is specifically ...

  13. Correction of respiratory motion for IMRT using aperture adaptive technique and visual guidance: A feasibility study

    Intensity-modulated radiation therapy (IMRT) utilizes nonuniform beam profile to deliver precise radiation doses to a tumor while minimizing radiation exposure to surrounding normal tissues. However, the problem of intrafraction organ motion distorts the dose distribution and leads to significant dosimetric errors. In this research, we applied an aperture adaptive technique with a visual guiding system to toggle the problem of respiratory motion. A homemade computer program showing a cyclic moving pattern was projected onto the ceiling to visually help patients adjust their respiratory patterns. Once the respiratory motion becomes regular, the leaf sequence can be synchronized with the target motion. An oscillator was employed to simulate the patient's breathing pattern. Two simple fields and one IMRT field were measured to verify the accuracy. Preliminary results showed that after appropriate training, the amplitude and duration of volunteer's breathing can be well controlled by the visual guiding system. The sharp dose gradient at the edge of the radiation fields was successfully restored. The maximum dosimetric error in the IMRT field was significantly decreased from 63% to 3%. We conclude that the aperture adaptive technique with the visual guiding system can be an inexpensive and feasible alternative without compromising delivery efficiency in clinical practice

  14. Techniques for valuing adaptive capacity in flood risk management

    Brisley, Rachel; Wylde, Richard; Lamb, Rob; Cooper, Jonathan; Sayers, Paul; Hall, Jim

    2015-01-01

    Flood and coastal erosion risk management has always faced the challenge of decision making in the face of multiple uncertainties relating to the climate, the economy and society. Traditionally, this has been addressed by adopting a precautionary approach that seeks to protect against a reasonable worst case. However, a managed adaptive approach can offer advantages. The benefits include improved resilience to negative changes, enabling opportunities from positive changes and greater cost eff...

  15. Interesting Metrics Based Adaptive Prediction Technique for Knowledge Discovery

    G. Anbukkarasy; N. Sairam

    2013-01-01

    Prediction is considered as an important factor to predict the future results from the existing information. Decision tree methodology is widely used for predicting the results. But this is not efficient to handle the large, heterogeneous or multi-featured type of data sources. So an adaptive prediction method is proposed by combining the statistical analysis approach of the data mining methods along with the decision tree prediction methodology. So when dealing with large and multi-server ba...

  16. Adaptive Communication Techniques for the Internet of Things

    Peng Du; George Roussos

    2013-01-01

    The vision for the Internet of Things (IoT) demands that material objects acquire communications and computation capabilities and become able to automatically identify themselves through standard protocols and open systems, using the Internet as their foundation. Yet, several challenges still must be addressed for this vision to become a reality. A core ingredient in such development is the ability of heterogeneous devices to communicate adaptively so as to make the best of limited spectrum a...

  17. Experimental Investigation on Adaptive Robust Controller Designs Applied to Constrained Manipulators

    Marco H. Terra

    2013-04-01

    Full Text Available In this paper, two interlaced studies are presented. The first is directed to the design and construction of a dynamic 3D force/moment sensor. The device is applied to provide a feedback signal of forces and moments exerted by the robotic end-effector. This development has become an alternative solution to the existing multi-axis load cell based on static force and moment sensors. The second one shows an experimental investigation on the performance of four different adaptive nonlinear H∞ control methods applied to a constrained manipulator subject to uncertainties in the model and external disturbances. Coordinated position and force control is evaluated. Adaptive procedures are based on neural networks and fuzzy systems applied in two different modeling strategies. The first modeling strategy requires a well-known nominal model for the robot, so that the intelligent systems are applied only to estimate the effects of uncertainties, unmodeled dynamics and external disturbances. The second strategy considers that the robot model is completely unknown and, therefore, intelligent systems are used to estimate these dynamics. A comparative study is conducted based on experimental implementations performed with an actual planar manipulator and with the dynamic force sensor developed for this purpose.

  18. Learning Rate Updating Methods Applied to Adaptive Fuzzy Equalizers for Broadband Power Line Communications

    Ribeiro Moisés V

    2004-01-01

    Full Text Available This paper introduces adaptive fuzzy equalizers with variable step size for broadband power line (PL communications. Based on delta-bar-delta and local Lipschitz estimation updating rules, feedforward, and decision feedback approaches, we propose singleton and nonsingleton fuzzy equalizers with variable step size to cope with the intersymbol interference (ISI effects of PL channels and the hardness of the impulse noises generated by appliances and nonlinear loads connected to low-voltage power grids. The computed results show that the convergence rates of the proposed equalizers are higher than the ones attained by the traditional adaptive fuzzy equalizers introduced by J. M. Mendel and his students. Additionally, some interesting BER curves reveal that the proposed techniques are efficient for mitigating the above-mentioned impairments.

  19. Adapted strategic plannig model applied to small business: a case study in the fitness area

    Eduarda Tirelli Hennig

    2012-06-01

    Full Text Available The strategic planning is an important management tool in the corporate scenario and shall not be restricted to big Companies. However, this kind of planning process in small business may need special adaptations due to their own characteristics. This paper aims to identify and adapt the existent models of strategic planning to the scenario of a small business in the fitness area. Initially, it is accomplished a comparative study among models of different authors to identify theirs phases and activities. Then, it is defined which of these phases and activities should be present in a model that will be utilized in a small business. That model was applied to a Pilates studio; it involves the establishment of an organizational identity, an environmental analysis as well as the definition of strategic goals, strategies and actions to reach them. Finally, benefits to the organization could be identified, as well as hurdles in the implementation of the tool.

  20. Fast Spectral Velocity Estimation Using Adaptive Techniques: In-Vivo Results

    Gran, Fredrik; Jakobsson, Andreas; Udesen, Jesper; Jensen, Jørgen Arendt

    Adaptive spectral estimation techniques are known to provide good spectral resolution and contrast even when the observation window(OW) is very sbort. In this paper two adaptive techniques are tested and compared to the averaged perlodogram (Welch) for blood velocity estimation. The Blood Power...... spectral Capon (BPC) method is based on a standard minimum variance technique adapted to account for both averaging over slowtime and depth. The Blood Amplitude and Phase Estimation technique (BAPES) is based on finding a set of matched filters (one for each velocity component of interest) and filtering...... the blood process over slow-time and averaging over depth to find the power spectral density estimate. In this paper, the two adaptive methods are explained, and performance Is assessed in controlled steady How experiments and in-vivo measurements. The three methods were tested on a circulating How...

  1. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  2. A hybrid adaptive large neighborhood search algorithm applied to a lot-sizing problem

    Muller, Laurent Flindt; Spoorendonk, Simon

    programming solver and its built-in feasibility heuristics is used to search a neighborhood for improving solutions. The general reoptimization approach used for repairing solutions is specifically suited for combinatorial problems where it may be hard to otherwise design operations to define a neighborhood......This paper presents a hybrid of a general heuristic framework that has been successfully applied to vehicle routing problems and a general purpose MIP solver. The framework uses local search and an adaptive procedure which choses between a set of large neighborhoods to be searched. A mixed integer...

  3. An adaptive range-query optimization technique with distributed replicas

    Sayar Ahmet; Pierce Marlon; Fox C.Geoffrey

    2014-01-01

    Replication is an approach often used to speed up the execution of queries submitted to a large dataset. A compile-time/run-time approach is presented for minimizing the response time of 2-dimensional range when a distributed replica of a dataset exists. The aim is to partition the query payload (and its range) into subsets and distribute those to the replica nodes in a way that minimizes a client’s response time. However, since query size and distribution characteristics of data (data dense/sparse regions) in varying ranges are not known a priori, performing efficient load balancing and parallel processing over the unpredictable workload is difficult. A technique based on the creation and manipulation of dynamic spatial indexes for query payload estimation in distributed queries was proposed. The effectiveness of this technique was demonstrated on queries for analysis of archived earthquake-generated seismic data records.

  4. Adaptive Ant Colony Clustering Method Applied to Finding Closely Communicating Community

    Yan Liu

    2012-02-01

    Full Text Available The investigation of community structures in networks is an important issue in many domains and disciplines. Closely communicating community is different from the traditional community which emphasize particularly on structure or context. Our previous method played more emphasis on the feasibility that ant colony algorithm applied to community detection. However the essence of closely communicating community did not be described clearly. In this paper, the definition of closely communicating community is put forward firstly, the four features are described and corresponding methods are introduced to achieve the value of features between each pair. Meanwhile, pair propinquity and local propinquity are put forward and used to guide ants’ decision. Based on the previous work, the closely communicating community detection method is improved in four aspects of adaptive adjusting, which are entropy based weight modulation, combining historical paths and random wandering to select next coordination, the strategy of forcing unloading and the adaptive change of ant’s eyesight. The value selection of parameters is discussed in the portion of experiments, and the results also reveal the improvement of our algorithm in adaptive djusting.

  5. Applying Web Usability Techniques to Assess Student Awareness of Library Web Resources

    Krueger, Janice; Ray, Ron L.; Knight, Lorrie

    2004-01-01

    The authors adapted Web usability techniques to assess student awareness of their library's Web site. Students performed search tasks using a Web browser. Approaches were categorized according to a student's preference for, and success with, the library's Web resources. Forty-five percent of the students utilized the library's Web site as first…

  6. Robust breathing signal extraction from cone beam CT projections based on adaptive and global optimization techniques.

    Chao, Ming; Wei, Jie; Li, Tianfang; Yuan, Yading; Rosenzweig, Kenneth E; Lo, Yeh-Chi

    2016-04-21

    We present a study of extracting respiratory signals from cone beam computed tomography (CBCT) projections within the framework of the Amsterdam Shroud (AS) technique. Acquired prior to the radiotherapy treatment, CBCT projections were preprocessed for contrast enhancement by converting the original intensity images to attenuation images with which the AS image was created. An adaptive robust z-normalization filtering was applied to further augment the weak oscillating structures locally. From the enhanced AS image, the respiratory signal was extracted using a two-step optimization approach to effectively reveal the large-scale regularity of the breathing signals. CBCT projection images from five patients acquired with the Varian Onboard Imager on the Clinac iX System Linear Accelerator (Varian Medical Systems, Palo Alto, CA) were employed to assess the proposed technique. Stable breathing signals can be reliably extracted using the proposed algorithm. Reference waveforms obtained using an air bellows belt (Philips Medical Systems, Cleveland, OH) were exported and compared to those with the AS based signals. The average errors for the enrolled patients between the estimated breath per minute (bpm) and the reference waveform bpm can be as low as -0.07 with the standard deviation 1.58. The new algorithm outperformed the original AS technique for all patients by 8.5% to 30%. The impact of gantry rotation on the breathing signal was assessed with data acquired with a Quasar phantom (Modus Medical Devices Inc., London, Canada) and found to be minimal on the signal frequency. The new technique developed in this work will provide a practical solution to rendering markerless breathing signal using the CBCT projections for thoracic and abdominal patients. PMID:27008349

  7. Robust breathing signal extraction from cone beam CT projections based on adaptive and global optimization techniques

    Chao, Ming; Wei, Jie; Li, Tianfang; Yuan, Yading; Rosenzweig, Kenneth E.; Lo, Yeh-Chi

    2016-04-01

    We present a study of extracting respiratory signals from cone beam computed tomography (CBCT) projections within the framework of the Amsterdam Shroud (AS) technique. Acquired prior to the radiotherapy treatment, CBCT projections were preprocessed for contrast enhancement by converting the original intensity images to attenuation images with which the AS image was created. An adaptive robust z-normalization filtering was applied to further augment the weak oscillating structures locally. From the enhanced AS image, the respiratory signal was extracted using a two-step optimization approach to effectively reveal the large-scale regularity of the breathing signals. CBCT projection images from five patients acquired with the Varian Onboard Imager on the Clinac iX System Linear Accelerator (Varian Medical Systems, Palo Alto, CA) were employed to assess the proposed technique. Stable breathing signals can be reliably extracted using the proposed algorithm. Reference waveforms obtained using an air bellows belt (Philips Medical Systems, Cleveland, OH) were exported and compared to those with the AS based signals. The average errors for the enrolled patients between the estimated breath per minute (bpm) and the reference waveform bpm can be as low as  -0.07 with the standard deviation 1.58. The new algorithm outperformed the original AS technique for all patients by 8.5% to 30%. The impact of gantry rotation on the breathing signal was assessed with data acquired with a Quasar phantom (Modus Medical Devices Inc., London, Canada) and found to be minimal on the signal frequency. The new technique developed in this work will provide a practical solution to rendering markerless breathing signal using the CBCT projections for thoracic and abdominal patients.

  8. An adaptive laser beam shaping technique based on a genetic algorithm

    Ping Yang; Yuan Liu; Wei Yang; Minwu Ao; Shijie Hu; Bing Xu; Wenhan Jiang

    2007-01-01

    @@ A new adaptive beam intensity shaping technique based on the combination of a 19-element piezo-electricity deformable mirror (DM) and a global genetic algorithm is presented. This technique can adaptively adjust the voltages of the 19 actuators on the DM to reduce the difference between the target beam shape and the actual beam shape. Numerical simulations and experimental results show that within the stroke range of the DM, this technique can be well used to create the given beam intensity profiles on the focal plane.

  9. Adaptive Input-Output Linearization Technique for Robust Speed Control of Brush less DC Motor

    Kim, Kyeong Hwa; Baik, In Cheol; Kim, Hyun Soo; Youn, Myung Joong [Korea Advance Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-06-01

    An adaptive input-output linearization technique for a robust speed control of a brush less DC (BLDC) motor is presented. By using this technique, the nonlinear motor model can be effectively linearized in Brunovski canonical form, and the desired speed dynamics can be obtained based on the linearized model. This control technique, however, gives an undesirable output performance under the mismatch of the system parameters and load conditions caused by the incomplete linearization. For the robust output response, the controller parameters will be estimated by a model reference adaptive technique where the disturbance torque and flux linkage are estimated. The adaptation laws are derived by the Popov`s hyper stability theory and positivity concept. The proposed control scheme is implemented on a BLDC motor using the software of DSP TMS320C30 and the effectiveness is verified through the comparative simulations and experiments. (author). 14 refs., 12 figs., 1 tab.

  10. (Costing) The adaption of product cost estimation techniques to estimate the cost of service.

    Huang, Estelle; Newnes, Linda B; Parry, Glenn

    2011-01-01

    Abstract This paper presents an approach to ascertain whether product cost estimating techniques can be adapted for use in estimating the costs for providing a service. The research methodology adopted consists of a critique and analysis of the literature to ascertain how current cost estimation techniques are used. The analysis of the cost estimation techniques provides knowledge of cost estimation, in particular for products and service with advantages and drawbacks defined. Th...

  11. Applying Analytical Derivative and Sparse Matrix Techniques to Large-Scale Process Optimization Problems

    2000-01-01

    The performance of analytical derivative and sparse matrix techniques applied to a traditional densesequential quadratic programming(SQP) is studied, and the strategy utilizing those techniques is also presented. Computational results on two typicalchemical optimization problems demonstrate significant enhancement in efficiency, which shows this strategy ispromising and suitable for large-scale process optimization problems.

  12. Applying Analytical Derivative and Sparse Matrix Techniques to Large-Scale Process Optimization Problems

    钟卫涛; 邵之江; 张余岳; 钱积新

    2000-01-01

    The performance of analytical derivative and sparse matrix techniques applied to a traditional dense sequential quadratic programming (SQP) is studied, and the strategy utilizing those techniques is also presented.Computational results on two typical chemical optimization problems demonstrate significant enhancement in efficiency, which shows this strategy is promising and suitable for large-scale process optimization problems.

  13. Adaptive and model-based control theory applied to convectively unstable flows

    Fabbiane, N; Bagheri, S; Henningson, D S

    2014-01-01

    Research on active control for the delay of laminar-turbulent transition in boundary layers has made a significant progress in the last two decades, but the employed strategies have been many and dispersed. Using one framework, we review model-based techniques, such as linear-quadratic regulators, and model-free adaptive methods, such as least-mean square filters. The former are supported by a elegant and powerful theoretical basis, whereas the latter may provide a more practical approach in the presence of complex disturbance environments, that are difficult to model. We compare the methods with a particular focus on efficiency, practicability and robustness to uncertainties. Each step is exemplified on the one-dimensional linearized Kuramoto-Sivashinsky equation, that shows many similarities with the initial linear stages of the transition process of the flow over a flat plate. Also, the source code for the examples are provided.

  14. Adaptive Readout Technique For A Sixteen Channel Peak Sensing ADC In the FERA Format

    An adaptive, variable block-size readout technique for use with multiple, sixteen-channel CAMAC ADCs with a FERA-bus readout has been developed and designed. It can be used to read data from experiments with or without coincidence, i.e. singles, without having to change the readout protocol. Details of the implementation are discussed and initial results are presented. Further applications of the adaptive readout are also discussed

  15. Adaptive Pointing Design and Evaluation of a Precision Enhancing Technique for Absolute Pointing Devices

    König, Werner A.; Gerken, Jens; Dierdorf, Stefan; Reiterer, Harald

    2009-01-01

    We present Adaptive Pointing, a novel approach to addressing the common problem of accuracy when using absolute pointing devices for distant interaction. First, we discuss extensively some related work concerning the problem-domain of pointing accuracy when using absolute or relative pointing devices. As a result, we introduce a novel classification scheme to more clearly discriminate between different approaches. Second, the Adaptive Pointing technique is presented and described in detail. ...

  16. Comparision of nerve stimulator and ultrasonography as the techniques applied for brachial plexus anesthesia

    2011-01-01

    Background Brachial plexus block is useful for upper extremity surgery, and many techniques are available. The aim of our study was to compare the efficacy of axillary brachial plexus block using an ultrasound technique to the peripheral nerve stimulation technique. Methods 60 patients scheduled for surgery of the forearm or hand were randomly allocated into two groups (n = 30 per group). For Group 1; US, and for Group 2 PNS was applied. The quality and the onset of the sensorial and motor bl...

  17. Applying BI Techniques To Improve Decision Making And Provide Knowledge Based Management

    Alexandra Maria Ioana FLOREA

    2015-07-01

    Full Text Available The paper focuses on BI techniques and especially data mining algorithms that can support and improve the decision making process, with applications within the financial sector. We consider the data mining techniques to be more efficient and thus we applied several techniques, supervised and unsupervised learning algorithms The case study in which these algorithms have been implemented regards the activity of a banking institution, with focus on the management of lending activities.

  18. Comparison of different automatic adaptive threshold selection techniques for estimating discharge from river width

    Elmi, Omid; Javad Tourian, Mohammad; Sneeuw, Nico

    2015-04-01

    The importance of river discharge monitoring is critical for e.g., water resource planning, climate change, hazard monitoring. River discharge has been measured at in situ gauges for more than a century. Despite various attempts, some basins are still ungauged. Moreover, a reduction in the number of worldwide gauging stations increases the interest to employ remote sensing data for river discharge monitoring. Finding an empirical relationship between simultaneous in situ measurements of discharge and river widths derived from satellite imagery has been introduced as a straightforward remote sensing alternative. Classifying water and land in an image is the primary task for defining the river width. Water appears dark in the near infrared and infrared bands in satellite images. As a result low values in the histogram usually represent the water content. In this way, applying a threshold on the image histogram and separating into two different classes is one of the most efficient techniques to build a water mask. Beside its simple definition, finding the appropriate threshold value in each image is the most critical issue. The threshold is variable due to changes in the water level, river extent, atmosphere, sunlight radiation, onboard calibration of the satellite over time. These complexities in water body classification are the main source of error in river width estimation. In this study, we are looking for the most efficient adaptive threshold algorithm to estimate the river discharge. To do this, all cloud free MODIS images coincident with the in situ measurement are collected. Next a number of automatic threshold selection techniques are employed to generate different dynamic water masks. Then, for each of them a separate empirical relationship between river widths and discharge measurements are determined. Through these empirical relationships, we estimate river discharge at the gauge and then validate our results against in situ measurements and also

  19. Neural and fuzzy computation techniques for playout delay adaptation in VoIP networks.

    Ranganathan, Mohan Krishna; Kilmartin, Liam

    2005-09-01

    Playout delay adaptation algorithms are often used in real time voice communication over packet-switched networks to counteract the effects of network jitter at the receiver. Whilst the conventional algorithms developed for silence-suppressed speech transmission focused on preserving the relative temporal structure of speech frames/packets within a talkspurt (intertalkspurt adaptation), more recently developed algorithms strive to achieve better quality by allowing for playout delay adaptation within a talkspurt (intratalkspurt adaptation). The adaptation algorithms, both intertalkspurt and intratalkspurt based, rely on short term estimations of the characteristics of network delay that would be experienced by up-coming voice packets. The use of novel neural networks and fuzzy systems as estimators of network delay characteristics are presented in this paper. Their performance is analyzed in comparison with a number of traditional techniques for both inter and intratalkspurt adaptation paradigms. The design of a novel fuzzy trend analyzer system (FTAS) for network delay trend analysis and its usage in intratalkspurt playout delay adaptation are presented in greater detail. The performance of the proposed mechanism is analyzed based on measured Internet delays. Index Terms-Fuzzy delay trend analysis, intertalkspurt, intratalkspurt, multilayer perceptrons (MLPs), network delay estimation, playout buffering, playout delay adaptation, time delay neural networks (TDNNs), voice over Internet protocol (VoIP). PMID:16252825

  20. Optimal control techniques for the adaptive optics system of the LBT

    Agapito, G.; Quiros-Pacheco, F.; Tesi, P.; Esposito, S.; Xompero, M.

    2008-07-01

    In this paper we will discuss the application of different optimal control techniques for the adaptive optics system of the LBT telescope which comprises a pyramid wavefront sensor and an adaptive secondary mirror. We have studied the application of both the Kalman and the H∞ filter to estimate the temporal evolution of the phase perturbations due to the atmospheric turbulence and the telescope vibrations. We have evaluated the performance of these control techniques with numerical simulations in preparation of the laboratory tests that will be carried out in the Arcetri laboratories.

  1. Constrained Optimization Based on Hybrid Evolutionary Algorithm and Adaptive Constraint-Handling Technique

    Wang, Yong; Cai, Zixing; Zhou, Yuren; Fan, Zhun

    2009-01-01

    A novel approach to deal with numerical and engineering constrained optimization problems, which incorporates a hybrid evolutionary algorithm and an adaptive constraint-handling technique, is presented in this paper. The hybrid evolutionary algorithm simultaneously uses simplex crossover and two...... four well-known constrained design problems verify the effectiveness and efficiency of the proposed method. The experimental results show that integrating the hybrid evolutionary algorithm with the adaptive constraint-handling technique is beneficial, and the proposed method achieves competitive...... performance with respect to some other state-of-the-art approaches in constrained evolutionary optimization....

  2. An efficient Video Segmentation Algorithm with Real time Adaptive Threshold Technique

    Yasira Beevi C P

    2009-12-01

    Full Text Available Automatic video segmentation plays an important role in real-time MPEG-4 encoding systems. This paper presents a video segmentation algorithm for MPEG-4 camera system with change detection, background registration techniques and real time adaptive thresholdtechniques. This algorithm can give satisfying segmentation results with low computation load. Besides, it has shadow cancellation mode, which can deal with light changing effect and shadow effect. Furthermore, this algorithm also implemented real time adaptive threshold techniques by which the parameters can be decided automatically.

  3. Research on key techniques of virtual reality applied in mining industry

    LIAO Jun; LU Guo-bin

    2009-01-01

    Based on the applications of virtual reality technology in many fields, introduced the virtual reality technical basic concept, structure type, related technique development, etc., tallied up applications of virtual reality technique in the present mining industry, inquired into core techniques related software and hardware, especially the optimization in the setup of various 3D models technique, and carried out a virtual scene to travel extensively in real-time by stereoscopic manifestation technique and so on. Then it brought forward the solution of virtual reality technique with software and hardware to the mining industry that can satisfy the demand of different aspects and levers. Finally, it show a fine prospect of virtual reality technique applied in the mining industry.

  4. Raviart–Thomas-type sources adapted to applied EEG and MEG: implementation and results

    This paper studies numerically electroencephalography and magnetoencephalography (EEG and MEG), two non-invasive imaging modalities in which external measurements of the electric potential and the magnetic field are, respectively, utilized to reconstruct the primary current density (neuronal activity) of the human brain. The focus is on adapting a Raviart–Thomas-type source model to meet the needs of EEG and MEG applications. The goal is to construct a model that provides an accurate approximation of dipole source currents and can be flexibly applied to different reconstruction strategies as well as to realistic computation geometries. The finite element method is applied in the simulation of the data. Least-squares fit interpolation is used to establish Cartesian source directions, which guarantee that the recovered current field is minimally dependent on the underlying finite element mesh. Implementation is explained in detail and made accessible, e.g., by using quadrature-free formulae and the Gaussian one-point rule in numerical integration. Numerical results are presented concerning, for example, the iterative alternating sequential inverse algorithm as well as resolution, smoothness and local refinement of the finite element mesh. Both spherical and pseudo-realistic head models, as well as real MEG data, are utilized in the numerical experiments. (paper)

  5. Raviart-Thomas-type sources adapted to applied EEG and MEG: implementation and results

    Pursiainen, S.

    2012-06-01

    This paper studies numerically electroencephalography and magnetoencephalography (EEG and MEG), two non-invasive imaging modalities in which external measurements of the electric potential and the magnetic field are, respectively, utilized to reconstruct the primary current density (neuronal activity) of the human brain. The focus is on adapting a Raviart-Thomas-type source model to meet the needs of EEG and MEG applications. The goal is to construct a model that provides an accurate approximation of dipole source currents and can be flexibly applied to different reconstruction strategies as well as to realistic computation geometries. The finite element method is applied in the simulation of the data. Least-squares fit interpolation is used to establish Cartesian source directions, which guarantee that the recovered current field is minimally dependent on the underlying finite element mesh. Implementation is explained in detail and made accessible, e.g., by using quadrature-free formulae and the Gaussian one-point rule in numerical integration. Numerical results are presented concerning, for example, the iterative alternating sequential inverse algorithm as well as resolution, smoothness and local refinement of the finite element mesh. Both spherical and pseudo-realistic head models, as well as real MEG data, are utilized in the numerical experiments.

  6. Investigation about the efficiency of the bioaugmentation technique when applied to diesel oil contaminated soils

    Adriano Pinto Mariano

    2009-10-01

    Full Text Available This work investigated the efficiency of the bioaugmentation technique when applied to diesel oil contaminated soils collected at three service stations. Batch biodegradation experiments were carried out in Bartha biometer flasks (250 mL used to measure the microbial CO2 production. Biodegradation efficiency was also measured by quantifying the concentration of hydrocarbons. In addition to the biodegradation experiments, the capability of the studied cultures and the native microorganisms to biodegrade the diesel oil purchased from a local service station, was verified using a technique based on the redox indicator 2,6 -dichlorophenol indophenol (DCPIP. Results obtained with this test showed that the inocula used in the biodegradation experiments were able to degrade the diesel oil and the tests carried out with the native microorganisms indicated that these soils had a microbiota adapted to degrade the hydrocarbons. In general, no gain was obtained with the addition of microorganisms or even negative effects were observed in the biodegradation experiments.Este trabalho investigou a eficiência da técnica do bioaumento quando aplicada a solos contaminados com óleo diesel coletados em três postos de combustíveis. Experimentos de biodegradação foram realizados em frascos de Bartha (250 mL, usados para medir a produção microbiana de CO2. A eficiência de biodegradação também foi quantificada pela concentração de hidrocarbonetos. Conjuntamente aos experimentos de biodegradação, a capacidade das culturas estudadas e dos microrganismos nativos em biodegradar óleo diesel comprado de um posto de combustíveis local, foi verificada utilizando-se a técnica baseada no indicador redox 2,6 - diclorofenol indofenol (DCPIP. Resultados obtidos com esse teste mostraram que os inóculos empregados nos experimentos de biodegradação foram capazes de biodegradar óleo diesel e os testes com os microrganismos nativos indicaram que estes solos

  7. Goal-based angular adaptivity applied to the spherical harmonics discretisation of the neutral particle transport equation

    Highlights: • A variable order spherical harmonics scheme is presented. • An adaptive process is proposed to automatically refine the angular resolution. • A regular error estimator and a goal-based error estimator are presented. • The adaptive methods are applied to fixed source and eigenvalue problems. • Adaptive methods give more accurate solutions than uniform angular resolution. - Abstract: A variable order spherical harmonics scheme has been described and employed for the solution of the neutral particle transport equation. The scheme is specifically described with application within the inner-element sub-grid scale finite element spatial discretisation. The angular resolution is variable across both the spatial and energy dimensions. That is, the order of the spherical harmonic expansion may differ at each node of the mesh for each energy group. The variable order scheme has been used to develop adaptive methods for the angular resolution of the particle transport phase-space. Two types of adaptive method have been developed and applied to examples. The first is regular adaptivity, in which the error in the solution over the entire domain is minimised. The second is goal-based adaptivity, in which the error in a specified functional is minimised. The methods were applied to fixed source and eigenvalue examples. Both methods demonstrate an improved accuracy for a given number of degrees of freedom in the angular discretisation

  8. Simulation of energy saving potential of a centralized HVAC system in an academic building using adaptive cooling technique

    Highlights: • We have simulated and validated the cooling loads of a multi-zone academic building, in a tropical region. • We have analyzed the effect of occupancy patterns on the cooling loads. • Adaptive cooling technique has been utilized to minimize the energy usage of HVAC system. • The results are promising and show a reduction of energy saving in the range of 20–30%. - Abstract: Application of adaptive comfort temperature as room temperature set points potentially reduce energy usage of the HVAC system during a cooling and heating period. The savings are mainly due to higher indoor temperature set point during hot period and lower indoor temperature set point during cold period than the recommended value. Numerous works have been carried out to show how much energy can be saved during cooling and heating period by applying adaptive comfort temperature. The previous work, however, focused on a continuous cooling load as found in many office and residential buildings. Therefore, this paper aims to simulate the energy saving potential for an academic glazed building in tropical Malaysian climate by developing adaptive cooling technique. A building simulation program (TRNSYS) was used to model the building and simulate the cooling load characteristic using current and proposed technique. Two experimental measurements were conducted and the results were used to validate the model. Finally, cooling load characteristic of the academic building using current and proposed technique were compared and the results showed that annual energy saving potential as much as 305,150 kW h can be achieved

  9. Frequency and Spatial Domains Adaptive-based Enhancement Technique for Thermal Infrared Images

    Debasis Chaudhuri

    2014-09-01

    Full Text Available Low contrast and noisy image limits the amount of information conveyed to the user. With the proliferation of digital imagery and computer interface between man-and-machine, it is now viable to consider digital enhancement in the image before presenting it to the user, thus increasing the information throughput. With better contrast, target detection and discrimination can be improved. The paper presents a sequence of filtering operations in frequency and spatial domains to improve the quality of the thermal infrared (IR images. Basically, two filters – homomorphic filter followed by adaptive Gaussian filter are applied to improve the quality of the thermal IR images. We have systematically evaluated the algorithm on a variety of images and carefully compared it with the techniques presented in the literature. We performed an evaluation of three filter banks such as homomorphic, Gaussian 5×5 and the proposed method, and we have seen that the proposed method yields optimal PSNR for all the thermal images. The results demonstrate that the proposed algorithm is efficient for enhancement of thermal IR images.Defence Science Journal, Vol. 64, No. 5, September 2014, pp.451-457, DOI:http://dx.doi.org/10.14429/dsj.64.6873

  10. A SELF-ADAPTIVE TECHNIQUE FOR A KIND OF NONLINEAR CONJUGATE GRADIENT METHODS

    王丽平

    2004-01-01

    Conjugate gradient methods. are a class of important methods for unconstrained optimization, especially when the dimension is large. In 2001, Dai and Liao have proposed a new conjugate condition, based on it two nonlinear conjugate gradient methods are constructed. With trust region idea, this paper gives a self-adaptive technique for the two methods. The numerical results show that this technique works well for the given nonlinear optimization test problems.

  11. An Approach for Automatic Generation of Adaptive Hypermedia in Education with Multilingual Knowledge Discovery Techniques

    Alfonseca, Enrique; Rodriguez, Pilar; Perez, Diana

    2007-01-01

    This work describes a framework that combines techniques from Adaptive Hypermedia and Natural Language processing in order to create, in a fully automated way, on-line information systems from linear texts in electronic format, such as textbooks. The process is divided into two steps: an "off-line" processing step, which analyses the source text,…

  12. Adaptations in physiology and propulsion techniques during the initial phase of learning manual wheelchair propulsion

    de Groot, S; Veeger, H E J; Hollander, A P; van der Woude, L H V

    2003-01-01

    OBJECTIVE: The purpose of this study was to analyze adaptations in gross mechanical efficiency and wheelchair propulsion technique in novice able-bodied subjects during the initial phase of learning hand-rim wheelchair propulsion. DESIGN: Nine able-bodied subjects performed three 4-min practice bloc

  13. Database 'catalogue of techniques applied to materials and products of nuclear engineering'

    The database 'Catalogue of techniques applied to materials and products of nuclear engineering' (IS MERI) was developed to provide informational support for SSC RF RIAR and other enterprises in scientific investigations. This database contains information on the techniques used at RF Minatom enterprises for reactor material properties investigation. The main purpose of this system consists in the assessment of the current status of the reactor material science experimental base for the further planning of experimental activities and methodical support improvement. (author)

  14. Data Mining E-protokol - Applying data mining techniques on student absence

    Shrestha, Amardip; Bro Lilleås, Lauge; Hansen, Asbjørn

    2014-01-01

    The scope of this project is to explore the possibilities in applying data mining techniques for discovering new knowledge about student absenteeism in primary school. The research consists in analyzing a large dataset collected through the digital protocol system E-protokol. The data mining techniques used for the analysis involves clustering, classification and association rule mining, which are utilized using the machine learning toolset WEKA. The findings includes a number of suggestions ...

  15. Sweep as a Generic Pruning Technique Applied to the Non-Overlapping Rectangles Constraint

    Beldiceanu, Nicolas; Carlsson, Mats

    2001-01-01

    We first present a generic pruning technique which aggregates several constraints sharing some variables. The method is derived from an idea called \\dfn{sweep} which is extensively used in computational geometry. A first benefit of this technique comes from the fact that it can be applied on several families of global constraints. A second main advantage is that it does not lead to any memory consumption problem since it only requires temporary memory that can be reclaimed after each invocat...

  16. Development of Promising Insulating Oil and Applied Techniques of EHD, ER·MR

    Hanaoka, Ryoichi

    The development of an environment-friendly insulating liquid has been noticed for a new design of oil-filled power apparatus such as transformer from viewpoints of the protection of the environment. The dielectric liquids can also widely be applied to various fields which are concerned in the electromagnetic field. This article introduces the recent trend on promising new vegetable based oil as an electrical insulation, and EHD pumping, ER fluid and MR fluid as the applied techniques of dielectric liquids.

  17. A Novel Adaptive Elite-Based Particle Swarm Optimization Applied to VAR Optimization in Electric Power Systems

    2014-01-01

    Particle swarm optimization (PSO) has been successfully applied to solve many practical engineering problems. However, more efficient strategies are needed to coordinate global and local searches in the solution space when the studied problem is extremely nonlinear and highly dimensional. This work proposes a novel adaptive elite-based PSO approach. The adaptive elite strategies involve the following two tasks: (1) appending the mean search to the original approach and (2) pruning/cloning par...

  18. Goal-based angular adaptivity applied to a wavelet-based discretisation of the neutral particle transport equation

    A method for applying goal-based adaptive methods to the angular resolution of the neutral particle transport equation is presented. The methods are applied to an octahedral wavelet discretisation of the spherical angular domain which allows for anisotropic resolution. The angular resolution is adapted across both the spatial and energy dimensions. The spatial domain is discretised using an inner-element sub-grid scale finite element method. The goal-based adaptive methods optimise the angular discretisation to minimise the error in a specific functional of the solution. The goal-based error estimators require the solution of an adjoint system to determine the importance to the specified functional. The error estimators and the novel methods to calculate them are described. Several examples are presented to demonstrate the effectiveness of the methods. It is shown that the methods can significantly reduce the number of unknowns and computational time required to obtain a given error. The novelty of the work is the use of goal-based adaptive methods to obtain anisotropic resolution in the angular domain for solving the transport equation. -- Highlights: •Wavelet angular discretisation used to solve transport equation. •Adaptive method developed for the wavelet discretisation. •Anisotropic angular resolution demonstrated through the adaptive method. •Adaptive method provides improvements in computational efficiency

  19. Goal-based angular adaptivity applied to a wavelet-based discretisation of the neutral particle transport equation

    Goffin, Mark A., E-mail: mark.a.goffin@gmail.com [Applied Modelling and Computation Group, Department of Earth Science and Engineering, Imperial College London, London, SW7 2AZ (United Kingdom); Buchan, Andrew G.; Dargaville, Steven; Pain, Christopher C. [Applied Modelling and Computation Group, Department of Earth Science and Engineering, Imperial College London, London, SW7 2AZ (United Kingdom); Smith, Paul N. [ANSWERS Software Service, AMEC, Kimmeridge House, Dorset Green Technology Park, Winfrith Newburgh, Dorchester, Dorset, DT2 8ZB (United Kingdom); Smedley-Stevenson, Richard P. [AWE, Aldermaston, Reading, RG7 4PR (United Kingdom)

    2015-01-15

    A method for applying goal-based adaptive methods to the angular resolution of the neutral particle transport equation is presented. The methods are applied to an octahedral wavelet discretisation of the spherical angular domain which allows for anisotropic resolution. The angular resolution is adapted across both the spatial and energy dimensions. The spatial domain is discretised using an inner-element sub-grid scale finite element method. The goal-based adaptive methods optimise the angular discretisation to minimise the error in a specific functional of the solution. The goal-based error estimators require the solution of an adjoint system to determine the importance to the specified functional. The error estimators and the novel methods to calculate them are described. Several examples are presented to demonstrate the effectiveness of the methods. It is shown that the methods can significantly reduce the number of unknowns and computational time required to obtain a given error. The novelty of the work is the use of goal-based adaptive methods to obtain anisotropic resolution in the angular domain for solving the transport equation. -- Highlights: •Wavelet angular discretisation used to solve transport equation. •Adaptive method developed for the wavelet discretisation. •Anisotropic angular resolution demonstrated through the adaptive method. •Adaptive method provides improvements in computational efficiency.

  20. Dual Adaptive Filtering by Optimal Projection Applied to Filter Muscle Artifacts on EEG and Comparative Study

    Samuel Boudet

    2014-01-01

    Full Text Available Muscle artifacts constitute one of the major problems in electroencephalogram (EEG examinations, particularly for the diagnosis of epilepsy, where pathological rhythms occur within the same frequency bands as those of artifacts. This paper proposes to use the method dual adaptive filtering by optimal projection (DAFOP to automatically remove artifacts while preserving true cerebral signals. DAFOP is a two-step method. The first step consists in applying the common spatial pattern (CSP method to two frequency windows to identify the slowest components which will be considered as cerebral sources. The two frequency windows are defined by optimizing convolutional filters. The second step consists in using a regression method to reconstruct the signal independently within various frequency windows. This method was evaluated by two neurologists on a selection of 114 pages with muscle artifacts, from 20 clinical recordings of awake and sleeping adults, subject to pathological signals and epileptic seizures. A blind comparison was then conducted with the canonical correlation analysis (CCA method and conventional low-pass filtering at 30 Hz. The filtering rate was 84.3% for muscle artifacts with a 6.4% reduction of cerebral signals even for the fastest waves. DAFOP was found to be significantly more efficient than CCA and 30 Hz filters. The DAFOP method is fast and automatic and can be easily used in clinical EEG recordings.

  1. Assessment of Multi-Joint Coordination and Adaptation in Standing Balance: A Novel Device and System Identification Technique.

    Engelhart, Denise; Schouten, Alfred C; Aarts, Ronald G K M; van der Kooij, Herman

    2015-11-01

    The ankles and hips play an important role in maintaining standing balance and the coordination between joints adapts with task and conditions, like the disturbance magnitude and type, and changes with age. Assessment of multi-joint coordination requires the application of multiple continuous and independent disturbances and closed loop system identification techniques (CLSIT). This paper presents a novel device, the double inverted pendulum perturbator (DIPP), which can apply disturbing forces at the hip level and between the shoulder blades. In addition to the disturbances, the device can provide force fields to study adaptation of multi-joint coordination. The performance of the DIPP and a novel CLSIT was assessed by identifying a system with known mechanical properties and model simulations. A double inverted pendulum was successfully identified, while force fields were able to keep the pendulum upright. The estimated dynamics were similar as the theoretical derived dynamics. The DIPP has a sufficient bandwidth of 7 Hz to identify multi-joint coordination dynamics. An experiment with human subjects where a stabilizing force field was rendered at the hip (1500 N/m), showed that subjects adapt by lowering their control actions around the ankles. The stiffness from upper and lower segment motion to ankle torque dropped with 30% and 48%, respectively. Our methods allow to study (pathological) changes in multi-joint coordination as well as adaptive capacity to maintain standing balance. PMID:25423654

  2. Strategies and techniques of communication and public relations applied to non-profit sector

    Ioana – Julieta Josan

    2010-05-01

    Full Text Available The aim of this paper is to summarize the strategies and techniques of communication and public relations applied to non-profit sector.The approach of the paper is to identify the most appropriate strategies and techniques that non-profit sector can use to accomplish its objectives, to highlight specific differences between the strategies and techniques of the profit and non-profit sectors and to identify potential communication and public relations actions in order to increase visibility among target audience, create brand awareness and to change into positive brand sentiment the target perception about the non-profit sector.

  3. Applying Modern Techniques and Carrying Out English .Extracurricular—— On the Model United Nations Activity

    XuXiaoyu; WangJian

    2004-01-01

    This paper is an introduction of the extracurricular activity of the Model United Nations in Northwestern Polyteehnical University (NPU) and it focuses on the application of the modem techniques in the activity and the pedagogical theories applied in it. An interview and questionnaire research will reveal the influence of the Model United Nations.

  4. An Adaptive Technique for a Redundant-Sensor Navigation System. Ph.D. Thesis

    Chien, T. T.

    1972-01-01

    An on-line adaptive technique is developed to provide a self-contained redundant-sensor navigation system with a capability to utilize its full potentiality in reliability and performance. The gyro navigation system is modeled as a Gauss-Markov process, with degradation modes defined as changes in characteristics specified by parameters associated with the model. The adaptive system is formulated as a multistage stochastic process: (1) a detection system, (2) an identification system and (3) a compensation system. It is shown that the sufficient statistics for the partially observable process in the detection and identification system is the posterior measure of the state of degradation, conditioned on the measurement history.

  5. Multi-Level Adaptive Techniques (MLAT) for singular-perturbation problems

    Brandt, A.

    1978-01-01

    The multilevel (multigrid) adaptive technique, a general strategy of solving continuous problems by cycling between coarser and finer levels of discretization is described. It provides very fast general solvers, together with adaptive, nearly optimal discretization schemes. In the process, boundary layers are automatically either resolved or skipped, depending on a control function which expresses the computational goal. The global error decreases exponentially as a function of the overall computational work, in a uniform rate independent of the magnitude of the singular-perturbation terms. The key is high-order uniformly stable difference equations, and uniformly smoothing relaxation schemes.

  6. Difficulties applying recent blind source separation techniques to EEG and MEG

    Knuth, Kevin H

    2015-01-01

    High temporal resolution measurements of human brain activity can be performed by recording the electric potentials on the scalp surface (electroencephalography, EEG), or by recording the magnetic fields near the surface of the head (magnetoencephalography, MEG). The analysis of the data is problematic due to the fact that multiple neural generators may be simultaneously active and the potentials and magnetic fields from these sources are superimposed on the detectors. It is highly desirable to un-mix the data into signals representing the behaviors of the original individual generators. This general problem is called blind source separation and several recent techniques utilizing maximum entropy, minimum mutual information, and maximum likelihood estimation have been applied. These techniques have had much success in separating signals such as natural sounds or speech, but appear to be ineffective when applied to EEG or MEG signals. Many of these techniques implicitly assume that the source distributions hav...

  7. ADAPTATION OF CRACK GROWTH DETECTION TECHNIQUES TO US MATERIAL TEST REACTORS

    A. Joseph Palmer; Sebastien P. Teysseyre; Kurt L. Davis; Gordon Kohse; Yakov Ostrovsky; David M. Carpenter; Joy L. Rempe

    2015-04-01

    A key component in evaluating the ability of Light Water Reactors to operate beyond 60 years is characterizing the degradation of materials exposed to radiation and various water chemistries. Of particular concern is the response of reactor materials to Irradiation Assisted Stress Corrosion Cracking (IASCC). Some test reactors outside the United States, such as the Halden Boiling Water Reactor (HBWR), have developed techniques to measure crack growth propagation during irradiation. The basic approach is to use a custom-designed compact loading mechanism to stress the specimen during irradiation, while the crack in the specimen is monitored in-situ using the Direct Current Potential Drop (DCPD) method. In 2012 the US Department of Energy commissioned the Idaho National Laboratory and the MIT Nuclear Reactor Laboratory (MIT NRL) to take the basic concepts developed at the HBWR and adapt them to a test rig capable of conducting in-pile IASCC tests in US Material Test Reactors. The first two and half years of the project consisted of designing and testing the loader mechanism, testing individual components of the in-pile rig and electronic support equipment, and autoclave testing of the rig design prior to insertion in the MIT Reactor. The load was applied to the specimen by means of a scissor like mechanism, actuated by a miniature metal bellows driven by pneumatic pressure and sized to fit within the small in-core irradiation volume. In addition to the loader design, technical challenges included developing robust connections to the specimen for the applied current and voltage measurements, appropriate ceramic insulating materials that can endure the LWR environment, dealing with the high electromagnetic noise environment of a reactor core at full power, and accommodating material property changes in the specimen, due primarily to fast neutron damage, which change the specimen resistance without additional crack growth. The project culminated with an in

  8. How to Apply Student-centered Teaching Techniques in a Large Class%How to Apply Student-centered Teaching Techniques in a Larae Class

    李焱

    2008-01-01

    It is very common to have a class of 50 or more students in Chinese schools,and teaching a foreign language effectively to a large class is really hard work.In order to change the teacher-centered teaching model into the student-centered one,Teachers should keep students' needs,interests,and learning styles in mind,apply several kinds of teaching techniques,organize different clas$1~OIlll activities and encourage,praise and appreciate both students' success and learning process all the time.If teachers place more responsibility in the hands of students,serve as "presenter or facilitator of knowledge"instead of "source of all knowledge",they can greatly motivate students to learn the language in a very active,cooperative andeffectiveway.After all,peoplelearn by doing,not only by watching andlistening.

  9. Measurement of the magnitude of force applied by students when learning a mobilisation technique

    E. Smit

    2003-02-01

    Full Text Available Passive accessory intervertebral movements (PAIVM’s are frequently used by physiotherapists in the  assessment and management of patients. Studies investigating the reliability of passive mobilisation techniques have shown conflicting results. Therefore, standardisation of PAIVM’s is essential for research and teaching purposes, which could result in better clinical management. In order to standardise graded passive mobilisation techniques, a reliable, easy-to-use, objective measurement tool must be used. The aim of this  study was to determine whether it is necessary to quantify the magnitude of force applied when teaching a grade I central  posteroanterior (PA mobilisation technique (according to Maitland on the cervical spine. An objective measurement tool (FlexiForceTM was used to determine the consistency of force applied by third and fourth year physiotherapy students while performing this technique. Twenty third- and 20 fourth year physiotherapy students (n=40 were randomly selected. Each subject performed a grade I central PA on sensors placed on C6 for 25 seconds. The average maximum grade 1 force applied by the third year students was  significantly higher than the force applied by the fourth year students (p=0.034. There was a significantly larger variation in applied force among third years (p=0.00043. The results indicate that the current teaching method is insufficient to ensure inter-therapist reliability amongst students, emphasising the need for an objective measurement tool to be used for teaching students. The measurement tool used in this study is economical, easily applied and is an efficient method of measuring the magnitude of force. Further research is needed to demonstrate the reliability and validity of the tool to assist teaching and research in a clinical setting.

  10. Prediction of radical scavenging activities of anthocyanins applying adaptive neuro-fuzzy inference system (ANFIS) with quantum chemical descriptors.

    Jhin, Changho; Hwang, Keum Taek

    2014-01-01

    Radical scavenging activity of anthocyanins is well known, but only a few studies have been conducted by quantum chemical approach. The adaptive neuro-fuzzy inference system (ANFIS) is an effective technique for solving problems with uncertainty. The purpose of this study was to construct and evaluate quantitative structure-activity relationship (QSAR) models for predicting radical scavenging activities of anthocyanins with good prediction efficiency. ANFIS-applied QSAR models were developed by using quantum chemical descriptors of anthocyanins calculated by semi-empirical PM6 and PM7 methods. Electron affinity (A) and electronegativity (χ) of flavylium cation, and ionization potential (I) of quinoidal base were significantly correlated with radical scavenging activities of anthocyanins. These descriptors were used as independent variables for QSAR models. ANFIS models with two triangular-shaped input fuzzy functions for each independent variable were constructed and optimized by 100 learning epochs. The constructed models using descriptors calculated by both PM6 and PM7 had good prediction efficiency with Q-square of 0.82 and 0.86, respectively. PMID:25153627

  11. Development of ultrasound Doppler velocimetry technique applying cross-correlation processing

    Ultrasound Doppler Velocimetry technique (UDV) applying Doppler effect has been developed for measuring velocity distributions of sodium flow. As Doppler shift frequency is proportional to velocity of microparticles carried by flowing liquid, it is possible to evaluate velocity distributions of flowing liquid from Doppler shift frequency. In this report, a technique applying cross-correlation processing is proposed to derive Doppler shift frequency from the echoes of ultrasonic pulses. Verification studies of the proposed technique are conducted based on simulated echoes and actual echoes in water tests. Main results are as follows: (1) As the result of verification studies conducted based on the simulated echoes, relative error estimated by the proposed technique is about 1 percent. (2) The proposed technique is an effective measures for the reduction of noise signals. (3) The velocity distributions of water flowing in a pipe are evaluated in the experiments. The velocity distributions evaluated by the proposed technique is almost equivalent to that of turbulent flow evaluated by 1/7th power law. (author)

  12. Coastal Adaptation to Climate Change. Can the IPCC Technical Guidelines be applied?

    This paper evaluates the IPCC Technical Guidelines for Assessing Climate Change Impacts and Adaptations with respect to the guidance offered for coastal-adaptation assessment. It appears that the IPCC Technical Guidelines focus strongly on implementation. This paper uses both conceptual, and empirical information is used in this paper to show that coastal adaptation embraces more than selecting one of the 'technical' options to respond to sea-level rise (retreat, accommodate or protect). Coastal adaptation is a more complex and iterative process with a series of policy cycles. To be effective, an expanded adaptation framework involving four steps is suggested, including (1) information collection and awareness raising; (2) planning and design; (3) implementation, and (4) monitoring and evaluation. The incomplete coverage of these four steps in existing coastal-adaptation assessments constrains the development of adaptation strategies that are supported by the relevant actors and integrated into existing management. Researchers and policy-makers are recommended to work together to establish a framework for adaptation that is integrated within current coastal management processes and practices and takes a broader view on the subject. 46 refs

  13. Adaptive Fuzzy Output-Feedback Method Applied to Fin Control for Time-Delay Ship Roll Stabilization

    Rui Bai

    2014-01-01

    Full Text Available The ship roll stabilization by fin control system is considered in this paper. Assuming that angular velocity in roll cannot be measured, an adaptive fuzzy output-feedback control is investigated. The fuzzy logic system is used to approximate the uncertain term of the controlled system, and a fuzzy state observer is designed to estimate the unmeasured states. By utilizing the fuzzy state observer and combining the adaptive backstepping technique with adaptive fuzzy control design, an observer-based adaptive fuzzy output-feedback control approach is developed. It is proved that the proposed control approach can guarantee that all the signals in the closed-loop system are semiglobally uniformly ultimately bounded (SGUUB, and the control strategy is effective to decrease the roll motion. Simulation results are included to illustrate the effectiveness of the proposed approach.

  14. Optimization technique applied to interpretation of experimental data and research of constitutive laws

    The feasibility of identification technique applied to one dimensional numerical analysis of the split-Hopkinson pressure bar experiment is proven. A general 1-D elastic-plastic-viscoplastic computer program was written down so as to give an adequate solution for elastic-plastic-viscoplastic response of a pressure bar subjected to a general Heaviside step loading function in time which is applied over one end of the bar. Special emphasis is placed on the response of the specimen during the first microseconds where no equilibrium conditions can be stated. During this transient phase discontinuity conditions related to wave propagation are encountered and must be carefully taken into account. Having derived an adequate numerical model, then Pontryagin identification technique has been applied in such a way that the unknowns are physical parameters. The solutions depend mainly on the selection of a class of proper eigen objective functionals (cost functions) which may be combined so as to obtain a convenient numerical objective function. A number of significant questions arising in the choice of parameter adjustment algorithms are discussed. In particular, this technique leads to a two point boundary value problem which has been solved using an iterative gradient like technique usually referred to as a double operator gradient method. This method combines the classical Fletcher-Powell technique and a partial quadratic technique with an automatic parameter step size selection. This method is much more efficient than usual ones. Numerical experimentation with simulated data was performed to test the accuracy and stability of the identification algorithm and to determine the most adequate type and quantity of data for estimation purposes

  15. Modelling the effects of the sterile insect technique applied to Eldana saccharina Walker in sugarcane

    L Potgieter

    2012-12-01

    Full Text Available A mathematical model is formulated for the population dynamics of an Eldana saccharina Walker infestation of sugarcane under the influence of partially sterile released insects. The model describes the population growth of and interaction between normal and sterile E.saccharina moths in a temporally variable, but spatially homogeneous environment. The model consists of a deterministic system of difference equations subject to strictly positive initial data. The primary objective of this model is to determine suitable parameters in terms of which the above population growth and interaction may be quantified and according to which E.saccharina infestation levels and the associated sugarcane damage may be measured. Although many models have been formulated in the past describing the sterile insect technique, few of these models describe the technique for Lepidopteran species with more than one life stage and where F1-sterility is relevant. In addition, none of these models consider the technique when fully sterile females and partially sterile males are being released. The model formulated is also the first to describe the technique applied specifically to E.saccharina, and to consider the economic viability of applying the technique to this species. Pertinent decision support is provided to farm managers in terms of the best timing for releases, release ratios and release frequencies.

  16. Renormalization techniques applied to the study of density of states in disordered systems

    A general scheme for real space renormalization of formal scattering theory is presented and applied to the calculation of density of states (DOS) in some finite width systems. This technique is extended in a self-consistent way, to the treatment of disordered and partially ordered chains. Numerical results of moments and DOS are presented in comparison with previous calculations. In addition, a self-consistent theory for the magnetic order problem in a Hubbard chain is derived and a parametric transition is observed. Properties of localization of the electronic states in disordered chains are studied through various decimation averaging techniques and using numerical simulations. (author)

  17. Phase-shifting technique applied to circular harmonic-based joint transform correlator

    2000-01-01

    The phase-shifting technique is applied to the circular harmonic expansion-based joint transform correlator. Computer simulation has shown that the light efficiency and the discrimination capability are greatly enhanced, and the full rotation invariance is preserved after the phase-shifting technique has been used. A rotation-invariant optical pattern recognition with high discrimination capability and high light efficiency is obtained. The influence of the additive noise on the performance of the correlator is also investigated. However, the anti-noise capability of this kind of correlator still needs improving.

  18. Best Available Technique (BAT) assessment applied to ACR-1000 waste and heavy water management systems

    The ACR-1000 design is the next evolution of the proven CANDU reactor design. One of the key objectives for this project was to systematically apply the As Low As Reasonably Achievable (ALARA) principle to the reactor design. The ACR design team selected the Best Available Technique (BAT) assessment for this purpose to document decisions made during the design of each ACR-1000 waste and heavy water management systems. This paper describes the steps in the BAT assessment that has been applied to the ACR-1000 design. (author)

  19. Observer-Based Control Techniques for the LBT Adaptive Optics under Telescope Vibrations

    Agapito, Guido; Quirós-Pacheco, Fernando; Tesi, Pietro; Riccardi, Armando; Esposito, Simone

    2011-01-01

    This paper addresses the application of observer-based control techniques for the adaptive optics system of the LBT telescope. In such a context, attention is focused on the use of Kalman and H∞ filters to estimate the temporal evolution of phase perturbations due to the atmospheric turbulence and the telescope vibrations acting on tip/tilt modes. We shall present preliminary laboratory experiments carried out at the Osservatorio Astrofisico di Arcetri using the Kalman filter.

  20. Micro combined heat and power home supply: Prospective and adaptive management achieved by computational intelligence techniques

    Matics, Jens; Krost, Gerhard [University of Duisburg-Essen, Bismarckstrasse 81, D-47057 Duisburg (Germany)

    2008-11-15

    Micro combined heat and power (CHP) systems for single residential buildings are seen as advantageous to combine both decentralized power supply and rather high overall efficiency. The latter presupposes flexible and adaptive plant management which has to mediate between energy cost minimization and user comfort aspects. This is achieved by use of computational intelligence (CI) techniques; structure and performance of the management system are shown. (author)

  1. An Adaptive Image Enhancement Technique by Combining Cuckoo Search and Particle Swarm Optimization Algorithm

    Zhiwei Ye; Mingwei Wang; Zhengbing Hu; Wei Liu

    2015-01-01

    Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO) for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three fa...

  2. Adaptive Finite Elements for Systems of PDEs: Software Concepts, Multi-level Techniques and Parallelization

    Vey, Simon

    2008-01-01

    In the recent past, the field of scientific computing has become of more and more importance for scientific as well as for industrial research, playing a comparable role as experiment and theory do. This success of computational methods in scientific and engineering research is next to the enormous improvement of computer hardware to a large extend due to contributions from applied mathematicians, who have developed algorithms which make real life applications feasible. Examples are adaptive ...

  3. A Novel Approach of Harris Corner Detection of Noisy Images using Adaptive Wavelet Thresholding Technique

    Dey, Nilanjan; Nandi, Pradipti; Barman, Nilanjana

    2012-01-01

    In this paper we propose a method of corner detection for obtaining features which is required to track and recognize objects within a noisy image. Corner detection of noisy images is a challenging task in image processing. Natural images often get corrupted by noise during acquisition and transmission. Though Corner detection of these noisy images does not provide desired results, hence de-noising is required. Adaptive wavelet thresholding approach is applied for the same.

  4. Sustainable Modular Adaptive Redundancy Technique Emphasizing Partial Reconfiguration for Reduced Power Consumption

    R. Al-Haddad

    2011-01-01

    Full Text Available As reconfigurable devices' capacities and the complexity of applications that use them increase, the need for self-reliance of deployed systems becomes increasingly prominent. Organic computing paradigms have been proposed for fault-tolerant systems because they promote behaviors that allow complex digital systems to adapt and survive in demanding environments. In this paper, we develop a sustainable modular adaptive redundancy technique (SMART composed of a two-layered organic system. The hardware layer is implemented on a Xilinx Virtex-4 Field Programmable Gate Array (FPGA to provide self-repair using a novel approach called reconfigurable adaptive redundancy system (RARS. The software layer supervises the organic activities on the FPGA and extends the self-healing capabilities through application-independent, intrinsic, and evolutionary repair techniques that leverage the benefits of dynamic partial reconfiguration (PR. SMART was evaluated using a Sobel edge-detection application and was shown to tolerate stressful sequences of injected transient and permanent faults while reducing dynamic power consumption by 30% compared to conventional triple modular redundancy (TMR techniques, with nominal impact on the fault-tolerance capabilities. Moreover, PR is employed to keep the system on line while under repair and also to reduce repair time. Experiments have shown a 27.48% decrease in repair time when PR is employed compared to the full bitstream configuration case.

  5. An adaptive image enhancement technique by combining cuckoo search and particle swarm optimization algorithm.

    Ye, Zhiwei; Wang, Mingwei; Hu, Zhengbing; Liu, Wei

    2015-01-01

    Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO) for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper. PMID:25784928

  6. Applying Advocacy Skills in Tumultuous Times: Adaptive Capacity of Insuring America's Children Grantees

    Jung Y. Kim; Victoria Peebles; Christopher A. Trenholm

    2012-01-01

    In 2007, the David and Lucile Packard Foundation established Insuring America's Children to help secure health care for all children. A new brief and executive summary document how state-based grantees responded and adapted to unprecedented changes in children's coverage over the past several years. As state economic and political contexts shifted, grantees adapted strategic partnerships, assuming new and expanded leadership roles in state-based coalitions and forging partnerships with nontra...

  7. Co-evolutionary and Reinforcement Learning Techniques Applied to Computer Go players

    Zela Moraya, Wester Edison

    2013-01-01

    The objective of this thesis is model some processes from the nature as evolution and co-evolution, and proposing some techniques that can ensure that these learning process really happens and useful to solve some complex problems as Go game. The Go game is ancient and very complex game with simple rules which still is a challenge for the Artificial Intelligence. This dissertation cover some approaches that were applied to solve this problem, proposing solve this problem using competiti...

  8. The digital geometric phase technique applied to the deformation evaluation of MEMS devices

    Quantitative evaluation of the structure deformation of microfabricated electromechanical systems is of importance for the design and functional control of microsystems. In this investigation, a novel digital geometric phase technique was developed to meet the deformation evaluation requirement of microelectromechanical systems (MEMS). The technique is performed on the basis of regular artificial lattices, instead of a natural atom lattice. The regular artificial lattices with a pitch ranging from micrometer to nanometer will be directly fabricated on the measured surface of MEMS devices by using a focused ion beam (FIB). Phase information can be obtained from the Bragg filtered images after fast Fourier transform (FFT) and inverse fast Fourier transform (IFFT) of the scanning electronic microscope (SEM) images. Then the in-plane displacement field and the local strain field related to the phase information will be evaluated. The obtained results show that the technique can be well applied to deformation measurement with nanometer sensitivity and stiction force estimation of a MEMS device

  9. GPU peer-to-peer techniques applied to a cluster interconnect

    Ammendola, Roberto; Biagioni, Andrea; Bisson, Mauro; Fatica, Massimiliano; Frezza, Ottorino; Cicero, Francesca Lo; Lonardo, Alessandro; Mastrostefano, Enrico; Paolucci, Pier Stanislao; Rossetti, Davide; Simula, Francesco; Tosoratto, Laura; Vicini, Piero

    2013-01-01

    Modern GPUs support special protocols to exchange data directly across the PCI Express bus. While these protocols could be used to reduce GPU data transmission times, basically by avoiding staging to host memory, they require specific hardware features which are not available on current generation network adapters. In this paper we describe the architectural modifications required to implement peer-to-peer access to NVIDIA Fermi- and Kepler-class GPUs on an FPGA-based cluster interconnect. Besides, the current software implementation, which integrates this feature by minimally extending the RDMA programming model, is discussed, as well as some issues raised while employing it in a higher level API like MPI. Finally, the current limits of the technique are studied by analyzing the performance improvements on low-level benchmarks and on two GPU-accelerated applications, showing when and how they seem to benefit from the GPU peer-to-peer method.

  10. The block adaptive multigrid method applied to the solution of the Euler equations

    Pantelelis, Nikos

    1993-01-01

    In the present study, a scheme capable of solving very fast and robust complex nonlinear systems of equations is presented. The Block Adaptive Multigrid (BAM) solution method offers multigrid acceleration and adaptive grid refinement based on the prediction of the solution error. The proposed solution method was used with an implicit upwind Euler solver for the solution of complex transonic flows around airfoils. Very fast results were obtained (18-fold acceleration of the solution) using one fourth of the volumes of a global grid with the same solution accuracy for two test cases.

  11. Wire-mesh and ultrasound techniques applied for the characterization of gas-liquid slug flow

    Ofuchi, Cesar Y.; Sieczkowski, Wytila Chagas; Neves Junior, Flavio; Arruda, Lucia V.R.; Morales, Rigoberto E.M.; Amaral, Carlos E.F.; Silva, Marco J. da [Federal University of Technology of Parana, Curitiba, PR (Brazil)], e-mails: ofuchi@utfpr.edu.br, wytila@utfpr.edu.br, neves@utfpr.edu.br, lvrarruda@utfpr.edu.br, rmorales@utfpr.edu.br, camaral@utfpr.edu.br, mdasilva@utfpr.edu.br

    2010-07-01

    Gas-liquid two-phase flows are found in a broad range of industrial applications, such as chemical, petrochemical and nuclear industries and quite often determine the efficiency and safety of process and plants. Several experimental techniques have been proposed and applied to measure and quantify two-phase flows so far. In this experimental study the wire-mesh sensor and an ultrasound technique are used and comparatively evaluated to study two-phase slug flows in horizontal pipes. The wire-mesh is an imaging technique and thus appropriated for scientific studies while ultrasound-based technique is robust and non-intrusive and hence well suited for industrial applications. Based on the measured raw data it is possible to extract some specific slug flow parameters of interest such as mean void fraction and characteristic frequency. The experiments were performed in the Thermal Sciences Laboratory (LACIT) at UTFPR, Brazil, in which an experimental two-phase flow loop is available. The experimental flow loop comprises a horizontal acrylic pipe of 26 mm diameter and 9 m length. Water and air were used to produce the two phase flow under controlled conditions. The results show good agreement between the techniques. (author)

  12. Comparison between different techniques applied to quartz CPO determination in granitoid mylonites

    Fazio, Eugenio; Punturo, Rosalda; Cirrincione, Rosolino; Kern, Hartmut; Wenk, Hans-Rudolph; Pezzino, Antonino; Goswami, Shalini; Mamtani, Manish

    2016-04-01

    Since the second half of the last century, several techniques have been adopted to resolve the crystallographic preferred orientation (CPO) of major minerals constituting crustal and mantle rocks. To this aim, many efforts have been made to increase the accuracy of such analytical devices as well as to progressively reduce the time needed to perform microstructural analysis. It is worth noting that many of these microstructural studies deal with quartz CPO because of the wide occurrence of this mineral phase in crustal rocks as well as its quite simple chemical composition. In the present work, four different techniques were applied to define CPOs of dynamically recrystallized quartz domains from naturally deformed rocks collected from a ductile crustal scale shear zone in order to compare their advantages and limitation. The selected Alpine shear zone is located in the Aspromonte Massif (Calabrian Peloritani Orogen, southern Italy) representing granitoid lithotypes. The adopted methods span from "classical" universal stage (US), to image analysis technique (CIP), electron back-scattered diffraction (EBSD), and time of flight neutron diffraction (TOF). When compared, bulk texture pole figures obtained by means of these different techniques show a good correlation. Advances in analytical techniques used for microstructural investigations are outlined by discussing results of quartz CPO that are presented in this study.

  13. Pulsed laser deposition: the road to hybrid nanocomposites coatings and novel pulsed laser adaptive technique.

    Serbezov, Valery

    2013-01-01

    The applications of Pulsed Laser Deposition (PLD) for producing nanoparticles, nanostructures and nanocomposites coatings based on recently developed laser ablating techniques and their convergence are being reviewed. The problems of in situ synthesis of hybrid inorganic-organic nanocomposites coatings by these techniques are being discussed. The novel modification of PLD called Pulsed Laser Adaptive Deposition (PLAD) technique is presented. The in situ synthesized inorganic/organic nanocomposites coatings from Magnesium (Mg) alloy/Rhodamine B and Mg alloy/ Desoximetasone by PLAD are described. The trends, applications and future development of discussed patented methods based on the laser ablating technologies for producing hybrid nanocomposite coatings have also been discussed in this review. PMID:22747717

  14. Applying Content-Based Image Retrieval Techniques to Provide New Services for Tourism Industry

    Zobeir Raisi

    2014-09-01

    Full Text Available The aim of this paper is to use the network and internet and also to apply the content based image retrieval techniques to provide new services for tourism industry. The assumption is a tourist faces an interesting subject; he or she can take an image of subject by a handheld device and send it to the server as query image of CBIR. In the server, images similar to the query are retrieved and results are returned to the handheld device to be shown on a web browser. Then, the tourist can access the useful information about the subject by clicking on one of the retrieved images. For this purpose, a tourism database is created. Then several particular content-based image retrieval techniques are selected and applied to the database. Among these techniques, ‘Edge Histogram Descriptor (EHD’ and ‘Color layout descriptor (CLD’ algorithms have better retrieval performances than the others. By combining and modification of these two methods, a new CBIR algorithm is proposed for this application. Simulation results show a high retrieval performance for the proposed algorithm.

  15. Automatic Domain Adaptation of Word Sense Disambiguation Based on Sublanguage Semantic Schemata Applied to Clinical Narrative

    Patterson, Olga

    2012-01-01

    Domain adaptation of natural language processing systems is challenging because it requires human expertise. While manual effort is effective in creating a high quality knowledge base, it is expensive and time consuming. Clinical text adds another layer of complexity to the task due to privacy and confidentiality restrictions that hinder the…

  16. Action research for climate change adaptation : Developing and applying knowledge for governance

    Buuren, van A.; Eshuis, J.; Vliet, van M.

    2015-01-01

    Governments all over the world are struggling with the question of how to adapt to climate change. They need information not only about the issue and its possible consequences, but also about feasible governance strategies and instruments to combat it. At the same time, scientists from different soc

  17. Adaptive bone-remodeling theory applied to prosthetic-design analysis

    R. Huiskes (Rik); H.H. Weinans (Harrie); H.J. Grootenboer; M. Dalstra; B. Fudala; T.J. Slooff

    1987-01-01

    textabstractThe subject of this article is the development and application of computer-simulation methods to predict stress-related adaptive bone remodeling, in accordance with 'Wolff's Law'. These models are based on the Finite Element Method (FEM) in combination with numerical formulations of adap

  18. A spectral identification technique for adaptive attitude control and pointing of the Space Telescope

    Teuber, D. L.

    1976-01-01

    The Space Telescope is a 2.4 m class aperture optical telescope having near-diffraction-limited performance. It will be placed into earth orbit by 1980 via the Space Shuttle. The problem considered is how to achieve negligible degradation of the astronomy imaging capability (to 0.005 arc second) due to smearing by pointing motions during observations. Initially, pointing instability sources were identified and a linear stability was used to assess the magnitude of elastic body modes and to design control system compensation regions necessary for subsequent adaptive control. A spectral identification technique for this adaptive attitude control and pointing has been investigated that will alleviate requirements for comprehensive dynamic ground testing. Typical all-digital simulation results describing motions of the telescope line of sight are presented.

  19. ADAPTING E-COURSES USING DATA MINING TECHNIQUES - PDCA APPROACH AND QUALITY SPIRAL

    Marija Blagojevic

    2013-09-01

    Full Text Available This paper presents an approach to adapting e-courses based on original PDCA (Plan, Do, Check , Act platform and quality spiral. An algorithm for the adaptation of e-courses was proposed and implemented into the Moodle Learning Management System at the Faculty of Technical Sciences, Cacak. The approach is primarily based on improving LMS (Learning Management Systems or e-learning systems through modifying the electronic structure of the courses by predicting the behaviour patterns of the users. The prediction of user behaviour patterns was done using data mining techniques. Future research will focus on modelling of excellence of continuous advancement of the original system based on the evaluation results carried out at the end of each PDCA cycle. Additionally, future work will aim at evaluating the effects of the system based on the achievements and positive feedback of the users.

  20. Adaptive MIMO Fuzzy Compensate Fuzzy Sliding Mode Algorithm: Applied to Second Order Nonlinear System

    Farzin Piltan, N. Sulaiman, Payman Ferdosali, Mehdi Rashidi, Zahra Tajpeikar

    2011-12-01

    Full Text Available This research is focused on proposed adaptive fuzzy sliding mode algorithms with the adaptation lawsderived in the Lyapunov sense. The stability of the closed-loop system is proved mathematically based onthe Lyapunov method. Adaptive MIMO fuzzy compensate fuzzy sliding mode method design a MIMO fuzzysystem to compensate for the model uncertainties of the system, and chattering also solved by linearsaturation method. Since there is no tuning method to adjust the premise part of fuzzy rules so wepresented a scheme to online tune consequence part of fuzzy rules. Classical sliding mode control isrobust to control model uncertainties and external disturbances. A sliding mode method with a switchingcontrol low guarantees the stability of the certain and/or uncertain system, but the addition of the switchingcontrol low introduces chattering into the system. One way to reduce or eliminate chattering is to insert aboundary layer method inside of a boundary layer around the sliding surface. Classical sliding modecontrol method has difficulty in handling unstructured model uncertainties. One can overcome this problemby combining a sliding mode controller and artificial intelligence (e.g. fuzzy logic. To approximate a timevaryingnonlinear dynamic system, a fuzzy system requires a large amount of fuzzy rule base. This largenumber of fuzzy rules will cause a high computation load. The addition of an adaptive law to a fuzzy slidingmode controller to online tune the parameters of the fuzzy rules in use will ensure a moderatecomputational load. The adaptive laws in this algorithm are designed based on the Lyapunov stabilitytheorem. Asymptotic stability of the closed loop system is also proved in the sense of Lyapunov.

  1. An adaptive p-refinement strategy applied to nodal expansion method in 3D Cartesian geometry

    Highlights: • An adaptive p-refinement approach is developed and implemented successfully in ACNEM. • The proposed strategy enhances the accuracy with regard to the uniform zeroth order solution. • Improvement of results is gained by less computation time relative to uniform high order solution. - Abstract: The aim of this work is to develop a coarse mesh treatment strategy using adaptive polynomial, p, refinement approach for average current nodal expansion method in order to solve the neutron diffusion equation. For performing the adaptive solution process, a posteriori error estimation scheme, i.e. flux gradient has been utilized for finding the probable numerical errors. The high net leakage in a node represents flux gradient existence between neighbor nodes and it may indicate the source of errors for the coarse mesh calculation. Therefore, the relative Cartesian directional net leakage of nodes is considered as an assessment criterion for mesh refinement in a sub-domain. In our proposed approach, the zeroth order nodal expansion solution is used along coarse meshes as large as fuel assemblies to treat neutron populations. Coarse nodes with high directional net leakage may be chosen for implementing higher order polynomial expansion in the corresponding direction, i.e. X and/or Y and/or Z Cartesian directions. Using this strategy, the computational cost and time are reduced relative to uniform high order polynomial solution. In order to demonstrate the efficiency of this approach, a computer program, APNEC, Adaptive P-refinement Nodal Expansion Code, has been developed for solving the neutron diffusion equation using various orders of average current nodal expansion method in 3D rectangular geometry. Some well-known benchmarks are investigated to compare the uniform and adaptive solutions. Results demonstrate the superiority of our proposed strategy in enhancing the accuracy of solution without using uniform high order solution throughout the domain and

  2. Adaptive Neuro-Fuzzy Inference System Applied QSAR with Quantum Chemical Descriptors for Predicting Radical Scavenging Activities of Carotenoids

    Changho Jhin; Keum Taek Hwang

    2015-01-01

    One of the physiological characteristics of carotenoids is their radical scavenging activity. In this study, the relationship between radical scavenging activities and quantum chemical descriptors of carotenoids was determined. Adaptive neuro-fuzzy inference system (ANFIS) applied quantitative structure-activity relationship models (QSAR) were also developed for predicting and comparing radical scavenging activities of carotenoids. Semi-empirical PM6 and PM7 quantum chemical calculations were...

  3. The adaptation of GDL motion recognition system to sport and rehabilitation techniques analysis.

    Hachaj, Tomasz; Ogiela, Marek R

    2016-06-01

    The main novelty of this paper is presenting the adaptation of Gesture Description Language (GDL) methodology to sport and rehabilitation data analysis and classification. In this paper we showed that Lua language can be successfully used for adaptation of the GDL classifier to those tasks. The newly applied scripting language allows easily extension and integration of classifier with other software technologies and applications. The obtained execution speed allows using the methodology in the real-time motion capture data processing where capturing frequency differs from 100 Hz to even 500 Hz depending on number of features or classes to be calculated and recognized. Due to this fact the proposed methodology can be used to the high-end motion capture system. We anticipate that using novel, efficient and effective method will highly help both sport trainers and physiotherapist in they practice. The proposed approach can be directly applied to motion capture data kinematics analysis (evaluation of motion without regard to the forces that cause that motion). The ability to apply pattern recognition methods for GDL description can be utilized in virtual reality environment and used for sport training or rehabilitation treatment. PMID:27106581

  4. Software engineering techniques applied to agricultural systems an object-oriented and UML approach

    Papajorgji, Petraq J

    2014-01-01

    Software Engineering Techniques Applied to Agricultural Systems presents cutting-edge software engineering techniques for designing and implementing better agricultural software systems based on the object-oriented paradigm and the Unified Modeling Language (UML). The focus is on the presentation of  rigorous step-by-step approaches for modeling flexible agricultural and environmental systems, starting with a conceptual diagram representing elements of the system and their relationships. Furthermore, diagrams such as sequential and collaboration diagrams are used to explain the dynamic and static aspects of the software system.    This second edition includes: a new chapter on Object Constraint Language (OCL), a new section dedicated to the Model-VIEW-Controller (MVC) design pattern, new chapters presenting details of two MDA-based tools – the Virtual Enterprise and Olivia Nova, and a new chapter with exercises on conceptual modeling.  It may be highly useful to undergraduate and graduate students as t...

  5. Applied methods and techniques for mechatronic systems modelling, identification and control

    Zhu, Quanmin; Cheng, Lei; Wang, Yongji; Zhao, Dongya

    2014-01-01

    Applied Methods and Techniques for Mechatronic Systems brings together the relevant studies in mechatronic systems with the latest research from interdisciplinary theoretical studies, computational algorithm development and exemplary applications. Readers can easily tailor the techniques in this book to accommodate their ad hoc applications. The clear structure of each paper, background - motivation - quantitative development (equations) - case studies/illustration/tutorial (curve, table, etc.) is also helpful. It is mainly aimed at graduate students, professors and academic researchers in related fields, but it will also be helpful to engineers and scientists from industry. Lei Liu is a lecturer at Huazhong University of Science and Technology (HUST), China; Quanmin Zhu is a professor at University of the West of England, UK; Lei Cheng is an associate professor at Wuhan University of Science and Technology, China; Yongji Wang is a professor at HUST; Dongya Zhao is an associate professor at China University o...

  6. Guidelines for depth data collection in rivers when applying interpolation techniques (kriging for river restoration

    M. Rivas-Casado

    2007-05-01

    Full Text Available River restoration appraisal requires the implementation of monitoring programmes that assess the river site before and after the restoration project. However, little work has yet been developed to design effective and efficient sampling strategies. Three main variables need to be considered when designing monitoring programmes: space, time and scale. The aim of this paper is to describe the methodology applied to analyse the variation of depth in space, scale and time so more comprehensive monitoring programmes can be developed. Geostatistical techniques were applied to study the spatial dimension (sampling strategy and density, spectral analysis was used to study the scale at which depth shows cyclic patterns, whilst descriptive statistics were used to assess the temporal variation. A brief set of guidelines have been summarised in the conclusion.

  7. The two-step electrochemical etching technique applied for polycarbonate track etched detectors

    The two-step electrochemical etching technique was optimized by varying the electrical field strength and applied to the polycarbonate track detector, Makrofol DE, for neutron dosimetry and radon monitoring using an electric field strength of 26.7 kV·cm-1.In comparison with the previously applied combination of chemical and electrochemical etching, the neutron response was increased, above a threshold energy of about 1.5 MeV, by a factor of 3 to a value of 52 tracks·cm-1·mSv-1. The background track density and its standard deviation of (6±2) cm-2 allows the detection of about 0.1 mSv. The alpha energy range was extended from an alpha window of about 1.5 MeV to an alpha energy range of 0.5 to 4 MeV. (author)

  8. Investigation of finite difference recession computation techniques applied to a nonlinear recession problem

    This report presents comparisons of results of five implicit and explicit finite difference recession computation techniques with results from a more accurate ''benchmark'' solution applied to a simple one-dimensional nonlinear ablation problem. In the comparison problem a semi-infinite solid is subjected to a constant heat flux at its surface and the rate of recession is controlled by the solid material's latent heat of fusion. All thermal properties are assumed constant. The five finite difference methods include three front node dropping schemes, a back node dropping scheme, and a method in which the ablation problem is embedded in an inverse heat conduction problem and no nodes are dropped. Constancy of thermal properties and the semiinfinite and one-dimensional nature of the problem at hand are not necessary assumptions in applying the methods studied to more general problems. The best of the methods studied will be incorporated into APL's Standard Heat Transfer Program

  9. Contact Nd:YAG Laser Technique Applied To Head And Neck Reconstructive Surgery

    Nobori, Takuo; Miyazaki, Yasuhiro; Moriyama, Ichiro; Sannikorn, Phakdee; Ohyama, Masaru

    1989-09-01

    The contact Nd:YAG laser system with ceramics tip was applied to head and neck reconstructive surgery. Plastic surgery was performed in 78 patients with head and neck diseases during the past 11 years. Since 1984 reconstructive surgery in these patients was made on 60 cases and on 45 cases(75%) of these cases the contact Nd:YAG laser surgery was used. Using this laser technique, half volume of bleeding in the operation was obtained as compared with that of the conventional procedure.

  10. The Service Laboratory - A GTZ-BgVV project: Health protection through adapted veterinary diagnostic techniques

    The customary diagnostic methods of today have been developed in industrialized countries. High costs for personnel resulted in a trend towards automation and prefabricated test kits. Consequently, these techniques are not sufficiently adapted to local conditions in developing countries, where, as a rule, skilled and ancillary staff is available whereas foreign currency reserves for purchasing laboratory equipment and material from abroad are rather limited. Furthermore, the training of personnel from developing countries has usually been oriented towards thenon-transferable standards and methods of industrialized countries. This leads to a long term dependence of the diagnostic services on external funding. A diagnostic technology adapted to the specific local conditions of developing countries is needed to overcome this situation. The project activities concentrate on serological diagnostic work. Here, basic knowledge of the common diagnostic techniques and their set-up for specific diseases, methods for the production of related reagents (antigens, antibodies, conjugates, complement, etc.) and cleaning procedures for the reuse of 'one way' plastic material is spread by training programmes, specific publications and information leaflets. For two of the more complex test procedures, the most frequently quoted prescribed test for international trade, CFT, and the increasingly important ELISA (OIE, Manual of Standards for Diagnostic Techniques, Paris, 1992), we have calculated the cost reduction potential of adaptation through self-production of reagents and reuse of plastic materials. Material costs per microtitre test plate for the diagnosis of brucellosis can be reduced from US $3.79 to 0.82 for CFT and from US $3.88 to 1.13 for ELISA. In comparison, commercial ELISA kits cost about US $80 to 90 per plate (e.g. Bommeli, IDEXX, Boehringer)

  11. Grid-based Moment Tensor Inversion Technique Apply for Earthquakes Offshore of Northeast Taiwan

    Cheng, H.; Lee, S.; Ma, K.

    2010-12-01

    We use a grid-based moment tensor inversion technique and broadband continuous recordings to real-time monitoring the earthquakes offshore northeast Taiwan. The moment tensor inversion technique and a grid search scheme are applied to obtain the information of source parameters, including the hypocenter, moment magnitude, and focal mechanism. In Taiwan, the routine moment tensor solutions are reported by CWB(Central Weather Bureau) and BATS(Broadband Array in Taiwan for Seismology) which both require some lag time for the information on event time and location before doing CMT(Centroid Moment Tensor) analysis. By using the Grid-based moment tensor inversion technique, the event location and focal mechanism could be obtained simultaneously within about two minutes after the occurrence of the earthquake. This inversion procedure is based on a 1-D Green’s functions database calculated by frequency-wavenumber(fk) method. The northeast offshore of Taiwan has been taken into account as our first test area which covers the region of 121.5E to 123E, 23.5N to 25N, and the depth to 136 km. A 3D grid system is set in this study area with average grid size of 10 x 10 x 10 km3. We compare our results with the past earthquakes from 2008 to 2010 which had analyzed by BATS CMT. We also compare the event time detected by GridMT with the CWB earthquake reports. The results indicate that the grid-based moment tensor inversion system is efficient and realizable to be applied real-time on monitoring the local seismic activity. Our long-term goal is to use the GridMT technique with fully 3-D Green’s functions for the whole Taiwan in the future.

  12. The Subarray MVDR Beamformer: A Space-Time Adaptive Processor Applied to Active Sonar

    Bezanson, Leverett Guidroz

    The research for this thesis was mainly performed at the NATO Underwater Research Center, now named the Center for Maritime Research and Experimentation (CMRE). The purpose of the research was to improve the detection of underwater targets in the littoral ocean when using active sonar. Currently these detections are being made by towed line arrays using a delay and sum beamformer for bearing measurements and noise suppression. This method of beamforming has can suffer from reverberation that commonly is present in the littoral environment. A proposed solution is to use an adaptive beamformer which can attenuate reverberation and increase the bearing resolution. The adaptive beamforming algorithms have existed for a long time and typically are not used in the active case due to limited amount of observable data that is needed for adaptation. This deficiency is caused by the conflicting requirements for high Doppler resolution for target detection and small time windows for building up full-rank covariance estimates. The algorithms also are sensitive to bearing estimate errors that commonly occur in active sonar systems. Recently it has been proposed to overcome these limitations through the use of reduced beamspace adaptive beamforming. The Subarray MVDR beamformer is analyzed, both against simulated data and against experimental data collected by CMRE during the GLINT/NGAS11 experiment in 2011. Simulation results indicate that the Subarray MVDR beamformer rejects interfering signals that are not effectively attenuated by conventional beamforming. The application of the Subarray MVDR beamformer to the experimental data shows that the Doppler spread of the reverberation ridge is reduced, and the bearing resolution improved. The signal to noise ratio is calculated at the target location and also shows improvement. These calculated and observed performance metrics indicate an improvement of detection in reverberation noise.

  13. Applying Agile Requirements Engineering Approach for Re-engineering & Changes in existing Brownfield Adaptive Systems

    Masood, Abdullah; Ali, M. Asim

    2014-01-01

    Requirements Engineering (RE) is a key activity in the development of software systems and is concerned with the identification of the goals of stakeholders and their elaboration into precise statements of desired services and behavior. The research describes an Agile Requirements Engineering approach for re-engineering & changes in existing Brownfield adaptive system. The approach has few modifications that can be used as a part of SCRUM development process for re-engineering & changes. The ...

  14. Applying computer adaptive testing to optimize online assessment of suicidal behavior: a simulation study.

    Beurs, D.P. de; Vries, A.L.M. de; Groot, M.H. de; Keijser, J. de; Kerkhof, A.J.F.M.

    2014-01-01

    Background The Internet is used increasingly for both suicide research and prevention. To optimize online assessment of suicidal patients, there is a need for short, good-quality tools to assess elevated risk of future suicidal behavior. Computer adaptive testing (CAT) can be used to reduce response burden and improve accuracy, and make the available pencil-and-paper tools more appropriate for online administration. Objective The aim was to test whether an item response–based computer adaptiv...

  15. Adaptively Reevaluated Bayesian Localization (ARBL): A novel technique for radiological source localization

    Miller, Erin A.; Robinson, Sean M.; Anderson, Kevin K.; McCall, Jonathon D.; Prinke, Amanda M.; Webster, Jennifer B.; Seifert, Carolyn E.

    2015-06-01

    We present a novel technique for the localization of radiological sources in urban or rural environments from an aerial platform. The technique is based on a Bayesian approach to localization, in which measured count rates in a time series are compared with predicted count rates from a series of pre-calculated test sources to define likelihood. This technique is expanded by using a localized treatment with a limited field of view (FOV), coupled with a likelihood ratio reevaluation, allowing for real-time computation on commodity hardware for arbitrarily complex detector models and terrain. In particular, detectors with inherent asymmetry of response (such as those employing internal collimation or self-shielding for enhanced directional awareness) are leveraged by this approach to provide improved localization. Results from the localization technique are shown for simulated flight data using monolithic as well as directionally-aware detector models, and the capability of the methodology to locate radioisotopes is estimated for several test cases. This localization technique is shown to facilitate urban search by allowing quick and adaptive estimates of source location, in many cases from a single flyover near a source. In particular, this method represents a significant advancement from earlier methods like full-field Bayesian likelihood, which is not generally fast enough to allow for broad-field search in real time, and highest-net-counts estimation, which has a localization error that depends strongly on flight path and cannot generally operate without exhaustive search.

  16. Adaptively Reevaluated Bayesian Localization (ARBL). A Novel Technique for Radiological Source Localization

    Miller, Erin A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Robinson, Sean M. [Pacific Northwest National Lab. (PNNL), Seattle, WA (United States); Anderson, Kevin K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McCall, Jonathon D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Prinke, Amanda M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Webster, Jennifer B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Seifert, Carolyn E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-01-19

    Here we present a novel technique for the localization of radiological sources in urban or rural environments from an aerial platform. The technique is based on a Bayesian approach to localization, in which measured count rates in a time series are compared with predicted count rates from a series of pre-calculated test sources to define likelihood. Furthermore, this technique is expanded by using a localized treatment with a limited field of view (FOV), coupled with a likelihood ratio reevaluation, allowing for real-time computation on commodity hardware for arbitrarily complex detector models and terrain. In particular, detectors with inherent asymmetry of response (such as those employing internal collimation or self-shielding for enhanced directional awareness) are leveraged by this approach to provide improved localization. Our results from the localization technique are shown for simulated flight data using monolithic as well as directionally-aware detector models, and the capability of the methodology to locate radioisotopes is estimated for several test cases. This localization technique is shown to facilitate urban search by allowing quick and adaptive estimates of source location, in many cases from a single flyover near a source. In particular, this method represents a significant advancement from earlier methods like full-field Bayesian likelihood, which is not generally fast enough to allow for broad-field search in real time, and highest-net-counts estimation, which has a localization error that depends strongly on flight path and cannot generally operate without exhaustive search

  17. Adaptively Reevaluated Bayesian Localization (ARBL): A novel technique for radiological source localization

    We present a novel technique for the localization of radiological sources in urban or rural environments from an aerial platform. The technique is based on a Bayesian approach to localization, in which measured count rates in a time series are compared with predicted count rates from a series of pre-calculated test sources to define likelihood. This technique is expanded by using a localized treatment with a limited field of view (FOV), coupled with a likelihood ratio reevaluation, allowing for real-time computation on commodity hardware for arbitrarily complex detector models and terrain. In particular, detectors with inherent asymmetry of response (such as those employing internal collimation or self-shielding for enhanced directional awareness) are leveraged by this approach to provide improved localization. Results from the localization technique are shown for simulated flight data using monolithic as well as directionally-aware detector models, and the capability of the methodology to locate radioisotopes is estimated for several test cases. This localization technique is shown to facilitate urban search by allowing quick and adaptive estimates of source location, in many cases from a single flyover near a source. In particular, this method represents a significant advancement from earlier methods like full-field Bayesian likelihood, which is not generally fast enough to allow for broad-field search in real time, and highest-net-counts estimation, which has a localization error that depends strongly on flight path and cannot generally operate without exhaustive search

  18. Applying contact to individual silicon nanowires using a dielectrophoresis (DEP)-based technique

    One major challenge for the technological use of nanostructures is the control of their electrical and optoelectronic properties. For that purpose, extensive research into the electrical characterization and therefore a fast and reliable way of contacting these structures are needed. Here, we report on a new, dielectrophoresis (DEP)-based technique, which enables to apply sufficient and reliable contact to individual nanostructures, like semiconducting nanowires (NW), easily and without the need for lithography. The DEP contacting technique presented in this article can be done without high-tech equipment and monitored in situ with an optical microscope. In the presented experiments, individual SiNWs are trapped and subsequently welded between two photolithographically pre-patterned electrodes by applying varying AC voltages to the electrodes. To proof the quality of these contacts, I–V curves, photoresponse and photoconductivity of a single SiNW were measured. Furthermore, the measured photoconductivity in dependence on the wavelength of illuminated light and was compared with calculations predicting the absorption spectra of an individual SiNW.

  19. Applying contact to individual silicon nanowires using a dielectrophoresis (DEP)-based technique

    Leiterer, Christian, E-mail: christian.leiterer@gmail.com [Institute of Photonic Technology (Germany); Broenstrup, Gerald [Max-Planck-Institute for the Science of Light (Germany); Jahr, Norbert; Urban, Matthias; Arnold, Cornelia; Christiansen, Silke; Fritzsche, Wolfgang [Institute of Photonic Technology (Germany)

    2013-05-15

    One major challenge for the technological use of nanostructures is the control of their electrical and optoelectronic properties. For that purpose, extensive research into the electrical characterization and therefore a fast and reliable way of contacting these structures are needed. Here, we report on a new, dielectrophoresis (DEP)-based technique, which enables to apply sufficient and reliable contact to individual nanostructures, like semiconducting nanowires (NW), easily and without the need for lithography. The DEP contacting technique presented in this article can be done without high-tech equipment and monitored in situ with an optical microscope. In the presented experiments, individual SiNWs are trapped and subsequently welded between two photolithographically pre-patterned electrodes by applying varying AC voltages to the electrodes. To proof the quality of these contacts, I-V curves, photoresponse and photoconductivity of a single SiNW were measured. Furthermore, the measured photoconductivity in dependence on the wavelength of illuminated light and was compared with calculations predicting the absorption spectra of an individual SiNW.

  20. Adaptive Finite Element Modeling Techniques for the Poisson-Boltzmann Equation

    Holst, Michael; Yu, Zeyun; Zhou, Yongcheng; Zhu, Yunrong

    2010-01-01

    We develop an efficient and reliable adaptive finite element method (AFEM) for the nonlinear Poisson-Boltzmann equation (PBE). We first examine the regularization technique of Chen, Holst, and Xu; this technique made possible the first a priori pointwise estimates and the first complete solution and approximation theory for the Poisson-Boltzmann equation. It also made possible the first provably convergent discretization of the PBE, and allowed for the development of a provably convergent AFEM for the PBE. However, in practice the regularization turns out to be numerically ill-conditioned. In this article, we examine a second regularization, and establish a number of basic results to ensure that the new approach produces the same mathematical advantages of the original regularization, without the ill-conditioning property. We then design an AFEM scheme based on the new regularized problem, and show that the resulting AFEM scheme is accurate and reliable, by proving a contraction result for the error. This res...

  1. Blind Adaptive Subcarrier Combining Technique for MC-CDMA Receiver in Mobile Rayleigh Channel

    Shakya, Indu; Stipidis, Elias

    2011-01-01

    A new subcarrier combining technique is proposed for MC -CDMA receiver in mobile Rayleigh fading channel. It exploits the structure formed by repeating spreading sequences of users on different subcarriers to simultaneously suppress multiple access interference (MAI) and provide implicit channel tracking without any knowledge of the channel amplitudes or training sequences. This is achieved by adaptively weighting each subcarrier in each symbol period by employing a simple gradient descent algorithm to meet the constant modulus (CM) criterion with judicious selection of step-size. Improved BER and user capacity performance are shown with similar complexity in order of O(N) compared with conventional maximum ratio combining and equal gain combining techniques even under high channel Doppler rates.

  2. Innovative image processing techniques applied to the thermographic inspection of PFC with SATIR facility

    The components used in fusion devices, especially high heat flux Plasma Facing Components (PFC), have to withstand heat fluxes in the range of 10-20 MW/m2. So, they require high reliability which can be only guaranteed by accurate Non Destructive Examinations (NDE). The SATIR test bed operating at Commissariat a l'Energie Atomique (CEA) Cadarache performs NDE using transient infrared thermography sequence which compares the thermal response of a tested element to a Reference element assumed to be defect free. The control parameter is called DTrefmax. In this paper, we present two innovative image processing techniques of the SATIR signal allowing the qualification of a component without any Reference element. The first method is based on a spatial image autocorrelation and the second on the resolution of an Inverse Heat Conduction Problem (IHCP) using a BEM (Boundary Element Method) technique. After a validation step performed on numerical data, these two methods have been applied to SATIR experimental data. The results show that these two techniques allow accurate defect detection, without using a Reference tile. They can be used in addition to the DTrefmax, for the qualification of plasma facing components.

  3. Applied research on air pollution using nuclear-related analytical techniques

    A co-ordinated research programme (CRP) on applied research on air pollution using nuclear-related techniques is a global CRP which will run from 1992-1996, and will build upon the experience gained by the Agency from the laboratory support that it has been providing for several years to BAPMoN - the Background Air Pollution Monitoring Network programme organized under the auspices of the World Meterological Organization. The purpose of this CRP is to promote the use of nuclear analytical techniques in air pollution studies, e.g. NAA, XFR, and PIXE for the analysis of toxic and other trace elements in suspended particulate matter (including air filter samples), rainwater and fog-water samples, and in biological indicators of air pollution (e.g. lichens and mosses). The main purposes of the core programme are i) to support the use of nuclear and nuclear-related analytical techniques for practically-oriented research and monitoring studies on air pollution ii) to identify major sources of air pollution affecting each of the participating countries with particular reference to toxic heavy metals, and iii) to obtain comparative data on pollution levels in areas of high pollution (e.g. a city centre or a populated area downwind of a large pollution source) and low pollution (e.g. rural areas). This document reports the discussions held during the first Research Co-ordination Meeting (RCM) for the CRP which took place at the IAEA Headquarters in Vienna. Refs, figs and tabs

  4. Interferometric Techniques Apply to Gemona (Friuli-Italy) Area as Tool for Structural Analysis.

    Sternai, P.; Calcagni, L.; Crippa, B.

    2009-04-01

    Interferometric Techniques Apply to Gemona (Friuli) Area as Tool for Structural Analysis. We suggest a possible exploitation of radar interferometry for estimating many features of the brittle deformation occurring at the very surface of the Earth, such as, for example, the length of the dislocation front, the total amount of the dislocation, the dislocation rate over the time interval considered. The Interferometric techniques allows obtaining highly reliable vertical velocity values of the order of 1 mm/yr, with a maximum resolution of 80m2. The values obtained always refer to the temporal interval considered, which depends on the availability of SAR images. We demonstrate that is possible to see the evolution and the behaviour of the main tectonic lineament of the considered area even on short period of time (few years). We describe the results of a procedure to calculate terrain motion velocity on highly correlated pixels of an area nearby Gemona - Friuli, Northern Italy, and then we presented some considerations, based on three successful examples of the analysis, on how to exploit these results in a structural-geological description of the area. The versatility of the technique, the large dimensions of the area that can be analyzed (10.000 km2), and the high precision and reliability of the results obtained, make radar interferometry a powerful tool not only to monitor the dislocation occurring at the surface, but also to obtain important information on the structural evolution of mountain belts, otherwise very difficult to recognize.

  5. Micropillar compression technique applied to micron-scale mudstone elasto-plastic deformation.

    Michael, Joseph Richard; Chidsey, Thomas (Utah Geological Survey, Salt Lake City, UT); Heath, Jason E.; Dewers, Thomas A.; Boyce, Brad Lee; Buchheit, Thomas Edward

    2010-12-01

    Mudstone mechanical testing is often limited by poor core recovery and sample size, preservation and preparation issues, which can lead to sampling bias, damage, and time-dependent effects. A micropillar compression technique, originally developed by Uchic et al. 2004, here is applied to elasto-plastic deformation of small volumes of mudstone, in the range of cubic microns. This study examines behavior of the Gothic shale, the basal unit of the Ismay zone of the Pennsylvanian Paradox Formation and potential shale gas play in southeastern Utah, USA. Precision manufacture of micropillars 5 microns in diameter and 10 microns in length are prepared using an ion-milling method. Characterization of samples is carried out using: dual focused ion - scanning electron beam imaging of nano-scaled pores and distribution of matrix clay and quartz, as well as pore-filling organics; laser scanning confocal (LSCM) 3D imaging of natural fractures; and gas permeability, among other techniques. Compression testing of micropillars under load control is performed using two different nanoindenter techniques. Deformation of 0.5 cm in diameter by 1 cm in length cores is carried out and visualized by a microscope loading stage and laser scanning confocal microscopy. Axisymmetric multistage compression testing and multi-stress path testing is carried out using 2.54 cm plugs. Discussion of results addresses size of representative elementary volumes applicable to continuum-scale mudstone deformation, anisotropy, and size-scale plasticity effects. Other issues include fabrication-induced damage, alignment, and influence of substrate.

  6. Photothermal Techniques Applied to the Thermal Characterization of l-Cysteine Nanofluids

    Alvarado, E. Maldonado; Ramón-Gallegos, E.; Jiménez Pérez, J. L.; Cruz-Orea, A.; Hernández Rosas, J.

    2013-05-01

    Thermal-diffusivity ( D) and thermal-effusivity ( e) measurements were carried out in l-cysteine nanoliquids l-cysteine in combination with Au nanoparticles and protoporphyrin IX (PpIX) nanofluid) by using thermal lens spectrometry (TLS) and photopyroelectric (PPE) techniques. The TLS technique was used in the two mismatched mode experimental configuration to obtain the thermal-diffusivity of the samples. On the other hand, the sample thermal effusivity ( e) was obtained by using the PPE technique where the temperature variation of a sample, exposed to modulated radiation, is measured with a pyrolectric sensor. From the obtained thermal-diffusivity and thermal-effusivity values, the thermal conductivity and specific heat capacity of the sample were calculated. The obtained thermal parameters were compared with the thermal parameters of water. The results of this study could be applied to the detection of tumors by using the l-cysteine in combination with Au nanoparticles and PpIX nanofluid, called conjugated in this study.

  7. Adaptive technique for matching the spectral response in skin lesions' images

    Pavlova, P.; Borisova, E.; Pavlova, E.; Avramov, L.

    2015-03-01

    The suggested technique is a subsequent stage for data obtaining from diffuse reflectance spectra and images of diseased tissue with a final aim of skin cancer diagnostics. Our previous work allows us to extract patterns for some types of skin cancer, as a ratio between spectra, obtained from healthy and diseased tissue in the range of 380 - 780 nm region. The authenticity of the patterns depends on the tested point into the area of lesion, and the resulting diagnose could also be fixed with some probability. In this work, two adaptations are implemented to localize pixels of the image lesion, where the reflectance spectrum corresponds to pattern. First adapts the standard to the personal patient and second - translates the spectrum white point basis to the relative white point of the image. Since the reflectance spectra and the image pixels are regarding to different white points, a correction of the compared colours is needed. The latest is done using a standard method for chromatic adaptation. The technique follows the steps below: -Calculation the colorimetric XYZ parameters for the initial white point, fixed by reflectance spectrum from healthy tissue; -Calculation the XYZ parameters for the distant white point on the base of image of nondiseased tissue; -Transformation the XYZ parameters for the test-spectrum by obtained matrix; -Finding the RGB values of the XYZ parameters for the test-spectrum according sRGB; Finally, the pixels of the lesion's image, corresponding to colour from the test-spectrum and particular diagnostic pattern are marked with a specific colour.

  8. Adaptive clutter rejection filters for airborne Doppler weather radar applied to the detection of low altitude windshear

    Keel, Byron M.

    1989-01-01

    An optimum adaptive clutter rejection filter for use with airborne Doppler weather radar is presented. The radar system is being designed to operate at low-altitudes for the detection of windshear in an airport terminal area where ground clutter returns may mask the weather return. The coefficients of the adaptive clutter rejection filter are obtained using a complex form of a square root normalized recursive least squares lattice estimation algorithm which models the clutter return data as an autoregressive process. The normalized lattice structure implementation of the adaptive modeling process for determining the filter coefficients assures that the resulting coefficients will yield a stable filter and offers possible fixed point implementation. A 10th order FIR clutter rejection filter indexed by geographical location is designed through autoregressive modeling of simulated clutter data. Filtered data, containing simulated dry microburst and clutter return, are analyzed using pulse-pair estimation techniques. To measure the ability of the clutter rejection filters to remove the clutter, results are compared to pulse-pair estimates of windspeed within a simulated dry microburst without clutter. In the filter evaluation process, post-filtered pulse-pair width estimates and power levels are also used to measure the effectiveness of the filters. The results support the use of an adaptive clutter rejection filter for reducing the clutter induced bias in pulse-pair estimates of windspeed.

  9. Array model interpolation and subband iterative adaptive filters applied to beamforming-based acoustic echo cancellation.

    Bai, Mingsian R; Chi, Li-Wen; Liang, Li-Huang; Lo, Yi-Yang

    2016-02-01

    In this paper, an evolutionary exposition is given in regard to the enhancing strategies for acoustic echo cancellers (AECs). A fixed beamformer (FBF) is utilized to focus on the near-end speaker while suppressing the echo from the far end. In reality, the array steering vector could differ considerably from the ideal freefield plane wave model. Therefore, an experimental procedure is developed to interpolate a practical array model from the measured frequency responses. Subband (SB) filtering with polyphase implementation is exploited to accelerate the cancellation process. Generalized sidelobe canceller (GSC) composed of an FBF and an adaptive blocking module is combined with AEC to maximize cancellation performance. Another enhancement is an internal iteration (IIT) procedure that enables efficient convergence in the adaptive SB filters within a sample time. Objective tests in terms of echo return loss enhancement (ERLE), perceptual evaluation of speech quality (PESQ), word recognition rate for automatic speech recognition (ASR), and subjective listening tests are conducted to validate the proposed AEC approaches. The results show that the GSC-SB-AEC-IIT approach has attained the highest ERLE without speech quality degradation, even in double-talk scenarios. PMID:26936567

  10. Possibilities of joint application of adaptive optics technique and nonlinear optical phase conjugation to compensate for turbulent distortions

    Lukin, V. P.; Kanev, F. Yu; Kulagin, O. V.

    2016-05-01

    The efficiency of integrating the nonlinear optical technique based on forming a reverse wavefront and the conventional adaptive optics into a unified complex (for example, for adaptive focusing of quasi-cw laser radiation) is demonstrated. Nonlinear optical phase conjugation may provide more exact information about the phase fluctuations in the corrected wavefront in comparison with the adaptive optics methods. At the same time, the conventional methods of adaptive optics provide an efficient control of a laser beam projected onto a target for a rather long time.