WorldWideScience

Sample records for sophisticated processing techniques

  1. Sophisticated Players and Sophisticated Agents

    NARCIS (Netherlands)

    Rustichini, A.

    1998-01-01

    A sophisticated player is an individual who takes the action of the opponents, in a strategic situation, as determined by decision of rational opponents, and acts accordingly. A sophisticated agent is rational in the choice of his action, but ignores the fact that he is part of a strategic

  2. Background field removal technique using regularization enabled sophisticated harmonic artifact reduction for phase data with varying kernel sizes.

    Science.gov (United States)

    Kan, Hirohito; Kasai, Harumasa; Arai, Nobuyuki; Kunitomo, Hiroshi; Hirose, Yasujiro; Shibamoto, Yuta

    2016-09-01

    An effective background field removal technique is desired for more accurate quantitative susceptibility mapping (QSM) prior to dipole inversion. The aim of this study was to evaluate the accuracy of regularization enabled sophisticated harmonic artifact reduction for phase data with varying spherical kernel sizes (REV-SHARP) method using a three-dimensional head phantom and human brain data. The proposed REV-SHARP method used the spherical mean value operation and Tikhonov regularization in the deconvolution process, with varying 2-14mm kernel sizes. The kernel sizes were gradually reduced, similar to the SHARP with varying spherical kernel (VSHARP) method. We determined the relative errors and relationships between the true local field and estimated local field in REV-SHARP, VSHARP, projection onto dipole fields (PDF), and regularization enabled SHARP (RESHARP). Human experiment was also conducted using REV-SHARP, VSHARP, PDF, and RESHARP. The relative errors in the numerical phantom study were 0.386, 0.448, 0.838, and 0.452 for REV-SHARP, VSHARP, PDF, and RESHARP. REV-SHARP result exhibited the highest correlation between the true local field and estimated local field. The linear regression slopes were 1.005, 1.124, 0.988, and 0.536 for REV-SHARP, VSHARP, PDF, and RESHARP in regions of interest on the three-dimensional head phantom. In human experiments, no obvious errors due to artifacts were present in REV-SHARP. The proposed REV-SHARP is a new method combined with variable spherical kernel size and Tikhonov regularization. This technique might make it possible to be more accurate backgroud field removal and help to achive better accuracy of QSM. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Novel food processing techniques

    Directory of Open Access Journals (Sweden)

    Vesna Lelas

    2006-12-01

    Full Text Available Recently, a lot of investigations have been focused on development of the novel mild food processing techniques with the aim to obtain the high quality food products. It is presumed also that they could substitute some of the traditional processes in the food industry. The investigations are primarily directed to usage of high hydrostatic pressure, ultrasound, tribomechanical micronization, microwaves, pulsed electrical fields. The results of the scientific researches refer to the fact that application of some of these processes in particular food industry can result in lots of benefits. A significant energy savings, shortening of process duration, mild thermal conditions, food products with better sensory characteristics and with higher nutritional values can be achieved. As some of these techniques act also on the molecular level changing the conformation, structure and electrical potential of organic as well as inorganic materials, the improvement of some functional properties of these components may occur. Common characteristics of all of these techniques are treatment at ambient or insignificant higher temperatures and short time of processing (1 to 10 minutes. High hydrostatic pressure applied to various foodstuffs can destroy some microorganisms, successfully modify molecule conformation and consequently improve functional properties of foods. At the same time it acts positively on the food products intend for freezing. Tribomechanical treatment causes micronization of various solid materials that results in nanoparticles and changes in structure and electrical potential of molecules. Therefore, the significant improvement of some rheological and functional properties of materials occurred. Ultrasound treatment proved to be potentially very successful technique of food processing. It can be used as a pretreatment to drying (decreases drying time and improves functional properties of food, as extraction process of various components

  4. In Praise of the Sophists.

    Science.gov (United States)

    Gibson, Walker

    1993-01-01

    Discusses the thinking of the Greek Sophist philosophers, particularly Gorgias and Protagoras, and their importance and relevance for contemporary English instructors. Considers the problem of language as signs of reality in the context of Sophist philosophy. (HB)

  5. Cumulative Dominance and Probabilistic Sophistication

    NARCIS (Netherlands)

    Wakker, P.P.; Sarin, R.H.

    2000-01-01

    Machina & Schmeidler (Econometrica, 60, 1992) gave preference conditions for probabilistic sophistication, i.e. decision making where uncertainty can be expressed in terms of (subjective) probabilities without commitment to expected utility maximization. This note shows that simpler and more general

  6. Business Process Customization using Process Merging Techniques

    NARCIS (Netherlands)

    Bulanov, Pavel; Lazovik, Alexander; Aiello, Marco

    2012-01-01

    One of the important application of service composition techniques lies in the field of business process management. Essentially a business process can be considered as a composition of services, which is usually prepared by domain experts, and many tasks still have to be performed manually. These

  7. Cognitive Load and Strategic Sophistication

    OpenAIRE

    Allred, Sarah; Duffy, Sean; Smith, John

    2013-01-01

    We study the relationship between the cognitive load manipulation and strategic sophistication. The cognitive load manipulation is designed to reduce the subject's cognitive resources that are available for deliberation on a choice. In our experiment, subjects are placed under a large cognitive load (given a difficult number to remember) or a low cognitive load (given a number which is not difficult to remember). Subsequently, the subjects play a one-shot game then they are asked to recall...

  8. STOCK EXCHANGE LISTING INDUCES SOPHISTICATION OF CAPITAL BUDGETING

    Directory of Open Access Journals (Sweden)

    Wesley Mendes-da-Silva

    2014-08-01

    Full Text Available This article compares capital budgeting techniques employed in listed and unlisted companies in Brazil. We surveyed the Chief Financial Officers (CFOs of 398 listed companies and 300 large unlisted companies, and based on 91 respondents, the results suggest that the CFOs of listed companies tend to use less simplistic methods more often, for example: NPV and CAPM, and that CFOs of unlisted companies are less likely to estimate the cost of equity, despite being large companies. These findings indicate that stock exchange listing may require greater sophistication of the capital budgeting process.

  9. Improvements in techniques and processes

    International Nuclear Information System (INIS)

    Cairon, B.; Nolin, D.

    2003-01-01

    The paper presents the De-construction And Decontamination Techniques used at COGEMA-La Hague for dismantling and decontamination of plant UP2 400. Intervention under water particularly intervention from the edge of the pool are described while significant radiological constraints due to the presence of fuel are observed. The Under water fuel operations were undertaking to recover pieces of UNGG fuel and miscellaneous technological waste under 5 m of water and with reduced visibility. Here remote works implying reduced dosimetry and increased security were carried out. Specific issues concerning tools and procedures are addressed as fallows: Pendulous telescopic tool holder on runway channel 215.40; HP cutting under water; Cutting machine set up in the facility; Suction of sludge; Gripping and handling system for the slider and lid; Dredging the Sludge; tests in facility; Control console; Shock absorbing units; Moving the shock absorbing mattresses using slings; Decontamination of large areas of stainless steel walls; Cutting bulky parts in air; Cutting a tubular structure under water; Compacting the drums; Concrete skinning using skinning machines; Concrete skinning using the BRH, hydraulic rock breaker; Concrete skinning using shot blasting; Dismantling the process cell using the 'ATENA' remote power carrier; Removing openings through dry core sample drilling; Removing openings through demolition

  10. Sophisticating a naive Liapunov function

    International Nuclear Information System (INIS)

    Smith, D.; Lewins, J.D.

    1985-01-01

    The art of the direct method of Liapunov to determine system stability is to construct a suitable Liapunov or V function where V is to be positive definite (PD), to shrink to a center, which may be conveniently chosen as the origin, and where V is the negative definite (ND). One aid to the art is to solve an approximation to the system equations in order to provide a candidate V function. It can happen, however, that the V function is not strictly ND but vanishes at a finite number of isolated points. Naively, one anticipates that stability has been demonstrated since the trajectory of the system at such points is only momentarily tangential and immediately enters a region of inward directed trajectories. To demonstrate stability rigorously requires the construction of a sophisticated Liapunov function from what can be called the naive original choice. In this paper, the authors demonstrate the method of perturbing the naive function in the context of the well-known second-order oscillator and then apply the method to a more complicated problem based on a prompt jump model for a nuclear fission reactor

  11. A Comparative of business process modelling techniques

    Science.gov (United States)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  12. Applicability of statistical process control techniques

    NARCIS (Netherlands)

    Schippers, W.A.J.

    1998-01-01

    This paper concerns the application of Process Control Techniques (PCTs) for the improvement of the technical performance of discrete production processes. Successful applications of these techniques, such as Statistical Process Control Techniques (SPC), can be found in the literature. However, some

  13. Pension fund sophistication and investment policy

    NARCIS (Netherlands)

    de Dreu, J.|info:eu-repo/dai/nl/364537906; Bikker, J.A.|info:eu-repo/dai/nl/06912261X

    This paper assesses the sophistication of pension funds’ investment policies using data on 748 Dutch pension funds during the 1999–2006 period. We develop three indicators of sophistication: gross rounding of investment choices, investments in alternative sophisticated asset classes and ‘home bias’.

  14. Radiotracer techniques in mineral processing

    International Nuclear Information System (INIS)

    Przewlocki, K.

    1991-01-01

    The value of the smelter metal content in currently exploited polymetallic ores mostly does not exceed 2%. Before metallurgical treatment, ore must pass through the concentration process. The benefication process usually starts from the comminution of excavated material and terminates at the flotation and drying of the concentrate. These operations consume vast quantities of energy. To be economically justified, the process requires optimization and, if possible, automatic control. Radioactive tracers were found to be useful in the identification of particular technological subsystems and their subsequent optimization. A great deal of experience has been gathered in this field so far. The industrial radiotracer test (RTT) is carried out using very sensitive multidetector recording systems which have digital data acquisition capabilities. The optimization strategy consists of periodically adjusting technological process and set points of controlled variables according to certain improvement procedures. If computer facilities are available, data interpretation and calibration of the mathematical models describing the technical process itself can be performed on the spot. This significantly accelerates the whole procedure as RTT may be repeated for particular system configurations. The procedure of plant optimization by means of RTT is illustrated in the paper using the example of the copper ore enrichment process, assuming that it is representative of the whole mineral industry. Identification by RTT of the three main operations involved in the ore enrichment process, such as comminution, flotation and granular classification, is discussed in detail as particular case studies. In reference to this, it is also shown how the technological process can be adjusted to be most efficient. (author). 14 refs, 7 figs

  15. Sophisticated fuel handling system evolved

    International Nuclear Information System (INIS)

    Ross, D.A.

    1988-01-01

    The control systems at Sellafield fuel handling plant are described. The requirements called for built-in diagnostic features as well as the ability to handle a large sequencing application. Speed was also important; responses better than 50ms were required. The control systems are used to automate operations within each of the three main process caves - two Magnox fuel decanners and an advanced gas-cooled reactor fuel dismantler. The fuel route within the fuel handling plant is illustrated and described. ASPIC (Automated Sequence Package for Industrial Control) which was developed as a controller for the plant processes is described. (U.K.)

  16. Natural language processing techniques for automatic test ...

    African Journals Online (AJOL)

    Natural language processing techniques for automatic test questions generation using discourse connectives. ... PROMOTING ACCESS TO AFRICAN RESEARCH. AFRICAN JOURNALS ... Journal of Computer Science and Its Application.

  17. Probabilistic evaluation of process model matching techniques

    NARCIS (Netherlands)

    Kuss, Elena; Leopold, Henrik; van der Aa, Han; Stuckenschmidt, Heiner; Reijers, Hajo A.

    2016-01-01

    Process model matching refers to the automatic identification of corresponding activities between two process models. It represents the basis for many advanced process model analysis techniques such as the identification of similar process parts or process model search. A central problem is how to

  18. A measurement technique for counting processes

    International Nuclear Information System (INIS)

    Cantoni, V.; Pavia Univ.; De Lotto, I.; Valenziano, F.

    1980-01-01

    A technique for the estimation of first and second order properties of a stationary counting process is presented here which uses standard instruments for analysis of a continuous stationary random signal. (orig.)

  19. Multibeam swath bathymetry signal processing techniques

    Digital Repository Service at National Institute of Oceanography (India)

    Ranade, G.; Sudhakar, T.

    Mathematical advances and the advances in the real time signal processing techniques in the recent times, have considerably improved the state of art in the bathymetry systems. These improvements have helped in developing high resolution swath...

  20. The value of multivariate model sophistication

    DEFF Research Database (Denmark)

    Rombouts, Jeroen; Stentoft, Lars; Violante, Francesco

    2014-01-01

    We assess the predictive accuracies of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set of 444 multivariate models that differ in their spec....... In addition to investigating the value of model sophistication in terms of dollar losses directly, we also use the model confidence set approach to statistically infer the set of models that delivers the best pricing performances.......We assess the predictive accuracies of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set of 444 multivariate models that differ...

  1. Image processing techniques for remote sensing data

    Digital Repository Service at National Institute of Oceanography (India)

    RameshKumar, M.R.

    interpretation and for processing of scene data for autonomous machine perception. The technique of digital image processing are used for' automatic character/pattern recognition, industrial robots for product assembly and inspection, military recognizance... and spatial co-ordinates into discrete components. The mathematical concepts involved are the sampling and transform theory. Two dimensional transforms are used for image enhancement, restoration, encoding and description too. The main objective of the image...

  2. Advances in process research by radionuclide techniques

    International Nuclear Information System (INIS)

    Merz, A.; Vogg, H.

    1978-01-01

    Modifications and transformations of materials and their technical implementation in process systems require movement of materials. Radionuclide techniques can greatly help in understanding and describing these mechanisms. The specialized measuring technique is demonstrated by three examples selected from various fields of process technology. Radioactive tracer studies performed on a rotary kiln helped, inter alia, to establish a subdivision into process zones and to pinpoint areas of dust generation. Mixing and feeding actions were studied in a twin screw extruder equipped with a special screw and mixer disk arrangement. Tracer experiments conducted in two secondary settling basins indicate the differences in the mechanisms of movement of the aqueous phase if the mean residence time and the residence time distribution may be influenced not only by hydraulic loads, but also by design variants of the overflow flumes. (orig./HP) [de

  3. Reasoning about objects using process calculus techniques

    DEFF Research Database (Denmark)

    Kleist, Josva

    This thesis investigates the applicability of techniques known from the world of process calculi to reason about properties of object-oriented programs. The investigation is performed upon a small object-oriented language - The Sigma-calculus of Abadi and Cardelli. The investigation is twofold: We......-calculus turns out to be insufficient. Based on our experiences, we present a translation of a typed imperative Sigma-calculus, which looks promising. We are able to provide simple proofs of the equivalence of different Sigma-calculus objects using this translation. We use a labelled transition system adapted...... to the Sigma-calculus to investigate the use of process calculi techniques directly on the Sigma-calculus. The results obtained are of a fairly theoretical nature. We investigate the connection between the operational and denotaional semantics for a typed functional Sigma-calculus. The result is that Abadi...

  4. Policy and process of innovation in techniques

    International Nuclear Information System (INIS)

    Kim, In Su; Lee, Jin Ju

    1982-09-01

    This book mentions policy and process of innovation in techniques, which deals with introduction, macroscopic analysis of development of science and technology including analysis of existing research about system of development on science and technology and new development system of science and technology, macroscopic analysis of development of science and technology of Korea. It also indicates innovation of technology in Korean industry with various access.

  5. Actinide recovery techniques utilizing electromechanical processes

    International Nuclear Information System (INIS)

    Westphal, B.R.; Benedict, R.W.

    1994-01-01

    Under certain conditions, the separation of actinides using electromechanical techniques may be an effective means of residue processing. The separation of granular mixtures of actinides and other materials is based on appreciable differences in the magnetic and electrical properties of the actinide elements. In addition, the high density of actinides, particularly uranium and plutonium, may render a simultaneous separation based on mutually complementary parameters. Both high intensity magnetic separation and electrostatic separation have been investigated for the concentration of an actinide waste stream. Waste stream constituents include an actinide metal alloy and broken quartz shards. The investigation of these techniques is in support of the Integral Fast Reactor (IFR) concept currently being developed at Argonne National Laboratory under the auspices of the Department of Energy

  6. Actinide recovery techniques utilizing electromechanical processes

    International Nuclear Information System (INIS)

    Westphal, B.R.; Benedict, R.W.

    1994-01-01

    Under certain conditions, the separation of actinides using electromechanical techniques may be an effective means of residue processing. The separation of granular mixtures of actinides and other materials discussed in this report is based on appreciable differences in the magnetic and electrical properties of the actinide elements. In addition, the high density of actinides, particularly uranium and plutonium, may render a simultaneous separation based on mutually complementary parameters. Both high intensity magnetic separation and electrostatic separation have been investigated for the concentration of an actinide waste stream. Waste stream constituents include an actinide metal alloy and broken quartz shards. The investigation of these techniques is in support of the Integral Fast Reactor (IFR) concept currently being developed at Argonne National Laboratory under the auspices of the Department of Energy

  7. Particle Handling Techniques in Microchemical Processes

    Directory of Open Access Journals (Sweden)

    Brian S. Flowers

    2012-08-01

    Full Text Available The manipulation of particulates in microfluidics is a challenge that continues to impact applications ranging from fine chemicals manufacturing to the materials and the life sciences. Heterogeneous operations carried out in microreactors involve high surface-to-volume characteristics that minimize the heat and mass transport resistances, offering precise control of the reaction conditions. Considerable advances have been made towards the engineering of techniques that control particles in microscale laminar flow, yet there remain tremendous opportunities for improvements in the area of chemical processing. Strategies that have been developed to successfully advance systems involving heterogeneous materials are reviewed and an outlook provided in the context of the challenges of continuous flow fine chemical processes.

  8. Does underground storage still require sophisticated studies?

    International Nuclear Information System (INIS)

    Marsily, G. de

    1997-01-01

    Most countries agree to the necessity of burying high or medium-level wastes in geological layers situated at a few hundred meters below the ground level. The advantages and disadvantages of different types of rock such as salt, clay, granite and volcanic material are examined. Sophisticated studies are lead to determine the best geological confinement but questions arise about the time for which safety must be ensured. France has chosen 3 possible sites. These sites are geologically described in the article. The final place will be proposed after a testing phase of about 5 years in an underground facility. (A.C.)

  9. Flame analysis using image processing techniques

    Science.gov (United States)

    Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng

    2018-04-01

    This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.

  10. Surface analytical techniques applied to minerals processing

    International Nuclear Information System (INIS)

    Smart, R.St.C.

    1991-01-01

    An understanding of the chemical and physical forms of the chemically altered layers on the surfaces of base metal sulphides, particularly in the form of hydroxides, oxyhydroxides and oxides, and the changes that occur in them during minerals processing lies at the core of a complete description of flotation chemistry. This paper reviews the application of a variety of surface-sensitive techniques and methodologies applied to the study of surface layers on single minerals, mixed minerals, synthetic ores and real ores. Evidence from combined XPS/SAM/SEM studies have provided images and analyses of three forms of oxide, oxyhydroxide and hydroxide products on the surfaces of single sulphide minerals, mineral mixtures and complex sulphide ores. 4 refs., 2 tabs., 4 figs

  11. Application of Electroporation Technique in Biofuel Processing

    Directory of Open Access Journals (Sweden)

    Yousuf Abu

    2017-01-01

    Full Text Available Biofuels production is mostly oriented with fermentation process, which requires fermentable sugar as nutrient for microbial growth. Lignocellulosic biomass (LCB represents the most attractive, low-cost feedstock for biofuel production, it is now arousing great interest. The cellulose that is embedded in the lignin matrix has an insoluble, highly-crystalline structure, so it is difficult to hydrolyze into fermentable sugar or cell protein. On the other hand, microbial lipid has been studying as substitute of plant oils or animal fat to produce biodiesel. It is still a great challenge to extract maximum lipid from microbial cells (yeast, fungi, algae investing minimum energy.Electroporation (EP of LCB results a significant increase in cell conductivity and permeability caused due to the application of an external electric field. EP is required to alter the size and structure of the biomass, to reduce the cellulose crystallinity, and increase their porosity as well as chemical composition, so that the hydrolysis of the carbohydrate fraction to monomeric sugars can be achieved rapidly and with greater yields. Furthermore, EP has a great potential to disrupt the microbial cell walls within few seconds to bring out the intracellular materials (lipid to the solution. Therefore, this study aims to describe the challenges and prospect of application of EP technique in biofuels processing.

  12. Application of on-line analytical processing technique in accelerator

    International Nuclear Information System (INIS)

    Xie Dong; Li Weimin; He Duohui; Liu Gongfa; Xuan Ke

    2005-01-01

    A method of application of the on-line analytical processing technique in accelerator is described, which includes data pre-processing, the process of constructing of data warehouse and on-line analytical processing. (authors)

  13. Process efficiency in isotope techniques by microelectronics

    International Nuclear Information System (INIS)

    Ziegler, J.; Boelke, L.; Nagli, G.; Schaelicke, W.

    1987-01-01

    The wide application of tracer techniques as a diagnostic tool in nuclear medicine requires the dosage of radioactive tracer solutions in a volume range of 0.5 to 10 ml with a high degree of precision. Two types of computer controlled filling and closing machines for injection bottles are described. Safe handling of radioactive solutions is taken in special consideration. (author)

  14. Criminal Network Investigation: Processes, Tools, and Techniques

    DEFF Research Database (Denmark)

    Petersen, Rasmus Rosenqvist

    important challenge for criminal network investigation, despite the massive attention it receives from research and media. Challenges such as the investigation process, the context of the investigation, human factors such as thinking and creativity, and political decisions and legal laws are all challenges...... that could mean the success or failure of criminal network investigations. % include commission reports as indications of process related problems .. to "play a little politics" !! Information, process, and human factors, are challenges we find to be addressable by software system support. Based on those......Criminal network investigations such as police investigations, intelligence analysis, and investigative journalism involve a range of complex knowledge management processes and tasks. Criminal network investigators collect, process, and analyze information related to a specific target to create...

  15. Image processing techniques for digital orthophotoquad production

    Science.gov (United States)

    Hood, Joy J.; Ladner, L. J.; Champion, Richard A.

    1989-01-01

    Orthophotographs have long been recognized for their value as supplements or alternatives to standard maps. Recent trends towards digital cartography have resulted in efforts by the US Geological Survey to develop a digital orthophotoquad production system. Digital image files were created by scanning color infrared photographs on a microdensitometer. Rectification techniques were applied to remove tile and relief displacement, thereby creating digital orthophotos. Image mosaicking software was then used to join the rectified images, producing digital orthophotos in quadrangle format.

  16. Digital image processing techniques in archaeology

    Digital Repository Service at National Institute of Oceanography (India)

    Santanam, K.; Vaithiyanathan, R.; Tripati, S.

    Digital image processing involves the manipulation and interpretation of digital images with the aid of a computer. This form of remote sensing actually began in the 1960's with a limited number of researchers analysing multispectral scanner data...

  17. A general software reliability process simulation technique

    Science.gov (United States)

    Tausworthe, Robert C.

    1991-01-01

    The structure and rationale of the generalized software reliability process, together with the design and implementation of a computer program that simulates this process are described. Given assumed parameters of a particular project, the users of this program are able to generate simulated status timelines of work products, numbers of injected anomalies, and the progress of testing, fault isolation, repair, validation, and retest. Such timelines are useful in comparison with actual timeline data, for validating the project input parameters, and for providing data for researchers in reliability prediction modeling.

  18. Processing data collected from radiometric experiments by multivariate technique

    International Nuclear Information System (INIS)

    Urbanski, P.; Kowalska, E.; Machaj, B.; Jakowiuk, A.

    2005-01-01

    Multivariate techniques applied for processing data collected from radiometric experiments can provide more efficient extraction of the information contained in the spectra. Several techniques are considered: (i) multivariate calibration using Partial Least Square Regression and Artificial Neural Network, (ii) standardization of the spectra, (iii) smoothing of collected spectra were autocorrelation function and bootstrap were used for the assessment of the processed data, (iv) image processing using Principal Component Analysis. Application of these techniques is illustrated on examples of some industrial applications. (author)

  19. Comparison of process estimation techniques for on-line calibration monitoring

    International Nuclear Information System (INIS)

    Shumaker, B. D.; Hashemian, H. M.; Morton, G. W.

    2006-01-01

    The goal of on-line calibration monitoring is to reduce the number of unnecessary calibrations performed each refueling cycle on pressure, level, and flow transmitters in nuclear power plants. The effort requires a baseline for determining calibration drift and thereby the need for a calibration. There are two ways to establish the baseline: averaging and modeling. Averaging techniques have proven to be highly successful in the applications when there are a large number of redundant transmitters; but, for systems with little or no redundancy, averaging methods are not always reliable. That is, for non-redundant transmitters, more sophisticated process estimation techniques are needed to augment or replace the averaging techniques. This paper explores three well-known process estimation techniques; namely Independent Component Analysis (ICA), Auto-Associative Neural Networks (AANN), and Auto-Associative Kernel Regression (AAKR). Using experience and data from an operating nuclear plant, the paper will present an evaluation of the effectiveness of these methods in detecting transmitter drift in actual plant conditions. (authors)

  20. Does Investors' Sophistication Affect Persistence and Pricing of Discretionary Accruals?

    OpenAIRE

    Lanfeng Kao

    2007-01-01

    This paper examines whether the sophistication of market investors influences management's strategy on discretionary accounting choice, and thus changes the persistence of discretionary accruals. The results show that the persistence of discretionary accruals for firms face with naive investors is lower than that for firms face with sophisticated investors. The results also demonstrate that sophisticated investors indeed incorporate the implications of current earnings components into future ...

  1. Wavelet processing techniques for digital mammography

    Science.gov (United States)

    Laine, Andrew F.; Song, Shuwu

    1992-09-01

    This paper introduces a novel approach for accomplishing mammographic feature analysis through multiresolution representations. We show that efficient (nonredundant) representations may be identified from digital mammography and used to enhance specific mammographic features within a continuum of scale space. The multiresolution decomposition of wavelet transforms provides a natural hierarchy in which to embed an interactive paradigm for accomplishing scale space feature analysis. Similar to traditional coarse to fine matching strategies, the radiologist may first choose to look for coarse features (e.g., dominant mass) within low frequency levels of a wavelet transform and later examine finer features (e.g., microcalcifications) at higher frequency levels. In addition, features may be extracted by applying geometric constraints within each level of the transform. Choosing wavelets (or analyzing functions) that are simultaneously localized in both space and frequency, results in a powerful methodology for image analysis. Multiresolution and orientation selectivity, known biological mechanisms in primate vision, are ingrained in wavelet representations and inspire the techniques presented in this paper. Our approach includes local analysis of complete multiscale representations. Mammograms are reconstructed from wavelet representations, enhanced by linear, exponential and constant weight functions through scale space. By improving the visualization of breast pathology we can improve the chances of early detection of breast cancers (improve quality) while requiring less time to evaluate mammograms for most patients (lower costs).

  2. A probabilistic evaluation procedure for process model matching techniques

    NARCIS (Netherlands)

    Kuss, Elena; Leopold, Henrik; van der Aa, Han; Stuckenschmidt, Heiner; Reijers, Hajo A.

    2018-01-01

    Process model matching refers to the automatic identification of corresponding activities between two process models. It represents the basis for many advanced process model analysis techniques such as the identification of similar process parts or process model search. A central problem is how to

  3. Computer processing techniques in digital radiography research

    International Nuclear Information System (INIS)

    Pickens, D.R.; Kugel, J.A.; Waddill, W.B.; Smith, G.D.; Martin, V.N.; Price, R.R.; James, A.E. Jr.

    1985-01-01

    In the Department of Radiology and Radiological Sciences, Vanderbilt University Medical Center, and the Center for Medical Imaging Research, Nashville, TN, there are several activities which are designed to increase the information available from film-screen acquisition as well as from direct digital acquisition of radiographic information. Two of the projects involve altering the display of images after acquisition, either to remove artifacts present as a result of the acquisition process or to change the manner in which the image is displayed to improve the perception of details in the image. These two projects use methods which can be applied to any type of digital image, but are being implemented with images digitized from conventional x-ray film. One of these research endeavors involves mathematical alteration of the image to correct for motion artifacts or registration errors between images that will be subtracted. Another applies well-known image processing methods to digital radiographic images to improve the image contrast and enhance subtle details in the image. A third project involves the use of dual energy imaging with a digital radiography system to reconstruct images which demonstrate either soft tissue details or the osseous structures. These projects are discussed in greater detail in the following sections of this communication

  4. Discovering Process Reference Models from Process Variants Using Clustering Techniques

    NARCIS (Netherlands)

    Li, C.; Reichert, M.U.; Wombacher, Andreas

    2008-01-01

    In today's dynamic business world, success of an enterprise increasingly depends on its ability to react to changes in a quick and flexible way. In response to this need, process-aware information systems (PAIS) emerged, which support the modeling, orchestration and monitoring of business processes

  5. Automatically Assessing Lexical Sophistication: Indices, Tools, Findings, and Application

    Science.gov (United States)

    Kyle, Kristopher; Crossley, Scott A.

    2015-01-01

    This study explores the construct of lexical sophistication and its applications for measuring second language lexical and speaking proficiency. In doing so, the study introduces the Tool for the Automatic Analysis of LExical Sophistication (TAALES), which calculates text scores for 135 classic and newly developed lexical indices related to word…

  6. The Impact of Financial Sophistication on Adjustable Rate Mortgage Ownership

    Science.gov (United States)

    Smith, Hyrum; Finke, Michael S.; Huston, Sandra J.

    2011-01-01

    The influence of a financial sophistication scale on adjustable-rate mortgage (ARM) borrowing is explored. Descriptive statistics and regression analysis using recent data from the Survey of Consumer Finances reveal that ARM borrowing is driven by both the least and most financially sophisticated households but for different reasons. Less…

  7. The role of sophisticated accounting system in strategy management

    OpenAIRE

    Naranjo Gil, David

    2004-01-01

    Organizations are designing more sophisticated accounting information systems to meet the strategic goals and enhance their performance. This study examines the effect of accounting information system design on the performance of organizations pursuing different strategic priorities. The alignment between sophisticated accounting information systems and organizational strategy is analyzed. The enabling effect of the accounting information system on performance is also examined. Relationships ...

  8. Probabilistic Sophistication, Second Order Stochastic Dominance, and Uncertainty Aversion

    OpenAIRE

    Simone Cerreia-Vioglio; Fabio Maccheroni; Massimo Marinacci; Luigi Montrucchio

    2010-01-01

    We study the interplay of probabilistic sophistication, second order stochastic dominance, and uncertainty aversion, three fundamental notions in choice under uncertainty. In particular, our main result, Theorem 2, characterizes uncertainty averse preferences that satisfy second order stochastic dominance, as well as uncertainty averse preferences that are probabilistically sophisticated.

  9. The First Sophists and the Uses of History.

    Science.gov (United States)

    Jarratt, Susan C.

    1987-01-01

    Reviews the history of intellectual views on the Greek sophists in three phases: (1) their disparagement by Plato and Aristotle as the morally disgraceful "other"; (2) nineteenth century British positivists' reappraisal of these relativists as ethically and scientifically superior; and (3) twentieth century versions of the sophists as…

  10. Evaluation of EMG processing techniques using Information Theory.

    Science.gov (United States)

    Farfán, Fernando D; Politti, Julio C; Felice, Carmelo J

    2010-11-12

    Electromyographic signals can be used in biomedical engineering and/or rehabilitation field, as potential sources of control for prosthetics and orthotics. In such applications, digital processing techniques are necessary to follow efficient and effectively the changes in the physiological characteristics produced by a muscular contraction. In this paper, two methods based on information theory are proposed to evaluate the processing techniques. These methods determine the amount of information that a processing technique is able to extract from EMG signals. The processing techniques evaluated with these methods were: absolute mean value (AMV), RMS values, variance values (VAR) and difference absolute mean value (DAMV). EMG signals from the middle deltoid during abduction and adduction movement of the arm in the scapular plane was registered, for static and dynamic contractions. The optimal window length (segmentation), abduction and adduction movements and inter-electrode distance were also analyzed. Using the optimal segmentation (200 ms and 300 ms in static and dynamic contractions, respectively) the best processing techniques were: RMS, AMV and VAR in static contractions, and only the RMS in dynamic contractions. Using the RMS of EMG signal, variations in the amount of information between the abduction and adduction movements were observed. Although the evaluation methods proposed here were applied to standard processing techniques, these methods can also be considered as alternatives tools to evaluate new processing techniques in different areas of electrophysiology.

  11. Evaluation of EMG processing techniques using Information Theory

    Directory of Open Access Journals (Sweden)

    Felice Carmelo J

    2010-11-01

    Full Text Available Abstract Background Electromyographic signals can be used in biomedical engineering and/or rehabilitation field, as potential sources of control for prosthetics and orthotics. In such applications, digital processing techniques are necessary to follow efficient and effectively the changes in the physiological characteristics produced by a muscular contraction. In this paper, two methods based on information theory are proposed to evaluate the processing techniques. Methods These methods determine the amount of information that a processing technique is able to extract from EMG signals. The processing techniques evaluated with these methods were: absolute mean value (AMV, RMS values, variance values (VAR and difference absolute mean value (DAMV. EMG signals from the middle deltoid during abduction and adduction movement of the arm in the scapular plane was registered, for static and dynamic contractions. The optimal window length (segmentation, abduction and adduction movements and inter-electrode distance were also analyzed. Results Using the optimal segmentation (200 ms and 300 ms in static and dynamic contractions, respectively the best processing techniques were: RMS, AMV and VAR in static contractions, and only the RMS in dynamic contractions. Using the RMS of EMG signal, variations in the amount of information between the abduction and adduction movements were observed. Conclusions Although the evaluation methods proposed here were applied to standard processing techniques, these methods can also be considered as alternatives tools to evaluate new processing techniques in different areas of electrophysiology.

  12. Assessment of Process Monitoring Techniques for Pyro processing Safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Y. E.; Kim, C. M.; Yim, M. S. [KAIST, Daejeon (Korea, Republic of)

    2016-05-15

    PM technologies can be used to inspect normal/off-normal operation with various data obtained from facility operations in real time to meet safeguards objectives. To support the use of PM technologies for the purpose of pyroprocessing safeguards, this study aims at identifying technologies that could be useful for PM purposes and evaluating their applicability to a pyroprocessing facility. This paper describes the development of the assessment criteria to evaluate the practicality of candidate technologies for PM based on a variety of requirements and considerations. By using the developed assessment criteria, application of technologies in the oxide reduction process was assessed as a test case example. Research is necessary to validate the criteria according to the needs of each unit process, perhaps based on expert elicitation and/or international collaboration with other expert organization(s). These advanced assessment criteria will serve a useful guideline for selecting appropriate candidate PM technologies for pyroprocessing safeguards. Based on the results of using these evaluation criteria, the optimum technologies can be successfully selected for use at a large scale pyroprocessing facility.

  13. A new processing technique for airborne gamma-ray data

    DEFF Research Database (Denmark)

    Hovgaard, Jens

    1997-01-01

    The mathematical-statistical background for at new technique for processing gamma-ray spectra is presented. The technique - Noise Adjusted Singular Value Decomposition - decomposes at set of gamma-ray spectra into a few basic spectra - the spectral components. The spectral components can be proce...

  14. Techniques and software architectures for medical visualisation and image processing

    NARCIS (Netherlands)

    Botha, C.P.

    2005-01-01

    This thesis presents a flexible software platform for medical visualisation and image processing, a technique for the segmentation of the shoulder skeleton from CT data and three techniques that make contributions to the field of direct volume rendering. Our primary goal was to investigate the use

  15. Noncontaminating technique for making holes in existing process systems

    Science.gov (United States)

    Hecker, T. P.; Czapor, H. P.; Giordano, S. M.

    1972-01-01

    Technique is developed for making cleanly-contoured holes in assembled process systems without introducing chips or other contaminants into system. Technique uses portable equipment and does not require dismantling of system. Method was tested on Inconel, stainless steel, ASTMA-53, and Hastelloy X in all positions.

  16. Effects of processing techniques on the radioactive contamination of food

    International Nuclear Information System (INIS)

    Bovard, P.; Delmas, J.; Grauby, A.

    Following contamination of cultures of rice, grapes and various vegetables by 90 Sr and 137 Cs, the effect of processing and cooking techniques on the contamination of the food-stuff was investigated [fr

  17. PAUL AND SOPHISTIC RHETORIC: A PERSPECTIVE ON HIS ...

    African Journals Online (AJOL)

    use of modern rhetorical theories but analyses the letter in terms of the clas- ..... If a critical reader would have had the traditional anti-sophistic arsenal ..... pressions and that 'rhetoric' is mainly a matter of communicating these thoughts.

  18. Sophistication and Performance of Italian Agri‐food Exports

    Directory of Open Access Journals (Sweden)

    Anna Carbone

    2012-06-01

    Full Text Available Nonprice competition is increasingly important in world food markets. Recently, the expression ‘export sophistication’ has been introduced in the economic literature to refer to a wide set of attributes that increase product value. An index has been proposed to measure sophistication in an indirect way through the per capita GDP of exporting countries (Lall et al., 2006; Haussmann et al., 2007.The paper applies the sophistication measure to the Italian food export sector, moving from an analysis of trends and performance of Italian food exports. An original way to disentangle different components in the temporal variation of the sophistication index is also proposed.Results show that the sophistication index offers original insights on recent trends in world food exports and with respect to Italian core food exports.

  19. Obfuscation, Learning, and the Evolution of Investor Sophistication

    OpenAIRE

    Bruce Ian Carlin; Gustavo Manso

    2011-01-01

    Investor sophistication has lagged behind the growing complexity of retail financial markets. To explore this, we develop a dynamic model to study the interaction between obfuscation and investor sophistication in mutual fund markets. Taking into account different learning mechanisms within the investor population, we characterize the optimal timing of obfuscation for financial institutions who offer retail products. We show that educational initiatives that are directed to facilitate learnin...

  20. Evaluation of stabilization techniques for ion implant processing

    Science.gov (United States)

    Ross, Matthew F.; Wong, Selmer S.; Minter, Jason P.; Marlowe, Trey; Narcy, Mark E.; Livesay, William R.

    1999-06-01

    With the integration of high current ion implant processing into volume CMOS manufacturing, the need for photoresist stabilization to achieve a stable ion implant process is critical. This study compares electron beam stabilization, a non-thermal process, with more traditional thermal stabilization techniques such as hot plate baking and vacuum oven processing. The electron beam processing is carried out in a flood exposure system with no active heating of the wafer. These stabilization techniques are applied to typical ion implant processes that might be found in a CMOS production process flow. The stabilization processes are applied to a 1.1 micrometers thick PFI-38A i-line photoresist film prior to ion implant processing. Post stabilization CD variation is detailed with respect to wall slope and feature integrity. SEM photographs detail the effects of the stabilization technique on photoresist features. The thermal stability of the photoresist is shown for different levels of stabilization and post stabilization thermal cycling. Thermal flow stability of the photoresist is detailed via SEM photographs. A significant improvement in thermal stability is achieved with the electron beam process, such that photoresist features are stable to temperatures in excess of 200 degrees C. Ion implant processing parameters are evaluated and compared for the different stabilization methods. Ion implant system end-station chamber pressure is detailed as a function of ion implant process and stabilization condition. The ion implant process conditions are detailed for varying factors such as ion current, energy, and total dose. A reduction in the ion implant systems end-station chamber pressure is achieved with the electron beam stabilization process over the other techniques considered. This reduction in end-station chamber pressure is shown to provide a reduction in total process time for a given ion implant dose. Improvements in the ion implant process are detailed across

  1. Digital processing optical transmission and coherent receiving techniques

    CERN Document Server

    Binh, Le Nguyen

    2013-01-01

    With coherent mixing in the optical domain and processing in the digital domain, advanced receiving techniques employing ultra-high speed sampling rates have progressed tremendously over the last few years. These advances have brought coherent reception systems for lightwave-carried information to the next stage, resulting in ultra-high capacity global internetworking. Digital Processing: Optical Transmission and Coherent Receiving Techniques describes modern coherent receiving techniques for optical transmission and aspects of modern digital optical communications in the most basic lines. The

  2. Roman sophisticated surface modification methods to manufacture silver counterfeited coins

    Science.gov (United States)

    Ingo, G. M.; Riccucci, C.; Faraldi, F.; Pascucci, M.; Messina, E.; Fierro, G.; Di Carlo, G.

    2017-11-01

    By means of the combined use of X-ray photoelectron spectroscopy (XPS), optical microscopy (OM) and scanning electron microscopy (SEM) coupled with energy dispersive X-ray spectroscopy (EDS) the surface and subsurface chemical and metallurgical features of silver counterfeited Roman Republican coins are investigated to decipher some aspects of the manufacturing methods and to evaluate the technological ability of the Roman metallurgists to produce thin silver coatings. The results demonstrate that over 2000 ago important advances in the technology of thin layer deposition on metal substrates were attained by Romans. The ancient metallurgists produced counterfeited coins by combining sophisticated micro-plating methods and tailored surface chemical modification based on the mercury-silvering process. The results reveal that Romans were able systematically to chemically and metallurgically manipulate alloys at a micro scale to produce adherent precious metal layers with a uniform thickness up to few micrometers. The results converge to reveal that the production of forgeries was aimed firstly to save expensive metals as much as possible allowing profitable large-scale production at a lower cost. The driving forces could have been a lack of precious metals, an unexpected need to circulate coins for trade and/or a combinations of social, political and economic factors that requested a change in money supply. Finally, some information on corrosion products have been achieved useful to select materials and methods for the conservation of these important witnesses of technology and economy.

  3. Automated synthesis of image processing procedures using AI planning techniques

    Science.gov (United States)

    Chien, Steve; Mortensen, Helen

    1994-01-01

    This paper describes the Multimission VICAR (Video Image Communication and Retrieval) Planner (MVP) (Chien 1994) system, which uses artificial intelligence planning techniques (Iwasaki & Friedland, 1985, Pemberthy & Weld, 1992, Stefik, 1981) to automatically construct executable complex image processing procedures (using models of the smaller constituent image processing subprograms) in response to image processing requests made to the JPL Multimission Image Processing Laboratory (MIPL). The MVP system allows the user to specify the image processing requirements in terms of the various types of correction required. Given this information, MVP derives unspecified required processing steps and determines appropriate image processing programs and parameters to achieve the specified image processing goals. This information is output as an executable image processing program which can then be executed to fill the processing request.

  4. Hazardous Waste Landfill Siting using GIS Technique and Analytical Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Ozeair Abessi

    2010-07-01

    Full Text Available Disposal of large amount of generated hazardous waste in power plants, has always received communities' and authori¬ties attentions. In this paper using site screening method and Analytical Hierarchy Process (AHP a sophisticated approach for siting hazardous waste landfill in large areas is presented. This approach demonstrates how the evaluation criteria such as physical, socio-economical, technical, environmental and their regulatory sub criteria can be introduced into an over layer technique to screen some limited appropriate zones in the area. Then, in order to find the optimal site amongst the primary screened site utilizing a Multiple Criteria Decision Making (MCDM method for hierarchy computations of the process is recommended. Using the introduced method an accurate siting procedure for environmental planning of the landfills in an area would be enabled. In the study this approach was utilized for disposal of hazardous wastes of Shahid Rajaee thermal power plant located in Qazvin province west central part of Iran. As a result of this study 10 suitable zones were screened in the area at first, then using analytical hierarchy process a site near the power plant were chosen as the optimal site for landfilling of the hazardous wastes in Qazvin province.

  5. A Document Imaging Technique for Implementing Electronic Loan Approval Process

    Directory of Open Access Journals (Sweden)

    J. Manikandan

    2015-04-01

    Full Text Available The image processing is one of the leading technologies of computer applications. Image processing is a type of signal processing, the input for image processor is an image or video frame and the output will be an image or subset of image [1]. Computer graphics and computer vision process uses an image processing techniques. Image processing systems are used in various environments like medical fields, computer-aided design (CAD, research fields, crime investigation fields and military fields. In this paper, we proposed a document image processing technique, for establishing electronic loan approval process (E-LAP [2]. Loan approval process has been tedious process, the E-LAP system attempts to reduce the complexity of loan approval process. Customers have to login to fill the loan application form online with all details and submit the form. The loan department then processes the submitted form and then sends an acknowledgement mail via the E-LAP to the requested customer with the details about list of documents required for the loan approval process [3]. The approaching customer can upload the scanned copies of all required documents. All this interaction between customer and bank take place using an E-LAP system.

  6. Library of sophisticated functions for analysis of nuclear spectra

    Science.gov (United States)

    Morháč, Miroslav; Matoušek, Vladislav

    2009-10-01

    In the paper we present compact library for analysis of nuclear spectra. The library consists of sophisticated functions for background elimination, smoothing, peak searching, deconvolution, and peak fitting. The functions can process one- and two-dimensional spectra. The software described in the paper comprises a number of conventional as well as newly developed methods needed to analyze experimental data. Program summaryProgram title: SpecAnalysLib 1.1 Catalogue identifier: AEDZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEDZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 42 154 No. of bytes in distributed program, including test data, etc.: 2 379 437 Distribution format: tar.gz Programming language: C++ Computer: Pentium 3 PC 2.4 GHz or higher, Borland C++ Builder v. 6. A precompiled Windows version is included in the distribution package Operating system: Windows 32 bit versions RAM: 10 MB Word size: 32 bits Classification: 17.6 Nature of problem: The demand for advanced highly effective experimental data analysis functions is enormous. The library package represents one approach to give the physicists the possibility to use the advanced routines simply by calling them from their own programs. SpecAnalysLib is a collection of functions for analysis of one- and two-parameter γ-ray spectra, but they can be used for other types of data as well. The library consists of sophisticated functions for background elimination, smoothing, peak searching, deconvolution, and peak fitting. Solution method: The algorithms of background estimation are based on Sensitive Non-linear Iterative Peak (SNIP) clipping algorithm. The smoothing algorithms are based on the convolution of the original data with several types of filters and algorithms based on discrete

  7. Processing techniques for data from the GKSS pressure suppression experiments

    International Nuclear Information System (INIS)

    Holman, G.S.; McCauley, E.W.

    1980-01-01

    This report describes techniques developed at LLNL for processing data from large-scale steam condensation experiments being performed by the GKSS Research Center in the Federal Republic of Germany. In particular, the computer code GKPLOT, a special evaluation program for generating time-history plots and numerical output files of GKSS data, will be discussed together with tape handling techniques to unblock the data to a form compatible with the LLNL octopus computer network. Using these data processing techniques, we have provided a convenient means of independently examining and analyzing a very extensive data base for steam condenstaion phenomena. In addition, the techniques developed for handling the GKSS data are applicable to the treatment of similar, but perhaps differently structured, experiment data sets

  8. Signal processing techniques for sodium boiling noise detection

    International Nuclear Information System (INIS)

    1989-05-01

    At the Specialists' Meeting on Sodium Boiling Detection organized by the International Working Group on Fast Reactors (IWGFR) of the International Atomic Energy Agency at Chester in the United Kingdom in 1981 various methods of detecting sodium boiling were reported. But, it was not possible to make a comparative assessment of these methods because the signal condition in each experiment was different from others. That is why participants of this meeting recommended that a benchmark test should be carried out in order to evaluate and compare signal processing methods for boiling detection. Organization of the Co-ordinated Research Programme (CRP) on signal processing techniques for sodium boiling noise detection was also recommended at the 16th meeting of the IWGFR. The CRP on Signal Processing Techniques for Sodium Boiling Noise Detection was set up in 1984. Eight laboratories from six countries have agreed to participate in this CRP. The overall objective of the programme was the development of reliable on-line signal processing techniques which could be used for the detection of sodium boiling in an LMFBR core. During the first stage of the programme a number of existing processing techniques used by different countries have been compared and evaluated. In the course of further work, an algorithm for implementation of this sodium boiling detection system in the nuclear reactor will be developed. It was also considered that the acoustic signal processing techniques developed for boiling detection could well make a useful contribution to other acoustic applications in the reactor. This publication consists of two parts. Part I is the final report of the co-ordinated research programme on signal processing techniques for sodium boiling noise detection. Part II contains two introductory papers and 20 papers presented at four research co-ordination meetings since 1985. A separate abstract was prepared for each of these 22 papers. Refs, figs and tabs

  9. Risk-assessment techniques and the reactor licensing process

    International Nuclear Information System (INIS)

    Levine, S.

    1979-01-01

    A brief description of the Reactor Safety Study (WASH-1400), concentrating on the engineering aspects of the contribution to reactor accident risks is followed by some comments on how we have applied the insights and techniques developed in this study to prepare a program to improve the safety of nuclear power plants. Some new work we are just beginning on the application of risk-assessment techniques to stablize the reactor licensing process is also discussed

  10. Classification of alarm processing techniques and human performance issues

    International Nuclear Information System (INIS)

    Kim, I.S.; O'Hara, J.M.

    1993-01-01

    Human factors reviews indicate that conventional alarm systems based on the one sensor, one alarm approach, have many human engineering deficiencies, a paramount example being too many alarms during major disturbances. As an effort to resolve these deficiencies, various alarm processing systems have been developed using different techniques. To ensure their contribution to operational safety, the impacts of those systems on operating crew performance should be carefully evaluated. This paper briefly reviews some of the human factors research issues associated with alarm processing techniques and then discusses a framework with which to classify the techniques. The dimensions of this framework can be used to explore the effects of alarm processing systems on human performance

  11. Classification of alarm processing techniques and human performance issues

    Energy Technology Data Exchange (ETDEWEB)

    Kim, I.S.; O' Hara, J.M.

    1993-01-01

    Human factors reviews indicate that conventional alarm systems based on the one sensor, one alarm approach, have many human engineering deficiencies, a paramount example being too many alarms during major disturbances. As an effort to resolve these deficiencies, various alarm processing systems have been developed using different techniques. To ensure their contribution to operational safety, the impacts of those systems on operating crew performance should be carefully evaluated. This paper briefly reviews some of the human factors research issues associated with alarm processing techniques and then discusses a framework with which to classify the techniques. The dimensions of this framework can be used to explore the effects of alarm processing systems on human performance.

  12. Classification of alarm processing techniques and human performance issues

    Energy Technology Data Exchange (ETDEWEB)

    Kim, I.S.; O`Hara, J.M.

    1993-05-01

    Human factors reviews indicate that conventional alarm systems based on the one sensor, one alarm approach, have many human engineering deficiencies, a paramount example being too many alarms during major disturbances. As an effort to resolve these deficiencies, various alarm processing systems have been developed using different techniques. To ensure their contribution to operational safety, the impacts of those systems on operating crew performance should be carefully evaluated. This paper briefly reviews some of the human factors research issues associated with alarm processing techniques and then discusses a framework with which to classify the techniques. The dimensions of this framework can be used to explore the effects of alarm processing systems on human performance.

  13. Financial Literacy and Financial Sophistication in the Older Population

    Science.gov (United States)

    Lusardi, Annamaria; Mitchell, Olivia S.; Curto, Vilsa

    2017-01-01

    Using a special-purpose module implemented in the Health and Retirement Study, we evaluate financial sophistication in the American population over the age of 50. We combine several financial literacy questions into an overall index to highlight which questions best capture financial sophistication and examine the sensitivity of financial literacy responses to framing effects. Results show that many older respondents are not financially sophisticated: they fail to grasp essential aspects of risk diversification, asset valuation, portfolio choice, and investment fees. Subgroups with notable deficits include women, the least educated, non-Whites, and those over age 75. In view of the fact that retirees increasingly must take on responsibility for their own retirement security, such meager levels of knowledge have potentially serious and negative implications. PMID:28553191

  14. The conceptualization and measurement of cognitive health sophistication.

    Science.gov (United States)

    Bodie, Graham D; Collins, William B; Jensen, Jakob D; Davis, Lashara A; Guntzviller, Lisa M; King, Andy J

    2013-01-01

    This article develops a conceptualization and measure of cognitive health sophistication--the complexity of an individual's conceptual knowledge about health. Study 1 provides initial validity evidence for the measure--the Healthy-Unhealthy Other Instrument--by showing its association with other cognitive health constructs indicative of higher health sophistication. Study 2 presents data from a sample of low-income adults to provide evidence that the measure does not depend heavily on health-related vocabulary or ethnicity. Results from both studies suggest that the Healthy-Unhealthy Other Instrument can be used to capture variability in the sophistication or complexity of an individual's health-related schematic structures on the basis of responses to two simple open-ended questions. Methodological advantages of the Healthy-Unhealthy Other Instrument and suggestions for future research are highlighted in the discussion.

  15. Financial Literacy and Financial Sophistication in the Older Population.

    Science.gov (United States)

    Lusardi, Annamaria; Mitchell, Olivia S; Curto, Vilsa

    2014-10-01

    Using a special-purpose module implemented in the Health and Retirement Study, we evaluate financial sophistication in the American population over the age of 50. We combine several financial literacy questions into an overall index to highlight which questions best capture financial sophistication and examine the sensitivity of financial literacy responses to framing effects. Results show that many older respondents are not financially sophisticated: they fail to grasp essential aspects of risk diversification, asset valuation, portfolio choice, and investment fees. Subgroups with notable deficits include women, the least educated, non-Whites, and those over age 75. In view of the fact that retirees increasingly must take on responsibility for their own retirement security, such meager levels of knowledge have potentially serious and negative implications.

  16. Future trends in power plant process computer techniques

    International Nuclear Information System (INIS)

    Dettloff, K.

    1975-01-01

    The development of new concepts of the process computer technique has advanced in great steps. The steps are in the three sections: hardware, software, application concept. New computers with a new periphery such as, e.g., colour layer equipment, have been developed in hardware. In software, a decisive step in the sector 'automation software' has been made. Through these components, a step forwards has also been made in the question of incorporating the process computer in the structure of the whole power plant control technique. (orig./LH) [de

  17. Development of process diagnostic techniques for piping and equipment

    International Nuclear Information System (INIS)

    Yotsutsuji, Mitoshi

    1987-01-01

    The thing required for using the facilities composing a plant for a long period without anxiety is to quantitatively grasp the quantities of the present condition of the facilities and to take the necessary measures beforehand. For this purpose, the diagnostic techniques for quickly and accurately detect the quantities of the condition of facilities are necessary, and the development of process diagnostic techniques has been desired. The process diagnostic techniques mentioned here mean those for diagnosing the contamination, clogging and performance of towers, tanks, heat exchangers and others. Idemitsu Engineering Co. had developed a simplified diagnostic equipment for detecting the state of fouling in piping in 1982, which is the gamma ray transmission diagnosis named Scale Checker. By further improving it, the process diagnostic techniques for piping and equipment were developed. In this report, the course of development and examination, the principle of detection, the constitution and the examination of remodeling of the Scale Checker are reported. As the cases of process diagnosis in plant facilities, the diagnosis of the clogging in process piping and the diagnosis of the performance of a distillation tower were carried out. The contents of the diagnosis and the results of those cases are explained. (Kako, I.)

  18. Alarm processing system using AI techniques for nuclear power plant

    International Nuclear Information System (INIS)

    Yang, Joon On; Chang, Soon Heung

    1990-01-01

    An alarm processing system (APS) has been developed using artificial intelligence (AI) techniques. The alarms of nuclear power plants (NPP's) are classified into the generalized and special alarms. The generalized alarms are also classified into the global and local alarms. For each type of alarms, the specific processing rules are applied to filter and suppress unnecessary and potentially misleading alarms. The local processing are based on 'model-based reasoning.' The global and special alarms are processed by using the general cause-consequence check rules. The priorities of alarms are determined according to the plant state and the consistencies between them

  19. STUDY OF ELECTROPOLIMERIZATION PROCESSES OF PYRROLE BY CYCLIC VOLTAMMETRIC TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Adhitasari Suratman

    2010-06-01

    Full Text Available Electropolymerization processes and electrochemical properties of polypyrrole as electroactive polymer have been studied by cyclic voltammetric technique. Pyrrole was electropolymerized to form polypyrrole in water-based solvent containing sodium perchlorate as supporting electrolyte in several pH values. The pH of the solutions were varied by using Britton Robinson buffer. The results showed that oxidation potential limit of electropolymerization processes of pyrrole was 1220 mV vs Ag/AgCl reference electrode. It can be seen that cyclic voltammetric respon of polypyrrole membrane that was prepared by electropolymerization processes of pyrrole at the scanning rate of 100 mV/s was stable. While the processes of pyrrole electropolymerization carried out at the variation of pH showed that the best condition was at the pH range of 2 - 6.   Keywords: polypyrolle, electropolymer, voltammetric technique

  20. Experimental data processing techniques by a personal computer

    International Nuclear Information System (INIS)

    Matsuura, Kiyokata; Tsuda, Kenzo; Abe, Yoshihiko; Kojima, Tsuyoshi; Nishikawa, Akira; Shimura, Hitoshi; Hyodo, Hiromi; Yamagishi, Shigeru.

    1989-01-01

    A personal computer (16-bit, about 1 MB memory) can be used at a low cost in the experimental data processing. This report surveys the important techniques on A/D and D/A conversion, display, store and transfer of the experimental data. It is also discussed the items to be considered in the software. Practical softwares programed BASIC and Assembler language are given as examples. Here, we present some techniques to get faster process in BASIC language and show that the system composed of BASIC and Assembler is useful in a practical experiment. The system performance such as processing speed and flexibility in setting operation condition will depend strongly on programming language. We have made test for processing speed by some typical programming languages; BASIC(interpreter), C, FORTRAN and Assembler. As for the calculation, FORTRAN has the best performance which is comparable to or better than Assembler even in the personal computer. (author)

  1. New developments in techniques for information processing in radionuclide imaging

    International Nuclear Information System (INIS)

    Di Paola, R.; Todd-Pokropek, A.E.; CEA, 91 - Orsay

    1981-01-01

    Processing of scintigraphic data has passed through different stages in the past fifteen years. After an 'euphoric' era when large off-line computer facilities were used to process very low-quality rectilinear scan pictures, a much more critical period followed the introduction of on-line minicomputer systems to acquire, process and visualize scintillation camera data. A selection of some of the available techniques that could improve the extraction of information from scintigraphic examinations in routine is presented. Tomography has been excluded. As examples, the different techniques of (a) inhomogeneity correction of camera response and (b) respiratory motion corrections are used to show one evolutionary process in the use of computer systems. Filtering has been for a long time the major area of research in scintigraphic image processing. Only very simple (usually smoothing) filters are widely distributed. Little use of more 'powerful' filters in clinical data has been made, and very few serious evaluations have been published. Nevertheless, the number of installed minicomputer and microprocessor systems is increasing rapidly, but in general performing tasks other than filtering. The reasons for this (relative) failure are examined. Some 'new' techniques of image processing are presented. The compression of scintigraphic information is important because of the expected need in the near future for handling of large numbers of static pictures as in dynamic and tomographic studies. For dynamic information processing, the present methodology has been narrowed to those techniques that permit the entire 'data space' to be manipulated (as opposed to curve fitting after region of interest definition). 'Functional' imaging was the first step in this process. 'Factor analysis' could be the next. The results obtained by various research laboratories are reviewed. (author)

  2. Statistic techniques of process control for MTR type

    International Nuclear Information System (INIS)

    Oliveira, F.S.; Ferrufino, F.B.J.; Santos, G.R.T.; Lima, R.M.

    2002-01-01

    This work aims at introducing some improvements on the fabrication of MTR type fuel plates, applying statistic techniques of process control. The work was divided into four single steps and their data were analyzed for: fabrication of U 3 O 8 fuel plates; fabrication of U 3 Si 2 fuel plates; rolling of small lots of fuel plates; applying statistic tools and standard specifications to perform a comparative study of these processes. (author)

  3. Technical and economic benefits of nuclear techniques in ore processing

    International Nuclear Information System (INIS)

    1989-08-01

    This report is the outcome of an Advisory Group Meeting organized by the Agency and hosted by the Institute of Physics and Nuclear Techniques, the Academy of Mining and Metallurgy in Krakow, Poland. The purpose of the meeting was to assess the technical and economic benefits of applying nuclear techniques in ore processing industry. Nucleonic control systems and nuclear on-line analytical techniques as well as radioisotope tracer tests and their applications in metallic ore-processing, coal production, and cement fabrication were discussed. This report contains a summary and the presentations dealing with nuclear techniques for process control made at this meeting. Using a number of case-histories as examples, it illustrates technical and economic benefits obtainable by the installation of nuclear process control instrumentation. It is expected to be useful for everybody dealing with ore and coal production, but especially for administrative personnel and engineers who plan and implement national development programmes related to mineral resources. Refs, figs and tabs

  4. A Monte Carlo Sampling Technique for Multi-phonon Processes

    Energy Technology Data Exchange (ETDEWEB)

    Hoegberg, Thure

    1961-12-15

    A sampling technique for selecting scattering angle and energy gain in Monte Carlo calculations of neutron thermalization is described. It is supposed that the scattering is separated into processes involving different numbers of phonons. The number of phonons involved is first determined. Scattering angle and energy gain are then chosen by using special properties of the multi-phonon term.

  5. Harmonizing the Writing Process with Music Training Techniques

    Science.gov (United States)

    Riecken, Nancy

    2009-01-01

    Can music help students become better thinkers and writers? Over the past three years, the author has incorporated some basic music training techniques in her classrooms to help her teach the writing process to students who would otherwise click her off. The students have developed clearer thinking and organizational skills, and have increased…

  6. Finding the Fabulous Few: Why Your Program Needs Sophisticated Research.

    Science.gov (United States)

    Pfizenmaier, Emily

    1981-01-01

    Fund raising, it is argued, needs sophisticated prospect research. Professional prospect researchers play an important role in helping to identify prospective donors and also in helping to stimulate interest in gift giving. A sample of an individual work-up on a donor and a bibliography are provided. (MLW)

  7. Procles the Carthaginian: A North African Sophist in Pausanias’ Periegesis

    Directory of Open Access Journals (Sweden)

    Juan Pablo Sánchez Hernández

    2010-11-01

    Full Text Available Procles, cited by Pausanias (in the imperfect tense about a display in Rome and for an opinion about Pyrrhus of Epirus, probably was not a historian of Hellenistic date, but a contemporary sophist whom Pausanias encountered in person in Rome.

  8. SMEs and new ventures need business model sophistication

    DEFF Research Database (Denmark)

    Kesting, Peter; Günzel-Jensen, Franziska

    2015-01-01

    , and Spreadshirt, this article develops a framework that introduces five business model sophistication strategies: (1) uncover additional functions of your product, (2) identify strategic benefits for third parties, (3) take advantage of economies of scope, (4) utilize cross-selling opportunities, and (5) involve...

  9. Process mining techniques: an application to time management

    Science.gov (United States)

    Khowaja, Ali Raza

    2018-04-01

    In an environment people have to make sure that all of their work are completed within a given time in accordance with its quality. In order to achieve the real phenomenon of process mining one needs to understand all of these processes in a detailed manner. Personal Information and communication has always been a highlighting issue on internet but for now information and communication tools within factual life refers to their daily schedule, location analysis, environmental analysis and, more generally, social media applications support these systems which makes data available for data analysis generated through event logs, but also for process analysis which combines environmental and location analysis. Process mining can be used to exploit all these real live processes with the help of the event logs which are already available in those datasets through user censored data or may be user labeled data. These processes could be used to redesign a user's flow and understand all these processes in a bit more detailed manner. In order to increase the quality of each of the processes that we go through our daily lives is to give a closer look to each of the processes and after analyzing them, one should make changes to get better results. On the contrarily, we applied process mining techniques on seven different subjects combined in a single dataset collected from Korea. Above all, the following paper comments on the efficiency of processes in the event logs referring to time management's sphere of influence.

  10. Comparative study of resist stabilization techniques for metal etch processing

    Science.gov (United States)

    Becker, Gerry; Ross, Matthew F.; Wong, Selmer S.; Minter, Jason P.; Marlowe, Trey; Livesay, William R.

    1999-06-01

    This study investigates resist stabilization techniques as they are applied to a metal etch application. The techniques that are compared are conventional deep-UV/thermal stabilization, or UV bake, and electron beam stabilization. The electron beam tool use din this study, an ElectronCure system from AlliedSignal Inc., ELectron Vision Group, utilizes a flood electron source and a non-thermal process. These stabilization techniques are compared with respect to a metal etch process. In this study, two types of resist are considered for stabilization and etch: a g/i-line resist, Shipley SPR-3012, and an advanced i-line, Shipley SPR 955- Cm. For each of these resist the effects of stabilization on resist features are evaluated by post-stabilization SEM analysis. Etch selectivity in all cases is evaluated by using a timed metal etch, and measuring resists remaining relative to total metal thickness etched. Etch selectivity is presented as a function of stabilization condition. Analyses of the effects of the type of stabilization on this method of selectivity measurement are also presented. SEM analysis was also performed on the features after a compete etch process, and is detailed as a function of stabilization condition. Post-etch cleaning is also an important factor impacted by pre-etch resist stabilization. Results of post- etch cleaning are presented for both stabilization methods. SEM inspection is also detailed for the metal features after resist removal processing.

  11. Development of food preservation and processing techniques by radiation

    Energy Technology Data Exchange (ETDEWEB)

    Byun, Myung Woo; Lee, Ju Woon; Kim, Dong Ho [KAERI, Taejon (Korea, Republic of); Yook, Hong Sun [Chungnam National Univ., Taejon (Korea, Republic of); Kim, Hak Soo [Sogang Univ., Seoul (Korea, Republic of); Lee, Cherl Ho; Park, Hyun Jin [Korea Univ., Seoul (Korea, Republic of); Kang, Il Jun [Hallym Univ., Chuncheon (Korea, Republic of); Kwon, Jung Ho [Kyungbook National Univ., Taegu (Korea, Republic of)

    2002-05-01

    To secure national food resources, development of energy-saving food processing and preservation technologies, establishment of method on improvement of national health and safety by development of alternative techniques of chemicals and foundation of the production of hygienic food and public health related products by irradiation technology were studied. Results at current stage are following; Processing techniques of low salted and fermented fish using gamma irradiation were developed and superiority of using irradiation to conventional food processing methods was established. Processing technique of value-added functional materials for the manufacture of food or public health products using RT/BT/NT combination technology was developed. The basic theories for the technology development to reduce toxic or undesirable compounds in food such as allergy or carcinogens were established. Exterminating methods of quarantine organisms in herbs/spices was established and the quality evaluation and detection conditions in quarantine treatment were set. From the studies on 'program of public understanding' based on safety of the gamma irradiated food, the information for public relation in enlargement of consumer acceptance/implementation and the peaceful use of nuclear energy were secured. Results from the research project will contribute on improvement of competency of domestic food industry and export market. The results also expect the improvement of public health by prevention of food borne diseases and enhancement of national economy and industry by increase of direct/indirect productivity.

  12. Development of food preservation and processing techniques by radiation

    International Nuclear Information System (INIS)

    Byun, Myung Woo; Lee, Ju Woon; Kim, Dong Ho; Yook, Hong Sun; Kim, Hak Soo; Lee, Cherl Ho; Park, Hyun Jin; Kang, Il Jun; Kwon, Jung Ho

    2002-05-01

    To secure national food resources, development of energy-saving food processing and preservation technologies, establishment of method on improvement of national health and safety by development of alternative techniques of chemicals and foundation of the production of hygienic food and public health related products by irradiation technology were studied. Results at current stage are following; Processing techniques of low salted and fermented fish using gamma irradiation were developed and superiority of using irradiation to conventional food processing methods was established. Processing technique of value-added functional materials for the manufacture of food or public health products using RT/BT/NT combination technology was developed. The basic theories for the technology development to reduce toxic or undesirable compounds in food such as allergy or carcinogens were established. Exterminating methods of quarantine organisms in herbs/spices was established and the quality evaluation and detection conditions in quarantine treatment were set. From the studies on 'program of public understanding' based on safety of the gamma irradiated food, the information for public relation in enlargement of consumer acceptance/implementation and the peaceful use of nuclear energy were secured. Results from the research project will contribute on improvement of competency of domestic food industry and export market. The results also expect the improvement of public health by prevention of food borne diseases and enhancement of national economy and industry by increase of direct/indirect productivity

  13. Development of food preservation and processing techniques by radiation

    Energy Technology Data Exchange (ETDEWEB)

    Byun, Myung Woo; Lee, Ju Woon; Kim, Dong Ho [KAERI, Taejon (Korea, Republic of); Yook, Hong Sun [Chungnam National Univ., Taejon (Korea, Republic of); Kim, Hak Soo [Sogang Univ., Seoul (Korea, Republic of); Lee, Cherl Ho; Park, Hyun Jin [Korea Univ., Seoul (Korea, Republic of); Kang, Il Jun [Hallym Univ., Chuncheon (Korea, Republic of); Kwon, Jung Ho [Kyungbook National Univ., Taegu (Korea, Republic of)

    2002-05-01

    To secure national food resources, development of energy-saving food processing and preservation technologies, establishment of method on improvement of national health and safety by development of alternative techniques of chemicals and foundation of the production of hygienic food and public health related products by irradiation technology were studied. Results at current stage are following; Processing techniques of low salted and fermented fish using gamma irradiation were developed and superiority of using irradiation to conventional food processing methods was established. Processing technique of value-added functional materials for the manufacture of food or public health products using RT/BT/NT combination technology was developed. The basic theories for the technology development to reduce toxic or undesirable compounds in food such as allergy or carcinogens were established. Exterminating methods of quarantine organisms in herbs/spices was established and the quality evaluation and detection conditions in quarantine treatment were set. From the studies on 'program of public understanding' based on safety of the gamma irradiated food, the information for public relation in enlargement of consumer acceptance/implementation and the peaceful use of nuclear energy were secured. Results from the research project will contribute on improvement of competency of domestic food industry and export market. The results also expect the improvement of public health by prevention of food borne diseases and enhancement of national economy and industry by increase of direct/indirect productivity.

  14. Spectroscopic analysis technique for arc-welding process control

    Science.gov (United States)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  15. Fractional Processes and Fractional-Order Signal Processing Techniques and Applications

    CERN Document Server

    Sheng, Hu; Qiu, TianShuang

    2012-01-01

    Fractional processes are widely found in science, technology and engineering systems. In Fractional Processes and Fractional-order Signal Processing, some complex random signals, characterized by the presence of a heavy-tailed distribution or non-negligible dependence between distant observations (local and long memory), are introduced and examined from the ‘fractional’ perspective using simulation, fractional-order modeling and filtering and realization of fractional-order systems. These fractional-order signal processing (FOSP) techniques are based on fractional calculus, the fractional Fourier transform and fractional lower-order moments. Fractional Processes and Fractional-order Signal Processing: • presents fractional processes of fixed, variable and distributed order studied as the output of fractional-order differential systems; • introduces FOSP techniques and the fractional signals and fractional systems point of view; • details real-world-application examples of FOSP techniques to demonstr...

  16. Evaluation of alternative drying techniques for the earthworm flour processing

    Directory of Open Access Journals (Sweden)

    Laura Suárez Hernández

    2016-01-01

    Full Text Available Production of earthworm flour includes several steps, among which the most critical is the drying process due to factors such as time and energ y requirements. In addition, the information available about this process is relquite limited. Thus, this work evaluated four drying techniques likely to be implemented by lombricultores: sun drying, oven drying, drying tunnel and microwave assisted drying. Drying kinetics values were obtained for all drying techniques, and specific parameters as the following were evaluated: drying tray material (stainless and ceramic steel for sun drying, microwave power (30 %, 50 % and 80 % and amount of material to be dried (72 and 100 g for microwave assisted drying, temperature (50, 65, 90 and 100 °C for oven drying, and temperature (50 and 63 °C and air speed (2.9 to 3.6 m/s for tunnel drying. It was determined that the most efficient technique is the drying tunnel, because this allows the combination of heat transfer by conduction and convection, and enables controlling the operating parameters. Finally, nutritional analyzes were performed in samples obtained by each drying technique evaluated. The crude protein content for sun drying, microwave assisted drying, oven drying and tunnel drying were 66.36 %, 67.91 %, 60.35 % and 62.33 % respectively, indicating that the drying method and operating parameters do not significantly affect the crude protein content.

  17. Detection of Glaucoma Using Image Processing Techniques: A Critique.

    Science.gov (United States)

    Kumar, B Naveen; Chauhan, R P; Dahiya, Nidhi

    2018-01-01

    The primary objective of this article is to present a summary of different types of image processing methods employed for the detection of glaucoma, a serious eye disease. Glaucoma affects the optic nerve in which retinal ganglion cells become dead, and this leads to loss of vision. The principal cause is the increase in intraocular pressure, which occurs in open-angle and angle-closure glaucoma, the two major types affecting the optic nerve. In the early stages of glaucoma, no perceptible symptoms appear. As the disease progresses, vision starts to become hazy, leading to blindness. Therefore, early detection of glaucoma is needed for prevention. Manual analysis of ophthalmic images is fairly time-consuming and accuracy depends on the expertise of the professionals. Automatic analysis of retinal images is an important tool. Automation aids in the detection, diagnosis, and prevention of risks associated with the disease. Fundus images obtained from a fundus camera have been used for the analysis. Requisite pre-processing techniques have been applied to the image and, depending upon the technique, various classifiers have been used to detect glaucoma. The techniques mentioned in the present review have certain advantages and disadvantages. Based on this study, one can determine which technique provides an optimum result.

  18. Recent Advances in Techniques for Hyperspectral Image Processing

    Science.gov (United States)

    Plaza, Antonio; Benediktsson, Jon Atli; Boardman, Joseph W.; Brazile, Jason; Bruzzone, Lorenzo; Camps-Valls, Gustavo; Chanussot, Jocelyn; Fauvel, Mathieu; Gamba, Paolo; Gualtieri, Anthony; hide

    2009-01-01

    Imaging spectroscopy, also known as hyperspectral imaging, has been transformed in less than 30 years from being a sparse research tool into a commodity product available to a broad user community. Currently, there is a need for standardized data processing techniques able to take into account the special properties of hyperspectral data. In this paper, we provide a seminal view on recent advances in techniques for hyperspectral image processing. Our main focus is on the design of techniques able to deal with the highdimensional nature of the data, and to integrate the spatial and spectral information. Performance of the discussed techniques is evaluated in different analysis scenarios. To satisfy time-critical constraints in specific applications, we also develop efficient parallel implementations of some of the discussed algorithms. Combined, these parts provide an excellent snapshot of the state-of-the-art in those areas, and offer a thoughtful perspective on future potentials and emerging challenges in the design of robust hyperspectral imaging algorithms

  19. Using of Natural Language Processing Techniques in Suicide Research

    Directory of Open Access Journals (Sweden)

    Azam Orooji

    2017-09-01

    Full Text Available It is estimated that each year many people, most of whom are teenagers and young adults die by suicide worldwide. Suicide receives special attention with many countries developing national strategies for prevention. Since, more medical information is available in text, Preventing the growing trend of suicide in communities requires analyzing various textual resources, such as patient records, information on the web or questionnaires. For this purpose, this study systematically reviews recent studies related to the use of natural language processing techniques in the area of people’s health who have completed suicide or are at risk. After electronically searching for the PubMed and ScienceDirect databases and studying articles by two reviewers, 21 articles matched the inclusion criteria. This study revealed that, if a suitable data set is available, natural language processing techniques are well suited for various types of suicide related research.

  20. Simulation of land mine detection processes using nuclear techniques

    International Nuclear Information System (INIS)

    Aziz, M.

    2005-01-01

    A computer models were designed to study the processes of land mine detection using nuclear technique. Parameters that affect the detection were analyzed . Mines of different masses at different depths in the soil are considered using two types of sources , 252 C f and 14 MeV neutron source. The capability to differentiate between mines and other objects such as concrete , iron , wood , Aluminum ,water and polyethylene were analyzed and studied

  1. Comparison and Evaluation of Various Tritium Decontamination Techniques and Processes

    International Nuclear Information System (INIS)

    Gentile, C.A.; Langish, S.W.; Skinner, C.H.; Ciebiera, L.P.

    2004-01-01

    In support of fusion energy development, various techniques and processes have been developed over the past two decades for the removal and decontamination of tritium from a variety of items, surfaces, and components. Tritium decontamination, by chemical, physical, mechanical, or a combination of these methods, is driven by two underlying motivational forces. The first of these motivational forces is safety. Safety is paramount to the established culture associated with fusion energy. The second of these motivational forces is cost. In all aspects, less tritium contamination equals lower operational and disposal costs. This paper will discuss and evaluate the various processes employed for tritium removal and decontamination

  2. Comparison and Evaluation of Various Tritium Decontamination Techniques and Processes

    International Nuclear Information System (INIS)

    Gentile, C.A.; Langish, S.W.; Skinner, C.H.; Ciebiera, L.P.

    2005-01-01

    In support of fusion energy development, various techniques and processes have been developed over the past two decades for the removal and decontamination of tritium from a variety of items, surfaces, and components. The motivational force for tritium decontamination by chemical, physical, mechanical, or a combination of these methods, is driven by two underlying forces. The first of these motivational forces is safety. Safety is paramount to the established culture associated with fusion energy. The second of these motivational forces is cost. In all aspects, less tritium contamination equals lower operational and disposal costs. This paper will discuss and evaluate the various processes employed for tritium removal and decontamination

  3. Development Strategies for Tourism Destinations: Tourism Sophistication vs. Resource Investments

    OpenAIRE

    Rainer Andergassen; Guido Candela

    2010-01-01

    This paper investigates the effectiveness of development strategies for tourism destinations. We argue that resource investments unambiguously increase tourism revenues and that increasing the degree of tourism sophistication, that is increasing the variety of tourism related goods and services, increases tourism activity and decreases the perceived quality of the destination's resource endowment, leading to an ambiguous effect on tourism revenues. We disentangle these two effects and charact...

  4. Sophisticated visualization algorithms for analysis of multidimensional experimental nuclear spectra

    International Nuclear Information System (INIS)

    Morhac, M.; Kliman, J.; Matousek, V.; Turzo, I.

    2004-01-01

    This paper describes graphical models of visualization of 2-, 3-, 4-dimensional scalar data used in nuclear data acquisition, processing and visualization system developed at the Institute of Physics, Slovak Academy of Sciences. It focuses on presentation of nuclear spectra (histograms). However it can be successfully applied for visualization of arrays of other data types. In the paper we present conventional as well as new developed surface and volume rendering visualization techniques used (Authors)

  5. Development of laser materials processing and laser metrology techniques

    International Nuclear Information System (INIS)

    Kim, Cheol Jung; Chung, Chin Man; Kim, Jeong Mook; Kim, Min Suk; Kim, Kwang Suk; Baik, Sung Hoon; Kim, Seong Ouk; Park, Seung Kyu

    1997-09-01

    The applications of remote laser materials processing and metrology have been investigated in nuclear industry from the beginning of laser invention because they can reduce the risks of workers in the hostile environment by remote operation. The objective of this project is the development of laser material processing and metrology techniques for repairing and inspection to improve the safety of nuclear power plants. As to repairing, we developed our own laser sleeve welding head and innovative optical laser weld monitoring techniques to control the sleeve welding process. Furthermore, we designed and fabricated a 800 W Nd:YAG and a 150 W Excimer laser systems for high power laser materials processing in nuclear industry such as cladding and decontamination. As to inspection, we developed an ESPI and a laser triangulation 3-D profile measurement system for defect detection which can complement ECT and UT inspections. We also developed a scanning laser vibrometer for remote vibration measurement of large structures and tested its performance. (author). 58 refs., 16 tabs., 137 figs

  6. Fluid Structure Interaction Techniques For Extrusion And Mixing Processes

    Science.gov (United States)

    Valette, Rudy; Vergnes, Bruno; Coupez, Thierry

    2007-05-01

    This work focuses on the development of numerical techniques devoted to the simulation of mixing processes of complex fluids such as twin-screw extrusion or batch mixing. In mixing process simulation, the absence of symmetry of the moving boundaries (the screws or the rotors) implies that their rigid body motion has to be taken into account by using a special treatment We therefore use a mesh immersion technique (MIT), which consists in using a P1+/P1-based (MINI-element) mixed finite element method for solving the velocity-pressure problem and then solving the problem in the whole barrel cavity by imposing a rigid motion (rotation) to nodes found located inside the so called immersed domain, each sub-domain (screw, rotor) being represented by a surface CAD mesh (or its mathematical equation in simple cases). The independent meshes are immersed into a unique background computational mesh by computing the distance function to their boundaries. Intersections of meshes are accounted for, allowing to compute a fill factor usable as for the VOF methodology. This technique, combined with the use of parallel computing, allows to compute the time-dependent flow of generalized Newtonian fluids including yield stress fluids in a complex system such as a twin screw extruder, including moving free surfaces, which are treated by a "level set" and Hamilton-Jacobi method.

  7. Removable partial denture alloys processed by laser-sintering technique.

    Science.gov (United States)

    Alageel, Omar; Abdallah, Mohamed-Nur; Alsheghri, Ammar; Song, Jun; Caron, Eric; Tamimi, Faleh

    2018-04-01

    Removable partial dentures (RPDs) are traditionally made using a casting technique. New additive manufacturing processes based on laser sintering has been developed for quick fabrication of RPDs metal frameworks at low cost. The objective of this study was to characterize the mechanical, physical, and biocompatibility properties of RPD cobalt-chromium (Co-Cr) alloys produced by two laser-sintering systems and compare them to those prepared using traditional casting methods. The laser-sintered Co-Cr alloys were processed by the selective laser-sintering method (SLS) and the direct metal laser-sintering (DMLS) method using the Phenix system (L-1) and EOS system (L-2), respectively. L-1 and L-2 techniques were 8 and 3.5 times more precise than the casting (CC) technique (p laser-sintered and cast alloys were biocompatible. In conclusion, laser-sintered alloys are more precise and present better mechanical and fatigue properties than cast alloys for RPDs. © 2017 Wiley Periodicals, Inc. J Biomed Mater Res Part B: Appl Biomater, 106B: 1174-1185, 2018. © 2017 Wiley Periodicals, Inc.

  8. Recent developments in numerical simulation techniques of thermal recovery processes

    Energy Technology Data Exchange (ETDEWEB)

    Tamim, M. [Bangladesh University of Engineering and Technology, Bangladesh (Bangladesh); Abou-Kassem, J.H. [Chemical and Petroleum Engineering Department, UAE University, Al-Ain 17555 (United Arab Emirates); Farouq Ali, S.M. [University of Alberta, Alberta (Canada)

    2000-05-01

    Numerical simulation of thermal processes (steam flooding, steam stimulation, SAGD, in-situ combustion, electrical heating, etc.) is an integral part of a thermal project design. The general tendency in the last 10 years has been to use commercial simulators. During the last decade, only a few new models have been reported in the literature. More work has been done to modify and refine solutions to existing problems to improve the efficiency of simulators. The paper discusses some of the recent developments in simulation techniques of thermal processes such as grid refinement, grid orientation, effect of temperature on relative permeability, mathematical models, and solution methods. The various aspects of simulation discussed here promote better understanding of the problems encountered in the simulation of thermal processes and will be of value to both simulator users and developers.

  9. Applying advanced digital signal processing techniques in industrial radioisotopes applications

    International Nuclear Information System (INIS)

    Mahmoud, H.K.A.E.

    2012-01-01

    Radioisotopes can be used to obtain signals or images in order to recognize the information inside the industrial systems. The main problems of using these techniques are the difficulty of identification of the obtained signals or images and the requirement of skilled experts for the interpretation process of the output data of these applications. Now, the interpretation of the output data from these applications is performed mainly manually, depending heavily on the skills and the experience of trained operators. This process is time consuming and the results typically suffer from inconsistency and errors. The objective of the thesis is to apply the advanced digital signal processing techniques for improving the treatment and the interpretation of the output data from the different Industrial Radioisotopes Applications (IRA). This thesis focuses on two IRA; the Residence Time Distribution (RTD) measurement and the defect inspection of welded pipes using a gamma source (gamma radiography). In RTD measurement application, this thesis presents methods for signal pre-processing and modeling of the RTD signals. Simulation results have been presented for two case studies. The first case study is a laboratory experiment for measuring the RTD in a water flow rig. The second case study is an experiment for measuring the RTD in a phosphate production unit. The thesis proposes an approach for RTD signal identification in the presence of noise. In this approach, after signal processing, the Mel Frequency Cepstral Coefficients (MFCCs) and polynomial coefficients are extracted from the processed signal or from one of its transforms. The Discrete Wavelet Transform (DWT), Discrete Cosine Transform (DCT), and Discrete Sine Transform (DST) have been tested and compared for efficient feature extraction. Neural networks have been used for matching of the extracted features. Furthermore, the Power Density Spectrum (PDS) of the RTD signal has been also used instead of the discrete

  10. Monitoring of Lactic Fermentation Process by Ultrasonic Technique

    Science.gov (United States)

    Alouache, B.; Touat, A.; Boutkedjirt, T.; Bennamane, A.

    The non-destructive control by using ultrasound techniques has become of great importance in food industry. In this work, Ultrasound has been used for quality control and monitoring the fermentation stages of yogurt, which is a highly consumed product. On the contrary to the physico-chemical methods, where the measurement instruments are directly introduced in the sample, ultrasound techniques have the advantage of being non-destructive and contactless, thus reducing the risk of contamination. Results obtained in this study by using ultrasound seem to be in good agreement with those obtained by physico-chemical methods such as acidity measurement by using a PH-meter instrument. This lets us to conclude that ultrasound method may be an alternative for a healthy control of yoghurt fermentation process.

  11. Implications of a ''Noisy'' observer to data processing techniques

    International Nuclear Information System (INIS)

    Goodenough, D.J.; Metz, C.E.

    1975-01-01

    It is attempted to show how an internal noise source (darklight and threshold jitter) would tend to explain experimental data concerning the visual detection of noise-limited signal in diagnostic imaging. The interesting conclusions can be drawn that the internal noise sets the upper limit to the utility of data processing techniques designed to reduce image noise. Moreover, there should be instances where contrast enhancement techniques may be far more useful to the human observer than corresponding reductions in noise amplitude, especially at high count rates (sigma/sub p/ less than or equal to sigma/sub D/). Then too, the limitations imposed on the human observer by an internal noise source, may point towards the need for additional methods (e.g. computer/microdensitometer) of interpreting images of high photon density so that the highest possible signal to noise ratio might be obtained

  12. Liquid argon TPC signal formation, signal processing and reconstruction techniques

    Science.gov (United States)

    Baller, B.

    2017-07-01

    This document describes a reconstruction chain that was developed for the ArgoNeuT and MicroBooNE experiments at Fermilab. These experiments study accelerator neutrino interactions that occur in a Liquid Argon Time Projection Chamber. Reconstructing the properties of particles produced in these interactions benefits from the knowledge of the micro-physics processes that affect the creation and transport of ionization electrons to the readout system. A wire signal deconvolution technique was developed to convert wire signals to a standard form for hit reconstruction, to remove artifacts in the electronics chain and to remove coherent noise. A unique clustering algorithm reconstructs line-like trajectories and vertices in two dimensions which are then matched to create of 3D objects. These techniques and algorithms are available to all experiments that use the LArSoft suite of software.

  13. Musical Sophistication and the Effect of Complexity on Auditory Discrimination in Finnish Speakers

    Science.gov (United States)

    Dawson, Caitlin; Aalto, Daniel; Šimko, Juraj; Vainio, Martti; Tervaniemi, Mari

    2017-01-01

    Musical experiences and native language are both known to affect auditory processing. The present work aims to disentangle the influences of native language phonology and musicality on behavioral and subcortical sound feature processing in a population of musically diverse Finnish speakers as well as to investigate the specificity of enhancement from musical training. Finnish speakers are highly sensitive to duration cues since in Finnish, vowel and consonant duration determine word meaning. Using a correlational approach with a set of behavioral sound feature discrimination tasks, brainstem recordings, and a musical sophistication questionnaire, we find no evidence for an association between musical sophistication and more precise duration processing in Finnish speakers either in the auditory brainstem response or in behavioral tasks, but they do show an enhanced pitch discrimination compared to Finnish speakers with less musical experience and show greater duration modulation in a complex task. These results are consistent with a ceiling effect set for certain sound features which corresponds to the phonology of the native language, leaving an opportunity for music experience-based enhancement of sound features not explicitly encoded in the language (such as pitch, which is not explicitly encoded in Finnish). Finally, the pattern of duration modulation in more musically sophisticated Finnish speakers suggests integrated feature processing for greater efficiency in a real world musical situation. These results have implications for research into the specificity of plasticity in the auditory system as well as to the effects of interaction of specific language features with musical experiences. PMID:28450829

  14. Musical Sophistication and the Effect of Complexity on Auditory Discrimination in Finnish Speakers.

    Science.gov (United States)

    Dawson, Caitlin; Aalto, Daniel; Šimko, Juraj; Vainio, Martti; Tervaniemi, Mari

    2017-01-01

    Musical experiences and native language are both known to affect auditory processing. The present work aims to disentangle the influences of native language phonology and musicality on behavioral and subcortical sound feature processing in a population of musically diverse Finnish speakers as well as to investigate the specificity of enhancement from musical training. Finnish speakers are highly sensitive to duration cues since in Finnish, vowel and consonant duration determine word meaning. Using a correlational approach with a set of behavioral sound feature discrimination tasks, brainstem recordings, and a musical sophistication questionnaire, we find no evidence for an association between musical sophistication and more precise duration processing in Finnish speakers either in the auditory brainstem response or in behavioral tasks, but they do show an enhanced pitch discrimination compared to Finnish speakers with less musical experience and show greater duration modulation in a complex task. These results are consistent with a ceiling effect set for certain sound features which corresponds to the phonology of the native language, leaving an opportunity for music experience-based enhancement of sound features not explicitly encoded in the language (such as pitch, which is not explicitly encoded in Finnish). Finally, the pattern of duration modulation in more musically sophisticated Finnish speakers suggests integrated feature processing for greater efficiency in a real world musical situation. These results have implications for research into the specificity of plasticity in the auditory system as well as to the effects of interaction of specific language features with musical experiences.

  15. Expert system and process optimization techniques for real-time monitoring and control of plasma processes

    Science.gov (United States)

    Cheng, Jie; Qian, Zhaogang; Irani, Keki B.; Etemad, Hossein; Elta, Michael E.

    1991-03-01

    To meet the ever-increasing demand of the rapidly-growing semiconductor manufacturing industry it is critical to have a comprehensive methodology integrating techniques for process optimization real-time monitoring and adaptive process control. To this end we have accomplished an integrated knowledge-based approach combining latest expert system technology machine learning method and traditional statistical process control (SPC) techniques. This knowledge-based approach is advantageous in that it makes it possible for the task of process optimization and adaptive control to be performed consistently and predictably. Furthermore this approach can be used to construct high-level and qualitative description of processes and thus make the process behavior easy to monitor predict and control. Two software packages RIST (Rule Induction and Statistical Testing) and KARSM (Knowledge Acquisition from Response Surface Methodology) have been developed and incorporated with two commercially available packages G2 (real-time expert system) and ULTRAMAX (a tool for sequential process optimization).

  16. COCONUT WATER VINEGAR: NEW ALTERNATIVE WITH IMPROVED PROCESSING TECHNIQUE

    Directory of Open Access Journals (Sweden)

    MUHAMMAD ANAS OTHAMAN

    2014-06-01

    Full Text Available Vinegar is a condiment made from various sugary and starchy materials by alcoholic and subsequent acetic fermentation. Vinegar can be produced via different methods and from various types of raw material. A new alternative substrate for vinegar production namely mature coconut water has been tested and was compared with 2 common substrates which were coconut sap and pineapple juice. Substrates such as sap and juices have been found to have high amount of total soluble solids which corresponding to high sugar content in the substrates which is more than 14oBrix. Therefore, both substrates could be directly used for vinegar production without requirement of other carbon sources. However, coconut water which showed low Brix value need to be adjusted to 14oBrix by adding sucrose prior to the fermentation process. Substrates fermented with Saccharomyces cerevisiae have yielded 7-8% of alcohol within 7-10 days aerobic incubation at room temperature. The alcoholic medium were then used as a seed broth for acetic fermentation with Acetobactor aceti as inoculums and fermented for approximately 2 months to obtain at least 4% of acetic acid. Investigation on the effect of inoculum sizes and implementation of back-slopping technique were performed to improve the processing method for coconut water vinegar production. The results show that 10% of inoculum size was the best for acetic acid fermentation and the back-slopping technique has helped to reduce the process time of coconut water vinegar production.

  17. Electrochemical Techniques in Textile Processes and Wastewater Treatment

    Directory of Open Access Journals (Sweden)

    Mireia Sala

    2012-01-01

    Full Text Available The textile industry uses the electrochemical techniques both in textile processes (such as manufacturing fibers, dyeing processes, and decolorizing fabrics and in wastewaters treatments (color removal. Electrochemical reduction reactions are mostly used in sulfur and vat dyeing, but in some cases, they are applied to effluents discoloration. However, the main applications of electrochemical treatments in the textile sector are based on oxidation reactions. Most of electrochemical oxidation processes involve indirect reactions which imply the generation of hypochlorite or hydroxyl radical in situ. These electrogenerated species are able to bleach indigo-dyed denim fabrics and to degrade dyes in wastewater in order to achieve the effluent color removal. The aim of this paper is to review the electrochemical techniques applied to textile industry. In particular, they are an efficient method to remove color of textile effluents. The reuse of the discolored effluent is possible, which implies an important saving of salt and water (i.e., by means of the “UVEC Cell”.

  18. Congestion estimation technique in the optical network unit registration process.

    Science.gov (United States)

    Kim, Geunyong; Yoo, Hark; Lee, Dongsoo; Kim, Youngsun; Lim, Hyuk

    2016-07-01

    We present a congestion estimation technique (CET) to estimate the optical network unit (ONU) registration success ratio for the ONU registration process in passive optical networks. An optical line terminal (OLT) estimates the number of collided ONUs via the proposed scheme during the serial number state. The OLT can obtain congestion level among ONUs to be registered such that this information may be exploited to change the size of a quiet window to decrease the collision probability. We verified the efficiency of the proposed method through simulation and experimental results.

  19. The New Toxicology of Sophisticated Materials: Nanotoxicology and Beyond

    Science.gov (United States)

    Maynard, Andrew D.; Warheit, David B.; Philbert, Martin A.

    2011-01-01

    It has long been recognized that the physical form of materials can mediate their toxicity—the health impacts of asbestiform materials, industrial aerosols, and ambient particulate matter are prime examples. Yet over the past 20 years, toxicology research has suggested complex and previously unrecognized associations between material physicochemistry at the nanoscale and biological interactions. With the rapid rise of the field of nanotechnology and the design and production of increasingly complex nanoscale materials, it has become ever more important to understand how the physical form and chemical composition of these materials interact synergistically to determine toxicity. As a result, a new field of research has emerged—nanotoxicology. Research within this field is highlighting the importance of material physicochemical properties in how dose is understood, how materials are characterized in a manner that enables quantitative data interpretation and comparison, and how materials move within, interact with, and are transformed by biological systems. Yet many of the substances that are the focus of current nanotoxicology studies are relatively simple materials that are at the vanguard of a new era of complex materials. Over the next 50 years, there will be a need to understand the toxicology of increasingly sophisticated materials that exhibit novel, dynamic and multifaceted functionality. If the toxicology community is to meet the challenge of ensuring the safe use of this new generation of substances, it will need to move beyond “nano” toxicology and toward a new toxicology of sophisticated materials. Here, we present a brief overview of the current state of the science on the toxicology of nanoscale materials and focus on three emerging toxicology-based challenges presented by sophisticated materials that will become increasingly important over the next 50 years: identifying relevant materials for study, physicochemical characterization, and

  20. Strategic sophistication of individuals and teams. Experimental evidence

    Science.gov (United States)

    Sutter, Matthias; Czermak, Simon; Feri, Francesco

    2013-01-01

    Many important decisions require strategic sophistication. We examine experimentally whether teams act more strategically than individuals. We let individuals and teams make choices in simple games, and also elicit first- and second-order beliefs. We find that teams play the Nash equilibrium strategy significantly more often, and their choices are more often a best response to stated first order beliefs. Distributional preferences make equilibrium play less likely. Using a mixture model, the estimated probability to play strategically is 62% for teams, but only 40% for individuals. A model of noisy introspection reveals that teams differ from individuals in higher order beliefs. PMID:24926100

  1. Few remarks on chiral theories with sophisticated topology

    International Nuclear Information System (INIS)

    Golo, V.L.; Perelomov, A.M.

    1978-01-01

    Two classes of the two-dimensional Euclidean chiral field theoreties are singled out: 1) the field phi(x) takes the values in the compact Hermitiam symmetric space 2) the field phi(x) takes the values in an orbit of the adjoint representation of the comcompact Lie group. The theories have sophisticated topological and rich analytical structures. They are considered with the help of topological invariants (topological charges). Explicit formulae for the topological charges are indicated, and the lower bound extimate for the action is given

  2. Some fuzzy techniques for staff selection process: A survey

    Science.gov (United States)

    Md Saad, R.; Ahmad, M. Z.; Abu, M. S.; Jusoh, M. S.

    2013-04-01

    With high level of business competition, it is vital to have flexible staff that are able to adapt themselves with work circumstances. However, staff selection process is not an easy task to be solved, even when it is tackled in a simplified version containing only a single criterion and a homogeneous skill. When multiple criteria and various skills are involved, the problem becomes much more complicated. In adddition, there are some information that could not be measured precisely. This is patently obvious when dealing with opinions, thoughts, feelings, believes, etc. One possible tool to handle this issue is by using fuzzy set theory. Therefore, the objective of this paper is to review the existing fuzzy techniques for solving staff selection process. It classifies several existing research methods and identifies areas where there is a gap and need further research. Finally, this paper concludes by suggesting new ideas for future research based on the gaps identified.

  3. Processing of combustible radioactive waste using incineration techniques

    International Nuclear Information System (INIS)

    Maestas, E.

    1981-01-01

    Among the OECD Nuclear Energy Agency Member countries numerous incineration concepts are being studied as potential methods for conditioning alpha-bearing and other types of combustible radioactive waste. The common objective of these different processes is volume reduction and the transformation of the waste to a more acceptable waste form. Because the combustion processes reduce the mass and volume of waste to a form which is generally more inert than the feed material, the resulting waste can be more uniformly compatible with safe handling, packaging, storage and/or disposal techniques. The number of different types of combustion process designed and operating specifically for alpha-bearing wastes is somewhat small compared with those for non-alpha radioactive wastes; however, research and development is under way in a number of countries to develop and improve alpha incinerators. This paper provides an overview of most alpha-incineration concepts in operation or under development in OECD/NEA Member countries. The special features of each concept are briefly discussed. A table containing characteristic data of incinerators is presented so that a comparison of the major programmes can be made. The table includes the incinerator name and location, process type, capacity throughput, operational status and application. (author)

  4. Processing Techniques and Applications of Silk Hydrogels in Bioengineering

    Directory of Open Access Journals (Sweden)

    Michael Floren

    2016-09-01

    Full Text Available Hydrogels are an attractive class of tunable material platforms that, combined with their structural and functional likeness to biological environments, have a diversity of applications in bioengineering. Several polymers, natural and synthetic, can be used, the material selection being based on the required functional characteristics of the prepared hydrogels. Silk fibroin (SF is an attractive natural polymer for its excellent processability, biocompatibility, controlled degradation, mechanical properties and tunable formats and a good candidate for the fabrication of hydrogels. Tremendous effort has been made to control the structural and functional characteristic of silk hydrogels, integrating novel biological features with advanced processing techniques, to develop the next generation of functional SF hydrogels. Here, we review the several processing methods developed to prepare advanced SF hydrogel formats, emphasizing a bottom-up approach beginning with critical structural characteristics of silk proteins and their behavior under specific gelation environments. Additionally, the preparation of SF hydrogel blends and other advanced formats will also be discussed. We conclude with a brief description of the attractive utility of SF hydrogels in relevant bioengineering applications.

  5. Measurement of pharyngeal sensory cortical processing: technique and physiologic implications

    Directory of Open Access Journals (Sweden)

    Ringelstein E Bernd

    2009-07-01

    Full Text Available Abstract Background Dysphagia is a major complication of different diseases affecting both the central and peripheral nervous system. Pharyngeal sensory impairment is one of the main features of neurogenic dysphagia. Therefore an objective technique to examine the cortical processing of pharyngeal sensory input would be a helpful diagnostic tool in this context. We developed a simple paradigm to perform pneumatic stimulation to both sides of the pharyngeal wall. Whole-head MEG was employed to study changes in cortical activation during this pharyngeal stimulation in nine healthy subjects. Data were analyzed by means of synthetic aperture magnetometry (SAM and the group analysis of individual SAM data was performed using a permutation test. Results Our results revealed bilateral activation of the caudolateral primary somatosensory cortex following sensory pharyngeal stimulation with a slight lateralization to the side of stimulation. Conclusion The method introduced here is simple and easy to perform and might be applicable in the clinical setting. The results are in keeping with previous findings showing bihemispheric involvement in the complex task of sensory pharyngeal processing. They might also explain changes in deglutition after hemispheric strokes. The ipsilaterally lateralized processing is surprising and needs further investigation.

  6. A deconvolution technique for processing small intestinal transit data

    Energy Technology Data Exchange (ETDEWEB)

    Brinch, K. [Department of Clinical Physiology and Nuclear Medicine, Glostrup Hospital, University Hospital of Copenhagen (Denmark); Larsson, H.B.W. [Danish Research Center of Magnetic Resonance, Hvidovre Hospital, University Hospital of Copenhagen (Denmark); Madsen, J.L. [Department of Clinical Physiology and Nuclear Medicine, Hvidovre Hospital, University Hospital of Copenhagen (Denmark)

    1999-03-01

    The deconvolution technique can be used to compute small intestinal impulse response curves from scintigraphic data. Previously suggested approaches, however, are sensitive to noise from the data. We investigated whether deconvolution based on a new simple iterative convolving technique can be recommended. Eight healthy volunteers ingested a meal that contained indium-111 diethylene triamine penta-acetic acid labelled water and technetium-99m stannous colloid labelled omelette. Imaging was performed at 30-min intervals until all radioactivity was located in the colon. A Fermi function=(1+e{sup -{alpha}{beta}})/(1+e{sup (t-{alpha}){beta}}) was chosen to characterize the small intestinal impulse response function. By changing only two parameters, {alpha} and {beta}, it is possible to obtain configurations from nearly a square function to nearly a monoexponential function. Small intestinal input function was obtained from the gastric emptying curve and convolved with the Fermi function. The sum of least squares was used to find {alpha} and {beta} yielding the best fit of the convolved curve to the oberved small intestinal time-activity curve. Finally, a small intestinal mean transit time was calculated from the Fermi function referred to. In all cases, we found an excellent fit of the convolved curve to the observed small intestinal time-activity curve, that is the Fermi function reflected the small intestinal impulse response curve. Small intestinal mean transit time of liquid marker (median 2.02 h) was significantly shorter than that of solid marker (median 2.99 h; P<0.02). The iterative convolving technique seems to be an attractive alternative to ordinary approaches for the processing of small intestinal transit data. (orig.) With 2 figs., 13 refs.

  7. Measurement of spatial correlation functions using image processing techniques

    International Nuclear Information System (INIS)

    Berryman, J.G.

    1985-01-01

    A procedure for using digital image processing techniques to measure the spatial correlation functions of composite heterogeneous materials is presented. Methods for eliminating undesirable biases and warping in digitized photographs are discussed. Fourier transform methods and array processor techniques for calculating the spatial correlation functions are treated. By introducing a minimal set of lattice-commensurate triangles, a method of sorting and storing the values of three-point correlation functions in a compact one-dimensional array is developed. Examples are presented at each stage of the analysis using synthetic photographs of cross sections of a model random material (the penetrable sphere model) for which the analytical form of the spatial correlations functions is known. Although results depend somewhat on magnification and on relative volume fraction, it is found that photographs digitized with 512 x 512 pixels generally have sufficiently good statistics for most practical purposes. To illustrate the use of the correlation functions, bounds on conductivity for the penetrable sphere model are calculated with a general numerical scheme developed for treating the singular three-dimensional integrals which must be evaluated

  8. Digital Image Processing Technique for Breast Cancer Detection

    Science.gov (United States)

    Guzmán-Cabrera, R.; Guzmán-Sepúlveda, J. R.; Torres-Cisneros, M.; May-Arrioja, D. A.; Ruiz-Pinales, J.; Ibarra-Manzano, O. G.; Aviña-Cervantes, G.; Parada, A. González

    2013-09-01

    Breast cancer is the most common cause of death in women and the second leading cause of cancer deaths worldwide. Primary prevention in the early stages of the disease becomes complex as the causes remain almost unknown. However, some typical signatures of this disease, such as masses and microcalcifications appearing on mammograms, can be used to improve early diagnostic techniques, which is critical for women’s quality of life. X-ray mammography is the main test used for screening and early diagnosis, and its analysis and processing are the keys to improving breast cancer prognosis. As masses and benign glandular tissue typically appear with low contrast and often very blurred, several computer-aided diagnosis schemes have been developed to support radiologists and internists in their diagnosis. In this article, an approach is proposed to effectively analyze digital mammograms based on texture segmentation for the detection of early stage tumors. The proposed algorithm was tested over several images taken from the digital database for screening mammography for cancer research and diagnosis, and it was found to be absolutely suitable to distinguish masses and microcalcifications from the background tissue using morphological operators and then extract them through machine learning techniques and a clustering algorithm for intensity-based segmentation.

  9. Supercritical fluid processing: a new dry technique for photoresist developing

    Science.gov (United States)

    Gallagher-Wetmore, Paula M.; Wallraff, Gregory M.; Allen, Robert D.

    1995-06-01

    Supercritical fluid (SCF) technology is investigated as a dry technique for photoresist developing. Because of their unique combination of gaseous and liquid-like properties, these fluids offer comparative or improved efficiencies over liquid developers and, particularly carbon dioxide, would have tremendous beneficial impact on the environment and on worker safety. Additionally, SCF technology offers the potential for processing advanced resist systems which are currently under investigation as well as those that may have been abandoned due to problems associated with conventional developers. An investigation of various negative and positive photoresist systems is ongoing. Initially, supercritical carbon dioxide (SC CO2) as a developer for polysilane resists was explored because the exposure products, polysiloxanes, are generally soluble in this fluid. These initial studies demonstrated the viability of the SCF technique with both single layer and bilayer systems. Subsequently, the investigation focused on using SC CO2 to produce negative images with polymers that would typically be considered positive resists. Polymers such as styrenes and methacrylates were chemically modified by fluorination and/or copolymerization to render them soluble in SC CO2. Siloxane copolymers and siloxane-modified methacrylates were examined as well. The preliminary findings reported here indicate the feasibility of using SC CO2 for photoresist developing.

  10. Accelerated decomposition techniques for large discounted Markov decision processes

    Science.gov (United States)

    Larach, Abdelhadi; Chafik, S.; Daoui, C.

    2017-12-01

    Many hierarchical techniques to solve large Markov decision processes (MDPs) are based on the partition of the state space into strongly connected components (SCCs) that can be classified into some levels. In each level, smaller problems named restricted MDPs are solved, and then these partial solutions are combined to obtain the global solution. In this paper, we first propose a novel algorithm, which is a variant of Tarjan's algorithm that simultaneously finds the SCCs and their belonging levels. Second, a new definition of the restricted MDPs is presented to ameliorate some hierarchical solutions in discounted MDPs using value iteration (VI) algorithm based on a list of state-action successors. Finally, a robotic motion-planning example and the experiment results are presented to illustrate the benefit of the proposed decomposition algorithms.

  11. Development of food preservation and processing techniques by radiation

    International Nuclear Information System (INIS)

    Byun, Myung Woo; Yook, Hong Sun; Lee, Ju Woon and others

    1999-03-01

    Development of food preservation and processing techniques by radiation was performed. Gamma irradiation at 2-10 kGy is considered to be an effective method to control pathogenic bacteria in species including Escherichia coli O157:H7. Gamma irradiation at 5 kGy completely eliminated pathogenic bacteria in beef. Gamma irradiation at such doses and subsequent storage at less than 4 deg C could ensure hygienic quality and prolong the microbiological shelf-life resulting from the reduction of spoilage microorganisms. Gamma irradiation on pre-rigor beef shortens the aging-period, improves tenderness and enhances the beef quality. And, a new beef processing method using gamma irradiation, such as in the low salt sausage and hygienic beef patty was developed. Safety tests of gamma-irradiated meats(beefs: 0-5 kGy; porks: 0-30 kGy) in areas such as genotoxicity, acute toxicity, four-week oral toxicity, rat hepato carcinogenesis and the anti oxidative defense system, were not affected by gamma irradiation. To pre-establish an alternative technique to the toxic fumigant, methyl bromide, which is the current quarantine measure of agricultural products for export and import, some selected agricultural products, such as chestnuts, acorns, red beans and mung beans, were subjected to a preliminary study to confirm the comparative effects of gamma irradiation and MBr fumigant on their disinfestation and quality, thereby preparing the basic data for the practical approach.Current fumigation(MBr) was perfect in its disinfecting capability, but it caused detrimental effects on the physical quality of agricultural produce. However, irradiation doses suitable for controlling pests did not induce any significant changes in the quality of the products. (author)

  12. The sophisticated control of the tram bogie on track

    Directory of Open Access Journals (Sweden)

    Radovan DOLECEK

    2015-09-01

    Full Text Available The paper deals with the problems of routing control algorithms of new conception of tram vehicle bogie. The main goal of these research activities is wear reduction of rail wheels and tracks, wear reduction of traction energy losses and increasing of running comfort. The testing experimental tram vehicle with special bogie construction powered by traction battery is utilized for these purposes. This vehicle has a rotary bogie with independent rotating wheels driven by permanent magnets synchronous motors and a solid axle. The wheel forces in bogie are measured by large amounts of the various sensors placed on the testing experimental tram vehicle. Nowadays the designed control algorithms are implemented to the vehicle superset control system. The traction requirements and track characteristics have an effect to these control algorithms. This control including sophisticated routing brings other improvements which is verified and corrected according to individual traction and driving characteristics, and opens new possibilities.

  13. SOFT: smooth OPC fixing technique for ECO process

    Science.gov (United States)

    Zhang, Hongbo; Shi, Zheng

    2007-03-01

    SOFT (Smooth OPC Fixing Technique) is a new OPC flow developed from the basic OPC framework. It provides a new method to reduce the computation cost and complexities of ECO-OPC (Engineering Change Order - Optical Proximity Correction). In this paper, we introduce polygon comparison to extract the necessary but possibly lost fragmentation and offset information of previous post-OPC layout. By reusing these data, we can start the modification on each segment from a more accurate initial offset. In addition, the fragmentation method in the boundary of the patch in the previous OPC process is therefore available for engineers to stitch the regional ECO-OPC result back to the whole post-OPC layout seamlessly. For the ripple effect in the OPC, by comparing each segment's movement in each loop, we much free the fixing speed from the limitation of patch size. We handle layout remodification, especially in three basic kinds of ECO-OPC processes, while maintaining other design closure. Our experimental results show that, by utilizing the previous post-OPC layout, full-chip ECO-OPC can realize an over 5X acceleration and the regional ECO-OPC result can also be stitched back into the whole layout seamlessly with the ripple effect of the lithography interaction.

  14. Computer processing of the scintigraphic image using digital filtering techniques

    International Nuclear Information System (INIS)

    Matsuo, Michimasa

    1976-01-01

    The theory of digital filtering was studied as a method for the computer processing of scintigraphic images. The characteristics and design techniques of finite impulse response (FIR) digital filters with linear phases were examined using the z-transform. The conventional data processing method, smoothing, could be recognized as one kind of linear phase FIR low-pass digital filtering. Ten representatives of FIR low-pass digital filters with various cut-off frequencies were scrutinized from the frequency domain in one-dimension and two-dimensions. These filters were applied to phantom studies with cold targets, using a Scinticamera-Minicomputer on-line System. These studies revealed that the resultant images had a direct connection with the magnitude response of the filter, that is, they could be estimated fairly well from the frequency response of the digital filter used. The filter, which was estimated from phantom studies as optimal for liver scintigrams using 198 Au-colloid, was successfully applied in clinical use for detecting true cold lesions and, at the same time, for eliminating spurious images. (J.P.N.)

  15. Integrative techniques related to positive processes in psychotherapy.

    Science.gov (United States)

    Cromer, Thomas D

    2013-09-01

    This review compiles and evaluates a number of therapist interventions that have been found to significantly contribute to positive psychotherapy processes (i.e., increased alliance, patient engagement/satisfaction, and symptomatic improvement). Four forms of intervention are presented: Affect-focused, Supportive, Exploratory, and Patient-Therapist Interaction. The intention of this review is to link specific interventions to applied practice so that integrative clinicians can potentially use these techniques to improve their clinical work. To this end, there is the inclusion of theory and empirical studies from a range of orientations including Emotionally Focused, Psychodynamic, Client-Centered, Cognitive-Behavioral, Interpersonal, Eclectic, and Motivational Interviewing. Each of the four sections will include the theoretical basis and proposed mechanism of change for the intervention, research that supports its positive impact on psychotherapy processes, and conclude with examples demonstrating its use in actual practice. Clinical implications and considerations regarding the use of these interventions will also be presented. 2013 APA, all rights reserved

  16. Mapping innovation processes: Visual techniques for opening and presenting the black box of service innovation processes

    DEFF Research Database (Denmark)

    Olesen, Anne Rørbæk

    2017-01-01

    This chapter argues for the usefulness of visual mapping techniques for performing qualitative analysis of complex service innovation processes. Different mapping formats are presented, namely, matrices, networks, process maps, situational analysis maps and temporal situational analysis maps....... For the purpose of researching service innovation processes, the three latter formats are argued to be particularly interesting. Process maps can give an overview of different periods and milestones in a process in one carefully organized location. Situational analysis maps and temporal situational analysis maps...... can open up complexities of service innovation processes, as well as close them down for presentational purposes. The mapping formats presented are illustrated by displaying maps from an exemplary research project, and the chapter is concluded with a brief discussion of the limitations and pitfalls...

  17. Analysis of the Growth Process of Neural Cells in Culture Environment Using Image Processing Techniques

    Science.gov (United States)

    Mirsafianf, Atefeh S.; Isfahani, Shirin N.; Kasaei, Shohreh; Mobasheri, Hamid

    Here we present an approach for processing neural cells images to analyze their growth process in culture environment. We have applied several image processing techniques for: 1- Environmental noise reduction, 2- Neural cells segmentation, 3- Neural cells classification based on their dendrites' growth conditions, and 4- neurons' features Extraction and measurement (e.g., like cell body area, number of dendrites, axon's length, and so on). Due to the large amount of noise in the images, we have used feed forward artificial neural networks to detect edges more precisely.

  18. Techniques for laser processing, assay, and examination of spent fuel

    International Nuclear Information System (INIS)

    Gray, J.H.; Mitchell, R.C.; Rogell, M.L.

    1981-11-01

    Fuel examination studies were performed which have application to interim spent fuel storage. These studies were in three areas, i.e., laser drilling and rewelding demonstration, nondestructive assay techniques survey, and fuel examination techniques survey

  19. Operation technique and healing process of telescopic ileocolostomy in dogs.

    Science.gov (United States)

    Szucs, G; Tóth, I; Barna, T; Bráth, E; Gyáni, K; Mikó, Irén

    2003-01-01

    The healing process of telescopic anastomoses was found in an animal experiment with 12 mongrel dogs. After the division of vessels an ileal segment of different length was invaginated into the lumen of the colon using single-layer interrupted sutures. The following four groups were used: Group A (n = 3): end-to-side ileocolostomy, single-layer interrupted suture (invagination length: 0 mm), survival time: 21 days. Group B (n = 3): invagination length: 20 mm, survival time: 7 days. Group C (n = 3): invagination length: 10 mm, survival time: 21 days. Group D (n = 3): invagination length: 20 mm, survival time: 21 days. At the end of the above survival times the anastomosis area was removed. The bursting pressure was measured and morphological as well as histological examinations were performed. In each case the 0-day look-alikes of anastomoses were performed using the remnant bowels, and bursting pressure measurements were done on these models as well. Anastomosis leakage did not occur. The serosal layer of the intracolonic part of the ileum disappeared during the healing process. The free surface of the intracolonic ileal segment became covered by the sliding mucosa of the colon and the prolapsing mucosa of the ileum. The following could be concluded after the experiments: The inner pressure tolerance of a telescopic ileocolostomy promptly after preparation is better than in case of another single-layer anastomosis. This fact results in increased safety against leakage on the first postoperative days. The inner pressure tolerance of the telescopic ileocolostomy increases during the healing process and it does not depend on the length of the invaginated part (0 day-20 mm: 56 mmHg +/- 6, Group A: 252 +/- 39, Group B: 154 +/- 19, Group C: 249 +/- 20, Group D: 298 +/- 2). There is no difference in pressure tolerance between the telescopic and the end-to-side single-layer interrupted anastomoses after the healing process. The invaginated section within the lumen of the

  20. Development of food preservation and processing techniques by radiation

    International Nuclear Information System (INIS)

    Byun, Myung Woo; Yook, Hong Sun; Lee, Ju Woon and others

    2000-03-01

    Development of food preservation and processing techniques by radiation was performed. Gamm irradiation at 5 kGy completely eliminated pathogenic bacteria in pork and chicken meats. Gamma irradiation at such doses and subsequent storage at less than 4 deg C could ensure hygienic quality and prolong the microbiological shelf-life resulting from the reduction of spoilage microorganisms. Pork loin ham with desirable color was also developed without using of sodium nitrite that is known as a carcinogen. Safety tests of gamma-irradiated meats in areas such as genotoxicity, acute toxicity, four-week oral toxicity, rat hepatocarcinogenesis and the antioxidative defense system, were not affected by gamma irradiation. Gamma irradiation at about 1 kGy completely eliminated the parasites in foods and drinking water. In the study of quarantine treatment of apple and pear for export by gamma irradiation, current fumigation(MBr) was perfect in its disinfesting capability, but it caused detrimental effects on the physical quality of apple and pear. However, irradiation doses at 1-3 kGy was suitable for controlling pests and did not induce any significant changes in the quality of the products. The result of the survey to assess the public understanding indicated that the irradiated food had somewhat negative impression to general public. Therefore, it is necessary to establish a public education and information program by using mass communication and by constructing communication system to obtain the enhanced impression from the general public

  1. Development of food preservation and processing techniques by radiation

    Energy Technology Data Exchange (ETDEWEB)

    Byun, Myung Woo; Yook, Hong Sun; Lee, Ju Woon and others

    2000-03-01

    Development of food preservation and processing techniques by radiation was performed. Gamm irradiation at 5 kGy completely eliminated pathogenic bacteria in pork and chicken meats. Gamma irradiation at such doses and subsequent storage at less than 4 deg C could ensure hygienic quality and prolong the microbiological shelf-life resulting from the reduction of spoilage microorganisms. Pork loin ham with desirable color was also developed without using of sodium nitrite that is known as a carcinogen. Safety tests of gamma-irradiated meats in areas such as genotoxicity, acute toxicity, four-week oral toxicity, rat hepatocarcinogenesis and the antioxidative defense system, were not affected by gamma irradiation. Gamma irradiation at about 1 kGy completely eliminated the parasites in foods and drinking water. In the study of quarantine treatment of apple and pear for export by gamma irradiation, current fumigation(MBr) was perfect in its disinfesting capability, but it caused detrimental effects on the physical quality of apple and pear. However, irradiation doses at 1-3 kGy was suitable for controlling pests and did not induce any significant changes in the quality of the products. The result of the survey to assess the public understanding indicated that the irradiated food had somewhat negative impression to general public. Therefore, it is necessary to establish a public education and information program by using mass communication and by constructing communication system to obtain the enhanced impression from the general public.

  2. Qubit Manipulations Techniques for Trapped-Ion Quantum Information Processing

    Science.gov (United States)

    Gaebler, John; Tan, Ting; Lin, Yiheng; Bowler, Ryan; Jost, John; Meier, Adam; Knill, Emanuel; Leibfried, Dietrich; Wineland, David; Ion Storage Team

    2013-05-01

    We report recent results on qubit manipulation techniques for trapped-ions towards scalable quantum information processing (QIP). We demonstrate a platform-independent benchmarking protocol for evaluating the performance of Clifford gates, which form a basis for fault-tolerant QIP. We report a demonstration of an entangling gate scheme proposed by Bermudez et al. [Phys. Rev. A. 85, 040302 (2012)] and achieve a fidelity of 0.974(4). This scheme takes advantage of dynamic decoupling which protects the qubit against dephasing errors. It can be applied directly on magnetic-field-insensitive states, and provides a number of simplifications in experimental implementation compared to some other entangling gates with trapped ions. We also report preliminary results on dissipative creation of entanglement with trapped-ions. Creation of an entangled pair does not require discrete logic gates and thus could reduce the level of quantum-coherent control needed for large-scale QIP. Supported by IARPA, ARO contract No. EAO139840, ONR, and the NIST Quantum Information Program.

  3. A microelectromechanical accelerometer fabricated using printed circuit processing techniques

    Science.gov (United States)

    Rogers, J. E.; Ramadoss, R.; Ozmun, P. M.; Dean, R. N.

    2008-01-01

    A microelectromechanical systems (MEMS) capacitive-type accelerometer fabricated using printed circuit processing techniques is presented. A Kapton polymide film is used as the structural layer for fabricating the MEMS accelerometer. The accelerometer proof mass along with four suspension beams is defined in the Kapton polyimide film. The proof mass is suspended above a Teflon substrate using a spacer. The deflection of the proof mass is detected using a pair of capacitive sensing electrodes. The top electrode of the accelerometer is defined on the top surface of the Kapton film. The bottom electrode is defined in the metallization on the Teflon substrate. The initial gap height is determined by the distance between the bottom electrode and the Kapton film. For an applied external acceleration (normal to the proof mass), the proof mass deflects toward or away from the fixed bottom electrode due to inertial force. This deflection causes either a decrease or increase in the air-gap height thereby either increasing or decreasing the capacitance between the top and the bottom electrodes. An example PCB MEMS accelerometer with a square proof mass of membrane area 6.4 mm × 6.4 mm is reported. The measured resonant frequency is 375 Hz and the Q-factor in air is 0.52.

  4. A microelectromechanical accelerometer fabricated using printed circuit processing techniques

    International Nuclear Information System (INIS)

    Rogers, J E; Ramadoss, R; Ozmun, P M; Dean, R N

    2008-01-01

    A microelectromechanical systems (MEMS) capacitive-type accelerometer fabricated using printed circuit processing techniques is presented. A Kapton polymide film is used as the structural layer for fabricating the MEMS accelerometer. The accelerometer proof mass along with four suspension beams is defined in the Kapton polyimide film. The proof mass is suspended above a Teflon substrate using a spacer. The deflection of the proof mass is detected using a pair of capacitive sensing electrodes. The top electrode of the accelerometer is defined on the top surface of the Kapton film. The bottom electrode is defined in the metallization on the Teflon substrate. The initial gap height is determined by the distance between the bottom electrode and the Kapton film. For an applied external acceleration (normal to the proof mass), the proof mass deflects toward or away from the fixed bottom electrode due to inertial force. This deflection causes either a decrease or increase in the air-gap height thereby either increasing or decreasing the capacitance between the top and the bottom electrodes. An example PCB MEMS accelerometer with a square proof mass of membrane area 6.4 mm × 6.4 mm is reported. The measured resonant frequency is 375 Hz and the Q-factor in air is 0.52

  5. Reactive polymer coatings: A robust platform towards sophisticated surface engineering for biotechnology

    Science.gov (United States)

    Chen, Hsien-Yeh

    Functionalized poly(p-xylylenes) or so-called reactive polymers can be synthesized via chemical vapor deposition (CVD) polymerization. The resulting ultra-thin coatings are pinhole-free and can be conformally deposited to a wide range of substrates and materials. More importantly, the equipped functional groups can served as anchoring sites for tailoring the surface properties, making these reactive coatings a robust platform that can deal with sophisticated challenges faced in biointerfaces. In this work presented herein, surface coatings presenting various functional groups were prepared by CVD process. Such surfaces include aldehyde-functionalized coating to precisely immobilize saccharide molecules onto well-defined areas and alkyne-functionalized coating to click azide-modified molecules via Huisgen 1,3-dipolar cycloaddition reaction. Moreover, CVD copolymerization has been conducted to prepare multifunctional coatings and their specific functions were demonstrated by the immobilization of biotin and NHS-ester molecules. By using a photodefinable coating, polyethylene oxides were immobilized onto a wide range of substrates through photo-immobilization. Spatially controlled protein resistant properties were characterized by selective adsorption of fibrinogen and bovine serum albumin as model systems. Alternatively, surface initiator coatings were used for polymer graftings of polyethylene glycol) methyl ether methacrylate, and the resultant protein- and cell- resistant properties were characterized by adsorption of kinesin motor proteins, fibrinogen, and murine fibroblasts (NIH3T3). Accessibility of reactive coatings within confined microgeometries was systematically studied, and the preparation of homogeneous polymer thin films within the inner surface of microchannels was demonstrated. Moreover, these advanced coatings were applied to develop a dry adhesion process for microfluidic devices. This process provides (i) excellent bonding strength, (ii) extended

  6. Sophisticated Communication in the Brazilian Torrent Frog Hylodes japi.

    Science.gov (United States)

    de Sá, Fábio P; Zina, Juliana; Haddad, Célio F B

    2016-01-01

    Intraspecific communication in frogs plays an important role in the recognition of conspecifics in general and of potential rivals or mates in particular and therefore with relevant consequences for pre-zygotic reproductive isolation. We investigate intraspecific communication in Hylodes japi, an endemic Brazilian torrent frog with territorial males and an elaborate courtship behavior. We describe its repertoire of acoustic signals as well as one of the most complex repertoires of visual displays known in anurans, including five new visual displays. Previously unknown in frogs, we also describe a bimodal inter-sexual communication system where the female stimulates the male to emit a courtship call. As another novelty for frogs, we show that in addition to choosing which limb to signal with, males choose which of their two vocal sacs will be used for visual signaling. We explain how and why this is accomplished. Control of inflation also provides additional evidence that vocal sac movement and color must be important for visual communication, even while producing sound. Through the current knowledge on visual signaling in Neotropical torrent frogs (i.e. hylodids), we discuss and highlight the behavioral diversity in the family Hylodidae. Our findings indicate that communication in species of Hylodes is undoubtedly more sophisticated than we expected and that visual communication in anurans is more widespread than previously thought. This is especially true in tropical regions, most likely due to the higher number of species and phylogenetic groups and/or to ecological factors, such as higher microhabitat diversity.

  7. Sophisticated Communication in the Brazilian Torrent Frog Hylodes japi.

    Directory of Open Access Journals (Sweden)

    Fábio P de Sá

    Full Text Available Intraspecific communication in frogs plays an important role in the recognition of conspecifics in general and of potential rivals or mates in particular and therefore with relevant consequences for pre-zygotic reproductive isolation. We investigate intraspecific communication in Hylodes japi, an endemic Brazilian torrent frog with territorial males and an elaborate courtship behavior. We describe its repertoire of acoustic signals as well as one of the most complex repertoires of visual displays known in anurans, including five new visual displays. Previously unknown in frogs, we also describe a bimodal inter-sexual communication system where the female stimulates the male to emit a courtship call. As another novelty for frogs, we show that in addition to choosing which limb to signal with, males choose which of their two vocal sacs will be used for visual signaling. We explain how and why this is accomplished. Control of inflation also provides additional evidence that vocal sac movement and color must be important for visual communication, even while producing sound. Through the current knowledge on visual signaling in Neotropical torrent frogs (i.e. hylodids, we discuss and highlight the behavioral diversity in the family Hylodidae. Our findings indicate that communication in species of Hylodes is undoubtedly more sophisticated than we expected and that visual communication in anurans is more widespread than previously thought. This is especially true in tropical regions, most likely due to the higher number of species and phylogenetic groups and/or to ecological factors, such as higher microhabitat diversity.

  8. Impact of sophisticated fog spray models on accident analyses

    International Nuclear Information System (INIS)

    Roblyer, S.P.; Owzarski, P.C.

    1978-01-01

    The N-Reactor confinement system release dose to the public in a postulated accident is reduced by washing the confinement atmosphere with fog sprays. This allows a low pressure release of confinement atmosphere containing fission products through filters and out an elevated stack. The current accident analysis required revision of the CORRAL code and other codes such as CONTEMPT to properly model the N Reactor confinement into a system of multiple fog-sprayed compartments. In revising these codes, more sophisticated models for the fog sprays and iodine plateout were incorporated to remove some of the conservatism of steam condensing rate, fission product washout and iodine plateout than used in previous studies. The CORRAL code, which was used to describe the transport and deposition of airborne fission products in LWR containment systems for the Rasmussen Study, was revised to describe fog spray removal of molecular iodine (I 2 ) and particulates in multiple compartments for sprays having individual characteristics of on-off times, flow rates, fall heights, and drop sizes in changing containment atmospheres. During postulated accidents, the code determined the fission product removal rates internally rather than from input decontamination factors. A discussion is given of how the calculated plateout and washout rates vary with time throughout the analysis. The results of the accident analyses indicated that more credit could be given to fission product washout and plateout. An important finding was that the release of fission products to the atmosphere and adsorption of fission products on the filters were significantly lower than previous studies had indicated

  9. Image Processing Techniques for Assessment of Dental Trays

    National Research Council Canada - National Science Library

    Yergin, E

    2001-01-01

    .... This technique was used for the alignment of six brands of perforated metal trays with 170 lower arch cast models collected from patients having Angle Class 1 type occlusion with minor malocclusions...

  10. USE OF ARTIFICIAL INTELLIGENCE TECHNIQUES IN QUALITY IMPROVING PROCESS

    OpenAIRE

    KALİTE İYİLEŞTİRME SÜRECİNDE YAPAY ZEKÃ KAYA; Orhan ENGİN

    2005-01-01

    Today, changing of competition conditions and customer preferences caused to happen many differences in the viewpoint of firms' quality studies. At the same time, improvements in computer technologies accelerated use of artificial intelligence. Artificial intelligence technologies are being used to solve many industry problems. In this paper, we investigated the use of artificial intelligence techniques to solve quality problems. The artificial intelligence techniques, which are used in quali...

  11. Beneficiation studies of Bajaur manganese ore by different processing techniques

    International Nuclear Information System (INIS)

    Riaz, M.; Khan, F.U.; Yamin, A.; Bilquees, R.; Muhammad, N.

    2010-01-01

    The manganese ore of Bajaur Agency of Pakistan was subjected to flotation, heavy medium separation, gravity concentration and magnetic separation techniques for beneficiation. The original composition of the manganese ore was 45.56% Mn , 4% Fe/sub 2/O/sub 3/, 40% SiO/sub 2/. The Mn content was raised to a maximum 48.76 % in the concentrate with the recovery of 67.78 % through flotation technique. Other techniques rendered marginal increase in Mn concentration against the theoretical possibility of substantial enrichment by rejecting the 20 % gangue minerals. The separation of manganese minerals from associated gangue was difficult, due to mineralogical complexity of the ore, extreme fineness of the particle size, texture and minerals intergrowth. High Mn/Fe ratio, phosphorus, and silica contents were within tolerable limits for utilisation of the ore in ferro-manganese production. (author)

  12. Nondestructive Evaluation of Thick Concrete Using Advanced Signal Processing Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Clayton, Dwight A [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Barker, Alan M [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Santos-Villalobos, Hector J [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Albright, Austin P [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hoegh, Kyle [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Khazanovich, Lev [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-09-01

    The purpose of the U.S. Department of Energy Office of Nuclear Energy’s Light Water Reactor Sustainability (LWRS) Program is to develop technologies and other solutions that can improve the reliability, sustain the safety, and extend the operating lifetimes of nuclear power plants (NPPs) beyond 60 years [1]. Since many important safety structures in an NPP are constructed of concrete, inspection techniques must be developed and tested to evaluate the internal condition. In-service containment structures generally do not allow for the destructive measures necessary to validate the accuracy of these inspection techniques. This creates a need for comparative testing of the various nondestructive evaluation (NDE) measurement techniques on concrete specimens with known material properties, voids, internal microstructure flaws, and reinforcement locations.

  13. Techniques, processes, and measures for software safety and reliability

    International Nuclear Information System (INIS)

    Sparkman, D.

    1992-01-01

    The purpose of this report is to provide a detailed survey of current recommended practices and measurement techniques for the development of reliable and safe software-based systems. This report is intended to assist the United States Nuclear Reaction Regulation (NRR) in determining the importance and maturity of the available techniques and in assessing the relevance of individual standards for application to instrumentation and control systems in nuclear power generating stations. Lawrence Livermore National Laboratory (LLNL) provides technical support for the Instrumentation and Control System Branch (ICSB) of NRRin advanced instrumentation and control systems, distributed digital systems, software reliability, and the application of verificafion and validafion for the development of software

  14. Handbook of thin film deposition processes and techniques principles, methods, equipment and applications

    CERN Document Server

    Seshan, Krishna

    2002-01-01

    New second edition of the popular book on deposition (first edition by Klaus Schruegraf) for engineers, technicians, and plant personnel in the semiconductor and related industries. This book traces the technology behind the spectacular growth in the silicon semiconductor industry and the continued trend in miniaturization over the last 20 years. This growth has been fueled in large part by improved thin film deposition techniques and the development of highly specialized equipment to enable this deposition. The book includes much cutting-edge material. Entirely new chapters on contamination and contamination control describe the basics and the issues-as feature sizes shrink to sub-micron dimensions, cleanliness and particle elimination has to keep pace. A new chapter on metrology explains the growth of sophisticated, automatic tools capable of measuring thickness and spacing of sub-micron dimensions. The book also covers PVD, laser and e-beam assisted deposition, MBE, and ion beam methods to bring together a...

  15. A Processing Technique for OFDM-Modulated Wideband Radar Signals

    NARCIS (Netherlands)

    Tigrek, R.F.

    2010-01-01

    The orthogonal frequency division multiplexing (OFDM) is a multicarrier spread-spectrum technique which finds wide-spread use in communications. The OFDM pulse compression method that utilizes an OFDM communication signal for radar tasks has been developed and reported in this dissertation. Using

  16. Image processing techniques for quantification and assessment of brain MRI

    NARCIS (Netherlands)

    Kuijf, H.J.

    2013-01-01

    Magnetic resonance imaging (MRI) is a widely used technique to acquire digital images of the human brain. A variety of acquisition protocols is available to generate images in vivo and noninvasively, giving great opportunities to study the anatomy and physiology of the human brain. In my thesis,

  17. Capturing and Modeling Domain Knowledge Using Natural Language Processing Techniques

    National Research Council Canada - National Science Library

    Auger, Alain

    2005-01-01

    .... Initiated in 2004 at Defense Research and Development Canada (DRDC), the SACOT knowledge engineering research project is currently investigating, developing and validating innovative natural language processing (NLP...

  18. Sophisticated Search Capabilities in the ADS Abstract Service

    Science.gov (United States)

    Eichhorn, G.; Accomazzi, A.; Grant, C. S.; Henneken, E.; Kurtz, M. J.; Murray, S. S.

    2003-12-01

    The ADS provides access to over 940,000 references from astronomy and planetary sciences publications and 1.5 million records from physics publications. It is funded by NASA and provides free access to these references, as well as to 2.4 million scanned pages from the astronomical literature. These include most of the major astronomy and several planetary sciences journals, as well as many historical observatory publications. The references now include the abstracts from all volumes of the Journal of Geophysical Research (JGR) since the beginning of 2002. We get these abstracts on a regular basis. The Kluwer journal Solar Physics has been scanned back to volume 1 and is available through the ADS. We have extracted the reference lists from this and many other journals and included them in the reference and citation database of the ADS. We have recently scanning Earth, Moon and Planets, another Kluwer journal, and will scan other Kluwer journals in the future as well. We plan on extracting references from these journals as well in the near future. The ADS has many sophisticated query features. These allow the user to formulate complex queries. Using results lists to get further information about the selected articles provide the means to quickly find important and relevant articles from the database. Three advanced feedback queries are available from the bottom of the ADS results list (in addition to regular feedback queries already available from the abstract page and from the bottom of the results list): 1. Get reference list for selected articles: This query returns all known references for the selected articles (or for all articles in the first list). The resulting list will be ranked according to how often each article is referred to and will show the most referenced articles in the field of study that created the first list. It presumably shows the most important articles in that field. 2. Get citation list for selected articles: This returns all known articles

  19. The Cassava Processing Industry in Brazil: Traditional Techniques ...

    African Journals Online (AJOL)

    The paper considers the evolution of cassava-based industrial production, processing and marketing in Brazil, in light of the great technological diversification to be found in Brazil. It discusses the private role of the small- and medium-scale food and related processing enterprises in the food industry, as they employ ...

  20. Intelligent techniques in signal processing for multimedia security

    CERN Document Server

    Santhi, V

    2017-01-01

    This book proposes new algorithms to ensure secured communications and prevent unauthorized data exchange in secured multimedia systems. Focusing on numerous applications’ algorithms and scenarios, it offers an in-depth analysis of data hiding technologies including watermarking, cryptography, encryption, copy control, and authentication. The authors present a framework for visual data hiding technologies that resolves emerging problems of modern multimedia applications in several contexts including the medical, healthcare, education, and wireless communication networking domains. Further, it introduces several intelligent security techniques with real-time implementation. As part of its comprehensive coverage, the book discusses contemporary multimedia authentication and fingerprinting techniques, while also proposing personal authentication/recognition systems based on hand images, surveillance system security using gait recognition, face recognition under restricted constraints such as dry/wet face condi...

  1. Decomposition based parallel processing technique for efficient collaborative optimization

    International Nuclear Information System (INIS)

    Park, Hyung Wook; Kim, Sung Chan; Kim, Min Soo; Choi, Dong Hoon

    2000-01-01

    In practical design studies, most of designers solve multidisciplinary problems with complex design structure. These multidisciplinary problems have hundreds of analysis and thousands of variables. The sequence of process to solve these problems affects the speed of total design cycle. Thus it is very important for designer to reorder original design processes to minimize total cost and time. This is accomplished by decomposing large multidisciplinary problem into several MultiDisciplinary Analysis SubSystem (MDASS) and processing it in parallel. This paper proposes new strategy for parallel decomposition of multidisciplinary problem to raise design efficiency by using genetic algorithm and shows the relationship between decomposition and Multidisciplinary Design Optimization(MDO) methodology

  2. Business process mapping techniques for ISO 9001 and 14001 certifications

    Energy Technology Data Exchange (ETDEWEB)

    Klement, R.E.; Richardson, G.D.

    1997-11-01

    AlliedSignal Federal Manufacturing and Technologies/Kansas City (FM and T/KC) produces nonnuclear components for nuclear weapons. The company has operated the plant for the US Department of Energy (DOE) since 1949. Throughout the history of the plant, procedures have been written to reflect the nuclear weapons industry best practices, and the facility has built a reputation for producing high quality products. The purpose of this presentation is to demonstrate how Total Quality principles were used at FM and T/KC to document processes for ISO 9001 and 14001 certifications. The information presented to the reader will lead to a better understanding of business administration by aligning procedures to key business processes within a business model; converting functional-based procedures to process-based procedures for total integrated resource management; and assigning ownership, validation, and metrics to procedures/processes, adding value to a company`s profitability.

  3. Effect of Traditional Processing Techniques on the Nutritional and ...

    African Journals Online (AJOL)

    Michael Horsfall

    Composition of African Bread-Fruit (Treculia africana) Seeds. *IFEOMA I IJEH .... located mainly in the seed coat (Kumar et al, 1979;. Singh ... development and control of some metabolic processes ... (1996). Regulation of selenoprotein gene.

  4. Capturing and Modeling Domain Knowledge Using Natural Language Processing Techniques

    Science.gov (United States)

    2005-06-01

    Intelligence Artificielle , France, May 2001, p. 109- 118 [Barrière, 2001] -----. “Investigating the Causal Relation in Informative Texts”. Terminology, 7:2...out of the flood of information, military have to create new ways of processing sensor and intelligence information, and of providing the results to...have to create new ways of processing sensor and intelligence information, and of providing the results to commanders who must take timely operational

  5. New Materials Design Through Friction Stir Processing Techniques

    International Nuclear Information System (INIS)

    Buffa, G.; Fratini, L.; Shivpuri, R.

    2007-01-01

    Friction Stir Welding (FSW) has reached a large interest in the scientific community and in the last years also in the industrial environment, due to the advantages of such solid state welding process with respect to the classic ones. The complex material flow occurring during the process plays a fundamental role in such solid state welding process, since it determines dramatic changes in the material microstructure of the so called weld nugget, which affects the effectiveness of the joints. What is more, Friction Stir Processing (FSP) is mainly being considered for producing high-strain-rate-superplastic (HSRS) microstructure in commercial aluminum alloys. The aim of the present research is the development of a locally composite material through the Friction Stir Processing (FSP) of two AA7075-T6 blanks and a different material insert. The results of a preliminary experimental campaign, carried out at the varying of the additional material placed at the sheets interface under different conditions, are presented. Micro and macro observation of the such obtained joints permitted to investigate the effects of such process on the overall joint performance

  6. Automatic identification of corrosion damage using image processing techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bento, Mariana P.; Ramalho, Geraldo L.B.; Medeiros, Fatima N.S. de; Ribeiro, Elvis S. [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil); Medeiros, Luiz C.L. [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper proposes a Nondestructive Evaluation (NDE) method for atmospheric corrosion detection on metallic surfaces using digital images. In this study, the uniform corrosion is characterized by texture attributes extracted from co-occurrence matrix and the Self Organizing Mapping (SOM) clustering algorithm. We present a technique for automatic inspection of oil and gas storage tanks and pipelines of petrochemical industries without disturbing their properties and performance. Experimental results are promising and encourage the possibility of using this methodology in designing trustful and robust early failure detection systems. (author)

  7. Process, Voltage and Temperature Compensation Technique for Cascode Modulated PAs

    DEFF Research Database (Denmark)

    Sira, Daniel; Larsen, Torben

    2013-01-01

    , that represents a transistor level model (empirical model) of the cascode modulated PA, is utilized in a PA analog predistorter. The analog predistorter linearizes and compensates for PVT variation of the cascode modulated PA. The empirical model is placed in the negative feedback of an operational...... transconductance amplifier. The predistorted varying envelope signal is applied to the cascode gate of the PA. It is shown that the proposed PVT compensation technique significantly reduces the PVT spread of the PA linearity indicators and improves the PA linearity. Simulations were performed in a 0.13 μm CMOS...

  8. Pulsed electrical discharges for medicine and biology techniques, processes, applications

    CERN Document Server

    Kolikov, Victor

    2015-01-01

    This book presents the application of pulsed electrical discharges in water and water dispersions of metal nanoparticles in medicine (surgery, dentistry, and oncology), biology, and ecology. The intensive electrical and shock waves represent a novel technique to destroy viruses and this way to  prepare anti-virus vaccines. The method of pulsed electrical discharges in water allows to decontaminate water from almost all known bacteria and spores of fungi being present in human beings. The nanoparticles used are not genotoxic and mutagenic. This book is useful for researchers and graduate students.

  9. Satellite and terrestrial radio positioning techniques a signal processing perspective

    CERN Document Server

    Dardari, Davide; Falletti, Emanuela

    2014-01-01

    * The first book to combine satellite and terrestrial positioning techniques - vital for the understanding and development of new technologies * Written and edited by leading experts in the field, with contributors belonging to the European Commission's FP7 Network of Excellence NEWCOM++ Applications to a wide range of fields, including sensor networks, emergency services, military use, location-based billing, location-based advertising, intelligent transportation, and leisure Location-aware personal devices and location-based services have become ever more prominent in the past few years

  10. Modeling and Control of Multivariable Process Using Intelligent Techniques

    Directory of Open Access Journals (Sweden)

    Subathra Balasubramanian

    2010-10-01

    Full Text Available For nonlinear dynamic systems, the first principles based modeling and control is difficult to implement. In this study, a fuzzy controller and recurrent fuzzy controller are developed for MIMO process. Fuzzy logic controller is a model free controller designed based on the knowledge about the process. In fuzzy controller there are two types of rule-based fuzzy models are available: one the linguistic (Mamdani model and the other is Takagi–Sugeno model. Of these two, Takagi-Sugeno model (TS has attracted most attention. The fuzzy controller application is limited to static processes due to their feedforward structure. But, most of the real-time processes are dynamic and they require the history of input/output data. In order to store the past values a memory unit is needed, which is introduced by the recurrent structure. The proposed recurrent fuzzy structure is used to develop a controller for the two tank heating process. Both controllers are designed and implemented in a real time environment and their performance is compared.

  11. Image-analysis techniques for investigation localized corrosion processes

    International Nuclear Information System (INIS)

    Quinn, M.J.; Bailey, M.G.; Ikeda, B.M.; Shoesmith, D.W.

    1993-12-01

    We have developed a procedure for determining the mode and depth of penetration of localized corrosion by combining metallography and image analysis of corroded coupons. Two techniques, involving either a face-profiling or an edge-profiling procedure, have been developed. In the face-profiling procedure, successive surface grindings and image analyses were performed until corrosion was no longer visible. In this manner, the distribution of corroded sites on the surface and the total area of the surface corroded were determined as a function of depth into the specimen. In the edge-profiling procedure, surface grinding exposed successive cross sections of the corroded region. Image analysis of the cross section quantified the distribution of depths across the corroded section, and a three-dimensional distribution of penetration depths was obtained. To develop these procedures, we used artificially creviced Grade-2 titanium specimens that were corroded in saline solutions containing various amounts of chloride maintained at various fixed temperatures (105 to 150 degrees C) using a previously developed galvanic-coupling technique. We discuss some results from these experiments to illustrate how the procedures developed can be applied to a real corroded system. (author). 6 refs., 4 tabs., 21 figs

  12. Signal and array processing techniques for RFID readers

    Science.gov (United States)

    Wang, Jing; Amin, Moeness; Zhang, Yimin

    2006-05-01

    Radio Frequency Identification (RFID) has recently attracted much attention in both the technical and business communities. It has found wide applications in, for example, toll collection, supply-chain management, access control, localization tracking, real-time monitoring, and object identification. Situations may arise where the movement directions of the tagged RFID items through a portal is of interest and must be determined. Doppler estimation may prove complicated or impractical to perform by RFID readers. Several alternative approaches, including the use of an array of sensors with arbitrary geometry, can be applied. In this paper, we consider direction-of-arrival (DOA) estimation techniques for application to near-field narrowband RFID problems. Particularly, we examine the use of a pair of RFID antennas to track moving RFID tagged items through a portal. With two antennas, the near-field DOA estimation problem can be simplified to a far-field problem, yielding a simple way for identifying the direction of the tag movement, where only one parameter, the angle, needs to be considered. In this case, tracking of the moving direction of the tag simply amounts to computing the spatial cross-correlation between the data samples received at the two antennas. It is pointed out that the radiation patterns of the reader and tag antennas, particularly their phase characteristics, have a significant effect on the performance of DOA estimation. Indoor experiments are conducted in the Radar Imaging and RFID Labs at Villanova University for validating the proposed technique for target movement direction estimations.

  13. Creation of structured documentation templates using Natural Language Processing techniques.

    Science.gov (United States)

    Kashyap, Vipul; Turchin, Alexander; Morin, Laura; Chang, Frank; Li, Qi; Hongsermeier, Tonya

    2006-01-01

    Structured Clinical Documentation is a fundamental component of the healthcare enterprise, linking both clinical (e.g., electronic health record, clinical decision support) and administrative functions (e.g., evaluation and management coding, billing). One of the challenges in creating good quality documentation templates has been the inability to address specialized clinical disciplines and adapt to local clinical practices. A one-size-fits-all approach leads to poor adoption and inefficiencies in the documentation process. On the other hand, the cost associated with manual generation of documentation templates is significant. Consequently there is a need for at least partial automation of the template generation process. We propose an approach and methodology for the creation of structured documentation templates for diabetes using Natural Language Processing (NLP).

  14. Parallel processing based decomposition technique for efficient collaborative optimization

    International Nuclear Information System (INIS)

    Park, Hyung Wook; Kim, Sung Chan; Kim, Min Soo; Choi, Dong Hoon

    2001-01-01

    In practical design studies, most of designers solve multidisciplinary problems with large sized and complex design system. These multidisciplinary problems have hundreds of analysis and thousands of variables. The sequence of process to solve these problems affects the speed of total design cycle. Thus it is very important for designer to reorder the original design processes to minimize total computational cost. This is accomplished by decomposing large multidisciplinary problem into several MultiDisciplinary Analysis SubSystem (MDASS) and processing it in parallel. This paper proposes new strategy for parallel decomposition of multidisciplinary problem to raise design efficiency by using genetic algorithm and shows the relationship between decomposition and Multidisciplinary Design Optimization(MDO) methodology

  15. Video fluoroscopic techniques for the study of Oral Food Processing

    Science.gov (United States)

    Matsuo, Koichiro; Palmer, Jeffrey B.

    2016-01-01

    Food oral processing and pharyngeal food passage cannot be observed directly from the outside of the body without instrumental methods. Videofluoroscopy (x-ray video recording) reveals the movement of oropharyngeal anatomical structures in two dimensions. By adding a radiopaque contrast medium, the motion and shape of the food bolus can be also visualized, providing critical information about the mechanisms of eating, drinking, and swallowing. For quantitative analysis of the kinematics of oral food processing, radiopaque markers are attached to the teeth, tongue or soft palate. This approach permits kinematic analysis with a variety of textures and consistencies, both solid and liquid. Fundamental mechanisms of food oral processing are clearly observed with videofluoroscopy in lateral and anteroposterior projections. PMID:27213138

  16. Processing and analysis techniques involving in-vessel material generation

    Science.gov (United States)

    Schabron, John F [Laramie, WY; Rovani, Jr., Joseph F.

    2012-09-25

    In at least one embodiment, the inventive technology relates to in-vessel generation of a material from a solution of interest as part of a processing and/or analysis operation. Preferred embodiments of the in-vessel material generation (e.g., in-vessel solid material generation) include precipitation; in certain embodiments, analysis and/or processing of the solution of interest may include dissolution of the material, perhaps as part of a successive dissolution protocol using solvents of increasing ability to dissolve. Applications include, but are by no means limited to estimation of a coking onset and solution (e.g., oil) fractionating.

  17. Product and process effectiveness using performance-based auditing techniques

    International Nuclear Information System (INIS)

    Horseman, M.L.

    1995-01-01

    Focus is the backbone of genius. Focus is the lifeblood of adequate products and effective processes. Focus is the theme of Performance-Based Audits (PBA). The Civilian Radioactive Waste Management (CRWM) Program is using the PBA tool extensively to focus on the evaluation of product adequacy and process effectiveness. The term Performance-Based Audit has been around for several years. however, the approach presented here for the systematic end-product selection, planning, and measurement of adequacy and effectiveness is new and innovative

  18. Particle tracking in sophisticated CAD models for simulation purposes

    International Nuclear Information System (INIS)

    Sulkimo, J.; Vuoskoski, J.

    1995-01-01

    The transfer of physics detector models from computer aided design systems to physics simulation packages like GEANT suffers from certain limitations. In addition, GEANT is not able to perform particle tracking in CAD models. We describe an application which is able to perform particle tracking in boundary models constructed in CAD systems. The transfer file format used is the new international standard, STEP. The design and implementation of the application was carried out using object-oriented techniques. It will be integrated in the future object-oriented version of GEANT. (orig.)

  19. Particle tracking in sophisticated CAD models for simulation purposes

    Science.gov (United States)

    Sulkimo, J.; Vuoskoski, J.

    1996-02-01

    The transfer of physics detector models from computer aided design systems to physics simulation packages like GEANT suffers from certain limitations. In addition, GEANT is not able to perform particle tracking in CAD models. We describe an application which is able to perform particle tracking in boundary models constructed in CAD systems. The transfer file format used is the new international standard, STEP. The design and implementation of the application was carried out using object-oriented techniques. It will be integrated in the future object-oriented version of GEANT.

  20. Thin Slits Manufacturing Process Using Electro Discharge Technique

    Czech Academy of Sciences Publication Activity Database

    Hošek, Jan

    -, č. 40 (2011), s. 175-178 ISSN 1584-5982 R&D Projects: GA AV ČR IAA200760905 Institutional research plan: CEZ:AV0Z20760514 Keywords : thin slit * EDM process * manufacturing Subject RIV: JR - Other Machinery

  1. The application of an electrochemical process as a decontamination technique

    International Nuclear Information System (INIS)

    Bond, R.D.

    1985-10-01

    A method of electrodeplating has been developed for reducing surface radioactive contamination. The theory, practice and equipment involved are described together with experimental work to test the process. Results are given of preliminary tests and it is concluded that electropolishing in phosphoric acid electrolyte is an effective method for the removal of radioactive particulate contamination from metal surfaces. (UK)

  2. Using Motivational Interviewing Techniques to Address Parallel Process in Supervision

    Science.gov (United States)

    Giordano, Amanda; Clarke, Philip; Borders, L. DiAnne

    2013-01-01

    Supervision offers a distinct opportunity to experience the interconnection of counselor-client and counselor-supervisor interactions. One product of this network of interactions is parallel process, a phenomenon by which counselors unconsciously identify with their clients and subsequently present to their supervisors in a similar fashion…

  3. Characterization of stress degradation products of benazepril by using sophisticated hyphenated techniques.

    Science.gov (United States)

    Narayanam, Mallikarjun; Sahu, Archana; Singh, Saranjit

    2013-01-04

    Benazepril, an anti-hypertensive drug, was subjected to forced degradation studies. The drug was unstable under hydrolytic conditions, yielding benazeprilat, which is a known major degradation product (DP) and an active metabolite. It also underwent photochemical degradation in acid and neutral pH conditions, resulting in multiple minor DPs. The products were separated on a reversed phase (C18) column in a gradient mode, and subjected to LC-MS and LC-NMR studies. Initially, comprehensive mass fragmentation pathway of the drug was established through support of high resolution mass spectrometric (HR-MS) and multi stage tandem mass spectrometric (MS(n)) data. The DPs were also subjected to LC-MS/TOF studies to obtain their accurate masses. Along with, on-line H/D exchange data were obtained to ascertain the number of exchangeable hydrogens in each molecule. LC-(1)H NMR and LC-2DNMR data were additionally acquired in a fraction loop mode. The whole information was successfully employed for the characterization of all the DPs. A complete degradation pathway of the drug was also established. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. Processing of ZnO nanocrystals by solochemical technique

    International Nuclear Information System (INIS)

    Gusatti, M.; Speckhahn, R.; Silva, L.A.; Rosario, J.A.; Lima, R.B.; Kuhnen, N.C.; Riella, H.G.; Campos, C.E.M.

    2009-01-01

    In the present work, we report the synthesis of high quality ZnO nanocrystals by solochemical technique. This synthetic strategy has been shown to have advantages over other methods of producing nanostructures in terms of low cost, efficiency, simplicity and uniformity of crystal structure. Zinc chloride solution at room temperature was mixed with sodium hydroxide solution at 50°C to produce ZnO nanocrystals. Transmission electronic microscopy (TEM) and X-ray powder diffraction (XRD) were used to characterize the ZnO nanocrystals obtained. The structure of ZnO was refined by the Rietveld Method from X-ray diffraction data. These methods showed that the product consisted of pure ZnO nanocrystals and has, predominantly, a rod-like morphology. (author)

  5. Convergent engineering techniques for management of nuclear processes

    International Nuclear Information System (INIS)

    Carabulea, A.; Popa, I.

    1995-01-01

    The paper briefly presents the concept of convergent arhemo-systematical engineering, its advantages in comparison with classical methods of research, design, manufacture. The convergent engineering application supposes the usage of the advanced methods, techniques and equipment corresponding to the domain and specific for the corresponding branch of computer science. Starting from the convergent engineering principles and concept, the paper proposes two models applicable for new products and also for improving and optimizing the existing ones. The models are based on two levels of feedback corresponding to two levels of control and they assume the utilization of expert and robot-expert systems. The economical efficiency of the application of the convergent engineering method is evaluated for the case of a nuclear power plant by calculation the main technical and economical indicators. (Author) 2 Figs., 5 Refs

  6. Comparative study of image restoration techniques in forensic image processing

    Science.gov (United States)

    Bijhold, Jurrien; Kuijper, Arjan; Westhuis, Jaap-Harm

    1997-02-01

    In this work we investigated the forensic applicability of some state-of-the-art image restoration techniques for digitized video-images and photographs: classical Wiener filtering, constrained maximum entropy, and some variants of constrained minimum total variation. Basic concepts and experimental results are discussed. Because all methods appeared to produce different results, a discussion is given of which method is the most suitable, depending on the image objects that are questioned, prior knowledge and type of blur and noise. Constrained minimum total variation methods produced the best results for test images with simulated noise and blur. In cases where images are the most substantial part of the evidence, constrained maximum entropy might be more suitable, because its theoretical basis predicts a restoration result that shows the most likely pixel values, given all the prior knowledge used during restoration.

  7. Effects of novel processing techniques on glucosinolates and membrane associated myrosinases in broccoli

    OpenAIRE

    Frandsen, Heidi Blok; Markedal, Keld Ejdrup; Martín Belloso, Olga; Sánchez Vega, Rogelio; Soliva-Fortuny, Robert; Sørensen, Hilmer; Sørensen, Susanne; Sørensen, Jens Christian

    2014-01-01

    High pressure/high temperature (HP/HT) and pulsed electric field (PEF) treatment of food are among the novel processing techniques considered as alternatives to conventional thermal food processing. Introduction of new processing techniques with fast and gentle processing steps may reveal new possibilities for preservation of healthy bioactive compounds in processed food. However, effects on various food components due to autolysis and fast reactions prior to the applied HP/HT or PEF need to ...

  8. Using natural language processing techniques to inform research on nanotechnology

    Directory of Open Access Journals (Sweden)

    Nastassja A. Lewinski

    2015-07-01

    Full Text Available Literature in the field of nanotechnology is exponentially increasing with more and more engineered nanomaterials being created, characterized, and tested for performance and safety. With the deluge of published data, there is a need for natural language processing approaches to semi-automate the cataloguing of engineered nanomaterials and their associated physico-chemical properties, performance, exposure scenarios, and biological effects. In this paper, we review the different informatics methods that have been applied to patent mining, nanomaterial/device characterization, nanomedicine, and environmental risk assessment. Nine natural language processing (NLP-based tools were identified: NanoPort, NanoMapper, TechPerceptor, a Text Mining Framework, a Nanodevice Analyzer, a Clinical Trial Document Classifier, Nanotoxicity Searcher, NanoSifter, and NEIMiner. We conclude with recommendations for sharing NLP-related tools through online repositories to broaden participation in nanoinformatics.

  9. Fuel processing requirements and techniques for fuel cell propulsion power

    Science.gov (United States)

    Kumar, R.; Ahmed, S.; Yu, M.

    Fuels for fuel cells in transportation systems are likely to be methanol, natural gas, hydrogen, propane, or ethanol. Fuels other than hydrogen will need to be reformed to hydrogen on-board the vehicle. The fuel reformer must meet stringent requirements for weight and volume, product quality, and transient operation. It must be compact and lightweight, must produce low levels of CO and other byproducts, and must have rapid start-up and good dynamic response. Catalytic steam reforming, catalytic or noncatalytic partial oxidation reforming, or some combination of these processes may be used. This paper discusses salient features of the different kinds of reformers and describes the catalysts and processes being examined for the oxidation reforming of methanol and the steam reforming of ethanol. Effective catalysts and reaction conditions for the former have been identified; promising catalysts and reaction conditions for the latter are being investigated.

  10. Software factory techniques applied to process control at CERN

    OpenAIRE

    Dutour, Mathias D

    2007-01-01

    The CERN Large Hadron Collider (LHC) requires constant monitoring and control of quantities of parameters to guarantee operational conditions. For this purpose, a methodology called UNICOS (UNIfied Industrial COntrols Systems) has been implemented to standardize the design of process control applications. To further accelerate the development of these applications, we migrated our existing UNICOS tooling suite toward a software factory in charge of assembling project, domain and technical inf...

  11. Experimental study of bubbly flow using image processing techniques

    Energy Technology Data Exchange (ETDEWEB)

    Fu, Yucheng, E-mail: ycfu@vt.edu; Liu, Yang, E-mail: liu130@vt.edu

    2016-12-15

    This paper presents an experimental study of bubbly flows at relatively high void fractions using an advanced image processing method. Bubble overlapping is a common problem in such flows and the past studies often treat the overlapping bubbles as a whole, which introduces considerable measurement uncertainties. In this study, a hybrid method combining intersection point detection and watershed segmentation is used to separate the overlapping bubbles. In order to reconstruct bubbles from separated segments, a systematic procedure is developed which can preserve more features captured in the raw image compared to the simple ellipse fitting method. The distributions of void fraction, interfacial area concentration, number density and velocity are obtained from the extracted bubble information. High-speed images of air-water bubbly flows are acquired and processed for eight test runs conducted in a 30 mm × 10 mm rectangular channel. The developed image processing scheme can effectively separate overlapping bubbles and the results compare well with the measurements by the gas flow meter and double-sensor conductivity probe. The development of flows in transverse and mainstream directions are analyzed and compared with the prediction made by the one-dimensional interfacial area transport equation (IATE) and the bubble number density transport equation.

  12. Towards immersive designing of production processes using virtual reality techniques

    Directory of Open Access Journals (Sweden)

    Domagoj Buzjak

    2018-03-01

    Full Text Available The article provides a novel approach to the implementation of virtual reality within planning and design of manual processes and systems. The use of hardware and software required to perform different production - especially assembly - tasks in a virtual environment, using CAD parts as interactive elements, is presented. Considering the CAD parts, the format conversion problem is comprehensively described and solved using format conversion software to overcome the present poor data connectivity between the CAD system and VR hardware and software. Two examples of work processes have been made in a virtual environment: peg-in-hole and wall socket assembly. In the latter case, the traditional planning approach of manual assembly tasks using predetermined motion time system MTM-2 has been compared with a modern approach in which the assembly task is fully performed within a virtual environment. The comparison comprises a discussion on the assembly task execution times. In addition, general and specific advantages and disadvantages that arise in the immersive designing of production processes using virtual reality are presented, as well as reflections on teamwork and collaborative man-machine work. Finally, novel technologies are proposed to overcome the main problems that occur when implementing VR, such as time-consuming scene defining or tedious CAD software data conversion.

  13. A New Technique For Information Processing of CLIC Technical Documentation

    CERN Document Server

    Tzermpinos, Konstantinos

    2013-01-01

    The scientific work presented in this paper could be described as a novel, systemic approach to the process of organization of CLIC documentation. The latter refers to the processing of various sets of archived data found on various CERN archiving services in a more friendly and organized way. From physics aspect, this is equal to having an initial system characterized by high entropy, which after some transformation of energy and matter will produce a final system of reduced entropy. However, this reduction in entropy can be considered valid for open systems only, which are sub-systems of grander isolated systems, to which the total entropy will always increase. Thus, using as basis elements from information theory, systems theory and thermodynamics, the unorganized form of data pending to be organized to a higher form, is modeled as an initial open sub-system with increased entropy, which, after the processing of information, will produce a final system with decreased entropy. This systemic approach to the ...

  14. Constructing a Sophistication Index as a Method of Market ...

    African Journals Online (AJOL)

    This study investigates the process of index construction as a means of measuring a hypothetical construct that can typically not be measured by a single question or item and applying it as a method of market segmentation. The availability of incidental secondary data provided a relevant quantitative basis to illustrate this ...

  15. Solving Real-Life Problems: Future Mobile Technology Sophistication

    Directory of Open Access Journals (Sweden)

    FARHAN SHAFIQ

    2016-07-01

    Full Text Available Almost all the human being real life concerned domains are taking advantage of latest technologies for enhancing their process, procedures and operations. This integration of technological innovations provides ease of access, flexibility, transparency, reliability and speed for the concerned process and procedures. Rapid growth of ICT (Information and Communication Technology and MT (Mobile Technology provides opportunity to redesign and reengineered the human routines? life activities process and procedures. Technology integration and adoption in routine life activities may serves compensatory mechanism to assist the population in different manner such as monitoring older adults and children at homes, provides security assistance, monitoring and recording patients vital signs automatically, controlling and monitoring equipments and devices, providing assistance in shopping, banking and education as well. Disasters happened suddenly, destroy everything indiscriminately. Adoption and integration of latest technologies including ICT and MT can enhance the current disaster management process, procedures and operations. This research study focuses the impacts of latest and emerging technology trends in routine life activities and surrounds their potential strength to improve and enhance disaster management activities. MT is providing a promising platform for facilitating people to enhance their routine life activities. This research argue that integration and adoption of mobile computing in disaster management domain can enhance disaster management activities with promising minimizing error, quick information assembling, quick response based on technology manipulation and prioritizing action.

  16. Solving real-life problems: future mobile technology sophistication

    International Nuclear Information System (INIS)

    Shafiq, F.; Ahsan, K.; Nadeem, A.

    2016-01-01

    Almost all the human being real life concerned domains are taking advantage of latest technologies for enhancing their process, procedures and operations. This integration of technological innovations provides ease of access, flexibility, transparency, reliability and speed for the concerned process and procedures. Rapid growth of ICT (Information and Communication Technology) and MT (Mobile Technology) provides opportunity to redesign and re-engineered the human routines life activities process and procedures. Technology integration and adoption in routine life activities may serves compensatory mechanism to assist the population in different manner such as monitoring older adults and children at homes, provides security assistance, monitoring and recording patients vital signs automatically, controlling and monitoring equipments and devices, providing assistance in shopping, banking and education as well. Disasters happened suddenly, destroy everything indiscriminately. Adoption and integration of latest technologies including ICT and MT can enhance the current disaster management process, procedures and operations. This research study focuses the impacts of latest and emerging technology trends in routine life activities and surrounds their potential strength to improve and enhance disaster management activities. MT is providing a promising platform for facilitating people to enhance their routine life activities. This research argue that integration and adoption of mobile computing in disaster management domain can enhance disaster management activities with promising minimizing error, quick information assembling, quick response based on technology manipulation and prioritizing action. (author)

  17. Techniques for automating the process of as-built reconciliation

    International Nuclear Information System (INIS)

    Skruch, B.R.; Brandt, G.B.; Denes, L.J.

    1984-01-01

    Techniques are being developed for acquisition, recording, and evaluation of as-built measurements of piping systems in nuclear power plants. The goal is to improve the efficiency with which as-built dimensions and configuration can be compared to as-designed dimensions and configuration. The approach utilizes an electronic digital ''ruler'' capable of measuring distances to 100 feet with a resolution of 1/100 of a foot. This ruler interfaces to a hand held computer. This ''electronic notebook'' also accepts alpha-numeric input from a keyboard and replaces a clipboard and pencil currently used. The electronic notebook, in turn, can transfer its data directly to a host mini or mainframe computer. Once the data is resident on the larger computer it is converted to a format compatible with an existing database system used for piping analysis and design. Using accepted tolerances for as-built deviations, the as-built data is then automatically compared to as-designed data. If reanalysis is required, the as-built data is in a compatible format to utilize existing computer analysis codes. This paper discusses the operation and interfacing of the electronic ruler, the general design of the data structures in the electronic notebook, the design of mini-computer software, and the results of preliminary testing of the system

  18. When not to copy: female fruit flies use sophisticated public information to avoid mated males

    Science.gov (United States)

    Loyau, Adeline; Blanchet, Simon; van Laere, Pauline; Clobert, Jean; Danchin, Etienne

    2012-10-01

    Semen limitation (lack of semen to fertilize all of a female's eggs) imposes high fitness costs to female partners. Females should therefore avoid mating with semen-limited males. This can be achieved by using public information extracted from watching individual males' previous copulating activities. This adaptive preference should be flexible given that semen limitation is temporary. We first demonstrate that the number of offspring produced by males Drosophila melanogaster gradually decreases over successive copulations. We then show that females avoid mating with males they just watched copulating and that visual public cues are sufficient to elicit this response. Finally, after males were given the time to replenish their sperm reserves, females did not avoid the males they previously saw copulating anymore. These results suggest that female fruit flies may have evolved sophisticated behavioural processes of resistance to semen-limited males, and demonstrate unsuspected adaptive context-dependent mate choice in an invertebrate.

  19. Deposition and post-processing techniques for transparent conductive films

    Energy Technology Data Exchange (ETDEWEB)

    Christoforo, Mark Greyson; Mehra, Saahil; Salleo, Alberto; Peumans, Peter

    2017-07-04

    In one embodiment, a method is provided for fabrication of a semitransparent conductive mesh. A first solution having conductive nanowires suspended therein and a second solution having nanoparticles suspended therein are sprayed toward a substrate, the spraying forming a mist. The mist is processed, while on the substrate, to provide a semitransparent conductive material in the form of a mesh having the conductive nanowires and nanoparticles. The nanoparticles are configured and arranged to direct light passing through the mesh. Connections between the nanowires provide conductivity through the mesh.

  20. Nuclear pulse signal processing techniques based on blind deconvolution method

    International Nuclear Information System (INIS)

    Hong Pengfei; Yang Lei; Qi Zhong; Meng Xiangting; Fu Yanyan; Li Dongcang

    2012-01-01

    This article presents a method of measurement and analysis of nuclear pulse signal, the FPGA to control high-speed ADC measurement of nuclear radiation signals and control the high-speed transmission status of the USB to make it work on the Slave FIFO mode, using the LabVIEW online data processing and display, using the blind deconvolution method to remove the accumulation of signal acquisition, and to restore the nuclear pulse signal with a transmission speed, real-time measurements show that the advantages. (authors)

  1. Nuclear pulse signal processing technique based on blind deconvolution method

    International Nuclear Information System (INIS)

    Hong Pengfei; Yang Lei; Fu Tingyan; Qi Zhong; Li Dongcang; Ren Zhongguo

    2012-01-01

    In this paper, we present a method for measurement and analysis of nuclear pulse signal, with which pile-up signal is removed, the signal baseline is restored, and the original signal is obtained. The data acquisition system includes FPGA, ADC and USB. The FPGA controls the high-speed ADC to sample the signal of nuclear radiation, and the USB makes the ADC work on the Slave FIFO mode to implement high-speed transmission status. Using the LabVIEW, it accomplishes online data processing of the blind deconvolution algorithm and data display. The simulation and experimental results demonstrate advantages of the method. (authors)

  2. Variance reduction techniques in the simulation of Markov processes

    International Nuclear Information System (INIS)

    Lessi, O.

    1987-01-01

    We study a functional r of the stationary distribution of a homogeneous Markov chain. It is often difficult or impossible to perform the analytical calculation of r and so it is reasonable to estimate r by a simulation process. A consistent estimator r(n) of r is obtained with respect to a chain with a countable state space. Suitably modifying the estimator r(n) of r one obtains a new consistent estimator which has a smaller variance than r(n). The same is obtained in the case of finite state space

  3. Nuclear analytical techniques and applications to materials processing

    International Nuclear Information System (INIS)

    Blondiaux, G.; Debrun, J.L.

    1993-01-01

    This paper will present the application of Rutherford backscattering spectrometry to thin film steochiometry determination and application to optimization of the film process elaboration in the case of dielectric films (Ge,Pb,O) and ionic conductors films (Na,Al,O). After we shall present the application of particles induced gamma emission (PIGE) for the characterization of ternary compounds (B,Si,C) used as coating to protect composites materials. The last part of this paper will describe the determination of oxygen in the bulk of fluoride glasses with charged particles activation analysis. (orig.)

  4. Power plant siting; an application of the nominal group process technique

    International Nuclear Information System (INIS)

    Voelker, A.H.

    1976-01-01

    The application of interactive group processes to the problem of facility siting is examined by this report. Much of the discussion is abstracted from experience gained in applying the Nominal Group Process Technique, an interactive group technique, to the identification and rating of factors important in siting nuclear power plants. Through this experience, interactive group process techniques are shown to facilitate the incorporation of the many diverse factors which play a role in siting. In direct contrast to mathematical optimization, commonly represented as the ultimate siting technique, the Nominal Group Process Technique described allows the incorporation of social, economic, and environmental factors and the quantification of the relative importance of these factors. The report concludes that the application of interactive group process techniques to planning and resource management will affect the consideration of social, economic, and environmental concerns and ultimately lead to more rational and credible siting decisions

  5. Modeling rainfall-runoff process using soft computing techniques

    Science.gov (United States)

    Kisi, Ozgur; Shiri, Jalal; Tombul, Mustafa

    2013-02-01

    Rainfall-runoff process was modeled for a small catchment in Turkey, using 4 years (1987-1991) of measurements of independent variables of rainfall and runoff values. The models used in the study were Artificial Neural Networks (ANNs), Adaptive Neuro-Fuzzy Inference System (ANFIS) and Gene Expression Programming (GEP) which are Artificial Intelligence (AI) approaches. The applied models were trained and tested using various combinations of the independent variables. The goodness of fit for the model was evaluated in terms of the coefficient of determination (R2), root mean square error (RMSE), mean absolute error (MAE), coefficient of efficiency (CE) and scatter index (SI). A comparison was also made between these models and traditional Multi Linear Regression (MLR) model. The study provides evidence that GEP (with RMSE=17.82 l/s, MAE=6.61 l/s, CE=0.72 and R2=0.978) is capable of modeling rainfall-runoff process and is a viable alternative to other applied artificial intelligence and MLR time-series methods.

  6. Employing image processing techniques for cancer detection using microarray images.

    Science.gov (United States)

    Dehghan Khalilabad, Nastaran; Hassanpour, Hamid

    2017-02-01

    Microarray technology is a powerful genomic tool for simultaneously studying and analyzing the behavior of thousands of genes. The analysis of images obtained from this technology plays a critical role in the detection and treatment of diseases. The aim of the current study is to develop an automated system for analyzing data from microarray images in order to detect cancerous cases. The proposed system consists of three main phases, namely image processing, data mining, and the detection of the disease. The image processing phase performs operations such as refining image rotation, gridding (locating genes) and extracting raw data from images the data mining includes normalizing the extracted data and selecting the more effective genes. Finally, via the extracted data, cancerous cell is recognized. To evaluate the performance of the proposed system, microarray database is employed which includes Breast cancer, Myeloid Leukemia and Lymphomas from the Stanford Microarray Database. The results indicate that the proposed system is able to identify the type of cancer from the data set with an accuracy of 95.45%, 94.11%, and 100%, respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Improving Accuracy of Processing by Adaptive Control Techniques

    Directory of Open Access Journals (Sweden)

    N. N. Barbashov

    2016-01-01

    Full Text Available When machining the work-pieces a range of scatter of the work-piece dimensions to the tolerance limit is displaced in response to the errors. To improve an accuracy of machining and prevent products from defects it is necessary to diminish the machining error components, i.e. to improve the accuracy of machine tool, tool life, rigidity of the system, accuracy of adjustment. It is also necessary to provide on-machine adjustment after a certain time. However, increasing number of readjustments reduces the performance and high machine and tool requirements lead to a significant increase in the machining cost.To improve the accuracy and machining rate, various devices of active control (in-process gaging devices, as well as controlled machining through adaptive systems for a technological process control now become widely used. Thus, the accuracy improvement in this case is reached by compensation of a majority of technological errors. The sensors of active control can provide improving the accuracy of processing by one or two quality classes, and simultaneous operation of several machines.For efficient use of sensors of active control it is necessary to develop the accuracy control methods by means of introducing the appropriate adjustments to solve this problem. Methods based on the moving average, appear to be the most promising for accuracy control, since they contain information on the change in the last several measured values of the parameter under control.When using the proposed method in calculation, the first three members of the sequence of deviations remain unchanged, therefore 1 1 x  x , 2 2 x  x , 3 3 x  x Then, for each i-th member of the sequence we calculate that way: , ' i i i x  x  k x , where instead of the i x values will be populated with the corresponding values ' i x calculated as an average of three previous members:3 ' 1  2  3  i i i i x x x x .As a criterion for the estimate of the control

  8. Opportunities for Process Monitoring Techniques at Delayed Access Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Curtis, Michael M.; Gitau, Ernest TN; Johnson, Shirley J.; Schanfein, Mark; Toomey, Christopher

    2013-09-20

    Except for specific cases where the International Atomic Energy Agency (IAEA) maintains a continuous presence at a facility (such as the Japanese Rokkasho Reprocessing Plant), there is always a period of time or delay between the moment a State is notified or aware of an upcoming inspection, and the time the inspector actually enters the material balance area or facility. Termed by the authors as “delayed access,” this period of time between inspection notice and inspector entrance to a facility poses a concern. Delayed access also has the potential to reduce the effectiveness of measures applied as part of the Safeguards Approach for a facility (such as short-notice inspections). This report investigates the feasibility of using process monitoring to address safeguards challenges posed by delayed access at a subset of facility types.

  9. Software factory techniques applied to Process Control at CERN

    CERN Multimedia

    Dutour, MD

    2007-01-01

    The CERN Large Hadron Collider (LHC) requires constant monitoring and control of quantities of parameters to guarantee operational conditions. For this purpose, a methodology called UNICOS (UNIfied Industrial COntrols Systems) has been implemented to standardize the design of process control applications. To further accelerate the development of these applications, we migrated our existing UNICOS tooling suite toward a software factory in charge of assembling project, domain and technical information seamlessly into deployable PLC (Programmable logic Controller) – SCADA (Supervisory Control And Data Acquisition) systems. This software factory delivers consistently high quality by reducing human error and repetitive tasks, and adapts to user specifications in a cost-efficient way. Hence, this production tool is designed to encapsulate and hide the PLC and SCADA target platforms, enabling the experts to focus on the business model rather than specific syntaxes and grammars. Based on industry standard software...

  10. Software factory techniques applied to process control at CERN

    CERN Document Server

    Dutour, Mathias D

    2008-01-01

    The CERN Large Hadron Collider (LHC) requires constant monitoring and control of quantities of parameters to guarantee operational conditions. For this purpose, a methodology called UNICOS (UNIfied Industrial COntrols Systems) has been implemented to standardize the design of process control applications. To further accelerate the development of these applications, we migrated our existing UNICOS tooling suite toward a software factory in charge of assembling project, domain and technical information seamlessly into deployable PLC (Programmable logic Controller) - SCADA (Supervisory Control And Data Acquisition) systems. This software factory delivers consistently high quality by reducing human error and repetitive tasks, and adapts to user specifications in a cost-efficient way. Hence, this production tool is designed to encapsulate and hide the PLC and SCADA target platforms, enabling the experts to focus on the business model rather than specific syntaxes and grammars. Based on industry standard software, ...

  11. Application of physical separation techniques in uranium resources processing

    International Nuclear Information System (INIS)

    Padmanabhan, N.P.H.; Sreenivas, T.

    2008-01-01

    The planned economic growth of our country and energy security considerations call for increasing the overall electricity generating capabilities with substantial increase in the zero-carbon and clean nuclear power component. Although India is endowed with vast resources of thorium, its utilization can commence only after the successful completion of the first two stages of nuclear power programme, which use natural uranium in the first stage and natural uranium plus plutonium in the second stage. For the successful operation of first stage, exploration and exploitation activities for uranium should be vigorously followed. This paper reviews the current status of physical beneficiation in processing of uranium ores and discusses its applicability to recover uranium from low grade and below-cut-off grade ores in Indian context. (author)

  12. The Application of Special Computing Techniques to Speed-Up Image Feature Extraction and Processing Techniques.

    Science.gov (United States)

    1981-12-01

    noise supression and data clustering . Another significant step or stage in image processing and exploitation is feature extraction. The objectives and...PAMI-3, no. 3, May, 1981. 16. L.G. Shapiro, "A Structural Model of Shape," IEEE Trans. on Pattern AnalIsis and Machine Intelligence, vol. PAMI-2, no...Theoretic Clustering ," IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. PAMI-1, no. 1, Jan., 1979. 34. P.M. Narendra, "A Separable Median

  13. Tuning of PID controller using optimization techniques for a MIMO process

    Science.gov (United States)

    Thulasi dharan, S.; Kavyarasan, K.; Bagyaveereswaran, V.

    2017-11-01

    In this paper, two processes were considered one is Quadruple tank process and the other is CSTR (Continuous Stirred Tank Reactor) process. These are majorly used in many industrial applications for various domains, especially, CSTR in chemical plants.At first mathematical model of both the process is to be done followed by linearization of the system due to MIMO process and controllers are the major part to control the whole process to our desired point as per the applications so the tuning of the controller plays a major role among the whole process. For tuning of parameters we use two optimizations techniques like Particle Swarm Optimization, Genetic Algorithm. The above techniques are majorly used in different applications to obtain which gives the best among all, we use these techniques to obtain the best tuned values among many. Finally, we will compare the performance of the each process with both the techniques.

  14. Analysis of Low Probability of Intercept (LPI) Radar Signals Using Cyclostationary Processing

    National Research Council Canada - National Science Library

    Lime, Antonio

    2002-01-01

    ... problem in the battle space To detect these types of radar, new digital receivers that use sophisticated signal processing techniques are required This thesis investigates the use of cyclostationary...

  15. Techniques for evaluation of E-beam evaporative processes

    International Nuclear Information System (INIS)

    Meier, T.C.; Nelson, C.M.

    1996-01-01

    High dynamic range video imaging of the molten pool surface has provided insight regarding process responses at the melt pool liquid-vapor interface. A water-cooled video camera provides continuous high resolution imaging of the pool surface from a low angle position within 20 cm of the liquid-vapor interface. From the vantage point, the e-beam footprint is clearly defined and melt pool free surface shape can be observed. Effects of changes in a beam footprint, power distribution, and sweep frequency on pool surface shape and stability of vaporization are immediately shown. Other events observed and recorded include: formation of the pool and dissipation of ''rafts'' on the pool surface during startup, behavior of feed material as it enters the pool, effects of feed configuration changes on mixing of feed entering the pool volume and behaviors of co-evaporated materials of different vapor pressures at the feed/pool boundary. When used in conjunction with laser vapor monitoring, correlation between pool surface phenomena and vaporizer performance has been identified. This video capability was used in verifying the titanium evaporation model results presented at this conference by confirming the calculated melt pool surface deformations caused by vapor pressure of the departing evaporant at the liquid-vapor interface

  16. Optical microphone with fiber Bragg grating and signal processing techniques

    Science.gov (United States)

    Tosi, Daniele; Olivero, Massimo; Perrone, Guido

    2008-06-01

    In this paper, we discuss the realization of an optical microphone array using fiber Bragg gratings as sensing elements. The wavelength shift induced by acoustic waves perturbing the sensing Bragg grating is transduced into an intensity modulation. The interrogation unit is based on a fixed-wavelength laser source and - as receiver - a photodetector with proper amplification; the system has been implemented using devices for standard optical communications, achieving a low-cost interrogator. One of the advantages of the proposed approach is that no voltage-to-strain calibration is required for tracking dynamic shifts. The optical sensor is complemented by signal processing tools, including a data-dependent frequency estimator and adaptive filters, in order to improve the frequency-domain analysis and mitigate the effects of disturbances. Feasibility and performances of the optical system have been tested measuring the output of a loudspeaker. With this configuration, the sensor is capable of correctly detecting sounds up to 3 kHz, with a frequency response that exhibits a top sensitivity within the range 200-500 Hz; single-frequency input sounds inducing an axial strain higher than ~10nɛ are correctly detected. The repeatability range is ~0.1%. The sensor has also been applied for the detection of pulsed stimuli generated from a metronome.

  17. Development of Processing Techniques for Advanced Thermal Protection Materials

    Science.gov (United States)

    Selvaduray, Guna; Cox, Michael; Srinivasan, Vijayakumar

    1997-01-01

    Thermal Protection Materials Branch (TPMB) has been involved in various research programs to improve the properties and structural integrity of the existing aerospace high temperature materials. Specimens from various research programs were brought into the analytical laboratory for the purpose of obtaining and refining the material characterization. The analytical laboratory in TPMB has many different instruments which were utilized to determine the physical and chemical characteristics of materials. Some of the instruments that were utilized by the SJSU students are: Scanning Electron Microscopy (SEM), Energy Dispersive X-ray analysis (EDX), X-ray Diffraction Spectrometer (XRD), Fourier Transform-Infrared Spectroscopy (FTIR), Ultra Violet Spectroscopy/Visible Spectroscopy (UV/VIS), Particle Size Analyzer (PSA), and Inductively Coupled Plasma Atomic Emission Spectrometer (ICP-AES). The above mentioned analytical instruments were utilized in the material characterization process of the specimens from research programs such as: aerogel ceramics (I) and (II), X-33 Blankets, ARC-Jet specimens, QUICFIX specimens and gas permeability of lightweight ceramic ablators. In addition to analytical instruments in the analytical laboratory at TPMB, there are several on-going experiments. One particular experiment allows the measurement of permeability of ceramic ablators. From these measurements, physical characteristics of the ceramic ablators can be derived.

  18. Techniques for evaluation of E-beam evaporative processes

    Energy Technology Data Exchange (ETDEWEB)

    Meier, T.C.; Nelson, C.M.

    1996-10-01

    High dynamic range video imaging of the molten pool surface has provided insight regarding process responses at the melt pool liquid-vapor interface. A water-cooled video camera provides continuous high resolution imaging of the pool surface from a low angle position within 20 cm of the liquid-vapor interface. From the vantage point, the e-beam footprint is clearly defined and melt pool free surface shape can be observed. Effects of changes in a beam footprint, power distribution, and sweep frequency on pool surface shape and stability of vaporization are immediately shown. Other events observed and recorded include: formation of the pool and dissipation of ``rafts`` on the pool surface during startup, behavior of feed material as it enters the pool, effects of feed configuration changes on mixing of feed entering the pool volume and behaviors of co-evaporated materials of different vapor pressures at the feed/pool boundary. When used in conjunction with laser vapor monitoring, correlation between pool surface phenomena and vaporizer performance has been identified. This video capability was used in verifying the titanium evaporation model results presented at this conference by confirming the calculated melt pool surface deformations caused by vapor pressure of the departing evaporant at the liquid-vapor interface.

  19. Modern devices the simple physics of sophisticated technology

    CERN Document Server

    Joseph, Charles L

    2016-01-01

    This book discusses the principles of physics through applications of state-of-the-art technologies and advanced instruments. The authors use diagrams, sketches, and graphs coupled with equations and mathematical analysis to enhance the reader's understanding of modern devices. Readers will learn to identify common underlying physical principles that govern several types of devices, while gaining an understanding of the performance trade-off imposed by the physical limitations of various processing methods. The topics discussed in the book assume readers have taken an introductory physics course, college algebra, and have a basic understanding of calculus. * Describes the basic physics behind a large number of devices encountered in everyday life, from the air conditioner to Blu-ray discs * Covers state-of-the-art devices such as spectrographs, photoelectric image sensors, spacecraft systems, astronomical and planetary observatories, biomedical imaging instruments, particle accelerators, and jet engines * Inc...

  20. Eye gazing direction inspection based on image processing technique

    Science.gov (United States)

    Hao, Qun; Song, Yong

    2005-02-01

    According to the research result in neural biology, human eyes can obtain high resolution only at the center of view of field. In the research of Virtual Reality helmet, we design to detect the gazing direction of human eyes in real time and feed it back to the control system to improve the resolution of the graph at the center of field of view. In the case of current display instruments, this method can both give attention to the view field of virtual scene and resolution, and improve the immersion of virtual system greatly. Therefore, detecting the gazing direction of human eyes rapidly and exactly is the basis of realizing the design scheme of this novel VR helmet. In this paper, the conventional method of gazing direction detection that based on Purklinje spot is introduced firstly. In order to overcome the disadvantage of the method based on Purklinje spot, this paper proposed a method based on image processing to realize the detection and determination of the gazing direction. The locations of pupils and shapes of eye sockets change with the gazing directions. With the aid of these changes, analyzing the images of eyes captured by the cameras, gazing direction of human eyes can be determined finally. In this paper, experiments have been done to validate the efficiency of this method by analyzing the images. The algorithm can carry out the detection of gazing direction base on normal eye image directly, and it eliminates the need of special hardware. Experiment results show that the method is easy to implement and have high precision.

  1. Design of process displays based on risk analysis techniques

    International Nuclear Information System (INIS)

    Lundtang Paulsen, J.

    2004-05-01

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, especially in view of the enormous amount of information available in computer-based supervision systems. The state of the art is discussed: How are supervision systems designed today and why? Which strategies are used? What kind of research is going on? Four different plants and their display systems, designed by the author, are described and discussed. Next we outline different methods for eliciting knowledge of a plant, particularly the risks, which is necessary information for the display designer. A chapter presents an overview of the various types of operation references: constitutive equations, set points, design parameters, component characteristics etc., and their validity in different situations. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engineer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described in some detail. Finally we address the problem of where to put the dot and the lines: when all information is on the table, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose was to develop a software tool for maintenance supervision of components in a nuclear power plant. (au)

  2. Application of signal processing techniques for islanding detection of distributed generation in distribution network: A review

    International Nuclear Information System (INIS)

    Raza, Safdar; Mokhlis, Hazlie; Arof, Hamzah; Laghari, J.A.; Wang, Li

    2015-01-01

    Highlights: • Pros & cons of conventional islanding detection techniques (IDTs) are discussed. • Signal processing techniques (SPTs) ability in detecting islanding is discussed. • SPTs ability in improving performance of passive techniques are discussed. • Fourier, s-transform, wavelet, HHT & tt-transform based IDTs are reviewed. • Intelligent classifiers (ANN, ANFIS, Fuzzy, SVM) application in SPT are discussed. - Abstract: High penetration of distributed generation resources (DGR) in distribution network provides many benefits in terms of high power quality, efficiency, and low carbon emissions in power system. However, efficient islanding detection and immediate disconnection of DGR is critical in order to avoid equipment damage, grid protection interference, and personnel safety hazards. Islanding detection techniques are mainly classified into remote, passive, active, and hybrid techniques. From these, passive techniques are more advantageous due to lower power quality degradation, lower cost, and widespread usage by power utilities. However, the main limitations of these techniques are that they possess a large non detection zones and require threshold setting. Various signal processing techniques and intelligent classifiers have been used to overcome the limitations of passive islanding. Signal processing techniques, in particular, are adopted due to their versatility, stability, cost effectiveness, and ease of modification. This paper presents a comprehensive overview of signal processing techniques used to improve common passive islanding detection techniques. A performance comparison between the signal processing based islanding detection techniques with existing techniques are also provided. Finally, this paper outlines the relative advantages and limitations of the signal processing techniques in order to provide basic guidelines for researchers and field engineers in determining the best method for their system

  3. Fabrication and processing of polymer solar cells: A review of printing and coating techniques

    DEFF Research Database (Denmark)

    Krebs, Frederik C

    2009-01-01

    Polymer solar cells are reviewed in the context of the processing techniques leading to complete devices. A distinction is made between the film-forming techniques that are used currently such as spincoating, doctor blading and casting and the, from a processing point of view, more desirable film...... are described with focus on the particular advantages and disadvantages associated with each case....

  4. Wind Erosion Processes and Control Techniques in the Sahelian Zone of Niger

    NARCIS (Netherlands)

    Sterk, G.; Stroosnijder, L.; Raats, P.A.C.

    1999-01-01

    Wind Erosion Processes and Control Techniques in the Sahelian Zone of Niger G. Sterk, L. Stroosnijder, and P.A.C. Raats Abstract The objective of this paper is to present the main results and conclusions from three years of field research on wind erosion processes and control techniques in the

  5. Lexical Sophistication as a Multidimensional Phenomenon: Relations to Second Language Lexical Proficiency, Development, and Writing Quality

    Science.gov (United States)

    Kim, Minkyung; Crossley, Scott A.; Kyle, Kristopher

    2018-01-01

    This study conceptualizes lexical sophistication as a multidimensional phenomenon by reducing numerous lexical features of lexical sophistication into 12 aggregated components (i.e., dimensions) via a principal component analysis approach. These components were then used to predict second language (L2) writing proficiency levels, holistic lexical…

  6. The predictors of economic sophistication: media, interpersonal communication and negative economic experiences

    NARCIS (Netherlands)

    Kalogeropoulos, A.; Albæk, E.; de Vreese, C.H.; van Dalen, A.

    2015-01-01

    In analogy to political sophistication, it is imperative that citizens have a certain level of economic sophistication, especially in times of heated debates about the economy. This study examines the impact of different influences (media, interpersonal communication and personal experiences) on

  7. Isocratean Discourse Theory and Neo-Sophistic Pedagogy: Implications for the Composition Classroom.

    Science.gov (United States)

    Blair, Kristine L.

    With the recent interest in the fifth century B.C. theories of Protagoras and Gorgias come assumptions about the philosophical affinity of the Greek educator Isocrates to this pair of older sophists. Isocratean education in discourse, with its emphasis on collaborative political discourse, falls within recent definitions of a sophist curriculum.…

  8. Aristotle and Social-Epistemic Rhetoric: The Systematizing of the Sophistic Legacy.

    Science.gov (United States)

    Allen, James E.

    While Aristotle's philosophical views are more foundational than those of many of the Older Sophists, Aristotle's rhetorical theories inherit and incorporate many of the central tenets ascribed to Sophistic rhetoric, albeit in a more systematic fashion, as represented in the "Rhetoric." However, Aristotle was more than just a rhetorical…

  9. Data Collision Prevention with Overflow Hashing Technique in Closed Hash Searching Process

    Science.gov (United States)

    Rahim, Robbi; Nurjamiyah; Rafika Dewi, Arie

    2017-12-01

    Hash search is a method that can be used for various search processes such as search engines, sorting, machine learning, neural network and so on, in the search process the possibility of collision data can happen and to prevent the occurrence of collision can be done in several ways one of them is to use Overflow technique, the use of this technique perform with varying length of data and this technique can prevent the occurrence of data collisions.

  10. The application of neural networks with artificial intelligence technique in the modeling of industrial processes

    International Nuclear Information System (INIS)

    Saini, K. K.; Saini, Sanju

    2008-01-01

    Neural networks are a relatively new artificial intelligence technique that emulates the behavior of biological neural systems in digital software or hardware. These networks can 'learn', automatically, complex relationships among data. This feature makes the technique very useful in modeling processes for which mathematical modeling is difficult or impossible. The work described here outlines some examples of the application of neural networks with artificial intelligence technique in the modeling of industrial processes.

  11. Sophistication of computational science and fundamental physics simulations

    International Nuclear Information System (INIS)

    Ishiguro, Seiji; Ito, Atsushi; Usami, Shunsuke; Ohtani, Hiroaki; Sakagami, Hitoshi; Toida, Mieko; Hasegawa, Hiroki; Horiuchi, Ritoku; Miura, Hideaki

    2016-01-01

    Numerical experimental reactor research project is composed of the following studies: (1) nuclear fusion simulation research with a focus on specific physical phenomena of specific equipment, (2) research on advanced simulation method to increase predictability or expand its application range based on simulation, (3) visualization as the foundation of simulation research, (4) research for advanced computational science such as parallel computing technology, and (5) research aiming at elucidation of fundamental physical phenomena not limited to specific devices. Specifically, a wide range of researches with medium- to long-term perspectives are being developed: (1) virtual reality visualization, (2) upgrading of computational science such as multilayer simulation method, (3) kinetic behavior of plasma blob, (4) extended MHD theory and simulation, (5) basic plasma process such as particle acceleration due to interaction of wave and particle, and (6) research related to laser plasma fusion. This paper reviews the following items: (1) simultaneous visualization in virtual reality space, (2) multilayer simulation of collisionless magnetic reconnection, (3) simulation of microscopic dynamics of plasma coherent structure, (4) Hall MHD simulation of LHD, (5) numerical analysis for extension of MHD equilibrium and stability theory, (6) extended MHD simulation of 2D RT instability, (7) simulation of laser plasma, (8) simulation of shock wave and particle acceleration, and (9) study on simulation of homogeneous isotropic MHD turbulent flow. (A.O.)

  12. Sophistication and integration of plant engineering CAD-CAE systems

    International Nuclear Information System (INIS)

    Yoshinaga, Toshiaki; Hanyu, Masaharu; Ota, Yoshimi; Kobayashi, Yasuhiro.

    1995-01-01

    In respective departments in charge of basic planning, design, manufacture, inspection and construction of nuclear power plants, by the positive utilization of CAD/CAE system, efficient workings have been advanced. This time, the plant integrated CAE system wich heightens the function of these individual systems, and can make workings efficient and advanced by unifying and integrating them was developed. This system is composed of the newly developed application system and the data base system which enables the unified management of engineering data and high speed data conversion in addition to the CAD system for three-dimensional plant layout planning. On the basis of the rich experience and the proposal of improvement of designers by the application of the CAD system for three-dimensional plant layout planning to actual machines, the automation, speed increase and the visualization of input and output by graphical user interface (GUI) in the processing of respective applications were made feasible. As the advancement of plant CAE system, scenic engineering system, integrated layout CAE system, electric instrumentation design CAE system and construction planning CAE system are described. As for the integration of plant CAE systems, the integrated engineering data base, the combination of plant CAE systems, and the operation management in the dispersed environment of networks are reported. At present, Hitachi Ltd. exerts efforts for the construction of atomic energy product in formation integrated management system as the second stage of integration. (K.I.)

  13. More Sophisticated Fits of the Oribts of Haumea's Interacting Moons

    Science.gov (United States)

    Oldroyd, William Jared; Ragozzine, Darin; Porter, Simon

    2018-04-01

    Since the discovery of Haumea's moons, it has been a challenge to model the orbits of its moons, Hi’iaka and Namaka. With many precision HST observations, Ragozzine & Brown 2009 succeeded in calculating a three-point mass model which was essential because Keplerian orbits were not a statistically acceptable fit. New data obtained in 2010 could be fit by adding a J2 and spin pole to Haumea, but new data from 2015 was far from the predicted locations, even after an extensive exploration using Bayesian Markov Chain Monte Carlo methods (using emcee). Here we report on continued investigations as to why our model cannot fit the full 10-year baseline of data. We note that by ignoring Haumea and instead examining the relative motion of the two moons in the Hi’iaka centered frame leads to adequate fits for the data. This suggests there are additional parameters connected to Haumea that will be required in a full model. These parameters are potentially related to photocenter-barycenter shifts which could be significant enough to affect the fitting process; these are unlikely to be caused by the newly discovered ring (Ortiz et al. 2017) or by unknown satellites (Burkhart et al. 2016). Additionally, we have developed a new SPIN+N-bodY integrator called SPINNY that self-consistently calculates the interactions between n-quadrupoles and is designed to test the importance of other possible effects (Haumea C22, satellite torques on the spin-pole, Sun, etc.) on our astrometric fits. By correctly determining the orbit of Haumea’s satellites we develop a better understanding of the physical properties of each of the objects with implications for the formation of Haumea, its moons, and its collisional family.

  14. Analysis of pulse-shape discrimination techniques for BC501A using GHz digital signal processing

    International Nuclear Information System (INIS)

    Rooney, B.D.; Dinwiddie, D.R.; Nelson, M.A.; Rawool-Sullivan, Mohini W.

    2001-01-01

    A comparison study of pulse-shape analysis techniques was conducted for a BC501A scintillator using digital signal processing (DSP). In this study, output signals from a preamplifier were input directly into a 1 GHz analog-to-digital converter. The digitized data obtained with this method was post-processed for both pulse-height and pulse-shape information. Several different analysis techniques were evaluated for neutron and gamma-ray pulse-shape discrimination. It was surprising that one of the simplest and fastest techniques resulted in some of the best pulse-shape discrimination results. This technique, referred to here as the Integral Ratio technique, was able to effectively process several thousand detector pulses per second. This paper presents the results and findings of this study for various pulse-shape analysis techniques with digitized detector signals.

  15. Hi-tech in space - Rosetta - a space sophisticate

    Science.gov (United States)

    2004-02-01

    The European Space Agency’s Rosetta mission will be getting under way in February 2004. The Rosetta spacecraft will be pairing up with Comet 67P/Churyumov-Gerasimenko and accompanying it on its journey, investigating the comet’s composition and the dynamic processes at work as it flies sunwards. The spacecraft will even deposit a lander on the comet. “This will be our first direct contact with the surface of a comet,” said Dr Manfred Warhaut, Operations Manager for the Rosetta mission at ESA's European Space Operations Centre (ESOC) in Darmstadt, Germany. The trip is certainly not short: Rosetta will need ten years just to reach the comet. This places extreme demands on its hardware; when the probe meets up with the comet, all instruments must be fully operational, especially since it will have been in “hibernation” for 2 and a half years of its journey. During this ‘big sleep’, all systems, scientific instruments included, are turned off. Only the on-board computer remains active. Twelve cubic metres of technical wizardry Rosetta’s hardware fits into a sort of aluminium box measuring just 12 cubic metres. The scientific payload is mounted in the upper part, while the subsystems - on-board computer, transmitter and propulsion system - are housed below. The lander is fixed to the opposite side of the probe from the steerable antenna. As the spacecraft orbits the comet, the scientific instruments will at all times be pointed towards its surface; the antenna and solar panels will point towards the Earth and Sun respectively. For trajectory and attitude control and for the major braking manœuvres, Rosetta is equipped with 24 thrusters each delivering 10 N. That corresponds to the force needed here on Earth to hold a bag containing 10 apples. Rosetta sets off with 1650 kg of propellant on board, accounting for more than half its mass at lift-off. Just 20% of total mass is available for scientific purposes. So when developing the research instruments

  16. The Design and Development of Test Platform for Wheat Precision Seeding Based on Image Processing Techniques

    OpenAIRE

    Li , Qing; Lin , Haibo; Xiu , Yu-Feng; Wang , Ruixue; Yi , Chuijie

    2009-01-01

    International audience; The test platform of wheat precision seeding based on image processing techniques is designed to develop the wheat precision seed metering device with high efficiency and precision. Using image processing techniques, this platform gathers images of seeds (wheat) on the conveyer belt which are falling from seed metering device. Then these data are processed and analyzed to calculate the qualified rate, reseeding rate and leakage sowing rate, etc. This paper introduces t...

  17. Application of nonliner reduction techniques in chemical process modeling: a review

    International Nuclear Information System (INIS)

    Muhaimin, Z; Aziz, N.; Abd Shukor, S.R.

    2006-01-01

    Model reduction techniques have been used widely in engineering fields for electrical, mechanical as well as chemical engineering. The basic idea of reduction technique is to replace the original system by an approximating system with much smaller state-space dimension. A reduced order model is more beneficial to process and industrial field in terms of control purposes. This paper is to provide a review on application of nonlinear reduction techniques in chemical processes. The advantages and disadvantages of each technique reviewed are also highlighted

  18. Low level image processing techniques using the pipeline image processing engine in the flight telerobotic servicer

    Science.gov (United States)

    Nashman, Marilyn; Chaconas, Karen J.

    1988-01-01

    The sensory processing system for the NASA/NBS Standard Reference Model (NASREM) for telerobotic control is described. This control system architecture was adopted by NASA of the Flight Telerobotic Servicer. The control system is hierarchically designed and consists of three parallel systems: task decomposition, world modeling, and sensory processing. The Sensory Processing System is examined, and in particular the image processing hardware and software used to extract features at low levels of sensory processing for tasks representative of those envisioned for the Space Station such as assembly and maintenance are described.

  19. Uncertainty in safety : new techniques for the assessment and optimisation of safety in process industry

    NARCIS (Netherlands)

    Rouvroye, J.L.; Nieuwenhuizen, J.K.; Brombacher, A.C.; Stavrianidis, P.; Spiker, R.Th.E.; Pyatt, D.W.

    1995-01-01

    At this moment there is no standardised method for the assessment for safety in the process industry. Many companies and institutes use qualitative techniques for safety analysis while other companies and institutes use quantitative techniques. The authors of this paper will compare different

  20. A history of engraving and etching techniques: developments of manual intaglio printmaking processes, 1400-2000

    NARCIS (Netherlands)

    Stijnman, A.C.J.

    2012-01-01

    This book surveys the history of the techniques of engraving, etching and plate printing - i.e. that of manual intaglio printmaking processes - from its beginning in the 1430s until today. These developments are observed in the light of the coherence between the technique of the intaglio print (such

  1. Multi-beam backscatter image data processing techniques employed to EM 1002 system

    Digital Repository Service at National Institute of Oceanography (India)

    Fernandes, W.A.; Chakraborty, B.

    to compensate outer-beam backscatter strength data in such a way that the effect of angular backscatter strength is removed. In this work we have developed backscatter data processing techniques for EM1002 multi-beam system...

  2. Uncovering cognitive processes: Different techniques that can contribute to cognitive load research and instruction

    NARCIS (Netherlands)

    Van Gog, Tamara; Kester, Liesbeth; Nievelstein, Fleurie; Giesbers, Bas; Fred, Paas

    2009-01-01

    Van Gog, T., Kester, L., Nievelstein, F., Giesbers, B., & Paas, F. (2009). Uncovering cognitive processes: Different techniques that can contribute to cognitive load research and instruction. Computers in Human Behavior, 25, 325-331.

  3. Grapefruit (Citrus paradisi Macfad) phytochemicals composition is modulated by household processing techniques.

    Science.gov (United States)

    Uckoo, Ram M; Jayaprakasha, Guddadarangavvanahally K; Balasubramaniam, V M; Patil, Bhimanagouda S

    2012-09-01

    Grapefruits (Citrus paradisi Macfad) contain several phytochemicals known to have health maintaining properties. Due to the consumer's interest in obtaining high levels of these phytochemicals, it is important to understand the changes in their levels by common household processing techniques. Therefore, mature Texas "Rio Red" grapefruits were processed by some of the common household processing practices such as blending, juicing, and hand squeezing techniques and analyzed for their phytochemical content by high performance liquid chromatography (HPLC). Results suggest that grapefruit juice processed by blending had significantly (P levels of flavonoids (narirutin, naringin, hesperidin, neohesperidin, didymin, and poncirin) and limonin compared to juicing and hand squeezing. No significant variation in their content was noticed in the juice processed by juicing and hand squeezing. Ascorbic acid and citric acid were significantly (P processed by juicing and blending, respectively. Furthermore, hand squeezed fruit juice had significantly higher contents of dihydroxybergamottin (DHB) than juice processed by juicing and blending. Bergamottin and 5-methoxy-7 gernoxycoumarin (5-M-7-GC) were significantly higher in blended juice compared to juicing and hand squeezing. Therefore, consuming grapefruit juice processed by blending may provide higher levels of health beneficial phytochemicals such as naringin, narirutin, and poncirin. In contrast, juice processed by hand squeezing and juicing provides lower levels of limonin, bergamottin, and 5-M-7-GC. These results suggest that, processing techniques significantly influence the levels of phytochemicals and blending is a better technique for obtaining higher levels of health beneficial phytochemicals from grapefruits. Practical Application:  Blending, squeezing, and juicing are common household processing techniques used for obtaining fresh grapefruit juice. Understanding the levels of health beneficial phytochemicals

  4. Recent advances in electronic nose techniques for monitoring of fermentation process.

    Science.gov (United States)

    Jiang, Hui; Zhang, Hang; Chen, Quansheng; Mei, Congli; Liu, Guohai

    2015-12-01

    Microbial fermentation process is often sensitive to even slight changes of conditions that may result in unacceptable end-product quality. Thus, the monitoring of the process is critical for discovering unfavorable deviations as early as possible and taking the appropriate measures. However, the use of traditional analytical techniques is often time-consuming and labor-intensive. In this sense, the most effective way of developing rapid, accurate and relatively economical method for quality assurance in microbial fermentation process is the use of novel chemical sensor systems. Electronic nose techniques have particular advantages in non-invasive monitoring of microbial fermentation process. Therefore, in this review, we present an overview of the most important contributions dealing with the quality control in microbial fermentation process using the electronic nose techniques. After a brief description of the fundamentals of the sensor techniques, some examples of potential applications of electronic nose techniques monitoring are provided, including the implementation of control strategies and the combination with other monitoring tools (i.e. sensor fusion). Finally, on the basis of the review, the electronic nose techniques are critically commented, and its strengths and weaknesses being highlighted. In addition, on the basis of the observed trends, we also propose the technical challenges and future outlook for the electronic nose techniques.

  5. Criteria for assessing the quality of signal processing techniques for acoustic leak detection

    International Nuclear Information System (INIS)

    Prabhakar, R.; Singh, O.P.

    1990-01-01

    In this paper the criteria used in the first IAEA coordinated research programme to assess the quality of signal processing techniques for sodium boiling noise detection are highlighted. Signal processing techniques, using new features sensitive to boiling and a new approach for achieving higher reliability of detection, which were developed at Indira Gandhi Centre for Atomic Research are also presented. 10 refs, 3 figs, 2 tabs

  6. Sampling phased array a new technique for signal processing and ultrasonic imaging

    OpenAIRE

    Bulavinov, A.; Joneit, D.; Kröning, M.; Bernus, L.; Dalichow, M.H.; Reddy, K.M.

    2006-01-01

    Different signal processing and image reconstruction techniques are applied in ultrasonic non-destructive material evaluation. In recent years, rapid development in the fields of microelectronics and computer engineering lead to wide application of phased array systems. A new phased array technique, called "Sampling Phased Array" has been developed in Fraunhofer Institute for non-destructive testing. It realizes unique approach of measurement and processing of ultrasonic signals. The sampling...

  7. Sampling phased array - a new technique for ultrasonic signal processing and imaging

    OpenAIRE

    Verkooijen, J.; Boulavinov, A.

    2008-01-01

    Over the past 10 years, the improvement in the field of microelectronics and computer engineering has led to significant advances in ultrasonic signal processing and image construction techniques that are currently being applied to non-destructive material evaluation. A new phased array technique, called 'Sampling Phased Array', has been developed in the Fraunhofer Institute for Non-Destructive Testing([1]). It realises a unique approach of measurement and processing of ultrasonic signals. Th...

  8. Sampling phased array, a new technique for ultrasonic signal processing and imaging now available to industry

    OpenAIRE

    Verkooijen, J.; Bulavinov, A.

    2008-01-01

    Over the past 10 years the improvement in the field of microelectronics and computer engineering has led to significant advances in ultrasonic signal processing and image construction techniques that are currently being applied to non-destructive material evaluation. A new phased array technique, called "Sampling Phased Array" has been developed in the Fraunhofer Institute for non-destructive testing [1]. It realizes a unique approach of measurement and processing of ultrasonic signals. The s...

  9. Novel process intensification techniques in solvent extraction. Contributed Paper IT-09

    International Nuclear Information System (INIS)

    Ghosh, S.K.

    2014-01-01

    Process intensification can be briefly described as any chemical engineering development that leads to substantially smaller, cleaner and more energy efficient technology. Process intensification in active nuclear material processing will offer additional benefit in terms of reduced containment volume. The intensification can be realized either by use of novel equipment or by novel operating techniques. Feasibility of hollow fiber (HF) modules and microchannels or microfluidic devices will be explained for their utilization in process intensification of solvent extraction operation in nuclear fuel cycle

  10. Redesigning business processes : a methodology based on simulation and process mining techniques

    NARCIS (Netherlands)

    Maruster, L.; van Beest, N.R.T.P.

    2009-01-01

    Nowadays, organizations have to adjust their business processes along with the changing environment in order to maintain a competitive advantage. Changing a part of the system to support the business process implies changing the entire system, which leads to complex redesign activities. In this

  11. Correlation techniques for the improvement of signal-to-noise ratio in measurements with stochastic processes

    CERN Document Server

    Reddy, V R; Reddy, T G; Reddy, P Y; Reddy, K R

    2003-01-01

    An AC modulation technique is described to convert stochastic signal variations into an amplitude variation and its retrieval through Fourier analysis. It is shown that this AC detection of signals of stochastic processes when processed through auto- and cross-correlation techniques improve the signal-to-noise ratio; the correlation techniques serve a similar purpose of frequency and phase filtering as that of phase-sensitive detection. A few model calculations applied to nuclear spectroscopy measurements such as Angular Correlations, Mossbauer spectroscopy and Pulse Height Analysis reveal considerable improvement in the sensitivity of signal detection. Experimental implementation of the technique is presented in terms of amplitude variations of harmonics representing the derivatives of normal spectra. Improved detection sensitivity to spectral variations is shown to be significant. These correlation techniques are general and can be made applicable to all the fields of particle counting where measurements ar...

  12. The tool for the automatic analysis of lexical sophistication (TAALES): version 2.0.

    Science.gov (United States)

    Kyle, Kristopher; Crossley, Scott; Berger, Cynthia

    2017-07-11

    This study introduces the second release of the Tool for the Automatic Analysis of Lexical Sophistication (TAALES 2.0), a freely available and easy-to-use text analysis tool. TAALES 2.0 is housed on a user's hard drive (allowing for secure data processing) and is available on most operating systems (Windows, Mac, and Linux). TAALES 2.0 adds 316 indices to the original tool. These indices are related to word frequency, word range, n-gram frequency, n-gram range, n-gram strength of association, contextual distinctiveness, word recognition norms, semantic network, and word neighbors. In this study, we validated TAALES 2.0 by investigating whether its indices could be used to model both holistic scores of lexical proficiency in free writes and word choice scores in narrative essays. The results indicated that the TAALES 2.0 indices could be used to explain 58% of the variance in lexical proficiency scores and 32% of the variance in word-choice scores. Newly added TAALES 2.0 indices, including those related to n-gram association strength, word neighborhood, and word recognition norms, featured heavily in these predictor models, suggesting that TAALES 2.0 represents a substantial upgrade.

  13. Nurturing Opportunity Identification for Business Sophistication in a Cross-disciplinary Study Environment

    Directory of Open Access Journals (Sweden)

    Karine Oganisjana

    2012-12-01

    Full Text Available Opportunity identification is the key element of the entrepreneurial process; therefore the issue of developing this skill in students is a crucial task in contemporary European education which has recognized entrepreneurship as one of the lifelong learning key competences. The earlier opportunity identification becomes a habitual way of thinking and behavior across a broad range of contexts, the more likely that entrepreneurial disposition will steadily reside in students. In order to nurture opportunity identification in students for making them able to organize sophisticated businesses in the future, certain demands ought to be put forward as well to the teacher – the person who is to promote these qualities in their students. The paper reflects some findings of a research conducted within the frameworks of a workplace learning project for the teachers of one of Riga secondary schools (Latvia. The main goal of the project was to teach the teachers to identify hidden inner links between apparently unrelated things, phenomena and events within 10th grade study curriculum and connect them together and create new opportunities. The creation and solution of cross-disciplinary tasks were the means for achieving this goal.

  14. Evaluating Acoustic Emission Signals as an in situ process monitoring technique for Selective Laser Melting (SLM)

    Energy Technology Data Exchange (ETDEWEB)

    Fisher, Karl A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Candy, Jim V. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Guss, Gabe [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mathews, M. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-10-14

    In situ real-time monitoring of the Selective Laser Melting (SLM) process has significant implications for the AM community. The ability to adjust the SLM process parameters during a build (in real-time) can save time, money and eliminate expensive material waste. Having a feedback loop in the process would allow the system to potentially ‘fix’ problem regions before a next powder layer is added. In this study we have investigated acoustic emission (AE) phenomena generated during the SLM process, and evaluated the results in terms of a single process parameter, of an in situ process monitoring technique.

  15. Microstructure characterisation of processed fruits and vegetables by complementary imaging techniques

    NARCIS (Netherlands)

    Voda, A.; Nijsse, J.; Dalen, van G.; As, van H.; Duynhoven, van J.P.M.

    2011-01-01

    The assessment of the microstructural impact of processing on fruits and vegetables is a prerequisite for understanding the relation between processing and textural quality. By combining complementary imaging techniques, one can obtain a multi scale and real-time structural view on the impact of

  16. Making the PACS workstation a browser of image processing software: a feasibility study using inter-process communication techniques.

    Science.gov (United States)

    Wang, Chunliang; Ritter, Felix; Smedby, Orjan

    2010-07-01

    To enhance the functional expandability of a picture archiving and communication systems (PACS) workstation and to facilitate the integration of third-part image-processing modules, we propose a browser-server style method. In the proposed solution, the PACS workstation shows the front-end user interface defined in an XML file while the image processing software is running in the background as a server. Inter-process communication (IPC) techniques allow an efficient exchange of image data, parameters, and user input between the PACS workstation and stand-alone image-processing software. Using a predefined communication protocol, the PACS workstation developer or image processing software developer does not need detailed information about the other system, but will still be able to achieve seamless integration between the two systems and the IPC procedure is totally transparent to the final user. A browser-server style solution was built between OsiriX (PACS workstation software) and MeVisLab (Image-Processing Software). Ten example image-processing modules were easily added to OsiriX by converting existing MeVisLab image processing networks. Image data transfer using shared memory added communication based on IPC techniques is an appealing method that allows PACS workstation developers and image processing software developers to cooperate while focusing on different interests.

  17. Sophisticated Online Learning Scheme for Green Resource Allocation in 5G Heterogeneous Cloud Radio Access Networks

    KAUST Repository

    Alqerm, Ismail

    2018-01-23

    5G is the upcoming evolution for the current cellular networks that aims at satisfying the future demand for data services. Heterogeneous cloud radio access networks (H-CRANs) are envisioned as a new trend of 5G that exploits the advantages of heterogeneous and cloud radio access networks to enhance spectral and energy efficiency. Remote radio heads (RRHs) are small cells utilized to provide high data rates for users with high quality of service (QoS) requirements, while high power macro base station (BS) is deployed for coverage maintenance and low QoS users service. Inter-tier interference between macro BSs and RRHs and energy efficiency are critical challenges that accompany resource allocation in H-CRANs. Therefore, we propose an efficient resource allocation scheme using online learning, which mitigates interference and maximizes energy efficiency while maintaining QoS requirements for all users. The resource allocation includes resource blocks (RBs) and power. The proposed scheme is implemented using two approaches: centralized, where the resource allocation is processed at a controller integrated with the baseband processing unit and decentralized, where macro BSs cooperate to achieve optimal resource allocation strategy. To foster the performance of such sophisticated scheme with a model free learning, we consider users\\' priority in RB allocation and compact state representation learning methodology to improve the speed of convergence and account for the curse of dimensionality during the learning process. The proposed scheme including both approaches is implemented using software defined radios testbed. The obtained results and simulation results confirm that the proposed resource allocation solution in H-CRANs increases the energy efficiency significantly and maintains users\\' QoS.

  18. Comparison Of Several Metrology Techniques For In-line Process Monitoring Of Porous SiOCH

    International Nuclear Information System (INIS)

    Fossati, D.; Imbert, G.; Beitia, C.; Yu, L.; Plantier, L.; Volpi, F.; Royer, J.-C.

    2007-01-01

    As porous SiOCH is a widely used inter-metal dielectric for 65 nm nodes and below, the control of its elaboration process by in-line monitoring is necessary to guarantee successful integration of the material. In this paper, the sensitivities of several non-destructive metrology techniques towards the film elaboration process drifts are investigated. It appears that the two steps of the process should be monitored separately and that corona charge method is the most sensitive technique of the review for this application

  19. Using Unified Modelling Language (UML) as a process-modelling technique for clinical-research process improvement.

    Science.gov (United States)

    Kumarapeli, P; De Lusignan, S; Ellis, T; Jones, B

    2007-03-01

    The Primary Care Data Quality programme (PCDQ) is a quality-improvement programme which processes routinely collected general practice computer data. Patient data collected from a wide range of different brands of clinical computer systems are aggregated, processed, and fed back to practices in an educational context to improve the quality of care. Process modelling is a well-established approach used to gain understanding and systematic appraisal, and identify areas of improvement of a business process. Unified modelling language (UML) is a general purpose modelling technique used for this purpose. We used UML to appraise the PCDQ process to see if the efficiency and predictability of the process could be improved. Activity analysis and thinking-aloud sessions were used to collect data to generate UML diagrams. The UML model highlighted the sequential nature of the current process as a barrier for efficiency gains. It also identified the uneven distribution of process controls, lack of symmetric communication channels, critical dependencies among processing stages, and failure to implement all the lessons learned in the piloting phase. It also suggested that improved structured reporting at each stage - especially from the pilot phase, parallel processing of data and correctly positioned process controls - should improve the efficiency and predictability of research projects. Process modelling provided a rational basis for the critical appraisal of a clinical data processing system; its potential maybe underutilized within health care.

  20. Towards a Business Process Modeling Technique for Agile Development of Case Management Systems

    Directory of Open Access Journals (Sweden)

    Ilia Bider

    2017-12-01

    Full Text Available A modern organization needs to adapt its behavior to changes in the business environment by changing its Business Processes (BP and corresponding Business Process Support (BPS systems. One way of achieving such adaptability is via separation of the system code from the process description/model by applying the concept of executable process models. Furthermore, to ease introduction of changes, such process model should separate different perspectives, for example, control-flow, human resources, and data perspectives, from each other. In addition, for developing a completely new process, it should be possible to start with a reduced process model to get a BPS system quickly running, and then continue to develop it in an agile manner. This article consists of two parts, the first sets requirements on modeling techniques that could be used in the tools that supports agile development of BPs and BPS systems. The second part suggests a business process modeling technique that allows to start modeling with the data/information perspective which would be appropriate for processes supported by Case or Adaptive Case Management (CM/ACM systems. In a model produced by this technique, called data-centric business process model, a process instance/case is defined as sequence of states in a specially designed instance database, while the process model is defined as a set of rules that set restrictions on allowed states and transitions between them. The article details the background for the project of developing the data-centric process modeling technique, presents the outline of the structure of the model, and gives formal definitions for a substantial part of the model.

  1. Process techniques of charge transfer time reduction for high speed CMOS image sensors

    International Nuclear Information System (INIS)

    Cao Zhongxiang; Li Quanliang; Han Ye; Qin Qi; Feng Peng; Liu Liyuan; Wu Nanjian

    2014-01-01

    This paper proposes pixel process techniques to reduce the charge transfer time in high speed CMOS image sensors. These techniques increase the lateral conductivity of the photo-generated carriers in a pinned photodiode (PPD) and the voltage difference between the PPD and the floating diffusion (FD) node by controlling and optimizing the N doping concentration in the PPD and the threshold voltage of the reset transistor, respectively. The techniques shorten the charge transfer time from the PPD diode to the FD node effectively. The proposed process techniques do not need extra masks and do not cause harm to the fill factor. A sub array of 32 × 64 pixels was designed and implemented in the 0.18 μm CIS process with five implantation conditions splitting the N region in the PPD. The simulation and measured results demonstrate that the charge transfer time can be decreased by using the proposed techniques. Comparing the charge transfer time of the pixel with the different implantation conditions of the N region, the charge transfer time of 0.32 μs is achieved and 31% of image lag was reduced by using the proposed process techniques. (semiconductor devices)

  2. Applications of process improvement techniques to improve workflow in abdominal imaging.

    Science.gov (United States)

    Tamm, Eric Peter

    2016-03-01

    Major changes in the management and funding of healthcare are underway that will markedly change the way radiology studies will be reimbursed. The result will be the need to deliver radiology services in a highly efficient manner while maintaining quality. The science of process improvement provides a practical approach to improve the processes utilized in radiology. This article will address in a step-by-step manner how to implement process improvement techniques to improve workflow in abdominal imaging.

  3. Control System Design for Cylindrical Tank Process Using Neural Model Predictive Control Technique

    Directory of Open Access Journals (Sweden)

    M. Sridevi

    2010-10-01

    Full Text Available Chemical manufacturing and process industry requires innovative technologies for process identification. This paper deals with model identification and control of cylindrical process. Model identification of the process was done using ARMAX technique. A neural model predictive controller was designed for the identified model. The performance of the controllers was evaluated using MATLAB software. The performance of NMPC controller was compared with Smith Predictor controller and IMC controller based on rise time, settling time, overshoot and ISE and it was found that the NMPC controller is better suited for this process.

  4. The musicality of non-musicians: an index for assessing musical sophistication in the general population.

    Directory of Open Access Journals (Sweden)

    Daniel Müllensiefen

    Full Text Available Musical skills and expertise vary greatly in Western societies. Individuals can differ in their repertoire of musical behaviours as well as in the level of skill they display for any single musical behaviour. The types of musical behaviours we refer to here are broad, ranging from performance on an instrument and listening expertise, to the ability to employ music in functional settings or to communicate about music. In this paper, we first describe the concept of 'musical sophistication' which can be used to describe the multi-faceted nature of musical expertise. Next, we develop a novel measurement instrument, the Goldsmiths Musical Sophistication Index (Gold-MSI to assess self-reported musical skills and behaviours on multiple dimensions in the general population using a large Internet sample (n = 147,636. Thirdly, we report results from several lab studies, demonstrating that the Gold-MSI possesses good psychometric properties, and that self-reported musical sophistication is associated with performance on two listening tasks. Finally, we identify occupation, occupational status, age, gender, and wealth as the main socio-demographic factors associated with musical sophistication. Results are discussed in terms of theoretical accounts of implicit and statistical music learning and with regard to social conditions of sophisticated musical engagement.

  5. Moral foundations and political attitudes: The moderating role of political sophistication.

    Science.gov (United States)

    Milesi, Patrizia

    2016-08-01

    Political attitudes can be associated with moral concerns. This research investigated whether people's level of political sophistication moderates this association. Based on the Moral Foundations Theory, this article examined whether political sophistication moderates the extent to which reliance on moral foundations, as categories of moral concerns, predicts judgements about policy positions. With this aim, two studies examined four policy positions shown by previous research to be best predicted by the endorsement of Sanctity, that is, the category of moral concerns focused on the preservation of physical and spiritual purity. The results showed that reliance on Sanctity predicted political sophisticates' judgements, as opposed to those of unsophisticates, on policy positions dealing with equal rights for same-sex and unmarried couples and with euthanasia. Political sophistication also interacted with Fairness endorsement, which includes moral concerns for equal treatment of everybody and reciprocity, in predicting judgements about equal rights for unmarried couples, and interacted with reliance on Authority, which includes moral concerns for obedience and respect for traditional authorities, in predicting opposition to stem cell research. Those findings suggest that, at least for these particular issues, endorsement of moral foundations can be associated with political attitudes more strongly among sophisticates than unsophisticates. © 2015 International Union of Psychological Science.

  6. Reading wild minds: A computational assay of Theory of Mind sophistication across seven primate species.

    Directory of Open Access Journals (Sweden)

    Marie Devaine

    2017-11-01

    Full Text Available Theory of Mind (ToM, i.e. the ability to understand others' mental states, endows humans with highly adaptive social skills such as teaching or deceiving. Candidate evolutionary explanations have been proposed for the unique sophistication of human ToM among primates. For example, the Machiavellian intelligence hypothesis states that the increasing complexity of social networks may have induced a demand for sophisticated ToM. This type of scenario ignores neurocognitive constraints that may eventually be crucial limiting factors for ToM evolution. In contradistinction, the cognitive scaffolding hypothesis asserts that a species' opportunity to develop sophisticated ToM is mostly determined by its general cognitive capacity (on which ToM is scaffolded. However, the actual relationships between ToM sophistication and either brain volume (a proxy for general cognitive capacity or social group size (a proxy for social network complexity are unclear. Here, we let 39 individuals sampled from seven non-human primate species (lemurs, macaques, mangabeys, orangutans, gorillas and chimpanzees engage in simple dyadic games against artificial ToM players (via a familiar human caregiver. Using computational analyses of primates' choice sequences, we found that the probability of exhibiting a ToM-compatible learning style is mainly driven by species' brain volume (rather than by social group size. Moreover, primates' social cognitive sophistication culminates in a precursor form of ToM, which still falls short of human fully-developed ToM abilities.

  7. The musicality of non-musicians: an index for assessing musical sophistication in the general population.

    Science.gov (United States)

    Müllensiefen, Daniel; Gingras, Bruno; Musil, Jason; Stewart, Lauren

    2014-01-01

    Musical skills and expertise vary greatly in Western societies. Individuals can differ in their repertoire of musical behaviours as well as in the level of skill they display for any single musical behaviour. The types of musical behaviours we refer to here are broad, ranging from performance on an instrument and listening expertise, to the ability to employ music in functional settings or to communicate about music. In this paper, we first describe the concept of 'musical sophistication' which can be used to describe the multi-faceted nature of musical expertise. Next, we develop a novel measurement instrument, the Goldsmiths Musical Sophistication Index (Gold-MSI) to assess self-reported musical skills and behaviours on multiple dimensions in the general population using a large Internet sample (n = 147,636). Thirdly, we report results from several lab studies, demonstrating that the Gold-MSI possesses good psychometric properties, and that self-reported musical sophistication is associated with performance on two listening tasks. Finally, we identify occupation, occupational status, age, gender, and wealth as the main socio-demographic factors associated with musical sophistication. Results are discussed in terms of theoretical accounts of implicit and statistical music learning and with regard to social conditions of sophisticated musical engagement.

  8. Study of the aging processes in polyurethane adhesives using thermal treatment and differential calorimetric, dielectric, and mechanical techniques ; 1, identifying the aging processes ; 2, quantifying the aging effect

    CERN Document Server

    Althouse, L P

    1979-01-01

    Study of the aging processes in polyurethane adhesives using thermal treatment and differential calorimetric, dielectric, and mechanical techniques ; 1, identifying the aging processes ; 2, quantifying the aging effect

  9. An image processing technique for the radiographic assessment of vertebral derangements

    Energy Technology Data Exchange (ETDEWEB)

    Breen, A.C. (Anglo-European Coll. of Chiropractic, Bournemouth (UK)); Allen, R. (Southampton Univ. (UK). Dept. of Mechanical Engineering); Morris, A. (Odstock Hospital, Salisbury (UK). Dept. of Radiology)

    1989-01-01

    A technique for measuring inter-vertebral motion by the digitization and processing of intensifier images is described. The technique reduces the time and X-ray dosage currently required to make such assessments. The errors associated with computing kinematic indices at increments of coronal plane rotations in the lumbar spine have been calculated using a calibration model designed to produce a facsimile of in vivo conditions in terms of image quality and geometric distortion. (author).

  10. Application of hydrometallurgy techniques in quartz processing and purification: a review

    Science.gov (United States)

    Lin, Min; Lei, Shaomin; Pei, Zhenyu; Liu, Yuanyuan; Xia, Zhangjie; Xie, Feixiang

    2018-04-01

    Although there have been numerous studies on separation and purification of metallic minerals by hydrometallurgy techniques, applications of the chemical techniques in separation and purification of non-metallic minerals are rarely reported. This paper reviews disparate areas of study into processing and purification of quartz (typical non-metallic ore) in an attempt to summarize current work, as well as to suggest potential for future consolidation in the field. The review encompasses chemical techniques of the quartz processing including situations, progresses, leaching mechanism, scopes of application, advantages and drawbacks of micro-bioleaching, high temperature leaching, high temperature pressure leaching and catalyzed high temperature pressure leaching. Traditional leaching techniques including micro-bioleaching and high temperature leaching are unequal to demand of modern glass industry for quality of quartz concentrate because the quartz products has to be further processed. High temperature pressure leaching and catalyzed high temperature pressure leaching provide new ways to produce high-grade quartz sand with only one process and lower acid consumption. Furthermore, the catalyzed high temperature pressure leaching realizes effective purification of quartz with extremely low acid consumption (no using HF or any fluoride). It is proposed that, by integrating the different chemical processes of quartz processing and expounding leaching mechanisms and scopes of application, the research field as a monopolized industry would benefit.

  11. A study on hybrid split-spectrum processing technique for enhanced reliability in ultrasonic signal analysis

    International Nuclear Information System (INIS)

    Huh, Hyung; Koo, Kil Mo; Cheong, Yong Moo; Kim, G. J.

    1995-01-01

    Many signal-processing techniques have been found to be useful in ultrasonic and nondestructive evaluation. Among the most popular techniques are signal averaging, spatial compounding, matched filters, and homomorphic processing. One of the significant new process is split-spectrum processing(SSP), which can be equally useful in signal-to-noise ratio(SNR) improvement and grain characterization in several engineering materials. The purpose of this paper is to explore the utility of SSP in ultrasonic NDE. A wide variety of engineering problems are reviewed and suggestions for implementation of the technique are provided. SSP uses the frequency-dependent response of the interfering coherent noise produced by unresolvable scatters in the resolution range cell of a transducer. It is implemented by splitting the Sequency spectrum of the received signal by using Gaussian bandpass filters. The theoretical basis for the potential of SSP for grain characterization in SUS 304 material is discussed, and some experimental-evidence for the feasibility of the approach is presented. Results of SNR enhancement in signals obtained from real four samples of SUS 304. The influence of various processing parameters on the performance of the processing technique is also discussed. The minimization algorithm. which provides an excellent SNR enhancement when used either in conjunction with other SSP algorithms like polarity-check or by itself, is also presented.

  12. A Study on Hybrid Split-Spectrum Processing Technique for Enhanced Reliability in Ultrasonic Signal Analysis

    International Nuclear Information System (INIS)

    Huh, H.; Koo, K. M.; Kim, G. J.

    1996-01-01

    Many signal-processing techniques have been found to be useful in ultrasonic and nondestructive evaluation. Among the most popular techniques are signal averaging, spatial compounding, matched filters and homomorphic processing. One of the significant new process is split-spectrum processing(SSP), which can be equally useful in signal-to-noise ratio(SNR) improvement and grain characterization in several specimens. The purpose of this paper is to explore the utility of SSP in ultrasonic NDE. A wide variety of engineering problems are reviewed, and suggestions for implementation of the technique are provided. SSP uses the frequency-dependent response of the interfering coherent noise produced by unresolvable scatters in the resolution range cell of a transducer. It is implemented by splitting the frequency spectrum of the received signal by using gaussian bandpass filter. The theoretical basis for the potential of SSP for grain characterization in SUS 304 material is discussed, and some experimental evidence for the feasibility of the approach is presented. Results of SNR enhancement in signals obtained from real four samples of SUS 304. The influence of various processing parameters on the performance of the processing technique is also discussed. The minimization algorithm, which provides an excellent SNR enhancement when used either in conjunction with other SSP algorithms like polarity-check or by itself, is also presented

  13. Ignition and monitoring technique for plasma processing of multicell superconducting radio-frequency cavities

    Science.gov (United States)

    Doleans, Marc

    2016-12-01

    An in-situ plasma processing technique has been developed at the Spallation Neutron Source (SNS) to improve the performance of the superconducting radio-frequency (SRF) cavities in operation. The technique uses a low-density reactive neon-oxygen plasma at room-temperature to improve the surface work function, to help remove adsorbed gases on the RF surface, and to reduce its secondary emission yield. SNS SRF cavities have six accelerating cells and the plasma typically ignites in the cell where the electric field is the highest. This article details the technique to ignite and monitor the plasma in each cell of the SNS cavities.

  14. On the Interface Between Automated Predictive Demand Planning Techniques and Humans in Collaborative Planning Processes

    DEFF Research Database (Denmark)

    Schorsch, Timm; Wallenburg, Carl Marcus; Wieland, Andreas

    The introduction of big data and predictive analytics techniques in the supply chain context constitutes a “hot topic” in both research and practice. Without arguing against this euphoria, this paper critically assesses the consequences of confronting human actors with an increasing usage...... of these techniques. The underlying case of this paper refers to collaborative supply chain processes that are predestinated for integrating new big data and predictive analytics techniques. By building a theoretical framework for deriving sound hypothesis and introducing and testing the experimental design...

  15. Enhancement of crack detection in stud bolts of nuclear reactor by ultrasonic signal processing technique

    International Nuclear Information System (INIS)

    Lee, J.H.; Oh, W.D.; Choi, S.W.; Park, M.H.

    2004-01-01

    'Full-text:' The stud bolts is one of the most critical parts for safety of reactor vessels in the nuclear power plants. However, in the application of ultrasonic technique for crack detection in stud bolt, some difficulties encountered are classification of crack signal from the signals reflected from threads part in stud bolt. In this study, shadow effect technique combined with new signal processing method is Investigated to enhance the detectability of small crack initiated from root of thread in stud bolt. The key idea of signal processing is based on the fact that the shape of waveforms from the threads is uniform since the shape of the threads in a bolt is same. If some cracks exist in the thread, the flaw signals are different to the reference signals. It is demonstrated that the small flaws are efficiently detected by novel ultrasonic technique combined with this new signal processing concept. (author)

  16. Process acceptance and adjustment techniques for Swiss automatic screw machine parts. Final report

    International Nuclear Information System (INIS)

    Robb, J.M.

    1976-01-01

    Product tolerance requirements for small, cylindrical, piece parts produced on swiss automatic screw machines have progressed to the reliability limits of inspection equipment. The miniature size, configuration, and tolerance requirements (plus or minus 0.0001 in.) (0.00254 mm) of these parts preclude the use of screening techniques to accept product or adjust processes during setup and production runs; therefore, existing means of product acceptance and process adjustment must be refined or new techniques must be developed. The purpose of this endeavor has been to determine benefits gained through the implementation of a process acceptance technique (PAT) to swiss automatic screw machine processes. PAT is a statistical approach developed for the purpose of accepting product and centering processes for parts produced by selected, controlled processes. Through this endeavor a determination has been made of the conditions under which PAT can benefit a controlled process and some specific types of screw machine processes upon which PAT could be applied. However, it was also determined that PAT, if used indiscriminately, may become a record keeping burden when applied to more than one dimension at a given machining operation

  17. Demonstration of laser processing technique combined with water jet technique for retrieval of fuel debris at Fukushima Daiichi Nuclear Power Station

    International Nuclear Information System (INIS)

    Hanari, Toshihide; Takebe, Toshihiko; Yamada, Tomonori; Daido, Hiroyuki; Ishizuka, Ippei; Ohmori, Shinya; Kurosawa, Koichi; Sasaki, Go; Nakada, Masahiro; Sakai, Hideaki

    2017-01-01

    In decommissioning of Fukushima Daiichi Nuclear Power Station, a retrieval process of fuel debris in the Primary Containment Vessel by a remote operation is one of the key issues. In this process, prevention of spreading radioactive materials is one of the important considerations. Furthermore, an applicable technique to the process requires keeping of reasonable processing-efficiency. We propose to use the combined technique including a laser light and a water jet as a retrieval technique of the fuel debris. The laser processing technique combined with a repetitive pulsed water jet could perform an efficient retrieval processing. Our experimental result encourages us to promote further development of the technique towards a real application at Fukushima Daiichi Nuclear Power Station. (author)

  18. Differential ethnic associations between maternal flexibility and play sophistication in toddlers born very low birth weight

    Science.gov (United States)

    Erickson, Sarah J.; Montague, Erica Q.; Maclean, Peggy C.; Bancroft, Mary E.; Lowe, Jean R.

    2013-01-01

    Children born very low birth weight (development of self-regulation and effective functional skills, and play serves as an important avenue of early intervention. The current study investigated associations between maternal flexibility and toddler play sophistication in Caucasian, Spanish speaking Hispanic, English speaking Hispanic, and Native American toddlers (18-22 months adjusted age) in a cross-sectional cohort of 73 toddlers born VLBW and their mothers. We found that the association between maternal flexibility and toddler play sophistication differed by ethnicity (F(3,65) = 3.34, p = .02). In particular, Spanish speaking Hispanic dyads evidenced a significant positive association between maternal flexibility and play sophistication of medium effect size. Results for Native Americans were parallel to those of Spanish speaking Hispanic dyads: the relationship between flexibility and play sophistication was positive and of small-medium effect size. Findings indicate that for Caucasians and English speaking Hispanics, flexibility evidenced a non-significant (negative and small effect size) association with toddler play sophistication. Significant follow-up contrasts revealed that the associations for Caucasian and English speaking Hispanic dyads were significantly different from those of the other two ethnic groups. Results remained unchanged after adjusting for the amount of maternal language, an index of maternal engagement and stimulation; and after adjusting for birth weight, gestational age, gender, test age, cognitive ability, as well maternal age, education, and income. Our results provide preliminary evidence that ethnicity and acculturation may mediate the association between maternal interactive behavior such as flexibility and toddler developmental outcomes, as indexed by play sophistication. Addressing these association differences is particularly important in children born VLBW because interventions targeting parent interaction strategies such as

  19. Processing ultrafine-grained Aluminum alloy using Multi-ECAP-Conform technique

    International Nuclear Information System (INIS)

    Fakhretdinova, Elvira; Raab, Georgy; Valiev, Ruslan; Ryzhikov, Oleg

    2014-01-01

    The stress-strained state (SSS), contact and force parameters of a new SPD technique – Multi-ECAP-Conform – have been studied. The new technique ensures a high level of accumulated strain □=4...5 per one processing cycle. Physical and computer modeling by finite element method in Deform-3D software was applied to evaluate the parameters. It is shown that the results of physical and computer modeling correlate with each other. Equipment has been upgraded, and experimental samples of Al-Mg-Si system alloy have been processed

  20. The Value of Multivariate Model Sophistication: An Application to pricing Dow Jones Industrial Average options

    DEFF Research Database (Denmark)

    Rombouts, Jeroen V.K.; Stentoft, Lars; Violante, Francesco

    innovation for a Laplace innovation assumption improves the pricing in a smaller way. Apart from investigating directly the value of model sophistication in terms of dollar losses, we also use the model condence set approach to statistically infer the set of models that delivers the best pricing performance.......We assess the predictive accuracy of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set 248 multivariate models that differer...

  1. Cognitive ability rivals the effect of political sophistication on ideological voting

    DEFF Research Database (Denmark)

    Hebbelstrup Rye Rasmussen, Stig

    2016-01-01

    This article examines the impact of cognitive ability on ideological voting. We find, using a US sample and a Danish sample, that the effect of cognitive ability rivals the effect of the traditionally strongest predicter of ideological voting political sophistication. Furthermore, the results...... are consistent with the effect of cognitive ability being partly mediated by political sophistication. Much of the effect of cognitive ability remains however and is not explained by differences in education or Openness to experience either. The implications of these results for democratic theory are discussed....

  2. Applying traditional signal processing techniques to social media exploitation for situational understanding

    Science.gov (United States)

    Abdelzaher, Tarek; Roy, Heather; Wang, Shiguang; Giridhar, Prasanna; Al Amin, Md. Tanvir; Bowman, Elizabeth K.; Kolodny, Michael A.

    2016-05-01

    Signal processing techniques such as filtering, detection, estimation and frequency domain analysis have long been applied to extract information from noisy sensor data. This paper describes the exploitation of these signal processing techniques to extract information from social networks, such as Twitter and Instagram. Specifically, we view social networks as noisy sensors that report events in the physical world. We then present a data processing stack for detection, localization, tracking, and veracity analysis of reported events using social network data. We show using a controlled experiment that the behavior of social sources as information relays varies dramatically depending on context. In benign contexts, there is general agreement on events, whereas in conflict scenarios, a significant amount of collective filtering is introduced by conflicted groups, creating a large data distortion. We describe signal processing techniques that mitigate such distortion, resulting in meaningful approximations of actual ground truth, given noisy reported observations. Finally, we briefly present an implementation of the aforementioned social network data processing stack in a sensor network analysis toolkit, called Apollo. Experiences with Apollo show that our techniques are successful at identifying and tracking credible events in the physical world.

  3. HARM processing techniques for MEMS and MOEMS devices using bonded SOI substrates and DRIE

    Science.gov (United States)

    Gormley, Colin; Boyle, Anne; Srigengan, Viji; Blackstone, Scott C.

    2000-08-01

    Silicon-on-Insulator (SOI) MEMS devices (1) are rapidly gaining popularity in realizing numerous solutions for MEMS, especially in the optical and inertia application fields. BCO recently developed a DRIE trench etch, utilizing the Bosch process, and refill process for high voltage dielectric isolation integrated circuits on thick SOI substrates. In this paper we present our most recently developed DRIE processes for MEMS and MOEMS devices. These advanced etch techniques are initially described and their integration with silicon bonding demonstrated. This has enabled process flows that are currently being utilized to develop optical router and filter products for fiber optics telecommunications and high precision accelerometers.

  4. Multi-disciplinary communication networks for skin risk assessment in nursing homes with high IT sophistication.

    Science.gov (United States)

    Alexander, Gregory L; Pasupathy, Kalyan S; Steege, Linsey M; Strecker, E Bradley; Carley, Kathleen M

    2014-08-01

    The role of nursing home (NH) information technology (IT) in quality improvement has not been clearly established, and its impacts on communication between care givers and patient outcomes in these settings deserve further attention. In this research, we describe a mixed method approach to explore communication strategies used by healthcare providers for resident skin risk in NH with high IT sophistication (ITS). Sample included NH participating in the statewide survey of ITS. We incorporated rigorous observation of 8- and 12-h shifts, and focus groups to identify how NH IT and a range of synchronous and asynchronous tools are used. Social network analysis tools and qualitative analysis were used to analyze data and identify relationships between ITS dimensions and communication interactions between care providers. Two of the nine ITS dimensions (resident care-technological and administrative activities-technological) and total ITS were significantly negatively correlated with number of unique interactions. As more processes in resident care and administrative activities are supported by technology, the lower the number of observed unique interactions. Additionally, four thematic areas emerged from staff focus groups that demonstrate how important IT is to resident care in these facilities including providing resident-centered care, teamwork and collaboration, maintaining safety and quality, and using standardized information resources. Our findings in this study confirm prior research that as technology support (resident care and administrative activities) and overall ITS increases, observed interactions between staff members decrease. Conversations during staff interviews focused on how technology facilitated resident centered care through enhanced information sharing, greater virtual collaboration between team members, and improved care delivery. These results provide evidence for improving the design and implementation of IT in long term care systems to support

  5. Image Processing Based Signature Verification Technique to Reduce Fraud in Financial Institutions

    Directory of Open Access Journals (Sweden)

    Hussein Walid

    2016-01-01

    Full Text Available Handwritten signature is broadly utilized as personal verification in financial institutions ensures the necessity for a robust automatic signature verification tool. This tool aims to reduce fraud in all related financial transactions’ sectors. This paper proposes an online, robust, and automatic signature verification technique using the recent advances in image processing and machine learning. Once the image of a handwritten signature for a customer is captured, several pre-processing steps are performed on it including filtration and detection of the signature edges. Afterwards, a feature extraction process is applied on the image to extract Speeded up Robust Features (SURF and Scale-Invariant Feature Transform (SIFT features. Finally, a verification process is developed and applied to compare the extracted image features with those stored in the database for the specified customer. Results indicate high accuracy, simplicity, and rapidity of the developed technique, which are the main criteria to judge a signature verification tool in banking and other financial institutions.

  6. Generalized hardware post-processing technique for chaos-based pseudorandom number generators

    KAUST Repository

    Barakat, Mohamed L.

    2013-06-01

    This paper presents a generalized post-processing technique for enhancing the pseudorandomness of digital chaotic oscillators through a nonlinear XOR-based operation with rotation and feedback. The technique allows full utilization of the chaotic output as pseudorandom number generators and improves throughput without a significant area penalty. Digital design of a third-order chaotic system with maximum function nonlinearity is presented with verified chaotic dynamics. The proposed post-processing technique eliminates statistical degradation in all output bits, thus maximizing throughput compared to other processing techniques. Furthermore, the technique is applied to several fully digital chaotic oscillators with performance surpassing previously reported systems in the literature. The enhancement in the randomness is further examined in a simple image encryption application resulting in a better security performance. The system is verified through experiment on a Xilinx Virtex 4 FPGA with throughput up to 15.44 Gbit/s and logic utilization less than 0.84% for 32-bit implementations. © 2013 ETRI.

  7. Development of an Advanced, Automatic, Ultrasonic NDE Imaging System via Adaptive Learning Network Signal Processing Techniques

    Science.gov (United States)

    1981-03-13

    UNCLASSIFIED SECURITY CLAS,:FtfC ’i OF TH*!’ AGC W~ct P- A* 7~9r1) 0. ABSTRACT (continued) onuing in concert with a sophisticated detector has...and New York, 1969. Whalen, M.F., L.J. O’Brien, and A.N. Mucciardi, "Application of Adaptive Learning Netowrks for the Characterization of Two

  8. assessment of environmental impacts in comfortable furniture production process using life cycle assessment (LCA technique

    Directory of Open Access Journals (Sweden)

    hejhar abbasi

    2016-12-01

    Full Text Available Furniture industry releases annually a large amount of volatile organic compound to the environment due to the use of adhesives, textiles, paints and coating materials. There are some different methods to measure the load of pollutions and the environmental impacts. Life cycle assessment (LCA is one of the best techniques. LCA is a technique in which all environmental impacts related to a product assessed all over its life cycle, from cradle to grave, and ultimately can be used to improve the production process and to prevent unsuitable environmental impacts. In summary, it can be concluded that the use of this technique is the basis for sustainable development and improving social, economic, and environmental indices. This study focused on the collecting of a comprehensive life cycle inventory data for comfortable furniture in two different production processes (B1 and B2 located in Tehran province, and analyzed the environmental impacts during the production process as gate to gate investigation. The results revealed that emissions in production process B1 were higher than that of production process B2. The reason for this is that basic operations such as sawing and frame assembling along with final operation have been done in the same unit for case B1. Textile production and usage, and polyurethane foam were identified as the main hotspots, respectively. Moreover, the results showed that comfortable furniture production process has the highest effects on ecosystem quality, human health, and resources (fossil fuels and mines, respectively.

  9. The Effective Ransomware Prevention Technique Using Process Monitoring on Android Platform

    Directory of Open Access Journals (Sweden)

    Sanggeun Song

    2016-01-01

    Full Text Available Due to recent indiscriminate attacks of ransomware, damage cases including encryption of users’ important files are constantly increasing. The existing vaccine systems are vulnerable to attacks of new pattern ransomware because they can only detect the ransomware of existing patterns. More effective technique is required to prevent modified ransomware. In this paper, an effective method is proposed to prevent the attacks of modified ransomware on Android platform. The proposed technique specifies and intensively monitors processes and specific file directories using statistical methods based on Processor usage, Memory usage, and I/O rates so that the process with abnormal behaviors can be detected. If the process running a suspicious ransomware is detected, the proposed system will stop the process and take steps to confirm the deletion of programs associated with the process from users. The information of suspected and exceptional processes confirmed by users is stored in a database. The proposed technique can detect ransomware even if you do not save its patterns. Its speed of detection is very fast because it can be implemented in Android source code instead of mobile application. In addition, it can effectively determine modified patterns of ransomware and provide protection with minimum damage.

  10. Processing techniques for data from the Kuosheng Unit 1 shakedown safety-relief-valve tests

    International Nuclear Information System (INIS)

    McCauley, E.W.; Rompel, S.L.; Weaver, H.J.; Altenbach, T.J.

    1982-08-01

    This report describes techniques developed at the Lawrence Livermore National Laobratory, Livermore, CA for processing original data from the Taiwan Power Company's Kuosheng MKIII Unit 1 Safety Relief Valve Shakedown Tests conducted in April/May 1981. The computer codes used, TPSORT, TPPLOT, and TPPSD, form a special evaluation system for treating the data from its original packed binary form to ordered, calibrated ASCII transducer files and then to production of time-history plots, numerical output files, and spectral analyses. Using the data processing techniques described, a convenient means of independently examining and analyzing a unique data base for steam condensation phenomena in the MARKIII wetwell is described. The techniques developed for handling these data are applicable to the treatment of similar, but perhaps differently structured, experiment data sets

  11. A novel data processing technique for image reconstruction of penumbral imaging

    Science.gov (United States)

    Xie, Hongwei; Li, Hongyun; Xu, Zeping; Song, Guzhou; Zhang, Faqiang; Zhou, Lin

    2011-06-01

    CT image reconstruction technique was applied to the data processing of the penumbral imaging. Compared with other traditional processing techniques for penumbral coded pinhole image such as Wiener, Lucy-Richardson and blind technique, this approach is brand new. In this method, the coded aperture processing method was used for the first time independent to the point spread function of the image diagnostic system. In this way, the technical obstacles was overcome in the traditional coded pinhole image processing caused by the uncertainty of point spread function of the image diagnostic system. Then based on the theoretical study, the simulation of penumbral imaging and image reconstruction was carried out to provide fairly good results. While in the visible light experiment, the point source of light was used to irradiate a 5mm×5mm object after diffuse scattering and volume scattering. The penumbral imaging was made with aperture size of ~20mm. Finally, the CT image reconstruction technique was used for image reconstruction to provide a fairly good reconstruction result.

  12. Modified technique for processing multiangle lidar data measured in clear and moderately polluted atmospheres

    Science.gov (United States)

    Vladimir Kovalev; Cyle Wold; Alexander Petkov; Wei Min Hao

    2011-01-01

    We present a modified technique for processing multiangle lidar data that is applicable for relatively clear atmospheres, where the utilization of the conventional Kano-Hamilton method meets significant issues. Our retrieval algorithm allows computing the two-way transmission and the corresponding extinction-coefficient profile in any slope direction searched during...

  13. Advanced signal processing techniques for acoustic detection of sodium/water reaction

    International Nuclear Information System (INIS)

    Yughay, V.S.; Gribok, A.V.; Volov, A.N.

    1997-01-01

    In this paper results of development of a neural network technique for processing of acoustic background noise and injection noise of various media (argon, water steam, hydrogen) at test rigs and industrial steam generator are presented. (author). 3 refs, 9 figs, 3 tabs

  14. Multiple-output all-optical header processing technique based on two-pulse correlation principle

    NARCIS (Netherlands)

    Calabretta, N.; Liu, Y.; Waardt, de H.; Hill, M.T.; Khoe, G.D.; Dorren, H.J.S.

    2001-01-01

    A serial all-optical header processing technique based on a two-pulse correlation principle in a semiconductor laser amplifier in a loop mirror (SLALOM) configuration that can have a large number of output ports is presented. The operation is demonstrated experimentally at a 10Gbit/s Manchester

  15. Analysis of two dimensional charged particle scintillation using video image processing techniques

    International Nuclear Information System (INIS)

    Sinha, A.; Bhave, B.D.; Singh, B.; Panchal, C.G.; Joshi, V.M.; Shyam, A.; Srinivasan, M.

    1993-01-01

    A novel method for video recording of individual charged particle scintillation images and their offline analysis using digital image processing techniques for obtaining position, time and energy information is presented . Results of an exploratory experiment conducted using 241 Am and 239 Pu alpha sources are presented. (author). 3 figs., 4 tabs

  16. Assessing Epistemic Sophistication by Considering Domain-Specific Absolute and Multiplicistic Beliefs Separately

    Science.gov (United States)

    Peter, Johannes; Rosman, Tom; Mayer, Anne-Kathrin; Leichner, Nikolas; Krampen, Günter

    2016-01-01

    Background: Particularly in higher education, not only a view of science as a means of finding absolute truths (absolutism), but also a view of science as generally tentative (multiplicism) can be unsophisticated and obstructive for learning. Most quantitative epistemic belief inventories neglect this and understand epistemic sophistication as…

  17. The Relationship between Logistics Sophistication and Drivers of the Outsourcing of Logistics Activities

    Directory of Open Access Journals (Sweden)

    Peter Wanke

    2008-10-01

    Full Text Available A strong link has been established between operational excellence and the degree of sophistication of logistics organization, a function of factors such as performance monitoring, investment in Information Technology [IT] and the formalization of logistics organization, as proposed in the Bowersox, Daugherty, Dröge, Germain and Rogers (1992 Leading Edge model. At the same time, shippers have been increasingly outsourcing their logistics activities to third party providers. This paper, based on a survey with large Brazilian shippers, addresses a gap in the literature by investigating the relationship between dimensions of logistics organization sophistication and drivers of logistics outsourcing. To this end, the dimensions behind the logistics sophistication construct were first investigated. Results from factor analysis led to the identification of six dimensions of logistics sophistication. By means of multivariate logistical regression analyses it was possible to relate some of these dimensions, such as the formalization of the logistics organization, to certain drivers of the outsourcing of logistics activities of Brazilian shippers, such as cost savings. These results indicate the possibility of segmenting shippers according to characteristics of their logistics organization, which may be particularly useful to logistics service providers.

  18. Reacting to Neighborhood Cues?: Political Sophistication Moderates the Effect of Exposure to Immigrants

    DEFF Research Database (Denmark)

    Danckert, Bolette; Dinesen, Peter Thisted; Sønderskov, Kim Mannemar

    2017-01-01

    is founded on politically sophisticated individuals having a greater comprehension of news and other mass-mediated sources, which makes them less likely to rely on neighborhood cues as sources of information relevant for political attitudes. Based on a unique panel data set with fine-grained information...

  19. Sophistic Ethics in the Technical Writing Classroom: Teaching "Nomos," Deliberation, and Action.

    Science.gov (United States)

    Scott, J. Blake

    1995-01-01

    Claims that teaching ethics is particularly important to technical writing. Outlines a classical, sophistic approach to ethics based on the theories and pedagogies of Protagoras, Gorgias, and Isocrates, which emphasizes the Greek concept of "nomos," internal and external deliberation, and responsible action. Discusses problems and…

  20. Close to the Clothes : Materiality and Sophisticated Archaism in Alexander van Slobbe’s Design Practices

    NARCIS (Netherlands)

    Baronian, M.-A.

    This article looks at the work of contemporary Dutch fashion designer Alexander van Slobbe (1959) and examines how, since the 1990s, his fashion practices have consistently and consciously put forward a unique reflection on questions related to materiality, sophisticated archaism, luxury,

  1. Close to the Clothes: Materiality and Sophisticated Archaism in Alexander van Slobbe’s Design Practices

    NARCIS (Netherlands)

    Baronian, M.-A.

    This article looks at the work of contemporary Dutch fashion designer Alexander van Slobbe (1959) and examines how, since the 1990s, his fashion practices have consistently and consciously put forward a unique reflection on questions related to materiality, sophisticated archaism, luxury,

  2. Lexical Complexity Development from Dynamic Systems Theory Perspective: Lexical Density, Diversity, and Sophistication

    Directory of Open Access Journals (Sweden)

    Reza Kalantari

    2017-10-01

    Full Text Available This longitudinal case study explored Iranian EFL learners’ lexical complexity (LC through the lenses of Dynamic Systems Theory (DST. Fifty independent essays written by five intermediate to advanced female EFL learners in a TOEFL iBT preparation course over six months constituted the corpus of this study. Three Coh-Metrix indices (Graesser, McNamara, Louwerse, & Cai, 2004; McNamara & Graesser, 2012, three Lexical Complexity Analyzer indices (Lu, 2010, 2012; Lu & Ai, 2011, and four Vocabprofile indices (Cobb, 2000 were selected to measure different dimensions of LC. Results of repeated measures analysis of variance (RM ANOVA indicated an improvement with regard to only lexical sophistication. Positive and significant relationships were found between time and mean values in Academic Word List and Beyond-2000 as indicators of lexical sophistication. The remaining seven indices of LC, falling short of significance, tended to flatten over the course of this writing program. Correlation analyses among LC indices indicated that lexical density enjoyed positive correlations with lexical sophistication. However, lexical diversity revealed no significant correlations with both lexical density and lexical sophistication. This study suggests that DST perspective specifies a viable foundation for analyzing lexical complexity

  3. Does a more sophisticated storm erosion model improve probabilistic erosion estimates?

    NARCIS (Netherlands)

    Ranasinghe, R.W.M.R.J.B.; Callaghan, D.; Roelvink, D.

    2013-01-01

    The dependency between the accuracy/uncertainty of storm erosion exceedance estimates obtained via a probabilistic model and the level of sophistication of the structural function (storm erosion model) embedded in the probabilistic model is assessed via the application of Callaghan et al.'s (2008)

  4. Physical evaluations of Co-Cr-Mo parts processed using different additive manufacturing techniques

    Science.gov (United States)

    Ghani, Saiful Anwar Che; Mohamed, Siti Rohaida; Harun, Wan Sharuzi Wan; Noar, Nor Aida Zuraimi Md

    2017-12-01

    In recent years, additive manufacturing with highly design customization has gained an important technique for fabrication in aerospace and medical fields. Despite the ability of the process to produce complex components with highly controlled architecture geometrical features, maintaining the part's accuracy, ability to fabricate fully functional high density components and inferior surfaces quality are the major obstacles in producing final parts using additive manufacturing for any selected application. This study aims to evaluate the physical properties of cobalt chrome molybdenum (Co-Cr-Mo) alloys parts fabricated by different additive manufacturing techniques. The full dense Co-Cr-Mo parts were produced by Selective Laser Melting (SLM) and Direct Metal Laser Sintering (DMLS) with default process parameters. The density and relative density of samples were calculated using Archimedes' principle while the surface roughness on the top and side surface was measured using surface profiler. The roughness average (Ra) for top surface for SLM produced parts is 3.4 µm while 2.83 µm for DMLS produced parts. The Ra for side surfaces for SLM produced parts is 4.57 µm while 9.0 µm for DMLS produced parts. The higher Ra values on side surfaces compared to the top faces for both manufacturing techniques was due to the balling effect phenomenon. The yield relative density for both Co-Cr-Mo parts produced by SLM and DMLS are 99.3%. Higher energy density has influence the higher density of produced samples by SLM and DMLS processes. The findings of this work demonstrated that SLM and DMLS process with default process parameters have effectively produced full dense parts of Co-Cr-Mo with high density, good agreement of geometrical accuracy and better surface finish. Despite of both manufacturing process yield that produced components with higher density, the current finding shows that SLM technique could produce components with smoother surface quality compared to DMLS

  5. The use of tomographic techniques in the mineral processing Industry. A review

    International Nuclear Information System (INIS)

    Witika, L.K.; Jere, E.H.

    2002-01-01

    Process tomographic techniques may be used to analyse the internal state of most of the multiphase process engineering systems such as material segregation in a reactor multiphase flow in pipes and the spatial resolution of mineral grains in multiphase particles. These techniques include radiation computed tomography (X-ray or ray), electrical methods(capacitance, impedance and inductive tomography) positron emission tomography,optical tomography, microwave tomography, acoustic tomographical methods and many more. Many potential applications exist for process tomographic instrumentation for quantitative analysis and fault-detection purposes. Amongst these, electrical methods are widely used for those mineral processes deserving particular attention such as dense-medium separation, hydro cyclones, flotation cells and columns, gas-liquid absorbers, solvent extraction and other liquid-liquid processes, filtration and other solid-liquid processes, grinding mills (both dry and wet, conveyors and hoppers). Development in on-line measurement instrumentation now allow direct observation of the behaviour of fluids inside mineral separation equipment. This offers the possibility to acquire process data to enable models to be devised, to verify theoretical computational fluid dynamics predictions and control of various unit processes. In this review, the most important tomographic sensing methods are reviewed. Examples of the implementation of some electrical methods are illustrated. (authors)

  6. A novel technique for die-level post-processing of released optical MEMS

    International Nuclear Information System (INIS)

    Elsayed, Mohannad Y; Beaulieu, Philippe-Olivier; Briere, Jonathan; Ménard, Michaël; Nabki, Frederic

    2016-01-01

    This work presents a novel die-level post-processing technique for dies including released movable structures. The procedure was applied to microelectromechanical systems (MEMS) chips that were fabricated in a commercial process, SOIMUMPs from MEMSCAP. It allows the performance of a clean DRIE etch of sidewalls on the diced chips enabling the optical testing of the pre-released MEMS mirrors through the chip edges. The etched patterns are defined by photolithography using photoresist spray coating. The photoresist thickness is tuned to create photoresist bridges over the pre-released gaps, protecting the released structures during subsequent wet processing steps. Then, the chips are subject to a sequence of wet and dry etching steps prior to dry photoresist removal in O 2 plasma. Processed micromirrors were tested and found to rotate similarly to devices without processing, demonstrating that the post-processing procedure does not affect the mechanical performance of the devices significantly. (technical note)

  7. Combinatorial techniques to efficiently investigate and optimize organic thin film processing and properties.

    Science.gov (United States)

    Wieberger, Florian; Kolb, Tristan; Neuber, Christian; Ober, Christopher K; Schmidt, Hans-Werner

    2013-04-08

    In this article we present several developed and improved combinatorial techniques to optimize processing conditions and material properties of organic thin films. The combinatorial approach allows investigations of multi-variable dependencies and is the perfect tool to investigate organic thin films regarding their high performance purposes. In this context we develop and establish the reliable preparation of gradients of material composition, temperature, exposure, and immersion time. Furthermore we demonstrate the smart application of combinations of composition and processing gradients to create combinatorial libraries. First a binary combinatorial library is created by applying two gradients perpendicular to each other. A third gradient is carried out in very small areas and arranged matrix-like over the entire binary combinatorial library resulting in a ternary combinatorial library. Ternary combinatorial libraries allow identifying precise trends for the optimization of multi-variable dependent processes which is demonstrated on the lithographic patterning process. Here we verify conclusively the strong interaction and thus the interdependency of variables in the preparation and properties of complex organic thin film systems. The established gradient preparation techniques are not limited to lithographic patterning. It is possible to utilize and transfer the reported combinatorial techniques to other multi-variable dependent processes and to investigate and optimize thin film layers and devices for optical, electro-optical, and electronic applications.

  8. Combinatorial Techniques to Efficiently Investigate and Optimize Organic Thin Film Processing and Properties

    Directory of Open Access Journals (Sweden)

    Hans-Werner Schmidt

    2013-04-01

    Full Text Available In this article we present several developed and improved combinatorial techniques to optimize processing conditions and material properties of organic thin films. The combinatorial approach allows investigations of multi-variable dependencies and is the perfect tool to investigate organic thin films regarding their high performance purposes. In this context we develop and establish the reliable preparation of gradients of material composition, temperature, exposure, and immersion time. Furthermore we demonstrate the smart application of combinations of composition and processing gradients to create combinatorial libraries. First a binary combinatorial library is created by applying two gradients perpendicular to each other. A third gradient is carried out in very small areas and arranged matrix-like over the entire binary combinatorial library resulting in a ternary combinatorial library. Ternary combinatorial libraries allow identifying precise trends for the optimization of multi-variable dependent processes which is demonstrated on the lithographic patterning process. Here we verify conclusively the strong interaction and thus the interdependency of variables in the preparation and properties of complex organic thin film systems. The established gradient preparation techniques are not limited to lithographic patterning. It is possible to utilize and transfer the reported combinatorial techniques to other multi-variable dependent processes and to investigate and optimize thin film layers and devices for optical, electro-optical, and electronic applications.

  9. Radioactive tracer technique in process optimization: applications in the chemical industry

    International Nuclear Information System (INIS)

    Charlton, J.S.

    1989-01-01

    Process optimization is concerned with the selection of the most appropriate technological design of the process and with controlling its operation to obtain maximum benefit. The role of radioactive tracers in process optimization is discussed and the various circumstances under which such techniques may be beneficially applied are identified. Case studies are presented which illustrate how radioisotopes may be used to monitor plant performance under dynamic conditions to improve production efficiency and to investigate the cause of production limitations. In addition, the use of sealed sources to provide information complementary to the tracer study is described. (author)

  10. PROCESS PERFORMANCE EVALUATION USING HISTOGRAM AND TAGUCHI TECHNIQUE IN LOCK MANUFACTURING COMPANY

    Directory of Open Access Journals (Sweden)

    Hagos Berhane

    2013-12-01

    Full Text Available Process capability analysis is a vital part of an overall quality improvement program. It is a technique that has application in many segments of the product cycle, including product and process design, vendor sourcing, production or manufacturing planning, and manufacturing. Frequently, a process capability study involves observing a quality characteristic of the product. Since this information usually pertains to the product rather than the process, this analysis should strictly speaking be called a product analysis study. A true process capability study in this context would involve collecting data that relates to process parameters so that remedial actions can be identified on a timely basis. The present study attempts to analyze performance of drilling, pressing, and reaming operations carried out for the manufacturing of two major lock components viz. handle and lever plate, at Gaurav International, Aligarh (India. The data collected for depth of hole on handle, central hole diameter, and key hole diameter are used to construct histogram. Next, the information available in frequency distribution table, the process mean, process capability from calculations and specification limits provided by the manufacturing concern are used with Taguchi technique. The data obtained from histogram and Taguchi technique combined are used to evaluate the performance of the manufacturing process. Results of this study indicated that the performance of all the processes used to produce depth of hole on handle, key hole diameter, and central hole diameter are potentially incapable as the process capability indices are found to be 0.54, 0.54 and 0.76 respectively. The number of nonconforming parts expressed in terms of parts per million (ppm that have fallen out of the specification limits are found to be 140000, 26666.66, and 146666.66 for depth of hole on handle, central hole diameter, and key hole diameter respectively. As a result, the total loss incurred

  11. Comparison of optimization techniques for MRR and surface roughness in wire EDM process for gear cutting

    Directory of Open Access Journals (Sweden)

    K.D. Mohapatra

    2016-11-01

    Full Text Available The objective of the present work is to use a suitable method that can optimize the process parameters like pulse on time (TON, pulse off time (TOFF, wire feed rate (WF, wire tension (WT and servo voltage (SV to attain the maximum value of MRR and minimum value of surface roughness during the production of a fine pitch spur gear made of copper. The spur gear has a pressure angle of 20⁰ and pitch circle diameter of 70 mm. The wire has a diameter of 0.25 mm and is made of brass. Experiments were conducted according to Taguchi’s orthogonal array concept with five factors and two levels. Thus, Taguchi quality loss design technique is used to optimize the output responses carried out from the experiments. Another optimization technique i.e. desirability with grey Taguchi technique has been used to optimize the process parameters. Both the optimized results are compared to find out the best combination of MRR and surface roughness. A confirmation test was carried out to identify the significant improvement in the machining performance in case of Taguchi quality loss. Finally, it was concluded that desirability with grey Taguchi technique produced a better result than the Taguchi quality loss technique in case of MRR and Taguchi quality loss gives a better result in case of surface roughness. The quality of the wire after the cutting operation has been presented in the scanning electron microscopy (SEM figure.

  12. Sophisticated Calculation of the 1oo4-architecture for Safety-related Systems Conforming to IEC61508

    International Nuclear Information System (INIS)

    Hayek, A; Al Bokhaiti, M; Schwarz, M H; Boercsoek, J

    2012-01-01

    With the publication and enforcement of the standard IEC 61508 of safety related systems, recent system architectures have been presented and evaluated. Among a number of techniques and measures to the evaluation of safety integrity level (SIL) for safety-related systems, several measures such as reliability block diagrams and Markov models are used to analyze the probability of failure on demand (PFD) and mean time to failure (MTTF) which conform to IEC 61508. The current paper deals with the quantitative analysis of the novel 1oo4-architecture (one out of four) presented in recent work. Therefore sophisticated calculations for the required parameters are introduced. The provided 1oo4-architecture represents an advanced safety architecture based on on-chip redundancy, which is 3-failure safe. This means that at least one of the four channels have to work correctly in order to trigger the safety function.

  13. Air Conditioning Compressor Air Leak Detection by Image Processing Techniques for Industrial Applications

    Directory of Open Access Journals (Sweden)

    Pookongchai Kritsada

    2015-01-01

    Full Text Available This paper presents method to detect air leakage of an air conditioning compressor using image processing techniques. Quality of air conditioning compressor should not have air leakage. To test an air conditioning compressor leak, air is pumped into a compressor and then submerged into the water tank. If air bubble occurs at surface of the air conditioning compressor, that leakage compressor must be returned for maintenance. In this work a new method to detect leakage and search leakage point with high accuracy, fast, and precise processes was proposed. In a preprocessing procedure to detect the air bubbles, threshold and median filter techniques have been used. Connected component labeling technique is used to detect the air bubbles while blob analysis is searching technique to analyze group of the air bubbles in sequential images. The experiments are tested with proposed algorithm to determine the leakage point of an air conditioning compressor. The location of the leakage point was presented as coordinated point. The results demonstrated that leakage point during process could be accurately detected. The estimation point had error less than 5% compared to the real leakage point.

  14. SENSE-MAKING TECHNIQUES IN EDUCATIONAL PROCESS AND THEIR IMPACT ON THE PERSONAL CHARACTERISTICS OF STUDENTS

    Directory of Open Access Journals (Sweden)

    Irina V. Abakumova

    2017-12-01

    Full Text Available This study looks into psychotechnics used in education and contributing to initiating logic among students, their personal growth and characterizes psychological features of “sense-deducting”. Here you will find a review of the sense-making techniques considering as one of the categories of psychotechnics. The described techniques are based on the human psychology, they improve the quality of instruction, create a favorable and unique system of values, take into account the individual characteristics of all types of education, and influence the sense-making process development among children. Sense-making techniques are stated in the author’s classification and extended by practical methods. The study of psychological features of influence of sense-making techniques on the personality of a student lets us see new patterns in personal, subjective and “meta-subjective” results of acquiring of the school program via transformation and development of value/logic consciousness of a child. The work emphasizes that the use of sense-making techniques is effective in the educational and after-school activities of the educational organization. The achieved results make it possible to understand, to substantiate the naturalness and relevance of the sense-technical approach according to personal and academic indicators of students. In the process of competent and correct use of the semantic techniques, we see the possibility of conveying the best, productive and quality pedagogical experience, as well as the perspective of innovative developments in the psychological and pedagogical sciences. For children and adolescents, information, thanks to sense-techniques, starts to be personal in nature, knowledge is objectified, learning activity becomes an individual need.

  15. Four-hour processing of clinical/diagnostic specimens for electron microscopy using microwave technique.

    Science.gov (United States)

    Giberson, R T; Demaree, R S; Nordhausen, R W

    1997-01-01

    A protocol for routine 4-hour microwave tissue processing of clinical or other samples for electron microscopy was developed. Specimens are processed by using a temperature-restrictive probe that can be set to automatically cycle the magnetron to maintain any designated temperature restriction (temperature maximum). In addition, specimen processing during fixation is performed in 1.7-ml microcentrifuge tubes followed by subsequent processing in flow-through baskets. Quality control is made possible during each step through the addition of an RS232 port to the microwave, allowing direct connection of the microwave oven to any personal computer. The software provided with the temperature probe enables the user to monitor time and temperature on a real-time basis. Tissue specimens, goat placenta, mouse liver, mouse kidney, and deer esophagus were processed by conventional and microwave techniques in this study. In all instances, the results for the microwave-processed samples were equal to or better than those achieved by routine processing techniques.

  16. Development of safety analysis and constraint detection techniques for process interaction errors

    International Nuclear Information System (INIS)

    Fan, Chin-Feng; Tsai, Shang-Lin; Tseng, Wan-Hui

    2011-01-01

    Among the new failure modes introduced by computer into safety systems, the process interaction error is the most unpredictable and complicated failure mode, which may cause disastrous consequences. This paper presents safety analysis and constraint detection techniques for process interaction errors among hardware, software, and human processes. Among interaction errors, the most dreadful ones are those that involve run-time misinterpretation from a logic process. We call them the 'semantic interaction errors'. Such abnormal interaction is not adequately emphasized in current research. In our static analysis, we provide a fault tree template focusing on semantic interaction errors by checking conflicting pre-conditions and post-conditions among interacting processes. Thus, far-fetched, but highly risky, interaction scenarios involve interpretation errors can be identified. For run-time monitoring, a range of constraint types is proposed for checking abnormal signs at run time. We extend current constraints to a broader relational level and a global level, considering process/device dependencies and physical conservation rules in order to detect process interaction errors. The proposed techniques can reduce abnormal interactions; they can also be used to assist in safety-case construction.

  17. Development of safety analysis and constraint detection techniques for process interaction errors

    Energy Technology Data Exchange (ETDEWEB)

    Fan, Chin-Feng, E-mail: csfanc@saturn.yzu.edu.tw [Computer Science and Engineering Dept., Yuan-Ze University, Taiwan (China); Tsai, Shang-Lin; Tseng, Wan-Hui [Computer Science and Engineering Dept., Yuan-Ze University, Taiwan (China)

    2011-02-15

    Among the new failure modes introduced by computer into safety systems, the process interaction error is the most unpredictable and complicated failure mode, which may cause disastrous consequences. This paper presents safety analysis and constraint detection techniques for process interaction errors among hardware, software, and human processes. Among interaction errors, the most dreadful ones are those that involve run-time misinterpretation from a logic process. We call them the 'semantic interaction errors'. Such abnormal interaction is not adequately emphasized in current research. In our static analysis, we provide a fault tree template focusing on semantic interaction errors by checking conflicting pre-conditions and post-conditions among interacting processes. Thus, far-fetched, but highly risky, interaction scenarios involve interpretation errors can be identified. For run-time monitoring, a range of constraint types is proposed for checking abnormal signs at run time. We extend current constraints to a broader relational level and a global level, considering process/device dependencies and physical conservation rules in order to detect process interaction errors. The proposed techniques can reduce abnormal interactions; they can also be used to assist in safety-case construction.

  18. Review of Palm Kernel Oil Processing And Storage Techniques In South East Nigeria

    Directory of Open Access Journals (Sweden)

    Okeke CG

    2017-06-01

    Full Text Available An assessment of palm kernel processing and storage in South-Eastern Nigeria was carried out by investigative survey approach. The survey basically ascertained the extent of mechanization applicable in the area to enable the palm kernel processors and agricultural policy makers, device the modalities for improving palm kernel processing in the area. According to the results obtained from the study, in Abia state, 85% of the respondents use mechanical method while 15% use manual method in cracking their kernels. In Imo state, 83% of the processors use mechanical method while 17% use manual method. In Enugu and Ebonyi state, 70% and 50% of the processors respectively use mechanical method. It is only in Anambra state that greater number of the processors (50% use manual method while 45% use mechanical means. It is observable from the results that palm kernel oil extraction has not received much attention in mechanization. The ANOVA of the palm kernel oil extraction technique in South- East Nigeria showed significant difference in both the study area and oil extraction techniques at 5% level of probability. Results further revealed that in Abia State, 70% of the processors use complete fractional process in refining the palm kernel oil; 25% and 5% respectively use incomplete fractional process and zero refining process. In Anambra, 60% of the processors use complete fractional process and 40% use incomplete fractional process. Zero refining method is not applicable in Anambra state. In Enugu sate, 53% use complete fractional process while 25% and 22% respectively use zero refining and incomplete fractional process in refining the palm kernel oil. Imo state, mostly use complete fractional process (85% in refining palm kernel oil. About 10% use zero refining method while 5% of the processors use incomplete fractional process. Plastic containers and metal drums are dominantly used in most areas in south-east Nigeria for the storage of palm kernel oil.

  19. Seismic qualification using digital signal processing/modal testing and finite element techniques

    International Nuclear Information System (INIS)

    Steedman, J.B.; Edelstein, A.

    1983-01-01

    A systematic procedure in which digital signal processing, modal testing and finite element techniques can be used to seismically qualify Class IE equipment for use in nuclear generating stations is presented. A new method was also developed in which measured transmissibility functions and Fourier transformation techniques were combined to compute instrument response spectra. As an illustrative example of the qualification method, the paper follows the qualification of a safety related Class IE Heating, Ventilating, and Air Conditioning (HVAC) Control Panel subjected to both seismic and hydrodynamic loading conditions

  20. SALP (Sensitivity Analysis by List Processing), a computer assisted technique for binary systems reliability analysis

    International Nuclear Information System (INIS)

    Astolfi, M.; Mancini, G.; Volta, G.; Van Den Muyzenberg, C.L.; Contini, S.; Garribba, S.

    1978-01-01

    A computerized technique which allows the modelling by AND, OR, NOT binary trees, of various complex situations encountered in safety and reliability assessment, is described. By the use of list-processing, numerical and non-numerical types of information are used together. By proper marking of gates and primary events, stand-by systems, common cause failure and multiphase systems can be analyzed. The basic algorithms used in this technique are shown in detail. Application to a stand-by and multiphase system is then illustrated

  1. Influence of the properties of granite and sandstone in the desalination process by electrokinetic technique

    DEFF Research Database (Denmark)

    Feijoo, J.; Ottosen, Lisbeth M.; Pozo-Antonio, J.S.

    2015-01-01

    ) achieved in both stones.From the results obtained, it was possible to find those inherent factors to each stone which could have an influence on the efficacy of the treatment. With this technique it was possible to reduce the salt concentration in the granite almost to 100%. However, in the sandstone...... samples the decreases were not equally high, mainly at the intermediate levels where slight enrichments were observed. The results indicate that although the used technique is efficient for salt removal regardless of the porosimetric distribution of the rock, the better interconnection between the pores...... in the granite samples (favored a faster desalination process)....

  2. Sound Is Sound: Film Sound Techniques and Infrasound Data Array Processing

    Science.gov (United States)

    Perttu, A. B.; Williams, R.; Taisne, B.; Tailpied, D.

    2017-12-01

    A multidisciplinary collaboration between earth scientists and a sound designer/composer was established to explore the possibilities of audification analysis of infrasound array data. Through the process of audification of the infrasound we began to experiment with techniques and processes borrowed from cinema to manipulate the noise content of the signal. The results of this posed the question: "Would the accuracy of infrasound data array processing be enhanced by employing these techniques?". So a new area of research was born from this collaboration and highlights the value of these interactions and the unintended paths that can occur from them. Using a reference event database, infrasound data were processed using these new techniques and the results were compared with existing techniques to asses if there was any improvement to detection capability for the array. With just under one thousand volcanoes, and a high probability of eruption, Southeast Asia offers a unique opportunity to develop and test techniques for regional monitoring of volcanoes with different technologies. While these volcanoes are monitored locally (e.g. seismometer, infrasound, geodetic and geochemistry networks) and remotely (e.g. satellite and infrasound), there are challenges and limitations to the current monitoring capability. Not only is there a high fraction of cloud cover in the region, making plume observation more difficult via satellite, there have been examples of local monitoring networks and telemetry being destroyed early in the eruptive sequence. The success of local infrasound studies to identify explosions at volcanoes, and calculate plume heights from these signals, has led to an interest in retrieving source parameters for the purpose of ash modeling with a regional network independent of cloud cover.

  3. Comparison of various techniques for the extraction of umbelliferone and herniarin in Matricaria chamomilla processing fractions.

    Science.gov (United States)

    Molnar, Maja; Mendešević, Nikolina; Šubarić, Drago; Banjari, Ines; Jokić, Stela

    2017-08-05

    Chamomile, a well-known medicinal plant, is a rich source of bioactive compounds, among which two coumarin derivatives, umbelliferone and herniarin, are often found in its extracts. Chamomile extracts have found a different uses in cosmetic industry, as well as umbelliferone itself, which is, due to its strong absorption of UV light, usually added to sunscreens, while herniarin (7-methoxycoumarin) is also known for its biological activity. Therefore, chamomile extracts with certain herniarin and umbelliferone content could be of interest for application in pharmaceutical and cosmetic products. The aim of this study was to compare the extracts of different chamomile fractions (unprocessed chamomile flowers first class, processed chamomile flowers first class, pulvis and processing waste) and to identify the best material and method of extraction to obtain herniarin and umbelliferone. Various extraction techniques such as soxhlet, hydrodistillation, maceration and supercritical CO 2 extraction were used in this study. Umbelliferone and herniarin content was determined by high performance liquid chromatography (HPLC). The highest yield of umbelliferone (11.80 mg/100 g) and herniarin (82.79 mg/100 g) were obtained from chamomile processing waste using maceration technique with 50% aqueous ethanol solution and this extract has also proven to possess antioxidant activity (61.5% DPPH scavenging activity). This study shows a possibility of potential utilization of waste from chamomile processing applying different extraction techniques.

  4. Monitoring alloy formation during mechanical alloying process by x-ray diffraction techniques

    International Nuclear Information System (INIS)

    Abdul Kadir Masrom; Noraizam Md Diah; Mazli Mustapha

    2002-01-01

    Monitoring alloying (MA) is a novel processing technique that use high energy impact ball mill to produce alloys with enhanced properties and microscopically homogeneous materials starting from various powder mixtures. Mechanical alloying process was originally developed to produce oxide dispersion strengthened nickel superalloys. In principal, in high-energy ball milling process, alloy is formed by the result of repeated welding, fracturing and rewelding of powder particles in a high energy ball mill. In this process a powder mixture in a ball mill is subjected to high-energy collisions among balls. MA has been shown to be capable of synthesizing a variety of materials. It is known to be capable to prepare equilibrium and non-equilibrium phases starting from blended elemental or prealloyed powders. The process ability to produce highly metastable materials such as amorphous alloys and nanostructured materials has made this process attractive and it has been considered as a promising material processing technique that could be used to produce many advanced materials at low cost. The present study explores the conditions under which aluminum alloys formation occurs by ball milling of blended aluminum and its alloying elements powders. In this work, attempt was made in producing aluminum 2024 alloys by milling of blended elemental aluminum powder of 2024 composition in a stainless steel container under argon atmosphere for up to 210 minutes. X-ray diffraction together with thermal analysis techniques has been used to monitor phase changes in the milled powder. Results indicate that, using our predetermined milling parameters, alloys were formed after 120 minutes milling. The thermal analysis data was also presented in this report. (Author)

  5. Influence of different processing techniques on the mechanical properties of used tires in embankment construction

    International Nuclear Information System (INIS)

    Edincliler, Ayse; Baykal, Goekhan; Saygili, Altug

    2010-01-01

    Use of the processed used tires in embankment construction is becoming an accepted way of beneficially recycling scrap tires due to shortages of natural mineral resources and increasing waste disposal costs. Using these used tires in construction requires an awareness of the properties and the limitations associated with their use. The main objective of this paper is to assess the different processing techniques on the mechanical properties of used tires-sand mixtures to improve the engineering properties of the available soil. In the first part, a literature study on the mechanical properties of the processed used tires such as tire shreds, tire chips, tire buffings and their mixtures with sand are summarized. In the second part, large-scale direct shear tests are performed to evaluate shear strength of tire crumb-sand mixtures where information is not readily available in the literature. The test results with tire crumb were compared with the other processed used tire-sand mixtures. Sand-used tire mixtures have higher shear strength than that of the sand alone and the shear strength parameters depend on the processing conditions of used tires. Three factors are found to significantly affect the mechanical properties: normal stress, processing techniques, and the used tire content.

  6. Influence of different processing techniques on the mechanical properties of used tires in embankment construction.

    Science.gov (United States)

    Edinçliler, Ayşe; Baykal, Gökhan; Saygili, Altug

    2010-06-01

    Use of the processed used tires in embankment construction is becoming an accepted way of beneficially recycling scrap tires due to shortages of natural mineral resources and increasing waste disposal costs. Using these used tires in construction requires an awareness of the properties and the limitations associated with their use. The main objective of this paper is to assess the different processing techniques on the mechanical properties of used tires-sand mixtures to improve the engineering properties of the available soil. In the first part, a literature study on the mechanical properties of the processed used tires such as tire shreds, tire chips, tire buffings and their mixtures with sand are summarized. In the second part, large-scale direct shear tests are performed to evaluate shear strength of tire crumb-sand mixtures where information is not readily available in the literature. The test results with tire crumb were compared with the other processed used tire-sand mixtures. Sand-used tire mixtures have higher shear strength than that of the sand alone and the shear strength parameters depend on the processing conditions of used tires. Three factors are found to significantly affect the mechanical properties: normal stress, processing techniques, and the used tire content. Copyright 2009. Published by Elsevier Ltd.

  7. Quantification of UV-Visible and Laser Spectroscopic Techniques for Materials Accountability and Process Control

    International Nuclear Information System (INIS)

    Czerwinski, Kenneth; Weck, Phil

    2013-01-01

    Ultraviolet-visible spectroscopy (UV-Visible) and time-resolved laser fluorescence spectroscopy (TRLFS) optical techniques can permit on-line analysis of actinide elements in a solvent extraction process in real time. These techniques have been used for measuring actinide speciation and concentration under laboratory conditions and are easily adaptable to multiple sampling geometries, such as dip probes, fiber-optic sample cells, and flow-through cell geometries. To fully exploit these techniques, researchers must determine the fundamental speciation of target actinides and the resulting influence on spectroscopic properties. Detection limits, process conditions, and speciation of key actinide components can be established and utilized in a range of areas, particularly those related to materials accountability and process control. Through this project, researchers will develop tools and spectroscopic techniques to evaluate solution extraction conditions and concentrations of U, Pu, and Cm in extraction processes, addressing areas of process control and materials accountability. The team will evaluate UV- Visible and TRLFS for use in solvent extraction-based separations. Ongoing research is examining efficacy of UV-Visible spectroscopy to evaluate uranium and plutonium speciation under conditions found in the UREX process and using TRLFS to evaluate Cm speciation and concentration in the TALSPEAK process. A uranyl and plutonium nitrate UV-Visible spectroscopy study met with success, which supports the utility and continued exploration of spectroscopic methods for evaluation of actinide concentrations and solution conditions for other aspects of the UREX+ solvent extraction scheme. This project will examine U and Pu absorbance in TRUEX and TALSPEAK, perform detailed examination of Cm in TRUEX and TALSPEAK, study U laser fluorescence, and apply project data to contactors. The team will also determine peak ratios as a function of solution concentrations for the UV

  8. Development of flow velocity measurement techniques in visible images. Improvement of particle image velocimetry techniques on image process

    International Nuclear Information System (INIS)

    Kimura, Nobuyuki; Nishimura, Motohiko; Kamide, Hideki; Hishida, Koichi

    1999-10-01

    Noise reduction system was developed to improve applicability of Particle Image Velocimetry (PIV) to complicated configure bounded flows. For fast reactor safety and thermal hydraulic studies, experiments are performed in scale models which usually have rather complicated geometry and structures such as fuel subassemblies, heat exchangers, etc. The structures and stuck dusts on the view window of the models obscure the particle image. Thus the image except the moving particles can be regarded as a noise. In the present study, two noise reduction techniques are proposed. The one is the Time-averaged Light Intensity Subtraction method (TIS) which subtracts the time-averaged light intensity of each pixel in the sequential images from the each corresponding pixel. The other one is the Minimum Light Intensity Subtraction method (MIS) which subtracts the minimum light intensity of each pixel in the sequential images from the each corresponding pixel. Both methods are examined on their capabilities of noise reduction. As for the original 'bench mark' image, the image made from Large Eddy Simulation was used. To the bench mark image, noises are added which are referred as sample images. Both methods reduce the rate of vector with the error of more than one pixel from 90% to less than 5%. Also, more than 50% of the vectors have the error of less than 0.2 pixel. The analysis of uncertainty shows that these methods enhances the accuracy of vector measurement 3 ∼ 12 times if the image with noise were processed, and the MIS method has 1.1 ∼ 2.1 times accuracy compared to the TIS. Thus the present noise reduction methods are quite efficient to enhance the accuracy of flow velocity fields measured with particle images including structures and deposits on the view window. (author)

  9. Assessment of the impact strength of the denture base resin polymerized by various processing techniques

    Directory of Open Access Journals (Sweden)

    Rajashree Jadhav

    2013-01-01

    Full Text Available Aim : To measure the impact strength of denture base resins polymerized using short and long curing cycles by water bath, pressure cooker and microwave techniques. Materials and Methods: For impact strength testing, 60 samples were made. The sample dimensions were 60 mm × 12 mm × 3 mm, as standardized by the American Standards for Testing and Materials (ASTM. A digital caliper was used to locate the midpoint of sample. The impact strength was measured in IZOD type of impact tester using CEAST Impact tester. The pendulum struck the sample and it broke. The energy required to break the sample was measured in Joules. Data were analyzed using Student′s " t" test. Results: There was statistically significant difference in the impact strength of denture base resins polymerized by long curing cycle and short curing cycle in each technique, with the long curing processing being the best. Conclusion: The polymerization technique plays an important role in the influence of impact strength in the denture base resin. This research demonstrates that the denture base resin polymerized by microwave processing technique possessed the highest impact strength.

  10. Intelligent Technique for Signal Processing to Identify the Brain Disorder for Epilepsy Captures Using Fuzzy Systems

    Directory of Open Access Journals (Sweden)

    Gurumurthy Sasikumar

    2016-01-01

    Full Text Available The new direction of understand the signal that is created from the brain organization is one of the main chores in the brain signal processing. Amid all the neurological disorders the human brain epilepsy is measured as one of the extreme prevalent and then programmed artificial intelligence detection technique is an essential due to the crooked and unpredictable nature of happening of epileptic seizures. We proposed an Improved Fuzzy firefly algorithm, which would enhance the classification of the brain signal efficiently with minimum iteration. An important bunching technique created on fuzzy logic is the Fuzzy C means. Together in the feature domain with the spatial domain the features gained after multichannel EEG signals remained combined by means of fuzzy algorithms. And for better precision segmentation process the firefly algorithm is applied to optimize the Fuzzy C-means membership function. Simultaneously for the efficient clustering method the convergence criteria are set. On the whole the proposed technique yields more accurate results and that gives an edge over other techniques. This proposed algorithm result compared with other algorithms like fuzzy c means algorithm and PSO algorithm.

  11. Studies on atom deceleration process by using the Zeeman-tuned technique

    International Nuclear Information System (INIS)

    Bagnato, V.S.

    1990-01-01

    The Zeeman-tuned technique to slow an atomic beam of sodium atoms was detailed studied. A new technique to study the deceleration which consists in monitoring the fluorescence along the deceleration path is used. This allows a direct observation of the process and open possibilities to investigate the adiabatic following of atoms in the magnetic field, and others very important aspects of the process. With a single laser and some modification of the magnetic field profile it is possible stop atoms outside the slower solenoid, which make a lot of experiments much simpler. A systematic study of the optical pumping effects and adiabatic following conditions allow to produce a very strong slow motion atomic beam. (author)

  12. Application of learning techniques based on kernel methods for the fault diagnosis in industrial processes

    Directory of Open Access Journals (Sweden)

    Jose M. Bernal-de-Lázaro

    2016-05-01

    Full Text Available This article summarizes the main contributions of the PhD thesis titled: "Application of learning techniques based on kernel methods for the fault diagnosis in Industrial processes". This thesis focuses on the analysis and design of fault diagnosis systems (DDF based on historical data. Specifically this thesis provides: (1 new criteria for adjustment of the kernel methods used to select features with a high discriminative capacity for the fault diagnosis tasks, (2 a proposed approach process monitoring using statistical techniques multivariate that incorporates a reinforced information concerning to the dynamics of the Hotelling's T2 and SPE statistics, whose combination with kernel methods improves the detection of small-magnitude faults; (3 an robustness index to compare the diagnosis classifiers performance taking into account their insensitivity to possible noise and disturbance on historical data.

  13. Measurement techniques in dry-powdered processing of spent nuclear fuels

    International Nuclear Information System (INIS)

    Bowers, D. L.; Hong, J.-S.; Kim, H.-D.; Persiani, P. J.; Wolf, S. F.

    1999-01-01

    High-performance liquid chromatography (HPLC) with inductively coupled plasma mass spectrometry (ICPMS) detection, α-spectrometry (α-S), and γ-spectrometry (γ-S) were used for the determination of nuclide content in five samples excised from a high-burnup fuel rod taken from a pressurized water reactor (PWR). The samples were prepared for analysis by dissolution of dry-powdered samples. The measurement techniques required no separation of the plutonium, uranium, and fission products. The sample preparation and analysis techniques showed promise for in-line analysis of highly-irradiated spent fuels in a dry-powdered process. The analytical results allowed the determination of fuel burnup based on 148 Nd, Pu, and U content. A goal of this effort is to develop the HPLC-ICPMS method for direct fissile material accountancy in the dry-powdered processing of spent nuclear fuel

  14. Discrimination of Parkinsonian Tremor From Essential Tremor by Voting Between Different EMG Signal Processing Techniques

    Directory of Open Access Journals (Sweden)

    A Hossen

    2014-06-01

    Full Text Available Parkinson's disease (PD and essential tremor (ET are the two most common disorders that cause involuntary muscle shaking movements, or what is called "tremor”. PD is a neurodegenerative disease caused by the loss of dopamine receptors which control and adjust the movement of the body. On the other hand, ET is a neurological movement disorder which also causes tremors and shaking, but it is not related to dopamine receptor loss; it is simply a tremor. The differential diagnosis between these two disorders is sometimes difficult to make clinically because of the similarities of their symptoms; additionally, the available tests are complex and expensive. Thus, the objective of this paper is to discriminate between these two disorders with simpler, cheaper and easier ways by using electromyography (EMG signal processing techniques. EMG and accelerometer records of 39 patients with PD and 41 with ET were acquired from the Hospital of Kiel University in Germany and divided into a trial group and a test group. Three main techniques were applied: the wavelet-based soft-decision technique, statistical signal characterization (SSC of the spectrum of the signal, and SSC of the amplitude variation of the Hilbert transform. The first technique resulted in a discrimination efficiency of 80% on the trial set and 85% on the test set. The second technique resulted in an efficiency of 90% on the trial set and 82.5% on the test set. The third technique resulted in an 87.5% efficiency on the trial set and 65.5% efficiency on the test set. Lastly, a final vote was done to finalize the discrimination using these three techniques, and as a result of the vote, accuracies of 92.5%, 85.0% and 88.75% were obtained on the trial data, test data and total data, respectively.

  15. Diazo processing of LANDSAT imagery: A low-cost instructional technique

    Science.gov (United States)

    Lusch, D. P.

    1981-01-01

    Diazo processing of LANDSAT imagery is a relatively simple and cost effective method of producing enhanced renditions of the visual LANDSAT products. This technique is capable of producing a variety of image enhancements which have value in a teaching laboratory environment. Additionally, with the appropriate equipment, applications research which relys on accurate and repeatable results is possible. Exposure and development equipment options, diazo materials, and enhancement routines are discussed.

  16. Extension of ERIM multispectral data processing capabilities through improved data handling techniques

    Science.gov (United States)

    Kriegler, F. J.

    1973-01-01

    The improvement and extension of the capabilities of the Environmental Research Institute of Michigan processing facility in handling multispectral data are discussed. Improvements consisted of implementing hardware modifications which permitted more rapid access to the recorded data through improved numbering and indexing of such data. In addition, techniques are discussed for handling data from sources other than the ERIM M-5 and M-7 scanner systems.

  17. An optimized protocol for handling and processing fragile acini cultured with the hanging drop technique.

    Science.gov (United States)

    Snyman, Celia; Elliott, Edith

    2011-12-15

    The hanging drop three-dimensional culture technique allows cultivation of functional three-dimensional mammary constructs without exogenous extracellular matrix. The fragile acini are, however, difficult to preserve during processing steps for advanced microscopic investigation. We describe adaptations to the protocol for handling of hanging drop cultures to include investigation using confocal, scanning, and electron microscopy, with minimal loss of cell culture components. Copyright © 2011 Elsevier Inc. All rights reserved.

  18. An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques

    Science.gov (United States)

    2018-01-09

    100 kHz, 1 MHz 100 MHz–1 GHz 1 100 kHz 3. Statistical Processing 3.1 Statistical Analysis Statistical analysis is the mathematical science...quantitative terms. In commercial prognostics and diagnostic vibrational monitoring applications , statistical techniques that are mainly used for alarm...Balakrishnan N, editors. Handbook of statistics . Amsterdam (Netherlands): Elsevier Science; 1998. p 555–602; (Order statistics and their applications

  19. Pre-writing Techniques In The Writing Process For The L2 Classroom

    OpenAIRE

    Gülşah Geyimci

    2014-01-01

    This study investigated pre-writing techniques in the learning process to improve written communication skills of learners using qualitative research methods. This study was performed in a public school, Suphi Öner Primary School in Turkey, in Mersin. Students were seventh grade class that their level was pre-intermediate. This class was made up of twenty students. It took three weeks, the students' samples, drawings and blogs were documented by the students. In order to examine the results, ...

  20. Opportunities and applications of medical imaging and image processing techniques for nondestructive testing

    International Nuclear Information System (INIS)

    Song, Samuel Moon Ho; Cho, Jung Ho; Son, Sang Rock; Sung, Je Jonng; Ahn, Hyung Keun; Lee, Jeong Soon

    2002-01-01

    Nondestructive testing (NDT) of structures strives to extract all relevant data regarding the state of the structure without altering its form or properties. The success enjoyed by imaging and image processing technologies in the field of modem medicine forecasts similar success of image processing related techniques both in research and practice of NDT. In this paper, we focus on two particular instances of such applications: a modern vision technique for 3-D profile and shape measurement, and ultrasonic imaging with rendering for 3-D visualization. Ultrasonic imaging of 3-D structures for nondestructive evaluation purposes must provide readily recognizable 3-D images with enough details to clearly show various faults that may or may not be present. As a step towards Improving conspicuity and thus detection of faults, we propose a pulse-echo ultrasonic imaging technique to generate a 3-D image of the 3-D object under evaluation through strategic scanning and processing of the pulse-echo data. This three-dimensional processing and display improves conspicuity of faults and in addition, provides manipulation capabilities, such as pan and rotation of the 3-D structure. As a second application, we consider an image based three-dimensional shape determination system. The shape, and thus the three-dimensional coordinate information of the 3-D object, is determined solely from captured images of the 3-D object from a prescribed set of viewpoints. The approach is based on the shape from silhouette (SFS) technique and the efficacy of the SFS method is tested using a sample data set. This system may be used to visualize the 3-D object efficiently, or to quickly generate initial CAD data for reverse engineering purposes. The proposed system potentially may be used in three dimensional design applications such as 3-D animation and 3-D games.

  1. The use of artificial intelligence techniques to improve the multiple payload integration process

    Science.gov (United States)

    Cutts, Dannie E.; Widgren, Brian K.

    1992-01-01

    A maximum return of science and products with a minimum expenditure of time and resources is a major goal of mission payload integration. A critical component then, in successful mission payload integration is the acquisition and analysis of experiment requirements from the principal investigator and payload element developer teams. One effort to use artificial intelligence techniques to improve the acquisition and analysis of experiment requirements within the payload integration process is described.

  2. Noise Suppression in ECG Signals through Efficient One-Step Wavelet Processing Techniques

    Directory of Open Access Journals (Sweden)

    E. Castillo

    2013-01-01

    Full Text Available This paper illustrates the application of the discrete wavelet transform (DWT for wandering and noise suppression in electrocardiographic (ECG signals. A novel one-step implementation is presented, which allows improving the overall denoising process. In addition an exhaustive study is carried out, defining threshold limits and thresholding rules for optimal wavelet denoising using this presented technique. The system has been tested using synthetic ECG signals, which allow accurately measuring the effect of the proposed processing. Moreover, results from real abdominal ECG signals acquired from pregnant women are presented in order to validate the presented approach.

  3. Sorption and chromatographic techniques for processing liquid waste of nuclear fuel cycle

    International Nuclear Information System (INIS)

    Gelis, V.M.; Milyutin, V.V.; Chuveleva, E.A.; Maslova, G.B.; Kudryavtseva, S.P.; Firsova, L.A.; Kozlitin, E.A.

    2000-01-01

    In the spent nuclear fuel processing procedures the significant quantity of high level liquid waste containing long-lived high toxic radionuclides of cesium, strontium, promethium, americium, curium, etc. is generated. Separation of those radionuclides from the waste not merely simplifies the further safe waste handling but also reduces the waste processing operation costs due to the market value of certain individual radionuclide preparations. Recovery and separation of high grade pure long-lived radionuclide preparations is frequently performed by means of chromatographic techniques. (authors)

  4. Multi-factor models and signal processing techniques application to quantitative finance

    CERN Document Server

    Darolles, Serges; Jay, Emmanuelle

    2013-01-01

    With recent outbreaks of multiple large-scale financial crises, amplified by interconnected risk sources, a new paradigm of fund management has emerged. This new paradigm leverages "embedded" quantitative processes and methods to provide more transparent, adaptive, reliable and easily implemented "risk assessment-based" practices.This book surveys the most widely used factor models employed within the field of financial asset pricing. Through the concrete application of evaluating risks in the hedge fund industry, the authors demonstrate that signal processing techniques are an intere

  5. Sophisticated Fowl: The Complex Behaviour and Cognitive Skills of Chickens and Red Junglefowl

    Directory of Open Access Journals (Sweden)

    Laura Garnham

    2018-01-01

    Full Text Available The world’s most numerous bird, the domestic chicken, and their wild ancestor, the red junglefowl, have long been used as model species for animal behaviour research. Recently, this research has advanced our understanding of the social behaviour, personality, and cognition of fowl, and demonstrated their sophisticated behaviour and cognitive skills. Here, we overview some of this research, starting with describing research investigating the well-developed senses of fowl, before presenting how socially and cognitively complex they can be. The realisation that domestic chickens, our most abundant production animal, are behaviourally and cognitively sophisticated should encourage an increase in general appraise and fascination towards them. In turn, this should inspire increased use of them as both research and hobby animals, as well as improvements in their unfortunately often poor welfare.

  6. The relation between maturity and sophistication shall be properly dealt with in nuclear power development

    International Nuclear Information System (INIS)

    Li Yongjiang

    2009-01-01

    The paper analyses the advantages and disadvantages of the second generation improved technologies and third generation technologies mainly developed in China in terms of safety and economy. The paper also discusses the maturity of the second generation improved technologies and the sophistication of the third generation technologies respectively. Meanwhile, the paper proposes that the advantage and disadvantage of second generation improved technologies and third generation technologies should be carefully taken into consideration and the relationship between the maturity and sophistication should be properly dealt with in the current stage. A two-step strategy shall be taken as a solution to solve the problem of insufficient capacity of nuclear power, trace and develop the third generation technologies, so as to ensure the sound and fast development of nuclear power. (authors)

  7. Acoustic levitation technique for containerless processing at high temperatures in space

    Science.gov (United States)

    Rey, Charles A.; Merkley, Dennis R.; Hammarlund, Gregory R.; Danley, Thomas J.

    1988-01-01

    High temperature processing of a small specimen without a container has been demonstrated in a set of experiments using an acoustic levitation furnace in the microgravity of space. This processing technique includes the positioning, heating, melting, cooling, and solidification of a material supported without physical contact with container or other surface. The specimen is supported in a potential energy well, created by an acoustic field, which is sufficiently strong to position the specimen in the microgravity environment of space. This containerless processing apparatus has been successfully tested on the Space Shuttle during the STS-61A mission. In that experiment, three samples wer successfully levitated and processed at temperatures from 600 to 1500 C. Experiment data and results are presented.

  8. Effects of processing techniques on oxidative stability of Prunus pedunculatus seed oil

    Directory of Open Access Journals (Sweden)

    J. Yan

    2017-09-01

    Full Text Available This paper investigated the effects of Prunus pedunculatus (P. pedunculatus seed pre-treatment, including microwaving (M, roasting (R, steaming (S and roasting plus steaming (RS on crude oil quality in terms of yield, color change, fatty acid composition, and oxidative stability. The results showed an increase in monounsaturated fatty acid content and oxidative stability of the oils obtained from different processing treatments compared to the oil obtained from raw seeds (RW without processing. The oils, obtained from pretreated seeds, had higher conjugated diene (CD and 2-thiobarbituric acid (2-TBA values, compared to that obtained from RW when stored in a Schaal oven at 65 °C for 168 h. However, polyphenol and tocopherol contents decreased in all oil samples, processed or unprocessed. The effect of pre-treating the seeds was more prominent in the oil sample obtained through the RS technique, and showed higher oxidative stability than the other processed oils and the oil from RW.

  9. A novel eco-friendly technique for efficient control of lime water softening process.

    Science.gov (United States)

    Ostovar, Mohamad; Amiri, Mohamad

    2013-12-01

    Lime softening is an established type of water treatment used for water softening. The performance of this process is highly dependent on lime dosage. Currently, lime dosage is adjusted manually based on chemical tests, aimed at maintaining the phenolphthalein (P) and total (M) alkalinities within a certain range (2 P - M > or = 5). In this paper, a critical study of the softening process has been presented. It has been shown that the current method is frequently incorrect. Furthermore, electrical conductivity (EC) has been introduced as a novel indicator for effectively characterizing the lime softening process.This novel technique has several advantages over the current alkalinities method. Because no chemical reagents are needed for titration, which is a simple test, there is a considerable reduction in test costs. Additionally, there is a reduction in the treated water hardness and generated sludge during the lime softening process. Therefore, it is highly eco-friendly, and is a very cost effective alternative technique for efficient control of the lime softening process.

  10. Measurable Disturbances Compensation: Analysis and Tuning of Feedforward Techniques for Dead-Time Processes

    Directory of Open Access Journals (Sweden)

    Andrzej Pawlowski

    2016-04-01

    Full Text Available In this paper, measurable disturbance compensation techniques are analyzed, focusing the problem on the input-output and disturbance-output time delays. The feedforward compensation method is evaluated for the common structures that appear between the disturbance and process dynamics. Due to the presence of time delays, the study includes causality and instability phenomena that can arise when a classical approach for disturbance compensation is used. Different feedforward configurations are analyzed for two feedback control techniques, PID (Proportional-Integral-Derivative and MPC (Model Predictive Control that are widely used for industrial process-control applications. The specific tuning methodology for the analyzed process structure is used to obtain improved disturbance rejection performance regarding classical approaches. The evaluation of the introduced disturbance rejection schemes is performed through simulation, considering process constraints in order to highlight the advantages and drawbacks in common scenarios. The performance of the analyzed structure is expressed with different indexes that allow us direct comparisons. The obtained results show that the proper design and tuning of the feedforward action helps to significantly improve the overall control performance in process control tasks.

  11. Electron sterilization validation techniques using the controlled depth of sterilization process

    International Nuclear Information System (INIS)

    Cleghorn, D.A.; Nablo, S.V.

    1990-01-01

    Many pharmaceutical products, especially parenteral drugs, cannot be sterilized with gamma rays or high energy electrons due to the concomitant product degradation. In view of the well-controlled electron energy spectrum available in modern electron processors, it is practical to deliver sterilizing doses over depths considerably less than those defining the thickness of blister-pack constructions or pharmaceutical containers. Because bremsstrahlung and X-ray production are minimized at these low electron energies and in these low Z materials, very high electron: penetrating X-ray dose ratios are possible for the application of the technique. Thin film dosimetric techniques have been developed utilizing radiochromic film in the 10-60 g/m 2 range for determining the surface dose distribution in occluded surface areas where direct electron illumination is not possible. Procedures for validation of the process using dried spore inoculum on the product as well as in good geometry are employed to determine the process lethality and its dependence on product surface geometry. Applications of the process to labile pharmaceuticals in glass and polystyrene syringes are reviewed. It has been applied to the sterilization of commercial sterile products since 1987, and the advantages and the natural limitations of the technique are discussed. (author)

  12. A safeguards verification technique for solution homogeneity and volume measurements in process tanks

    International Nuclear Information System (INIS)

    Suda, S.; Franssen, F.

    1987-01-01

    A safeguards verification technique is being developed for determining whether process-liquid homogeneity has been achieved in process tanks and for authenticating volume-measurement algorithms involving temperature corrections. It is proposed that, in new designs for bulk-handling plants employing automated process lines, bubbler probes and thermocouples be installed at several heights in key accountability tanks. High-accuracy measurements of density using an electromanometer can now be made which match or even exceed analytical-laboratory accuracies. Together with regional determination of tank temperatures, these measurements provide density, liquid-column weight and temperature gradients over the fill range of the tank that can be used to ascertain when the tank solution has reached equilibrium. Temperature-correction algorithms can be authenticated by comparing the volumes obtained from the several bubbler-probe liquid-height measurements, each based on different amounts of liquid above and below the probe. The verification technique is based on the automated electromanometer system developed by Brookhaven National Laboratory (BNL). The IAEA has recently approved the purchase of a stainless-steel tank equipped with multiple bubbler and thermocouple probes for installation in its Bulk Calibration Laboratory at IAEA Headquarters, Vienna. The verification technique is scheduled for preliminary trials in late 1987

  13. Financial Sophistication and the Distribution of the Welfare Cost of Inflation

    OpenAIRE

    Paola Boel; Gabriele Camera

    2009-01-01

    The welfare cost of anticipated inflation is quantified in a calibrated model of the U.S. economy that exhibits tractable equilibrium dispersion in wealth and earnings. Inflation does not generate large losses in societal welfare, yet its impact varies noticeably across segments of society depending also on the financial sophistication of the economy. If money is the only asset, then inflation hurts mostly the wealthier and more productive agents, while those poorer and less productive may ev...

  14. Putin’s Russia: Russian Mentality and Sophisticated Imperialism in Military Policies

    OpenAIRE

    Szénási, Lieutenant-Colonel Endre

    2016-01-01

    According to my experiences, the Western world hopelessly fails to understand Russian mentality, or misinterprets it. During my analysis of the Russian way of thinking I devoted special attention to the examination of military mentality. I have connected the issue of the Russian way of thinking to the contemporary imperial policies of Putin’s Russia.  I have also attempted to prove the level of sophistication of both. I hope that a better understanding of both the Russian mentality and imperi...

  15. Plant process computer replacements - techniques to limit installation schedules and costs

    International Nuclear Information System (INIS)

    Baker, M.D.; Olson, J.L.

    1992-01-01

    Plant process computer systems, a standard fixture in all nuclear power plants, are used to monitor and display important plant process parameters. Scanning thousands of field sensors and alarming out-of-limit values, these computer systems are heavily relied on by control room operators. The original nuclear steam supply system (NSSS) vendor for the power plant often supplied the plant process computer. Designed using sixties and seventies technology, a plant's original process computer has been obsolete for some time. Driven by increased maintenance costs and new US Nuclear Regulatory Commission regulations such as NUREG-0737, Suppl. 1, many utilities have replaced their process computers with more modern computer systems. Given that computer systems are by their nature prone to rapid obsolescence, this replacement cycle will likely repeat. A process computer replacement project can be a significant capital expenditure and must be performed during a scheduled refueling outage. The object of the installation process is to install a working system on schedule. Experience gained by supervising several computer replacement installations has taught lessons that, if applied, will shorten the schedule and limit the risk of costly delays. Examples illustrating this technique are given. This paper and these examples deal only with the installation process and assume that the replacement computer system has been adequately designed, and development and factory tested

  16. Mathematical Foundation Based Inter-Connectivity modelling of Thermal Image processing technique for Fire Protection

    Directory of Open Access Journals (Sweden)

    Sayantan Nath

    2015-09-01

    Full Text Available In this paper, integration between multiple functions of image processing and its statistical parameters for intelligent alarming series based fire detection system is presented. The proper inter-connectivity mapping between processing elements of imagery based on classification factor for temperature monitoring and multilevel intelligent alarm sequence is introduced by abstractive canonical approach. The flow of image processing components between core implementation of intelligent alarming system with temperature wise area segmentation as well as boundary detection technique is not yet fully explored in the present era of thermal imaging. In the light of analytical perspective of convolutive functionalism in thermal imaging, the abstract algebra based inter-mapping model between event-calculus supported DAGSVM classification for step-by-step generation of alarm series with gradual monitoring technique and segmentation of regions with its affected boundaries in thermographic image of coal with respect to temperature distinctions is discussed. The connectedness of the multifunctional operations of image processing based compatible fire protection system with proper monitoring sequence is presently investigated here. The mathematical models representing the relation between the temperature affected areas and its boundary in the obtained thermal image defined in partial derivative fashion is the core contribution of this study. The thermal image of coal sample is obtained in real-life scenario by self-assembled thermographic camera in this study. The amalgamation between area segmentation, boundary detection and alarm series are described in abstract algebra. The principal objective of this paper is to understand the dependency pattern and the principles of working of image processing components and structure an inter-connected modelling technique also for those components with the help of mathematical foundation.

  17. The training and learning process of transseptal puncture using a modified technique.

    Science.gov (United States)

    Yao, Yan; Ding, Ligang; Chen, Wensheng; Guo, Jun; Bao, Jingru; Shi, Rui; Huang, Wen; Zhang, Shu; Wong, Tom

    2013-12-01

    As the transseptal (TS) puncture has become an integral part of many types of cardiac interventional procedures, its technique that was initial reported for measurement of left atrial pressure in 1950s, continue to evolve. Our laboratory adopted a modified technique which uses only coronary sinus catheter as the landmark to accomplishing TS punctures under fluoroscopy. The aim of this study is prospectively to evaluate the training and learning process for TS puncture guided by this modified technique. Guided by the training protocol, TS puncture was performed in 120 consecutive patients by three trainees without previous personal experience in TS catheterization and one experienced trainer as a controller. We analysed the following parameters: one puncture success rate, total procedure time, fluoroscopic time, and radiation dose. The learning curve was analysed using curve-fitting methodology. The first attempt at TS crossing was successful in 74 (82%), a second attempt was successful in 11 (12%), and 5 patients failed to puncture the interatrial septal finally. The average starting process time was 4.1 ± 0.8 min, and the estimated mean learning plateau was 1.2 ± 0.2 min. The estimated mean learning rate for process time was 25 ± 3 cases. Important aspects of learning curve can be estimated by fitting inverse curves for TS puncture. The study demonstrated that this technique was a simple, safe, economic, and effective approach for learning of TS puncture. Base on the statistical analysis, approximately 29 TS punctures will be needed for trainee to pass the steepest area of learning curve.

  18. Rapid Automated Dissolution and Analysis Techniques for Radionuclides in Recycle Process Streams

    International Nuclear Information System (INIS)

    Sudowe, Ralf; Roman, Audrey; Dailey, Ashlee; Go, Elaine

    2013-01-01

    The analysis of process samples for radionuclide content is an important part of current procedures for material balance and accountancy in the different process streams of a recycling plant. The destructive sample analysis techniques currently available necessitate a significant amount of time. It is therefore desirable to develop new sample analysis procedures that allow for a quick turnaround time and increased sample throughput with a minimum of deviation between samples. In particular, new capabilities for rapid sample dissolution and radiochemical separation are required. Most of the radioanalytical techniques currently employed for sample analysis are based on manual laboratory procedures. Such procedures are time- and labor-intensive, and not well suited for situations in which a rapid sample analysis is required and/or large number of samples need to be analyzed. To address this issue we are currently investigating radiochemical separation methods based on extraction chromatography that have been specifically optimized for the analysis of process stream samples. The influence of potential interferences present in the process samples as well as mass loading, flow rate and resin performance is being studied. In addition, the potential to automate these procedures utilizing a robotic platform is evaluated. Initial studies have been carried out using the commercially available DGA resin. This resin shows an affinity for Am, Pu, U, and Th and is also exhibiting signs of a possible synergistic effects in the presence of iron.

  19. Implementation of quality improvement techniques for management and technical processes in the ACRV project

    Science.gov (United States)

    Raiman, Laura B.

    1992-12-01

    Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .

  20. Plasma processing techniques for deposition of carbonic thin protective coatings on structural nuclear materials

    International Nuclear Information System (INIS)

    Andrei, V.; Oncioiu, G.; Coaca, E.; Rusu, O.; Lungu, C.

    2009-01-01

    Full text of publication follows: The production of nano-structured surface films with controlled properties is crucial for the development of materials necessary for the Advanced Systems for Nuclear Energy. Since the surface of materials is the zone through which materials interact with the environment, the surface science and surface engineering techniques plays an essential role in the understanding and control of the processes involved. Complex surface structures were developed on stainless steels used as structural nuclear materials: austenitic stainless steels based on Fe, austenitic steels with high content of Cr, ferrites resistant to corrosion, by various Plasma Processing methods which include: - Plasma Electrolytic (PE) treatments: the steel substrates were modified by nitriding and nitro-carburizing plasma diffusion treatments; - carbonic films deposition in Thermionic Vacuum Arc Plasma. The results of the characterization of surface structures obtained in various experimental conditions for improvement of the properties (corrosion resistance, hardness, wear properties) are reported: the processes and structures were characterized by correlation of the results of the complementary techniques: XPS, 'depth profiling', SEM, XRD, EIS. An overall description of the processes involved in the surface properties improvement, and some consideration about the new materials development for energy technologies are presented

  1. Implementation of quality improvement techniques for management and technical processes in the ACRV project

    Science.gov (United States)

    Raiman, Laura B.

    1992-01-01

    Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .

  2. An experimental evaluation of the generalizing capabilities of process discovery techniques and black-box sequence models

    NARCIS (Netherlands)

    Tax, N.; van Zelst, S.J.; Teinemaa, I.; Gulden, Jens; Reinhartz-Berger, Iris; Schmidt, Rainer; Guerreiro, Sérgio; Guédria, Wided; Bera, Palash

    2018-01-01

    A plethora of automated process discovery techniques have been developed which aim to discover a process model based on event data originating from the execution of business processes. The aim of the discovered process models is to describe the control-flow of the underlying business process. At the

  3. Dysphagia Screening: Contributions of Cervical Auscultation Signals and Modern Signal-Processing Techniques

    Science.gov (United States)

    Dudik, Joshua M.; Coyle, James L.

    2015-01-01

    Cervical auscultation is the recording of sounds and vibrations caused by the human body from the throat during swallowing. While traditionally done by a trained clinician with a stethoscope, much work has been put towards developing more sensitive and clinically useful methods to characterize the data obtained with this technique. The eventual goal of the field is to improve the effectiveness of screening algorithms designed to predict the risk that swallowing disorders pose to individual patients’ health and safety. This paper provides an overview of these signal processing techniques and summarizes recent advances made with digital transducers in hopes of organizing the highly varied research on cervical auscultation. It investigates where on the body these transducers are placed in order to record a signal as well as the collection of analog and digital filtering techniques used to further improve the signal quality. It also presents the wide array of methods and features used to characterize these signals, ranging from simply counting the number of swallows that occur over a period of time to calculating various descriptive features in the time, frequency, and phase space domains. Finally, this paper presents the algorithms that have been used to classify this data into ‘normal’ and ‘abnormal’ categories. Both linear as well as non-linear techniques are presented in this regard. PMID:26213659

  4. Application of thin layer activation technique for monitoring corrosion of carbon steel in hydrocarbon processing environment.

    Science.gov (United States)

    Saxena, R C; Biswal, Jayashree; Pant, H J; Samantray, J S; Sharma, S C; Gupta, A K; Ray, S S

    2018-05-01

    Acidic crude oil transportation and processing in petroleum refining and petrochemical operations cause corrosion in the pipelines and associated components. Corrosion monitoring is invariably required to test and prove operational reliability. Thin Layer Activation (TLA) technique is a nuclear technique used for measurement of corrosion and erosion of materials. The technique involves irradiation of material with high energy ion beam from an accelerator and measurement of loss of radioactivity after the material is subjected to corrosive environment. In the present study, TLA technique has been used to monitor corrosion of carbon steel (CS) in crude oil environment at high temperature. Different CS coupons were irradiated with a 13 MeV proton beam to produce Cobalt-56 radioisotope on the surface of the coupons. The corrosion studies were carried out by subjecting the irradiated coupons to a corrosive environment, i.e, uninhibited straight run gas oil (SRGO) containing known amount of naphthenic acid (NA) at high temperature. The effects of different parameters, such as, concentration of NA, temperature and fluid velocity (rpm) on corrosion behaviour of CS were studied. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. An intelligent signal processing and pattern recognition technique for defect identification using an active sensor network

    Science.gov (United States)

    Su, Zhongqing; Ye, Lin

    2004-08-01

    The practical utilization of elastic waves, e.g. Rayleigh-Lamb waves, in high-performance structural health monitoring techniques is somewhat impeded due to the complicated wave dispersion phenomena, the existence of multiple wave modes, the high susceptibility to diverse interferences, the bulky sampled data and the difficulty in signal interpretation. An intelligent signal processing and pattern recognition (ISPPR) approach using the wavelet transform and artificial neural network algorithms was developed; this was actualized in a signal processing package (SPP). The ISPPR technique comprehensively functions as signal filtration, data compression, characteristic extraction, information mapping and pattern recognition, capable of extracting essential yet concise features from acquired raw wave signals and further assisting in structural health evaluation. For validation, the SPP was applied to the prediction of crack growth in an alloy structural beam and construction of a damage parameter database for defect identification in CF/EP composite structures. It was clearly apparent that the elastic wave propagation-based damage assessment could be dramatically streamlined by introduction of the ISPPR technique.

  6. Domain Immersion Technique And Free Surface Computations Applied To Extrusion And Mixing Processes

    Science.gov (United States)

    Valette, Rudy; Vergnes, Bruno; Basset, Olivier; Coupez, Thierry

    2007-04-01

    This work focuses on the development of numerical techniques devoted to the simulation of mixing processes of complex fluids such as twin-screw extrusion or batch mixing. In mixing process simulation, the absence of symmetry of the moving boundaries (the screws or the rotors) implies that their rigid body motion has to be taken into account by using a special treatment. We therefore use a mesh immersion technique (MIT), which consists in using a P1+/P1-based (MINI-element) mixed finite element method for solving the velocity-pressure problem and then solving the problem in the whole barrel cavity by imposing a rigid motion (rotation) to nodes found located inside the so called immersed domain, each subdomain (screw, rotor) being represented by a surface CAD mesh (or its mathematical equation in simple cases). The independent meshes are immersed into a unique backgound computational mesh by computing the distance function to their boundaries. Intersections of meshes are accounted for, allowing to compute a fill factor usable as for the VOF methodology. This technique, combined with the use of parallel computing, allows to compute the time-dependent flow of generalized Newtonian fluids including yield stress fluids in a complex system such as a twin screw extruder, including moving free surfaces, which are treated by a "level set" and Hamilton-Jacobi method.

  7. Determination of some process parameters in a tyre-cord plant using radiotracer technique

    International Nuclear Information System (INIS)

    Kirti; Madhavankutti, C.K.; Eapen, A.C.

    1979-01-01

    In the process industry, it is often necessary to study the process parameters such as the residence time, flow rate, etc., under different operating conditions and equipment. The tracer technique represents in this respect an outstanding and even sometimes singular means of determining some of the above parameters. A method consisting of the introduction of a radioactive tracer at the input of a flow system under study and subsequently determining the distribution of activity with time at the output end is described. The form of the activity time curve depends on the parameters of the installation and the mode of operation. A study conducted at a multi-stage viscose rayon processing plant is described in detail. (auth.)

  8. Impurities in sugar cane and their influence on industrial processing evaluated by nuclear techniques

    International Nuclear Information System (INIS)

    Bacchi, M.A.; Fernandes, E.A.N.; Ferraz, E.S.B.

    1990-01-01

    During the cutting and loading operations, impurities, mainly soil, are added to sugar cane in amounts that can impair industrial processing due to excessive wear of metallic members and contamination of juice and bagasse. Mechanization of loading operation has showed a considerable enhancement of the impurity content, leading to the improvement of cane washing technology. Nevertheless, for a correct understanding of the problem and the process optimization, it is necessary and exact and fast quantification of these impurities as well as of its consequences. Nuclear techniques, in special neutron activation analysis, have been proved to be appropriate for estimating soil level in sugar cane, washing process efficiency and wearing of cases and moving parts. (author)

  9. Volumetric image processing: A new technique for three-dimensional imaging

    International Nuclear Information System (INIS)

    Fishman, E.K.; Drebin, B.; Magid, D.; St Ville, J.A.; Zerhouni, E.A.; Siegelman, S.S.; Ney, D.R.

    1986-01-01

    Volumetric three-dimensional (3D) image processing was performed on CT scans of 25 normal hips, and image quality and potential diagnostic applications were assessed. In contrast to surface detection 3D techniques, volumetric processing preserves every pixel of transaxial CT data, replacing the gray scale with transparent ''gels'' and shading. Anatomically, accurate 3D images can be rotated and manipulated in real time, including simulated tissue layer ''peeling'' and mock surgery or disarticulation. This pilot study suggests that volumetric rendering is a major advance in signal processing of medical image data, producing a high quality, uniquely maneuverable image that is useful for fracture interpretation, soft-tissue analysis, surgical planning, and surgical rehearsal

  10. Image Processing Techniques for Assessing Contractility in Isolated Adult Cardiac Myocytes

    Directory of Open Access Journals (Sweden)

    Carlos Bazan

    2009-01-01

    The physiologic application of the methodology is evaluated by assessing overall contraction in enzymatically dissociated adult rat cardiocytes. Our results demonstrate the effectiveness of the proposed approach in characterizing the true, two-dimensional, “shortening” in the contraction process of adult cardiocytes. We compare the performance of the proposed method to that of a popular edge detection system in the literature. The proposed method not only provides a more comprehensive assessment of the myocyte contraction process but also can potentially eliminate historical concerns and sources of errors caused by myocyte rotation or translation during contraction. Furthermore, the versatility of the image processing techniques makes the method suitable for determining myocyte shortening in cells that usually bend or move during contraction. The proposed method can be utilized to evaluate changes in contractile behavior resulting from drug intervention, disease modeling, transgeneity, or other common applications to mammalian cardiocytes.

  11. Full characterization of the photorefractive bright soliton formation process using a digital holographic technique

    International Nuclear Information System (INIS)

    Merola, F; Miccio, L; Paturzo, M; Ferraro, P; De Nicola, S

    2009-01-01

    An extensive characterization of the photorefractive bright soliton writing process in a lithium niobate crystal is presented. An interferometric approach based on a digital holographic technique has been used to reconstruct the complex wavefield at the exit face of the crystal. Temporal evolution of both intensity and phase profile of the writing beam has been analysed. The effective changes of the refractive index of the medium during the writing process and after the soliton formation are determined from the optical phase distribution. This method provides a reliable way to observe the process of soliton formation, whereas the determination of the intensity distribution of the output beam does not show clearly whether the soliton regime has been achieved or not. Furthermore, a detailed analysis of the soliton in a steady-state situation and under different writing conditions is presented and discussed

  12. Reduce of adherence problems in galvanised processes through data mining techniques

    International Nuclear Information System (INIS)

    Martinez de Pison, F. J.; Ordieres, J.; Pernia, A.; Alba, F.; Torre, V.

    2007-01-01

    This paper presents an example of the application of data mining techniques to obtain hidden knowledge from the historical data of a hot dip galvanizing process and to establish rules to improve quality in the final product and to reduce errors in the process. For this purpose, the tuning records of a hot dip galvanizing line where coils with adherence problems in the zinc coating had been identified were used as starting point. From the database of the process, the classical data mining approach was applied to obtain and analyze a number of decision trees hat classified two types of coils, i.e. those with the right adherence and those with irregular adherence. The variables and values that might have influenced the quality of the coating were extracted from these tress. Several rules that may be applied to reduce the number of faulty coils with adherence problems were also established. (Author) 24 refs

  13. FY 1998 annual summary report on photon measuring/processing techniques. Development of the techniques for high-efficiency production processes; 1998 nendo foton keisoku kako gijutsu seika hokokusho. Kokoritsu seisan process gijutsu kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    The objectives are set to develop the techniques for energy-efficient laser-aided processing; techniques for high-precision, real-time measurement to improve quality control for production processes and increase their efficiency; and the techniques for generating/controlling photon of high efficiency and quality as the laser beam sources therefor, in order to promote energy saving at and improve efficiency of production processes consuming large quantities of energy, e.g., welding, joining, surface treatment and production of fine particles. The R and D themes are microscopic processing technology: simulation technology for laser welding phenomena; microscopic processing technology: synthesis of technology for quantum dot functional structures; in-situ status measuring technology: fine particle elements and size measurement technology; high-power all-solid-state laser technology: efficient rod type LD-pumping laser modules and pumping chamber of a slab-type laser; tightly-focusing all-solid-state laser technology: improvement of E/O efficiency of laser diode, high-quality nonlinear crystal growth technology and fabrication technology for nonlinear crystal; and comprehensive investigation of photonics engineering: high-efficiency harmonic generation technology. (NEDO)

  14. Advanced gamma spectrum processing technique applied to the analysis of scattering spectra for determining material thickness

    International Nuclear Information System (INIS)

    Hoang Duc Tam; VNUHCM-University of Science, Ho Chi Minh City; Huynh Dinh Chuong; Tran Thien Thanh; Vo Hoang Nguyen; Hoang Thi Kieu Trang; Chau Van Tao

    2015-01-01

    In this work, an advanced gamma spectrum processing technique is applied to analyze experimental scattering spectra for determining the thickness of C45 heat-resistant steel plates. The single scattering peak of scattering spectra is taken as an advantage to measure the intensity of single scattering photons. Based on these results, the thickness of steel plates is determined with a maximum deviation of real thickness and measured thickness of about 4 %. Monte Carlo simulation using MCNP5 code is also performed to cross check the results, which yields a maximum deviation of 2 %. These results strongly confirm the capability of this technique in analyzing gamma scattering spectra, which is a simple, effective and convenient method for determining material thickness. (author)

  15. Nuclear techniques used in study of biological processes in Sinapis alba culture

    International Nuclear Information System (INIS)

    Giosanu, D.; Fleancu, M.

    2001-01-01

    The aim of the present paper is to study different nuclear techniques, in particular the influence of gamma radiation upon germination, growth and respiration processes in Sinapis alba culture. The dependence of these phenomena on dose of gamma irradiation was studied. Research was done on dry seeds of mustard (Sinapis alba).The doses of gamma irradiation were: 20 krad, 40 krad, 60 krad, 80 krad and 100 krad.The subsequent evolution of the irradiated samples was compared with the evolution of an unirradiated (control) samples. The irradiation was done evenly, in a single phase. The treatment of the dry seeds of mustard with gamma radiation determined a diminution of energy of germination. So, the energy of germination was 57 - 73% in gamma treated batches and 81% in the control batch. Thus, the faculty of germination decreases from 92% (in the control batch) to 83% in the irradiated batches. Growth process (length of roots and hypocotyl) was also studied. For 100 krad gamma irradiation the rate of this process was lower than that of the control batch, both in the first and the four day of irradiation. The inhibition effect manifested on germination and growth processes for gamma treated dry seeds of mustard is determined by the modification in the membrane permeability. The intensity of respiration process in the irradiated lots was lower than that of the control lot. The inhibition effect manifested by respiration process following gamma irradiation could be explained by the enzymatic activity of mustard seeds. (authors)

  16. Process sensors characterization based on noise analysis technique and artificial intelligence

    International Nuclear Information System (INIS)

    Mesquita, Roberto N. de; Perillo, Sergio R.P.; Santos, Roberto C. dos

    2005-01-01

    The time response of pressure and temperature sensors from the Reactor Protection System (RPS) is a requirement that must be satisfied in nuclear power plants, furthermore is an indicative of its degradation and its remaining life. The nuclear power industry and others have been eager to implement smart sensor technologies and digital instrumentation concepts to reduce manpower and effort currently spent on testing and calibration. Process parameters fluctuations during normal operation of a reactor are caused by random variations in neutron flux, heat transfer and other sources. The output sensor noise can be considered as the response of the system to an input representing the statistical nature of the underlying process which can be modeled using a time series model. Since the noise signal measurements are influenced by many factors, such as location of sensors, extraneous noise interference, and randomness in temperature and pressure fluctuation - the quantitative estimate of the time response using autoregressive noise modeling is subject to error. This technique has been used as means of sensor monitoring. In this work a set of pressure sensors installed in one experimental loop adapted from a flow calibration setup is used to test and analyze signals in a new approach using artificial intelligence techniques. A set of measurements of dynamic signals in different experimental conditions is used to distinguish and identify underlying process sources. A methodology that uses Blind Separation of Sources with a neural networks scheme is being developed to improve time response estimate reliability in noise analysis. (author)

  17. Process sensors characterization based on noise analysis technique and artificial intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Roberto N. de; Perillo, Sergio R.P.; Santos, Roberto C. dos [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil)]. E-mail: rnavarro@ipen.br; sperillo@ipen.br; rcsantos@ipen.br

    2005-07-01

    The time response of pressure and temperature sensors from the Reactor Protection System (RPS) is a requirement that must be satisfied in nuclear power plants, furthermore is an indicative of its degradation and its remaining life. The nuclear power industry and others have been eager to implement smart sensor technologies and digital instrumentation concepts to reduce manpower and effort currently spent on testing and calibration. Process parameters fluctuations during normal operation of a reactor are caused by random variations in neutron flux, heat transfer and other sources. The output sensor noise can be considered as the response of the system to an input representing the statistical nature of the underlying process which can be modeled using a time series model. Since the noise signal measurements are influenced by many factors, such as location of sensors, extraneous noise interference, and randomness in temperature and pressure fluctuation - the quantitative estimate of the time response using autoregressive noise modeling is subject to error. This technique has been used as means of sensor monitoring. In this work a set of pressure sensors installed in one experimental loop adapted from a flow calibration setup is used to test and analyze signals in a new approach using artificial intelligence techniques. A set of measurements of dynamic signals in different experimental conditions is used to distinguish and identify underlying process sources. A methodology that uses Blind Separation of Sources with a neural networks scheme is being developed to improve time response estimate reliability in noise analysis. (author)

  18. Material accountancy measurement techniques in dry-powdered processing of nuclear spent fuels

    International Nuclear Information System (INIS)

    Wolf, S. F.

    1999-01-01

    The paper addresses the development of inductively coupled plasma-mass spectrometry (ICPMS), thermal ionization-mass spectrometry (TIMS), alpha-spectrometry, and gamma spectrometry techniques for in-line analysis of highly irradiated (18 to 64 GWD/T) PWR spent fuels in a dry-powdered processing cycle. The dry-powdered technique for direct elemental and isotopic accountancy assay measurements was implemented without the need for separation of the plutonium, uranium and fission product elements in the bulk powdered process. The analyses allow the determination of fuel burn-up based on the isotopic composition of neodymium and/or cesium. An objective of the program is to develop the ICPMS method for direct fissile nuclear materials accountancy in the dry-powdered processing of spent fuel. The ICPMS measurement system may be applied to the KAERI DUPIC (direct use of spent PWR fuel in CANDU reactors) experiment, and in a near-real-time mode for international safeguards verification and non-proliferation policy concerns

  19. Comparative of signal processing techniques for micro-Doppler signature extraction with automotive radar systems

    Science.gov (United States)

    Rodriguez-Hervas, Berta; Maile, Michael; Flores, Benjamin C.

    2014-05-01

    In recent years, the automotive industry has experienced an evolution toward more powerful driver assistance systems that provide enhanced vehicle safety. These systems typically operate in the optical and microwave regions of the electromagnetic spectrum and have demonstrated high efficiency in collision and risk avoidance. Microwave radar systems are particularly relevant due to their operational robustness under adverse weather or illumination conditions. Our objective is to study different signal processing techniques suitable for extraction of accurate micro-Doppler signatures of slow moving objects in dense urban environments. Selection of the appropriate signal processing technique is crucial for the extraction of accurate micro-Doppler signatures that will lead to better results in a radar classifier system. For this purpose, we perform simulations of typical radar detection responses in common driving situations and conduct the analysis with several signal processing algorithms, including short time Fourier Transform, continuous wavelet or Kernel based analysis methods. We take into account factors such as the relative movement between the host vehicle and the target, and the non-stationary nature of the target's movement. A comparison of results reveals that short time Fourier Transform would be the best approach for detection and tracking purposes, while the continuous wavelet would be the best suited for classification purposes.

  20. Energy meshing techniques for processing ENDF/B-VI cross sections using the AMPX code system

    International Nuclear Information System (INIS)

    Dunn, M.E.; Greene, N.M.; Leal, L.C.

    1999-01-01

    Modern techniques for the establishment of criticality safety for fissile systems invariably require the use of neutronic transport codes with applicable cross-section data. Accurate cross-section data are essential for solving the Boltzmann Transport Equation for fissile systems. In the absence of applicable critical experimental data, the use of independent calculational methods is crucial for the establishment of subcritical limits. Moreover, there are various independent modern transport codes available to the criticality safety analyst (e.g., KENO V.a., MCNP, and MONK). In contrast, there is currently only one complete software package that processes data from the Version 6 format of the Evaluated Nuclear Data File (ENDF) to a format useable by criticality safety codes. To facilitate independent cross-section processing, Oak Ridge National Laboratory (ORNL) is upgrading the AMPX code system to enable independent processing of Version 6 formats using state-of-the-art procedures. The AMPX code system has been in continuous use at ORNL since the early 1970s and is the premier processor for providing multigroup cross sections for criticality safety analysis codes. Within the AMPX system, the module POLIDENT is used to access the resonance parameters in File 2 of an ENDF/B library, generate point cross-section data, and combine the cross sections with File 3 point data. At the heart of any point cross-section processing code is the generation of a suitable energy mesh for representing the data. The purpose of this work is to facilitate the AMPX upgrade through the development of a new and innovative energy meshing technique for processing point cross-section data

  1. Security Transition Program Office (STPO), technology transfer of the STPO process, tools, and techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hauth, J.T.; Forslund, C.R.J.; Underwood, J.A.

    1994-09-01

    In 1990, with the transition from a defense mission to environmental restoration, the U.S. Department of Energy`s (DOE`s) Hanford Site began a significant effort to diagnose, redesign, and implement new safeguards and security (SAS) processes. In 1992 the Security Transition Program Office (STPO) was formed to address the sweeping changes that were being identified. Comprised of SAS and other contractor staff with extensive experience and supported by staff experienced in organizational analysis and work process redesign, STPO undertook a series of tasks designed to make fundamental changes to SAS processes throughout the Hanford Site. The goal of STPO is to align the SAS work and organization with the new Site mission. This report describes the key strategy, tools, methods, and techniques used by STPO to change SAS processes at Hanford. A particular focus of this review is transferring STPO`s experience to other DOE sites and federal agency efforts: that is, to extract, analyze, and provide a critical review of the approach, tools, and techniques used by STPO that will be useful to other DOE sites and national laboratories in transitioning from a defense production mode to environmental restoration and other missions. In particular, what lessons does STPO provide as a pilot study or model for implementing change in other transition activities throughout the DOE complex? More broadly, what theoretical and practical contributions do DOE transition efforts, such as STPO, provide to federal agency streamlining efforts and attempts to {open_quotes}reinvent{close_quotes} government enterprises in the public sector? The approach used by STPO should provide valuable information to those examining their own processes in light of new mission requirements.

  2. Investigating deformation processes in AM60 magnesium alloy using the acoustic emission technique

    International Nuclear Information System (INIS)

    Mathis, K.; Chmelik, F.; Janecek, M.; Hadzima, B.; Trojanova, Z.; Lukac, P.

    2006-01-01

    Microstructure changes in an AM60 magnesium alloy were monitored using the acoustic emission (AE) technique during tensile tests in the temperature range from 20 to 300 deg. C. The correlation of the AE signal and the deformation processes is discussed. It is shown, using transmission electron and light microscopy, that the character of the AE response is associated with various modes of mechanical twinning at lower temperatures, whereas at higher temperatures also the influence of non-basal dislocations on the AE response must be taken into account

  3. Triaxial testing system for pressure core analysis using image processing technique

    Science.gov (United States)

    Yoneda, J.; Masui, A.; Tenma, N.; Nagao, J.

    2013-11-01

    In this study, a newly developed innovative triaxial testing system to investigate strength, deformation behavior, and/or permeability of gas hydrate bearing-sediments in deep sea is described. Transport of the pressure core from the storage chamber to the interior of the sealing sleeve of a triaxial cell without depressurization was achieved. An image processing technique was used to capture the motion and local deformation of a specimen in a transparent acrylic triaxial pressure cell and digital photographs were obtained at each strain level during the compression test. The material strength was successfully measured and the failure mode was evaluated under high confining and pore water pressures.

  4. Recleaning of HEPA filters by reverse flow - evaluation of the underlying processes and the cleaning technique

    International Nuclear Information System (INIS)

    Leibold, H.; Leiber, T.; Doeffert, I.; Wilhelm, J.G.

    1993-08-01

    HEPA filter operation at high concentrations of fine dusts requires the periodic recleaning of the filter units in their service locations. Due to the low mechanical stress induced during the recleaning process the regenration via low pressure reverse flow is a very suitable technique. Recleanability of HEPA filter had been attained for particle diameter >0,4 μm at air velocities up to 1 m/s, but filter clogging occurred in case of smaller particles. The recleaning forces are too weak for particles [de

  5. Searching for the Unknowable: A Process of Detection — Abductive Research Generated by Projective Techniques

    Directory of Open Access Journals (Sweden)

    Miri Levin-Rozalis

    2004-06-01

    Full Text Available This article looks at the process of doing research ‘from scratch.’ The author began a project investigating children of Ethiopian origin living in Israel to see how ones who attended a kindergartern program years earlier differed from those who had not attended. However, the problem from the outset was that there may not be a difference to find. In this article, the author compares inductive, deductive, and abductive reasoning, and argues that abductive reasoning is the proper technique when nothing is known about the research at the outset.

  6. Identification and Evaluation of Human Errors in the Medication Process Using the Extended CREAM Technique

    Directory of Open Access Journals (Sweden)

    Iraj Mohammadfam

    2017-10-01

    Full Text Available Background Medication process is a powerful instrument for curing patients. Obeying the commands of this process has an important role in the treatment and provision of care to patients. Medication error, as a complicated process, can occur in any stage of this process, and to avoid it, appropriate decision-making, cognition, and performance of the hospital staff are needed. Objectives The present study aimed at identifying and evaluating the nature and reasons of human errors in the medication process in a hospital using the extended CREAM method. Methods This was a qualitative and cross-sectional study conducted in a hospital in Hamadan. In this study, first, the medication process was selected as a critical issue based on the opinions of experts, specialists, and experienced individuals in the nursing and medical departments. Then, the process was analyzed into relative steps and substeps using the method of HTA and was evaluated using extended CREAM technique considering the probability of human errors. Results Based on the findings achieved through the basic CREAM method, the highest CFPt was in the step of medicine administration to patients (0.056. Moreover, the results revealed that the highest CFPt was in the substeps of calculating the dose of medicine and determining the method of prescription and identifying the patient (0.0796 and 0.0785, respectively. Also, the least CFPt was related to transcribing the prescribed medicine from file to worksheet of medicine (0.0106. Conclusions Considering the critical consequences of human errors in the medication process, holding pharmacological retraining classes, using the principles of executing pharmaceutical orders, increasing medical personnel, reducing working overtime, organizing work shifts, and using error reporting systems are of paramount importance.

  7. Do organizations adopt sophisticated capital budgeting practices to deal with uncertainty in the investment decision? : A research note

    NARCIS (Netherlands)

    Verbeeten, Frank H M

    This study examines the impact of uncertainty on the sophistication of capital budgeting practices. While the theoretical applications of sophisticated capital budgeting practices (defined as the use of real option reasoning and/or game theory decision rules) have been well documented, empirical

  8. Building the competitive intelligence knowledge: processes and activities in a corporate organisation

    OpenAIRE

    Sreenivasulu, V.

    1999-01-01

    This paper discusses the process of building and developing comprehensive tools, techniques, support systems, and better methods of harnessing the competitive intelligence knowledge processes. The author stresses the need for building sophisticated methodological competitive intelligence knowledge acquisition, systematic collection of competitive intelligence knowledge from various sources for critical analysis, process, organization, synthesis, assessment, screening, filtering and interpreta...

  9. "SOCRATICS" AS ADDRESSES OF ISOCRATES’ EPIDEICTIC SPEECHES (Against the Sophists, Encomium of Helen, Busiris

    Directory of Open Access Journals (Sweden)

    Anna Usacheva

    2012-06-01

    Full Text Available This article analyses the three epideictic orations of Isocrates which are in themselves a precious testimony of the quality of intellectual life at the close of the fourth century before Christ. To this period belong also the Socratics who are generally seen as an important link between Socrates and Plato. The author of this article proposes a more productive approach to the study of Antisthenes, Euclid of Megara and other so-called Socratics, revealing them not as independent thinkers but rather as adherents of the sophistic school and also as teachers, thereby, including them among those who took part in the educative activity of their time

  10. Low Level RF Including a Sophisticated Phase Control System for CTF3

    CERN Document Server

    Mourier, J; Nonglaton, J M; Syratchev, I V; Tanner, L

    2004-01-01

    CTF3 (CLIC Test Facility 3), currently under construction at CERN, is a test facility designed to demonstrate the key feasibility issues of the CLIC (Compact LInear Collider) two-beam scheme. When completed, this facility will consist of a 150 MeV linac followed by two rings for bunch-interleaving, and a test stand where 30 GHz power will be generated. In this paper, the work that has been carried out on the linac's low power RF system is described. This includes, in particular, a sophisticated phase control system for the RF pulse compressor to produce a flat-top rectangular pulse over 1.4 µs.

  11. Feedback correction of injection errors using digital signal-processing techniques

    Directory of Open Access Journals (Sweden)

    N. S. Sereno

    2007-01-01

    Full Text Available Efficient transfer of electron beams from one accelerator to another is important for 3rd-generation light sources that operate using top-up. In top-up mode, a constant amount of charge is injected at regular intervals into the storage ring to replenish beam lost primarily due to Touschek scattering. Top-up therefore requires that the complex of injector accelerators that fill the storage ring transport beam with a minimum amount of loss. Injection can be a source of significant beam loss if not carefully controlled. In this note we describe a method of processing injection transient signals produced by beam-position monitors and using the processed data in feedback. Feedback control using the technique described here has been incorporated in the Advanced Photon Source (APS booster synchrotron to correct injection transients.

  12. Fuzzy logic and image processing techniques for the interpretation of seismic data

    International Nuclear Information System (INIS)

    Orozco-del-Castillo, M G; Ortiz-Alemán, C; Rodríguez-Castellanos, A; Urrutia-Fucugauchi, J

    2011-01-01

    Since interpretation of seismic data is usually a tedious and repetitive task, the ability to do so automatically or semi-automatically has become an important objective of recent research. We believe that the vagueness and uncertainty in the interpretation process makes fuzzy logic an appropriate tool to deal with seismic data. In this work we developed a semi-automated fuzzy inference system to detect the internal architecture of a mass transport complex (MTC) in seismic images. We propose that the observed characteristics of a MTC can be expressed as fuzzy if-then rules consisting of linguistic values associated with fuzzy membership functions. The constructions of the fuzzy inference system and various image processing techniques are presented. We conclude that this is a well-suited problem for fuzzy logic since the application of the proposed methodology yields a semi-automatically interpreted MTC which closely resembles the MTC from expert manual interpretation

  13. Recent developments at JPL in the application of digital image processing techniques to astronomical images

    Science.gov (United States)

    Lorre, J. J.; Lynn, D. J.; Benton, W. D.

    1976-01-01

    Several techniques of a digital image-processing nature are illustrated which have proved useful in visual analysis of astronomical pictorial data. Processed digital scans of photographic plates of Stephans Quintet and NGC 4151 are used as examples to show how faint nebulosity is enhanced by high-pass filtering, how foreground stars are suppressed by linear interpolation, and how relative color differences between two images recorded on plates with different spectral sensitivities can be revealed by generating ratio images. Analyses are outlined which are intended to compensate partially for the blurring effects of the atmosphere on images of Stephans Quintet and to obtain more detailed information about Saturn's ring structure from low- and high-resolution scans of the planet and its ring system. The employment of a correlation picture to determine the tilt angle of an average spectral line in a low-quality spectrum is demonstrated for a section of the spectrum of Uranus.

  14. Feasibility of Johnson Noise Thermometry based on Digital Signal Processing Techniques

    International Nuclear Information System (INIS)

    Hwang, In Koo; Kim, Yang Mo

    2014-01-01

    This paper presents an implementation strategy of noise thermometry based on a digital signal processing technique and demonstrates its feasibilities. A key factor in its development is how to extract the small thermal noise signal from other noises, for example, random noise from amplifiers and continuous electromagnetic interference from the environment. The proposed system consists of two identical amplifiers and uses a cross correlation function to cancel the random noise of the amplifiers. Then, the external interference noises are eliminated by discriminating the difference in the peaks between the thermal signal and external noise. The gain of the amplifiers is estimated by injecting an already known pilot signal. The experimental simulation results of signal processing methods have demonstrated that the proposed approach is an effective method in eliminating an external noise signal and performing gain correction for development of the thermometry

  15. Acoustic Emission Signal Processing Technique to Characterize Reactor In-Pile Phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Vivek Agarwal; Magdy Samy Tawfik; James A Smith

    2014-07-01

    Existing and developing advanced sensor technologies and instrumentation will allow non-intrusive in-pile measurement of temperature, extension, and fission gases when coupled with advanced signal processing algorithms. The transmitted measured sensor signals from inside to the outside of containment structure are corrupted by noise and are attenuated, thereby reducing the signal strength and signal-to-noise ratio. Identification and extraction of actual signal (representative of an in-pile phenomenon) is a challenging and complicated process. In this paper, empirical mode decomposition technique is proposed to reconstruct actual sensor signal by partially combining intrinsic mode functions. Reconstructed signal corresponds to phenomena and/or failure modes occurring inside the reactor. In addition, it allows accurate non-intrusive monitoring and trending of in-pile phenomena.

  16. Dorsolateral prefrontal cortex, working memory and episodic memory processes: insight through transcranial magnetic stimulation techniques

    Institute of Scientific and Technical Information of China (English)

    Michela Balconi

    2013-01-01

    The ability to recall and recognize facts we experienced in the past is based on a complex mechanism in which several cerebral regions are implicated.Neuroimaging and lesion studies agree in identifying the frontal lobe as a crucial structure for memory processes,and in particular for working memory and episodic memory and their relationships.Furthermore,with the introduction of transcranial magnetic stimulation (TMS) a new way was proposed to investigate the relationships between brain correlates,memory functions and behavior.The aim of this review is to present the main findings that have emerged from experiments which used the TMS technique for memory analysis.They mainly focused on the role of the dorsolateral prefrontal cortex in memory process.Furthermore,we present state-of-the-art evidence supporting a possible use of TMS in the clinic.Specifically we focus on the treatment of memory deficits in depression and anxiety disorders.

  17. Image processing techniques for thermal, x-rays and nuclear radiations

    International Nuclear Information System (INIS)

    Chadda, V.K.

    1998-01-01

    The paper describes image acquisition techniques for the non-visible range of electromagnetic spectrum especially thermal, x-rays and nuclear radiations. Thermal imaging systems are valuable tools used for applications ranging from PCB inspection, hot spot studies, fire identification, satellite imaging to defense applications. Penetrating radiations like x-rays and gamma rays are used in NDT, baggage inspection, CAT scan, cardiology, radiography, nuclear medicine etc. Neutron radiography compliments conventional x-rays and gamma radiography. For these applications, image processing and computed tomography are employed for 2-D and 3-D image interpretation respectively. The paper also covers main features of image processing systems for quantitative evaluation of gray level and binary images. (author)

  18. Automatic detection of health changes using statistical process control techniques on measured transfer times of elderly.

    Science.gov (United States)

    Baldewijns, Greet; Luca, Stijn; Nagels, William; Vanrumste, Bart; Croonenborghs, Tom

    2015-01-01

    It has been shown that gait speed and transfer times are good measures of functional ability in elderly. However, data currently acquired by systems that measure either gait speed or transfer times in the homes of elderly people require manual reviewing by healthcare workers. This reviewing process is time-consuming. To alleviate this burden, this paper proposes the use of statistical process control methods to automatically detect both positive and negative changes in transfer times. Three SPC techniques: tabular CUSUM, standardized CUSUM and EWMA, known for their ability to detect small shifts in the data, are evaluated on simulated transfer times. This analysis shows that EWMA is the best-suited method with a detection accuracy of 82% and an average detection time of 9.64 days.

  19. Brazing and diffusion bonding processes as available repair techniques for gas turbine blades and nozzles

    International Nuclear Information System (INIS)

    Mazur, Z.

    1997-01-01

    The conventionally welding methods are not useful for repair of heavily damaged gas turbine blades and nozzles. It includes thermal fatigue and craze cracks, corrosion, erosion and foreign object damage, which extend to the large areas. Because of required extensive heat input and couponing, it can cause severe distortion of the parts and cracks in the heat affected zone, and can made the repair costs high. For these cases, the available repair methods of gas turbine blades and nozzles, include brazing and diffusion bonding techniques are presented. Detailed analysis of the brazing and diffusion bonding processes applied for gas turbine blades repair with all elements which presented. Detailed analysis of the brazing and diffusion bonding processes applied for gas turbine blades repair with all elements which have influence to get sound joint is carried out. Depend of kind of blades and nozzle damage or deterioration registered a different methods of brazing and diffusion bonding applicability is presented. (Author) 65 refs

  20. Processing of semen by density gradient centrifugation selects spermatozoa with longer telomeres for assisted reproduction techniques.

    Science.gov (United States)

    Yang, Qingling; Zhang, Nan; Zhao, Feifei; Zhao, Wanli; Dai, Shanjun; Liu, Jinhao; Bukhari, Ihtisham; Xin, Hang; Niu, Wenbing; Sun, Yingpu

    2015-07-01

    The ends of eukaryotic chromosomes contain specialized chromatin structures called telomeres, the length of which plays a key role in early human embryonic development. Although the effect of sperm preparation techniques on major sperm characteristics, such as concentration, motility and morphology have been previously documented, the possible status of telomere length and its relation with sperm preparation techniques is not well-known for humans. The aim of this study was to investigate the role of density gradient centrifugation in the selection of spermatozoa with longer telomeres for use in assisted reproduction techniques in 105 samples before and after sperm processing. After density gradient centrifugation, the average telomere length of the sperm was significantly longer (6.51 ± 2.54 versus 5.16 ± 2.29, P average motile sperm rate was significantly higher (77.9 ± 11.8 versus 44.6 ± 11.2, P average DNA fragmentation rate was significantly lower (11.1 ± 5.9 versus 25.9 ± 12.9, P sperm count (rs = 0.58; P sperm with longer telomeres. Copyright © 2015 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.

  1. Application of integrated logistic techniques to operation, maintenance and re engineering processes in Nuclear Power plants

    International Nuclear Information System (INIS)

    Santiago Diez, P.

    1997-01-01

    This paper addresses the advisability of adapting and applying management and Integrated Logistic engineering techniques to nuclear power plants instead of using more traditional maintenance management methods. It establishes a historical framework showing the origins of integrated approaches based on traditional logistic support concepts, their phases and the real results obtained in the aeronautic world where they originated. It reviews the application of integrated management philosophy, and logistic support and engineering analysis techniques regarding Availability, Reliability and Maintainability (ARM) and shows their inter dependencies in different phases of the system's life (Design, Development and Operation). It describes how these techniques are applied to nuclear power plant operation, their impact on plant availability and the optimisation of maintenance and replacement plans. The paper analyses the need for data (type and volume), which will have to be collected, and the different tools to manage such data. It examines the different CALS tools developed by EA for engineering and for logistic management. It also explains the possibility of using these tools for process and data operations through the INTERNET. It also focuses on the qualities of some simple examples of possible applications, and how they would be used in the framework of Integrated Logistic Support (ILS). (Author)

  2. Functional imaging of the pancreas. Image processing techniques and clinical evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Nakanishi, Fumiko

    1984-02-01

    An image processing technique for functional imaging of the pancreas was developed and is here reported. In this paper, clinical efficacy of the technique for detecting pancreatic abnormality is evaluated in comparison with conventional pancreatic scintigraphy and CT. For quantitative evaluation, functional rate, i.e. the rate of normal functioning pancreatic area, was calculated from the functional image and subtraction image. Two hundred and ninety-five cases were studied using this technique. Conventional image had a sensitivity of 65% and a specificity of 78%, while the use of functional imaging improved sensitivity to 88% and specificity to 88%. The mean functional rate in patients with pancreatic disease was significantly lower (33.3 +- 24.5 in patients with chronic pancreatitis, 28.1 +- 26.9 in patients with acute pancreatitis, 43.4 +- 22.3 in patients with diabetes mellitus, 20.4 +- 23.4 in patients with pancreatic cancer) than the mean functional rate in cases without pancreatic disease (86.4 +- 14.2). It is suggested that functional image of the pancreas reflecting pancreatic exocrine function and functional rate is a useful indicator of pancreatic exocrine function.

  3. A New Screening Methodology for Improved Oil Recovery Processes Using Soft-Computing Techniques

    Science.gov (United States)

    Parada, Claudia; Ertekin, Turgay

    2010-05-01

    The first stage of production of any oil reservoir involves oil displacement by natural drive mechanisms such as solution gas drive, gas cap drive and gravity drainage. Typically, improved oil recovery (IOR) methods are applied to oil reservoirs that have been depleted naturally. In more recent years, IOR techniques are applied to reservoirs even before their natural energy drive is exhausted by primary depletion. Descriptive screening criteria for IOR methods are used to select the appropriate recovery technique according to the fluid and rock properties. This methodology helps in assessing the most suitable recovery process for field deployment of a candidate reservoir. However, the already published screening guidelines neither provide information about the expected reservoir performance nor suggest a set of project design parameters, which can be used towards the optimization of the process. In this study, artificial neural networks (ANN) are used to build a high-performance neuro-simulation tool for screening different improved oil recovery techniques: miscible injection (CO2 and N2), waterflooding and steam injection processes. The simulation tool consists of proxy models that implement a multilayer cascade feedforward back propagation network algorithm. The tool is intended to narrow the ranges of possible scenarios to be modeled using conventional simulation, reducing the extensive time and energy spent in dynamic reservoir modeling. A commercial reservoir simulator is used to generate the data to train and validate the artificial neural networks. The proxy models are built considering four different well patterns with different well operating conditions as the field design parameters. Different expert systems are developed for each well pattern. The screening networks predict oil production rate and cumulative oil production profiles for a given set of rock and fluid properties, and design parameters. The results of this study show that the networks are

  4. Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique.

    Science.gov (United States)

    Intarapanich, Apichart; Kaewkamnerd, Saowaluck; Shaw, Philip J; Ukosakit, Kittipat; Tragoonrung, Somvong; Tongsima, Sissades

    2015-01-01

    DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. This work presents an automated genotyping tool from DNA

  5. Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique

    Science.gov (United States)

    2015-01-01

    Background DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. Results We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. Conclusions This work presents an

  6. Signal processing techniques for damage detection with piezoelectric wafer active sensors and embedded ultrasonic structural radar

    Science.gov (United States)

    Yu, Lingyu; Bao, Jingjing; Giurgiutiu, Victor

    2004-07-01

    Embedded ultrasonic structural radar (EUSR) algorithm is developed for using piezoelectric wafer active sensor (PWAS) array to detect defects within a large area of a thin-plate specimen. Signal processing techniques are used to extract the time of flight of the wave packages, and thereby to determine the location of the defects with the EUSR algorithm. In our research, the transient tone-burst wave propagation signals are generated and collected by the embedded PWAS. Then, with signal processing, the frequency contents of the signals and the time of flight of individual frequencies are determined. This paper starts with an introduction of embedded ultrasonic structural radar algorithm. Then we will describe the signal processing methods used to extract the time of flight of the wave packages. The signal processing methods being used include the wavelet denoising, the cross correlation, and Hilbert transform. Though hardware device can provide averaging function to eliminate the noise coming from the signal collection process, wavelet denoising is included to ensure better signal quality for the application in real severe environment. For better recognition of time of flight, cross correlation method is used. Hilbert transform is applied to the signals after cross correlation in order to extract the envelope of the signals. Signal processing and EUSR are both implemented by developing a graphical user-friendly interface program in LabView. We conclude with a description of our vision for applying EUSR signal analysis to structural health monitoring and embedded nondestructive evaluation. To this end, we envisage an automatic damage detection application utilizing embedded PWAS, EUSR, and advanced signal processing.

  7. Managing complex processing of medical image sequences by program supervision techniques

    Science.gov (United States)

    Crubezy, Monica; Aubry, Florent; Moisan, Sabine; Chameroy, Virginie; Thonnat, Monique; Di Paola, Robert

    1997-05-01

    Our objective is to offer clinicians wider access to evolving medical image processing (MIP) techniques, crucial to improve assessment and quantification of physiological processes, but difficult to handle for non-specialists in MIP. Based on artificial intelligence techniques, our approach consists in the development of a knowledge-based program supervision system, automating the management of MIP libraries. It comprises a library of programs, a knowledge base capturing the expertise about programs and data and a supervision engine. It selects, organizes and executes the appropriate MIP programs given a goal to achieve and a data set, with dynamic feedback based on the results obtained. It also advises users in the development of new procedures chaining MIP programs.. We have experimented the approach for an application of factor analysis of medical image sequences as a means of predicting the response of osteosarcoma to chemotherapy, with both MRI and NM dynamic image sequences. As a result our program supervision system frees clinical end-users from performing tasks outside their competence, permitting them to concentrate on clinical issues. Therefore our approach enables a better exploitation of possibilities offered by MIP and higher quality results, both in terms of robustness and reliability.

  8. Developing the technique of image processing for the study of bubble dynamics in subcooled flow boiling

    International Nuclear Information System (INIS)

    Donevski, Bozin; Saga, Tetsuo; Kobayashi, Toshio; Segawa, Shigeki

    1998-01-01

    This study presents the development of an image processing technique for studying the dynamic behavior of vapor bubbles in a two-phase bubbly flow. It focuses on the quantitative assessment of some basic parameters such as a local bubble size and size distribution in the range of void fraction between 0.03 < a < 0.07. The image processing methodology is based upon the computer evaluation of high speed motion pictures obtained from the flow field in the region of underdeveloped subcooled flow boiling for a variety of experimental conditions. This technique has the advantage of providing computer measurements and extracting the bubbles of the two-phase bubbly flow. This method appears to be promising for determining the governing mechanisms in subcooled flow boiling, particularly near the point of net vapor generation. The data collected by the image analysis software can be incorporated into the new models and computer codes currently under development which are aimed at incorporating the effect of vapor generation and condensation separately. (author)

  9. Optical modulation techniques for analog signal processing and CMOS compatible electro-optic modulation

    Science.gov (United States)

    Gill, Douglas M.; Rasras, Mahmoud; Tu, Kun-Yii; Chen, Young-Kai; White, Alice E.; Patel, Sanjay S.; Carothers, Daniel; Pomerene, Andrew; Kamocsai, Robert; Beattie, James; Kopa, Anthony; Apsel, Alyssa; Beals, Mark; Mitchel, Jurgen; Liu, Jifeng; Kimerling, Lionel C.

    2008-02-01

    Integrating electronic and photonic functions onto a single silicon-based chip using techniques compatible with mass-production CMOS electronics will enable new design paradigms for existing system architectures and open new opportunities for electro-optic applications with the potential to dramatically change the management, cost, footprint, weight, and power consumption of today's communication systems. While broadband analog system applications represent a smaller volume market than that for digital data transmission, there are significant deployments of analog electro-optic systems for commercial and military applications. Broadband linear modulation is a critical building block in optical analog signal processing and also could have significant applications in digital communication systems. Recently, broadband electro-optic modulators on a silicon platform have been demonstrated based on the plasma dispersion effect. The use of the plasma dispersion effect within a CMOS compatible waveguide creates new challenges and opportunities for analog signal processing since the index and propagation loss change within the waveguide during modulation. We will review the current status of silicon-based electrooptic modulators and also linearization techniques for optical modulation.

  10. West Java Snack Mapping based on Snack Types, Main Ingredients, and Processing Techniques

    Science.gov (United States)

    Nurani, A. S.; Subekti, S.; Ana

    2016-04-01

    The research was motivated by lack of literature on archipelago snack especially from West Java. It aims to explore the snack types, the processing techniques, and the main ingredients by planning a learning material on archipelago cake especially from West Java. The research methods used are descriptive observations and interviews. The samples were randomly chosen from all regions in West Java. The findings show the identification of traditional snack from West java including: 1. snack types which are similar in all regions as research sample namely: opak, rangginang, nagasari, aliagrem, cuhcur, keripik, semprong, wajit, dodol, kecimpring, combro, tape ketan, and surabi. The typical snack types involve burayot (Garut), simping kaum (Purwakarta), surabi hejo (Karawang), papais cisaat (Subang), Papais moyong, opak bakar (Kuningan), opak oded, ranggesing (Sumedang), gapit, tapel (Cirebon), gulampo, kue aci (Tasikmalaya), wajit cililin, gurilem (West Bandung), and borondong (Bandung District); 2. various processing techniques namely: steaming, boiling, frying, caramelizing, baking, grilling, roaster, sugaring; 3. various main ingredients namely rice, local glutinous rice, rice flour, glutinous rice flour, starch, wheat flour, hunkue flour, cassava, sweet potato, banana, nuts, and corn; 4. snack classification in West Java namely (1) traditional snack, (2) creation-snack, (3) modification-snack, (4) outside influence-snack.

  11. Separation and purification of uranium product from thorium in thorex process by precipitation technique

    International Nuclear Information System (INIS)

    Ramanujam, A.; Dhami, P.S.; Gopalakrishnan, V.; Mukherjee, A.; Dhumwad, R.K.

    1989-01-01

    A sequential precipitation technique is reported for the separation of uranium and thorium present in the uranium product stream of a single cycle 5 per cent TBP Thorex Process. It involves the precipitation of thorium as oxalate in 1M HNO 3 medium at 60-70degC and after filtration, precipitation of uranium as ammonium diuranate at 80-90degC from the oxalate supernatant. This technique has several advantages over the ion-exchange process normally used for treating these products. In order to meet the varying feed conditions, this method has been tested for feeds containing 10 g/1 uranium and 1-50 g/1 thorium in 1-6M HNO 3 . Various parameters like feed acidities, uranium and thorium concentrations, excess oxalic acid concentrations in the oxalate supernatant, precipitation temperatures, precipitate wash volumes etc. have been optimised to obtain more than 99 per cent recovery of thorium and uranium as their oxides with less than 50 ppm uranium losses to ammonium diuranate filtrate. The distribution patterns of different fission products and stainless steel corrosion products during various steps of this procedure have also been studied. For simulating the actual Thorex plant scale operation, experiments have been conducted with 25g and 100g lots of uranium per batch. (author). 6 tabs., 8 figs., 22 refs

  12. Quantitative identification and analysis of sub-seismic extensional structure system: technique schemes and processes

    International Nuclear Information System (INIS)

    Chenghua, Ou; Chen, Wei; Ma, Zhonggao

    2015-01-01

    Quantitative characterization of complex sub-seismic extensional structure system that essentially controls petroleum exploitation is difficult to implement in seismic profile interpretation. This research, based on a case study in block M of Myanmar, established a set of quantitative treatment schemes and technique processes for the identification of sub-seismic low-displacement (SSLD) extensional faults or fractures upon structural deformation restoration and geometric inversion. Firstly, the master-subsidiary inheritance relations and configuration of the seismic-scale extensional fault systems are determined by analyzing the structural pattern. Besides, three-dimensional (3D) pattern and characteristics of the seismic-scale extensional structure have been illustrated by a 3D structure model built upon seismic sections. Moreover, according to the dilatancy obtained from structural restoration on the basis of inclined shear method, as well as the fracture-flow index, potential SSLD extensional faults or fractures have been quantitatively identified. Application of the technique processes to the sub-seismic low-displacement extensional structures in block M in Myanmar is instructive to quantitatively interpret those SSLD extensional structure systems in practice. (paper)

  13. Development of spent solvent treatment process by a submerged combustion technique

    International Nuclear Information System (INIS)

    Uchiyama, Gunzo; Maeda, Mitsuru; Fujine, Sachio; Amakawa, Masayuki; Uchida, Katsuhide; Chida, Mitsuhisa

    1994-01-01

    An experimental study using a bench-scale equipment of 1 kg-simulated spent solvents per hour has been conducted in order to evaluate the applicability of a submerged combustion technique to the treatment of spent solvents contaminated with TRU elements. This report describes the experimental results on the combustion characteristics of the simulated spent solvents of tri-n-butyl phosphate and/or n-dodecane, and on the distribution behaviors of combustion products such as phosphoric acid, Ru, I, Zr and lanthanides as TRU simulants in the submerged combustion process. Also the experimental results of TRU separation from phosphoric acid solution by co-precipitation using bismuth phosphate are reported. It was shown that the submerged combustion technique was applicable to the treatment of spent solvents including the distillation residues of the solvent. Based on the experimental data, a new treatment process of spent solvent was proposed which consisted of submerged combustion, co-precipitation using bismuth phosphate, ceramic membrane filtration, cementation of TRU lean phosphate, and vitrification of TRU rich waste. (author)

  14. Effect of Formulation and Process Parameters on Chitosan Microparticles Prepared by an Emulsion Crosslinking Technique.

    Science.gov (United States)

    Rodriguez, Lidia B; Avalos, Abraham; Chiaia, Nicholas; Nadarajah, Arunan

    2017-05-01

    There are many studies about the synthesis of chitosan microparticles; however, most of them have very low production rate, have wide size distribution, are difficult to reproduce, and use harsh crosslinking agents. Uniform microparticles are necessary to obtain repeatable drug release behavior. The main focus of this investigation was to study the effect of the process and formulation parameters during the preparation of chitosan microparticles in order to produce particles with narrow size distribution. The technique evaluated during this study was emulsion crosslinking technique. Chitosan is a biocompatible and biodegradable material but lacks good mechanical properties; for that reason, chitosan was ionically crosslinked with sodium tripolyphosphate (TPP) at three different ratios (32, 64, and 100%). The model drug used was acetylsalicylic acid (ASA). During the preparation of the microparticles, chitosan was first mixed with ASA and then dispersed in oil containing an emulsifier. The evaporation of the solvents hardened the hydrophilic droplets forming microparticles with spherical shape. The process and formulation parameters were varied, and the microparticles were characterized by their morphology, particle size, drug loading efficiency, and drug release behavior. The higher drug loading efficiency was achieved by using 32% mass ratio of TPP to chitosan. The average microparticle size was 18.7 μm. The optimum formulation conditions to prepare uniform spherical microparticles were determined and represented by a region in a triangular phase diagram. The drug release analyses were evaluated in phosphate buffer solution at pH 7.4 and were mainly completed at 24 h.

  15. Phase distribution measurements in narrow rectangular channels using image processing techniques

    International Nuclear Information System (INIS)

    Bentley, C.; Ruggles, A.

    1991-01-01

    Many high flux research reactor fuel assemblies are cooled by systems of parallel narrow rectangular channels. The HFIR is cooled by single phase forced convection under normal operating conditions. However, two-phase forced convection or two phase mixed convection can occur in the fueled region as a result of some hypothetical accidents. Such flow conditions would occur only at decay power levels. The system pressure would be around 0.15 MPa in such circumstances. Phase distribution of air-water flow in a narrow rectangular channel is examined using image processing techniques. Ink is added to the water and clear channel walls are used to allow high speed still photographs and video tape to be taken of the air-water flow field. Flow field images are digitized and stored in a Macintosh 2ci computer using a frame grabber board. Local grey levels are related to liquid thickness in the flow channel using a calibration fixture. Image processing shareware is used to calculate the spatially averaged liquid thickness from the image of the flow field. Time averaged spatial liquid distributions are calculated using image calculation algorithms. The spatially averaged liquid distribution is calculated from the time averaged spatial liquid distribution to formulate the combined temporally and spatially averaged fraction values. The temporally and spatially averaged liquid fractions measured using this technique compare well to those predicted from pressure gradient measurements at zero superficial liquid velocity

  16. Performance evaluation of WDXRF as a process control technique for MOX fuel fabrication

    International Nuclear Information System (INIS)

    Pandey, A.; Khan, F.A.; Das, D.K.; Behere, P.G.; Afzal, Mohd

    2015-01-01

    This paper presents studies on Wavelength Dispersive X-Ray Fluorescence (WDXRF), as a powerful non destructive technique (NDT) for the compositional analysis of various types of MOX fuels. The sample has come after mixing and milling of UO 2 and PuO 2 powder for the estimation of plutonium, as a process control step of fabrication of (U, Pu)O 2 mixed oxide (MOX) fuel. For the characterization for heavy metal in various MOX fuel, a WDXRF method was established as a process control technique. The attractiveness of our system is that it can analyze the samples in solid form as well as in liquid form. The system is adapted in a glove box for handling of plutonium based fuels. The glove box adapted system was optimized with Uranium and Thorium based MOX sample before introduction of Pu. Uranium oxide and thorium oxide have been estimated in uranium thorium MOX samples. Standard deviation for the analysis of U 3 O 8 and ThO 2 were found to be 0.14 and 0.15 respectively. The results are validated against the conventional wet chemical methods of analysis. (author)

  17. Ultra-processed family foods in Australia: nutrition claims, health claims and marketing techniques.

    Science.gov (United States)

    Pulker, Claire Elizabeth; Scott, Jane Anne; Pollard, Christina Mary

    2018-01-01

    To objectively evaluate voluntary nutrition and health claims and marketing techniques present on packaging of high-market-share ultra-processed foods (UPF) in Australia for their potential impact on public health. Cross-sectional. Packaging information from five high-market-share food manufacturers and one retailer were obtained from supermarket and manufacturers' websites. Ingredients lists for 215 UPF were examined for presence of added sugar. Packaging information was categorised using a taxonomy of nutrition and health information which included nutrition and health claims and five common food marketing techniques. Compliance of statements and claims with the Australia New Zealand Food Standards Code and with Health Star Ratings (HSR) were assessed for all products. Almost all UPF (95 %) contained added sugars described in thirty-four different ways; 55 % of UPF displayed a HSR; 56 % had nutrition claims (18 % were compliant with regulations); 25 % had health claims (79 % were compliant); and 97 % employed common food marketing techniques. Packaging of 47 % of UPF was designed to appeal to children. UPF carried a mean of 1·5 health and nutrition claims (range 0-10) and 2·6 marketing techniques (range 0-5), and 45 % had HSR≤3·0/5·0. Most UPF packaging featured nutrition and health statements or claims despite the high prevalence of added sugars and moderate HSR. The degree of inappropriate or inaccurate statements and claims present is concerning, particularly on packaging designed to appeal to children. Public policies to assist parents to select healthy family foods should address the quality and accuracy of information provided on UPF packaging.

  18. Fetal Cardiac Doppler Signal Processing Techniques: Challenges and Future Research Directions

    Directory of Open Access Journals (Sweden)

    Saeed Abdulrahman Alnuaimi

    2017-12-01

    Full Text Available The fetal Doppler Ultrasound (DUS is commonly used for monitoring fetal heart rate and can also be used for identifying the event timings of fetal cardiac valve motions. In early-stage fetuses, the detected Doppler signal suffers from noise and signal loss due to the fetal movements and changing fetal location during the measurement procedure. The fetal cardiac intervals, which can be estimated by measuring the fetal cardiac event timings, are the most important markers of fetal development and well-being. To advance DUS-based fetal monitoring methods, several powerful and well-advanced signal processing and machine learning methods have recently been developed. This review provides an overview of the existing techniques used in fetal cardiac activity monitoring and a comprehensive survey on fetal cardiac Doppler signal processing frameworks. The review is structured with a focus on their shortcomings and advantages, which helps in understanding fetal Doppler cardiogram signal processing methods and the related Doppler signal analysis procedures by providing valuable clinical information. Finally, a set of recommendations are suggested for future research directions and the use of fetal cardiac Doppler signal analysis, processing, and modeling to address the underlying challenges.

  19. Systematization and sophistication of a comprehensive sensitivity analysis program. Phase 2

    International Nuclear Information System (INIS)

    Oyamada, Kiyoshi; Ikeda, Takao

    2004-02-01

    This study developed minute estimation by adopting comprehensive sensitivity analytical program for reliability of TRU waste repository concepts in a crystalline rock condition. We examined each components and groundwater scenario of geological repository and prepared systematic bases to examine the reliability from the point of comprehensiveness. Models and data are sophisticated to examine the reliability. Based on an existing TRU waste repository concepts, effects of parameters to nuclide migration were quantitatively classified. Those parameters, that will be decided quantitatively, are such as site character of natural barrier and design specification of engineered barriers. Considering the feasibility of those figures of specifications, reliability is re-examined on combinations of those parameters within a practical range. Future issues are; Comprehensive representation of hybrid geosphere model including the fractured medium and permeable matrix medium. Sophistication of tools to develop the reliable combinations of parameters. It is significant to continue this study because the disposal concepts and specification of TRU nuclides containing waste on various sites shall be determined rationally and safely through these studies. (author)

  20. Applying industrial process improvement techniques to increase efficiency in a surgical practice.

    Science.gov (United States)

    Reznick, David; Niazov, Lora; Holizna, Eric; Siperstein, Allan

    2014-10-01

    The goal of this study was to examine how industrial process improvement techniques could help streamline the preoperative workup. Lean process improvement was used to streamline patient workup at an endocrine surgery service at a tertiary medical center utilizing multidisciplinary collaboration. The program consisted of several major changes in how patients are processed in the department. The goal was to shorten the wait time between initial call and consult visit and between consult and surgery. We enrolled 1,438 patients enrolled in the program. The wait time from the initial call until consult was reduced from 18.3 ± 0.7 to 15.4 ± 0.9 days. Wait time from consult until operation was reduced from 39.9 ± 1.5 to 33.9 ± 1.3 days for the overall practice and to 15.0 ± 4.8 days for low-risk patients. Patient cancellations were reduced from 27.9 ± 2.4% to 17.3 ± 2.5%. Overall patient flow increased from 30.9 ± 5.1 to 52.4 ± 5.8 consults per month (all P process improvement methodology, surgery patients can benefit from an improved, streamlined process with significant reduction in wait time from call to initial consult and initial consult to surgery, with reduced cancellations. This generalized process has resulted in increased practice throughput and efficiency and is applicable to any surgery practice. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Error characterization and quantum control benchmarking in liquid state NMR using quantum information processing techniques

    Science.gov (United States)

    Laforest, Martin

    Quantum information processing has been the subject of countless discoveries since the early 1990's. It is believed to be the way of the future for computation: using quantum systems permits one to perform computation exponentially faster than on a regular classical computer. Unfortunately, quantum systems that not isolated do not behave well. They tend to lose their quantum nature due to the presence of the environment. If key information is known about the noise present in the system, methods such as quantum error correction have been developed in order to reduce the errors introduced by the environment during a given quantum computation. In order to harness the quantum world and implement the theoretical ideas of quantum information processing and quantum error correction, it is imperative to understand and quantify the noise present in the quantum processor and benchmark the quality of the control over the qubits. Usual techniques to estimate the noise or the control are based on quantum process tomography (QPT), which, unfortunately, demands an exponential amount of resources. This thesis presents work towards the characterization of noisy processes in an efficient manner. The protocols are developed from a purely abstract setting with no system-dependent variables. To circumvent the exponential nature of quantum process tomography, three different efficient protocols are proposed and experimentally verified. The first protocol uses the idea of quantum error correction to extract relevant parameters about a given noise model, namely the correlation between the dephasing of two qubits. Following that is a protocol using randomization and symmetrization to extract the probability that a given number of qubits are simultaneously corrupted in a quantum memory, regardless of the specifics of the error and which qubits are affected. Finally, a last protocol, still using randomization ideas, is developed to estimate the average fidelity per computational gates for

  2. Microstructure and associated properties of YBa2Cu3Ox superconductors prepared by melt-processing techniques

    International Nuclear Information System (INIS)

    Balachandran, U.; Zhong, W.; Youngdahl, C.A.; Poeppel, R.B.

    1993-03-01

    From the standpoint of applications, melt-processed bulk YBa 2 Cu 3 O x (YBCO) superconductors are of considerable interest. We have studied the microstructure and levitation force of melt-processed YBCO, YBCO plus Y 2 BaCuO 5 , and YBCO plus Pt samples. Large single crystalline samples, grown using a seeding technique, were also studied. The levitation force is highest in melt-processed samples made by the seeding technique. 6 figs, 24 refs

  3. Study on structural design technique of silicon carbide applied for thermochemical hydrogen production IS process

    International Nuclear Information System (INIS)

    Takegami, Hiroaki; Terada, Atsuhiko; Inagaki, Yoshiyuki; Ishikura, Syuichi

    2011-03-01

    The IS process is the hydrogen production method which used the thermochemical reaction cycle of sulfuric acid and iodyne. Therefore, the design to endure the high temperature and moreover corrode-able environment is required to the equipment. Specifically, the sulfuric acid decomposer which is one of the main equipment of the IS process is the equipment to heat with hot helium and for the sulfuric acid of 90 wt% to evaporate. Moreover, it is the important equipment to supply the SO 3 decomposer which is the following process, resolving the part of sulfuric acid vapor into SO 3 with. The heat exchanger that sulfuric acid evaporates must be made pressure-resistant structure because it has the high-pressure helium of 4 MPa and the material that the high temperature and the corrosion environment of equal to or more than 700degC can be endured must be used. As the material, it is selected from the corrosion experiment and so on when SiC which is carbonization silicone ceramics is the most excellent material. However, even if it damages the ceramic block which is a heat exchanger because it becomes the structure which is stored in pressure-resistant metallic container, fluid such as sulfuric acid becomes the structure which doesn't leak out outside. However, the structure design technique to have been unified when using ceramics as the structure part isn't serviced as the standard. This report is the one which was studied about the structural design technique to have taken the material strength characteristic of the ceramics into consideration, refer to existing structural design standard. (author)

  4. Measurement and analysis of geometric parameters of human carotid bifurcation using image post-processing technique

    International Nuclear Information System (INIS)

    Xue Yunjing; Gao Peiyi; Lin Yan

    2008-01-01

    Objective: To investigate variation in the carotid bifurcation geometry of adults of different age by MR angiography images combining image post-processing technique. Methods: Images of the carotid bifurcations of 27 young adults (≤40 years old) and 30 older subjects ( > 40 years old) were acquired via contrast-enhanced MR angiography. Three dimensional (3D) geometries of the bifurcations were reconstructed and geometric parameters were measured by post-processing technique. Results: The geometric parameters of the young versus older groups were as follows: bifurcation angle (70.268 degree± 16.050 degree versus 58.857 degree±13.294 degree), ICA angle (36.893 degree±11.837 degree versus 30.275 degree±9.533 degree), ICA planarity (6.453 degree ± 5.009 degree versus 6.263 degree ±4.250 degree), CCA tortuosity (0.023±0.011 versus 0.014± 0.005), ICA tortuosity (0.070±0.042 versus 0.046±0.022), ICA/CCA diameter ratio (0.693± 0.132 versus 0.728±0.106), ECA/CCA diameter ratio (0.750±0.123 versus 0.809±0.122), ECA/ ICA diameter ratio (1.103±0.201 versus 1.127±0.195), bifurcation area ratio (1.057±0.281 versus 1.291±0.252). There was significant statistical difference between young group and older group in-bifurcation angle, ICA angle, CCA tortuosity, ICA tortuosity, ECA/CCA and bifurcation area ratio (F= 17.16, 11.74, 23.02, 13.38, 6.54, 22.80, respectively, P<0.05). Conclusions: MR angiography images combined with image post-processing technique can reconstruct 3D carotid bifurcation geometry and measure the geometric parameters of carotid bifurcation in vivo individually. It provides a new and convenient method to investigate the relationship of vascular geometry and flow condition with atherosclerotic pathological changes. (authors)

  5. Review of potential processing techniques for the encapsulation of wastes in thermoplastic polymers

    International Nuclear Information System (INIS)

    Patel, B.R.; Lageraaen, P.R.; Kalb, P.D.

    1995-08-01

    Thermoplastic encapsulation has been extensively studied at Brookhaven National Laboratory's (BNL) Environmental and Waste Technology Center (EWTC) as a waste encapsulation technology applicable to a wide range of waste types including radioactive, hazardous and mixed wastes. Encapsulation involves processing thermoplastic and waste materials into a waste form product by heating and mixing both materials into a homogeneous molten mixture. Cooling of the melt results in a solid monolithic waste form in which contaminants have been completely surrounded by a polymer matrix. Heating and mixing requirements for successful waste encapsulation can be met using proven technologies available in various types of commercial equipment. Processing techniques for thermoplastic materials, such as low density polyethylene (LDPE), are well established within the plastics industry. The majority of commercial polymer processing is accomplished using extruders, mixers or a combination of these technologies. Extruders and mixers are available in a broad range of designs and are used during the manufacture of consumer and commercial products as well as for compounding applications. Compounding which refers to mixing additives such as stabilizers and/or colorants with polymers, is analogous to thermoplastic encapsulation. Several processing technologies were investigated for their potential application in encapsulating residual sorbent waste in selected thermoplastic polymers, including single-screw extruders, twin-screw extruders, continuous mixers, batch mixers as well as other less conventional devices. Each was evaluated based on operational ease, quality control, waste handling capabilities as well as degree of waste pretreatment required. Based on literature review, this report provides a description of polymer processing technologies, a discussion of the merits and limitations of each and an evaluation of their applicability to the encapsulation of sorbent wastes

  6. Review of potential processing techniques for the encapsulation of wastes in thermoplastic polymers

    Energy Technology Data Exchange (ETDEWEB)

    Patel, B.R.; Lageraaen, P.R.; Kalb, P.D.

    1995-08-01

    Thermoplastic encapsulation has been extensively studied at Brookhaven National Laboratory`s (BNL) Environmental and Waste Technology Center (EWTC) as a waste encapsulation technology applicable to a wide range of waste types including radioactive, hazardous and mixed wastes. Encapsulation involves processing thermoplastic and waste materials into a waste form product by heating and mixing both materials into a homogeneous molten mixture. Cooling of the melt results in a solid monolithic waste form in which contaminants have been completely surrounded by a polymer matrix. Heating and mixing requirements for successful waste encapsulation can be met using proven technologies available in various types of commercial equipment. Processing techniques for thermoplastic materials, such as low density polyethylene (LDPE), are well established within the plastics industry. The majority of commercial polymer processing is accomplished using extruders, mixers or a combination of these technologies. Extruders and mixers are available in a broad range of designs and are used during the manufacture of consumer and commercial products as well as for compounding applications. Compounding which refers to mixing additives such as stabilizers and/or colorants with polymers, is analogous to thermoplastic encapsulation. Several processing technologies were investigated for their potential application in encapsulating residual sorbent waste in selected thermoplastic polymers, including single-screw extruders, twin-screw extruders, continuous mixers, batch mixers as well as other less conventional devices. Each was evaluated based on operational ease, quality control, waste handling capabilities as well as degree of waste pretreatment required. Based on literature review, this report provides a description of polymer processing technologies, a discussion of the merits and limitations of each and an evaluation of their applicability to the encapsulation of sorbent wastes.

  7. Performance enhancement of various real-time image processing techniques via speculative execution

    Science.gov (United States)

    Younis, Mohamed F.; Sinha, Purnendu; Marlowe, Thomas J.; Stoyenko, Alexander D.

    1996-03-01

    In real-time image processing, an application must satisfy a set of timing constraints while ensuring the semantic correctness of the system. Because of the natural structure of digital data, pure data and task parallelism have been used extensively in real-time image processing to accelerate the handling time of image data. These types of parallelism are based on splitting the execution load performed by a single processor across multiple nodes. However, execution of all parallel threads is mandatory for correctness of the algorithm. On the other hand, speculative execution is an optimistic execution of part(s) of the program based on assumptions on program control flow or variable values. Rollback may be required if the assumptions turn out to be invalid. Speculative execution can enhance average, and sometimes worst-case, execution time. In this paper, we target various image processing techniques to investigate applicability of speculative execution. We identify opportunities for safe and profitable speculative execution in image compression, edge detection, morphological filters, and blob recognition.

  8. Evaluating Statistical Process Control (SPC) techniques and computing the uncertainty of force calibrations

    Science.gov (United States)

    Navard, Sharon E.

    1989-01-01

    In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.

  9. New understanding of rhizosphere processes enabled by advances in molecular and spatially resolved techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hess, Nancy J.; Paša-Tolić, Ljiljana; Bailey, Vanessa L.; Dohnalkova, Alice C.

    2017-06-01

    Understanding the role played by microorganisms within soil systems is challenged by the unique intersection of physics, chemistry, mineralogy and biology in fostering habitat for soil microbial communities. To address these challenges will require observations across multiple spatial and temporal scales to capture the dynamics and emergent behavior from complex and interdependent processes. The heterogeneity and complexity of the rhizosphere require advanced techniques that press the simultaneous frontiers of spatial resolution, analyte sensitivity and specificity, reproducibility, large dynamic range, and high throughput. Fortunately many exciting technical advancements are now available to inform and guide the development of new hypotheses. The aim of this Special issue is to provide a holistic view of the rhizosphere in the perspective of modern molecular biology methodologies that enabled a highly-focused, detailed view on the processes in the rhizosphere, including numerous, strong and complex interactions between plant roots, soil constituents and microorganisms. We discuss the current rhizosphere research challenges and knowledge gaps, as well as perspectives and approaches using newly available state-of-the-art toolboxes. These new approaches and methodologies allow the study of rhizosphere processes and properties, and rhizosphere as a central component of ecosystems and biogeochemical cycles.

  10. Patented Techniques for Acrylamide Mitigation in High-Temperature Processed Foods

    DEFF Research Database (Denmark)

    Mariotti, Salome; Pedreschi, Franco; Antonio Carrasco, José

    2011-01-01

    Heating foods has many advantages since it adds taste, color, texture and minimizes harmful germs, among others. Flavor and aroma compounds are produced via the Maillard reaction, where various hazardous com-pounds may form as well, such as acrylamide. Maillard reaction is believed to be the main...... for acrylamide reduction in foods processed at high temperatures are mentioned and briefly analyzed in order to develop new mitigation techniques for acrylamide in different food matrixes.......Heating foods has many advantages since it adds taste, color, texture and minimizes harmful germs, among others. Flavor and aroma compounds are produced via the Maillard reaction, where various hazardous com-pounds may form as well, such as acrylamide. Maillard reaction is believed to be the main...... route for acrylamide for-mation between reducing sugars (glucose and fructose), sucrose, and the amino acid asparagine, and, consequently, a variety of technologies have been developed to reduce acrylamide concentration in thermally processed foods based ei-ther on: (i) Changing process parameters (e...

  11. A novel method for detecting and counting overlapping tracks in SSNTD by image processing techniques

    International Nuclear Information System (INIS)

    Ab Azar, N.; Babakhani, A.; Broumandnia, A.; Sepanloo, K.

    2016-01-01

    Overlapping object detection and counting is a challenge in image processing. A new method for detecting and counting overlapping circles is presented in this paper. This method is based on pattern recognition and feature extraction using “neighborhood values“ in an object image by implementation of image processing techniques. The junction points are detected by assignment of a value for each pixel in an image. As is shown, the neighborhood values for junction points are larger than the values for other points. This distinction of neighborhood values is the main feature which can be utilized to identify the junction points and to count the overlapping tracks. This method can be used for recognizing and counting charged particle tracks, blood cells and also cancer cells. The method is called “Track Counting based on Neighborhood Values” and is symbolized by “TCNV”. - Highlights: • A new method is introduced to recognize nuclear tracks by image processing. • The method is used to specify neighborhood pixels in junction points in overlapping tracks. • Enhanced method of counting overlapping tracks. • New counting system has linear behavior in counting tracks with density less than 300,000 tracks per cm"2. • In the new method, the overlap tracks can be recognized even to 10× tracks and more.

  12. Evaluating Non-In-Place Update Techniques for Flash-Based Transaction Processing Systems

    Science.gov (United States)

    Wang, Yongkun; Goda, Kazuo; Kitsuregawa, Masaru

    Recently, flash memory is emerging as the storage device. With price sliding fast, the cost per capacity is approaching to that of SATA disk drives. So far flash memory has been widely deployed in consumer electronics even partly in mobile computing environments. For enterprise systems, the deployment has been studied by many researchers and developers. In terms of the access performance characteristics, flash memory is quite different from disk drives. Without the mechanical components, flash memory has very high random read performance, whereas it has a limited random write performance because of the erase-before-write design. The random write performance of flash memory is comparable with or even worse than that of disk drives. Due to such a performance asymmetry, naive deployment to enterprise systems may not exploit the potential performance of flash memory at full blast. This paper studies the effectiveness of using non-in-place-update (NIPU) techniques through the IO path of flash-based transaction processing systems. Our deliberate experiments using both open-source DBMS and commercial DBMS validated the potential benefits; x3.0 to x6.6 performance improvement was confirmed by incorporating non-in-place-update techniques into file system without any modification of applications or storage devices.

  13. Effect of Novel Quick Freezing Techniques Combined with Different Thawing Processes on Beef Quality

    Science.gov (United States)

    Yoo, Seon-Mi; Han, Gui-Jung

    2014-01-01

    This study investigated the effect of various freezing and thawing techniques on the quality of beef. Meat samples were frozen using natural convection freezing (NF), individual quick freezing (IQF), or cryogenic freezing (CF) techniques, followed by natural convection thawing (NCT) or running water thawing (RT). The meat was frozen until the core temperature reached -12℃ and then stored at -24℃, followed by thawing until the temperature reached 5℃. Quality parameters, such as the pH, water binding properties, CIE color, shear force, and microstructure of the beef were elucidated. Although the freezing and thawing combinations did not cause remarkable changes in the quality parameters, rapid freezing, in the order of CF, IQF, and NF, was found to minimize the quality deterioration. In the case of thawing methods, NCT was better than RT and the meat quality was influence on the thawing temperature rather than the thawing rate. Although the microstructure of the frozen beef exhibited an excessive loss of integrity after the freezing and thawing, it did not cause any remarkable change in the beef quality. Taken together, these results demonstrate that CF and NCT form the best combination for beef processing; however, IQF and NCT may have practical applications in the frozen food industry. PMID:26761674

  14. Digital image analysis in breast pathology-from image processing techniques to artificial intelligence.

    Science.gov (United States)

    Robertson, Stephanie; Azizpour, Hossein; Smith, Kevin; Hartman, Johan

    2018-04-01

    Breast cancer is the most common malignant disease in women worldwide. In recent decades, earlier diagnosis and better adjuvant therapy have substantially improved patient outcome. Diagnosis by histopathology has proven to be instrumental to guide breast cancer treatment, but new challenges have emerged as our increasing understanding of cancer over the years has revealed its complex nature. As patient demand for personalized breast cancer therapy grows, we face an urgent need for more precise biomarker assessment and more accurate histopathologic breast cancer diagnosis to make better therapy decisions. The digitization of pathology data has opened the door to faster, more reproducible, and more precise diagnoses through computerized image analysis. Software to assist diagnostic breast pathology through image processing techniques have been around for years. But recent breakthroughs in artificial intelligence (AI) promise to fundamentally change the way we detect and treat breast cancer in the near future. Machine learning, a subfield of AI that applies statistical methods to learn from data, has seen an explosion of interest in recent years because of its ability to recognize patterns in data with less need for human instruction. One technique in particular, known as deep learning, has produced groundbreaking results in many important problems including image classification and speech recognition. In this review, we will cover the use of AI and deep learning in diagnostic breast pathology, and other recent developments in digital image analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Mass Detection in Mammographic Images Using Wavelet Processing and Adaptive Threshold Technique.

    Science.gov (United States)

    Vikhe, P S; Thool, V R

    2016-04-01

    Detection of mass in mammogram for early diagnosis of breast cancer is a significant assignment in the reduction of the mortality rate. However, in some cases, screening of mass is difficult task for radiologist, due to variation in contrast, fuzzy edges and noisy mammograms. Masses and micro-calcifications are the distinctive signs for diagnosis of breast cancer. This paper presents, a method for mass enhancement using piecewise linear operator in combination with wavelet processing from mammographic images. The method includes, artifact suppression and pectoral muscle removal based on morphological operations. Finally, mass segmentation for detection using adaptive threshold technique is carried out to separate the mass from background. The proposed method has been tested on 130 (45 + 85) images with 90.9 and 91 % True Positive Fraction (TPF) at 2.35 and 2.1 average False Positive Per Image(FP/I) from two different databases, namely Mammographic Image Analysis Society (MIAS) and Digital Database for Screening Mammography (DDSM). The obtained results show that, the proposed technique gives improved diagnosis in the early breast cancer detection.

  16. Collimation method using an image processing technique for an assembling-type antenna

    Science.gov (United States)

    Okuyama, Toshiyuki; Kimura, Shinichi; Fukase, Yutaro; Ueno, Hiroshi; Harima, Kouichi; Sato, Hitoshi; Yoshida, Tetsuji

    1998-10-01

    To construct highly precise space structures, such as antennas, it is essential to be able to collimate them with high precision by remote operation. Surveying techniques which are commonly used for collimating ground-based antennas cannot be applied to space systems, since they require relatively sensitive and complex instruments. In this paper, we propose a collimation method that is applied to mark-patterns mounted on an antenna dish for detecting very slight displacements. By calculating a cross- correlation function between the target and reference mark- patterns, and by interpolating this calculated function, we can measure the displacement of the target mark-pattern in sub-pixel precision. We developed a test-bed for the measuring system and evaluated several mark-patterns suitable for our image processing technique. A mark-pattern with which enabled to detect displacement within an RMS error of 1/100 pixels was found. Several tests conducted using this chosen pattern verified the robustness of the method to different light conditions and alignment errors. This collimating method is designed for application to an assembling-type antenna which is being developed by the Communications Research Laboratory.

  17. Recent progress in the melt-process technique of high-temperature superconductors

    CERN Document Server

    Ikuta, H; Mizutani, U

    1999-01-01

    Recently, the performance of high-temperature super conductors prepared by the melt-process technique has been greatly improved. This progress was accomplished by the addition of Ag into the starting materials of the Sm-Ba-CuO $9 system, which prevents the formation of severe macro-sized cracks in the finished samples. The magnetic flux density trapped by this material has now reached 9 T at 25 K, which is comparable to the magnetic flux density produced by $9 ordinary superconducting magnets. The amount of magnetic flux density that can be trapped by the sample is limited by the mechanical strength rather than superconducting properties of the material. The increase in the mechanical $9 strength of the material is important both for further improvement of the material properties and for ensuring reliability of the material in practical applications. (20 refs).

  18. Process parameters optimization for synthesis of methyl ester from sunflower oil using Taguchi technique

    Directory of Open Access Journals (Sweden)

    G. Senthilkumar

    2014-09-01

    Full Text Available In this work, transesterification of sunflower oil for obtaining biodiesel was studied. Taguchi’s methodology (L9 orthogonal array was selected to optimize the most significant variables (methanol, catalyst concentration and stirrer speed in transesterification process. Experiments have conducted based on development of L9 orthogonal array by using Taguchi technique. Analysis of Variance (ANOVA and the regression equations were used to find the optimum yield of sunflower methyl ester under the influence of methanol, catalyst & stirrer speed. The study resulted in a maximum yield of sun flower methyl ester as 96% with the optimal conditions of methanol 110 ml with 0.5% by wt. of sodium hydroxide (NaOH stirred at 1200 rpm. The yield was analyzed on the basis of “larger is better”. Finally, confirmation tests were carried out to verify the experimental results.

  19. Investigation of an Autofocusing Method for Visible Aerial Cameras Based on Image Processing Techniques

    Directory of Open Access Journals (Sweden)

    Zhichao Chen

    2016-01-01

    Full Text Available In order to realize the autofocusing in aerial camera, an autofocusing system is established and its characteristics such as working principle and optical-mechanical structure and focus evaluation function are investigated. The reason for defocusing in aviation camera is analyzed and several autofocusing methods along with appropriate focus evaluation functions are introduced based on the image processing techniques. The proposed autofocusing system is designed and implemented using two CMOS detectors. The experiment results showed that the proposed method met the aviation camera focusing accuracy requirement, and a maximum focusing error of less than half of the focus depth is achieved. The system designed in this paper can find the optical imaging focal plane in real-time; as such, this novel design has great potential in practical engineering, especially aerospace applications.

  20. ONTOLOGY BASED MEANINGFUL SEARCH USING SEMANTIC WEB AND NATURAL LANGUAGE PROCESSING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    K. Palaniammal

    2013-10-01

    Full Text Available The semantic web extends the current World Wide Web by adding facilities for the machine understood description of meaning. The ontology based search model is used to enhance efficiency and accuracy of information retrieval. Ontology is the core technology for the semantic web and this mechanism for representing formal and shared domain descriptions. In this paper, we proposed ontology based meaningful search using semantic web and Natural Language Processing (NLP techniques in the educational domain. First we build the educational ontology then we present the semantic search system. The search model consisting three parts which are embedding spell-check, finding synonyms using WordNet API and querying ontology using SPARQL language. The results are both sensitive to spell check and synonymous context. This paper provides more accurate results and the complete details for the selected field in a single page.

  1. Study of ferrallitisation process in soil by application of isotopic dilution kinetic technique to iron

    International Nuclear Information System (INIS)

    Thomann, Christiane

    1978-01-01

    Isotopic dilution kinetic technique applied to iron may contribute to make clear the conditions of ''potential'' mobility of iron in soils, under the action of three factors: moisture, incubation period and organic matter imputs. Comparison between surface horizons of three tropical soils: leached ferruginous tropical soil, weakly ferrallitic red soil and ferrallitic soil shows that in the ferrallitisation process, weakly ferrallitic soil would take place between the two other types of soils with a maximum mobility of iron. This mobility decreases when organic matter rate decreases leading then to ''beige'' soil (ferruginous leached tropical soil), and when hydroxide rate increases, which leads to ferrallitic soil. In podzol (A 1 horizon), for the same rate of organic matter, potential mobility of iron is higher than in ferallitic soil, because it contains ten times more free iron than the podzol [fr

  2. Test/score/report: Simulation techniques for automating the test process

    Science.gov (United States)

    Hageman, Barbara H.; Sigman, Clayton B.; Koslosky, John T.

    1994-01-01

    A Test/Score/Report capability is currently being developed for the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) system which will automate testing of the Goddard Space Flight Center (GSFC) Payload Operations Control Center (POCC) and Mission Operations Center (MOC) software in three areas: telemetry decommutation, spacecraft command processing, and spacecraft memory load and dump processing. Automated computer control of the acceptance test process is one of the primary goals of a test team. With the proper simulation tools and user interface, the task of acceptance testing, regression testing, and repeatability of specific test procedures of a ground data system can be a simpler task. Ideally, the goal for complete automation would be to plug the operational deliverable into the simulator, press the start button, execute the test procedure, accumulate and analyze the data, score the results, and report the results to the test team along with a go/no recommendation to the test team. In practice, this may not be possible because of inadequate test tools, pressures of schedules, limited resources, etc. Most tests are accomplished using a certain degree of automation and test procedures that are labor intensive. This paper discusses some simulation techniques that can improve the automation of the test process. The TASS system tests the POCC/MOC software and provides a score based on the test results. The TASS system displays statistics on the success of the POCC/MOC system processing in each of the three areas as well as event messages pertaining to the Test/Score/Report processing. The TASS system also provides formatted reports documenting each step performed during the tests and the results of each step. A prototype of the Test/Score/Report capability is available and currently being used to test some POCC/MOC software deliveries. When this capability is fully operational it should greatly reduce the time necessary

  3. xSyn: A Software Tool for Identifying Sophisticated 3-Way Interactions From Cancer Expression Data

    Directory of Open Access Journals (Sweden)

    Baishali Bandyopadhyay

    2017-08-01

    Full Text Available Background: Constructing gene co-expression networks from cancer expression data is important for investigating the genetic mechanisms underlying cancer. However, correlation coefficients or linear regression models are not able to model sophisticated relationships among gene expression profiles. Here, we address the 3-way interaction that 2 genes’ expression levels are clustered in different space locations under the control of a third gene’s expression levels. Results: We present xSyn, a software tool for identifying such 3-way interactions from cancer gene expression data based on an optimization procedure involving the usage of UPGMA (Unweighted Pair Group Method with Arithmetic Mean and synergy. The effectiveness is demonstrated by application to 2 real gene expression data sets. Conclusions: xSyn is a useful tool for decoding the complex relationships among gene expression profiles. xSyn is available at http://www.bdxconsult.com/xSyn.html .

  4. RSYST: From nuclear reactor calculations towards a highly sophisticated scientific software integration environment

    International Nuclear Information System (INIS)

    Noack, M.; Seybold, J.; Ruehle, R.

    1996-01-01

    The software environment RSYST was originally used to solve problems of reactor physics. The consideration of advanced scientific simulation requirements and the strict application of modern software design principles led to a system which is perfectly suitable to solve problems in various complex scientific problem domains. Starting with a review of the early days of RSYST, we describe the straight evolution driven by the need of software environment which combines the advantages of a high-performance database system with the capability to integrate sophisticated scientific technical applications. The RSYST architecture is presented and the data modelling capabilities are described. To demonstrate the powerful possibilities and flexibility of the RSYST environment, we describe a wide range of RSYST applications, e.g., mechanical simulations of multibody systems, which are used in biomechanical research, civil engineering and robotics. In addition, a hypermedia system which is used for scientific technical training and documentation is presented. (orig.) [de

  5. Exploring the possibility of using digital image processing technique to detect diseases of rice leaf

    Directory of Open Access Journals (Sweden)

    S. H Peyman

    2016-04-01

    Full Text Available Introduction: Rice is a very important staple food crop provides more than half of the world caloric supply. Rice diseases lead to significant annual crop losses, have negative impacts on quality of the final product and destroy plant variety. Rice Blast is one of the most widespread and most destructive fungal diseases in tropical and subtropical humid areas, which causes significant decrease in the amount of paddy yield and quality of milled rice. Brown spot disease is another important fungal disease in rice which infects the plant during the rice growing season from the nursery period up to farm growth stage and productivity phase. The later the disease is diagnosed the higher the amount of chemicals is needed for treatment. Due to high costs and harmful environmental impacts of chemical toxins, the accurate early detection and treatment of plant disease is seemed to be necessary. In general, observation with the naked eye is used for disease detection. However, the results are indeed depend on the intelligence of the person performing the operation. So usually the accurate determination of the severity and progression of the disease can’t be achieved. On the other side, the use of experts for continuous monitoring of large farms might be prohibitively expensive and time consuming. Thus, investigating the new approaches for rapid, automated, inexpensive and accurate plant disease diagnosis is very important. Machine vision and image processing is a new technique which can capture images from a scene of interest, analyze the images and accurately extract the desired information. Studies show that image processing techniques have been successfully used for plant disease detection. The aim of this study was to investigate the ability of image processing techniques for diagnosing the rice blast and rice brown spot. Materials and Methods: The samples of rice leaf infected by brown spot and rice blast diseases were collected from rice fields and

  6. Amorphous Calcium Phosphate Formation and Aggregation Process Revealed by Light Scattering Techniques

    Directory of Open Access Journals (Sweden)

    Vida Čadež

    2018-06-01

    Full Text Available Amorphous calcium phosphate (ACP attracts attention as a precursor of crystalline calcium phosphates (CaPs formation in vitro and in vivo as well as due to its excellent biological properties. Its formation can be considered to be an aggregation process. Although aggregation of ACP is of interest for both gaining a fundamental understanding of biominerals formation and in the synthesis of novel materials, it has still not been investigated in detail. In this work, the ACP aggregation was followed by two widely applied techniques suitable for following nanoparticles aggregation in general: dynamic light scattering (DLS and laser diffraction (LD. In addition, the ACP formation was followed by potentiometric measurements and formed precipitates were characterized by Fourier transform infrared spectroscopy (FTIR, powder X-ray diffraction (PXRD, transmission electron microscopy (TEM, and atomic force microscopy (AFM. The results showed that aggregation of ACP particles is a process which from the earliest stages simultaneously takes place at wide length scales, from nanometers to micrometers, leading to a highly polydisperse precipitation system, with polydispersity and vol. % of larger aggregates increasing with concentration. Obtained results provide insight into developing a way of regulating ACP and consequently CaP formation by controlling aggregation on the scale of interest.

  7. Development of educational system for nuclear power plant operators using knowledge processing techniques

    International Nuclear Information System (INIS)

    Yoshimura, Seiichi

    1990-01-01

    It is important to carry out effective education optimally adapted to the operator's knowledge level for the enhancement of the operator's ability to deal with abnormal situations. This paper outlines the educational system that realizes effective education using knowledge-processing techniques. This system is composed of three devices. One is a knowledge-processing computer that evaluates the operator's knowledge level and presents educational materials optimally adapted to his knowledge level. Another is a computer for displaying transients and plant equipments. The other is a computer for voice input and output. The educational materials utilize cause-and-effect relationships. It is possible to perform effective education by pointing out the parts the operator failed to understand using the relationships. An evaluation test was performed using several tens of operators by actually operating the system and then impressions were gathered by questionnaire. As a result, the cause-and-effect relationships were proved to be useful to understand the transients. And the contents of the educational materials and the display pictures were also deemed to have practical value. (author)

  8. Improving Vintage Seismic Data Quality through Implementation of Advance Processing Techniques

    Science.gov (United States)

    Latiff, A. H. Abdul; Boon Hong, P. G.; Jamaludin, S. N. F.

    2017-10-01

    It is essential in petroleum exploration to have high resolution subsurface images, both vertically and horizontally, in uncovering new geological and geophysical aspects of our subsurface. The lack of success may have been from the poor imaging quality which led to inaccurate analysis and interpretation. In this work, we re-processed the existing seismic dataset with an emphasis on two objectives. Firstly, to produce a better 3D seismic data quality with full retention of relative amplitudes and significantly reduce seismic and structural uncertainty. Secondly, to facilitate further prospect delineation through enhanced data resolution, fault definitions and events continuity, particularly in syn-rift section and basement cover contacts and in turn, better understand the geology of the subsurface especially in regard to the distribution of the fluvial and channel sands. By adding recent, state-of-the-art broadband processing techniques such as source and receiver de-ghosting, high density velocity analysis and shallow water de-multiple, the final results produced a better overall reflection detail and frequency in specific target zones, particularly in the deeper section.

  9. A comparative analysis of pre-processing techniques in colour retinal images

    International Nuclear Information System (INIS)

    Salvatelli, A; Bizai, G; Barbosa, G; Drozdowicz, B; Delrieux, C

    2007-01-01

    Diabetic retinopathy (DR) is a chronic disease of the ocular retina, which most of the times is only discovered when the disease is on an advanced stage and most of the damage is irreversible. For that reason, early diagnosis is paramount for avoiding the most severe consequences of the DR, of which complete blindness is not uncommon. Unsupervised or supervised image processing of retinal images emerges as a feasible tool for this diagnosis. The preprocessing stages are the key for any further assessment, since these images exhibit several defects, including non uniform illumination, sampling noise, uneven contrast due to pigmentation loss during sampling, and many others. Any feasible diagnosis system should work with images where these defects were compensated. In this work we analyze and test several correction techniques. Non uniform illumination is compensated using morphology and homomorphic filtering; uneven contrast is compensated using morphology and local enhancement. We tested our processing stages using Fuzzy C-Means, and local Hurst (self correlation) coefficient for unsupervised segmentation of the abnormal blood vessels. The results over a standard set of DR images are more than promising

  10. The application of irradiation techniques for food preservation and processing improvement

    Energy Technology Data Exchange (ETDEWEB)

    Byun, Myung Woo; Cho, Han Ok; Jo, Sung Ki; Yook, Hong Sun; Kwon, Oh Jin; Yang, Jae Seung; Kim, Sung; Im, Sung Il

    1997-09-01

    This project has intended to develop alternative techniques to be used in food industry for food processing and utilization by safe irradiation methods. For improvement of rheology and processing in corn starch by irradiation, the production of modified starch with low viscosity as well as with excellent viscosity stability became feasible by the control of gamma irradiation dose levels and the amount of added inorganic peroxides to starch. Also, this project was developed the improvement methods of hygienic quality and long-term storage of dried red pepper by gamma irradiation. And, in Korean medicinal plants, 10 kGy gamma irradiation was effective for improving sanitary quality and increasing extraction yield of major components. For the sanitization of health and convenience foods, gamma irradiation was more effective than ozone treatment in decontamination of microorganisms, with minimal effect on the physicochemical properties analysed. In evaluation of wholesomeness, gamma-irradiated the Korean medicinal plants could be safe on the genotoxic point of view. And, thirteen groups of irradiated foods approved for human consumption from Korea Ministry of Health and Welfare. (author). 81 refs., 74 tabs.

  11. Modeling and Simulation of Voids in Composite Tape Winding Process Based on Domain Superposition Technique

    Science.gov (United States)

    Deng, Bo; Shi, Yaoyao

    2017-11-01

    The tape winding technology is an effective way to fabricate rotationally composite materials. Nevertheless, some inevitable defects will seriously influence the performance of winding products. One of the crucial ways to identify the quality of fiber-reinforced composite material products is examining its void content. Significant improvement in products' mechanical properties can be achieved by minimizing the void defect. Two methods were applied in this study, finite element analysis and experimental testing, respectively, to investigate the mechanism of how void forming in composite tape winding processing. Based on the theories of interlayer intimate contact and Domain Superposition Technique (DST), a three-dimensional model of prepreg tape void with SolidWorks has been modeled in this paper. Whereafter, ABAQUS simulation software was used to simulate the void content change with pressure and temperature. Finally, a series of experiments were performed to determine the accuracy of the model-based predictions. The results showed that the model is effective for predicting the void content in the composite tape winding process.

  12. Remedial Investigation/Feasibility Study (RI/FS) process, elements and techniques guidance

    Energy Technology Data Exchange (ETDEWEB)

    1993-12-01

    This manual provides detailed guidance on Remedial Investigation/Feasibility Studies (RI/FSs) conducted pursuant to the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) at Department of Energy (DOE) facilities. The purpose of the RI/FS, to assess the risk posed by a hazardous waste site and to determine the best way to reduce that risk, and its structure (site characterization, risk assessment, screening and detailed analysis of alternatives, etc.) is defined in the National Oil and Hazardous Substances Pollution Contingency Plan (NCP) and further explained in the Environmental Protection Agency`s (EPA`s) Guidance for Conducting Remedial Investigations and Feasibility Studies Under CERCLA (Interim Final) 540/G-89/004, OSWER Directive 9355.3-01, October 1988. Though issued in 1988, the EPA guidance remains an excellent source of information on the conduct and structure of an RI/FS. This document makes use of supplemental RI/FS-related guidance that EPA has developed since its initial document was issued in 1988, incorporates practical lessons learned in more than 12 years of experience in CERCLA hazardous site remediation, and drawing on those lessons, introduces the Streamlined Approach For Environmental Restoration (SAFER), developed by DOE as a way to proceed quickly and efficiently through the RI/FS process at DOE facilities. Thus as its title implies, this guidance is intended to describe in detail the process and component elements of an RI/FS, as well as techniques to manage the RI/FS effectively.

  13. Evaluation of area strain response of dielectric elastomer actuator using image processing technique

    Science.gov (United States)

    Sahu, Raj K.; Sudarshan, Koyya; Patra, Karali; Bhaumik, Shovan

    2014-03-01

    Dielectric elastomer actuator (DEA) is a kind of soft actuators that can produce significantly large electric-field induced actuation strain and may be a basic unit of artificial muscles and robotic elements. Understanding strain development on a pre-stretched sample at different regimes of electrical field is essential for potential applications. In this paper, we report about ongoing work on determination of area strain using digital camera and image processing technique. The setup, developed in house consists of low cost digital camera, data acquisition and image processing algorithm. Samples have been prepared by biaxially stretched acrylic tape and supported between two cardboard frames. Carbon-grease has been pasted on the both sides of the sample, which will be compliant with electric field induced large deformation. Images have been grabbed before and after the application of high voltage. From incremental image area, strain has been calculated as a function of applied voltage on a pre-stretched dielectric elastomer (DE) sample. Area strain has been plotted with the applied voltage for different pre-stretched samples. Our study shows that the area strain exhibits nonlinear relationship with applied voltage. For same voltage higher area strain has been generated on a sample having higher pre-stretched value. Also our characterization matches well with previously published results which have been done with costly video extensometer. The study may be helpful for the designers to fabricate the biaxial pre-stretched planar actuator from similar kind of materials.

  14. Deformation processes in functional materials studied by in situ neutron diffraction and ultrasonic techniques

    International Nuclear Information System (INIS)

    Sittner, P.; Novak, V.; Landa, M.; Lukas, P.

    2007-01-01

    The unique thermomechanical functions of shape memory alloys (hysteretic stress-strain-temperature responses) not their structural properties (as strength, fatigue, corrosion resistance, etc.) are primarily utilized in engineering applications. In order to better understand and predict the functional behavior, we have recently employed two dedicated non-invasive in situ experimental methods capable to follow the deformation/transformation processes in thermomechanically loaded polycrystalline samples. The in situ neutron diffraction method takes advantage of the ability of thermal neutrons to penetrate bulk samples. As a diffraction technique sensitive to interplanar spacings in crystalline solids, it provides in situ information on the changes in crystal structure, phase composition, phase stress and texture in the transforming samples. The combined in situ ultrasonic and electric resistance method follows variations of the electric resistance as well as speed and attenuation of acoustic waves propagating through the transforming sample. The acoustic waves are mainly sensitive to changes of elastic properties accompanying the deformation/transformation processes. The latter method thus follows the changes in interatomic bonds rather than changes in the interplanar lattice spacings focused in the neutron diffraction method. The methods are thus complementary. They are briefly described and selected experimental results obtained recently on NiTi alloys are presented and discussed

  15. Digital pulse processing techniques for high resolution amplitude measurement of radiation detector

    International Nuclear Information System (INIS)

    Singhai, P.; Roy, A.; Dhara, P.; Chatterjee, S.

    2012-01-01

    The digital pulse processing techniques for high resolution amplitude measurement of radiation detector pulse is an effective replacement of expensive and bulky analog processing as the digital domain offers higher channel density and at the same time it is cheaper. We have demonstrated a prototype digital setup with highspeed sampling ADC with sampling frequency of 80-125 MHz followed by series of IIR filters for pulse shaping in a trigger-less acquisition mode. The IIR filters, peak detection algorithm and the data write-out logic was written on VHDL and implemented on FPGA. We used CAMAC as the read out platform. In conjunction with the full hardware implementation we also used a mixed platform with VME digitizer card with raw-sample read out using C code. The rationale behind this mixed platform is to test out various filter algorithms quickly on C and also to benchmark the performance of the chip level ADCs against the standard commercial digitizer in terms of noise or resolution. The paper describes implementation of both the methods with performance obtained in both the methods. (author)

  16. A comparative analysis of pre-processing techniques in colour retinal images

    Energy Technology Data Exchange (ETDEWEB)

    Salvatelli, A [Artificial Intelligence Group, Facultad de Ingenieria, Universidad Nacional de Entre Rios (Argentina); Bizai, G [Artificial Intelligence Group, Facultad de Ingenieria, Universidad Nacional de Entre Rios (Argentina); Barbosa, G [Artificial Intelligence Group, Facultad de Ingenieria, Universidad Nacional de Entre Rios (Argentina); Drozdowicz, B [Artificial Intelligence Group, Facultad de Ingenieria, Universidad Nacional de Entre Rios (Argentina); Delrieux, C [Electric and Computing Engineering Department, Universidad Nacional del Sur, Alem 1253, BahIa Blanca, (Partially funded by SECyT-UNS) (Argentina)], E-mail: claudio@acm.org

    2007-11-15

    Diabetic retinopathy (DR) is a chronic disease of the ocular retina, which most of the times is only discovered when the disease is on an advanced stage and most of the damage is irreversible. For that reason, early diagnosis is paramount for avoiding the most severe consequences of the DR, of which complete blindness is not uncommon. Unsupervised or supervised image processing of retinal images emerges as a feasible tool for this diagnosis. The preprocessing stages are the key for any further assessment, since these images exhibit several defects, including non uniform illumination, sampling noise, uneven contrast due to pigmentation loss during sampling, and many others. Any feasible diagnosis system should work with images where these defects were compensated. In this work we analyze and test several correction techniques. Non uniform illumination is compensated using morphology and homomorphic filtering; uneven contrast is compensated using morphology and local enhancement. We tested our processing stages using Fuzzy C-Means, and local Hurst (self correlation) coefficient for unsupervised segmentation of the abnormal blood vessels. The results over a standard set of DR images are more than promising.

  17. A scintillation detector signal processing technique with active pileup prevention for extending scintillation count rates

    International Nuclear Information System (INIS)

    Wong, W.H.; Li, H.

    1998-01-01

    A new method for processing signals from scintillation detectors is proposed for very high count-rate situations where multiple-event pileups are the norm. This method is designed to sort out and recover every impinging event from multiple-event pileups while maximizing the collection of scintillation signal for every event to achieve optimal accuracy in determining the energy of the event. For every detected event, this method cancels the remnant signals from previous events, and excludes the pileup of signals from following events. With this technique, pileup events can be recovered and the energy of every recovered event can be optimally measured despite multiple pileups. A prototype circuit demonstrated that the maximum count rates have been increased by more than 10 times, comparing to the standard pulse-shaping method, while the energy resolution is as good as that of the pulse shaping (or the fixed integration) method at normal count rates. At 2 x 10 6 events/sec for NaI(Tl), the true counts acquired are 3 times more than the delay-line clipping method (commonly used in fast processing designs) due to events recovered from pileups. Pulse-height spectra up to 3.5 x 10 6 events/sec have been studied

  18. Radioisotope techniques for process optimisation and control in the offshore oil and gas industries

    International Nuclear Information System (INIS)

    Charlton, J.S.

    2002-01-01

    For over fifty years, radioisotope technology has been used by the oil industry to solve problems and to help optimise process operations. The widespread development of offshore oil and gas fields has brought, and continues to bring, new challenges and, in response, new or modified applications of radioisotope technology have been introduced. This paper presents case studies, which illustrate the use of radioisotopes, both in the sub-sea environment and on the offshore production platforms. On the platform, radioisotope techniques applied singly or in combination, have been applied to the performance assessment of oil/gas separation and gas dehydration units. Novel nucleonic instrumentation has been developed for the control of three-phase separators. Sub-sea, radioactive tracers and/or sealed sources have been used to investigate the integrity of submerged structures and to troubleshoot pipeline problems. The continuing expansion in the use of this technology stems from industry increasing awareness of its versatility and from the fact that the benefits it confers can be obtained at a relatively modest cost. Examples of economic benefit described in the paper are associated with production enhancements derived from the ability of radioisotope technology to measure performance and diagnose problems on line, without disrupting process operations in any way. (Author)

  19. Evaluation of improved techniques for removing strontium and cesium from process wastewater and groundwater

    International Nuclear Information System (INIS)

    Bostick, D.

    1996-01-01

    The goal of this task is to evaluate new sorbent materials, ion-exchange materials, or other processes for groundwater and process wastewater decontamination that will be more selective for the removal of 90 Sr and 137 Cs than standard treatment methods. Laboratory studies will strive to obtain a quantitative understanding of the behavior of these new materials and to evaluate their sorption efficiency in reference to a standard benchmark treatment technique. Testing of the new materials will begin by conducting scoping tests where new treatment materials are compared with standard, commercially available materials in batch shaker tests. Sorption tests will be performed under various treatment conditions (e.g., pH, temperature, simulant waste composition) for the most promising materials. Additional testing with actual wastewater will be conducted with two or three of the most effective treatment methods. Once batch testing of a treatment method is completed, dynamic column tests will be performed using the most successful sorbents, to obtain the defining column operating parameters

  20. The application of irradiation techniques for food preservation and processing improvement

    International Nuclear Information System (INIS)

    Byun, Myung Woo; Cho, Han Ok; Jo, Sung Ki; Yook, Hong Sun; Kwon, Oh Jin; Yang, Jae Seung; Kim, Sung; Im, Sung Il.

    1997-09-01

    This project has intended to develop alternative techniques to be used in food industry for food processing and utilization by safe irradiation methods. For improvement of rheology and processing in corn starch by irradiation, the production of modified starch with low viscosity as well as with excellent viscosity stability became feasible by the control of gamma irradiation dose levels and the amount of added inorganic peroxides to starch. Also, this project was developed the improvement methods of hygienic quality and long-term storage of dried red pepper by gamma irradiation. And, in Korean medicinal plants, 10 kGy gamma irradiation was effective for improving sanitary quality and increasing extraction yield of major components. For the sanitization of health and convenience foods, gamma irradiation was more effective than ozone treatment in decontamination of microorganisms, with minimal effect on the physicochemical properties analysed. In evaluation of wholesomeness, gamma-irradiated the Korean medicinal plants could be safe on the genotoxic point of view. And, thirteen groups of irradiated foods approved for human consumption from Korea Ministry of Health and Welfare. (author). 81 refs., 74 tabs

  1. Spray Drying as a Processing Technique for Syndiotactic Polystyrene to Powder Form for Part Manufacturing Through Selective Laser Sintering

    Science.gov (United States)

    Mys, N.; Verberckmoes, A.; Cardon, L.

    2017-03-01

    Selective laser sintering (SLS) is a rapidly expanding field of the three-dimensional printing concept. One stumbling block in the evolution of the technique is the limited range of materials available for processing with SLS making the application window small. This article aims at identifying syndiotactic polystyrene (sPS) as a promising material. sPS pellets were processed into powder form with a lab-scale spray dryer with vibrating nozzle. This technique is the focus of this scope as it almost eliminates the agglomeration phenomenon often encountered with the use of solution-based processing techniques. Microspheres obtained were characterized in shape and size by scanning electron microscopy and evaluation of the particle size distribution. The effect the processing technique imparts on the intrinsic properties of the material was examined by differential scanning calorimetry analysis.

  2. A Survey on Optimal Signal Processing Techniques Applied to Improve the Performance of Mechanical Sensors in Automotive Applications

    Science.gov (United States)

    Hernandez, Wilmar

    2007-01-01

    In this paper a survey on recent applications of optimal signal processing techniques to improve the performance of mechanical sensors is made. Here, a comparison between classical filters and optimal filters for automotive sensors is made, and the current state of the art of the application of robust and optimal control and signal processing techniques to the design of the intelligent (or smart) sensors that today's cars need is presented through several experimental results that show that the fusion of intelligent sensors and optimal signal processing techniques is the clear way to go. However, the switch between the traditional methods of designing automotive sensors and the new ones cannot be done overnight because there are some open research issues that have to be solved. This paper draws attention to one of the open research issues and tries to arouse researcher's interest in the fusion of intelligent sensors and optimal signal processing techniques.

  3. The effect of processing techniques on microstructural and tribological properties of copper-based alloys

    International Nuclear Information System (INIS)

    Vencl, Aleksandar; Rajkovic, Viseslava; Zivic, Fatima; Mitrović, Slobodan; Cvijović-Alagić, Ivana; Jovanovic, Milan T.

    2013-01-01

    Three copper-based alloys, i.e. two composites reinforced with Al 2 O 3 particles and fabricated through PM route and Cu–Cr–Zr alloy processed by the vacuum melting and casting technique were the object of this investigation. Light microscope, scanning electron microscope (SEM) equipped with electron X-ray spectrometer (EDS) and transmission electron microscope (TEM) were used for microstructural characterization. The ball-on-disc nanotribometer served for wear and friction tests applying low sliding speeds (6, 8 and 10 mm/s) at constant load (1 N). The objective of the paper was to investigate the effect of different processing techniques on microstructure, thermal stability and the tribological characteristics of composites and copper ingot alloy. Nano-sized Al 2 O 3 particles (less than 100 nm in size) are present not only in the copper matrix of Cu–2.5 wt.% Al composite, obtained by internal oxidation, but they are also formed at the grain boundaries preventing the grain growth and providing very small grain size. During the high temperature annealing (in the range 300–950 o C) composites behaved much better than the ingot alloy. The highest thermal stability showed Cu–2.5 wt.% Al composite. The pinning effect of nano-sized Al 2 O 3 particles prevents the grain growth slowing down recrystallization of this composite up to 900 o C. Micro-sized Al 2 O 3 particles in Cu–5 wt.% Al 2 O 3 composite, processed by mechanical annealing, are not effective in preventing dislocation motion and the grain growth, whereas microstructure of Cu–0.4 wt.% Cr–0.08 wt.% Zr ingot alloy was completely recrystallized around 550 o C. Cu–2.5 wt.% Al composite showed the best wear resistance, approximately 2.5 times higher than that of Cu–5 wt.% Al 2 O 3 composite. High hardness and nano-sized Al 2 O 3 particles size combined with the fine-grain structure are the main parameters leading to the improved wear resistance of the Cu–2.5Al composite.

  4. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    Science.gov (United States)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  5. ActionScript 30 Design Patterns Object Oriented Programming Techniques

    CERN Document Server

    Sanders, William

    2008-01-01

    If you're an experienced Flash or Flex developer ready to tackle sophisticated programming techniques with ActionScript 3.0, this hands-on introduction to design patterns takes you step by step through the process. You learn about various types of design patterns and construct small abstract examples before trying your hand at building full-fledged working applications outlined in the book.

  6. Effect of lean process improvement techniques on a university hospital inpatient pharmacy.

    Science.gov (United States)

    Hintzen, Barbara L; Knoer, Scott J; Van Dyke, Christie J; Milavitz, Brian S

    2009-11-15

    The effect of lean process improvement on an inpatient university hospital pharmacy was evaluated. The University of Minnesota Medical Center (UMMC), Fairview, implemented lean techniques in its inpatient pharmacy to improve workflow, reduce waste, and achieve substantial cost savings. The sterile products area (SPA) and the inventory area were prospectively identified as locations for improvement due to their potential to realize cost savings. Process-improvement goals for the SPA included the reduction of missing doses, errors, and patient-specific waste by 30%, 50%, and 30%, respectively, and the reallocation of two technician full-time equivalents (FTEs). Reductions in pharmaceutical inventory and returns due to outdating were also anticipated. Work-flow in the SPA was improved through the creation of accountability, standard work, and movement toward one-piece flow. Increasing the number of i.v. batches decreased pharmaceutical waste by 40%. Through SPA environment improvements and enhanced workload sharing, two FTE technicians from the SPA were redistributed within the department. SPA waste reduction yielded an annual saving of $275,500. Quality and safety were also improved, as measured by reductions in missing doses, expired products, and production errors. In the inventory area, visual control was improved through the use of a double-bin system, the number of outdated drugs decreased by 20%, and medication inventory was reduced by $50,000. Lean methodology was successfully implemented in the SPA and inventory area at the UMMC, Fairview, inpatient pharmacy. Benefits of this process included an estimated annual cost saving of $289,256 due to waste reduction, improvements in workflow, and decreased staffing requirements.

  7. Optical signal processing techniques and applications of optical phase modulation in high-speed communication systems

    Science.gov (United States)

    Deng, Ning

    the speed limitation of electronics. Thus, all-optical signal processing techniques are highly desirable to support the necessary optical switching functionalities in future ultrahigh-speed optical packet-switching networks. To cope with the wide use of optical phase-modulated signals, in the thesis, an all-optical logic for DPSK or PSK input signals is developed, for the first time. Based on four-wave mixing in semiconductor optical amplifier, the structure of the logic gate is simple, compact, and capable of supporting ultrafast operation. In addition to the general logic processing, a simple label recognition scheme, as a specific signal processing function, is proposed for phase-modulated label signals. The proposed scheme can recognize any incoming label pattern according to the local pattern, and is potentially capable of handling variable-length label patterns. Optical access network with multicast overlay and centralized light sources. In the arena of optical access networks, wavelength division multiplexing passive optical network (WDM-PON) is a promising technology to deliver high-speed data traffic. However, most of proposed WDM-PONs only support conventional point-to-point service, and cannot meet the requirement of increasing demand on broadcast and multicast service. In this thesis, a simple network upgrade is proposed based on the traditional PON architecture to support both point-to-point and multicast service. In addition, the two service signals are modulated on the same lightwave carrier. The upstream signal is also remodulated on the same carrier at the optical network unit, which can significantly relax the requirement on wavelength management at the network unit.

  8. Assessment of myocardial metabolism by PET - a sophisticated dream or clinical reality

    International Nuclear Information System (INIS)

    Schelbert, H.R.

    1986-01-01

    This symposium reviewed radionuclide techniques for the noninvasive study of regional myocardial metabolism and spanned a wide range of topics. New radiotracers for probing different metabolic pathways or selected biochemical reaction steps were presented. New information on tracers already in use was forthcoming. Because the imaging device can measure only concentrations of radiolabel in tissue, other studies examined relationships between uptake and turnover of radioactivity in tissue as an externally observed signal, the chemical fate of the label, and the biologic process under study. Other studies formulated these relationships through tracer compartment models, which are fundament to quantifying regional physiologic processes externally. Other investigations applied radiotracer methods to experimental models of cardiac disease to patients. They described findings of regional or global alterations in substrate metabolism. These observations highlighted the potential clinical value of this new methodology. At the same time, several of these observations remain at present without mechanistic explanation; yet they form the foundation on which working hypotheses can be built, which in turn can be tested in vivo. (orig.)

  9. Assessment of myocardial metabolism by PET - a sophisticated dream or clinical reality

    Energy Technology Data Exchange (ETDEWEB)

    Schelbert, H R

    1986-08-01

    This symposium reviewed radionuclide techniques for the noninvasive study of regional myocardial metabolism and spanned a wide range of topics. New radiotracers for probing different metabolic pathways or selected biochemical reaction steps were presented. New information on tracers already in use was forthcoming. Because the imaging device can measure only concentrations of radiolabel in tissue, other studies examined relationships between uptake and turnover of radioactivity in tissue as an externally observed signal, the chemical fate of the label, and the biologic process under study. Other studies formulated these relationships through tracer compartment models, which are fundament to quantifying regional physiologic processes externally. Other investigations applied radiotracer methods to experimental models of cardiac disease to patients. They described findings of regional or global alterations in substrate metabolism. These observations highlighted the potential clinical value of this new methodology. At the same time, several of these observations remain at present without mechanistic explanation; yet they form the foundation on which working hypotheses can be built, which in turn can be tested in vivo.

  10. Multi-criteria approach with linear combination technique and analytical hierarchy process in land evaluation studies

    Directory of Open Access Journals (Sweden)

    Orhan Dengiz

    2018-01-01

    Full Text Available Land evaluation analysis is a prerequisite to achieving optimum utilization of the available land resources. Lack of knowledge on best combination of factors that suit production of yields has contributed to the low production. The aim of this study was to determine the most suitable areas for agricultural uses. For that reasons, in order to determine land suitability classes of the study area, multi-criteria approach was used with linear combination technique and analytical hierarchy process by taking into consideration of some land and soil physico-chemical characteristic such as slope, texture, depth, derange, stoniness, erosion, pH, EC, CaCO3 and organic matter. These data and land mapping unites were taken from digital detailed soil map scaled as 1:5.000. In addition, in order to was produce land suitability map GIS was program used for the study area. This study was carried out at Mahmudiye, Karaamca, Yazılı, Çiçeközü, Orhaniye and Akbıyık villages in Yenişehir district of Bursa province. Total study area is 7059 ha. 6890 ha of total study area has been used as irrigated agriculture, dry farming agriculture, pasture while, 169 ha has been used for non-agricultural activities such as settlement, road water body etc. Average annual temperature and precipitation of the study area are 16.1oC and 1039.5 mm, respectively. Finally after determination of land suitability distribution classes for the study area, it was found that 15.0% of the study area has highly (S1 and moderately (S2 while, 85% of the study area has marginally suitable and unsuitable coded as S3 and N. It was also determined some relation as compared results of linear combination technique with other hierarchy approaches such as Land Use Capability Classification and Suitability Class for Agricultural Use methods.

  11. The taper of cast post preparation measured using innovative image processing technique

    Directory of Open Access Journals (Sweden)

    Al Hyiasat Ahmad S

    2010-08-01

    Full Text Available Abstract Background No documentation in the literature about taper of cast posts. This study was conducted to measure the degree of cast posts taper, and to evaluate its suitability based on the anatomy aspects of the common candidate teeth for post reconstruction. Methods Working casts for cast posts, prepared using Gates Glidden drills, were collected. Impressions of post spaces were made using polyvinyl siloxan putty/wash technique. Digital camera with a 10' high quality lens was used for capturing two digital images for each impression; one in the Facio-Lingual (FL and the other in the Mesio-Distal (MD directions. Automated image processing program was developed to measure the degree of canal taper. Data were analyzed using Statistical Package for Social Sciences software and One way Analysis of Variance. Results Eighty four dies for cast posts were collected: 16 for each maxillary anterior teeth subgroup, and 18 for each maxillary and mandibular premolar subgroup. Mean of total taper for all preparations was 10.7 degree. There were no statistical differences among the total taper of all groups (P = .256 or between the MD and FL taper for each subgroup. Mean FL taper for the maxillary first premolars was lower significantly (P = .003 than the maxillary FL taper of the second premolars. FL taper was higher than the MD taper in all teeth except the maxillary first premolars. Conclusions Taper produced did not reflect the differences among the anatomy of teeth. While this technique deemed satisfactory in the maxillary anterior teeth, the same could not be said for the maxillary first premolars. Careful attention to the root anatomy is mandatory.

  12. [Motor capacities involved in the psychomotor skills of the cardiopulmonary resuscitation technique: recommendations for the teaching-learning process].

    Science.gov (United States)

    Miyadahira, A M

    2001-12-01

    It is a bibliographic study about the identification of the motor capacities involved in the psychomotor skills of the cardiopulmonary resuscitation (CPR) which aims to obtain subsidies to the planning of the teaching-learning process of this skill. It was found that: the motor capacities involved in the psychomotor skill of the CPR technique are predominantly cognitive and motor, involving 9 perceptive-motor capacities and 8 physical proficiency capacities. The CPR technique is a psychomotor skill classified as open, done in series and categorized as a thin and global skill and the teaching-learning process of the CPR technique has an elevated degree of complexity.

  13. Sophisticated approval voting, ignorance priors, and plurality heuristics: a behavioral social choice analysis in a Thurstonian framework.

    Science.gov (United States)

    Regenwetter, Michel; Ho, Moon-Ho R; Tsetlin, Ilia

    2007-10-01

    This project reconciles historically distinct paradigms at the interface between individual and social choice theory, as well as between rational and behavioral decision theory. The authors combine a utility-maximizing prescriptive rule for sophisticated approval voting with the ignorance prior heuristic from behavioral decision research and two types of plurality heuristics to model approval voting behavior. When using a sincere plurality heuristic, voters simplify their decision process by voting for their single favorite candidate. When using a strategic plurality heuristic, voters strategically focus their attention on the 2 front-runners and vote for their preferred candidate among these 2. Using a hierarchy of Thurstonian random utility models, the authors implemented these different decision rules and tested them statistically on 7 real world approval voting elections. They cross-validated their key findings via a psychological Internet experiment. Although a substantial number of voters used the plurality heuristic in the real elections, they did so sincerely, not strategically. Moreover, even though Thurstonian models do not force such agreement, the results show, in contrast to common wisdom about social choice rules, that the sincere social orders by Condorcet, Borda, plurality, and approval voting are identical in all 7 elections and in the Internet experiment. PsycINFO Database Record (c) 2007 APA, all rights reserved.

  14. Reducing the absorbed dose in analogue radiography of infant chest images by improving the image quality, using image processing techniques

    International Nuclear Information System (INIS)

    Karimian, A.; Yazdani, S.; Askari, M. A.

    2011-01-01

    Radiographic inspection is one of the most widely employed techniques for medical testing methods. Because of poor contrast and high un-sharpness of radiographic image quality in films, converting radiographs to a digital format and using further digital image processing is the best method of enhancing the image quality and assisting the interpreter in their evaluation. In this research work, radiographic films of 70 infant chest images with different sizes of defects were selected. To digitise the chest images and employ image processing the two algorithms (i) spatial domain and (ii) frequency domain techniques were used. The MATLAB environment was selected for processing in the digital format. Our results showed that by using these two techniques, the defects with small dimensions are detectable. Therefore, these suggested techniques may help medical specialists to diagnose the defects in the primary stages and help to prevent more repeat X-ray examination of paediatric patients. (authors)

  15. Regional seismic lines reprocessed using post-stack processing techniques; National Petroleum Reserve, Alaska

    Science.gov (United States)

    Miller, John J.; Agena, W.F.; Lee, M.W.; Zihlman, F.N.; Grow, J.A.; Taylor, D.J.; Killgore, Michele; Oliver, H.L.

    2000-01-01

    This CD-ROM contains stacked, migrated, 2-Dimensional seismic reflection data and associated support information for 22 regional seismic lines (3,470 line-miles) recorded in the National Petroleum Reserve ? Alaska (NPRA) from 1974 through 1981. Together, these lines constitute about one-quarter of the seismic data collected as part of the Federal Government?s program to evaluate the petroleum potential of the Reserve. The regional lines, which form a grid covering the entire NPRA, were created by combining various individual lines recorded in different years using different recording parameters. These data were reprocessed by the USGS using modern, post-stack processing techniques, to create a data set suitable for interpretation on interactive seismic interpretation computer workstations. Reprocessing was done in support of ongoing petroleum resource studies by the USGS Energy Program. The CD-ROM contains the following files: 1) 22 files containing the digital seismic data in standard, SEG-Y format; 2) 1 file containing navigation data for the 22 lines in standard SEG-P1 format; 3) 22 small scale graphic images of each seismic line in Adobe Acrobat? PDF format; 4) a graphic image of the location map, generated from the navigation file, with hyperlinks to the graphic images of the seismic lines; 5) an ASCII text file with cross-reference information for relating the sequential trace numbers on each regional line to the line number and shotpoint number of the original component lines; and 6) an explanation of the processing used to create the final seismic sections (this document). The SEG-Y format seismic files and SEG-P1 format navigation file contain all the information necessary for loading the data onto a seismic interpretation workstation.

  16. Techniques of radioactive soil processing at rehabilitation of contamination territories - 59199

    International Nuclear Information System (INIS)

    Volkov, Victor; Chesnokov, Alexander; Danilovich, Alexey; Zverkov, Yury; Koltyshev, Sergey; Semenov, Sergey; Shisha, Anatoly

    2012-01-01

    Rehabilitation of nuclear- and radiation objects assumes dealing with and removal of considerable volumes of a radioactive soil. A similar situation was faced at the remediation of such sufficiently large objects, as old radioactive waste storages at the territory of 'Kurchatov Institute' and elimination of consequences of radiation accident at Podolsk plant of nonferrous metals. At rough estimates the volumes of a radioactive soil at territory of 'Kurchatov institute' were 15-20 thousand m 3 , volumes of a removed soil at carrying out of urgent measures in territory of Kirovo-Chepetsk chemical plant exceeded 20-25 thousand m 3 , volumes of a low active waste at the territory of Podolsk plant may reach 20 thousand m 3 . Such considerable volumes demand creation of technologies of their processing, an effective measuring technique of levels of their contamination and ways of considerable (in times) decrease of their volumes at the expense of decontamination or separation. Works have been aimed at the decision of these problems at carrying out of rehabilitation of territory 'Kurchatov institute'. During works technologies of radiation and water-gravitational separation of a radioactive soil have been offered and are realized in practice. A facility of water -gravitational separation of the soil was created and used within 5 years. It allowed decreasing of volumes of the low active waste in 5-6 times. In further the facility was supplied by a facility of radiation separation of the soil that has raised its efficiency. On turn there is a start-up question in experimental operation of facility of radiation separation of low active slag for Podolsk plant of nonferrous metals. The decision of these problems will allow to gain experience of creation of through technology of the processing of a radioactive soil and decrease in its volumes for using it as a design decisions for rehabilitation of other large scale radioactive-contaminated territories and industrial objects

  17. Estimation of Apple Volume and Its Shape Indentation Using Image Processing Technique and Neural Network

    Directory of Open Access Journals (Sweden)

    M Jafarlou

    2014-04-01

    Full Text Available Physical properties of agricultural products such as volume are the most important parameters influencing grading and packaging systems. They should be measured accurately as they are considered for any good system design. Image processing and neural network techniques are both non-destructive and useful methods which are recently used for such purpose. In this study, the images of apples were captured from a constant distance and then were processed in MATLAB software and the edges of apple images were extracted. The interior area of apple image was divided into some thin trapezoidal elements perpendicular to longitudinal axis. Total volume of apple was estimated by the summation of incremental volumes of these elements revolved around the apple’s longitudinal axis. The picture of half cut apple was also captured in order to obtain the apple shape’s indentation volume, which was subtracted from the previously estimated total volume of apple. The real volume of apples was measured using water displacement method and the relation between the real volume and estimated volume was obtained. The t-test and Bland-Altman indicated that the difference between the real volume and the estimated volume was not significantly different (p>0.05 i.e. the mean difference was 1.52 cm3 and the accuracy of measurement was 92%. Utilizing neural network with input variables of dimension and mass has increased the accuracy up to 97% and the difference between the mean of volumes decreased to 0.7 cm3.

  18. The effect of starting point placement technique on thoracic transverse process strength: an ex vivo biomechanical study

    Directory of Open Access Journals (Sweden)

    Burton Douglas C

    2010-07-01

    Full Text Available Abstract Background The use of thoracic pedicle screws in spinal deformity, trauma, and tumor reconstruction is becoming more common. Unsuccessful screw placement may require salvage techniques utilizing transverse process hooks. The effect of different starting point placement techniques on the strength of the transverse process has not previously been reported. The purpose of this paper is to determine the biomechanical properties of the thoracic transverse process following various pedicle screw starting point placement techniques. Methods Forty-seven fresh-frozen human cadaveric thoracic vertebrae from T2 to T9 were disarticulated and matched by bone mineral density (BMD and transverse process (TP cross-sectional area. Specimens were randomized to one of four groups: A, control, and three others based on thoracic pedicle screw placement technique; B, straightforward; C, funnel; and D, in-out-in. Initial cortical bone removal for pedicle screw placement was made using a burr at the location on the transverse process or transverse process-laminar junction as published in the original description of each technique. The transverse process was tested measuring load-to-failure simulating a hook in compression mode. Analysis of covariance and Pearson correlation coefficients were used to examine the data. Results Technique was a significant predictor of load-to-failure (P = 0.0007. The least squares mean (LS mean load-to-failure of group A (control was 377 N, group B (straightforward 355 N, group C (funnel 229 N, and group D (in-out-in 301 N. Significant differences were noted between groups A and C, A and D, B and C, and C and D. BMD (0.925 g/cm2 [range, 0.624-1.301 g/cm2] was also a significant predictor of load-to-failure, for all specimens grouped together (P P 0.05. Level and side tested were not found to significantly correlate with load-to-failure. Conclusions The residual coronal plane compressive strength of the thoracic transverse process

  19. Ion source techniques for high-speed processing of material surface by ion beams

    International Nuclear Information System (INIS)

    Ishikawa, Junzo

    1990-01-01

    The present paper discusses some key or candidate techniques for future ion source development and such ion sources developed by the author. Several types of microwave ion sources for producing low charge state ions have been developed in Japan. When a microwave plasma cathode developed by the author is adapted to a Kaufman type ion source, the electron emission currents are found to be 2.5 A for argon gas and 0.5-0.9 A for oxygen gas. An alternative ionization method for metal atoms is strongly required for high-speed processing of material surface by metal-ion beams. Detailed discussion is made of collisional ionization of vaporized atoms, and negative-ion production (secondary negative-ion emission by sputtering). An impregnated electrode type liquid-metal ion source developed by the author, which has a porous tip structure, is described. The negative-ion production efficiency is quite high. The report also presents a neutral and ionized alkaline-metal bombardment type heavy negative-ion source, which consists of a cesium plasma ion source, suppressor, target electrode, negative-ion extraction electrode, and einzel lens. (N.K.)

  20. Hydroxyapatite scaffolds processed using a TBA-based freeze-gel casting/polymer sponge technique.

    Science.gov (United States)

    Yang, Tae Young; Lee, Jung Min; Yoon, Seog Young; Park, Hong Chae

    2010-05-01

    A novel freeze-gel casting/polymer sponge technique has been introduced to fabricate porous hydroxyapatite scaffolds with controlled "designer" pore structures and improved compressive strength for bone tissue engineering applications. Tertiary-butyl alcohol (TBA) was used as a solvent in this work. The merits of each production process, freeze casting, gel casting, and polymer sponge route were characterized by the sintered microstructure and mechanical strength. A reticulated structure with large pore size of 180-360 microm, which formed on burn-out of polyurethane foam, consisted of the strut with highly interconnected, unidirectional, long pore channels (approximately 4.5 microm in dia.) by evaporation of frozen TBA produced in freeze casting together with the dense inner walls with a few, isolated fine pores (<2 microm) by gel casting. The sintered porosity and pore size generally behaved in an opposite manner to the solid loading, i.e., a high solid loading gave low porosity and small pore size, and a thickening of the strut cross section, thus leading to higher compressive strengths.

  1. Selective and validated data processing techniques for performance improvement of automatic lines

    Directory of Open Access Journals (Sweden)

    D’Aponte Francesco

    2016-01-01

    Full Text Available Optimization of the data processing techniques of accelerometers and force transducers allowed to get information about actions in order to improve the behavior of a cutting stage of a converting machinery for diapers production. In particular, different mechanical configurations have been studied and compared in order to reduce the solicitations due to the impacts between knives and anvil, to get clean and accurate cuts and to reduce wear of knives themselves. Reducing the uncertainty of measurements allowed to correctly individuate the best configuration for the pneumatic system that realize the coupling between anvil and knife. The size of pipes, the working pressure and the type of the fluid used in the coupling system have been examined. Experimental results obtained by means of acceleration and force measurements allowed to identify in a reproducible and coherent way the geometry of the pushing device and the working pressure range of the hydraulic fluid. The remarkable reduction of knife and anvil vibrations is expected to strongly reduce the wear of the cutting stage components.

  2. Identification of bacteria used for microbial enhanced oil recovery process by fluorescence in situ hybridization technique

    Energy Technology Data Exchange (ETDEWEB)

    Fujiwara, K.; Tanaka, S.; Otsuka, M. [Kansai Research Institute, Kyoto (Japan). Lifescience Lab.; Yonebayashi, H. [Japan National Oil Corp., Chiba (Japan). Tech. Research Center; Enomoto, H. [Tohoku University, Sendai (Japan). Dept. of Geoscience and Tech.

    2000-01-01

    A fluorescence in situ hybridization (FISH) technique using 16S rRNA-targeted oligonucleotide probes was developed for rapid detection of microorganisms for use in the microbial enhancement of oil recovery (MEOR) process. Two microorganisms, Enterobacter cloacae TRC-322 and Bacillus licheniformis TRC-18-2-a, were selected from a collection of Enterobacter sp. and Bacillus sp. which were screened in previous studies as candidate microorganisms for injection, and were used for this experiment. Oligonucleotide probes, design based on specific sequences in the 16S rRNA gene were labeled with either fluorescein isothiocyanate (FITC), or 6-car-boxy-X-rhodamine (ROX), and were allowed to hybridize with fixed cells of the two microorganisms noted above. The fluorescence signal emitted from each microorganism cells could clearly be detected by an epifluorescence microscope. Moreover, E. cloacae TRC-322 and B, licheniformis TRC-18-2-a, suspended in actual reservoir brine, including inorganic salts, oil and aboriginal cells of the reservoir brine, could be detected directly by this hybridization method, without the need for cultivation and isolation. (author)

  3. Closing the gap: accelerating the translational process in nanomedicine by proposing standardized characterization techniques.

    Science.gov (United States)

    Khorasani, Ali A; Weaver, James L; Salvador-Morales, Carolina

    2014-01-01

    On the cusp of widespread permeation of nanomedicine, academia, industry, and government have invested substantial financial resources in developing new ways to better treat diseases. Materials have unique physical and chemical properties at the nanoscale compared with their bulk or small-molecule analogs. These unique properties have been greatly advantageous in providing innovative solutions for medical treatments at the bench level. However, nanomedicine research has not yet fully permeated the clinical setting because of several limitations. Among these limitations are the lack of universal standards for characterizing nanomaterials and the limited knowledge that we possess regarding the interactions between nanomaterials and biological entities such as proteins. In this review, we report on recent developments in the characterization of nanomaterials as well as the newest information about the interactions between nanomaterials and proteins in the human body. We propose a standard set of techniques for universal characterization of nanomaterials. We also address relevant regulatory issues involved in the translational process for the development of drug molecules and drug delivery systems. Adherence and refinement of a universal standard in nanomaterial characterization as well as the acquisition of a deeper understanding of nanomaterials and proteins will likely accelerate the use of nanomedicine in common practice to a great extent.

  4. Development of SAP-DoA techniques for GPR data processing within COST Action TU1208

    Science.gov (United States)

    Meschino, Simone; Pajewski, Lara; Marciniak, Marian

    2016-04-01

    This work focuses on the use of Sub-Array Processing (SAP) and Direction of Arrival (DoA) approaches for the processing of Ground-Penetrating Radar data, with the purpose of locating metal scatterers embedded in concrete or buried in the ground. Research activities have been carried out during two Short-Term Scientific Missions (STSMs) funded by the COST (European COoperation in Science and Technology) Action TU1208 "Civil Engineering Applications of Ground Penetrating Radar" in May 2015 and January 2016. In applications involving smart antennas and in the presence of several transmitters operating simultaneously, it is important for a receiving array to be able to estimate the Direction of Arrival (DoA) of the incoming signals, in order to decipher how many emitters are present and predict their positions. A number of methods have been devised for DoA estimation: the MUltiple SIgnal Classification (MUSIC) and Estimation of Signal Parameters via Rotational Invariance Technique (ESPRIT) are amongst the most popular ones [1]. In the scenario considered by us, the electromagnetic sources are the currents induced on metal elements embedded in concrete or buried in the ground. GPR radargrams are processed, to estimate the DoAs of the electric field back-scattered by the sought targets. In order to work in near-field conditions, a sub-array processing (SAP) approach is adopted: the radargram is partitioned in sub-radargrams composed of few A-scans each, the dominant DoA is predicted for each sub-radargram. The estimated angles are triangulated, obtaining a set of crossings with intersections condensed around object locations. This pattern is filtered, in order to remove a noisy background of unwanted crossings, and is processed by applying the statistical procedure described in [2]. We tested our approach on synthetic GPR radargrams, obtained by using the freeware simulator gprMax implementing the Finite-Difference Time-Domain method [3]. In particular, we worked with

  5. The Impact of Services on Economic Complexity: Service Sophistication as Route for Economic Growth.

    Science.gov (United States)

    Stojkoski, Viktor; Utkovski, Zoran; Kocarev, Ljupco

    2016-01-01

    Economic complexity reflects the amount of knowledge that is embedded in the productive structure of an economy. By combining tools from network science and econometrics, a robust and stable relationship between a country's productive structure and its economic growth has been established. Here we report that not only goods but also services are important for predicting the rate at which countries will grow. By adopting a terminology which classifies manufactured goods and delivered services as products, we investigate the influence of services on the country's productive structure. In particular, we provide evidence that complexity indices for services are in general higher than those for goods, which is reflected in a general tendency to rank countries with developed service sector higher than countries with economy centred on manufacturing of goods. By focusing on country dynamics based on experimental data, we investigate the impact of services on the economic complexity of countries measured in the product space (consisting of both goods and services). Importantly, we show that diversification of service exports and its sophistication can provide an additional route for economic growth in both developing and developed countries.

  6. Exploring the predictive power of interaction terms in a sophisticated risk equalization model using regression trees.

    Science.gov (United States)

    van Veen, S H C M; van Kleef, R C; van de Ven, W P M M; van Vliet, R C J A

    2018-02-01

    This study explores the predictive power of interaction terms between the risk adjusters in the Dutch risk equalization (RE) model of 2014. Due to the sophistication of this RE-model and the complexity of the associations in the dataset (N = ~16.7 million), there are theoretically more than a million interaction terms. We used regression tree modelling, which has been applied rarely within the field of RE, to identify interaction terms that statistically significantly explain variation in observed expenses that is not already explained by the risk adjusters in this RE-model. The interaction terms identified were used as additional risk adjusters in the RE-model. We found evidence that interaction terms can improve the prediction of expenses overall and for specific groups in the population. However, the prediction of expenses for some other selective groups may deteriorate. Thus, interactions can reduce financial incentives for risk selection for some groups but may increase them for others. Furthermore, because regression trees are not robust, additional criteria are needed to decide which interaction terms should be used in practice. These criteria could be the right incentive structure for risk selection and efficiency or the opinion of medical experts. Copyright © 2017 John Wiley & Sons, Ltd.

  7. The State of Nursing Home Information Technology Sophistication in Rural and Nonrural US Markets.

    Science.gov (United States)

    Alexander, Gregory L; Madsen, Richard W; Miller, Erin L; Wakefield, Douglas S; Wise, Keely K; Alexander, Rachel L

    2017-06-01

    To test for significant differences in information technology sophistication (ITS) in US nursing homes (NH) based on location. We administered a primary survey January 2014 to July 2015 to NH in each US state. The survey was cross-sectional and examined 3 dimensions (IT capabilities, extent of IT use, degree of IT integration) among 3 domains (resident care, clinical support, administrative activities) of ITS. ITS was broken down by NH location. Mean responses were compared across 4 NH categories (Metropolitan, Micropolitan, Small Town, and Rural) for all 9 ITS dimensions and domains. Least square means and Tukey's method were used for multiple comparisons. Methods yielded 815/1,799 surveys (45% response rate). In every health care domain (resident care, clinical support, and administrative activities) statistical differences in facility ITS occurred in larger (metropolitan or micropolitan) and smaller (small town or rural) populated areas. This study represents the most current national assessment of NH IT since 2004. Historically, NH IT has been used solely for administrative activities and much less for resident care and clinical support. However, results are encouraging as ITS in other domains appears to be greater than previously imagined. © 2016 National Rural Health Association.

  8. Simple Plans or Sophisticated Habits? State, Transition and Learning Interactions in the Two-Step Task.

    Science.gov (United States)

    Akam, Thomas; Costa, Rui; Dayan, Peter

    2015-12-01

    The recently developed 'two-step' behavioural task promises to differentiate model-based from model-free reinforcement learning, while generating neurophysiologically-friendly decision datasets with parametric variation of decision variables. These desirable features have prompted its widespread adoption. Here, we analyse the interactions between a range of different strategies and the structure of transitions and outcomes in order to examine constraints on what can be learned from behavioural performance. The task involves a trade-off between the need for stochasticity, to allow strategies to be discriminated, and a need for determinism, so that it is worth subjects' investment of effort to exploit the contingencies optimally. We show through simulation that under certain conditions model-free strategies can masquerade as being model-based. We first show that seemingly innocuous modifications to the task structure can induce correlations between action values at the start of the trial and the subsequent trial events in such a way that analysis based on comparing successive trials can lead to erroneous conclusions. We confirm the power of a suggested correction to the analysis that can alleviate this problem. We then consider model-free reinforcement learning strategies that exploit correlations between where rewards are obtained and which actions have high expected value. These generate behaviour that appears model-based under these, and also more sophisticated, analyses. Exploiting the full potential of the two-step task as a tool for behavioural neuroscience requires an understanding of these issues.

  9. Simple Plans or Sophisticated Habits? State, Transition and Learning Interactions in the Two-Step Task

    Science.gov (United States)

    Akam, Thomas; Costa, Rui; Dayan, Peter

    2015-01-01

    The recently developed ‘two-step’ behavioural task promises to differentiate model-based from model-free reinforcement learning, while generating neurophysiologically-friendly decision datasets with parametric variation of decision variables. These desirable features have prompted its widespread adoption. Here, we analyse the interactions between a range of different strategies and the structure of transitions and outcomes in order to examine constraints on what can be learned from behavioural performance. The task involves a trade-off between the need for stochasticity, to allow strategies to be discriminated, and a need for determinism, so that it is worth subjects’ investment of effort to exploit the contingencies optimally. We show through simulation that under certain conditions model-free strategies can masquerade as being model-based. We first show that seemingly innocuous modifications to the task structure can induce correlations between action values at the start of the trial and the subsequent trial events in such a way that analysis based on comparing successive trials can lead to erroneous conclusions. We confirm the power of a suggested correction to the analysis that can alleviate this problem. We then consider model-free reinforcement learning strategies that exploit correlations between where rewards are obtained and which actions have high expected value. These generate behaviour that appears model-based under these, and also more sophisticated, analyses. Exploiting the full potential of the two-step task as a tool for behavioural neuroscience requires an understanding of these issues. PMID:26657806

  10. Simple Plans or Sophisticated Habits? State, Transition and Learning Interactions in the Two-Step Task.

    Directory of Open Access Journals (Sweden)

    Thomas Akam

    2015-12-01

    Full Text Available The recently developed 'two-step' behavioural task promises to differentiate model-based from model-free reinforcement learning, while generating neurophysiologically-friendly decision datasets with parametric variation of decision variables. These desirable features have prompted its widespread adoption. Here, we analyse the interactions between a range of different strategies and the structure of transitions and outcomes in order to examine constraints on what can be learned from behavioural performance. The task involves a trade-off between the need for stochasticity, to allow strategies to be discriminated, and a need for determinism, so that it is worth subjects' investment of effort to exploit the contingencies optimally. We show through simulation that under certain conditions model-free strategies can masquerade as being model-based. We first show that seemingly innocuous modifications to the task structure can induce correlations between action values at the start of the trial and the subsequent trial events in such a way that analysis based on comparing successive trials can lead to erroneous conclusions. We confirm the power of a suggested correction to the analysis that can alleviate this problem. We then consider model-free reinforcement learning strategies that exploit correlations between where rewards are obtained and which actions have high expected value. These generate behaviour that appears model-based under these, and also more sophisticated, analyses. Exploiting the full potential of the two-step task as a tool for behavioural neuroscience requires an understanding of these issues.

  11. A sophisticated simulation for the fracture behavior of concrete material using XFEM

    Science.gov (United States)

    Zhai, Changhai; Wang, Xiaomin; Kong, Jingchang; Li, Shuang; Xie, Lili

    2017-10-01

    The development of a powerful numerical model to simulate the fracture behavior of concrete material has long been one of the dominant research areas in earthquake engineering. A reliable model should be able to adequately represent the discontinuous characteristics of cracks and simulate various failure behaviors under complicated loading conditions. In this paper, a numerical formulation, which incorporates a sophisticated rigid-plastic interface constitutive model coupling cohesion softening, contact, friction and shear dilatation into the XFEM, is proposed to describe various crack behaviors of concrete material. An effective numerical integration scheme for accurately assembling the contribution to the weak form on both sides of the discontinuity is introduced. The effectiveness of the proposed method has been assessed by simulating several well-known experimental tests. It is concluded that the numerical method can successfully capture the crack paths and accurately predict the fracture behavior of concrete structures. The influence of mode-II parameters on the mixed-mode fracture behavior is further investigated to better determine these parameters.

  12. Ranking network of a captive rhesus macaque society: a sophisticated corporative kingdom.

    Science.gov (United States)

    Fushing, Hsieh; McAssey, Michael P; Beisner, Brianne; McCowan, Brenda

    2011-03-15

    We develop a three-step computing approach to explore a hierarchical ranking network for a society of captive rhesus macaques. The computed network is sufficiently informative to address the question: Is the ranking network for a rhesus macaque society more like a kingdom or a corporation? Our computations are based on a three-step approach. These steps are devised to deal with the tremendous challenges stemming from the transitivity of dominance as a necessary constraint on the ranking relations among all individual macaques, and the very high sampling heterogeneity in the behavioral conflict data. The first step simultaneously infers the ranking potentials among all network members, which requires accommodation of heterogeneous measurement error inherent in behavioral data. Our second step estimates the social rank for all individuals by minimizing the network-wide errors in the ranking potentials. The third step provides a way to compute confidence bounds for selected empirical features in the social ranking. We apply this approach to two sets of conflict data pertaining to two captive societies of adult rhesus macaques. The resultant ranking network for each society is found to be a sophisticated mixture of both a kingdom and a corporation. Also, for validation purposes, we reanalyze conflict data from twenty longhorn sheep and demonstrate that our three-step approach is capable of correctly computing a ranking network by eliminating all ranking error.

  13. Ranking network of a captive rhesus macaque society: a sophisticated corporative kingdom.

    Directory of Open Access Journals (Sweden)

    Hsieh Fushing

    2011-03-01

    Full Text Available We develop a three-step computing approach to explore a hierarchical ranking network for a society of captive rhesus macaques. The computed network is sufficiently informative to address the question: Is the ranking network for a rhesus macaque society more like a kingdom or a corporation? Our computations are based on a three-step approach. These steps are devised to deal with the tremendous challenges stemming from the transitivity of dominance as a necessary constraint on the ranking relations among all individual macaques, and the very high sampling heterogeneity in the behavioral conflict data. The first step simultaneously infers the ranking potentials among all network members, which requires accommodation of heterogeneous measurement error inherent in behavioral data. Our second step estimates the social rank for all individuals by minimizing the network-wide errors in the ranking potentials. The third step provides a way to compute confidence bounds for selected empirical features in the social ranking. We apply this approach to two sets of conflict data pertaining to two captive societies of adult rhesus macaques. The resultant ranking network for each society is found to be a sophisticated mixture of both a kingdom and a corporation. Also, for validation purposes, we reanalyze conflict data from twenty longhorn sheep and demonstrate that our three-step approach is capable of correctly computing a ranking network by eliminating all ranking error.

  14. Measurement and image processing evaluation of surface modifications of dental implants G4 pure titanium created by different techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bulutsuz, A. G., E-mail: asligunaya@gmail.com [Department of Mechanical Engineering, Yildiz Technical University, 34349 Besiktas, İstanbul (Turkey); Demircioglu, P., E-mail: pinar.demircioglu@adu.edu.tr; Bogrekci, I., E-mail: ismail.bogrekci@adu.edu.tr [Adnan Menderes University, Faculty of Engineering, Department of Mechanical Engineering, Aytepe, 09010, Aydin (Turkey); Durakbasa, M. N., E-mail: durakbasa@gmx.at [Department of Interchangeable Manufacturing and Industrial Metrology, Institute for Production Engineering and Laser Technology, Vienna University of Technology, Karlsplatz 13/3113 A-1040 Wien (Austria); Katiboglu, A. B., E-mail: abkatiboglu@hotmail.com [Istanbul University, Faculty of Dentistry, Department of Oral and Maxillofacial Surgery, Istanbul (Turkey)

    2015-03-30

    Foreign substances and organic tissue interaction placed into the jaw in order to eliminate tooth loss involves a highly complex process. Many biological reactions take place as well as the biomechanical forces that influence this formation. Osseointegration denotes to the direct structural and functional association between the living bone and the load-bearing artificial implant's surface. Taking into consideration of the requirements in the manufacturing processes of the implants, surface characterizations with high precise measurement techniques are investigated and thus long-term success of dental implant is emphasized on the importance of these processes in this study. In this research, the detailed surface characterization was performed to identify the dependence of the manufacturing techniques on the surface properties by using the image processing methods and using the scanning electron microscope (SEM) for morphological properties in 3D and Taylor Hobson stylus profilometer for roughness properties in 2D. Three implant surfaces fabricated by different manufacturing techniques were inspected, and a machined surface was included into the study as a reference specimen. The results indicated that different surface treatments were strongly influenced surface morphology. Thus 2D and 3D precise inspection techniques were highlighted on the importance for surface characterization. Different image analyses techniques such as Dark-light technique were used to verify the surface measurement results. The computational phase was performed using image processing toolbox in Matlab with precise evaluation of the roughness for the implant surfaces. The relationship between the number of black and white pixels and surface roughness is presented. FFT image processing and analyses results explicitly imply that the technique is useful in the determination of surface roughness. The results showed that the number of black pixels in the image increases with increase in

  15. Improvement on Exoplanet Detection Methods and Analysis via Gaussian Process Fitting Techniques

    Science.gov (United States)

    Van Ross, Bryce; Teske, Johanna

    2018-01-01

    Planetary signals in radial velocity (RV) data are often accompanied by signals coming solely from stellar photo- or chromospheric variation. Such variation can reduce the precision of planet detection and mass measurements, and cause misidentification of planetary signals. Recently, several authors have demonstrated the utility of Gaussian Process (GP) regression for disentangling planetary signals in RV observations (Aigrain et al. 2012; Angus et al. 2017; Czekala et al. 2017; Faria et al. 2016; Gregory 2015; Haywood et al. 2014; Rajpaul et al. 2015; Foreman-Mackey et al. 2017). GP models the covariance of multivariate data to make predictions about likely underlying trends in the data, which can be applied to regions where there are no existing observations. The potency of GP has been used to infer stellar rotation periods; to model and disentangle time series spectra; and to determine physical aspects, populations, and detection of exoplanets, among other astrophysical applications. Here, we implement similar analysis techniques to times series of the Ca-2 H and K activity indicator measured simultaneously with RVs in a small sample of stars from the large Keck/HIRES RV planet search program. Our goal is to characterize the pattern(s) of non-planetary variation to be able to know what is/ is not a planetary signal. We investigated ten different GP kernels and their respective hyperparameters to determine the optimal combination (e.g., the lowest Bayesian Information Criterion value) in each stellar data set. To assess the hyperparameters’ error, we sampled their posterior distributions using Markov chain Monte Carlo (MCMC) analysis on the optimized kernels. Our results demonstrate how GP analysis of stellar activity indicators alone can contribute to exoplanet detection in RV data, and highlight the challenges in applying GP analysis to relatively small, irregularly sampled time series.

  16. New technique for the determination of trace noble metal content in geological and process materials

    Energy Technology Data Exchange (ETDEWEB)

    Mitkin, V.N. E-mail: mit@che.nsk.su; Zayakina, S.B.; Anoshin, G.N

    2003-02-03

    A new two-step sample preparation technique is proposed for the instrumental determination of trace quantities of noble metals (NM) in refractory geological and process materials. The decomposition procedure is based on the oxidizing fluorination of samples with subsequent sulfatization (OFS) of the sample melt or cake. Fluorination of samples is accomplished using a mixture of KHF{sub 2}+KBrF{sub 4} or KHF{sub 2}+BrF{sub 3} depending on the ratio of sample mass to oxidizing mixture. Both cakes and melts can result using this procedure. Sulfatization of resulting fluorides is completed using concentrated sulfuric acid heated to 550 deg. C. Validation studies using certified geostandard reference materials (GSO VP-2, ZH-3, Matte RTP, HO-1, SARM-7) have shown that the proposed method is fast, convenient and most often produces non-hygroscopic homogeneous residues suitable for analysis by atomic absorption spectrometry (AAS) and atomic emission spectrometry (AES). Results obtained for NM concentrations in reference materials agreed with certified concentration ranges and results obtained using other methods of analysis. The OFS procedure combined with direct current plasma d.c. plasma AES achieved the following limits of detection (LOD) for the noble metals: Ag, Au, Pd, 1-2x10{sup -6}; Pt, 5x10{sup -6}; and Ru, Rh, Ir, Os, 1-3x10{sup -7} wt.%. Using graphite furnace AAS (GFAAS) combined extraction pre-concentration the following LODs for NMs were achieved: Pt, Ru, 1x10{sup -6}; Pd, Rh, 1x10{sup -7}; and Au, Ag, 1-2x10{sup -8} wt.%. The relative standard deviation for NM determinations (S{sub r}) was dependent on NM concentration and sample type, but commonly was in the range of 3-15% for d.c. plasma AES and 5-30% for GFAAS.

  17. High Resolution Radar Imaging using Coherent MultiBand Processing Techniques

    NARCIS (Netherlands)

    Dorp, Ph. van; Ebeling, R.P.; Huizing, A.G.

    2010-01-01

    High resolution radar imaging techniques can be used in ballistic missile defence systems to determine the type of ballistic missile during the boost phase (threat typing) and to discriminate different parts of a ballistic missile after the boost phase. The applied radar imaging technique is 2D

  18. The Use of Plasma Technique in Nitridation Process of Metal Alloy DIN 42CrMo4

    International Nuclear Information System (INIS)

    Purwanto; Malau, Viktor; Tjipto Sujitno

    2003-01-01

    Nitridation process with plasma technique is one of technique for surface treatment of a material. Research on plasma technique for nitridation process has been carried out to find out the nitridation effect on properties of metal alloy DIN 42CrM04. Nitridation process with plasma technique was conducted in a vacuum tube under following conditions 0.36 torr of pressure, 300 o C of temperature and nitridation times 1, 2, and 3 hours. Nitridation process was followed by hardness test measurement using High Quality Micro Hardness Tester machine, serial number MM-0054, as well as microstructure test using Scanning Electron Microscope (SEM) coupled with Energy Dispersive Spectroscopy (EDS) EDAX-DX4. The results showed that surface hardness increased after nitridation process. For nitridation processes for 1, 2, and 3 hours, the hardness increased from 291 kg/mm 2 to 303 kg/mm 2 , 324 kg/mm 2 and 403 kg/mm 2 , respectively. The results from micro structure observation showed that new phase of Ferro Nitride (Fe 4 N) has been formed with 4.17% nitrogen weight equivalent to 14.73% nitrogen atom and with the thickness of 5.71 μm, 5.08% nitrogen weight or 17.51% nitrogen atom and 6.78 μm thickness, and 5.69% nitrogen weight or 19.24% nitrogen atom and 8.57 μm thickness. (author)

  19. Development and Quantification of UV-Visible and Laser Spectroscopic Techniques for Materials Accountability and Process Control

    International Nuclear Information System (INIS)

    Czerwinski, Ken; Weck, Phil; Poineau, Frederic

    2010-01-01

    Ultraviolet-Visible Spectroscopy (UV-Visible) and Time Resolved Laser Fluorescence Spectroscopy (TRLFS) optical techniques can permit on-line, real-time analysis of the actinide elements in a solvent extraction process. UV-Visible and TRLFS techniques have been used for measuring the speciation and concentration of the actinides under laboratory conditions. These methods are easily adaptable to multiple sampling geometries, such as dip probes, fiber-optic sample cells, and flow-through cell geometries. To fully exploit these techniques for GNEP applications, the fundamental speciation of the target actinides and the resulting influence on 3 spectroscopic properties must be determined. Through this effort detection limits, process conditions, and speciation of key actinide components can be establish and utilized in a range of areas of interest to GNEP, especially in areas related to materials accountability and process control.

  20. Spatial Distribution Analysis of Soil Properties in Varzaneh Region of Isfahan Using Image Processing Techniques

    Directory of Open Access Journals (Sweden)

    F. Mahmoodi

    2016-02-01

    annual evaporation rate is 3265 mm. In this study, image processing techniquess including band combinations, Principal Component Analysis (PC1, PC2 and PC3, and classification were applied to a TM image to map different soil properties. In order to prepare the satellite image, geometric correction was performed. A 1:25,000 map (UTM 39 was used as a base to georegister the Landsat image. 40 Ground Control Points (GCPs were selected throughout the map and image. Road intersections or other man-made features were appropriate targets for this purpose. The raw image was transformed to the georectified image using a first order polynomial, and then resampled using the nearest neighbour method to preserve radiometry. The final Root Mean Square (RMS error for the selected points was 0.3 pixels. To establish relationships between image and field data, stratified random sampling techniques were used to collect 53 soil samples at the GPS (Global Positioning System points. The continuous map of soil properties was achieved using simple and multiple linear regression models by averaging 9 image pixels around sampling sites. Different image spectral indices were used as independent variables and the dependent variables were field- based data. Results and Discussion: The results of multiple regression analysis showed that the strongest relationships was between sandy soil and TM bands 1, 2, 3, 4, and 5, explaining up to 83% of variation in this component. The weakest relationship was found between CaCo3 and 3, 5, and 7 TM bands. In some cases, the multiple regressions was not an appropriate predicting model of soil properties, therefore, the TM and PC bands that had the highest relationship with field data (confidence level, 99% based on simple regression were classified by the maximum likelihood algorithm. According to error matrix, the overall accuracy of classified maps was between 85 and 93% for chlorine (Cl and silt componets, repectively. Conclusions: The results indicated that