WorldWideScience

Sample records for digital frequency analysis

  1. Fast focus estimation using frequency analysis in digital holography.

    Science.gov (United States)

    Oh, Seungtaik; Hwang, Chi-Young; Jeong, Il Kwon; Lee, Sung-Keun; Park, Jae-Hyeung

    2014-11-17

    A novel fast frequency-based method to estimate the focus distance of digital hologram for a single object is proposed. The focus distance is computed by analyzing the distribution of intersections of smoothed-rays. The smoothed-rays are determined by the directions of energy flow which are computed from local spatial frequency spectrum based on the windowed Fourier transform. So our method uses only the intrinsic frequency information of the optical field on the hologram and therefore does not require any sequential numerical reconstructions and focus detection techniques of conventional photography, both of which are the essential parts in previous methods. To show the effectiveness of our method, numerical results and analysis are presented as well.

  2. Joint time frequency analysis in digital signal processing

    DEFF Research Database (Denmark)

    Pedersen, Flemming

    with this technique is that the resolution is limited because of distortion. To overcome the resolution limitations of the Fourier Spectogram, many new distributions have been developed. In spite of this the Fourier Spectogram is by far the prime method for the analysis of signals whose spectral content is varying...

  3. COMPARATIVE ANALYSIS OF APPLICATION EFFICIENCY OF ORTHOGONAL TRANSFORMATIONS IN FREQUENCY ALGORITHMS FOR DIGITAL IMAGE WATERMARKING

    Directory of Open Access Journals (Sweden)

    Vladimir A. Batura

    2014-11-01

    Full Text Available The efficiency of orthogonal transformations application in the frequency algorithms of the digital watermarking of still images is examined. Discrete Hadamard transform, discrete cosine transform and discrete Haar transform are selected. Their effectiveness is determined by the invisibility of embedded in digital image watermark and its resistance to the most common image processing operations: JPEG-compression, noising, changing of the brightness and image size, histogram equalization. The algorithm for digital watermarking and its embedding parameters remain unchanged at these orthogonal transformations. Imperceptibility of embedding is defined by the peak signal to noise ratio, watermark stability– by Pearson's correlation coefficient. Embedding is considered to be invisible, if the value of the peak signal to noise ratio is not less than 43 dB. Embedded watermark is considered to be resistant to a specific attack, if the Pearson’s correlation coefficient is not less than 0.5. Elham algorithm based on the image entropy is chosen for computing experiment. Computing experiment is carried out according to the following algorithm: embedding of a digital watermark in low-frequency area of the image (container by Elham algorithm, exposure to a harmful influence on the protected image (cover image, extraction of a digital watermark. These actions are followed by quality assessment of cover image and watermark on the basis of which efficiency of orthogonal transformation is defined. As a result of computing experiment it was determined that the choice of the specified orthogonal transformations at identical algorithm and parameters of embedding doesn't influence the degree of imperceptibility for a watermark. Efficiency of discrete Hadamard transform and discrete cosine transformation in relation to the attacks chosen for experiment was established based on the correlation indicators. Application of discrete Hadamard transform increases

  4. The DECMU: a digital device for delayed analysis of multi-frequency eddy current signals

    International Nuclear Information System (INIS)

    Pigeon, Michel.

    1981-08-01

    A delayed data analysis system has been realized by the CEA and Intercontrole for in-service inspection of steam generators of nuclear plants by multifrequency eddy current testing. This device allows, out of the plant, adjustment during switching of the probes, graph recording and analysis for defect signal qualification. The equipment contains an analog mixing device, as IC3FA multi-frequency appartus, but has in addition a memory allowing data cycling and signal isolation for adjustment or analysis [fr

  5. Low-power digital ASIC for on-chip spectral analysis of low-frequency physiological signals

    International Nuclear Information System (INIS)

    Nie Zedong; Zhang Fengjuan; Li Jie; Wang Lei

    2012-01-01

    A digital ASIC chip customized for battery-operated body sensing devices is presented. The ASIC incorporates a novel hybrid-architecture fast Fourier transform (FFT) unit that is capable of scalable spectral analysis, a licensed ARM7TDMI IP hardcore and several peripheral IP blocks. Extensive experimental results suggest that the complete chip works as intended. The power consumption of the FFT unit is 0.69 mW at 1 MHz with 1.8 V power supply. The low-power and programmable features of the ASIC make it suitable for ‘on-the-fly’ low-frequency physiological signal processing. (semiconductor integrated circuits)

  6. Digital dynamic amplitude-frequency spectra analyzer

    International Nuclear Information System (INIS)

    Kalinnikov, V.A.; )

    2006-01-01

    The spectra analyzer is intended for the dynamic spectral analysis of signals physical installations and noise filtering. The recurrence Fourier transformation algorithm is used in the digital dynamic analyzer. It is realized on the basis of the fast logic FPGA matrix and the special signal ADSP microprocessor. The discretization frequency is 2 kHz-10 MHz. The number of calculated spectral coefficients is not less 512. The functional fast-action is 20 ns [ru

  7. Frequency characteristics of the laser film digitizer

    International Nuclear Information System (INIS)

    Ishimitsu, Y.; Taira, R.K.; Huang, H.K.

    1988-01-01

    The frequency characteristics of the laser film digitizer in the parallel and in the perpendicular scan direction are different. Because of this difference, moire pattern artifacts may appear in the digitized image. The authors found that this phenomenon is due to the frequency transfer characteristics of the various components in the laser film digitizer. From this observation, they derive a relationship between the spatial frequency content of the original image and the laser beam spot size based on the concept of image contrast. This relationship can be utilized to avoid the appearance of the moire pattern in the digitized image

  8. Investigation of genesis of gallop sounds in dogs by quantitative phonocardiography and digital frequency analysis.

    Science.gov (United States)

    Aubert, A E; Denys, B G; Meno, F; Reddy, P S

    1985-05-01

    Several investigators have noted external gallop sounds to be of higher amplitude than their corresponding internal sounds (S3 and S4). In this study we hoped to determine if S3 and S4 are transmitted in the same manner as S1. In 11 closed-chest dogs, external (apical) and left ventricular pressures and sounds were recorded simultaneously with transducers with identical sensitivity and frequency responses. Volume and pressure overload and positive and negative inotropic drugs were used to generate gallop sounds. Recordings were made in the control state and after the various interventions. S3 and S4 were recorded in 17 experiments each. The amplitude of the external S1 was uniformly higher than that of internal S1 and internal gallop sounds were inconspicuous. With use of Fourier transforms, the gain function was determined by comparing internal to external S1. By inverse transform, the amplitude of the internal gallop sounds was predicted from external sounds. The internal sounds of significant amplitude were predicted in many instances, but the actual recordings showed no conspicuous sounds. The absence of internal gallop sounds of expected amplitude as calculated from the external gallop sounds and the gain function derived from the comparison of internal and external S1 make it very unlikely that external gallop sounds are derived from internal sounds.

  9. Model-based analysis of digital radio frequency control systems for a heavy-ion synchrotron

    International Nuclear Information System (INIS)

    Spies, Christopher

    2013-12-01

    In this thesis, we investigate the behavior of different radio frequency control systems in a heavy-ion synchrotron, which act on the electrical fields used to accelerate charged particles, along with the longitudinal dynamics of the particles in the beam. Due to the large physical dimensions of the system, the required precision can only be achieved by a distributed control system. Since the plant is highly nonlinear and the overall system is very complex, a purely analytical treatment is not possible without introducing unacceptable simplifications. Instead, we use numerical simulation to investigate the system behavior. This thesis arises from a cooperation between the Institute of Microelectronic Systems at Technische Universitaet Darmstadt and the GSI Helmholtz Center for Heavy-Ion Research. A new heavy-ion synchrotron, the SIS100, is currently being built at GSI; its completion is scheduled for 2016. The starting point for the present thesis was the question whether a control concept previously devised at GSI is feasible - not only in the ideal case, but in the presence of parameter deviations, noise, and other disturbances - and how it can be optimized. In this thesis, we present a system model of a heavy-ion synchrotron. This model comprises the beam dynamics, the relevant components of the accelerator, and the relevant controllers as well as the communication between those controllers. We discuss the simulation techniques as well as several simplifications we applied in order to be able to simulate the model in an acceptable amount of time and show that these simplifications are justified. Using the model, we conducted several case studies in order to demonstrate the practical feasibility of the control concept, analyze the system's sensitivity towards disturbances and explore opportunities for future extensions. We derive specific suggestions for improvements from our results. Finally, we demonstrate that the model represents the physical reality

  10. Digital Fourier analysis fundamentals

    CERN Document Server

    Kido, Ken'iti

    2015-01-01

    This textbook is a thorough, accessible introduction to digital Fourier analysis for undergraduate students in the sciences. Beginning with the principles of sine/cosine decomposition, the reader walks through the principles of discrete Fourier analysis before reaching the cornerstone of signal processing: the Fast Fourier Transform. Saturated with clear, coherent illustrations, "Digital Fourier Analysis - Fundamentals" includes practice problems and thorough Appendices for the advanced reader. As a special feature, the book includes interactive applets (available online) that mirror the illustrations.  These user-friendly applets animate concepts interactively, allowing the user to experiment with the underlying mathematics. For example, a real sine signal can be treated as a sum of clockwise and counter-clockwise rotating vectors. The applet illustration included with the book animates the rotating vectors and the resulting sine signal. By changing parameters such as amplitude and frequency, the reader ca...

  11. Digital Filters for Low Frequency Equalization

    DEFF Research Database (Denmark)

    Tyril, Marni; Abildgaard, J.; Rubak, Per

    2001-01-01

    Digital filters with high resolution in the low-frequency range are studied. Specifically, for a given computational power, traditional IIR filters are compared with warped FIR filters, warped IIR filters, and modified warped FIR filters termed warped individual z FIR filters (WizFIR). The results...

  12. Fast Hopping Frequency Generation in Digital CMOS

    CERN Document Server

    Farazian, Mohammad; Gudem, Prasad S

    2013-01-01

    Overcoming the agility limitations of conventional frequency synthesizers in multi-band OFDM ultra wideband is a key research goal in digital technology. This volume outlines a frequency plan that can generate all the required frequencies from a single fixed frequency, able to implement center frequencies with no more than two levels of SSB mixing. It recognizes the need for future synthesizers to bypass on-chip inductors and operate at low voltages to enable the increased integration and efficiency of networked appliances. The author examines in depth the architecture of the dividers that generate the necessary frequencies from a single base frequency and are capable of establishing a fractional division ratio.   Presenting the first CMOS inductorless single PLL 14-band frequency synthesizer for MB-OFDMUWB makes this volume a key addition to the literature, and with the synthesizer capable of arbitrary band-hopping in less than two nanoseconds, it operates well within the desired range on a 1.2-volt power s...

  13. Digital image analysis

    DEFF Research Database (Denmark)

    Riber-Hansen, Rikke; Vainer, Ben; Steiniche, Torben

    2012-01-01

    Digital image analysis (DIA) is increasingly implemented in histopathological research to facilitate truly quantitative measurements, decrease inter-observer variation and reduce hands-on time. Originally, efforts were made to enable DIA to reproduce manually obtained results on histological slides...... reproducibility, application of stereology-based quantitative measurements, time consumption, optimization of histological slides, regions of interest selection and recent developments in staining and imaging techniques....

  14. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  15. Cryocooled wideband digital channelizing radio-frequency receiver based on low-pass ADC

    International Nuclear Information System (INIS)

    Vernik, Igor V; Kirichenko, Dmitri E; Dotsenko, Vladimir V; Miller, Robert; Webber, Robert J; Shevchenko, Pavel; Talalaevskii, Andrei; Gupta, Deepnarayan; Mukhanov, Oleg A

    2007-01-01

    We have demonstrated a digital receiver performing direct digitization of radio-frequency signals over a wide frequency range from kilohertz to gigahertz. The complete system, consisting of a cryopackaged superconductor all-digital receiver (ADR) chip followed by room-temperature interface electronics and a field programmable gate array (FPGA) based post-processing module, has been developed. The ADR chip comprises a low-pass analog-to-digital converter (ADC) delta modulator with phase modulation-demodulation architecture together with digital in-phase and quadrature mixer and a pair of digital decimation filters. The chip is fabricated using a 4.5 kA cm -2 process and is cryopackaged using a commercial-off-the-shelf cryocooler. Experimental results in HF, VHF, UHF and L bands and their analysis, proving consistent operation of the cryopackaged ADR chip up to 24.32 GHz clock frequency, are presented and discussed

  16. Single Frequency Networks (SFN in Digital Terrestrial Broadcasting

    Directory of Open Access Journals (Sweden)

    V. Ricny

    2007-12-01

    Full Text Available The paper deals with principles and properties of single frequency networks of digital television and radio transmitters. Basic definitions and contextual relationships (guard interval, area of SFN, influence of used modulation parameters etc. are explained.

  17. Intermediate Frequency Digital Receiver Based on Multi-FPGA System

    Directory of Open Access Journals (Sweden)

    Chengchang Zhang

    2016-01-01

    Full Text Available Aiming at high-cost, large-size, and inflexibility problems of traditional analog intermediate frequency receiver in the aerospace telemetry, tracking, and command (TTC system, we have proposed a new intermediate frequency (IF digital receiver based on Multi-FPGA system in this paper. Digital beam forming (DBF is realized by coordinated rotation digital computer (CORDIC algorithm. An experimental prototype has been developed on a compact Multi-FPGA system with three FPGAs to receive 16 channels of IF digital signals. Our experimental results show that our proposed scheme is able to provide a great convenience for the design of IF digital receiver, which offers a valuable reference for real-time, low power, high density, and small size receiver design.

  18. Performance Analysis of Digital loudspeaker Arrays

    DEFF Research Database (Denmark)

    Pedersen, Bo Rohde; Kontomichos, Fotios; Mourjopoulos, John

    2008-01-01

    An analysis of digital loudspeaker arrays shows that the ways in which bits are mapped to the drivers influence the quality of the audio result. Specifically, a "bit-summed" rather than the traditional "bit-mapped" strategy greatly reduces the number of times drivers make binary transitions per...... period of the input frequency. Detailed simulations compare the results for a 32-loudspeaker array with a similar configuration with analog excitation of the drivers. Ideally, drivers in digital arrays should be very small and span a small area, but that sets limits on the low-frequency response...

  19. Frequency Usage and Digital Dividend in India

    DEFF Research Database (Denmark)

    Patil, Kishor P.; Prasad, Ramjee; Skouby, Knud Erik

    2013-01-01

    , which could be the solution for the spectrum requirement for mobile telecom services in India. The views of the different stakeholders about 700 MHz band plan is presented. Finally, the two harmonized frequency arrangement for IMT systems agreed by the Asia Pacific Telecommunity (APT) for 700 MHz band...

  20. Resonance frequency analysis

    Directory of Open Access Journals (Sweden)

    Rajiv K Gupta

    2011-01-01

    Full Text Available Initial stability at the placement and development of osseointegration are two major issues for implant survival. Implant stability is a mechanical phenomenon which is related to the local bone quality and quantity, type of implant, and placement technique used. The application of a simple, clinically applicable, non-invasive test to assess implant stability and osseointegration is considered highly desirable. Resonance frequency analysis (RFA is one of such techniques which is most frequently used now days. The aim of this paper was to review and analyze critically the current available literature in the field of RFA, and to also discuss based on scientific evidence, the prognostic value of RFA to detect implants at risk of failure. A search was made using the PubMed database to find all the literature published on "Resonance frequency analysis for implant stability" till date. Articles discussed in vivo or in vitro studies comparing RFA with other methods of implant stability measurement and articles discussing its reliability were thoroughly reviewed and discussed. A limited number of clinical reports were found. Various studies have demonstrated the feasibility and predictability of the technique. However, most of these articles are based on retrospective data or uncontrolled cases. Randomized, prospective, parallel-armed longitudinal human trials are based on short-term results and long-term follow up are still scarce in this field. Nonetheless, from available literature, it may be concluded that RFA technique evaluates implant stability as a function of stiffness of the implant bone interface and is influenced by factors such as bone type, exposed implant height above the alveolar crest. Resonance frequency analysis could serve as a non-invasive diagnostic tool for detecting the implant stability of dental implants during the healing stages and in subsequent routine follow up care after treatment. Future studies, preferably randomized

  1. Frequency to digital converter for IUAC Linac control system

    International Nuclear Information System (INIS)

    Jain, Mamta; Subramaiam, E.T.; Sahu, B.K.

    2015-01-01

    A frequency to digital converter CAMAC module has been designed and developed for LINAC control systems. This module is used to see the frequency difference of master clock and the resonator frequency digitally without using the oscilloscope. Later on this can be used for automatic tuning and locking of the cavities using piezoelectric actuator based tunner control. This module has eight independent channels to fulfill the need of all the eight cavities of the cryostat. A Schmitt trigger along with level converaccepts almost any form of pulse train, with 30 Vp-p. The time period is measured by counters clocked from a high resolution clock (10 MHz +/- 250 ps). The counter values are cross checked at both the input levels. Frequency is obtained from the computed time period by a special divisor core implemented inside the FPGA. The major task was the implementation of eight individual divisor cores and routing inside one Spartan 3s500E FPGA chip

  2. Digital intermediate frequency QAM modulator using parallel processing

    Science.gov (United States)

    Pao, Hsueh-Yuan [Livermore, CA; Tran, Binh-Nien [San Ramon, CA

    2008-05-27

    The digital Intermediate Frequency (IF) modulator applies to various modulation types and offers a simple and low cost method to implement a high-speed digital IF modulator using field programmable gate arrays (FPGAs). The architecture eliminates multipliers and sequential processing by storing the pre-computed modulated cosine and sine carriers in ROM look-up-tables (LUTs). The high-speed input data stream is parallel processed using the corresponding LUTs, which reduces the main processing speed, allowing the use of low cost FPGAs.

  3. Digital control of high-frequency switched-mode power converters

    CERN Document Server

    Corradini, Luca; Mattavelli, Paolo; Zane, Regan

    This book is focused on the fundamental aspects of analysis, modeling and design of digital control loops around high-frequency switched-mode power converters in a systematic and rigorous manner Comprehensive treatment of digital control theory for power converters Verilog and VHDL sample codes are provided Enables readers to successfully analyze, model, design, and implement voltage, current, or multi-loop digital feedback loops around switched-mode power converters Practical examples are used throughout the book to illustrate applications of the techniques developed Matlab examples are also

  4. Frequencies of digits, divergence points, and Schmidt games

    International Nuclear Information System (INIS)

    Olsen, L.

    2009-01-01

    Sets of divergence points, i.e. numbers x (or tuples of numbers) for which the limiting frequency of a given string of N-adic digits of x fails to exist, have recently attracted huge interest in the literature. In this paper we consider sets of simultaneous divergence points, i.e. numbers x (or tuples of numbers) for which the limiting frequencies of all strings of N-adic digits of x fail to exist. We show that many natural sets of simultaneous divergence points are (α, β)-wining sets in the sense of the Schmidt game. As an application we obtain lower bounds for the Hausdorff dimension of these sets.

  5. Digital system to monitor the natural frequency of mechanical resonators

    International Nuclear Information System (INIS)

    Brengartner, Tobias; Siegel, Michael; Urban, Martin; Monse, Benjamin; Frühauf, Dietmar

    2013-01-01

    Mechanical resonators are often used in process or condition monitoring. They are used for liquid-level limit detection or for viscosity and density sensing. Therefore, the resonator is preferably actuated at its natural frequency. In industrial applications, this is achieved by analogue closed resonant circuits. These circuits have been established because of the low energy consumption and low component costs. Due to the future trend of microprocessors, digital systems are now an interesting alternative and can achieve better results compared to analogue realizations. In this context, this paper presents a novel digital system for monitoring the natural frequency of mechanical resonators. The system is realized with newly developed algorithms and is based on a simple signal processing procedure with minimum computational cost. This allows the use of a low-power microcontroller, thus making the system interesting for industrial use. It is shown that the natural frequency can be measured in respect of high industrial requirements on reliability, fastness and accuracy, combined with the possibility of reducing energy consumption. (paper)

  6. A digital instantaneous frequency measurement technique utilising high speed analogue to digital converters and field programmable gate arrays

    CSIR Research Space (South Africa)

    Herselman, PLR

    2007-09-01

    Full Text Available In modern information and sensor systems, the timely estimation of the carrier frequency of received signals is of critical importance. This paper presents a digital instantaneous frequency measurement (DIFM) technique, which can measure the carrier...

  7. High-Speed Universal Frequency-to-Digital Converter for Quasi-Digital Sensors and Transducers

    Directory of Open Access Journals (Sweden)

    Sergey Y. Yurish

    2007-06-01

    Full Text Available New fast, accurate universal integrated frequency-to-digital converter (UFDC-1M-16 is described in the article. It is based on the novel patented modified method of the dependent count and has non-redundant conversion time from 6.25 ms to 6.25 ms for 1 to 0.001 % relative errors respectively, comparable with conversion time for successive-approximation and S-D ADC. The IC can work with different sensors, transducers and encoders, which have frequency, period, duty-cycle, PWM, phase shift, pulse number, etc. output.

  8. Digital image sequence processing, compression, and analysis

    CERN Document Server

    Reed, Todd R

    2004-01-01

    IntroductionTodd R. ReedCONTENT-BASED IMAGE SEQUENCE REPRESENTATIONPedro M. Q. Aguiar, Radu S. Jasinschi, José M. F. Moura, andCharnchai PluempitiwiriyawejTHE COMPUTATION OF MOTIONChristoph Stiller, Sören Kammel, Jan Horn, and Thao DangMOTION ANALYSIS AND DISPLACEMENT ESTIMATION IN THE FREQUENCY DOMAINLuca Lucchese and Guido Maria CortelazzoQUALITY OF SERVICE ASSESSMENT IN NEW GENERATION WIRELESS VIDEO COMMUNICATIONSGaetano GiuntaERROR CONCEALMENT IN DIGITAL VIDEOFrancesco G.B. De NataleIMAGE SEQUENCE RESTORATION: A WIDER PERSPECTIVEAnil KokaramVIDEO SUMMARIZATIONCuneyt M. Taskiran and Edward

  9. Digital instantaneous frequency measurement technique utilising high-speed ADC’s and FPGA’s

    CSIR Research Space (South Africa)

    Herselman, PL

    2006-02-27

    Full Text Available This paper presents the Digital Instantaneous Frequency Measurement (DIFM) technique, which can measure the carrier frequency of a received waveform within a fraction of a microsecond. The resulting frequency range, resolution and accuracy...

  10. A Digital Analysis Of The Reported Earnings Of Asian Firms

    OpenAIRE

    Kathy H.Y. Hsu; Thomas E. Wilson, Jr.

    2011-01-01

    Prior research (Carslaw, 1988; Thomas, 1989) has noted unusual patterns in the frequency of occurrence of certain digits contained in reported earnings. Employing digital analysis, studies have found that managers in the U.S. and Australia may round reported earnings numbers to achieve income-smoothing objectives. This study extends prior literature by examining whether reported earnings of firms from six Asian countries: South Korea, Malaysia, Philippines, Singapore, Thailand and China follo...

  11. Digital image analysis of NDT radiographs

    International Nuclear Information System (INIS)

    Graeme, W.A. Jr.; Eizember, A.C.; Douglass, J.

    1989-01-01

    Prior to the introduction of Charge Coupled Device (CCD) detectors the majority of image analysis performed on NDT radiographic images was done visually in the analog domain. While some film digitization was being performed, the process was often unable to capture all the usable information on the radiograph or was too time consuming. CCD technology now provides a method to digitize radiographic film images without losing the useful information captured in the original radiograph in a timely process. Incorporating that technology into a complete digital radiographic workstation allows analog radiographic information to be processed, providing additional information to the radiographer. Once in the digital domain, that data can be stored, and fused with radioscopic and other forms of digital data. The result is more productive analysis and management of radiographic inspection data. The principal function of the NDT Scan IV digital radiography system is the digitization, enhancement and storage of radiographic images

  12. Digitizing and analysis of neutron generator waveforms

    International Nuclear Information System (INIS)

    Bryant, T.C.

    1977-11-01

    All neutron generator waveforms from units tested at the SLA neutron generator test site are digitized and the digitized data stored in the CDC 6600 tape library for display and analysis using the CDC 6600 computer. The digitizing equipment consists mainly of seven Biomation Model 8100 transient recorders, Digital Equipment Corporation PDP 11/20 computer, RK05 disk, seven-track magnetic tape transport, and appropriate DEC and SLA controllers and interfaces. The PDP 11/20 computer is programmed in BASIC with assembly language drivers. In addition to digitizing waveforms, this equipment is used for other functions such as the automated testing of multiple-operation electronic neutron generators. Although other types of analysis have been done, the largest use of the digitized data has been for various types of graphical displays using the CDC 6600 and either the SD4020 or DX4460 plotters

  13. Column: The Science of Digital Forensics: Analysis of Digital Traces

    Directory of Open Access Journals (Sweden)

    Fred Cohen

    2012-09-01

    Full Text Available In part 1 of this series (Cohen, 2011a, Analysis of digital traces is a foundational process by which the examiner, typically using computer software tools, comes to understand and answer basic questions regarding digital traces.“Input sequences to digital systems produce outputs and state changes as a function of the previous state. To the extent that the state or outputs produce stored and/or captured bit sequences, these form traces of the event sequences that caused them. Thus the definition of a trace may be stated as: "A set of bit sequences produced from the execution of a finite state machine."(see PDF for full column

  14. Digital Trade Infrastructures: A Framework for Analysis

    Directory of Open Access Journals (Sweden)

    Boriana Boriana

    2018-04-01

    Full Text Available In global supply chains, information about transactions resides in fragmented pockets within business and government systems. The lack of reliable, accurate and complete information makes it hard to detect risks (such as safety, security, compliance and commercial risks and at the same time makes international trade inefficient. The introduction of digital infrastructures that transcend organizational and system domains is driven by the prospect of reducing the fragmentation of information, thereby enabling improved security and efficiency in the trading process. This article develops a digital trade infrastructure framework through an empirically grounded analysis of four digital infrastructures in the trade domain, using the conceptual lens of digital infrastructure.

  15. Interdisciplinary analysis of digital government work

    DEFF Research Database (Denmark)

    Scholl, Hans J.; Mai, Jens Erik; Fidel, Raya

    2006-01-01

    This bird-of-a-feather session attempts to break interdisciplinary ground in the context of work content, workflow, and work context analysis in Digital Government. The authors argue that using and connecting multiple theories and disciplines might yield more robust results and deeper understanding...... of the Digital Government evolution than strictly disciplinary research....

  16. Digital timing: sampling frequency, anti-aliasing filter and signal interpolation filter dependence on timing resolution

    International Nuclear Information System (INIS)

    Cho, Sanghee; Grazioso, Ron; Zhang Nan; Aykac, Mehmet; Schmand, Matthias

    2011-01-01

    The main focus of our study is to investigate how the performance of digital timing methods is affected by sampling rate, anti-aliasing and signal interpolation filters. We used the Nyquist sampling theorem to address some basic questions such as what will be the minimum sampling frequencies? How accurate will the signal interpolation be? How do we validate the timing measurements? The preferred sampling rate would be as low as possible, considering the high cost and power consumption of high-speed analog-to-digital converters. However, when the sampling rate is too low, due to the aliasing effect, some artifacts are produced in the timing resolution estimations; the shape of the timing profile is distorted and the FWHM values of the profile fluctuate as the source location changes. Anti-aliasing filters are required in this case to avoid the artifacts, but the timing is degraded as a result. When the sampling rate is marginally over the Nyquist rate, a proper signal interpolation is important. A sharp roll-off (higher order) filter is required to separate the baseband signal from its replicates to avoid the aliasing, but in return the computation will be higher. We demonstrated the analysis through a digital timing study using fast LSO scintillation crystals as used in time-of-flight PET scanners. From the study, we observed that there is no significant timing resolution degradation down to 1.3 Ghz sampling frequency, and the computation requirement for the signal interpolation is reasonably low. A so-called sliding test is proposed as a validation tool checking constant timing resolution behavior of a given timing pick-off method regardless of the source location change. Lastly, the performance comparison for several digital timing methods is also shown.

  17. Digital repeat analysis; setup and operation.

    Science.gov (United States)

    Nol, J; Isouard, G; Mirecki, J

    2006-06-01

    Since the emergence of digital imaging, there have been questions about the necessity of continuing reject analysis programs in imaging departments to evaluate performance and quality. As a marketing strategy, most suppliers of digital technology focus on the supremacy of the technology and its ability to reduce the number of repeats, resulting in less radiation doses given to patients and increased productivity in the department. On the other hand, quality assurance radiographers and radiologists believe that repeats are mainly related to positioning skills, and repeat analysis is the main tool to plan training needs to up-skill radiographers. A comparative study between conventional and digital imaging was undertaken to compare outcomes and evaluate the need for reject analysis. However, digital technology still being at its early development stages, setting a credible reject analysis program became the major task of the study. It took the department, with the help of the suppliers of the computed radiography reader and the picture archiving and communication system, over 2 years of software enhancement to build a reliable digital repeat analysis system. The results were supportive of both philosophies; the number of repeats as a result of exposure factors was reduced dramatically; however, the percentage of repeats as a result of positioning skills was slightly on the increase for the simple reason that some rejects in the conventional system qualifying for both exposure and positioning errors were classified as exposure error. The ability of digitally adjusting dark or light images reclassified some of those images as positioning errors.

  18. Digital methods for mediated discourse analysis

    DEFF Research Database (Denmark)

    Kjær, Malene; Larsen, Malene Charlotte

    2015-01-01

    , restrictions or privately mediated settings. Having used mediated discourse analysis (Scollon 2002, Scollon & Scollon, 2004) as a framework in two different research projects, we show how the framework, in correlation with digital resources for data gathering, provides new understandings of 1) the daily......In this paper we discuss methodological strategies for collecting multimodal data using digital resources. The aim is to show how digital resources can provide ethnographic insights into mediated actions (Scollon, 2002) that can otherwise be difficult to observe or engage in, due to, for instance......) and online questionnaire data in order to capture mediated actions and discourses in practice....

  19. Digital divide and digital opportunity: Comparison, analysis and strategies for sustainable development in developing nations

    International Nuclear Information System (INIS)

    Bhunia, C.T.; Onime, C.

    2007-07-01

    The world is witnessing a new digital economic order which may be quantified by the diffusion of information technology and globalization process. The current information technology gap (digital divide) between developed countries and developing countries is huge. Improvements in information technology (measured by the digital opportunity index) usually open up an opportunity for national/regional growth and development. There is a need for scientific investigation on the digital divide, digital opportunity index and their consequences. This paper presents a critical analysis of existing digital divide and its trends, it also investigates the relationship between the digital divide and the digital opportunity index. A mathematical model based on analysis of the growing digital divide is presented as a possible tool for combating and eradicate the digital divide gap which is only possible if developing and poor nations take advantage of the digital opportunities that can transform them into global competitive partners in digital knowledge economy. (author)

  20. Frequency-Dependent Blanking with Digital Linear Chirp Waveform Synthesis

    Energy Technology Data Exchange (ETDEWEB)

    Doerry, Armin Walter [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Andrews, John M. [General Atomics Aeronautical Systems, Inc., San Diego, CA (United States)

    2014-07-01

    Wideband radar systems, especially those that operate at lower frequencies such as VHF and UHF, are often restricted from transmitting within or across specific frequency bands in order to prevent interference to other spectrum users. Herein we describe techniques for notching the transmitted spectrum of a generated and transmitted radar waveform. The notches are fully programmable as to their location, and techniques are given that control the characteristics of the notches.

  1. A Paradigmatic Analysis of Digital Application Marketplaces

    DEFF Research Database (Denmark)

    Ghazawneh, Ahmad; Henfridsson, Ola

    2015-01-01

    This paper offers a paradigmatic analysis of digital application marketplaces for advancing information systems research on digital platforms and ecosystems. We refer to the notion of digital application marketplace, colloquially called ‘appstores,’ as a platform component that offers a venue...... for exchanging applications between developers and end users belonging to a single or multiple ecosystems. Such marketplaces exhibit diversity in features and assumptions, and we propose that examining this diversity, and its ideal types, will help us to further understand the relationship between application...... marketplaces, platforms, and platform ecosystems. To this end, we generate a typology that distinguishes four kinds of digital application marketplaces: closed, censored, focused, and open marketplaces. The paper also offers implications for actors wishing to make informed decisions about their relationship...

  2. Digitizing high frequency signals using serial analog memories

    International Nuclear Information System (INIS)

    Coonrod, J.W.

    1975-10-01

    An online computer system has been developed as a replacement for oscilloscopes and cameras on the Tormac project. Up to 32 simultaneous waveforms are recorded at up to 2 MHz in analog shift registers, then digitized sequentially after the event into a small PDP-11 computer. Data and functions of data may be displayed or plotted locally, and then forwarded for storage at a larger, remote computer via a network arrangement. Advantages over scopes have been lower incremental cost (approximately $200/channel), less noise pickup, better resolution (less than 1%), and immediate presentation of data

  3. Digital PIV (DPIV) Software Analysis System

    Science.gov (United States)

    Blackshire, James L.

    1997-01-01

    A software package was developed to provide a Digital PIV (DPIV) capability for NASA LaRC. The system provides an automated image capture, test correlation, and autocorrelation analysis capability for the Kodak Megaplus 1.4 digital camera system for PIV measurements. The package includes three separate programs that, when used together with the PIV data validation algorithm, constitutes a complete DPIV analysis capability. The programs are run on an IBM PC/AT host computer running either Microsoft Windows 3.1 or Windows 95 using a 'quickwin' format that allows simple user interface and output capabilities to the windows environment.

  4. Digital low level rf control system with four different intermediate frequencies for the International Linear Collider

    Science.gov (United States)

    Wibowo, Sigit Basuki; Matsumoto, Toshihiro; Michizono, Shinichiro; Miura, Takako; Qiu, Feng; Liu, Na

    2017-09-01

    A field programmable gate array-based digital low level rf (LLRF) control system will be used in the International Linear Collider (ILC) in order to satisfy the rf stability requirements. The digital LLRF control system with four different intermediate frequencies has been developed to decrease the required number of analog-to-digital converters in this system. The proof of concept of this technique was demonstrated at the Superconducting RF Test Facility in the High Energy Accelerator Research Organization, Japan. The amplitude and phase stability has fulfilled the ILC requirements.

  5. Recent developments in time-frequency analysis

    CERN Document Server

    Loughlin, Patrick

    1998-01-01

    Recent Developments in Time-Frequency Analysis brings together in one place important contributions and up-to-date research results in this fast moving area. Recent Developments in Time-Frequency Analysis serves as an excellent reference, providing insight into some of the most challenging research issues in the field.

  6. Dental Videographic Analysis using Digital Age Media.

    Science.gov (United States)

    Agarwal, Anirudh; Seth, Karan; Parmar, Siddharaj; Jhawar, Rahul

    2016-01-01

    This study was to evaluate a new method of smile analysis using videographic and photographic softwares (as in this study Photoshop Elements X, Windows Movie Maker 2012) as primary assessment tools and to develop an index for malocclusion and treatment plan that could be used in assessing severity of maloc-clussion. Agarwal A, Seth K, Parmar S, Jhawar R. Dental Videographic Analysis using Digital Age Media. Int J Clin Pediatr Dent 2016;9(4):355-363.

  7. Research on digital multi-channel pulse height analysis techniques

    International Nuclear Information System (INIS)

    Xiao Wuyun; Wei Yixiang; Ai Xianyun; Ao Qi

    2005-01-01

    Multi-channel pulse height analysis techniques are developing in the direction of digitalization. Based on digital signal processing techniques, digital multi-channel analyzers are characterized by powerful pulse processing ability, high throughput, improved stability and flexibility. This paper analyzes key techniques of digital nuclear pulse processing. With MATLAB software, main algorithms are simulated, such as trapezoidal shaping, digital baseline estimation, digital pole-zero/zero-pole compensation, poles and zeros identification. The preliminary general scheme of digital MCA is discussed, as well as some other important techniques about its engineering design. All these lay the foundation of developing homemade digital nuclear spectrometers. (authors)

  8. Digital radiography: optimization of image quality and dose using multi-frequency software.

    Science.gov (United States)

    Precht, H; Gerke, O; Rosendahl, K; Tingberg, A; Waaler, D

    2012-09-01

    New developments in processing of digital radiographs (DR), including multi-frequency processing (MFP), allow optimization of image quality and radiation dose. This is particularly promising in children as they are believed to be more sensitive to ionizing radiation than adults. To examine whether the use of MFP software reduces the radiation dose without compromising quality at DR of the femur in 5-year-old-equivalent anthropomorphic and technical phantoms. A total of 110 images of an anthropomorphic phantom were imaged on a DR system (Canon DR with CXDI-50 C detector and MLT[S] software) and analyzed by three pediatric radiologists using Visual Grading Analysis. In addition, 3,500 images taken of a technical contrast-detail phantom (CDRAD 2.0) provide an objective image-quality assessment. Optimal image-quality was maintained at a dose reduction of 61% with MLT(S) optimized images. Even for images of diagnostic quality, MLT(S) provided a dose reduction of 88% as compared to the reference image. Software impact on image quality was found significant for dose (mAs), dynamic range dark region and frequency band. By optimizing image processing parameters, a significant dose reduction is possible without significant loss of image quality.

  9. Noise analysis of a digital radiography system

    International Nuclear Information System (INIS)

    Arnold, B.A.; Scheibe, P.O.

    1984-01-01

    The sources of noise in a digital video subtraction angiography system were identified and analyzed. Signal-to-noise ratios of digital radiography systems were measured using the digital image data recorded in the computer. The major sources of noise include quantum noise, TV camera electronic noise, quantization noise from the analog-to-digital converter, time jitter, structure noise in the image intensifier, and video recorder electronic noise. A new noise source was identified, which results from the interplay of fixed pattern noise and the lack of image registration. This type of noise may result from image-intensifier structure noise in combination with TV camera time jitter or recorder time jitter. A similar noise source is generated from the interplay of patient absorption inhomogeneities and patient motion or image re-registration. Signal-to-noise ratios were measured for a variety of experimental conditions using subtracted digital images. Image-intensifier structure noise was shown to be a dominant noise source in unsubtracted images at medium to high radiation exposure levels. A total-system signal-to-noise ratio (SNR) of 750:1 was measured for an input exposure of 1 mR/frame at the image intensifier input. The effect of scattered radiation on subtracted image SNR was found to be greater than previously reported. The detail SNR was found to vary approximately as one plus the scatter degradation factor. Quantization error noise with 8-bit image processors (signal-to-noise ratio of 890:1) was shown to be of increased importance after recent improvements in TV cameras. The results of the analysis are useful both in the design of future digital radiography systems and the selection of optimum clinical techniques

  10. Theoretical Feasibility of Digital Communication Over Ocean Areas by High Frequency Radio

    Science.gov (United States)

    1979-11-01

    The theoretical reliability of digital data transmission via high frequency radio is examined for typical air traffic routes in the Atlantic and Pacific areas to assist the U.S. Department of Transportation in the evaluation of a system for improving...

  11. Digital dream analysis: a revised method.

    Science.gov (United States)

    Bulkeley, Kelly

    2014-10-01

    This article demonstrates the use of a digital word search method designed to provide greater accuracy, objectivity, and speed in the study of dreams. A revised template of 40 word search categories, built into the website of the Sleep and Dream Database (SDDb), is applied to four "classic" sets of dreams: The male and female "Norm" dreams of Hall and Van de Castle (1966), the "Engine Man" dreams discussed by Hobson (1988), and the "Barb Sanders Baseline 250" dreams examined by Domhoff (2003). A word search analysis of these original dream reports shows that a digital approach can accurately identify many of the same distinctive patterns of content found by previous investigators using much more laborious and time-consuming methods. The results of this study emphasize the compatibility of word search technologies with traditional approaches to dream content analysis. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Digital image analysis of X-ray television with an image digitizer

    International Nuclear Information System (INIS)

    Mochizuki, Yasuo; Akaike, Hisahiko; Ogawa, Hitoshi; Kyuma, Yukishige

    1995-01-01

    When video signals of X-ray fluoroscopy were transformed from analog-to-digital ones with an image digitizer, their digital characteristic curves, pre-sampling MTF's and digital Wiener spectral could be measured. This method was advant ageous in that it was able to carry out data sampling because the pixel values inputted could be verified on a CRT. The system of image analysis by this method is inexpensive and effective in evaluating the image quality of digital system. Also, it is expected that this method can be used as a tool for learning the measurement techniques and physical characteristics of digital image quality effectively. (author)

  13. Cluster analysis of word frequency dynamics

    Science.gov (United States)

    Maslennikova, Yu S.; Bochkarev, V. V.; Belashova, I. A.

    2015-01-01

    This paper describes the analysis and modelling of word usage frequency time series. During one of previous studies, an assumption was put forward that all word usage frequencies have uniform dynamics approaching the shape of a Gaussian function. This assumption can be checked using the frequency dictionaries of the Google Books Ngram database. This database includes 5.2 million books published between 1500 and 2008. The corpus contains over 500 billion words in American English, British English, French, German, Spanish, Russian, Hebrew, and Chinese. We clustered time series of word usage frequencies using a Kohonen neural network. The similarity between input vectors was estimated using several algorithms. As a result of the neural network training procedure, more than ten different forms of time series were found. They describe the dynamics of word usage frequencies from birth to death of individual words. Different groups of word forms were found to have different dynamics of word usage frequency variations.

  14. Cluster analysis of word frequency dynamics

    International Nuclear Information System (INIS)

    Maslennikova, Yu S; Bochkarev, V V; Belashova, I A

    2015-01-01

    This paper describes the analysis and modelling of word usage frequency time series. During one of previous studies, an assumption was put forward that all word usage frequencies have uniform dynamics approaching the shape of a Gaussian function. This assumption can be checked using the frequency dictionaries of the Google Books Ngram database. This database includes 5.2 million books published between 1500 and 2008. The corpus contains over 500 billion words in American English, British English, French, German, Spanish, Russian, Hebrew, and Chinese. We clustered time series of word usage frequencies using a Kohonen neural network. The similarity between input vectors was estimated using several algorithms. As a result of the neural network training procedure, more than ten different forms of time series were found. They describe the dynamics of word usage frequencies from birth to death of individual words. Different groups of word forms were found to have different dynamics of word usage frequency variations

  15. Use of a novel cell adhesion method and digital measurement to show stimulus-dependent variation in somatic and oral ciliary beat frequency in Paramecium.

    Science.gov (United States)

    Bell, Wade E; Hallworth, Richard; Wyatt, Todd A; Sisson, Joseph H

    2015-01-01

    When Paramecium encounters positive stimuli, the membrane hyperpolarizes and ciliary beat frequency increases. We adapted an established immobilization protocol using a biological adhesive and a novel digital analysis system to quantify beat frequency in immobilized Paramecium. Cells showed low mortality and demonstrated beat frequencies consistent with previous studies. Chemoattractant molecules, reduction in external potassium, and posterior stimulation all increased somatic beat frequency. In all cases, the oral groove cilia maintained a higher beat frequency than mid-body cilia, but only oral cilia from cells stimulated with chemoattactants showed an increase from basal levels. © 2014 The Author(s) Journal of Eukaryotic Microbiology © 2014 International Society of Protistologists.

  16. A note on the physical interpretation of frequency dependent boundary conditions in a digital waveguide mesh

    DEFF Research Database (Denmark)

    Escolano-Carrasco, José; Jacobsen, Finn

    2007-01-01

    Digital waveguide mesh (DWM) is a popular method for time domain modelling of sound fields. DWM consists of a recursive digital filter structure where a D'Alembert solution of the wave equation is propagated. One of the attractive characteristics of this method is related to the simplicity...... model of the boundary does not agree with the behaviour of a locally reacting surface, and this can give rise to contradictions in the physical interpretation of the reflected sound field. This paper analyses the behaviour of frequency dependent boundary conditions in DWM in order to obtain a physical...

  17. Efficiency Optimization Methods in Low-Power High-Frequency Digitally Controlled SMPS

    Directory of Open Access Journals (Sweden)

    Aleksandar Prodić

    2010-06-01

    Full Text Available This paper gives a review of several power efficiency optimization techniques that are utilizing advantages of emerging digital control in high frequency switch-mode power supplies (SMPS, processing power from a fraction of watt to several hundreds of watts. Loss mechanisms in semiconductor components are briefly reviewed and the related principles of online efficiency optimization through power stage segmentation and gate voltage variation presented. Practical implementations of such methods utilizing load prediction or data extraction from a digital control loop are shown. The benefits of the presented efficiency methods are verified through experimental results, showing efficiency improvements, ranging from 2% to 30%,depending on the load conditions.

  18. Remote Sensing Digital Image Analysis An Introduction

    CERN Document Server

    Richards, John A

    2013-01-01

    Remote Sensing Digital Image Analysis provides the non-specialist with a treatment of the quantitative analysis of satellite and aircraft derived remotely sensed data. Since the first edition of the book there have been significant developments in the algorithms used for the processing and analysis of remote sensing imagery; nevertheless many of the fundamentals have substantially remained the same.  This new edition presents material that has retained value since those early days, along with new techniques that can be incorporated into an operational framework for the analysis of remote sensing data. The book is designed as a teaching text for the senior undergraduate and postgraduate student, and as a fundamental treatment for those engaged in research using digital image processing in remote sensing.  The presentation level is for the mathematical non-specialist.  Since the very great number of operational users of remote sensing come from the earth sciences communities, the text is pitched at a leve...

  19. A digital frequency stabilization system of external cavity diode laser based on LabVIEW FPGA

    Science.gov (United States)

    Liu, Zhuohuan; Hu, Zhaohui; Qi, Lu; Wang, Tao

    2015-10-01

    Frequency stabilization for external cavity diode laser has played an important role in physics research. Many laser frequency locking solutions have been proposed by researchers. Traditionally, the locking process was accomplished by analog system, which has fast feedback control response speed. However, analog system is susceptible to the effects of environment. In order to improve the automation level and reliability of the frequency stabilization system, we take a grating-feedback external cavity diode laser as the laser source and set up a digital frequency stabilization system based on National Instrument's FPGA (NI FPGA). The system consists of a saturated absorption frequency stabilization of beam path, a differential photoelectric detector, a NI FPGA board and a host computer. Many functions, such as piezoelectric transducer (PZT) sweeping, atomic saturation absorption signal acquisition, signal peak identification, error signal obtaining and laser PZT voltage feedback controlling, are totally completed by LabVIEW FPGA program. Compared with the analog system, the system built by the logic gate circuits, performs stable and reliable. User interface programmed by LabVIEW is friendly. Besides, benefited from the characteristics of reconfiguration, the LabVIEW program is good at transplanting in other NI FPGA boards. Most of all, the system periodically checks the error signal. Once the abnormal error signal is detected, FPGA will restart frequency stabilization process without manual control. Through detecting the fluctuation of error signal of the atomic saturation absorption spectrum line in the frequency locking state, we can infer that the laser frequency stability can reach 1MHz.

  20. Cyber Threat and vulnerability Analysis for Digital Assets of NPPs

    International Nuclear Information System (INIS)

    Oh, Eun Se; Seo, In Yong; Kim, See Hong

    2009-01-01

    Today's computer and communication technology breakthrough make increase plant floor replacement from analog instrumentation and control systems of nuclear power plants to a full-fledged digital system . The rich functionality and crisp accuracy are one of big advantages of digital technology adaptation, but use of open networks and inherited shared system resources (memory, network, etc.) are well known weak points of digital system. Intended or un-intended cyber attack throughout power plant digital control system's weak point may result to wide area of system failures and that easily defeats system operation and multiple protection safeguards. Well organized cyber security analysis for nuclear plant digital control systems (digital assets) are required

  1. The Linear Time Frequency Analysis Toolbox

    DEFF Research Database (Denmark)

    Søndergaard, Peter Lempel; Torrésani, Bruno; Balazs, Peter

    2011-01-01

    The Linear Time Frequency Analysis Toolbox is a Matlab/Octave toolbox for computational time-frequency analysis. It is intended both as an educational and computational tool. The toolbox provides the basic Gabor, Wilson and MDCT transform along with routines for constructing windows (lter...... prototypes) and routines for manipulating coe cients. It also provides a bunch of demo scripts devoted either to demonstrating the main functions of the toolbox, or to exemplify their use in specic signal processing applications. In this paper we describe the used algorithms, their mathematical background...

  2. Frequency modulation television analysis: Distortion analysis

    Science.gov (United States)

    Hodge, W. H.; Wong, W. H.

    1973-01-01

    Computer simulation is used to calculate the time-domain waveform of standard T-pulse-and-bar test signal distorted in passing through an FM television system. The simulator includes flat or preemphasized systems and requires specification of the RF predetection filter characteristics. The predetection filters are modeled with frequency-symmetric Chebyshev (0.1-db ripple) and Butterworth filters. The computer was used to calculate distorted output signals for sixty-four different specified systems, and the output waveforms are plotted for all sixty-four. Comparison of the plotted graphs indicates that a Chebyshev predetection filter of four poles causes slightly more signal distortion than a corresponding Butterworth filter and the signal distortion increases as the number of poles increases. An increase in the peak deviation also increases signal distortion. Distortion also increases with the addition of preemphasis.

  3. Spatial analysis of digital technologies and impact on socio - cultural ...

    African Journals Online (AJOL)

    The objective of this study was to determine the spatial distribution of digital technologies and ascertain whether digital technologies have significant impact on socio - cultural values or not. Moran's index and Getis and Ord's statistic were used for cluster and hotspots analysis. The unique locations of digital technologies ...

  4. Semiautomatic digital imaging system for cytogenetic analysis

    International Nuclear Information System (INIS)

    Chaubey, R.C.; Chauhan, P.C.; Bannur, S.V.; Kulgod, S.V.; Chadda, V.K.; Nigam, R.K.

    1999-08-01

    The paper describes a digital image processing system, developed indigenously at BARC for size measurement of microscopic biological objects such as cell, nucleus and micronucleus in mouse bone marrow; cytochalasin-B blocked human lymphocytes in-vitro; numerical counting and karyotyping of metaphase chromosomes of human lymphocytes. Errors in karyotyping of chromosomes by the imaging system may creep in due to lack of well-defined position of centromere or extensive bending of chromosomes, which may result due to poor quality of preparation. Good metaphase preparations are mandatory for precise and accurate analysis by the system. Additional new morphological parameters about each chromosome have to be incorporated to improve the accuracy of karyotyping. Though the experienced cytogenetisist is the final judge; however, the system assists him/her to carryout analysis much faster as compared to manual scoring. Further, experimental studies are in progress to validate different software packages developed for various cytogenetic applications. (author)

  5. Digital implementation of a laser frequency stabilisation technique in the telecommunications band

    Science.gov (United States)

    Jivan, Pritesh; van Brakel, Adriaan; Manuel, Rodolfo Martínez; Grobler, Michael

    2016-02-01

    Laser frequency stabilisation in the telecommunications band was realised using the Pound-Drever-Hall (PDH) error signal. The transmission spectrum of the Fabry-Perot cavity was used as opposed to the traditionally used reflected spectrum. A comparison was done using an analogue as well as a digitally implemented system. This study forms part of an initial step towards developing a portable optical time and frequency standard. The frequency discriminator used in the experimental setup was a fibre-based Fabry-Perot etalon. The phase sensitive system made use of the optical heterodyne technique to detect changes in the phase of the system. A lock-in amplifier was used to filter and mix the input signals to generate the error signal. This error signal may then be used to generate a control signal via a PID controller. An error signal was realised at a wavelength of 1556 nm which correlates to an optical frequency of 1.926 THz. An implementation of the analogue PDH technique yielded an error signal with a bandwidth of 6.134 GHz, while a digital implementation yielded a bandwidth of 5.774 GHz.

  6. Spectral analysis of full field digital mammography data

    International Nuclear Information System (INIS)

    Heine, John J.; Velthuizen, Robert P.

    2002-01-01

    The spectral content of mammograms acquired from using a full field digital mammography (FFDM) system are analyzed. Fourier methods are used to show that the FFDM image power spectra obey an inverse power law; in an average sense, the images may be considered as 1/f fields. Two data representations are analyzed and compared (1) the raw data, and (2) the logarithm of the raw data. Two methods are employed to analyze the power spectra (1) a technique based on integrating the Fourier plane with octave ring sectioning developed previously, and (2) an approach based on integrating the Fourier plane using rings of constant width developed for this work. Both methods allow theoretical modeling. Numerical analysis indicates that the effects due to the transformation influence the power spectra measurements in a statistically significant manner in the high frequency range. However, this effect has little influence on the inverse power law estimation for a given image regardless of the data representation or the theoretical analysis approach. The analysis is presented from two points of view (1) each image is treated independently with the results presented as distributions, and (2) for a given representation, the entire image collection is treated as an ensemble with the results presented as expected values. In general, the constant ring width analysis forms the foundation for a spectral comparison method for finding spectral differences, from an image distribution sense, after applying a nonlinear transformation to the data. The work also shows that power law estimation may be influenced due to the presence of noise in the higher frequency range, which is consistent with the known attributes of the detector efficiency. The spectral modeling and inverse power law determinations obtained here are in agreement with that obtained from the analysis of digitized film-screen images presented previously. The form of the power spectrum for a given image is approximately 1/f 2

  7. Quantitative blood flow analysis with digital techniques

    International Nuclear Information System (INIS)

    Forbes, G.

    1984-01-01

    The general principles of digital techniques in quantitating absolute blood flow during arteriography are described. Results are presented for a phantom constructed to correlate digitally calculated absolute flow with direct flow measurements. The clinical use of digital techniques in cerebrovascular angiography is briefly described. (U.K.)

  8. Digital photoelastic analysis applied to implant dentistry

    Science.gov (United States)

    Ramesh, K.; Hariprasad, M. P.; Bhuvanewari, S.

    2016-12-01

    Development of improved designs of implant systems in dentistry have necessitated the study of stress fields in the implant regions of the mandible/maxilla for better understanding of the biomechanics involved. Photoelasticity has been used for various studies related to dental implants in view of whole field visualization of maximum shear stress in the form of isochromatic contours. The potential of digital photoelasticity has not been fully exploited in the field of implant dentistry. In this paper, the fringe field in the vicinity of the connected implants (All-On-Four® concept) is analyzed using recent advances in digital photoelasticity. Initially, a novel 3-D photoelastic model making procedure, to closely mimic all the anatomical features of the human mandible is proposed. By choosing appropriate orientation of the model with respect to the light path, the essential region of interest were sought to be analysed while keeping the model under live loading conditions. Need for a sophisticated software module to carefully identify the model domain has been brought out. For data extraction, five-step method is used and isochromatics are evaluated by twelve fringe photoelasticity. In addition to the isochromatic fringe field, whole field isoclinic data is also obtained for the first time in implant dentistry, which could throw important information in improving the structural stability of the implant systems. Analysis is carried out for the implant in the molar as well as the incisor region. In addition, the interaction effects of loaded molar implant on the incisor area are also studied.

  9. Reproducibility of lateral cephalometric landmarks on conventional radiographs and spatial frequency-processed digital images

    International Nuclear Information System (INIS)

    Shin, Jeong Won; Heo, Min Suk; Lee, Sam Sun; Choi, Hyun Bae; Choi, Soon Chul; Choi, Hang Moon

    2002-01-01

    Computed radiography (CR) has been used in cephalometric radiography and many studies have been carried out to improve image quality using various digital enhancement and filtering techniques. During CR image acquisition, the frequency rank and type affect to the image quality. The aim of this study was to compare the diagnostic quality of conventional cephalometric radiographs to those of computed radiography. The diagnostic quality of conventional cephalometric radiographs (M0) and their digital image counterparts were compared, and at the same time, six modalities (M1-M6) of spatial frequency-processed digital images were compared by evaluating the reproducibility of 23 cephalometric landmark locations. Reproducibility was defined as an observer's deviation (in mm) from the mean between all observers. In comparison with the conventional cephalometric radiograph (M0), M1 showed statistically significant differences in 8 locations, M2 in 9, M3 12, M4 in 7, M5 in 12, and M6 showed significant differences in 14 of 23 landmark locations (p<0.05). The number of reproducible landmarks that each modality possesses were 7 in M6, 6 in M5, 5 in M3, 4 in M4, 3 in M2, 2 in M1, and 1 location in M0. The image modality that observers selected as having the best image quality was M5.

  10. Digital signals processing using non-linear orthogonal transformation in frequency domain

    Directory of Open Access Journals (Sweden)

    Ivanichenko E.V.

    2017-12-01

    Full Text Available The rapid progress of computer technology in recent decades led to a wide introduction of methods of digital information processing practically in all fields of scientific research. In this case, among various applications of computing one of the most important places is occupied by digital processing systems signals (DSP that are used in data processing remote solution tasks of navigation of aerospace and marine objects, communications, radiophysics, digital optics and in a number of other applications. Digital Signal Processing (DSP is a dynamically developing an area that covers both technical and software tools. Related areas for digital signal processing are theory information, in particular, the theory of optimal signal reception and theory pattern recognition. In the first case, the main problem is signal extraction against a background of noise and interference of a different physical nature, and in the second - automatic recognition, i.e. classification and signal identification. In the digital processing of signals under a signal, we mean its mathematical description, i.e. a certain real function, containing information on the state or behavior of a physical system under an event that can be defined on a continuous or discrete space of time variation or spatial coordinates. In the broad sense, DSP systems mean a complex algorithmic, hardware and software. As a rule, systems contain specialized technical means of preliminary (or primary signal processing and special technical means for secondary processing of signals. Means of pretreatment are designed to process the original signals observed in general case against a background of random noise and interference of a different physical nature and represented in the form of discrete digital samples, for the purpose of detecting and selection (selection of the useful signal and evaluation characteristics of the detected signal. A new method of digital signal processing in the frequency

  11. Accuracy assessment of high frequency 3D ultrasound for digital impression-taking of prepared teeth

    Science.gov (United States)

    Heger, Stefan; Vollborn, Thorsten; Tinschert, Joachim; Wolfart, Stefan; Radermacher, Klaus

    2013-03-01

    Silicone based impression-taking of prepared teeth followed by plaster casting is well-established but potentially less reliable, error-prone and inefficient, particularly in combination with emerging techniques like computer aided design and manufacturing (CAD/CAM) of dental prosthesis. Intra-oral optical scanners for digital impression-taking have been introduced but until now some drawbacks still exist. Because optical waves can hardly penetrate liquids or soft-tissues, sub-gingival preparations still need to be uncovered invasively prior to scanning. High frequency ultrasound (HFUS) based micro-scanning has been recently investigated as an alternative to optical intra-oral scanning. Ultrasound is less sensitive against oral fluids and in principal able to penetrate gingiva without invasively exposing of sub-gingival preparations. Nevertheless, spatial resolution as well as digitization accuracy of an ultrasound based micro-scanning system remains a critical parameter because the ultrasound wavelength in water-like media such as gingiva is typically smaller than that of optical waves. In this contribution, the in-vitro accuracy of ultrasound based micro-scanning for tooth geometry reconstruction is being investigated and compared to its extra-oral optical counterpart. In order to increase the spatial resolution of the system, 2nd harmonic frequencies from a mechanically driven focused single element transducer were separated and corresponding 3D surface models were calculated for both fundamentals and 2nd harmonics. Measurements on phantoms, model teeth and human teeth were carried out for evaluation of spatial resolution and surface detection accuracy. Comparison of optical and ultrasound digital impression taking indicate that, in terms of accuracy, ultrasound based tooth digitization can be an alternative for optical impression-taking.

  12. Why Map Issues? On Controversy Analysis as a Digital Method.

    Science.gov (United States)

    Marres, Noortje

    2015-09-01

    This article takes stock of recent efforts to implement controversy analysis as a digital method in the study of science, technology, and society (STS) and beyond and outlines a distinctive approach to address the problem of digital bias. Digital media technologies exert significant influence on the enactment of controversy in online settings, and this risks undermining the substantive focus of controversy analysis conducted by digital means. To address this problem, I propose a shift in thematic focus from controversy analysis to issue mapping. The article begins by distinguishing between three broad frameworks that currently guide the development of controversy analysis as a digital method, namely, demarcationist, discursive, and empiricist. Each has been adopted in STS, but only the last one offers a digital "move beyond impartiality." I demonstrate this approach by analyzing issues of Internet governance with the aid of the social media platform Twitter.

  13. Digital Natives, Digital Immigrants: An Analysis of Age and ICT Competency in Teacher Education

    Science.gov (United States)

    Guo, Ruth Xiaoqing; Dobson, Teresa; Petrina, Stephen

    2008-01-01

    This article examines the intersection of age and ICT (information and communication technology) competency and critiques the "digital natives versus digital immigrants" argument proposed by Prensky (2001a, 2001b). Quantitative analysis was applied to a statistical data set collected in the context of a study with over 2,000 pre-service…

  14. The Modified Frequency Algorithm of Digital Watermarking of Still Images Resistant to JPEG Compression

    Directory of Open Access Journals (Sweden)

    V. A. Batura

    2015-01-01

    Full Text Available Digital watermarking is an effective copyright protection for multimedia products (in particular, still images. Digital marking represents process of embedding into object of protection of a digital watermark which is invisible for a human eye. However there is rather large number of the harmful influences capable to destroy the watermark which is embedded into the still image. The most widespread attack is JPEG compression that is caused by efficiency of this format of compression and its big prevalence on the Internet.The new algorithm which is modification of algorithm of Elham is presented in the present article. The algorithm of digital marking of motionless images carries out embedding of a watermark in frequency coefficients of discrete Hadamard transform of the chosen image blocks. The choice of blocks of the image for embedding of a digital watermark is carried out on the basis of the set threshold of entropy of pixels. The choice of low-frequency coefficients for embedding is carried out on the basis of comparison of values of coefficients of discrete cosine transformation with a predetermined threshold, depending on the product of the built-in watermark coefficient on change coefficient.Resistance of new algorithm to compression of JPEG, noising, filtration, change of color, the size and histogram equalization is in details analysed. Research of algorithm consists in comparison of the appearance taken from the damaged image of a watermark with the introduced logo. Ability of algorithm to embedding of a watermark with a minimum level of distortions of the image is in addition analysed. It is established that the new algorithm in comparison by initial algorithm of Elham showed full resistance to compression of JPEG, and also the improved resistance to a noising, change of brightness and histogram equalization.The developed algorithm can be used for copyright protection on the static images. Further studies will be used to study the

  15. Digital Printing Quality Detection and Analysis Technology Based on CCD

    Science.gov (United States)

    He, Ming; Zheng, Liping

    2017-12-01

    With the help of CCD digital printing quality detection and analysis technology, it can carry out rapid evaluation and objective detection of printing quality, and can play a certain control effect on printing quality. It can be said CDD digital printing quality testing and analysis of the rational application of technology, its digital printing and printing materials for a variety of printing equipments to improve the quality of a very positive role. In this paper, we do an in-depth study and discussion based on the CCD digital print quality testing and analysis technology.

  16. Cost analysis of operating an all-digital radiology department

    International Nuclear Information System (INIS)

    Arenson, R.L.; Soshadri, S.B.; DeSimone, D.; Hiss, S.S.

    1988-01-01

    Using the current film system as a baseline, this study analyzes the cost of digital acquisition, transmission, archiving, and display of all images generated in our department. Two approaches are considered: (1) conventional x-ray films are digitized with a laser scanning film digitizer; (2) images are captured with a direct digital receptor and no film is created. In both approaches, images from digital modalities are acquired directly from the scanners. The cost of equipment and its maintenance, film, supplies, storage space, operations, personnel, and so forth are analyzed for all approaches. The annual cost of operating the film system is $2.5 million. The estimated annual cost is $2.3 million for the first digital approach, $1.8 million for the second. This analysis demonstrates that these digital approaches can be cost-effective, viable alternatives to film-bases systems

  17. How Digital Are the Digital Humanities? An Analysis of Two Scholarly Blogging Platforms

    Science.gov (United States)

    Puschmann, Cornelius; Bastos, Marco

    2015-01-01

    In this paper we compare two academic networking platforms, HASTAC and Hypotheses, to show the distinct ways in which they serve specific communities in the Digital Humanities (DH) in different national and disciplinary contexts. After providing background information on both platforms, we apply co-word analysis and topic modeling to show thematic similarities and differences between the two sites, focusing particularly on how they frame DH as a new paradigm in humanities research. We encounter a much higher ratio of posts using humanities-related terms compared to their digital counterparts, suggesting a one-way dependency of digital humanities-related terms on the corresponding unprefixed labels. The results also show that the terms digital archive, digital literacy, and digital pedagogy are relatively independent from the respective unprefixed terms, and that digital publishing, digital libraries, and digital media show considerable cross-pollination between the specialization and the general noun. The topic modeling reproduces these findings and reveals further differences between the two platforms. Our findings also indicate local differences in how the emerging field of DH is conceptualized and show dynamic topical shifts inside these respective contexts. PMID:25675441

  18. PERLUASAN JANGKAUAN SIARAN STASIUN PEMANCAR DIGITAL TVRI JAWA BARAT DENGAN SISTEM SINGLE FREQUENCY NETWORK (SFN

    Directory of Open Access Journals (Sweden)

    Trya Agung Pahlevi

    2017-02-01

    Full Text Available Implementasi penyiaran televisi digital dengan menggunakan media teresterial telah menjadi kenyataan di Indonesia. Sistem penyiaran televisi digital (Digital Video Broadcast over Terrestrial / DVB-T memungkinkan untuk penggunaan Single Frequency Network (SFN, sehingga dapat memperluas wilayah jangkauan siaran dengan menggunakan satu kanal frekuensi. Penelitian yang dilakukan adalah merencanakan koordinat dan parameter teknis sistem SFN DVB-T di wilayah TVRI Jawa Barat, berdasarkan rekomendasi dari akta-akta akhir International Telecommunication Union (ITU dalam Sidang Regional Radiocommunication Conference (RRC-06, untuk mendapatkan jangkauan wilayah yang paling maksimal dan efisien. Simulasi perancangan menggunakan  perangkat lunak "Mobile RF" dan "CHIRplus_BC"dalam menentukan koordinat pemancar, parameter daya dan tinggi pemancar, serta keekonomian perancangan. Hasil akhir dari penelitian adalah dengan sistem SFN dapat meningkatkan wilayah jangkauan siaran dari kondisi awal 11.609.819 orang (persentase coverage population 55,51%, menjadi 12.060.282 orang (persentase coverage population 57,66% sampai dengan 17.563.586 orang (persentase coverage population 83,98%, dengan total jumlah pemirsa adalah 20.914.885  orang.

  19. Digitally synthesized high purity, high-voltage radio frequency drive electronics for mass spectrometry

    Science.gov (United States)

    Schaefer, R. T.; MacAskill, J. A.; Mojarradi, M.; Chutjian, A.; Darrach, M. R.; Madzunkov, S. M.; Shortt, B. J.

    2008-09-01

    Reported herein is development of a quadrupole mass spectrometer controller (MSC) with integrated radio frequency (rf) power supply and mass spectrometer drive electronics. Advances have been made in terms of the physical size and power consumption of the MSC, while simultaneously making improvements in frequency stability, total harmonic distortion, and spectral purity. The rf power supply portion of the MSC is based on a series-resonant LC tank, where the capacitive load is the mass spectrometer itself, and the inductor is a solenoid or toroid, with various core materials. The MSC drive electronics is based on a field programmable gate array (FPGA), with serial peripheral interface for analog-to-digital and digital-to-analog converter support, and RS232/RS422 communications interfaces. The MSC offers spectral quality comparable to, or exceeding, that of conventional rf power supplies used in commercially available mass spectrometers; and as well an inherent flexibility, via the FPGA implementation, for a variety of tasks that includes proportional-integral derivative closed-loop feedback and control of rf, rf amplitude, and mass spectrometer sensitivity. Also provided are dc offsets and resonant dipole excitation for mass selective accumulation in applications involving quadrupole ion traps; rf phase locking and phase shifting for external loading of a quadrupole ion trap; and multichannel scaling of acquired mass spectra. The functionality of the MSC is task specific, and is easily modified by simply loading FPGA registers or reprogramming FPGA firmware.

  20. 162.5 MHz digital low-level radio frequency control monitoring system design and implementation

    International Nuclear Information System (INIS)

    Zhang Ruifeng; Wang Xianwu; Xu Zhe; Yi Xiaoping

    2014-01-01

    162.5 MHz high-frequency low-level control system self-developed by Institute of Modern Physics for ADS project took digital technology. All parameters' reading and writing, including loop parameter setting, open and close-loop operation, and condition monitoring, were achieved through the monitoring system. The system used lightweight client-server working mode that client running in the PC sent command data, server running on high-frequency digital low level system responded instructions to complete parameter monitoring and control. The system consisted of three parts. Firstly, server hardware system was constructed based on Atera Stratix Ⅲ family of field-programmable gate array (FPGA) development board. Secondly, the server software system was designed based on Micro C/OS Ⅱ real-time operating systems and lightweight TCP/IP protocol stack, and finally a client PC program was designed based on MFC. After a long test, it was indicated that the monitoring system works properly and stably. TCP sends and receives throughput reached 11.931038 Mbps and 8.117624 Mbps. (authors)

  1. Digitally synthesized high purity, high-voltage radio frequency drive electronics for mass spectrometry.

    Science.gov (United States)

    Schaefer, R T; MacAskill, J A; Mojarradi, M; Chutjian, A; Darrach, M R; Madzunkov, S M; Shortt, B J

    2008-09-01

    Reported herein is development of a quadrupole mass spectrometer controller (MSC) with integrated radio frequency (rf) power supply and mass spectrometer drive electronics. Advances have been made in terms of the physical size and power consumption of the MSC, while simultaneously making improvements in frequency stability, total harmonic distortion, and spectral purity. The rf power supply portion of the MSC is based on a series-resonant LC tank, where the capacitive load is the mass spectrometer itself, and the inductor is a solenoid or toroid, with various core materials. The MSC drive electronics is based on a field programmable gate array (FPGA), with serial peripheral interface for analog-to-digital and digital-to-analog converter support, and RS232/RS422 communications interfaces. The MSC offers spectral quality comparable to, or exceeding, that of conventional rf power supplies used in commercially available mass spectrometers; and as well an inherent flexibility, via the FPGA implementation, for a variety of tasks that includes proportional-integral derivative closed-loop feedback and control of rf, rf amplitude, and mass spectrometer sensitivity. Also provided are dc offsets and resonant dipole excitation for mass selective accumulation in applications involving quadrupole ion traps; rf phase locking and phase shifting for external loading of a quadrupole ion trap; and multichannel scaling of acquired mass spectra. The functionality of the MSC is task specific, and is easily modified by simply loading FPGA registers or reprogramming FPGA firmware.

  2. Extending electronic length frequency analysis in R

    DEFF Research Database (Denmark)

    Taylor, M. H.; Mildenberger, Tobias K.

    2017-01-01

    VBGF (soVBGF) requires a more intensive search due to two additional parameters. This work describes the implementation of two optimisation approaches ("simulated annealing" and "genetic algorithm") for growth function fitting using the open-source software "R." Using a generated LFQ data set......Electronic length frequency analysis (ELEFAN) is a system of stock assessment methods using length-frequency (LFQ) data. One step is the estimation of growth from the progression of LFQ modes through time using the von Bertalanffy growth function (VBGF). The option to fit a seasonally oscillating...... of the asymptotic length parameter (L-infinity) are found to have significant effects on parameter estimation error. An outlook provides context as to the significance of the R-based implementation for further testing and development, as well as the general relevance of the method for data-limited stock assessment....

  3. Digital frequency offset-locked He–Ne laser system with high beat frequency stability, narrow optical linewidth and optical fibre output

    Science.gov (United States)

    Sternkopf, Christian; Manske, Eberhard

    2018-06-01

    We report on the enhancement of a previously-presented heterodyne laser source on the basis of two phase-locked loop (PLL) frequency coupled internal-mirror He–Ne lasers. Our new system consists of two digitally controlled He–Ne lasers with slightly different wavelengths, and offers high-frequency stability and very narrow optical linewidth. The digitally controlled system has been realized by using a FPGA controller and transconductance amplifiers. The light of both lasers was coupled into separate fibres for heterodyne interferometer applications. To enhance the laser performance we observed the sensitivity of both laser tubes to electromagnetic noise from various laser power supplies and frequency control systems. Furthermore, we describe how the linewidth of a frequency-controlled He–Ne laser can be reduced during precise frequency stabilisation. The digitally controlled laser source reaches a standard beat frequency deviation of less than 20 Hz (with 1 s gate time) and a spectral full width at half maximum (FWHM) of the beat signal less than 3 kHz. The laser source has enough optical output power to serve a fibre-coupled multi axis heterodyne interferometer. The system can be adjusted to output beat frequencies in the range of 0.1 MHz–20 MHz.

  4. Real time analysis of electromagnetic radiation in a very wide frequency range

    International Nuclear Information System (INIS)

    Peralta, J.A.; Reyes L, P.; Yepez, E.

    2001-01-01

    In this work, we present an electronic apparatus that facilitates the monitoring and analysis of electromagnetic radiation in a very wide frequency range. The device is a combination of real and virtual instruments, taking advantage of new hardware and software; the measurable range of frequencies depends on the speed of an analog/digital converter, reaching tens of Megahertz. The device has been successfully used to monitor the environmental electromagnetic radiation at very low frequency, a very useful parameter in the research of electromagnetic precursors of earthquakes. The apparatus is a new configuration and has advantages with respect to those previously used: when the attached computer is fast, Fourier analysis can be done in real time, can display simultaneously several bands, the digitized data allow a variety of methods of analysis, and the apparatus is very cheap. (Author)

  5. Real time analysis of electromagnetic radiation in a very wide frequency range

    Energy Technology Data Exchange (ETDEWEB)

    Peralta, J.A.; Reyes L, P.; Yepez, E. [Escuela Superior de Fisica y Matematicas, Instituto Politecnico Nacional, Edificio 9, U.P. Adolfo Lopez Mateos, Zacatenco, 07738 Mexico D.F. (Mexico)

    2001-07-01

    In this work, we present an electronic apparatus that facilitates the monitoring and analysis of electromagnetic radiation in a very wide frequency range. The device is a combination of real and virtual instruments, taking advantage of new hardware and software; the measurable range of frequencies depends on the speed of an analog/digital converter, reaching tens of Megahertz. The device has been successfully used to monitor the environmental electromagnetic radiation at very low frequency, a very useful parameter in the research of electromagnetic precursors of earthquakes. The apparatus is a new configuration and has advantages with respect to those previously used: when the attached computer is fast, Fourier analysis can be done in real time, can display simultaneously several bands, the digitized data allow a variety of methods of analysis, and the apparatus is very cheap. (Author)

  6. Elementary, Middle, and High School Students Vary in Frequency and Purpose When Using Online Digital References. A review of: Silverstein, Joanne. “Just Curious: Children’s Use of Digital Reference for Unimposed Queries and Its Importance in Informal Education.” Library Trends 54.2 (Fall 2005): 228‐44.

    OpenAIRE

    Julie Stephens

    2006-01-01

    Objective – To determine 1) how and with what frequency children use digital references to answer their own unimposed questions; 2) whether digital reference services support their self‐initiated learning; 3) whether digital reference services support the transfer of student motivation and curiosity from the formal to the informal; and 4) what instructional and software designers should consider in creating tools that support learning.Design – Inductive analysis.Setting – Virtual Reference De...

  7. Cyber Threat and vulnerability Analysis for Digital Assets of NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Eun Se; Seo, In Yong [Korea Electric Power research Institute, Daejeon (Korea, Republic of); Kim, See Hong [Korea Hydro and Nuclear Power Co., Seoul (Korea, Republic of)

    2009-10-15

    Today's computer and communication technology breakthrough make increase plant floor replacement from analog instrumentation and control systems of nuclear power plants to a full-fledged digital system . The rich functionality and crisp accuracy are one of big advantages of digital technology adaptation, but use of open networks and inherited shared system resources (memory, network, etc.) are well known weak points of digital system. Intended or un-intended cyber attack throughout power plant digital control system's weak point may result to wide area of system failures and that easily defeats system operation and multiple protection safeguards. Well organized cyber security analysis for nuclear plant digital control systems (digital assets) are required.

  8. Frequency domain analysis of knock images

    Science.gov (United States)

    Qi, Yunliang; He, Xin; Wang, Zhi; Wang, Jianxin

    2014-12-01

    High speed imaging-based knock analysis has mainly focused on time domain information, e.g. the spark triggered flame speed, the time when end gas auto-ignition occurs and the end gas flame speed after auto-ignition. This study presents a frequency domain analysis on the knock images recorded using a high speed camera with direct photography in a rapid compression machine (RCM). To clearly visualize the pressure wave oscillation in the combustion chamber, the images were high-pass-filtered to extract the luminosity oscillation. The luminosity spectrum was then obtained by applying fast Fourier transform (FFT) to three basic colour components (red, green and blue) of the high-pass-filtered images. Compared to the pressure spectrum, the luminosity spectra better identify the resonant modes of pressure wave oscillation. More importantly, the resonant mode shapes can be clearly visualized by reconstructing the images based on the amplitudes of luminosity spectra at the corresponding resonant frequencies, which agree well with the analytical solutions for mode shapes of gas vibration in a cylindrical cavity.

  9. Utilization of the voltage frequency converter or digital representation and documentation of transient reactor operation

    International Nuclear Information System (INIS)

    Doane, Harry J.

    1986-01-01

    The ease and speed of handling transient data is enhanced by the use of a voltage to frequency converter (VFC). This analogue to digital semiconductor device provides an inexpensive and portable alternative to electro-mechanical recorders and hand entry of data into computer codes. The VFC used at The University of Arizona is a Teledyne Philbrick 4705/01. A zero to positive ten volt input signal provides a zero to one megahertz output signal which is TTL/DTL compatible. VFC is used at the University of Arizona to collect data for super prompt critical TRIGA excursions. The VFC provides a low cost, convenient method of transient data storage and retrieval for experimentation and laboratory demonstration

  10. Demonstrations of analog-to-digital conversion using a frequency domain stretched processor.

    Science.gov (United States)

    Reibel, Randy Ray; Harrington, Calvin; Dahl, Jason; Ostrander, Charles; Roos, Peter Aaron; Berg, Trenton; Mohan, R Krishna; Neifeld, Mark A; Babbitt, Wm R

    2009-07-06

    The first proof-of-concept demonstrations are presented for a broadband photonic-assisted analog-to-digital converter (ADC) based on spatial spectral holography (SSH). The SSH-ADC acts as a frequency-domain stretch processor converting high bandwidth input signals to low bandwidth output signals, allowing the system to take advantage of high performance, low bandwidth electronic ADCs. Demonstrations with 50 MHz effective bandwidth are shown to highlight basic performance with approximately 5 effective bits of vertical resolution. Signal capture with 1600 MHz effective bandwidth is also shown. Because some SSH materials span over 100 GHz and have large time apertures (approximately 10 micros), this technique holds promise as a candidate for the next generation of ADCs.

  11. An ultra-high-speed direct digital frequency synthesizer implemented in GaAs HBT technology

    International Nuclear Information System (INIS)

    Chen Gaopeng; Wu Danyu; Jin Zhi; Liu Xinyu

    2010-01-01

    This paper presents a 10-GHz 8-bit direct digital synthesizer (DDS) microwave monolithic integrated circuit implemented in 1 μm GaAs HBT technology. The DDS takes a double-edge-trigger (DET) 8-stage pipeline accumulator with sine-weighted DAC-based ROM-less architecture, which can maximize the utilization ratio of the GaAs HBT's high-speed potential. With an output frequency up to 5 GHz, the DDS gives an average spurious free dynamic range of 23.24 dBc through the first Nyquist band, and consumes 2.4 W of DC power from a single -4.6 V DC supply. Using 1651 GaAs HBT transistors, the total area of the DDS chip is 2.4 x 2.0 mm 2 . (semiconductor integrated circuits)

  12. Modulation format identification enabled by the digital frequency-offset loading technique for hitless coherent transceiver.

    Science.gov (United States)

    Fu, Songnian; Xu, Zuying; Lu, Jianing; Jiang, Hexun; Wu, Qiong; Hu, Zhouyi; Tang, Ming; Liu, Deming; Chan, Calvin Chun-Kit

    2018-03-19

    We propose a blind and fast modulation format identification (MFI) enabled by the digital frequency-offset (FO) loading technique for hitless coherent transceiver. Since modulation format information is encoded to the FO distribution during digital signal processing (DSP) at the transmitter side (Tx), we can use the fast Fourier transformation based FO estimation (FFT-FOE) method to obtain the FO distribution of individual data block after constant modulus algorithm (CMA) pre-equalization at the receiver side, in order to realize non-data-aided (NDA) and fast MFI. The obtained FO can be also used for subsequent FO compensation (FOC), without additional complexity. We numerically investigate and experimentally verify the proposed MFI with high accuracy and fast format switching among 28 Gbaud dual-polarization (DP)-4/8/16/64QAM, time domain hybrid-4/16QAM, and set partitioning (SP)-128QAM. In particular, the proposed MFI brings no performance degradation, in term of tolerance of amplified spontaneous emission (ASE) noise, laser linewidth, and fiber nonlinearity. Finally, a hitless coherent transceiver enabled by the proposed MFI with switching-block of only 2048 symbols is demonstrated over 1500 km standard single mode fiber (SSMF) transmission.

  13. Digital frequency domain multiplexing readout electronics for the next generation of millimeter telescopes

    Science.gov (United States)

    Bender, Amy N.; Cliche, Jean-François; de Haan, Tijmen; Dobbs, Matt A.; Gilbert, Adam J.; Montgomery, Joshua; Rowlands, Neil; Smecher, Graeme M.; Smith, Ken; Wilson, Andrew

    2014-07-01

    Frequency domain multiplexing (fMux) is an established technique for the readout of transition-edge sensor (TES) bolometers in millimeter-wavelength astrophysical instrumentation. In fMux, the signals from multiple detectors are read out on a single pair of wires reducing the total cryogenic thermal loading as well as the cold component complexity and cost of a system. The current digital fMux system, in use by POLARBEAR, EBEX, and the South Pole Telescope, is limited to a multiplexing factor of 16 by the dynamic range of the Superconducting Quantum Interference Device pre-amplifier and the total system bandwidth. Increased multiplexing is key for the next generation of large format TES cameras, such as SPT-3G and POLARBEAR2, which plan to have on the of order 15,000 detectors. Here, we present the next generation fMux readout, focusing on the warm electronics. In this system, the multiplexing factor increases to 64 channels per module (2 wires) while maintaining low noise levels and detector stability. This is achieved by increasing the system bandwidth, reducing the dynamic range requirements though active feedback, and digital synthesis of voltage biases with a novel polyphase filter algorithm. In addition, a version of the new fMux readout includes features such as low power consumption and radiation-hard components making it viable for future space-based millimeter telescopes such as the LiteBIRD satellite.

  14. Frequency band analysis of muscle activation during cycling to exhaustion

    Directory of Open Access Journals (Sweden)

    Fernando Diefenthaeler

    2012-04-01

    Full Text Available DOI: http://dx.doi.org/10.5007/1980-0037.2012v14n3p243 Lower limb muscles activation was assessed during cycling to exhaustion using frequency band analysis. Nine cyclists were evaluated in two days. On the first day, cyclists performed a maximal incremental cycling exercise to measure peak power output, which was used on the second day to define the workload for a constant load time to exhaustion cycling exercise (maximal aerobic power output from day 1. Muscle activation of vastus lateralis (VL, long head of biceps femoris (BF, lateral head of gastrocnemius (GL, and tibialis anterior (TA from the right lower limb was recorded during the time to exhaustion cycling exercise. A series of nine band-pass Butterworth digital filters was used to analyze muscle activity amplitude for each band. The overall amplitude of activation and the high and low frequency components were defined to assess the magnitude of fatigue effects on muscle activity via effect sizes. The profile of the overall muscle activation during the test was analyzed using a second order polynomial, and the variability of the overall bands was analyzed by the coefficient of variation for each muscle in each instant of the test. Substantial reduction in the high frequency components of VL and BF activation was observed. The overall and low frequency bands presented trivial to small changes for all muscles. High relationship between the second order polynomial fitting and muscle activity was found (R2 > 0.89 for all muscles. High variability (~25% was found for muscle activation at the four instants of the fatigue test. Changes in the spectral properties of the EMG signal were only substantial when extreme changes in fatigue state were induced.

  15. Optimization and Implementation of Scaling-Free CORDIC-Based Direct Digital Frequency Synthesizer for Body Care Area Network Systems

    Directory of Open Access Journals (Sweden)

    Ying-Shen Juang

    2012-01-01

    Full Text Available Coordinate rotation digital computer (CORDIC is an efficient algorithm for computations of trigonometric functions. Scaling-free-CORDIC is one of the famous CORDIC implementations with advantages of speed and area. In this paper, a novel direct digital frequency synthesizer (DDFS based on scaling-free CORDIC is presented. The proposed multiplier-less architecture with small ROM and pipeline data path has advantages of high data rate, high precision, high performance, and less hardware cost. The design procedure with performance and hardware analysis for optimization has also been given. It is verified by Matlab simulations and then implemented with field programmable gate array (FPGA by Verilog. The spurious-free dynamic range (SFDR is over 86.85 dBc, and the signal-to-noise ratio (SNR is more than 81.12 dB. The scaling-free CORDIC-based architecture is suitable for VLSI implementations for the DDFS applications in terms of hardware cost, power consumption, SNR, and SFDR. The proposed DDFS is very suitable for medical instruments and body care area network systems.

  16. Reliability analysis of digital safety systems at nuclear power plants

    International Nuclear Information System (INIS)

    Sopira Vladimir; Kovacs, Zoltan

    2015-01-01

    Reliability analysis of digital reactor protection systems built on the basis of TELEPERM XS is described, and experience gained by the Slovak RELKO company during the past 20 years in this domain is highlighted. (orig.)

  17. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    Directory of Open Access Journals (Sweden)

    Thomas Jensen

    2016-01-01

    Full Text Available Background: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. Methods: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. Results: It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. Conclusion: The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework.

  18. Analysis of the frequency components of X-ray images

    International Nuclear Information System (INIS)

    Matsuo, Satoru; Komizu, Mitsuru; Kida, Tetsuo; Noma, Kazuo; Hashimoto, Keiji; Onishi, Hideo; Masuda, Kazutaka

    1997-01-01

    We examined the relation between the frequency components of x-ray images of the chest and phalanges and their read sizes for digitizing. Images of the chest and phalanges were radiographed using three types of screens and films, and the noise images in background density were digitized with a drum scanner, changing the read sizes. The frequency components for these images were evaluated by converting them to the secondary Fourier to obtain the power spectrum and signal to noise ratio (SNR). After changing the cut-off frequency on the power spectrum to process a low pass filter, we also examined the frequency components of the images in relation to the normalized mean square error (NMSE) for the image converted to reverse Fourier and the original image. Results showed that the frequency components were 2.0 cycles/mm for the chest image and 6.0 cycles/mm for the phalanges. Therefore, it is necessary to collect data applying the read sizes of 200 μm and 50 μm for the chest and phalangeal images, respectively, in order to digitize these images without loss of their frequency components. (author)

  19. Analysis of computational vulnerabilities in digital repositories

    Directory of Open Access Journals (Sweden)

    Valdete Fernandes Belarmino

    2015-04-01

    Full Text Available Objective. Demonstrates the results of research that aimed to analyze the computational vulnerabilities of digital directories in public Universities. Argues the relevance of information in contemporary societies like an invaluable resource, emphasizing scientific information as an essential element to constitute scientific progress. Characterizes the emergence of Digital Repositories and highlights its use in academic environment to preserve, promote, disseminate and encourage the scientific production. Describes the main software for the construction of digital repositories. Method. The investigation identified and analyzed the vulnerabilities that are exposed the digital repositories using Penetration Testing running. Discriminating the levels of risk and the types of vulnerabilities. Results. From a sample of 30 repositories, we could examine 20, identified that: 5% of the repositories have critical vulnerabilities, 85% high, 25% medium and 100% lowers. Conclusions. Which demonstrates the necessity to adapt actions for these environments that promote informational security to minimizing the incidence of external and / or internal systems attacks.Abstract Grey Text – use bold for subheadings when needed.

  20. Complex Signal Kurtosis and Independent Component Analysis for Wideband Radio Frequency Interference Detection

    Science.gov (United States)

    Schoenwald, Adam; Mohammed, Priscilla; Bradley, Damon; Piepmeier, Jeffrey; Wong, Englin; Gholian, Armen

    2016-01-01

    Radio-frequency interference (RFI) has negatively implicated scientific measurements across a wide variation passive remote sensing satellites. This has been observed in the L-band radiometers SMOS, Aquarius and more recently, SMAP [1, 2]. RFI has also been observed at higher frequencies such as K band [3]. Improvements in technology have allowed wider bandwidth digital back ends for passive microwave radiometry. A complex signal kurtosis radio frequency interference detector was developed to help identify corrupted measurements [4]. This work explores the use of ICA (Independent Component Analysis) as a blind source separation technique to pre-process radiometric signals for use with the previously developed real and complex signal kurtosis detectors.

  1. The Digital Image Processing And Quantitative Analysis In Microscopic Image Characterization

    International Nuclear Information System (INIS)

    Ardisasmita, M. Syamsa

    2000-01-01

    Many electron microscopes although have produced digital images, but not all of them are equipped with a supporting unit to process and analyse image data quantitatively. Generally the analysis of image has to be made visually and the measurement is realized manually. The development of mathematical method for geometric analysis and pattern recognition, allows automatic microscopic image analysis with computer. Image processing program can be used for image texture and structure periodic analysis by the application of Fourier transform. Because the development of composite materials. Fourier analysis in frequency domain become important for measure the crystallography orientation. The periodic structure analysis and crystal orientation are the key to understand many material properties like mechanical strength. stress, heat conductivity, resistance, capacitance and other material electric and magnetic properties. In this paper will be shown the application of digital image processing in microscopic image characterization and analysis in microscopic image

  2. High-Speed Microscale Optical Tracking Using Digital Frequency-Domain Multiplexing.

    Science.gov (United States)

    Maclachlan, Robert A; Riviere, Cameron N

    2009-06-01

    Position-sensitive detectors (PSDs), or lateral-effect photodiodes, are commonly used for high-speed, high-resolution optical position measurement. This paper describes the instrument design for multidimensional position and orientation measurement based on the simultaneous position measurement of multiple modulated sources using frequency-domain-multiplexed (FDM) PSDs. The important advantages of this optical configuration in comparison with laser/mirror combinations are that it has a large angular measurement range and allows the use of a probe that is small in comparison with the measurement volume. We review PSD characteristics and quantitative resolution limits, consider the lock-in amplifier measurement system as a communication link, discuss the application of FDM to PSDs, and make comparisons with time-domain techniques. We consider the phase-sensitive detector as a multirate DSP problem, explore parallels with Fourier spectral estimation and filter banks, discuss how to choose the modulation frequencies and sample rates that maximize channel isolation under design constraints, and describe efficient digital implementation. We also discuss hardware design considerations, sensor calibration, probe construction and calibration, and 3-D measurement by triangulation using two sensors. As an example, we characterize the resolution, speed, and accuracy of an instrument that measures the position and orientation of a 10 mm × 5 mm probe in 5 degrees of freedom (DOF) over a 30-mm cube with 4-μm peak-to-peak resolution at 1-kHz sampling.

  3. A Digital Hysteresis Current Control for Half-Bridge Inverters with Constrained Switching Frequency

    Directory of Open Access Journals (Sweden)

    Triet Nguyen-Van

    2017-10-01

    Full Text Available This paper proposes a new robustly adaptive hysteresis current digital control algorithm for half-bridge inverters, which plays an important role in electric power, and in various applications in electronic systems. The proposed control algorithm is assumed to be implemented on a high-speed Field Programmable Gate Array (FPGA circuit, using measured data with high sampling frequency. The hysteresis current band is computed in each switching modulation period based on both the current error and the negative half switching period during the previous modulation period, in addition to the conventionally used voltages measured at computation instants. The proposed control algorithm is derived by solving the optimization problem—where the switching frequency is always constrained at below the desired constant frequency—which is not guaranteed by the conventional method. The optimization problem also keeps the output current stable around the reference, and minimizes power loss. Simulation results show good performances of the proposed algorithm compared with the conventional one.

  4. REVIEW OF NRC APPROVED DIGITAL CONTROL SYSTEMS ANALYSIS

    International Nuclear Information System (INIS)

    Markman, D.W.

    1999-01-01

    Preliminary design concepts for the proposed Subsurface Repository at Yucca Mountain indicate extensive reliance on modern, computer-based, digital control technologies. The purpose of this analysis is to investigate the degree to which the U. S. Nuclear Regulatory Commission (NRC) has accepted and approved the use of digital control technology for safety-related applications within the nuclear power industry. This analysis reviews cases of existing digitally-based control systems that have been approved by the NRC. These cases can serve as precedence for using similar types of digitally-based control technologies within the Subsurface Repository. While it is anticipated that the Yucca Mountain Project (YMP) will not contain control systems as complex as those required for a nuclear power plant, the review of these existing NRC approved applications will provide the YMP with valuable insight into the NRCs review process and design expectations for safety-related digital control systems. According to the YMP Compliance Program Guidance, portions of various NUREGS, Regulatory Guidelines, and nuclear IEEE standards the nuclear power plant safety related concept would be applied to some of the designs on a case-by-case basis. This analysis will consider key design methods, capabilities, successes, and important limitations or problems of selected control systems that have been approved for use in the Nuclear Power industry. An additional purpose of this analysis is to provide background information in support of further development of design criteria for the YMP. The scope and primary objectives of this analysis are to: (1) Identify and research the extent and precedence of digital control and remotely operated systems approved by the NRC for the nuclear power industry. Help provide a basis for using and relying on digital technologies for nuclear related safety critical applications. (2) Identify the basic control architecture and methods of key digital control

  5. ACE2 Global Digital Elevation Model : User Analysis

    Science.gov (United States)

    Smith, R. G.; Berry, P. A. M.; Benveniste, J.

    2013-12-01

    Altimeter Corrected Elevations 2 (ACE2), first released in October 2009, is the Global Digital Elevation Model (GDEM) created by fusing the high accuracy of over 100 million altimeter retracked height estimates, derived primarily from the ERS-1 Geodetic Mission, with the high frequency content available within the near-global Shuttle Radar Topography Mission. This novel ACE2 GDEM is freely available at 3”, 9”, 30” and 5' and has been distributed via the web to over 680 subscribers. This paper presents the results of a detailed analysis of geographical distribution of subscribed users, along with fields of study and potential uses. Investigations have also been performed to determine the most popular spatial resolutions and the impact these have on the scope of data downloaded. The analysis has shown that, even though the majority of users have come from Europe and America, a significant number of website hits have been received from South America, Africa and Asia. Registered users also vary widely, from research institutions and major companies down to individual hobbyists looking at data for single projects.

  6. Digital Architecture – Results From a Gap Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna Helene [Idaho National Lab. (INL), Idaho Falls, ID (United States); Thomas, Kenneth David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Fitzgerald, Kirk [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    The digital architecture is defined as a collection of IT capabilities needed to support and integrate a wide-spectrum of real-time digital capabilities for nuclear power plant performance improvements. The digital architecture can be thought of as an integration of the separate I&C and information systems already in place in NPPs, brought together for the purpose of creating new levels of automation in NPP work activities. In some cases, it might be an extension of the current communication systems, to provide digital communications where they are currently analog only. This collection of IT capabilities must in turn be based on a set of user requirements that must be supported for the interconnected technologies to operate in an integrated manner. These requirements, simply put, are a statement of what sorts of digital work functions will be exercised in a fully-implemented seamless digital environment and how much they will be used. The goal of the digital architecture research is to develop a methodology for mapping nuclear power plant operational and support activities into the digital architecture, which includes the development of a consensus model for advanced information and control architecture. The consensus model should be developed at a level of detail that is useful to the industry. In other words, not so detailed that it specifies specific protocols and not so vague that it is only provides a high level description of technology. The next step towards the model development is to determine the current state of digital architecture at typical NPPs. To investigate the current state, the researchers conducted a gap analysis to determine to what extent the NPPs can support the future digital technology environment with their existing I&C and IT structure, and where gaps exist with respect to the full deployment of technology over time. The methodology, result, and conclusions from the gap analysis are described in this report.

  7. Optimal depth-based regional frequency analysis

    Directory of Open Access Journals (Sweden)

    H. Wazneh

    2013-06-01

    Full Text Available Classical methods of regional frequency analysis (RFA of hydrological variables face two drawbacks: (1 the restriction to a particular region which can lead to a loss of some information and (2 the definition of a region that generates a border effect. To reduce the impact of these drawbacks on regional modeling performance, an iterative method was proposed recently, based on the statistical notion of the depth function and a weight function φ. This depth-based RFA (DBRFA approach was shown to be superior to traditional approaches in terms of flexibility, generality and performance. The main difficulty of the DBRFA approach is the optimal choice of the weight function ϕ (e.g., φ minimizing estimation errors. In order to avoid a subjective choice and naïve selection procedures of φ, the aim of the present paper is to propose an algorithm-based procedure to optimize the DBRFA and automate the choice of ϕ according to objective performance criteria. This procedure is applied to estimate flood quantiles in three different regions in North America. One of the findings from the application is that the optimal weight function depends on the considered region and can also quantify the region's homogeneity. By comparing the DBRFA to the canonical correlation analysis (CCA method, results show that the DBRFA approach leads to better performances both in terms of relative bias and mean square error.

  8. Optimal depth-based regional frequency analysis

    Science.gov (United States)

    Wazneh, H.; Chebana, F.; Ouarda, T. B. M. J.

    2013-06-01

    Classical methods of regional frequency analysis (RFA) of hydrological variables face two drawbacks: (1) the restriction to a particular region which can lead to a loss of some information and (2) the definition of a region that generates a border effect. To reduce the impact of these drawbacks on regional modeling performance, an iterative method was proposed recently, based on the statistical notion of the depth function and a weight function φ. This depth-based RFA (DBRFA) approach was shown to be superior to traditional approaches in terms of flexibility, generality and performance. The main difficulty of the DBRFA approach is the optimal choice of the weight function ϕ (e.g., φ minimizing estimation errors). In order to avoid a subjective choice and naïve selection procedures of φ, the aim of the present paper is to propose an algorithm-based procedure to optimize the DBRFA and automate the choice of ϕ according to objective performance criteria. This procedure is applied to estimate flood quantiles in three different regions in North America. One of the findings from the application is that the optimal weight function depends on the considered region and can also quantify the region's homogeneity. By comparing the DBRFA to the canonical correlation analysis (CCA) method, results show that the DBRFA approach leads to better performances both in terms of relative bias and mean square error.

  9. Digitization

    DEFF Research Database (Denmark)

    Finnemann, Niels Ole

    2014-01-01

    what a concept of digital media might add to the understanding of processes of mediatization and what the concept of mediatization might add to the understanding of digital media. It is argued that digital media open an array of new trajectories in human communication, trajectories which were...

  10. A Novel Sub-pixel Measurement Algorithm Based on Mixed the Fractal and Digital Speckle Correlation in Frequency Domain

    Directory of Open Access Journals (Sweden)

    Zhangfang Hu

    2014-10-01

    Full Text Available The digital speckle correlation is a non-contact in-plane displacement measurement method based on machine vision. Motivated by the facts that the low accuracy and large amount of calculation produced by the traditional digital speckle correlation method in spatial domain, we introduce a sub-pixel displacement measurement algorithm which employs a fast interpolation method based on fractal theory and digital speckle correlation in frequency domain. This algorithm can overcome either the blocking effect or the blurring caused by the traditional interpolation methods, and the frequency domain processing also avoids the repeated searching in the correlation recognition of the spatial domain, thus the operation quantity is largely reduced and the information extracting speed is improved. The comparative experiment is given to verify that the proposed algorithm in this paper is effective.

  11. The digital storytelling process: A comparative analysis from various experts

    Science.gov (United States)

    Hussain, Hashiroh; Shiratuddin, Norshuhada

    2016-08-01

    Digital Storytelling (DST) is a method of delivering information to the audience. It combines narrative and digital media content infused with the multimedia elements. In order for the educators (i.e the designers) to create a compelling digital story, there are sets of processes introduced by experts. Nevertheless, the experts suggest varieties of processes to guide them; of which some are redundant. The main aim of this study is to propose a single guide process for the creation of DST. A comparative analysis is employed where ten DST models from various experts are analysed. The process can also be implemented in other multimedia materials that used the concept of DST.

  12. Clustering of users of digital libraries through log file analysis

    Directory of Open Access Journals (Sweden)

    Juan Antonio Martínez-Comeche

    2017-09-01

    Full Text Available This study analyzes how users perform information retrieval tasks when introducing queries to the Hispanic Digital Library. Clusters of users are differentiated based on their distinct information behavior. The study used the log files collected by the server over a year and different possible clustering algorithms are compared. The k-means algorithm is found to be a suitable clustering method for the analysis of large log files from digital libraries. In the case of the Hispanic Digital Library the results show three clusters of users and the characteristic information behavior of each group is described.

  13. Digital radiographic techniques in the analysis of paintings

    International Nuclear Information System (INIS)

    James, A.E. Jr.; Gibbs, S.J.; James, A.E. III; Pickens, D.R.; Sloan, M.; Price, R.R.; Erickson, J.J.

    1985-01-01

    In this chapter the authors use the term digital radiography to mean any method of radiographic image production in which the silver halide-based film is replaced by an electronic sensor for production of an image. There are essentially three types of digital radiographic systems available at present, but others will be developed. These differ primarily in the method of image production and the rapidity with which images can be produced. The three methods discussed are digital fluoroscopy, scanned projection radiography, and the scanned point source radiography. Each has certain characteristics which, if properly utilized, will allow improved x-ray analysis of paintings

  14. Percorsi linguistici e semiotici: Critical Multimodal Analysis of Digital Discourse

    Directory of Open Access Journals (Sweden)

    edited by Ilaria Moschini

    2014-12-01

    Full Text Available The language section of LEA - edited by Ilaria Moschini - is dedicated to the Critical Multimodal Analysis of Digital Discourse, an approach that encompasses the linguistic and semiotic detailed investigation of texts within a socio-cultural perspective. It features an interview with Professor Theo van Leeuwen by Ilaria Moschini and four essays: “Retwitting, reposting, repinning; reshaping identities online: Towards a social semiotic multimodal analysis of digital remediation” by Elisabetta Adami; “Multimodal aspects of corporate social responsibility communication” by Carmen Daniela Maier; “Pervasive Technologies and the Paradoxes of Multimodal Digital Communication” by Sandra Petroni and “Can the powerless speak? Linguistic and multimodal corporate media manipulation in digital environments: the case of Malala Yousafzai” by Maria Grazia Sindoni. 

  15. #DigitalHealth: Exploring Users' Perspectives through Social Media Analysis.

    Science.gov (United States)

    Afyouni, Soroosh; Fetit, Ahmed E; Arvanitis, Theodoros N

    2015-01-01

    In order to explore the role of social media in forming an understanding of digital healthcare, we conducted a study involving sentiment and network analysis of Twitter contents. In doing this, we gathered 20,400 tweets that mentioned the key term #DigitalHealth for 55 hours, over a three-day period. In addition to examining users' opinions through sentiment analysis, we calculated in-degree centralities of nodes to identify the hubs in the network of interactions. The results suggest that the overall opinion about digital healthcare is generally positive. Additionally, our findings indicate that the most prevalent keywords, associated with digital health, widely range from mobile health to wearable technologies and big data. Surprisingly, the results show that the newly announced wearable technologies could occupy the majority of discussions.

  16. Digital spectral analysis parametric, non-parametric and advanced methods

    CERN Document Server

    Castanié, Francis

    2013-01-01

    Digital Spectral Analysis provides a single source that offers complete coverage of the spectral analysis domain. This self-contained work includes details on advanced topics that are usually presented in scattered sources throughout the literature.The theoretical principles necessary for the understanding of spectral analysis are discussed in the first four chapters: fundamentals, digital signal processing, estimation in spectral analysis, and time-series models.An entire chapter is devoted to the non-parametric methods most widely used in industry.High resolution methods a

  17. Mars Global Surveyor Ka-Band Frequency Data Analysis

    Science.gov (United States)

    Morabito, D.; Butman, S.; Shambayati, S.

    2000-01-01

    The Mars Global Surveyor (MGS) spacecraft, launched on November 7, 1996, carries an experimental space-to-ground telecommunications link at Ka-band (32 GHz) along with the primary X-band (8.4 GHz) downlink. The signals are simultaneously transmitted from a 1.5-in diameter parabolic high gain antenna (HGA) on MGS and received by a beam-waveguide (BWG) R&D 34-meter antenna located in NASA's Goldstone Deep Space Network (DSN) complex near Barstow, California. The projected 5-dB link advantage of Ka-band relative to X-band was confirmed in previous reports using measurements of MGS signal strength data acquired during the first two years of the link experiment from December 1996 to December 1998. Analysis of X-band and Ka-band frequency data and difference frequency (fx-fka)/3.8 data will be presented here. On board the spacecraft, a low-power sample of the X-band downlink from the transponder is upconverted to 32 GHz, the Ka-band frequency, amplified to I-W using a Solid State Power Amplifier, and radiated from the dual X/Ka HGA. The X-band signal is amplified by one of two 25 W TWTAs. An upconverter first downconverts the 8.42 GHz X-band signal to 8 GHz and then multiplies using a X4 multiplier producing the 32 GHz Ka-band frequency. The frequency source selection is performed by an RF switch which can be commanded to select a VCO (Voltage Controlled Oscillator) or USO (Ultra-Stable Oscillator) reference. The Ka-band frequency can be either coherent with the X-band downlink reference or a hybrid combination of the USO and VCO derived frequencies. The data in this study were chosen such that the Ka-band signal is purely coherent with the X-band signal, that is the downconverter is driven by the same frequency source as the X-band downlink). The ground station used to acquire the data is DSS-13, a 34-meter BWG antenna which incorporates a series of mirrors inside beam waveguide tubes which guide the energy to a subterranean pedestal room, providing a stable environment

  18. A new algorithm for a high-modulation frequency and high-speed digital lock-in amplifier

    International Nuclear Information System (INIS)

    Jiang, G L; Yang, H; Li, R; Kong, P

    2016-01-01

    To increase the maximum modulation frequency of the digital lock-in amplifier in an online system, we propose a new algorithm using a square wave reference whose frequency is an odd sub-multiple of the modulation frequency, which is based on odd harmonic components in the square wave reference. The sampling frequency is four times the modulation frequency to insure the orthogonality of reference sequences. Only additions and subtractions are used to implement phase-sensitive detection, which speeds up the computation in lock-in. Furthermore, the maximum modulation frequency of a lock-in is enhanced considerably. The feasibility of this new algorithm is tested by simulation and experiments. (paper)

  19. Analysis of medical exposures in digital mammography

    International Nuclear Information System (INIS)

    Oliveira, Sergio R.; Mantuano, Natalia O.; Albrecht, Afonso S.

    2014-01-01

    Currently, the use of digital mammography in the early diagnosis of breast cancer is increasingly common due to the production of high definition image that allows to detect subtle changes in breast images profiles. However it is necessary to be an improvement of the technique used since some devices offer minimization parameters of entrance dose to the skin. Thus, this study seeks to examine how the qualification of technical professionals in radiology interferes with the use of the techniques applied in mammography. For this, survey was carried out in a hospital in the city of Rio de Janeiro, which evaluated the scans of 1190 patients undergoing routine mammography (It is considered routinely the 4 basic exhibitions: with 2 flow skull and 2 medium oblique side, excluding repeats and supplements) in 2013. The medical exposures analyzed obtained from a single full digital equipment, model Senographe DS were compared with three different procedures performed by professionals in mammography techniques. The images were classified according to exposure techniques available in the equipment: Standard (STD), contrast (CNT) and dose (dose), and to be selected as breast density of the patient. Comparing the variation of the radiographic technique in relation to the professional who made the exhibition, what is observed is that the professional B presented the best conduct in relation to radiological protection, because she considered breast density in the choice of technical equipment parameter. The professional A, which is newly formed, and C, which has more service time, almost did not perform variations in the pattern of exposure, even for different breast densities. Thus, we can conclude that there is a need to update the professionals so that the tools available of dose limitation and mamas variability to digital mammography are efficiently employed in the service routine and thus meet the requirements of current legislation

  20. Image quality analysis of digital mammographic equipments

    Energy Technology Data Exchange (ETDEWEB)

    Mayo, P.; Pascual, A.; Verdu, G. [Valencia Univ. Politecnica, Chemical and Nuclear Engineering Dept. (Spain); Rodenas, F. [Valencia Univ. Politecnica, Applied Mathematical Dept. (Spain); Campayo, J.M. [Valencia Univ. Hospital Clinico, Servicio de Radiofisica y Proteccion Radiologica (Spain); Villaescusa, J.I. [Hospital Clinico La Fe, Servicio de Proteccion Radiologica, Valencia (Spain)

    2006-07-01

    The image quality assessment of a radiographic phantom image is one of the fundamental points in a complete quality control programme. The good functioning result of all the process must be an image with an appropriate quality to carry out a suitable diagnostic. Nowadays, the digital radiographic equipments are replacing the traditional film-screen equipments and it is necessary to update the parameters to guarantee the quality of the process. Contrast-detail phantoms are applied to digital radiography to study the threshold contrast detail sensitivity at operation conditions of the equipment. The phantom that is studied in this work is C.D.M.A.M. 3.4, which facilitates the evaluation of image contrast and detail resolution. One of the most extended indexes to measure the image quality in an objective way is the Image Quality Figure (I.Q.F.). This parameter is useful to calculate the image quality taking into account the contrast and detail resolution of the image analysed. The contrast-detail curve is useful as a measure of the image quality too, because it is a graphical representation in which the hole thickness and diameter are plotted for each contrast-detail combination detected in the radiographic image of the phantom. It is useful for the comparison of the functioning of different radiographic image systems, for phantom images under the same exposition conditions. The aim of this work is to study the image quality of different images contrast-detail phantom C.D.M.A.M. 3.4, carrying out the automatic detection of the contrast-detail combination and to establish a parameter which characterize in an objective way the mammographic image quality. This is useful to compare images obtained at different digital mammographic equipments to study the functioning of the equipments. (authors)

  1. Image quality analysis of digital mammographic equipments

    International Nuclear Information System (INIS)

    Mayo, P.; Pascual, A.; Verdu, G.; Rodenas, F.; Campayo, J.M.; Villaescusa, J.I.

    2006-01-01

    The image quality assessment of a radiographic phantom image is one of the fundamental points in a complete quality control programme. The good functioning result of all the process must be an image with an appropriate quality to carry out a suitable diagnostic. Nowadays, the digital radiographic equipments are replacing the traditional film-screen equipments and it is necessary to update the parameters to guarantee the quality of the process. Contrast-detail phantoms are applied to digital radiography to study the threshold contrast detail sensitivity at operation conditions of the equipment. The phantom that is studied in this work is C.D.M.A.M. 3.4, which facilitates the evaluation of image contrast and detail resolution. One of the most extended indexes to measure the image quality in an objective way is the Image Quality Figure (I.Q.F.). This parameter is useful to calculate the image quality taking into account the contrast and detail resolution of the image analysed. The contrast-detail curve is useful as a measure of the image quality too, because it is a graphical representation in which the hole thickness and diameter are plotted for each contrast-detail combination detected in the radiographic image of the phantom. It is useful for the comparison of the functioning of different radiographic image systems, for phantom images under the same exposition conditions. The aim of this work is to study the image quality of different images contrast-detail phantom C.D.M.A.M. 3.4, carrying out the automatic detection of the contrast-detail combination and to establish a parameter which characterize in an objective way the mammographic image quality. This is useful to compare images obtained at different digital mammographic equipments to study the functioning of the equipments. (authors)

  2. A digital photopeak integration in activation analysis

    International Nuclear Information System (INIS)

    Czauderna, M.; Peplowski, A.

    1992-01-01

    A study of the precision attainable by two methods of γ-ray photopeak computation has been carried out. The 'total peak area' method (TPA) and the proposed new method have been compared. The method offered is digital and simulates repeatedly accumulations of γ-ray spectra. The method described here computes the apparent net peak are without a clear distinction between peak and non-peak related channels. The proposed method is considered to be the most advantageous because of its relatively high precision. (author) 10 refs.; 1 fig.; 3 tabs

  3. Analysis of extracellular RNA by digital PCR

    Directory of Open Access Journals (Sweden)

    Kenji eTakahashi

    2014-06-01

    Full Text Available The transfer of extracellular RNA is emerging as an important mechanism for intracellular communication. The ability for the transfer of functionally active RNA molecules from one cell to another within vesicles such as exosomes enables a cell to modulate cellular signaling and biological processes within recipient cells. The study of extracellular RNA requires sensitive methods for the detection of these molecules. In this methods article, we will describe protocols for the detection of such extracellular RNA using sensitive detection technologies such as digital PCR. These protocols should be valuable to researchers interested in the role and contribution of extracellular RNA to tumor cell biology.

  4. Reliability analysis of digital I and C systems at KAERI

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2013-01-01

    This paper provides an overview of the ongoing research activities on a reliability analysis of digital instrumentation and control (I and C) systems of nuclear power plants (NPPs) performed by the Korea Atomic Energy Research Institute (KAERI). The research activities include the development of a new safety-critical software reliability analysis method by integrating the advantages of existing software reliability analysis methods, a fault coverage estimation method based on fault injection experiments, and a new human reliability analysis method for computer-based main control rooms (MCRs) based on human performance data from the APR-1400 full-scope simulator. The research results are expected to be used to address various issues such as the licensing issues related to digital I and C probabilistic safety assessment (PSA) for advanced digital-based NPPs. (author)

  5. Automated hazard analysis of digital control systems

    International Nuclear Information System (INIS)

    Garrett, Chris J.; Apostolakis, George E.

    2002-01-01

    Digital instrumentation and control (I and C) systems can provide important benefits in many safety-critical applications, but they can also introduce potential new failure modes that can affect safety. Unlike electro-mechanical systems, whose failure modes are fairly well understood and which can often be built to fail in a particular way, software errors are very unpredictable. There is virtually no nontrivial software that will function as expected under all conditions. Consequently, there is a great deal of concern about whether there is a sufficient basis on which to resolve questions about safety. In this paper, an approach for validating the safety requirements of digital I and C systems is developed which uses the Dynamic Flowgraph Methodology to conduct automated hazard analyses. The prime implicants of these analyses can be used to identify unknown system hazards, prioritize the disposition of known system hazards, and guide lower-level design decisions to either eliminate or mitigate known hazards. In a case study involving a space-based reactor control system, the method succeeded in identifying an unknown failure mechanism

  6. Effects of high frequency electromagnetic field emitted from digital cellular telephones on electronic pocket dosimeters

    International Nuclear Information System (INIS)

    Shizuhiko, Deji; Kunihide, Nishizawa

    2002-01-01

    High frequency electromagnetic fields emitted from digital cellular telephones (cell phones) occasionally cause abnormally high values (wrong dosages) on electronic pocket dosimeters (EPD). Electric field strength distribution around the cell phone transmitting 1.5GHz band with a maximum power of 0.8 W was analyzed by using an isotropic probe with tri-axial dipole antennas. Five kinds of EPDs were exposed to the fields for 50s under four kinds of configurations relative to the cell phone. The electric field distribution expanded around the antenna and had a maximum strength level of 36.5 ± 0.30 V/m. The cell phone gave rise to a wrong dosage of four EPDs out of five. The electromagnetic susceptibility of the EPD was higher in the section where the semiconductor detector or electric circuit boards were implanted. The maximum value of wrong dosage was 1283μ Sv. The distance preventing electromagnetic interference differed in each EPD and ranged from 2.0cm to 21.0cm. The electromagnetic immunity levels of the EPDs were distributed from 9.2V/m to a value greater than 35V/m. The EPDs displayed wrong dosage during exposure, while they recovered their normal performance after the cell phone ceased transmitting. The electromagnetic immunity levels of the EPDs were either equal to or greater than the IEC-standard. The immunity levels should be enhanced greater than the IEC-standard from the standpoint of radiation protection

  7. Second-to-fourth digit ratio predicts success among high-frequency financial traders.

    Science.gov (United States)

    Coates, John M; Gurnell, Mark; Rustichini, Aldo

    2009-01-13

    Prenatal androgens have important organizing effects on brain development and future behavior. The second-to-fourth digit length ratio (2D:4D) has been proposed as a marker of these prenatal androgen effects, a relatively longer fourth finger indicating higher prenatal androgen exposure. 2D:4D has been shown to predict success in highly competitive sports. Yet, little is known about the effects of prenatal androgens on an economically influential class of competitive risk taking-trading in the financial world. Here, we report the findings of a study conducted in the City of London in which we sampled 2D:4D from a group of male traders engaged in what is variously called "noise" or "high-frequency" trading. We found that 2D:4D predicted the traders' long-term profitability as well as the number of years they remained in the business. 2D:4D also predicted the sensitivity of their profitability to increases both in circulating testosterone and in market volatility. Our results suggest that prenatal androgens increase risk preferences and promote more rapid visuomotor scanning and physical reflexes. The success and longevity of traders exposed to high levels of prenatal androgens further suggests that financial markets may select for biological traits rather than rational expectations.

  8. Effects of high frequency electromagnetic field emitted from digital cellular telephones on electronic pocket dosimeters

    Energy Technology Data Exchange (ETDEWEB)

    Shizuhiko, Deji; Kunihide, Nishizawa [Nagoya Univ., Nagoya (Japan)

    2002-07-01

    High frequency electromagnetic fields emitted from digital cellular telephones (cell phones) occasionally cause abnormally high values (wrong dosages) on electronic pocket dosimeters (EPD). Electric field strength distribution around the cell phone transmitting 1.5GHz band with a maximum power of 0.8 W was analyzed by using an isotropic probe with tri-axial dipole antennas. Five kinds of EPDs were exposed to the fields for 50s under four kinds of configurations relative to the cell phone. The electric field distribution expanded around the antenna and had a maximum strength level of 36.5 {+-} 0.30 V/m. The cell phone gave rise to a wrong dosage of four EPDs out of five. The electromagnetic susceptibility of the EPD was higher in the section where the semiconductor detector or electric circuit boards were implanted. The maximum value of wrong dosage was 1283{mu} Sv. The distance preventing electromagnetic interference differed in each EPD and ranged from 2.0cm to 21.0cm. The electromagnetic immunity levels of the EPDs were distributed from 9.2V/m to a value greater than 35V/m. The EPDs displayed wrong dosage during exposure, while they recovered their normal performance after the cell phone ceased transmitting. The electromagnetic immunity levels of the EPDs were either equal to or greater than the IEC-standard. The immunity levels should be enhanced greater than the IEC-standard from the standpoint of radiation protection.

  9. Comparison of conventional and digital cephalometric analysis: A pilot study

    Directory of Open Access Journals (Sweden)

    Hemlata Bhagwan Tanwani

    2014-01-01

    Full Text Available Aim: The aim of the study was to analyze and compare the manual cephalometric tracings with computerized cephalometric tracings using Burstone hard tissue analysis and McNamara analysis. Materials and Methods: Conventional lateral cephalograms of 20 subjects were obtained and manually traced. The radiographs were subsequently scanned and digitized using Dolphin Imaging software version 11.7. McNamara analysis and Burstone hard tissue analysis were performed by both conventional and digital method. No differentiations were made for age or gender. Data were subjected to statistical analysis. Statistical analysis was undertaken using SPSS 17.0 version (Chicago, Illinois, USA statistical software program. A paired t-test was used to detect differences between the manual and digital methods. Statistical significance was set at the P < 0.05 level of confidence. Results: (A From Burstone analysis variables N-Pg II Hp show statistically very significant difference, and ANS-N, U1-NF, N-B II Hp, L1-Mp, and Go-Pg shows the statistically significant difference. (B From McNamara analysis variables Nasolabial angle and L1-APog show statistically significant differences and the Mandibular length shows the statistically very significant difference. Conclusion: According to this study, is reasonable to conclude that the manual and digital tracings show the statistically significant difference.

  10. A digital processing method for the analysis of complex nuclear spectra

    International Nuclear Information System (INIS)

    Madan, V.K.; Abani, M.C.; Bairi, B.R.

    1994-01-01

    This paper describes a digital processing method using frequency power spectra for the analysis of complex nuclear spectra. The power spectra were estimated by employing modified discrete Fourier transform. The method was applied to observed spectral envelopes. The results for separating closely-spaced doublets in nuclear spectra of low statistical precision compared favorably with those obtained by using a popular peak fitting program SAMPO. The paper also describes limitations of the peak fitting methods. It describes the advantages of digital processing techniques for type II digital signals including nuclear spectra. A compact computer program occupying less than 2.5 kByte of memory space was written in BASIC for the processing of observed spectral envelopes. (orig.)

  11. The vibrating reed frequency meter: digital investigation of an early cochlear model

    Directory of Open Access Journals (Sweden)

    Andrew Bell

    2015-10-01

    Full Text Available The vibrating reed frequency meter, originally employed by Békésy and later by Wilson as a cochlear model, uses a set of tuned reeds to represent the cochlea’s graded bank of resonant elements and an elastic band threaded between them to provide nearest-neighbour coupling. Here the system, constructed of 21 reeds progressively tuned from 45 to 55 Hz, is simulated numerically as an elastically coupled bank of passive harmonic oscillators driven simultaneously by an external sinusoidal force. To uncover more detail, simulations were extended to 201 oscillators covering the range 1–2 kHz. Calculations mirror the results reported by Wilson and show expected characteristics such as traveling waves, phase plateaus, and a response with a broad peak at a forcing frequency just above the natural frequency. The system also displays additional fine-grain features that resemble those which have only recently been recognised in the cochlea. Thus, detailed analysis brings to light a secondary peak beyond the main peak, a set of closely spaced low-amplitude ripples, rapid rotation of phase as the driving frequency is swept, frequency plateaus, clustering, and waxing and waning of impulse responses. Further investigation shows that each reed’s vibrations are strongly localised, with small energy flow along the chain. The distinctive set of equally spaced ripples is an inherent feature which is found to be largely independent of boundary conditions. Although the vibrating reed model is functionally different to the standard transmission line, its cochlea-like properties make it an intriguing local oscillator model whose relevance to cochlear mechanics needs further investigation.

  12. Reliability analysis of digital based I and C system

    Energy Technology Data Exchange (ETDEWEB)

    Kang, I. S.; Cho, B. S.; Choi, M. J. [KOPEC, Yongin (Korea, Republic of)

    1999-10-01

    Rapidly, digital technology is being widely applied in replacing analog component installed in existing plant and designing new nuclear power plant for control and monitoring system in Korea as well as in foreign countries. Even though many merits of digital technology, it is being faced with a new problem of reliability assurance. The studies for solving this problem are being performed vigorously in foreign countries. The reliability of KNGR Engineered Safety Features Component Control System (ESF-CCS), digital based I and C system, was analyzed to verify fulfillment of the ALWR EPRI-URD requirement for reliability analysis and eliminate hazards in design applied new technology. The qualitative analysis using FMEA and quantitative analysis using reliability block diagram were performed. The results of analyses are shown in this paper.

  13. Analog and digital signal analysis from basics to applications

    CERN Document Server

    Cohen Tenoudji, Frédéric

    2016-01-01

    This book provides comprehensive, graduate-level treatment of analog and digital signal analysis suitable for course use and self-guided learning. This expert text guides the reader from the basics of signal theory through a range of application tools for use in acoustic analysis, geophysics, and data compression. Each concept is introduced and explained step by step, and the necessary mathematical formulae are integrated in an accessible and intuitive way. The first part of the book explores how analog systems and signals form the basics of signal analysis. This section covers Fourier series and integral transforms of analog signals, Laplace and Hilbert transforms, the main analog filter classes, and signal modulations. Part II covers digital signals, demonstrating their key advantages. It presents z and Fourier transforms, digital filtering, inverse filters, deconvolution, and parametric modeling for deterministic signals. Wavelet decomposition and reconstruction of non-stationary signals are also discussed...

  14. A population frequency analysis of the FABP2 gene polymorphism

    African Journals Online (AJOL)

    salah

    DNA was extracted from blood samples for genotype analysis. A PCR-RFLP ... Thr54 genotype. The frequencies of the allele Ala54 and the allele Thr54 of the .... Table 2: Genotype percentages and allele frequencies of FABP2 polymorphism in various ethnic groups. Study Group (n). Genotype %. Allele frequency. P. (vs.

  15. A digital, constant-frequency pulsed phase-locked-loop instrument for real-time, absolute ultrasonic phase measurements

    Science.gov (United States)

    Haldren, H. A.; Perey, D. F.; Yost, W. T.; Cramer, K. E.; Gupta, M. C.

    2018-05-01

    A digitally controlled instrument for conducting single-frequency and swept-frequency ultrasonic phase measurements has been developed based on a constant-frequency pulsed phase-locked-loop (CFPPLL) design. This instrument uses a pair of direct digital synthesizers to generate an ultrasonically transceived tone-burst and an internal reference wave for phase comparison. Real-time, constant-frequency phase tracking in an interrogated specimen is possible with a resolution of 0.000 38 rad (0.022°), and swept-frequency phase measurements can be obtained. Using phase measurements, an absolute thickness in borosilicate glass is presented to show the instrument's efficacy, and these results are compared to conventional ultrasonic pulse-echo time-of-flight (ToF) measurements. The newly developed instrument predicted the thickness with a mean error of -0.04 μm and a standard deviation of error of 1.35 μm. Additionally, the CFPPLL instrument shows a lower measured phase error in the absence of changing temperature and couplant thickness than high-resolution cross-correlation ToF measurements at a similar signal-to-noise ratio. By showing higher accuracy and precision than conventional pulse-echo ToF measurements and lower phase errors than cross-correlation ToF measurements, the new digitally controlled CFPPLL instrument provides high-resolution absolute ultrasonic velocity or path-length measurements in solids or liquids, as well as tracking of material property changes with high sensitivity. The ability to obtain absolute phase measurements allows for many new applications than possible with previous ultrasonic pulsed phase-locked loop instruments. In addition to improved resolution, swept-frequency phase measurements add useful capability in measuring properties of layered structures, such as bonded joints, or materials which exhibit non-linear frequency-dependent behavior, such as dispersive media.

  16. Common Cause Failure Analysis for the Digital Plant Protection System

    International Nuclear Information System (INIS)

    Kagn, Hyun Gook; Jang, Seung Cheol

    2005-01-01

    Safety-critical systems such as nuclear power plants adopt the multiple-redundancy design in order to reduce the risk from the single component failure. The digitalized safety-signal generation system is also designed based on the multiple-redundancy strategy which consists of more redundant components. The level of the redundant design of digital systems is usually higher than those of conventional mechanical systems. This higher redundancy would clearly reduce the risk from the single failure of components, but raise the importance of the common cause failure (CCF) analysis. This research aims to develop the practical and realistic method for modeling the CCF in digital safety-critical systems. We propose a simple and practical framework for assessing the CCF probability of digital equipment. Higher level of redundancy causes the difficulty of CCF analysis because it results in impractically large number of CCF events in the fault tree model when we use conventional CCF modeling methods. We apply the simplified alpha-factor (SAF) method to the digital system CCF analysis. The precedent study has shown that SAF method is quite realistic but simple when we consider carefully system success criteria. The first step for using the SAF method is the analysis of target system for determining the function failure cases. That is, the success criteria of the system could be derived from the target system's function and configuration. Based on this analysis, we can calculate the probability of single CCF event which represents the CCF events resulting in the system failure. In addition to the application of SAF method, in order to accommodate the other characteristics of digital technology, we develop a simple concept and several equations for practical use

  17. Stepped-frequency radar sensors theory, analysis and design

    CERN Document Server

    Nguyen, Cam

    2016-01-01

    This book presents the theory, analysis and design of microwave stepped-frequency radar sensors. Stepped-frequency radar sensors are attractive for various sensing applications that require fine resolution. The book consists of five chapters. The first chapter describes the fundamentals of radar sensors including applications followed by a review of ultra-wideband pulsed, frequency-modulated continuous-wave (FMCW), and stepped-frequency radar sensors. The second chapter discusses a general analysis of radar sensors including wave propagation in media and scattering on targets, as well as the radar equation. The third chapter addresses the analysis of stepped-frequency radar sensors including their principles and design parameters. Chapter 4 presents the development of two stepped-frequency radar sensors at microwave and millimeter-wave frequencies based on microwave integrated circuits (MICs), microwave monolithic integrated circuits (MMICs) and printed-circuit antennas, and discusses their signal processing....

  18. Audio Frequency Analysis in Mobile Phones

    Science.gov (United States)

    Aguilar, Horacio Munguía

    2016-01-01

    A new experiment using mobile phones is proposed in which its audio frequency response is analyzed using the audio port for inputting external signal and getting a measurable output. This experiment shows how the limited audio bandwidth used in mobile telephony is the main cause of the poor speech quality in this service. A brief discussion is…

  19. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    DEFF Research Database (Denmark)

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H

    2016-01-01

    to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including...... staining may benefit. METHODS: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm......BACKGROUND: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar...

  20. Interactive Construction Digital Tools With Real Time Analysis

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2007-01-01

    . The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and designers...... an architect-engineer or hybrid practitioner works simultaneously with both aesthetic and technical design requirements. In this paper the problem of a vague or not existing link between digital design tools, used by architects and designers, and the analysis tools developed by and for engineers is considered......The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where...

  1. Frequency modulation television analysis: Threshold impulse analysis. [with computer program

    Science.gov (United States)

    Hodge, W. H.

    1973-01-01

    A computer program is developed to calculate the FM threshold impulse rates as a function of the carrier-to-noise ratio for a specified FM system. The system parameters and a vector of 1024 integers, representing the probability density of the modulating voltage, are required as input parameters. The computer program is utilized to calculate threshold impulse rates for twenty-four sets of measured probability data supplied by NASA and for sinusoidal and Gaussian modulating waveforms. As a result of the analysis several conclusions are drawn: (1) The use of preemphasis in an FM television system improves the threshold by reducing the impulse rate. (2) Sinusoidal modulation produces a total impulse rate which is a practical upper bound for the impulse rates of TV signals providing the same peak deviations. (3) As the moment of the FM spectrum about the center frequency of the predetection filter increases, the impulse rate tends to increase. (4) A spectrum having an expected frequency above (below) the center frequency of the predetection filter produces a higher negative (positive) than positive (negative) impulse rate.

  2. Frequency analysis of DC tolerant current transformers

    International Nuclear Information System (INIS)

    Mlejnek, P; Kaspar, P

    2013-01-01

    This article deals with wide frequency range behaviour of DC tolerant current transformers that are usually used in modern static energy meters. In this application current transformers must comply with European and International Standards in their accuracy and DC tolerance. Therefore, the linear DC tolerant current transformers and double core current transformers are used in this field. More details about the problems of these particular types of transformers can be found in our previous works. Although these transformers are designed mainly for power distribution network frequency (50/60 Hz), it can be interesting to understand their behaviour in wider frequency range. Based on this knowledge the new generations of energy meters with measuring quality of electric energy will be produced. This solution brings better measurement of consumption of nonlinear loads or measurement of non-sinusoidal voltage and current sources such as solar cells or fuel cells. The determination of actual power consumption in such energy meters is done using particular harmonics component of current and voltage. We measured the phase and ratio errors that are the most important parameters of current transformers, to characterize several samples of current transformers of both types

  3. DIGITAL

    Data.gov (United States)

    Federal Emergency Management Agency, Department of Homeland Security — The Digital Flood Insurance Rate Map (DFIRM) Database depicts flood risk information and supporting data used to develop the risk data. The primary risk...

  4. Digital Simulation-Based Training: A Meta-Analysis

    Science.gov (United States)

    Gegenfurtner, Andreas; Quesada-Pallarès, Carla; Knogler, Maximilian

    2014-01-01

    This study examines how design characteristics in digital simulation-based learning environments moderate self-efficacy and transfer of learning. Drawing on social cognitive theory and the cognitive theory of multimedia learning, the meta-analysis psychometrically cumulated k?=?15 studies of 25 years of research with a total sample size of…

  5. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2006-01-01

    The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and design...

  6. Intelligent assembly time analysis, using a digital knowledge based approach

    NARCIS (Netherlands)

    Jin, Y.; Curran, R.; Butterfield, J.; Burke, R.; Welch, B.

    2009-01-01

    The implementation of effective time analysis methods fast and accurately in the era of digital manufacturing has become a significant challenge for aerospace manufacturers hoping to build and maintain a competitive advantage. This paper proposes a structure oriented, knowledge-based approach for

  7. Digital vs conventional radiography: cost and revenue analysis

    International Nuclear Information System (INIS)

    Dalla Palma, L.; Cuttin, R.; Rimondini, A.; Grisi, G.

    1999-01-01

    The objective of this study was to analyse and compare the operating and investment costs of two radiographic systems, a conventional and a digital one, and to evaluate the cost/revenue ratio of the two systems. The radiological activity over 1 year for chest and skeletal exams was evaluated: 13,401 chest and 7,124 skeletal exams were considered. The following parameters of variable costs were evaluated: the difference between variable proportional costs of the two technologies, the effective variable cost of any size film, including the chemicals, and for different sizes of digital film, variable costs of chest plus skeletal exams performed with the two techniques. Afterwards the economical effect was considered taking into account depreciation during a time of utilization ranging between 8 and 4 years. In the second part of the analysis the total cost and the revenues of the two technologies were determined. The comparison between the digital and conventional systems has shown the following aspects: 1. Digital radiography system has a much higher investment cost in comparison with the conventional one. 2. Operating costs of digital equipment are higher or lower depending on the film size used. Evaluating chest X-ray we reach a breakeven point after 1 year and 10,000 exams only if displayed over 8 x 10-in. film and after 30,000 if displayed over a 11 x 14-in. film. 3. The total cost (variable cost, technology cost, labour cost) of digital technology is lower than that of the conventional system by 20 % on average using 8 x 10-in. film size. 4. Digital technology also allows lesser film waste and lesser film per exam (orig.)

  8. Applications of digital image analysis capability in Idaho

    Science.gov (United States)

    Johnson, K. A.

    1981-01-01

    The use of digital image analysis of LANDSAT imagery in water resource assessment is discussed. The data processing systems employed are described. The determination of urban land use conversion of agricultural land in two southwestern Idaho counties involving estimation and mapping of crop types and of irrigated land is described. The system was also applied to an inventory of irrigated cropland in the Snake River basin and establishment of a digital irrigation water source/service area data base for the basin. Application of the system to a determination of irrigation development in the Big Lost River basin as part of a hydrologic survey of the basin is also described.

  9. Spot analysis system by digitalization and imaging

    International Nuclear Information System (INIS)

    Gedin, F.

    1988-05-01

    Laser isotope separation experiments use series of laser producing several beams with characteristics adapted to physical conditions of photoionization. This paper describes briefly the laser chain and systems for measure and test with more details on analysis of spatial distribution of fluence and superposition of the three beams and alignment on the experiment axis [fr

  10. Digital Circuit Analysis Using an 8080 Processor.

    Science.gov (United States)

    Greco, John; Stern, Kenneth

    1983-01-01

    Presents the essentials of a program written in Intel 8080 assembly language for the steady state analysis of a combinatorial logic gate circuit. Program features and potential modifications are considered. For example, the program could also be extended to include clocked/unclocked sequential circuits. (JN)

  11. Virtual unit delay for digital frequency adaptive T/4 delay phase-locked loop system

    DEFF Research Database (Denmark)

    Yang, Yongheng; Zhou, Keliang; Blaabjerg, Frede

    2016-01-01

    /processor with a fixed sampling rate considering the cost and complexity, where the number of unit delays that have been adopted should be an integer. For instance, in conventional digital control systems, a single-phase T/4 Delay Phase-Locked Loop (PLL) system takes 50 unit delays (i.e., in a 50-Hz system...... Delay PLL system should be done in its implementation. This process will result in performance degradation in the digital control system, as the exactly required number of delays is not realized. Hence, in this paper, a Virtual Unit Delay (VUD) has been proposed to address such challenges to the digital......Digital micro-controllers/processors enable the cost-effective control of grid-connected power converter systems in terms of system monitoring, signal processing (e.g., grid synchronization), control (e.g., grid current and voltage control), etc. Normally, the control is implemented in a micro-controller...

  12. Analysis of electronic circuits using digital computers

    International Nuclear Information System (INIS)

    Tapu, C.

    1968-01-01

    Various programmes have been proposed for studying electronic circuits with the help of computers. It is shown here how it possible to use the programme ECAP, developed by I.B.M., for studying the behaviour of an operational amplifier from different point of view: direct current, alternating current and transient state analysis, optimisation of the gain in open loop, study of the reliability. (author) [fr

  13. Forensic Analysis of Digital Image Tampering

    Science.gov (United States)

    2004-12-01

    analysis of when each method fails, which Chapter 4 discusses. Finally, a test image containing an invisible watermark using LSB steganography is...2.2 – Example of invisible watermark using Steganography Software F5 ............. 8 Figure 2.3 – Example of copy-move image forgery [12...used to embed the hidden watermark is Steganography Software F5 version 11+ discussed in Section 2.2. Original JPEG Image – 580 x 435 – 17.4

  14. Time-frequency analysis of pediatric murmurs

    Science.gov (United States)

    Lombardo, Joseph S.; Blodgett, Lisa A.; Rosen, Ron S.; Najmi, Amir-Homayoon; Thompson, W. Reid

    1998-05-01

    Technology has provided many new tools to assist in the diagnosis of pathologic conditions of the heart. Echocardiography, Ultrafast CT, and MRI are just a few. While these tools are a valuable resource, they are typically too expensive, large and complex in operation for use in rural, homecare, and physician's office settings. Recent advances in computer performance, miniaturization, and acoustic signal processing, have yielded new technologies that when applied to heart sounds can provide low cost screening for pathologic conditions. The short duration and transient nature of these signals requires processing techniques that provide high resolution in both time and frequency. Short-time Fourier transforms, Wigner distributions, and wavelet transforms have been applied to signals form hearts with various pathologic conditions. While no single technique provides the ideal solution, the combination of tools provides a good representation of the acoustic features of the pathologies selected.

  15. Digital storage and analysis of color Doppler echocardiograms

    Science.gov (United States)

    Chandra, S.; Thomas, J. D.

    1997-01-01

    Color Doppler flow mapping has played an important role in clinical echocardiography. Most of the clinical work, however, has been primarily qualitative. Although qualitative information is very valuable, there is considerable quantitative information stored within the velocity map that has not been extensively exploited so far. Recently, many researchers have shown interest in using the encoded velocities to address the clinical problems such as quantification of valvular regurgitation, calculation of cardiac output, and characterization of ventricular filling. In this article, we review some basic physics and engineering aspects of color Doppler echocardiography, as well as drawbacks of trying to retrieve velocities from video tape data. Digital storage, which plays a critical role in performing quantitative analysis, is discussed in some detail with special attention to velocity encoding in DICOM 3.0 (medical image storage standard) and the use of digital compression. Lossy compression can considerably reduce file size with minimal loss of information (mostly redundant); this is critical for digital storage because of the enormous amount of data generated (a 10 minute study could require 18 Gigabytes of storage capacity). Lossy JPEG compression and its impact on quantitative analysis has been studied, showing that images compressed at 27:1 using the JPEG algorithm compares favorably with directly digitized video images, the current goldstandard. Some potential applications of these velocities in analyzing the proximal convergence zones, mitral inflow, and some areas of future development are also discussed in the article.

  16. A Bayesian Analysis of the Flood Frequency Hydrology Concept

    Science.gov (United States)

    2016-02-01

    ERDC/CHL CHETN-X-1 February 2016 Approved for public release; distribution is unlimited. A Bayesian Analysis of the Flood Frequency Hydrology ...flood frequency hydrology concept as a formal probabilistic-based means by which to coherently combine and also evaluate the worth of different types...and development. INTRODUCTION: Merz and Blöschl (2008a,b) proposed the concept of flood frequency hydrology , which emphasizes the importance of

  17. Incremental Tensor Principal Component Analysis for Handwritten Digit Recognition

    Directory of Open Access Journals (Sweden)

    Chang Liu

    2014-01-01

    Full Text Available To overcome the shortcomings of traditional dimensionality reduction algorithms, incremental tensor principal component analysis (ITPCA based on updated-SVD technique algorithm is proposed in this paper. This paper proves the relationship between PCA, 2DPCA, MPCA, and the graph embedding framework theoretically and derives the incremental learning procedure to add single sample and multiple samples in detail. The experiments on handwritten digit recognition have demonstrated that ITPCA has achieved better recognition performance than that of vector-based principal component analysis (PCA, incremental principal component analysis (IPCA, and multilinear principal component analysis (MPCA algorithms. At the same time, ITPCA also has lower time and space complexity.

  18. Selective Dirac voltage engineering of individual graphene field-effect transistors for digital inverter and frequency multiplier integrations.

    Science.gov (United States)

    Sul, Onejae; Kim, Kyumin; Jung, Yungwoo; Choi, Eunsuk; Lee, Seung-Beck

    2017-09-15

    The ambipolar band structure of graphene presents unique opportunities for novel electronic device applications. A cycle of gate voltage sweep in a conventional graphene transistor produces a frequency-doubled output current. To increase the frequency further, we used various graphene doping control techniques to produce Dirac voltage engineered graphene channels. The various surface treatments and substrate conditions produced differently doped graphene channels that were integrated on a single substrate and multiple Dirac voltages were observed by applying a single gate voltage sweep. We applied the Dirac voltage engineering techniques to graphene field-effect transistors on a single chip for the fabrication of a frequency multiplier and a logic inverter demonstrating analog and digital circuit application possibilities.

  19. Selective Dirac voltage engineering of individual graphene field-effect transistors for digital inverter and frequency multiplier integrations

    Science.gov (United States)

    Sul, Onejae; Kim, Kyumin; Jung, Yungwoo; Choi, Eunsuk; Lee, Seung-Beck

    2017-09-01

    The ambipolar band structure of graphene presents unique opportunities for novel electronic device applications. A cycle of gate voltage sweep in a conventional graphene transistor produces a frequency-doubled output current. To increase the frequency further, we used various graphene doping control techniques to produce Dirac voltage engineered graphene channels. The various surface treatments and substrate conditions produced differently doped graphene channels that were integrated on a single substrate and multiple Dirac voltages were observed by applying a single gate voltage sweep. We applied the Dirac voltage engineering techniques to graphene field-effect transistors on a single chip for the fabrication of a frequency multiplier and a logic inverter demonstrating analog and digital circuit application possibilities.

  20. Digital analysis of air photography for sustainable forest management; Digital flygbildsteknik foer uthaallig skogsskoetsel

    Energy Technology Data Exchange (ETDEWEB)

    Ekstrand, Sam; Loefmark, Magnus; Johansson, Desiree

    2001-02-01

    The objective of this project was to develop methods for estimation of forest stand variables using digital analysis of near infrared air photography. Near Infrared air photography covering an area 200 km northwest of Stockholm was scanned and ortho rectified. Methods for digital classification, normalisation of view angle effects and estimation of parameters such as timber volume, stand density, crown coverage, species composition, defoliation and number of dead or dying trees have been developed. Major results were that the functions for normalisation on view angle effects on tree size as viewed from the focal point strongly improved the stand estimates. Timber volume, stand density, species composition as well as the ecological variables were estimated with accuracies comparable to those of subjective field inventory methods. In spite of the photography being of high quality, differences in colour between flight lines gave problems with separation of pine and spruce. This may be solved using post-classification manual editing, but will cause an increase in costs. In the future, digital cameras or calibration lamps within the photograph could further reduce this problem.

  1. Time-frequency analysis of human motion during rhythmic exercises.

    Science.gov (United States)

    Omkar, S N; Vyas, Khushi; Vikranth, H N

    2011-01-01

    Biomechanical signals due to human movements during exercise are represented in time-frequency domain using Wigner Distribution Function (WDF). Analysis based on WDF reveals instantaneous spectral and power changes during a rhythmic exercise. Investigations were carried out on 11 healthy subjects who performed 5 cycles of sun salutation, with a body-mounted Inertial Measurement Unit (IMU) as a motion sensor. Variance of Instantaneous Frequency (I.F) and Instantaneous Power (I.P) for performance analysis of the subject is estimated using one-way ANOVA model. Results reveal that joint Time-Frequency analysis of biomechanical signals during motion facilitates a better understanding of grace and consistency during rhythmic exercise.

  2. Frequency analysis of a tower-cable coupled system

    Energy Technology Data Exchange (ETDEWEB)

    Park, Moo Yeol [Young Sin Precision Engineering Ltd., Gyungju (Korea, Republic of); Kim, Seock Hyun; Park, In Su [Kangwon National University, Chuncheon (Korea, Republic of); Cui, Chengxun [Yanbian University, Yangji (China)

    2013-06-15

    This study considers the prediction of natural frequency to avoid resonance in a wind turbine tower- cable coupled system. An analytical model based on the Rayleigh-Ritz method is proposed to predict the resonance frequency of a wind turbine tower structure supported by four guy cables. To verify the validity of the analytical model, a small tower-cable model is manufactured and tested. The frequency and mode data of the tower model are obtained by modal testing and finite element analysis. The validity of the proposed method is verified through the comparison of the frequency analysis results. Finally, using a parametric study with the analytical model, we identified how the cable tension and cable angle affect the resonance frequency of the wind turbine tower structure. From the analysis results, the tension limit and optimal angle of the cable are identified.

  3. Assessing the copula selection for bivariate frequency analysis ...

    Indian Academy of Sciences (India)

    58

    Copulas are applied to overcome the restriction of traditional bivariate frequency ... frequency analysis methods cannot describe the random variable properties that ... In order to overcome the limitation of multivariate distributions, a copula is a ..... The Mann-Kendall (M-K) test is a non-parametric statistical test which is used ...

  4. Digital Double-Pulse Holographic Interferometry for Vibration Analysis

    Directory of Open Access Journals (Sweden)

    H.J. Tiziani

    1996-01-01

    Full Text Available Different arrangements for double-pulsed holographic and speckle interferometry for vibration analysis will be described. Experimental results obtained with films (classical holographic interferometry and CCD cameras (digital holographic interferometry as storage materials are presented. In digital holography, two separate holograms of an object under test are recorded within a few microseconds using a CCD camera and are stored in a frame grabber. The phases of the two reconstructed wave fields are calculated from the complex amplitudes. The deformation is obtained from the phase difference. In the case of electronic speckle pattern interferometry (or image plane hologram, the phase can be calculated by using the sinusoid-fitting method. In the case of digital holographic interferometry, the phase is obtained by digital reconstruction of the complex amplitudes of the wave fronts. Using three directions of illumination and one direction of observation, all the information necessary for the reconstruction of the 3-dimensional deformation vector can be recorded at the same time. Applications of the method for measuring rotating objects are discussed where a derotator needs to be used.

  5. Application of radio frequency based digital thermometer for real-time monitoring of dairy cattle rectal temperature

    Directory of Open Access Journals (Sweden)

    Tridib Debnath

    2017-09-01

    Full Text Available Aim: Dairy cattle health monitoring program becomes vital for detecting the febrile conditions to prevent the outbreak of the animal diseases as well as ensuring the fitness of the animals that are directly affecting the health of the consumers. The aim of this study was to validate real-time rectal temperature (RT data of radio frequency based digital (RFD thermometer with RT data of mercury bulb (MB thermometer in dairy cattle. Materials and Methods: Two experiments were conducted. In experiment I, six female Jersey crossbred cattle with a mean (±standard error of the mean body weight of 534.83±13.90 kg at the age of 12±0.52 years were used to record RT for 2 h on empty stomach and 2 h after feeding at 0, 30, 60, 90, and 120 min using a RFD thermometer as well as a MB thermometer. In experiment II, six female Jersey crossbred cattle were further used to record RT for 2 h before exercise and 2 h after exercise at 0, 30, 60, 90, and 120 min. Two-way repeated measures analysis of variance with post hoc comparisons by Bonferroni test was done. Results: Real-time RT data recorded by RFD thermometer as well as MB thermometer did not differ (p>0.05 before and after feeding/exercise. An increase (p<0.05 in RT after feeding/exercise in experimental crossbred cattle was recorded by both RFD thermometer and MB thermometer. Conclusion: The results obtained in the present study suggest that the body temperature recordings from RFD thermometer would be acceptable and thus RFD thermometer could work well for monitoring real-time RT in cattle.

  6. Frequency domain analysis of piping systems under short duration loading

    International Nuclear Information System (INIS)

    Sachs, K.; Sand, H.; Lockau, J.

    1981-01-01

    In piping analysis two procedures are used almost exclusively: the modal superposition method for relatively long input time histories (e.g., earthquake) and direct integration of the equations of motion for short input time histories. A third possibility, frequency domain analysis, has only rarely been applied to piping systems to date. This paper suggests the use of frequency domain analysis for specific piping problems for which only direct integration could be used in the past. Direct integration and frequency domain analysis are compared, and it is shown that the frequency domain method is less costly if more than four or five load cases are considered. In addition, this method offers technical advantages, such as more accurate representation of modal damping and greater insight into the structural behavior of the system. (orig.)

  7. Frequency domain performance analysis of nonlinearly controlled motion systems

    NARCIS (Netherlands)

    Pavlov, A.V.; Wouw, van de N.; Pogromski, A.Y.; Heertjes, M.F.; Nijmeijer, H.

    2007-01-01

    At the heart of the performance analysis of linear motion control systems lie essential frequency domain characteristics such as sensitivity and complementary sensitivity functions. For a class of nonlinear motion control systems called convergent systems, generalized versions of these sensitivity

  8. Quantitative analysis of culture using millions of digitized books

    OpenAIRE

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2010-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pu...

  9. Quantitative Analysis of Culture Using Millions of Digitized Books

    OpenAIRE

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K.; Google Books Team; Pickett, Joseph; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics,’ focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pu...

  10. Time-frequency analysis and harmonic Gaussian functions

    International Nuclear Information System (INIS)

    Ranaivoson, R.T.R; Raoelina Andriambololona; Hanitriarivo, R.

    2013-01-01

    A method for time-frequency analysis is given. The approach utilizes properties of Gaussian distribution, properties of Hermite polynomials and Fourier analysis. We begin by the definitions of a set of functions called Harmonic Gaussian Functions. Then these functions are used to define a set of transformations, noted Τ n , which associate to a function ψ, of the time variable t, a set of functions Ψ n which depend on time, frequency and frequency (or time) standard deviation. Some properties of the transformations Τ n and the functions Ψ n are given. It is proved in particular that the square of the modulus of each function Ψ n can be interpreted as a representation of the energy distribution of the signal, represented by the function ψ, in the time-frequency plane for a given value of the frequency (or time) standard deviation. It is also shown that the function ψ can be recovered from the functions Ψ n .

  11. Application of digital radiography in the analysis of cultural heritage

    Energy Technology Data Exchange (ETDEWEB)

    Oiveira, Davi F.; Calza, Cristiane; Rocha, Henrique S.; Nascimento, Joseilson R.; Lopes, Ricardo T., E-mail: davi@lin.ufrj.br, E-mail: ccalza@lin.ufrj.br, E-mail: henrique@lin.ufrj.br, E-mail: joseilson@lin.ufrj.br, E-mail: ricardo@lin.ufrj.br [Universidade Federal do Rio de Janeiro (UFRJ), Rio de Janeiro, RJ (Brazil). Lab. de Instrumentacao Nuclear

    2013-07-01

    The scientific examination of artworks has gained increasing interest in the last years, allowing the characterization of materials and techniques employed by the artists. This analysis can be extremely valuable to conservation and restoration treatments. However, the fact that every artworks is a unique piece emphasizes the necessity of working with non-destructive techniques. Although radiography has been used in the technical examination of museum objects for many decades, digital radiography is rapidly becoming a preferred modality for this essential tool in the advanced examination of works of art. The ability to electronically combine images from the large painting into a single composite image file was extremely valuable and results in higher quality images than those achieved with conventional radiography. These images can be also processed and improved using adequate software. Additional advantages of digital radiography include the possibility of an almost immediate analysis of the results, use of an only recording film and absence of chemical processing. Radiographic imaging can be applied to the analysis of virtually all media including paintings, sculptures, woodworks, engravings, etc. This paper reports some case studies of the use of digital radiography in the study of paintings and sculptures, showing the feasibility and advantages of this technique for this kind of purpose. The radiographic images revealed the conservation state of the analyzed objects and various details of its execution in order to assist recently restoration processes. (author)

  12. Application of digital radiography in the analysis of cultural heritage

    International Nuclear Information System (INIS)

    Oiveira, Davi F.; Calza, Cristiane; Rocha, Henrique S.; Nascimento, Joseilson R.; Lopes, Ricardo T.

    2013-01-01

    The scientific examination of artworks has gained increasing interest in the last years, allowing the characterization of materials and techniques employed by the artists. This analysis can be extremely valuable to conservation and restoration treatments. However, the fact that every artworks is a unique piece emphasizes the necessity of working with non-destructive techniques. Although radiography has been used in the technical examination of museum objects for many decades, digital radiography is rapidly becoming a preferred modality for this essential tool in the advanced examination of works of art. The ability to electronically combine images from the large painting into a single composite image file was extremely valuable and results in higher quality images than those achieved with conventional radiography. These images can be also processed and improved using adequate software. Additional advantages of digital radiography include the possibility of an almost immediate analysis of the results, use of an only recording film and absence of chemical processing. Radiographic imaging can be applied to the analysis of virtually all media including paintings, sculptures, woodworks, engravings, etc. This paper reports some case studies of the use of digital radiography in the study of paintings and sculptures, showing the feasibility and advantages of this technique for this kind of purpose. The radiographic images revealed the conservation state of the analyzed objects and various details of its execution in order to assist recently restoration processes. (author)

  13. Analog-to-digital conversion of spectrometric data in information-control systems of activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mamonov, E I

    1972-01-01

    Analog-digital conversion (ADC) techniques in nuclear radiation spectrometer channels is a most important link of information control systems in activation analysis. For the development of the ADC of spectrometer channels logico-structural methods of increasing the capacity, procedures for boosting frequency modes and improving the accuracy are promising. Procedures are suggested for increasing the ADC capacity. Insufficient stability and noticeable non-linearity of the spectrometer channel can be corrected at the information processing stage if their regularities are known. Capacity limitations make the development of ADC featuring high stability, capacity and linearity quite urgent.

  14. Digital substraction angiography (DSA) in a universal radiodiagnostic room with a novel multi-pulse high-frequency generator

    International Nuclear Information System (INIS)

    Ellegast, H.H.; Kloss, R.; Mayr, H.; Ammann, E.; Kuehnel, W.; Siemens A.G., Erlangen

    1985-01-01

    Application of digital subtraction angiography in a universal radiodiagnostic room can be implemented rapidly and reliably. The number of examinations could be increased without negative effects to conventional operations in this room. At optimum radiation hygiene and high-degree operational safety, the multipulse high-frequency generator with its DSA parameter automatic system guarantees a reproducibly good image quality equalling that of a special DSA facility. In this way, the examination room constitutes an economic solution for small-sized hospitals without any special angiography room, too. (orig.) [de

  15. Correlation between radiographic analysis of alveolar bone density around dental implant and resonance frequency of dental implant

    Science.gov (United States)

    Prawoko, S. S.; Nelwan, L. C.; Odang, R. W.; Kusdhany, L. S.

    2017-08-01

    The histomorphometric test is the gold standard for dental implant stability quantification; however, it is invasive, and therefore, it is inapplicable to clinical patients. Consequently, accurate and objective alternative methods are required. Resonance frequency analysis (RFA) and digital radiographic analysis are noninvasive methods with excellent objectivity and reproducibility. To analyze the correlation between the radiographic analysis of alveolar bone density around a dental implant and the resonance frequency of the dental implant. Digital radiographic images for 35 samples were obtained, and the resonance frequency of the dental implant was acquired using Osstell ISQ immediately after dental implant placement and on third-month follow-up. The alveolar bone density around the dental implant was subsequently analyzed using SIDEXIS-XG software. No significant correlation was reported between the alveolar bone density around the dental implant and the resonance frequency of the dental implant (r = -0.102 at baseline, r = 0.146 at follow-up, p > 0.05). However, the alveolar bone density and resonance frequency showed a significant difference throughout the healing period (p = 0.005 and p = 0.000, respectively). Conclusion: Digital dental radiographs and Osstell ISQ showed excellent objectivity and reproducibility in quantifying dental implant stability. Nonetheless, no significant correlation was observed between the results obtained using these two methods.

  16. Suppression of mechanical resonance in digital servo system considering oscillation frequency deviation

    DEFF Research Database (Denmark)

    Chen, Yangyang; Yang, Ming; Hu, Kun

    2017-01-01

    High-stiffness servo system is easy to cause mechanical resonance in elastic coupling servo system. Although on-line adaptive notch filter is effective in most cases, it will lead to a severer resonance when resonance frequency deviated from the natural torsional frequency. To explain...

  17. ANALYSIS OF FREQUENCY OF PHENYLKETONURIA AMONG INSTITUTIONALIZED

    Directory of Open Access Journals (Sweden)

    S VALIAN

    2003-09-01

    Full Text Available Introduction: Phenylketonuria (PKU is a genetic disease, which is caused by deficiency in phenylalanine hydroxylase (PAH enzyme. Untreated patients will develop a severe mental retardation, which is irreversible. In this study, the incidence of the PKU disease among isolated mentally retarded residents in institutions in Isfahan, was investigated. Methods: A total number of 1541 patients were involved in the study. Of the patients studied, 611 with no known reason for their mental retardation were chosen for blood sampling. Blood samples were collected on filter papers and examined by Gutheri bacterial inhibition assay (GBIA, which is specific for PKU In patients with positive test, the serum phenylalanine was quatitavely analyzed using high pressure liquid chromatography, HPLC. Results: Among the patients examined, 33 were found positive. Quantitative analysis of phenylalanine allowed classification of the patients, indicating 600 with classical, 36% with moderate, and 3% with mild type of PKU Furthermore; it was found that in 68% of the cases, parents are third grade relative. Discussion: The results obtained in this screening study indicated that 2.1% of the patients in the institutions for mentally related in Isfahan suffered from PKU The incidence of the disease is relatively high compare to the reports from other countries. Since, a large number of patients (68% are the results of consanguineous marriages, this kind of marriage could be considered as one of the important factors involved in the prevalence of PKU in Isfahan.

  18. Diagnosis of industrial gearboxes condition by vibration and time-frequency, scale-frequency, frequency-frequency analysis

    Directory of Open Access Journals (Sweden)

    P. Czech

    2012-10-01

    Full Text Available In the article methods of vibroacoustic diagnostics of high-power toothed gears are described. It is shown below, that properly registered and processed acoustic signal or vibration signal may serve as an explicitly interpreted source of diagnostic symptoms. The presented analysis were based on vibration signals registered during the work of the gear of a rolling stand working in Katowice Steel Plant (presently one of the branches of Mittal Steel Poland JSC.

  19. Joint Time-Frequency And Wavelet Analysis - An Introduction

    Directory of Open Access Journals (Sweden)

    Majkowski Andrzej

    2014-12-01

    Full Text Available A traditional frequency analysis is not appropriate for observation of properties of non-stationary signals. This stems from the fact that the time resolution is not defined in the Fourier spectrum. Thus, there is a need for methods implementing joint time-frequency analysis (t/f algorithms. Practical aspects of some representative methods of time-frequency analysis, including Short Time Fourier Transform, Gabor Transform, Wigner-Ville Transform and Cone-Shaped Transform are described in this paper. Unfortunately, there is no correlation between the width of the time-frequency window and its frequency content in the t/f analysis. This property is not valid in the case of a wavelet transform. A wavelet is a wave-like oscillation, which forms its own “wavelet window”. Compression of the wavelet narrows the window, and vice versa. Individual wavelet functions are well localized in time and simultaneously in scale (the equivalent of frequency. The wavelet analysis owes its effectiveness to the pyramid algorithm described by Mallat, which enables fast decomposition of a signal into wavelet components.

  20. Computerized Analysis of Digital Photographs for Evaluation of Tooth Movement.

    Science.gov (United States)

    Toodehzaeim, Mohammad Hossein; Karandish, Maryam; Karandish, Mohammad Nabi

    2015-03-01

    Various methods have been introduced for evaluation of tooth movement in orthodontics. The challenge is to adopt the most accurate and most beneficial method for patients. This study was designed to introduce analysis of digital photographs with AutoCAD software as a method to evaluate tooth movement and assess the reliability of this method. Eighteen patients were evaluated in this study. Three intraoral digital images from the buccal view were captured from each patient in half an hour interval. All the photos were sent to AutoCAD software 2011, calibrated and the distance between canine and molar hooks were measured. The data was analyzed using intraclass correlation coefficient. Photographs were found to have high reliability coefficient (P > 0.05). The introduced method is an accurate, efficient and reliable method for evaluation of tooth movement.

  1. Digital predistortion of 75–110 GHz W-band frequency multiplier for fiber wireless short range access systems

    DEFF Research Database (Denmark)

    Zhao, Ying; Deng, Lei; Pang, Xiaodan

    2011-01-01

    be effectively pre-compensated. Without using costly W-band components, a transmission system with 26km fiber and 4m wireless transmission operating at 99.6GHz is experimentally validated. Adjacent-channel power ratio (ACPR) improvements for IQ-modulated vector signals are guaranteed and transmission......We present a W-band fiber-wireless transmission system based on a nonlinear frequency multiplier for high-speed wireless short range access applications. By implementing a baseband digital signal predistortion scheme, intensive nonlinear distortions induced in a sextuple frequency multiplier can...... performances for fiber and wireless channels are studied. This W-band predistortion technique is a promising candidate for applications in high capacity wireless-fiber access systems....

  2. Note: Digital laser frequency auto-locking for inter-satellite laser ranging.

    Science.gov (United States)

    Luo, Yingxin; Li, Hongyin; Yeh, Hsien-Chi

    2016-05-01

    We present a prototype of a laser frequency auto-locking and re-locking control system designed for laser frequency stabilization in inter-satellite laser ranging system. The controller has been implemented on field programmable gate arrays and programmed with LabVIEW software. The controller allows initial frequency calibrating and lock-in of a free-running laser to a Fabry-Pérot cavity. Since it allows automatic recovery from unlocked conditions, benefit derives to automated in-orbit operations. Program design and experimental results are demonstrated.

  3. Note: Digital laser frequency auto-locking for inter-satellite laser ranging

    International Nuclear Information System (INIS)

    Luo, Yingxin; Yeh, Hsien-Chi; Li, Hongyin

    2016-01-01

    We present a prototype of a laser frequency auto-locking and re-locking control system designed for laser frequency stabilization in inter-satellite laser ranging system. The controller has been implemented on field programmable gate arrays and programmed with LabVIEW software. The controller allows initial frequency calibrating and lock-in of a free-running laser to a Fabry-Pérot cavity. Since it allows automatic recovery from unlocked conditions, benefit derives to automated in-orbit operations. Program design and experimental results are demonstrated.

  4. A low-power digital frequency divider for system-on-a-chip applications

    KAUST Repository

    Omran, Hesham

    2011-08-01

    In this paper, an idea for a new frequency divider architecture is proposed. The divider is based on a coarse-fine architecture. The coarse block operates at a low frequency to save power consumption and it selectively enables the fine block which operates at the high input frequency. The proposed divider has the advantages of synchronous divider, but with lower power consumption and higher operation speed. The design can achieve a wide division range with a minor effect on power consumption and speed. The architecture was implemented on a complex programmable logic device (CPLD) to verify its operation. Experimental measurements validate system operation with power reduction greater than 40%. © 2011 IEEE.

  5. Note: Digital laser frequency auto-locking for inter-satellite laser ranging

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Yingxin; Yeh, Hsien-Chi, E-mail: yexianji@mail.hust.edu.cn [MOE Key Laboratory of Fundamental Quantities Measurement, School of Physics, Huazhong University of Science and Technology, Wuhan 430074 (China); Li, Hongyin [MOE Key Laboratory of Fundamental Quantities Measurement, School of Physics, Huazhong University of Science and Technology, Wuhan 430074 (China); School of Automation, Huazhong University of Science and Technology, Wuhan 430074 (China)

    2016-05-15

    We present a prototype of a laser frequency auto-locking and re-locking control system designed for laser frequency stabilization in inter-satellite laser ranging system. The controller has been implemented on field programmable gate arrays and programmed with LabVIEW software. The controller allows initial frequency calibrating and lock-in of a free-running laser to a Fabry-Pérot cavity. Since it allows automatic recovery from unlocked conditions, benefit derives to automated in-orbit operations. Program design and experimental results are demonstrated.

  6. Frequency and time domain characteristics of digital control of electric vehicle in-wheel drives

    Directory of Open Access Journals (Sweden)

    Jarzebowicz Leszek

    2017-12-01

    Full Text Available In-wheel electric drives are promising as actuators in active safety systems of electric and hybrid vehicles. This new function requires dedicated control algorithms, making it essential to deliver models that reflect better the wheel-torque control dynamics of electric drives. The timing of digital control events, whose importance is stressed in current research, still lacks an analytical description allowing for modeling its influence on control system dynamics. In this paper, authors investigate and compare approaches to the analog and discrete analytical modeling of torque control loop in digitally controlled electric drive. Five different analytical models of stator current torque component control are compared to judge their accuracy in representing drive control dynamics related to the timing of digital control events. The Bode characteristics and stepresponse characteristics of the analytical models are then compared with those of a reference model for three commonly used cases of motor discrete control schemes. Finally, the applicability of the presented models is discussed.

  7. Digital Speckle Photography of Subpixel Displacements of Speckle Structures Based on Analysis of Their Spatial Spectra

    Science.gov (United States)

    Maksimova, L. A.; Ryabukho, P. V.; Mysina, N. Yu.; Lyakin, D. V.; Ryabukho, V. P.

    2018-04-01

    We have investigated the capabilities of the method of digital speckle interferometry for determining subpixel displacements of a speckle structure formed by a displaceable or deformable object with a scattering surface. An analysis of spatial spectra of speckle structures makes it possible to perform measurements with a subpixel accuracy and to extend the lower boundary of the range of measurements of displacements of speckle structures to the range of subpixel values. The method is realized on the basis of digital recording of the images of undisplaced and displaced speckle structures, their spatial frequency analysis using numerically specified constant phase shifts, and correlation analysis of spatial spectra of speckle structures. Transformation into the frequency range makes it possible to obtain quantities to be measured with a subpixel accuracy from the shift of the interference-pattern minimum in the diffraction halo by introducing an additional phase shift into the complex spatial spectrum of the speckle structure or from the slope of the linear plot of the function of accumulated phase difference in the field of the complex spatial spectrum of the displaced speckle structure. The capabilities of the method have been investigated in natural experiment.

  8. Reliability Analysis Study of Digital Reactor Protection System in Nuclear Power Plant

    International Nuclear Information System (INIS)

    Guo, Xiao Ming; Liu, Tao; Tong, Jie Juan; Zhao, Jun

    2011-01-01

    The Digital I and C systems are believed to improve a plants safety and reliability generally. The reliability analysis of digital I and C system has become one research hotspot. Traditional fault tree method is one of means to quantify the digital I and C system reliability. Review of advanced nuclear power plant AP1000 digital protection system evaluation makes clear both the fault tree application and analysis process to the digital system reliability. One typical digital protection system special for advanced reactor has been developed, which reliability evaluation is necessary for design demonstration. The typical digital protection system construction is introduced in the paper, and the process of FMEA and fault tree application to the digital protection system reliability evaluation are described. Reliability data and bypass logic modeling are two points giving special attention in the paper. Because the factors about time sequence and feedback not exist in reactor protection system obviously, the dynamic feature of digital system is not discussed

  9. A time-frequency analysis of wave packet fractional revivals

    International Nuclear Information System (INIS)

    Ghosh, Suranjana; Banerji, J

    2007-01-01

    We show that the time-frequency analysis of the autocorrelation function is, in many ways, a more appropriate tool to resolve fractional revivals of a wave packet than the usual time-domain analysis. This advantage is crucial in reconstructing the initial state of the wave packet when its coherent structure is short-lived and decays before it is fully revived. Our calculations are based on the model example of fractional revivals in a Rydberg wave packet of circular states. We end by providing an analytical investigation which fully agrees with our numerical observations on the utility of time-frequency analysis in the study of wave packet fractional revivals

  10. High Order Differential Frequency Hopping: Design and Analysis

    Directory of Open Access Journals (Sweden)

    Yong Li

    2015-01-01

    Full Text Available This paper considers spectrally efficient differential frequency hopping (DFH system design. Relying on time-frequency diversity over large spectrum and high speed frequency hopping, DFH systems are robust against hostile jamming interference. However, the spectral efficiency of conventional DFH systems is very low due to only using the frequency of each channel. To improve the system capacity, in this paper, we propose an innovative high order differential frequency hopping (HODFH scheme. Unlike in traditional DFH where the message is carried by the frequency relationship between the adjacent hops using one order differential coding, in HODFH, the message is carried by the frequency and phase relationship using two-order or higher order differential coding. As a result, system efficiency is increased significantly since the additional information transmission is achieved by the higher order differential coding at no extra cost on either bandwidth or power. Quantitative performance analysis on the proposed scheme demonstrates that transmission through the frequency and phase relationship using two-order or higher order differential coding essentially introduces another dimension to the signal space, and the corresponding coding gain can increase the system efficiency.

  11. Analysis algorithm for digital data used in nuclear spectroscopy

    CERN Document Server

    AUTHOR|(CDS)2085950; Sin, Mihaela

    Data obtained from digital acquisition systems used in nuclear spectroscopy experiments must be converted by a dedicated algorithm in or- der to extract the physical quantities of interest. I will report here the de- velopment of an algorithm capable to read digital data, discriminate between random and true signals and convert the results into a format readable by a special data analysis program package used to interpret nuclear spectra and to create coincident matrices. The algorithm can be used in any nuclear spectroscopy experimental setup provided that digital acquisition modules are involved. In particular it was used to treat data obtained from the IS441 experiment at ISOLDE where the beta decay of 80Zn was investigated as part of ultra-fast timing studies of neutron rich Zn nuclei. The results obtained for the half-lives of 80Zn and 80Ga were in very good agreement with previous measurements. This fact proved unquestionably that the conversion algorithm works. Another remarkable result was the improve...

  12. From Digital Imaging to Computer Image Analysis of Fine Art

    Science.gov (United States)

    Stork, David G.

    An expanding range of techniques from computer vision, pattern recognition, image analysis, and computer graphics are being applied to problems in the history of art. The success of these efforts is enabled by the growing corpus of high-resolution multi-spectral digital images of art (primarily paintings and drawings), sophisticated computer vision methods, and most importantly the engagement of some art scholars who bring questions that may be addressed through computer methods. This paper outlines some general problem areas and opportunities in this new inter-disciplinary research program.

  13. Quantitative analysis of culture using millions of digitized books.

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  14. Quantitative analysis of culture using millions of digitized books

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  15. Analysis of the Paradigm Evolution of Digital Libraries in China

    Directory of Open Access Journals (Sweden)

    Tan Sun

    2007-10-01

    Full Text Available The authors analyze the developmental framework of digital libraries in China and point out their current demand characteristics, development requirements, and developmental period. They then conclude that it is necessary to start up a new paradigm evolution of a digital library, from a traditional digital library to a virtual digital library. On that basis, they describe in detail several problems and developmental approaches that developing a virtual digital library must deal with, drawing lessons from the prototype DILIGENT.

  16. Key Concept Identification: A Comprehensive Analysis of Frequency and Topical Graph-Based Approaches

    Directory of Open Access Journals (Sweden)

    Muhammad Aman

    2018-05-01

    Full Text Available Automatic key concept extraction from text is the main challenging task in information extraction, information retrieval and digital libraries, ontology learning, and text analysis. The statistical frequency and topical graph-based ranking are the two kinds of potentially powerful and leading unsupervised approaches in this area, devised to address the problem. To utilize the potential of these approaches and improve key concept identification, a comprehensive performance analysis of these approaches on datasets from different domains is needed. The objective of the study presented in this paper is to perform a comprehensive empirical analysis of selected frequency and topical graph-based algorithms for key concept extraction on three different datasets, to identify the major sources of error in these approaches. For experimental analysis, we have selected TF-IDF, KP-Miner and TopicRank. Three major sources of error, i.e., frequency errors, syntactical errors and semantical errors, and the factors that contribute to these errors are identified. Analysis of the results reveals that performance of the selected approaches is significantly degraded by these errors. These findings can help us develop an intelligent solution for key concept extraction in the future.

  17. Quantitative subsurface analysis using frequency modulated thermal wave imaging

    Science.gov (United States)

    Subhani, S. K.; Suresh, B.; Ghali, V. S.

    2018-01-01

    Quantitative depth analysis of the anomaly with an enhanced depth resolution is a challenging task towards the estimation of depth of the subsurface anomaly using thermography. Frequency modulated thermal wave imaging introduced earlier provides a complete depth scanning of the object by stimulating it with a suitable band of frequencies and further analyzing the subsequent thermal response using a suitable post processing approach to resolve subsurface details. But conventional Fourier transform based methods used for post processing unscramble the frequencies with a limited frequency resolution and contribute for a finite depth resolution. Spectral zooming provided by chirp z transform facilitates enhanced frequency resolution which can further improves the depth resolution to axially explore finest subsurface features. Quantitative depth analysis with this augmented depth resolution is proposed to provide a closest estimate to the actual depth of subsurface anomaly. This manuscript experimentally validates this enhanced depth resolution using non stationary thermal wave imaging and offers an ever first and unique solution for quantitative depth estimation in frequency modulated thermal wave imaging.

  18. Noise Measurement and Frequency Analysis of Commercially Available Noisy Toys

    Directory of Open Access Journals (Sweden)

    Shohreh Jalaie

    2005-06-01

    Full Text Available Objective: Noise measurement and frequency analysis of commercially available noisy toys were the main purposes of the study. Materials and Methods: 181 noisy toys commonly found in toy stores in different zones of Tehran were selected and categorized into 10 groups. Noise measurement were done at 2, 25, and 50 cm from toys in dBA. The noisiest toy of each group was frequency analyzed in octave bands. Results: The highest and the lowest intensity levels belonged to the gun (mean=112 dBA and range of 100-127 dBA and to the rattle-box (mean=84 dBA and range of 74-95 dBA, respectively. Noise intensity levels significantly decreased with increasing distance except for two toys. Noise frequency analysis indicated energy in effective hearing frequencies. Most of the toys energies were in the middle and high frequency region. Conclusion: As intensity level of the toys is considerable, mostly more than 90 dBA, and also their energy exist in the middle and high frequency region, toys should be considered as a cause of the hearing impairment.

  19. The vibrating reed frequency meter : digital investigation of an early cochlear model

    NARCIS (Netherlands)

    Bell, Andrew; Wit, Hero P.

    2015-01-01

    The vibrating reed frequency meter, originally employed by Bekesy and later by Wilson as a cochlear model, uses a set of tuned reeds to represent the cochlea's graded bank of resonant elements and an elastic band threaded between them to provide nearest-neighbour coupling. Here the system,

  20. An operational modal analysis method in frequency and spatial domain

    Science.gov (United States)

    Wang, Tong; Zhang, Lingmi; Tamura, Yukio

    2005-12-01

    A frequency and spatial domain decomposition method (FSDD) for operational modal analysis (OMA) is presented in this paper, which is an extension of the complex mode indicator function (CMIF) method for experimental modal analysis (EMA). The theoretical background of the FSDD method is clarified. Singular value decomposition is adopted to separate the signal space from the noise space. Finally, an enhanced power spectrum density (PSD) is proposed to obtain more accurate modal parameters by curve fitting in the frequency domain. Moreover, a simulation case and an application case are used to validate this method.

  1. Swept-frequency feedback interferometry using terahertz frequency QCLs: a method for imaging and materials analysis.

    Science.gov (United States)

    Rakić, Aleksandar D; Taimre, Thomas; Bertling, Karl; Lim, Yah Leng; Dean, Paul; Indjin, Dragan; Ikonić, Zoran; Harrison, Paul; Valavanis, Alexander; Khanna, Suraj P; Lachab, Mohammad; Wilson, Stephen J; Linfield, Edmund H; Davies, A Giles

    2013-09-23

    The terahertz (THz) frequency quantum cascade laser (QCL) is a compact source of high-power radiation with a narrow intrinsic linewidth. As such, THz QCLs are extremely promising sources for applications including high-resolution spectroscopy, heterodyne detection, and coherent imaging. We exploit the remarkable phase-stability of THz QCLs to create a coherent swept-frequency delayed self-homodyning method for both imaging and materials analysis, using laser feedback interferometry. Using our scheme we obtain amplitude-like and phase-like images with minimal signal processing. We determine the physical relationship between the operating parameters of the laser under feedback and the complex refractive index of the target and demonstrate that this coherent detection method enables extraction of complex refractive indices with high accuracy. This establishes an ultimately compact and easy-to-implement THz imaging and materials analysis system, in which the local oscillator, mixer, and detector are all combined into a single laser.

  2. Analysis quality of content digital plan of cadastre lines (underground installation

    Directory of Open Access Journals (Sweden)

    Marinković Goran

    2016-01-01

    Full Text Available This work presents the results of research quality digital geodetic plans lines and underground installations. The study included the creation of a digital geodetic plan sewerage network vectorization of analogue geodetic plans and creation of digital geodetic plan of the same details from the original data survey and maintenance survey. Based on research results the analysis of the accuracy and reliability digital geodetic plan created vectoring analogue plan or 'digitizing from the screen'. The results obtained in this work, the can be use not only in the Republic of Serbia, but also in other countries where digitalization of geodetic plans are increasingly gaining in importance.

  3. Gravitational torque frequency analysis for the Einstein elevator experiment

    Energy Technology Data Exchange (ETDEWEB)

    Ashenberg, Joshua [Harvard-Smithsonian Center for Astrophysics (CfA), Cambridge, MA (United States); Lorenzini, Enrico C [University of Padova, Padua (Italy)

    2007-09-07

    Testing the principle of equivalence with a differential acceleration detector that spins while free falling requires a reliable extraction of a very small violation signal from the noise in the output signal frequency spectrum. The experiment is designed such that the violation signal is modulated by the spin of the test bodies. The possible violation signal is mixed with the intrinsic white noise of the detector and the colored noise associated with the modulation of gravitational perturbations, through the spin, and inertial-motion-related noise. In order to avoid false alarms the frequencies of the gravitational disturbances and the violation signal must be separate. This paper presents a model for the perturbative gravitational torque that affects the measurement. The torque is expanded in an asymptotic series to the fourth order and then expressed as a frequency spectrum. A spectral analysis shows the design conditions for frequency separation between the perturbing torque and the violation signal.

  4. Digital image processing and analysis human and computer vision applications with CVIPtools

    CERN Document Server

    Umbaugh, Scott E

    2010-01-01

    Section I Introduction to Digital Image Processing and AnalysisDigital Image Processing and AnalysisOverviewImage Analysis and Computer VisionImage Processing and Human VisionKey PointsExercisesReferencesFurther ReadingComputer Imaging SystemsImaging Systems OverviewImage Formation and SensingCVIPtools SoftwareImage RepresentationKey PointsExercisesSupplementary ExercisesReferencesFurther ReadingSection II Digital Image Analysis and Computer VisionIntroduction to Digital Image AnalysisIntroductionPreprocessingBinary Image AnalysisKey PointsExercisesSupplementary ExercisesReferencesFurther Read

  5. LUXURY BRANDS IN THE DIGITAL AGE: AN EMPIRICAL ANALYSIS OF THE EFFECTIVENESS OF DIGITAL MARKETING STRATEGIES

    OpenAIRE

    Yu, Shubin

    2017-01-01

    The rise of the internet technology has drawn the attention of many luxury marketers and researchers. Digital marketing is regarded as “activities, institutions, and processes facilitated by digital technologies for creating, communicating and delivering value for customers and other stake-holders” (Kannan & Li, 2017, p. 23). Digital marketing enhances value for customers and creates customer equity and firm value (Kannan & Li, 2017). For luxury brands, it is not a big problem to integrate ce...

  6. High-Speed Microscale Optical Tracking Using Digital Frequency-Domain Multiplexing

    OpenAIRE

    MacLachlan, Robert A.; Riviere, Cameron N.

    2009-01-01

    Position-sensitive detectors (PSDs), or lateral-effect photodiodes, are commonly used for high-speed, high-resolution optical position measurement. This paper describes the instrument design for multidimensional position and orientation measurement based on the simultaneous position measurement of multiple modulated sources using frequency-domain-multiplexed (FDM) PSDs. The important advantages of this optical configuration in comparison with laser/mirror combinations are that it has a large ...

  7. Separate recording of rationally related vibration frequencies using digital stroboscopic holographic interferometry

    International Nuclear Information System (INIS)

    Alexeenko, Igor; Gusev, Michael; Gurevich, Vadim

    2009-01-01

    A method for separate recording of rationally related vibration frequencies is presented. To record and measure the mode shape of vibrations, a synchronized stroboscopic CCD camera is used. Synchronization and control of the camera acquisition for recording stroboscopic holographic sequence has been realized. The phase for different states of the object vibration is calculated using the Fourier-transform method. Experimental results are presented, and the advantages and disadvantages of the proposed method are discussed.

  8. Frequency analysis of the ECG before and during ventricular fibrillation

    NARCIS (Netherlands)

    Herbschleb, J.N.; Heethaar, R.M.; Tweel, I. van der; Meijler, F.L.

    1980-01-01

    Frequency analysis of cardiac electrograms of dogs with ventricular fibrillation (VF) during complete cardiopulmonary bypass and coronary perfusion showed a power spectrum with a peak around 12 Hz and its higher harmonics, suggesting more organization than generally assumed. As a next step to see

  9. Robust detection of discordant sites in regional frequency analysis

    NARCIS (Netherlands)

    Neykov, N.M.; Neytchev, P.N.; Van Gelder, P.H.A.J.M.; Todorov, V.K.

    2007-01-01

    The discordancy measure in terms of the sample L?moment ratios (L?CV, L?skewness, L?kurtosis) of the at?site data is widely recommended in the screening process of atypical sites in the regional frequency analysis (RFA). The sample mean and the covariance matrix of the L?moments ratios, on which the

  10. Application of frequency spectrum analysis in the rotator moving equilibrium

    International Nuclear Information System (INIS)

    Liu Ruilan; Su Guanghui; Shang Zhi; Jia Dounan

    2001-01-01

    The experimental equipment is developed to simulate the rotator vibration. The running state of machine is simulated by using different running conditions. The vibration caused by non-equilibrium mass is analyzed and discussed for first order with focus load. The effective method is found out by using frequency spectrum analysis

  11. A report on digital image processing and analysis

    International Nuclear Information System (INIS)

    Singh, B.; Alex, J.; Haridasan, G.

    1989-01-01

    This report presents developments in software, connected with digital image processing and analysis in the Centre. In image processing, one resorts to either alteration of grey level values so as to enhance features in the image or resorts to transform domain operations for restoration or filtering. Typical transform domain operations like Karhunen-Loeve transforms are statistical in nature and are used for a good registration of images or template - matching. Image analysis procedures segment grey level images into images contained within selectable windows, for the purpose of estimating geometrical features in the image, like area, perimeter, projections etc. In short, in image processing both the input and output are images, whereas in image analyses, the input is an image whereas the output is a set of numbers and graphs. (author). 19 refs

  12. Digital Processor Module Reliability Analysis of Nuclear Power Plant

    International Nuclear Information System (INIS)

    Lee, Sang Yong; Jung, Jae Hyun; Kim, Jae Ho; Kim, Sung Hun

    2005-01-01

    The system used in plant, military equipment, satellite, etc. consists of many electronic parts as control module, which requires relatively high reliability than other commercial electronic products. Specially, Nuclear power plant related to the radiation safety requires high safety and reliability, so most parts apply to Military-Standard level. Reliability prediction method provides the rational basis of system designs and also provides the safety significance of system operations. Thus various reliability prediction tools have been developed in recent decades, among of them, the MI-HDBK-217 method has been widely used as a powerful tool for the prediction. In this work, It is explained that reliability analysis work for Digital Processor Module (DPM, control module of SMART) is performed by Parts Stress Method based on MIL-HDBK-217F NOTICE2. We are using the Relex 7.6 of Relex software corporation, because reliability analysis process requires enormous part libraries and data for failure rate calculation

  13. Analysis Components of the Digital Consumer Behavior in Romania

    Directory of Open Access Journals (Sweden)

    Cristian Bogdan Onete

    2016-08-01

    Full Text Available This article is investigating the Romanian consumer behavior in the context of the evolution of the online shopping. Given that online stores are a profitable business model in the area of electronic commerce and because the relationship between consumer digital Romania and its decision to purchase products or services on the Internet has not been sufficiently explored, this study aims to identify specific features of the new type of consumer and to examine the level of online shopping in Romania. Therefore a documentary study was carried out with statistic data regarding the volume and the number of transactions of the online shopping in Romania during 2010-2014, the type of products and services that Romanians are searching the Internet for and demographics of these people. In addition, to study more closely the online consumer behavior, and to interpret the detailed secondary data provided, an exploratory research was performed as a structured questionnaire with five closed questions on the distribution of individuals according to the gender category they belong (male or female; decision to purchase products / services in the virtual environment in the past year; the source of the goods / services purchased (Romanian or foreign sites; factors that have determined the consumers to buy products from foreign sites; categories of products purchased through online transactions from foreign merchants. The questionnaire was distributed electronically via Facebook social network users and the data collected was processed directly in the Facebook official app to create and interpret responses to surveys. The results of this research correlated with the official data reveals the following characteristics of the digital consumer in Romania: atypical European consumer, interested more in online purchases from abroad, influenced by the quality and price of the purchase. This paper assumed a careful analysis of the online acquisitions phenomenon and also

  14. Climate Informed Low Flow Frequency Analysis Using Nonstationary Modeling

    Science.gov (United States)

    Liu, D.; Guo, S.; Lian, Y.

    2014-12-01

    Stationarity is often assumed for frequency analysis of low flows in water resources management and planning. However, many studies have shown that flow characteristics, particularly the frequency spectrum of extreme hydrologic events,were modified by climate change and human activities and the conventional frequency analysis without considering the non-stationary characteristics may lead to costly design. The analysis presented in this paper was based on the more than 100 years of daily flow data from the Yichang gaging station 44 kilometers downstream of the Three Gorges Dam. The Mann-Kendall trend test under the scaling hypothesis showed that the annual low flows had significant monotonic trend, whereas an abrupt change point was identified in 1936 by the Pettitt test. The climate informed low flow frequency analysis and the divided and combined method are employed to account for the impacts from related climate variables and the nonstationarities in annual low flows. Without prior knowledge of the probability density function for the gaging station, six distribution functions including the Generalized Extreme Values (GEV), Pearson Type III, Gumbel, Gamma, Lognormal, and Weibull distributions have been tested to find the best fit, in which the local likelihood method is used to estimate the parameters. Analyses show that GEV had the best fit for the observed low flows. This study has also shown that the climate informed low flow frequency analysis is able to exploit the link between climate indices and low flows, which would account for the dynamic feature for reservoir management and provide more accurate and reliable designs for infrastructure and water supply.

  15. Effects of high frequency electromagnetic fields emitted from digital cellular telephones on electronic pocket dosimeters

    International Nuclear Information System (INIS)

    Deji, Shizuhiko; Nishizawa, Kunihide

    2003-01-01

    Electric field strength distribution around the digital cellular telephone (cell phone) transmitting 1.5GHz band was analyzed by using an isotropic probe. Five types of electronic pocket dosimeters (EPDs) were exposed to the fields for 50sec under four kinds of configurations relative to the cell phone. The field distribution expanded around the antenna and had a maximum strength level of 36.5±0.3V/m. The cell phone caused abnormally high values (wrong dosages) to four EPDs out of five due to electromagnetic interference. Three out of the four EPDs exceeded the upper limits of dose range depending on the configurations, and the maximum value of wrong dosage among the EPDs was 1,283 μSv. The minimum distance preventing electromagnetic interference (protection distance) differed with each EPD and ranged from 2.0cm to 21.0cm. The electromagnetic immunity levels of EPD-1, 2, 3, 4 and 5 were 13.3, ≥35, ≥32, 9.2 and ≥35 V/m, respectively. Although the immunity levels were either equal to or greater than the IEC-standard level, those of the EPDs should be enhanced greater than the IEC-standard from the standpoint of radiation protection. (author)

  16. Parallel optical control of spatiotemporal neuronal spike activity using high-frequency digital light processingtechnology

    Directory of Open Access Journals (Sweden)

    Jason eJerome

    2011-08-01

    Full Text Available Neurons in the mammalian neocortex receive inputs from and communicate back to thousands of other neurons, creating complex spatiotemporal activity patterns. The experimental investigation of these parallel dynamic interactions has been limited due to the technical challenges of monitoring or manipulating neuronal activity at that level of complexity. Here we describe a new massively parallel photostimulation system that can be used to control action potential firing in in vitro brain slices with high spatial and temporal resolution while performing extracellular or intracellular electrophysiological measurements. The system uses Digital-Light-Processing (DLP technology to generate 2-dimensional (2D stimulus patterns with >780,000 independently controlled photostimulation sites that operate at high spatial (5.4 µm and temporal (>13kHz resolution. Light is projected through the quartz-glass bottom of the perfusion chamber providing access to a large area (2.76 x 2.07 mm2 of the slice preparation. This system has the unique capability to induce temporally precise action potential firing in large groups of neurons distributed over a wide area covering several cortical columns. Parallel photostimulation opens up new opportunities for the in vitro experimental investigation of spatiotemporal neuronal interactions at a broad range of anatomical scales.

  17. DATA ACQUISITION AND ANALYSIS OF LOW FREQUENCY ELECTROMAGNETIC FIELD

    Directory of Open Access Journals (Sweden)

    PETRICA POPOV

    2016-06-01

    Full Text Available In recent years more and more studies have shown that, the low frequency field strength (particularly magnetic, 50 / 60Hz are a major risk factor; according to some specialists - even more important as the radiation field. As a result, the personnel serving equipment and facilities such as: electric generators, synchronous, the motors, the inverters or power transformers is subjected continually to intense fields, in their vicinity, with possible harmful effects in the long term by affecting metabolism cell, espectively, the biological mechanisms.Therefore, finding new methods and tools for measurement and analysis of low frequency electromagnetic fields may lead to improved standards for exposure limits of the human body.

  18. Time-Frequency Analysis of Signals Generated by Rotating Machines

    Directory of Open Access Journals (Sweden)

    R. Zetik

    1999-06-01

    Full Text Available This contribution is devoted to the higher order time-frequency analyses of signals. Firstly, time-frequency representations of higher order (TFRHO are defined. Then L-Wigner distribution (LWD is given as a special case of TFRHO. Basic properties of LWD are illustrated based on the analysis of mono-component and multi-component synthetic signals and acoustical signals generated by rotating machine. The obtained results confirm usefulness of LWD application for the purpose of rotating machine condition monitoring.

  19. The analysis of track chamber photographs using flying spot digitizers

    CERN Multimedia

    Powell, Brian W

    1966-01-01

    A vast quantity of data pours from the experiments on particle accelerators throughout the world. For example, over 300 000 photographs per week came from the three bubble chambers operating on the CERN PS at the end of 1965. The conventional method of processing these bubble chamber photographs is for each one of them to be examined ('scanned') to see whether it records an interesting particle interaction. The interesting photographs are then passed to hand operated measuring machines to obtain precise measurements of the particle trajectories recorded on the film. Similar measurements are carried out on photographs taken in film spark chamber experiments. This article on the Flying Spot Digitizers at CERN describes one of the most fruitful attempts to speed and make more accurate the process of analysis of bubble and spark chamber photographs. There are two types of Flying Spot Digitizer at CERN — the HPD or Hough Powell Device (named after Professor Hough and the author who, together, initiated the devel...

  20. A new frequency-domain criterion for elimination of limit cycles in fixed-point state-space digital filters using saturation arithmetic

    International Nuclear Information System (INIS)

    Singh, Vimal

    2007-01-01

    In [Singh V. Elimination of overflow oscillations in fixed-point state-space digital filters using saturation arithmetic. IEEE Trans Circ Syst 1990;37(6):814-8], a frequency-domain criterion for the suppression of limit cycles in fixed-point state-space digital filters using saturation overflow arithmetic was presented. The passivity property owing to the presence of multiple saturation nonlinearities was exploited therein. In the present paper, a new notion of passivity, namely, that involving the state variables is considered, thereby arriving at an entirely new frequency-domain criterion for the suppression of limit cycles in such filters

  1. High frequency vibration analysis by the complex envelope vectorization.

    Science.gov (United States)

    Giannini, O; Carcaterra, A; Sestieri, A

    2007-06-01

    The complex envelope displacement analysis (CEDA) is a procedure to solve high frequency vibration and vibro-acoustic problems, providing the envelope of the physical solution. CEDA is based on a variable transformation mapping the high frequency oscillations into signals of low frequency content and has been successfully applied to one-dimensional systems. However, the extension to plates and vibro-acoustic fields met serious difficulties so that a general revision of the theory was carried out, leading finally to a new method, the complex envelope vectorization (CEV). In this paper the CEV method is described, underlying merits and limits of the procedure, and a set of applications to vibration and vibro-acoustic problems of increasing complexity are presented.

  2. New Canonic Active RC Sinusoidal Oscillator Circuits Using Second-Generation Current Conveyors with Application as a Wide-Frequency Digitally Controlled Sinusoid Generator

    Directory of Open Access Journals (Sweden)

    Abhirup Lahiri

    2011-01-01

    Full Text Available This paper reports two new circuit topologies using second-generation current conveyors (CCIIs for realizing variable frequency sinusoidal oscillators with minimum passive components. The proposed topologies in this paper provide new realizations of resistance-controlled and capacitor-controlled variable frequency oscillators (VFOs using only four passive components. The first topology employs three CCIIs, while the second topology employs two CCIIs. The second topology provides an advantageous feature of frequency tuning through two grounded elements. Application of the proposed circuits as a wide-frequency range digitally controlled sinusoid generator is exhibited wherein the digital frequency control has been enabled by replacing both the capacitors by two identical variable binary capacitor banks tunable by means of the same binary code. SPICE simulations of the CMOS implementation of the oscillators using 0.35 μm TSMC CMOS technology parameters and bipolar implementation of the oscillators using process parameters for NR200N-2X (NPN and PR200N-2X (PNP of bipolar arrays ALA400-CBIC-R have validated their workability. One of the oscillators (with CMOS implementation is exemplified as a digitally controlled sinusoid generator with frequency generation from 25 kHz to 6.36 MHz, achieved by switching capacitors and with power consumption of 7 mW in the entire operating frequency range.

  3. Radiation dose with digital breast tomosynthesis compared to digital mammography. Per-view analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gennaro, Gisella [Veneto Institute of Oncology IOV- IRCCS, Radiology Unit, Padua (Italy); Bernardi, D. [Azienda Provinciale Servizi Sanitari (APSS), U.O. Senologia Clinica e Screening Mammografico, Department of Diagnostics, Trento (Italy); Houssami, N. [University of Sydney, Screening and Test Evaluation Program (STEP), School of Public Health, Sydney Medical School, Sydney (Australia)

    2018-02-15

    To compare radiation dose delivered by digital mammography (FFDM) and breast tomosynthesis (DBT) for a single view. 4,780 FFDM and 4,798 DBT images from 1,208 women enrolled in a screening trial were used to ground dose comparison. Raw images were processed by an automatic software to determine volumetric breast density (VBD) and were used together with exposure data to compute the mean glandular dose (MGD) according to Dance's model. DBT and FFDM were compared in terms of operation of the automatic exposure control (AEC) and MGD level. Statistically significant differences were found between FFDM and DBT MGDs for all views (CC: MGD{sub FFDM}=1.366 mGy, MGD{sub DBT}=1.858 mGy; p<0.0001; MLO: MGD{sub FFDM}=1.374 mGy, MGD{sub DBT}=1.877 mGy; p<0.0001). Considering the 4,768 paired views, Bland-Altman analysis showed that the average increase of DBT dose compared to FFDM is 38 %, and a range between 0 % and 75 %. Our findings show a modest increase of radiation dose to the breast by tomosynthesis compared to FFDM. Given the emerging role of DBT, its use in conjunction with synthetic 2D images should not be deterred by concerns regarding radiation burden, and should draw on evidence of potential clinical benefit. (orig.)

  4. Analysis of modal frequency optimization of railway vehicle car body

    Directory of Open Access Journals (Sweden)

    Wenjing Sun

    2016-04-01

    Full Text Available High structural modal frequencies of car body are beneficial as they ensure better vibration control and enhance ride quality of railway vehicles. Modal sensitivity optimization and elastic suspension parameters used in the design of equipment beneath the chassis of the car body are proposed in order to improve the modal frequencies of car bodies under service conditions. Modal sensitivity optimization is based on sensitivity analysis theory which considers the thickness of the body frame at various positions as variables in order to achieve optimization. Equipment suspension design analyzes the influence of suspension parameters on the modal frequencies of the car body through the use of an equipment-car body coupled model. Results indicate that both methods can effectively improve the modal parameters of the car body. Modal sensitivity optimization increases vertical bending frequency from 9.70 to 10.60 Hz, while optimization of elastic suspension parameters increases the vertical bending frequency to 10.51 Hz. The suspension design can be used without alteration to the structure of the car body while ensuring better ride quality.

  5. FREQUENCY ANALYSIS OF VIBRATIONS OF THE ROUND PARACHUTE EDGE

    Directory of Open Access Journals (Sweden)

    2016-01-01

    Full Text Available The article is addressed to the analysis of the videos obtained during flight experiment at the launch of meteo-rocket MMP-06 in order to determine main characteristics of the oscillatory process the edges of the canopy at subsonic speeds at altitudes from 42,2 km to 34.2 km. Data analysis demonstrated that the oscillations of the edge of the canopy has a random character. The structure frequency of 2.4 Hz was identified from the analysis to be determined by the nylon sling stiffness.

  6. Nonstationary Hydrological Frequency Analysis: Theoretical Methods and Application Challenges

    Science.gov (United States)

    Xiong, L.

    2014-12-01

    Because of its great implications in the design and operation of hydraulic structures under changing environments (either climate change or anthropogenic changes), nonstationary hydrological frequency analysis has become so important and essential. Two important achievements have been made in methods. Without adhering to the consistency assumption in the traditional hydrological frequency analysis, the time-varying probability distribution of any hydrological variable can be established by linking the distribution parameters to some covariates such as time or physical variables with the help of some powerful tools like the Generalized Additive Model of Location, Scale and Shape (GAMLSS). With the help of copulas, the multivariate nonstationary hydrological frequency analysis has also become feasible. However, applications of the nonstationary hydrological frequency formula to the design and operation of hydraulic structures for coping with the impacts of changing environments in practice is still faced with many challenges. First, the nonstationary hydrological frequency formulae with time as covariate could only be extrapolated for a very short time period beyond the latest observation time, because such kind of formulae is not physically constrained and the extrapolated outcomes could be unrealistic. There are two physically reasonable methods that can be used for changing environments, one is to directly link the quantiles or the distribution parameters to some measureable physical factors, and the other is to use the derived probability distributions based on hydrological processes. However, both methods are with a certain degree of uncertainty. For the design and operation of hydraulic structures under changing environments, it is recommended that design results of both stationary and nonstationary methods be presented together and compared with each other, to help us understand the potential risks of each method.

  7. Time-variant random interval natural frequency analysis of structures

    Science.gov (United States)

    Wu, Binhua; Wu, Di; Gao, Wei; Song, Chongmin

    2018-02-01

    This paper presents a new robust method namely, unified interval Chebyshev-based random perturbation method, to tackle hybrid random interval structural natural frequency problem. In the proposed approach, random perturbation method is implemented to furnish the statistical features (i.e., mean and standard deviation) and Chebyshev surrogate model strategy is incorporated to formulate the statistical information of natural frequency with regards to the interval inputs. The comprehensive analysis framework combines the superiority of both methods in a way that computational cost is dramatically reduced. This presented method is thus capable of investigating the day-to-day based time-variant natural frequency of structures accurately and efficiently under concrete intrinsic creep effect with probabilistic and interval uncertain variables. The extreme bounds of the mean and standard deviation of natural frequency are captured through the embedded optimization strategy within the analysis procedure. Three particularly motivated numerical examples with progressive relationship in perspective of both structure type and uncertainty variables are demonstrated to justify the computational applicability, accuracy and efficiency of the proposed method.

  8. Model validity and frequency band selection in operational modal analysis

    Science.gov (United States)

    Au, Siu-Kui

    2016-12-01

    Experimental modal analysis aims at identifying the modal properties (e.g., natural frequencies, damping ratios, mode shapes) of a structure using vibration measurements. Two basic questions are encountered when operating in the frequency domain: Is there a mode near a particular frequency? If so, how much spectral data near the frequency can be included for modal identification without incurring significant modeling error? For data with high signal-to-noise (s/n) ratios these questions can be addressed using empirical tools such as singular value spectrum. Otherwise they are generally open and can be challenging, e.g., for modes with low s/n ratios or close modes. In this work these questions are addressed using a Bayesian approach. The focus is on operational modal analysis, i.e., with 'output-only' ambient data, where identification uncertainty and modeling error can be significant and their control is most demanding. The approach leads to 'evidence ratios' quantifying the relative plausibility of competing sets of modeling assumptions. The latter involves modeling the 'what-if-not' situation, which is non-trivial but is resolved by systematic consideration of alternative models and using maximum entropy principle. Synthetic and field data are considered to investigate the behavior of evidence ratios and how they should be interpreted in practical applications.

  9. Advanced Time-Frequency Representation in Voice Signal Analysis

    Directory of Open Access Journals (Sweden)

    Dariusz Mika

    2018-03-01

    Full Text Available The most commonly used time-frequency representation of the analysis in voice signal is spectrogram. This representation belongs in general to Cohen's class, the class of time-frequency energy distributions. From the standpoint of properties of the resolution spectrogram representation is not optimal. In Cohen class representations are known which have a better resolution properties. All of them are created by smoothing the Wigner-Ville'a (WVD distribution characterized by the best resolution, however, the biggest harmful interference. Used smoothing functions decide about a compromise between the properties of resolution and eliminating harmful interference term. Another class of time-frequency energy distributions is the affine class of distributions. From the point of view of readability of analysis the best properties are known so called Redistribution of energy caused by the use of a general methodology referred to as reassignment to any time-frequency representation. Reassigned distributions efficiently combine a reduction of the interference terms provided by a well adapted smoothing kernel and an increased concentration of the signal components.

  10. Analysis of core damage frequency: Surry, Unit 1 internal events

    International Nuclear Information System (INIS)

    Bertucio, R.C.; Julius, J.A.; Cramond, W.R.

    1990-04-01

    This document contains the accident sequence analysis of internally initiated events for the Surry Nuclear Station, Unit 1. This is one of the five plant analyses conducted as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 documents the risk of a selected group of nuclear power plants. The work performed and described here is an extensive of that published in November 1986 as NUREG/CR-4450, Volume 3. It addresses comments form numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved. The context and detail of this report are directed toward PRA practitioners who need to know how the work was performed and the details for use in further studies. The mean core damage frequency at Surry was calculated to be 4.05-E-5 per year, with a 95% upper bound of 1.34E-4 and 5% lower bound of 6.8E-6 per year. Station blackout type accidents (loss of all AC power) were the largest contributors to the core damage frequency, accounting for approximately 68% of the total. The next type of dominant contributors were Loss of Coolant Accidents (LOCAs). These sequences account for 15% of core damage frequency. No other type of sequence accounts for more than 10% of core damage frequency. 49 refs., 52 figs., 70 tabs

  11. [Evaluation of dental plaque by quantitative digital image analysis system].

    Science.gov (United States)

    Huang, Z; Luan, Q X

    2016-04-18

    To analyze the plaque staining image by using image analysis software, to verify the maneuverability, practicability and repeatability of this technique, and to evaluate the influence of different plaque stains. In the study, 30 volunteers were enrolled from the new dental students of Peking University Health Science Center in accordance with the inclusion criteria. The digital images of the anterior teeth were acquired after plaque stained according to filming standardization.The image analysis was performed using Image Pro Plus 7.0, and the Quigley-Hein plaque indexes of the anterior teeth were evaluated. The plaque stain area percentage and the corresponding dental plaque index were highly correlated,and the Spearman correlation coefficient was 0.776 (Pchart showed only a few spots outside the 95% consistency boundaries. The different plaque stains image analysis results showed that the difference of the tooth area measurements was not significant, while the difference of the plaque area measurements significant (P<0.01). This method is easy in operation and control,highly related to the calculated percentage of plaque area and traditional plaque index, and has good reproducibility.The different plaque staining method has little effect on image segmentation results.The sensitive plaque stain for image analysis is suggested.

  12. The Peltier driven frequency domain approach in thermal analysis.

    Science.gov (United States)

    De Marchi, Andrea; Giaretto, Valter

    2014-10-01

    The merits of Frequency Domain analysis as a tool for thermal system characterization are discussed, and the complex thermal impedance approach is illustrated. Pure AC thermal flux generation with negligible DC component is possible with a Peltier device, differently from other existing methods in which a significant DC component is intrinsically attached to the generated AC flux. Such technique is named here Peltier Driven Frequency Domain (PDFD). As a necessary prerequisite, a novel one-dimensional analytical model for an asymmetrically loaded Peltier device is developed, which is general enough to be useful in most practical situations as a design tool for measurement systems and as a key for the interpretation of experimental results. Impedance analysis is possible with Peltier devices by the inbuilt Seebeck effect differential thermometer, and is used in the paper for an experimental validation of the analytical model. Suggestions are then given for possible applications of PDFD, including the determination of thermal properties of materials.

  13. The Accuracy of the Digital imaging system and the frequency dependent type apex locator in root canal length measurement

    International Nuclear Information System (INIS)

    Lee, Byoung Rib; Park, Chang Seo

    1998-01-01

    In order to achieve a successful endodontic treatment, root canals must be obturated three-dimensionally without causing any damage to apical tissues. Accurate length determination of the root canal is critical in this case. For this reason, I've used the conventional periapical radiography, Digora (digital imaging system) and Root ZX (the frequency dependent type apex locator) to measure the length of the canal and compare it with the true length obtained by cutting the tooth in half and measuring the length between the occlusal surface and the apical foramen. From the information obtained by these measurements, I was able to evaluate the accuracy and clinical usefulness of each systems, whether the thickness of files used in endodontic therapy has any effect on the measuring systems was also evaluated in an effort to simplify the treatment planning phase of endodontic treatment. 29 canals of 29 sound premolars were measured with no 15, no 20, no 25 files by 3 different dentists each using the periapical radiography, Digora and Root ZX. The measurements were then compared with the true length. The results were as follows ; 1. In comparing mean discrepancies between measurements obtained by using periapical radiography (mean error : -0.449 ± 0.444 mm), Digora (mean error : -0.417 ± 0.415 mm) and Root ZX (mean error : 0.123 ± 0.458 mm) with true length, periapical radiography and Digora system had statistically significant differences (p 0.05). 2. By subtracting values obtained by using periapical radiography, Digora and Root ZX from the true length and making a distribution table of their absolute values, the following analysis was possible. In the case of periapical film, 140 out of 261 (53.6%) were clinically acceptable satisfying the margin of error of less than 0.5 mm, 151 out of 261 (53,6%) were acceptable in the Digora system while Root ZX had 197 out of 261 (75.5%) within the limits of 0.5 mm margin of error. 3. In determining whether the thickness of

  14. Time-Frequency Analysis of the Dispersion of Lamb Modes

    Science.gov (United States)

    Prosser, W. H.; Seale, Michael D.; Smith, Barry T.

    1999-01-01

    Accurate knowledge of the velocity dispersion of Lamb modes is important for ultrasonic nondestructive evaluation methods used in detecting and locating flaws in thin plates and in determining their elastic stiffness coefficients. Lamb mode dispersion is also important in the acoustic emission technique for accurately triangulating the location of emissions in thin plates. In this research, the ability to characterize Lamb mode dispersion through a time-frequency analysis (the pseudo Wigner-Ville distribution) was demonstrated. A major advantage of time-frequency methods is the ability to analyze acoustic signals containing multiple propagation modes, which overlap and superimpose in the time domain signal. By combining time-frequency analysis with a broadband acoustic excitation source, the dispersion of multiple Lamb modes over a wide frequency range can be determined from as little as a single measurement. In addition, the technique provides a direct measurement of the group velocity dispersion. The technique was first demonstrated in the analysis of a simulated waveform in an aluminum plate in which the Lamb mode dispersion was well known. Portions of the dispersion curves of the A(sub 0), A(sub 1), S(sub 0), and S(sub 2)Lamb modes were obtained from this one waveform. The technique was also applied for the analysis of experimental waveforms from a unidirectional graphite/epoxy composite plate. Measurements were made both along, and perpendicular to the fiber direction. In this case, the signals contained only the lowest order symmetric and antisymmetric modes. A least squares fit of the results from several source to detector distances was used. Theoretical dispersion curves were calculated and are shown to be in good agreement with experimental results.

  15. Frequency spectrum analysis of finger photoplethysmographic waveform variability during haemodialysis.

    Science.gov (United States)

    Javed, Faizan; Middleton, Paul M; Malouf, Philip; Chan, Gregory S H; Savkin, Andrey V; Lovell, Nigel H; Steel, Elizabeth; Mackie, James

    2010-09-01

    This study investigates the peripheral circulatory and autonomic response to volume withdrawal in haemodialysis based on spectral analysis of photoplethysmographic waveform variability (PPGV). Frequency spectrum analysis was performed on the baseline and pulse amplitude variabilities of the finger infrared photoplethysmographic (PPG) waveform and on heart rate variability extracted from the ECG signal collected from 18 kidney failure patients undergoing haemodialysis. Spectral powers were calculated from the low frequency (LF, 0.04-0.145 Hz) and high frequency (HF, 0.145-0.45 Hz) bands. In eight stable fluid overloaded patients (fluid removal of >2 L) not on alpha blockers, progressive reduction in relative blood volume during haemodialysis resulted in significant increase in LF and HF powers of PPG baseline and amplitude variability (P analysis of finger PPGV may provide valuable information on the autonomic vascular response to blood volume reduction in haemodialysis, and can be potentially utilized as a non-invasive tool for assessing peripheral circulatory control during routine dialysis procedure.

  16. Time-frequency analysis : mathematical analysis of the empirical mode decomposition.

    Science.gov (United States)

    2009-01-01

    Invented over 10 years ago, empirical mode : decomposition (EMD) provides a nonlinear : time-frequency analysis with the ability to successfully : analyze nonstationary signals. Mathematical : Analysis of the Empirical Mode Decomposition : is a...

  17. [Quantitative image analysis in pulmonary pathology - digitalization of preneoplastic lesions in human bronchial epithelium (author's transl)].

    Science.gov (United States)

    Steinbach, T; Müller, K M; Kämper, H

    1979-01-01

    The report concerns the first phase of a quantitative study of normal and abnormal bronchial epithelium with the objective of establishing the digitalization of histologic patterns. Preparative methods, data collecting and handling, and further mathematical analysis are described. In cluster and discriminatory analysis the digitalized histologic features can be used to separate and classify the individual cases into the respective diagnostic groups.

  18. An instructional guide for leaf color analysis using digital imaging software

    Science.gov (United States)

    Paula F. Murakami; Michelle R. Turner; Abby K. van den Berg; Paul G. Schaberg

    2005-01-01

    Digital color analysis has become an increasingly popular and cost-effective method utilized by resource managers and scientists for evaluating foliar nutrition and health in response to environmental stresses. We developed and tested a new method of digital image analysis that uses Scion Image or NIH image public domain software to quantify leaf color. This...

  19. Effects of non-surgical factors on digital replantation survival rate: a meta-analysis.

    Science.gov (United States)

    Ma, Z; Guo, F; Qi, J; Xiang, W; Zhang, J

    2016-02-01

    This study aimed to evaluate the risk factors affecting survival rate of digital replantation by a meta-analysis. A computer retrieval of MEDLINE, OVID, EMBASE, and CNKI databases was conducted to identify citations for digital replantation with digit or finger or thumb or digital or fingertip and replantation as keywords. RevMan 5.2 software was used to calculate the pooled odds ratios. In total, there were 4678 amputated digits in 2641 patients. Gender and ischemia time had no significant influence on the survival rate of amputation replantation (P > 0.05). Age, injured hand, injury type, zone, and the method of preservation the amputated digit significantly influence the survival rate of digital replantation (P < 0.05). Children, right hand, crush, or avulsion and little finger are the risk factors that adversely affect the outcome. Level 5*. © The Author(s) 2015.

  20. Frequency prediction by linear stability analysis around mean flow

    Science.gov (United States)

    Bengana, Yacine; Tuckerman, Laurette

    2017-11-01

    The frequency of certain limit cycles resulting from a Hopf bifurcation, such as the von Karman vortex street, can be predicted by linear stability analysis around their mean flows. Barkley (2006) has shown this to yield an eigenvalue whose real part is zero and whose imaginary part matches the nonlinear frequency. This property was named RZIF by Turton et al. (2015); moreover they found that the traveling waves (TW) of thermosolutal convection have the RZIF property. They explained this as a consequence of the fact that the temporal Fourier spectrum is dominated by the mean flow and first harmonic. We could therefore consider that only the first mode is important in the saturation of the mean flow as presented in the Self-Consistent Model (SCM) of Mantic-Lugo et al. (2014). We have implemented a full Newton's method to solve the SCM for thermosolutal convection. We show that while the RZIF property is satisfied far from the threshold, the SCM model reproduces the exact frequency only very close to the threshold. Thus, the nonlinear interaction of only the first mode with itself is insufficiently accurate to estimate the mean flow. Our next step will be to take into account higher harmonics and to apply this analysis to the standing waves, for which RZIF does not hold.

  1. Signal Adaptive System for Space/Spatial-Frequency Analysis

    Directory of Open Access Journals (Sweden)

    Veselin N. Ivanović

    2009-01-01

    Full Text Available This paper outlines the development of a multiple-clock-cycle implementation (MCI of a signal adaptive two-dimensional (2D system for space/spatial-frequency (S/SF signal analysis. The design is based on a method for improved S/SF representation of the analyzed 2D signals, also proposed here. The proposed MCI design optimizes critical design performances related to hardware complexity, making it a suitable system for real time implementation on an integrated chip. Additionally, the design allows the implemented system to take a variable number of clock cycles (CLKs (the only necessary ones regarding desirable—2D Wigner distribution-presentation of autoterms in different frequency-frequency points during the execution. This ability represents a major advantage of the proposed design which helps to optimize the time required for execution and produce an improved, cross-terms-free S/SF signal representation. The design has been verified by a field-programmable gate array (FPGA circuit design, capable of performing S/SF analysis of 2D signals in real time.

  2. High frequency, high time resolution time-to-digital converter employing passive resonating circuits.

    Science.gov (United States)

    Ripamonti, Giancarlo; Abba, Andrea; Geraci, Angelo

    2010-05-01

    A method for measuring time intervals accurate to the picosecond range is based on phase measurements of oscillating waveforms synchronous with their beginning and/or end. The oscillation is generated by triggering an LC resonant circuit, whose capacitance is precharged. By using high Q resonators and a final active quenching of the oscillation, it is possible to conjugate high time resolution and a small measurement time, which allows a high measurement rate. Methods for fast analysis of the data are considered and discussed with reference to computing resource requirements, speed, and accuracy. Experimental tests show the feasibility of the method and a time accuracy better than 4 ps rms. Methods aimed at further reducing hardware resources are finally discussed.

  3. High frequency, high time resolution time-to-digital converter employing passive resonating circuits

    International Nuclear Information System (INIS)

    Ripamonti, Giancarlo; Abba, Andrea; Geraci, Angelo

    2010-01-01

    A method for measuring time intervals accurate to the picosecond range is based on phase measurements of oscillating waveforms synchronous with their beginning and/or end. The oscillation is generated by triggering an LC resonant circuit, whose capacitance is precharged. By using high Q resonators and a final active quenching of the oscillation, it is possible to conjugate high time resolution and a small measurement time, which allows a high measurement rate. Methods for fast analysis of the data are considered and discussed with reference to computing resource requirements, speed, and accuracy. Experimental tests show the feasibility of the method and a time accuracy better than 4 ps rms. Methods aimed at further reducing hardware resources are finally discussed.

  4. Regional Frequency and Uncertainty Analysis of Extreme Precipitation in Bangladesh

    Science.gov (United States)

    Mortuza, M. R.; Demissie, Y.; Li, H. Y.

    2014-12-01

    Increased frequency of extreme precipitations, especially those with multiday durations, are responsible for recent urban floods and associated significant losses of lives and infrastructures in Bangladesh. Reliable and routinely updated estimation of the frequency of occurrence of such extreme precipitation events are thus important for developing up-to-date hydraulic structures and stormwater drainage system that can effectively minimize future risk from similar events. In this study, we have updated the intensity-duration-frequency (IDF) curves for Bangladesh using daily precipitation data from 1961 to 2010 and quantified associated uncertainties. Regional frequency analysis based on L-moments is applied on 1-day, 2-day and 5-day annual maximum precipitation series due to its advantages over at-site estimation. The regional frequency approach pools the information from climatologically similar sites to make reliable estimates of quantiles given that the pooling group is homogeneous and of reasonable size. We have used Region of influence (ROI) approach along with homogeneity measure based on L-moments to identify the homogenous pooling groups for each site. Five 3-parameter distributions (i.e., Generalized Logistic, Generalized Extreme value, Generalized Normal, Pearson Type Three, and Generalized Pareto) are used for a thorough selection of appropriate models that fit the sample data. Uncertainties related to the selection of the distributions and historical data are quantified using the Bayesian Model Averaging and Balanced Bootstrap approaches respectively. The results from this study can be used to update the current design and management of hydraulic structures as well as in exploring spatio-temporal variations of extreme precipitation and associated risk.

  5. Cost Analysis of a Digital Health Care Model in Sweden.

    Science.gov (United States)

    Ekman, Björn

    2017-09-22

    Digital technologies in health care are expected to increase in scope and to affect ever more parts of the health care system. It is important to enhance the knowledge of whether new digital methods and innovations provide value for money compared with traditional models of care. The objective of the study was to evaluate whether a digital health care model for primary care is a less costly alternative compared with traditional in-office primary care in Sweden. Cost data for the two care models were collected and analyzed to obtain a measure in local currency per care contact. The comparison showed that the total economic cost of a digital consultation is 1960 Swedish krona (SEK) (SEK100 = US$11.29; February 2017) compared with SEK3348 for a traditional consultation at a health care clinic. Cost differences arose on both the provider side and on the user side. The digital health care model may be a less costly alternative to the traditional health care model. Depending on the rate of digital substitution, gross economic cost savings of between SEK1 billion and SEK10 billion per year could be realized if more digital consultations were made. Further studies are needed to validate the findings, assess the types of care most suitable for digital care, and also to obtain various quality-adjusted outcome measures.

  6. Time-frequency analysis of submerged synthetic jet

    Science.gov (United States)

    Kumar, Abhay; Saha, Arun K.; Panigrahi, P. K.

    2017-12-01

    The coherent structures transport the finite body of fluid mass through rolling which plays an important role in heat transfer, boundary layer control, mixing, cooling, propulsion and other engineering applications. A synthetic jet in the form of a train of vortex rings having coherent structures of different length scales is expected to be useful in these applications. The propagation and sustainability of these coherent structures (vortex rings) in downstream direction characterize the performance of synthetic jet. In the present study, the velocity signal acquired using the S-type hot-film probe along the synthetic jet centerline has been taken for the spectral analysis. One circular and three rectangular orifices of aspect ratio 1, 2 and 4 actuating at 1, 6 and 18 Hz frequency have been used for creating different synthetic jets. The laser induced fluorescence images are used to study the flow structures qualitatively and help in explaining the velocity signal for detection of coherent structures. The study depicts four regions as vortex rollup and suction region (X/D h ≤ 3), steadily translating region (X/D h ≤ 3-8), vortex breakup region (X/Dh ≤ 4-8) and dissipation of small-scale vortices (X/D h ≤ 8-15). The presence of coherent structures localized in physical and temporal domain is analyzed for the characterization of synthetic jet. Due to pulsatile nature of synthetic jet, analysis of velocity time trace or signal in time, frequency and combined time-frequency domain assist in characterizing the signatures of coherent structures. It has been observed that the maximum energy is in the first harmonic of actuation frequency, which decreases slowly in downstream direction at 6 Hz compared to 1 and 18 Hz of actuation.

  7. Quantifying biodiversity using digital cameras and automated image analysis.

    Science.gov (United States)

    Roadknight, C. M.; Rose, R. J.; Barber, M. L.; Price, M. C.; Marshall, I. W.

    2009-04-01

    Monitoring the effects on biodiversity of extensive grazing in complex semi-natural habitats is labour intensive. There are also concerns about the standardization of semi-quantitative data collection. We have chosen to focus initially on automating the most time consuming aspect - the image analysis. The advent of cheaper and more sophisticated digital camera technology has lead to a sudden increase in the number of habitat monitoring images and information that is being collected. We report on the use of automated trail cameras (designed for the game hunting market) to continuously capture images of grazer activity in a variety of habitats at Moor House National Nature Reserve, which is situated in the North of England at an average altitude of over 600m. Rainfall is high, and in most areas the soil consists of deep peat (1m to 3m), populated by a mix of heather, mosses and sedges. The cameras have been continuously in operation over a 6 month period, daylight images are in full colour and night images (IR flash) are black and white. We have developed artificial intelligence based methods to assist in the analysis of the large number of images collected, generating alert states for new or unusual image conditions. This paper describes the data collection techniques, outlines the quantitative and qualitative data collected and proposes online and offline systems that can reduce the manpower overheads and increase focus on important subsets in the collected data. By converting digital image data into statistical composite data it can be handled in a similar way to other biodiversity statistics thus improving the scalability of monitoring experiments. Unsupervised feature detection methods and supervised neural methods were tested and offered solutions to simplifying the process. Accurate (85 to 95%) categorization of faunal content can be obtained, requiring human intervention for only those images containing rare animals or unusual (undecidable) conditions, and

  8. Summary of the preparation of methodology for digital system reliability analysis for PSA purposes

    International Nuclear Information System (INIS)

    Hustak, S.; Babic, P.

    2001-12-01

    The report is structured as follows: Specific features of and requirements for the digital part of NPP Instrumentation and Control (I and C) systems (Computer-controlled digital technologies and systems of the NPP I and C system; Specific types of digital technology failures and preventive provisions; Reliability requirements for the digital parts of I and C systems; Safety requirements for the digital parts of I and C systems; Defence-in-depth). Qualitative analyses of NPP I and C system reliability and safety (Introductory system analysis; Qualitative requirements for and proof of NPP I and C system reliability and safety). Quantitative reliability analyses of the digital parts of I and C systems (Selection of a suitable quantitative measure of digital system reliability; Selected qualitative and quantitative findings regarding digital system reliability; Use of relations among the occurrences of the various types of failure). Mathematical section in support of the calculation of the various types of indices (Boolean reliability models, Markovian reliability models). Example of digital system analysis (Description of a selected protective function and the relevant digital part of the I and C system; Functional chain examined, its components and fault tree). (P.A.)

  9. LOFT PSMG Speed Control System frequency response analysis

    International Nuclear Information System (INIS)

    Hansen, H.R.

    1977-01-01

    An analysis was done to gain insight into the shape of the open loop frequency response of the PSMG Speed Control System. The results of the analysis were used as a guide to groom the proportional band and reset time settings of the 2 mode controller in the speed control system. The analysis shows that when an actuator with a timing of 90 degrees per 60 seconds is installed in the system the proportional band and reset time should be 316% and 1 minute. Whereas when grooming the system a proportional band and reset time of 150% and 1.5 minutes were found to be appropriate. The closeness of the settings show that even though a linear model was used to describe the non-linear PSMG Speed Control System, it was accurate enough to be used as a guide to groom the proportional band and reset time settings

  10. An electromagnetic signals monitoring and analysis wireless platform employing personal digital assistants and pattern analysis techniques

    Science.gov (United States)

    Ninos, K.; Georgiadis, P.; Cavouras, D.; Nomicos, C.

    2010-05-01

    This study presents the design and development of a mobile wireless platform to be used for monitoring and analysis of seismic events and related electromagnetic (EM) signals, employing Personal Digital Assistants (PDAs). A prototype custom-developed application was deployed on a 3G enabled PDA that could connect to the FTP server of the Institute of Geodynamics of the National Observatory of Athens and receive and display EM signals at 4 receiver frequencies (3 KHz (E-W, N-S), 10 KHz (E-W, N-S), 41 MHz and 46 MHz). Signals may originate from any one of the 16 field-stations located around the Greek territory. Employing continuous recordings of EM signals gathered from January 2003 till December 2007, a Support Vector Machines (SVM)-based classification system was designed to distinguish EM precursor signals within noisy background. EM-signals corresponding to recordings preceding major seismic events (Ms≥5R) were segmented, by an experienced scientist, and five features (mean, variance, skewness, kurtosis, and a wavelet based feature), derived from the EM-signals were calculated. These features were used to train the SVM-based classification scheme. The performance of the system was evaluated by the exhaustive search and leave-one-out methods giving 87.2% overall classification accuracy, in correctly identifying EM precursor signals within noisy background employing all calculated features. Due to the insufficient processing power of the PDAs, this task was performed on a typical desktop computer. This optimal trained context of the SVM classifier was then integrated in the PDA based application rendering the platform capable to discriminate between EM precursor signals and noise. System's efficiency was evaluated by an expert who reviewed 1/ multiple EM-signals, up to 18 days prior to corresponding past seismic events, and 2/ the possible EM-activity of a specific region employing the trained SVM classifier. Additionally, the proposed architecture can form a

  11. A Development of Nonstationary Regional Frequency Analysis Model with Large-scale Climate Information: Its Application to Korean Watershed

    Science.gov (United States)

    Kim, Jin-Young; Kwon, Hyun-Han; Kim, Hung-Soo

    2015-04-01

    The existing regional frequency analysis has disadvantages in that it is difficult to consider geographical characteristics in estimating areal rainfall. In this regard, this study aims to develop a hierarchical Bayesian model based nonstationary regional frequency analysis in that spatial patterns of the design rainfall with geographical information (e.g. latitude, longitude and altitude) are explicitly incorporated. This study assumes that the parameters of Gumbel (or GEV distribution) are a function of geographical characteristics within a general linear regression framework. Posterior distribution of the regression parameters are estimated by Bayesian Markov Chain Monte Carlo (MCMC) method, and the identified functional relationship is used to spatially interpolate the parameters of the distributions by using digital elevation models (DEM) as inputs. The proposed model is applied to derive design rainfalls over the entire Han-river watershed. It was found that the proposed Bayesian regional frequency analysis model showed similar results compared to L-moment based regional frequency analysis. In addition, the model showed an advantage in terms of quantifying uncertainty of the design rainfall and estimating the area rainfall considering geographical information. Finally, comprehensive discussion on design rainfall in the context of nonstationary will be presented. KEYWORDS: Regional frequency analysis, Nonstationary, Spatial information, Bayesian Acknowledgement This research was supported by a grant (14AWMP-B082564-01) from Advanced Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

  12. Automatic analysis of ciliary beat frequency using optical flow

    Science.gov (United States)

    Figl, Michael; Lechner, Manuel; Werther, Tobias; Horak, Fritz; Hummel, Johann; Birkfellner, Wolfgang

    2012-02-01

    Ciliary beat frequency (CBF) can be a useful parameter for diagnosis of several diseases, as e.g. primary ciliary dyskinesia. (PCD). CBF computation is usually done using manual evaluation of high speed video sequences, a tedious, observer dependent, and not very accurate procedure. We used the OpenCV's pyramidal implementation of the Lukas-Kanade algorithm for optical flow computation and applied this to certain objects to follow the movements. The objects were chosen by their contrast applying the corner detection by Shi and Tomasi. Discrimination between background/noise and cilia by a frequency histogram allowed to compute the CBF. Frequency analysis was done using the Fourier transform in matlab. The correct number of Fourier summands was found by the slope in an approximation curve. The method showed to be usable to distinguish between healthy and diseased samples. However there remain difficulties in automatically identifying the cilia, and also in finding enough high contrast cilia in the image. Furthermore the some of the higher contrast cilia are lost (and sometimes found) by the method, an easy way to distinguish the correct sub-path of a point's path have yet to be found in the case where the slope methods doesn't work.

  13. Nonlinear analysis for dual-frequency concurrent energy harvesting

    Science.gov (United States)

    Yan, Zhimiao; Lei, Hong; Tan, Ting; Sun, Weipeng; Huang, Wenhu

    2018-05-01

    The dual-frequency responses of the hybrid energy harvester undergoing the base excitation and galloping were analyzed numerically. In this work, an approximate dual-frequency analytical method is proposed for the nonlinear analysis of such a system. To obtain the approximate analytical solutions of the full coupled distributed-parameter model, the forcing interactions is first neglected. Then, the electromechanical decoupled governing equation is developed using the equivalent structure method. The hybrid mechanical response is finally separated to be the self-excited and forced responses for deriving the analytical solutions, which are confirmed by the numerical simulations of the full coupled model. The forced response has great impacts on the self-excited response. The boundary of Hopf bifurcation is analytically determined by the onset wind speed to galloping, which is linearly increased by the electrical damping. Quenching phenomenon appears when the increasing base excitation suppresses the galloping. The theoretical quenching boundary depends on the forced mode velocity. The quenching region increases with the base acceleration and electrical damping, but decreases with the wind speed. Superior to the base-excitation-alone case, the existence of the aerodynamic force protects the hybrid energy harvester at resonance from damages caused by the excessive large displacement. From the view of the harvested power, the hybrid system surpasses the base-excitation-alone system or the galloping-alone system. This study advances our knowledge on intrinsic nonlinear dynamics of the dual-frequency energy harvesting system by taking advantage of the analytical solutions.

  14. Field nonuniformity correction for quantitative analysis of digitized mammograms

    International Nuclear Information System (INIS)

    Pawluczyk, Olga; Yaffe, Martin J.

    2001-01-01

    Several factors, including the heel effect, variation in distance from the x-ray source to points in the image and path obliquity contribute to the signal nonuniformity of mammograms. To best use digitized mammograms for quantitative image analysis, these field non-uniformities must be corrected. An empirically based correction method, which uses a bowl-shaped calibration phantom, has been developed. Due to the annular spherical shape of the phantom, its attenuation is constant over the entire image. Remaining nonuniformities are due only to the heel and inverse square effects as well as the variable path through the beam filter, compression plate and image receptor. In logarithmic space, a normalized image of the phantom can be added to mammograms to correct for these effects. Then, an analytical correction for path obliquity in the breast can be applied to the images. It was found that the correction causes the errors associated with field nonuniformity to be reduced from 14% to 2% for a 4 cm block of material corresponding to a combination of 50% fibroglandular and 50% fatty breast tissue. A repeatability study has been conducted to show that in regions as far as 20 cm away from the chest wall, variations due to imaging conditions and phantom alignment contribute to <2% of overall corrected signal

  15. Differential Power Analysis as a digital forensic tool.

    Science.gov (United States)

    Souvignet, T; Frinken, J

    2013-07-10

    Electronic payment fraud is considered a serious international crime by Europol. An important part of this fraud comes from payment card data skimming. This type of fraud consists of an illegal acquisition of payment card details when a user is withdrawing cash at an automated teller machine (ATM) or paying at a point of sale (POS). Modern skimming devices, also known as skimmers, use secure crypto-algorithms (e.g. Advanced Encryption Standard (AES)) to protect skimmed data stored within their memory. In order to provide digital evidence in criminal cases involving skimmers, law enforcement agencies (LEAs) must retrieve the plaintext skimmed data, generally without having knowledge of the secret key. This article proposes an alternative to the current solution at the Bundeskriminalamt (BKA) to reveal the secret key. The proposed solution is non-invasive, based on Power Analysis Attack (PAA). This article first describes the structure and the behaviour of an AES skimmer, followed by the proposal of the full operational PAA process, from power measurements to attack computation. Finally, it presents results obtained in several cases, explaining the latest improvements and providing some ideas for further developments. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. Coaxial waveguide mode reconstruction and analysis with THz digital holography.

    Science.gov (United States)

    Wang, Xinke; Xiong, Wei; Sun, Wenfeng; Zhang, Yan

    2012-03-26

    Terahertz (THz) digital holography is employed to investigate the properties of waveguides. By using a THz digital holographic imaging system, the propagation modes of a metallic coaxial waveguide are measured and the mode patterns are restored with the inverse Fresnel diffraction algorithm. The experimental results show that the THz propagation mode inside the waveguide is a combination of four modes TE₁₁, TE₁₂, TM₁₁, and TM₁₂, which are in good agreement with the simulation results. In this work, THz digital holography presents its strong potential as a platform for waveguide mode charactering. The experimental findings provide a valuable reference for the design of THz waveguides.

  17. Suitability review of FMEA and reliability analysis for digital plant protection system and digital engineered safety features actuation system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, I. S.; Kim, T. K.; Kim, M. C.; Kim, B. S.; Hwang, S. W.; Ryu, K. C. [Hanyang Univ., Seoul (Korea, Republic of)

    2000-11-15

    Of the many items that should be checked out during a review stage of the licensing application for the I and C system of Ulchin 5 and 6 units, this report relates to a suitability review of the reliability analysis of Digital Plant Protection System (DPPS) and Digital Engineered Safety Features Actuation System (DESFAS). In the reliability analysis performed by the system designer, ABB-CE, fault tree analysis was used as the main methods along with Failure Modes and Effect Analysis (FMEA). However, the present regulatory technique dose not allow the system reliability analysis and its results to be appropriately evaluated. Hence, this study was carried out focusing on the following four items ; development of general review items by which to check the validity of a reliability analysis, and the subsequent review of suitability of the reliability analysis for Ulchin 5 and 6 DPPS and DESFAS L development of detailed review items by which to check the validity of an FMEA, and the subsequent review of suitability of the FMEA for Ulchin 5 and 6 DPPS and DESFAS ; development of detailed review items by which to check the validity of a fault tree analysis, and the subsequent review of suitability of the fault tree for Ulchin 5 and 6 DPPS and DESFAS ; an integrated review of the safety and reliability of the Ulchin 5 and 6 DPPS and DESFAS based on the results of the various reviews above and also of a reliability comparison between the digital systems and the comparable analog systems, i.e., and analog Plant Protection System (PPS) and and analog Engineered Safety Features Actuation System (ESFAS). According to the review mentioned above, the reliability analysis of Ulchin 5 and 6 DPPS and DESFAS generally satisfies the review requirements. However, some shortcomings of the analysis were identified in our review such that the assumed test periods for several equipment were not properly incorporated in the analysis, and failures of some equipment were not included in the

  18. Frequency-dependent springs in the seismic analysis of structures

    International Nuclear Information System (INIS)

    Tyapin, A.G.

    2005-01-01

    This paper presents a two-step algorithm for the seismic analysis of structure resting on the rigid embedded basement. Frequency-domain analysis of SSI is carried out on the second step for a platform model with special 'soil spring' which is complex, frequency-dependent, wave-dependent and non-balanced. Theory is presented to obtain the parameters of the soil spring on the first step of the analysis, performed without structure (only geometry of the basement is used) using well-known SASSI code (Lysmer et al, 1981) or in some other ways. On the second step in the SASSI analysis the soil spring is included in the model as a special finite element. Thus, the first step enables to save the computer resources on structure, the second step-to save resources on soil. Soil spring is the most general form for a SSI linear analysis: conventional springs and dashpots can be easily represented in such a format. Thus, the presented approach enables to study the impact of various factors (such as the embedment depth and soil-structure separation, the off-diagonal stiffness, various formulas for stiffness and damping, etc.) on the soil spring parameters. These parameters can be studied separately from the structure itself. As an example, the study of the horizontal soil mesh size is presented. Lumped soil spring may be used on the second step to obtain structural response spectra. To get stresses complex stiffness may be distributed over the basement slab and embedded walls. The proposed approach may be considered to be the alternative to the impedance method (see ASCE4-98). (authors)

  19. Frequency Synthesiser

    NARCIS (Netherlands)

    Drago, Salvatore; Sebastiano, Fabio; Leenaerts, Dominicus M.W.; Breems, Lucien J.; Nauta, Bram

    2016-01-01

    A low power frequency synthesiser circuit (30) for a radio transceiver, the synthesiser circuit comprising: a digital controlled oscillator configured to generate an output signal having a frequency controlled by an input digital control word (DCW); a feedback loop connected between an output and an

  20. Frequency synthesiser

    NARCIS (Netherlands)

    Drago, S.; Sebastiano, Fabio; Leenaerts, Dominicus Martinus Wilhelmus; Breems, Lucien Johannes; Nauta, Bram

    2010-01-01

    A low power frequency synthesiser circuit (30) for a radio transceiver, the synthesiser circuit comprising: a digital controlled oscillator configured to generate an output signal having a frequency controlled by an input digital control word (DCW); a feedback loop connected between an output and an

  1. A Method and Support Tool for the Analysis of Human Error Hazards in Digital Devices

    International Nuclear Information System (INIS)

    Lee, Yong Hee; Kim, Seon Soo; Lee, Yong Hee

    2012-01-01

    In recent years, many nuclear power plants have adopted modern digital I and C technologies since they are expected to significantly improve their performance and safety. Modern digital technologies were expected to significantly improve both the economical efficiency and safety of nuclear power plants. However, the introduction of an advanced main control room (MCR) is accompanied with lots of changes in forms and features and differences through virtue of new digital devices. Many user-friendly displays and new features in digital devices are not enough to prevent human errors in nuclear power plants (NPPs). It may be an urgent to matter find the human errors potentials due to digital devices, and their detailed mechanisms. We can then consider them during the design of digital devices and their interfaces. The characteristics of digital technologies and devices may give many opportunities to the interface management, and can be integrated into a compact single workstation in an advanced MCR, such that workers can operate the plant with minimum burden under any operating condition. However, these devices may introduce new types of human errors, and thus we need a means to evaluate and prevent such errors, especially within digital devices for NPPs. This research suggests a new method named HEA-BIS (Human Error Analysis based on Interaction Segment) to confirm and detect human errors associated with digital devices. This method can be facilitated by support tools when used to ensure the safety when applying digital devices in NPPs

  2. Provider attributes correlation analysis to their referral frequency and awards.

    Science.gov (United States)

    Wiley, Matthew T; Rivas, Ryan L; Hristidis, Vagelis

    2016-03-14

    There has been a recent growth in health provider search portals, where patients specify filters-such as specialty or insurance-and providers are ranked by patient ratings or other attributes. Previous work has identified attributes associated with a provider's quality through user surveys. Other work supports that intuitive quality-indicating attributes are associated with a provider's quality. We adopt a data-driven approach to study how quality indicators of providers are associated with a rich set of attributes including medical school, graduation year, procedures, fellowships, patient reviews, location, and technology usage. In this work, we only consider providers as individuals (e.g., general practitioners) and not organizations (e.g., hospitals). As quality indicators, we consider the referral frequency of a provider and a peer-nominated quality designation. We combined data from the Centers for Medicare and Medicaid Services (CMS) and several provider rating web sites to perform our analysis. Our data-driven analysis identified several attributes that correlate with and discriminate against referral volume and peer-nominated awards. In particular, our results consistently demonstrate that these attributes vary by locality and that the frequency of an attribute is more important than its value (e.g., the number of patient reviews or hospital affiliations are more important than the average review rating or the ranking of the hospital affiliations, respectively). We demonstrate that it is possible to build accurate classifiers for referral frequency and quality designation, with accuracies over 85 %. Our findings show that a one-size-fits-all approach to ranking providers is inadequate and that provider search portals should calibrate their ranking function based on location and specialty. Further, traditional filters of provider search portals should be reconsidered, and patients should be aware of existing pitfalls with these filters and educated on local

  3. The collaborative writing in @Elhombredetweed. A pragmatic analysis for the digital literature studies

    Directory of Open Access Journals (Sweden)

    Luis Felipe González Gutiérrez

    2018-05-01

    Full Text Available This article discusses the preliminary results of the analysis of the Twitter account @Elhombredetweed account, from the contributions of digital literature, specifically on digital poetry studies, social construccionism and cultural psychology. From these results indicates the importance of the collaborative writing in the actual studies in the digital humanities, especially in the digital literature field. The methodology of the study is centered in poetic research, like a qualitative approach. For the tweets analysis we worked with the social network analysis software Netlytic. The results indicate two stories in the account: the literary story-object of the Mexican writer Mauricio Montiel (account author and the series of projects and narrative sequences of the followers to @Elhombredetweed, which constitutes an example of transmedia narrative and shows the impact of social networks in the collective construction of stories and the formation of digital subjectivities, through the use of ICT as a potential for an online reality.

  4. Low energy booster radio frequency cavity structural analysis

    International Nuclear Information System (INIS)

    Jones, K.

    1994-01-01

    The structural design of the Superconducting Super Collider Low Energy Booster (LEB) Radio Frequency (RF) Cavity is very unique. The cavity is made of three different materials which all contribute to its structural strength while at the same time providing a good medium for magnetic properties. Its outer conductor is made of thin walled stainless steel which is later copper plated to reduce the electrical losses. Its tuner housing is made of a fiber reinforced composite laminate, similar to G10, glued to stainless steel plating. The stainless steel of the tuner is slotted to significantly diminish the magnetically-induced eddy currents. The composite laminate is bonded to the stainless steel to restore the structural strength that was lost in slotting. The composite laminate is also a barrier against leakage of the pressurized internal ferrite coolant fluid. The cavity's inner conductor, made of copper and stainless steel, is subjected to high heat loads and must be liquid cooled. The requirements of the Cavity are very stringent and driven primarily by deflection, natural frequency and temperature. Therefore, very intricate finite element analysis was used to complement conventional hand analysis in the design of the cavity. Structural testing of the assembled prototype cavity is planned to demonstrate the compliance of the cavity design to all of its requirements

  5. Low energy booster radio frequency cavity structural analysis

    International Nuclear Information System (INIS)

    Jones, K.

    1993-04-01

    The structural design of the Superconducting Super Collider Low Energy Booster (LEB) Radio Frequency (RF) Cavity is very unique. The cavity is made of three different materials which all contribute to its structural strength while at the same time providing a good medium for magnetic properties. Its outer conductor is made of thin walled stainless steel which is later copper plated to reduce the electrical losses. Its tuner housing is made of a fiber reinforced composite laminate, similar to G10, glued to stainless steel plating. The stainless steel of the tuner is slotted to significantly diminish the magnetically-induced eddy currents. The composite laminate is bonded to the stainless steel to restore the structural strength that was lost in slotting. The composite laminate is also a barrier against leakage of the pressurized internal ferrite coolant fluid. The cavity's inner conductor, made of copper and stainless steel, is subjected to high heat loads and must be liquid cooled. The requirements of the Cavity are very stringent and driven primarily by deflection, natural frequency and temperature. Therefore, very intricate finite element analysis was used to complement conventional hand analysis in the design of the cavity. Structural testing of the assembled prototype cavity is planned to demonstrate the compliance of the cavity design to all of its requirements

  6. Time Frequency Analysis of Spacecraft Propellant Tank Spinning Slosh

    Science.gov (United States)

    Green, Steven T.; Burkey, Russell C.; Sudermann, James

    2010-01-01

    Many spacecraft are designed to spin about an axis along the flight path as a means of stabilizing the attitude of the spacecraft via gyroscopic stiffness. Because of the assembly requirements of the spacecraft and the launch vehicle, these spacecraft often spin about an axis corresponding to a minor moment of inertia. In such a case, any perturbation of the spin axis will cause sloshing motions in the liquid propellant tanks that will eventually dissipate enough kinetic energy to cause the spin axis nutation (wobble) to grow further. This spinning slosh and resultant nutation growth is a primary design problem of spinning spacecraft and one that is not easily solved by analysis or simulation only. Testing remains the surest way to address spacecraft nutation growth. This paper describes a test method and data analysis technique that reveal the resonant frequency and damping behavior of liquid motions in a spinning tank. Slosh resonant frequency and damping characteristics are necessary inputs to any accurate numerical dynamic simulation of the spacecraft.

  7. Software for analysis of waveforms acquired by digital Doppler broadening spectrometer

    International Nuclear Information System (INIS)

    Vlcek, M; Čížek, J; Procházka, I

    2013-01-01

    High-resolution digital spectrometer for coincidence measurement of Doppler broadening of positron annihilation radiation was recently developed and tested. In this spectrometer pulses from high purity Ge (HPGe) detectors are sampled in the real time by fast digitizers and subsequently analyzed off-line by software. We present description of the software routines used for pulse shape analysis in two spectrometer configurations: (i) semi-digital setup in which detector pulses shaped in spectroscopic amplifiers (SA's) are digitized; (ii) pure digital setup in which pulses from detector pre-amplifiers are digitized directly. Software developed in this work will be freely available in the form of source code and pre-compiled binaries.

  8. A systematic review and meta-analysis of teachers’ development of digital literacy

    DEFF Research Database (Denmark)

    Khalid, Md. Saifuddin; Slættalíð, Tóri; Parveen, Mahmuda

    2015-01-01

    Teachers’ development of digital literacy (DL) is gaining importance with the increase in the integration and adoption of information and communication technologies in educational contexts. The focus has been predominantly on students and not much on teachers, who require greater attention due...... to rapid transformation of both school systems and digital systems’ applications. The goal of this systematic literature review is to draw attention of researchers, policy-makers, and practitioners associated with education systems for considering ‘digital literacy for the professional development...... for the qualitative analysis. This paper reports on three main categories: (a) definition of digital literacy, (b) development of digital literacy of pre-service and in-service teachers and (c) models for the development and evaluation of digital literacy. The general definitions of DL include the elements...

  9. Digital Game-Based Learning for K-12 Mathematics Education: A Meta-Analysis

    Science.gov (United States)

    Byun, JaeHwan; Joung, Eunmi

    2018-01-01

    Digital games (e.g., video games or computer games) have been reported as an effective educational method that can improve students' motivation and performance in mathematics education. This meta-analysis study (a) investigates the current trend of digital game-based learning (DGBL) by reviewing the research studies on the use of DGBL for…

  10. Vibro-Shock Dynamics Analysis of a Tandem Low Frequency Resonator—High Frequency Piezoelectric Energy Harvester

    Directory of Open Access Journals (Sweden)

    Darius Žižys

    2017-04-01

    Full Text Available Frequency up-conversion is a promising technique for energy harvesting in low frequency environments. In this approach, abundantly available environmental motion energy is absorbed by a Low Frequency Resonator (LFR which transfers it to a high frequency Piezoelectric Vibration Energy Harvester (PVEH via impact or magnetic coupling. As a result, a decaying alternating output signal is produced, that can later be collected using a battery or be transferred directly to the electric load. The paper reports an impact-coupled frequency up-converting tandem setup with different LFR to PVEH natural frequency ratios and varying contact point location along the length of the harvester. RMS power output of different frequency up-converting tandems with optimal resistive values was found from the transient analysis revealing a strong relation between power output and LFR-PVEH natural frequency ratio as well as impact point location. Simulations revealed that higher power output is obtained from a higher natural frequency ratio between LFR and PVEH, an increase of power output by one order of magnitude for a doubled natural frequency ratio and up to 150% difference in power output from different impact point locations. The theoretical results were experimentally verified.

  11. Feed particle size evaluation: conventional approach versus digital holography based image analysis

    Directory of Open Access Journals (Sweden)

    Vittorio Dell’Orto

    2010-01-01

    Full Text Available The aim of this study was to evaluate the application of image analysis approach based on digital holography in defining particle size in comparison with the sieve shaker method (sieving method as reference method. For this purpose ground corn meal was analyzed by a sieve shaker Retsch VS 1000 and by image analysis approach based on digital holography. Particle size from digital holography were compared with results obtained by screen (sieving analysis for each of size classes by a cumulative distribution plot. Comparison between particle size values obtained by sieving method and image analysis indicated that values were comparable in term of particle size information, introducing a potential application for digital holography and image analysis in feed industry.

  12. Ambiguity Function Analysis and Processing for Passive Radar Based on CDR Digital Audio Broadcasting

    Directory of Open Access Journals (Sweden)

    Zhang Qiang

    2015-01-01

    Full Text Available China Digital Radio (CDR broadcasting is a new standard of digital audio broadcasting of FM frequency (87–108 MHz based on our research and development efforts. It is compatible with the frequency spectrum in analog FM radio and satisfies the requirements for smooth transition from analog to digital signal in FM broadcasting in China. This paper focuses on the signal characteristics and processing methods of radio-based passive radar. The signal characteristics and ambiguity function of a passive radar illumination source are analyzed. The adverse effects on the target detection of the side peaks owing to cyclic prefix, the Doppler ambiguity strips because of signal synchronization, and the range of side peaks resulting from the signal discontinuous spectrum are then studied. Finally, methods for suppressing these side peaks are proposed and their effectiveness is verified by simulations.

  13. Interference to cable television due to mobile usage in the Digital Dividend - Analysis

    NARCIS (Netherlands)

    Robijns, Jan; Schiphorst, Roelof

    2011-01-01

    The use of mobile applications in the 800 MHz band, which forms part of the ‘Digital Dividend’, may cause interference to cable TV signals under certain conditions. The new mobile applications (called LTE, Long Term Evolution) use frequency bands also used in cable TV networks. This paper discusses

  14. Analysis of medical exposures in digital mammography; Analise das exposicoes medicas em mamografia digital

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Sergio R., E-mail: oliveirasr@fiocruz.br [Escola Politecnica de Saude Joaquim Venancio (EPSJV/FIOCRUZ), Rio de Janeiro, RJ (Brazil); Mantuano, Natalia O.; Albrecht, Afonso S., E-mail: nataliamantuano@gmail.com, E-mail: afonsofismed@gmail.com [Universidade Federal do Rio de Janeiro (UFRJ), Rio de Janeiro, RJ (Brazil). Instituto de Fisica; Flor, Leonardo S., E-mail: leonardo.flor@hsvp.org.br [Hospital Sao Vicente de Paulo (HSVP), Rio de Janeiro, RJ (Brazil)

    2014-07-01

    Currently, the use of digital mammography in the early diagnosis of breast cancer is increasingly common due to the production of high definition image that allows to detect subtle changes in breast images profiles. However it is necessary to be an improvement of the technique used since some devices offer minimization parameters of entrance dose to the skin. Thus, this study seeks to examine how the qualification of technical professionals in radiology interferes with the use of the techniques applied in mammography. For this, survey was carried out in a hospital in the city of Rio de Janeiro, which evaluated the scans of 1190 patients undergoing routine mammography (It is considered routinely the 4 basic exhibitions: with 2 flow skull and 2 medium oblique side, excluding repeats and supplements) in 2013. The medical exposures analyzed obtained from a single full digital equipment, model Senographe DS were compared with three different procedures performed by professionals in mammography techniques. The images were classified according to exposure techniques available in the equipment: Standard (STD), contrast (CNT) and dose (dose), and to be selected as breast density of the patient. Comparing the variation of the radiographic technique in relation to the professional who made the exhibition, what is observed is that the professional B presented the best conduct in relation to radiological protection, because she considered breast density in the choice of technical equipment parameter. The professional A, which is newly formed, and C, which has more service time, almost did not perform variations in the pattern of exposure, even for different breast densities. Thus, we can conclude that there is a need to update the professionals so that the tools available of dose limitation and mamas variability to digital mammography are efficiently employed in the service routine and thus meet the requirements of current legislation.

  15. Noise and Spurious Tones Management Techniques for Multi-GHz RF-CMOS Frequency Synthesizers Operating in Large Mixed Analog-Digital SOCs

    Directory of Open Access Journals (Sweden)

    Maxim Adrian

    2006-01-01

    Full Text Available This paper presents circuit techniques and power supply partitioning, filtering, and regulation methods aimed at reducing the phase noise and spurious tones in frequency synthesizers operating in large mixed analog-digital system-on-chip (SOC. The different noise and spur coupling mechanisms are presented together with solutions to minimize their impact on the overall PLL phase noise performance. Challenges specific to deep-submicron CMOS integration of multi-GHz PLLs are revealed, while new architectures that address these issues are presented. Layout techniques that help reducing the parasitic noise and spur coupling between digital and analog blocks are described. Combining system-level and circuit-level low noise design methods, low phase noise frequency synthesizers were achieved which are compatible with the demanding nowadays wireless communication standards.

  16. Oil price and financial markets: Multivariate dynamic frequency analysis

    International Nuclear Information System (INIS)

    Creti, Anna; Ftiti, Zied; Guesmi, Khaled

    2014-01-01

    The aim of this paper is to study the degree of interdependence between oil price and stock market index into two groups of countries: oil-importers and oil-exporters. To this end, we propose a new empirical methodology allowing a time-varying dynamic correlation measure between the stock market index and the oil price series. We use the frequency approach proposed by Priestley and Tong (1973), that is the evolutionary co-spectral analysis. This method allows us to distinguish between short-run and medium-run dependence. In order to complete our study by analysing long-run dependence, we use the cointegration procedure developed by Engle and Granger (1987). We find that interdependence between the oil price and the stock market is stronger in exporters' markets than in the importers' ones. - Highlights: • A new time-varying measure for the stock markets and oil price relationship in different horizons. • We propose a new empirical methodology: multivariate frequency approach. • We propose a comparison between oil importing and exporting countries. • We show that oil is not always countercyclical with respect to stock markets. • When high oil prices originate from supply shocks, oil is countercyclical with stock markets

  17. Frequency Analysis of Aircraft hazards for License Application

    International Nuclear Information System (INIS)

    K. Ashley

    2006-01-01

    The preclosure safety analysis for the monitored geologic repository at Yucca Mountain must consider the hazard that aircraft may pose to surface structures. Relevant surface structures are located beneath the restricted airspace of the Nevada Test Site (NTS) on the eastern slope of Yucca Mountain, near the North Portal of the Exploratory Studies Facility Tunnel (Figure 1). The North Portal is located several miles from the Nevada Test and Training Range (NTTR), which is used extensively by the U.S. Air Force (USAF) for training and test flights (Figure 1). The NTS airspace, which is controlled by the U.S. Department of Energy (DOE) for NTS activities, is not part of the NTTR. Agreements with the DOE allow USAF aircraft specific use of the airspace above the NTS (Reference 2.1.1 [DIRS 103472], Section 3.1.1 and Appendix A, Section 2.1; and Reference 2.1.2 [DIRS 157987], Sections 1.26 through 1.29). Commercial, military, and general aviation aircraft fly within several miles to the southwest of the repository site in the Beatty Corridor, which is a broad air corridor that runs approximately parallel to U.S. Highway 95 and the Nevada-California border (Figure 2). These aircraft and other aircraft operations are identified and described in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Sections 6 and 8). The purpose of this analysis is to estimate crash frequencies for aircraft hazards identified for detailed analysis in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Section 8). Reference 2.1.3, Section 8, also identifies a potential hazard associated with electronic jamming, which will be addressed in this analysis. This analysis will address only the repository and not the transportation routes to the site. The analysis is intended to provide the basis for: (1) Categorizing event sequences related to aircraft hazards; (2) Identifying design or operational requirements related to aircraft hazards

  18. Correlation analysis of the physiological factors controlling fundamental voice frequency.

    Science.gov (United States)

    Atkinson, J E

    1978-01-01

    A technique has been developed to obtain a quantitative measure of correlation between electromyographic (EMG) activity of various laryngeal muscles, subglottal air pressure, and the fundamental frequency of vibration of the vocal folds (Fo). Data were collected and analyzed on one subject, a native speaker of American English. The results show that an analysis of this type can provide a useful measure of correlation between the physiological and acoustical events in speech and, furthermore, can yield detailed insights into the organization and nature of the speech production process. In particular, based on these results, a model is suggested of Fo control involving laryngeal state functions that seems to agree with present knowledge of laryngeal control and experimental evidence.

  19. Analysis and modelling of engineering structures in frequency domain

    International Nuclear Information System (INIS)

    Ishtev, K.; Bonev, Z.; Petrov, P.; Philipov, P.

    1987-01-01

    This paper deals with some possible applications for modelling and analysis of engineering structures, basing on technique, mentioned above. The governing system of equations is written by using frequency domain approach since elemination technique has computational significance in this field. Modelling is made basing on the well known relationship Y(jw) = W(jw) * X(jw). Here X(jw) is a complex Fourier spectra associated with the imput signals being defined as earthquake, wind, hydrodynamic, control or other type of action. W(jw) is a matrix complex transfer function which reveals the correlation between input X und output Y spectra. Y (ja) represents a complex Fourier spectra of output signals. Input and output signals are both associated with master degrees of freedom, thus matrix transfer function is composed of elements in such a manner that solve unknown parameters are implemented implicitly. It is available an integration algorithm of 'condensed' system of equations. (orig./GL)

  20. Digital image analysis techniques for fiber and soil mixtures : technical summary.

    Science.gov (United States)

    1999-05-01

    This project used to innovative technologies of digital image analysis for the characterization of a material currently being considered for broad use at DOTD. The material under consideration is a mixture of fiber and soil for use in the stabilizati...

  1. Effect Analysis of Digital I and C Systems on Plant Safety based on Fault-Tree Analysis

    International Nuclear Information System (INIS)

    Lee, Seung Jun; Jung, Wondea

    2014-01-01

    Deterioration and an inadequate supply of components of analog I and C systems have led to inefficient and costly maintenance. Moreover, since the fast evolution of digital technology has enabled more reliable functions to be designed for NPP safety, the transition from analog to digital has been accelerated. Owing to the distinguishable characteristics of digital I and C systems, a reliability analysis of digital systems has become an important element of a probabilistic safety assessment (PSA). Digital I and C systems have unique characteristics such as fault-tolerant techniques and software. However, these features have not been properly considered yet in most NPP PSA models. The effect of digital I and C systems should be evaluated by comparing them to that of analog I and C systems. Before installing a digital I and C system, even though it is expected that the plant safety can be improved through the advantageous features of digital I and C systems, it should be validated whether the total NPP safety is better than analog systems or is the same at least. In this work, the fault-tree (FT) technique, which is most widely used in a PSA, was used to compare the effects of analog and digital I and C systems. From a case study, the results of plant safety were compared. In this work, the effect of a digital RPS was evaluated by comparing it to that of an analog RPS based on the FT models. In the evaluation results, it was observed that digital RPS has a positive effect on reducing the system unavailability. The analysis results can be used for the development of a guide for evaluating digital I and C systems and reliability requirements

  2. Combination of one-view digital breast tomosynthesis with one-view digital mammography versus standard two-view digital mammography: per lesion analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gennaro, Gisella; Bezzon, Elisabetta; Pescarini, Luigi; Polico, Ilaria; Proietti, Alessandro; Baldan, Enrica; Pomerri, Fabio; Muzzio, Pier Carlo [Veneto Institute of Oncology (IRCCS), Padua (Italy); Hendrick, R.E. [University of Colorado-Denver, Department of Radiology, School of Medicine, Aurora, CO (United States); Toledano, Alicia [Biostatistics Consulting, LLC, Kensington, MD (United States); Paquelet, Jean R. [Advanced Medical Imaging Consultants, Fort Collins, CO (United States); Breast Imaging, McKee Medical Center, Loveland, CO (United States); Chersevani, Roberta [Private Medical Practice, Gorizia (Italy); Di Maggio, Cosimo [Private Medical Practice, Padua (Italy); La Grassa, Manuela [Department of Radiology, Oncological Reference Center (IRCCS), Aviano (Italy)

    2013-08-15

    To evaluate the clinical value of combining one-view mammography (cranio-caudal, CC) with the complementary view tomosynthesis (mediolateral-oblique, MLO) in comparison to standard two-view mammography (MX) in terms of both lesion detection and characterization. A free-response receiver operating characteristic (FROC) experiment was conducted independently by six breast radiologists, obtaining data from 463 breasts of 250 patients. Differences in mean lesion detection fraction (LDF) and mean lesion characterization fraction (LCF) were analysed by analysis of variance (ANOVA) to compare clinical performance of the combination of techniques to standard two-view digital mammography. The 463 cases (breasts) reviewed included 258 with one to three lesions each, and 205 with no lesions. The 258 cases with lesions included 77 cancers in 68 breasts and 271 benign lesions to give a total of 348 proven lesions. The combination, DBT{sub (MLO)}+MX{sub (CC)}, was superior to MX (CC+MLO) in both lesion detection (LDF) and lesion characterization (LCF) overall and for benign lesions. DBT{sub (MLO)}+MX{sub (CC)} was non-inferior to two-view MX for malignant lesions. This study shows that readers' capabilities in detecting and characterizing breast lesions are improved by combining single-view digital breast tomosynthesis and single-view mammography compared to two-view digital mammography. (orig.)

  3. Percorsi linguistici e semiotici: Critical Multimodal Analysis of Digital Discourse

    OpenAIRE

    edited by Ilaria Moschini

    2014-01-01

    The language section of LEA - edited by Ilaria Moschini - is dedicated to the Critical Multimodal Analysis of Digital Discourse, an approach that encompasses the linguistic and semiotic detailed investigation of texts within a socio-cultural perspective. It features an interview with Professor Theo van Leeuwen by Ilaria Moschini and four essays: “Retwitting, reposting, repinning; reshaping identities online: Towards a social semiotic multimodal analysis of digital remediation” by Elisabetta A...

  4. On time-frequence analysis of heart rate variability

    NARCIS (Netherlands)

    H.G. van Steenis (Hugo)

    2002-01-01

    textabstractThe aim of this research is to develop a time-frequency method suitable to study HRV in greater detail. The following approach was used: • two known time-frequency representations were applied to HRV to understand its advantages and disadvantages in describing HRV in frequency and in

  5. Evaluating fracture healing using digital x-ray image analysis

    African Journals Online (AJOL)

    2011-03-02

    Mar 2, 2011 ... with intensive imaging and modelling.6 dual energy X-ray ... techniques due to their high-quality digital output in ... the bone in the loaded X-ray is at an angular offset due to .... The research described in this article was carried ...

  6. A computer program for planimetric analysis of digitized images

    DEFF Research Database (Denmark)

    Lynnerup, N; Lynnerup, O; Homøe, P

    1992-01-01

    bones as seen on X-rays. By placing the X-rays on a digitizer tablet and tracing the outline of the cell system, the area was calculated by the program. The calculated data and traced images could be stored and printed. The program is written in BASIC; necessary hardware is an IBM-compatible personal...

  7. Analysis of synchronous digital-modulation schemes for satellite communication

    Science.gov (United States)

    Takhar, G. S.; Gupta, S. C.

    1975-01-01

    The multipath communication channel for space communications is modeled as a multiplicative channel. This paper discusses the effects of multiplicative channel processes on the symbol error rate for quadrature modulation (QM) digital modulation schemes. An expression for the upper bound on the probability of error is derived and numerically evaluated. The results are compared with those obtained for additive channels.

  8. Trade-off Analysis of Virtual Inertia and Fast Primary Frequency Control During Frequency Transients in a Converter Dominated Network

    DEFF Research Database (Denmark)

    Rezkalla, Michel M.N.; Marinelli, Mattia; Pertl, Michael

    2016-01-01

    Traditionally the electricity generation is based on rotating synchronous machines which provide inertia to the power system.The increasing share of converter connected energy sources reduces the available rotational inertia in the power system leading to faster frequency dynamics, which may cause...... more critical frequency excursions. Both, virtual inertia and fast primary control could serve as a solution to improvefrequency stability, however, their respective impacts on the system have different consequences, so that the trade-off is not straightforward. This study presents a comparative...... analysis of virtual inertiaand a fast primary control algorithms with respect to rate of change of frequency (ROCOF), frequency nadir and steady state value considering the effect of the dead time which is carried out by a sensitivity analysis. The investigation shows that the virtual inertia controller...

  9. Space-frequency analysis and reduction of potential field ambiguity

    Directory of Open Access Journals (Sweden)

    A. Rapolla

    1997-06-01

    Full Text Available Ambiguity of depth estimation of magnetic sources via spectral analysis can be reduced representing its field via a set of space-frequency atoms. This is obtained throughout a continuous wavelet transform using a Morlet analyzing wavelet. In the phase-plane representation even a weak contribution related to deep-seated sources is clearly distinguished with respect a more intense effect of a shallow source, also in the presence of a strong noise. Furthermore, a new concept of local power spectrum allows the depth to both the sources to be correctly interpreted. Neither result can be provided by standard Fourier analysis. Another method is proposed to reduce ambiguity by inversion of potential field data lying along the vertical axis. This method allows a depth resolution to gravity or the magnetic methods and below some conditions helps to reduce their inherent ambiguity. Unlike the case of monopoles, inversion of a vertical profile of gravity data above a cubic source gives correct results for the cube side and density.

  10. Analysis of pulse-shape discrimination techniques for BC501A using GHz digital signal processing

    International Nuclear Information System (INIS)

    Rooney, B.D.; Dinwiddie, D.R.; Nelson, M.A.; Rawool-Sullivan, Mohini W.

    2001-01-01

    A comparison study of pulse-shape analysis techniques was conducted for a BC501A scintillator using digital signal processing (DSP). In this study, output signals from a preamplifier were input directly into a 1 GHz analog-to-digital converter. The digitized data obtained with this method was post-processed for both pulse-height and pulse-shape information. Several different analysis techniques were evaluated for neutron and gamma-ray pulse-shape discrimination. It was surprising that one of the simplest and fastest techniques resulted in some of the best pulse-shape discrimination results. This technique, referred to here as the Integral Ratio technique, was able to effectively process several thousand detector pulses per second. This paper presents the results and findings of this study for various pulse-shape analysis techniques with digitized detector signals.

  11. User Requirements Analysis For Digital Library Application Using Quality Function Deployment.

    Science.gov (United States)

    Wulandari, Lily; Sularto, Lana; Yusnitasari, Tristyanti; Ikasari, Diana

    2017-03-01

    This study attemp to build Smart Digital Library to be used by the wider community wherever they are. The system is built in the form of Smart Digital Library portal which uses semantic similarity method (Semantic Similarity) to search journals, articles or books by title or author name. This method is also used to determine the recommended books to be read by visitors of Smart Digital Library based on testimony from a previous reader automatically. Steps being taken in the development of Smart Digital Library system is the analysis phase, design phase, testing and implementation phase. At this stage of the analysis using WebQual for the preparation of the instruments to be distributed to the respondents and the data obtained from the respondents will be processed using Quality Function Deployment. In the analysis phase has the purpose of identifying consumer needs and technical requirements. The analysis was performed to a digital library on the web digital library Gunadarma University, Bogor Institute of Agriculture, University of Indonesia, etc. The questionnaire was distributed to 200 respondents. The research methodology begins with the collection of user requirements and analyse it using QFD. Application design is funded by the government through a program of Featured Universities Research by the Directorate General of Higher Education (DIKTI). Conclusions from this research are identified which include the Consumer Requirements of digital library application. The elements of the consumers requirements consists of 13 elements and 25 elements of Engineering Characteristics digital library requirements. Therefore the design of digital library applications that will be built, is designed according to the findings by eliminating features that are not needed by restaurant based on QFD House of Quality.

  12. A general theory on frequency and time-frequency analysis of irregularly sampled time series based on projection methods - Part 1: Frequency analysis

    Science.gov (United States)

    Lenoir, Guillaume; Crucifix, Michel

    2018-03-01

    We develop a general framework for the frequency analysis of irregularly sampled time series. It is based on the Lomb-Scargle periodogram, but extended to algebraic operators accounting for the presence of a polynomial trend in the model for the data, in addition to a periodic component and a background noise. Special care is devoted to the correlation between the trend and the periodic component. This new periodogram is then cast into the Welch overlapping segment averaging (WOSA) method in order to reduce its variance. We also design a test of significance for the WOSA periodogram, against the background noise. The model for the background noise is a stationary Gaussian continuous autoregressive-moving-average (CARMA) process, more general than the classical Gaussian white or red noise processes. CARMA parameters are estimated following a Bayesian framework. We provide algorithms that compute the confidence levels for the WOSA periodogram and fully take into account the uncertainty in the CARMA noise parameters. Alternatively, a theory using point estimates of CARMA parameters provides analytical confidence levels for the WOSA periodogram, which are more accurate than Markov chain Monte Carlo (MCMC) confidence levels and, below some threshold for the number of data points, less costly in computing time. We then estimate the amplitude of the periodic component with least-squares methods, and derive an approximate proportionality between the squared amplitude and the periodogram. This proportionality leads to a new extension for the periodogram: the weighted WOSA periodogram, which we recommend for most frequency analyses with irregularly sampled data. The estimated signal amplitude also permits filtering in a frequency band. Our results generalise and unify methods developed in the fields of geosciences, engineering, astronomy and astrophysics. They also constitute the starting point for an extension to the continuous wavelet transform developed in a companion

  13. Multifrequency spectrum analysis using fully digital G Mode-Kelvin probe force microscopy

    International Nuclear Information System (INIS)

    Collins, Liam; Belianinov, Alex; Somnath, Suhas; Balke, Nina; Kalinin, Sergei V; Jesse, Stephen; Rodriguez, Brian J

    2016-01-01

    Since its inception over two decades ago, Kelvin probe force microscopy (KPFM) has become the standard technique for characterizing electrostatic, electrochemical and electronic properties at the nanoscale. In this work, we present a purely digital, software-based approach to KPFM utilizing big data acquisition and analysis methods. General mode (G-Mode) KPFM works by capturing the entire photodetector data stream, typically at the sampling rate limit, followed by subsequent de-noising, analysis and compression of the cantilever response. We demonstrate that the G-Mode approach allows simultaneous multi-harmonic detection, combined with on-the-fly transfer function correction—required for quantitative CPD mapping. The KPFM approach outlined in this work significantly simplifies the technique by avoiding cumbersome instrumentation optimization steps (i.e. lock in parameters, feedback gains etc), while also retaining the flexibility to be implemented on any atomic force microscopy platform. We demonstrate the added advantages of G-Mode KPFM by allowing simultaneous mapping of CPD and capacitance gradient (C′) channels as well as increased flexibility in data exploration across frequency, time, space, and noise domains. G-Mode KPFM is particularly suitable for characterizing voltage sensitive materials or for operation in conductive electrolytes, and will be useful for probing electrodynamics in photovoltaics, liquids and ionic conductors. (paper)

  14. Dynamic strain analysis of structures employing digital signal processing, storage and display

    Energy Technology Data Exchange (ETDEWEB)

    Patwardhan, P K; Misra, V M; Kumar, Surendra

    1975-01-01

    A multi-channel digital technique has been adopted for analysing wave patterns of stresses and strains in structures, particularly under dynamic conditions. This technique provides adequate signal to noise discrimination and high sensitivity for very small (few milli-volts) and slow varying signals (few Hz to 100 Hz.), and A-D conversion accompined by live display during the course of data gathering and computer compatible output. This system also provides fast response because of inherent 50 MHz digitising speed and a large dynamic range of 1024 discrete signal steps. The signals can be suitably fed to the A-D converter (50 MHz) or can be analysed employing frequency modulation techniques and time mode operation of the analyser. The data can be gathered in the field on cassette tapes and replayed in the laboratory for detailed analysis. This technique would provide a versatile system for dynamic analysis of structures under varying conditions. e.g. structures in nuclear power systems, such as testing of end fittings, calandria, vibration testing and measurements exploying pressure transducers.

  15. Dynamic strain analysis of structures employing digital signal processing, storage and display

    International Nuclear Information System (INIS)

    Patwardhan, P.K.; Misra, V.M.; Kumar, Surendra

    1975-01-01

    A multi-channel digital technique has been adopted for analysing wave patterns of stresses and strains in structures, particularly under dynamic conditions. This technique provides adequate signal to noise discrimination and high sensitivity for very small (few milli-volts) and slow varying signals (few Hz to 100 Hz.), A-D conversion accompined by live display during the course of data gathering and computer compatible output. This system also provides fast response because of inherent 50 MHz digitising speed and a large dynamic range of 1024 discrete signal steps. The signals can be suitably fed to the A-D converter (50 MHz) or can be analysed employing frequency modulation techniques and time mode operation of the analyser. The data can be gathered in the field on cassette tapes and replayed in the laboratory for detailed analysis. This technique would provide a versatile system for dynamic analysis of structures under varying conditions. e.g. structures in nuclear power systems, such as testing of end fittings, calandria, vibration testing and measurements exploying pressure transducers. (author)

  16. Bridging the Digital Divide Creating Digital Dividend - The Investigation in Guizhou Province and the Analysis of GZNW

    Directory of Open Access Journals (Sweden)

    Linbo Jing

    2007-12-01

    Full Text Available This article begins with attention to the digital divide. It gives a brief overview of the digital divide on a global basis and analyzes specific aspects of the digital divide in our country. It also introduces the informationization construction of Guizhou Province and points out problems with the digital divide in that province. Then it focuses on the practice of Guizhou Province to bridge the digital divide ---- the practice and experience of GZNW. The final section gives a series of policy recommendations on how to bridge the digital divide, realize digital dividends, and how to build a new socialist countryside.

  17. Frequency Analysis of Aircraft hazards for License Application

    Energy Technology Data Exchange (ETDEWEB)

    K. Ashley

    2006-10-24

    The preclosure safety analysis for the monitored geologic repository at Yucca Mountain must consider the hazard that aircraft may pose to surface structures. Relevant surface structures are located beneath the restricted airspace of the Nevada Test Site (NTS) on the eastern slope of Yucca Mountain, near the North Portal of the Exploratory Studies Facility Tunnel (Figure 1). The North Portal is located several miles from the Nevada Test and Training Range (NTTR), which is used extensively by the U.S. Air Force (USAF) for training and test flights (Figure 1). The NTS airspace, which is controlled by the U.S. Department of Energy (DOE) for NTS activities, is not part of the NTTR. Agreements with the DOE allow USAF aircraft specific use of the airspace above the NTS (Reference 2.1.1 [DIRS 103472], Section 3.1.1 and Appendix A, Section 2.1; and Reference 2.1.2 [DIRS 157987], Sections 1.26 through 1.29). Commercial, military, and general aviation aircraft fly within several miles to the southwest of the repository site in the Beatty Corridor, which is a broad air corridor that runs approximately parallel to U.S. Highway 95 and the Nevada-California border (Figure 2). These aircraft and other aircraft operations are identified and described in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Sections 6 and 8). The purpose of this analysis is to estimate crash frequencies for aircraft hazards identified for detailed analysis in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Section 8). Reference 2.1.3, Section 8, also identifies a potential hazard associated with electronic jamming, which will be addressed in this analysis. This analysis will address only the repository and not the transportation routes to the site. The analysis is intended to provide the basis for: (1) Categorizing event sequences related to aircraft hazards; (2) Identifying design or operational requirements related to aircraft hazards.

  18. Grid Frequency Extreme Event Analysis and Modeling: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Florita, Anthony R [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clark, Kara [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gevorgian, Vahan [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Folgueras, Maria [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wenger, Erin [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-11-01

    Sudden losses of generation or load can lead to instantaneous changes in electric grid frequency and voltage. Extreme frequency events pose a major threat to grid stability. As renewable energy sources supply power to grids in increasing proportions, it becomes increasingly important to examine when and why extreme events occur to prevent destabilization of the grid. To better understand frequency events, including extrema, historic data were analyzed to fit probability distribution functions to various frequency metrics. Results showed that a standard Cauchy distribution fit the difference between the frequency nadir and prefault frequency (f_(C-A)) metric well, a standard Cauchy distribution fit the settling frequency (f_B) metric well, and a standard normal distribution fit the difference between the settling frequency and frequency nadir (f_(B-C)) metric very well. Results were inconclusive for the frequency nadir (f_C) metric, meaning it likely has a more complex distribution than those tested. This probabilistic modeling should facilitate more realistic modeling of grid faults.

  19. Analysis of identification of digital images from a map of cosmic microwaves

    Science.gov (United States)

    Skeivalas, J.; Turla, V.; Jurevicius, M.; Viselga, G.

    2018-04-01

    This paper discusses identification of digital images from the cosmic microwave background radiation map formed according to the data of the European Space Agency "Planck" telescope by applying covariance functions and wavelet theory. The estimates of covariance functions of two digital images or single images are calculated according to the random functions formed of the digital images in the form of pixel vectors. The estimates of pixel vectors are formed on expansion of the pixel arrays of the digital images by a single vector. When the scale of a digital image is varied, the frequencies of single-pixel color waves remain constant and the procedure for calculation of covariance functions is not affected. For identification of the images, the RGB format spectrum has been applied. The impact of RGB spectrum components and the color tensor on the estimates of covariance functions was analyzed. The identity of digital images is assessed according to the changes in the values of the correlation coefficients in a certain range of values by applying the developed computer program.

  20. Decomposed Photo Response Non-Uniformity for Digital Forensic Analysis

    Science.gov (United States)

    Li, Yue; Li, Chang-Tsun

    The last few years have seen the applications of Photo Response Non-Uniformity noise (PRNU) - a unique stochastic fingerprint of image sensors, to various types of digital forensic investigations such as source device identification and integrity verification. In this work we proposed a new way of extracting PRNU noise pattern, called Decomposed PRNU (DPRNU), by exploiting the difference between the physical andartificial color components of the photos taken by digital cameras that use a Color Filter Array for interpolating artificial components from physical ones. Experimental results presented in this work have shown the superiority of the proposed DPRNU to the commonly used version. We also proposed a new performance metrics, Corrected Positive Rate (CPR) to evaluate the performance of the common PRNU and the proposed DPRNU.

  1. Digital holography microscopy in 3D biologic samples analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ricardo, J O; Palacios, F; Palacios, G F; Sanchez, A [Department of Physics, University of Oriente (Cuba); Muramatsu, M [Department of General Physics, University of Sao Paulo - Sao Paulo (Brazil); Gesualdi, M [Engineering center, Models and Applied Social Science, UFABC - Sao Paulo (Brazil); Font, O [Department of Bio-ingeniering, University of Oriente - Santiago de Cuba (Cuba); Valin, J L [Mechanics Department, ISPJAE, Habana (Cuba); Escobedo, M; Herold, S [Department of Computation, University of Oriente (Cuba); Palacios, D F, E-mail: frpalaciosf@gmail.com [Department of Nuclear physics, University of Simon BolIva (Venezuela, Bolivarian Republic of)

    2011-01-01

    In this work it is used a setup for Digital Holography Microscopy (MHD) for 3D biologic samples reconstruction. The phase contrast image reconstruction is done by using the Double propagation Method. The system was calibrated and tested by using a micrometric scale and pure phase object respectively. It was simulated the human red blood cell (erythrocyte) and beginning from the simulated hologram the digital 3D phase image for erythrocytes it was calculated. Also there was obtained experimental holograms of human erythrocytes and its corresponding 3D phase images, being evident the correspondence qualitative and quantitative between these characteristics in the simulated erythrocyte and in the experimentally calculated by DHM in both cases.

  2. Helper T lymphocyte precursor frequency analysis in alloreactivity detection

    International Nuclear Information System (INIS)

    Cukrova, V.; Dolezalova, L.; Loudova, M.; Vitek, A.

    1998-01-01

    The utility of IL-2 secreting helper T lymphocyte precursors (HTLp) frequency testing has been evaluated for detecting alloreactivity. The frequency of HTLp was approached by limiting dilution assay. High HTLp frequency was detected in 20 out of 30 HLA matched unrelated pairs (67%). The comparison of HTLp and CTLp (cytotoxic T lymphocyte precursors) frequencies in HLA matched unrelated pairs showed that the two examinations are not fully alternative in detecting alloreactivity. This could suggest the utility of combined testing of both HTLp and CTLp frequencies for alloreactivity assessment. In contrast, five positive HTLp values were only found among 28 HLA genotypic identical siblings (18%). Previous CTLp limiting dilution studies showed very low or undetectable CTLp frequency results in that group. For that, HTLp assay remains to be the only cellular in vitro technique detecting alloreactivity in these combinations. (authors)

  3. Diagnostic analysis of vibration signals using adaptive digital filtering techniques

    Science.gov (United States)

    Jewell, R. E.; Jones, J. H.; Paul, J. E.

    1983-01-01

    Signal enhancement techniques are described using recently developed digital adaptive filtering equipment. Adaptive filtering concepts are not new; however, as a result of recent advances in microprocessor-based electronics, hardware has been developed that has stable characteristics and of a size exceeding 1000th order. Selected data processing examples are presented illustrating spectral line enhancement, adaptive noise cancellation, and transfer function estimation in the presence of corrupting noise.

  4. Radiation dose with digital breast tomosynthesis compared to digital mammography: per-view analysis.

    Science.gov (United States)

    Gennaro, Gisella; Bernardi, D; Houssami, N

    2018-02-01

    To compare radiation dose delivered by digital mammography (FFDM) and breast tomosynthesis (DBT) for a single view. 4,780 FFDM and 4,798 DBT images from 1,208 women enrolled in a screening trial were used to ground dose comparison. Raw images were processed by an automatic software to determine volumetric breast density (VBD) and were used together with exposure data to compute the mean glandular dose (MGD) according to Dance's model. DBT and FFDM were compared in terms of operation of the automatic exposure control (AEC) and MGD level. Statistically significant differences were found between FFDM and DBT MGDs for all views (CC: MGD FFDM =1.366 mGy, MGD DBT =1.858 mGy; ptomosynthesis compared to FFDM. Given the emerging role of DBT, its use in conjunction with synthetic 2D images should not be deterred by concerns regarding radiation burden, and should draw on evidence of potential clinical benefit. • Most studies compared tomosynthesis in combination with mammography vs. mammography alone. • There is some concern about the dose increase with tomosynthesis. • Clinical data show a small increase in radiation dose with tomosynthesis. • Synthetic 2D images from tomosynthesis at zero dose reduce potential harm. • The small dose increase should not be a barrier to use of tomosynthesis.

  5. [Advantages of digitalization and analysis of roentgenograms of bones by microcomputer.].

    Science.gov (United States)

    Tkácik, J; Makai, F; Keppert, M; Valko, B

    1994-01-01

    The creation of an experimental group for digitalization of two-dimensional X-ray pictures of bones was the starting point for the application of selected procedures for their computer-assisted analysis. During processing of the picture first the primary digitalized picture was corrected (e. g. subtraction of the background, correction of the non-homogenity of the scanning element and light source) and subsequently the picture was prepared by spot or local transformations (e. g. semithresholding, equalization, Sobel's operator and subtraction of two differently prepared pictures) while preserving the geometry of the original objects on the picture. On treated pictures of bony structures then graphic procedures of identification of important spots, lines, angles and derived indices and of mutual relations of bones were verified. For creation of prerequisites of the long-term development of osseous changes, in particular in patients with non-cemented implants of the hip joints, possibilities of and quantitative requirements of filing of digitalized two-dimensional X-ray pictures of the bones in information media were tested as well as the possibility to transmit them in the local computer network. Finally the authors draw attention to the advantages of digitalization and analysis of two-dimensional X-ray pictures of bones by a microcomputer from the aspect of impact on information, economic aspects and in particular preserving of information in digital form. Key words: digitalization of X-ray pictures, digital processing of X-ray pictures, filing of X-ray pictures.

  6. A general theory on frequency and time-frequency analysis of irregularly sampled time series based on projection methods - Part 2: Extension to time-frequency analysis

    Science.gov (United States)

    Lenoir, Guillaume; Crucifix, Michel

    2018-03-01

    Geophysical time series are sometimes sampled irregularly along the time axis. The situation is particularly frequent in palaeoclimatology. Yet, there is so far no general framework for handling the continuous wavelet transform when the time sampling is irregular. Here we provide such a framework. To this end, we define the scalogram as the continuous-wavelet-transform equivalent of the extended Lomb-Scargle periodogram defined in Part 1 of this study (Lenoir and Crucifix, 2018). The signal being analysed is modelled as the sum of a locally periodic component in the time-frequency plane, a polynomial trend, and a background noise. The mother wavelet adopted here is the Morlet wavelet classically used in geophysical applications. The background noise model is a stationary Gaussian continuous autoregressive-moving-average (CARMA) process, which is more general than the traditional Gaussian white and red noise processes. The scalogram is smoothed by averaging over neighbouring times in order to reduce its variance. The Shannon-Nyquist exclusion zone is however defined as the area corrupted by local aliasing issues. The local amplitude in the time-frequency plane is then estimated with least-squares methods. We also derive an approximate formula linking the squared amplitude and the scalogram. Based on this property, we define a new analysis tool: the weighted smoothed scalogram, which we recommend for most analyses. The estimated signal amplitude also gives access to band and ridge filtering. Finally, we design a test of significance for the weighted smoothed scalogram against the stationary Gaussian CARMA background noise, and provide algorithms for computing confidence levels, either analytically or with Monte Carlo Markov chain methods. All the analysis tools presented in this article are available to the reader in the Python package WAVEPAL.

  7. Time-frequency analysis of stimulus frequency otoacoustic emissions and their changes with efferent stimulation in guinea pigs

    Science.gov (United States)

    Berezina-Greene, Maria A.; Guinan, John J.

    2015-12-01

    To aid in understanding their origin, stimulus frequency otoacoustic emissions (SFOAEs) were measured at a series of tone frequencies using the suppression method, both with and without stimulation of medial olivocochlear (MOC) efferents, in anesthetized guinea pigs. Time-frequency analysis showed SFOAE energy peaks in 1-3 delay components throughout the measured frequency range (0.5-12 kHz). One component's delay usually coincided with the phase-gradient delay. When multiple delay components were present, they were usually near SFOAE dips. Below 2 kHz, SFOAE delays were shorter than predicted from mechanical measurements. With MOC stimulation, SFOAE amplitude was decreased at most frequencies, but was sometimes enhanced, and all SFOAE delay components were affected. The MOC effects and an analysis of model data suggest that the multiple SFOAE delay components arise at the edges of the traveling-wave peak, not far basal of the peak. Comparisons with published guinea-pig neural data suggest that the short latencies of low-frequency SFOAEs may arise from coherent reflection from an organ-of-Corti motion that has a shorter group delay than the traveling wave.

  8. Design and Analysis of Robust Active Damping for LCL Filters using Digital Notch Filters

    DEFF Research Database (Denmark)

    Yao, Wenli; Yang, Yongheng; Zhang, Xiaobin

    2017-01-01

    Resonant poles of LCL filters may challenge the entire system stability especially in digital-controlled Pulse Width Modulation (PWM) inverters. In order to tackle the resonance issues, many active damping solutions have been reported. For instance, a notch filter can be employed to damp the reso......Resonant poles of LCL filters may challenge the entire system stability especially in digital-controlled Pulse Width Modulation (PWM) inverters. In order to tackle the resonance issues, many active damping solutions have been reported. For instance, a notch filter can be employed to damp...... the resonance, where the notch frequency should be aligned exactly to the resonant frequency of the LCL filter. However, parameter variations of the LCL filter as well as the time delay appearing in digital control systems will induce resonance drifting, and thus break this alignment, possibly deteriorating...... the original damping. In this paper, the effectiveness of the notch filter based active damping is firstly explored, considering the drifts of the resonant frequency. It is revealed that, when the resonant frequency drifts away from its nominal value, the phase lead or lag introduced by the notch filter may...

  9. A low-cost, high-performance, digital signal processor-based lock-in amplifier capable of measuring multiple frequency sweeps simultaneously

    International Nuclear Information System (INIS)

    Sonnaillon, Maximiliano Osvaldo; Bonetto, Fabian Jose

    2005-01-01

    A high-performance digital lock-in amplifier implemented in a low-cost digital signal processor (DSP) board is described. This lock in is capable of measuring simultaneously multiple frequencies that change in time as frequency sweeps (chirps). The used 32-bit DSP has enough computing power to generate N=3 simultaneous reference signals and accurately measure the N=3 responses, operating as three lock ins connected in parallel to a linear system. The lock in stores the measured values in memory until they are downloaded to the a personal computer (PC). The lock in works in stand-alone mode and can be programmed and configured through the PC serial port. Downsampling and multiple filter stages were used in order to obtain a sharp roll off and a long time constant in the filters. This makes measurements possible in presence of high-noise levels. Before each measurement, the lock in performs an autocalibration that measures the frequency response of analog output and input circuitry in order to compensate for the departure from ideal operation. Improvements from previous lock-in implementations allow measuring the frequency response of a system in a short time. Furthermore, the proposed implementation can measure how the frequency response changes with time, a characteristic that is very important in our biotechnological application. The number of simultaneous components that the lock in can generate and measure can be extended, without reprogramming, by only using other DSPs of the same family that are code compatible and work at higher clock frequencies

  10. A low-cost, high-performance, digital signal processor-based lock-in amplifier capable of measuring multiple frequency sweeps simultaneously

    Energy Technology Data Exchange (ETDEWEB)

    Sonnaillon, Maximiliano Osvaldo; Bonetto, Fabian Jose [Laboratorio de Cavitacion y Biotecnologia, San Carlos de Bariloche (8400) (Argentina)

    2005-02-01

    A high-performance digital lock-in amplifier implemented in a low-cost digital signal processor (DSP) board is described. This lock in is capable of measuring simultaneously multiple frequencies that change in time as frequency sweeps (chirps). The used 32-bit DSP has enough computing power to generate N=3 simultaneous reference signals and accurately measure the N=3 responses, operating as three lock ins connected in parallel to a linear system. The lock in stores the measured values in memory until they are downloaded to the a personal computer (PC). The lock in works in stand-alone mode and can be programmed and configured through the PC serial port. Downsampling and multiple filter stages were used in order to obtain a sharp roll off and a long time constant in the filters. This makes measurements possible in presence of high-noise levels. Before each measurement, the lock in performs an autocalibration that measures the frequency response of analog output and input circuitry in order to compensate for the departure from ideal operation. Improvements from previous lock-in implementations allow measuring the frequency response of a system in a short time. Furthermore, the proposed implementation can measure how the frequency response changes with time, a characteristic that is very important in our biotechnological application. The number of simultaneous components that the lock in can generate and measure can be extended, without reprogramming, by only using other DSPs of the same family that are code compatible and work at higher clock frequencies.

  11. Fault tree and failure mode and effects analysis of a digital safety function

    International Nuclear Information System (INIS)

    Maskuniitty, M.; Pulkkinen, U.

    1995-01-01

    The principles of fault tree and failure mode and effects analysis (FMEA) for the analysis of digital safety functions of nuclear power plants are discussed. Based on experiences from a case study, a proposal for a full scale analysis is presented. The feasibility and applicability the above mentioned reliability engineering methods are discussed. (author). 13 refs, 1 fig., 2 tabs

  12. Measurement of cytokine and adhesion molecule expression in synovial tissue by digital image analysis

    NARCIS (Netherlands)

    Kraan, M. C.; Smith, M. D.; Weedon, H.; Ahern, M. J.; Breedveld, F. C.; Tak, P. P.

    2001-01-01

    Digital image analysis (DIA) offers the opportunity to quantify the stained area and staining intensity when synovial tissue (ST) is investigated by immunohistochemical analysis. This study aimed at determining the sensitivity of DIA compared with semiquantitative analysis (SQA). Paired ST samples

  13. Uncertainty Assessment of Hydrological Frequency Analysis Using Bootstrap Method

    Directory of Open Access Journals (Sweden)

    Yi-Ming Hu

    2013-01-01

    Full Text Available The hydrological frequency analysis (HFA is the foundation for the hydraulic engineering design and water resources management. Hydrological extreme observations or samples are the basis for HFA; the representativeness of a sample series to the population distribution is extremely important for the estimation reliability of the hydrological design value or quantile. However, for most of hydrological extreme data obtained in practical application, the size of the samples is usually small, for example, in China about 40~50 years. Generally, samples with small size cannot completely display the statistical properties of the population distribution, thus leading to uncertainties in the estimation of hydrological design values. In this paper, a new method based on bootstrap is put forward to analyze the impact of sampling uncertainty on the design value. By bootstrap resampling technique, a large number of bootstrap samples are constructed from the original flood extreme observations; the corresponding design value or quantile is estimated for each bootstrap sample, so that the sampling distribution of design value is constructed; based on the sampling distribution, the uncertainty of quantile estimation can be quantified. Compared with the conventional approach, this method provides not only the point estimation of a design value but also quantitative evaluation on uncertainties of the estimation.

  14. Quantification of Uncertainty in the Flood Frequency Analysis

    Science.gov (United States)

    Kasiapillai Sudalaimuthu, K.; He, J.; Swami, D.

    2017-12-01

    Flood frequency analysis (FFA) is usually carried out for planning and designing of water resources and hydraulic structures. Owing to the existence of variability in sample representation, selection of distribution and estimation of distribution parameters, the estimation of flood quantile has been always uncertain. Hence, suitable approaches must be developed to quantify the uncertainty in the form of prediction interval as an alternate to deterministic approach. The developed framework in the present study to include uncertainty in the FFA discusses a multi-objective optimization approach to construct the prediction interval using ensemble of flood quantile. Through this approach, an optimal variability of distribution parameters is identified to carry out FFA. To demonstrate the proposed approach, annual maximum flow data from two gauge stations (Bow river at Calgary and Banff, Canada) are used. The major focus of the present study was to evaluate the changes in magnitude of flood quantiles due to the recent extreme flood event occurred during the year 2013. In addition, the efficacy of the proposed method was further verified using standard bootstrap based sampling approaches and found that the proposed method is reliable in modeling extreme floods as compared to the bootstrap methods.

  15. Assessment of homogeneity of regions for regional flood frequency analysis

    Science.gov (United States)

    Lee, Jeong Eun; Kim, Nam Won

    2016-04-01

    This paper analyzed the effect of rainfall on hydrological similarity, which is an important step for regional flood frequency analysis (RFFA). For the RFFA, storage function method (SFM) using spatial extension technique was applied for the 22 sub-catchments that are partitioned from Chungju dam watershed in Republic of Korea. We used the SFM to generate the annual maximum floods for 22 sub-catchments using annual maximum storm events (1986~2010) as input data. Then the quantiles of rainfall and flood were estimated using the annual maximum series for the 22 sub-catchments. Finally, spatial variations in terms of two quantiles were analyzed. As a result, there were significant correlation between spatial variations of the two quantiles. This result demonstrates that spatial variation of rainfall is an important factor to explain the homogeneity of regions when applying RFFA. Acknowledgements: This research was supported by a grant (11-TI-C06) from Advanced Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

  16. Numerical electromagnetic frequency domain analysis with discrete exterior calculus

    Science.gov (United States)

    Chen, Shu C.; Chew, Weng Cho

    2017-12-01

    In this paper, we perform a numerical analysis in frequency domain for various electromagnetic problems based on discrete exterior calculus (DEC) with an arbitrary 2-D triangular or 3-D tetrahedral mesh. We formulate the governing equations in terms of DEC for 3-D and 2-D inhomogeneous structures, and also show that the charge continuity relation is naturally satisfied. Then we introduce a general construction for signed dual volume to incorporate material information and take into account the case when circumcenters fall outside triangles or tetrahedrons, which may lead to negative dual volume without Delaunay triangulation. Then we examine the boundary terms induced by the dual mesh and provide a systematical treatment of various boundary conditions, including perfect magnetic conductor (PMC), perfect electric conductor (PEC), Dirichlet, periodic, and absorbing boundary conditions (ABC) within this method. An excellent agreement is achieved through the numerical calculation of several problems, including homogeneous waveguides, microstructured fibers, photonic crystals, scattering by a 2-D PEC, and resonant cavities.

  17. Rainfall frequency analysis for ungauged sites using satellite precipitation products

    Science.gov (United States)

    Gado, Tamer A.; Hsu, Kuolin; Sorooshian, Soroosh

    2017-11-01

    The occurrence of extreme rainfall events and their impacts on hydrologic systems and society are critical considerations in the design and management of a large number of water resources projects. As precipitation records are often limited or unavailable at many sites, it is essential to develop better methods for regional estimation of extreme rainfall at these partially-gauged or ungauged sites. In this study, an innovative method for regional rainfall frequency analysis for ungauged sites is presented. The new method (hereafter, this is called the RRFA-S) is based on corrected annual maximum series obtained from a satellite precipitation product (e.g., PERSIANN-CDR). The probability matching method (PMM) is used here for bias correction to match the CDF of satellite-based precipitation data with the gauged data. The RRFA-S method was assessed through a comparative study with the traditional index flood method using the available annual maximum series of daily rainfall in two different regions in USA (11 sites in Colorado and 18 sites in California). The leave-one-out cross-validation technique was used to represent the ungauged site condition. Results of this numerical application have found that the quantile estimates obtained from the new approach are more accurate and more robust than those given by the traditional index flood method.

  18. Use of historical information in extreme storm surges frequency analysis

    Science.gov (United States)

    Hamdi, Yasser; Duluc, Claire-Marie; Deville, Yves; Bardet, Lise; Rebour, Vincent

    2013-04-01

    The prevention of storm surge flood risks is critical for protection and design of coastal facilities to very low probabilities of failure. The effective protection requires the use of a statistical analysis approach having a solid theoretical motivation. Relating extreme storm surges to their frequency of occurrence using probability distributions has been a common issue since 1950s. The engineer needs to determine the storm surge of a given return period, i.e., the storm surge quantile or design storm surge. Traditional methods for determining such a quantile have been generally based on data from the systematic record alone. However, the statistical extrapolation, to estimate storm surges corresponding to high return periods, is seriously contaminated by sampling and model uncertainty if data are available for a relatively limited period. This has motivated the development of approaches to enlarge the sample extreme values beyond the systematic period. The nonsystematic data occurred before the systematic period is called historical information. During the last three decades, the value of using historical information as a nonsystematic data in frequency analysis has been recognized by several authors. The basic hypothesis in statistical modeling of historical information is that a perception threshold exists and that during a giving historical period preceding the period of tide gauging, all exceedances of this threshold have been recorded. Historical information prior to the systematic records may arise from high-sea water marks left by extreme surges on the coastal areas. It can also be retrieved from archives, old books, earliest newspapers, damage reports, unpublished written records and interviews with local residents. A plotting position formula, to compute empirical probabilities based on systematic and historical data, is used in this communication paper. The objective of the present work is to examine the potential gain in estimation accuracy with the

  19. The impact research of control modes in steam turbine control system (digital electric hydraulic to the low-frequency oscillation of grid

    Directory of Open Access Journals (Sweden)

    Yanghai Li

    2016-01-01

    Full Text Available Through the analysis of the control theory for steam turbine, the transfer function of the steam turbine control modes in the parallel operation was obtained. The frequency domain analysis indicated that different control modes of turbine control system have different influence on the damping characteristics of the power system. The comparative analysis shows the direction and the degree of the influence under the different oscillation frequency range. This can provide the theory for the suppression of the low-frequency oscillation from turbine side and has a guiding significance for the stability of power system. The results of simulation tests are consistent with the theoretic analysis.

  20. Identification of the vital digital assets based on PSA results analysis

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Moon Kyoung; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of); Son, Han Seong [Joongbu Univiersity, Geumsan (Korea, Republic of); Kim, Hyundoo [Korea Institute of Nuclear Nonproliferation and Control, Daejeon (Korea, Republic of)

    2016-10-15

    As the main systems for managing totally about the operation, control, monitoring, measurement, and safety function in an emergency, instrumentation and control systems (I and C) in nuclear power plants have been digitalized gradually for the precise operation and its convenience. The digitalization of infrastructure makes systems vulnerable to cyber threats and hybrid attacks. According to ICS-CERT report, as time goes by, the number of vulnerabilities in ICS industries increases rapidly. Recently, due to the digitalization of I and C, it has begun to rise the need of cyber security in the digitalized I and C in NPPs. However, there are too many critical digital assets (CDAs) in NPPs. More than 60% of the total critical systems are digital system. Addressing more than 100 security controls for each CDA needs too much effort for both licensee and inspector. It is necessary to focus on more significant CDAs for effective regulation. Probabilistic Safety Analysis (PSA) results are analyzed in order to identify more significant CDAs which could evoke an accident of NPPs by digital malfunction or cyber-attacks. By eliciting minimal cut sets using fault tree analyses, accident-related CDAs are drawn. Also the CDAs that must be secured from outsiders are elicited in case of some accident scenario. It is expected that effective cyber security regulation based on the graded approach can be implemented. Furthermore, defense-in-depth of digital assets for NPPs safety can be built up. Digital technologies such as computers, control systems, and data networks currently play essential roles in modern NPPs. Further, the introduction of new digitalized technologies is also being considered. These digital technologies make the operation of NPPs more convenient and economical; however, they are inherently susceptible to problems such as digital malfunction of components or cyber-attacks. Recently, needs for cyber security on digitalized nuclear Instrumentation and Control (I and C

  1. Identification of the vital digital assets based on PSA results analysis

    International Nuclear Information System (INIS)

    Choi, Moon Kyoung; Seong, Poong Hyun; Son, Han Seong; Kim, Hyundoo

    2016-01-01

    As the main systems for managing totally about the operation, control, monitoring, measurement, and safety function in an emergency, instrumentation and control systems (I and C) in nuclear power plants have been digitalized gradually for the precise operation and its convenience. The digitalization of infrastructure makes systems vulnerable to cyber threats and hybrid attacks. According to ICS-CERT report, as time goes by, the number of vulnerabilities in ICS industries increases rapidly. Recently, due to the digitalization of I and C, it has begun to rise the need of cyber security in the digitalized I and C in NPPs. However, there are too many critical digital assets (CDAs) in NPPs. More than 60% of the total critical systems are digital system. Addressing more than 100 security controls for each CDA needs too much effort for both licensee and inspector. It is necessary to focus on more significant CDAs for effective regulation. Probabilistic Safety Analysis (PSA) results are analyzed in order to identify more significant CDAs which could evoke an accident of NPPs by digital malfunction or cyber-attacks. By eliciting minimal cut sets using fault tree analyses, accident-related CDAs are drawn. Also the CDAs that must be secured from outsiders are elicited in case of some accident scenario. It is expected that effective cyber security regulation based on the graded approach can be implemented. Furthermore, defense-in-depth of digital assets for NPPs safety can be built up. Digital technologies such as computers, control systems, and data networks currently play essential roles in modern NPPs. Further, the introduction of new digitalized technologies is also being considered. These digital technologies make the operation of NPPs more convenient and economical; however, they are inherently susceptible to problems such as digital malfunction of components or cyber-attacks. Recently, needs for cyber security on digitalized nuclear Instrumentation and Control (I and C

  2. Image analysis of microsialograms of the mouse parotid gland using digital image processing

    International Nuclear Information System (INIS)

    Yoshiura, K.; Ohki, M.; Yamada, N.

    1991-01-01

    The authors compared two digital-image feature-extraction methods for the analysis of microsialograms of the mouse parotid gland following either overfilling, experimentally induced acute sialoadenitis or irradiation. Microsialograms were digitized using a drum-scanning microdensitometer. The grey levels were then partitioned into four bands representing soft tissue, peripheral minor, middle-sized and major ducts, and run-length and histogram analysis of the digital images performed. Serial analysis of microsialograms during progressive filling showed that both methods depicted the structural characteristics of the ducts at each grey level. However, in the experimental groups, run-length analysis showed slight changes in the peripheral duct system more clearly. This method was therefore considered more effective than histogram analysis

  3. Elasticity analysis by MR elastography using the instantaneous frequency method

    International Nuclear Information System (INIS)

    Oshiro, Osamu; Suga, Mikio; Minato, Kotaro; Okamoto, Jun; Takizawa, Osamu; Matsuda, Tetsuya; Komori, Masaru; Takahashi, Takashi

    2000-01-01

    This paper describes a calculation method for estimating the elasticity of human organs using magnetic resonance elastography (MRE) images. The method is based on the instantaneous frequency method, which is very sensitive to noise. Therefore, the proposed method also incorporates a noise-reduction function. In the instantaneous frequency method, Fourier transform is applied to the measurement signal. Then, inverse Fourier transform is performed after the negative frequency component is set to zero. In the proposed method, noise is reduced by processing in which the positive higher frequency component is also set to zero before inverse Fourier transform is performed. First, we conducted a simulation study and confirmed the applicability of this method and the noise reduction function. Next, we carried out a phantom experiment and demonstrated that elasticity images could be generated, with the gray level corresponding to the local frequency in MRE images. (author)

  4. Digital Predistortion of 75-110GHzW-Band Frequency Multiplier for Fiber Wireless Short Range Access Systems

    DEFF Research Database (Denmark)

    Zhao, Ying; Pang, Xiaodan; Deng, Lei

    2011-01-01

    We present a digital predistortion technique to effectively compensate high nonlinearity of a sextuple multiplier operating at 99.6GHz. An 18.9dB adjacent-channel power ratio (ACPR) improvement is guaranteed and a W-band fiber-wireless system is experimentally investigated.......We present a digital predistortion technique to effectively compensate high nonlinearity of a sextuple multiplier operating at 99.6GHz. An 18.9dB adjacent-channel power ratio (ACPR) improvement is guaranteed and a W-band fiber-wireless system is experimentally investigated....

  5. Application on technique of joint time-frequency analysis of seismic signal's first arrival estimation

    International Nuclear Information System (INIS)

    Xu Chaoyang; Liu Junmin; Fan Yanfang; Ji Guohua

    2008-01-01

    Joint time-frequency analysis is conducted to construct one joint density function of time and frequency. It can open out one signal's frequency components and their evolvements. It is the new evolvement of Fourier analysis. In this paper, according to the characteristic of seismic signal's noise, one estimation method of seismic signal's first arrival based on triple correlation of joint time-frequency spectrum is introduced, and the results of experiment and conclusion are presented. (authors)

  6. Corneal ablation depth readout of the MEL 80 excimer laser compared to Artemis three-dimensional very high-frequency digital ultrasound stromal measurements.

    Science.gov (United States)

    Reinstein, Dan Z; Archer, Timothy J; Gobbe, Marine

    2010-12-01

    To evaluate the accuracy of the ablation depth readout for the MEL 80 excimer laser (Carl Zeiss Meditec). Artemis 1 very high-frequency digital ultrasound measurements were obtained before and at least 3 months after LASIK in 121 eyes (65 patients). The Artemis-measured ablation depth was calculated as the maximum difference in stromal thickness before and after treatment. Laser in situ keratomileusis was performed using the MEL 80 excimer laser and the Hansatome microkeratome (Bausch & Lomb). The Aberration Smart Ablation profile was used in 56 eyes and the Tissue Saving Ablation profile was used in 65 eyes. All ablations were centered on the corneal vertex. Comparative statistics and linear regression analysis were performed between the laser readout ablation depth and Artemis-measured ablation depth. The mean maximum myopic meridian was -6.66±2.40 diopters (D) (range: -1.50 to -10.00 D) for Aberration Smart Ablation-treated eyes and -6.50±2.56 D (range: -1.34 to -11.50 D) for Tissue Saving Ablation-treated eyes. The MEL 80 readout was found to overestimate the Artemis-measured ablation depth by 20±12 μm for Aberration Smart Ablation and by 21±12 μm for Tissue Saving Ablation profiles. The accuracy of ablation depth measurement was improved by using the Artemis stromal thickness profile measurements before and after surgery to exclude epithelial changes. The MEL 80 readout was found to overestimate the achieved ablation depth. The linear regression equations could be used by MEL 80 users to adjust the ablation depth for predicted residual stromal thickness calculations without increasing the risk of ectasia due to excessive keratectomy depth as long as a suitable flap thickness bias is included. Copyright 2010, SLACK Incorporated.

  7. 3 GHz digital rf control at the superconducting Darmstadt electron linear accelerator: First results from the baseband approach and extensions for other frequencies

    Directory of Open Access Journals (Sweden)

    A. Araz

    2010-08-01

    Full Text Available The low level rf system for the superconducting Darmstadt electron linear accelerator (S-DALINAC developed 20 years ago and operating since converts the 3 GHz signals from the cavities down to the baseband and not to an intermediate frequency. While designing the new, digital rf control system this concept was kept: the rf module does the I/Q and amplitude modulation/demodulation while the low frequency board, housing an field programmable gate array analyzes and processes the signals. Recently, the flexibility of this concept was realized: By replacing the modulator/demodulators on the rf module, cavities operating at frequencies other than the one of the S-DALINAC can be controlled with only minor modifications: A 6 GHz version, needed for a harmonic bunching system at the S-DALINAC and a 324 MHz solution to be used on a room temperature cavity at GSI, are currently under design. This paper reviews the concept of the digital low level rf control loops in detail and reports on the results gained during first operation with a superconducting cavity.

  8. Digital squares

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Kim, Chul E

    1988-01-01

    Digital squares are defined and their geometric properties characterized. A linear time algorithm is presented that considers a convex digital region and determines whether or not it is a digital square. The algorithm also determines the range of the values of the parameter set of its preimages....... The analysis involves transforming the boundary of a digital region into parameter space of slope and y-intercept...

  9. An analysis of low frequency noise from large wind turbines

    DEFF Research Database (Denmark)

    Pedersen, Christian Sejer; Møller, Henrik

    2010-01-01

    As wind turbines get larger, worries have emerged, that the noise emitted by the turbines would move down in frequency, and that the contents of low-frequency noise would be enough to cause significant annoyance for the neighbors. The sound emission from 48 wind turbines with nominal electric power......-third-octave-band spectra shows that the relative noise emission is higher in the 63-250 Hz frequency range from turbines above 2 MW than from smaller turbines. The observations confirm a downward shift of the spectrum....

  10. Interference Rejection in Receivers by Frequency Translated Low‿Pass Filtering and Digitally Enhanced Harmonic‿Rejection Mixing

    NARCIS (Netherlands)

    Klumperink, Eric A.M.; Ru, Z.; Moseley, N.A.; Nauta, Bram

    2011-01-01

    Software-Defined Radio (SDR) and Cognitive Radio (CR) concepts have recently drawn considerable interest. These radio concepts built on digital signal processing to realize flexibly programmable radio transceivers, which can adapt in a smart way to their environment. As CMOS is the mainstream IC

  11. Time frequency analysis of olfactory induced EEG-power change.

    Directory of Open Access Journals (Sweden)

    Valentin Alexander Schriever

    Full Text Available The objective of the present study was to investigate the usefulness of time-frequency analysis (TFA of olfactory-induced EEG change with a low-cost, portable olfactometer in the clinical investigation of smell function.A total of 78 volunteers participated. The study was composed of three parts where olfactory stimuli were presented using a custom-built olfactometer. Part I was designed to optimize the stimulus as well as the recording conditions. In part II EEG-power changes after olfactory/trigeminal stimulation were compared between healthy participants and patients with olfactory impairment. In Part III the test-retest reliability of the method was evaluated in healthy subjects.Part I indicated that the most effective paradigm for stimulus presentation was cued stimulus, with an interstimulus interval of 18-20s at a stimulus duration of 1000ms with each stimulus quality presented 60 times in blocks of 20 stimuli each. In Part II we found that central processing of olfactory stimuli analyzed by TFA differed significantly between healthy controls and patients even when controlling for age. It was possible to reliably distinguish patients with olfactory impairment from healthy individuals at a high degree of accuracy (healthy controls vs anosmic patients: sensitivity 75%; specificity 89%. In addition we could show a good test-retest reliability of TFA of chemosensory induced EEG-power changes in Part III.Central processing of olfactory stimuli analyzed by TFA reliably distinguishes patients with olfactory impairment from healthy individuals at a high degree of accuracy. Importantly this can be achieved with a simple olfactometer.

  12. Insertion torque, resonance frequency, and removal torque analysis of microimplants.

    Science.gov (United States)

    Tseng, Yu-Chuan; Ting, Chun-Chan; Du, Je-Kang; Chen, Chun-Ming; Wu, Ju-Hui; Chen, Hong-Sen

    2016-09-01

    This study aimed to compare the insertion torque (IT), resonance frequency (RF), and removal torque (RT) among three microimplant brands. Thirty microimplants of the three brands were used as follows: Type A (titanium alloy, 1.5-mm × 8-mm), Type B (stainless steel, 1.5-mm × 8-mm), and Type C (titanium alloy, 1.5-mm × 9-mm). A synthetic bone with a 2-mm cortical bone and bone marrow was used. Each microimplant was inserted into the synthetic bone, without predrilling, to a 7 mm depth. The IT, RF, and RT were measured in both vertical and horizontal directions. One-way analysis of variance and Spearman's rank correlation coefficient tests were used for intergroup and intragroup comparisons, respectively. In the vertical test, the ITs of Type C (7.8 Ncm) and Type B (7.5 Ncm) were significantly higher than that of Type A (4.4 Ncm). The RFs of Type C (11.5 kHz) and Type A (10.2 kHz) were significantly higher than that of Type B (7.5 kHz). Type C (7.4 Ncm) and Type B (7.3 Ncm) had significantly higher RTs than did Type A (4.1 Ncm). In the horizontal test, both the ITs and RTs were significantly higher for Type C, compared with Type A. No significant differences were found among the groups, and the study hypothesis was accepted. Type A had the lowest inner/outer diameter ratio and widest apical facing angle, engendering the lowest IT and highest RF values. However, no significant correlations in the IT, RF, and RT were observed among the three groups. Copyright © 2016. Published by Elsevier Taiwan.

  13. Spatial recurrence analysis: A sensitive and fast detection tool in digital mammography

    International Nuclear Information System (INIS)

    Prado, T. L.; Galuzio, P. P.; Lopes, S. R.; Viana, R. L.

    2014-01-01

    Efficient diagnostics of breast cancer requires fast digital mammographic image processing. Many breast lesions, both benign and malignant, are barely visible to the untrained eye and requires accurate and reliable methods of image processing. We propose a new method of digital mammographic image analysis that meets both needs. It uses the concept of spatial recurrence as the basis of a spatial recurrence quantification analysis, which is the spatial extension of the well-known time recurrence analysis. The recurrence-based quantifiers are able to evidence breast lesions in a way as good as the best standard image processing methods available, but with a better control over the spurious fragments in the image

  14. Frequency of Testing for Dyslipidemia: An Evidence-Based Analysis

    Science.gov (United States)

    2014-01-01

    Background Dyslipidemias include high levels of total cholesterol, low-density lipoprotein (LDL) cholesterol, and triglycerides and low levels of high-density lipoprotein (HDL) cholesterol. Dyslipidemia is a risk factor for cardiovascular disease, which is a major contributor to mortality in Canada. Approximately 23% of the 2009/11 Canadian Health Measures Survey (CHMS) participants had a high level of LDL cholesterol, with prevalence increasing with age, and approximately 15% had a total cholesterol to HDL ratio above the threshold. Objectives To evaluate the frequency of lipid testing in adults not diagnosed with dyslipidemia and in adults on treatment for dyslipidemia. Research Methods A systematic review of the literature set out to identify randomized controlled trials (RCTs), systematic reviews, health technology assessments (HTAs), and observational studies published between January 1, 2000, and November 29, 2012, that evaluated the frequency of testing for dyslipidemia in the 2 populations. Results Two observational studies assessed the frequency of lipid testing, 1 in individuals not on lipid-lowering medications and 1 in treated individuals. Both studies were based on previously collected data intended for a different objective and, therefore, no conclusions could be reached about the frequency of testing at intervals other than the ones used in the original studies. Given this limitation and generalizability issues, the quality of evidence was considered very low. No evidence for the frequency of lipid testing was identified in the 2 HTAs included. Canadian and international guidelines recommend testing for dyslipidemia in individuals at an increased risk for cardiovascular disease. The frequency of testing recommended is based on expert consensus. Conclusions Conclusions on the frequency of lipid testing could not be made based on the 2 observational studies. Current guidelines recommend lipid testing in adults with increased cardiovascular risk, with

  15. Digitization of simulated clinical dental impressions: virtual three-dimensional analysis of exactness.

    Science.gov (United States)

    Persson, Anna S K; Odén, Agneta; Andersson, Matts; Sandborgh-Englund, Gunilla

    2009-07-01

    To compare the exactness of simulated clinical impressions and stone replicas of crown preparations, using digitization and virtual three-dimensional analysis. Three master dies (mandibular incisor, canine and molar) were prepared for full crowns, mounted in full dental arches in a plane line articulator. Eight impressions were taken using an experimental monophase vinyl polysiloxane-based material. Stone replicas were poured in type IV stone (Vel-Mix Stone; Kerr). The master dies and the stone replicas were digitized in a touch-probe scanner (Procera) Forte; Nobel Biocare AB) and the impressions in a laser scanner (D250, 3Shape A/S), to create virtual models. The resulting point-clouds from the digitization of the master dies were used as CAD-Reference-Models (CRM). Discrepancies between the points in the pointclouds and the corresponding CRM were measured by a matching-software (CopyCAD 6.504 SP2; Delcam Plc). The distribution of the discrepancies was analyzed and depicted on color-difference maps. The discrepancies of the digitized impressions and the stone replicas compared to the CRM were of similar size with a mean+/-SD within 40microm, with the exception of two of the digitized molar impressions. The precision of the digitized impressions and stone replicas did not differ significantly (F=4.2; p=0.053). However, the shape affected the digitization (F=5.4; p=0.013) and the interaction effect of shape and digitization source (impression or stone replica) was pronounced (F=28; pimpressions varied with shape. Both impressions and stone replicas can be digitized repeatedly with a high reliability.

  16. Application of computer intensive data analysis methods to the analysis of digital images and spatial data

    DEFF Research Database (Denmark)

    Windfeld, Kristian

    1992-01-01

    Computer-intensive methods for data analysis in a traditional setting has developed rapidly in the last decade. The application of and adaption of some of these methods to the analysis of multivariate digital images and spatial data are explored, evaluated and compared to well established classical...... into the projection pursuit is presented. Examples from remote sensing are given. The ACE algorithm for computing non-linear transformations for maximizing correlation is extended and applied to obtain a non-linear transformation that maximizes autocorrelation or 'signal' in a multivariate image....... This is a generalization of the minimum /maximum autocorrelation factors (MAF's) which is a linear method. The non-linear method is compared to the linear method when analyzing a multivariate TM image from Greenland. The ACE method is shown to give a more detailed decomposition of the image than the MAF-transformation...

  17. Defective pixel map creation based on wavelet analysis in digital radiography detectors

    International Nuclear Information System (INIS)

    Park, Chun Joo; Lee, Hyoung Koo; Song, William Y.; Achterkirchen, Thorsten Graeve; Kim, Ho Kyung

    2011-01-01

    The application of digital radiography detectors has attracted increasing attention in both medicine and industry. Since the imaging detectors are fabricated by semiconductor manufacturing process over large areas, defective pixels in the detectors are unavoidable. Moreover, the radiation damage due to the routine use of the detectors progressively increases the density of defective pixels. In this study, we present a method of identifying defective pixels in digital radiography detectors based on wavelet analysis. Artifacts generated due to wavelet transformations have been prevented by an additional local threshold method. The proposed method was applied to a sample digital radiography and the result was promising. The proposed method uses a single pair of dark and white images and does not require them to be corrected in gain-and-offset properties. This method will be helpful for the reliable use of digital radiography detectors through the working lifetime.

  18. Digital TV-echelle spectrograph for simultaneous multielemental analysis using microcomputer control

    International Nuclear Information System (INIS)

    Davidson, J.B.; Case, A.L.

    1980-12-01

    A digital TV-echelle spectrograph with microcomputer control was developed for simultaneous multielemental analysis. The optical system is a commercially available unit originally equipped for film and photomultiplier (single element) readout. The film port was adapted for the intensifier camera. The camera output is digitized and stored in a microcomputer-controlled, 512 x 512 x 12 bit memory and image processor. Multiple spectra over the range of 200 to 800 nm are recorded in a single exposure. Spectra lasting from nanoseconds to seconds are digitized and stored in 0.033 s and displayed on a TV monitor. An inexpensive microcomputer controls the exposure, reads and displays the intensity of predetermined spectral lines, and calculates wavelengths of unknown lines. The digital addresses of unknown lines are determined by superimposing a cursor on the TV display. The microcomputer also writes into memory wavelength fiducial marks for alignment of the TV camera

  19. A meta-analysis of serious digital games for healthy lifestyle promotion

    Science.gov (United States)

    Several systematic reviews have described health-promoting effects of serious games, but so far no meta-analysis has been reported. This paper presents a meta-analysis of 54 serious digital game studies for healthy lifestyle promotion, in which we investigated the overall effectiveness of serious di...

  20. Security Analysis of Randomize-Hash-then-Sign Digital Signatures

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Knudsen, Lars Ramkilde

    2012-01-01

    At CRYPTO 2006, Halevi and Krawczyk proposed two randomized hash function modes and analyzed the security of digital signature algorithms based on these constructions. They showed that the security of signature schemes based on the two randomized hash function modes relies on properties similar...... functions, such as for the Davies-Meyer construction used in the popular hash functions such as MD5 designed by Rivest and the SHA family of hash functions designed by the National Security Agency (NSA), USA and published by NIST in the Federal Information Processing Standards (FIPS). We show an online...... 800-106. We discuss some important applications of our attacks and discuss their applicability on signature schemes based on hash functions with ‘built-in’ randomization. Finally, we compare our attacks on randomize-hash-then-sign schemes with the generic forgery attacks on the standard hash...

  1. Improving the flash flood frequency analysis applying dendrogeomorphological evidences

    Science.gov (United States)

    Ruiz-Villanueva, V.; Ballesteros, J. A.; Bodoque, J. M.; Stoffel, M.; Bollschweiler, M.; Díez-Herrero, A.

    2009-09-01

    Flash floods are one of the natural hazards that cause major damages worldwide. Especially in Mediterranean areas they provoke high economic losses every year. In mountain areas with high stream gradients, floods events are characterized by extremely high flow and debris transport rates. Flash flood analysis in mountain areas presents specific scientific challenges. On one hand, there is a lack of information on precipitation and discharge due to a lack of spatially well distributed gauge stations with long records. On the other hand, gauge stations may not record correctly during extreme events when they are damaged or the discharge exceeds the recordable level. In this case, no systematic data allows improvement of the understanding of the spatial and temporal occurrence of the process. Since historic documentation is normally scarce or even completely missing in mountain areas, tree-ring analysis can provide an alternative approach. Flash floods may influence trees in different ways: (1) tilting of the stem through the unilateral pressure of the flowing mass or individual boulders; (2) root exposure through erosion of the banks; (3) injuries and scars caused by boulders and wood transported in the flow; (4) decapitation of the stem and resulting candelabra growth through the severe impact of boulders; (5) stem burial through deposition of material. The trees react to these disturbances with specific growth changes such as abrupt change of the yearly increment and anatomical changes like reaction wood or callus tissue. In this study, we sampled 90 cross sections and 265 increment cores of trees heavily affected by past flash floods in order to date past events and to reconstruct recurrence intervals in two torrent channels located in the Spanish Central System. The first study site is located along the Pelayo River, a torrent in natural conditions. Based on the external disturbances of trees and their geomorphological position, 114 Pinus pinaster (Ait

  2. Digital Stratigraphy: Contextual Analysis of File System Traces in Forensic Science.

    Science.gov (United States)

    Casey, Eoghan

    2017-12-28

    This work introduces novel methods for conducting forensic analysis of file allocation traces, collectively called digital stratigraphy. These in-depth forensic analysis methods can provide insight into the origin, composition, distribution, and time frame of strata within storage media. Using case examples and empirical studies, this paper illuminates the successes, challenges, and limitations of digital stratigraphy. This study also shows how understanding file allocation methods can provide insight into concealment activities and how real-world computer usage can complicate digital stratigraphy. Furthermore, this work explains how forensic analysts have misinterpreted traces of normal file system behavior as indications of concealment activities. This work raises awareness of the value of taking the overall context into account when analyzing file system traces. This work calls for further research in this area and for forensic tools to provide necessary information for such contextual analysis, such as highlighting mass deletion, mass copying, and potential backdating. © 2017 American Academy of Forensic Sciences.

  3. Developments of FPGA-based digital back-ends for low frequency antenna arrays at Medicina radio telescopes

    Science.gov (United States)

    Naldi, G.; Bartolini, M.; Mattana, A.; Pupillo, G.; Hickish, J.; Foster, G.; Bianchi, G.; Lingua, A.; Monari, J.; Montebugnoli, S.; Perini, F.; Rusticelli, S.; Schiaffino, M.; Virone, G.; Zarb Adami, K.

    In radio astronomy Field Programmable Gate Array (FPGA) technology is largely used for the implementation of digital signal processing techniques applied to antenna arrays. This is mainly due to the good trade-off among computing resources, power consumption and cost offered by FPGA chip compared to other technologies like ASIC, GPU and CPU. In the last years several digital backend systems based on such devices have been developed at the Medicina radio astronomical station (INAF-IRA, Bologna, Italy). Instruments like FX correlator, direct imager, beamformer, multi-beam system have been successfully designed and realized on CASPER (Collaboration for Astronomy Signal Processing and Electronics Research, https://casper.berkeley.edu) processing boards. In this paper we present the gained experience in this kind of applications.

  4. Blind Time-Frequency Analysis for Source Discrimination in Multisensor Array Processing

    National Research Council Canada - National Science Library

    Amin, Moeness

    1999-01-01

    .... We have clearly demonstrated, through analysis and simulations, the offerings of time-frequency distributions in solving key problems in sensor array processing, including direction finding, source...

  5. Detailed Analysis of Torque Ripple in High Frequency Signal Injection based Sensor less PMSM Drives

    Directory of Open Access Journals (Sweden)

    Ravikumar Setty A.

    2017-01-01

    Full Text Available High Frequency Signal Injection based techniques are robust and well proven to estimate the rotor position from stand still to low speed. However, Injected high frequency signal introduces, high frequency harmonics in the motor phase currents and results in significant Output Torque ripple. There is no detailed analysis exist in the literature, to study the effect of injected signal frequency on Torque ripple. Objective of this work is to study the Torque Ripple resulting from High Frequency signal injection in PMSM motor drives. Detailed MATLAB/Simulink simulations are carried to quantify the Torque ripple at different Signal frequencies.

  6. Elementary, Middle, and High School Students Vary in Frequency and Purpose When Using Online Digital References. A review of: Silverstein, Joanne. “Just Curious: Children’s Use of Digital Reference for Unimposed Queries and Its Importance in Informal Education.” Library Trends 54.2 (Fall 2005: 228‐44.

    Directory of Open Access Journals (Sweden)

    Julie Stephens

    2006-12-01

    Full Text Available Objective – To determine 1 how and with what frequency children use digital references to answer their own unimposed questions; 2 whether digital reference services support their self‐initiated learning; 3 whether digital reference services support the transfer of student motivation and curiosity from the formal to the informal; and 4 what instructional and software designers should consider in creating tools that support learning.Design – Inductive analysis.Setting – Virtual Reference Desk’s (VRD Learning Center (http://vrd.askvrd.org/ and the National Science Foundation’s (NSFdigital reference service (http://www.esteme.org during Excellence in Science, Technology, Engineering, and Mathematics Education Week (ESTEME, April 11‐16, 2005.Subjects – Elementary (K‐5, middle (6‐8,and high school (9‐12 students from the general public. One hundred fourteen questions were analyzed, however there is no indication of the number of different students who submitted the questions.Methods – This study was conducted using a pool of 600 questions from students, teachers, parents, and the general public that were submitted to two digital reference services intended for students. Three hundred experts in the fields of Math and Science volunteered to answer the submitted questions during Excellence in Science, Technology, Engineering, and Mathematics Education Week. Because the digital services employed a pull‐down menu to describe the user as a student, teacher, parent, etc., the questions could be narrowed to those submitted by students. The questions were also narrowed to those marked as “just curious” from a question purpose menu that contained categories including “written report,” “science fair project,” and “just curious.” A total of 114 unique questions from elementary, middle, and high school students were analyzed to determine the study objectives. The 114 questions were loaded into a qualitative software

  7. Analysis of East Tank Farms Contamination Survey Frequency

    International Nuclear Information System (INIS)

    ELDER, R.E.

    2000-01-01

    This document provides the justification for the change in survey frequency in East Tank Farms occupied contamination areas from weekly to monthly. The Tank Farms Radiological Control Organization has performed radiological surveys of its Contamination Area (CA) Double Shell Tank (DST) farms in 200 East Area on a weekly basis for several years. The task package (DST-W012) controlling these routines designates specific components, at a minimum, that must be surveyed whenever the task is performed. This document documents the evaluation of these survey requirements and provides the recommendation and basis for moving DST tank farms in the 200 East Area from a weekly to monthly contamination survey. The contamination surveys for occupied contamination areas in West Tank Farms (WTF) were changed from a weekly frequency to a monthly frequency in 1997. Review of contamination survey data in WTF indicates a monthly interval remains satisfactory

  8. Analysis of factors correlating with medical radiological examination frequencies

    International Nuclear Information System (INIS)

    Jahnen, A.; Jaervinen, H.; Bly, R.; Olerud, H.; Vassilieva, J.; Vogiatzi, S.; Shannoun, F.

    2015-01-01

    The European Commission (EC) funded project Dose Datamed 2 (DDM2) had two objectives: to collect available data on patient doses from the radiodiagnostic procedures (X-ray and nuclear medicine) in Europe, and to facilitate the implementation of the Radiation Protection 154 Guidelines (RP154). Besides the collection of frequency and dose data, two questionnaires were issued to gather information about medical radiological imaging. This article analyses a possible correlation between the collected frequency data, selected variables from the results of the detailed questionnaire and national economic data. Based on a 35 countries dataset, there is no correlation between the gross domestic product (GDP) and the total number of X-ray examinations in a country. However, there is a significant correlation ( p < 0.01) between the GDP and the overall CT examination frequency. High income countries perform more CT examinations per inhabitant. That suggests that planar X-ray examinations are replaced by CT examinations. (authors)

  9. Simulation Analysis of SPWM Variable Frequency Speed Based on Simulink

    Directory of Open Access Journals (Sweden)

    Min-Yan DI

    2014-01-01

    Full Text Available This article is studied on currently a very active field of researching sinusoidal pulse width modulation (SPWM frequency speed control system, and strengthen researched on the simulation model of speed control system with MATLAB / Simulink / Power System simulation tools, thus we can find the best way to simulation. We apply it to the actual conveyor belt, frequency conversion motor, when the obtained simulation results are compared with the measured data, we prove that the method is practical and effective. The results of our research have a guiding role for the future engineering and technical personnel in asynchronous motor SPWM VVVF CAD design.

  10. Joint time-frequency analysis of ultrasonic signal

    International Nuclear Information System (INIS)

    Oh, Sae Kyu; Nam, Ki Woo; Oh, Jung Hwan; Lee, Keun Chan; Jang, Hong Keun

    1998-01-01

    This paper examines the propagation of Lamb (or plate) waves in anisotropic laminated composite plates. The dispersion relations are explicitly derived using the classical plate theory (CLT), the first-order shear deformation theory (FSDT) and the exact solution (ES), Attention is paid to the lowest antisymmetric (flexural) and lowest symmetric(extensional) modes in the low frequency, long wavelength limit. Different values of shear correction factor were tested in FSDT and comparisons between flexural wave dispersion curves were made with exact results to asses the range of validity of approximate plate theories in the frequency domain.

  11. Analysis of Various Frequency Selective Shielding Glass by FDTD method

    OpenAIRE

    笠嶋, 善憲; Kasashima, Yoshinori

    2012-01-01

    A frequency Selective shielding (FSS) glass is a print of many same size antennas on a sheet of glass, and it has high shielding properties for one specific frequency. This time, the author analyzed characteristics of various FSSs whose antenna types are different by FDTD method. The antenna types are cross dipole, circular loop, square loop, circular patch, and square patch. As the result, the FSSs can be composed of the various types of the antennas, and the FSSs have broad-band shielding c...

  12. Archeological Echocardiography: Digitization and Speckle Tracking Analysis of Archival Echocardiograms in the HyperGEN Study.

    Science.gov (United States)

    Aguilar, Frank G; Selvaraj, Senthil; Martinez, Eva E; Katz, Daniel H; Beussink, Lauren; Kim, Kwang-Youn A; Ping, Jie; Rasmussen-Torvik, Laura; Goyal, Amita; Sha, Jin; Irvin, Marguerite R; Arnett, Donna K; Shah, Sanjiv J

    2016-03-01

    Several large epidemiologic studies and clinical trials have included echocardiography, but images were stored in analog format and these studies predated tissue Doppler imaging (TDI) and speckle tracking echocardiography (STE). We hypothesized that digitization of analog echocardiograms, with subsequent quantification of cardiac mechanics using STE, is feasible, reproducible, accurate, and produces clinically valid results. In the NHLBI HyperGEN study (N = 2234), archived analog echocardiograms were digitized and subsequently analyzed using STE to obtain tissue velocities/strain. Echocardiograms were assigned quality scores and inter-/intra-observer agreement was calculated. Accuracy was evaluated in: (1) a separate second study (N = 50) comparing prospective digital strain versus post hoc analog-to-digital strain, and (2) in a third study (N = 95) comparing prospectively obtained TDI e' velocities with post hoc STE e' velocities. Finally, we replicated previously known associations between tissue velocities/strain, conventional echocardiographic measurements, and clinical data. Of the 2234 HyperGEN echocardiograms, 2150 (96.2%) underwent successful digitization and STE analysis. Inter/intra-observer agreement was high for all STE parameters, especially longitudinal strain (LS). In accuracy studies, LS performed best when comparing post hoc STE to prospective digital STE for strain analysis. STE-derived e' velocities correlated with, but systematically underestimated, TDI e' velocity. Several known associations between clinical variables and cardiac mechanics were replicated in HyperGEN. We also found a novel independent inverse association between fasting glucose and LS (adjusted β = -2.4 [95% CI -3.6, -1.2]% per 1-SD increase in fasting glucose; P echocardiography, the digitization and speckle tracking analysis of archival echocardiograms, is feasible and generates indices of cardiac mechanics similar to contemporary studies. © 2015, Wiley Periodicals, Inc.

  13. On the use of the term 'frequency' in applied behavior analysis.

    Science.gov (United States)

    Carr, James E; Nosik, Melissa R; Luke, Molli M

    2018-04-01

    There exists a terminological problem in applied behavior analysis: the term frequency has been used as a synonym for both rate (the number of responses per time) and count (the number of responses). To guide decisions about the use and meaning of frequency, we surveyed the usage of frequency in contemporary behavior-analytic journals and textbooks and found that the predominant usage of frequency was as count, not rate. Thus, we encourage behavior analysts to use frequency as a synonym for count. © 2018 Society for the Experimental Analysis of Behavior.

  14. Digital tomosynthesis parallel imaging computational analysis with shift and add and back projection reconstruction algorithms.

    Science.gov (United States)

    Chen, Ying; Balla, Apuroop; Rayford II, Cleveland E; Zhou, Weihua; Fang, Jian; Cong, Linlin

    2010-01-01

    Digital tomosynthesis is a novel technology that has been developed for various clinical applications. Parallel imaging configuration is utilised in a few tomosynthesis imaging areas such as digital chest tomosynthesis. Recently, parallel imaging configuration for breast tomosynthesis began to appear too. In this paper, we present the investigation on computational analysis of impulse response characterisation as the start point of our important research efforts to optimise the parallel imaging configurations. Results suggest that impulse response computational analysis is an effective method to compare and optimise imaging configurations.

  15. Lake-level frequency analysis for Devils Lake, North Dakota

    Science.gov (United States)

    Wiche, Gregg J.; Vecchia, Aldo V.

    1996-01-01

    Two approaches were used to estimate future lake-level probabilities for Devils Lake. The first approach is based on an annual lake-volume model, and the second approach is based on a statistical water mass-balance model that generates seasonal lake volumes on the basis of seasonal precipitation, evaporation, and inflow. Autoregressive moving average models were used to model the annual mean lake volume and the difference between the annual maximum lake volume and the annual mean lake volume. Residuals from both models were determined to be uncorrelated with zero mean and constant variance. However, a nonlinear relation between the residuals of the two models was included in the final annual lakevolume model.Because of high autocorrelation in the annual lake levels of Devils Lake, the annual lake-volume model was verified using annual lake-level changes. The annual lake-volume model closely reproduced the statistics of the recorded lake-level changes for 1901-93 except for the skewness coefficient. However, the model output is less skewed than the data indicate because of some unrealistically large lake-level declines. The statistical water mass-balance model requires as inputs seasonal precipitation, evaporation, and inflow data for Devils Lake. Analysis of annual precipitation, evaporation, and inflow data for 1950-93 revealed no significant trends or long-range dependence so the input time series were assumed to be stationary and short-range dependent.Normality transformations were used to approximately maintain the marginal probability distributions; and a multivariate, periodic autoregressive model was used to reproduce the correlation structure. Each of the coefficients in the model is significantly different from zero at the 5-percent significance level. Coefficients relating spring inflow from one year to spring and fall inflows from the previous year had the largest effect on the lake-level frequency analysis.Inclusion of parameter uncertainty in the model

  16. 3D DIGITIZATION OF AN HERITAGE MASTERPIECE - A CRITICAL ANALYSIS ON QUALITY ASSESSMENT

    Directory of Open Access Journals (Sweden)

    F. Menna

    2016-06-01

    Full Text Available Despite being perceived as interchangeable when properly applied, close-range photogrammetry and range imaging have both their pros and limitations that can be overcome using suitable procedures. Even if the two techniques have been frequently cross-compared, critical analysis discussing all sub-phases of a complex digitization project are quite rare. Comparisons taking into account the digitization of a cultural masterpiece, such as the Etruscan Sarcophagus of the Spouses (Figure 1 discussed in this paper, are even less common. The final 3D model of the Sarcophagus shows impressive spatial and texture resolution, in the order of tenths of millimetre for both digitization techniques, making it a large 3D digital model even though the physical size of the artwork is quite limited. The paper presents the survey of the Sarcophagus, a late 6th century BC Etruscan anthropoid Sarcophagus. Photogrammetry and laser scanning were used for its 3D digitization in two different times only few days apart from each other. The very short time available for the digitization was a crucial constraint for the surveying operations (due to constraints imposed us by the museum curators. Despite very high-resolution and detailed 3D models have been produced, a metric comparison between the two models shows intrinsic limitations of each technique that should be overcome through suitable onsite metric verification procedures as well as a proper processing workflow.

  17. Predicting Individual Characteristics from Digital Traces on Social Media: A Meta-Analysis.

    Science.gov (United States)

    Settanni, Michele; Azucar, Danny; Marengo, Davide

    2018-04-01

    The increasing utilization of social media provides a vast and new source of user-generated ecological data (digital traces), which can be automatically collected for research purposes. The availability of these data sets, combined with the convergence between social and computer sciences, has led researchers to develop automated methods to extract digital traces from social media and use them to predict individual psychological characteristics and behaviors. In this article, we reviewed the literature on this topic and conducted a series of meta-analyses to determine the strength of associations between digital traces and specific individual characteristics; personality, psychological well-being, and intelligence. Potential moderator effects were analyzed with respect to type of social media platform, type of digital traces examined, and study quality. Our findings indicate that digital traces from social media can be studied to assess and predict theoretically distant psychosocial characteristics with remarkable accuracy. Analysis of moderators indicated that the collection of specific types of information (i.e., user demographics), and the inclusion of different types of digital traces, could help improve the accuracy of predictions.

  18. Analysis and synthesis of digital circuits for a computer of specific purposes

    International Nuclear Information System (INIS)

    Marchand Rosales, E.E.

    1975-01-01

    The circuits described in this paper are part of a computer system designed for the automation of plasma diagnostics using electrostatic probes. The automated system is designed to give: (a) The density of the plasma (state variable) every ten microseconds in binary digits; (b) Probe data, stored for subsequent diagnostics; (c) A graphic and digital display of results; (d) Presentation of numerical diagnostics results in floating point format and in the decimal system for convenience of interpretation. The project is aimed, furthermore, at the development of techniques for the design, construction and adjustment of digital circuits, and at the training of personnel who will apply these techniques in digital instrumentation. A block diagram of the system is discussed in general terms. Methods for analysis and synthesis of the sequential circuits applied to the circuit for aligning and normalizing the floating point format, the format circuit and the operational sequence circuit are also described. Recommendations are made and precautions suggested which it is thought advisable to follow at the stages of design, construction and adjustment of the digital circuits, and these apply also to the equipment and techniques (wire wrapping) used for building the circuits. The adjustment of the digital circuits proved to be satisfactory and a definition panel was thus obtained for the decimal point alignment circuit. It is concluded that the method of synthesis need not always be applied; the cases in which the method is recommended are mentioned, as are those in which the non-formal method of synthesis can be used. (author)

  19. Analysis of nonlinear behavior of loudspeakers using the instantaneous frequency

    DEFF Research Database (Denmark)

    Huang, Hai; Jacobsen, Finn

    2003-01-01

    on the Fourier transform. In this work, a new method using the instantaneous frequency is introduced for describing and characterizing loudspeaker nonlinearities. First, numerical integration is applied to simulate the nonlinearities of loudspeakers caused by two nonlinear parameters, force factor and stiffness...

  20. Rectifier analysis for radio frequency energy harvesting and power transport

    NARCIS (Netherlands)

    Keyrouz, S.; Visser, H.J.; Tijhuis, A.G.

    2012-01-01

    Wireless Power Transmission (WPT) is an attractive powering method for wireless sensor nodes, battery-less sensors, and Radio-Frequency Identification (RFID) tags. The key element on the receiving side of a WPT system is the rectifying antenna (rectenna) which captures the electromagnetic power and

  1. Evaluation and Analysis of Frequency of Transformer Failures ...

    African Journals Online (AJOL)

    Abstract. The frequency of failed distribution transformers in Power Holding Company of Nigeria Plc, Akpakpava Business Unit network, Benin City, for a period of two years have been investigated in this work. The frequent power outages recorded in our communities resulting in customers dissatisfactions, economic losses, ...

  2. Alcohol marketing in televised international football: frequency analysis.

    Science.gov (United States)

    Adams, Jean; Coleman, James; White, Martin

    2014-05-20

    Alcohol marketing includes sponsorship of individuals, organisations and sporting events. Football (soccer) is one of the most popular spectator sports worldwide. No previous studies have quantified the frequency of alcohol marketing in a high profile international football tournament. The aims were to determine: the frequency and nature of visual references to alcohol in a representative sample of EURO2012 matches broadcast in the UK; and if frequency or nature varied between matches broadcast on public service and commercial channels, or between matches that did and did not feature England. Eight matches selected by stratified random sampling were recorded. All visual references to alcohol were identified using a tool with high inter-rater reliability. 1846 visual references to alcohol were identified over 1487 minutes of broadcast--an average of 1.24 references per minute. The mean number of references per minute was higher in matches that did vs did not feature England (p = 0.004), but did not differ between matches broadcast on public service vs commercial channels (p = 0.92). The frequency of visual references to alcohol was universally high and higher in matches featuring the only UK home team--England--suggesting that there may be targeting of particularly highly viewed matches. References were embedded in broadcasts, and not particular to commercial channels including paid-for advertising. New UK codes-of-conduct on alcohol marketing at sporting events will not reduce the level of marketing reported here.

  3. Allele frequency analysis of Chinese chestnut ( Castanea mollissima ...

    African Journals Online (AJOL)

    The aim of this study was to establish a method for allele frequency detection in bulk samples. The abundance of polymerase chain reaction (PCR) products in bulk leaf samples was detected using fluorescent labeled Simple sequence repeat (SSR) primers and an Applied biosystems (AB) automatic DNA analyzer.

  4. Word Inventory and Frequency Analysis of French Conversations.

    Science.gov (United States)

    Malecot, Andre

    This word frequency list was extracted from a corpus of fifty half-hour conversations recorded in Paris during the academic year 1967-68. The speakers, who did not know that they were being recorded, were all well-educated professionals and all speakers of the most standard dialect of French. The list is made up of all phonetically discrete words…

  5. The analysis of cable forces based on natural frequency

    Science.gov (United States)

    Suangga, Made; Hidayat, Irpan; Juliastuti; Bontan, Darwin Julius

    2017-12-01

    A cable is a flexible structural member that is effective at resisting tensile forces. Cables are used in a variety of structures that employ their unique characteristics to create efficient design tension members. The condition of the cable forces in the cable supported structure is an important indication of judging whether the structure is in good condition. Several methods have been developed to measure on site cable forces. Vibration technique using correlation between natural frequency and cable forces is a simple method to determine in situ cable forces, however the method need accurate information on the boundary condition, cable mass, and cable length. The natural frequency of the cable is determined using FFT (Fast Fourier Transform) Technique to the acceleration record of the cable. Based on the natural frequency obtained, the cable forces then can be determine by analytical or by finite element program. This research is focus on the vibration techniques to determine the cable forces, to understand the physical parameter effect of the cable and also modelling techniques to the natural frequency and cable forces.

  6. Digital signal processing

    CERN Document Server

    O'Shea, Peter; Hussain, Zahir M

    2011-01-01

    In three parts, this book contributes to the advancement of engineering education and that serves as a general reference on digital signal processing. Part I presents the basics of analog and digital signals and systems in the time and frequency domain. It covers the core topics: convolution, transforms, filters, and random signal analysis. It also treats important applications including signal detection in noise, radar range estimation for airborne targets, binary communication systems, channel estimation, banking and financial applications, and audio effects production. Part II considers sel

  7. A new on-chip all-digital three-phase full-bridge dc/ac power inverter with feedforward and frequency control techniques.

    Science.gov (United States)

    Chen, Jiann-Jong; Kung, Che-Min

    2010-09-01

    The communication speed between components is far from satisfactory. To achieve high speed, simple control system configuration, and low cost, a new on-chip all-digital three-phase dc/ac power inverter using feedforward and frequency control techniques is proposed. The controller of the proposed power inverter, called the shift register, consists of six-stage D-latch flip-flops with a goal of achieving low-power consumption and area efficiency. Variable frequency is achieved by controlling the clocks of the shift register. One advantage regarding the data signal (D) and the common clock (CK) is that, regardless of the phase difference between the two, all of the D-latch flip-flops are capable of delaying data by one CK period. To ensure stability, the frequency of CK must be six times higher than that of D. The operation frequency of the proposed power inverter ranges from 10 Hz to 2 MHz, and the maximum output loading current is 0.8 A. The prototype of the proposed circuit has been fabricated with TSMC 0.35 μm 2P4M CMOS processes. The total chip area is 2.333 x 1.698 mm2. The three-phase dc/ac power inverter is applicable in uninterrupted power supplies, cold cathode fluorescent lamps, and motors, because of its ability to convert the dc supply voltage into the three-phase ac power sources.

  8. Automatic analysis of quality of images from X-ray digital flat detectors

    International Nuclear Information System (INIS)

    Le Meur, Y.

    2009-04-01

    Since last decade, medical imaging has grown up with the development of new digital imaging techniques. In the field of X-ray radiography, new detectors replace progressively older techniques, based on film or x-ray intensifiers. These digital detectors offer a higher sensibility and reduced overall dimensions. This work has been prepared with Trixell, the world leading company in flat detectors for medical radiography. It deals with quality control on digital images stemming from these detectors. High quality standards of medical imaging impose a close analysis of the defects that can appear on the images. This work describes a complete process for quality analysis of such images. A particular focus is given on the detection task of the defects, thanks to methods well adapted to our context of spatially correlated defects in noise background. (author)

  9. Guidelines for reliability analysis of digital systems in PSA context. Phase 1 status report

    International Nuclear Information System (INIS)

    Authen, S.; Larsson, J.; Bjoerkman, K.; Holmberg, J.-E.

    2010-12-01

    Digital protection and control systems are appearing as upgrades in older nuclear power plants (NPPs) and are commonplace in new NPPs. To assess the risk of NPP operation and to determine the risk impact of digital system upgrades on NPPs, quantitative reliability models are needed for digital systems. Due to the many unique attributes of these systems, challenges exist in systems analysis, modeling and in data collection. Currently there is no consensus on reliability analysis approaches. Traditional methods have clearly limitations, but more dynamic approaches are still in trial stage and can be difficult to apply in full scale probabilistic safety assessments (PSA). The number of PSAs worldwide including reliability models of digital I and C systems are few. A comparison of Nordic experiences and a literature review on main international references have been performed in this pre-study project. The study shows a wide range of approaches, and also indicates that no state-of-the-art currently exists. The study shows areas where the different PSAs agree and gives the basis for development of a common taxonomy for reliability analysis of digital systems. It is still an open matter whether software reliability needs to be explicitly modelled in the PSA. The most important issue concerning software reliability is proper descriptions of the impact that software-based systems has on the dependence between the safety functions and the structure of accident sequences. In general the conventional fault tree approach seems to be sufficient for modelling reactor protection system kind of functions. The following focus areas have been identified for further activities: 1. Common taxonomy of hardware and software failure modes of digital components for common use 2. Guidelines regarding level of detail in system analysis and screening of components, failure modes and dependencies 3. Approach for modelling of CCF between components (including software). (Author)

  10. Guidelines for reliability analysis of digital systems in PSA context. Phase 1 status report

    Energy Technology Data Exchange (ETDEWEB)

    Authen, S.; Larsson, J. (Risk Pilot AB, Stockholm (Sweden)); Bjoerkman, K.; Holmberg, J.-E. (VTT, Helsingfors (Finland))

    2010-12-15

    Digital protection and control systems are appearing as upgrades in older nuclear power plants (NPPs) and are commonplace in new NPPs. To assess the risk of NPP operation and to determine the risk impact of digital system upgrades on NPPs, quantitative reliability models are needed for digital systems. Due to the many unique attributes of these systems, challenges exist in systems analysis, modeling and in data collection. Currently there is no consensus on reliability analysis approaches. Traditional methods have clearly limitations, but more dynamic approaches are still in trial stage and can be difficult to apply in full scale probabilistic safety assessments (PSA). The number of PSAs worldwide including reliability models of digital I and C systems are few. A comparison of Nordic experiences and a literature review on main international references have been performed in this pre-study project. The study shows a wide range of approaches, and also indicates that no state-of-the-art currently exists. The study shows areas where the different PSAs agree and gives the basis for development of a common taxonomy for reliability analysis of digital systems. It is still an open matter whether software reliability needs to be explicitly modelled in the PSA. The most important issue concerning software reliability is proper descriptions of the impact that software-based systems has on the dependence between the safety functions and the structure of accident sequences. In general the conventional fault tree approach seems to be sufficient for modelling reactor protection system kind of functions. The following focus areas have been identified for further activities: 1. Common taxonomy of hardware and software failure modes of digital components for common use 2. Guidelines regarding level of detail in system analysis and screening of components, failure modes and dependencies 3. Approach for modelling of CCF between components (including software). (Author)

  11. Digital Holographic Microscopy: Quantitative Phase Imaging and Applications in Live Cell Analysis

    Science.gov (United States)

    Kemper, Björn; Langehanenberg, Patrik; Kosmeier, Sebastian; Schlichthaber, Frank; Remmersmann, Christian; von Bally, Gert; Rommel, Christina; Dierker, Christian; Schnekenburger, Jürgen

    The analysis of complex processes in living cells creates a high demand for fast and label-free methods for online monitoring. Widely used fluorescence methods require specific labeling and are often restricted to chemically fixated samples. Thus, methods that offer label-free and minimally invasive detection of live cell processes and cell state alterations are of particular interest. In combination with light microscopy, digital holography provides label-free, multi-focus quantitative phase imaging of living cells. In overview, several methods for digital holographic microscopy (DHM) are presented. First, different experimental setups for the recording of digital holograms and the modular integration of DHM into common microscopes are described. Then the numerical processing of digitally captured holograms is explained. This includes the description of spatial and temporal phase shifting techniques, spatial filtering based reconstruction, holographic autofocusing, and the evaluation of self-interference holograms. Furthermore, the usage of partial coherent light and multi-wavelength approaches is discussed. Finally, potentials of digital holographic microscopy for quantitative cell imaging are illustrated by results from selected applications. It is shown that DHM can be used for automated tracking of migrating cells and cell thickness monitoring as well as for refractive index determination of cells and particles. Moreover, the use of DHM for label-free analysis in fluidics and micro-injection monitoring is demonstrated. The results show that DHM is a highly relevant method that allows novel insights in dynamic cell biology, with applications in cancer research and for drugs and toxicity testing.

  12. Econometric analysis of realised covariation: high frequency covariance, regression and correlation in financial economics

    OpenAIRE

    Ole E. Barndorff-Nielsen; Neil Shephard

    2002-01-01

    This paper analyses multivariate high frequency financial data using realised covariation. We provide a new asymptotic distribution theory for standard methods such as regression, correlation analysis and covariance. It will be based on a fixed interval of time (e.g. a day or week), allowing the number of high frequency returns during this period to go to infinity. Our analysis allows us to study how high frequency correlations, regressions and covariances change through time. In particular w...

  13. Low-frequency computational electromagnetics for antenna analysis

    Energy Technology Data Exchange (ETDEWEB)

    Miller, E.K. (Los Alamos National Lab., NM (USA)); Burke, G.J. (Lawrence Livermore National Lab., CA (USA))

    1991-01-01

    An overview of low-frequency, computational methods for modeling the electromagnetic characteristics of antennas is presented here. The article presents a brief analytical background, and summarizes the essential ingredients of the method of moments, for numerically solving low-frequency antenna problems. Some extensions to the basic models of perfectly conducting objects in free space are also summarized, followed by a consideration of some of the same computational issues that affect model accuracy, efficiency and utility. A variety of representative computations are then presented to illustrate various modeling aspects and capabilities that are currently available. A fairly extensive bibliography is included to suggest further reference material to the reader. 90 refs., 27 figs.

  14. Patellofemoral pain syndrome: electromyography in a frequency domain analysis

    Science.gov (United States)

    Catelli, D. S.; Kuriki, H. U.; Polito, L. F.; Azevedo, F. M.; Negrão Filho, R. F.; Alves, N.

    2011-09-01

    The Patellofemoral Pain Syndrome (PFPS), has a multifactorial etiology and affects approximately 7 to 15% of the population, mostly women, youth, adults and active persons. PFPS causes anterior or retropatelar pain that is exacerbated during functional motor gestures, such as up and down stairs or spending long periods of time sitting, squatting or kneeling. As the diagnostic evaluation of this syndrome is still indirect, different mechanisms and methodologies try to make a classification that distinguishes patients with PFPS in relation to asymptomatic. Thereby, the purpose of this investigation was to determine the characteristics of the electromyographic (EMG) signal in the frequency domain of the vastus medialis oblique (VMO) and vastus lateralis (VL) in patients with PFPS, during the ascent of stairs. 33 young women (22 control group and 11 PFPS group), were evaluated by EMG during ascent of stairs. The VMO mean power frequency (MPF) and the VL frequency 95% (F95) were lower in symptomatic individuals. This may be related to the difference in muscle recruitment strategy exerted by each muscle in the PFPS group compared to the control group.

  15. Uses of software in digital image analysis: a forensic report

    Science.gov (United States)

    Sharma, Mukesh; Jha, Shailendra

    2010-02-01

    Forensic image analysis is required an expertise to interpret the content of an image or the image itself in legal matters. Major sub-disciplines of forensic image analysis with law enforcement applications include photo-grammetry, photographic comparison, content analysis and image authentication. It has wide applications in forensic science range from documenting crime scenes to enhancing faint or indistinct patterns such as partial fingerprints. The process of forensic image analysis can involve several different tasks, regardless of the type of image analysis performed. Through this paper authors have tried to explain these tasks, which are described in to three categories: Image Compression, Image Enhancement & Restoration and Measurement Extraction. With the help of examples like signature comparison, counterfeit currency comparison and foot-wear sole impression using the software Canvas and Corel Draw.

  16. The Apply of Frequency Divider Circuit in Nuclear Electronics

    International Nuclear Information System (INIS)

    LIU Hefan; Zeng Bing; Zhang Ziliang; Ge Liangquan

    2009-01-01

    Different components in a digital system often need different working frequencies, the way we often used is clock division from the system clock. Through the analysis of frequency divider principle, a applied integer frequency dividing circuit with SE120A is proposed. It can divide the frequency multiple from 2 to 64. It's usually used in nuclear electronics. It's testing and analysis is displayed that it has no noise, good frequency division effect and stability. (authors)

  17. Tolerance of image enhancement brightness and contrast in lateral cephalometric digital radiography for Steiner analysis

    Science.gov (United States)

    Rianti, R. A.; Priaminiarti, M.; Syahraini, S. I.

    2017-08-01

    Image enhancement brightness and contrast can be adjusted on lateral cephalometric digital radiographs to improve image quality and anatomic landmarks for measurement by Steiner analysis. To determine the limit value for adjustments of image enhancement brightness and contrast in lateral cephalometric digital radiography for Steiner analysis. Image enhancement brightness and contrast were adjusted on 100 lateral cephalometric radiography in 10-point increments (-30, -20, -10, 0, +10, +20, +30). Steiner analysis measurements were then performed by two observers. Reliabilities were tested by the Interclass Correlation Coefficient (ICC) and significance tested by ANOVA or the Kruskal Wallis test. No significant differences were detected in lateral cephalometric analysis measurements following adjustment of the image enhancement brightness and contrast. The limit value of adjustments of the image enhancement brightness and contrast associated with incremental 10-point changes (-30, -20, -10, 0, +10, +20, +30) does not affect the results of Steiner analysis.

  18. Digital radiography

    DEFF Research Database (Denmark)

    Precht, H; Gerke, O; Rosendahl, K

    2012-01-01

    BACKGROUND: New developments in processing of digital radiographs (DR), including multi-frequency processing (MFP), allow optimization of image quality and radiation dose. This is particularly promising in children as they are believed to be more sensitive to ionizing radiation than adults....... OBJECTIVE: To examine whether the use of MFP software reduces the radiation dose without compromising quality at DR of the femur in 5-year-old-equivalent anthropomorphic and technical phantoms. MATERIALS AND METHODS: A total of 110 images of an anthropomorphic phantom were imaged on a DR system (Canon DR...... with CXDI-50 C detector and MLT[S] software) and analyzed by three pediatric radiologists using Visual Grading Analysis. In addition, 3,500 images taken of a technical contrast-detail phantom (CDRAD 2.0) provide an objective image-quality assessment. RESULTS: Optimal image-quality was maintained at a dose...

  19. Remaking Poems: Combining Translation and Digital Media to Interest High School Students in Poetry Analysis

    Science.gov (United States)

    Simpson, Amy Beth

    2017-01-01

    In American high schools, the practice of poetry analysis as a study of language art has declined. Outworn methods have contributed to the trend away from close interactions with the text, to the unfortunate end that millennial high school students neither understand nor enjoy poetry. Digital technology coupled with principles of translation…

  20. Extracting topographic structure from digital elevation data for geographic information-system analysis

    Science.gov (United States)

    Jenson, Susan K.; Domingue, Julia O.

    1988-01-01

    Software tools have been developed at the U.S. Geological Survey's EROS Data Center to extract topographic structure and to delineate watersheds and overland flow paths from digital elevation models. The tools are specialpurpose FORTRAN programs interfaced with general-purpose raster and vector spatial analysis and relational data base management packages.

  1. Digital image correlation in analysis of striffness in local zones of welded joints

    Czech Academy of Sciences Publication Activity Database

    Milosevic, M.; Milosevic, N.J.; Sedmak, S.; Tatic, U.; Mitrovic, N.; Hloch, Sergej; Jovicic, R.

    2016-01-01

    Roč. 23, č. 1 (2016), s. 19-24 ISSN 1330-3651 Institutional support: RVO:68145535 Keywords : Aramis software * digital image correlation * strain analysis * stiffness * welded joints Subject RIV: JQ - Machines ; Tools Impact factor: 0.723, year: 2016 http://hrcak.srce.hr/file/225545

  2. Peer Assessment in the Digital Age: A Meta-Analysis Comparing Peer and Teacher Ratings

    Science.gov (United States)

    Li, Hongli; Xiong, Yao; Zang, Xiaojiao; Kornhaber, Mindy L.; Lyu, Youngsun; Chung, Kyung Sun; Suen, Hoi K.

    2016-01-01

    Given the wide use of peer assessment, especially in higher education, the relative accuracy of peer ratings compared to teacher ratings is a major concern for both educators and researchers. This concern has grown with the increase of peer assessment in digital platforms. In this meta-analysis, using a variance-known hierarchical linear modelling…

  3. Automatic morphometry of synaptic boutons of cultured cells using granulometric analysis of digital images

    NARCIS (Netherlands)

    Prodanov, D.P.; Heeroma, Joost; Marani, Enrico

    2006-01-01

    Numbers, linear density, and surface area of synaptic boutons can be important parameters in studies on synaptic plasticity in cultured neurons. We present a method for automatic identification and morphometry of boutons based on filtering of digital images using granulometric analysis. Cultures of

  4. Concurrent Development and Cost-Benefit Analysis of Paper-Based and Digitized Instructional Material.

    Science.gov (United States)

    Annand, David

    2002-01-01

    Describes the simultaneous development of paper-based and digitized versions of a textbook and related instructional material used in an undergraduate, independent study, distance education course at Athabasca University (Canada). Used break-even analysis as an initial evaluation measure to determine cost-effectiveness, and discusses the next…

  5. The Role of Business Agreements in Defining Textbook Affordability and Digital Materials: A Document Analysis

    Science.gov (United States)

    Raible, John; deNoyelles, Aimee

    2015-01-01

    Adopting digital materials such as eTextbooks and e-coursepacks is a potential strategy to address textbook affordability in the United States. However, university business relationships with bookstore vendors implicitly structure which instructional resources are available and in what manner. In this study, a document analysis was conducted on…

  6. Digital Games, Design, and Learning: A Systematic Review and Meta-Analysis

    Science.gov (United States)

    Clark, Douglas B.; Tanner-Smith, Emily E.; Killingsworth, Stephen S.

    2016-01-01

    In this meta-analysis, we systematically reviewed research on digital games and learning for K-16 students. We synthesized comparisons of game versus nongame conditions (i.e., media comparisons) and comparisons of augmented games versus standard game designs (i.e., value-added comparisons). We used random-effects meta-regression models with robust…

  7. Full-field wrist pulse signal acquisition and analysis by 3D Digital Image Correlation

    Science.gov (United States)

    Xue, Yuan; Su, Yong; Zhang, Chi; Xu, Xiaohai; Gao, Zeren; Wu, Shangquan; Zhang, Qingchuan; Wu, Xiaoping

    2017-11-01

    Pulse diagnosis is an essential part in four basic diagnostic methods (inspection, listening, inquiring and palpation) in traditional Chinese medicine, which depends on longtime training and rich experience, so computerized pulse acquisition has been proposed and studied to ensure the objectivity. To imitate the process that doctors using three fingertips with different pressures to feel fluctuations in certain areas containing three acupoints, we established a five dimensional pulse signal acquisition system adopting a non-contacting optical metrology method, 3D digital image correlation, to record the full-field displacements of skin fluctuations under different pressures. The system realizes real-time full-field vibration mode observation with 10 FPS. The maximum sample frequency is 472 Hz for detailed post-processing. After acquisition, the signals are analyzed according to the amplitude, pressure, and pulse wave velocity. The proposed system provides a novel optical approach for digitalizing pulse diagnosis and massive pulse signal data acquisition for various types of patients.

  8. Digital data acquisition for laser radar for vibration analysis

    OpenAIRE

    Montes, Felix G.

    1998-01-01

    Approved for public release; distribution is unlimited Laser radar for vibration analysis represents a military application to develop a target identification system in the future. The problem addressed is how to analyze the vibrations of a target illuminated by the laser radar to achieve a positive identification. This thesis develops a computer-based data acquisition and analysis system for improving the laser radar capability. Specifically, a review is made of the CO2 laser radar, coher...

  9. Bilinear Time-frequency Analysis for Lamb Wave Signal Detected by Electromagnetic Acoustic Transducer

    Science.gov (United States)

    Sun, Wenxiu; Liu, Guoqiang; Xia, Hui; Xia, Zhengwu

    2018-03-01

    Accurate acquisition of the detection signal travel time plays a very important role in cross-hole tomography. The experimental platform of aluminum plate under the perpendicular magnetic field is established and the bilinear time-frequency analysis methods, Wigner-Ville Distribution (WVD) and the pseudo-Wigner-Ville distribution (PWVD), are applied to analyse the Lamb wave signals detected by electromagnetic acoustic transducer (EMAT). By extracting the same frequency component of the time-frequency spectrum as the excitation frequency, the travel time information can be obtained. In comparison with traditional linear time-frequency analysis method such as short-time Fourier transform (STFT), the bilinear time-frequency analysis method PWVD is more appropriate in extracting travel time and recognizing patterns of Lamb wave.

  10. Context-based coding of bilevel images enhanced by digital straight line analysis

    DEFF Research Database (Denmark)

    Aghito, Shankar Manuel; Forchhammer, Søren

    2006-01-01

    , or segmentation maps are also encoded efficiently. The algorithm is not targeted at document images with text, which can be coded efficiently with dictionary-based techniques as in JBIG2. The scheme is based on a local analysis of the digital straightness of the causal part of the object boundary, which is used...... in the context definition for arithmetic encoding. Tested on individual images of standard TV resolution binary shapes and the binary layers of a digital map, the proposed algorithm outperforms PWC, JBIG, JBIG2, and MPEG-4 CAE. On the binary shapes, the code lengths are reduced by 21%, 27 %, 28 %, and 41...

  11. Digital Holography, a metrological tool for quantitative analysis: Trends and future applications

    Science.gov (United States)

    Paturzo, Melania; Pagliarulo, Vito; Bianco, Vittorio; Memmolo, Pasquale; Miccio, Lisa; Merola, Francesco; Ferraro, Pietro

    2018-05-01

    A review on the last achievements of Digital Holography is reported in this paper, showing that this powerful method can be a key metrological tool for the quantitative analysis and non-invasive inspection of a variety of materials, devices and processes. Nowadays, its range of applications has been greatly extended, including the study of live biological matter and biomedical applications. This paper overviews the main progresses and future perspectives of digital holography, showing new optical configurations and investigating the numerical issues to be tackled for the processing and display of quantitative data.

  12. Automatic analysis of digitized TV-images by a computer-driven optical microscope

    International Nuclear Information System (INIS)

    Rosa, G.; Di Bartolomeo, A.; Grella, G.; Romano, G.

    1997-01-01

    New methods of image analysis and three-dimensional pattern recognition were developed in order to perform the automatic scan of nuclear emulsion pellicles. An optical microscope, with a motorized stage, was equipped with a CCD camera and an image digitizer, and interfaced to a personal computer. Selected software routines inspired the design of a dedicated hardware processor. Fast operation, high efficiency and accuracy were achieved. First applications to high-energy physics experiments are reported. Further improvements are in progress, based on a high-resolution fast CCD camera and on programmable digital signal processors. Applications to other research fields are envisaged. (orig.)

  13. ENEA-Frascati inertial confinement fusion: Multi-channel digitizer control, acquisition and analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Caruso, A.; Strangio, C.; Elman, G.; Hugh, J. Mc; Rogers, G.; Pastina, E.; Raimondi, F.; Umbro, A.

    1991-09-01

    The ICF (Inertial Confinement Fusion) data acquisition and analysis system described in this paper incorporates digitizer channels, signal switching, an instrument controller, a graphic hardcopy unit, laser printer and software. The digitizers, signal switching and instrument controller are standard components appropriate to acquire the single fast shot signals (rise-time: from about 100 picoseconds to nanoseconds). The input signals are switched to the digitizers through the TSI-8150 test system interface by TSS40 switch controller cards and TSS46 18GHz microwave switches. Graphic hardcopy is accomplished using either a 4693PX colour hardcopy unit or a laser printer connected through a printer spooler/multiplexer. The software is based on an existing package from Ressler called RAI/DAC reviewed by Fus-Inerz and Tektronix-Italy to define the modifications needed for data acquisition and handling. The adopted software solution is based on an IBM PC compatible instrument controller running software developed by Ressler Associates.

  14. Digital Imaging Analysis for the Study of Endotoxin-Induced Mitochondrial Ultrastructure Injury

    Directory of Open Access Journals (Sweden)

    Mandar S. Joshi

    2000-01-01

    Full Text Available Primary defects in mitochondrial function have been implicated in over 100 diverse diseases. In situ, mitochondria possess unique and well-defined morphology in normal healthy cells, but diseases linked to defective mitochondrial function are characterized by the presence of morphologically abnormal and swollen mitochondria with distorted cristae. In situ study of mitochondrial morphology is established as an indicator of mitochondrial health but thus far assessments have been via subjective evaluations by trained observers using discontinuous scoring systems. Here we investigated the value of digital imaging analysis to provide for unbiased, reproducible, and convenient evaluations of mitochondrial ultrastructure. Electron photomicrographs of ileal mucosal mitochondria were investigated using a scoring system previously described by us, and also analyzed digitally by using six digital parameters which define size, shape, and electron density characteristics of over 700 individual mitochondria. Statistically significant changes in mitochondrial morphology were detected in LPS treated animals relative to vehicle control using both the subjective scoring system and digital imaging parameters (p < 0:05. However, the imaging approach provided convenient and high throughput capabilities and was easily automated to remove investigator influences. These results illustrate significant changes in ileal mucosal mitochondrial ultrastructure during sepsis and demonstrate the value of digital imaging technology for routine assessments in this setting.

  15. Noise-shaping all-digital phase-locked loops modeling, simulation, analysis and design

    CERN Document Server

    Brandonisio, Francesco

    2014-01-01

    This book presents a novel approach to the analysis and design of all-digital phase-locked loops (ADPLLs), technology widely used in wireless communication devices. The authors provide an overview of ADPLL architectures, time-to-digital converters (TDCs) and noise shaping. Realistic examples illustrate how to analyze and simulate phase noise in the presence of sigma-delta modulation and time-to-digital conversion. Readers will gain a deep understanding of ADPLLs and the central role played by noise-shaping. A range of ADPLL and TDC architectures are presented in unified manner. Analytical and simulation tools are discussed in detail. Matlab code is included that can be reused to design, simulate and analyze the ADPLL architectures that are presented in the book.   • Discusses in detail a wide range of all-digital phase-locked loops architectures; • Presents a unified framework in which to model time-to-digital converters for ADPLLs; • Explains a procedure to predict and simulate phase noise in oscil...

  16. Carbon financial markets: A time-frequency analysis of CO2 prices

    Science.gov (United States)

    Sousa, Rita; Aguiar-Conraria, Luís; Soares, Maria Joana

    2014-11-01

    We characterize the interrelation of CO2 prices with energy prices (electricity, gas and coal), and with economic activity. Previous studies have relied on time-domain techniques, such as Vector Auto-Regressions. In this study, we use multivariate wavelet analysis, which operates in the time-frequency domain. Wavelet analysis provides convenient tools to distinguish relations at particular frequencies and at particular time horizons. Our empirical approach has the potential to identify relations getting stronger and then disappearing over specific time intervals and frequencies. We are able to examine the coherency of these variables and lead-lag relations at different frequencies for the time periods in focus.

  17. Ultra-high performance, solid-state, autoradiographic image digitization and analysis system

    International Nuclear Information System (INIS)

    Lear, J.L.; Pratt, J.P.; Ackermann, R.F.; Plotnick, J.; Rumley, S.

    1990-01-01

    We developed a Macintosh II-based, charge-coupled device (CCD), image digitization and analysis system for high-speed, high-resolution quantification of autoradiographic image data. A linear CCD array with 3,500 elements was attached to a precision drive assembly and mounted behind a high-uniformity lens. The drive assembly was used to sweep the array perpendicularly to its axis so that an entire 20 x 25-cm autoradiographic image-containing film could be digitized into 256 gray levels at 50-microns resolution in less than 30 sec. The scanner was interfaced to a Macintosh II computer through a specially constructed NuBus circuit board and software was developed for autoradiographic data analysis. The system was evaluated by scanning individual films multiple times, then measuring the variability of the digital data between the different scans. Image data were found to be virtually noise free. The coefficient of variation averaged less than 1%, a value significantly exceeding the accuracy of both high-speed, low-resolution, video camera (VC) systems and low-speed, high-resolution, rotating drum densitometers (RDD). Thus, the CCD scanner-Macintosh computer analysis system offers the advantage over VC systems of the ability to digitize entire films containing many autoradiograms, but with much greater speed and accuracy than achievable with RDD scanners

  18. Real-time visualization and analysis of airflow field by use of digital holography

    Science.gov (United States)

    Di, Jianglei; Wu, Bingjing; Chen, Xin; Liu, Junjiang; Wang, Jun; Zhao, Jianlin

    2013-04-01

    The measurement and analysis of airflow field is very important in fluid dynamics. For airflow, smoke particles can be added to visually observe the turbulence phenomena by particle tracking technology, but the effect of smoke particles to follow the high speed airflow will reduce the measurement accuracy. In recent years, with the advantage of non-contact, nondestructive, fast and full-field measurement, digital holography has been widely applied in many fields, such as deformation and vibration analysis, particle characterization, refractive index measurement, and so on. In this paper, we present a method to measure the airflow field by use of digital holography. A small wind tunnel model made of acrylic glass is built to control the velocity and direction of airflow. Different shapes of samples such as aircraft wing and cylinder are placed in the wind tunnel model to produce different forms of flow field. With a Mach-Zehnder interferometer setup, a series of digital holograms carrying the information of airflow filed distributions in different states are recorded by CCD camera and corresponding holographic images are numerically reconstructed from the holograms by computer. Then we can conveniently obtain the velocity or pressure information of the airflow deduced from the quantitative phase information of holographic images and visually display the airflow filed and its evolution in the form of a movie. The theory and experiment results show that digital holography is a robust and feasible approach for real-time visualization and analysis of airflow field.

  19. Analysis and improvement of digital control stability for master-slave manipulator system

    International Nuclear Information System (INIS)

    Yoshida, Koichi; Yabuta, Tetsuro

    1992-01-01

    Some bilateral controls of master-slave system have been designed, which can realize high-fidelity telemanipulation as if the operator were manipulating the object directly. While usual robot systems are controlled by software-servo system using digital computer, little work has been published on design and analysis for digital control of these systems, which must consider time-delay of sensor signals and zero order hold effect of command signals on actuators. This paper presents a digital control analysis for single degree of freedom master-slave system including impedance models of both the human operator and the task object, which clarifies some index for the stability. The stability result shows a virtual master-slave system concepts, which improve the digital control stability. We first analyze a dynamic control method of master-slave system in discrete-time system for the stability problem, which can realize high-fidelity telemanipulation in the continuous-time. Secondly, using the results of the stability analysis, the robust control scheme for master-slave system is proposed, and the validity of this scheme is finally confirmed by the simulation. Consequently, it would be considered that any combination of master and slave modules with dynamic model of these manipulators is possible to construct the stable master-slave system. (author)

  20. The Digital Single Market and Legal Certainty : A Critical Analysis

    NARCIS (Netherlands)

    Castermans, A.G.; Graaff, de R.; Haentjens, M.; Colombi, Ciacchi A.

    2016-01-01

    This chapter critically examines the CESL from the viewpoint of its capability to provide legal certainty for commercial actors. This chapter’s analysis focuses on three important stages in the life cycle of a contract, seen from a business perspective: the scope rules that determine whether the

  1. The impact of the microphone position on the frequency analysis of snoring sounds.

    Science.gov (United States)

    Herzog, Michael; Kühnel, Thomas; Bremert, Thomas; Herzog, Beatrice; Hosemann, Werner; Kaftan, Holger

    2009-08-01

    Frequency analysis of snoring sounds has been reported as a diagnostic tool to differentiate between different sources of snoring. Several studies have been published presenting diverging results of the frequency analyses of snoring sounds. Depending on the position of the used microphones, the results of the frequency analysis of snoring sounds vary. The present study investigated the influence of different microphone positions on the outcome of the frequency analysis of snoring sounds. Nocturnal snoring was recorded simultaneously at six positions (air-coupled: 30 cm middle, 100 cm middle, 30 cm lateral to both sides of the patients' head; body contact: neck and parasternal) in five patients. The used microphones had a flat frequency response and a similar frequency range (10/40 Hz-18 kHz). Frequency analysis was performed by fast Fourier transformation and frequency bands as well as peak intensities (Peaks 1-5) were detected. Air-coupled microphones presented a wider frequency range (60 Hz-10 kHz) compared to contact microphones. The contact microphone at cervical position presented a cut off at frequencies above 300 Hz, whereas the contact microphone at parasternal position revealed a cut off above 100 Hz. On an exemplary base, the study demonstrates that frequencies above 1,000 Hz do appear in complex snoring patterns, and it is emphasised that high frequencies are imported for the interpretation of snoring sounds with respect to the identification of the source of snoring. Contact microphones might be used in screening devices, but for a natural analysis of snoring sounds the use of air-coupled microphones is indispensable.

  2. Analysis of data flow and activities at radiology reporting stations for design and evaluation of digital work stations

    International Nuclear Information System (INIS)

    Mun, S.K.; Benson, H.; Welsh, C.; Elliott, L.P.; Zeman, R.

    1987-01-01

    Definition of necessary and desirable functional capabilities of PACS work stations is critical in the design of digital systems for the successful clinical acceptance of digital imaging networks. The authors conducted a detailed time motion study of data flow pattern, diagnostic decision making, and reporting activities at current film alternators for neuroradiology, body CT, and pulmonary service. The measured parameters include data volume, data presentation speed, frequency of use of previous studies, efforts needed to retrieve previous studies, time required for diagnosis, frequency and duration of consultation with referring physicians, frequency of interruptions, and dictation time an defficiency. The result of this study provides critical information in designing digital work stations for various services

  3. A comparison of hair colour measurement by digital image analysis with reflective spectrophotometry.

    Science.gov (United States)

    Vaughn, Michelle R; van Oorschot, Roland A H; Baindur-Hudson, Swati

    2009-01-10

    While reflective spectrophotometry is an established method for measuring macroscopic hair colour, it can be cumbersome to use on a large number of individuals and not all reflective spectrophotometry instruments are easily portable. This study investigates the use of digital photographs to measure hair colour and compares its use to reflective spectrophotometry. An understanding of the accuracy of colour determination by these methods is of relevance when undertaking specific investigations, such as those on the genetics of hair colour. Measurements of hair colour may also be of assistance in cases where a photograph is the only evidence of hair colour available (e.g. surveillance). Using the CIE L(*)a(*)b(*) colour space, the hair colour of 134 individuals of European ancestry was measured by both reflective spectrophotometry and by digital image analysis (in V++). A moderate correlation was found along all three colour axes, with Pearson correlation coefficients of 0.625, 0.593 and 0.513 for L(*), a(*) and b(*) respectively (p-values=0.000), with means being significantly overestimated by digital image analysis for all three colour components (by an average of 33.42, 3.38 and 8.00 for L(*), a(*) and b(*) respectively). When using digital image data to group individuals into clusters previously determined by reflective spectrophotometric analysis using a discriminant analysis, individuals were classified into the correct clusters 85.8% of the time when there were two clusters. The percentage of cases correctly classified decreases as the number of clusters increases. It is concluded that, although more convenient, hair colour measurement from digital images has limited use in situations requiring accurate and consistent measurements.

  4. Digital filtering in nuclear medicine

    International Nuclear Information System (INIS)

    Miller, T.R.; Sampathkumaran, S.

    1982-01-01

    Digital filtering is a powerful mathematical technique in computer analysis of nuclear medicine studies. The basic concepts of object-domain and frequency-domain filtering are presented in simple, largely nonmathemaical terms. Computational methods are described using both the Fourier transform and convolution techniques. The frequency response is described and used to represent the behavior of several classes of filters. These concepts are illustrated with examples drawn from a variety of important applications in nuclear medicine

  5. Time-Frequency Analysis and Hermite Projection Method Applied to Swallowing Accelerometry Signals

    Directory of Open Access Journals (Sweden)

    Ervin Sejdić

    2010-01-01

    Full Text Available Fast Hermite projections have been often used in image-processing procedures such as image database retrieval, projection filtering, and texture analysis. In this paper, we propose an innovative approach for the analysis of one-dimensional biomedical signals that combines the Hermite projection method with time-frequency analysis. In particular, we propose a two-step approach to characterize vibrations of various origins in swallowing accelerometry signals. First, by using time-frequency analysis we obtain the energy distribution of signal frequency content in time. Second, by using fast Hermite projections we characterize whether the analyzed time-frequency regions are associated with swallowing or other phenomena (vocalization, noise, bursts, etc.. The numerical analysis of the proposed scheme clearly shows that by using a few Hermite functions, vibrations of various origins are distinguishable. These results will be the basis for further analysis of swallowing accelerometry to detect swallowing difficulties.

  6. Mathematical methods in time series analysis and digital image processing

    CERN Document Server

    Kurths, J; Maass, P; Timmer, J

    2008-01-01

    The aim of this volume is to bring together research directions in theoretical signal and imaging processing developed rather independently in electrical engineering, theoretical physics, mathematics and the computer sciences. In particular, mathematically justified algorithms and methods, the mathematical analysis of these algorithms, and methods as well as the investigation of connections between methods from time series analysis and image processing are reviewed. An interdisciplinary comparison of these methods, drawing upon common sets of test problems from medicine and geophysical/enviromental sciences, is also addressed. This volume coherently summarizes work carried out in the field of theoretical signal and image processing. It focuses on non-linear and non-parametric models for time series as well as on adaptive methods in image processing.

  7. Design and simulation of a fast Josephson junction on-chip gated clock for frequency and time analysis

    International Nuclear Information System (INIS)

    Ruby, R.C.

    1991-01-01

    This paper reports that as the sophistication and speed of digital communication systems increase, there is a corresponding demand for more sophisticated and faster measurement instruments. One such instrument new on the market is the HP 5371A Frequency and Time Interval Analyzer (FTIA). Such an instrument is analogous to a conventional oscilloscope. Whereas the oscilloscope measures waveform amplitudes as a function of time, the FTIA measures phase, frequency, or timing events as functions of time. These applications are useful in such diverse areas as spread-spectrum radar, chirp filter designs, disk-head evaluation, and timing jitter analysis. The on-chip clock designed for this application uses a single Josephson Junction as the clock and a resonator circuit to fix the frequency. A zero-crossing detector is used to start and stop the clock. A SFQ counter is used to count the pulses generated by the clock and a reset circuit is used to reset the clock. Extensive simulations and modeling have been done based on measured values obtained from our Nb/Al 2 O 3 /Al/Nb process

  8. Analysis of Fractured Teeth Utilizing Digital Microscopy: A Pilot Study

    Science.gov (United States)

    2016-06-01

    pathology (8). Artifacts such as beam hardening, streaking and cupping may limit the clinician’s ability to detect fractures (9). Metallic structures...visualization of a pathologic condition in the area of diagnostic interest (9). There have been multiple studies published which measure the size of...Heimel P, Metscher B. Volume analysis of heat-induced cracks in human molars: A preliminary study. J Forensic Dent Sci 2014;6:139-44. 13. Hunter J

  9. Managing health care in the digital world: a comparative analysis

    OpenAIRE

    Cucciniello, Maria; Lapsley, Irvine; Nasi, Greta

    2016-01-01

    Recently, most reforms affecting healthcare systems have focused on improving the quality of care and containing costs. This has led many scholars to advocate the adoption of Health Information systems, especially electronic medical records, by highlighting their potential benefits. This study is based on a comparative analysis using a multiple method approach to examine the implementation of the same electronic medical record system at two different hospitals. Its findings offer insights int...

  10. Image analysis and machine learning in digital pathology: Challenges and opportunities.

    Science.gov (United States)

    Madabhushi, Anant; Lee, George

    2016-10-01

    With the rise in whole slide scanner technology, large numbers of tissue slides are being scanned and represented and archived digitally. While digital pathology has substantial implications for telepathology, second opinions, and education there are also huge research opportunities in image computing with this new source of "big data". It is well known that there is fundamental prognostic data embedded in pathology images. The ability to mine "sub-visual" image features from digital pathology slide images, features that may not be visually discernible by a pathologist, offers the opportunity for better quantitative modeling of disease appearance and hence possibly improved prediction of disease aggressiveness and patient outcome. However the compelling opportunities in precision medicine offered by big digital pathology data come with their own set of computational challenges. Image analysis and computer assisted detection and diagnosis tools previously developed in the context of radiographic images are woefully inadequate to deal with the data density in high resolution digitized whole slide images. Additionally there has been recent substantial interest in combining and fusing radiologic imaging and proteomics and genomics based measurements with features extracted from digital pathology images for better prognostic prediction of disease aggressiveness and patient outcome. Again there is a paucity of powerful tools for combining disease specific features that manifest across multiple different length scales. The purpose of this review is to discuss developments in computational image analysis tools for predictive modeling of digital pathology images from a detection, segmentation, feature extraction, and tissue classification perspective. We discuss the emergence of new handcrafted feature approaches for improved predictive modeling of tissue appearance and also review the emergence of deep learning schemes for both object detection and tissue classification

  11. Color image digitization and analysis for drum inspection

    International Nuclear Information System (INIS)

    Muller, R.C.; Armstrong, G.A.; Burks, B.L.; Kress, R.L.; Heckendorn, F.M.; Ward, C.R.

    1993-01-01

    A rust inspection system that uses color analysis to find rust spots on drums has been developed. The system is composed of high-resolution color video equipment that permits the inspection of rust spots on the order of 0.25 cm (0.1-in.) in diameter. Because of the modular nature of the system design, the use of open systems software (X11, etc.), the inspection system can be easily integrated into other environmental restoration and waste management programs. The inspection system represents an excellent platform for the integration of other color inspection and color image processing algorithms

  12. Pyroprocess Deployment Analysis and Remote Accessibility Experiment using Digital Mockup and Simulation

    International Nuclear Information System (INIS)

    Kim, K. H.; Park, H. S.; Kim, S. H.; Choi, C. H.; Lee, H. J.; Park, B. S.; Yoon, G. S.; Kim, K. H.; Kim, H. D.

    2009-11-01

    Nuclear fuel cycle facility that treats with spent fuel must be designed and manufactured a Pyroprcess facility and process with considering a speciality as every process have to be processed remotely. To prevent an unexpected accident under a circumstance that must operate with a remote manipulator after done the Pyroprocess facility, an procedure related Pyroprocess operation and maintenance need to establish it in the early design stage. To develop the simulator that is mixed by 3D modelling and simulation, a system architecture was designed. A full-scale digital mockup with a real pyroprocess facility was designed and manufactured. An inverse kinematics algorithm of remote manipulator was created in order to simulate an accident and repair that could happen in pyroprocess operation and maintenance under a virtual digital mockup environment. Deployment analysis of process devices through a workspace analysis was carried out and Accessibility analysis by using haptic device was examined

  13. Changes in frequency of recall recommendations of examinations depicting cancer with the availability of either priors or digital breast tomosynthesis

    Science.gov (United States)

    Hakim, Christiane M.; Bandos, Andriy I.; Ganott, Marie A.; Catullo, Victor J.; Chough, Denise M.; Kelly, Amy E.; Shinde, Dilip D.; Sumkin, Jules H.; Wallace, Luisa P.; Nishikawa, Robert M.; Gur, David

    2016-03-01

    Performance changes in a binary environment when using additional information is affected only when changes in recommendations are made due to the additional information in question. In a recent study, we have shown that, contrary to general expectation, introducing prior examinations improved recall rates, but not sensitivity. In this study, we assessed cancer detection differences when prior examinations and/or digital breast tomosynthesis (DBT) were made available to the radiologist. We identified a subset of 21 cancer cases with differences in the number of radiologists who recalled these cases after reviewing either a prior examination or DBT. For the cases with differences in recommendations after viewing either priors or DBT, separately, we evaluated the total number of readers that changed their recommendations, regardless of the specific radiologist in question. Confidence intervals for the number of readers and a test for the hypothesis of no difference was performed using the non-parameteric bootstrap approach addressing both case and reader-related sources of variability by resampling cases and readers. With the addition of priors, there were 14 cancer cases (out of 15) where the number of "recalling radiologists" decreased. With the addition of DBT, the number of "recalling radiologists" decreased in only five cases (out of 15) while increasing in the remaining 9 cases. Unlike most new approaches to breast imaging DBT seems to improve both recall rates and cancer detection rates. Changes in recommendations were noted by all radiologists for all cancers by type, size, and breast density.

  14. Intra-observer reliability and agreement of manual and digital orthodontic model analysis.

    Science.gov (United States)

    Koretsi, Vasiliki; Tingelhoff, Linda; Proff, Peter; Kirschneck, Christian

    2018-01-23

    Digital orthodontic model analysis is gaining acceptance in orthodontics, but its reliability is dependent on the digitalisation hardware and software used. We thus investigated intra-observer reliability and agreement / conformity of a particular digital model analysis work-flow in relation to traditional manual plaster model analysis. Forty-eight plaster casts of the upper/lower dentition were collected. Virtual models were obtained with orthoX®scan (Dentaurum) and analysed with ivoris®analyze3D (Computer konkret). Manual model analyses were done with a dial caliper (0.1 mm). Common parameters were measured on each plaster cast and its virtual counterpart five times each by an experienced observer. We assessed intra-observer reliability within method (ICC), agreement/conformity between methods (Bland-Altman analyses and Lin's concordance correlation), and changing bias (regression analyses). Intra-observer reliability was substantial within each method (ICC ≥ 0.7), except for five manual outcomes (12.8 per cent). Bias between methods was statistically significant, but less than 0.5 mm for 87.2 per cent of the outcomes. In general, larger tooth sizes were measured digitally. Total difference maxilla and mandible had wide limits of agreement (-3.25/6.15 and -2.31/4.57 mm), but bias between methods was mostly smaller than intra-observer variation within each method with substantial conformity of manual and digital measurements in general. No changing bias was detected. Although both work-flows were reliable, the investigated digital work-flow proved to be more reliable and yielded on average larger tooth sizes. Averaged differences between methods were within 0.5 mm for directly measured outcomes but wide ranges are expected for some computed space parameters due to cumulative error. © The Author 2017. Published by Oxford University Press on behalf of the European Orthodontic Society. All rights reserved. For permissions, please email: journals.permissions@oup.com

  15. Free digital image analysis software helps to resolve equivocal scores in HER2 immunohistochemistry.

    Science.gov (United States)

    Helin, Henrik O; Tuominen, Vilppu J; Ylinen, Onni; Helin, Heikki J; Isola, Jorma

    2016-02-01

    Evaluation of human epidermal growth factor receptor 2 (HER2) immunohistochemistry (IHC) is subject to interobserver variation and lack of reproducibility. Digital image analysis (DIA) has been shown to improve the consistency and accuracy of the evaluation and its use is encouraged in current testing guidelines. We studied whether digital image analysis using a free software application (ImmunoMembrane) can assist in interpreting HER2 IHC in equivocal 2+ cases. We also compared digital photomicrographs with whole-slide images (WSI) as material for ImmunoMembrane DIA. We stained 750 surgical resection specimens of invasive breast cancers immunohistochemically for HER2 and analysed staining with ImmunoMembrane. The ImmunoMembrane DIA scores were compared with the originally responsible pathologists' visual scores, a researcher's visual scores and in situ hybridisation (ISH) results. The originally responsible pathologists reported 9.1 % positive 3+ IHC scores, for the researcher this was 8.4 % and for ImmunoMembrane 9.5 %. Equivocal 2+ scores were 34 % for the pathologists, 43.7 % for the researcher and 10.1 % for ImmunoMembrane. Negative 0/1+ scores were 57.6 % for the pathologists, 46.8 % for the researcher and 80.8 % for ImmunoMembrane. There were six false positive cases, which were classified as 3+ by ImmunoMembrane and negative by ISH. Six cases were false negative defined as 0/1+ by IHC and positive by ISH. ImmunoMembrane DIA using digital photomicrographs and WSI showed almost perfect agreement. In conclusion, digital image analysis by ImmunoMembrane can help to resolve a majority of equivocal 2+ cases in HER2 IHC, which reduces the need for ISH testing.

  16. Software Safety Analysis of Digital Protection System Requirements Using a Qualitative Formal Method

    International Nuclear Information System (INIS)

    Lee, Jang-Soo; Kwon, Kee-Choon; Cha, Sung-Deok

    2004-01-01

    The safety analysis of requirements is a key problem area in the development of software for the digital protection systems of a nuclear power plant. When specifying requirements for software of the digital protection systems and conducting safety analysis, engineers find that requirements are often known only in qualitative terms and that existing fault-tree analysis techniques provide little guidance on formulating and evaluating potential failure modes. A framework for the requirements engineering process is proposed that consists of a qualitative method for requirements specification, called the qualitative formal method (QFM), and a safety analysis method for the requirements based on causality information, called the causal requirements safety analysis (CRSA). CRSA is a technique that qualitatively evaluates causal relationships between software faults and physical hazards. This technique, extending the qualitative formal method process and utilizing information captured in the state trajectory, provides specific guidelines on how to identify failure modes and the relationship among them. The QFM and CRSA processes are described using shutdown system 2 of the Wolsong nuclear power plants as the digital protection system example

  17. A study of trabecular bone strength and morphometric analysis of bone microstructure from digital radiographic image

    International Nuclear Information System (INIS)

    Han, Seung Yun; Lee, Sun Bok; Oh, Sung Ook; Heo, Min Suk; Lee, Sam Sun; Choi, Soon Chul; Park, Tae Won; Kim, Jong Dae

    2003-01-01

    To evaluate the relationship between morphometric analysis of microstructure from digital radiographic image and trabecular bone strength. One hundred eleven bone specimens with 5 mm thickness were obtained from the mandibles of 5 pigs. Digital images of specimens were taken using a direct digital intraoral radiographic system. After selection of ROI(100 x 100 pixel) within the trabecular bone, mean gray level and standard deviation were obtained. Fractal dimension and the variants of morphometric analysis (trabecular area, periphery, length of skeletonized trabeculae, number of terminal point, number of branch point) were obtained from ROI. Punch sheer strength analysis was performed using Instron (model 4465, Instron Corp., USA). The loading force (loading speed 1mm/min) was applied to ROI of bone specimen by a 2 mm diameter punch. Stress-deformation curve was obtained from the punch sheer strength analysis and maximum stress, yield stress, Young's modulus were measured. Maximum stress had a negative linear correlation with mean gray level and fractal dimension significantly (p<0.05). Yield stress had a negative linear correlation with mean gray level, periphery, fractal dimension and the length of skeletonized trabeculae significantly (p<0.05). Young's modulus had a negative linear correlation with mean gray level and fractal dimension significantly (p<0.05). The strength of cancellous bone exhibited a significantly linear relationship between mean gray level, fractal dimension and morphometric analysis. The methods described above can be easily used to evaluate bone quality clinically.

  18. Spatial resolution requirements in digital radiography of scaphoid fractures. An ROC analysis

    International Nuclear Information System (INIS)

    Jonsson, A.; Laurin, S.; Karner, G.; Herrlin, K.; Hochbergs, P.; Jonsson, K.; Rudling, O.; Sandstroem, S.; Sloth, M.; Svahn, G.; Pettersson, H.

    1996-01-01

    Purpose: To investigate the spatial resolution requirements in digital radiography of scaphoid fractures. Material and Methods: Included in the study were 60 scaphoid radiographs with and 60 without fractures of the scaphoid bone. The film-screen images were digitized using pixel sizes of 115, 170, and 340 μm along with 170 μm with a 10:1 wavelet compression. The digital images were displayed on a 1280 x 1024 x 8 bits monitor, and 5 observers evaluated the images in 5 randomized sessions. The results for each pixel size were then compared to the film-screen images by ROC analysis. Results: The mean area under the ROC curves was larger for the film-screen images than for the digital images at all resolutions. However, this difference was not significant when the areas under the ROC curves for the film-screen images were compared to the digital images of 115, 170, and 170 μm with 10:1 compression. There was a significant difference for the 340-μm pixel size in favour of the film-screen images. The mean ROC curves for the digital images were very similar for the 115 and 170 μm pixel sizes, although slightly better for 115 μm. At 170 μm, the compression seemed to have a relatively small negative effect on the diagnostic performance; the deterioration was greater when the pixel size was increased to 340 μm. There was no obvious correlation between diagnostic performance and the experience of the observers in using workstations. Conclusions: The pixel size of 170 μm is adequate for the detection of subtle fractures, even after wavelet compression by a ratio of 10:1. (orig.)

  19. A Human Error Analysis with Physiological Signals during Utilizing Digital Devices

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Hee; Oh, Yeon Ju; Shin, Kwang Hyeon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2011-10-15

    The introduction of advanced MCR is accompanied with lots of changes and different forms and features through the virtue of new digital technologies. There are various kinds of digital devices such as flat panel displays, touch screens, and so on. The characteristics of these digital devices give many chances to the interface management, and can be integrated into a compact single workstation in an advanced MCR so that workers can operate the plant with minimum burden during any operating condition. However, these devices may introduce new types of human errors, and thus we need a means to evaluate and prevent such error, especially those related to the digital devices. Human errors have been retrospectively assessed for accident reviews and quantitatively evaluated through HRA for PSA. However, the ergonomic verification and validation is an important process to defend all human error potential in the NPP design. HRA is a crucial part of a PSA, and helps in preparing a countermeasure for design by drawing potential human error items that affect the overall safety of NPPs. Various HRA techniques are available however: they reveal shortages of the HMI design in the digital era. - HRA techniques depend on PSFs: this means that the scope dealing with human factors is previously limited, and thus all attributes of new digital devices may not be considered in HRA. - The data used to HRA are not close to the evaluation items. So, human error analysis is not easy to apply to design by several individual experiments and cases. - The results of HRA are not statistically meaningful because accidents including human errors in NPPs are rare and have been estimated as having an extremely low probability

  20. EVALUATION OF BARTIN CITY ECONOMIC CONSTRUCT WITH DIGITALIZED SWOT ANALYSIS

    Directory of Open Access Journals (Sweden)

    NERMİN ÇELİK

    2013-06-01

    Full Text Available In this study, firstly besides weakness and strengths of Bartın economy, threats and opportunities were presented by means of SWOT analysis. Secondly obtained findings were evaluated in comparative way and priority weights of each one were calculated by means of Analytic Hierarchy Process (AHP which is an evaluation approach with multiple criteria. Finally, the weak aspects were taken attention on the basis of quantitative findings and the alternative strategies towards to economic development of the city were presented. The weakest side of the city is high unemployment ratio and immigration problem, the most strength side of the city is the using for trading of Bartın port. Besides preparing that study as a first for Bartın city which is within the Encouragement Law, offering the solutions by evaluating the current and potential situations can be described as original sides of this study.

  1. Digital Forensic Analysis Of Malware Infected Machine- Case Study

    Directory of Open Access Journals (Sweden)

    Amulya Podile

    2015-08-01

    Full Text Available Abstract Internet banking has created a convenient way for us to handle our business without leaving our home. Man-in-the-Browser is a special case of Man-in-the-middle attack targeted against customers of Internet banking. One of the capabilities of Man-in-the-Browser Trojan is modification of html referred to as html injection that allows the attacker to alter the html of a page before it is sent to the browser for interpretation. In this paper the authors discussed about forensic analysis of RAM Volatile data system logs and registry collected from bank customer computer infected with Trojan and confirmed the source of attack time-stamps and the behavior of the malware by using open source and commercial tools.

  2. Research and Analysis of MEMS Switches in Different Frequency Bands

    Directory of Open Access Journals (Sweden)

    Wenchao Tian

    2018-04-01

    Full Text Available Due to their high isolation, low insertion loss, high linearity, and low power consumption, microelectromechanical systems (MEMS switches have drawn much attention from researchers in recent years. In this paper, we introduce the research status of MEMS switches in different bands and several reliability issues, such as dielectric charging, contact failure, and temperature instability. In this paper, some of the following methods to improve the performance of MEMS switches in high frequency are summarized: (1 utilizing combinations of several switches in series; (2 covering a float metal layer on the dielectric layer; (3 using dielectric layer materials with high dielectric constants and conductor materials with low resistance; (4 developing MEMS switches using T-match and π-match; (5 designing MEMS switches based on bipolar complementary metal–oxide–semiconductor (BiCMOS technology and reconfigurable MEMS’ surfaces; (6 employing thermal compensation structures, circularly symmetric structures, thermal buckle-beam actuators, molybdenum membrane, and thin-film packaging; (7 selecting Ultra-NanoCrystalline diamond or aluminum nitride dielectric materials and applying a bipolar driving voltage, stoppers, and a double-dielectric-layer structure; and (8 adopting gold alloying with carbon nanotubes (CNTs, hermetic and reliable packaging, and mN-level contact.

  3. Alcohol marketing in televised English professional football: a frequency analysis.

    Science.gov (United States)

    Graham, Andrew; Adams, Jean

    2014-01-01

    The aim of the study was to explore the frequency of alcohol marketing (both formal commercials and otherwise) in televised top-class English professional football matches. A purposive sample of six broadcasts (total = 1101 min) of televised top-class English club football matches were identified and recorded in full. A customized coding framework was used to identify and categorize all verbal and visual alcohol references in non-commercial broadcasting. The number and the duration of all formal alcohol commercials were also noted. A mean of 111 visual references and 2 verbal references to alcohol per hour of broadcast were identified. Nearly all visual references were to beer products and were primarily simple logos or branding. The majority of verbal alcohol references were related to title-sponsorship of competitions. A total of 17 formal alcohol commercials were identified, accounting for <1% of total broadcast time. Visual alcohol references in televised top-class English football matches are common with an average of nearly two per minute. Verbal references are rare and formal alcohol commercials account for <1% of broadcast time. Restriction of all alcohol sports sponsorship, as seen for tobacco, may be justified.

  4. Accessory bones of the feet: Radiological analysis of frequency

    Directory of Open Access Journals (Sweden)

    Vasiljević Vladica

    2010-01-01

    Full Text Available Background/Aim. Accessory bones are most commonly found on the feet and they represent an anatomic variant. They occur when there is a failure in the formation of a unique bone from separated centre of ossification. The aim of this study was to establish their frequency and medical significance. Methods. Anteroposterior and lateral foot radiography was performed in 270 patients aged of 20-80 years with a history of trauma (180 and rheumatology disease (90. The presence and distribution of accessory bones was analysed in relation to the total number of patients and their gender. The results are expressed in numeric values and in terms of percentage. Results. Accessory bones were identified in 62 (22.96% patients: 29 (10.74% of them were found in female patients and 33 (12.22% in males. The most common accessory bones were as follows: os tibiale externum 50%, os peroneum 29.03%, ostrigonum 11.29%, os vaselianum 9.68%. Conclusion. Accessory bones found in 23% of patients with trauma and some of rheumatological diseases. Their significance is demonstrated in the differential diagnosis among degenerative diseases, avulsion fractures, muscle and tendon trauma and other types of injuries which can cause painful affection of the foot, as well as in forensic practice.

  5. Development of Digital Hysteresis Current Control with PLL Loop Gain Compensation Strategy for PWM Inverters with Constant Switching Frequency

    Directory of Open Access Journals (Sweden)

    N. Belhaouchet

    2008-03-01

    Full Text Available Hysteresis current control is one of the simplest techniques used to control the magnitude and phase angle of motor current for motor drives systems. However, this technique presents several disadvantages such as operation at variable switching frequency which can reveal problems of filtering, interference between the phases in the case of the three-phase systems with insulated neutral connection or delta connection, and irregularity of the modulation pulses which especially causes an acoustic noise on the level of the machine for the high power drive. In this paper, a new technique is proposed for a variable-hysteresis-band controller based on dead beat control applied to three phase voltage source PWM inverters feeding AC motors. Its main aim is firstly ensure a constant switching frequency and secondly the synchronization of modulation pulses using the phase-locked-loop with loop gain compensation in order to ensure a better stability. The behavior of the proposed technique is verified by simulation.

  6. Frequency analysis for modulation-enhanced powder diffraction.

    Science.gov (United States)

    Chernyshov, Dmitry; Dyadkin, Vadim; van Beek, Wouter; Urakawa, Atsushi

    2016-07-01

    Periodic modulation of external conditions on a crystalline sample with a consequent analysis of periodic diffraction response has been recently proposed as a tool to enhance experimental sensitivity for minor structural changes. Here the intensity distributions for both a linear and nonlinear structural response induced by a symmetric and periodic stimulus are analysed. The analysis is further extended for powder diffraction when an external perturbation changes not only the intensity of Bragg lines but also their positions. The derived results should serve as a basis for a quantitative modelling of modulation-enhanced diffraction data measured in real conditions.

  7. Spurious results from Fourier analysis of data with closely spaced frequencies

    International Nuclear Information System (INIS)

    Loumos, G.L.; Deeming, T.J.

    1978-01-01

    It is shown how erroneous results can occur using some period-finding methods, such as Fourier analysis, on data containing closely spaced frequencies. The frequency spacing accurately resolvable with data of length T is increased from the standard value of about 1/T quoted in the literature to approximately 1.5/T. (Auth.)

  8. Analysis of frequency effect on variegated RAM styles and other parameters using 40 nm FPGA

    DEFF Research Database (Denmark)

    Sharma, Rashmi; Pandey, Bishwajeet; Sharma, Vaashu

    2018-01-01

    . This analysis has been performed using the XILINX 12.1 and IBM SPSS Statistics 21 software and VHDL language. Pipe_distributed style at comparatively lesser values of frequencies consumes the least power. Therefore, lesser values of frequencies should be maintained while observing the power. This would bloom up...

  9. Wavelet analysis of frequency chaos game signal: a time-frequency signature of the C. elegans DNA.

    Science.gov (United States)

    Messaoudi, Imen; Oueslati, Afef Elloumi; Lachiri, Zied

    2014-12-01

    Challenging tasks are encountered in the field of bioinformatics. The choice of the genomic sequence's mapping technique is one the most fastidious tasks. It shows that a judicious choice would serve in examining periodic patterns distribution that concord with the underlying structure of genomes. Despite that, searching for a coding technique that can highlight all the information contained in the DNA has not yet attracted the attention it deserves. In this paper, we propose a new mapping technique based on the chaos game theory that we call the frequency chaos game signal (FCGS). The particularity of the FCGS coding resides in exploiting the statistical properties of the genomic sequence itself. This may reflect important structural and organizational features of DNA. To prove the usefulness of the FCGS approach in the detection of different local periodic patterns, we use the wavelet analysis because it provides access to information that can be obscured by other time-frequency methods such as the Fourier analysis. Thus, we apply the continuous wavelet transform (CWT) with the complex Morlet wavelet as a mother wavelet function. Scalograms that relate to the organism Caenorhabditis elegans (C. elegans) exhibit a multitude of periodic organization of specific DNA sequences.

  10. Accounting for trip frequency in importance-performance analysis

    Science.gov (United States)

    Joshua K. Gill; J.M. Bowker; John C. Bergstrom; Stanley J. Zarnoch

    2010-01-01

    Understanding customer satisfaction is critical to the successful operation of both privately and publicly managed recreation venues. A popular tool for assessing recreation visitor satisfaction is Importance- Performance Analysis (IPA). IPA provides resource managers, government officials, and private businesses with easy-to-understand and -use information about...

  11. Frequency Analysis of Gradient Estimators in Volume Rendering

    NARCIS (Netherlands)

    Bentum, Marinus Jan; Lichtenbelt, Barthold B.A.; Malzbender, Tom

    1996-01-01

    Gradient information is used in volume rendering to classify and color samples along a ray. In this paper, we present an analysis of the theoretically ideal gradient estimator and compare it to some commonly used gradient estimators. A new method is presented to calculate the gradient at arbitrary

  12. Adjoint sensitivity analysis of high frequency structures with Matlab

    CERN Document Server

    Bakr, Mohamed; Demir, Veysel

    2017-01-01

    This book covers the theory of adjoint sensitivity analysis and uses the popular FDTD (finite-difference time-domain) method to show how wideband sensitivities can be efficiently estimated for different types of materials and structures. It includes a variety of MATLAB® examples to help readers absorb the content more easily.

  13. Letter Frequency Analysis of Lithuanian and Other Languages Using the Latin Alphabet

    Directory of Open Access Journals (Sweden)

    Gintautas Grigas

    2015-12-01

    Full Text Available It is important to evaluate specificities of alphabets, particularly the letter frequencies while designing keyboards, analyzing texts, designing games based on alphabets, and doing some text mining. In order to adequately compare lettter frequences of Lithuanian language to other languages in the Internet space, Wikipedia source was selected which content is common to different languages. The method of letter frequency jumps is used. The main attention is paid to the analysis of letter frequencies at the boundary between native letters and foreign letters used in Lithuanian and other languages.

  14. Estimation of Internal Flooding Frequency for Screening Analysis of Flooding PSA

    International Nuclear Information System (INIS)

    Choi, Sun Yeong; Yang, Jun Eon

    2005-01-01

    The purpose of this paper is to estimate the internal frequency for the quantitative screening analysis of the flooding PSA (Probabilistic Safety Assessment) with the appropriate data and estimation method. In the case of the existing flood PSA for domestic NPPs (Nuclear Power Plant), the screening analysis was performed firstly and then detailed analysis was performed for the area not screened out. For the quantitative screening analysis, the plant area based flood frequency by MLE (Maximum Likelihood Estimation) method was used, while the component based flood frequency is used for the detailed analysis. The existing quantitative screening analysis for domestic NPPs have used data from all LWRs (Light Water Reactor), namely PWR (Pressurized Water Reactor) and BWR (Boiling Water Reactor) for the internal flood frequency of the auxiliary building and turbine building. However, in the case of the primary auxiliary building, the applicability of the data from all LWRs needs to be examined carefully because of the significant difference in equipments between the PWR and BWR structure. NUREG/CR-5750 suggested the Bayesian update method with Jeffrey's noninformative prior to estimate the initiating event frequency for the flood. It, however, did not describe any procedure of the flood PSA. Recently, Fleming and Lydell suggested the internal flooding frequency in the unit of the plant operation year-pipe length (in meter) by pipe size of each specific system which is susceptible to the flooding such as the service water system and the circulating water system. They used the failure rate, the rupture conditional probability given the failure to estimate the internal flooding frequency, and the Bayesian update to reduce uncertainties. To perform the quantitative screening analysis with the method, it requires pipe length by each pipe size of the specific system per each divided area to change the concept of the component based frequency to the concept of the plant area

  15. Technical difficulties and challenges for performing safety analysis on digital I and C systems

    International Nuclear Information System (INIS)

    Yih, Swu

    1996-01-01

    Performing safety analysis on digital I and C systems is an important task for nuclear safety analysts. The analysis results can not only confirm that the system is well-developed but also provide crucial evidence for licensing process. However, currently both I and C developers and regulators have difficulties in evaluating the safety of digital I and C systems. To investigate this problem, this paper propose a frame-based model to analyze the working and failure mechanisms of software and its interaction with the environment. Valid isomorphic relationship between the logical (software) and the physical (hardware environment) frame is identified as a major factor that determines the safe behavior of the software. The failures that may potentially cause the violation of isomorphic relations are also discussed. To perform safety analysis on digital I and C systems, analysts need to predict the effects incurred by such failures. However, due to lack of continuity, regularity, integrity, and high complexity of software structure, software does not have a stable and predictable pattern of behavior, which in turn makes the trustworthiness of results of software safety analysis susceptible. Our model can explain many troublesome events experienced by computer controlled systems. Implications and possible directions for improvement are also discussed. (author)

  16. Software hazard analysis for nuclear digital protection system by Colored Petri Net

    International Nuclear Information System (INIS)

    Bai, Tao; Chen, Wei-Hua; Liu, Zhen; Gao, Feng

    2017-01-01

    Highlights: •A dynamic hazard analysis method is proposed for the safety-critical software. •The mechanism relies on Colored Petri Net. •Complex interactions between software and hardware are captured properly. •Common failure mode in software are identified effectively. -- Abstract: The software safety of a nuclear digital protection system is critical for the safety of nuclear power plants as any software defect may result in severe damage. In order to ensure the safety and reliability of safety-critical digital system products and their applications, software hazard analysis is required to be performed during the lifecycle of software development. The dynamic software hazard modeling and analysis method based on Colored Petri Net is proposed and applied to the safety-critical control software of the nuclear digital protection system in this paper. The analysis results show that the proposed method can explain the complex interactions between software and hardware and identify the potential common cause failure in software properly and effectively. Moreover, the method can find the dominant software induced hazard to safety control actions, which aids in increasing software quality.

  17. Semi-Automated Digital Image Analysis of Pick's Disease and TDP-43 Proteinopathy.

    Science.gov (United States)

    Irwin, David J; Byrne, Matthew D; McMillan, Corey T; Cooper, Felicia; Arnold, Steven E; Lee, Edward B; Van Deerlin, Vivianna M; Xie, Sharon X; Lee, Virginia M-Y; Grossman, Murray; Trojanowski, John Q

    2016-01-01

    Digital image analysis of histology sections provides reliable, high-throughput methods for neuropathological studies but data is scant in frontotemporal lobar degeneration (FTLD), which has an added challenge of study due to morphologically diverse pathologies. Here, we describe a novel method of semi-automated digital image analysis in FTLD subtypes including: Pick's disease (PiD, n=11) with tau-positive intracellular inclusions and neuropil threads, and TDP-43 pathology type C (FTLD-TDPC, n=10), defined by TDP-43-positive aggregates predominantly in large dystrophic neurites. To do this, we examined three FTLD-associated cortical regions: mid-frontal gyrus (MFG), superior temporal gyrus (STG) and anterior cingulate gyrus (ACG) by immunohistochemistry. We used a color deconvolution process to isolate signal from the chromogen and applied both object detection and intensity thresholding algorithms to quantify pathological burden. We found object-detection algorithms had good agreement with gold-standard manual quantification of tau- and TDP-43-positive inclusions. Our sampling method was reliable across three separate investigators and we obtained similar results in a pilot analysis using open-source software. Regional comparisons using these algorithms finds differences in regional anatomic disease burden between PiD and FTLD-TDP not detected using traditional ordinal scale data, suggesting digital image analysis is a powerful tool for clinicopathological studies in morphologically diverse FTLD syndromes. © The Author(s) 2015.

  18. Semi-Automated Digital Image Analysis of Pick’s Disease and TDP-43 Proteinopathy

    Science.gov (United States)

    Irwin, David J.; Byrne, Matthew D.; McMillan, Corey T.; Cooper, Felicia; Arnold, Steven E.; Lee, Edward B.; Van Deerlin, Vivianna M.; Xie, Sharon X.; Lee, Virginia M.-Y.; Grossman, Murray; Trojanowski, John Q.

    2015-01-01

    Digital image analysis of histology sections provides reliable, high-throughput methods for neuropathological studies but data is scant in frontotemporal lobar degeneration (FTLD), which has an added challenge of study due to morphologically diverse pathologies. Here, we describe a novel method of semi-automated digital image analysis in FTLD subtypes including: Pick’s disease (PiD, n=11) with tau-positive intracellular inclusions and neuropil threads, and TDP-43 pathology type C (FTLD-TDPC, n=10), defined by TDP-43-positive aggregates predominantly in large dystrophic neurites. To do this, we examined three FTLD-associated cortical regions: mid-frontal gyrus (MFG), superior temporal gyrus (STG) and anterior cingulate gyrus (ACG) by immunohistochemistry. We used a color deconvolution process to isolate signal from the chromogen and applied both object detection and intensity thresholding algorithms to quantify pathological burden. We found object-detection algorithms had good agreement with gold-standard manual quantification of tau- and TDP-43-positive inclusions. Our sampling method was reliable across three separate investigators and we obtained similar results in a pilot analysis using open-source software. Regional comparisons using these algorithms finds differences in regional anatomic disease burden between PiD and FTLD-TDP not detected using traditional ordinal scale data, suggesting digital image analysis is a powerful tool for clinicopathological studies in morphologically diverse FTLD syndromes. PMID:26538548

  19. Resolution analysis of archive films for the purpose of their optimal digitization and distribution

    Science.gov (United States)

    Fliegel, Karel; Vítek, Stanislav; Páta, Petr; Myslík, Jiří; Pecák, Josef; Jícha, Marek

    2017-09-01

    With recent high demand for ultra-high-definition (UHD) content to be screened in high-end digital movie theaters but also in the home environment, film archives full of movies in high-definition and above are in the scope of UHD content providers. Movies captured with the traditional film technology represent a virtually unlimited source of UHD content. The goal to maintain complete image information is also related to the choice of scanning resolution and spatial resolution for further distribution. It might seem that scanning the film material in the highest possible resolution using state-of-the-art film scanners and also its distribution in this resolution is the right choice. The information content of the digitized images is however limited, and various degradations moreover lead to its further reduction. Digital distribution of the content in the highest image resolution might be therefore unnecessary or uneconomical. In other cases, the highest possible resolution is inevitable if we want to preserve fine scene details or film grain structure for archiving purposes. This paper deals with the image detail content analysis of archive film records. The resolution limit in captured scene image and factors which lower the final resolution are discussed. Methods are proposed to determine the spatial details of the film picture based on the analysis of its digitized image data. These procedures allow determining recommendations for optimal distribution of digitized video content intended for various display devices with lower resolutions. Obtained results are illustrated on spatial downsampling use case scenario, and performance evaluation of the proposed techniques is presented.

  20. Optimization and test of a digital radio-frequency control system and developments for the EPICS-based accelerator control system at the S-DALINAC

    International Nuclear Information System (INIS)

    Burandt, Christoph Warwick

    2017-01-01

    The first part of this thesis covers multiple extensions of the digital low-level radio-frequency control system of the electron accelerator S-DALINAC. This comprises bringing into regular operation the piezoelectrically driven fine tuners of the superconducting cavities. For this a power supply series has been developed, which is compatible with the existing electronics and can power the piezo elements. Furthermore the flexibility of the radio-frequency control system has been demonstrated on superconducting quarter-wave-resonators of the ion accelerator ALPI. These 160 MHz cavities are quite different to the S-DALINAC's acceleration structures but could be operated successfully with low residual errors in phase and amplitude nevertheless. The second part describes the migration of the accelerator control system to an EPICS-based system. A multitude of devices with a diversity of interfaces and protocols have been integrated and higher-level functionality has been unified. Within the framework of the construction of the third recirculation beam-line, the beam-path-length adjustment mechanism was motorized.

  1. Understanding Different Levels of Group Functionality: Activity Systems Analysis of an Intercultural Telecollaborative Multilingual Digital Storytelling Project

    Science.gov (United States)

    Priego, Sabrina; Liaw, Meei-Ling

    2017-01-01

    An Activity Theory framework has been increasingly applied for understanding the tension or contradictions in telecollaboration. However, to date, few researchers have applied it to the analysis of digital stories, and none of them, to our knowledge, have used it to analyze the co-creation of multilingual digital stories. In this study, we explore…

  2. Analysis and experimental results of frequency splitting of underwater wireless power transfer

    Directory of Open Access Journals (Sweden)

    Wangqiang Niu

    2017-06-01

    Full Text Available Underwater wireless power transfer (UWPT is an important technique to power underwater devices while its frequency splitting phenomena are not fully elucidated. In this study, frequency splitting phenomena of a symmetrical planar two-coil wireless power transfer (WPT system resonated at 90 kHz are investigated in seawater and freshwater. A concise frequency splitting analysis of this WPT system in air based on circuit model is given first and then experimental data are reported to show there is little difference between power transfer in air, freshwater and seawater in the range of 40–140 kHz of this WPT system. Consequently, the frequency splitting analysis and observations in air are also applicable in freshwater and seawater. It is found a V-type frequency splitting pattern exists in this WPT system under seawater and freshwater. Frequency shift is observed in this UWPT system in overcoupled region, and no frequency shift is observed in undercoupled region. In undercoupled region, in the low frequency zone of 40–90 kHz the load voltage characteristics in three media are identical; in the high-frequency zone of 90–140 kHz, the load voltage in air is slightly larger than those in freshwater and seawater.

  3. Frequency Domain Computer Programs for Prediction and Analysis of Rail Vehicle Dynamics : Volume 1. Technical Report

    Science.gov (United States)

    1975-12-01

    Frequency domain computer programs developed or acquired by TSC for the analysis of rail vehicle dynamics are described in two volumes. Volume I defines the general analytical capabilities required for computer programs applicable to single rail vehi...

  4. Frequency analysis for the thermal hydraulic characterization of a natural circulation circuit

    International Nuclear Information System (INIS)

    Torres, Walmir M.; Macedo, Luiz A.; Sabundjian, Gaiane; Andrade, Delvonei A.; Umbehaun, Pedro E.; Conti, Thadeu N.; Mesquita, Roberto N.; Masotti, Paulo H.; Angelo, Gabriel

    2011-01-01

    This paper presents the frequency analysis studies of the pressure signals from an experimental natural circulation circuit during a heating process. The main objective is to identify the characteristic frequencies of this process using fast Fourier transform. Video images are used to associate these frequencies to the observed phenomenology in the circuit during the process. Sub-cooled and saturated flow boiling, heaters vibrations, overall circuit vibrations, chugging and geysering were observed. Each phenomenon has its specific frequency associated. Some phenomena and their frequencies must be avoided or attenuated since they can cause damages to the natural circulation circuit and its components. Special operation procedures and devices can be developed to avoid these undesirable frequencies. (author)

  5. Frequency analysis for the thermal hydraulic characterization of a natural circulation circuit

    Energy Technology Data Exchange (ETDEWEB)

    Torres, Walmir M.; Macedo, Luiz A.; Sabundjian, Gaiane; Andrade, Delvonei A.; Umbehaun, Pedro E.; Conti, Thadeu N.; Mesquita, Roberto N.; Masotti, Paulo H.; Angelo, Gabriel, E-mail: wmtorres@ipen.b, E-mail: lamacedo@ipen.b, E-mail: gdjian@ipen.b, E-mail: delvonei@ipen.b, E-mail: umbehaun@ipen.b, E-mail: tnconti@ipen.b, E-mail: , E-mail: rnavarro@ipen.b, E-mail: pmasotti@ipen.b, E-mail: gabriel.angelo@usp.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2011-07-01

    This paper presents the frequency analysis studies of the pressure signals from an experimental natural circulation circuit during a heating process. The main objective is to identify the characteristic frequencies of this process using fast Fourier transform. Video images are used to associate these frequencies to the observed phenomenology in the circuit during the process. Sub-cooled and saturated flow boiling, heaters vibrations, overall circuit vibrations, chugging and geysering were observed. Each phenomenon has its specific frequency associated. Some phenomena and their frequencies must be avoided or attenuated since they can cause damages to the natural circulation circuit and its components. Special operation procedures and devices can be developed to avoid these undesirable frequencies. (author)

  6. Accuracy of Digital vs Conventional Implant Impression Approach: A Three-Dimensional Comparative In Vitro Analysis.

    Science.gov (United States)

    Basaki, Kinga; Alkumru, Hasan; De Souza, Grace; Finer, Yoav

    To assess the three-dimensional (3D) accuracy and clinical acceptability of implant definitive casts fabricated using a digital impression approach and to compare the results with those of a conventional impression method in a partially edentulous condition. A mandibular reference model was fabricated with implants in the first premolar and molar positions to simulate a patient with bilateral posterior edentulism. Ten implant-level impressions per method were made using either an intraoral scanner with scanning abutments for the digital approach or an open-tray technique and polyvinylsiloxane material for the conventional approach. 3D analysis and comparison of implant location on resultant definitive casts were performed using laser scanner and quality control software. The inter-implant distances and interimplant angulations for each implant pair were measured for the reference model and for each definitive cast (n = 20 per group); these measurements were compared to calculate the magnitude of error in 3D for each definitive cast. The influence of implant angulation on definitive cast accuracy was evaluated for both digital and conventional approaches. Statistical analysis was performed using t test (α = .05) for implant position and angulation. Clinical qualitative assessment of accuracy was done via the assessment of the passivity of a master verification stent for each implant pair, and significance was analyzed using chi-square test (α = .05). A 3D error of implant positioning was observed for the two impression techniques vs the reference model, with mean ± standard deviation (SD) error of 116 ± 94 μm and 56 ± 29 μm for the digital and conventional approaches, respectively (P = .01). In contrast, the inter-implant angulation errors were not significantly different between the two techniques (P = .83). Implant angulation did not have a significant influence on definitive cast accuracy within either technique (P = .64). The verification stent

  7. DNBR calculation in digital core protection system by a subchannel analysis code

    International Nuclear Information System (INIS)

    In, W. K.; Yoo, Y. J.; Hwang, T. H.; Ji, S. K.

    2001-01-01

    The DNBR calculation uncertainty and DNBR margin were evaluated in digital core protection system by a thermal-hydrualic subchannel analysis code MATRA. A simplified thermal-hydraulic code CETOP is used to calculate on-line DNBR in core protection system at a digital PWR. The DNBR tuning process against a best-estimate subchannel analysis code is required for CETOP to ensure accurate and conservative DNBR calculation but not necessary for MATRA. The DNBR calculations by MATRA and CETOP were performed for a large number of operating condition in Yonggwang nulcear units 3-4 where the digitial core protection system is initially implemented in Korea. MATRA resulted in a less negative mean value (i.e., reduce the overconservatism) and a somewhat larger standard deviation of the DNBR error. The uncertainty corrected minimum DNBR by MATRA was shown to be higher by 1.8% -9.9% that the CETOP DNBR

  8. Analysis of Shape Nonconformity between Embroidered Element and Its Digital Image

    Directory of Open Access Journals (Sweden)

    Svetlana RADAVIČIENĖ

    2014-04-01

    Full Text Available Embroidery technologies are widely applied for developing decorative elements of original design in garments, for integrating threads intended for protection into garments and other articles. Nonconformity of the shape and dimensions of the embroidered element with the designed digital image is influenced by properties of embroidery threads and fibres, by the filling type, density of stitches and other technological parameters. The objective of the paper is to explore the influence made by properties of fabrics and by the direction of stitches of the actual embroidered element on conformity of the shape with one of the designed digital image. For the research, embroidery threads of different purpose as well as three woven fabrics have been selected. For preparation of test samples, round digital images have been designed filling the embroidery area in different stitch directions. Analysis of the results of the investigations has demonstrated that the shape and dimensions of the embroidered element failed to conform to the shape and dimensions of the designed digital image in most cases. In certain cases, e.g. when the stitch direction goes towards the middle of the embroidered element, a defect, i. e. hole, is observed due to considerable concentration of stitches in the centre of the element.DOI: http://dx.doi.org/10.5755/j01.ms.20.1.2911

  9. Iso-precision scaling of digitized mammograms to facilitate image analysis

    International Nuclear Information System (INIS)

    Karssmeijer, N.; van Erning, L.

    1991-01-01

    This paper reports on a 12 bit CCD camera equipped with a linear sensor of 4096 photodiodes which is used to digitize conventional mammographic films. An iso-precision conversion of the pixel values is preformed to transform the image data to a scale on which the image noise is equal at each level. For this purpose film noise and digitization noise have been determined as a function of optical density and pixel size. It appears that only at high optical densities digitization noise is comparable to or larger than film noise. The quantization error caused by compression of images recorded with 12 bits per pixel to 8 bit images by an iso-precision conversion has been calculated as a function of the number of quantization levels. For mammograms digitized in a 4096 2 matrix the additional error caused by such a scale transform is only about 1.5 percent. An iso-precision scale transform can be advantageous when automated procedures for quantitative image analysis are developed. Especially when detection of signals in noise is aimed at, a constant noise level over the whole pixel value range is very convenient. This is demonstrated by applying local thresholding to detect small microcalcifications. Results are compared to those obtained by using logarithmic or linearized scales

  10. An analysis of error patterns in children′s backward digit recall in noise

    Directory of Open Access Journals (Sweden)

    Homira Osman

    2015-01-01

    Full Text Available The purpose of the study was to determine whether perceptual masking or cognitive processing accounts for a decline in working memory performance in the presence of competing speech. The types and patterns of errors made on the backward digit span in quiet and multitalker babble at -5 dB signal-to-noise ratio (SNR were analyzed. The errors were classified into two categories: item (if digits that were not presented in a list were repeated and order (if correct digits were repeated but in an incorrect order. Fifty five children with normal hearing were included. All the children were aged between 7 years and 10 years. Repeated measures of analysis of variance (RM-ANOVA revealed the main effects for error type and digit span length. In terms of listening condition interaction, it was found that the order errors occurred more frequently than item errors in the degraded listening condition compared to quiet. In addition, children had more difficulty recalling the correct order of intermediate items, supporting strong primacy and recency effects. Decline in children′s working memory performance was not primarily related to perceptual difficulties alone. The majority of errors was related to the maintenance of sequential order information, which suggests that reduced performance in competing speech may result from increased cognitive processing demands in noise.

  11. An analysis of error patterns in children's backward digit recall in noise

    Science.gov (United States)

    Osman, Homira; Sullivan, Jessica R.

    2015-01-01

    The purpose of the study was to determine whether perceptual masking or cognitive processing accounts for a decline in working memory performance in the presence of competing speech. The types and patterns of errors made on the backward digit span in quiet and multitalker babble at -5 dB signal-to-noise ratio (SNR) were analyzed. The errors were classified into two categories: item (if digits that were not presented in a list were repeated) and order (if correct digits were repeated but in an incorrect order). Fifty five children with normal hearing were included. All the children were aged between 7 years and 10 years. Repeated measures of analysis of variance (RM-ANOVA) revealed the main effects for error type and digit span length. In terms of listening condition interaction it was found that the order errors occurred more frequently than item errors in the degraded listening condition compared to quiet. In addition, children had more difficulty recalling the correct order of intermediate items, supporting strong primacy and recency effects. Decline in children's working memory performance was not primarily related to perceptual difficulties alone. The majority of errors was related to the maintenance of sequential order information, which suggests that reduced performance in competing speech may result from increased cognitive processing demands in noise. PMID:26168949

  12. Meta-analysis of digital game and study characteristics eliciting physiological stress responses.

    Science.gov (United States)

    van der Vijgh, Benny; Beun, Robbert-Jan; Van Rood, Maarten; Werkhoven, Peter

    2015-08-01

    Digital games have been used as stressors in a range of disciplines for decades. Nonetheless, the underlying characteristics of these stressors and the study in which the stressor was applied are generally not recognized for their moderating effect on the measured physiological stress responses. We have therefore conducted a meta-analysis that analyzes the effects of characteristics of digital game stressors and study design on heart rate, systolic and diastolic blood pressure, in studies carried out from 1976 to 2012. In order to assess the differing quality between study designs, a new scale is developed and presented, coined reliability of effect size. The results show specific and consistent moderating functions of both game and study characteristics, on average accounting for around 43%, and in certain cases up to 57% of the variance found in physiological stress responses. Possible cognitive and physiological processes underlying these moderating functions are discussed, and a new model integrating these processes with the moderating functions is presented. These findings indicate that a digital game stressor does not act as a stressor by virtue of being a game, but rather derives its stressor function from its characteristics and the methodology in which it is used. This finding, together with the size of the associated moderations, indicates the need for a standardization of digital game stressors. © 2015 Society for Psychophysiological Research.

  13. Influence of Type of Frequency Weighting Function On VDV Analysis

    Science.gov (United States)

    Kowalska-Koczwara, Alicja; Stypuła, Krzysztof

    2017-10-01

    Transport vibrations are the subject of many research, mostly their influence on structural elements of the building is investigated. However, nowadays, especially in the centres of large cities were apartments, residential buildings are closer to the transport vibration sources, an increasing attention is given to providing vibrational comfort to humans in buildings. Currently, in most countries, two main methods of evaluation are used: root mean squared method (RMS) and vibration dose value (VDV). In this article, VDV method is presented and the analysis of the weighting functions selection on value of VDV is made. Measurements required for the analysis were made in Krakow, on masonry, residential, two storey building located in the city centre. The building is subjected into two transport vibration sources: tram passages and vehicle passages on very close located road. Measurement points were located on the basement wall at ground level to control the excitation and in the middle of the floor on the highest storey (in the place where people percept vibration). The room chosen for measurements is located closest to the transport excitation sources. During the measurements, 25 vibration events were recorded and analysed. VDV values were calculated for three different weighting functions according to standard: ISO 2631-1, ISO 2631-2 and BS-6841. Differences in VDV values are shown, but also influence of the weighting function selection on result of evaluation is also presented. VDV analysis was performed not only for the individual vibration event but also all day and night vibration exposure were calculated using formulas contained in the annex to the standard BS-6841. It is demonstrated that, although there are differences in the values of VDV, an influence on all day and night exposure is no longer so significant.

  14. Modal spectral analysis of piping: Determination of the significant frequency range

    International Nuclear Information System (INIS)

    Geraets, L.H.

    1981-01-01

    This paper investigates the influence of the number of modes on the response of a piping system in a dynamic modal spectral analysis. It shows how the analysis can be limited to a specific frequency range of the pipe (independent of the frequency range of the response spectrum), allowing cost reduction without loss in accuracy. The 'missing mass' is taken into account through an original technique. (orig./HP)

  15. Analysis of Standards Efficiency in Digital Television Via Satellite at Ku and Ka Bands

    Directory of Open Access Journals (Sweden)

    Landeros-Ayala Salvador

    2013-06-01

    Full Text Available In this paper, an analysis on the main technical features of digital television standards for satellite transmission is carried out. Based on simulations and link budgets, the standard with the best operational performance is defined, based on simulations and link budget analysis, as well as a comparative efficiency analysis is conducted for the Ku and Ka bands for both transparent and regenerative transponders in terms of power, bandwidth, information rate and link margin, including clear sky, uplink rain, downlink rain and rain in both.

  16. Digital camera image analysis of faeces in detection of cholestatic jaundice in infants

    OpenAIRE

    Parinya Parinyanut; Tai Bandisak; Piyawan Chiengkriwate; Sawit Tanthanuch; Surasak Sangkhathat

    2016-01-01

    Background: Stool colour assessment is a screening method for biliary tract obstruction in infants. This study is aimed to be a proof of concept work of digital photograph image analysis of stool colour compared to colour grading by a colour card, and the stool bilirubin level test. Materials and Methods: The total bilirubin (TB) level contents in stool samples from 17 infants aged less than 1 year, seven with confirmed cholestatic jaundice and ten healthy subjects was measured, and outcome c...

  17. SENTIMENT ANALYSIS OF SOCIAL NETWORKS AS A CHALLENGE TO THE DIGITAL MARKETING

    OpenAIRE

    Brano Markić; Sanja Bijakšić; Arnela Bevanda

    2016-01-01

    Huge amounts of data, in the form of messages on social networks, represent a challange for digital marketing and marketing analytics when meeting the requirements, needs and customer satisfaction with services or products. Marketing strives to be a part of the overall culture based on the data and to define marketing strategies that respond to consumers and thus to provide economic benefits for the company. Therefore, the focus of marketing analysis is on the data recorded at the social netw...

  18. Multi-scale location analysis of vulnerabilities and their link to disturbances within digital ecosystems

    OpenAIRE

    Jackson, Jennifer

    2017-01-01

    As computer networks evolve, so too does the techniques used by attackers to exploit new vulnerabilities. Natural ecosystems already have resistant and resilient properties that help protect them from unwanted disturbances despite the existence of different vulnerabilities. Computer networks and their environments can be considered as digital ecosystems with different vulnerabilities, and security attacks can be considered as unwanted disturbances. Analysis of vulnerabilities and attacks from...

  19. An analysis of technology usage for streaming digital video in support of a preclinical curriculum.

    Science.gov (United States)

    Dev, P; Rindfleisch, T C; Kush, S J; Stringer, J R

    2000-01-01

    Usage of streaming digital video of lectures in preclinical courses was measured by analysis of the data in the log file maintained on the web server. We observed that students use the video when it is available. They do not use it to replace classroom attendance but rather for review before examinations or when a class has been missed. Usage of video has not increased significantly for any course within the 18 month duration of this project.

  20. Extreme Precipitation Estimation with Typhoon Morakot Using Frequency and Spatial Analysis

    Directory of Open Access Journals (Sweden)

    Hone-Jay Chu

    2011-01-01

    Full Text Available Typhoon Morakot lashed Taiwan and produced copious amounts of precipitation in 2009. From the point view of hydrological statistics, the impact of the precipitation from typhoon Morakot using a frequency analysis can be analyzed and discussed. The frequency curve, which was fitted mathematically to historical observed data, can be used to estimate the probability of exceedance for runoff events of a certain magnitude. The study integrates frequency analysis and spatial analysis to assess the effect of Typhoon Morakot event on rainfall frequency in the Gaoping River basin of southern Taiwan. First, extreme rainfall data are collected at sixteen stations for durations of 1, 3, 6, 12, and 24 hours and then an appropriate probability distribution was selected to analyze the impact of the extreme hydrological event. Spatial rainfall patterns for a return period of 200-yr with 24-hr duration with and without Typhoon Morakot are estimated. Results show that the rainfall amount is significantly different with long duration with and without the event for frequency analysis. Furthermore, spatial analysis shows that extreme rainfall for a return period of 200-yr is highly dependent on topography and is smaller in the southwest than that in the east. The results not only demonstrate the distinct effect of Typhoon Morakot on frequency analysis, but also could provide reference in future planning of hydrological engineering.

  1. Soil-structure interaction analysis of NPP containments: substructure and frequency domain methods

    International Nuclear Information System (INIS)

    Venancio-Filho, F.; Almeida, M.C.F.; Ferreira, W.G.; De Barros, F.C.P.

    1997-01-01

    Substructure and frequency domain methods for soil-structure interaction are addressed in this paper. After a brief description of mathematical models for the soil and of excitation, the equations for dynamic soil-structure interaction are developed for a rigid surface foundation and for an embedded foundation. The equations for the frequency domain analysis of MDOF systems are provided. An example of soil-structure interaction analysis with frequency-dependent soil properties is given and examples of identification of foundation impedance functions and soil properties are presented. (orig.)

  2. Econometric analysis of realized covariation: high frequency based covariance, regression, and correlation in financial economics

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Shephard, N.

    2004-01-01

    This paper analyses multivariate high frequency financial data using realized covariation. We provide a new asymptotic distribution theory for standard methods such as regression, correlation analysis, and covariance. It will be based on a fixed interval of time (e.g., a day or week), allowing...... the number of high frequency returns during this period to go to infinity. Our analysis allows us to study how high frequency correlations, regressions, and covariances change through time. In particular we provide confidence intervals for each of these quantities....

  3. The Real-time Frequency Spectrum Analysis of Neutron Pulse Signal Series

    International Nuclear Information System (INIS)

    Tang Yuelin; Ren Yong; Wei Biao; Feng Peng; Mi Deling; Pan Yingjun; Li Jiansheng; Ye Cenming

    2009-01-01

    The frequency spectrum analysis of neutron pulse signal is a very important method in nuclear stochastic signal processing Focused on the special '0' and '1' of neutron pulse signal series, this paper proposes new rotation-table and realizes a real-time frequency spectrum algorithm under 1G Hz sample rate based on PC with add, address and SSE. The numerical experimental results show that under the count rate of 3X10 6 s -1 , this algorithm is superior to FFTW in time-consumption and can meet the real-time requirement of frequency spectrum analysis. (authors)

  4. Study of interhemispheric asymmetries in electroencephalographic signals by frequency analysis

    International Nuclear Information System (INIS)

    Zapata, J F; Garzon, J

    2011-01-01

    This study provides a new method for the detection of interhemispheric asymmetries in patients with continuous video-electroencephalography (EEG) monitoring at Intensive Care Unit (ICU), using wavelet energy. We obtained the registration of EEG signals in 42 patients with different pathologies, and then we proceeded to perform signal processing using the Matlab program, we compared the abnormalities recorded in the report by the neurophysiologist, the images of each patient and the result of signals analysis with the Discrete Wavelet Transform (DWT). Conclusions: there exists correspondence between the abnormalities found in the processing of the signal with the clinical reports of findings in patients; according to previous conclusion, the methodology used can be a useful tool for diagnosis and early quantitative detection of interhemispheric asymmetries.

  5. Space Shuttle and Space Station Radio Frequency (RF) Exposure Analysis

    Science.gov (United States)

    Hwu, Shian U.; Loh, Yin-Chung; Sham, Catherine C.; Kroll, Quin D.

    2005-01-01

    This paper outlines the modeling techniques and important parameters to define a rigorous but practical procedure that can verify the compliance of RF exposure to the NASA standards for astronauts and electronic equipment. The electromagnetic modeling techniques are applied to analyze RF exposure in Space Shuttle and Space Station environments with reasonable computing time and resources. The modeling techniques are capable of taking into account the field interactions with Space Shuttle and Space Station structures. The obtained results illustrate the multipath effects due to the presence of the space vehicle structures. It's necessary to include the field interactions with the space vehicle in the analysis for an accurate assessment of the RF exposure. Based on the obtained results, the RF keep out zones are identified for appropriate operational scenarios, flight rules and necessary RF transmitter constraints to ensure a safe operating environment and mission success.

  6. T-scan III system diagnostic tool for digital occlusal analysis in orthodontics - a modern approach.

    Science.gov (United States)

    Trpevska, Vesna; Kovacevska, Gordana; Benedeti, Alberto; Jordanov, Bozidar

    2014-01-01

    This systematic literature review was performed to establish the mechanism, methodology, characteristics, clinical application and opportunities of the T-Scan III System as a diagnostic tool for digital occlusal analysis in different fields of dentistry, precisely in orthodontics. Searching of electronic databases, using MEDLINE and PubMed, hand searching of relevant key journals, and screening of reference lists of included studies with no language restriction was performed. Publications providing statistically examined data were included for systematic review. Twenty potentially relevant Randomized Controlled Trials (RCTs) were identified. Only ten met the inclusion criteria. The literature demonstrates that using digital occlusal analysis with T-Scan III System in orthodontics has significant advantage with regard to the capability of measuring occlusal parameters in static positions and during dynamic of the mandible. Within the scope of this systematic review, there is evidence to support that T-Scan system is rapid and accurate in identifying the distribution of the tooth contacts and it shows great promise as a clinical diagnostic screening device for occlusion and for improving the occlusion after various dental treatments. Additional clinical studies are required to advance the indication filed of this system. Importance of using digital occlusal T-Scan analysis in orthodontics deserves further investigation.

  7. Stability and performance analysis of a jump linear control system subject to digital upsets

    Science.gov (United States)

    Wang, Rui; Sun, Hui; Ma, Zhen-Yang

    2015-04-01

    This paper focuses on the methodology analysis for the stability and the corresponding tracking performance of a closed-loop digital jump linear control system with a stochastic switching signal. The method is applied to a flight control system. A distributed recoverable platform is implemented on the flight control system and subject to independent digital upsets. The upset processes are used to stimulate electromagnetic environments. Specifically, the paper presents the scenarios that the upset process is directly injected into the distributed flight control system, which is modeled by independent Markov upset processes and independent and identically distributed (IID) processes. A theoretical performance analysis and simulation modelling are both presented in detail for a more complete independent digital upset injection. The specific examples are proposed to verify the methodology of tracking performance analysis. The general analyses for different configurations are also proposed. Comparisons among different configurations are conducted to demonstrate the availability and the characteristics of the design. Project supported by the Young Scientists Fund of the National Natural Science Foundation of China (Grant No. 61403395), the Natural Science Foundation of Tianjin, China (Grant No. 13JCYBJC39000), the Scientific Research Foundation for the Returned Overseas Chinese Scholars, State Education Ministry, China, the Tianjin Key Laboratory of Civil Aircraft Airworthiness and Maintenance in Civil Aviation of China (Grant No. 104003020106), and the Fund for Scholars of Civil Aviation University of China (Grant No. 2012QD21x).

  8. Automated analysis of phantom images for the evaluation of long-term reproducibility in digital mammography

    International Nuclear Information System (INIS)

    Gennaro, G; Ferro, F; Contento, G; Fornasin, F; Di Maggio, C

    2007-01-01

    The performance of an automatic software package was evaluated with phantom images acquired by a full-field digital mammography unit. After the validation, the software was used, together with a Leeds TORMAS test object, to model the image acquisition process. Process modelling results were used to evaluate the sensitivity of the method in detecting changes of exposure parameters from routine image quality measurements in digital mammography, which is the ultimate purpose of long-term reproducibility tests. Image quality indices measured by the software included the mean pixel value and standard deviation of circular details and surrounding background, contrast-to-noise ratio and relative contrast; detail counts were also collected. The validation procedure demonstrated that the software localizes the phantom details correctly and the difference between automatic and manual measurements was within few grey levels. Quantitative analysis showed sufficient sensitivity to relate fluctuations in exposure parameters (kV p or mAs) to variations in image quality indices. In comparison, detail counts were found less sensitive in detecting image quality changes, even when limitations due to observer subjectivity were overcome by automatic analysis. In conclusion, long-term reproducibility tests provided by the Leeds TORMAS phantom with quantitative analysis of multiple IQ indices have been demonstrated to be effective in predicting causes of deviation from standard operating conditions and can be used to monitor stability in full-field digital mammography

  9. Digitized mammograms

    International Nuclear Information System (INIS)

    Bruneton, J.N.; Balu-Maestro, C.; Rogopoulos, A.; Chauvel, C.; Geoffray, A.

    1988-01-01

    Two observers conducted a blind evaluation of 100 mammography files, including 47 malignant cases. Films were read both before and after image digitization at 50 μm and 100 μm with the FilmDRSII. Digitization permitted better analysis of the normal anatomic structures and moderately improved diagnostic sensitivity. Searches for microcalcifications before and after digitization at 100 μm and 50 μm showed better analysis of anatomic structures after digitization (especially for solitary microcalcifications). The diagnostic benefit, with discovery of clustered microcalcifications, was more limited (one case at 100 μm, nine cases at 50 μm). Recognition of microcalcifications was clearly improved in dense breasts, which can benefit from reinterpretation after digitization at 50 μm rather 100μm

  10. Automatic flow analysis of digital subtraction angiography using independent component analysis in patients with carotid stenosis.

    Directory of Open Access Journals (Sweden)

    Han-Jui Lee

    Full Text Available Current time-density curve analysis of digital subtraction angiography (DSA provides intravascular flow information but requires manual vasculature selection. We developed an angiographic marker that represents cerebral perfusion by using automatic independent component analysis.We retrospectively analyzed the data of 44 patients with unilateral carotid stenosis higher than 70% according to North American Symptomatic Carotid Endarterectomy Trial criteria. For all patients, magnetic resonance perfusion (MRP was performed one day before DSA. Fixed contrast injection protocols and DSA acquisition parameters were used before stenting. The cerebral circulation time (CCT was defined as the difference in the time to peak between the parietal vein and cavernous internal carotid artery in a lateral angiogram. Both anterior-posterior and lateral DSA views were processed using independent component analysis, and the capillary angiogram was extracted automatically. The full width at half maximum of the time-density curve in the capillary phase in the anterior-posterior and lateral DSA views was defined as the angiographic mean transient time (aMTT; i.e., aMTTAP and aMTTLat. The correlations between the degree of stenosis, CCT, aMTTAP and aMTTLat, and MRP parameters were evaluated.The degree of stenosis showed no correlation with CCT, aMTTAP, aMTTLat, or any MRP parameter. CCT showed a strong correlation with aMTTAP (r = 0.67 and aMTTLat (r = 0.72. Among the MRP parameters, CCT showed only a moderate correlation with MTT (r = 0.67 and Tmax (r = 0.40. aMTTAP showed a moderate correlation with Tmax (r = 0.42 and a strong correlation with MTT (r = 0.77. aMTTLat also showed similar correlations with Tmax (r = 0.59 and MTT (r = 0.73.Apart from vascular anatomy, aMTT estimates brain parenchyma hemodynamics from DSA and is concordant with MRP. This process is completely automatic and provides immediate measurement of quantitative peritherapeutic brain parenchyma

  11. Eulerian frequency analysis of structural vibrations from high-speed video

    International Nuclear Information System (INIS)

    Venanzoni, Andrea; De Ryck, Laurent; Cuenca, Jacques

    2016-01-01

    An approach for the analysis of the frequency content of structural vibrations from high-speed video recordings is proposed. The techniques and tools proposed rely on an Eulerian approach, that is, using the time history of pixels independently to analyse structural motion, as opposed to Lagrangian approaches, where the motion of the structure is tracked in time. The starting point is an existing Eulerian motion magnification method, which consists in decomposing the video frames into a set of spatial scales through a so-called Laplacian pyramid [1]. Each scale — or level — can be amplified independently to reconstruct a magnified motion of the observed structure. The approach proposed here provides two analysis tools or pre-amplification steps. The first tool provides a representation of the global frequency content of a video per pyramid level. This may be further enhanced by applying an angular filter in the spatial frequency domain to each frame of the video before the Laplacian pyramid decomposition, which allows for the identification of the frequency content of the structural vibrations in a particular direction of space. This proposed tool complements the existing Eulerian magnification method by amplifying selectively the levels containing relevant motion information with respect to their frequency content. This magnifies the displacement while limiting the noise contribution. The second tool is a holographic representation of the frequency content of a vibrating structure, yielding a map of the predominant frequency components across the structure. In contrast to the global frequency content representation of the video, this tool provides a local analysis of the periodic gray scale intensity changes of the frame in order to identify the vibrating parts of the structure and their main frequencies. Validation cases are provided and the advantages and limits of the approaches are discussed. The first validation case consists of the frequency content

  12. Analysis Method of Common Cause Failure on Non-safety Digital Control System

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yun Goo; Oh, Eun Gse [KHNP, Daejeon (Korea, Republic of)

    2014-08-15

    The effects of common cause failure on safety digital instrumentation and control system had been considered in defense in depth analysis with safety analysis method. However, the effects of common cause failure on non-safety digital instrumentation and control system also should be evaluated. The common cause failure can be included in credible failure on the non-safety system. In the I and C architecture of nuclear power plant, many design feature has been applied for the functional integrity of control system. One of that is segmentation. Segmentation defenses the propagation of faults in the I and C architecture. Some of effects from common cause failure also can be limited by segmentation. Therefore, in this paper there are two type of failure mode, one is failures in one control group which is segmented, and the other is failures in multiple control group because that the segmentation cannot defense all effects from common cause failure. For each type, the worst failure scenario is needed to be determined, so the analysis method has been proposed in this paper. The evaluation can be qualitative when there is sufficient justification that the effects are bounded in previous safety analysis. When it is not bounded in previous safety analysis, additional analysis should be done with conservative assumptions method of previous safety analysis or best estimation method with realistic assumptions.

  13. Determining the optimal system-specific cut-off frequencies for filtering in-vitro upper extremity impact force and acceleration data by residual analysis.

    Science.gov (United States)

    Burkhart, Timothy A; Dunning, Cynthia E; Andrews, David M

    2011-10-13

    The fundamental nature of impact testing requires a cautious approach to signal processing, to minimize noise while preserving important signal information. However, few recommendations exist regarding the most suitable filter frequency cut-offs to achieve these goals. Therefore, the purpose of this investigation is twofold: to illustrate how residual analysis can be utilized to quantify optimal system-specific filter cut-off frequencies for force, moment, and acceleration data resulting from in-vitro upper extremity impacts, and to show how optimal cut-off frequencies can vary based on impact condition intensity. Eight human cadaver radii specimens were impacted with a pneumatic impact testing device at impact energies that increased from 20J, in 10J increments, until fracture occurred. The optimal filter cut-off frequency for pre-fracture and fracture trials was determined with a residual analysis performed on all force and acceleration waveforms. Force and acceleration data were filtered with a dual pass, 4th order Butterworth filter at each of 14 different cut-off values ranging from 60Hz to 1500Hz. Mean (SD) pre-fracture and fracture optimal cut-off frequencies for the force variables were 605.8 (82.7)Hz and 513.9 (79.5)Hz, respectively. Differences in the optimal cut-off frequency were also found between signals (e.g. Fx (medial-lateral), Fy (superior-inferior), Fz (anterior-posterior)) within the same test. These optimal cut-off frequencies do not universally agree with the recommendations of filtering all upper extremity impact data using a cut-off frequency of 600Hz. This highlights the importance of quantifying the filter frequency cut-offs specific to the instrumentation and experimental set-up. Improper digital filtering may lead to erroneous results and a lack of standardized approaches makes it difficult to compare findings of in-vitro dynamic testing between laboratories. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Analysis of the influence of aliasing effect on the digital X ray images

    International Nuclear Information System (INIS)

    Niu Yantao; Liu Zhensheng; Wang Gexin; Zhao Bo; Hao Hui; Yan Shulin

    2007-01-01

    Objective: To investigate the causes and eliminating methods of aliasing effect in digital radiography. Methods: Stationary grid and rectangular wave test phantom were imaged on Kodak CR900 system. Lead strips of phantom were parallel to laser scanning direction or with an angle of 45 degrees when they were exposed on imaging plate. The representation ability for resolution test phantom in two types of images were observed. Grid was imaged when its lead strips are parallel to or perpendicular to laser scanning direction. Two images were observed and contrasted on monitor using various magnifying rate. Results: In phantom images, the lead bats below the frequency of 3.93 linepairs per mm could be discriminated, and it is Nyquist frequency of this system. But the lead bars with the frequency of 4.86 linepairs per mm could even been distinguished in the image of test phantom with an angle of 45 degrees. When grid lead bars were parallel to imaging plate scanning direction, resulting images displayed visable streak artifacts. The display degree has marked difference when grid strips were parallel or perpendicular to laser scanning direction. Streaks were not clear when the image was displayed as true size on monitor, but there widths changed in a large range as zoom in or zoom out. At the same time, the directions of streaks changed. Conclusions: Optimum stationary grids should be selected in clinical practice according to limited resolution of CR system because aliasing effect would cause disadvantageous influence, and grid frequency should be greater than Nyquist frequency. Grid strip direction should be perpendicular to laser scanner direction in clinics to avoid streak artifacts. There are notable affection to image seeming on monitor when using different magnifying rate, and using integral times of real image size were suggested. (authors)

  15. Safety analysis for the use of new digital safety I and C systems

    International Nuclear Information System (INIS)

    Buehler, Cornelia

    2012-01-01

    Age-induced replacement or modernization of safety I and C systems by digital equipment technology has been one of the topical subjects in nuclear technology for more than a decade. Digital equipment technology in this case means microcontroller- or microprocessor-based systems which implement I and C functions in software (SW) and, on the other hand, systems with programmed hardware (HW) components, such as Application-specific Integrated Circuits (ASIC), Field Programmable Gate Arrays (FPGA) or Programmable Logic Devices (PLS), which can be developed only by means of sophisticated SW development environments. The switch to digital equipment technology is more than a mere change in equipment technology even though the I and C functions remain almost identical in most cases. The switch not only leads to a different approach in equipment qualification, but also requires new focal points in plant design when it comes to assessing plant design, and needs new or adapted methods of analysis and evaluation. The main reason lies in the greater possibilities of systematic errors caused mainly by software-based development, manufacture and maintenance. New and adapted methods of analysis and evaluation for I and C systems are presented and explained. It is safe to say that safety I and C technology in the highest category of requirements necessitates a very far reaching realignment in design and evaluation as well as the use of new analytical techniques. This meets the claim of an I and C technology fit for use, reliable and comparable to the technology it replaces. (orig.)

  16. Vulnerability Identification and Design-Improvement-Feedback using Failure Analysis of Digital Control System Designs

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Eunchan; Bae, Yeonkyoung [Korea Hydro and Nuclear Power Co., Ltd., Daejeon (Korea, Republic of)

    2013-05-15

    Fault tree analyses let analysts establish the failure sequences of components as a logical model and confirm the result at the plant level. These two analyses provide insights regarding what improvements are needed to increase availability because it expresses the quantified design attribute of the system as minimal cut sets and availability value interfaced with component reliability data in the fault trees. This combined failure analysis method helps system users understand system characteristics including its weakness and strength in relation to faults in the design stage before system operation. This study explained why a digital system could have weaknesses in methods to transfer control signals or data and how those vulnerabilities could cause unexpected outputs. In particular, the result of the analysis confirmed that complex optical communication was not recommended for digital data transmission in the critical systems of nuclear power plants. Regarding loop controllers in Design A, a logic configuration should be changed to prevent spurious actuation due to a single failure, using hardware or software improvements such as cross checking between redundant modules, or diagnosis of the output signal integrity. Unavailability calculations support these insights from the failure analyses of the systems. In the near future, KHNP will perform failure mode and effect analyses in the design stage before purchasing non-safety-related digital system packages. In addition, the design requirements of the system will be confirmed based on evaluation of overall system availability or unavailability.

  17. Game-based digital interventions for depression therapy: a systematic review and meta-analysis.

    Science.gov (United States)

    Li, Jinhui; Theng, Yin-Leng; Foo, Schubert

    2014-08-01

    The aim of this study was to review the existing literature on game-based digital interventions for depression systematically and examine their effectiveness through a meta-analysis of randomized controlled trials (RCTs). Database searching was conducted using specific search terms and inclusion criteria. A standard meta-analysis was also conducted of available RCT studies with a random effects model. The standard mean difference (Cohen's d) was used to calculate the effect size of each study. Nineteen studies were included in the review, and 10 RCTs (eight studies) were included in the meta-analysis. Four types of game interventions-psycho-education and training, virtual reality exposure therapy, exercising, and entertainment-were identified, with various types of support delivered and populations targeted. The meta-analysis revealed a moderate effect size of the game interventions for depression therapy at posttreatment (d=-0.47 [95% CI -0.69 to -0.24]). A subgroup analysis showed that interventions based on psycho-education and training had a smaller effect than those based on the other forms, and that self-help interventions yielded better outcomes than supported interventions. A higher effect was achieved when a waiting list was used as the control. The review and meta-analysis support the effectiveness of game-based digital interventions for depression. More large-scale, high-quality RCT studies with sufficient long-term data for treatment evaluation are needed.

  18. A Hybrid Soft-computing Method for Image Analysis of Digital Plantar Scanners

    Science.gov (United States)

    Razjouyan, Javad; Khayat, Omid; Siahi, Mehdi; Mansouri, Ali Alizadeh

    2013-01-01

    Digital foot scanners have been developed in recent years to yield anthropometrists digital image of insole with pressure distribution and anthropometric information. In this paper, a hybrid algorithm containing gray level spatial correlation (GLSC) histogram and Shanbag entropy is presented for analysis of scanned foot images. An evolutionary algorithm is also employed to find the optimum parameters of GLSC and transform function of the membership values. Resulting binary images as the thresholded images are undergone anthropometric measurements taking in to account the scale factor of pixel size to metric scale. The proposed method is finally applied to plantar images obtained through scanning feet of randomly selected subjects by a foot scanner system as our experimental setup described in the paper. Running computation time and the effects of GLSC parameters are investigated in the simulation results. PMID:24083133

  19. A Hybrid Soft-computing Method for Image Analysis of Digital Plantar Scanners.

    Science.gov (United States)

    Razjouyan, Javad; Khayat, Omid; Siahi, Mehdi; Mansouri, Ali Alizadeh

    2013-01-01

    Digital foot scanners have been developed in recent years to yield anthropometrists digital image of insole with pressure distribution and anthropometric information. In this paper, a hybrid algorithm containing gray level spatial correlation (GLSC) histogram and Shanbag entropy is presented for analysis of scanned foot images. An evolutionary algorithm is also employed to find the optimum parameters of GLSC and transform function of the membership values. Resulting binary images as the thresholded images are undergone anthropometric measurements taking in to account the scale factor of pixel size to metric scale. The proposed method is finally applied to plantar images obtained through scanning feet of randomly selected subjects by a foot scanner system as our experimental setup described in the paper. Running computation time and the effects of GLSC parameters are investigated in the simulation results.

  20. The role of business agreements in defining textbook affordability and digital materials: A document analysis

    Directory of Open Access Journals (Sweden)

    John Raible

    2015-12-01

    Full Text Available Adopting digital materials such as eTextbooks and e-coursepacks is a potential strategy to address textbook affordability in the United States. However, university business relationships with bookstore vendors implicitly structure which instructional resources are available and in what manner. In this study, a document analysis was conducted on the bookstore contracts for the universities included in the State University System of Florida. Namely, issues of textbook affordability, digital material terminology and seller exclusivity were investigated. It was found that textbook affordability was generally conceived in terms of print rental textbooks and buyback programs, and that eTextbooks were priced higher than print textbooks (25% to 30% markup. Implications and recommendations for change are shared. DOI: 10.18870/hlrc.v5i4.284

  1. Semi-automated digital measurement as the method of choice for beta cell mass analysis.

    Directory of Open Access Journals (Sweden)

    Violette Coppens

    Full Text Available Pancreas injury by partial duct ligation (PDL activates beta cell differentiation and proliferation in adult mouse pancreas but remains controversial regarding the anticipated increase in beta cell volume. Several reports unable to show beta cell volume augmentation in PDL pancreas used automated digital image analysis software. We hypothesized that fully automatic beta cell morphometry without manual micrograph artifact remediation introduces bias and therefore might be responsible for reported discrepancies and controversy. However, our present results prove that standard digital image processing with automatic thresholding is sufficiently robust albeit less sensitive and less adequate to demonstrate a significant increase in beta cell volume in PDL versus Sham-operated pancreas. We therefore conclude that other confounding factors such as quality of surgery, selection of samples based on relative abundance of the transcription factor Neurogenin 3 (Ngn3 and tissue processing give rise to inter-laboratory inconsistencies in beta cell volume quantification in PDL pancreas.

  2. Study on The Extended Range Weather Forecast of Low Frequency Signal Based on Period Analysis Method

    Science.gov (United States)

    Li, X.

    2016-12-01

    Although many studies have explored the MJO and its application for weather forecasting, low-frequency oscillation has been insufficiently studied for the extend range weather forecasting over middle and high latitudes. In China, low-frequency synoptic map is a useful tool for meteorological operation department to forecast extend range weather. It is therefore necessary to develop objective methods to serve the need for finding low-frequency signal, interpretation and application of this signal in the extend range weather forecasting. In this paper, method of Butterworth band pass filter was applied to get low-frequency height field at 500hPa from 1980 to 2014 by using NCEP/NCAR daily grid data. Then period analysis and optimal subset regression methods were used to process the low frequency data of 150 days before the first forecast day and extend the low frequency signal of 500hPa low-frequency high field to future 30 days in the global from June to August during 2011-2014. Finally, the results were test. The main results are as follows: (1) In general, the fitting effect of low frequency signals of 500hPa low-frequency height field by period analysis in the northern hemisphere was better than that in the southern hemisphere, and was better in the low latitudes than that in the high latitudes. The fitting accuracy gradually reduced with the increase of forecast time length, which tended to be stable during the late forecasting period. (2) The fitting effects over the 6 key regions in China showed that except filtering result over Xinjiang area in the first 10 days and 30 days, filtering results over the other 5 key regions throughout the whole period have passed reliability test with level more than 95%. (3) The center and scope of low and high low frequency systems can be fitted well by using the methods mentioned above, which is consist with the corresponding use of the low-frequency synoptic map for the prediction of the extended period. Application of the

  3. Performance analysis of the closed digital control circuit of reactor A-1

    International Nuclear Information System (INIS)

    Karpeta, C.; Volf, K.; Stirsky, P.; Roubal, S.; Muellerova, H.

    A computer-aided analysis is presented of the optimum digital control of the A-1 nuclear power plant reactor. The effect of index weighting matrices on the quality of control processes was studied for a deterministic case using the Separation Theorem for a linear time-discrete regulator problem with a quadratic performance index. Some properties were also investigated of the Kalman filter serving the process state estimation. An analysis is reported for a stochastic case, this for both time-invariant and time-variant Kalman filter gain matrix. (author)

  4. Analysis of first and second order binary quantized digital phase-locked loops for ideal and white Gaussian noise inputs

    Science.gov (United States)

    Blasche, P. R.

    1980-01-01

    Specific configurations of first and second order all digital phase locked loops are analyzed for both ideal and additive white gaussian noise inputs. In addition, a design for a hardware digital phase locked loop capable of either first or second order operation is presented along with appropriate experimental data obtained from testing of the hardware loop. All parameters chosen for the analysis and the design of the digital phase locked loop are consistent with an application to an Omega navigation receiver although neither the analysis nor the design are limited to this application.

  5. Frequency Analysis Using Bootstrap Method and SIR Algorithm for Prevention of Natural Disasters

    Science.gov (United States)

    Kim, T.; Kim, Y. S.

    2017-12-01

    The frequency analysis of hydrometeorological data is one of the most important factors in response to natural disaster damage, and design standards for a disaster prevention facilities. In case of frequency analysis of hydrometeorological data, it assumes that observation data have statistical stationarity, and a parametric method considering the parameter of probability distribution is applied. For a parametric method, it is necessary to sufficiently collect reliable data; however, snowfall observations are needed to compensate for insufficient data in Korea, because of reducing the number of days for snowfall observations and mean maximum daily snowfall depth due to climate change. In this study, we conducted the frequency analysis for snowfall using the Bootstrap method and SIR algorithm which are the resampling methods that can overcome the problems of insufficient data. For the 58 meteorological stations distributed evenly in Korea, the probability of snowfall depth was estimated by non-parametric frequency analysis using the maximum daily snowfall depth data. The results show that probabilistic daily snowfall depth by frequency analysis is decreased at most stations, and most stations representing the rate of change were found to be consistent in both parametric and non-parametric frequency analysis. This study shows that the resampling methods can do the frequency analysis of the snowfall depth that has insufficient observed samples, which can be applied to interpretation of other natural disasters such as summer typhoons with seasonal characteristics. Acknowledgment.This research was supported by a grant(MPSS-NH-2015-79) from Disaster Prediction and Mitigation Technology Development Program funded by Korean Ministry of Public Safety and Security(MPSS).

  6. Analysis of Fatigue Life of PMMA at Different Frequencies Based on a New Damage Mechanics Model

    Directory of Open Access Journals (Sweden)

    Aifeng Huang

    2014-01-01

    Full Text Available Low-cycle fatigue tests at different frequencies and creep tests under different stress levels of Plexiglas Resist 45 were conducted. Correspondingly, the creep fracture time, S-N curves, cyclic creep, and hysteresis loop were obtained. These results showed that the fatigue life increases with frequency at low frequency domain. After analysis, it was found that fatigue life is dependent on the load rate and is affected by the creep damage. In addition, a new continuum damage mechanics (CDM model was established to analyze creep-fatigue life, where the damage increment nonlinear summation rule was proposed and the frequency modification was made on the fatigue damage evolution equation. Differential evolution (DE algorithm was employed to determine the parameters within the model. The proposed model described fatigue life under different frequencies, and the calculated results agreed well with the experimental results.

  7. Digital camera image analysis of faeces in detection of cholestatic jaundice in infants.

    Science.gov (United States)

    Parinyanut, Parinya; Bandisak, Tai; Chiengkriwate, Piyawan; Tanthanuch, Sawit; Sangkhathat, Surasak

    2016-01-01

    Stool colour assessment is a screening method for biliary tract obstruction in infants. This study is aimed to be a proof of concept work of digital photograph image analysis of stool colour compared to colour grading by a colour card, and the stool bilirubin level test. The total bilirubin (TB) level contents in stool samples from 17 infants aged less than 1 year, seven with confirmed cholestatic jaundice and ten healthy subjects was measured, and outcome correlated with the physical colour of the stool. The seven infants with cholestasis included 6 cases of biliary atresia and 1 case of pancreatic mass. All pre-operative stool samples in these cases were indicated as grade 1 on the stool card (stool colour in healthy infants ranges from 4 to 6). The average stool TB in the pale stool group was 43.07 μg/g compared to 101.78 μg/g in the non-pale stool group. Of the 3 colour channels assessed in the digital photographs, the blue and green light were best able to discriminate accurately between the pre-operative stool samples from infants with cholestasis and the samples from the healthy controls. With red, green, and blue (RGB) image analysis using wave level as the ANN input, the system predicts the stool TB with a relationship coefficient of 0.96, compared to 0.61 when stool colour card grading was used. Input from digital camera images of stool had a higher predictive capability compared to the standard stool colour card, indicating using digital photographs may be a useful tool for detection of cholestasis in infants.

  8. Effect Analysis of Faults in Digital I and C Systems of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Jun; Jung, Won Dea [KAERI, Dajeon (Korea, Republic of); Kim, Man Cheol [Chung-Ang University, Seoul (Korea, Republic of)

    2014-08-15

    A reliability analysis of digital instrumentation and control (I and C) systems in nuclear power plants has been introduced as one of the important elements of a probabilistic safety assessment because of the unique characteristics of digital I and C systems. Digital I and C systems have various features distinguishable from those of analog I and C systems such as software and fault-tolerant techniques. In this work, the faults in a digital I and C system were analyzed and a model for representing the effects of the faults was developed. First, the effects of the faults in a system were analyzed using fault injection experiments. A software-implemented fault injection technique in which faults can be injected into the memory was used based on the assumption that all faults in a system are reflected in the faults in the memory. In the experiments, the effect of a fault on the system output was observed. In addition, the success or failure in detecting the fault by fault-tolerant functions included in the system was identified. Second, a fault tree model for representing that a fault is propagated to the system output was developed. With the model, it can be identified how a fault is propagated to the output or why a fault is not detected by fault-tolerant techniques. Based on the analysis results of the proposed method, it is possible to not only evaluate the system reliability but also identify weak points of fault-tolerant techniques by identifying undetected faults. The results can be reflected in the designs to improve the capability of fault-tolerant techniques.

  9. Digital camera image analysis of faeces in detection of cholestatic jaundice in infants

    Directory of Open Access Journals (Sweden)

    Parinya Parinyanut

    2016-01-01

    Full Text Available Background: Stool colour assessment is a screening method for biliary tract obstruction in infants. This study is aimed to be a proof of concept work of digital photograph image analysis of stool colour compared to colour grading by a colour card, and the stool bilirubin level test. Materials and Methods: The total bilirubin (TB level contents in stool samples from 17 infants aged less than 1 year, seven with confirmed cholestatic jaundice and ten healthy subjects was measured, and outcome correlated with the physical colour of the stool. Results: The seven infants with cholestasis included 6 cases of biliary atresia and 1 case of pancreatic mass. All pre-operative stool samples in these cases were indicated as grade 1 on the stool card (stool colour in healthy infants ranges from 4 to 6. The average stool TB in the pale stool group was 43.07 μg/g compared to 101.78 μg/g in the non-pale stool group. Of the 3 colour channels assessed in the digital photographs, the blue and green light were best able to discriminate accurately between the pre-operative stool samples from infants with cholestasis and the samples from the healthy controls. With red, green, and blue (RGB image analysis using wave level as the ANN input, the system predicts the stool TB with a relationship coefficient of 0.96, compared to 0.61 when stool colour card grading was used. Conclusion: Input from digital camera images of stool had a higher predictive capability compared to the standard stool colour card, indicating using digital photographs may be a useful tool for detection of cholestasis in infants.

  10. Extraction Of Electronic Evidence From VoIP: Identification & Analysis Of Digital Speech

    Directory of Open Access Journals (Sweden)

    David Irwin

    2012-09-01

    Full Text Available The Voice over Internet Protocol (VoIP is increasing in popularity as a cost effective and efficient means of making telephone calls via the Internet. However, VoIP may also be an attractive method of communication to criminals as their true identity may be hidden and voice and video communications are encrypted as they are deployed across the Internet. This produces in a new set of challenges for forensic analysts compared with traditional wire-tapping of the Public Switched Telephone Network (PSTN infrastructure, which is not applicable to VoIP. Therefore, other methods of recovering electronic evidence from VoIP are required.  This research investigates the analysis and recovery of digitised human, which persists in computer memory after a VoIP call.This paper proposes a proof of concept how remnants of digitised human speech from a VoIP call may be identified within a forensic memory capture based on how the human voice is detected via a microphone and encoded to a digital format using the sound card of your personal computer. This digital format is unencrypted whist processed in Random Access Memory (RAM before it is passed to the VoIP application for encryption and  transmission over the Internet. Similarly, an incoming encrypted VoIP call is decrypted by the VoIP application and passes through RAM unencrypted in order to be played via the speaker output.A series of controlled tests were undertaken whereby RAM captures were analysed for remnants of digital speech after a VoIP audio call with known conversation. The identification and analysis of digital speech from RAM attempts to construct an automatic process for the identification and subsequent reconstruction of the audio content of a VoIP call.

  11. Effect analysis of faults in digital I and C systems of nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Jun

    2014-01-01

    A reliability analysis of digital instrumentation and control (I and C) systems in nuclear power plants has been introduced as one of the important elements of a probabilistic safety assessment because of the unique characteristics of digital I and C systems. Digital I and C systems have various features distinguishable from those of analog I and C systems such as software and fault-tolerant techniques. In this work, the faults in a digital I and C system were analyzed and a model for representing the effects of the faults was developed. First, the effects of the faults in a system were analyzed using fault injection experiments. A software-implemented fault injection technique in which faults can be injected into the memory was used based on the assumption that all faults in a system are reflected in the faults in the memory. In the experiments, the effect of a fault on the system output was observed. In addition, the success or failure in detecting the fault by fault-tolerant functions included in the system was identified. Second, a fault tree model for representing that a fault is propagated to the system output was developed. With the model, it can be identified how a fault is propagated to the output or why a fault is not detected by fault-tolerant techniques. Based on the analysis results of the proposed method, it is possible to not only evaluate the system reliability but also identify weak points of fault-tolerant techniques by identifying undetected faults. The results can be reflected in the designs to improve the capability of fault-tolerant techniques. (author)

  12. Developing a digital photography-based method for dietary analysis in self-serve dining settings.

    Science.gov (United States)

    Christoph, Mary J; Loman, Brett R; Ellison, Brenna

    2017-07-01

    Current population-based methods for assessing dietary intake, including food frequency questionnaires, food diaries, and 24-h dietary recall, are limited in their ability to objectively measure food intake. Digital photography has been identified as a promising addition to these techniques but has rarely been assessed in self-serve settings. We utilized digital photography to examine university students' food choices and consumption in a self-serve dining hall setting. Research assistants took pre- and post-photos of students' plates during lunch and dinner to assess selection (presence), servings, and consumption of MyPlate food groups. Four coders rated the same set of approximately 180 meals for inter-rater reliability analyses; approximately 50 additional meals were coded twice by each coder to assess intra-rater agreement. Inter-rater agreement on the selection, servings, and consumption of food groups was high at 93.5%; intra-rater agreement was similarly high with an average of 95.6% agreement. Coders achieved the highest rates of agreement in assessing if a food group was present on the plate (95-99% inter-rater agreement, depending on food group) and estimating the servings of food selected (81-98% inter-rater agreement). Estimating consumption, particularly for items such as beans and cheese that were often in mixed dishes, was more challenging (77-94% inter-rater agreement). Results suggest that the digital photography method presented is feasible for large studies in real-world environments and can provide an objective measure of food selection, servings, and consumption with a high degree of agreement between coders; however, to make accurate claims about the state of dietary intake in all-you-can-eat, self-serve settings, researchers will need to account for the possibility of diners taking multiple trips through the serving line. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Design and analysis of planar spiral resonator bandstop filter for microwave frequency

    Science.gov (United States)

    Motakabber, S. M. A.; Shaifudin Suharsono, Muhammad

    2017-11-01

    In microwave frequency, a spiral resonator can act as either frequency reject or acceptor circuits. A planar logarithmic spiral resonator bandstop filter has been developed based on this property. This project focuses on the rejection property of the spiral resonator. The performance analysis of the exhibited filter circuit has been performed by using scattering parameters (S-parameters) technique in the ultra-wideband microwave frequency. The proposed filter is built, simulated and S-parameters analysis have been accomplished by using electromagnetic simulation software CST microwave studio. The commercial microwave substrate Taconic TLX-8 has been used to build this filter. Experimental results showed that the -10 dB rejection bandwidth of the filter is 2.32 GHz and central frequency is 5.72 GHz which is suitable for ultra-wideband applications. The proposed design has been full of good compliance with the simulated and experimental results here.

  14. Frequency-domain analysis of resonant-type ring magnet power supplies

    International Nuclear Information System (INIS)

    Kim, J.M.S.; Reiniger, K.W.

    1993-01-01

    For fast-cycling synchrotrons, resonant-type ring magnet power supplies are commonly used to provide a dc-biased ac excitation for the ring magnets. Up to the present, this power supply system has been analyzed using simplified analytical approximation, namely assuming the resonant frequency of the ring magnet network is fixed and equal to the accelerator frequency. This paper presents a frequency-domain analysis technique for a more accurate analysis of resonant-type ring magnet power supplies. This approach identifies that, with the variation of the resonant frequency, the operating conditions of the power supply changes quite dramatically because of the high Q value of the resonant network. The analytical results are verified, using both experimental results and simulation results

  15. Analysis of Power System Low Frequency Oscillation Based on Energy Shift Theory

    Science.gov (United States)

    Zhang, Junfeng; Zhang, Chunwang; Ma, Daqing

    2018-01-01

    In this paper, a new method for analyzing low-frequency oscillation between analytic areas based on energy coefficient is proposed. The concept of energy coefficient is proposed by constructing the energy function, and the low-frequency oscillation is analyzed according to the energy coefficient under the current operating conditions; meanwhile, the concept of model energy is proposed to analyze the energy exchange behavior between two generators. Not only does this method provide an explanation of low-frequency oscillation from the energy point of view, but also it helps further reveal the dynamic behavior of complex power systems. The case analysis of four-machine two-area and the power system of Jilin Power Grid proves the correctness and effectiveness of the proposed method in low-frequency oscillation analysis of power system.

  16. Fast simulation of wind generation for frequency stability analysis in island power systems

    Energy Technology Data Exchange (ETDEWEB)

    Conroy, James [EirGrid, Dublin (Ireland)

    2010-07-01

    Frequency stability is a major issue for power system planning and operation in an island power system such as Ireland. As increasing amounts of variable speed wind generation are added to the system, this issue becomes more prominent, as variable speed wind generation does not provide an inherent inertial response. This lack of an inertial response means that simplified models for variable speed wind farms can be used for investigating frequency stability. EirGrid uses DIgSILENT Power Factory (as well as other software tools) to investigate frequency stability. In PowerFactory, an automation program has been created to convert detailed wind farm representation (as necessary for other types of analysis) to negative load models for frequency stability analysis. The advantage of this approach is much-improved simulation speed without loss of accuracy. This approach can also be to study future wind energy targets, and long-term simulation of voltage stability. (orig.)

  17. Validation of virtual instrument for data analysis in metrology of time and frequency

    International Nuclear Information System (INIS)

    Jordao, Bruno; Quaresma, Daniel; Rocha, Pedro; Carvalho, Ricardo; Peixoto, Jose Guilherme

    2016-01-01

    Commercial Software (CS) for collection, analysis and plot time and frequency data plots are being increasingly used in reference laboratories worldwide. With this, it has greatly improved the results of calculations of uncertainty for these values. We propose the creation of a collection of software and data analysis using Virtual Instruments (VI) developed the Primary Laboratory Time and frequency of the National Observatory - ON and validation of this instrument. To validate the instrument developed, it made a comparative analysis between the results obtained (VI) with the results obtained by (CS) widely used in many metrology laboratories. From these results we can conclude that there was equivalence between the analyzed data. (author)

  18. Computational analysis of Pelton bucket tip erosion using digital image processing

    Science.gov (United States)

    Shrestha, Bim Prasad; Gautam, Bijaya; Bajracharya, Tri Ratna

    2008-03-01

    Erosion of hydro turbine components through sand laden river is one of the biggest problems in Himalayas. Even with sediment trapping systems, complete removal of fine sediment from water is impossible and uneconomical; hence most of the turbine components in Himalayan Rivers are exposed to sand laden water and subject to erode. Pelton bucket which are being wildly used in different hydropower generation plant undergoes erosion on the continuous presence of sand particles in water. The subsequent erosion causes increase in splitter thickness, which is supposed to be theoretically zero. This increase in splitter thickness gives rise to back hitting of water followed by decrease in turbine efficiency. This paper describes the process of measurement of sharp edges like bucket tip using digital image processing. Image of each bucket is captured and allowed to run for 72 hours; sand concentration in water hitting the bucket is closely controlled and monitored. Later, the image of the test bucket is taken in the same condition. The process is repeated for 10 times. In this paper digital image processing which encompasses processes that performs image enhancement in both spatial and frequency domain. In addition, the processes that extract attributes from images, up to and including the measurement of splitter's tip. Processing of image has been done in MATLAB 6.5 platform. The result shows that quantitative measurement of edge erosion of sharp edges could accurately be detected and the erosion profile could be generated using image processing technique.

  19. Combination of digital signal processing methods towards an improved analysis algorithm for structural health monitoring.

    Science.gov (United States)

    Pentaris, Fragkiskos P.; Makris, John P.

    2013-04-01

    In Structural Health Monitoring (SHM) is of great importance to reveal valuable information from the recorded SHM data that could be used to predict or indicate structural fault or damage in a building. In this work a combination of digital signal processing methods, namely FFT along with Wavelet Transform is applied, together with a proposed algorithm to study frequency dispersion, in order to depict non-linear characteristics of SHM data collected in two university buildings under natural or anthropogenic excitation. The selected buildings are of great importance from civil protection point of view, as there are the premises of a public higher education institute, undergoing high use, stress, visit from academic staff and students. The SHM data are collected from two neighboring buildings that have different age (4 and 18 years old respectively). Proposed digital signal processing methods are applied to the data, presenting a comparison of the structural behavior of both buildings in response to seismic activity, weather conditions and man-made activity. Acknowledgments This work was supported in part by the Archimedes III Program of the Ministry of Education of Greece, through the Operational Program "Educational and Lifelong Learning", in the framework of the project entitled «Interdisciplinary Multi-Scale Research of Earthquake Physics and Seismotectonics at the front of the Hellenic Arc (IMPACT-ARC) » and is co-financed by the European Union (European Social Fund) and Greek National Fund.

  20. Analysis of fractal dimensions of rat bones from film and digital images

    Science.gov (United States)

    Pornprasertsuk, S.; Ludlow, J. B.; Webber, R. L.; Tyndall, D. A.; Yamauchi, M.

    2001-01-01

    OBJECTIVES: (1) To compare the effect of two different intra-oral image receptors on estimates of fractal dimension; and (2) to determine the variations in fractal dimensions between the femur, tibia and humerus of the rat and between their proximal, middle and distal regions. METHODS: The left femur, tibia and humerus from 24 4-6-month-old Sprague-Dawley rats were radiographed using intra-oral film and a charge-coupled device (CCD). Films were digitized at a pixel density comparable to the CCD using a flat-bed scanner. Square regions of interest were selected from proximal, middle, and distal regions of each bone. Fractal dimensions were estimated from the slope of regression lines fitted to plots of log power against log spatial frequency. RESULTS: The fractal dimensions estimates from digitized films were significantly greater than those produced from the CCD (P=0.0008). Estimated fractal dimensions of three types of bone were not significantly different (P=0.0544); however, the three regions of bones were significantly different (P=0.0239). The fractal dimensions estimated from radiographs of the proximal and distal regions of the bones were lower than comparable estimates obtained from the middle region. CONCLUSIONS: Different types of image receptors significantly affect estimates of fractal dimension. There was no difference in the fractal dimensions of the different bones but the three regions differed significantly.