Sensitivity analysis in multi-parameter probabilistic systems
International Nuclear Information System (INIS)
Walker, J.R.
1987-01-01
Probabilistic methods involving the use of multi-parameter Monte Carlo analysis can be applied to a wide range of engineering systems. The output from the Monte Carlo analysis is a probabilistic estimate of the system consequence, which can vary spatially and temporally. Sensitivity analysis aims to examine how the output consequence is influenced by the input parameter values. Sensitivity analysis provides the necessary information so that the engineering properties of the system can be optimized. This report details a package of sensitivity analysis techniques that together form an integrated methodology for the sensitivity analysis of probabilistic systems. The techniques have known confidence limits and can be applied to a wide range of engineering problems. The sensitivity analysis methodology is illustrated by performing the sensitivity analysis of the MCROC rock microcracking model
Stability Analysis for Multi-Parameter Linear Periodic Systems
DEFF Research Database (Denmark)
Seyranian, A.P.; Solem, Frederik; Pedersen, Pauli
1999-01-01
This paper is devoted to stability analysis of general linear periodic systems depending on real parameters. The Floquet method and perturbation technique are the basis of the development. We start out with the first and higher-order derivatives of the Floquet matrix with respect to problem...
Multi-Parameter Analysis of Surface Finish in Electro-Discharge Machining of Tool Steels
Directory of Open Access Journals (Sweden)
Cornelia Victoria Anghel
2006-10-01
Full Text Available The paper presents a multi- parameter analysis of surface finish imparted to tool-steel plates by electro-discharge machining (EDM is presented. The interrelationship between surface texture parameters and process parameters is emphasized. An increased number of parameters is studied including amplitude, spacing, hybrid and fractal parameters,, as well. The correlation of these parameters with the machining conditions is investigated. Observed characteristics become more pronounced, when intensifying machining conditions. Close correlation exists between certain surface finish parameters and EDM input variables and single and multiple statistical regression models are developed.
Flow-cytometric identification of vinegars using a multi-parameter analysis optical detection module
Verschooten, T.; Ottevaere, H.; Vervaeke, M.; Van Erps, J.; Callewaert, M.; De Malsche, W.; Thienpont, H.
2015-09-01
We show a proof-of-concept demonstration of a multi-parameter analysis low-cost optical detection system for the flowcytometric identification of vinegars. This multi-parameter analysis system can simultaneously measure laser induced fluorescence, absorption and scattering excited by two time-multiplexed lasers of different wavelengths. To our knowledge no other polymer optofluidic chip based system offers more simultaneous measurements. The design of the optofluidic channels is aimed at countering the effects that viscous fingering, air bubbles, and emulsion samples can have on the correct operation of such a detection system. Unpredictable variations in viscosity and refractive index of the channel content can be turned into a source of information. The sample is excited by two laser diodes that are driven by custom made low-cost laser drivers. The optofluidic chip is built to be robust and easy to handle and is reproducible using hot embossing. We show a custom optomechanical holder for the optofluidic chip that ensures correct alignment and automatic connection to the external fluidic system. We show an experiment in which 92 samples of vinegar are measured. We are able to identify 9 different kinds of vinegar with an accuracy of 94%. Thus we show an alternative approach to the classic optical spectroscopy solution at a lowered. Furthermore, we have shown the possibility of predicting the viscosity and turbidity of vinegars with a goodness-of-fit R2 over 0.947.
International Nuclear Information System (INIS)
Nam Gung, Chan; Lee, Yoon Sang; Hwang, Seong Sik; Kim, Hong Pyo
2004-01-01
The eddy current testing (ECT) is a nondestructive technique. It is used for evaluation of material's integrity, especially, steam generator (SG) tubing in nuclear plants, due to their rapid inspection, safe and easy operation. For depth measurement of defects, we prepared Electro Discharge Machined (EDM) notches that have several of defects and applied multi-parameter (MP) algorithm. It is a crack shape estimation program developed in Argonne National Laboratory (ANL). To evaluate the MP algorithm, we compared defect profile with fractography of the defects. In the following sections, we described the basic structure of a computer-aided data analysis algorithm used as means of more accurate and efficient processing of ECT data, and explained the specification of a standard calibration. Finally, we discussed the accuracy of estimated depth profile compared with conventional ECT method
Multi-parameter Analysis and Inversion for Anisotropic Media Using the Scattering Integral Method
Djebbi, Ramzi
2017-01-01
the model. I study the prospect of applying a scattering integral approach for multi-parameter inversion for a transversely isotropic model with a vertical axis of symmetry. I mainly analyze the sensitivity kernels to understand the sensitivity of seismic
Real-time multi-parameter cell-based analysis platform
DEFF Research Database (Denmark)
Caviglia, Claudia
biomedical diagnostic techniques, drug discovery and screening. My project focused on the further development, improvement and exploration of the EXCELL microfluidic platform with particular interest in drug kinetic monitoring and neurotransmitter detection. The aim was to perform multi-parameter real...... of protocols and procedures for performing different cellular assays. (2) Electrochemical impedance spectroscopy (EIS) applied for drug screening and drug delivery in cancer research and wound healing studies. (3) Amperometry for monitoring of neurotransmitter exocytosis, relevant in research on Parkinson...
Multi-parameter Analysis and Inversion for Anisotropic Media Using the Scattering Integral Method
Djebbi, Ramzi
2017-10-24
The main goal in seismic exploration is to identify locations of hydrocarbons reservoirs and give insights on where to drill new wells. Therefore, estimating an Earth model that represents the right physics of the Earth\\'s subsurface is crucial in identifying these targets. Recent seismic data, with long offsets and wide azimuth features, are more sensitive to anisotropy. Accordingly, multiple anisotropic parameters need to be extracted from the recorded data on the surface to properly describe the model. I study the prospect of applying a scattering integral approach for multi-parameter inversion for a transversely isotropic model with a vertical axis of symmetry. I mainly analyze the sensitivity kernels to understand the sensitivity of seismic data to anisotropy parameters. Then, I use a frequency domain scattering integral approach to invert for the optimal parameterization. The scattering integral approach is based on the explicit computation of the sensitivity kernels. I present a new method to compute the traveltime sensitivity kernels for wave equation tomography using the unwrapped phase. I show that the new kernels are a better alternative to conventional cross-correlation/Rytov kernels. I also derive and analyze the sensitivity kernels for a transversely isotropic model with a vertical axis of symmetry. The kernels structure, for various opening/scattering angles, highlights the trade-off regions between the parameters. For a surface recorded data, I show that the normal move-out velocity vn, ƞ and δ parameterization is suitable for a simultaneous inversion of diving waves and reflections. Moreover, when seismic data is inverted hierarchically, the horizontal velocity vh, ƞ and ϵ is the parameterization with the least trade-off. In the frequency domain, the hierarchical inversion approach is naturally implemented using frequency continuation, which makes vh, ƞ and ϵ parameterization attractive. I formulate the multi-parameter inversion using the
Multi-parameters sensitivity analysis of natural vibration modal for steel arch bridge
Directory of Open Access Journals (Sweden)
WANG Ying
2014-02-01
Full Text Available Because of the vehicle loads and environmental factors,the behaviors of bridge structure in service is becoming deterioration.The modal parameters are important indexes of structure,so sensitivity analysis of natural vibration is an important way to evaluate the behavior of bridge structure.In this paper,using the finite element software Ansys,calculation model of a steel arch bridge was built,and the natural vibration modals were obtained.In order to compare the different sensitivity of material parameters which may affect the natural vibration modal,5 factors were chosen to perform the calculation.The results indicated that different 5 factors had different sensitivity.The leading factor was elastic modulus of arch rib,and the elastic modulus of suspender had little effect to the sensitivity.Another argument was the opposite sensitivity effect happened between the elastic modulus and density of the material.
International Nuclear Information System (INIS)
Bortolini, Marco; Gamberi, Mauro; Graziani, Alessandro; Mora, Cristina; Regattieri, Alberto
2013-01-01
Highlights: • Performance cost model assesses the feasibility and profitability of PV systems. • Multi-country and multi-parameter analysis of PV systems in the European Union area. • The impact of key technical, environmental, economic and financial data is assessed. • Germany, Italy and Spain present the most effective PV sector support strategies. • The rated power and energy self-consumption ratio affect PV plant profitability. - Abstract: In the last decades, the attention to solar energy as a renewable and nonpolluting energy source increased a lot through scientists, private and public institutions. Several efforts are made to increase the diffusion of such a source and to create the conditions making it competitive for the energy market. Particularly, for the photovoltaic (PV) sector, the module efficiency increase, manufacturing cost reduction and a strong public support, through favorable incentive schemes, generates a significant rise in the installed power, exceeding 40 GWp in 2010. Although the global trend of the PV sector is positive, differences among countries arise out of local peculiarities and evolutions in the national support policies. This paper investigates such issues focusing on the feasibility analysis of PV solar systems for eight relevant countries in the European Union area, i.e. France, Germany, Greece, Italy, Spain, The Netherlands, Turkey and United Kingdom. A multi-country and multi-parameter comparative analysis, based on the net present value and payback capital budget indices, allows to highlight the conditions most affecting the economic feasibility of PV systems. The national support strategies, along with the most relevant technical, environmental, economic and financial parameters, are the key features included and compared in the analysis. The major results deal with the conditions which make PV systems potentially profitable for each country and installation feature. The national support strategies to the PV
Continuous multi-parameter heart rate variability analysis heralds onset of sepsis in adults.
Directory of Open Access Journals (Sweden)
Saif Ahmad
Full Text Available BACKGROUND: Early diagnosis of sepsis enables timely resuscitation and antibiotics and prevents subsequent morbidity and mortality. Clinical approaches relying on point-in-time analysis of vital signs or lab values are often insensitive, non-specific and late diagnostic markers of sepsis. Exploring otherwise hidden information within intervals-in-time, heart rate variability (HRV has been documented to be both altered in the presence of sepsis, and correlated with its severity. We hypothesized that by continuously tracking individual patient HRV over time in patients as they develop sepsis, we would demonstrate reduced HRV in association with the onset of sepsis. METHODOLOGY/PRINCIPAL FINDINGS: We monitored heart rate continuously in adult bone marrow transplant (BMT patients (n = 21 beginning a day before their BMT and continuing until recovery or withdrawal (12+/-4 days. We characterized HRV continuously over time with a panel of time, frequency, complexity, and scale-invariant domain techniques. We defined baseline HRV as mean variability for the first 24 h of monitoring and studied individual and population average percentage change (from baseline over time in diverse HRV metrics, in comparison with the time of clinical diagnosis and treatment of sepsis (defined as systemic inflammatory response syndrome along with clinically suspected infection requiring treatment. Of the 21 patients enrolled, 4 patients withdrew, leaving 17 patients who completed the study. Fourteen patients developed sepsis requiring antibiotic therapy, whereas 3 did not. On average, for 12 out of 14 infected patients, a significant (25% reduction prior to the clinical diagnosis and treatment of sepsis was observed in standard deviation, root mean square successive difference, sample and multiscale entropy, fast Fourier transform, detrended fluctuation analysis, and wavelet variability metrics. For infected patients (n = 14, wavelet HRV demonstrated a 25% drop from
Flower, Verity J. B.; Carn, Simon A.
2015-10-01
The identification of cyclic volcanic activity can elucidate underlying eruption dynamics and aid volcanic hazard mitigation. Whilst satellite datasets are often analysed individually, here we exploit the multi-platform NASA A-Train satellite constellation to cross-correlate cyclical signals identified using complementary measurement techniques at Soufriere Hills Volcano (SHV), Montserrat. In this paper we present a Multi-taper (MTM) Fast Fourier Transform (FFT) analysis of coincident SO2 and thermal infrared (TIR) satellite measurements at SHV facilitating the identification of cyclical volcanic behaviour. These measurements were collected by the Ozone Monitoring Instrument (OMI) and Moderate Resolution Imaging Spectroradiometer (MODIS) (respectively) in the A-Train. We identify a correlating cycle in both the OMI and MODIS data (54-58 days), with this multi-week feature attributable to episodes of dome growth. The 50 day cycles were also identified in ground-based SO2 data at SHV, confirming the validity of our analysis and further corroborating the presence of this cycle at the volcano. In addition a 12 day cycle was identified in the OMI data, previously attributed to variable lava effusion rates on shorter timescales. OMI data also display a one week (7-8 days) cycle attributable to cyclical variations in viewing angle resulting from the orbital characteristics of the Aura satellite. Longer period cycles possibly relating to magma intrusion were identified in the OMI record (102-, 121-, and 159 days); in addition to a 238-day cycle identified in the MODIS data corresponding to periodic destabilisation of the lava dome. Through the analysis of reconstructions generated from cycles identified in the OMI and MODIS data, periods of unrest were identified, including the major dome collapse of 20th May 2006 and significant explosive event of 3rd January 2009. Our analysis confirms the potential for identification of cyclical volcanic activity through combined
Multi-Parameter Estimation for Orthorhombic Media
Masmoudi, Nabil; Alkhalifah, Tariq Ali
2015-01-01
Building reliable anisotropy models is crucial in seismic modeling, imaging and full waveform inversion. However, estimating anisotropy parameters is often hampered by the trade off between inhomogeneity and anisotropy. For instance, one way to estimate the anisotropy parameters is to relate them analytically to traveltimes, which is challenging in inhomogeneous media. Using perturbation theory, we develop travel-time approximations for orthorhombic media as explicit functions of the anellipticity parameters η1, η2 and a parameter Δγ in inhomogeneous background media. Specifically, our expansion assumes inhomogeneous ellipsoidal anisotropic background model, which can be obtained from well information and stacking velocity analysis. This approach has two main advantages: in one hand, it provides a computationally efficient tool to solve the orthorhombic eikonal equation, on the other hand, it provides a mechanism to scan for the best fitting anisotropy parameters without the need for repetitive modeling of traveltimes, because the coefficients of the traveltime expansion are independent of the perturbed parameters. Furthermore, the coefficients of the traveltime expansion provide insights on the sensitivity of the traveltime with respect to the perturbed parameters. We show the accuracy of the traveltime approximations as well as an approach for multi-parameter scanning in orthorhombic media.
Multi-Parameter Estimation for Orthorhombic Media
Masmoudi, Nabil
2015-08-19
Building reliable anisotropy models is crucial in seismic modeling, imaging and full waveform inversion. However, estimating anisotropy parameters is often hampered by the trade off between inhomogeneity and anisotropy. For instance, one way to estimate the anisotropy parameters is to relate them analytically to traveltimes, which is challenging in inhomogeneous media. Using perturbation theory, we develop travel-time approximations for orthorhombic media as explicit functions of the anellipticity parameters η1, η2 and a parameter Δγ in inhomogeneous background media. Specifically, our expansion assumes inhomogeneous ellipsoidal anisotropic background model, which can be obtained from well information and stacking velocity analysis. This approach has two main advantages: in one hand, it provides a computationally efficient tool to solve the orthorhombic eikonal equation, on the other hand, it provides a mechanism to scan for the best fitting anisotropy parameters without the need for repetitive modeling of traveltimes, because the coefficients of the traveltime expansion are independent of the perturbed parameters. Furthermore, the coefficients of the traveltime expansion provide insights on the sensitivity of the traveltime with respect to the perturbed parameters. We show the accuracy of the traveltime approximations as well as an approach for multi-parameter scanning in orthorhombic media.
Multi-parameters scanning in HTI media
Masmoudi, Nabil
2014-08-05
Building credible anisotropy models is crucial in imaging. One way to estimate anisotropy parameters is to relate them analytically to traveltime, which is challenging in inhomogeneous media. Using perturbation theory, we develop traveltime approximations for transversely isotropic media with horizontal symmetry axis (HTI) as explicit functions of the anellipticity parameter η and the symmetry axis azimuth ϕ in inhomogeneous background media. Specifically, our expansion assumes an inhomogeneous elliptically anisotropic background medium, which may be obtained from well information and stacking velocity analysis in HTI media. This formulation has advantages on two fronts: on one hand, it alleviates the computational complexity associated with solving the HTI eikonal equation, and on the other hand, it provides a mechanism to scan for the best fitting parameters η and ϕ without the need for repetitive modeling of traveltimes, because the traveltime coefficients of the expansion are independent of the perturbed parameters η and ϕ. The accuracy of our expansion is further enhanced by the use of shanks transform. We show the effectiveness of our scheme with tests on a 3D model and we propose an approach for multi-parameters scanning in TI media.
Multi-parameters scanning in HTI media
Masmoudi, Nabil; Alkhalifah, Tariq Ali
2014-01-01
Building credible anisotropy models is crucial in imaging. One way to estimate anisotropy parameters is to relate them analytically to traveltime, which is challenging in inhomogeneous media. Using perturbation theory, we develop traveltime approximations for transversely isotropic media with horizontal symmetry axis (HTI) as explicit functions of the anellipticity parameter η and the symmetry axis azimuth ϕ in inhomogeneous background media. Specifically, our expansion assumes an inhomogeneous elliptically anisotropic background medium, which may be obtained from well information and stacking velocity analysis in HTI media. This formulation has advantages on two fronts: on one hand, it alleviates the computational complexity associated with solving the HTI eikonal equation, and on the other hand, it provides a mechanism to scan for the best fitting parameters η and ϕ without the need for repetitive modeling of traveltimes, because the traveltime coefficients of the expansion are independent of the perturbed parameters η and ϕ. The accuracy of our expansion is further enhanced by the use of shanks transform. We show the effectiveness of our scheme with tests on a 3D model and we propose an approach for multi-parameters scanning in TI media.
Multi-parameter CAMAC compatible ADC scanner
Energy Technology Data Exchange (ETDEWEB)
Midttun, G J; Ingebretsen, F [Oslo Univ. (Norway). Fysisk Inst.; Johnsen, P J [Norsk Data A.S., Box 163, Oekern, Oslo 5, Norway
1979-02-15
A fast ADC scanner for multi-parameter nuclear physics experiments is described. The scanner is based on a standard CAMAC crate, and data from several different experiments can be handled simultaneously through a direct memory access (DMA) channel. The implementation on a PDP-7 computer is outlined.
Directory of Open Access Journals (Sweden)
J. Sobek
2017-01-01
Full Text Available A study on the accuracy of an approximation of the stress field in a cracked body is presented. Crack-tip stress tensor is expressed using the linear elastic fracture mechanics (LEFM theory in this work, more precisely via its multi-parameter formulation, i.e. by Williams power series (WPS. Determination of coefficients of terms of this series is performed using a least squares-based regression technique known as over-deterministic method (ODM for which results from finite element (FE method computation are usually taken as inputs. Main attention is paid to a detailed analysis of a suitable selection of FE nodes whose results serve as the inputs to the employed method. Two different ways of FE nodal selection are compared – nodes selected from the crack tip vicinity lying at a ring of a certain radius versus nodes selected more or less uniformly from a specified part of the test specimen body. Comparison of these approaches is made with the help of procedures developed by the authors which enable both the determination of the coefficients of terms of the analytical WPS approximation of the stress field based on the FE results and the backward reconstruction of the field (again using WPS from those determined terms’ coefficients/functions. The wedge-splitting test (WST specimen with a crack is taken as example for the study.
A nuclear radiation multi-parameter measurement system based on pulse-shape sampling
International Nuclear Information System (INIS)
Qiu Xiaolin; Fang Guoming; Xu Peng; Di Yuming
2007-01-01
In this paper, A nuclear radiation multi-parameter measurement system based on pulse-shape sampling is introduced, including the system's characteristics, composition, operating principle, experiment data and analysis. Compared with conventional nuclear measuring apparatus, it has some remarkable advantages such as the synchronous detection using multi-parameter measurement in the same measurement platform and the general analysis of signal data by user-defined program. (authors)
A multi-parameter, acquisition system positron annihilation lifetime spectrometer
International Nuclear Information System (INIS)
Sharshar, T.
2004-01-01
A positron annihilation lifetime spectrometer employing a multi-parameter acquisition system has been prepared for various purposes such as the investigation and characterization of solid-state materials. The fast-fast coincidence technique was used in the present spectrometer with a pair of plastic scintillation detectors. The acquisition system is based on the Kmax software and on CAMAC modules. The data are acquired in event-by-event list mode. The time spectrum for the desired energy windows can be obtained by off-line data sorting and analysis. The spectrometer for event-by-event data acquisition is an important step to construct a positron age-momentum correlation (AMOC) spectrometer. The AMOC technique is especially suited for the observation of positron transitions between different states during their lifetime. The system performance was tested and the results were presented and discussed
Multi-Parameter Measurement in Unseeded Flows using Femtosecond Lasers
National Aeronautics and Space Administration — Our approach is to use new turn-key femtosecond laser technology along with new high-speed CMOS camera technology to build a multi-parameter measurement system based...
Evaluation for Bearing Wear States Based on Online Oil Multi-Parameters Monitoring
Directory of Open Access Journals (Sweden)
Si-Yuan Wang
2018-04-01
Full Text Available As bearings are critical components of a mechanical system, it is important to characterize their wear states and evaluate health conditions. In this paper, a novel approach for analyzing the relationship between online oil multi-parameter monitoring samples and bearing wear states has been proposed based on an improved gray k-means clustering model (G-KCM. First, an online monitoring system with multiple sensors for bearings is established, obtaining oil multi-parameter data and vibration signals for bearings through the whole lifetime. Secondly, a gray correlation degree distance matrix is generated using a gray correlation model (GCM to express the relationship of oil monitoring samples at different times and then a KCM is applied to cluster the matrix. Analysis and experimental results show that there is an obvious correspondence that state changing coincides basically in time between the lubricants’ multi-parameters and the bearings’ wear states. It also has shown that online oil samples with multi-parameters have early wear failure prediction ability for bearings superior to vibration signals. It is expected to realize online oil monitoring and evaluation for bearing health condition and to provide a novel approach for early identification of bearing-related failure modes.
Evaluation for Bearing Wear States Based on Online Oil Multi-Parameters Monitoring
Hu, Hai-Feng
2018-01-01
As bearings are critical components of a mechanical system, it is important to characterize their wear states and evaluate health conditions. In this paper, a novel approach for analyzing the relationship between online oil multi-parameter monitoring samples and bearing wear states has been proposed based on an improved gray k-means clustering model (G-KCM). First, an online monitoring system with multiple sensors for bearings is established, obtaining oil multi-parameter data and vibration signals for bearings through the whole lifetime. Secondly, a gray correlation degree distance matrix is generated using a gray correlation model (GCM) to express the relationship of oil monitoring samples at different times and then a KCM is applied to cluster the matrix. Analysis and experimental results show that there is an obvious correspondence that state changing coincides basically in time between the lubricants’ multi-parameters and the bearings’ wear states. It also has shown that online oil samples with multi-parameters have early wear failure prediction ability for bearings superior to vibration signals. It is expected to realize online oil monitoring and evaluation for bearing health condition and to provide a novel approach for early identification of bearing-related failure modes. PMID:29621175
Crack propagation direction in a mixed mode geometry estimated via multi-parameter fracture criteria
Czech Academy of Sciences Publication Activity Database
Malíková, L.; Veselý, V.; Seitl, Stanislav
2016-01-01
Roč. 89, AUG (2016), s. 99-107 ISSN 0142-1123. [International Conference on Characterisation of Crack Tip Fields /3./. Urbino, 20.04.2015-22.04.2015] Institutional support: RVO:68081723 Keywords : Near-crack-tip fields * Mixed mode * Crack propagation direction * Multi-parameter fracture criteria * Finite element analysis Subject RIV: JL - Materials Fatigue, Friction Mechanics Impact factor: 2.899, year: 2016
Multi-parameter sensor based on random fiber lasers
Directory of Open Access Journals (Sweden)
Yanping Xu
2016-09-01
Full Text Available We demonstrate a concept of utilizing random fiber lasers to achieve multi-parameter sensing. The proposed random fiber ring laser consists of an erbium-doped fiber as the gain medium and a random fiber grating as the feedback. The random feedback is effectively realized by a large number of reflections from around 50000 femtosecond laser induced refractive index modulation regions over a 10cm standard single mode fiber. Numerous polarization-dependent spectral filters are formed and superimposed to provide multiple lasing lines with high signal-to-noise ratio up to 40dB, which gives an access for a high-fidelity multi-parameter sensing scheme. The number of sensing parameters can be controlled by the number of the lasing lines via input polarizations and wavelength shifts of each peak can be explored for the simultaneous multi-parameter sensing with one sensing probe. In addition, the random grating induced coupling between core and cladding modes can be potentially used for liquid medical sample sensing in medical diagnostics, biology and remote sensing in hostile environments.
Multi-parameter full waveform inversion using Poisson
Oh, Juwon
2016-07-21
In multi-parameter full waveform inversion (FWI), the success of recovering each parameter is dependent on characteristics of the partial derivative wavefields (or virtual sources), which differ according to parameterisation. Elastic FWIs based on the two conventional parameterisations (one uses Lame constants and density; the other employs P- and S-wave velocities and density) have low resolution of gradients for P-wave velocities (or ). Limitations occur because the virtual sources for P-wave velocity or (one of the Lame constants) are related only to P-P diffracted waves, and generate isotropic explosions, which reduce the spatial resolution of the FWI for these parameters. To increase the spatial resolution, we propose a new parameterisation using P-wave velocity, Poisson\\'s ratio, and density for frequency-domain multi-parameter FWI for isotropic elastic media. By introducing Poisson\\'s ratio instead of S-wave velocity, the virtual source for the P-wave velocity generates P-S and S-S diffracted waves as well as P-P diffracted waves in the partial derivative wavefields for the P-wave velocity. Numerical examples of the cross-triangle-square (CTS) model indicate that the new parameterisation provides highly resolved descent directions for the P-wave velocity. Numerical examples of noise-free and noisy data synthesised for the elastic Marmousi-II model support the fact that the new parameterisation is more robust for noise than the two conventional parameterisations.
Multi-parameter optimization design of parabolic trough solar receiver
International Nuclear Information System (INIS)
Guo, Jiangfeng; Huai, Xiulan
2016-01-01
Highlights: • The optimal condition can be obtained by multi-parameter optimization. • Exergy and thermal efficiencies are employed as objective function. • Exergy efficiency increases at the expense of heat losses. • The heat obtained by working fluid increases as thermal efficiency grows. - Abstract: The design parameters of parabolic trough solar receiver are interrelated and interact with one another, so the optimal performance of solar receiver cannot be obtained by the convectional single-parameter optimization. To overcome the shortcoming of single-parameter optimization, a multi-parameter optimization of parabolic trough solar receiver is employed based on genetic algorithm in the present work. When the thermal efficiency is taken as the objective function, the heat obtained by working fluid increases while the average temperature of working fluid and wall temperatures of solar receiver decrease. The average temperature of working fluid and the wall temperatures of solar receiver increase while the heat obtained by working fluid decreases generally by taking the exergy efficiency as an objective function. Assuming that the solar radiation intensity remains constant, the exergy obtained by working fluid increases by taking exergy efficiency as the objective function, which comes at the expense of heat losses of solar receiver.
Czech Academy of Sciences Publication Activity Database
Malíková, L.; Veselý, V.; Seitl, Stanislav
2015-01-01
Roč. 9, č. 33 (2015), s. 25-32 ISSN 1971-8993 Institutional support: RVO:68081723 Keywords : Near-crack-tip fields * Williams expansion * Crack propagation direction * Multi-parameter fracture criteria * Finite element analysis Subject RIV: JL - Materials Fatigue, Friction Mechanics
Monitoring the Dead Sea Region by Multi-Parameter Stations
Mohsen, A.; Weber, M. H.; Kottmeier, C.; Asch, G.
2015-12-01
The Dead Sea Region is an exceptional ecosystem whose seismic activity has influenced all facets of the development, from ground water availability to human evolution. Israelis, Palestinians and Jordanians living in the Dead Sea region are exposed to severe earthquake hazard. Repeatedly large earthquakes (e.g. 1927, magnitude 6.0; (Ambraseys, 2009)) shook the whole Dead Sea region proving that earthquake hazard knows no borders and damaging seismic events can strike anytime. Combined with the high vulnerability of cities in the region and with the enormous concentration of historical values this natural hazard results in an extreme earthquake risk. Thus, an integration of earthquake parameters at all scales (size and time) and their combination with data of infrastructure are needed with the specific aim of providing a state-of-the-art seismic hazard assessment for the Dead Sea region as well as a first quantitative estimate of vulnerability and risk. A strong motivation for our research is the lack of reliable multi-parameter ground-based geophysical information on earthquakes in the Dead Sea region. The proposed set up of a number of observatories with on-line data access will enable to derive the present-day seismicity and deformation pattern in the Dead Sea region. The first multi-parameter stations were installed in Jordan, Israel and Palestine for long-time monitoring. All partners will jointly use these locations. All stations will have an open data policy, with the Deutsches GeoForschungsZentrum (GFZ, Potsdam, Germany) providing the hard and software for real-time data transmission via satellite to Germany, where all partners can access the data via standard data protocols.
Directory of Open Access Journals (Sweden)
Lifeng Wang
2018-01-01
Full Text Available A repeater coil is used to extend the detection distance of a passive wireless multi-parameter sensing system. The passive wireless sensing system has the ability of simultaneously monitoring three parameters by using backscatter modulation together with channel multiplexing. Two different repeater coils are designed and fabricated for readout distance enhancement of the sensing system: one is a PCB (printed circuit board repeater coil, and the other is a copper wire repeater coil. Under the conditions of fixed voltage and adjustable voltage, the maximum readout distance of the sensing system with and without a repeater coil is measured. Experimental results show that larger power supply voltage can help further increase the readout distance. The maximum readout distance of the sensing system with a PCB repeater coil has been extended 2.3 times, and the one with a copper wire repeater coil has been extended 3 times. Theoretical analysis and experimental results both indicate that the high Q factor repeater coil can extend the readout distance more. With the copper wire repeater coil as well as a higher power supply voltage, the passive wireless multi-parameter sensing system finally achieves a maximum readout distance of 13.5 cm.
Stepanova, L. V.
2017-12-01
The paper is devoted to the multi-parameter asymptotic description of the stress field near the crack tip of a finite crack in an infinite isotropic elastic plane medium subject to 1) tensile stress; 2) in-plane shear; 3) mixed mode loading for a wide range of mode-mixity situations (Mode I and Mode II). The multi-parameter series expansion of stress tensor components containing higher-order terms is obtained. All the coefficients of the multiparameter series expansion of the stress field are given. The main focus is on the discussion of the influence of considering the higher-order terms of the Williams expansion. The analysis of the higher-order terms in the stress field is performed. It is shown that the larger the distance from the crack tip, the more terms it is necessary to keep in the asymptotic series expansion. Therefore, it can be concluded that several more higher-order terms of the Williams expansion should be used for the stress field description when the distance from the crack tip is not small enough. The crack propagation direction angle is calculated. Two fracture criteria, the maximum tangential stress criterion and the strain energy density criterion, are used. The multi-parameter form of the two commonly used fracture criteria is introduced and tested. Thirty and more terms of the Williams series expansion for the near-crack-tip stress field enable the angle to be calculated more precisely.
High speed acquisition of multi-parameter data using a Macintosh II CX
International Nuclear Information System (INIS)
Berno, A.; Vogel, J.S.; Caffee, M.
1990-08-01
Accelerator mass spectrometry systems based on >3MV tandem accelerators often use multi-anode ionization detectors and/or time-of-flight detectors to identify individual isotopes through multi-parameter analysis. A Macintosh llcx has been programmed to collect AMS data from a CAMAC-implemented analyzer and to display the histogrammed individual parameters and a double-parameter array. The computer-CAMAC connection is through a Nu-Bus to CAMAC dataway interface which allows direct addressing to all functions and locations in the crate. The asynchronous data from counting the rare isotope is sorted into a CAMAC memory module by a list sequence controller. Isotope switching is controlled by a one-cycle timing generator. A rate-dependent amount of time is used to transfer the data from the memory module at the end of each timing cycle. The present configuration uses 10 to 75 ms for rates of 500--10000 cps. Parameter analysis occurs during the rest of the 520 ms data collection cycle. Completed measurements of the isotope concentrations of each sample are written to files which are compatible with standard Macintosh databases or other processing programs. The system is inexpensive and operates at speeds comparable to those obtainable using larger computers
Tadayyon, Hadi; Sannachi, Lakshmanan; Gangeh, Mehrdad; Sadeghi-Naini, Ali; Tran, William; Trudeau, Maureen E; Pritchard, Kathleen; Ghandi, Sonal; Verma, Sunil; Czarnota, Gregory J
2016-07-19
This study demonstrated the ability of quantitative ultrasound (QUS) parameters in providing an early prediction of tumor response to neoadjuvant chemotherapy (NAC) in patients with locally advanced breast cancer (LABC). Using a 6-MHz array transducer, ultrasound radiofrequency (RF) data were collected from 58 LABC patients prior to NAC treatment and at weeks 1, 4, and 8 of their treatment, and prior to surgery. QUS parameters including midband fit (MBF), spectral slope (SS), spectral intercept (SI), spacing among scatterers (SAS), attenuation coefficient estimate (ACE), average scatterer diameter (ASD), and average acoustic concentration (AAC) were determined from the tumor region of interest. Ultrasound data were compared with the ultimate clinical and pathological response of the patient's tumor to treatment and patient recurrence-free survival. Multi-parameter discriminant analysis using the κ-nearest-neighbor classifier demonstrated that the best response classification could be achieved using the combination of MBF, SS, and SAS, with an accuracy of 60 ± 10% at week 1, 77 ± 8% at week 4 and 75 ± 6% at week 8. Furthermore, when the QUS measurements at each time (week) were combined with pre-treatment (week 0) QUS values, the classification accuracies improved (70 ± 9% at week 1, 80 ± 5% at week 4, and 81 ± 6% at week 8). Finally, the multi-parameter QUS model demonstrated a significant difference in survival rates of responding and non-responding patients at weeks 1 and 4 (p=0.035, and 0.027, respectively). This study demonstrated for the first time, using new parameters tested on relatively large patient cohort and leave-one-out classifier evaluation, that a hybrid QUS biomarker including MBF, SS, and SAS could, with relatively high sensitivity and specificity, detect the response of LABC tumors to NAC as early as after 4 weeks of therapy. The findings of this study also suggested that incorporating pre-treatment QUS parameters of a tumor improved the
Multi-parameter Full-waveform Inversion for Acoustic VTI Medium with Surface Seismic Data
Cheng, X.; Jiao, K.; Sun, D.; Huang, W.; Vigh, D.
2013-12-01
Full-waveform Inversion (FWI) attracts wide attention recently in oil and gas industry as a new promising tool for high resolution subsurface velocity model building. While the traditional common image point gather based tomography method aims to focus post-migrated data in depth domain, FWI aims to directly fit the observed seismic waveform in either time or frequency domain. The inversion is performed iteratively by updating the velocity fields to reduce the difference between the observed and the simulated data. It has been shown the inversion is very sensitive to the starting velocity fields, and data with long offsets and low frequencies is crucial for the success of FWI to overcome this sensitivity. Considering the importance of data with long offsets and low frequencies, in most geologic environment, anisotropy is an unavoidable topic for FWI especially at long offsets, since anisotropy tends to have more pronounced effects on waves traveled for a great distance. In VTI medium, this means more horizontal velocity will be registered in middle-to-long offset data, while more vertical velocity will be registered in near-to-middle offset data. Up to date, most of real world applications of FWI still remain in isotropic medium, and only a few studies have been shown to account for anisotropy. And most of those studies only account for anisotropy in waveform simulation, but not invert for those anisotropy fields. Multi-parameter inversion for anisotropy fields, even in VTI medium, remains as a hot topic in the field. In this study, we develop a strategy for multi-parameter FWI for acoustic VTI medium with surface seismic data. Because surface seismic data is insensitivity to the delta fields, we decide to hold the delta fields unchanged during our inversion, and invert only for vertical velocity and epsilon fields. Through parameterization analysis and synthetic tests, we find that it is more feasible to invert for the parameterization as vertical and horizontal
Accelerated whole-brain multi-parameter mapping using blind compressed sensing.
Bhave, Sampada; Lingala, Sajan Goud; Johnson, Casey P; Magnotta, Vincent A; Jacob, Mathews
2016-03-01
To introduce a blind compressed sensing (BCS) framework to accelerate multi-parameter MR mapping, and demonstrate its feasibility in high-resolution, whole-brain T1ρ and T2 mapping. BCS models the evolution of magnetization at every pixel as a sparse linear combination of bases in a dictionary. Unlike compressed sensing, the dictionary and the sparse coefficients are jointly estimated from undersampled data. Large number of non-orthogonal bases in BCS accounts for more complex signals than low rank representations. The low degree of freedom of BCS, attributed to sparse coefficients, translates to fewer artifacts at high acceleration factors (R). From 2D retrospective undersampling experiments, the mean square errors in T1ρ and T2 maps were observed to be within 0.1% up to R = 10. BCS was observed to be more robust to patient-specific motion as compared to other compressed sensing schemes and resulted in minimal degradation of parameter maps in the presence of motion. Our results suggested that BCS can provide an acceleration factor of 8 in prospective 3D imaging with reasonable reconstructions. BCS considerably reduces scan time for multiparameter mapping of the whole brain with minimal artifacts, and is more robust to motion-induced signal changes compared to current compressed sensing and principal component analysis-based techniques. © 2015 Wiley Periodicals, Inc.
Optimization Design of Multi-Parameters in Rail Launcher System
Directory of Open Access Journals (Sweden)
Yujiao Zhang
2014-05-01
Full Text Available Today the energy storage systems are still encumbering, therefore it is useful to think about the optimization of a railgun system in order to achieve the best performance with the lowest energy input. In this paper, an optimal design method considering 5 parameters is proposed to improve the energy conversion efficiency of a simple railgun. In order to avoid costly trials, the field- circuit method is employed to analyze the operations of different structural railguns with different parameters respectively. And the orthogonal test approach is used to guide the simulation for choosing the better parameter combinations, as well reduce the calculation cost. The research shows that the proposed method gives a better result in the energy efficiency of the system. To improve the energy conversion efficiency of electromagnetic rail launchers, the selection of more parameters must be considered in the design stage, such as the width, height and length of rail, the distance between rail pair, and pulse forming inductance. However, the relationship between these parameters and energy conversion efficiency cannot be directly described by one mathematical expression. So optimization methods must be applied to conduct design. In this paper, a rail launcher with five parameters was optimized by using orthogonal test method. According to the arrangement of orthogonal table, the better parameters’ combination can be obtained through less calculation. Under the condition of different parameters’ value, field and circuit simulation analysis were made. The results show that the energy conversion efficiency of the system is increased by 71.9 % after parameters optimization.
A Smart Multi-parameter Sensor with Online Monitoring for the Aquaculture in China
Peng , Fa; Wang , Jinxing; Liu , Shuangxi; Li , Daoliang; Xu , Dan; Wang , Yang
2013-01-01
International audience; PH, DO,ORP, EC and water-level are important parameters of the aquaculture monitoring. But the high cost of foreign sensors and high-energy consumption of Chinese sensors make it impossible for wide use in China. This paper uses MCU STM8L152 to realize the ultralow power design. With simple hardware structure design, the cost of the multi-parameter sensor can be reduced .The experiment data of the multi-parameter sensor contrasting with the results obtained by Hach mul...
DEFF Research Database (Denmark)
Shitu, J. O.; Woodley, John; Wnek, R.
2009-01-01
The expression of interleukin-13 (IL13) following induction with IPTG in Escherichia coli results in metabolic changes as indicated by multi-parameter flow cytometry and traditional methods of fermentation profiling (O-2 uptake rate, CO2 evolution rate and optical density measurements). Induction...
Frequency Domain Multi-parameter Full Waveform Inversion for Acoustic VTI Media
Djebbi, Ramzi; Alkhalifah, Tariq Ali
2017-01-01
Multi-parameter full waveform inversion (FWI) for transversely isotropic (TI) media with vertical axis of symmetry (VTI) suffers from the trade-off between the parameters. The trade-off results in the leakage of one parameter's update into the other
Characterization and optimized control by means of multi-parameter controllers
Energy Technology Data Exchange (ETDEWEB)
Nielsen, Carsten; Hoeg, S.; Thoegersen, A. (Dan-Ejendomme, Hellerup (Denmark)) (and others)
2009-07-01
Poorly functioning HVAC systems (Heating, Ventilation and Air Conditioning), but also separate heating, ventilation and air conditioning systems are costing the Danish society billions of kroner every year: partly because of increased energy consumption and high operational and maintenance costs, but mainly due to reduced productivity and absence due to illness because of a poor indoor climate. Typically, the operation of buildings and installations takes place today with traditional build-ing automation, which is characterised by 1) being based on static considerations 2) the individual sensor being coupled with one actuator/valve, i.e. the sensor's signal is only used in one place in the system 3) subsystems often being controlled independently of each other 4) the dynamics in building constructions and systems which is very important to the system and comfort regulation is not being considered. This, coupled with the widespread tendency to use large glass areas in the facades without sufficient sun shading, means that it is difficult to optimise comfort and energy consumption. Therefore, the last 10-20 years have seen a steady increase in the complaints of the indoor climate in Danish buildings and, at the same time, new buildings often turn out to be considerably higher energy consuming than expected. The purpose of the present project is to investigate the type of multi parameter sensors which may be generated for buildings and further to carry out a preliminary evaluation on how such multi parameter controllers may be utilized for optimal control of buildings. The aim of the project isn't to develop multi parameter controllers - this requires much more effort than possible in the present project. The aim is to show the potential of using multi parameter sensors when controlling buildings. For this purpose a larger office building has been chosen - an office building with at high energy demand and complaints regarding the indoor climate. In order to
Fast ADC interface with data reduction facilities for multi-parameter experiments in nuclear physics
Energy Technology Data Exchange (ETDEWEB)
Liebl, W; Franz, N; Ziegler, G [Technische Univ. Muenchen, Garching (Germany, F.R.). Fakultaet Physik; Hegewisch, S; Kunz, D; Maier, D; Lutter, R; Schoeffel, K; Stanzel, B [Muenchen Univ. (Germany, F.R.). Sektion Physik; Drescher, B [Hahn-Meitner-Institut fuer Kernforschung Berlin G.m.b.H. (Germany, F.R.)
1982-03-01
A modular ADC interface system for multi-parameter experiments with single NIM ADCs is described. 16 fast ADCs are handled by CAMAC modules and data buses in order to build up a sophisticated hardware system which is able to take coincidence data and singles spectra in parallel. The coincidence logic is handled by one of the interface modules; the interface allows online data reduction. The further expansion of the system will be discussed.
International Nuclear Information System (INIS)
Liebl, W.; Franz, N.; Ziegler, G.
1982-01-01
A modular ADC interface system for multi-parameter experiments with single NIM ADCs is described. 16 fast ADCs are handled by CAMAC modules and data buses in order to build up a sophisticated hardware system which is able to take coincidence data and singles spectra in parallel. The coincidence logic is handled by one of the interface modules; the interface allows online data reduction. The further expansion of the system will be discussed. (orig.)
In-Pile Instrumentation Multi- Parameter System Utilizing Photonic Fibers and Nanovision
Energy Technology Data Exchange (ETDEWEB)
Burgett, Eric [Idaho State Univ., Pocatello, ID (United States)
2015-10-13
An advanced in-pile multi-parameter reactor monitoring system is being proposed in this funding opportunity. The proposed effort brings cutting edge, high fidelity optical measurement systems into the reactor environment in an unprecedented fashion, including in-core, in-cladding and in-fuel pellet itself. Unlike instrumented leads, the proposed system provides a unique solution to a multi-parameter monitoring need in core while being minimally intrusive in the reactor core. Detector designs proposed herein can monitor fuel compression and expansion in both the radial and axial dimensions as well as monitor linear power profiles and fission rates during the operation of the reactor. In addition to pressure, stress, strain, compression, neutron flux, neutron spectra, and temperature can be observed inside the fuel bundle and fuel rod using the proposed system. The proposed research aims at developing radiation-hard, harsh-environment multi-parameter systems for insertion into the reactor environment. The proposed research holds the potential to drastically increase the fidelity and precision of in-core instrumentation with little or no impact in the neutron economy in the reactor environment while providing a measurement system capable of operation for entire operating cycles.
In-Pile Instrumentation Multi- Parameter System Utilizing Photonic Fibers and Nanovision
International Nuclear Information System (INIS)
Burgett, Eric
2015-01-01
An advanced in-pile multi-parameter reactor monitoring system is being proposed in this funding opportunity. The proposed effort brings cutting edge, high fidelity optical measurement systems into the reactor environment in an unprecedented fashion, including in-core, in-cladding and in-fuel pellet itself. Unlike instrumented leads, the proposed system provides a unique solution to a multi-parameter monitoring need in core while being minimally intrusive in the reactor core. Detector designs proposed herein can monitor fuel compression and expansion in both the radial and axial dimensions as well as monitor linear power profiles and fission rates during the operation of the reactor. In addition to pressure, stress, strain, compression, neutron flux, neutron spectra, and temperature can be observed inside the fuel bundle and fuel rod using the proposed system. The proposed research aims at developing radiation-hard, harsh-environment multi-parameter systems for insertion into the reactor environment. The proposed research holds the potential to drastically increase the fidelity and precision of in-core instrumentation with little or no impact in the neutron economy in the reactor environment while providing a measurement system capable of operation for entire operating cycles.
Integrated, multi-parameter, investigation of eruptive dynamics at Santiaguito lava dome, Guatemala
Lavallée, Yan; De Angelis, Silvio; Rietbrock, Andreas; Lamb, Oliver; Hornby, Adrian; Lamur, Anthony; Kendrick, Jackie E.; von Aulock, Felix W.; Chigna, Gustavo
2016-04-01
Understanding the nature of the signals generated at volcanoes is central to hazard mitigation efforts. Systematic identification and understanding of the processes responsible for the signals associated with volcanic activity are only possible when high-resolution data are available over relatively long periods of time. For this reason, in November 2014, the Liverpool Earth Observatory (LEO), UK, in collaboration with colleagues of the Instituto Nacional de Sismologia, Meteorologia e Hidrologia (INSIVUMEH), Guatemala, installed a large multi-parameter geophysical monitoring network at Santiaguito - the most active volcano in Guatemala. The network, which is to date the largest temporary deployment on Santiaguito, includes nine three-component broadband seismometers, three tiltmeters, and five infrasound microphones. Further, during the initial installation campaign we conducted visual and thermal infrared measurements of surface explosive activity and collected numerous rock samples for geochemical, geophysical and rheological characterisation. Activity at Santiaguito began in 1922, with the extrusion of a series of lava domes. In recent years, persistent dome extrusion has yielded spectacularly episodic piston-like motion displayed by characteristic tilt/seismic patterns (Johnson et al, 2014). This cyclicity episodically concludes with gas emissions or gas-and-ash explosions, observed to progress along a complex fault system in the dome. The explosive activity is associated with distinct geophysical signals characterised by the presence of very-long period earthquakes as well as more rapid inflation/deflation cycles; the erupted ash further evidences partial melting and thermal vesiculation resulting from fault processes (Lavallée et al., 2015). One year of data demonstrates the regularity of the periodicity and intensity of the explosions; analysis of infrasound data suggests that each explosion expulses on the order of 10,000-100,000 kg of gas and ash. We
De Angelis, S.; Rietbrock, A.; Lavallée, Y.; Lamb, O. D.; Lamur, A.; Kendrick, J. E.; Hornby, A. J.; von Aulock, F. W.; Chigna, G.
2016-12-01
Understanding the complex processes that drive volcanic unrest is crucial to effective risk mitigation. Characterization of these processes, and the mechanisms of volcanic eruptions, is only possible when high-resolution geophysical and geological observations are available over comparatively long periods of time. In November 2014, the Liverpool Earth Observatory, UK, in collaboration with the Instituto Nacional de Sismologia, Meteorologia e Hidrologia (INSIVUMEH), Guatemala, established a multi-parameter geophysical network at Santiaguito, one of the most active volcanoes in Guatemala. Activity at Santiaguito throughout the past decade, until the summer of 2015, was characterized by nearly continuous lava dome extrusion accompanied by frequent and regular small-to-moderate gas or gas-and-ash explosions. Over the past two years our network collected a wealth of seismic, acoustic and deformation data, complemented by campaign visual and thermal infrared measurements, and rock and ash samples. Here we present preliminary results from the analysis of this unique dataset. Using acoustic and thermal data collected during 2014-2015 we were able to assess volume fractions of ash and gas in the eruptive plumes. The small proportion of ash inferred in the plumes confirms estimates from previous, independent, studies, and suggests that these events did not involve significant magma fragmentation in the conduit. The results also agree with the suggestion that sacrificial fragmentation along fault zones in the conduit region, due to shear-induced thermal vesiculation, may be at the origin of such events. Finally, starting in the summer of 2015, our experiment captured the transition to a new phase of activity characterized by vigorous vulcanian-style explosions producing large, ash-rich, plumes and frequent hazardous pyroclastic flows, as well as the formation a large summit crater. We present evidence of this transition in the geophysical and geological data, and discuss its
Szymczak, Sonja; Hetzer, Timo; Bräuning, Achim; Joachimski, Michael M.; Leuschner, Hanns-Hubert; Kuhlemann, Joachim
2014-10-01
We present a new multi-parameter dataset from Corsican black pine growing on the island of Corsica in the Western Mediterranean basin covering the period AD 1410-2008. Wood parameters measured include tree-ring width, latewood width, earlywood width, cell lumen area, cell width, cell wall thickness, modelled wood density, as well as stable carbon and oxygen isotopes. We evaluated the relationships between different parameters and determined the value of the dataset for climate reconstructions. Correlation analyses revealed that carbon isotope ratios are influenced by cell parameters determining cell size, whereas oxygen isotope ratios are influenced by cell parameters determining the amount of transportable water in the xylem. A summer (June to August) precipitation reconstruction dating back to AD 1185 was established based on tree-ring width. No long-term trends or pronounced periods with extreme high/low precipitation are recorded in our reconstruction, indicating relatively stable moisture conditions over the entire time period. By comparing the precipitation reconstruction with a summer temperature reconstruction derived from the carbon isotope chronologies, we identified summers with extreme climate conditions, i.e. warm-dry, warm-wet, cold-dry and cold-wet. Extreme climate conditions during summer months were found to influence cell parameter characteristics. Cold-wet summers promote the production of broad latewood composed of wide and thin-walled tracheids, while warm-wet summers promote the production of latewood with small thick-walled cells. The presented dataset emphasizes the potential of multi-parameter wood analysis from one tree species over long time scales.
International Nuclear Information System (INIS)
Dorr, Peter; Gruss, Christian
2001-01-01
Photothermal infrared radiometry has been used for the measurement of thermophysical, optical, and geometrical properties of multilayered samples of paint on a metallic substrate. A special data normalization is applied to reduce the number of sensitive parameters which makes the identification task for the remaining parameters easier. The normalization stabilizes the evaluation of the photothermal signal and makes the infrared radiometry more attractive for applications in the industrial environment. It is shown that modeling and multi-parameter-fitting can be applied successfully to the normalized data for the determination of layer thicknesses. As a side product we can calculate some other physical properties of the sample. [copyright] 2001 American Institute of Physics
Probabilistic teleportation via multi-parameter measurements and partially entangled states
Wei, Jiahua; Shi, Lei; Han, Chen; Xu, Zhiyan; Zhu, Yu; Wang, Gang; Wu, Hao
2018-04-01
In this paper, a novel scheme for probabilistic teleportation is presented with multi-parameter measurements via a non-maximally entangled state. This is in contrast to the fact that the measurement kinds for quantum teleportation are usually particular in most previous schemes. The detail implementation producers for our proposal are given by using of appropriate local unitary operations. Moreover, the total success probability and classical information of this proposal are calculated. It is demonstrated that the success probability and classical cost would be changed with the multi-measurement parameters and the entanglement factor of quantum channel. Our scheme could enlarge the research range of probabilistic teleportation.
Directory of Open Access Journals (Sweden)
Jiang Ge
2017-01-01
Full Text Available System degradation was usually caused by multiple-parameter degradation. The assessment result of system reliability by universal generating function was low accurate when compared with the Monte Carlo simulation. And the probability density function of the system output performance cannot be got. So the reliability assessment method based on the probability density evolution with multi-parameter was presented for complexly degraded system. Firstly, the system output function was founded according to the transitive relation between component parameters and the system output performance. Then, the probability density evolution equation based on the probability conservation principle and the system output function was established. Furthermore, probability distribution characteristics of the system output performance was obtained by solving differential equation. Finally, the reliability of the degraded system was estimated. This method did not need to discrete the performance parameters and can establish continuous probability density function of the system output performance with high calculation efficiency and low cost. Numerical example shows that this method is applicable to evaluate the reliability of multi-parameter degraded system.
CIMI simulations with recently developed multi-parameter chorus and plasmaspheric hiss models
Aryan, Homayon; Sibeck, David; Kang, Suk-bin; Balikhin, Michael; Fok, Mei-ching
2017-04-01
Simulation studies of the Earth's radiation belts are very useful in understanding the acceleration and loss of energetic particles. The Comprehensive Inner Magnetosphere-Ionosphere (CIMI) model considers the effects of the ring current and plasmasphere on the radiation belts. CIMI was formed by merging the Comprehensive Ring Current Model (CRCM) and the Radiation Belt Environment (RBE) model to solves for many essential quantities in the inner magnetosphere, including radiation belt enhancements and dropouts. It incorporates chorus and plasmaspheric hiss wave diffusion of energetic electrons in energy, pitch angle, and cross terms. Usually the chorus and plasmaspheric hiss models used in CIMI are based on single-parameter geomagnetic index (AE). Here we integrate recently developed multi-parameter chorus and plasmaspheric hiss wave models based on geomagnetic index and solar wind parameters. We then perform CIMI simulations for different storms and compare the results with data from the Van Allen Probes and the Two Wide-angle Imaging Neutral-atom Spectrometers and Akebono satellites. We find that the CIMI simulations with multi-parameter chorus and plasmaspheric hiss wave models are more comparable to data than the single-parameter wave models.
Application of multi-parameter chorus and plasmaspheric hiss wave models in radiation belt modeling
Aryan, H.; Kang, S. B.; Balikhin, M. A.; Fok, M. C. H.; Agapitov, O. V.; Komar, C. M.; Kanekal, S. G.; Nagai, T.; Sibeck, D. G.
2017-12-01
Numerical simulation studies of the Earth's radiation belts are important to understand the acceleration and loss of energetic electrons. The Comprehensive Inner Magnetosphere-Ionosphere (CIMI) model along with many other radiation belt models require inputs for pitch angle, energy, and cross diffusion of electrons, due to chorus and plasmaspheric hiss waves. These parameters are calculated using statistical wave distribution models of chorus and plasmaspheric hiss amplitudes. In this study we incorporate recently developed multi-parameter chorus and plasmaspheric hiss wave models based on geomagnetic index and solar wind parameters. We perform CIMI simulations for two geomagnetic storms and compare the flux enhancement of MeV electrons with data from the Van Allen Probes and Akebono satellites. We show that the relativistic electron fluxes calculated with multi-parameter wave models resembles the observations more accurately than the relativistic electron fluxes calculated with single-parameter wave models. This indicates that wave models based on a combination of geomagnetic index and solar wind parameters are more effective as inputs to radiation belt models.
Concept for a solid-state multi-parameter sensor system for cell-culture monitoring
International Nuclear Information System (INIS)
Baecker, M.; Beging, S.; Biselli, M.; Poghossian, A.; Wang, J.; Zang, W.; Wagner, P.; Schoening, M.J.
2009-01-01
In this study, a concept for a silicon-based modular solid-state sensor system for inline multi-parameter monitoring of cell-culture fermentation processes is presented. The envisaged multi-parameter sensor system consists of two identical sensor modules and is intended for continuous quantification of up to five (bio-)chemical and physical parameters, namely, glucose and glutamine concentration, pH value, electrolyte conductivity and temperature by applying different transducer principles and/or different operation modes. Experimental results for the field-effect electrolyte-insulator-semiconductor (EIS) sterilisable pH sensor and electrolyte conductivity sensor based on interdigitated electrodes are presented. The ongoing autoclaving does not have any significant impact on the pH-sensitive properties of a Ta 2 O 5 -gate EIS sensor. Even after 30 autoclaving cycles, the pH sensors show a clear pH response and nearly linear calibration curve with a slope of 57 ± 1 mV/pH. Additional scanning electron microscopy and ellipsometric investigations do not show any visible surface degradation or changes in the thickness of the pH-sensitive Ta 2 O 5 layer. The preliminary results demonstrate the suitability of the developed EIS sensor for an inline pH measurement during a fermentation process. In addition, interdigitated electrodes of different geometries serving as electrolyte conductivity sensor have been tested for measurements in relatively high ionic-strength solutions.
Frequency Domain Multi-parameter Full Waveform Inversion for Acoustic VTI Media
Djebbi, Ramzi
2017-05-26
Multi-parameter full waveform inversion (FWI) for transversely isotropic (TI) media with vertical axis of symmetry (VTI) suffers from the trade-off between the parameters. The trade-off results in the leakage of one parameter\\'s update into the other during the inversion. It affects the accuracy and convergence of the inversion. The sensitivity analyses suggested a parameterisation using the horizontal velocity vh, epsilon and eta to reduce the trade-off for surface recorded seismic data.We test the (vh, epsilon, eta) parameterisation for acoustic VTI media using a scattering integral (SI) based inversion. The data is modeled in frequency domain and the model is updated using a preconditioned conjugate gradient method. We applied the method to the VTI Marmousi II model and in the inversion, we keep eta parameter fixed as the background initial model and we invert simultaneously for both vh and epsilon. The results show the suitability of the parameterisation for multi-parameter VTI acoustic inversion as well as the accuracy of the inversion approach.
Odbert, Henry; Aspinall, Willy
2014-05-01
the uncertainty of inferences, and how our method provides a route to formal propagation of uncertainties in hazard models. Such approaches provide an attractive route to developing an interface between volcano monitoring analyses and probabilistic hazard scenario analysis. We discuss the use of BBNs in hazard analysis as a tractable and traceable tool for fast, rational assimilation of complex, multi-parameter data sets in the context of timely volcanic crisis decision support.
A molecular informatics view on best practice in multi-parameter compound optimization.
Lusher, Scott J; McGuire, Ross; Azevedo, Rita; Boiten, Jan-Willem; van Schaik, Rene C; de Vlieg, Jacob
2011-07-01
The difference between biologically active molecules and drugs is that the latter balance an array of related and unrelated properties required for administration to patients. Inevitability, during optimization, some of these multiple factors will conflict. Although informatics has a crucial role in addressing the challenges of modern compound optimization, it is arguably still undervalued and underutilized. We present here some of the basic requirements of multi-parameter drug design, the crucial role of informatics and examples of favorable practice. The most crucial of these best practices are the need for informaticians to align their technologies and insights directly to discovery projects and for all scientists in drug discovery to become more proficient in the use of in silico methods. Copyright © 2011 Elsevier Ltd. All rights reserved.
Directory of Open Access Journals (Sweden)
Li Ke
2014-12-01
Full Text Available A large-scale high altitude environment simulation test cabin was developed to accurately control temperatures and pressures encountered at high altitudes. The system was developed to provide slope-tracking dynamic control of the temperature–pressure two-parameter and overcome the control difficulties inherent to a large inertia lag link with a complex control system which is composed of turbine refrigeration device, vacuum device and liquid nitrogen cooling device. The system includes multi-parameter decoupling of the cabin itself to avoid equipment damage of air refrigeration turbine caused by improper operation. Based on analysis of the dynamic characteristics and modeling for variations in temperature, pressure and rotation speed, an intelligent controller was implemented that includes decoupling and fuzzy arithmetic combined with an expert PID controller to control test parameters by decoupling and slope tracking control strategy. The control system employed centralized management in an open industrial ethernet architecture with an industrial computer at the core. The simulation and field debugging and running results show that this method can solve the problems of a poor anti-interference performance typical for a conventional PID and overshooting that can readily damage equipment. The steady-state characteristics meet the system requirements.
Directory of Open Access Journals (Sweden)
Wei Bai
2017-06-01
Full Text Available A multi-parameter measurement system based on ultra-weak fiber Bragg grating (UFBG array with sensitive material was proposed and experimentally demonstrated. The UFBG array interrogation principle is time division multiplex technology with two semiconductor optical amplifiers as timing units. Experimental results showed that the performance of the proposed UFBG system is almost equal to that of traditional FBG, while the UFBG array system has obvious superiority with potential multiplexing ability for multi-point and multi-parameter measurement. The system experimented on a 144 UFBG array with the reflectivity of UFBG ~0.04% for the four target parameters: hydrogen, humidity, temperature and salinity. Moreover, a uniform solution was customized to divide the cross-sensitivity between temperature and other target parameters. It is expected that this scheme will be capable of handling thousands of multi-parameter sensors in a single fiber.
Bai, Wei; Yang, Minghong; Hu, Chenyuan; Dai, Jixiang; Zhong, Xuexiang; Huang, Shuai; Wang, Gaopeng
2017-06-26
A multi-parameter measurement system based on ultra-weak fiber Bragg grating (UFBG) array with sensitive material was proposed and experimentally demonstrated. The UFBG array interrogation principle is time division multiplex technology with two semiconductor optical amplifiers as timing units. Experimental results showed that the performance of the proposed UFBG system is almost equal to that of traditional FBG, while the UFBG array system has obvious superiority with potential multiplexing ability for multi-point and multi-parameter measurement. The system experimented on a 144 UFBG array with the reflectivity of UFBG ~0.04% for the four target parameters: hydrogen, humidity, temperature and salinity. Moreover, a uniform solution was customized to divide the cross-sensitivity between temperature and other target parameters. It is expected that this scheme will be capable of handling thousands of multi-parameter sensors in a single fiber.
Realization of multi-parameter and multi-state in fault tree computer-aided building software
International Nuclear Information System (INIS)
Guo Xiaoli; Tong Jiejuan; Xue Dazhi
2004-01-01
More than one parameter and more than one failed state of a parameter are often involved in building fault tree, so it is necessary for fault tree computer-aided building software to deal with multi-parameter and multi-state. Fault Tree Expert System (FTES) has the target of aiding the FT-building work of hydraulic systems. This paper expatiates on how to realize multi-parameter and multi-state in FTES with focus on Knowledge Base and Illation Engine. (author)
Truncated Gauss-Newton Implementation for Multi-Parameter Full Waveform Inversion
Liu, Y.; Yang, J.; Dong, L.; Wang, Y.
2014-12-01
Full waveform inversion (FWI) is a numerical optimization method which aims at minimizing the difference between the synthetic and recorded seismic data to obtain high resolution subsurface images. A practical implementation for FWI is the adjoint-state method (AD), in which the data residuals at receiver locations are simultaneously back-propagated to form the gradient. Scattering-integral method (SI) is an alternative way which is based on the explicit building of the sensitivity kernel (Fréchet derivative matrix). Although it is more memory-consuming, SI is more efficient than AD when the number of the sources is larger than the number of the receivers. To improve the convergence of FWI, the information carried out by the inverse Hessian operator is crucial. Taking account accurately of the effect of this operator in FWI can correct illumination deficits, reserve the amplitude of the subsurface parameters, and remove artifacts generated by multiple reflections. In multi-parameter FWI, the off-diagonal blocks of the Hessian operator reflect the coupling between different parameter classes. Therefore, incorporating its inverse could help to mitigate the trade-off effects. In this study, we focus on the truncated Gauss-Newton implementation for multi-parameter FWI. The model update is computed through a matrix-free conjugate gradient solution of the Newton linear system. Both the gradient and the Hessian-vector product are calculated using the SI approach instead of the first- and second-order AD. However, the gradient expressed by kernel-vector product is calculated through the accumulation of the decomposed vector-scalar products. Thus, it's not necessary to store the huge sensitivity matrix beforehand. We call this method the matrix decomposition approach (MD). And the Hessian-vector product is replaced by two kernel-vector products which are then calculated by the above MD. By this way, we don't need to solve two additional wave propagation problems as in the
International Nuclear Information System (INIS)
Doherty, Kimberly R.; Wappel, Robert L.; Talbert, Dominique R.; Trusk, Patricia B.; Moran, Diarmuid M.; Kramer, James W.; Brown, Arthur M.; Shell, Scott A.; Bacus, Sarah
2013-01-01
Tyrosine kinase inhibitors (TKi) have greatly improved the treatment and prognosis of multiple cancer types. However, unexpected cardiotoxicity has arisen in a subset of patients treated with these agents that was not wholly predicted by pre-clinical testing, which centers around animal toxicity studies and inhibition of the human Ether-à-go-go-Related Gene (hERG) channel. Therefore, we sought to determine whether a multi-parameter test panel assessing the effect of drug treatment on cellular, molecular, and electrophysiological endpoints could accurately predict cardiotoxicity. We examined how 4 FDA-approved TKi agents impacted cell viability, apoptosis, reactive oxygen species (ROS) generation, metabolic status, impedance, and ion channel function in human cardiomyocytes. The 3 drugs clinically associated with severe cardiac adverse events (crizotinib, sunitinib, nilotinib) all proved to be cardiotoxic in our in vitro tests while the relatively cardiac-safe drug erlotinib showed only minor changes in cardiac cell health. Crizotinib, an ALK/MET inhibitor, led to increased ROS production, caspase activation, cholesterol accumulation, disruption in cardiac cell beat rate, and blockage of ion channels. The multi-targeted TKi sunitinib showed decreased cardiomyocyte viability, AMPK inhibition, increased lipid accumulation, disrupted beat pattern, and hERG block. Nilotinib, a second generation Bcr-Abl inhibitor, led to increased ROS generation, caspase activation, hERG block, and an arrhythmic beat pattern. Thus, each drug showed a unique toxicity profile that may reflect the multiple mechanisms leading to cardiotoxicity. This study demonstrates that a multi-parameter approach can provide a robust characterization of drug-induced cardiomyocyte damage that can be leveraged to improve drug safety during early phase development. - Highlights: • TKi with known adverse effects show unique cardiotoxicity profiles in this panel. • Crizotinib increases ROS, apoptosis, and
Wang, Yujie; Pan, Rui; Liu, Chang; Chen, Zonghai; Ling, Qiang
2018-01-01
The battery power capability is intimately correlated with the climbing, braking and accelerating performance of the electric vehicles. Accurate power capability prediction can not only guarantee the safety but also regulate driving behavior and optimize battery energy usage. However, the nonlinearity of the battery model is very complex especially for the lithium iron phosphate batteries. Besides, the hysteresis loop in the open-circuit voltage curve is easy to cause large error in model prediction. In this work, a multi-parameter constraints dynamic estimation method is proposed to predict the battery continuous period power capability. A high-fidelity battery model which considers the battery polarization and hysteresis phenomenon is presented to approximate the high nonlinearity of the lithium iron phosphate battery. Explicit analyses of power capability with multiple constraints are elaborated, specifically the state-of-energy is considered in power capability assessment. Furthermore, to solve the problem of nonlinear system state estimation, and suppress noise interference, the UKF based state observer is employed for power capability prediction. The performance of the proposed methodology is demonstrated by experiments under different dynamic characterization schedules. The charge and discharge power capabilities of the lithium iron phosphate batteries are quantitatively assessed under different time scales and temperatures.
Zhang, Jingdong; Zhu, Tao; Zhou, Huan; Huang, Shihong; Liu, Min; Huang, Wei
2016-11-28
We demonstrate a cost-effective distributed fiber sensing system for the multi-parameter detection of the vibration, the temperature, and the strain by integrating phase-sensitive optical time domain reflectometry (φ-OTDR) and Brillouin optical time domain reflectometry (B-OTDR). Taking advantage of the fast changing property of the vibration and the static properties of the temperature and the strain, both the width and intensity of the laser pulses are modulated and injected into the single-mode sensing fiber proportionally, so that three concerned parameters can be extracted simultaneously by only one photo-detector and one data acquisition channel. A data processing method based on Gaussian window short time Fourier transform (G-STFT) is capable of achieving high spatial resolution in B-OTDR. The experimental results show that up to 4.8kHz vibration sensing with 3m spatial resolution at 10km standard single-mode fiber can be realized, as well as the distributed temperature and stress profiles along the same fiber with 80cm spatial resolution.
Liu, Ronghua; Sun, Qiaofeng; Hu, Tian; Li, Lian; Nie, Lei; Wang, Jiayue; Zhou, Wanhui; Zang, Hengchang
2018-03-01
As a powerful process analytical technology (PAT) tool, near infrared (NIR) spectroscopy has been widely used in real-time monitoring. In this study, NIR spectroscopy was applied to monitor multi-parameters of traditional Chinese medicine (TCM) Shenzhiling oral liquid during the concentration process to guarantee the quality of products. Five lab scale batches were employed to construct quantitative models to determine five chemical ingredients and physical change (samples density) during concentration process. The paeoniflorin, albiflorin, liquiritin and samples density were modeled by partial least square regression (PLSR), while the content of the glycyrrhizic acid and cinnamic acid were modeled by support vector machine regression (SVMR). Standard normal variate (SNV) and/or Savitzkye-Golay (SG) smoothing with derivative methods were adopted for spectra pretreatment. Variable selection methods including correlation coefficient (CC), competitive adaptive reweighted sampling (CARS) and interval partial least squares regression (iPLS) were performed for optimizing the models. The results indicated that NIR spectroscopy was an effective tool to successfully monitoring the concentration process of Shenzhiling oral liquid.
Multi-parameter vital sign database to assist in alarm optimization for general care units.
Welch, James; Kanter, Benjamin; Skora, Brooke; McCombie, Scott; Henry, Isaac; McCombie, Devin; Kennedy, Rosemary; Soller, Babs
2016-12-01
Continual vital sign assessment on the general care, medical-surgical floor is expected to provide early indication of patient deterioration and increase the effectiveness of rapid response teams. However, there is concern that continual, multi-parameter vital sign monitoring will produce alarm fatigue. The objective of this study was the development of a methodology to help care teams optimize alarm settings. An on-body wireless monitoring system was used to continually assess heart rate, respiratory rate, SpO 2 and noninvasive blood pressure in the general ward of ten hospitals between April 1, 2014 and January 19, 2015. These data, 94,575 h for 3430 patients are contained in a large database, accessible with cloud computing tools. Simulation scenarios assessed the total alarm rate as a function of threshold and annunciation delay (s). The total alarm rate of ten alarms/patient/day predicted from the cloud-hosted database was the same as the total alarm rate for a 10 day evaluation (1550 h for 36 patients) in an independent hospital. Plots of vital sign distributions in the cloud-hosted database were similar to other large databases published by different authors. The cloud-hosted database can be used to run simulations for various alarm thresholds and annunciation delays to predict the total alarm burden experienced by nursing staff. This methodology might, in the future, be used to help reduce alarm fatigue without sacrificing the ability to continually monitor all vital signs.
Seismo-Geochemical Variations in SW Taiwan: Multi-Parameter Automatic Gas Monitoring Results
Yang, T. F.; Fu, C.-C.; Walia, V.; Chen, C.-H.; Chyi, L. L.; Liu, T.-K.; Song, S.-R.; Lee, M.; Lin, C.-W.; Lin, C.-C.
2006-04-01
Gas variations of many mud volcanoes and hot springs distributed along the tectonic sutures in southwestern Taiwan are considered to be sensitive to the earthquake activity. Therefore, a multi-parameter automatic gas station was built on the bank of one of the largest mud-pools at an active fault zone of southwestern Taiwan, for continuous monitoring of CO2, CH4, N2 and H2O, the major constituents of its bubbling gases. During the year round monitoring from October 2001 to October 2002, the gas composition, especially, CH4 and CO2, of the mud pool showed significant variations. Taking the CO2/CH4 ratio as the main indicator, anomalous variations can be recognized from a few days to a few weeks before earthquakes and correlated well with those with a local magnitude >4.0 and local intensities >2. It is concluded that the gas composition in the area is sensitive to the local crustal stress/strain and is worthy to conduct real-time monitoring for the seismo-geochemical precursors.
Multi-parameter actuation of a neutrally stable shell: a flexible gear-less motor.
Hamouche, W; Maurini, C; Vidoli, S; Vincenti, A
2017-08-01
We have designed and tested experimentally a morphing structure consisting of a neutrally stable thin cylindrical shell driven by a multi-parameter piezoelectric actuation. The shell is obtained by plastically deforming an initially flat copper disc, so as to induce large isotropic and almost uniform inelastic curvatures. Following the plastic deformation, in a perfectly isotropic system, the shell is theoretically neutrally stable, having a continuous set of stable cylindrical shapes corresponding to the rotation of the axis of maximal curvature. Small imperfections render the actual structure bistable, giving preferred orientations. A three-parameter piezoelectric actuation, exerted through micro-fibre-composite actuators, allows us to add a small perturbation to the plastic inelastic curvature and to control the direction of maximal curvature. This actuation law is designed through a geometrical analogy based on a fully nonlinear inextensible uniform-curvature shell model. We report on the fabrication, identification and experimental testing of a prototype and demonstrate the effectiveness of the piezoelectric actuators in controlling its shape. The resulting motion is an apparent rotation of the shell, controlled by the voltages as in a 'gear-less motor', which is, in reality, a precession of the axis of principal curvature.
Ahl, Andreas; Supper, R.; Motschka, K.; Schattauer, I.
2010-05-01
For the interpretation of airborne gamma-ray spectrometry as well as airborne electromagnetics it is of great importance to determine the distance between the geophysical sensor and the ground surface. Since radar altimeters do not penetrate vegetation, laser altimeters became popular in airborne geophysics over the past years. Currently the airborne geophysical platform of the Geological Survey of Austria (GBA) is equipped with a Riegl LD90-3800VHS-FLP high resolution laser altimeter, measuring the distances according to the first and the last reflected pulse. The goal of the presented study was to explore the possibilities of deriving additional information about the survey area from the laser data and to determine the accuracy of such results. On one hand the difference between the arrival time of the first and the last reflected pulse can be used to determine the height of the vegetation. This parameter is for example important for the correction of damping effects on airborne gamma-ray measurements caused by vegetation. Moreover especially for groundwater studies at catchment scale, this parameter can also be applied to support the spatial assessment of evapotranspiration. In combination with the altitude above geoid, determined by a GPS receiver, a rough digital elevation model of the survey area can be derived from the laser altimetry. Based on a data set from a survey area in the northern part of Austria, close to the border with the Czech Republic, the reliability of such a digital elevation model and the calculated vegetation height was tested. In this study a mean deviation of -1.4m, with a standard deviation of ±3.4m, between the digital elevation model from Upper Austria (25m spatial resolution) and the determined elevation model was determined. We also found an obvious correlation between the calculated vegetation heights greater 15m and the mapped forest published by the ‘Department of Forest Inventory' of the ‘Federal Forest Office' of Austria. These results encouraged us to apply these methods to airborne geophysical data sets from the United Mexican States. One survey was targeted to provide additional data for advanced groundwater modeling in remote areas of the karstic plateau of Yucatan. Within the other project a sustainable source of water supply for a small settlement on the isolated island of Socorro, 700 km off the Mexican main coast had to be detected. At both survey areas no accurate elevation models or area-wide information about vegetation heights where available before the airborne geophysical survey. The results of these investigations will be presented. From an evaluation of the results it can be concluded that the use of laser altimetry not only provides essential information about the ground clearance of the geophysical instruments but also increases the benefit of the airborne survey for the client by delivering additional information about the survey area. It is clear that the accuracy of the resulting data cannot compete with a high resolution laser scanning survey. However in areas where such information is not available an obvious additional benefit can be achieved without the need to spend money for additional survey campaigns. Currently further studies are launched to investigate the possibility to increase the accuracy of the altitude data by determining roll and pitch of the helicopter by the use of differentially corrected multiple L1/L2 band GPS receiver mounted at fixed positions on the helicopter platform. The above study was partly financed by the Austrian Science Fund, Xplore (L524-N10) project.
Model-based dynamic multi-parameter method for peak power estimation of lithium-ion batteries
Sun, F.; Xiong, R.; He, H.; Li, W.; Aussems, J.E.E.
2012-01-01
A model-based dynamic multi-parameter method for peak power estimation is proposed for batteries and battery management systems (BMSs) used in hybrid electric vehicles (HEVs). The available power must be accurately calculated in order to not damage the battery by over charging or over discharging or
Multi-parameter fibre Bragg grating sensor-array for thermal vacuum cycling test
Cheng, L.; Ahlers, B.; Toet, P.; Casarosa, G.; Appolloni, M.
2017-11-01
Fibre Bragg Grating (FBG) sensor systems based on optical fibres are gaining interest in space applications. Studies on Structural Health Monitoring (SHM) of the reusable launchers using FBG sensors have been carried out in the Future European Space Transportation Investigations Programme (FESTIP). Increasing investment in the development on FBG sensor applications is foreseen for the Future Launchers Preparatory Programme (FLPP). TNO has performed different SHM measurements with FBGs including on the VEGA interstage [1, 2] in 2006. Within the current project, a multi-parameter FBG sensor array demonstrator system for temperature and strain measurements is designed, fabricated and tested under ambient as well as Thermal Vacuum (TV) conditions in a TV chamber of the European Space Agency (ESA), ESTEC site. The aim is the development of a multi-parameters measuring system based on FBG technology for space applications. During the TV tests of a Space Craft (S/C) or its subsystems, thermal measurements, as well as strain measurements are needed by the engineers in order to verify their prediction and to validate their models. Because of the dimensions of the test specimen and the accuracy requested to the measurement, a large number of observation/measuring points are needed. Conventional sensor systems require a complex routing of the cables connecting the sensors to their acquisition unit. This will add extra weight to the construction under test. FBG sensors are potentially light-weight and can easily be multiplexed in an array configuration. The different tasks comply of a demonstrator system design; its component selection, procurement, manufacturing and finally its assembly. The temperature FBG sensor is calibrated in a dedicated laboratory setup down to liquid nitrogen (LN2) temperature at TNO. A temperature-wavelength calibration curve is generated. After a test programme definition a setup in thermal vacuum is realised at ESA premises including a mechanical
International Nuclear Information System (INIS)
Solov'ev, A.G.; Stadnik, A.V.; Islamov, A.N.; Kuklin, A.I.
2008-01-01
Fitter is a C++ program aimed to fit a chosen theoretical multi-parameter function through a set of data points. The method of fitting is chi-square minimization. Moreover, the robust fitting method can be applied to Fitter. Fitter was designed to be used for a small-angle neutron scattering data analysis. Respective theoretical models are implemented in it. Some commonly used models (Gaussian and polynomials) are also implemented for wider applicability
Design and implementation of atmospheric multi-parameter sensor for UAVs
Yu, F.; Zhao, Y.; Chen, G.; Liu, Y.; Han, Y.
2017-12-01
With the rapid development of industry and the increase of cars in developing countries, air pollutants have caused a series of environmental issues such as haze and smog. However, air pollution is a process of surface-to-air mass exchange, and various kinds of atmospheric factors have close association with aerosol concentration, such as temperature, humidity, etc. Vertical distributions of aerosol in the region provide an important clue to reveal the exchange mechanism in the atmosphere between atmospheric boundary layer and troposphere. Among the various kinds of flying platforms, unmanned aerial vehicles (UAVs) shows more advantages in vertical measurement of aerosol owned to its flexibility and low cost. However, only few sensors could be mounted on the UAVs because of the limited size and power requirement. Here, a light-weight, low-power atmospheric multi-parameter sensor (AMPS) is proposed and could be mounted on several kinds of UAV platforms. The AMPS integrates multi-sensors, which are the laser aerosol particle sensor, the temperature probe, the humidity probe and the pressure probe, in order to simultaneously sample the vertical distribution characters of aerosol particle concentration, temperature, relative humidity and atmospheric pressure. The data from the sensors are synchronized by a proposed communication mechanism based on GPS. Several kinds of housing are designed to accommodate the different payload requirements of UAVs in size and weight. The experiments were carried out with AMPS mounted on three kinds of flying platforms. The results shows that the power consumption is less than 1.3 W, with relatively high accuracy in temperature (±0.1°C), relative humidity (±0.8%RH), PM2.5 (<20%) and PM10 (<20%). Vertical profiles of PM2.5 and PM10 concentrations were observed simultaneously by the AMPS three times every day in five days. The results revealed the significant correlation between the aerosol particle concentration and atmospheric
Zhang, Yu; Yang, Wei; Han, Dongsheng; Kim, Young-Il
2014-01-01
Environment monitoring is important for the safety of underground coal mine production, and it is also an important application of Wireless Sensor Networks (WSNs). We put forward an integrated environment monitoring system for underground coal mine, which uses the existing Cable Monitoring System (CMS) as the main body and the WSN with multi-parameter monitoring as the supplementary technique. As CMS techniques are mature, this paper mainly focuses on the WSN and the interconnection between t...
Bai, Wei; Yang, Minghong; Hu, Chenyuan; Dai, Jixiang; Zhong, Xuexiang; Huang, Shuai; Wang, Gaopeng
2017-01-01
A multi-parameter measurement system based on ultra-weak fiber Bragg grating (UFBG) array with sensitive material was proposed and experimentally demonstrated. The UFBG array interrogation principle is time division multiplex technology with two semiconductor optical amplifiers as timing units. Experimental results showed that the performance of the proposed UFBG system is almost equal to that of traditional FBG, while the UFBG array system has obvious superiority with potential multiplexing ...
Pappas, D.; Jeevarajan, A.; Anderson, M. M.
2004-01-01
Compact and automated sensors are desired for assessing the health of cell cultures in biotechnology experiments in microgravity. Measurement of cell culture medium allows for the optirn.jzation of culture conditions on orbit to maximize cell growth and minimize unnecessary exchange of medium. While several discrete sensors exist to measure culture health, a multi-parameter sensor would simplify the experimental apparatus. One such sensor, the Paratrend 7, consists of three optical fibers for measuring pH, dissolved oxygen (p02), dissolved carbon dioxide (pC02) , and a thermocouple to measure temperature. The sensor bundle was designed for intra-arterial placement in clinical patients, and potentially can be used in NASA's Space Shuttle and International Space Station biotechnology program bioreactors. Methods: A Paratrend 7 sensor was placed at the outlet of a rotating-wall perfused vessel bioreactor system inoculated with BHK-21 (baby hamster kidney) cells. Cell culture medium (GTSF-2, composed of 40% minimum essential medium, 60% L-15 Leibovitz medium) was manually measured using a bench top blood gas analyzer (BGA, Ciba-Corning). Results: A Paratrend 7 sensor was used over a long-term (>120 day) cell culture experiment. The sensor was able to track changes in cell medium pH, p02, and pC02 due to the consumption of nutrients by the BHK-21. When compared to manually obtained BGA measurements, the sensor had good agreement for pH, p02, and pC02 with bias [and precision] of 0.02 [0.15], 1 mm Hg [18 mm Hg], and -4.0 mm Hg [8.0 mm Hg] respectively. The Paratrend oxygen sensor was recalibrated (offset) periodically due to drift. The bias for the raw (no offset or recalibration) oxygen measurements was 42 mm Hg [38 mm Hg]. The measured response (rise) time of the sensor was 20 +/- 4s for pH, 81 +/- 53s for pC02, 51 +/- 20s for p02. For long-term cell culture measurements, these response times are more than adequate. Based on these findings , the Paratrend sensor could
Multi-parameter approach to evaluate the timing of memory status after 17DD-YF primary vaccination.
Costa-Pereira, Christiane; Campi-Azevedo, Ana Carolina; Coelho-Dos-Reis, Jordana Grazziela; Peruhype-Magalhães, Vanessa; Araújo, Márcio Sobreira Silva; do Vale Antonelli, Lis Ribeiro; Fonseca, Cristina Toscano; Lemos, Jandira Aparecida; Malaquias, Luiz Cosme Cote; de Souza Gomes, Matheus; Rodrigues Amaral, Laurence; Rios, Maria; Chancey, Caren; Persi, Harold Richard; Pereira, Jorge Marcelo; de Sousa Maia, Maria de Lourdes; Freire, Marcos da Silva; Martins, Reinaldo de Menezes; Homma, Akira; Simões, Marisol; Yamamura, Anna Yoshida; Farias, Roberto Henrique Guedes; Romano, Alessandro Pecego Martins; Domingues, Carla Magda; Tauil, Pedro Luiz; Vasconcelos, Pedro Fernando Costa; Caldas, Iramaya Rodrigues; Camacho, Luiz Antônio; Teixeira-Carvalho, Andrea; Martins-Filho, Olindo Assis
2018-06-01
In this investigation, machine-enhanced techniques were applied to bring about scientific insights to identify a minimum set of phenotypic/functional memory-related biomarkers for post-vaccination follow-up upon yellow fever (YF) vaccination. For this purpose, memory status of circulating T-cells (Naïve/early-effector/Central-Memory/Effector-Memory) and B-cells (Naïve/non-Classical-Memory/Classical-Memory) along with the cytokine profile (IFN/TNF/IL-5/IL-10) were monitored before-NV(day0) and at distinct time-points after 17DD-YF primary vaccination-PV(day30-45); PV(year1-9) and PV(year10-11). A set of biomarkers (eEfCD4; EMCD4; CMCD19; EMCD8; IFNCD4; IL-5CD8; TNFCD4; IFNCD8; TNFCD8; IL-5CD19; IL-5CD4) were observed in PV(day30-45), but not in NV(day0), with most of them still observed in PV(year1-9). Deficiencies of phenotypic/functional biomarkers were observed in NV(day0), while total lack of memory-related attributes was observed in PV(year10-11), regardless of the age at primary vaccination. Venn-diagram analysis pre-selected 10 attributes (eEfCD4, EMCD4, CMCD19, EMCD8, IFNCD4, IL-5CD8, TNFCD4, IFNCD8, TNFCD8 and IL-5CD4), of which the overall mean presented moderate accuracy to discriminate PV(day30-45)&PV(year1-9) from NV(day0)&PV(year10-11). Multi-parameter approaches and decision-tree algorithms defined the EMCD8 and IL-5CD4 attributes as the top-two predictors with moderated performance. Together with the PRNT titers, the top-two biomarkers led to a resultant memory status observed in 80% and 51% of volunteers in PV(day30-45) and PV(year1-9), contrasting with 0% and 29% found in NV(day0) and PV(year10-11), respectively. The deficiency of memory-related attributes observed at PV(year10-11) underscores the conspicuous time-dependent decrease of resultant memory following17DD-YF primary vaccination that could be useful to monitor potential correlates of protection in areas under risk of YF transmission.
International Nuclear Information System (INIS)
Heys, D.W.; Stump, D.R.
1987-01-01
Variational calculations are described that use multi-parameter trial wave functions for the U(1) lattice gauge theory in two space dimensions, and for the XY model. The trial functions are constructed as the exponential of a linear combination of states from the strong-coupling basis of the model, with the coefficients treated as variational parameters. The expectation of the hamiltonian is computed by the Monte Carlo method, using a reweighting technique to evaluate expectation values in finite patches of the parameter space. The trial function for the U(1) gauge theory involves six variational parameters, and its weak-coupling behaviour is in reasonable agreement with theoretical expectations. (orig.)
Nilsson, Ingemar; Polla, Magnus O
2012-10-01
Drug design is a multi-parameter task present in the analysis of experimental data for synthesized compounds and in the prediction of new compounds with desired properties. This article describes the implementation of a binned scoring and composite ranking scheme for 11 experimental parameters that were identified as key drivers in the MC4R project. The composite ranking scheme was implemented in an AstraZeneca tool for analysis of project data, thereby providing an immediate re-ranking as new experimental data was added. The automated ranking also highlighted compounds overlooked by the project team. The successful implementation of a composite ranking on experimental data led to the development of an equivalent virtual score, which was based on Free-Wilson models of the parameters from the experimental ranking. The individual Free-Wilson models showed good to high predictive power with a correlation coefficient between 0.45 and 0.97 based on the external test set. The virtual ranking adds value to the selection of compounds for synthesis but error propagation must be controlled. The experimental ranking approach adds significant value, is parameter independent and can be tuned and applied to any drug discovery project.
Ozel, Oguz; Guralp, Cansun; Tunc, Suleyman; Yalcinkaya, Esref
2016-04-01
The main objective of this study is to install a multi-parameter borehole system and surface array as close to the main Marmara Fault (MMF) in the western Marmara Sea as possible, and measure continuously the evolution of the state of the fault zone surrounding the MMF and to detect any anomaly or change, which may occur before earthquakes by making use of the data from the arrays already running in the eastern part of the Marmara Sea. The multi-parameter borehole system is composed of very wide dynamic range and stable borehole (VBB) broad band seismic sensor, and incorporate strain meter, tilt meter, and temperature and local hydrostatic pressure measuring devices. The borehole seismic station uses the latest update technologies and design ideas to record "Earth tides" signals to the smallest magnitude -3 events. Additionally, a surface microearthquake observation array, consisting of 8-10 seismometers around the borehole is established to obtain continuous high resolution locations of micro-seismicity and to better understand the existing seismically active structures and their roles in local tectonic settings.Bringing face to face the seismograms of microearthquakes recorded by borehole and surface instruments portrays quite different contents. The shorter recording duration and nearly flat frequency spectrum up to the Nyquist frequencies of borehole records are faced with longer recording duration and rapid decay of spectral amplitudes at higher frequencies of a surface seismogram. The main causative of the observed differences are near surface geology effects that mask most of the source related information the seismograms include, and that give rise to scattering, generating longer duration seismograms. In view of these circumstances, studies on microearthquakes employing surface seismograms may bring on misleading results. Particularly, the works on earthquake physics and nucleation process of earthquakes requires elaborate analysis of tiny events. It is
Lin, Hong; Wang, Xinming; Liang, Kun
2010-10-01
For monitoring and forecasting of the ocean red tide in real time, a marine environment monitoring technology based on the double-wavelength airborne lidar system is proposed. An airborne lidar is father more efficient than the traditional measure technology by the boat. At the same time, this technology can detect multi-parameter about the ocean red tide by using the double-wavelength lidar.It not only can use the infrared laser to detect the scattering signal under the water and gain the information about the red tise's density and size, but also can use the blue-green laser to detect the Brillouin scattering signal and deduce the temperature and salinity of the seawater.The red tide's density detecting model is firstly established by introducing the concept about the red tide scattering coefficient based on the Mie scattering theory. From the Brillouin scattering theory, the relationship about the blue-green laser's Brillouin scattering frequency shift value and power value with the seawater temperature and salinity is found. Then, the detecting mode1 of the saewater temperature and salinity can be established. The value of the red tide infrared scattering signal is evaluated by the simulation, and therefore the red tide particles' density can be known. At the same time, the blue-green laser's Brillouin scattering frequency shift value and power value are evaluated by simulating, and the temperature and salinity of the seawater can be known. Baed on the multi-parameters, the ocean red tide's growth can be monitored and forecasted.
Optical fibre multi-parameter sensing with secure cloud based signal capture and processing
Newe, Thomas; O'Connell, Eoin; Meere, Damien; Yuan, Hongwei; Leen, Gabriel; O'Keeffe, Sinead; Lewis, Elfed
2016-05-01
Recent advancements in cloud computing technologies in the context of optical and optical fibre based systems are reported. The proliferation of real time and multi-channel based sensor systems represents significant growth in data volume. This coupled with a growing need for security presents many challenges and presents a huge opportunity for an evolutionary step in the widespread application of these sensing technologies. A tiered infrastructural system approach is adopted that is designed to facilitate the delivery of Optical Fibre-based "SENsing as a Service- SENaaS". Within this infrastructure, novel optical sensing platforms, deployed within different environments, are interfaced with a Cloud-based backbone infrastructure which facilitates the secure collection, storage and analysis of real-time data. Feedback systems, which harness this data to affect a change within the monitored location/environment/condition, are also discussed. The cloud based system presented here can also be used with chemical and physical sensors that require real-time data analysis, processing and feedback.
Directory of Open Access Journals (Sweden)
Guozhen Hu
2017-12-01
Full Text Available A loosely coupled inductive power transfer (IPT system for industrial track applications has been researched in this paper. The IPT converter using primary Inductor-Capacitor-Inductor (LCL network and secondary parallel-compensations is analyzed combined coil design for optimal operating efficiency. Accurate mathematical analytical model and expressions of self-inductance and mutual inductance are proposed to achieve coil parameters. Furthermore, the optimization process is performed combined with the proposed resonant compensations and coil parameters. The results are evaluated and discussed using finite element analysis (FEA. Finally, an experimental prototype is constructed to verify the proposed approach and the experimental results show that the optimization can be better applied to industrial track distributed IPT system.
Multi-Parameter Observation and Detection of Pre-Earthquake Signals in Seismically Active Areas
Ouzounov, D.; Pulinets, S.; Parrot, M.; Liu, J. Y.; Hattori, K.; Kafatos, M.; Taylor, P.
2012-01-01
The recent large earthquakes (M9.0 Tohoku, 03/2011; M7.0 Haiti, 01/2010; M6.7 L Aquila, 04/2008; and M7.9 Wenchuan 05/2008) have renewed interest in pre-anomalous seismic signals associated with them. Recent workshops (DEMETER 2006, 2011 and VESTO 2009 ) have shown that there were precursory atmospheric /ionospheric signals observed in space prior to these events. Our initial results indicate that no single pre-earthquake observation (seismic, magnetic field, electric field, thermal infrared [TIR], or GPS/TEC) can provide a consistent and successful global scale early warning. This is most likely due to complexity and chaotic nature of earthquakes and the limitation in existing ground (temporal/spatial) and global satellite observations. In this study we analyze preseismic temporal and spatial variations (gas/radon counting rate, atmospheric temperature and humidity change, long-wave radiation transitions and ionospheric electron density/plasma variations) which we propose occur before the onset of major earthquakes:. We propose an Integrated Space -- Terrestrial Framework (ISTF), as a different approach for revealing pre-earthquake phenomena in seismically active areas. ISTF is a sensor web of a coordinated observation infrastructure employing multiple sensors that are distributed on one or more platforms; data from satellite sensors (Terra, Aqua, POES, DEMETER and others) and ground observations, e.g., Global Positioning System, Total Electron Content (GPS/TEC). As a theoretical guide we use the Lithosphere-Atmosphere-Ionosphere Coupling (LAIC) model to explain the generation of multiple earthquake precursors. Using our methodology, we evaluated retrospectively the signals preceding the most devastated earthquakes during 2005-2011. We observed a correlation between both atmospheric and ionospheric anomalies preceding most of these earthquakes. The second phase of our validation include systematic retrospective analysis for more than 100 major earthquakes (M>5
Development of a multi platform and multi parameter data acquisition interface
International Nuclear Information System (INIS)
Lapolli, Andre L.; Zahn, Guilherme S.
2013-01-01
The process of nuclear data acquisition is evolving continuously. Today data could be digitized from the pre-amplifier or be processed electronically until the digitalisation. Besides, some labs have more than one spectrometer and different data acquisition. Depending on the form of the data acquisition, the researcher will have access to the results only after this process. In some cases, to follow up the process of data acquisition, the operator needs specific knowledge of the different data types used by each system. Consequently, the researcher have to gather a lot of skills in different areas other than his' own, interfering in the analysis process and possibly taking away the efficacy of the research. This work consists in the development of an interface for simultaneous data acquisition of high flexibility and ease of use, which can be programmed by an untrained operator. It is a multi platform interface and it can make data acquisition in real time. Therefore this system has two major tasks: The human machine interface, using the keyboard, mouse, touch screen or the distance (by internet), with the definition made by user according to the equipment. The other task is related to the connection with acquisition system or other peripherals. So, it's possible to link one or more systems of different data or process of the acquisition. This system has been developed for different operating systems with concept of using the Object-Oriented Programming (OOP) concept, enabling integrations of new interfaces and acquisition systems without the need for user training. The system is under development and the implantation in the Laboratorio de Interacao Hiperfinas (LIH) in IPEN, with two different acquisition systems, one using an Ortec® MCA (Model 920-16) and the other a ADC Camberra® ADC (Model 8715) with one National Intruments® (Model 6251) interface, besides several modules of temperature control. (author)
Development of a multi platform and multi parameter data acquisition interface
Energy Technology Data Exchange (ETDEWEB)
Lapolli, Andre L.; Zahn, Guilherme S., E-mail: alapolli@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)
2013-07-01
The process of nuclear data acquisition is evolving continuously. Today data could be digitized from the pre-amplifier or be processed electronically until the digitalisation. Besides, some labs have more than one spectrometer and different data acquisition. Depending on the form of the data acquisition, the researcher will have access to the results only after this process. In some cases, to follow up the process of data acquisition, the operator needs specific knowledge of the different data types used by each system. Consequently, the researcher have to gather a lot of skills in different areas other than his' own, interfering in the analysis process and possibly taking away the efficacy of the research. This work consists in the development of an interface for simultaneous data acquisition of high flexibility and ease of use, which can be programmed by an untrained operator. It is a multi platform interface and it can make data acquisition in real time. Therefore this system has two major tasks: The human machine interface, using the keyboard, mouse, touch screen or the distance (by internet), with the definition made by user according to the equipment. The other task is related to the connection with acquisition system or other peripherals. So, it's possible to link one or more systems of different data or process of the acquisition. This system has been developed for different operating systems with concept of using the Object-Oriented Programming (OOP) concept, enabling integrations of new interfaces and acquisition systems without the need for user training. The system is under development and the implantation in the Laboratorio de Interacao Hiperfinas (LIH) in IPEN, with two different acquisition systems, one using an Ortec® MCA (Model 920-16) and the other a ADC Camberra® ADC (Model 8715) with one National Intruments® (Model 6251) interface, besides several modules of temperature control. (author)
Zhang, Yu; Yang, Wei; Han, Dongsheng; Kim, Young-Il
2014-07-21
Environment monitoring is important for the safety of underground coal mine production, and it is also an important application of Wireless Sensor Networks (WSNs). We put forward an integrated environment monitoring system for underground coal mine, which uses the existing Cable Monitoring System (CMS) as the main body and the WSN with multi-parameter monitoring as the supplementary technique. As CMS techniques are mature, this paper mainly focuses on the WSN and the interconnection between the WSN and the CMS. In order to implement the WSN for underground coal mines, two work modes are designed: periodic inspection and interrupt service; the relevant supporting technologies, such as routing mechanism, collision avoidance, data aggregation, interconnection with the CMS, etc., are proposed and analyzed. As WSN nodes are limited in energy supply, calculation and processing power, an integrated network management scheme is designed in four aspects, i.e., topology management, location management, energy management and fault management. Experiments were carried out both in a laboratory and in a real underground coal mine. The test results indicate that the proposed integrated environment monitoring system for underground coal mines is feasible and all designs performed well as expected.
International Nuclear Information System (INIS)
Hanewinkel, H.
1984-01-01
The construction of a new structurated data acquisition system at the Cologne tandem accelerator should contribute to the further development of the experimental methods in nuclear physics. For this fast procedures for the real-time processing and on-line data reduction of multi-parameter events were development and applied. These procedures allowed in connection with the whole analyzer system an effective processing of the occurring many-parametric data. By this work an important condition for the experimental application of the high-resolution γγ anti-Compton coincidence spectrometer OSIRIS, which is constructed in an collaboration with goups in Berlin, Bonn, and Julich, was created at the Cologne accelerator. The ensued analyzer system offers however also to all other users an extension of their experimental possibilities. The requirements and the structure of the analyzer system as well as the procedures developed and applied for this are described and compared with other procedures proposed in literature. (orig./HSI) [de
Zhang, Yu; Yang, Wei; Han, Dongsheng; Kim, Young-Il
2014-01-01
Environment monitoring is important for the safety of underground coal mine production, and it is also an important application of Wireless Sensor Networks (WSNs). We put forward an integrated environment monitoring system for underground coal mine, which uses the existing Cable Monitoring System (CMS) as the main body and the WSN with multi-parameter monitoring as the supplementary technique. As CMS techniques are mature, this paper mainly focuses on the WSN and the interconnection between the WSN and the CMS. In order to implement the WSN for underground coal mines, two work modes are designed: periodic inspection and interrupt service; the relevant supporting technologies, such as routing mechanism, collision avoidance, data aggregation, interconnection with the CMS, etc., are proposed and analyzed. As WSN nodes are limited in energy supply, calculation and processing power, an integrated network management scheme is designed in four aspects, i.e., topology management, location management, energy management and fault management. Experiments were carried out both in a laboratory and in a real underground coal mine. The test results indicate that the proposed integrated environment monitoring system for underground coal mines is feasible and all designs performed well as expected. PMID:25051037
Directory of Open Access Journals (Sweden)
Yu Zhang
2014-07-01
Full Text Available Environment monitoring is important for the safety of underground coal mine production, and it is also an important application of Wireless Sensor Networks (WSNs. We put forward an integrated environment monitoring system for underground coal mine, which uses the existing Cable Monitoring System (CMS as the main body and the WSN with multi-parameter monitoring as the supplementary technique. As CMS techniques are mature, this paper mainly focuses on the WSN and the interconnection between the WSN and the CMS. In order to implement the WSN for underground coal mines, two work modes are designed: periodic inspection and interrupt service; the relevant supporting technologies, such as routing mechanism, collision avoidance, data aggregation, interconnection with the CMS, etc., are proposed and analyzed. As WSN nodes are limited in energy supply, calculation and processing power, an integrated network management scheme is designed in four aspects, i.e., topology management, location management, energy management and fault management. Experiments were carried out both in a laboratory and in a real underground coal mine. The test results indicate that the proposed integrated environment monitoring system for underground coal mines is feasible and all designs performed well as expected.
Groschen, George E.; King, Robin B.
2005-01-01
Eight streams, representing a wide range of environmental and water-quality conditions across Illinois, were monitored from July 2001 to October 2003 for five water-quality parameters as part of a pilot study by the U.S. Geological Survey (USGS) in cooperation with the Illinois Environmental Protection Agency (IEPA). Continuous recording multi-parameter water-quality monitors were installed to collect data on water temperature, dissolved-oxygen concentrations, specific conductivity, pH, and turbidity. The monitors were near USGS streamflow-gaging stations where stage and streamflow are continuously recorded. During the study period, the data collected for these five parameters generally met the data-quality objectives established by the USGS and IEPA at all eight stations. A similar pilot study during this period for measurement of chlorophyll concentrations failed to achieve the data-quality objectives. Of all the sensors used, the temperature sensors provided the most accurate and reliable measurements (generally within ?5 percent of a calibrated thermometer reading). Signal adjustments and calibration of all other sensors are dependent upon an accurate and precise temperature measurement. The dissolved-oxygen sensors were the next most reliable during the study and were responsive to changing conditions and accurate at all eight stations. Specific conductivity was the third most accurate and reliable measurement collected from the multi-parameter monitors. Specific conductivity at the eight stations varied widely-from less than 40 microsiemens (?S) at Rayse Creek near Waltonville to greater than 3,500 ?S at Salt Creek at Western Springs. In individual streams, specific conductivity often changed quickly (greater than 25 percent in less than 3 hours) and the sensors generally provided good to excellent record of these variations at all stations. The widest range of specific-conductivity measurements was in Salt Creek at Western Springs in the Greater Chicago
McDonald, Scott A; Mohamed, Rosmawati; Dahlui, Maznah; Naning, Herlianna; Kamarulzaman, Adeeba
2014-11-07
Collecting adequate information on key epidemiological indicators is a prerequisite to informing a public health response to reduce the impact of hepatitis C virus (HCV) infection in Malaysia. Our goal was to overcome the acute data shortage typical of low/middle income countries using statistical modelling to estimate the national HCV prevalence and the distribution over transmission pathways as of the end of 2009. Multi-parameter evidence synthesis methods were applied to combine all available relevant data sources - both direct and indirect - that inform the epidemiological parameters of interest. An estimated 454,000 (95% credible interval [CrI]: 392,000 to 535,000) HCV antibody-positive individuals were living in Malaysia in 2009; this represents 2.5% (95% CrI: 2.2-3.0%) of the population aged 15-64 years. Among males of Malay ethnicity, for 77% (95% CrI: 69-85%) the route of probable transmission was active or a previous history of injecting drugs. The corresponding proportions were smaller for male Chinese and Indian/other ethnic groups (40% and 71%, respectively). The estimated prevalence in females of all ethnicities was 1% (95% CrI: 0.6 to 1.4%); 92% (95% CrI: 88 to 95%) of infections were attributable to non-drug injecting routes of transmission. The prevalent number of persons living with HCV infection in Malaysia is estimated to be very high. Low/middle income countries often lack a comprehensive evidence base; however, evidence synthesis methods can assist in filling the data gaps required for the development of effective policy to address the future public health and economic burden due to HCV.
Faye, Grégory; Rankin, James; Chossat, Pascal
2013-05-01
The existence of spatially localized solutions in neural networks is an important topic in neuroscience as these solutions are considered to characterize working (short-term) memory. We work with an unbounded neural network represented by the neural field equation with smooth firing rate function and a wizard hat spatial connectivity. Noting that stationary solutions of our neural field equation are equivalent to homoclinic orbits in a related fourth order ordinary differential equation, we apply normal form theory for a reversible Hopf bifurcation to prove the existence of localized solutions; further, we present results concerning their stability. Numerical continuation is used to compute branches of localized solution that exhibit snaking-type behaviour. We describe in terms of three parameters the exact regions for which localized solutions persist.
Milosevic, Igor; Naunovic, Zorana
2013-10-01
This article presents a process of evaluation and selection of the most favourable location for a sanitary landfill facility from three alternative locations, by applying a multi-criteria decision-making (MCDM) method. An incorrect choice of location for a landfill facility can have a significant negative economic and environmental impact, such as the pollution of air, ground and surface waters. The aim of this article is to present several improvements in the practical process of landfill site selection using the VIKOR MCDM compromise ranking method integrated with a fuzzy analytic hierarchy process approach for determining the evaluation criteria weighing coefficients. The VIKOR method focuses on ranking and selecting from a set of alternatives in the presence of conflicting and non-commensurable (different units) criteria, and on proposing a compromise solution that is closest to the ideal solution. The work shows that valuable site ranking lists can be obtained using the VIKOR method, which is a suitable choice when there is a large number of relevant input parameters.
Besson, P; Lalanne, F X; Wang, Y; Guyot, F
1999-11-01
An original multi-parameter system has been used to study the nature of dust in the ambient air, particularly the total fibers and asbestos fibers, in eight areas of the Institut de Physique de Globe de Paris (France). These analyses provide a detailed case study of environmental pollution by asbestos fibers at low levels. The levels of total fibers with a length greater than 3 microns, measured with a real time fiber analyser monitor (FAM), give a baseline of 2.5 fibers per l., throughout the duration of sampling. The same levels, calculated during periods of effective presence of staff, are smaller than 10 fb per l. During these periods, the instantaneous value can show high peaks, reaching a maximum of 60 fb per l., but more often of about 5 to 10 fb per l. A direct cause and effect relationship exists between fiber concentrations and the presence of people, and indirectly with the variation of the other environmental parameters (temperature, humidity, air velocity). The baseline concentration of asbestos fibers, determined by analytical transmission electron microscopy (ATEM), is about 10(-1) fb per l., with a mean value during the presence of people always less than 1.5 fb per l. The low levels of asbestos fibers do not allow us to establish a precise correlation between the concentration of total fibers and the asbestos concentration, but a rough estimate suggests that asbestos could represent 10-20% of the airborne fibers monitored with the FAM. The statistical study of fiber sizes shows that 70 and 55% of analyzed chrysotile and amosite fibers respectively are smaller than 5 microns. These numbers are 40 and 35% for fibers smaller than 3 microns, which are undetected by the FAM. Amosite, which characterizes most of the asbestos-containing materials (ACM) in the analyzed areas, is detected in the ambient air in quantities ten times less important than chrysotile. The low asbestos levels and the difference between the nature of building asbestos and airborne
Czech Academy of Sciences Publication Activity Database
Veselý, V.; Sobek, J.; Frantík, P.; Seitl, Stanislav
2016-01-01
Roč. 89, AUG (2016), s. 20-35 ISSN 0142-1123. [International Conference on Characterisation of Crack Tip Fields /3./. Urbino, 20.04.2015-22.04.2015] Institutional support: RVO:68081723 Keywords : Crack-tip fields * Williams power series * Higher order terms * Stress field reconstruction * Multi-parameter approximation accuracy Subject RIV: JL - Materials Fatigue, Friction Mechanics Impact factor: 2.899, year: 2016
Czech Academy of Sciences Publication Activity Database
Veselý, V.; Sopek, J.; Tesař, D.; Frantík, P.; Pail, T.; Seitl, Stanislav
2015-01-01
Roč. 9, č. 33 (2015), s. 120-133 ISSN 1971-8993 Institutional support: RVO:68081723 Keywords : Cracked specimen * Near-crack-tip fields * Williams expansion * Higher order terms * Stress field reconstruction * Finite element analysis * Java application Subject RIV: JL - Materials Fatigue, Friction Mechanics
Energy Technology Data Exchange (ETDEWEB)
Gilchrist, Kristin H., E-mail: kgilchrist@rti.org; Lewis, Gregory F.; Gay, Elaine A.; Sellgren, Katelyn L.; Grego, Sonia
2015-10-15
Microelectrode arrays (MEAs) recording extracellular field potentials of human-induced pluripotent stem cell-derived cardiomyocytes (hiPS-CM) provide a rich data set for functional assessment of drug response. The aim of this work is the development of a method for a systematic analysis of arrhythmia using MEAs, with emphasis on the development of six parameters accounting for different types of cardiomyocyte signal irregularities. We describe a software approach to carry out such analysis automatically including generation of a heat map that enables quick visualization of arrhythmic liability of compounds. We also implemented signal processing techniques for reliable extraction of the repolarization peak for field potential duration (FPD) measurement even from recordings with low signal to noise ratios. We measured hiPS-CM's on a 48 well MEA system with 5 minute recordings at multiple time points (0.5, 1, 2 and 4 h) after drug exposure. We evaluated concentration responses for seven compounds with a combination of hERG, QT and clinical proarrhythmia properties: Verapamil, Ranolazine, Flecainide, Amiodarone, Ouabain, Cisapride, and Terfenadine. The predictive utility of MEA parameters as surrogates of these clinical effects were examined. The beat rate and FPD results exhibited good correlations with previous MEA studies in stem cell derived cardiomyocytes and clinical data. The six-parameter arrhythmia assessment exhibited excellent predictive agreement with the known arrhythmogenic potential of the tested compounds, and holds promise as a new method to predict arrhythmic liability. - Highlights: • Six parameters describing arrhythmia were defined and measured for known compounds. • Software for efficient parameter extraction from large MEA data sets was developed. • The proposed cellular parameter set is predictive of clinical drug proarrhythmia.
Langemann, Timo; Mayr, Ulrike Beate; Meitz, Andrea; Lubitz, Werner; Herwig, Christoph
2016-01-01
Flow cytometry (FCM) is a tool for the analysis of single-cell properties in a cell suspension. In this contribution, we present an improved FCM method for the assessment of E-lysis in Enterobacteriaceae. The result of the E-lysis process is empty bacterial envelopes-called bacterial ghosts (BGs)-that constitute potential products in the pharmaceutical field. BGs have reduced light scattering properties when compared with intact cells. In combination with viability information obtained from staining samples with the membrane potential-sensitive fluorescent dye bis-(1,3-dibutylarbituric acid) trimethine oxonol (DiBAC4(3)), the presented method allows to differentiate between populations of viable cells, dead cells, and BGs. Using a second fluorescent dye RH414 as a membrane marker, non-cellular background was excluded from the data which greatly improved the quality of the results. Using true volumetric absolute counting, the FCM data correlated well with cell count data obtained from colony-forming units (CFU) for viable populations. Applicability of the method to several Enterobacteriaceae (different Escherichia coli strains, Salmonella typhimurium, Shigella flexneri 2a) could be shown. The method was validated as a resilient process analytical technology (PAT) tool for the assessment of E-lysis and for particle counting during 20-l batch processes for the production of Escherichia coli Nissle 1917 BGs.
International Nuclear Information System (INIS)
Zhang, Y.; Melnikov, A.; Mandelis, A.; Halliop, B.; Kherani, N. P.; Zhu, R.
2015-01-01
A theoretical one-dimensional two-layer linear photocarrier radiometry (PCR) model including the presence of effective interface carrier traps was used to evaluate the transport parameters of p-type hydrogenated amorphous silicon (a-Si:H) and n-type crystalline silicon (c-Si) passivated by an intrinsic hydrogenated amorphous silicon (i-layer) nanolayer. Several crystalline Si heterojunction structures were examined to investigate the influence of the i-layer thickness and the doping concentration of the a-Si:H layer. The experimental data of a series of heterojunction structures with intrinsic thin layers were fitted to PCR theory to gain insight into the transport properties of these devices. The quantitative multi-parameter results were studied with regard to measurement reliability (uniqueness) and precision using two independent computational best-fit programs. The considerable influence on the transport properties of the entire structure of two key parameters that can limit the performance of amorphous thin film solar cells, namely, the doping concentration of the a-Si:H layer and the i-layer thickness was demonstrated. It was shown that PCR can be applied to the non-destructive characterization of a-Si:H/c-Si heterojunction solar cells yielding reliable measurements of the key parameters
Zhang, Y; Melnikov, A; Mandelis, A; Halliop, B; Kherani, N P; Zhu, R
2015-03-01
A theoretical one-dimensional two-layer linear photocarrier radiometry (PCR) model including the presence of effective interface carrier traps was used to evaluate the transport parameters of p-type hydrogenated amorphous silicon (a-Si:H) and n-type crystalline silicon (c-Si) passivated by an intrinsic hydrogenated amorphous silicon (i-layer) nanolayer. Several crystalline Si heterojunction structures were examined to investigate the influence of the i-layer thickness and the doping concentration of the a-Si:H layer. The experimental data of a series of heterojunction structures with intrinsic thin layers were fitted to PCR theory to gain insight into the transport properties of these devices. The quantitative multi-parameter results were studied with regard to measurement reliability (uniqueness) and precision using two independent computational best-fit programs. The considerable influence on the transport properties of the entire structure of two key parameters that can limit the performance of amorphous thin film solar cells, namely, the doping concentration of the a-Si:H layer and the i-layer thickness was demonstrated. It was shown that PCR can be applied to the non-destructive characterization of a-Si:H/c-Si heterojunction solar cells yielding reliable measurements of the key parameters.
Quantitative multi-parameter mapping of R1, PD*, MT and R2* at 3T: a multi-center validation
Directory of Open Access Journals (Sweden)
Nikolaus eWeiskopf
2013-06-01
Full Text Available Multi-center studies using magnetic resonance imaging facilitate studying small effect sizes, global population variance and rare diseases. The reliability and sensitivity of these multi-center studies crucially depend on the comparability of the data generated at different sites and time points. The level of inter-site comparability is still controversial for conventional anatomical T1-weighted MRI data. Quantitative multi-parameter mapping (MPM was designed to provide MR parameter measures that are comparable across sites and time points, i.e., 1mm high-resolution maps of the longitudinal relaxation rate (R1=1/T1, effective proton density (PD*, magnetization transfer saturation (MT and effective transverse relaxation rate (R2*=1/T2*. MPM was validated at 3T for use in multi-center studies by scanning five volunteers at three different sites. We determined the inter-site bias, inter-site and intra-site coefficient of variation (CoV for typical morphometric measures (i.e., gray matter probability maps used in voxel-based morphometry and the four quantitative parameters. The inter-site bias and CoV were smaller than 3.1% and 8%, respectively, except for the inter-site CoV of R2* (< 20%. The gray matter probability maps based on the MT parameter maps had a 14% higher inter-site reproducibility than maps based on conventional T1-weighted images. The low inter-site bias and variance in the parameters and derived gray matter probability maps confirm the high comparability of the quantitative maps across sites and time points. The reliability, short acquisition time, high resolution and the detailed insights into the brain microstructure provided by MPM makes it an efficient tool for multi-center imaging studies.
International Nuclear Information System (INIS)
Eichinger, L.; Lorenz, G.D.; Eichinger, F.; Wechner, S.; Voropaev, A.
2012-01-01
Document available in extended abstract form only. Within the research framework of natural clay rocks used as barriers for radioactive waste confinement comprehensive analyses are mandatory to determine the chemical and isotopic composition of natural pore water and therein dissolved gases as well as samples from distinct in-situ and lab experiments. Based on the natural conditions pore waters from low permeable argillaceous rocks can be sampled only in small amounts over long time periods. Often those samples are primarily influenced by processes of the exploration and exploitation such as the contamination by drilling fluid and disinfection fluid or cement-water interactions. Sophisticated equipment for circulation experiments allows the sampling of gas and water in the original state in steel and peek cells. The challenge though is to optimise the lab equipment and measurement techniques in a way that the physical-chemical conditions of the water can be analysed in the original state. The development of special micro measuring cells enables the analyses of physical parameters like redox potential under very slow through-flow conditions. Additional analyses can follow subsequently without wasting any drop of the precious pore water. The gas composition is measured in equilibrated gas phases above water phases after emptying a defined volume by inert gas or through manual pressure. The analytical challenge is to obtain an extensive set of parameters which is considered representative for the in-situ conditions using only a few millilitres of water. The parameter analysis includes the determination of the composition of the water, the isotopic compositions of the water and the dissolved constituents as well as their gas concentrations and isotopic signatures. So far the smallest sample volume needed for an analysis of a full set of parameters including the gas composition was 9 ml of water. Obviously, the analysis requires a highly sophisticated infrastructure and
Multi-parameter study of gammas capture
International Nuclear Information System (INIS)
Samama, R.; Nifenecker, H.; Carlos, P.; Delaitre, B.
1966-06-01
This equipment is intended for analyzing, recording, and reading simultaneous information from several 'gamma' detectors. It allows multiparameter study of γ-γ cascades emitted after thermal neutrons capture. (authors) [fr
Multi-Parameter Aerosol Scattering Sensor
Greenberg, Paul S.; Fischer, David G.
2011-01-01
This work relates to the development of sensors that measure specific aerosol properties. These properties are in the form of integrated moment distributions, i.e., total surface area, total mass, etc., or mathematical combinations of these moment distributions. Specifically, the innovation involves two fundamental features: a computational tool to design and optimize such sensors and the embodiment of these sensors in actual practice. The measurement of aerosol properties is a problem of general interest. Applications include, but are not limited to, environmental monitoring, assessment of human respiratory health, fire detection, emission characterization and control, and pollutant monitoring. The objectives for sensor development include increased accuracy and/or dynamic range, the inclusion in a single sensor of the ability to measure multiple aerosol properties, and developing an overall physical package that is rugged, compact, and low in power consumption, so as to enable deployment in harsh or confined field applications, and as distributed sensor networks. Existing instruments for this purpose include scattering photometers, direct-reading mass instruments, Beta absorption devices, differential mobility analyzers, and gravitational samplers. The family of sensors reported here is predicated on the interaction of light and matter; specifically, the scattering of light from distributions of aerosol particles. The particular arrangement of the sensor, e.g. the wavelength(s) of incident radiation, the number and location of optical detectors, etc., can be derived so as to optimize the sensor response to aerosol properties of practical interest. A key feature of the design is the potential embodiment as an extremely compact, integrated microsensor package. This is of fundamental importance, as it enables numerous previously inaccessible applications. The embodiment of these sensors is inherently low maintenance and high reliability by design. The novel and unique features include the underlying computational underpinning that allows the optimization for specific applications, and the physical embodiment that affords the construction of a compact, durable, and reliable integrated package. The advantage appears in the form of increased accuracy relative to existing instruments, and the applications enabled by the physical attributes of the resulting configuration
Operto, S.; Miniussi, A.
2018-03-01
Three-dimensional frequency-domain full waveform inversion (FWI) is applied on North Sea wide-azimuth ocean-bottom cable data at low frequencies (≤ 10 Hz) to jointly update vertical wavespeed, density and quality factor Q in the visco-acoustic VTI approximation. We assess whether density and Q should be viewed as proxy to absorb artefacts resulting from approximate wave physics or are valuable for interpretation in presence of saturated sediments and gas. FWI is performed in the frequency domain to account for attenuation easily. Multi-parameter frequency-domain FWI is efficiently performed with a few discrete frequencies following a multi-scale frequency continuation. However, grouping a few frequencies during each multi-scale step is necessary to mitigate acquisition footprint and match dispersive shallow guided waves. Q and density absorb a significant part of the acquisition footprint hence cleaning the velocity model from this pollution. Low Q perturbations correlate with low velocity zones associated with soft sediments and gas cloud. However, the amplitudes of the Q perturbations show significant variations when the inversion tuning is modified. This dispersion in the Q reconstructions is however not passed on the velocity parameter suggesting that cross-talks between first-order kinematic and second-order dynamic parameters are limited. The density model shows a good match with a well log at shallow depths. Moreover, the impedance built a posteriori from the FWI velocity and density models shows a well-focused image with however local differences with the velocity model near the sea bed where density might have absorbed elastic effects. The FWI models are finally assessed against time-domain synthetic seismogram modelling performed with the same frequency-domain modelling engine used for FWI.
[Cluster analysis in biomedical researches].
Akopov, A S; Moskovtsev, A A; Dolenko, S A; Savina, G D
2013-01-01
Cluster analysis is one of the most popular methods for the analysis of multi-parameter data. The cluster analysis reveals the internal structure of the data, group the separate observations on the degree of their similarity. The review provides a definition of the basic concepts of cluster analysis, and discusses the most popular clustering algorithms: k-means, hierarchical algorithms, Kohonen networks algorithms. Examples are the use of these algorithms in biomedical research.
Energy Technology Data Exchange (ETDEWEB)
Paul, Sumit; Legner, Wolfgang; Hackner, Angelika; Mueller, Gerhard [EADS Innovation Works, Muenchen (Germany). Bereich Sensors, Electronics and Systems Integration; Baumbach, Volker [Airbus Operations GmbH, Bremen (Germany). Bereich Hydraulic Performance and Integrity
2011-07-01
A miniaturised sensor system for aviation hydraulic fluids is presented. The system consists of an optochemical sensor and a particle sensor. The optochemical sensor detects the form of the O-H absorption feature around 3500 cm{sup -1} to reveal the water and acid contamination in the fluid. The particle sensor uses a light barrier principle to derive its particle contamination number. (orig.)
Institute of Scientific and Technical Information of China (English)
张远芳; 周正; 李会勇
2016-01-01
对于极化敏感 L 型阵列的多参数联合估计问题，采用传统的多重信号分类(MUSIC)算法所需计算量大，采用旋转不变子空间(ESPRIT)算法需要考虑参数配对问题。提出了模值约束下的求根多重信号分类(root-MUSIC)算法，首先利用 L 型阵列中两个相互垂直的线阵构造两子阵接收数据的自相关函数，采用 root-MUSIC 算法进行波达方向角(DOA)估计，然后根据模值约束条件构造代价函数，通过闭合式解得到极化参数估计。该算法与传统 MUSIC 算法相比，大大减少了计算量，同时能够实现参数自动配对，避免了 ESPRIT 算法的不足。计算机仿真结果表明，该算法的角度估计性能与传统 MUSIC 算法接近，优于ESPRIT 算法，且算法收敛速度快。%A large amount of computation is required when using the traditional MUSIC algorithm and the parameter matching problem should be considered when using ESPRIT algorithm for multi-parameters estimation of L-shaped polarization sensitive array.A root-MUSIC algorithm with modulus constraint is pro-posed.This algorithm estimates the DOA and polarization parameters in two separate steps.In step one,the autocorrelation function of the received data from two mutually vertical linear arrays is constructed,and DOA parameter is estimated by root-MUSIC algorithm;in step two,the cost function is constructed according to constraint condition and polarization parameters are obtained with closed-form formulas.Compared with the traditional MUSIC algorithm,the proposed algorithm greatly reduces the amount of calculation.It can realize automatic matching parameters at the same time,which avoids the deficiency of the ESPRIT algorithm.The computer simulation results show that the angle estimation performance of the proposed algorithm is close to traditional MUSIC algorithm,and is better than ESPRIT algorithm.Furthermore,the proposed algorithm has fast convergence speed.
EMS mutant spectra generated by multi-parameter flow cytometry
Energy Technology Data Exchange (ETDEWEB)
Keysar, Stephen B. [Cell and Molecular Biology Graduate Program, Colorado State University, Fort Collins, CO (United States); Fox, Michael H., E-mail: michael.fox@colostate.edu [Cell and Molecular Biology Graduate Program, Colorado State University, Fort Collins, CO (United States); Department of Environmental and Radiological Health Sciences, Colorado State University, Fort Collins, CO (United States)
2009-12-01
The CHO A{sub L} cell line contains a single copy of human chromosome 11 that encodes several cell surface proteins including glycosyl phosphatidylinositol (GPI) linked CD59 and CD90, as well as CD98, CD44 and CD151 which are not GPI-linked. The flow cytometry mutation assay (FCMA) measures mutations of the CD59 gene by the absence of fluorescence when stained with antibodies against the CD59 cell surface protein. We have measured simultaneous mutations in CD59, CD44, CD90, CD98 and CD151 to generate a mutant spectrum for ionizing radiation. After treatment with ethyl methanesulfonate (EMS) many cells have an intermediate level of CD59 staining. Single cells were sorted from CD59{sup -} regions with varying levels of fluorescence and the resulting clonal populations had a stable phenotype for CD59 expression. Mutant spectra were generated by flow cytometry using the isolated clones and nearly all clones were mutated in CD59 only. Interestingly, about 60% of the CD59 negative clones were actually GPI mutants determined by staining with the GPI specific fluorescently labeled bacterial toxin aerolysin (FLAER). The GPI negative cells are most likely caused by mutations in the X-linked pigA gene important in GPI biosynthesis. Small mutations of pigA and CD59 were expected for the alkylating agent EMS and the resulting spectra are significantly different than the large deletions found when analyzing radiation mutants. After analyzing the CD59{sup -} clonal populations we have adjusted the FCMA mutant regions from 1% to 10% of the mean of the CD59 positive peak to include the majority of CD59 mutants.
Optimization Design of Multi-Parameters in Rail Launcher System
Yujiao Zhang; Weinan Qin; Junpeng Liao; Jiangjun Ruan
2014-01-01
Today the energy storage systems are still encumbering, therefore it is useful to think about the optimization of a railgun system in order to achieve the best performance with the lowest energy input. In this paper, an optimal design method considering 5 parameters is proposed to improve the energy conversion efficiency of a simple railgun. In order to avoid costly trials, the field- circuit method is employed to analyze the operations of different structural railguns with different paramete...
Multi-parameter full waveform inversion using Poisson
Oh, Juwon; Min, Dong-Joo
2016-01-01
on the two conventional parameterisations (one uses Lame constants and density; the other employs P- and S-wave velocities and density) have low resolution of gradients for P-wave velocities (or ). Limitations occur because the virtual sources for P
Hyperspectral multi-parameter sensing for marine environmental security
DEFF Research Database (Denmark)
Zielinski, Oliver; Brehm, Robert
2007-01-01
Ocean sensors are used to detect algae bloom and feature oil spill alarm systems, among others. Optical sensors for such stations do not require chemical reagents and hae the potential to be fast, self-sufficient and suitable for long-term applications. An effort has been made by the Institute...
Performance of MarSite Multi parameter Borehole Instrumentation
Guralp, Cansun; Tunc, Suleyman; Ozel, Oguz; Meral Ozel, Nurcan; Necmioglu, Ocal
2017-04-01
In this paper we present two year results obtained from the integrated multiparameter borehole system at Marsite. The very broad band (VBB) system have been operating since installation in November 2014; one year in a water filled borehole and one year in a dry Borehole. from January 2016. The real time data has been available to the community. The two Borehole environments are compared showing the superior performance of dry borehole environ- ment compared to water filled for a very broad band (VBB) seismometer. The practical considerations applied in both borehole installations are compared and the best borehole practical installation techniques are presented and discussed. The data is also compared with a surface 120 second broad band sensor and the seismic arrays with in MarSite region. The very long term performance, (one year data in a dry hole) of the VBB Borehole seismometer and the Dilatometer will be presented The high frequency performance of the VBB seismometer which extends to 150 Hz and the dilatometer are compared characterizing the results from the dilatometer.
Methods and apparatus for multi-parameter acoustic signature inspection
Energy Technology Data Exchange (ETDEWEB)
Diaz, Aaron A [Richland, WA; Samuel, Todd J [Pasco, WA; Valencia, Juan D [Kennewick, WA; Gervais, Kevin L [Richland, WA; Tucker, Brian J [Pasco, WA; Kirihara, Leslie J [Richland, WA; Skorpik, James R [Kennewick, WA; Reid, Larry D [Benton City, WA; Munley, John T [Benton City, WA; Pappas, Richard A [Richland, WA; Wright, Bob W [West Richland, WA; Panetta, Paul D [Richland, WA; Thompson, Jason S [Richland, WA
2007-07-24
A multiparameter acoustic signature inspection device and method are described for non-invasive inspection of containers. Dual acoustic signatures discriminate between various fluids and materials for identification of the same.
An universal multi-parameter data acquisition system MOLDAS1
International Nuclear Information System (INIS)
Jiao Dunpang; Zhou Yanyen; Ge Wenxiu; Wang Yanyu; Yu Jusheng; Jing Lan
1988-01-01
MOLDAS1 is a data acquisition system to be used for data-taking from reactions induced by heavy-ion in IMP. Its configuration both on hardware and software, system control logic, data flow and functions are intraduced. System specification is discussed as well
Multidimentional and Multi-Parameter Fortran-Based Curve Fitting ...
African Journals Online (AJOL)
This work briefly describes the mathematics behind the algorithm, and also elaborates how to implement it using FORTRAN 95 programming language. The advantage of this algorithm, when it is extended to surfaces and complex functions, is that it makes researchers to have a better trust during fitting. It also improves the ...
Stability, performance and sensitivity analysis of I.I.D. jump linear systems
Chávez Fuentes, Jorge R.; González, Oscar R.; Gray, W. Steven
2018-06-01
This paper presents a symmetric Kronecker product analysis of independent and identically distributed jump linear systems to develop new, lower dimensional equations for the stability and performance analysis of this type of systems than what is currently available. In addition, new closed form expressions characterising multi-parameter relative sensitivity functions for performance metrics are introduced. The analysis technique is illustrated with a distributed fault-tolerant flight control example where the communication links are allowed to fail randomly.
Development of in-air micro-PIXE analysis and data sharing systems in JAERI Takasaki
International Nuclear Information System (INIS)
Sakai, Takuro; Kamiya, Tomihiro; Oikawa, Masakazu
2000-01-01
An external scanning ion microbeam system has been developed for in-air micro-PIXE analysis at JAERI Takasaki. The analysis system is widely used for various researches in recent years. The system consists of the external scanning ion microbeam system, a multi-parameter data acquisition system, a file transfer protocol (FTP) server and analysis software. The software of the system provides a graphical user interface for interaction between users and an experimental setup. The server is connected to the Internet and allows remote users to access the experimental data. (author)
Initial multi-parameter detection of atmospheric metal layers by Beijing Na–K lidar
International Nuclear Information System (INIS)
Jiao, Jing; Yang, Guotao; Wang, Jihong; Cheng, Xuewu; Du, Lifang; Wang, Zelong; Gong, Wei
2017-01-01
Beijing Na–K lidar has been started running in 2010. This lidar has two laser beams: one dye laser emits a 589-nm laser beam for Na layer detection; the other dye laser emits a 770-nm laser beam for K layer detection. Under similar conditions, the echo signal of K layer is only about 2 orders of magnitude smaller than that of Na layer. This lidar has a sufficient Signal Noise Ratio (SNR). The structure and details of potassium layer can be effectively distinguished from a single original echo. Several examples of co-observation of density of Na and K layer showed some different results with previous studies. This lidar not only can supplement the lack of Na and K layer observation at this latitude region, but also provide evidence for the atmospheric sciences and space environment monitoring. - Highlights: • Full-band dual-beam lidar at 40°N. • Detecting sodium and potassium layer simultaneously. • Providing a supplement to the study of atmospheric metal layers and evidence for atmospheric sciences and space and atmospheric sciences and space environment monitoring.
Multi Parameter Flow Meter for On-Line Measurement of Gas Mixture Composition
Directory of Open Access Journals (Sweden)
Egbert van der Wouden
2015-04-01
Full Text Available In this paper we describe the development of a system and model to analyze the composition of gas mixtures up to four components. The system consists of a Coriolis mass flow sensor, density, pressure and thermal flow sensor. With this system it is possible to measure the viscosity, density, heat capacity and flow rate of the medium. In a next step the composition can be analyzed if the constituents of the mixture are known. This makes the approach universally applicable to all gasses as long as the number of components does not exceed the number of measured properties and as long as the properties are measured with a sufficient accuracy. We present measurements with binary and ternary gas mixtures, on compositions that range over an order of magnitude in value for the physical properties. Two platforms for analyses are presented. The first platform consists of sensors realized with MEMS fabrication technology. This approach allows for a system with a high level of integration. With this system we demonstrate a proof of principle for the analyses of binary mixtures with an accuracy of 10%. In the second platform we utilize more mature steel sensor technology to demonstrate the potential of this approach. We show that with this technique, binary mixtures can be measured within 1% and ternary gas mixtures within 3%.
Multi-parameter assessment of thrombus formation on microspotted arrays of thrombogenic surfaces
sprotocols
2014-01-01
Authors: Susanne de Witt, Frauke Swieringa, Judith Cosemans & Johan Heemskerk ### Abstract Thrombus formation by adhering and aggregating blood platelets is fundamental to hemostasis and is a prerequisite for vascular occlusion in pathological thrombosis. The parallel-plate flow chamber technique has been extensively used to measure platelet adhesion and activation in vitro at arterial or venous flow conditions. Here, we describe the use of brightfield and confocal fluorescence micros...
Jin, Dayang; Yang, Fen; Chen, Zhongjiang; Yang, Sihua; Xing, Da
2017-09-01
The combination of phase-sensitive photoacoustic (PA) imaging of tissue viscoelasticity with the esophagus-adaptive PA endoscope (PAE) technique allows the characterization of the biomechanical and morphological changes in the early stage of esophageal disease with high accuracy. In this system, the tissue biomechanics and morphology are obtained by detecting the PA phase and PA amplitude information, respectively. The PAE has a transverse resolution of approximately 37 μm and an outer diameter of 1.2 mm, which is suitable for detecting rabbit esophagus. Here, an in-situ biomechanical and morphological study of normal and diseased rabbit esophagus (tumors of esophagus and reflux esophagitis) was performed. The in-situ findings were highly consistent with those observed by histology. In summary, we demonstrated the potential application of PAE for early clinical detection of esophageal diseases.
A Highly Integrated Multi-Parameter Distributed Fiber-Optic Instrumentation System, Phase I
National Aeronautics and Space Administration — In the future, exploration missions will benefit greatly from advanced metrology capabilities, particularly structural health monitoring systems that provide real...
Multi-Parameter Wireless Monitoring and Telecommand of a Rocket Payload: Design and Implementation
Pamungkas, Arga C.; Putra, Alma A.; Puspitaningayu, Pradini; Fransisca, Yulia; Widodo, Arif
2018-04-01
A rocket system generally consists of two parts, the rocket motor and the payload. The payload system is built of several sensors such as accelerometer, gyroscope, magnetometer, and also a surveillance camera. These sensors are used to monitor the rocket in a three-dimensional axis which determine its attitude. Additionally, the payload must be able to perform image capturing in a certain distance using telecommand. This article is intended to describe the design and also the implementation of a rocket payload which has attitude monitoring and telecommand ability from the ground control station using a long-range wireless module Digi XBee Pro 900 HP.
Multi-parameter optimization of a nanomagnetic system for spintronic applications
Energy Technology Data Exchange (ETDEWEB)
Morales Meza, Mishel [Centro de Investigación en Materiales Avanzados, S.C. (CIMAV), Chihuahua/Monterrey, 120 Avenida Miguel de Cervantes, 31109 Chihuahua (Mexico); Zubieta Rico, Pablo F. [Centro de Investigación en Materiales Avanzados, S.C. (CIMAV), Chihuahua/Monterrey, 120 Avenida Miguel de Cervantes, 31109 Chihuahua (Mexico); Centro de Investigación y de Estudios Avanzados del IPN (CINVESTAV) Querétaro, Libramiento Norponiente 2000, Fracc. Real de Juriquilla, 76230 Querétaro (Mexico); Horley, Paul P., E-mail: paul.horley@cimav.edu.mx [Centro de Investigación en Materiales Avanzados, S.C. (CIMAV), Chihuahua/Monterrey, 120 Avenida Miguel de Cervantes, 31109 Chihuahua (Mexico); Sukhov, Alexander [Institut für Physik, Martin-Luther Universität Halle-Wittenberg, 06120 Halle (Saale) (Germany); Vieira, Vítor R. [Centro de Física das Interacções Fundamentais (CFIF), Instituto Superior Técnico, Universidade Técnica de Lisboa, Avenida Rovisco Pais, 1049-001 Lisbon (Portugal)
2014-11-15
Magnetic properties of nano-particles feature many interesting physical phenomena that are essentially important for the creation of a new generation of spin-electronic devices. The magnetic stability of the nano-particles can be improved by formation of ordered particle arrays, which should be optimized over several parameters. Here we report successful optimization regarding inter-particle distance and applied field frequency allowing to obtain about three-times reduction of coercivity of a particle array compared to that of a single particle, which opens new perspectives for development of new spintronic devices.
Multi-parameter optimization of a nanomagnetic system for spintronic applications
International Nuclear Information System (INIS)
Morales Meza, Mishel; Zubieta Rico, Pablo F.; Horley, Paul P.; Sukhov, Alexander; Vieira, Vítor R.
2014-01-01
Magnetic properties of nano-particles feature many interesting physical phenomena that are essentially important for the creation of a new generation of spin-electronic devices. The magnetic stability of the nano-particles can be improved by formation of ordered particle arrays, which should be optimized over several parameters. Here we report successful optimization regarding inter-particle distance and applied field frequency allowing to obtain about three-times reduction of coercivity of a particle array compared to that of a single particle, which opens new perspectives for development of new spintronic devices
Miniature Non-Intrusive Multi-Parameter Oronasal Respiratory Health Monitor, Phase I
National Aeronautics and Space Administration — Redondo Optics Inc. (ROI), proposes to develop, demonstrate, and deliver to NASA an intrinsically safe, miniature, low power, autonomous, and self-calibrated,...
Farhat, I. A. H.; Alpha, C.; Gale, E.; Atia, D. Y.; Stein, A.; Isakovic, A. F.
The scaledown of magnetic tunnel junctions (MTJ) and related nanoscale spintronics devices poses unique challenges for energy optimization of their performance. We demonstrate the dependence of the switching current on the scaledown variable, while considering the influence of geometric parameters of MTJ, such as the free layer thickness, tfree, lateral size of the MTJ, w, and the anisotropy parameter of the MTJ. At the same time, we point out which values of the saturation magnetization, Ms, and anisotropy field, Hk, can lead to lowering the switching current and overall decrease of the energy needed to operate an MTJ. It is demonstrated that scaledown via decreasing the lateral size of the MTJ, while allowing some other parameters to be unconstrained, can improve energy performance by a measurable factor, shown to be the function of both geometric and physical parameters above. Given the complex interdependencies among both families of parameters, we developed a particle swarm optimization (PSO) algorithm that can simultaneously lower energy of operation and the switching current density. Results we obtained in scaledown study and via PSO optimization are compared to experimental results. Support by Mubadala-SRC 2012-VJ-2335 is acknowledged, as are staff at Cornell-CNF and BNL-CFN.
Engels, Alexander; Canli, Ekrem; Thiebes, Benni; Glade, Thomas
2015-04-01
Landslides pose a serious threat to many communities in Austria. The region of Lower Austria is underlayed, amongst others, by the lithological units of the Flysch Zone and the Gresten Klippenbelt. Both are particularly affected by landslides and the majority of episodic occurrences are bound to these two units. The active Salcher landslide is situated at the western border of the municipality of Gresten and is embedded in the geologic transition zone of the respective lithological units. The landslide is a reactivated and deep seated complex landslide that endangers buildings, parts of a road and lifelines such as power and optical fiber lines, fresh and sewage water supplies. Its varying movement rates are in the order of a few centimeters per year and consequently are classified as slow to extremely slow. Despite biannual geodetic surveys, little is known about the dynamic behavior including the triggering and controlling factors and its internal structure. Surface and subsurface investigations were therefore carried out on that landslide. With the intention to detect morphological surface changes, comparative geomorphologic mapping and terrestrial laser scanning was performed. Additionally, surface kinematical information was acquired by historical documents and GNSS measurements. The detailed present soil-physical conditions and their relation to current dynamics were investigated by six drill cores and three inclinometer installations. Soil specimens were obtained by percussion drilling. Particle size distribution, and water and carbonate content were subsequently analyzed in the laboratory. In addition, dynamic probing was performed at 13 sites across the landslide body and resistance values were compared to textural findings. The soil specimens show a heterogeneous texture and large variations in carbonate and water content. Soil wedges, originating from local displacements, were determined in two drill cores. Very high water content and resulting plastic behavior indicate the presence of weakness zones with the geometry of a translational landslide. The depths of the drill cores ranged from 5 m to 9 m. The sampling density of each respective core was less than one meter. The final depth of the three inclinometers ranged from 6.5 m to 13 m. The inclinometers were placed at prominent morphological landslide features like the head, bulged levee and the transitions zone and were maintained over the past eight months. Subsurface displacement measurements were then compared with the soils' texture. GNSS based geomorphological mapping revealed areas that underwent morphological changes. Surface displacements were analyzed by terrestrial laserscanning. These sites investigations are the basis for a detailed understanding of the landslide dynamics. In the future, the measurements will be applied in modelling concepts which will be embedded in a comprehensive landslide early warning system.
A computer interface for processing multi-parameter data of multiple event types
International Nuclear Information System (INIS)
Katayama, I.; Ogata, H.
1980-01-01
A logic circuit called a 'Raw Data Processor (RDP)' which functions as an interface between ADCs and the PDP-11 computer has been developed at RCNP, Osaka University for general use. It enables data processing simultaneously for numbers of events of various types up to 16, and an arbitrary combination of ADCs of any number up to 14 can be assigned to each event type by means of a pinboard matrix. The details of the RDP and its application are described. (orig.)
Hybrid organic-inorganic porous semiconductor transducer for multi-parameters sensing.
Caliò, Alessandro; Cassinese, Antonio; Casalino, Maurizio; Rea, Ilaria; Barra, Mario; Chiarella, Fabio; De Stefano, Luca
2015-07-06
Porous silicon (PSi) non-symmetric multi-layers are modified by organic molecular beam deposition of an organic semiconductor, namely the N,N'-1H,1H-perfluorobutyldicyanoperylene-carboxydi-imide (PDIF-CN2). Joule evaporation of PDIF-CN2 into the PSi sponge-like matrix not only improves but also adds transducing skills, making this solid-state device a dual signal sensor for chemical monitoring. PDIF-CN2 modified PSi optical microcavities show an increase of about five orders of magnitude in electric current with respect to the same bare device. This feature can be used to sense volatile substances. PDIF-CN2 also improves chemical resistance of PSi against alkaline and acid corrosion. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
Utama, D. N.; Ani, N.; Iqbal, M. M.
2018-03-01
Optimization is a process for finding parameter (parameters) that is (are) able to deliver an optimal value for an objective function. Seeking an optimal generic model for optimizing is a computer science study that has been being practically conducted by numerous researchers. Generic model is a model that can be technically operated to solve any varieties of optimization problem. By using an object-oriented method, the generic model for optimizing was constructed. Moreover, two types of optimization method, simulated-annealing and hill-climbing, were functioned in constructing the model and compared to find the most optimal one then. The result said that both methods gave the same result for a value of objective function and the hill-climbing based model consumed the shortest running time.
Energy Technology Data Exchange (ETDEWEB)
Bostock, J.; Weller, P. [School of Informatics, City University London, London EC1V 0HB (United Kingdom); Cooklin, M., E-mail: jbostock1@msn.co [Cardiovascular Directorate, Guy' s and St. Thomas' NHS Foundation Trust, London, SE1 7EH (United Kingdom)
2010-07-01
Automated diagnostic algorithms are used in implantable cardioverter-defibrillators (ICD's) to detect abnormal heart rhythms. Algorithms misdiagnose and improved specificity is needed to prevent inappropriate therapy. Knowledge engineering (KE) and artificial intelligence (AI) could improve this. A pilot study of KE was performed with artificial neural network (ANN) as AI system. A case note review analysed arrhythmic events stored in patients ICD memory. 13.2% patients received inappropriate therapy. The best ICD algorithm had sensitivity 1.00, specificity 0.69 (p<0.001 different to gold standard). A subset of data was used to train and test an ANN. A feed-forward, back-propagation network with 7 inputs, a 4 node hidden layer and 1 output had sensitivity 1.00, specificity 0.71 (p<0.001). A prospective study was performed using KE to list arrhythmias, factors and indicators for which measurable parameters were evaluated and results reviewed by a domain expert. Waveforms from electrodes in the heart and thoracic bio-impedance; temperature and motion data were collected from 65 patients during cardiac electrophysiological studies. 5 incomplete datasets were due to technical failures. We concluded that KE successfully guided selection of parameters and ANN produced a usable system and that complex data collection carries greater risk of technical failure, leading to data loss.
Fitting the Generic Multi-Parameter Crossover Model: Towards Realistic Scaling Estimates
Z.R. Struzik; E.H. Dooijes; F.C.A. Groen; M.M. Novak; T. G. Dewey
1997-01-01
textabstractThe primary concern of fractal metrology is providing a means of reliable estimation of scaling exponents such as fractal dimension, in order to prove the null hypothesis that a particular object can be regarded as fractal. In the particular context to be discussed in this contribution,
On structural reliability under time-varying multi-parameter loading
International Nuclear Information System (INIS)
Augusti, G.
1975-01-01
This paper intends to be a contribution towards the formulation of a procedure for the solution of the title problem that is at the same time correct and not too cumbersome for practical application. The problem is examined in detail and a number of possible alternative approaches to the solution discussed. Special attention is paid to the superimposition of loads of different origin and characteristics (e.g. long-term loads like the furniture and usual occupancy load in a building, and short-term loads like explosions, earthquakes, storms, etc.): it is recognized that a single procedure for all cases does not appear practical, and that, within a general framework, special methods must be devised according to the type of loads and structural responses. For instance, the superimposition of impulsive loads must be studied with reference to the response time of the structure. It is shown that usually, the statistics of extreme values are not sufficient for a correct study of superimposition: the instantaneous probability distributions of the load intensities are also required. (Auth.)
On structural reliability under time-varying multi-parameter loading
International Nuclear Information System (INIS)
Augusti, G.
1975-01-01
Special attention will be paid to the superimposition of loads of different origin and characteristics (e.g. long-term loads like the furniture and usual occupancy load in a building and short-term loads like explosions, earthquakes, storms, etc.): it will be recognized that a single procedure for all cases does not appear practical, and that, within a general framework special method must be devised according to the type of loads and structural responses. For instance, the superimposition of impulsive loads must be studied with reference to the response time of the structure. It will be shown that usually, the statistics of extreme values are not sufficient for a correct study of superimposition: the instantaneous probability distributions of the load intensities are also required. The results obtained with respect to the loads can be joined with previous results by Augusti and Baratta (see e.g. SMiRT-2 paper M7/8) on structural strength, for the evaluation of the probability of success (i.e. the reliability) of a structural design
Multi-parameter spectroscopy of fission fragments and related emission products
International Nuclear Information System (INIS)
Ruben, A.; Jahnke, U.
1993-01-01
An exclusive measurement of the 252 C f(sf) fragment distribution in mass and energy in coincidence with the related emission products by combining a twin ionization chamber with a 4π-neutron tank, a n-γ-detector, and a solid-state detector telescope is presented. The experimental set-up, data handling and acquisition is described followed by a discussion of the raw data evaluation. (orig.)
''Sheiva'' : a general purpose multi-parameter data acquisition and processing system at VECC
International Nuclear Information System (INIS)
Viyogi, Y.P.; Ganguly, N.K.
1982-01-01
A general purpose interactive software to be used with the PDP-15/76 on-line computer at VEC Centre for the acquisition and processing of data in nuclear physics experiments is described. The program can accommodate a maximum of thirty two inputs although the present hardware limits the number of inputs to eight. Particular emphasis is given to the problems of flexibility and ease of operation, memory optimisation and techniques dealing with experimenter-computer interaction. Various graphical methods for one- and two-dimensional data presentation are discussed. Specific problems of particle identification using detector telescopes have been dealt with carefully to handle experiments using several detector telescopes and those involving light particle-heavy particle coincidence studies. Steps needed to tailor this program towards utilisation for special experiments are also described. (author)
A molecular informatics view on best practice in multi-parameter compound optimization
Lusher, S.J.; McGuire, R.; Azevedo, R.; Boiten, J.W.; Schaik, van R.C.; Vlieg, de J.
2011-01-01
The difference between biologically active molecules and drugs is that the latter balance an array of related and unrelated properties required for administration to patients. Inevitability, during optimization, some of these multiple factors will conflict. Although informatics has a crucial role in
Probing water motion in heterogenous systems : a multi-parameter NMR approach
Dusschoten, van D.
1996-01-01
In this Thesis a practical approach is presented to study water mobility in heterogeneous systems by a number of novel NMR sequences. The major part of this Thesis describes how the reliability of diffusion measurements can be improved using some of the novel NMR sequences. The
A Highly Integrated Multi-Parameter Distributed Fiber-Optic Instrumentation System, Phase II
National Aeronautics and Space Administration — In the future, exploration missions will benefit greatly from advanced metrology capabilities, particularly structural health monitoring systems that provide real...
Multi-parameter machine learning approach to the neuroanatomical basis of developmental dyslexia.
Płoński, Piotr; Gradkowski, Wojciech; Altarelli, Irene; Monzalvo, Karla; van Ermingen-Marbach, Muna; Grande, Marion; Heim, Stefan; Marchewka, Artur; Bogorodzki, Piotr; Ramus, Franck; Jednoróg, Katarzyna
2017-02-01
Despite decades of research, the anatomical abnormalities associated with developmental dyslexia are still not fully described. Studies have focused on between-group comparisons in which different neuroanatomical measures were generally explored in isolation, disregarding potential interactions between regions and measures. Here, for the first time a multivariate classification approach was used to investigate grey matter disruptions in children with dyslexia in a large (N = 236) multisite sample. A variety of cortical morphological features, including volumetric (volume, thickness and area) and geometric (folding index and mean curvature) measures were taken into account and generalizability of classification was assessed with both 10-fold and leave-one-out cross validation (LOOCV) techniques. Classification into control vs. dyslexic subjects achieved above chance accuracy (AUC = 0.66 and ACC = 0.65 in the case of 10-fold CV, and AUC = 0.65 and ACC = 0.64 using LOOCV) after principled feature selection. Features that discriminated between dyslexic and control children were exclusively situated in the left hemisphere including superior and middle temporal gyri, subparietal sulcus and prefrontal areas. They were related to geometric properties of the cortex, with generally higher mean curvature and a greater folding index characterizing the dyslexic group. Our results support the hypothesis that an atypical curvature pattern with extra folds in left hemispheric perisylvian regions characterizes dyslexia. Hum Brain Mapp 38:900-908, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
High-Speed, Noninvasive, Multi-Parameter Laser Diagnostics for Transonic Flows, Phase II
National Aeronautics and Space Administration — Numerous ground-test and wind-tunnel facilities are used extensively to make surface measurements of and to characterize the forces and moments encountered by...
International Nuclear Information System (INIS)
Bostock, J.; Weller, P.; Cooklin, M.
2010-01-01
Automated diagnostic algorithms are used in implantable cardioverter-defibrillators (ICD's) to detect abnormal heart rhythms. Algorithms misdiagnose and improved specificity is needed to prevent inappropriate therapy. Knowledge engineering (KE) and artificial intelligence (AI) could improve this. A pilot study of KE was performed with artificial neural network (ANN) as AI system. A case note review analysed arrhythmic events stored in patients ICD memory. 13.2% patients received inappropriate therapy. The best ICD algorithm had sensitivity 1.00, specificity 0.69 (p<0.001 different to gold standard). A subset of data was used to train and test an ANN. A feed-forward, back-propagation network with 7 inputs, a 4 node hidden layer and 1 output had sensitivity 1.00, specificity 0.71 (p<0.001). A prospective study was performed using KE to list arrhythmias, factors and indicators for which measurable parameters were evaluated and results reviewed by a domain expert. Waveforms from electrodes in the heart and thoracic bio-impedance; temperature and motion data were collected from 65 patients during cardiac electrophysiological studies. 5 incomplete datasets were due to technical failures. We concluded that KE successfully guided selection of parameters and ANN produced a usable system and that complex data collection carries greater risk of technical failure, leading to data loss.
Inhomogeneous Quantum Invariance Group of Multi-Dimensional Multi-parameter Deformed Boson Algebra
International Nuclear Information System (INIS)
Altintas Azmi Ali; Arik Metin; Arikan Ali Serdar; Dil Emre
2012-01-01
We investigate the inhomogeneous invariance quantum group of the d-dimensional d-parameter deformed boson algebra. It is found that the homogeneous part of this quantum group is given by the d-parameter deformed general linear group. We construct the R-matrix which collects all information about the non-commuting structure of the quantum group for the two-dimensional case. (general)
Multi-parameter studies of environmental aerosols with cascade track filters
International Nuclear Information System (INIS)
Ensinger, W.; Guo, S.-L.; Vater, P.; Brandt, R.
2005-01-01
Aerosols in the air in a factory processing nuclear reactor fuel material were collected by using cascade Kapton track filters with outer pore sizes of 12.8, 4.0 and 1.0μm consecutively and a conventional filter of glass fiber being behind to collect all aerosol particles left-over. The volume of air passed through the filters was measured by a flow meter. The weight of aerosol particles on each filter was obtained by the weight difference of the filter before and after collection of aerosol particles. α-activity on each filter was measured with a methane gas flow proportional counter. The sizes and elemental compositions of aerosol particles on the filters were analyzed by using a scanning electron microscope and an electron microprobe. Special attention was given to uranium aerosol particles. The median sizes of uranium aerosol particles were obtained being 1.97, 1.33 and 0.72μm on the first, second and third filter, respectively. The median size of all the uranium aerosol particles on the three track filters was 1.25μm
DEFF Research Database (Denmark)
Mathiesen, Brian Vad; Liu, Wen; Zhang, Xiliang
2014-01-01
three major technological changes: energy savings on the demand side, efficiency improvements in energy production, and the replacement of fossil fuels by various sources of renewable energy. Consequently, the analysis of these systems must include strategies for integrating renewable sources...
Nonlinear multi-analysis of agent-based financial market dynamics by epidemic system
Lu, Yunfan; Wang, Jun; Niu, Hongli
2015-10-01
Based on the epidemic dynamical system, we construct a new agent-based financial time series model. In order to check and testify its rationality, we compare the statistical properties of the time series model with the real stock market indices, Shanghai Stock Exchange Composite Index and Shenzhen Stock Exchange Component Index. For analyzing the statistical properties, we combine the multi-parameter analysis with the tail distribution analysis, the modified rescaled range analysis, and the multifractal detrended fluctuation analysis. For a better perspective, the three-dimensional diagrams are used to present the analysis results. The empirical research in this paper indicates that the long-range dependence property and the multifractal phenomenon exist in the real returns and the proposed model. Therefore, the new agent-based financial model can recurrence some important features of real stock markets.
LikelihoodLib - Fitting, Function Maximization, and Numerical Analysis
Smirnov, I B
2001-01-01
A new class library is designed for function maximization, minimization, solution of equations and for other problems related to mathematical analysis of multi-parameter functions by numerical iterative methods. When we search the maximum or another special point of a function, we may change and fit all parameters simultaneously, sequentially, recursively, or by any combination of these methods. The discussion is focused on the first the most complicated method, although the others are also supported by the library. For this method we apply: control of precision by interval computations; the calculation of derivatives either by differential arithmetic, or by the method of finite differences with the step lengths which provide suppression of the influence of numerical noise; possible synchronization of the subjective function calls with minimization of the number of iterations; competitive application of various methods for step calculation, and converging to the solution by many trajectories.
Carr, Bob; Knowles, John; Warren, Jeremy
2008-10-01
We describe the continuing development of a laser-based, light scattering detector system capable of detecting and analysing liquid-borne nanoparticles. Using a finely focussed and specially configured laser beam to illuminate a suspension of nanoparticles in a small (250ul) sample and videoing the Brownian motion of each and every particle in the detection zone should allow individual but simultaneous detection and measurement of particle size, scattered light intensity, electrophoretic mobility and, where applicable, shape asymmetry. This real-time, multi-parameter analysis capability offers the prospect of reagentlessly differentiating between different particle types within a complex sample of potentially high and variable background. Employing relatively low powered (50-100mW) laser diode modules and low resolution CCD arrays, each component could be run off battery power, allowing distributed/remote or personal deployment. Voltages needed for electrophoresis measurement s would be similarly low (e.g. 20V, low current) and 30second videos (exported at mobile/cell phone download speeds) analysed remotely. The potential of such low-cost technology as a field-deployable grid of remote, battery powered and reagentless, multi-parameter sensors for use as trigger devices is discussed.
Optimal fatigue analysis of structures during complex loadings
Directory of Open Access Journals (Sweden)
Karaouni Habib
2016-01-01
Full Text Available A new framework for high cycle fatigue analysis of metallic structures under complex multi-parameter loadings was here developed. This allows to reduce the analysis on a 2-D window with a characterized one-parameter cyclic loading thanks to an equivalence rule relative to damage between any two loadings. The simplified inelastic analysis introduced by J. Zarka [J. Zarka et al. 1990. A new approach in inelastic analysis of structures. CADLM] was used to find the limit state of the structure. A new design rules for fatigue analysis by utilizing automatic learning systems was successfully performed. A database was built by coupling numerical simulations and experimental results on several welded specimens which are considered as a general structure in the proposed approach. This could be possible by the introduction of an intelligent description of a general fatigue case based on the actual theories and models. A software, FATPRO [M.I. Systems, FatPro, available at http://www.mzintsys.com/our_products_fatpro.html], based on this work has been developed at MZ Intelligent Systems.
Mandrup, Ole A; Lykkemark, Simon; Kristensen, Peter
2017-02-10
One of the hallmarks of cancer is sustained angiogenesis. Here, normal endothelial cells are activated, and their formation of new blood vessels leads to continued tumour growth. An improved patient condition is often observed when angiogenesis is prevented or normalized through targeting of these genomically stable endothelial cells. However, intracellular targets constitute a challenge in therapy, as the agents modulating these targets have to be delivered and internalized specifically to the endothelial cells. Selection of antibodies binding specifically to certain cell types is well established. It is nonetheless a challenge to ensure that the binding of antibodies to the target cell will mediate internalization. Previously selection of such antibodies has been performed targeting cancer cell lines; most often using either monovalent display or polyvalent display. In this article, we describe selections that isolate internalizing antibodies by sequential combining monovalent and polyvalent display using two types of helper phages, one which increases display valence and one which reduces background. One of the selected antibodies was found to mediate internalization into human endothelial cells, although our results confirms that the single stranded nature of the DNA packaged into phage particles may limit applications aimed at targeting nucleic acids in mammalian cells.
Czech Academy of Sciences Publication Activity Database
Veselý, V.; Sobek, J.; Šestáková, L.; Frantík, P.; Seitl, Stanislav
2013-01-01
Roč. 7, č. 25 (2013), s. 69-78 ISSN 1971-8993 R&D Projects: GA ČR(CZ) GAP104/11/0833; GA ČR(CZ) GAP105/11/1551 Institutional support: RVO:68081723 Keywords : Near-crack tip fields * Williams series * higher-order terms * stress field approximation * wedge splitting test * fracture process zone Subject RIV: JL - Materials Fatigue, Friction Mechanics
International Nuclear Information System (INIS)
Akram, M.; Afzal, M.; Ashraf, M.
2011-01-01
Salt tolerance potential of a newly developed wheat genotype (N-9760: V3) was assessed by comparing it with a known salt tolerant line (N-1073:V2) and a commercial cultivar (Inqlab: V1) using various growth parameters measured at the vegetative and maturity stages, The objectives were to know qualitative and quantitative tolerance status and possible utilization of the new genotype as well as to examine as to whether the parameters used to assess the tolerance at vegetative and maturity stages are affected differentially by various salinity levels. The experiment was conducted in pots using four salinity levels (EC 1.5, 5, 10 and 15 dS m/sup -1/). Root and shoot length, root and shoot fresh and dry weight, number of leaves and leaf area were recorded at the vegetative stage, while plant height, number of tillers, spike length and grain yield plant/sup -1/ were recorded at the maturity stage. Fresh weight of shoots, fresh and dry weights of roots, plant height, number of productive tillers and grain yield were least affected in V3 while shoot length, shoot fresh weight, number of leaves, leaf area and spike length were least affected in V2 by EC 15 dS m/sup -1/. Both genotypes appeared tolerant but all the parameters studied at both stages were affected differentially by salinity levels and genotypes hence, testing of every new genotype appeared essential. (author)
Directory of Open Access Journals (Sweden)
Adimi Maryam
2012-01-01
Full Text Available A quantitative structure activity relationship (QSAR model has been produced for predicting antagonist potency of biphenyl derivatives as human histamine (H3 receptors. The molecular structures of the compounds are numerically represented by various kinds of molecular descriptors. The whole data set was divided into training and test sets. Genetic algorithm based multiple linear regression is used to select most statistically effective descriptors. The final QSAR model (N =24, R2=0.916, F = 51.771, Q2 LOO = 0.872, Q2 LGO = 0.847, Q2 BOOT = 0.857 was fully validated employing leaveone- out (LOO cross-validation approach, Fischer statistics (F, Yrandomisation test, and predictions based on the test data set. The test set presented an external prediction power of R2 test=0.855. In conclusion, the QSAR model generated can be used as a valuable tool for designing similar groups of new antagonists of histamine (H3 receptors.
Fovet, O.; Humbert, G.; Dupas, R.; Gascuel-Odoux, C.; Gruau, G.; Jaffrezic, A.; Thelusma, G.; Faucheux, M.; Gilliet, N.; Hamon, Y.; Grimaldi, C.
2018-04-01
The response of stream chemistry to storm is of major interest for understanding the export of dissolved and particulate species from catchments. The related challenge is the identification of active hydrological flow paths during these events and of the sources of chemical elements for which these events are hot moments of exports. An original four-year data set that combines high frequency records of stream flow, turbidity, nitrate and dissolved organic carbon concentrations, and piezometric levels was used to characterize storm responses in a headwater agricultural catchment. The data set was used to test to which extend the shallow groundwater was impacting the variability of storm responses. A total of 177 events were described using a set of quantitative and functional descriptors related to precipitation, stream and groundwater pre-event status and event dynamics, and to the relative dynamics between water quality parameters and flow via hysteresis indices. This approach led to identify different types of response for each water quality parameter which occurrence can be quantified and related to the seasonal functioning of the catchment. This study demonstrates that high-frequency records of water quality are precious tools to study/unique in their ability to emphasize the variability of catchment storm responses.
Applications of noise analysis to nuclear safety
International Nuclear Information System (INIS)
Aguilar Martinez, Omar
2000-01-01
Noise Analysis techniques (analysis of the fluctuation of physical parameters) have been successfully applied to the operational vigilance of the technical equipment that plays a decisive role in the production cycle of a very complex industry. Although fluctuation measurements in nuclear installations started almost at the start of the nuclear era (see works by Feynman and Rossi on the development of neutron methodology), only recently have neutron noise diagnostic applications begun to be a part of the standard procedures for the performance of some modern nuclear installations. Following the relevant technical advances made in information sciences and analogical electronics, measuring the fluctuation of physical parameters has become a very effective tool for detecting, guarding and following up possible defects in a nuclear system. As the processing techniques for the fluctuation of a nuclear reactor's physical-neutron parameters have evolved (temporal and frequency analysis, multi-parameter self -regression analysis, etc.), the applications of the theory of non-lineal dynamics and chaos theory have progressed by focusing on the problem from another perspective. This work reports on those nuclear applications of noise analysis that increase nuclear safety in all types of nuclear facilities and that have been carried out by the author over the last decade, such as: -Void Force Critical Set Applications (Zero Power Reactor Applications, Central Institute of Physical Research, Budapest, Hungary); -Research Reactor Applications (Triga Mark III Reactor, National Institute of Nuclear Research, ININ, Mexico); -Power Reactor Applications in a Nuclear Power Plant (First Circuit of Block II, Paks Nuclear Center, Hungary); -Second Loop applications in a Nuclear Power Plant (Block I Paks Nuclear Center, Hungary; Block II Kalinin Nuclear Center, Russia); -Shield System Applications for the Transport of Radioisotopes (Nuclear Technology Center, Havana, Cuba) New trends in
International Nuclear Information System (INIS)
Elfman, Mikael; Ros, Linus; Kristiansson, Per; Nilsson, E.J. Charlotta; Pallon, Jan
2016-01-01
With the recent advances towards modern Ion Beam Analysis (IBA), going from one- or few-parameter detector systems to multi-parameter systems, it has been necessary to expand and replace the more than twenty years old CAMAC based system. A new VME multi-parameter (presently up to 200 channels) data acquisition and control system has been developed and implemented at the Lund Ion Beam Analysis Facility (LIBAF). The system is based on the VX-511 Single Board Computer (SBC), acting as master with arbiter functionality and consists of standard VME modules like Analog to Digital Converters (ADC’s), Charge to Digital Converters (QDC’s), Time to Digital Converters (TDC’s), scaler’s, IO-cards, high voltage and waveform units. The modules have been specially selected to support all of the present detector systems in the laboratory, with the option of future expansion. Typically, the detector systems consist of silicon strip detectors, silicon drift detectors and scintillator detectors, for detection of charged particles, X-rays and γ-rays. The data flow of the raw data buffers out from the VME bus to the final storage place on a 16 terabyte network attached storage disc (NAS-disc) is described. The acquisition process, remotely controlled over one of the SBCs ethernet channels, is also discussed. The user interface is written in the Kmax software package, and is used to control the acquisition process as well as for advanced online and offline data analysis through a user-friendly graphical user interface (GUI). In this work the system implementation, layout and performance are presented. The user interface and possibilities for advanced offline analysis are also discussed and illustrated.
Energy Technology Data Exchange (ETDEWEB)
Elfman, Mikael, E-mail: Mikael.Elfman@nuclear.lu.se; Ros, Linus; Kristiansson, Per; Nilsson, E.J. Charlotta; Pallon, Jan
2016-03-15
With the recent advances towards modern Ion Beam Analysis (IBA), going from one- or few-parameter detector systems to multi-parameter systems, it has been necessary to expand and replace the more than twenty years old CAMAC based system. A new VME multi-parameter (presently up to 200 channels) data acquisition and control system has been developed and implemented at the Lund Ion Beam Analysis Facility (LIBAF). The system is based on the VX-511 Single Board Computer (SBC), acting as master with arbiter functionality and consists of standard VME modules like Analog to Digital Converters (ADC’s), Charge to Digital Converters (QDC’s), Time to Digital Converters (TDC’s), scaler’s, IO-cards, high voltage and waveform units. The modules have been specially selected to support all of the present detector systems in the laboratory, with the option of future expansion. Typically, the detector systems consist of silicon strip detectors, silicon drift detectors and scintillator detectors, for detection of charged particles, X-rays and γ-rays. The data flow of the raw data buffers out from the VME bus to the final storage place on a 16 terabyte network attached storage disc (NAS-disc) is described. The acquisition process, remotely controlled over one of the SBCs ethernet channels, is also discussed. The user interface is written in the Kmax software package, and is used to control the acquisition process as well as for advanced online and offline data analysis through a user-friendly graphical user interface (GUI). In this work the system implementation, layout and performance are presented. The user interface and possibilities for advanced offline analysis are also discussed and illustrated.
Progress in Analysis to Remote Sensed Thermal Abnormity with Fault Activity and Seismogenic Process
Directory of Open Access Journals (Sweden)
WU Lixin
2017-10-01
Full Text Available Research to the remote sensed thermal abnormity with fault activity and seismogenic process is a vital topic of the Earth observation and remote sensing application. It is presented that a systematic review on the international researches on the topic during the past 30 years, in the respects of remote sensing data applications, anomaly analysis methods, and mechanism understanding. Firstly, the outlines of remote sensing data applications are given including infrared brightness temperature, microwave brightness temperature, outgoing longwave radiation, and assimilated data from multiple earth observations. Secondly, three development phases are summarized as qualitative analysis based on visual interpretation, quantitative analysis based on image processing, and multi-parameter spatio-temporal correlation analysis. Thirdly, the theoretical hypotheses presented for the mechanism understanding are introduced including earth degassing, stress-induced heat, crustal rock battery conversion, latent heat release due to radon decay as well as multi-spheres coupling effect. Finally, three key directions of future research on this topic are proposed:anomaly recognizing by remote sensing monitoring and data analysis for typical tectonic activity areas; anomaly mechanism understanding based on earthquake-related earth system responses; spatio-temporal correlation analysis of air-based, space-based and ground-based stereoscopic observations.
Cytobank: providing an analytics platform for community cytometry data analysis and collaboration.
Chen, Tiffany J; Kotecha, Nikesh
2014-01-01
Cytometry is used extensively in clinical and laboratory settings to diagnose and track cell subsets in blood and tissue. High-throughput, single-cell approaches leveraging cytometry are developed and applied in the computational and systems biology communities by researchers, who seek to improve the diagnosis of human diseases, map the structures of cell signaling networks, and identify new cell types. Data analysis and management present a bottleneck in the flow of knowledge from bench to clinic. Multi-parameter flow and mass cytometry enable identification of signaling profiles of patient cell samples. Currently, this process is manual, requiring hours of work to summarize multi-dimensional data and translate these data for input into other analysis programs. In addition, the increase in the number and size of collaborative cytometry studies as well as the computational complexity of analytical tools require the ability to assemble sufficient and appropriately configured computing capacity on demand. There is a critical need for platforms that can be used by both clinical and basic researchers who routinely rely on cytometry. Recent advances provide a unique opportunity to facilitate collaboration and analysis and management of cytometry data. Specifically, advances in cloud computing and virtualization are enabling efficient use of large computing resources for analysis and backup. An example is Cytobank, a platform that allows researchers to annotate, analyze, and share results along with the underlying single-cell data.
Leka, K. D.; Barnes, Graham; Wagner, Eric
2018-04-01
A classification infrastructure built upon Discriminant Analysis (DA) has been developed at NorthWest Research Associates for examining the statistical differences between samples of two known populations. Originating to examine the physical differences between flare-quiet and flare-imminent solar active regions, we describe herein some details of the infrastructure including: parametrization of large datasets, schemes for handling "null" and "bad" data in multi-parameter analysis, application of non-parametric multi-dimensional DA, an extension through Bayes' theorem to probabilistic classification, and methods invoked for evaluating classifier success. The classifier infrastructure is applicable to a wide range of scientific questions in solar physics. We demonstrate its application to the question of distinguishing flare-imminent from flare-quiet solar active regions, updating results from the original publications that were based on different data and much smaller sample sizes. Finally, as a demonstration of "Research to Operations" efforts in the space-weather forecasting context, we present the Discriminant Analysis Flare Forecasting System (DAFFS), a near-real-time operationally-running solar flare forecasting tool that was developed from the research-directed infrastructure.
Energy Technology Data Exchange (ETDEWEB)
Various, Authors
1981-05-01
In order to control pollutants resulting from energy production and utilization, adequate methods are required for monitoring the level of various substances often present at low concentrations. The Energy and Environment Division Applied Research in Laser Spectroscopy & Analytical Techniques Program is directed toward meeting these needs, Emphasis is on the development of physical methods, as opposed to conventional chemical analysis techniques. The advantages, now widely recognized, include ultra-high sensitivity coupled with minimal sample preparation. In some instances physical methods provide multi-parameter measurements which often provide the only means of achiev·ing the sensitivity necessary for the detection of trace contaminants. Work is reported in these areas: APPLIED PHYSICS AND LASER SPECTROSCOPY RESEARCH; MICROPROCESSOR CONTROLLER ANODIC STRIPPING VOLTAMETER FOR TRACE METALS ANALYSIS IN WATER; THE SURVEY OF INSTRUMENTATION FOR ENVIRONMENTAL MONITORING; THE POSSIBLE CHRONDRITIC NATURE OF THE DANISH CRETACEOUS~TERTIARY BOUNDARY; IMPROVEMENT OF THE SENSITIVITY AND PRECISION OF NEUTRON ACTIVATION ANALYSIS OF SOME ELEMENTS IN PLANKTON AND PLANKTONIC FISH; and SOURCES OF SOME SECONDARILY WORKED OBSIDIAN ARTIFACTS FROM TIKAL, GUATEMALA.
Djebbi, Ramzi; Plessix, René -É douard; Alkhalifah, Tariq Ali
2016-01-01
In anisotropic media, several parameters govern the propagation of the compressional waves. To correctly invert surface recorded seismic data in anisotropic media, a multi-parameter inversion is required. However, a tradeoff between parameters
Zhu, Lingyun; Li, Lianjie; Meng, Chunyan
2014-12-01
There have been problems in the existing multiple physiological parameter real-time monitoring system, such as insufficient server capacity for physiological data storage and analysis so that data consistency can not be guaranteed, poor performance in real-time, and other issues caused by the growing scale of data. We therefore pro posed a new solution which was with multiple physiological parameters and could calculate clustered background data storage and processing based on cloud computing. Through our studies, a batch processing for longitudinal analysis of patients' historical data was introduced. The process included the resource virtualization of IaaS layer for cloud platform, the construction of real-time computing platform of PaaS layer, the reception and analysis of data stream of SaaS layer, and the bottleneck problem of multi-parameter data transmission, etc. The results were to achieve in real-time physiological information transmission, storage and analysis of a large amount of data. The simulation test results showed that the remote multiple physiological parameter monitoring system based on cloud platform had obvious advantages in processing time and load balancing over the traditional server model. This architecture solved the problems including long turnaround time, poor performance of real-time analysis, lack of extensibility and other issues, which exist in the traditional remote medical services. Technical support was provided in order to facilitate a "wearable wireless sensor plus mobile wireless transmission plus cloud computing service" mode moving towards home health monitoring for multiple physiological parameter wireless monitoring.
Rapid determination of long-lived artificial alpha radionuclides using time interval analysis
International Nuclear Information System (INIS)
Uezu, Yasuhiro; Koarashi, Jun; Sanada, Yukihisa; Hashimoto, Tetsuo
2003-01-01
It is important to monitor long lived alpha radionuclides as plutonium ( 238 Pu, 239+240 Pu) in the field of working area and environment of nuclear fuel cycle facilities, because it is well known that potential risks of cancer-causing from alpha radiation is higher than gamma radiations. Thus, these monitoring are required high sensitivity, high resolution and rapid determination in order to measure a very low-level concentration of plutonium isotopes. In such high sensitive monitoring, natural radionuclides, including radon ( 222 Rn or 220 Rn) and their progenies, should be eliminated as low as possible. In this situation, a sophisticated discrimination method between Pu and progenies of 222 Rn or 220 Rn using time interval analysis (TIA), which was able to subtract short-lived radionuclides using the time interval distributions calculation of successive alpha and beta decay events within millisecond or microsecond orders, was designed and developed. In this system, alpha rays from 214 Po, 216 Po and 212 Po are extractable. TIA measuring system composes of Silicon Surface Barrier Detector (SSD), an amplifier, an Analog to Digital Converter (ADC), a Multi-Channel Analyzer (MCA), a high-resolution timer (TIMER), a multi-parameter collector and a personal computer. In ADC, incidental alpha and beta pulses are sent to the MCA and the TIMER simultaneously. Pulses from them are synthesized by the multi-parameter collector. After measurement, natural radionuclides are subtracted. Airborne particles were collected on membrane filter for 60 minutes at 100 L/min. Small Pu particles were added on the surface of it. Alpha and beta rays were measured and natural radionuclides were subtracted within 5 times of 145 msec. by TIA. As a result of it, the hidden Pu in natural background could be recognized clearly. The lower limit of determination of 239 Pu is calculated as 6x10 -9 Bq/cm 3 . This level is satisfied with the derived air concentration (DAC) of 239 Pu (8x10 -9 Bq/cm 3
Djebbi, Ramzi; Alkhalifah, Tariq Ali
2014-01-01
Multi-parameter inversion in anisotropic media suffers from the inherent trade-off between the anisotropic parameters, even under the acoustic assumption. Multi-component data, often acquired nowadays in ocean bottom acquisition and land data
Transient dynamic and modeling parameter sensitivity analysis of 1D solid oxide fuel cell model
International Nuclear Information System (INIS)
Huangfu, Yigeng; Gao, Fei; Abbas-Turki, Abdeljalil; Bouquain, David; Miraoui, Abdellatif
2013-01-01
Highlights: • A multiphysics, 1D, dynamic SOFC model is developed. • The presented model is validated experimentally in eight different operating conditions. • Electrochemical and thermal dynamic transient time expressions are given in explicit forms. • Parameter sensitivity is discussed for different semi-empirical parameters in the model. - Abstract: In this paper, a multiphysics solid oxide fuel cell (SOFC) dynamic model is developed by using a one dimensional (1D) modeling approach. The dynamic effects of double layer capacitance on the electrochemical domain and the dynamic effect of thermal capacity on thermal domain are thoroughly considered. The 1D approach allows the model to predict the non-uniform distributions of current density, gas pressure and temperature in SOFC during its operation. The developed model has been experimentally validated, under different conditions of temperature and gas pressure. Based on the proposed model, the explicit time constant expressions for different dynamic phenomena in SOFC have been given and discussed in detail. A parameters sensitivity study has also been performed and discussed by using statistical Multi Parameter Sensitivity Analysis (MPSA) method, in order to investigate the impact of parameters on the modeling accuracy
Hu, Shunren; Chen, Weimin; Liu, Lin; Gao, Xiaoxia
2010-03-01
Bridge structural health monitoring system is a typical multi-sensor measurement system due to the multi-parameters of bridge structure collected from the monitoring sites on the river-spanning bridges. Bridge structure monitored by multi-sensors is an entity, when subjected to external action; there will be different performances to different bridge structure parameters. Therefore, the data acquired by each sensor should exist countless correlation relation. However, complexity of the correlation relation is decided by complexity of bridge structure. Traditionally correlation analysis among monitoring sites is mainly considered from physical locations. unfortunately, this method is so simple that it cannot describe the correlation in detail. The paper analyzes the correlation among the bridge monitoring sites according to the bridge structural data, defines the correlation of bridge monitoring sites and describes its several forms, then integrating the correlative theory of data mining and signal system to establish the correlation model to describe the correlation among the bridge monitoring sites quantificationally. Finally, The Chongqing Mashangxi Yangtze river bridge health measurement system is regards as research object to diagnosis sensors fault, and simulation results verify the effectiveness of the designed method and theoretical discussions.
Microfabricated Electrochemical Cell-Based Biosensors for Analysis of Living Cells In Vitro
Directory of Open Access Journals (Sweden)
Jun Wang
2012-04-01
Full Text Available Cellular biochemical parameters can be used to reveal the physiological and functional information of various cells. Due to demonstrated high accuracy and non-invasiveness, electrochemical detection methods have been used for cell-based investigation. When combined with improved biosensor design and advanced measurement systems, the on-line biochemical analysis of living cells in vitro has been applied for biological mechanism study, drug screening and even environmental monitoring. In recent decades, new types of miniaturized electrochemical biosensor are emerging with the development of microfabrication technology. This review aims to give an overview of the microfabricated electrochemical cell-based biosensors, such as microelectrode arrays (MEA, the electric cell-substrate impedance sensing (ECIS technique, and the light addressable potentiometric sensor (LAPS. The details in their working principles, measurement systems, and applications in cell monitoring are covered. Driven by the need for high throughput and multi-parameter detection proposed by biomedicine, the development trends of electrochemical cell-based biosensors are also introduced, including newly developed integrated biosensors, and the application of nanotechnology and microfluidic technology.
Metzger, Philip T.; Lane, John E.; Carilli, Robert A.; Long, Jason M.; Shawn, Kathy L.
2010-07-01
A method combining photogrammetry with ballistic analysis is demonstrated to identify flying debris in a rocket launch environment. Debris traveling near the STS-124 Space Shuttle was captured on cameras viewing the launch pad within the first few seconds after launch. One particular piece of debris caught the attention of investigators studying the release of flame trench fire bricks because its high trajectory could indicate a flight risk to the Space Shuttle. Digitized images from two pad perimeter high-speed 16-mm film cameras were processed using photogrammetry software based on a multi-parameter optimization technique. Reference points in the image were found from 3D CAD models of the launch pad and from surveyed points on the pad. The three-dimensional reference points were matched to the equivalent two-dimensional camera projections by optimizing the camera model parameters using a gradient search optimization technique. Using this method of solving the triangulation problem, the xyz position of the object's path relative to the reference point coordinate system was found for every set of synchronized images. This trajectory was then compared to a predicted trajectory while performing regression analysis on the ballistic coefficient and other parameters. This identified, with a high degree of confidence, the object's material density and thus its probable origin within the launch pad environment. Future extensions of this methodology may make it possible to diagnose the underlying causes of debris-releasing events in near-real time, thus improving flight safety.
Ponomarev, Yury K.
2018-01-01
The mathematical model of deformation of a cable (rope) vibration insulator consisting of two identical clips connected by means of elastic elements of a complex axial line is developed in detail. The axial line of the element is symmetric relatively to the horizontal axis of the shape and is made up of five rectilinear sections of arbitrary length a, b, c, conjugated to four radius sections with parameters R1 and R2 with angular extent 90°. On the basis of linear representations of the theory of bending and torsion of mechanics of materials, applied mechanics and linear algebra, a mathematical model of loading of an element and a vibration insulator as a whole in the direction of the vertical Y axis has been developed. Generalized characteristics of the friction and elastic forces for an elastic element with a complete set of the listed sections are obtained. Further, with the help of nullification in the generalized model of the characteristics of certain parameters, special cases of friction and elastic forces are obtained without taking into account the nullified parameters. Simultaneously, on the basis of the 3D computer-aided design system, volumetric models of simplified structures were created, given in the work. It is shown that, with the help of a variation of the five parameters of the axial scheme of the element, in combination with the variation of the moment of inertia of the rope section and the number of elements entering the ensemble, the load characteristics and stiffness of the vibration insulators can be changed tens and hundreds of times. This opens up unlimited possibilities for the optimal design of vibration protection systems in terms of weight characteristics, in cost, in terms of vibration intensity, in overall dimensions in different directions, which is very important for aerospace and transport engineering.
Childs, Charmaine; Wang, Li; Neoh, Boon Kwee; Goh, Hok Liok; Zu, Mya Myint; Aung, Phyo Wai; Yeo, Tseng Tsai
2014-10-01
The objective was to investigate sensor measurement uncertainty for intracerebral probes inserted during neurosurgery and remaining in situ during neurocritical care. This describes a prospective observational study of two sensor types and including performance of the complete sensor-bedside monitoring and readout system. Sensors from 16 patients with severe traumatic brain injury (TBI) were obtained at the time of removal from the brain. When tested, 40% of sensors achieved the manufacturer temperature specification of 0.1 °C. Pressure sensors calibration differed from the manufacturers at all test pressures in 8/20 sensors. The largest pressure measurement error was in the intraparenchymal triple sensor. Measurement uncertainty is not influenced by duration in situ. User experiences reveal problems with sensor 'handling', alarms and firmware. Rigorous investigation of the performance of intracerebral sensors in the laboratory and at the bedside has established measurement uncertainty in the 'real world' setting of neurocritical care.
Cucchetti, E.; Eckart, M. E.; Peille, P.; Porter, F. S.; Pajot, F.; Pointecouteau, E.
2018-04-01
With its array of 3840 Transition Edge Sensors (TESs), the Athena X-ray Integral Field Unit (X-IFU) will provide spatially resolved high-resolution spectroscopy (2.5 eV up to 7 keV) from 0.2 to 12 keV, with an absolute energy scale accuracy of 0.4 eV. Slight changes in the TES operating environment can cause significant variations in its energy response function, which may result in systematic errors in the absolute energy scale. We plan to monitor such changes at pixel level via onboard X-ray calibration sources and correct the energy scale accordingly using a linear or quadratic interpolation of gain curves obtained during ground calibration. However, this may not be sufficient to meet the 0.4 eV accuracy required for the X-IFU. In this contribution, we introduce a new two-parameter gain correction technique, based on both the pulse-height estimate of a fiducial line and the baseline value of the pixels. Using gain functions that simulate ground calibration data, we show that this technique can accurately correct deviations in detector gain due to changes in TES operating conditions such as heat sink temperature, bias voltage, thermal radiation loading and linear amplifier gain. We also address potential optimisations of the onboard calibration source and compare the performance of this new technique with those previously used.
Hübner, Lena; Pennings, Steven C; Zimmer, Martin
2015-08-01
Distinct habitats are often linked through fluxes of matter and migration of organisms. In particular, intertidal ecotones are prone to being influenced from both the marine and the terrestrial realms, but whether or not small-scale migration for feeding, sheltering or reproducing is detectable may depend on the parameter studied. Within the ecotone of an upper saltmarsh in the United States, we investigated the sex-specific movement of the semi-terrestrial crab Armases cinereum using an approach of determining multiple measures of across-ecotone migration. To this end, we determined food preference, digestive abilities (enzyme activities), bacterial hindgut communities (genetic fingerprint), and the trophic position of Armases and potential food sources (stable isotopes) of males versus females of different sub-habitats, namely high saltmarsh and coastal forest. Daily observations showed that Armases moved frequently between high-intertidal (saltmarsh) and terrestrial (forest) habitats. Males were encountered more often in the forest habitat, whilst gravid females tended to be more abundant in the marsh habitat but moved more frequently. Food preference was driven by both sex and habitat. The needlerush Juncus was preferred over three other high-marsh detrital food sources, and the periwinkle Littoraria was the preferred prey of male (but not female) crabs from the forest habitats; both male and female crabs from marsh habitat preferred the fiddler crab Uca over three other prey items. In the field, the major food sources were clearly vegetal, but males have a higher trophic position than females. In contrast to food preference, isotope data excluded Uca and Littoraria as major food sources, except for males from the forest, and suggested that Armases consumes a mix of C4 and C3 plants along with animal prey. Digestive enzyme activities differed significantly between sexes and habitats and were higher in females and in marsh crabs. The bacterial hindgut community differed significantly between sexes, but habitat effects were greater than sex effects. By combining multiple measures of feeding ecology, we demonstrate that Armases exhibits sex-specific habitat choice and food preference. By using both coastal forest and saltmarsh habitats, but feeding predominantly in the latter, they possibly act as a key biotic vector of spatial subsidies across habitat borders. The degree of contributing to fluxes of matter, nutrients and energy, however, depends on their sex, indicating that changes in population structure would likely have profound effects on ecosystem connectivity and functioning.
Ghionis, George; Alexandrakis, George; Karditsa, Aikaterini; Sifnioti, Dafni; Vousdoukas, Michalis; Andreadis, Olympos; Petrakis, Stelios; Poulos, Serafim; Velegrakis, Adonis; Kampanis, Nikolaos; Lipakis, Michalis
2014-05-01
The AKTAIA project aims at the production of new knowledge regarding the forms of manifestation of the climate change and its influence on the stability and evolution of the coastal landforms along the shoreline of eastern Crete (approximate length: 757 km), taking into account the various aspects of human intervention. Aerial photographs, satellite images and orthophotomaps have been used to produce a detailed coastline map and to study the morphological characteristics of the coastal zone of Eastern Crete. More than 100 beach zones have been visited during three field campaigns, which included geomorphological and human intervention mapping, topographic, meteorological and oceanographic measurements and sedimentological sampling and observations. In addition, two pilot sites (one in the north and one in the south part of Crete) are being monitored, via the installation of coastal video monitoring systems, shore-based meteorological stations and wave-tide recorders installed in the nearshore zone. Detailed seafloor mapping with the use of side scan sonar and scuba diving and bathymetric surveys were conducted in the two pilot sites. Meteorological and oceanographic data from all existing land-based meteorological stations, oceanographic buoys and the ERA-interim dataset are used to determine the wind and wave climate of each beach. The collected climatic, sedimentological and coastal environmental data are being integrated in a GIS database that will be used to forecast the climatic trends in the area of Crete for the next decades and to model the impact of the climatic change on the future evolution of the coastal zone. New methodologies for the continuous monitoring of land-sea interaction and for the quantification of the loss of sensitive coastal zones due to sea-level rise and a modified Coastal Vulnerability Index for a comparative evaluation of the vulnerability of the coasts are being developed. Numerical modelling of the nearshore hydrodynamics and the associated sediment transport and beach morphodynamics, calibrated with in situ data, is used to predict beach response and vulnerability to different climate change scenarios. Finally, the socio-economic impact of the climate change on the coastal zone will be assessed and a management protocol for the coastal zone and for the mitigation of the climate change impact will be developed. The ultimate scope of the project is to benefit the society by providing current and high quality information on the consequences of the climate change, especially those related to sea-level rise, and on the available protection and mitigation measures. In addition, the technological product will help in the proper planning of the required actions and technical interventions, reducing the need for costly, incomplete and frequently redundant localized studies and the risk of unsuccessful interventions. Acknowledgements The project is supported by the Action "Cooperation 2007-2013" (09SYN-31-711 "AKTAIA") of the Operational Program "Competitiveness and Entrepreneurship" co-funded by the European Regional Development Fund (ERDF) and the General Secretariat for Research and Technology (Hellenic Ministry of Education).
Veres, D.; Bazin, L.; Landais, A.; Toyé Mahamadou Kele, H.; Lemieux-Dudon, B.; Parrenin, F.; Martinerie, P.; Blayo, E.; Blunier, T.; Capron, E.; Chappellaz, J.; Rasmussen, S. O.; Severi, M.; Svensson, A.; Vinther, B.; Wolff, E. W.
2013-08-01
The deep polar ice cores provide reference records commonly employed in global correlation of past climate events. However, temporal divergences reaching up to several thousand years (ka) exist between ice cores over the last climatic cycle. In this context, we are hereby introducing the Antarctic Ice Core Chronology 2012 (AICC2012), a new and coherent timescale developed for four Antarctic ice cores, namely Vostok, EPICA Dome C (EDC), EPICA Dronning Maud Land (EDML) and Talos Dome (TALDICE), alongside the Greenlandic NGRIP record. The AICC2012 timescale has been constructed using the Bayesian tool Datice (Lemieux-Dudon et al., 2010) that combines glaciological inputs and data constraints, including a wide range of relative and absolute gas and ice stratigraphic markers. We focus here on the last 120 ka, whereas the companion paper by Bazin et al. (2013) focuses on the interval 120-800 ka. Compared to previous timescales, AICC2012 presents an improved timing for the last glacial inception, respecting the glaciological constraints of all analyzed records. Moreover, with the addition of numerous new stratigraphic markers and improved calculation of the lock-in depth (LID) based on δ15N data employed as the Datice background scenario, the AICC2012 presents a slightly improved timing for the bipolar sequence of events over Marine Isotope Stage 3 associated with the seesaw mechanism, with maximum differences of about 600 yr with respect to the previous Datice-derived chronology of Lemieux-Dudon et al. (2010), hereafter denoted LD2010. Our improved scenario confirms the regional differences for the millennial scale variability over the last glacial period: while the EDC isotopic record (events of triangular shape) displays peaks roughly at the same time as the NGRIP abrupt isotopic increases, the EDML isotopic record (events characterized by broader peaks or even extended periods of high isotope values) reached the isotopic maximum several centuries before. It is expected that the future contribution of both other long ice core records and other types of chronological constraints to the Datice tool will lead to further refinements in the ice core chronologies beyond the AICC2012 chronology. For the time being however, we recommend that AICC2012 be used as the preferred chronology for the Vostok, EDC, EDML and TALDICE ice core records, both over the last glacial cycle (this study), and beyond (following Bazin et al., 2013). The ages for NGRIP in AICC2012 are virtually identical to those of GICC05 for the last 60.2 ka, whereas the ages beyond are independent of those in GICC05modelext (as in the construction of AICC2012, the GICC05modelext was included only via the background scenarios and not as age markers). As such, where issues of phasing between Antarctic records included in AICC2012 and NGRIP are involved, the NGRIP ages in AICC2012 should therefore be taken to avoid introducing false offsets. However for issues involving only Greenland ice cores, there is not yet a strong basis to recommend superseding GICC05modelext as the recommended age scale for Greenland ice cores.
Directory of Open Access Journals (Sweden)
D. Veres
2013-08-01
Full Text Available The deep polar ice cores provide reference records commonly employed in global correlation of past climate events. However, temporal divergences reaching up to several thousand years (ka exist between ice cores over the last climatic cycle. In this context, we are hereby introducing the Antarctic Ice Core Chronology 2012 (AICC2012, a new and coherent timescale developed for four Antarctic ice cores, namely Vostok, EPICA Dome C (EDC, EPICA Dronning Maud Land (EDML and Talos Dome (TALDICE, alongside the Greenlandic NGRIP record. The AICC2012 timescale has been constructed using the Bayesian tool Datice (Lemieux-Dudon et al., 2010 that combines glaciological inputs and data constraints, including a wide range of relative and absolute gas and ice stratigraphic markers. We focus here on the last 120 ka, whereas the companion paper by Bazin et al. (2013 focuses on the interval 120–800 ka. Compared to previous timescales, AICC2012 presents an improved timing for the last glacial inception, respecting the glaciological constraints of all analyzed records. Moreover, with the addition of numerous new stratigraphic markers and improved calculation of the lock-in depth (LID based on δ15N data employed as the Datice background scenario, the AICC2012 presents a slightly improved timing for the bipolar sequence of events over Marine Isotope Stage 3 associated with the seesaw mechanism, with maximum differences of about 600 yr with respect to the previous Datice-derived chronology of Lemieux-Dudon et al. (2010, hereafter denoted LD2010. Our improved scenario confirms the regional differences for the millennial scale variability over the last glacial period: while the EDC isotopic record (events of triangular shape displays peaks roughly at the same time as the NGRIP abrupt isotopic increases, the EDML isotopic record (events characterized by broader peaks or even extended periods of high isotope values reached the isotopic maximum several centuries before. It is expected that the future contribution of both other long ice core records and other types of chronological constraints to the Datice tool will lead to further refinements in the ice core chronologies beyond the AICC2012 chronology. For the time being however, we recommend that AICC2012 be used as the preferred chronology for the Vostok, EDC, EDML and TALDICE ice core records, both over the last glacial cycle (this study, and beyond (following Bazin et al., 2013. The ages for NGRIP in AICC2012 are virtually identical to those of GICC05 for the last 60.2 ka, whereas the ages beyond are independent of those in GICC05modelext (as in the construction of AICC2012, the GICC05modelext was included only via the background scenarios and not as age markers. As such, where issues of phasing between Antarctic records included in AICC2012 and NGRIP are involved, the NGRIP ages in AICC2012 should therefore be taken to avoid introducing false offsets. However for issues involving only Greenland ice cores, there is not yet a strong basis to recommend superseding GICC05modelext as the recommended age scale for Greenland ice cores.
Ferrucci, F.; Tampellini, M.; Loughlin, S. C.; Tait, S.; Theys, N.; Valks, P.; Hirn, B.
2013-12-01
The EVOSS consortium of academic, industrial and institutional partners in Europe and Africa, has created a satellite-based volcano observatory, designed to support crisis management within the Global Monitoring for Environment and Security (GMES) framework of the European Commission. Data from 8 different payloads orbiting on 14 satellite platforms (SEVIRI on-board MSG-1, -2 and -3, MODIS on-board Terra and Aqua, GOME-2 and IASI onboard MetOp-A, OMI on-board Aura, Cosmo-SkyMED/1, /2, /3 and /4, JAMI on-board MTSAT-1 and -2, and, until April 8th2012, SCHIAMACHY on-board ENVISAT) acquired at 5 different down-link stations, are disseminated to and automatically processed at 6 locations in 4 countries. The results are sent, in four separate geographic data streams (high-temperature thermal anomalies, volcanic Sulfur dioxide daily fluxes, volcanic ash and ground deformation), to a central facility called VVO, the 'Virtual Volcano Observatory'. This system operates 24H/24-7D/7 since September 2011 on all volcanoes in Europe, Africa, the Lesser Antilles, and the oceans around them, and during this interval has detected, measured and monitored all subaerial eruptions occurred in this region (44 over 45 certified, with overall detection and processing efficiency of ~97%). EVOSS borne realtime information is delivered to a group of 14 qualified end users, bearing the direct or indirect responsibility of monitoring and managing volcano emergencies, and of advising governments in Comoros, DR Congo, Djibouti, Ethiopia, Montserrat, Uganda, Tanzania, France and Iceland. We present the full set of eruptions detected and monitored - from 2004 to present - by multispectral payloads SEVIRI onboard the geostationary platforms of the MSG constellation, for developing and fine tuning-up the EVOSS system along with its real-time, pre- and post-processing automated algorithms. The set includes 91% of subaerial eruptions occurred at 15 volcanoes (Piton de la Fournaise, Karthala, Jebel al Tair, Erta Ale, Manda Hararo, Dalafilla, Nabro, Ol Doinyo Lengai, Nyiamulagira, Nyiragongo, Etna, Stromboli, Eyjafjallajökull, Grimsvötn, Soufriere Hills) showing radiant fluxes above ~0.5 GW and/or SO2 columns in excess of ~6 DU. Porting of automated thermal algorithms on MTSAT's JAMI (orbiting at 145°E) was developed on the eruptions of Merapi, Semeru Kliuchevskoi, Bezymianny and Shiveluch in 2006-2007, calibrated on the frequent activity of Batu Tara, and demonstrated on the 2012-2013 large eruption of Tolbachik.
Pazos, Antonio; Martín Davila, José; Buforn, Elisa; Gárate Pasquín, Jorge; Catalán Morollón, Manuel; Hanka, Winfried; Udías, Agustín.; Benzzeghoud, Mourad; Harnafi, Mimoun
2010-05-01
The plate boundary between Eurasia and Africa plates crosses the called "Ibero-Maghrebian" region from the San Vicente Cape (SW Portugal) to Tunisia including the South of Iberia, Alboran Sea, and northern Morocco and Algeria. In this area, the convergence, with a low rate, is accommodated over a wide and diffuse deformation zone, characterized by a significant and widespread moderate seismic activity [Buforn et al., 1995], and the occurrence of large earthquakes is separated by long time intervals. Since more than hundred years ago San Fernando Naval Observatory (ROA), in collaboration with other Institutes, has deployed different geophysical and geodetic equipment in the Southern Spain - North-western Africa area in order to study this broad deformation zone. Currently a Broad Band seismic net (Western Mediterranean, WM net) is deployed, in collaboration with other institutions, around the Gulf of Cádiz and the Alboran sea, with stations in the South of Iberia and in North Africa (at Spanish places and Morocco), together with the seismic stations a permanent geodetic GPS net is co-installed at the same sites. Also, other geophysical instruments have been installed: a Satellite Laser Ranging (SLR) station at San Fernando Observatory Headquarter, a Geomagnetic Observatory in Cádiz bay area and some meteorological stations. These networks have been recently improved with the deployment of a new submarine and on-land geophysical observatory in the Alboran island (ALBO Observatory), where a permanent GPS, a meteorological station were installed on land and a permanent submarine observatory in 50 meters depth was also deploy in last October (with a broad band seismic sensor, a 3 C accelerometer and a DPG). This work shows the present status and the future plans of these networks and some results.
Wopereis, S.; Stroeve, J.H.M.; Stafleu, A.; Bakker, G.C.M.; Burggraaf, J.; Erk, M.J. van; Pellis, L.; Boessen, R.; Kardinaal, A.A.F.; Ommen, B. van
2017-01-01
Background: A key feature of metabolic health is the ability to adapt upon dietary perturbations. Recently, it was shown that metabolic challenge tests in combination with the new generation biomarkers allow the simultaneous quantification of major metabolic health processes. Currently, applied
Energy Technology Data Exchange (ETDEWEB)
Varfolomeev, Mikhail A.; Rakipov, Ilnaz T.; Khachatrian, Artashes A. [Department of Physical Chemistry, Kazan Federal University, Kremlevskaya 18, Kazan 420008 (Russian Federation); Acree, William E., E-mail: acree@unt.edu [Department of Chemistry, 1155 Union Circle # 305070, University of North Texas, Denton, TX 76203-5017 (United States); Brumfield, Michela [Department of Chemistry, 1155 Union Circle # 305070, University of North Texas, Denton, TX 76203-5017 (United States); Abraham, Michael H. [Department of Chemistry, University College London, 20 Gordon Street, London WC1H 0AJ (United Kingdom)
2015-10-10
Graphical abstract: - Highlights: • Enthalpies of solution measured for 43 solutes dissolved in chlorobenzene. • Enthalpies of solution measured for 72 solutes dissolved in 1,2-dichlorobenzene. • Mathematical expressions derived for predicting enthalpies of solvation of solutes in chlorobenzene. • Mathematical expressions derived for predicting enthalpies of solvation of solutes in 1,2-chlorobenzene. - Abstract: Enthalpies of solution at infinite dilution at 298 K, Δ{sub soln}H{sup A/Solvent}, have been measured by isothermal solution calorimetry for 43 and 72 organic solutes dissolved in chlorobenzene and 1,2-dichlorobenzene, respectively. The measured Δ{sub soln}H{sup A/Solvent} data, along with published Δ{sub soln}H{sup A/Solvent} values taken from the published literature for solutes dissolved in both chlorobenzene solvents, were converted to enthalpies of solvation, Δ{sub solv}H{sup A/Solvent}, using standard thermodynamic equations. Abraham model correlations were developed from the experimental Δ{sub solv}H{sup A/Solvent} data. The best derived correlations describe the experimental gas-to-chlorobenzene and gas-to-1,2-dichlorobenzene enthalpies of solvation to within standard deviations of 1.5 kJ mol{sup −1} and 1.9 kJ mol{sup −1}, respectively. Enthalpies of X−H…π (X – O, N, and C) hydrogen bond formation of proton donor solutes (alcohols, amines, chlorinated hydrocarbons, etc.) with chlorobenzene and 1,2-dichlorobenzene were calculated based on the Abraham solvation equation. Obtained values are in good agreement with the results determined using conventional methods.
DEFF Research Database (Denmark)
Holm, Jens Kai; Stelte, Wolfgang; Posselt, Dorthe
2011-01-01
Pelletization of biomass residues increases the energy density, reduces storage and transportation costs and results in a homogeneous product with well-defined physical properties. However, raw materials for fuel pellet production consist of ligno-cellulosic biomass from various resources...... and error” experiments and personal experience. However in recent years the utilization of single pellet press units for testing the biomass pelletizing properties has attracted more attention. The present study outlines an approach where single pellet press testing is combined with modeling to mimic...... the pelletizing behavior of new types of biomass in a large scale pellet mill. This enables a fast estimation of key process parameters such as optimal press channel length and moisture content. Secondly, the study addresses the question of the origin of the observed relationship between pelletizing pressure...
Kralik, Martin
2017-04-01
The application of nitrogen and oxygen isotopes in nitrate allows, under favourable circumstances, to identify potential sources such as precipitation, chemical fertilisers and manure or sewage water. Without any additional tracer, the source distinction of nitrate from manure or sewage water is still difficult. Even the application of boron isotopes can in some cases not avoid ambiguous interpretation. Therefore, the Environment Agency Austria developed a new multi parametrical indicator test to allow the identification and quantification of pollution by domestic sewage water. The test analyses 8 substances well known to occur in sewage water: Acesulfame and sucralose (two artificial, calorie-free sweeteners), benzotriazole and tolyltriazole (two industrial chemicals/corrosion inhibitors), metoprolol, sotalol, carbamazepine and the metabolite 10,11-Dihydro-10,11-dihydroxycarbamazepine (pharmaceuticals) [1]. These substances are polar and degradation in the aquatic system by microbiological processes is not documented. These 8 Substances do not occur naturally which make them ideal tracers. The test can detect wastewater in the analysed water sample down to 0.1 %. This ideal coupling of these analytic tests helps to identify the nitrogen sources in the groundwater body Marchfeld East of Vienna to a high confidence level. In addition, the results allow a reasonable quantification of nitrogen sources from different types of fertilizers as well as sewage water contributions close to villages and in wells recharged by bank filtration. Recent investigations of groundwater in selected wells in Marchfeld [2] indicated a clear nitrogen contribution by wastewater leakages (sewers or septic tanks) to the total nitrogen budget. However, this contribution is shrinking and the main source comes still from agricultural activities. [1] Humer, F.; Weiss, S.; Reinnicke, S.; Clara, M.; Grath, J.; Windhofer, G. (2013): Multi parametrical indicator test for urban wastewater influence. EGU General Assembly 2013, held 7-12 April, 2013 in Vienna, Austria, id. EGU2013-5332, EGU2013-5332. [2] Kralik, M.; Humer, F. & Grath, J. (2008): Pilotprojekt Grundwasseralter: Herkunftsanalyse von Nitrat mittels Stickstoff-, Sauerstoff-, Schwefel und Kohlenstoffisotopen. 57 S.2, Environment Agency Austria/Ministry of Agriculture, Forestry, Environment and Water Management, Vienna.
Czech Academy of Sciences Publication Activity Database
Veselý, V.; Frantík, P.; Sopek, J.; Malíková, L.; Seitl, Stanislav
2015-01-01
Roč. 38, č. 2 (2015), s. 200-214 ISSN 8756-758X R&D Projects: GA ČR(CZ) GAP104/11/0833 Institutional support: RVO:68081723 Keywords : near-crack tip fields * Williams series * higher-order terms * stress field * failure criterion * nonlinear zone * quasi-brittle fracture * splitting-bending geometry Subject RIV: JL - Materials Fatigue, Friction Mechanics Impact factor: 1.838, year: 2015
Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R
2009-01-01
We present g-PRIME, a software based tool for physiology data acquisition, analysis, and stimulus generation in education and research. This software was developed in an undergraduate neurophysiology course and strongly influenced by instructor and student feedback. g-PRIME is a free, stand-alone, windows application coded and "compiled" in Matlab (does not require a Matlab license). g-PRIME supports many data acquisition interfaces from the PC sound card to expensive high throughput calibrated equipment. The program is designed as a software oscilloscope with standard trigger modes, multi-channel visualization controls, and data logging features. Extensive analysis options allow real time and offline filtering of signals, multi-parameter threshold-and-window based event detection, and two-dimensional display of a variety of parameters including event time, energy density, maximum FFT frequency component, max/min amplitudes, and inter-event rate and intervals. The software also correlates detected events with another simultaneously acquired source (event triggered average) in real time or offline. g-PRIME supports parameter histogram production and a variety of elegant publication quality graphics outputs. A major goal of this software is to merge powerful engineering acquisition and analysis tools with a biological approach to studies of nervous system function.
Li, Xuejian; Wang, Youqing
2016-12-01
Offline general-type models are widely used for patients' monitoring in intensive care units (ICUs), which are developed by using past collected datasets consisting of thousands of patients. However, these models may fail to adapt to the changing states of ICU patients. Thus, to be more robust and effective, the monitoring models should be adaptable to individual patients. A novel combination of just-in-time learning (JITL) and principal component analysis (PCA), referred to learning-type PCA (L-PCA), was proposed for adaptive online monitoring of patients in ICUs. JITL was used to gather the most relevant data samples for adaptive modeling of complex physiological processes. PCA was used to build an online individual-type model and calculate monitoring statistics, and then to judge whether the patient's status is normal or not. The adaptability of L-PCA lies in the usage of individual data and the continuous updating of the training dataset. Twelve subjects were selected from the Physiobank's Multi-parameter Intelligent Monitoring for Intensive Care II (MIMIC II) database, and five vital signs of each subject were chosen. The proposed method was compared with the traditional PCA and fast moving-window PCA (Fast MWPCA). The experimental results demonstrated that the fault detection rates respectively increased by 20 % and 47 % compared with PCA and Fast MWPCA. L-PCA is first introduced into ICU patients monitoring and achieves the best monitoring performance in terms of adaptability to changes in patient status and sensitivity for abnormality detection.
International Nuclear Information System (INIS)
Vladimir Aizen; Donald Bren; Karl Kreutz; Cameron Wake
2001-01-01
While the majority of ice core investigations have been undertaken in the polar regions, a few ice cores recovered from carefully selected high altitude/mid-to-low latitude glaciers have also provided valuable records of climate variability in these regions. A regional array of high resolution, multi-parameter ice core records developed from temperate and tropical regions of the globe can be used to document regional climate and environmental change in the latitudes which are home to the vase majority of the Earth's human population. In addition, these records can be directly compared with ice core records available from the polar regions and can therefore expand our understanding of inter-hemispheric dynamics of past climate changes. The main objectives of our paleoclimate research in the Tien Shan mountains of middle Asia combine the development of detailed paleoenvironmental records via the physical and chemical analysis of ice cores with the analysis of modern meteorological and hydrological data. The first step in this research was the collection of ice cores from the accumulation zone of the Inylchek Glacier and the collection of meteorological data from a variety of stations throughout the Tien Shan. The research effort described in this report was part of a collaborative effort with the United State Geological Survey's (USGS) Global Environmental Research Program which began studying radionuclide deposition in mid-latitude glaciers in 1995
Energy Technology Data Exchange (ETDEWEB)
Vladimir Aizen; Donald Bren; Karl Kreutz; Cameron Wake
2001-05-30
While the majority of ice core investigations have been undertaken in the polar regions, a few ice cores recovered from carefully selected high altitude/mid-to-low latitude glaciers have also provided valuable records of climate variability in these regions. A regional array of high resolution, multi-parameter ice core records developed from temperate and tropical regions of the globe can be used to document regional climate and environmental change in the latitudes which are home to the vase majority of the Earth's human population. In addition, these records can be directly compared with ice core records available from the polar regions and can therefore expand our understanding of inter-hemispheric dynamics of past climate changes. The main objectives of our paleoclimate research in the Tien Shan mountains of middle Asia combine the development of detailed paleoenvironmental records via the physical and chemical analysis of ice cores with the analysis of modern meteorological and hydrological data. The first step in this research was the collection of ice cores from the accumulation zone of the Inylchek Glacier and the collection of meteorological data from a variety of stations throughout the Tien Shan. The research effort described in this report was part of a collaborative effort with the United State Geological Survey's (USGS) Global Environmental Research Program which began studying radionuclide deposition in mid-latitude glaciers in 1995.
Tav4SB: integrating tools for analysis of kinetic models of biological systems.
Rybiński, Mikołaj; Lula, Michał; Banasik, Paweł; Lasota, Sławomir; Gambin, Anna
2012-04-05
Progress in the modeling of biological systems strongly relies on the availability of specialized computer-aided tools. To that end, the Taverna Workbench eases integration of software tools for life science research and provides a common workflow-based framework for computational experiments in Biology. The Taverna services for Systems Biology (Tav4SB) project provides a set of new Web service operations, which extend the functionality of the Taverna Workbench in a domain of systems biology. Tav4SB operations allow you to perform numerical simulations or model checking of, respectively, deterministic or stochastic semantics of biological models. On top of this functionality, Tav4SB enables the construction of high-level experiments. As an illustration of possibilities offered by our project we apply the multi-parameter sensitivity analysis. To visualize the results of model analysis a flexible plotting operation is provided as well. Tav4SB operations are executed in a simple grid environment, integrating heterogeneous software such as Mathematica, PRISM and SBML ODE Solver. The user guide, contact information, full documentation of available Web service operations, workflows and other additional resources can be found at the Tav4SB project's Web page: http://bioputer.mimuw.edu.pl/tav4sb/. The Tav4SB Web service provides a set of integrated tools in the domain for which Web-based applications are still not as widely available as for other areas of computational biology. Moreover, we extend the dedicated hardware base for computationally expensive task of simulating cellular models. Finally, we promote the standardization of models and experiments as well as accessibility and usability of remote services.
Decision analysis multicriteria analysis
International Nuclear Information System (INIS)
Lombard, J.
1986-09-01
The ALARA procedure covers a wide range of decisions from the simplest to the most complex one. For the simplest one the engineering judgement is generally enough and the use of a decision aiding technique is therefore not necessary. For some decisions the comparison of the available protection option may be performed from two or a few criteria (or attributes) (protection cost, collective dose,...) and the use of rather simple decision aiding techniques, like the Cost Effectiveness Analysis or the Cost Benefit Analysis, is quite enough. For the more complex decisions, involving numerous criteria or for decisions involving large uncertainties or qualitative judgement the use of these techniques, even the extended cost benefit analysis, is not recommended and appropriate techniques like multi-attribute decision aiding techniques are more relevant. There is a lot of such particular techniques and it is not possible to present all of them. Therefore only two broad categories of multi-attribute decision aiding techniques will be presented here: decision analysis and the outranking analysis
Energy Technology Data Exchange (ETDEWEB)
Likhachev, D.V., E-mail: dmitriy.likhachev@globalfoundries.com
2015-08-31
During semiconductor device fabrication, control of the layer thicknesses is an important task for in-line metrology since the correct thickness values are essential for proper device performance. At the present time, ellipsometry is widely used for routine process monitoring and process improvement as well as characterization of various materials in the modern nanoelectronic manufacturing. The wide recognition of this technique is based on its non-invasive, non-intrusive and non-destructive nature, high measurement precision, accuracy and speed, and versatility to characterize practically all types of materials used in modern semiconductor industry (dielectrics, semiconductors, metals, polymers, etc.). However, it requires the use of one of the multi-parameter non-linear optimization methods due to its indirect nature. This fact creates a big challenge for analysis of multilayered structures since the number of simultaneously determined model parameters, for instance, thin film thicknesses and model variables related to film optical properties, should be restricted due to parameter cross-correlations. In this paper, we use parametric sensitivity analysis to evaluate the importance of various model parameters and to suggest their optimal search ranges. In this work, the method is applied practically for analysis of a few structures with up to five-layered film stack. It demonstrates an evidence-based improvement in accuracy of multilayered thin-film thickness measurements which suggests that the proposed approach can be useful for industrial applications. - Highlights: • An improved method for multilayered thin-film stack characterization is proposed. • The screening-type technique based on so-called “elementary effects” was employed. • The model parameters were ranked according to relative importance for model output. • The method is tested using two examples of complex thin-film stack characterization. • The approach can be useful in many practical
International Nuclear Information System (INIS)
Likhachev, D.V.
2015-01-01
During semiconductor device fabrication, control of the layer thicknesses is an important task for in-line metrology since the correct thickness values are essential for proper device performance. At the present time, ellipsometry is widely used for routine process monitoring and process improvement as well as characterization of various materials in the modern nanoelectronic manufacturing. The wide recognition of this technique is based on its non-invasive, non-intrusive and non-destructive nature, high measurement precision, accuracy and speed, and versatility to characterize practically all types of materials used in modern semiconductor industry (dielectrics, semiconductors, metals, polymers, etc.). However, it requires the use of one of the multi-parameter non-linear optimization methods due to its indirect nature. This fact creates a big challenge for analysis of multilayered structures since the number of simultaneously determined model parameters, for instance, thin film thicknesses and model variables related to film optical properties, should be restricted due to parameter cross-correlations. In this paper, we use parametric sensitivity analysis to evaluate the importance of various model parameters and to suggest their optimal search ranges. In this work, the method is applied practically for analysis of a few structures with up to five-layered film stack. It demonstrates an evidence-based improvement in accuracy of multilayered thin-film thickness measurements which suggests that the proposed approach can be useful for industrial applications. - Highlights: • An improved method for multilayered thin-film stack characterization is proposed. • The screening-type technique based on so-called “elementary effects” was employed. • The model parameters were ranked according to relative importance for model output. • The method is tested using two examples of complex thin-film stack characterization. • The approach can be useful in many practical
International Nuclear Information System (INIS)
2008-05-01
This book introduces energy and resource technology development business with performance analysis, which has business division and definition, analysis of current situation of support, substance of basic plan of national energy, resource technique development, selection of analysis index, result of performance analysis by index, performance result of investigation, analysis and appraisal of energy and resource technology development business in 2007.
The development and transplants of XSYS
International Nuclear Information System (INIS)
Yang Xiaoqing; Jia Guoxiang
1997-01-01
XSYS is a multi parameter data acquisition and analysis system which runs on VAX780. With the introduction of new hardware, the authors have transplanted the XSYS from VAX780 to ALPHA2100 for data analysis
International Nuclear Information System (INIS)
Jae, Myeong Gi; Lee, Won Seong; Kim, Ha Hyeok
1989-02-01
This book give description of electronic engineering such as circuit element and device, circuit analysis and logic digital circuit, the method of electrochemistry like conductometry, potentiometry and current measuring, spectro chemical analysis with electromagnetic radiant rays, optical components, absorption spectroscopy, X-ray analysis, atomic absorption spectrometry and reference, chromatography like gas-chromatography and liquid-chromatography and automated analysis on control system evaluation of automated analysis and automated analysis system and reference.
Energy Technology Data Exchange (ETDEWEB)
Jae, Myeong Gi; Lee, Won Seong; Kim, Ha Hyeok
1989-02-15
This book give description of electronic engineering such as circuit element and device, circuit analysis and logic digital circuit, the method of electrochemistry like conductometry, potentiometry and current measuring, spectro chemical analysis with electromagnetic radiant rays, optical components, absorption spectroscopy, X-ray analysis, atomic absorption spectrometry and reference, chromatography like gas-chromatography and liquid-chromatography and automated analysis on control system evaluation of automated analysis and automated analysis system and reference.
Directory of Open Access Journals (Sweden)
Jan Krajicek
Full Text Available The true bugs (Hemiptera: Heteroptera have evolved a system of well-developed scent glands that produce diverse and frequently strongly odorous compounds that act mainly as chemical protection against predators. A new method of non-lethal sampling with subsequent separation using gas chromatography with mass spectrometric detection was proposed for analysis of these volatile defensive secretions. Separation was performed on Rtx-200 column containing fluorinated polysiloxane stationary phase. Various mechanical irritation methods (ultrasonics, shaking, pressing bugs with plunger of syringe were tested for secretion sampling with a special focus on non-lethal irritation. The preconcentration step was performed by sorption on solid phase microextraction (SPME fibers with different polarity. For optimization of sampling procedure, Pyrrhocoris apterus was selected. The entire multi-parameter optimization procedure of secretion sampling was performed using response surface methodology. The irritation of bugs by pressing them with a plunger of syringe was shown to be the most suitable. The developed method was applied to analysis of secretions produced by adult males and females of Pyrrhocoris apterus, Pyrrhocoris tibialis and Scantius aegyptius (all Heteroptera: Pyrrhocoridae. The chemical composition of secretion, particularly that of alcohols, aldehydes and esters, is species-specific in all three pyrrhocorid species studied. The sexual dimorphism in occurrence of particular compounds is largely limited to alcohols and suggests their epigamic intraspecific function. The phenetic overall similarities in composition of secretion do not reflect either relationship of species or similarities in antipredatory color pattern. The similarities of secretions may be linked with antipredatory strategies. The proposed method requires only a few individuals which remain alive after the procedure. Thus secretions of a number of species including even the rare
Energy Technology Data Exchange (ETDEWEB)
Kim, Seung Jae; Seo, Seong Gyu
1995-03-15
This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.
International Nuclear Information System (INIS)
Kim, Seung Jae; Seo, Seong Gyu
1995-03-01
This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.
Directory of Open Access Journals (Sweden)
I. Crawford
2015-11-01
Full Text Available In this paper we present improved methods for discriminating and quantifying primary biological aerosol particles (PBAPs by applying hierarchical agglomerative cluster analysis to multi-parameter ultraviolet-light-induced fluorescence (UV-LIF spectrometer data. The methods employed in this study can be applied to data sets in excess of 1 × 106 points on a desktop computer, allowing for each fluorescent particle in a data set to be explicitly clustered. This reduces the potential for misattribution found in subsampling and comparative attribution methods used in previous approaches, improving our capacity to discriminate and quantify PBAP meta-classes. We evaluate the performance of several hierarchical agglomerative cluster analysis linkages and data normalisation methods using laboratory samples of known particle types and an ambient data set. Fluorescent and non-fluorescent polystyrene latex spheres were sampled with a Wideband Integrated Bioaerosol Spectrometer (WIBS-4 where the optical size, asymmetry factor and fluorescent measurements were used as inputs to the analysis package. It was found that the Ward linkage with z-score or range normalisation performed best, correctly attributing 98 and 98.1 % of the data points respectively. The best-performing methods were applied to the BEACHON-RoMBAS (Bio–hydro–atmosphere interactions of Energy, Aerosols, Carbon, H2O, Organics and Nitrogen–Rocky Mountain Biogenic Aerosol Study ambient data set, where it was found that the z-score and range normalisation methods yield similar results, with each method producing clusters representative of fungal spores and bacterial aerosol, consistent with previous results. The z-score result was compared to clusters generated with previous approaches (WIBS AnalysiS Program, WASP where we observe that the subsampling and comparative attribution method employed by WASP results in the overestimation of the fungal spore concentration by a factor of 1.5 and the
... page: //medlineplus.gov/ency/article/003741.htm Sensitivity analysis To use the sharing features on this page, please enable JavaScript. Sensitivity analysis determines the effectiveness of antibiotics against microorganisms (germs) ...
McShane, Edward James
2013-01-01
This text surveys practical elements of real function theory, general topology, and functional analysis. Discusses the maximality principle, the notion of convergence, the Lebesgue-Stieltjes integral, function spaces and harmonic analysis. Includes exercises. 1959 edition.
Cerebrospinal fluid analysis ... Analysis of CSF can help detect certain conditions and diseases. All of the following can be, but ... An abnormal CSF analysis result may be due to many different causes, ... Encephalitis (such as West Nile and Eastern Equine) Hepatic ...
... analysis URL of this page: //medlineplus.gov/ency/article/003627.htm Semen analysis To use the sharing features on this page, please enable JavaScript. Semen analysis measures the amount and quality of a man's semen and sperm. Semen is ...
Directory of Open Access Journals (Sweden)
Jaroszewicz Jerzy
2018-01-01
Full Text Available The work is devoted to methods of analysis of vibrations and stability of discrete-continuous, multi-parameter models of beams, shafts, rotors, vanes, converting to homogeneous and one-dimensional. The properties of Cauchy's influence function and the characteristic series method were used to solve the boundary problem. It has been shown that the methods are an effective tool for solving boundary problems described by ordinary fourth-and second-order differential equations with variable parameters. Particular attention should be paid to the solution of the border problem of two-parameter elastic systems with variable distribution of parameters. Universal beam-specific equations with typical support conditions including vertical support, which do not depend on beam shape and axial load type, are recorded. The shape and type of load are considered in the form of an impact function that corresponds to any change in cross-section of the support and continuous axial load, so that the functions describing the stiffness, the mass and the continuous load are complete. As a result of the solution of the boundary vibration problem of freely bent support and any change in its cross-section, loaded with any longitudinal load, arranged on the resilient substrate, strict relations between the own frequency parameters and the load parameters were derived. Using the methods, simple calculations were made, easy to use in engineering practice and conditions of use were given. Experimental studies have confirmed the high accuracy of theoretical calculations using the proposed methods and formulas.
International Nuclear Information System (INIS)
Cardemil, José M.; Silva, Alexandre K. da
2016-01-01
Highlights: • Thermodynamic modeling of CO_2-based power cycles. • A multi-parameter analysis for different cycle configurations. • Performance comparison between CO_2 and four other fluids. • Detailed discussion considering optimized operational parameters (i.e., pressure, HX size). • Overview of the technical applicability of the CO_2. - Abstract: This thermodynamically based study focuses on the thermal performance of power cycles using CO_2 as the working fluid. The work considers numerous aspects that can influence the cycle's performance, such as the type of cycle (i.e., Rankine or Brayton), its configuration (i.e., with and without a recuperator), and different operational conditions (i.e., heat source temperature and the upper and lower operating pressures of the CO_2). To account for all possible scenarios, a thermodynamic routine was especially implemented and linked to a library that contained all the thermodynamics properties of CO_2. The results are mostly presented in terms of the absolute and relative 1st and 2nd Law efficiencies of CO_2 as well as the cycle's scale, here represented by the global conductance (UA) of the heat exchangers used within the cycle. For the relative performance assessment, four other working fluids, commonly used in energy conversion cycles, were considered (i.e., ethane, toluene, D4 siloxane and water). As expected, the absolute performance results indicate a strong dependence of the cycle's efficiencies on the operational conditions. As for the relative performance, the results suggest that while the CO_2's 1st Law efficiency might be lower than other fluids, its exergetic efficiency can be significantly higher. Furthermore, the calculations also indicate that the CO_2's needed global conductance is potentially lower than competing fluids (e.g., toluene) for certain operational conditions, which suggests that CO_2-based power plants can be more compact, since they might require smaller heat exchangers to produce
Federal Laboratory Consortium — The primary goal of the Flow Cytometry Section is to provide the services of state-of-the-art multi-parameter cellular analysis and cell sorting for researchers and...
Kantorovich, L V
1982-01-01
Functional Analysis examines trends in functional analysis as a mathematical discipline and the ever-increasing role played by its techniques in applications. The theory of topological vector spaces is emphasized, along with the applications of functional analysis to applied analysis. Some topics of functional analysis connected with applications to mathematical economics and control theory are also discussed. Comprised of 18 chapters, this book begins with an introduction to the elements of the theory of topological spaces, the theory of metric spaces, and the theory of abstract measure space
International Nuclear Information System (INIS)
Berman, M.; Bischof, L.M.; Breen, E.J.; Peden, G.M.
1994-01-01
This paper provides an overview of modern image analysis techniques pertinent to materials science. The usual approach in image analysis contains two basic steps: first, the image is segmented into its constituent components (e.g. individual grains), and second, measurement and quantitative analysis are performed. Usually, the segmentation part of the process is the harder of the two. Consequently, much of the paper concentrates on this aspect, reviewing both fundamental segmentation tools (commonly found in commercial image analysis packages) and more advanced segmentation tools. There is also a review of the most widely used quantitative analysis methods for measuring the size, shape and spatial arrangements of objects. Many of the segmentation and analysis methods are demonstrated using complex real-world examples. Finally, there is a discussion of hardware and software issues. 42 refs., 17 figs
Thiemann, Francis C.
Semiotic analysis is a method of analyzing signs (e.g., words) to reduce non-numeric data to their component parts without losing essential meanings. Semiotics dates back to Aristotle's analysis of language; it was much advanced by nineteenth-century analyses of style and logic and by Whitehead and Russell's description in this century of the role…
Indian Academy of Sciences (India)
Dimensional analysis is a useful tool which finds important applications in physics and engineering. It is most effective when there exist a maximal number of dimensionless quantities constructed out of the relevant physical variables. Though a complete theory of dimen- sional analysis was developed way back in 1914 in a.
Bravená, Helena
2009-01-01
This bacherlor thesis deals with the importance of job analysis for personnel activities in the company. The aim of this work is to find the most suitable method of job analysis in a particular enterprise, and continues creating descriptions and specifications of each job.
International Nuclear Information System (INIS)
Francois, P.
1996-01-01
We undertook a study programme at the end of 1991. To start with, we performed some exploratory studies aimed at learning some preliminary lessons on this type of analysis: Assessment of the interest of probabilistic incident analysis; possibility of using PSA scenarios; skills and resources required. At the same time, EPN created a working group whose assignment was to define a new approach for analysis of incidents on NPPs. This working group gave thought to both aspects of Operating Feedback that EPN wished to improve: Analysis of significant incidents; analysis of potential consequences. We took part in the work of this group, and for the second aspects, we proposed a method based on an adaptation of the event-tree method in order to establish a link between existing PSA models and actual incidents. Since PSA provides an exhaustive database of accident scenarios applicable to the two most common types of units in France, they are obviously of interest for this sort of analysis. With this method we performed some incident analyses, and at the same time explores some methods employed abroad, particularly ASP (Accident Sequence Precursor, a method used by the NRC). Early in 1994 EDF began a systematic analysis programme. The first, transient phase will set up methods and an organizational structure. 7 figs
Energy Technology Data Exchange (ETDEWEB)
Francois, P
1997-12-31
We undertook a study programme at the end of 1991. To start with, we performed some exploratory studies aimed at learning some preliminary lessons on this type of analysis: Assessment of the interest of probabilistic incident analysis; possibility of using PSA scenarios; skills and resources required. At the same time, EPN created a working group whose assignment was to define a new approach for analysis of incidents on NPPs. This working group gave thought to both aspects of Operating Feedback that EPN wished to improve: Analysis of significant incidents; analysis of potential consequences. We took part in the work of this group, and for the second aspects, we proposed a method based on an adaptation of the event-tree method in order to establish a link between existing PSA models and actual incidents. Since PSA provides an exhaustive database of accident scenarios applicable to the two most common types of units in France, they are obviously of interest for this sort of analysis. With this method we performed some incident analyses, and at the same time explores some methods employed abroad, particularly ASP (Accident Sequence Precursor, a method used by the NRC). Early in 1994 EDF began a systematic analysis programme. The first, transient phase will set up methods and an organizational structure. 7 figs.
Khabaza, I M
1960-01-01
Numerical Analysis is an elementary introduction to numerical analysis, its applications, limitations, and pitfalls. Methods suitable for digital computers are emphasized, but some desk computations are also described. Topics covered range from the use of digital computers in numerical work to errors in computations using desk machines, finite difference methods, and numerical solution of ordinary differential equations. This book is comprised of eight chapters and begins with an overview of the importance of digital computers in numerical analysis, followed by a discussion on errors in comput
International Nuclear Information System (INIS)
Warner, M.
1987-01-01
What is the current state of quantitative trace analytical chemistry? What are today's research efforts? And what challenges does the future hold? These are some of the questions addressed at a recent four-day symposium sponsored by the National Bureau of Standards (NBS) entitled Accuracy in Trace Analysis - Accomplishments, Goals, Challenges. The two plenary sessions held on the first day of the symposium reviewed the history of quantitative trace analysis, discussed the present situation from academic and industrial perspectives, and summarized future needs. The remaining three days of the symposium consisted of parallel sessions dealing with the measurement process; quantitation in materials; environmental, clinical, and nutrient analysis; and advances in analytical techniques
Goodstein, R L
2010-01-01
Recursive analysis develops natural number computations into a framework appropriate for real numbers. This text is based upon primary recursive arithmetic and presents a unique combination of classical analysis and intuitional analysis. Written by a master in the field, it is suitable for graduate students of mathematics and computer science and can be read without a detailed knowledge of recursive arithmetic.Introductory chapters on recursive convergence and recursive and relative continuity are succeeded by explorations of recursive and relative differentiability, the relative integral, and
Tao, Terence
2016-01-01
This is part one of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...
Tao, Terence
2016-01-01
This is part two of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...
International Nuclear Information System (INIS)
1988-01-01
Basic studies in nuclear analytical techniques include the examination of underlying assumptions and the development and extention of techniques involving the use of ion beams for elemental and mass analysis. 1 ref., 1 tab
Energy Technology Data Exchange (ETDEWEB)
2016-06-01
Fact sheet summarizing NREL's techno-economic analysis and life-cycle assessment capabilities to connect research with future commercial process integration, a critical step in the scale-up of biomass conversion technologies.
Gasinski, Leszek
2005-01-01
Hausdorff Measures and Capacity. Lebesgue-Bochner and Sobolev Spaces. Nonlinear Operators and Young Measures. Smooth and Nonsmooth Analysis and Variational Principles. Critical Point Theory. Eigenvalue Problems and Maximum Principles. Fixed Point Theory.
DEFF Research Database (Denmark)
Bauer-Gottwein, Peter; Riegels, Niels; Pulido-Velazquez, Manuel
2017-01-01
Hydroeconomic analysis and modeling provides a consistent and quantitative framework to assess the links between water resources systems and economic activities related to water use, simultaneously modeling water supply and water demand. It supports water managers and decision makers in assessing...... trade-offs between different water uses, different geographic regions, and various economic sectors and between the present and the future. Hydroeconomic analysis provides consistent economic performance criteria for infrastructure development and institutional reform in water policies and management...... organizations. This chapter presents an introduction to hydroeconomic analysis and modeling, and reviews the state of the art in the field. We review available economic water-valuation techniques and summarize the main types of decision problems encountered in hydroeconomic analysis. Popular solution strategies...
Schiffrin, Deborah
1990-01-01
Summarizes the current state of research in conversation analysis, referring primarily to six different perspectives that have developed from the philosophy, sociology, anthropology, and linguistics disciplines. These include pragmatics; speech act theory; interactional sociolinguistics; ethnomethodology; ethnography of communication; and…
Gorsuch, Richard L
2013-01-01
Comprehensive and comprehensible, this classic covers the basic and advanced topics essential for using factor analysis as a scientific tool in psychology, education, sociology, and related areas. Emphasizing the usefulness of the techniques, it presents sufficient mathematical background for understanding and sufficient discussion of applications for effective use. This includes not only theory but also the empirical evaluations of the importance of mathematical distinctions for applied scientific analysis.
International Nuclear Information System (INIS)
1959-01-01
Radioactivation analysis is the technique of radioactivation analysis of the constituents of a very small sample of matter by making the sample artificially radioactive. The first stage is to make the sample radioactive by artificial means, e.g. subject it to neutron bombardment. Once the sample has been activated, or made radioactive, the next task is to analyze the radiations given off by the sample. This analysis would indicate the nature and quantities of the various elements present in the sample. The reason is that the radiation from a particular radioisotope. In 1959 a symposium on 'Radioactivation Analysis' was organized in Vienna by the IAEA and the Joint Commission on Applied Radioactivity (ICSU). It was pointed out that there are certain factors creating uncertainties and elaborated how to overcome them. Attention was drawn to the fact that radioactivation analysis had proven a powerful tool tackling fundamental problems in geo- and cosmochemistry, and a review was given of the recent work in this field. Because of its extreme sensitivity radioactivation analysis had been principally employed for trace detection and its most extensive use has been in control of semiconductors and very pure metals. An account of the experience gained in the USA was given, where radioactivation analysis was being used by many investigators in various scientific fields as a practical and useful tool for elemental analyses. Much of this work had been concerned with determining sub microgramme and microgramme concentration of many different elements in samples of biological materials, drugs, fertilizers, fine chemicals, foods, fuels, glass, ceramic materials, metals, minerals, paints, petroleum products, resinous materials, soils, toxicants, water and other materials. In addition to these studies, radioactivation analysis had been used by other investigators to determine isotopic ratios of the stable isotopes of some of the elements. Another paper dealt with radioactivation
Energy Technology Data Exchange (ETDEWEB)
NONE
1959-07-15
Radioactivation analysis is the technique of radioactivation analysis of the constituents of a very small sample of matter by making the sample artificially radioactive. The first stage is to make the sample radioactive by artificial means, e.g. subject it to neutron bombardment. Once the sample has been activated, or made radioactive, the next task is to analyze the radiations given off by the sample. This analysis would indicate the nature and quantities of the various elements present in the sample. The reason is that the radiation from a particular radioisotope. In 1959 a symposium on 'Radioactivation Analysis' was organized in Vienna by the IAEA and the Joint Commission on Applied Radioactivity (ICSU). It was pointed out that there are certain factors creating uncertainties and elaborated how to overcome them. Attention was drawn to the fact that radioactivation analysis had proven a powerful tool tackling fundamental problems in geo- and cosmochemistry, and a review was given of the recent work in this field. Because of its extreme sensitivity radioactivation analysis had been principally employed for trace detection and its most extensive use has been in control of semiconductors and very pure metals. An account of the experience gained in the USA was given, where radioactivation analysis was being used by many investigators in various scientific fields as a practical and useful tool for elemental analyses. Much of this work had been concerned with determining sub microgramme and microgramme concentration of many different elements in samples of biological materials, drugs, fertilizers, fine chemicals, foods, fuels, glass, ceramic materials, metals, minerals, paints, petroleum products, resinous materials, soils, toxicants, water and other materials. In addition to these studies, radioactivation analysis had been used by other investigators to determine isotopic ratios of the stable isotopes of some of the elements. Another paper dealt with radioactivation
DEFF Research Database (Denmark)
Brænder, Morten; Andersen, Lotte Bøgh
2014-01-01
Based on our 2013-article, ”Does Deployment to War Affect Soldiers' Public Service Motivation – A Panel Study of Soldiers Before and After their Service in Afghanistan”, we present Panel Analysis as a methodological discipline. Panels consist of multiple units of analysis, observed at two or more...... in research settings where it is not possible to distribute units of analysis randomly or where the independent variables cannot be manipulated. The greatest disadvantage in regard to using panel studies is that data may be difficult to obtain. This is most clearly vivid in regard to the use of panel surveys...... points in time. In comparison with traditional cross-sectional studies, the advantage of using panel studies is that the time dimension enables us to study effects. Whereas experimental designs may have a clear advantage in regard to causal inference, the strength of panel studies is difficult to match...
Loeb, Peter A
2016-01-01
This textbook is designed for a year-long course in real analysis taken by beginning graduate and advanced undergraduate students in mathematics and other areas such as statistics, engineering, and economics. Written by one of the leading scholars in the field, it elegantly explores the core concepts in real analysis and introduces new, accessible methods for both students and instructors. The first half of the book develops both Lebesgue measure and, with essentially no additional work for the student, general Borel measures for the real line. Notation indicates when a result holds only for Lebesgue measure. Differentiation and absolute continuity are presented using a local maximal function, resulting in an exposition that is both simpler and more general than the traditional approach. The second half deals with general measures and functional analysis, including Hilbert spaces, Fourier series, and the Riesz representation theorem for positive linear functionals on continuous functions with compact support....
Scott, L Ridgway
2011-01-01
Computational science is fundamentally changing how technological questions are addressed. The design of aircraft, automobiles, and even racing sailboats is now done by computational simulation. The mathematical foundation of this new approach is numerical analysis, which studies algorithms for computing expressions defined with real numbers. Emphasizing the theory behind the computation, this book provides a rigorous and self-contained introduction to numerical analysis and presents the advanced mathematics that underpin industrial software, including complete details that are missing from most textbooks. Using an inquiry-based learning approach, Numerical Analysis is written in a narrative style, provides historical background, and includes many of the proofs and technical details in exercises. Students will be able to go beyond an elementary understanding of numerical simulation and develop deep insights into the foundations of the subject. They will no longer have to accept the mathematical gaps that ex...
Rao, G Shanker
2006-01-01
About the Book: This book provides an introduction to Numerical Analysis for the students of Mathematics and Engineering. The book is designed in accordance with the common core syllabus of Numerical Analysis of Universities of Andhra Pradesh and also the syllabus prescribed in most of the Indian Universities. Salient features: Approximate and Numerical Solutions of Algebraic and Transcendental Equation Interpolation of Functions Numerical Differentiation and Integration and Numerical Solution of Ordinary Differential Equations The last three chapters deal with Curve Fitting, Eigen Values and Eigen Vectors of a Matrix and Regression Analysis. Each chapter is supplemented with a number of worked-out examples as well as number of problems to be solved by the students. This would help in the better understanding of the subject. Contents: Errors Solution of Algebraic and Transcendental Equations Finite Differences Interpolation with Equal Intervals Interpolation with Unequal Int...
Jacques, Ian
1987-01-01
This book is primarily intended for undergraduates in mathematics, the physical sciences and engineering. It introduces students to most of the techniques forming the core component of courses in numerical analysis. The text is divided into eight chapters which are largely self-contained. However, with a subject as intricately woven as mathematics, there is inevitably some interdependence between them. The level of difficulty varies and, although emphasis is firmly placed on the methods themselves rather than their analysis, we have not hesitated to include theoretical material when we consider it to be sufficiently interesting. However, it should be possible to omit those parts that do seem daunting while still being able to follow the worked examples and to tackle the exercises accompanying each section. Familiarity with the basic results of analysis and linear algebra is assumed since these are normally taught in first courses on mathematical methods. For reference purposes a list of theorems used in the t...
DiBenedetto, Emmanuele
2016-01-01
The second edition of this classic textbook presents a rigorous and self-contained introduction to real analysis with the goal of providing a solid foundation for future coursework and research in applied mathematics. Written in a clear and concise style, it covers all of the necessary subjects as well as those often absent from standard introductory texts. Each chapter features a “Problems and Complements” section that includes additional material that briefly expands on certain topics within the chapter and numerous exercises for practicing the key concepts. The first eight chapters explore all of the basic topics for training in real analysis, beginning with a review of countable sets before moving on to detailed discussions of measure theory, Lebesgue integration, Banach spaces, functional analysis, and weakly differentiable functions. More topical applications are discussed in the remaining chapters, such as maximal functions, functions of bounded mean oscillation, rearrangements, potential theory, a...
International Nuclear Information System (INIS)
Romli
1997-01-01
Cluster analysis is the name of group of multivariate techniques whose principal purpose is to distinguish similar entities from the characteristics they process.To study this analysis, there are several algorithms that can be used. Therefore, this topic focuses to discuss the algorithms, such as, similarity measures, and hierarchical clustering which includes single linkage, complete linkage and average linkage method. also, non-hierarchical clustering method, which is popular name K -mean method ' will be discussed. Finally, this paper will be described the advantages and disadvantages of every methods
Rockafellar, Ralph Tyrell
2015-01-01
Available for the first time in paperback, R. Tyrrell Rockafellar's classic study presents readers with a coherent branch of nonlinear mathematical analysis that is especially suited to the study of optimization problems. Rockafellar's theory differs from classical analysis in that differentiability assumptions are replaced by convexity assumptions. The topics treated in this volume include: systems of inequalities, the minimum or maximum of a convex function over a convex set, Lagrange multipliers, minimax theorems and duality, as well as basic results about the structure of convex sets and
Brezinski, C
2012-01-01
Numerical analysis has witnessed many significant developments in the 20th century. This book brings together 16 papers dealing with historical developments, survey papers and papers on recent trends in selected areas of numerical analysis, such as: approximation and interpolation, solution of linear systems and eigenvalue problems, iterative methods, quadrature rules, solution of ordinary-, partial- and integral equations. The papers are reprinted from the 7-volume project of the Journal of Computational and Applied Mathematics on '/homepage/sac/cam/na2000/index.html<
International Nuclear Information System (INIS)
Biehl, F.A.
1984-05-01
This paper presents the criteria, previous nuclear experience in space, analysis techniques, and possible breakup enhancement devices applicable to an acceptable SP-100 reentry from space. Reactor operation in nuclear-safe orbit will minimize the radiological risk; the remaining safeguards criteria need to be defined. A simple analytical point mass reentry technique and a more comprehensive analysis method that considers vehicle dynamics and orbit insertion malfunctions are presented. Vehicle trajectory, attitude, and possible breakup enhancement devices will be integrated in the simulation as required to ensure an adequate representation of the reentry process
Aggarwal, Charu C
2013-01-01
With the increasing advances in hardware technology for data collection, and advances in software technology (databases) for data organization, computer scientists have increasingly participated in the latest advancements of the outlier analysis field. Computer scientists, specifically, approach this field based on their practical experiences in managing large amounts of data, and with far fewer assumptions- the data can be of any type, structured or unstructured, and may be extremely large.Outlier Analysis is a comprehensive exposition, as understood by data mining experts, statisticians and
Everitt, Brian S; Leese, Morven; Stahl, Daniel
2011-01-01
Cluster analysis comprises a range of methods for classifying multivariate data into subgroups. By organizing multivariate data into such subgroups, clustering can help reveal the characteristics of any structure or patterns present. These techniques have proven useful in a wide range of areas such as medicine, psychology, market research and bioinformatics.This fifth edition of the highly successful Cluster Analysis includes coverage of the latest developments in the field and a new chapter dealing with finite mixture models for structured data.Real life examples are used throughout to demons
Snell, K S; Langford, W J; Maxwell, E A
1966-01-01
Elementary Analysis, Volume 2 introduces several of the ideas of modern mathematics in a casual manner and provides the practical experience in algebraic and analytic operations that lays a sound foundation of basic skills. This book focuses on the nature of number, algebraic and logical structure, groups, rings, fields, vector spaces, matrices, sequences, limits, functions and inverse functions, complex numbers, and probability. The logical structure of analysis given through the treatment of differentiation and integration, with applications to the trigonometric and logarithmic functions, is
International Nuclear Information System (INIS)
Baron, J.H.; Nunez McLeod, J.; Rivera, S.S.
1997-01-01
This book contains a selection of research works performed in the CEDIAC Institute (Cuyo National University) in the area of Risk Analysis, with specific orientations to the subjects of uncertainty and sensitivity studies, software reliability, severe accident modeling, etc. This volume presents important material for all those researches who want to have an insight in the risk analysis field, as a tool to solution several problems frequently found in the engineering and applied sciences field, as well as for the academic teachers who want to keep up to date, including the new developments and improvements continuously arising in this field [es
Health status monitoring for ICU patients based on locally weighted principal component analysis.
Ding, Yangyang; Ma, Xin; Wang, Youqing
2018-03-01
Intelligent status monitoring for critically ill patients can help medical stuff quickly discover and assess the changes of disease and then make appropriate treatment strategy. However, general-type monitoring model now widely used is difficult to adapt the changes of intensive care unit (ICU) patients' status due to its fixed pattern, and a more robust, efficient and fast monitoring model should be developed to the individual. A data-driven learning approach combining locally weighted projection regression (LWPR) and principal component analysis (PCA) is firstly proposed and applied to monitor the nonlinear process of patients' health status in ICU. LWPR is used to approximate the complex nonlinear process with local linear models, in which PCA could be further applied to status monitoring, and finally a global weighted statistic will be acquired for detecting the possible abnormalities. Moreover, some improved versions are developed, such as LWPR-MPCA and LWPR-JPCA, which also have superior performance. Eighteen subjects were selected from the Physiobank's Multi-parameter Intelligent Monitoring for Intensive Care II (MIMIC II) database, and two vital signs of each subject were chosen for online monitoring. The proposed method was compared with several existing methods including traditional PCA, Partial least squares (PLS), just in time learning combined with modified PCA (L-PCA), and Kernel PCA (KPCA). The experimental results demonstrated that the mean fault detection rate (FDR) of PCA can be improved by 41.7% after adding LWPR. The mean FDR of LWPR-MPCA was increased by 8.3%, compared with the latest reported method L-PCA. Meanwhile, LWPR spent less training time than others, especially KPCA. LWPR is first introduced into ICU patients monitoring and achieves the best monitoring performance including adaptability to changes in patient status, sensitivity for abnormality detection as well as its fast learning speed and low computational complexity. The algorithm
Alan Gallegos
2002-01-01
Watershed analyses and assessments for the Kings River Sustainable Forest Ecosystems Project were done on about 33,000 acres of the 45,500-acre Big Creek watershed and 32,000 acres of the 85,100-acre Dinkey Creek watershed. Following procedures developed for analysis of cumulative watershed effects (CWE) in the Pacific Northwest Region of the USDA Forest Service, the...
Freund, Rudolf J; Sa, Ping
2006-01-01
The book provides complete coverage of the classical methods of statistical analysis. It is designed to give students an understanding of the purpose of statistical analyses, to allow the student to determine, at least to some degree, the correct type of statistical analyses to be performed in a given situation, and have some appreciation of what constitutes good experimental design
International Nuclear Information System (INIS)
Unterberger, A.
1987-01-01
We study the Klein-Gordon symbolic calculus of operators acting on solutions of the free Klein-Gordon equation. It contracts to the Weyl calculus as c→∞. Mathematically, it may also be considered as a pseudodifferential analysis on the unit ball of R n [fr
International Nuclear Information System (INIS)
Woodard, K.
1985-01-01
The objectives of this paper are to: Provide a realistic assessment of consequences; Account for plant and site-specific characteristics; Adjust accident release characteristics to account for results of plant-containment analysis; Produce conditional risk curves for each of five health effects; and Estimate uncertainties
DEFF Research Database (Denmark)
Hjørland, Birger
2017-01-01
The domain-analytic approach to knowledge organization (KO) (and to the broader field of library and information science, LIS) is outlined. The article reviews the discussions and proposals on the definition of domains, and provides an example of a domain-analytic study in the field of art studies....... Varieties of domain analysis as well as criticism and controversies are presented and discussed....
International Nuclear Information System (INIS)
Rhoades, W.A.; Dray, B.J.
1970-01-01
The effect of Gadolinium-155 on the prompt kinetic behavior of a zirconium hydride reactor has been deduced, using experimental data from the SNAPTRAN machine. The poison material makes the temperature coefficient more positive, and the Type IV sleeves were deduced to give a positive coefficient above 1100 0 F. A thorough discussion of the data and analysis is included. (U.S.)
International Nuclear Information System (INIS)
Saadi, Radouan; Marah, Hamid
2014-01-01
This report presents results related to Tritium analysis carried out at the CNESTEN DASTE in Rabat (Morocco), on behalf of Senegal, within the framework of the RAF7011 project. It describes analytical method and instrumentation including general uncertainty estimation: Electrolytic enrichment and liquid scintillation counting; The results are expressed in Tritium Unit (TU); Low Limit Detection: 0.02 TU
Miller, Rupert G
2011-01-01
A concise summary of the statistical methods used in the analysis of survival data with censoring. Emphasizes recently developed nonparametric techniques. Outlines methods in detail and illustrates them with actual data. Discusses the theory behind each method. Includes numerous worked problems and numerical exercises.
Koornneef, M.; Alonso-Blanco, C.; Stam, P.
2006-01-01
The Mendelian analysis of genetic variation, available as induced mutants or as natural variation, requires a number of steps that are described in this chapter. These include the determination of the number of genes involved in the observed trait's variation, the determination of dominance
DEFF Research Database (Denmark)
Nielsen, Kirsten
2010-01-01
The first part of this article presents the characteristics of Hebrew poetry: features associated with rhythm and phonology, grammatical features, structural elements like parallelism, and imagery and intertextuality. The second part consists of an analysis of Psalm 121. It is argued that assonan...
Adrian Ioana; Tiberiu Socaciu
2013-01-01
The article presents specific aspects of management and models for economic analysis. Thus, we present the main types of economic analysis: statistical analysis, dynamic analysis, static analysis, mathematical analysis, psychological analysis. Also we present the main object of the analysis: the technological activity analysis of a company, the analysis of the production costs, the economic activity analysis of a company, the analysis of equipment, the analysis of labor productivity, the anal...
International Nuclear Information System (INIS)
Smith, M.; Jones, D.R.
1991-01-01
The goal of exploration is to find reserves that will earn an adequate rate of return on the capital invested. Neither exploration nor economics is an exact science. The authors must therefore explore in those trends (plays) that have the highest probability of achieving this goal. Trend analysis is a technique for organizing the available data to make these strategic exploration decisions objectively and is in conformance with their goals and risk attitudes. Trend analysis differs from resource estimation in its purpose. It seeks to determine the probability of economic success for an exploration program, not the ultimate results of the total industry effort. Thus the recent past is assumed to be the best estimate of the exploration probabilities for the near future. This information is combined with economic forecasts. The computer software tools necessary for trend analysis are (1) Information data base - requirements and sources. (2) Data conditioning program - assignment to trends, correction of errors, and conversion into usable form. (3) Statistical processing program - calculation of probability of success and discovery size probability distribution. (4) Analytical processing - Monte Carlo simulation to develop the probability distribution of the economic return/investment ratio for a trend. Limited capital (short-run) effects are analyzed using the Gambler's Ruin concept in the Monte Carlo simulation and by a short-cut method. Multiple trend analysis is concerned with comparing and ranking trends, allocating funds among acceptable trends, and characterizing program risk by using risk profiles. In summary, trend analysis is a reality check for long-range exploration planning
DEFF Research Database (Denmark)
The 19th Scandinavian Conference on Image Analysis was held at the IT University of Copenhagen in Denmark during June 15-17, 2015. The SCIA conference series has been an ongoing biannual event for more than 30 years and over the years it has nurtured a world-class regional research and development...... area within the four participating Nordic countries. It is a regional meeting of the International Association for Pattern Recognition (IAPR). We would like to thank all authors who submitted works to this year’s SCIA, the invited speakers, and our Program Committee. In total 67 papers were submitted....... The topics of the accepted papers range from novel applications of vision systems, pattern recognition, machine learning, feature extraction, segmentation, 3D vision, to medical and biomedical image analysis. The papers originate from all the Scandinavian countries and several other European countries...
Helson, Henry
2010-01-01
This second edition has been enlarged and considerably rewritten. Among the new topics are infinite product spaces with applications to probability, disintegration of measures on product spaces, positive definite functions on the line, and additional information about Weyl's theorems on equidistribution. Topics that have continued from the first edition include Minkowski's theorem, measures with bounded powers, idempotent measures, spectral sets of bounded functions and a theorem of Szego, and the Wiener Tauberian theorem. Readers of the book should have studied the Lebesgue integral, the elementary theory of analytic and harmonic functions, and the basic theory of Banach spaces. The treatment is classical and as simple as possible. This is an instructional book, not a treatise. Mathematics students interested in analysis will find here what they need to know about Fourier analysis. Physicists and others can use the book as a reference for more advanced topics.
Bray, Hubert L; Mazzeo, Rafe; Sesum, Natasa
2015-01-01
This volume includes expanded versions of the lectures delivered in the Graduate Minicourse portion of the 2013 Park City Mathematics Institute session on Geometric Analysis. The papers give excellent high-level introductions, suitable for graduate students wishing to enter the field and experienced researchers alike, to a range of the most important areas of geometric analysis. These include: the general issue of geometric evolution, with more detailed lectures on Ricci flow and Kähler-Ricci flow, new progress on the analytic aspects of the Willmore equation as well as an introduction to the recent proof of the Willmore conjecture and new directions in min-max theory for geometric variational problems, the current state of the art regarding minimal surfaces in R^3, the role of critical metrics in Riemannian geometry, and the modern perspective on the study of eigenfunctions and eigenvalues for Laplace-Beltrami operators.
Freitag, Eberhard
2005-01-01
The guiding principle of this presentation of ``Classical Complex Analysis'' is to proceed as quickly as possible to the central results while using a small number of notions and concepts from other fields. Thus the prerequisites for understanding this book are minimal; only elementary facts of calculus and algebra are required. The first four chapters cover the essential core of complex analysis: - differentiation in C (including elementary facts about conformal mappings) - integration in C (including complex line integrals, Cauchy's Integral Theorem, and the Integral Formulas) - sequences and series of analytic functions, (isolated) singularities, Laurent series, calculus of residues - construction of analytic functions: the gamma function, Weierstrass' Factorization Theorem, Mittag-Leffler Partial Fraction Decomposition, and -as a particular highlight- the Riemann Mapping Theorem, which characterizes the simply connected domains in C. Further topics included are: - the theory of elliptic functions based on...
International Nuclear Information System (INIS)
Quinn, C.A.
1983-01-01
The article deals with spectrographic analysis and the analytical methods based on it. The theory of spectrographic analysis is discussed as well as the layout of a spectrometer system. The infrared absorption spectrum of a compound is probably its most unique property. The absorption of infrared radiation depends on increasing the energy of vibration and rotation associated with a covalent bond. The infrared region is intrinsically low in energy thus the design of infrared spectrometers is always directed toward maximising energy throughput. The article also considers atomic absorption - flame atomizers, non-flame atomizers and the source of radiation. Under the section an emission spectroscopy non-electrical energy sources, electrical energy sources and electrical flames are discussed. Digital computers form a part of the development on spectrographic instrumentation
Cheng, Lizhi; Luo, Yong; Chen, Bo
2014-01-01
This book could be divided into two parts i.e. fundamental wavelet transform theory and method and some important applications of wavelet transform. In the first part, as preliminary knowledge, the Fourier analysis, inner product space, the characteristics of Haar functions, and concepts of multi-resolution analysis, are introduced followed by a description on how to construct wavelet functions both multi-band and multi wavelets, and finally introduces the design of integer wavelets via lifting schemes and its application to integer transform algorithm. In the second part, many applications are discussed in the field of image and signal processing by introducing other wavelet variants such as complex wavelets, ridgelets, and curvelets. Important application examples include image compression, image denoising/restoration, image enhancement, digital watermarking, numerical solution of partial differential equations, and solving ill-conditioned Toeplitz system. The book is intended for senior undergraduate stude...
International Nuclear Information System (INIS)
Hwang, Hun
2007-02-01
This book explains potentiometry, voltametry, amperometry and basic conception of conductometry with eleven chapters. It gives the specific descriptions on electrochemical cell and its mode, basic conception of electrochemical analysis on oxidation-reduction reaction, standard electrode potential, formal potential, faradaic current and faradaic process, mass transfer and overvoltage, potentiometry and indirect potentiometry, polarography with TAST, normal pulse and deferential pulse, voltammetry, conductometry and conductometric titration.
International Nuclear Information System (INIS)
Badwe, R.A.
1999-01-01
The primary endpoint in the majority of the studies has been either disease recurrence or death. This kind of analysis requires a special method since all patients in the study experience the endpoint. The standard method for estimating such survival distribution is Kaplan Meier method. The survival function is defined as the proportion of individuals who survive beyond certain time. Multi-variate comparison for survival has been carried out with Cox's proportional hazard model
DEFF Research Database (Denmark)
Andersen, Lars
This book contains the lecture notes for the 9th semester course on elastodynamics. The first chapter gives an overview of the basic theory of stress waves propagating in viscoelastic media. In particular, the effect of surfaces and interfaces in a viscoelastic material is studied, and different....... Thus, in Chapter 3, an alternative semi-analytic method is derived, which may be applied for the analysis of layered half-spaces subject to moving or stationary loads....
Mucha, Hans-Joachim; Sofyan, Hizir
2000-01-01
As an explorative technique, duster analysis provides a description or a reduction in the dimension of the data. It classifies a set of observations into two or more mutually exclusive unknown groups based on combinations of many variables. Its aim is to construct groups in such a way that the profiles of objects in the same groups are relatively homogenous whereas the profiles of objects in different groups are relatively heterogeneous. Clustering is distinct from classification techniques, ...
International Nuclear Information System (INIS)
Garbarino, J.R.; Steinheimer, T.R.; Taylor, H.E.
1985-01-01
This is the twenty-first biennial review of the inorganic and organic analytical chemistry of water. The format of this review differs somewhat from previous reviews in this series - the most recent of which appeared in Analytical Chemistry in April 1983. Changes in format have occurred in the presentation of material concerning review articles and the inorganic analysis of water sections. Organic analysis of water sections are organized as in previous reviews. Review articles have been compiled and tabulated in an Appendix with respect to subject, title, author(s), citation, and number of references cited. The inorganic water analysis sections are now grouped by constituent using the periodic chart; for example, alkali, alkaline earth, 1st series transition metals, etc. Within these groupings the references are roughly grouped by instrumental technique; for example, spectrophotometry, atomic absorption spectrometry, etc. Multiconstituent methods for determining analytes that cannot be grouped in this manner are compiled into a separate section sorted by instrumental technique. References used in preparing this review were compiled from nearly 60 major journals published during the period from October 1982 through September 1984. Conference proceedings, most foreign journals, most trade journals, and most government publications are excluded. References cited were obtained using the American Chemical Society's Chemical Abstracts for sections on inorganic analytical chemistry, organic analytical chemistry, water, and sewage waste. Cross-references of these sections were also included. 860 references
Energy Technology Data Exchange (ETDEWEB)
None
1980-06-01
The Energy Policy and Conservation Act (EPCA) mandated that minimum energy efficiency standards be established for classes of refrigerators and refrigerator-freezers, freezers, clothes dryers, water heaters, room air conditioners, home heating equipment, kitchen ranges and ovens, central air conditioners, and furnaces. EPCA requires that standards be designed to achieve the maximum improvement in energy efficiency that is technologically feasible and economically justified. Following the introductory chapter, Chapter Two describes the methodology used in the economic analysis and its relationship to legislative criteria for consumer product efficiency assessment; details how the CPES Value Model systematically compared and evaluated the economic impacts of regulation on the consumer, manufacturer and Nation. Chapter Three briefly displays the results of the analysis and lists the proposed performance standards by product class. Chapter Four describes the reasons for developing a baseline forecast, characterizes the baseline scenario from which regulatory impacts were calculated and summarizes the primary models, data sources and assumptions used in the baseline formulations. Chapter Five summarizes the methodology used to calculate regulatory impacts; describes the impacts of energy performance standards relative to the baseline discussed in Chapter Four. Also discussed are regional standards and other program alternatives to performance standards. Chapter Six describes the procedure for balancing consumer, manufacturer, and national impacts to select standard levels. Details of models and data bases used in the analysis are included in Appendices A through K.
Newell, Homer E
2006-01-01
When employed with skill and understanding, vector analysis can be a practical and powerful tool. This text develops the algebra and calculus of vectors in a manner useful to physicists and engineers. Numerous exercises (with answers) not only provide practice in manipulation but also help establish students' physical and geometric intuition in regard to vectors and vector concepts.Part I, the basic portion of the text, consists of a thorough treatment of vector algebra and the vector calculus. Part II presents the illustrative matter, demonstrating applications to kinematics, mechanics, and e
Brand, Louis
2006-01-01
The use of vectors not only simplifies treatments of differential geometry, mechanics, hydrodynamics, and electrodynamics, but also makes mathematical and physical concepts more tangible and easy to grasp. This text for undergraduates was designed as a short introductory course to give students the tools of vector algebra and calculus, as well as a brief glimpse into these subjects' manifold applications. The applications are developed to the extent that the uses of the potential function, both scalar and vector, are fully illustrated. Moreover, the basic postulates of vector analysis are brou
Abbott, Stephen
2015-01-01
This lively introductory text exposes the student to the rewards of a rigorous study of functions of a real variable. In each chapter, informal discussions of questions that give analysis its inherent fascination are followed by precise, but not overly formal, developments of the techniques needed to make sense of them. By focusing on the unifying themes of approximation and the resolution of paradoxes that arise in the transition from the finite to the infinite, the text turns what could be a daunting cascade of definitions and theorems into a coherent and engaging progression of ideas. Acutely aware of the need for rigor, the student is much better prepared to understand what constitutes a proper mathematical proof and how to write one. Fifteen years of classroom experience with the first edition of Understanding Analysis have solidified and refined the central narrative of the second edition. Roughly 150 new exercises join a selection of the best exercises from the first edition, and three more project-sty...
DEFF Research Database (Denmark)
Moore, R; Brødsgaard, I; Miller, ML
1997-01-01
A quantitative method for validating qualitative interview results and checking sample parameters is described and illustrated using common pain descriptions among a sample of Anglo-American and mandarin Chinese patients and dentists matched by age and gender. Assumptions were that subjects were ...... of covalidating questionnaires that reflect results of qualitative interviews are recommended in order to estimate sample parameters such as intersubject agreement, individual subject accuracy, and minimum required sample sizes.......A quantitative method for validating qualitative interview results and checking sample parameters is described and illustrated using common pain descriptions among a sample of Anglo-American and mandarin Chinese patients and dentists matched by age and gender. Assumptions were that subjects were...... of consistency in use of descriptors within groups, validity of description, accuracy of individuals compared with others in their group, and minimum required sample size were calculated using Cronbach's alpha, factor analysis, and Bayesian probability. Ethnic and professional differences within and across...
International Nuclear Information System (INIS)
Iorio, A.F.; Crespi, J.C.
1987-01-01
After ten years of operation at the Atucha I Nuclear Power Station a gear belonging to a pressurized heavy water reactor refuelling machine, failed. The gear box was used to operate the inlet-outlet heavy-water valve of the machine. Visual examination of the gear device showed an absence of lubricant and that several gear teeth were broken at the root. Motion was transmitted with a speed-reducing device with controlled adjustable times in order to produce a proper fitness of the valve closure. The aim of this paper is to discuss the results of the gear failure analysis in order to recommend the proper solution to prevent further failures. (Author)
International Nuclear Information System (INIS)
1988-01-01
In a search for correlations between the elemental composition of trace elements in human stones and the stone types with relation to their growth pattern, a combined PIXE and x-ray diffraction spectrometry approach was implemented. The combination of scanning PIXE and XRD has proved to be an advance in the methodology of stone analysis and may point to the growth pattern in the body. The exact role of trace elements in the formation and growth of urinary stones is not fully understood. Efforts are thus continuing firstly to solve the analytical problems concerned and secondly to design suitable experiments that would provide information about the occurrence and distribution of trace elements in urine. 1 fig., 1 ref
International Nuclear Information System (INIS)
Straub, W.A.
1987-01-01
This review is the seventh in the series compiled by using the Dialog on-line CA Search facilities at the Information Resource Center of USS Technical Center covering the period from Oct. 1984 to Nov. 1, 1986. The quest for better surface properties, through the application of various electrochemical and other coating techniques, seems to have increased and reinforces the notion that only through the value added to a steel by proper finishing steps can a major supplier hope to compete profitably. The detection, determination, and control of microalloying constituents has also been generating a lot of interest as evidenced by the number of publications devoted to this subject in the last few years. Several recent review articles amplify on the recent trends in the application of modern analytical technology to steelmaking. Another review has been devoted to the determination of trace elements and the simultaneous determination of elements in metals by mass spectrometry, atomic absorption spectrometry, and multielement emission spectrometry. Problems associated with the analysis of electroplating wastewaters have been reviewed in a recent publication that has described the use of various spectrophotometric methods for this purpose. The collection and treatment of analytical data in the modern steel making environment have been extensively reviewed emphasis on the interaction of the providers and users of the analytical data, its quality, and the cost of its collection. Raw material treatment and beneficiation was the dominant theme
Bhatia, Rajendra
1997-01-01
A good part of matrix theory is functional analytic in spirit. This statement can be turned around. There are many problems in operator theory, where most of the complexities and subtleties are present in the finite-dimensional case. My purpose in writing this book is to present a systematic treatment of methods that are useful in the study of such problems. This book is intended for use as a text for upper division and gradu ate courses. Courses based on parts of the material have been given by me at the Indian Statistical Institute and at the University of Toronto (in collaboration with Chandler Davis). The book should also be useful as a reference for research workers in linear algebra, operator theory, mathe matical physics and numerical analysis. A possible subtitle of this book could be Matrix Inequalities. A reader who works through the book should expect to become proficient in the art of deriving such inequalities. Other authors have compared this art to that of cutting diamonds. One first has to...
International Nuclear Information System (INIS)
Thomas, R.E.
1982-03-01
An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software
Energy Technology Data Exchange (ETDEWEB)
Ibsen, Lars Bo; Liingaard, M.
2006-12-15
This technical report concerns the basic theory and principles for experimental modal analysis. The sections within the report are: Output-only modal analysis software, general digital analysis, basics of structural dynamics and modal analysis and system identification. (au)
Theoretical numerical analysis a functional analysis framework
Atkinson, Kendall
2005-01-01
This textbook prepares graduate students for research in numerical analysis/computational mathematics by giving to them a mathematical framework embedded in functional analysis and focused on numerical analysis. This helps the student to move rapidly into a research program. The text covers basic results of functional analysis, approximation theory, Fourier analysis and wavelets, iteration methods for nonlinear equations, finite difference methods, Sobolev spaces and weak formulations of boundary value problems, finite element methods, elliptic variational inequalities and their numerical solu
International Nuclear Information System (INIS)
2003-08-01
This book deals with analysis of heat transfer which includes nonlinear analysis examples, radiation heat transfer, analysis of heat transfer in ANSYS, verification of analysis result, analysis of heat transfer of transition with automatic time stepping and open control, analysis of heat transfer using arrangement of ANSYS, resistance of thermal contact, coupled field analysis such as of thermal-structural interaction, cases of coupled field analysis, and phase change.
Information security risk analysis
Peltier, Thomas R
2001-01-01
Effective Risk AnalysisQualitative Risk AnalysisValue AnalysisOther Qualitative MethodsFacilitated Risk Analysis Process (FRAP)Other Uses of Qualitative Risk AnalysisCase StudyAppendix A: QuestionnaireAppendix B: Facilitated Risk Analysis Process FormsAppendix C: Business Impact Analysis FormsAppendix D: Sample of ReportAppendix E: Threat DefinitionsAppendix F: Other Risk Analysis OpinionsIndex
International Nuclear Information System (INIS)
Son, Seung Hui
2004-02-01
This book deals with information technology and business process, information system architecture, methods of system development, plan on system development like problem analysis and feasibility analysis, cases for system development, comprehension of analysis of users demands, analysis of users demands using traditional analysis, users demands analysis using integrated information system architecture, system design using integrated information system architecture, system implementation, and system maintenance.
Optimal pattern synthesis for speech recognition based on principal component analysis
Korsun, O. N.; Poliyev, A. V.
2018-02-01
The algorithm for building an optimal pattern for the purpose of automatic speech recognition, which increases the probability of correct recognition, is developed and presented in this work. The optimal pattern forming is based on the decomposition of an initial pattern to principal components, which enables to reduce the dimension of multi-parameter optimization problem. At the next step the training samples are introduced and the optimal estimates for principal components decomposition coefficients are obtained by a numeric parameter optimization algorithm. Finally, we consider the experiment results that show the improvement in speech recognition introduced by the proposed optimization algorithm.
LISA - a powerful program package for LIstmode and Spectral data Analysis
International Nuclear Information System (INIS)
Oberstedt, A.; Hambsch, F.J.
1994-01-01
LISA is a graphical program package which enables both off-line listmode and spectral data evaluation as well as on-line monitoring while multi-parameter experiments are running. It can be executed on every computer with a UNIX operating system and an X-WINDOW environment, running PV-WAVE from Visual Numerics Incorporation. This package is basically written in the language PV-WAVE CL, but integration of procedures written in the C-language and execution of UNIX shell commands lead to an additional increase of performance. (orig.)
Analysis of Project Finance | Energy Analysis | NREL
Analysis of Project Finance Analysis of Project Finance NREL analysis helps potential renewable energy developers and investors gain insights into the complex world of project finance. Renewable energy project finance is complex, requiring knowledge of federal tax credits, state-level incentives, renewable
International Nuclear Information System (INIS)
Wright, A.C.D.
2002-01-01
This paper discusses the safety analysis fundamentals in reactor design. This study includes safety analysis done to show consequences of postulated accidents are acceptable. Safety analysis is also used to set design of special safety systems and includes design assist analysis to support conceptual design. safety analysis is necessary for licensing a reactor, to maintain an operating license, support changes in plant operations
An example of multidimensional analysis: Discriminant analysis
International Nuclear Information System (INIS)
Lutz, P.
1990-01-01
Among the approaches on the data multi-dimensional analysis, lectures on the discriminant analysis including theoretical and practical aspects are presented. The discrimination problem, the analysis steps and the discrimination categories are stressed. Examples on the descriptive historical analysis, the discrimination for decision making, the demonstration and separation of the top quark are given. In the linear discriminant analysis the following subjects are discussed: Huyghens theorem, projection, discriminant variable, geometrical interpretation, case for g=2, classification method, separation of the top events. Criteria allowing the obtention of relevant results are included [fr
... Sources Ask Us Also Known As Sperm Analysis Sperm Count Seminal Fluid Analysis Formal Name Semen Analysis This ... semen Viscosity—consistency or thickness of the semen Sperm count—total number of sperm Sperm concentration (density)—number ...
Papageorgiou, Nikolaos S
2009-01-01
Offers an examination of important theoretical methods and procedures in applied analysis. This book details the important theoretical trends in nonlinear analysis and applications to different fields. It is suitable for those working on nonlinear analysis.
Shape analysis in medical image analysis
Tavares, João
2014-01-01
This book contains thirteen contributions from invited experts of international recognition addressing important issues in shape analysis in medical image analysis, including techniques for image segmentation, registration, modelling and classification, and applications in biology, as well as in cardiac, brain, spine, chest, lung and clinical practice. This volume treats topics such as, anatomic and functional shape representation and matching; shape-based medical image segmentation; shape registration; statistical shape analysis; shape deformation; shape-based abnormity detection; shape tracking and longitudinal shape analysis; machine learning for shape modeling and analysis; shape-based computer-aided-diagnosis; shape-based medical navigation; benchmark and validation of shape representation, analysis and modeling algorithms. This work will be of interest to researchers, students, and manufacturers in the fields of artificial intelligence, bioengineering, biomechanics, computational mechanics, computationa...
Organization Search Go Search Polar Go MMAB SST Analysis Main page About MMAB Our Mission Our Personnel EMC Branches Global Climate & Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Contact EMC (RTG_SST_HR) analysis For a regional map, click the desired area in the global SST analysis and anomaly maps
Foundations of factor analysis
Mulaik, Stanley A
2009-01-01
Introduction Factor Analysis and Structural Theories Brief History of Factor Analysis as a Linear Model Example of Factor AnalysisMathematical Foundations for Factor Analysis Introduction Scalar AlgebraVectorsMatrix AlgebraDeterminants Treatment of Variables as Vectors Maxima and Minima of FunctionsComposite Variables and Linear Transformations Introduction Composite Variables Unweighted Composite VariablesDifferentially Weighted Composites Matrix EquationsMulti
International Nuclear Information System (INIS)
PECH, S.H.
2000-01-01
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report
Quantitative analysis chemistry
International Nuclear Information System (INIS)
Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung
1995-02-01
This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.
Energy Technology Data Exchange (ETDEWEB)
PECH, S.H.
2000-08-23
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.
International Nuclear Information System (INIS)
WEBB, R.H.
1999-01-01
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Safety Analysis Report (HNF-SD-WM-SAR-062/Rev.4). This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report
Philipp Mayring
2000-01-01
The article describes an approach of systematic, rule guided qualitative text analysis, which tries to preserve some methodological strengths of quantitative content analysis and widen them to a concept of qualitative procedure. First the development of content analysis is delineated and the basic principles are explained (units of analysis, step models, working with categories, validity and reliability). Then the central procedures of qualitative content analysis, inductive development of ca...
RELIABILITY ANALYSIS OF BENDING ELIABILITY ANALYSIS OF ...
African Journals Online (AJOL)
eobe
Reliability analysis of the safety levels of the criteria slabs, have been .... was also noted [2] that if the risk level or β < 3.1), the ... reliability analysis. A study [6] has shown that all geometric variables, ..... Germany, 1988. 12. Hasofer, A. M and ...
DTI analysis methods : Voxel-based analysis
Van Hecke, Wim; Leemans, Alexander; Emsell, Louise
2016-01-01
Voxel-based analysis (VBA) of diffusion tensor imaging (DTI) data permits the investigation of voxel-wise differences or changes in DTI metrics in every voxel of a brain dataset. It is applied primarily in the exploratory analysis of hypothesized group-level alterations in DTI parameters, as it does
Analysis of Precision of Activation Analysis Method
DEFF Research Database (Denmark)
Heydorn, Kaj; Nørgaard, K.
1973-01-01
The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...
Hazard Analysis Database Report
Grams, W H
2000-01-01
The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from t...
Santiago, John
2013-01-01
Circuits overloaded from electric circuit analysis? Many universities require that students pursuing a degree in electrical or computer engineering take an Electric Circuit Analysis course to determine who will ""make the cut"" and continue in the degree program. Circuit Analysis For Dummies will help these students to better understand electric circuit analysis by presenting the information in an effective and straightforward manner. Circuit Analysis For Dummies gives you clear-cut information about the topics covered in an electric circuit analysis courses to help
Cluster analysis for applications
Anderberg, Michael R
1973-01-01
Cluster Analysis for Applications deals with methods and various applications of cluster analysis. Topics covered range from variables and scales to measures of association among variables and among data units. Conceptual problems in cluster analysis are discussed, along with hierarchical and non-hierarchical clustering methods. The necessary elements of data analysis, statistics, cluster analysis, and computer implementation are integrated vertically to cover the complete path from raw data to a finished analysis.Comprised of 10 chapters, this book begins with an introduction to the subject o
Activation analysis in food analysis. Pt. 9
International Nuclear Information System (INIS)
Szabo, S.A.
1992-01-01
An overview is presented on the application of activation analysis (AA) techniques for food analysis, as reflected at a recent international conference titled Activation Analysis and its Applications. The most popular analytical techniques include instrumental neutron AA, (INAA or NAA), radiochemical NAA (RNAA), X-ray fluorescence analysis and mass spectrometry. Data are presented for the multielemental NAA of instant soups, for elemental composition of drinking water in Iraq, for Na, K, Mn contents of various Indian rices, for As, Hg, Sb and Se determination in various seafoods, for daily microelement takeup in China, for the elemental composition of Chinese teas. Expected development trends in AA are outlined. (R.P.) 24 refs.; 8 tabs
Joint fluid analysis; Joint fluid aspiration ... El-Gabalawy HS. Synovial fluid analysis, synovial biopsy, and synovial pathology. In: Firestein GS, Budd RC, Gabriel SE, McInnes IB, O'Dell JR, eds. Kelly's Textbook of ...
International Nuclear Information System (INIS)
Burgess, R.L.
1978-01-01
Progress is reported on the following research programs: analysis and modeling of ecosystems; EDFB/IBP data center; biome analysis studies; land/water interaction studies; and computer programs for development of models
Confirmatory Composite Analysis
Schuberth, Florian; Henseler, Jörg; Dijkstra, Theo K.
2018-01-01
We introduce confirmatory composite analysis (CCA) as a structural equation modeling technique that aims at testing composite models. CCA entails the same steps as confirmatory factor analysis: model specification, model identification, model estimation, and model testing. Composite models are
Introductory numerical analysis
Pettofrezzo, Anthony J
2006-01-01
Written for undergraduates who require a familiarity with the principles behind numerical analysis, this classical treatment encompasses finite differences, least squares theory, and harmonic analysis. Over 70 examples and 280 exercises. 1967 edition.
Gap Analysis: Application to Earned Value Analysis
Langford, Gary O.; Franck, Raymond (Chip)
2008-01-01
Sponsored Report (for Acquisition Research Program) Earned Value is regarded as a useful tool to monitor commercial and defense system acquisitions. This paper applies the theoretical foundations and systematics of Gap Analysis to improve Earned Value Management. As currently implemented, Earned Value inaccurately provides a higher value for the work performed. This preliminary research indicates that Earned Value calculations can be corrected. Value Analysis, properly defined and enacted,...
Importance-performance analysis based SWOT analysis
Phadermrod, Boonyarat; Crowder, Richard M.; Wills, Gary B.
2016-01-01
SWOT analysis, a commonly used tool for strategic planning, is traditionally a form of brainstorming. Hence, it has been criticised that it is likely to hold subjective views of the individuals who participate in a brainstorming session and that SWOT factors are not prioritized by their significance thus it may result in an improper strategic action. While most studies of SWOT analysis have only focused on solving these shortcomings separately, this study offers an approach to diminish both s...
Discourse analysis and Foucault's
Directory of Open Access Journals (Sweden)
Jansen I.
2008-01-01
Full Text Available Discourse analysis is a method with up to now was less recognized in nursing science, althoughmore recently nursing scientists are discovering it for their purposes. However, several authors have criticized thatdiscourse analysis is often misinterpreted because of a lack of understanding of its theoretical backgrounds. In thisarticle, I reconstruct Foucault’s writings in his “Archaeology of Knowledge” to provide a theoretical base for futurearchaeological discourse analysis, which can be categorized as a socio-linguistic discourse analysis.
Yuan, Ying; MacKinnon, David P.
2009-01-01
This article proposes Bayesian analysis of mediation effects. Compared to conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian mediation analysis, inference is straightforward and exact, which makes it appealing for studies with small samples. Third, the Bayesian approach is conceptua...
Fitzmaurice, Garrett M; Ware, James H
2012-01-01
Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo
Regression analysis by example
Chatterjee, Samprit
2012-01-01
Praise for the Fourth Edition: ""This book is . . . an excellent source of examples for regression analysis. It has been and still is readily readable and understandable."" -Journal of the American Statistical Association Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. Regression Analysis by Example, Fifth Edition has been expanded
2014-01-01
M.Ing. (Electrical & Electronic Engineering) One of the most important steps to be taken before a site is to be selected for the extraction of wind energy is the analysis of the energy within the wind on that particular site. No wind energy analysis system exists for the measurement and analysis of wind power. This dissertation documents the design and development of a Wind Energy Analysis System (WEAS). Using a micro-controller based design in conjunction with sensors, WEAS measure, calcu...
Slice hyperholomorphic Schur analysis
Alpay, Daniel; Sabadini, Irene
2016-01-01
This book defines and examines the counterpart of Schur functions and Schur analysis in the slice hyperholomorphic setting. It is organized into three parts: the first introduces readers to classical Schur analysis, while the second offers background material on quaternions, slice hyperholomorphic functions, and quaternionic functional analysis. The third part represents the core of the book and explores quaternionic Schur analysis and its various applications. The book includes previously unpublished results and provides the basis for new directions of research.
Computational movement analysis
Laube, Patrick
2014-01-01
This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi
Thompson, Cheryl Bagley
2009-01-01
This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.
Trend Analysis Using Microcomputers.
Berger, Carl F.
A trend analysis statistical package and additional programs for the Apple microcomputer are presented. They illustrate strategies of data analysis suitable to the graphics and processing capabilities of the microcomputer. The programs analyze data sets using examples of: (1) analysis of variance with multiple linear regression; (2) exponential…
Yuan, Ying; MacKinnon, David P.
2009-01-01
In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…
Automation of activation analysis
International Nuclear Information System (INIS)
Ivanov, I.N.; Ivanets, V.N.; Filippov, V.V.
1985-01-01
The basic data on the methods and equipment of activation analysis are presented. Recommendations on the selection of activation analysis techniques, and especially the technique envisaging the use of short-lived isotopes, are given. The equipment possibilities to increase dataway carrying capacity, using modern computers for the automation of the analysis and data processing procedure, are shown
Cuesta, Hector
2013-01-01
Each chapter of the book quickly introduces a key 'theme' of Data Analysis, before immersing you in the practical aspects of each theme. You'll learn quickly how to perform all aspects of Data Analysis.Practical Data Analysis is a book ideal for home and small business users who want to slice & dice the data they have on hand with minimum hassle.
Implementing an Executive-Function Syllabus: Operational Issues
Directory of Open Access Journals (Sweden)
Russell Jay Hendel
2016-08-01
Full Text Available A recent approach to pedagogic challenge, contrastive to the hierarchy approach of Bloom, Anderson, Gagne, Van Hiele, Marzano, Webb and many others, identifies pedagogic challenge with executive function: Pedagogy is defined as challenging if it addresses executive function. Executive function, in turn, is defined by the presence of multiple modalities of topic approach and a multi-parameter development of the topic. This paper discusses operational issues in implementing a teaching methodology based on multi-parameter problems. This paper advocates teaching a multi-parameter topic using a step-by-step incremental approach, introducing one parameter at a time. Examples are presented from trigonometry, actuarial mathematics, statistics and (biblical literary analysis. The paper also discusses the use of the incremental approach for problem creation and remediation.
Mathematical analysis fundamentals
Bashirov, Agamirza
2014-01-01
The author's goal is a rigorous presentation of the fundamentals of analysis, starting from elementary level and moving to the advanced coursework. The curriculum of all mathematics (pure or applied) and physics programs include a compulsory course in mathematical analysis. This book will serve as can serve a main textbook of such (one semester) courses. The book can also serve as additional reading for such courses as real analysis, functional analysis, harmonic analysis etc. For non-math major students requiring math beyond calculus, this is a more friendly approach than many math-centric o
Foundations of mathematical analysis
Johnsonbaugh, Richard
2010-01-01
This classroom-tested volume offers a definitive look at modern analysis, with views of applications to statistics, numerical analysis, Fourier series, differential equations, mathematical analysis, and functional analysis. Upper-level undergraduate students with a background in calculus will benefit from its teachings, along with beginning graduate students seeking a firm grounding in modern analysis. A self-contained text, it presents the necessary background on the limit concept, and the first seven chapters could constitute a one-semester introduction to limits. Subsequent chapters discuss
Analysis in usability evaluations
DEFF Research Database (Denmark)
Følstad, Asbjørn; Lai-Chong Law, Effie; Hornbæk, Kasper
2010-01-01
While the planning and implementation of usability evaluations are well described in the literature, the analysis of the evaluation data is not. We present interviews with 11 usability professionals on how they conduct analysis, describing the resources, collaboration, creation of recommendations......, and prioritization involved. The interviews indicate a lack of structure in the analysis process and suggest activities, such as generating recommendations, that are unsupported by existing methods. We discuss how to better support analysis, and propose four themes for future research on analysis in usability...
DEFF Research Database (Denmark)
Bøving, Kristian Billeskov; Simonsen, Jesper
2004-01-01
This article documents how log analysis can inform qualitative studies concerning the usage of web-based information systems (WIS). No prior research has used http log files as data to study collaboration between multiple users in organisational settings. We investigate how to perform http log...... analysis; what http log analysis says about the nature of collaborative WIS use; and how results from http log analysis may support other data collection methods such as surveys, interviews, and observation. The analysis of log files initially lends itself to research designs, which serve to test...... hypotheses using a quantitative methodology. We show that http log analysis can also be valuable in qualitative research such as case studies. The results from http log analysis can be triangulated with other data sources and for example serve as a means of supporting the interpretation of interview data...
Amir Farbin
The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...
Multivariate analysis with LISREL
Jöreskog, Karl G; Y Wallentin, Fan
2016-01-01
This book traces the theory and methodology of multivariate statistical analysis and shows how it can be conducted in practice using the LISREL computer program. It presents not only the typical uses of LISREL, such as confirmatory factor analysis and structural equation models, but also several other multivariate analysis topics, including regression (univariate, multivariate, censored, logistic, and probit), generalized linear models, multilevel analysis, and principal component analysis. It provides numerous examples from several disciplines and discusses and interprets the results, illustrated with sections of output from the LISREL program, in the context of the example. The book is intended for masters and PhD students and researchers in the social, behavioral, economic and many other sciences who require a basic understanding of multivariate statistical theory and methods for their analysis of multivariate data. It can also be used as a textbook on various topics of multivariate statistical analysis.
International Nuclear Information System (INIS)
Actis, Oxana; Brodski, Michael; Erdmann, Martin; Fischer, Robert; Hinzmann, Andreas; Mueller, Gero; Muenzer, Thomas; Plum, Matthias; Steggemann, Jan; Winchen, Tobias; Klimkovich, Tatsiana
2010-01-01
VISPA is a development environment for high energy physics analyses which enables physicists to combine graphical and textual work. A physics analysis cycle consists of prototyping, performing, and verifying the analysis. The main feature of VISPA is a multipurpose window for visual steering of analysis steps, creation of analysis templates, and browsing physics event data at different steps of an analysis. VISPA follows an experiment-independent approach and incorporates various tools for steering and controlling required in a typical analysis. Connection to different frameworks of high energy physics experiments is achieved by using different types of interfaces. We present the look-and-feel for an example physics analysis at the LHC and explain the underlying software concepts of VISPA.
Cost benefit analysis cost effectiveness analysis
International Nuclear Information System (INIS)
Lombard, J.
1986-09-01
The comparison of various protection options in order to determine which is the best compromise between cost of protection and residual risk is the purpose of the ALARA procedure. The use of decision-aiding techniques is valuable as an aid to selection procedures. The purpose of this study is to introduce two rather simple and well known decision aiding techniques: the cost-effectiveness analysis and the cost-benefit analysis. These two techniques are relevant for the great part of ALARA decisions which need the use of a quantitative technique. The study is based on an hypothetical case of 10 protection options. Four methods are applied to the data
International Nuclear Information System (INIS)
Sommer, S; Tinh Tran, T.
2008-01-01
Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process
Functional analysis and applications
Siddiqi, Abul Hasan
2018-01-01
This self-contained textbook discusses all major topics in functional analysis. Combining classical materials with new methods, it supplies numerous relevant solved examples and problems and discusses the applications of functional analysis in diverse fields. The book is unique in its scope, and a variety of applications of functional analysis and operator-theoretic methods are devoted to each area of application. Each chapter includes a set of problems, some of which are routine and elementary, and some of which are more advanced. The book is primarily intended as a textbook for graduate and advanced undergraduate students in applied mathematics and engineering. It offers several attractive features making it ideally suited for courses on functional analysis intended to provide a basic introduction to the subject and the impact of functional analysis on applied and computational mathematics, nonlinear functional analysis and optimization. It introduces emerging topics like wavelets, Gabor system, inverse pro...
DEFF Research Database (Denmark)
Bemman, Brian; Meredith, David
it with a “ground truth” analysis of the same music pro- duced by a human expert (see, in particular, [5]). In this paper, we explore the problem of generating an encoding of the musical surface of a work automatically from a systematic encoding of an analysis. The ability to do this depends on one having...... an effective (i.e., comput- able), correct and complete description of some aspect of the structure of the music. Generating the surface struc- ture of a piece from an analysis in this manner serves as a proof of the analysis' correctness, effectiveness and com- pleteness. We present a reductive analysis......In recent years, a significant body of research has focused on developing algorithms for computing analyses of mu- sical works automatically from encodings of these works' surfaces [3,4,7,10,11]. The quality of the output of such analysis algorithms is typically evaluated by comparing...
Fundamentals of functional analysis
Farenick, Douglas
2016-01-01
This book provides a unique path for graduate or advanced undergraduate students to begin studying the rich subject of functional analysis with fewer prerequisites than is normally required. The text begins with a self-contained and highly efficient introduction to topology and measure theory, which focuses on the essential notions required for the study of functional analysis, and which are often buried within full-length overviews of the subjects. This is particularly useful for those in applied mathematics, engineering, or physics who need to have a firm grasp of functional analysis, but not necessarily some of the more abstruse aspects of topology and measure theory normally encountered. The reader is assumed to only have knowledge of basic real analysis, complex analysis, and algebra. The latter part of the text provides an outstanding treatment of Banach space theory and operator theory, covering topics not usually found together in other books on functional analysis. Written in a clear, concise manner,...
DEFF Research Database (Denmark)
This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today...... on well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....
Analysis apparatus and method of analysis
International Nuclear Information System (INIS)
1976-01-01
A continuous streaming method developed for the excution of immunoassays is described in this patent. In addition, a suitable apparatus for the method was developed whereby magnetic particles are automatically employed for the consecutive analysis of a series of liquid samples via the RIA technique
International Nuclear Information System (INIS)
Dougherty, E.M.; Fragola, J.R.
1988-01-01
The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach
Emission spectrochemical analysis
International Nuclear Information System (INIS)
Rives, R.D.; Bruks, R.R.
1983-01-01
The emission spectrochemical method of analysis based on the fact that atoms of elements can be excited in the electric arc or in the laser beam and will emit radiation with characteristic wave lengths is considered. The review contains the data on spectrochemical analysis, of liquids geological materials, scheme of laser microprobe. The main characteristics of emission spectroscopy, atomic absorption spectroscopy and X-ray fluorescent analysis, are aeneralized
International Nuclear Information System (INIS)
Crawford, H.J.; Lindstrom, P.J.
1983-06-01
Our analysis program LULU has proven very useful in all stages of experiment analysis, from prerun detector debugging through final data reduction. It has solved our problem of having arbitrary word length events and is easy enough to use that many separate experimenters are now analyzing with LULU. The ability to use the same software for all stages of experiment analysis greatly eases the programming burden. We may even get around to making the graphics elegant someday
Mastering Clojure data analysis
Rochester, Eric
2014-01-01
This book consists of a practical, example-oriented approach that aims to help you learn how to use Clojure for data analysis quickly and efficiently.This book is great for those who have experience with Clojure and who need to use it to perform data analysis. This book will also be hugely beneficial for readers with basic experience in data analysis and statistics.
Fast neutron activation analysis
International Nuclear Information System (INIS)
Pepelnik, R.
1986-01-01
Since 1981 numerous 14 MeV neutron activation analyses were performed at Korona. On the basis of that work the advantages of this analysis technique and therewith obtained results are compared with other analytical methods. The procedure of activation analysis, the characteristics of Korona, some analytical investigations in environmental research and material physics, as well as sources of systematic errors in trace analysis are described. (orig.) [de
Crisan, Dan
2011-01-01
"Stochastic Analysis" aims to provide mathematical tools to describe and model high dimensional random systems. Such tools arise in the study of Stochastic Differential Equations and Stochastic Partial Differential Equations, Infinite Dimensional Stochastic Geometry, Random Media and Interacting Particle Systems, Super-processes, Stochastic Filtering, Mathematical Finance, etc. Stochastic Analysis has emerged as a core area of late 20th century Mathematics and is currently undergoing a rapid scientific development. The special volume "Stochastic Analysis 2010" provides a sa
The ATLAS Analysis Architecture
International Nuclear Information System (INIS)
Cranmer, K.S.
2008-01-01
We present an overview of the ATLAS analysis architecture including the relevant aspects of the computing model and the major architectural aspects of the Athena framework. Emphasis will be given to the interplay between the analysis use cases and the technical aspects of the architecture including the design of the event data model, transient-persistent separation, data reduction strategies, analysis tools, and ROOT interoperability
Circuit analysis with Multisim
Baez-Lopez, David
2011-01-01
This book is concerned with circuit simulation using National Instruments Multisim. It focuses on the use and comprehension of the working techniques for electrical and electronic circuit simulation. The first chapters are devoted to basic circuit analysis.It starts by describing in detail how to perform a DC analysis using only resistors and independent and controlled sources. Then, it introduces capacitors and inductors to make a transient analysis. In the case of transient analysis, it is possible to have an initial condition either in the capacitor voltage or in the inductor current, or bo
Textile Technology Analysis Lab
Federal Laboratory Consortium — The Textile Analysis Labis built for evaluating and characterizing the physical properties of an array of textile materials, but specifically those used in aircrew...
DEFF Research Database (Denmark)
Sørensen, Olav Jull
2009-01-01
The review presents the book International Market Analysis: Theories and Methods, written by John Kuiada, professor at Centre of International Business, Department of Business Studies, Aalborg University. The book is refreshingly new in its way of looking at a classical problem. It looks at market...... analysis from the point of vie of ways of thinking about markets. Furthermore, the book includes the concept of learning in the analysis of markets og how the way we understand business reality influneces our choice of methodology for market analysis....
Chemical Security Analysis Center
Federal Laboratory Consortium — In 2006, by Presidential Directive, DHS established the Chemical Security Analysis Center (CSAC) to identify and assess chemical threats and vulnerabilities in the...
Geospatial Data Analysis Facility
Federal Laboratory Consortium — Geospatial application development, location-based services, spatial modeling, and spatial analysis are examples of the many research applications that this facility...
National Research Council Canada - National Science Library
Gilbert, John
1984-01-01
... quantification methods used in the analysis of mycotoxins in foods - Confirmation and quantification of trace organic food contaminants by mass spectrometry-selected ion monitoring - Chemiluminescence...
Federal Laboratory Consortium — FUNCTION: Uses state-of-the-art instrumentation for qualitative and quantitative analysis of organic and inorganic compounds, and biomolecules from gas, liquid, and...
Thermogravimetric Analysis Laboratory
Federal Laboratory Consortium — At NETL’s Thermogravimetric Analysis Laboratory in Morgantown, WV, researchers study how chemical looping combustion (CLC) can be applied to fossil energy systems....
Sensitivity and uncertainty analysis
Cacuci, Dan G; Navon, Ionel Michael
2005-01-01
As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c
Hox, J.J.; Maas, C.J.M.; Lensvelt-Mulders, G.J.L.M.
2004-01-01
The goal of meta-analysis is to integrate the research results of a number of studies on a specific topic. Characteristic for meta-analysis is that in general only the summary statistics of the studies are used and not the original data. When the published research results to be integrated
International Nuclear Information System (INIS)
Hahn, A.A.
1994-11-01
The complexity of instrumentation sometimes requires data analysis to be done before the result is presented to the control room. This tutorial reviews some of the theoretical assumptions underlying the more popular forms of data analysis and presents simple examples to illuminate the advantages and hazards of different techniques
Activation analysis. Detection limits
International Nuclear Information System (INIS)
Revel, G.
1999-01-01
Numerical data and limits of detection related to the four irradiation modes, often used in activation analysis (reactor neutrons, 14 MeV neutrons, photon gamma and charged particles) are presented here. The technical presentation of the activation analysis is detailed in the paper P 2565 of Techniques de l'Ingenieur. (A.L.B.)
SMART performance analysis methodology
International Nuclear Information System (INIS)
Lim, H. S.; Kim, H. C.; Lee, D. J.
2001-04-01
To ensure the required and desired operation over the plant lifetime, the performance analysis for the SMART NSSS design is done by means of the specified analysis methodologies for the performance related design basis events(PRDBE). The PRDBE is an occurrence(event) that shall be accommodated in the design of the plant and whose consequence would be no more severe than normal service effects of the plant equipment. The performance analysis methodology which systematizes the methods and procedures to analyze the PRDBEs is as follows. Based on the operation mode suitable to the characteristics of the SMART NSSS, the corresponding PRDBEs and allowable range of process parameters for these events are deduced. With the developed control logic for each operation mode, the system thermalhydraulics are analyzed for the chosen PRDBEs using the system analysis code. Particularly, because of different system characteristics of SMART from the existing commercial nuclear power plants, the operation mode, PRDBEs, control logic, and analysis code should be consistent with the SMART design. This report presents the categories of the PRDBEs chosen based on each operation mode and the transition among these and the acceptance criteria for each PRDBE. It also includes the analysis methods and procedures for each PRDBE and the concept of the control logic for each operation mode. Therefore this report in which the overall details for SMART performance analysis are specified based on the current SMART design, would be utilized as a guide for the detailed performance analysis
Contrast analysis : A tutorial
Haans, A.
2018-01-01
Contrast analysis is a relatively simple but effective statistical method for testing theoretical predictions about differences between group means against the empirical data. Despite its advantages, contrast analysis is hardly used to date, perhaps because it is not implemented in a convenient
Interactive Controls Analysis (INCA)
Bauer, Frank H.
1989-01-01
Version 3.12 of INCA provides user-friendly environment for design and analysis of linear control systems. System configuration and parameters easily adjusted, enabling INCA user to create compensation networks and perform sensitivity analysis in convenient manner. Full complement of graphical routines makes output easy to understand. Written in Pascal and FORTRAN.
Marketing research cluster analysis
Directory of Open Access Journals (Sweden)
Marić Nebojša
2002-01-01
Full Text Available One area of applications of cluster analysis in marketing is identification of groups of cities and towns with similar demographic profiles. This paper considers main aspects of cluster analysis by an example of clustering 12 cities with the use of Minitab software.
SWOT ANALYSIS - CHINESE PETROLEUM
Directory of Open Access Journals (Sweden)
Chunlan Wang
2014-01-01
Full Text Available This article was written in early December 2013,combined with the historical development andthe latest data on the Chinese Petroleum carried SWOT- analysis. This paper discusses corporate resources, cost, management and external factorssuch as the political environment and the marketsupply and demand, conducted a comprehensiveand profound analysis.
de Roon, F.A.; Nijman, T.E.; Ter Horst, J.R.
2000-01-01
In this paper we evaluate applications of (return based) style analysis.The portfolio and positivity constraints imposed by style analysis are useful in constructing mimicking portfolios without short positions.Such mimicking portfolios can be used, e.g., to construct efficient portfolios of mutual
F.A. de Roon (Frans); T.E. Nijman (Theo); B.J.M. Werker
2000-01-01
textabstractIn this paper we evaluate applications of (return based) style analysis. The portfolio and positivity constraints imposed by style analysis are useful in constructing mimicking portfolios without short positions. Such mimicking portfolios can be used e.g. to construct efficient
Directory of Open Access Journals (Sweden)
Satu Elo
2014-02-01
Full Text Available Qualitative content analysis is commonly used for analyzing qualitative data. However, few articles have examined the trustworthiness of its use in nursing science studies. The trustworthiness of qualitative content analysis is often presented by using terms such as credibility, dependability, conformability, transferability, and authenticity. This article focuses on trustworthiness based on a review of previous studies, our own experiences, and methodological textbooks. Trustworthiness was described for the main qualitative content analysis phases from data collection to reporting of the results. We concluded that it is important to scrutinize the trustworthiness of every phase of the analysis process, including the preparation, organization, and reporting of results. Together, these phases should give a reader a clear indication of the overall trustworthiness of the study. Based on our findings, we compiled a checklist for researchers attempting to improve the trustworthiness of a content analysis study. The discussion in this article helps to clarify how content analysis should be reported in a valid and understandable manner, which would be of particular benefit to reviewers of scientific articles. Furthermore, we discuss that it is often difficult to evaluate the trustworthiness of qualitative content analysis studies because of defective data collection method description and/or analysis description.
Schraagen, J.M.C.
2000-01-01
Cognitive task analysis is defined as the extension of traditional task analysis techniques to yield information about the knowledge, thought processes and goal structures that underlie observable task performance. Cognitive task analyses are conducted for a wide variety of purposes, including the
DEFF Research Database (Denmark)
Damkilde, Lars
2007-01-01
Limit State analysis has a long history and many prominent researchers have contributed. The theoretical foundation is based on the upper- and lower-bound theorems which give a very comprehensive and elegant formulation on complicated physical problems. In the pre-computer age Limit State analysis...... also enabled engineers to solve practical problems within reinforced concrete, steel structures and geotechnics....
Verhoosel, C.V.; Scott, M.A.; Borden, M.J.; Borst, de R.; Hughes, T.J.R.; Mueller-Hoeppe, D.; Loehnert, S.; Reese, S.
2011-01-01
Isogeometric analysis is a versatile tool for failure analysis. On the one hand, the excellent control over the inter-element continuity conditions enables a natural incorporation of continuum constitutive relations that incorporate higher-order strain gradients, as in gradient plasticity or damage.
DEFF Research Database (Denmark)
Durbin, Richard; Eddy, Sean; Krogh, Anders Stærmose
This book provides an up-to-date and tutorial-level overview of sequence analysis methods, with particular emphasis on probabilistic modelling. Discussed methods include pairwise alignment, hidden Markov models, multiple alignment, profile searches, RNA secondary structure analysis, and phylogene...
International Nuclear Information System (INIS)
Arien, B.
2000-01-01
The objective of SCK-CEN's programme on reactor safety is to develop expertise in probabilistic and deterministic reactor safety analysis. The research programme consists of two main activities, in particular the development of software for reliability analysis of large systems and participation in the international PHEBUS-FP programme for severe accidents. Main achievements in 1999 are reported
Factorial Analysis of Profitability
Georgeta VINTILA; Ilie GHEORGHE; Ioana Mihaela POCAN; Madalina Gabriela ANGHEL
2012-01-01
The DuPont analysis system is based on decomposing the profitability ratio in factors of influence. This paper describes the factorial analysis of profitability based on the DuPont system. Significant importance is given to the impact on various indicators on the shares value and profitability.
Spool assembly support analysis
International Nuclear Information System (INIS)
Norman, B.F.
1994-01-01
This document provides the wind/seismic analysis and evaluation for the pump pit spool assemblies. Hand calculations were used for the analysis. UBC, AISC, and load factors were used in this evaluation. The results show that the actual loads are under the allowable loads and all requirements are met
International Nuclear Information System (INIS)
Hansen, J.D.
1976-01-01
This article discusses the partial wave analysis of two, three and four meson systems. The difference between the two approaches, referred to as amplitude and Ascoli analysis is discussed. Some of the results obtained with these methods are shown. (B.R.H.)
Enabling interdisciplinary analysis
L. M. Reid
1996-01-01
'New requirements for evaluating environmental conditions in the Pacific Northwest have led to increased demands for interdisciplinary analysis of complex environmental problems. Procedures for watershed analysis have been developed for use on public and private lands in Washington State (Washington Forest Practices Board 1993) and for federal lands in the Pacific...
Shot loading platform analysis
International Nuclear Information System (INIS)
Norman, B.F.
1994-01-01
This document provides the wind/seismic analysis and evaluation for the shot loading platform. Hand calculations were used for the analysis. AISC and UBC load factors were used in this evaluation. The results show that the actual loads are under the allowable loads and all requirements are met
Marketing research cluster analysis
Marić Nebojša
2002-01-01
One area of applications of cluster analysis in marketing is identification of groups of cities and towns with similar demographic profiles. This paper considers main aspects of cluster analysis by an example of clustering 12 cities with the use of Minitab software.
Towards Cognitive Component Analysis
DEFF Research Database (Denmark)
Hansen, Lars Kai; Ahrendt, Peter; Larsen, Jan
2005-01-01
Cognitive component analysis (COCA) is here defined as the process of unsupervised grouping of data such that the ensuing group structure is well-aligned with that resulting from human cognitive activity. We have earlier demonstrated that independent components analysis is relevant for representing...
Satu Elo; Maria Kääriäinen; Outi Kanste; Tarja Pölkki; Kati Utriainen; Helvi Kyngäs
2014-01-01
Qualitative content analysis is commonly used for analyzing qualitative data. However, few articles have examined the trustworthiness of its use in nursing science studies. The trustworthiness of qualitative content analysis is often presented by using terms such as credibility, dependability, conformability, transferability, and authenticity. This article focuses on trustworthiness based on a review of previous studie...
Interaction Analysis and Supervision.
Amidon, Edmund
This paper describes a model that uses interaction analysis as a tool to provide feedback to a teacher in a microteaching situation. The author explains how interaction analysis can be used for teacher improvement, describes the category system used in the model, the data collection methods used, and the feedback techniques found in the model. (JF)
Activation analysis. Chapter 4
International Nuclear Information System (INIS)
1976-01-01
The principle, sample and calibration standard preparation, activation by neutrons, charged particles and gamma radiation, sample transport after activation, activity measurement, and chemical sample processing are described for activation analysis. Possible applications are shown of nondestructive activation analysis. (J.P.)
Donahue, Craig J.; Rais, Elizabeth A.
2009-01-01
This lab experiment illustrates the use of thermogravimetric analysis (TGA) to perform proximate analysis on a series of coal samples of different rank. Peat and coke are also examined. A total of four exercises are described. These are dry exercises as students interpret previously recorded scans. The weight percent moisture, volatile matter,…
Ian M. Franks; Mike Hughes
2004-01-01
This book addresses and appropriately explains the notational analysis of technique, tactics, individual athlete/team exercise and work-rate in sport. The book offers guidance in: developing a system, analyzes of data, effective coaching using notational performance analysis and modeling sport behaviors. It updates and improves the 1997 edition
Directory of Open Access Journals (Sweden)
Ian M. Franks
2004-06-01
Full Text Available This book addresses and appropriately explains the notational analysis of technique, tactics, individual athlete/team exercise and work-rate in sport. The book offers guidance in: developing a system, analyzes of data, effective coaching using notational performance analysis and modeling sport behaviors. It updates and improves the 1997 edition
Allain, Rhett
2016-05-01
We currently live in a world filled with videos. There are videos on YouTube, feature movies and even videos recorded with our own cameras and smartphones. These videos present an excellent opportunity to not only explore physical concepts, but also inspire others to investigate physics ideas. With video analysis, we can explore the fantasy world in science-fiction films. We can also look at online videos to determine if they are genuine or fake. Video analysis can be used in the introductory physics lab and it can even be used to explore the make-believe physics embedded in video games. This book covers the basic ideas behind video analysis along with the fundamental physics principles used in video analysis. The book also includes several examples of the unique situations in which video analysis can be used.
Ramsay, J O
1997-01-01
Scientists today collect samples of curves and other functional observations. This monograph presents many ideas and techniques for such data. Included are expressions in the functional domain of such classics as linear regression, principal components analysis, linear modelling, and canonical correlation analysis, as well as specifically functional techniques such as curve registration and principal differential analysis. Data arising in real applications are used throughout for both motivation and illustration, showing how functional approaches allow us to see new things, especially by exploiting the smoothness of the processes generating the data. The data sets exemplify the wide scope of functional data analysis; they are drwan from growth analysis, meterology, biomechanics, equine science, economics, and medicine. The book presents novel statistical technology while keeping the mathematical level widely accessible. It is designed to appeal to students, to applied data analysts, and to experienced researc...
Systems engineering and analysis
Blanchard, Benjamin S
2010-01-01
For senior-level undergraduate and first and second year graduate systems engineering and related courses. A total life-cycle approach to systems and their analysis. This practical introduction to systems engineering and analysis provides the concepts, methodologies, models, and tools needed to understand and implement a total life-cycle approach to systems and their analysis. The authors focus first on the process of bringing systems into being--beginning with the identification of a need and extending that need through requirements determination, functional analysis and allocation, design synthesis, evaluation, and validation, operation and support, phase-out, and disposal. Next, the authors discuss the improvement of systems currently in being, showing that by employing the iterative process of analysis, evaluation, feedback, and modification, most systems in existence can be improved in their affordability, effectiveness, and stakeholder satisfaction.
International Nuclear Information System (INIS)
Ishii, Keizo
1997-01-01
Elemental analysis based on the particle induced x-ray emission (PIXE) is a novel technique to analyze trace elements. It is a very simple method, its sensitivity is very high, multiple elements in a sample can be simultaneously analyzed and a few 10 μg of a sample is enough to be analyzed. Owing to these characteristics, the PIXE analysis is now used in many fields (e.g. biology, medicine, dentistry, environmental pollution, archaeology, culture assets etc.). Fundamentals of the PIXE analysis are described here: the production of characteristic x-rays and inner shell ionization by heavy charged particles, the continuous background in PIXE spectrum, quantitative formulae of the PIXE analysis, the detection limit of PIXE analysis, etc. (author)
International Nuclear Information System (INIS)
Porten, D.R.; Crowe, R.D.
1994-01-01
The purpose of this accident safety analysis is to document in detail, analyses whose results were reported in summary form in the K Basins Safety Analysis Report WHC-SD-SNF-SAR-001. The safety analysis addressed the potential for release of radioactive and non-radioactive hazardous material located in the K Basins and their supporting facilities. The safety analysis covers the hazards associated with normal K Basin fuel storage and handling operations, fuel encapsulation, sludge encapsulation, and canister clean-up and disposal. After a review of the Criticality Safety Evaluation of the K Basin activities, the following postulated events were evaluated: Crane failure and casks dropped into loadout pit; Design basis earthquake; Hypothetical loss of basin water accident analysis; Combustion of uranium fuel following dryout; Crane failure and cask dropped onto floor of transfer area; Spent ion exchange shipment for burial; Hydrogen deflagration in ion exchange modules and filters; Release of Chlorine; Power availability and reliability; and Ashfall
International Nuclear Information System (INIS)
Goetz, A.; Gerring, M.; Svensson, O.; Brockhauser, S.
2012-01-01
Data Analysis Workbench (DAWB) is a new software tool being developed at the ESRF. Its goal is to provide a tool for both online data analysis which can be used on the beamlines and for offline data analysis which users can use during experiments or take home. The tool includes support for data visualization and work-flows. work-flows allow algorithms which exploit parallel architectures to be designed from existing high level modules for data analysis in combination with data collection. The workbench uses Passerelle as the work-flow engine and EDNA plug-ins for data analysis. Actors talking to Tango are used for sending commands to a limited set of hardware to start existing data collection algorithms. A Tango server allows work-flows to be executed from existing applications. There are scripting interfaces to Python, Javascript and SPEC. The current state at the ESRF is the workbench is in test on a selected number of beamlines. (authors)
J Olive, David
2017-01-01
This text presents methods that are robust to the assumption of a multivariate normal distribution or methods that are robust to certain types of outliers. Instead of using exact theory based on the multivariate normal distribution, the simpler and more applicable large sample theory is given. The text develops among the first practical robust regression and robust multivariate location and dispersion estimators backed by theory. The robust techniques are illustrated for methods such as principal component analysis, canonical correlation analysis, and factor analysis. A simple way to bootstrap confidence regions is also provided. Much of the research on robust multivariate analysis in this book is being published for the first time. The text is suitable for a first course in Multivariate Statistical Analysis or a first course in Robust Statistics. This graduate text is also useful for people who are familiar with the traditional multivariate topics, but want to know more about handling data sets with...
Field, Michael
2017-01-01
This book provides a rigorous introduction to the techniques and results of real analysis, metric spaces and multivariate differentiation, suitable for undergraduate courses. Starting from the very foundations of analysis, it offers a complete first course in real analysis, including topics rarely found in such detail in an undergraduate textbook such as the construction of non-analytic smooth functions, applications of the Euler-Maclaurin formula to estimates, and fractal geometry. Drawing on the author’s extensive teaching and research experience, the exposition is guided by carefully chosen examples and counter-examples, with the emphasis placed on the key ideas underlying the theory. Much of the content is informed by its applicability: Fourier analysis is developed to the point where it can be rigorously applied to partial differential equations or computation, and the theory of metric spaces includes applications to ordinary differential equations and fractals. Essential Real Analysis will appeal t...
Real analysis and applications
Botelho, Fabio Silva
2018-01-01
This textbook introduces readers to real analysis in one and n dimensions. It is divided into two parts: Part I explores real analysis in one variable, starting with key concepts such as the construction of the real number system, metric spaces, and real sequences and series. In turn, Part II addresses the multi-variable aspects of real analysis. Further, the book presents detailed, rigorous proofs of the implicit theorem for the vectorial case by applying the Banach fixed-point theorem and the differential forms concept to surfaces in Rn. It also provides a brief introduction to Riemannian geometry. With its rigorous, elegant proofs, this self-contained work is easy to read, making it suitable for undergraduate and beginning graduate students seeking a deeper understanding of real analysis and applications, and for all those looking for a well-founded, detailed approach to real analysis.
Nonactivation interaction analysis. Chapter 5
International Nuclear Information System (INIS)
1976-01-01
Analyses are described including the alpha scattering analysis, beta absorption and scattering analysis, gamma and X-ray absorption and scattering analysis, X-ray fluorescence analysis, neutron absorption and scattering analysis, Moessbauer effect application and an analysis based on the application of radiation ionizing effects. (J.P.)
Is activation analysis still active?
International Nuclear Information System (INIS)
Chai Zhifang
2001-01-01
This paper reviews some aspects of neutron activation analysis (NAA), covering instrumental neutron activation analysis (INAA), k 0 method, prompt gamma-ray neutron activation analysis (PGNAA), radiochemical neutron activation analysis (RNAA) and molecular activation analysis (MAA). The comparison of neutron activation analysis with other analytical techniques are also made. (author)
Czech Academy of Sciences Publication Activity Database
Seitl, Stanislav; Malíková, Lucie; Sobek, J.; Frantík, P.; Lopez; Crespo, P.
2017-01-01
Roč. 11, č. 41 (2017), s. 323-331 ISSN 1971-8993 Institutional support: RVO:68081723 Keywords : Crack tip fields * Higher - order terms * Multi - parameter approximetion * Optical data processing Subject RIV: JL - Materials Fatigue, Friction Mechanics OBOR OECD: Audio engineering, reliability analysis
International Nuclear Information System (INIS)
Depres, B.; Dossantos-Uzarralde, P.
2009-01-01
More than 150 researchers and engineers from universities and the industrial world met to discuss on the new methodologies developed around assessing uncertainty. About 20 papers were presented and the main topics were: methods to study the propagation of uncertainties, sensitivity analysis, nuclear data covariances or multi-parameter optimisation. This report gathers the contributions of CEA researchers and engineers
International Nuclear Information System (INIS)
González Caballero, I; Cuesta Noriega, A; Rodríguez Marrero, A; Fernández del Castillo, E
2012-01-01
The analysis of the complex LHC data usually follows a standard path that aims at minimizing not only the amount of data but also the number of observables used. After a number of steps of slimming and skimming the data, the remaining few terabytes of ROOT files hold a selection of the events and a flat structure for the variables needed that can be more easily inspected and traversed in the final stages of the analysis. PROOF arises at this point as an efficient mechanism to distribute the analysis load by taking advantage of all the cores in modern CPUs through PROOF Lite, or by using PROOF Cluster or PROOF on Demand tools to build dynamic PROOF cluster on computing facilities with spare CPUs. However using PROOF at the level required for a serious analysis introduces some difficulties that may scare new adopters. We have developed the PROOF Analysis Framework (PAF) to facilitate the development of new analysis by uniformly exposing the PROOF related configurations across technologies and by taking care of the routine tasks as much as possible. We describe the details of the PAF implementation as well as how we succeeded in engaging a group of CMS physicists to use PAF as their daily analysis framework.
Hazard Analysis Database Report
Energy Technology Data Exchange (ETDEWEB)
GAULT, G.W.
1999-10-13
The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The FSAR is part of the approved TWRS Authorization Basis (AB). This document describes, identifies, and defines the contents and structure of the TWRS FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The TWRS Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The database supports the preparation of Chapters 3,4, and 5 of the TWRS FSAR and the USQ process and consists of two major, interrelated data sets: (1) Hazard Evaluation Database--Data from the results of the hazard evaluations; and (2) Hazard Topography Database--Data from the system familiarization and hazard identification.
Containment vessel stability analysis
International Nuclear Information System (INIS)
Harstead, G.A.; Morris, N.F.; Unsal, A.I.
1983-01-01
The stability analysis for a steel containment shell is presented herein. The containment is a freestanding shell consisting of a vertical cylinder with a hemispherical dome. It is stiffened by large ring stiffeners and relatively small longitudinal stiffeners. The containment vessel is subjected to both static and dynamic loads which can cause buckling. These loads must be combined prior to their use in a stability analysis. The buckling loads were computed with the aid of the ASME Code case N-284 used in conjunction with general purpose computer codes and in-house programs. The equations contained in the Code case were used to compute the knockdown factors due to shell imperfections. After these knockdown factors were applied to the critical stress states determined by freezing the maximum dynamic stresses and combining them with other static stresses, a linear bifurcation analysis was carried out with the aid of the BOSOR4 program. Since the containment shell contained large penetrations, the Code case had to be supplemented by a local buckling analysis of the shell area surrounding the largest penetration. This analysis was carried out with the aid of the NASTRAN program. Although the factor of safety against buckling obtained in this analysis was satisfactory, it is claimed that the use of the Code case knockdown factors are unduly conservative when applied to the analysis of buckling around penetrations. (orig.)
Feasibility analysis of marine ecological on-line integrated monitoring system
Chu, D. Z.; Cao, X.; Zhang, S. W.; Wu, N.; Ma, R.; Zhang, L.; Cao, L.
2017-08-01
The in-situ water quality sensors were susceptible to biological attachment. Moreover, sea water corrosion and wave impact damage, and many sensors scattered distribution would cause maintenance inconvenience. The paper proposed a highly integrated marine ecological on-line integrated monitoring system, which can be used inside monitoring station. All sensors were reasonably classified, the similar in series, the overall in parallel. The system composition and workflow were described. In addition, the paper proposed attention issues of the system design and corresponding solutions. Water quality multi-parameters and 5 nutrient salts as the verification index, in-situ and systematic data comparison experiment were carried out. The results showed that the data consistency of nutrient salt, PH and salinity was better. Temperature and dissolved oxygen data trend was consistent, but the data had deviation. Turbidity fluctuated greatly; the chlorophyll trend was similar with it. Aiming at the above phenomena, three points system optimization direction were proposed.
Whole blood flow cytometric analysis of Ureaplasma-stimulated monocytes from pregnant women.
Friedland, Yael D; Lee-Pullen, Tracey F; Nathan, Elizabeth; Watts, Rory; Keelan, Jeffrey A; Payne, Matthew S; Ireland, Demelza J
2015-06-01
We hypothesised that circulating monocytes of women with vaginal colonisation with Ureaplasma spp., genital microorganisms known to cause inflammation-driven preterm birth, would elicit a tolerised cytokine response to subsequent in vitro Ureaplasma parvum serovar 3 (UpSV3) stimulation. Using multi-parameter flow cytometry, we found no differences with regard to maternal colonisation status in the frequency of TNF-α-, IL-6-, IL-8- and IL-1β-expressing monocytes in response to subsequent UpSV3 stimulation (P > 0.10 for all cytokines). We conclude that vaginal Ureaplasma spp. colonisation does not specifically tolerise monocytes of pregnant women towards decreased responses to subsequent stimulation. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
International Nuclear Information System (INIS)
Thompson, W.A. Jr.
1979-11-01
This paper briefly describes WASH 1400 and the Lewis report. It attempts to define basic concepts such as risk and risk analysis, common mode failure, and rare event. Several probabilistic models which go beyond the WASH 1400 methodology are introduced; the common characteristic of these models is that they recognize explicitly that risk analysis is time dependent whereas WASH 1400 takes a per demand failure rate approach which obscures the important fact that accidents are time related. Further, the presentation of a realistic risk analysis should recognize that there are various risks which compete with one another for the lives of the individuals at risk. A way of doing this is suggested
International Nuclear Information System (INIS)
Kartiwa Sumadi; Yayah Rohayati
1996-01-01
The 'monazit' analytical program has been set up for routine work of Rare Earth Elements analysis in the monazite and xenotime minerals samples. Total relative error of the analysis is very low, less than 2.50%, and the reproducibility of counting statistic and stability of the instrument were very excellent. The precision and accuracy of the analytical program are very good with the maximum percentage relative are 5.22% and 1.61%, respectively. The mineral compositions of the 30 monazite samples have been also calculated using their chemical constituents, and the results were compared to the grain counting microscopic analysis
Methods of Multivariate Analysis
Rencher, Alvin C
2012-01-01
Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit
Dunham, Ken
2014-01-01
The rapid growth and development of Android-based devices has resulted in a wealth of sensitive information on mobile devices that offer minimal malware protection. This has created an immediate demand for security professionals that understand how to best approach the subject of Android malware threats and analysis.In Android Malware and Analysis, Ken Dunham, renowned global malware expert and author, teams up with international experts to document the best tools and tactics available for analyzing Android malware. The book covers both methods of malware analysis: dynamic and static.This tact
Aven, Terje
2012-01-01
Foundations of Risk Analysis presents the issues core to risk analysis - understanding what risk means, expressing risk, building risk models, addressing uncertainty, and applying probability models to real problems. The author provides the readers with the knowledge and basic thinking they require to successfully manage risk and uncertainty to support decision making. This updated edition reflects recent developments on risk and uncertainty concepts, representations and treatment. New material in Foundations of Risk Analysis includes:An up to date presentation of how to understand, define and
International Nuclear Information System (INIS)
Ko, Myeong Su; Kim, Tae Hwa; Park, Gyu Hyeon; Yang, Jong Beom; Oh, Chang Hwan; Lee, Kyoung Hye
2010-04-01
This textbook describes instrument analysis in easy way with twelve chapters. The contents of the book are pH measurement on principle, pH meter, pH measurement, examples of the experiments, centrifugation, Absorptiometry, Fluorescent method, Atomic absorption analysis, Gas-chromatography, Gas chromatography-mass spectrometry, High performance liquid chromatography liquid chromatograph-mass spectrometry, Electrophoresis on practical case and analysis of the result and examples, PCR on principle, device, application and examples and Enzyme-linked immunosorbent assay with indirect ELISA, sandwich ELISA and ELISA reader.
International Nuclear Information System (INIS)
Williams, Mike; Egede, Ulrik; Paterson, Stuart
2011-01-01
The distributed analysis experience to date at LHCb has been positive: job success rates are high and wait times for high-priority jobs are low. LHCb users access the grid using the GANGA job-management package, while the LHCb virtual organization manages its resources using the DIRAC package. This clear division of labor has benefitted LHCb and its users greatly; it is a major reason why distributed analysis at LHCb has been so successful. The newly formed LHCb distributed analysis support team has also proved to be a success.
Factor analysis and scintigraphy
International Nuclear Information System (INIS)
Di Paola, R.; Penel, C.; Bazin, J.P.; Berche, C.
1976-01-01
The goal of factor analysis is usually to achieve reduction of a large set of data, extracting essential features without previous hypothesis. Due to the development of computerized systems, the use of largest sampling, the possibility of sequential data acquisition and the increase of dynamic studies, the problem of data compression can be encountered now in routine. Thus, results obtained for compression of scintigraphic images were first presented. Then possibilities given by factor analysis for scan processing were discussed. At last, use of this analysis for multidimensional studies and specially dynamic studies were considered for compression and processing [fr
Energy Technology Data Exchange (ETDEWEB)
Ko, Myeong Su; Kim, Tae Hwa; Park, Gyu Hyeon; Yang, Jong Beom; Oh, Chang Hwan; Lee, Kyoung Hye
2010-04-15
This textbook describes instrument analysis in easy way with twelve chapters. The contents of the book are pH measurement on principle, pH meter, pH measurement, examples of the experiments, centrifugation, Absorptiometry, Fluorescent method, Atomic absorption analysis, Gas-chromatography, Gas chromatography-mass spectrometry, High performance liquid chromatography liquid chromatograph-mass spectrometry, Electrophoresis on practical case and analysis of the result and examples, PCR on principle, device, application and examples and Enzyme-linked immunosorbent assay with indirect ELISA, sandwich ELISA and ELISA reader.
DEFF Research Database (Denmark)
Raket, Lars Lau
We propose a direction it the field of statistics which we will call functional object analysis. This subfields considers the analysis of functional objects defined on continuous domains. In this setting we will focus on model-based statistics, with a particularly emphasis on mixed......-effect formulations, where the observed functional signal is assumed to consist of both fixed and random functional effects. This thesis takes the initial steps toward the development of likelihood-based methodology for functional objects. We first consider analysis of functional data defined on high...
Bayesian nonparametric data analysis
Müller, Peter; Jara, Alejandro; Hanson, Tim
2015-01-01
This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.
International Nuclear Information System (INIS)
Ramirez T, J.J.; Lopez M, J.; Sandoval J, A.R.; Villasenor S, P.; Aspiazu F, J.A.
2001-01-01
An elemental analysis, metallographic and of phases was realized in order to determine the oxidation states of Fe contained in three metallic pieces: block, plate and cylinder of unknown material. Results are presented from the elemental analysis which was carried out in the Tandem Accelerator of ININ by Proton induced X-ray emission (PIXE). The phase analysis was carried out by X-ray diffraction which allowed to know the type of alloy or alloys formed. The combined application of nuclear techniques with metallographic techniques allows the integral characterization of industrial metals. (Author)
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
Iremonger, M J
1982-01-01
BASIC Stress Analysis aims to help students to become proficient at BASIC programming by actually using it in an important engineering subject. It also enables the student to use computing as a means of learning stress analysis because writing a program is analogous to teaching-it is necessary to understand the subject matter. The book begins by introducing the BASIC approach and the concept of stress analysis at first- and second-year undergraduate level. Subsequent chapters contain a summary of relevant theory, worked examples containing computer programs, and a set of problems. Topics c
Fundamentals of mathematical analysis
Paul J Sally, Jr
2013-01-01
This is a textbook for a course in Honors Analysis (for freshman/sophomore undergraduates) or Real Analysis (for junior/senior undergraduates) or Analysis-I (beginning graduates). It is intended for students who completed a course in "AP Calculus", possibly followed by a routine course in multivariable calculus and a computational course in linear algebra. There are three features that distinguish this book from many other books of a similar nature and which are important for the use of this book as a text. The first, and most important, feature is the collection of exercises. These are spread
Systems analysis-independent analysis and verification
Energy Technology Data Exchange (ETDEWEB)
Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)
1995-09-01
The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.
Plasma data analysis using statistical analysis system
International Nuclear Information System (INIS)
Yoshida, Z.; Iwata, Y.; Fukuda, Y.; Inoue, N.
1987-01-01
Multivariate factor analysis has been applied to a plasma data base of REPUTE-1. The characteristics of the reverse field pinch plasma in REPUTE-1 are shown to be explained by four independent parameters which are described in the report. The well known scaling laws F/sub chi/ proportional to I/sub p/, T/sub e/ proportional to I/sub p/, and tau/sub E/ proportional to N/sub e/ are also confirmed. 4 refs., 8 figs., 1 tab
Summary Analysis: Hanford Site Composite Analysis Update
Energy Technology Data Exchange (ETDEWEB)
Nichols, W. E. [CH2M HILL Plateau Remediation Company, Richland, WA (United States); Lehman, L. L. [CH2M HILL Plateau Remediation Company, Richland, WA (United States)
2017-06-05
The Hanford Site’s currently maintained Composite Analysis, originally completed in 1998, requires an update. A previous update effort was undertaken by the U.S. Department of Energy (DOE) in 2001-2005, but was ended before completion to allow the Tank Closure & Waste Management Environmental Impact Statement (TC&WM EIS) (DOE/EIS-0391) to be prepared without potential for conflicting sitewide models. This EIS was issued in 2012, and the deferral was ended with guidance in memorandum “Modeling to Support Regulatory Decision Making at Hanford” (Williams, 2012) provided with the aim of ensuring subsequent modeling is consistent with the EIS.
Genome-wide analysis of effectors of peroxisome biogenesis.
Directory of Open Access Journals (Sweden)
Ramsey A Saleem
2010-08-01
Full Text Available Peroxisomes are intracellular organelles that house a number of diverse metabolic processes, notably those required for beta-oxidation of fatty acids. Peroxisomes biogenesis can be induced by the presence of peroxisome proliferators, including fatty acids, which activate complex cellular programs that underlie the induction process. Here, we used multi-parameter quantitative phenotype analyses of an arrayed mutant collection of yeast cells induced to proliferate peroxisomes, to establish a comprehensive inventory of genes required for peroxisome induction and function. The assays employed include growth in the presence of fatty acids, and confocal imaging and flow cytometry through the induction process. In addition to the classical phenotypes associated with loss of peroxisomal functions, these studies identified 169 genes required for robust signaling, transcription, normal peroxisomal development and morphologies, and transmission of peroxisomes to daughter cells. These gene products are localized throughout the cell, and many have indirect connections to peroxisome function. By integration with extant data sets, we present a total of 211 genes linked to peroxisome biogenesis and highlight the complex networks through which information flows during peroxisome biogenesis and function.
Analysis of a convenient information bound for general quantum channels
International Nuclear Information System (INIS)
O'Loan, C J
2007-01-01
Open questions from Sarovar and Milburn (2006 J. Phys. A: Math. Gen. 39 8487) are answered. Sarovar and Milburn derived a convenient upper bound for the Fisher information of a one-parameter quantum channel. They showed that for quasi-classical models their bound is achievable and they gave a necessary and sufficient condition for positive operator-valued measures (POVMs) attaining this bound. They asked (i) whether their bound is attainable more generally (ii) whether explicit expressions for optimal POVMs can be derived from the attainability condition. We show that the symmetric logarithmic derivative (SLD) quantum information is less than or equal to the SM bound, i.e., H(θ) ≤ C Y (θ) and we find conditions for equality. As the Fisher information is less than or equal to the SLD quantum information, i.e., F M (θ) ≤ H(θ), we can deduce when equality holds in F M (θ) ≤ C Y (θ). Equality does not hold for all channels. As a consequence, the attainability condition cannot be used to test for optimal POVMs for all channels. These results are extended to multi-parameter channels
He, Jingrui
2012-01-01
This book focuses on rare category analysis where the majority classes have smooth distributions and the minority classes exhibit the compactness property. It focuses on challenging cases where the support regions of the majority and minority classes overlap.
Longitudinal categorical data analysis
Sutradhar, Brajendra C
2014-01-01
This is the first book in longitudinal categorical data analysis with parametric correlation models developed based on dynamic relationships among repeated categorical responses. This book is a natural generalization of the longitudinal binary data analysis to the multinomial data setup with more than two categories. Thus, unlike the existing books on cross-sectional categorical data analysis using log linear models, this book uses multinomial probability models both in cross-sectional and longitudinal setups. A theoretical foundation is provided for the analysis of univariate multinomial responses, by developing models systematically for the cases with no covariates as well as categorical covariates, both in cross-sectional and longitudinal setups. In the longitudinal setup, both stationary and non-stationary covariates are considered. These models have also been extended to the bivariate multinomial setup along with suitable covariates. For the inferences, the book uses the generalized quasi-likelihood as w...
International Nuclear Information System (INIS)
1981-09-01
Suggestion are made concerning the method of the fault tree analysis, the use of certain symbols in the examination of system failures. This purpose of the fault free analysis is to find logical connections of component or subsystem failures leading to undesirable occurrances. The results of these examinations are part of the system assessment concerning operation and safety. The objectives of the analysis are: systematical identification of all possible failure combinations (causes) leading to a specific undesirable occurrance, finding of reliability parameters such as frequency of failure combinations, frequency of the undesirable occurrance or non-availability of the system when required. The fault tree analysis provides a near and reconstructable documentation of the examination. (orig./HP) [de
Denker, A; Rauschenberg, J; Röhrich, J; Strub, E
2006-01-01
Materials analysis with ion beams exploits the interaction of ions with the electrons and nuclei in the sample. Among the vast variety of possible analytical techniques available with ion beams we will restrain to ion beam analysis with ion beams in the energy range from one to several MeV per mass unit. It is possible to use either the back-scattered projectiles (RBS – Rutherford Back Scattering) or the recoiled atoms itself (ERDA – Elastic Recoil Detection Analysis) from the elastic scattering processes. These techniques allow the simultaneous and absolute determination of stoichiometry and depth profiles of the detected elements. The interaction of the ions with the electrons in the sample produces holes in the inner electronic shells of the sample atoms, which recombine and emit X-rays characteristic for the element in question. Particle Induced X-ray Emission (PIXE) has shown to be a fast technique for the analysis of elements with an atomic number above 11.
DEFF Research Database (Denmark)
Vatrapu, Ravi; Mukkamala, Raghava Rao; Hussain, Abid
2016-01-01
, conceptual and formal models of social data, and an analytical framework for combining big social data sets with organizational and societal data sets. Three empirical studies of big social data are presented to illustrate and demonstrate social set analysis in terms of fuzzy set-theoretical sentiment...... automata and agent-based modeling). However, when it comes to organizational and societal units of analysis, there exists no approach to conceptualize, model, analyze, explain, and predict social media interactions as individuals' associations with ideas, values, identities, and so on. To address...... analysis, crisp set-theoretical interaction analysis, and event-studies-oriented set-theoretical visualizations. Implications for big data analytics, current limitations of the set-theoretical approach, and future directions are outlined....
PWR systems transient analysis
International Nuclear Information System (INIS)
Kennedy, M.F.; Peeler, G.B.; Abramson, P.B.
1985-01-01
Analysis of transients in pressurized water reactor (PWR) systems involves the assessment of the response of the total plant, including primary and secondary coolant systems, steam piping and turbine (possibly including the complete feedwater train), and various control and safety systems. Transient analysis is performed as part of the plant safety analysis to insure the adequacy of the reactor design and operating procedures and to verify the applicable plant emergency guidelines. Event sequences which must be examined are developed by considering possible failures or maloperations of plant components. These vary in severity (and calculational difficulty) from a series of normal operational transients, such as minor load changes, reactor trips, valve and pump malfunctions, up to the double-ended guillotine rupture of a primary reactor coolant system pipe known as a Large Break Loss of Coolant Accident (LBLOCA). The focus of this paper is the analysis of all those transients and accidents except loss of coolant accidents
Full closure strategic analysis.
2014-07-01
The full closure strategic analysis was conducted to create a decision process whereby full roadway : closures for construction and maintenance activities can be evaluated and approved or denied by CDOT : Traffic personnel. The study reviewed current...
Electrical Subsurface Grounding Analysis
International Nuclear Information System (INIS)
J.M. Calle
2000-01-01
The purpose and objective of this analysis is to determine the present grounding requirements of the Exploratory Studies Facility (ESF) subsurface electrical system and to verify that the actual grounding system and devices satisfy the requirements
DEFF Research Database (Denmark)
Skrypnyuk, Nataliya; Nielson, Flemming; Pilegaard, Henrik
2009-01-01
We present the ongoing work on the pathway analysis of a stochastic calculus. Firstly we present a particular stochastic calculus that we have chosen for our modeling - the Interactive Markov Chains calculus, IMC for short. After that we specify a few restrictions that we have introduced into the...... into the syntax of IMC in order to make our analysis feasible. Finally we describe the analysis itself together with several theoretical results that we have proved for it.......We present the ongoing work on the pathway analysis of a stochastic calculus. Firstly we present a particular stochastic calculus that we have chosen for our modeling - the Interactive Markov Chains calculus, IMC for short. After that we specify a few restrictions that we have introduced...
Canonical Information Analysis
DEFF Research Database (Denmark)
Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg
2015-01-01
is replaced by the information theoretical, entropy based measure mutual information, which is a much more general measure of association. We make canonical information analysis feasible for large sample problems, including for example multispectral images, due to the use of a fast kernel density estimator......Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical information analysis introduced here, linear correlation as a measure of association between variables...... for entropy estimation. Canonical information analysis is applied successfully to (1) simple simulated data to illustrate the basic idea and evaluate performance, (2) fusion of weather radar and optical geostationary satellite data in a situation with heavy precipitation, and (3) change detection in optical...
Qualitative Data Analysis Strategies
Greaves, Kristoffer
2014-01-01
A set of concept maps for qualitative data analysis strategies, inspired by Corbin, JM & Strauss, AL 2008, Basics of qualitative research: Techniques and procedures for developing grounded theory, 3rd edn, Sage Publications, Inc, Thousand Oaks, California.
Statistical data analysis handbook
National Research Council Canada - National Science Library
Wall, Francis J
1986-01-01
It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...
Fanfani, Alessandra; Sanches, Jose Afonso; Andreeva, Julia; Bagliesi, Giusepppe; Bauerdick, Lothar; Belforte, Stefano; Bittencourt Sampaio, Patricia; Bloom, Ken; Blumenfeld, Barry; Bonacorsi, Daniele; Brew, Chris; Calloni, Marco; Cesini, Daniele; Cinquilli, Mattia; Codispoti, Giuseppe; D'Hondt, Jorgen; Dong, Liang; Dongiovanni, Danilo; Donvito, Giacinto; Dykstra, David; Edelmann, Erik; Egeland, Ricky; Elmer, Peter; Eulisse, Giulio; Evans, Dave; Fanzago, Federica; Farina, Fabio; Feichtinger, Derek; Fisk, Ian; Flix, Josep; Grandi, Claudio; Guo, Yuyi; Happonen, Kalle; Hernandez, Jose M; Huang, Chih-Hao; Kang, Kejing; Karavakis, Edward; Kasemann, Matthias; Kavka, Carlos; Khan, Akram; Kim, Bockjoo; Klem, Jukka; Koivumaki, Jesper; Kress, Thomas; Kreuzer, Peter; Kurca, Tibor; Kuznetsov, Valentin; Lacaprara, Stefano; Lassila-Perini, Kati; Letts, James; Linden, Tomas; Lueking, Lee; Maes, Joris; Magini, Nicolo; Maier, Gerhild; McBride, Patricia; Metson, Simon; Miccio, Vincenzo; Padhi, Sanjay; Pi, Haifeng; Riahi, Hassen; Riley, Daniel; Rossman, Paul; Saiz, Pablo; Sartirana, Andrea; Sciaba, Andrea; Sekhri, Vijay; Spiga, Daniele; Tuura, Lassi; Vaandering, Eric; Vanelderen, Lukas; Van Mulders, Petra; Vedaee, Aresh; Villella, Ilaria; Wicklund, Eric; Wildish, Tony; Wissing, Christoph; Wurthwein, Frank
2009-01-01
The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.
NOAA's Inundation Analysis Tool
National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...
Multidimensional nonlinear descriptive analysis
Nishisato, Shizuhiko
2006-01-01
Quantification of categorical, or non-numerical, data is a problem that scientists face across a wide range of disciplines. Exploring data analysis in various areas of research, such as the social sciences and biology, Multidimensional Nonlinear Descriptive Analysis presents methods for analyzing categorical data that are not necessarily sampled randomly from a normal population and often involve nonlinear relations. This reference not only provides an overview of multidimensional nonlinear descriptive analysis (MUNDA) of discrete data, it also offers new results in a variety of fields. The first part of the book covers conceptual and technical preliminaries needed to understand the data analysis in subsequent chapters. The next two parts contain applications of MUNDA to diverse data types, with each chapter devoted to one type of categorical data, a brief historical comment, and basic skills peculiar to the data types. The final part examines several problems and then concludes with suggestions for futu...
Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars
2016-04-12
A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.
Water Quality Analysis Simulation
The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural phenomena and man-made pollution for variious pollution management decisions.
... Plasma Free Metanephrines Platelet Count Platelet Function Tests Pleural Fluid Analysis PML-RARA Porphyrin Tests Potassium Prealbumin ... is being tested? Synovial fluid is a thick liquid that acts as a lubricant for the body's ...
Hytönen, Tuomas; Veraar, Mark; Weis, Lutz
The present volume develops the theory of integration in Banach spaces, martingales and UMD spaces, and culminates in a treatment of the Hilbert transform, Littlewood-Paley theory and the vector-valued Mihlin multiplier theorem. Over the past fifteen years, motivated by regularity problems in evolution equations, there has been tremendous progress in the analysis of Banach space-valued functions and processes. The contents of this extensive and powerful toolbox have been mostly scattered around in research papers and lecture notes. Collecting this diverse body of material into a unified and accessible presentation fills a gap in the existing literature. The principal audience that we have in mind consists of researchers who need and use Analysis in Banach Spaces as a tool for studying problems in partial differential equations, harmonic analysis, and stochastic analysis. Self-contained and offering complete proofs, this work is accessible to graduate students and researchers with a background in functional an...
Analysis Streamlining in ATLAS
Heinrich, Lukas; The ATLAS collaboration
2018-01-01
We present recent work within the ATLAS collaboration centrally provide tools to facilitate analysis management and highly automated container-based analysis execution in order to both enable non-experts to benefit from these best practices as well as the collaboration to track and re-execute analyses indpendently, e.g. during their review phase. Through integration with the ATLAS GLANCE system, users can request a pre-configured, but customizable version control setup, including continuous integration for automated build and testing as well as continuous Linux Container image building for software preservation purposes. As analyses typically require many individual steps, analysis workflow pipelines can then be defined using such images and the yadage workflow description language. The integration into the workflow exection service REANA allows the interactive or automated reproduction of the main analysis results by orchestrating a large number of container jobs using the Kubernetes. For long-term archival,...
Wolff, Thomas H; Shubin, Carol
2003-01-01
This book demonstrates how harmonic analysis can provide penetrating insights into deep aspects of modern analysis. It is both an introduction to the subject as a whole and an overview of those branches of harmonic analysis that are relevant to the Kakeya conjecture. The usual background material is covered in the first few chapters: the Fourier transform, convolution, the inversion theorem, the uncertainty principle and the method of stationary phase. However, the choice of topics is highly selective, with emphasis on those frequently used in research inspired by the problems discussed in the later chapters. These include questions related to the restriction conjecture and the Kakeya conjecture, distance sets, and Fourier transforms of singular measures. These problems are diverse, but often interconnected; they all combine sophisticated Fourier analysis with intriguing links to other areas of mathematics and they continue to stimulate first-rate work. The book focuses on laying out a solid foundation for fu...
Water Quality Analysis Simulation
U.S. Environmental Protection Agency — The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural...
Federal Laboratory Consortium — Provides engineering design of aircraft components, subsystems and installations using Pro/E, Anvil 1000, CADKEY 97, AutoCAD 13. Engineering analysis tools include...
CSIR Research Space (South Africa)
Khuluse, S
2009-04-01
Full Text Available ) determination of the distribution of the damage and (iii) preparation of products that enable prediction of future risk events. The methodology provided by extreme value theory can also be a powerful tool in risk analysis...
Ziemer, William P
2017-01-01
This first year graduate text is a comprehensive resource in real analysis based on a modern treatment of measure and integration. Presented in a definitive and self-contained manner, it features a natural progression of concepts from simple to difficult. Several innovative topics are featured, including differentiation of measures, elements of Functional Analysis, the Riesz Representation Theorem, Schwartz distributions, the area formula, Sobolev functions and applications to harmonic functions. Together, the selection of topics forms a sound foundation in real analysis that is particularly suited to students going on to further study in partial differential equations. This second edition of Modern Real Analysis contains many substantial improvements, including the addition of problems for practicing techniques, and an entirely new section devoted to the relationship between Lebesgue and improper integrals. Aimed at graduate students with an understanding of advanced calculus, the text will also appeal to mo...
DEFF Research Database (Denmark)
Fischer, Paul; Hilbert, Astrid
2012-01-01
We introduce a platform which supplies an easy-to-handle, interactive, extendable, and fast analysis tool for time series analysis. In contrast to other software suits like Maple, Matlab, or R, which use a command-line-like interface and where the user has to memorize/look-up the appropriate...... commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...... choose between manual and automated parameter selection. The user can dene new transformations and add them to the system. The application contains efficient implementations of advanced and recent techniques for time series analysis including techniques related to extreme value analysis and filtering...
International Nuclear Information System (INIS)
Holland, W.E.
1980-02-01
A method was developed to determine if boron-loaded polymeric material contained enriched boron or natural boron. A prototype analyzer was constructed, and initial planning was done for an actual analysis facility
Stakeholder Analysis Worksheet
Stakeholder Analysis WorksheetA worksheet that can be used to document potential stakeholder groups, the information or expertise they hold, the role that they can play, their interests or concerns about the HIA
Energy Technology Data Exchange (ETDEWEB)
Arent, D.; Benioff, R.; Mosey, G.; Bird, L.; Brown, J.; Brown, E.; Vimmerstedt, L.; Aabakken, J.; Parks, K.; Lapsa, M.; Davis, S.; Olszewski, M.; Cox, D.; McElhaney, K.; Hadley, S.; Hostick, D.; Nicholls, A.; McDonald, S.; Holloman, B.
2006-10-01
This paper presents the results of energy market analysis sponsored by the Department of Energy's (DOE) Weatherization and International Program (WIP) within the Office of Energy Efficiency and Renewable Energy (EERE). The analysis was conducted by a team of DOE laboratory experts from the National Renewable Energy Laboratory (NREL), Oak Ridge National Laboratory (ORNL), and Pacific Northwest National Laboratory (PNNL), with additional input from Lawrence Berkeley National Laboratory (LBNL). The analysis was structured to identify those markets and niches where government can create the biggest impact by informing management decisions in the private and public sectors. The analysis identifies those markets and niches where opportunities exist for increasing energy efficiency and renewable energy use.
Principles of Fourier analysis
Howell, Kenneth B
2001-01-01
Fourier analysis is one of the most useful and widely employed sets of tools for the engineer, the scientist, and the applied mathematician. As such, students and practitioners in these disciplines need a practical and mathematically solid introduction to its principles. They need straightforward verifications of its results and formulas, and they need clear indications of the limitations of those results and formulas.Principles of Fourier Analysis furnishes all this and more. It provides a comprehensive overview of the mathematical theory of Fourier analysis, including the development of Fourier series, "classical" Fourier transforms, generalized Fourier transforms and analysis, and the discrete theory. Much of the author''s development is strikingly different from typical presentations. His approach to defining the classical Fourier transform results in a much cleaner, more coherent theory that leads naturally to a starting point for the generalized theory. He also introduces a new generalized theory based ...
Quantitative investment analysis
DeFusco, Richard
2007-01-01
In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.
Energy Technology Data Exchange (ETDEWEB)
Wood, William Monford [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-02-23
Presenting a systematic study of the standard analysis of rod-pinch radiographs for obtaining quantitative measurements of areal mass densities, and making suggestions for improving the methodology of obtaining quantitative information from radiographed objects.
Biodiesel Emissions Analysis Program
Using existing data, the EPA's biodiesel emissions analysis program sought to quantify the air pollution emission effects of biodiesel for diesel engines that have not been specifically modified to operate on biodiesel.
Introduction to global analysis
Kahn, Donald W
2007-01-01
This text introduces the methods of mathematical analysis as applied to manifolds, including the roles of differentiation and integration, infinite dimensions, Morse theory, Lie groups, and dynamical systems. 1980 edition.
Biorefinery Sustainability Analysis
DEFF Research Database (Denmark)
J. S. M. Silva, Carla; Prunescu, Remus Mihail; Gernaey, Krist
2017-01-01
This chapter deals with sustainability analysis of biorefinery systems in terms of environmental and socio-economic indicators . Life cycle analysis has methodological issues related to the functional unit (FU), allocation , land use and biogenic carbon neutrality of the reference system and of t......This chapter deals with sustainability analysis of biorefinery systems in terms of environmental and socio-economic indicators . Life cycle analysis has methodological issues related to the functional unit (FU), allocation , land use and biogenic carbon neutrality of the reference system...... and of the biorefinery-based system. Socio-economic criteria and indicators used in sustainability frameworks assessment are presented and discussed. There is not one single methodology that can aptly cover the synergies of environmental, economic, social and governance issues required to assess the sustainable...
Pesticide Instrumental Analysis
International Nuclear Information System (INIS)
Samir, E.; Fonseca, E.; Baldyga, N.; Acosta, A.; Gonzalez, F.; Felicita, F.; Tomasso, M.; Esquivel, D.; Parada, A.; Enriquez, P.; Amilibia, M.
2012-01-01
This workshop was the evaluation of the pesticides impact on the vegetable matrix with the purpose to determine the analysis by GC / M S. The working material were lettuce matrix, chard and a mix of green leaves and pesticides.
Kansas Data Access and Support Center — The Kansas GAP Analysis Land Cover database depicts 43 land cover classes for the state of Kansas. The database was generated using a two-stage hybrid classification...
Perspectives in shape analysis
Bruckstein, Alfred; Maragos, Petros; Wuhrer, Stefanie
2016-01-01
This book presents recent advances in the field of shape analysis. Written by experts in the fields of continuous-scale shape analysis, discrete shape analysis and sparsity, and numerical computing who hail from different communities, it provides a unique view of the topic from a broad range of perspectives. Over the last decade, it has become increasingly affordable to digitize shape information at high resolution. Yet analyzing and processing this data remains challenging because of the large amount of data involved, and because modern applications such as human-computer interaction require real-time processing. Meeting these challenges requires interdisciplinary approaches that combine concepts from a variety of research areas, including numerical computing, differential geometry, deformable shape modeling, sparse data representation, and machine learning. On the algorithmic side, many shape analysis tasks are modeled using partial differential equations, which can be solved using tools from the field of n...
National Research Council Canada - National Science Library
1998-01-01
.... Establishing proper job procedures is one of the benefits of conducting a job hazard analysis carefully studying and recording each step of a job, identifying existing or potential job hazards...
Main: Nucleotide Analysis [KOME
Lifescience Database Archive (English)
Full Text Available Nucleotide Analysis Japonica genome blast search result Result of blastn search against jap...onica genome sequence kome_japonica_genome_blast_search_result.zip kome_japonica_genome_blast_search_result ...
Schuller, Björn W
2013-01-01
This book provides the reader with the knowledge necessary for comprehension of the field of Intelligent Audio Analysis. It firstly introduces standard methods and discusses the typical Intelligent Audio Analysis chain going from audio data to audio features to audio recognition. Further, an introduction to audio source separation, and enhancement and robustness are given. After the introductory parts, the book shows several applications for the three types of audio: speech, music, and general sound. Each task is shortly introduced, followed by a description of the specific data and methods applied, experiments and results, and a conclusion for this specific task. The books provides benchmark results and standardized test-beds for a broader range of audio analysis tasks. The main focus thereby lies on the parallel advancement of realism in audio analysis, as too often today’s results are overly optimistic owing to idealized testing conditions, and it serves to stimulate synergies arising from transfer of ...
Scientific stream pollution analysis
National Research Council Canada - National Science Library
Nemerow, Nelson Leonard
1974-01-01
A comprehensive description of the analysis of water pollution that presents a careful balance of the biological,hydrological, chemical and mathematical concepts involved in the evaluation of stream...
International Nuclear Information System (INIS)
Andreeva, J; Maier, G; Spiga, D; Calloni, M; Colling, D; Fanzago, F; D'Hondt, J; Maes, J; Van Mulders, P; Villella, I; Klem, J; Letts, J; Padhi, S; Sarkar, S
2010-01-01
During normal data taking CMS expects to support potentially as many as 2000 analysis users. Since the beginning of 2008 there have been more than 800 individuals who submitted a remote analysis job to the CMS computing infrastructure. The bulk of these users will be supported at the over 40 CMS Tier-2 centres. Supporting a globally distributed community of users on a globally distributed set of computing clusters is a task that requires reconsidering the normal methods of user support for Analysis Operations. In 2008 CMS formed an Analysis Support Task Force in preparation for large-scale physics analysis activities. The charge of the task force was to evaluate the available support tools, the user support techniques, and the direct feedback of users with the goal of improving the success rate and user experience when utilizing the distributed computing environment. The task force determined the tools needed to assess and reduce the number of non-zero exit code applications submitted through the grid interfaces and worked with the CMS experiment dashboard developers to obtain the necessary information to quickly and proactively identify issues with user jobs and data sets hosted at various sites. Results of the analysis group surveys were compiled. Reference platforms for testing and debugging problems were established in various geographic regions. The task force also assessed the resources needed to make the transition to a permanent Analysis Operations task. In this presentation the results of the task force will be discussed as well as the CMS Analysis Operations plans for the start of data taking.
Invitation to classical analysis
Duren, Peter
2012-01-01
This book gives a rigorous treatment of selected topics in classical analysis, with many applications and examples. The exposition is at the undergraduate level, building on basic principles of advanced calculus without appeal to more sophisticated techniques of complex analysis and Lebesgue integration. Among the topics covered are Fourier series and integrals, approximation theory, Stirling's formula, the gamma function, Bernoulli numbers and polynomials, the Riemann zeta function, Tauberian theorems, elliptic integrals, ramifications of the Cantor set, and a theoretical discussion of differ
Analysis of irradiated materials
International Nuclear Information System (INIS)
Bellamy, B.A.
1988-01-01
Papers presented at the UKAEA Conference on Materials Analysis by Physical Techniques (1987) covered a wide range of techniques as applied to the analysis of irradiated materials. These varied from reactor component materials, materials associated with the Authority's radwaste disposal programme, fission products and products associated with the decommissioning of nuclear reactors. An invited paper giving a very comprehensive review of Laser Ablation Microprobe Mass Spectroscopy (LAMMS) was included in the programme. (author)
Oktavianto, Digit
2013-01-01
This book is a step-by-step, practical tutorial for analyzing and detecting malware and performing digital investigations. This book features clear and concise guidance in an easily accessible format.Cuckoo Malware Analysis is great for anyone who wants to analyze malware through programming, networking, disassembling, forensics, and virtualization. Whether you are new to malware analysis or have some experience, this book will help you get started with Cuckoo Sandbox so you can start analysing malware effectively and efficiently.
International Nuclear Information System (INIS)
Niehaus, F.
1988-01-01
In this paper, the risks of various energy systems are discussed considering severe accidents analysis, particularly the probabilistic safety analysis, and probabilistic safety criteria, and the applications of these criteria and analysis. The comparative risk analysis has demonstrated that the largest source of risk in every society is from daily small accidents. Nevertheless, we have to be more concerned about severe accidents. The comparative risk analysis of five different energy systems (coal, oil, gas, LWR and STEC (Solar)) for the public has shown that the main sources of risks are coal and oil. The latest comparative risk study of various energy has been conducted in the USA and has revealed that the number of victims from coal is 42 as many than victims from nuclear. A study for severe accidents from hydro-dams in United States has estimated the probability of dam failures at 1 in 10,000 years and the number of victims between 11,000 and 260,000. The average occupational risk from coal is one fatal accident in 1,000 workers/year. The probabilistic safety analysis is a method that can be used to assess nuclear energy risks, and to analyze the severe accidents, and to model all possible accident sequences and consequences. The 'Fault tree' analysis is used to know the probability of failure of the different systems at each point of accident sequences and to calculate the probability of risks. After calculating the probability of failure, the criteria for judging the numerical results have to be developed, that is the quantitative and qualitative goals. To achieve these goals, several systems have been devised by various countries members of AIEA. The probabilistic safety ana-lysis method has been developed by establishing a computer program permit-ting to know different categories of safety related information. 19 tabs. (author)
International Nuclear Information System (INIS)
Arien, B.
1998-01-01
The objective of SCK-CEN's programme on reactor safety is to develop expertise in probabilistic and deterministic reactor safety analysis. The research programme consists of four main activities, in particular the development of software for reliability analysis of large systems and participation in the international PHEBUS-FP programme for severe accidents, the development of an expert system for the aid to diagnosis; the development and application of a probabilistic reactor dynamics method. Main achievements in 1999 are reported
Brady, Sir Michael; Highnam, Ralph; Irving, Benjamin; Schnabel, Julia A
2016-10-01
Cancer is one of the world's major healthcare challenges and, as such, an important application of medical image analysis. After a brief introduction to cancer, we summarise some of the major developments in oncological image analysis over the past 20 years, but concentrating those in the authors' laboratories, and then outline opportunities and challenges for the next decade. Copyright © 2016 Elsevier B.V. All rights reserved.
International Nuclear Information System (INIS)
Tibari, Elghali; Taous, Fouad; Marah, Hamid
2014-01-01
This report presents results related to stable isotopes analysis carried out at the CNESTEN DASTE in Rabat (Morocco), on behalf of Senegal. These analyzes cover 127 samples. These results demonstrate that Oxygen-18 and Deuterium in water analysis were performed by infrared Laser spectroscopy using a LGR / DLT-100 with Autosampler. Also, the results are expressed in δ values (‰) relative to V-SMOW to ± 0.3 ‰ for oxygen-18 and ± 1 ‰ for deuterium.
Oden, J Tinsley
2010-01-01
The textbook is designed to drive a crash course for beginning graduate students majoring in something besides mathematics, introducing mathematical foundations that lead to classical results in functional analysis. More specifically, Oden and Demkowicz want to prepare students to learn the variational theory of partial differential equations, distributions, and Sobolev spaces and numerical analysis with an emphasis on finite element methods. The 1996 first edition has been used in a rather intensive two-semester course. -Book News, June 2010
Griffel, DH
2002-01-01
A stimulating introductory text, this volume examines many important applications of functional analysis to mechanics, fluid mechanics, diffusive growth, and approximation. Detailed enough to impart a thorough understanding, the text is also sufficiently straightforward for those unfamiliar with abstract analysis. Its four-part treatment begins with distribution theory and discussions of Green's functions. Essentially independent of the preceding material, the second and third parts deal with Banach spaces, Hilbert space, spectral theory, and variational techniques. The final part outlines the
Ivan Stosic; Drasko Nikolic; Aleksandar Zdravkovic
2012-01-01
The main purpose of this paper is to examine the impact of the current Serbian macro-environment on the businesses through the implementation of PEST analysis as a framework for assessing general or macro environment in which companies are operating. The authors argue the elements in presented PEST analysis indicate that the current macro-environment is characterized by the dominance of threats and weaknesses with few opportunities and strengths. Consequently, there is a strong need for faste...
Forensic neutron activation analysis
International Nuclear Information System (INIS)
Kishi, T.
1987-01-01
The progress of forensic neutron activation analysis (FNAA) in Japan is described. FNAA began in 1965 and during the past 20 years many cases have been handled; these include determination of toxic materials, comparison examination of physical evidences (e.g., paints, metal fragments, plastics and inks) and drug sample differentiation. Neutron activation analysis is applied routinely to the scientific criminal investigation as one of multielement analytical techniques. This paper also discusses these routine works. (author) 14 refs
Probabilistic Structural Analysis Program
Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.
2010-01-01
NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.
Directory of Open Access Journals (Sweden)
Iulian N. BUJOREANU
2011-01-01
Full Text Available Sensitivity analysis represents such a well known and deeply analyzed subject that anyone to enter the field feels like not being able to add anything new. Still, there are so many facets to be taken into consideration.The paper introduces the reader to the various ways sensitivity analysis is implemented and the reasons for which it has to be implemented in most analyses in the decision making processes. Risk analysis is of outmost importance in dealing with resource allocation and is presented at the beginning of the paper as the initial cause to implement sensitivity analysis. Different views and approaches are added during the discussion about sensitivity analysis so that the reader develops an as thoroughly as possible opinion on the use and UTILITY of the sensitivity analysis. Finally, a round-up conclusion brings us to the question of the possibility of generating the future and analyzing it before it unfolds so that, when it happens it brings less uncertainty.
International Nuclear Information System (INIS)
Grimanis, A.P.
1985-01-01
A review of research and development on NAA as well as examples of applications of this method are presented, taken from work carried out over the last 21 years at the Radioanalytical Laboratory of the Department of Chemistry in the Greek Nuclear Research Center ''Demokritos''. Improved and faster radiochemical NAA methods have been developed for the determination of Au, Ni, Cl, As, Cu, U, Cr, Eu, Hg and Mo in several materials, for the simultaneous determination of Br and I; Mg, Sr and Ni; As and Cu; As, Sb and Hg; Mn, Sr and Ba; Cd and Zn; Se and As; Mo and Cr in biological materials. Instrumental NAA methods have also been developed for the determination of Ag, Cl and Na in lake waters, Al, Ca, Mg and V in wines, 7 trace elements in biological materials, 17 trace elements in sediments and 20 minor and trace elements in ceramics. A comprehensive computer program for routine activation analysis using Ge(Li) detectors have been worked out. A rather extended charged-particle activation analysis program is carried out for the last 10 years, including particle induced X-ray emission (PIXE) analysis, particle induced prompt gamma-ray emission analysis (PIGE), other nuclear reactions and proton activation analysis. A special neutron activation method, the delayed fission neutron counting method is used for the analysis of fissionable elements, as U, Th, Pu, in samples of the whole nuclear fuel cycle including geological, enriched and nuclear safeguards samples
Integrated genetic analysis microsystems
International Nuclear Information System (INIS)
Lagally, Eric T; Mathies, Richard A
2004-01-01
With the completion of the Human Genome Project and the ongoing DNA sequencing of the genomes of other animals, bacteria, plants and others, a wealth of new information about the genetic composition of organisms has become available. However, as the demand for sequence information grows, so does the workload required both to generate this sequence and to use it for targeted genetic analysis. Microfabricated genetic analysis systems are well poised to assist in the collection and use of these data through increased analysis speed, lower analysis cost and higher parallelism leading to increased assay throughput. In addition, such integrated microsystems may point the way to targeted genetic experiments on single cells and in other areas that are otherwise very difficult. Concomitant with these advantages, such systems, when fully integrated, should be capable of forming portable systems for high-speed in situ analyses, enabling a new standard in disciplines such as clinical chemistry, forensics, biowarfare detection and epidemiology. This review will discuss the various technologies available for genetic analysis on the microscale, and efforts to integrate them to form fully functional robust analysis devices. (topical review)
INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES
Directory of Open Access Journals (Sweden)
Caescu Stefan Claudiu
2011-12-01
Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such
Professionalizing Intelligence Analysis
Directory of Open Access Journals (Sweden)
James B. Bruce
2015-09-01
Full Text Available This article examines the current state of professionalism in national security intelligence analysis in the U.S. Government. Since the introduction of major intelligence reforms directed by the Intelligence Reform and Terrorism Prevention Act (IRTPA in December, 2004, we have seen notable strides in many aspects of intelligence professionalization, including in analysis. But progress is halting, uneven, and by no means permanent. To consolidate its gains, and if it is to continue improving, the U.S. intelligence community (IC should commit itself to accomplishing a new program of further professionalization of analysis to ensure that it will develop an analytic cadre that is fully prepared to deal with the complexities of an emerging multipolar and highly dynamic world that the IC itself is forecasting. Some recent reforms in intelligence analysis can be assessed against established standards of more fully developed professions; these may well fall short of moving the IC closer to the more fully professionalized analytical capability required for producing the kind of analysis needed now by the United States.
Harmonic and geometric analysis
Citti, Giovanna; Pérez, Carlos; Sarti, Alessandro; Zhong, Xiao
2015-01-01
This book presents an expanded version of four series of lectures delivered by the authors at the CRM. Harmonic analysis, understood in a broad sense, has a very wide interplay with partial differential equations and in particular with the theory of quasiconformal mappings and its applications. Some areas in which real analysis has been extremely influential are PDE's and geometric analysis. Their foundations and subsequent developments made extensive use of the Calderón–Zygmund theory, especially the Lp inequalities for Calderón–Zygmund operators (Beurling transform and Riesz transform, among others) and the theory of Muckenhoupt weights. The first chapter is an application of harmonic analysis and the Heisenberg group to understanding human vision, while the second and third chapters cover some of the main topics on linear and multilinear harmonic analysis. The last serves as a comprehensive introduction to a deep result from De Giorgi, Moser and Nash on the regularity of elliptic partial differen...
Hansson, Sven Ove; Aven, Terje
2014-07-01
This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). © 2014 Society for Risk Analysis.
Zhou, Qing; Son, Kyungjin; Liu, Ying; Revzin, Alexander
2015-01-01
Biosensors first appeared several decades ago to address the need for monitoring physiological parameters such as oxygen or glucose in biological fluids such as blood. More recently, a new wave of biosensors has emerged in order to provide more nuanced and granular information about the composition and function of living cells. Such biosensors exist at the confluence of technology and medicine and often strive to connect cell phenotype or function to physiological or pathophysiological processes. Our review aims to describe some of the key technological aspects of biosensors being developed for cell analysis. The technological aspects covered in our review include biorecognition elements used for biosensor construction, methods for integrating cells with biosensors, approaches to single-cell analysis, and the use of nanostructured biosensors for cell analysis. Our hope is that the spectrum of possibilities for cell analysis described in this review may pique the interest of biomedical scientists and engineers and may spur new collaborations in the area of using biosensors for cell analysis.
Clinical reasoning: concept analysis.
Simmons, Barbara
2010-05-01
This paper is a report of a concept analysis of clinical reasoning in nursing. Clinical reasoning is an ambiguous term that is often used synonymously with decision-making and clinical judgment. Clinical reasoning has not been clearly defined in the literature. Healthcare settings are increasingly filled with uncertainty, risk and complexity due to increased patient acuity, multiple comorbidities, and enhanced use of technology, all of which require clinical reasoning. Data sources. Literature for this concept analysis was retrieved from several databases, including CINAHL, PubMed, PsycINFO, ERIC and OvidMEDLINE, for the years 1980 to 2008. Rodgers's evolutionary method of concept analysis was used because of its applicability to concepts that are still evolving. Multiple terms have been used synonymously to describe the thinking skills that nurses use. Research in the past 20 years has elucidated differences among these terms and identified the cognitive processes that precede judgment and decision-making. Our concept analysis defines one of these terms, 'clinical reasoning,' as a complex process that uses cognition, metacognition, and discipline-specific knowledge to gather and analyse patient information, evaluate its significance, and weigh alternative actions. This concept analysis provides a middle-range descriptive theory of clinical reasoning in nursing that helps clarify meaning and gives direction for future research. Appropriate instruments to operationalize the concept need to be developed. Research is needed to identify additional variables that have an impact on clinical reasoning and what are the consequences of clinical reasoning in specific situations.
International Nuclear Information System (INIS)
Sitek, J.; Degmova, J.; Dekan, J.
2011-01-01
Meteorite Kosice fell down 28 th of February 2010 near the Kosice and represents an unique find, because the last fall of meteorite was observed in Slovakia at the year 1895. It supposes that for this kind of meteorite the orbit in cosmic space could be calculated. This is one of most important part because until now 13 places of meteorite find are known in the world of which cosmic orbit in space have been calculated. Slovakia is member of international bolide net, dealing with meteorite analysis in Middle Europe .Analysis of Kosice meteorite will also concern at the long live and short live nuclides. Results should be a contribution to determination of radiation and formative ages. From structural analysis of meteorite it will be possible to compare it with similar types of meteorites. In this work Moessbauer spectroscopy will be used for phase analysis from point of view iron contain components with the aim to identify magnetic and non magnetic fractions. From the analysis of magnetic part we can find that the first sextet with hyperfine magnetic field 33.5 T corresponds to bcc Fe-Ni alloy (kamacite) and second with field 31.5 T to FeS (triolite). Meteorites with mentioned composition belong to the mineral group of chondrites. Comparing our parameters with results of measurements at the similar meteorites we can conclude that Kosice meteorite contains the same components. According all Moessbauer parameters we can also include this meteorite in the mineral group of chondrites. (authors)
Foundations of VISAR analysis.
Energy Technology Data Exchange (ETDEWEB)
Dolan, Daniel H.
2006-06-01
The Velocity Interferometer System for Any Reflector (VISAR) is a widely used diagnostic at Sandia National Laboratories. Although the operating principles of the VISAR are well established, recently deployed systems (such as the fast push-pull and air delay VISAR) require more careful consideration, and many common assumptions about VISAR are coming into question. This report presents a comprehensive review of VISAR analysis to address these issues. Detailed treatment of several interferometer configurations is given to identify important aspects of the operation and characterization of VISAR systems. The calculation of velocity from interferometer measurements is also described. The goal is to derive the standard VISAR analysis relationships, indicate when these relationships are valid, and provide alternative methods when the standard analysis fails.
Pugh, Charles C
2015-01-01
Based on an honors course taught by the author at UC Berkeley, this introduction to undergraduate real analysis gives a different emphasis by stressing the importance of pictures and hard problems. Topics include: a natural construction of the real numbers, four-dimensional visualization, basic point-set topology, function spaces, multivariable calculus via differential forms (leading to a simple proof of the Brouwer Fixed Point Theorem), and a pictorial treatment of Lebesgue theory. Over 150 detailed illustrations elucidate abstract concepts and salient points in proofs. The exposition is informal and relaxed, with many helpful asides, examples, some jokes, and occasional comments from mathematicians, such as Littlewood, Dieudonné, and Osserman. This book thus succeeds in being more comprehensive, more comprehensible, and more enjoyable, than standard introductions to analysis. New to the second edition of Real Mathematical Analysis is a presentation of Lebesgue integration done almost entirely using the un...
Digital Fourier analysis fundamentals
Kido, Ken'iti
2015-01-01
This textbook is a thorough, accessible introduction to digital Fourier analysis for undergraduate students in the sciences. Beginning with the principles of sine/cosine decomposition, the reader walks through the principles of discrete Fourier analysis before reaching the cornerstone of signal processing: the Fast Fourier Transform. Saturated with clear, coherent illustrations, "Digital Fourier Analysis - Fundamentals" includes practice problems and thorough Appendices for the advanced reader. As a special feature, the book includes interactive applets (available online) that mirror the illustrations. These user-friendly applets animate concepts interactively, allowing the user to experiment with the underlying mathematics. For example, a real sine signal can be treated as a sum of clockwise and counter-clockwise rotating vectors. The applet illustration included with the book animates the rotating vectors and the resulting sine signal. By changing parameters such as amplitude and frequency, the reader ca...
Frank, IE
1994-01-01
Analyzing observed or measured data is an important step in applied sciences. The recent increase in computer capacity has resulted in a revolution both in data collection and data analysis. An increasing number of scientists, researchers and students are venturing into statistical data analysis; hence the need for more guidance in this field, which was previously dominated mainly by statisticians. This handbook fills the gap in the range of textbooks on data analysis. Written in a dictionary format, it will serve as a comprehensive reference book in a rapidly growing field. However, this book is more structured than an ordinary dictionary, where each entry is a separate, self-contained entity. The authors provide not only definitions and short descriptions, but also offer an overview of the different topics. Therefore, the handbook can also be used as a companion to textbooks for undergraduate or graduate courses. 1700 entries are given in alphabetical order grouped into 20 topics and each topic is organized...
International Nuclear Information System (INIS)
Arien, B.
1998-01-01
Risk assessments of nuclear installations require accurate safety and reliability analyses to estimate the consequences of accidental events and their probability of occurrence. The objective of the work performed in this field at the Belgian Nuclear Research Centre SCK-CEN is to develop expertise in probabilistic and deterministic reactor safety analysis. The four main activities of the research project on reactor safety analysis are: (1) the development of software for the reliable analysis of large systems; (2) the development of an expert system for the aid to diagnosis; (3) the development and the application of a probabilistic reactor-dynamics method, and (4) to participate in the international PHEBUS-FP programme for severe accidents. Progress in research during 1997 is described
Energy Technology Data Exchange (ETDEWEB)
Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-05-12
The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.
International Nuclear Information System (INIS)
Deville, J.P.
1998-01-01
Nowadays, there are a lot of surfaces analysis methods, each having its specificity, its qualities, its constraints (for instance vacuum) and its limits. Expensive in time and in investment, these methods have to be used deliberately. This article appeals to non specialists. It gives some elements of choice according to the studied information, the sensitivity, the use constraints or the answer to a precise question. After having recalled the fundamental principles which govern these analysis methods, based on the interaction between radiations (ultraviolet, X) or particles (ions, electrons) with matter, two methods will be more particularly described: the Auger electron spectroscopy (AES) and x-rays photoemission spectroscopy (ESCA or XPS). Indeed, they are the most widespread methods in laboratories, the easier for use and probably the most productive for the analysis of surface of industrial materials or samples submitted to treatments in aggressive media. (O.M.)
Power electronics reliability analysis.
Energy Technology Data Exchange (ETDEWEB)
Smith, Mark A.; Atcitty, Stanley
2009-12-01
This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.
International Nuclear Information System (INIS)
Gregg, H.R.; Meltzer, M.P.
1996-01-01
The portable Contamination Analysis Unit (CAU) measures trace quantities of surface contamination in real time. The detector head of the portable contamination analysis unit has an opening with an O-ring seal, one or more vacuum valves and a small mass spectrometer. With the valve closed, the mass spectrometer is evacuated with one or more pumps. The O-ring seal is placed against a surface to be tested and the vacuum valve is opened. Data is collected from the mass spectrometer and a portable computer provides contamination analysis. The CAU can be used to decontaminate and decommission hazardous and radioactive surfaces by measuring residual hazardous surface contamination, such as tritium and trace organics. It provides surface contamination data for research and development applications as well as real-time process control feedback for industrial cleaning operations and can be used to determine the readiness of a surface to accept bonding or coatings. 1 fig
Jorgensen, Palle
2017-01-01
The book features new directions in analysis, with an emphasis on Hilbert space, mathematical physics, and stochastic processes. We interpret 'non-commutative analysis' broadly to include representations of non-Abelian groups, and non-Abelian algebras; emphasis on Lie groups and operator algebras (C* algebras and von Neumann algebras.)A second theme is commutative and non-commutative harmonic analysis, spectral theory, operator theory and their applications. The list of topics includes shift invariant spaces, group action in differential geometry, and frame theory (over-complete bases) and their applications to engineering (signal processing and multiplexing), projective multi-resolutions, and free probability algebras.The book serves as an accessible introduction, offering a timeless presentation, attractive and accessible to students, both in mathematics and in neighboring fields.
DEFF Research Database (Denmark)
Frigaard, Peter; Andersen, Thomas Lykke
The present book describes the most important aspects of wave analysis techniques applied to physical model tests. Moreover, the book serves as technical documentation for the wave analysis software WaveLab 3, cf. Aalborg University (2012). In that respect it should be mentioned that supplementary...... to the present technical documentation exists also the online help document describing the WaveLab software in detail including all the inputs and output fields. In addition to the two main authors also Tue Hald, Jacob Helm-Petersen and Morten Møller Jakobsen have contributed to the note. Their input is highly...... acknowledged. The outline of the book is as follows: • Chapter 2 and 3 describes analysis of waves in time and frequency domain. • Chapter 4 and 5 describes the separation of incident and reflected waves for the two-dimensional case. • Chapter 6 describes the estimation of the directional spectra which also...
Canuto, Claudio
2015-01-01
The purpose of the volume is to provide a support textbook for a second lecture course on Mathematical Analysis. The contents are organised to suit, in particular, students of Engineering, Computer Science and Physics, all areas in which mathematical tools play a crucial role. The basic notions and methods concerning integral and differential calculus for multivariable functions, series of functions and ordinary differential equations are presented in a manner that elicits critical reading and prompts a hands-on approach to concrete applications. The pedagogical layout echoes the one used in the companion text Mathematical Analysis I. The book’s structure has a specifically-designed modular nature, which allows for great flexibility in the preparation of a lecture course on Mathematical Analysis. The style privileges clarity in the exposition and a linear progression through the theory. The material is organised on two levels. The first, reflected in this book, allows students to grasp the essential ideas, ...
Bandemer, Hans
1992-01-01
Fuzzy data such as marks, scores, verbal evaluations, imprecise observations, experts' opinions and grey tone pictures, are quite common. In Fuzzy Data Analysis the authors collect their recent results providing the reader with ideas, approaches and methods for processing such data when looking for sub-structures in knowledge bases for an evaluation of functional relationship, e.g. in order to specify diagnostic or control systems. The modelling presented uses ideas from fuzzy set theory and the suggested methods solve problems usually tackled by data analysis if the data are real numbers. Fuzzy Data Analysis is self-contained and is addressed to mathematicians oriented towards applications and to practitioners in any field of application who have some background in mathematics and statistics.
Zorich, Vladimir A
2015-01-01
VLADIMIR A. ZORICH is professor of mathematics at Moscow State University. His areas of specialization are analysis, conformal geometry, quasiconformal mappings, and mathematical aspects of thermodynamics. He solved the problem of global homeomorphism for space quasiconformal mappings. He holds a patent in the technology of mechanical engineering, and he is also known by his book Mathematical Analysis of Problems in the Natural Sciences . This second English edition of a very popular two-volume work presents a thorough first course in analysis, leading from real numbers to such advanced topics as differential forms on manifolds; asymptotic methods; Fourier, Laplace, and Legendre transforms; elliptic functions; and distributions. Especially notable in this course are the clearly expressed orientation toward the natural sciences and the informal exploration of the essence and the roots of the basic concepts and theorems of calculus. Clarity of exposition is matched by a wealth of instructive exercises, problems...
DEFF Research Database (Denmark)
Christensen, Ole; Feichtinger, Hans G.; Paukner, Stephan
2015-01-01
, it characterizes a function by its transform over phase space, which is the time–frequency plane (TF-plane) in a musical context or the location–wave-number domain in the context of image processing. Since the transition from the signal domain to the phase space domain introduces an enormous amount of data...... of the generalities relevant for an understanding of Gabor analysis of functions on Rd. We pay special attention to the case d = 2, which is the most important case for image processing and image analysis applications. The chapter is organized as follows. Section 2 presents central tools from functional analysis......, the application of Gabor expansions to image representation is considered in Sect. 6....