WorldWideScience

Sample records for matlab curve fitting

  1. Edge detection and mathematic fitting for corneal surface with Matlab software.

    Science.gov (United States)

    Di, Yue; Li, Mei-Yan; Qiao, Tong; Lu, Na

    2017-01-01

    To select the optimal edge detection methods to identify the corneal surface, and compare three fitting curve equations with Matlab software. Fifteen subjects were recruited. The corneal images from optical coherence tomography (OCT) were imported into Matlab software. Five edge detection methods (Canny, Log, Prewitt, Roberts, Sobel) were used to identify the corneal surface. Then two manual identifying methods (ginput and getpts) were applied to identify the edge coordinates respectively. The differences among these methods were compared. Binomial curve (y=Ax 2 +Bx+C), Polynomial curve [p(x)=p1x n +p2x n-1 +....+pnx+pn+1] and Conic section (Ax 2 +Bxy+Cy 2 +Dx+Ey+F=0) were used for curve fitting the corneal surface respectively. The relative merits among three fitting curves were analyzed. Finally, the eccentricity (e) obtained by corneal topography and conic section were compared with paired t -test. Five edge detection algorithms all had continuous coordinates which indicated the edge of the corneal surface. The ordinates of manual identifying were close to the inside of the actual edges. Binomial curve was greatly affected by tilt angle. Polynomial curve was lack of geometrical properties and unstable. Conic section could calculate the tilted symmetry axis, eccentricity, circle center, etc . There were no significant differences between 'e' values by corneal topography and conic section ( t =0.9143, P =0.3760 >0.05). It is feasible to simulate the corneal surface with mathematical curve with Matlab software. Edge detection has better repeatability and higher efficiency. The manual identifying approach is an indispensable complement for detection. Polynomial and conic section are both the alternative methods for corneal curve fitting. Conic curve was the optimal choice based on the specific geometrical properties.

  2. A computerized glow curve analysis (GCA) method for WinREMS thermoluminescent dosimeter data using MATLAB

    International Nuclear Information System (INIS)

    Harvey, John A.; Rodrigues, Miesher L.; Kearfott, Kimberlee J.

    2011-01-01

    A computerized glow curve analysis (GCA) program for handling of thermoluminescence data originating from WinREMS is presented. The MATLAB program fits the glow peaks using the first-order kinetics model. Tested materials are LiF:Mg,Ti, CaF 2 :Dy, CaF 2 :Tm, CaF 2 :Mn, LiF:Mg,Cu,P, and CaSO 4 :Dy, with most having an average figure of merit (FOM) of 1.3% or less, with CaSO 4 :Dy 2.2% or less. Output is a list of fit parameters, peak areas, and graphs for each fit, evaluating each glow curve in 1.5 s or less. - Highlights: → Robust algorithm for performing thermoluminescent dosimeter glow curve analysis. → Written in MATLAB so readily implemented on variety of computers. → Usage of figure of merit demonstrated for six different materials.

  3. Polynomial curve fitting for control rod worth using least square numerical analysis

    International Nuclear Information System (INIS)

    Muhammad Husamuddin Abdul Khalil; Mark Dennis Usang; Julia Abdul Karim; Mohd Amin Sharifuldin Salleh

    2012-01-01

    RTP must have sufficient excess reactivity to compensate the negative reactivity feedback effects such as those caused by the fuel temperature and power defects of reactivity, fuel burn-up and to allow full power operation for predetermined period of time. To compensate this excess reactivity, it is necessary to introduce an amount of negative reactivity by adjusting or controlling the control rods at will. Control rod worth depends largely upon the value of the neutron flux at the location of the rod and reflected by a polynomial curve. Purpose of this paper is to rule out the polynomial curve fitting using least square numerical techniques via MATLAB compatible language. (author)

  4. The Predicting Model of E-commerce Site Based on the Ideas of Curve Fitting

    Science.gov (United States)

    Tao, Zhang; Li, Zhang; Dingjun, Chen

    On the basis of the idea of the second multiplication curve fitting, the number and scale of Chinese E-commerce site is analyzed. A preventing increase model is introduced in this paper, and the model parameters are solved by the software of Matlab. The validity of the preventing increase model is confirmed though the numerical experiment. The experimental results show that the precision of preventing increase model is ideal.

  5. CURVE LSFIT, Gamma Spectrometer Calibration by Interactive Fitting Method

    International Nuclear Information System (INIS)

    Olson, D.G.

    1992-01-01

    1 - Description of program or function: CURVE and LSFIT are interactive programs designed to obtain the best data fit to an arbitrary curve. CURVE finds the type of fitting routine which produces the best curve. The types of fitting routines available are linear regression, exponential, logarithmic, power, least squares polynomial, and spline. LSFIT produces a reliable calibration curve for gamma ray spectrometry by using the uncertainty value associated with each data point. LSFIT is intended for use where an entire efficiency curve is to be made starting at 30 KeV and continuing to 1836 KeV. It creates calibration curves using up to three least squares polynomial fits to produce the best curve for photon energies above 120 KeV and a spline function to combine these fitted points with a best fit for points below 120 KeV. 2 - Method of solution: The quality of fit is tested by comparing the measured y-value to the y-value calculated from the fitted curve. The fractional difference between these two values is printed for the evaluation of the quality of the fit. 3 - Restrictions on the complexity of the problem - Maxima of: 2000 data points calibration curve output (LSFIT) 30 input data points 3 least squares polynomial fits (LSFIT) The least squares polynomial fit requires that the number of data points used exceed the degree of fit by at least two

  6. A versatile curve-fit model for linear to deeply concave rank abundance curves

    NARCIS (Netherlands)

    Neuteboom, J.H.; Struik, P.C.

    2005-01-01

    A new, flexible curve-fit model for linear to concave rank abundance curves was conceptualized and validated using observational data. The model links the geometric-series model and log-series model and can also fit deeply concave rank abundance curves. The model is based ¿ in an unconventional way

  7. GLOBAL AND STRICT CURVE FITTING METHOD

    NARCIS (Netherlands)

    Nakajima, Y.; Mori, S.

    2004-01-01

    To find a global and smooth curve fitting, cubic B­Spline method and gathering­ line methods are investigated. When segmenting and recognizing a contour curve of character shape, some global method is required. If we want to connect contour curves around a singular point like crossing points,

  8. NOISY DISPERSION CURVE PICKING (NDCP): a Matlab friendly suite package for fully control dispersion curve picking

    Science.gov (United States)

    Granados, I.; Calo, M.; Ramos, V.

    2017-12-01

    We developed a Matlab suite package (NDCP, Noisy Dispersion Curve Picking) that allows a full control over parameters to identify correctly group velocity dispersion curves in two types of datasets: correlograms between two stations or surface wave records from earthquakes. Using the frequency-time analysis (FTAN), the procedure to obtain the dispersion curves from records with a high noise level becomes difficult, and sometimes, the picked curve result in a misinterpreted character. For correlogram functions, obtained with cross-correlation of noise records or earthquake's coda, a non-homogeneous noise sources distribution yield to a non-symmetric Green's function (GF); to retrieve the complete information contained in there, NDCP allows to pick the dispersion curve in the time domain both in the causal and non-causal part of the GF. Then the picked dispersion curve is displayed on the FTAN diagram to in order to check if it matches with the maximum of the signal energy avoiding confusion with overtones or spike of noise. To illustrate how NDCP performs, we show exemple using: i) local correlograms functions obtained from sensors deployed into a volcanic caldera (Los Humeros, in Puebla, Mexico), ii) regional correlograms functions between two stations of the National Seismological Service (SSN, Servicio Sismológico Nacional in Spanish), and iii) surface wave seismic record for an earthquake located in the Pacific Ocean coast of Mexico and recorded by the SSN. This work is supported by the GEMEX project (Geothermal Europe-Mexico consortium).

  9. FIT3D toolbox: multiple view geometry and 3D reconstruction for Matlab

    NARCIS (Netherlands)

    Esteban, I.; Dijk, J.; Groen, F.

    2010-01-01

    FIT3D is a Toolbox built for Matlab that aims at unifying and distributing a set of tools that will allow the researcher to obtain a complete 3D model from a set of calibrated images. In this paper we motivate and present the structure of the toolbox in a tutorial and example based approach. Given

  10. Testing the validity of stock-recruitment curve fits

    International Nuclear Information System (INIS)

    Christensen, S.W.; Goodyear, C.P.

    1988-01-01

    The utilities relied heavily on the Ricker stock-recruitment model as the basis for quantifying biological compensation in the Hudson River power case. They presented many fits of the Ricker model to data derived from striped bass catch and effort records compiled by the National Marine Fisheries Service. Based on this curve-fitting exercise, a value of 4 was chosen for the parameter alpha in the Ricker model, and this value was used to derive the utilities' estimates of the long-term impact of power plants on striped bass populations. A technique was developed and applied to address a single fundamental question: if the Ricker model were applicable to the Hudson River striped bass population, could the estimates of alpha from the curve-fitting exercise be considered reliable. The technique involved constructing a simulation model that incorporated the essential biological features of the population and simulated the characteristics of the available actual catch-per-unit-effort data through time. The ability or failure to retrieve the known parameter values underlying the simulation model via the curve-fitting exercise was a direct test of the reliability of the results of fitting stock-recruitment curves to the real data. The results demonstrated that estimates of alpha from the curve-fitting exercise were not reliable. The simulation-modeling technique provides an effective way to identify whether or not particular data are appropriate for use in fitting such models. 39 refs., 2 figs., 3 tabs

  11. Curve fitting methods for solar radiation data modeling

    Energy Technology Data Exchange (ETDEWEB)

    Karim, Samsul Ariffin Abdul, E-mail: samsul-ariffin@petronas.com.my, E-mail: balbir@petronas.com.my; Singh, Balbir Singh Mahinder, E-mail: samsul-ariffin@petronas.com.my, E-mail: balbir@petronas.com.my [Department of Fundamental and Applied Sciences, Faculty of Sciences and Information Technology, Universiti Teknologi PETRONAS, Bandar Seri Iskandar, 31750 Tronoh, Perak Darul Ridzuan (Malaysia)

    2014-10-24

    This paper studies the use of several type of curve fitting method to smooth the global solar radiation data. After the data have been fitted by using curve fitting method, the mathematical model of global solar radiation will be developed. The error measurement was calculated by using goodness-fit statistics such as root mean square error (RMSE) and the value of R{sup 2}. The best fitting methods will be used as a starting point for the construction of mathematical modeling of solar radiation received in Universiti Teknologi PETRONAS (UTP) Malaysia. Numerical results indicated that Gaussian fitting and sine fitting (both with two terms) gives better results as compare with the other fitting methods.

  12. Curve fitting methods for solar radiation data modeling

    Science.gov (United States)

    Karim, Samsul Ariffin Abdul; Singh, Balbir Singh Mahinder

    2014-10-01

    This paper studies the use of several type of curve fitting method to smooth the global solar radiation data. After the data have been fitted by using curve fitting method, the mathematical model of global solar radiation will be developed. The error measurement was calculated by using goodness-fit statistics such as root mean square error (RMSE) and the value of R2. The best fitting methods will be used as a starting point for the construction of mathematical modeling of solar radiation received in Universiti Teknologi PETRONAS (UTP) Malaysia. Numerical results indicated that Gaussian fitting and sine fitting (both with two terms) gives better results as compare with the other fitting methods.

  13. Curve fitting methods for solar radiation data modeling

    International Nuclear Information System (INIS)

    Karim, Samsul Ariffin Abdul; Singh, Balbir Singh Mahinder

    2014-01-01

    This paper studies the use of several type of curve fitting method to smooth the global solar radiation data. After the data have been fitted by using curve fitting method, the mathematical model of global solar radiation will be developed. The error measurement was calculated by using goodness-fit statistics such as root mean square error (RMSE) and the value of R 2 . The best fitting methods will be used as a starting point for the construction of mathematical modeling of solar radiation received in Universiti Teknologi PETRONAS (UTP) Malaysia. Numerical results indicated that Gaussian fitting and sine fitting (both with two terms) gives better results as compare with the other fitting methods

  14. From Curve Fitting to Machine Learning

    CERN Document Server

    Zielesny, Achim

    2011-01-01

    The analysis of experimental data is at heart of science from its beginnings. But it was the advent of digital computers that allowed the execution of highly non-linear and increasingly complex data analysis procedures - methods that were completely unfeasible before. Non-linear curve fitting, clustering and machine learning belong to these modern techniques which are a further step towards computational intelligence. The goal of this book is to provide an interactive and illustrative guide to these topics. It concentrates on the road from two dimensional curve fitting to multidimensional clus

  15. Quantifying and Reducing Curve-Fitting Uncertainty in Isc

    Energy Technology Data Exchange (ETDEWEB)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-06-14

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data points can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.

  16. Real-Time Exponential Curve Fits Using Discrete Calculus

    Science.gov (United States)

    Rowe, Geoffrey

    2010-01-01

    An improved solution for curve fitting data to an exponential equation (y = Ae(exp Bt) + C) has been developed. This improvement is in four areas -- speed, stability, determinant processing time, and the removal of limits. The solution presented avoids iterative techniques and their stability errors by using three mathematical ideas: discrete calculus, a special relationship (be tween exponential curves and the Mean Value Theorem for Derivatives), and a simple linear curve fit algorithm. This method can also be applied to fitting data to the general power law equation y = Ax(exp B) + C and the general geometric growth equation y = Ak(exp Bt) + C.

  17. Fitting model-based psychometric functions to simultaneity and temporal-order judgment data: MATLAB and R routines.

    Science.gov (United States)

    Alcalá-Quintana, Rocío; García-Pérez, Miguel A

    2013-12-01

    Research on temporal-order perception uses temporal-order judgment (TOJ) tasks or synchrony judgment (SJ) tasks in their binary SJ2 or ternary SJ3 variants. In all cases, two stimuli are presented with some temporal delay, and observers judge the order of presentation. Arbitrary psychometric functions are typically fitted to obtain performance measures such as sensitivity or the point of subjective simultaneity, but the parameters of these functions are uninterpretable. We describe routines in MATLAB and R that fit model-based functions whose parameters are interpretable in terms of the processes underlying temporal-order and simultaneity judgments and responses. These functions arise from an independent-channels model assuming arrival latencies with exponential distributions and a trichotomous decision space. Different routines fit data separately for SJ2, SJ3, and TOJ tasks, jointly for any two tasks, or also jointly for the three tasks (for common cases in which two or even the three tasks were used with the same stimuli and participants). Additional routines provide bootstrap p-values and confidence intervals for estimated parameters. A further routine is included that obtains performance measures from the fitted functions. An R package for Windows and source code of the MATLAB and R routines are available as Supplementary Files.

  18. Quantifying and Reducing Curve-Fitting Uncertainty in Isc: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-09-28

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data points can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.

  19. Comparison of ductile-to-brittle transition curve fitting approaches

    International Nuclear Information System (INIS)

    Cao, L.W.; Wu, S.J.; Flewitt, P.E.J.

    2012-01-01

    Ductile-to-brittle transition (DBT) curve fitting approaches are compared over the transition temperature range for reactor pressure vessel steels with different kinds of data, including Charpy-V notch impact energy data and fracture toughness data. Three DBT curve fitting methods have been frequently used in the past, including the Burr S-Weibull and tanh distributions. In general there is greater scatter associated with test data obtained within the transition region. Therefore these methods give results with different accuracies, especially when fitting to small quantities of data. The comparison shows that the Burr distribution and tanh distribution can almost equally fit well distributed and large data sets extending across the test temperature range to include the upper and lower shelves. The S-Weibull distribution fit is poor for the lower shelf of the DBT curve. Overall for both large and small quantities of measured data the Burr distribution provides the best description. - Highlights: ► Burr distribution offers a better fit than that of a S-Weibull and tanh fit. ► Burr and tanh methods show similar fitting ability for a large data set. ► Burr method can fit sparse data well distributed across the test temperature. ► S-Weibull method cannot fit the lower shelf well and show poor fitting quality.

  20. Estimating reaction rate constants: comparison between traditional curve fitting and curve resolution

    NARCIS (Netherlands)

    Bijlsma, S.; Boelens, H. F. M.; Hoefsloot, H. C. J.; Smilde, A. K.

    2000-01-01

    A traditional curve fitting (TCF) algorithm is compared with a classical curve resolution (CCR) approach for estimating reaction rate constants from spectral data obtained in time of a chemical reaction. In the TCF algorithm, reaction rate constants an estimated from the absorbance versus time data

  1. MATLAB as an incentive for student learning of skills

    Science.gov (United States)

    Bank, C. G.; Ghent, R. R.

    2016-12-01

    Our course "Computational Geology" takes a holistic approach to student learning by using MATLAB as a focal point to increase students' computing, quantitative reasoning, data analysis, report writing, and teamwork skills. The course, taught since 2007 with recent enrollments around 35 and aimed at 2nd to 3rd-year students, is required for the Geology and Earth and Environmental Systems major programs, and can be chosen as elective in our other programs, including Geophysics. The course is divided into five projects: Pacific plate velocity from the Hawaiian hotspot track, predicting CO2 concentration in the atmosphere, volume of Earth's oceans and sea-level rise, comparing wind directions for Vancouver and Squamish, and groundwater flow. Each project is based on real data, focusses on a mathematical concept (linear interpolation, gradients, descriptive statistics, differential equations) and highlights a programming task (arrays, functions, text file input/output, curve fitting). Working in teams of three, students need to develop a conceptional model to explain the data, and write MATLAB code to visualize the data and match it to their conceptional model. The programming is guided, and students work individually on different aspects (for example: reading the data, fitting a function, unit conversion) which they need to put together to solve the problem. They then synthesize their thought process in a paper. Anecdotal evidence shows that students continue using MATLAB in other courses.

  2. Fitting the curve in Excel® : Systematic curve fitting of laboratory and remotely sensed planetary spectra

    NARCIS (Netherlands)

    McCraig, M.A.; Osinski, G.R.; Cloutis, E.A.; Flemming, R.L.; Izawa, M.R.M.; Reddy, V.; Fieber-Beyer, S.K.; Pompilio, L.; van der Meer, F.D.; Berger, J.A.; Bramble, M.S.; Applin, D.M.

    2017-01-01

    Spectroscopy in planetary science often provides the only information regarding the compositional and mineralogical make up of planetary surfaces. The methods employed when curve fitting and modelling spectra can be confusing and difficult to visualize and comprehend. Researchers who are new to

  3. Analysis of Surface Plasmon Resonance Curves with a Novel Sigmoid-Asymmetric Fitting Algorithm

    Directory of Open Access Journals (Sweden)

    Daeho Jang

    2015-09-01

    Full Text Available The present study introduces a novel curve-fitting algorithm for surface plasmon resonance (SPR curves using a self-constructed, wedge-shaped beam type angular interrogation SPR spectroscopy technique. Previous fitting approaches such as asymmetric and polynomial equations are still unsatisfactory for analyzing full SPR curves and their use is limited to determining the resonance angle. In the present study, we developed a sigmoid-asymmetric equation that provides excellent curve-fitting for the whole SPR curve over a range of incident angles, including regions of the critical angle and resonance angle. Regardless of the bulk fluid type (i.e., water and air, the present sigmoid-asymmetric fitting exhibited nearly perfect matching with a full SPR curve, whereas the asymmetric and polynomial curve fitting methods did not. Because the present curve-fitting sigmoid-asymmetric equation can determine the critical angle as well as the resonance angle, the undesired effect caused by the bulk fluid refractive index was excluded by subtracting the critical angle from the resonance angle in real time. In conclusion, the proposed sigmoid-asymmetric curve-fitting algorithm for SPR curves is widely applicable to various SPR measurements, while excluding the effect of bulk fluids on the sensing layer.

  4. Curve fitting for RHB Islamic Bank annual net profit

    Science.gov (United States)

    Nadarajan, Dineswary; Noor, Noor Fadiya Mohd

    2015-05-01

    The RHB Islamic Bank net profit data are obtained from 2004 to 2012. Curve fitting is done by assuming the data are exact or experimental due to smoothing process. Higher order Lagrange polynomial and cubic spline with curve fitting procedure are constructed using Maple software. Normality test is performed to check the data adequacy. Regression analysis with curve estimation is conducted in SPSS environment. All the eleven models are found to be acceptable at 10% significant level of ANOVA. Residual error and absolute relative true error are calculated and compared. The optimal model based on the minimum average error is proposed.

  5. Prediction of Pressing Quality for Press-Fit Assembly Based on Press-Fit Curve and Maximum Press-Mounting Force

    Directory of Open Access Journals (Sweden)

    Bo You

    2015-01-01

    Full Text Available In order to predict pressing quality of precision press-fit assembly, press-fit curves and maximum press-mounting force of press-fit assemblies were investigated by finite element analysis (FEA. The analysis was based on a 3D Solidworks model using the real dimensions of the microparts and the subsequent FEA model that was built using ANSYS Workbench. The press-fit process could thus be simulated on the basis of static structure analysis. To verify the FEA results, experiments were carried out using a press-mounting apparatus. The results show that the press-fit curves obtained by FEA agree closely with the curves obtained using the experimental method. In addition, the maximum press-mounting force calculated by FEA agrees with that obtained by the experimental method, with the maximum deviation being 4.6%, a value that can be tolerated. The comparison shows that the press-fit curve and max press-mounting force calculated by FEA can be used for predicting the pressing quality during precision press-fit assembly.

  6. Weighted curve-fitting program for the HP 67/97 calculator

    International Nuclear Information System (INIS)

    Stockli, M.P.

    1983-01-01

    The HP 67/97 calculator provides in its standard equipment a curve-fit program for linear, logarithmic, exponential and power functions that is quite useful and popular. However, in more sophisticated applications, proper weights for data are often essential. For this purpose a program package was created which is very similar to the standard curve-fit program but which includes the weights of the data for proper statistical analysis. This allows accurate calculation of the uncertainties of the fitted curve parameters as well as the uncertainties of interpolations or extrapolations, or optionally the uncertainties can be normalized with chi-square. The program is very versatile and allows one to perform quite difficult data analysis in a convenient way with the pocket calculator HP 67/97

  7. Hot Spots Detection of Operating PV Arrays through IR Thermal Image Using Method Based on Curve Fitting of Gray Histogram

    Directory of Open Access Journals (Sweden)

    Jiang Lin

    2016-01-01

    Full Text Available The overall efficiency of PV arrays is affected by hot spots which should be detected and diagnosed by applying responsible monitoring techniques. The method using the IR thermal image to detect hot spots has been studied as a direct, noncontact, nondestructive technique. However, IR thermal images suffer from relatively high stochastic noise and non-uniformity clutter, so the conventional methods of image processing are not effective. The paper proposes a method to detect hotspots based on curve fitting of gray histogram. The result of MATLAB simulation proves the method proposed in the paper is effective to detect the hot spots suppressing the noise generated during the process of image acquisition.

  8. GRace: a MATLAB-based application for fitting the discrimination-association model.

    Science.gov (United States)

    Stefanutti, Luca; Vianello, Michelangelo; Anselmi, Pasquale; Robusto, Egidio

    2014-10-28

    The Implicit Association Test (IAT) is a computerized two-choice discrimination task in which stimuli have to be categorized as belonging to target categories or attribute categories by pressing, as quickly and accurately as possible, one of two response keys. The discrimination association model has been recently proposed for the analysis of reaction time and accuracy of an individual respondent to the IAT. The model disentangles the influences of three qualitatively different components on the responses to the IAT: stimuli discrimination, automatic association, and termination criterion. The article presents General Race (GRace), a MATLAB-based application for fitting the discrimination association model to IAT data. GRace has been developed for Windows as a standalone application. It is user-friendly and does not require any programming experience. The use of GRace is illustrated on the data of a Coca Cola-Pepsi Cola IAT, and the results of the analysis are interpreted and discussed.

  9. Sensitivity of Fit Indices to Misspecification in Growth Curve Models

    Science.gov (United States)

    Wu, Wei; West, Stephen G.

    2010-01-01

    This study investigated the sensitivity of fit indices to model misspecification in within-individual covariance structure, between-individual covariance structure, and marginal mean structure in growth curve models. Five commonly used fit indices were examined, including the likelihood ratio test statistic, root mean square error of…

  10. Potential errors when fitting experience curves by means of spreadsheet software

    International Nuclear Information System (INIS)

    Sark, W.G.J.H.M. van; Alsema, E.A.

    2010-01-01

    Progress ratios (PRs) are widely used in forecasting development of many technologies; they are derived from historical data represented in experience curves. Fitting the double logarithmic graphs is easily done with spreadsheet software like Microsoft Excel, by adding a trend line to the graph. However, it is unknown to many that these data are transformed to linear data before a fit is performed. This leads to erroneous results or a transformation bias in the PR, as we demonstrate using the experience curve for photovoltaic technology: logarithmic transformation leads to overestimates of progress ratios and underestimates of goodness of fit. Therefore, other graphing and analysis software is recommended.

  11. Multimodal determination of Rayleigh dispersion and attenuation curves using the circle fit method

    Science.gov (United States)

    Verachtert, R.; Lombaert, G.; Degrande, G.

    2018-03-01

    This paper introduces the circle fit method for the determination of multi-modal Rayleigh dispersion and attenuation curves as part of a Multichannel Analysis of Surface Waves (MASW) experiment. The wave field is transformed to the frequency-wavenumber (fk) domain using a discretized Hankel transform. In a Nyquist plot of the fk-spectrum, displaying the imaginary part against the real part, the Rayleigh wave modes correspond to circles. The experimental Rayleigh dispersion and attenuation curves are derived from the angular sweep of the central angle of these circles. The method can also be applied to the analytical fk-spectrum of the Green's function of a layered half-space in order to compute dispersion and attenuation curves, as an alternative to solving an eigenvalue problem. A MASW experiment is subsequently simulated for a site with a regular velocity profile and a site with a soft layer trapped between two stiffer layers. The performance of the circle fit method to determine the dispersion and attenuation curves is compared with the peak picking method and the half-power bandwidth method. The circle fit method is found to be the most accurate and robust method for the determination of the dispersion curves. When determining attenuation curves, the circle fit method and half-power bandwidth method are accurate if the mode exhibits a sharp peak in the fk-spectrum. Furthermore, simulated and theoretical attenuation curves determined with the circle fit method agree very well. A similar correspondence is not obtained when using the half-power bandwidth method. Finally, the circle fit method is applied to measurement data obtained for a MASW experiment at a site in Heverlee, Belgium. In order to validate the soil profile obtained from the inversion procedure, force-velocity transfer functions were computed and found in good correspondence with the experimental transfer functions, especially in the frequency range between 5 and 80 Hz.

  12. Background does not significantly affect power-exponential fitting of gastric emptying curves

    International Nuclear Information System (INIS)

    Jonderko, K.

    1987-01-01

    Using a procedure enabling the assessment of background radiation, research was done to elucidate the course of changes in background activity during gastric emptying measurements. Attention was focused on the changes in the shape of power-exponential fitted gastric emptying curves after correction for background was performed. The observed pattern of background counts allowed to explain the shifts of the parameters characterizing power-exponential curves connected with background correction. It was concluded that background had a negligible effect on the power-exponential fitting of gastric emptying curves. (author)

  13. A graph-based method for fitting planar B-spline curves with intersections

    Directory of Open Access Journals (Sweden)

    Pengbo Bo

    2016-01-01

    Full Text Available The problem of fitting B-spline curves to planar point clouds is studied in this paper. A novel method is proposed to deal with the most challenging case where multiple intersecting curves or curves with self-intersection are necessary for shape representation. A method based on Delauney Triangulation of data points is developed to identify connected components which is also capable of removing outliers. A skeleton representation is utilized to represent the topological structure which is further used to create a weighted graph for deciding the merging of curve segments. Different to existing approaches which utilize local shape information near intersections, our method considers shape characteristics of curve segments in a larger scope and is thus capable of giving more satisfactory results. By fitting each group of data points with a B-spline curve, we solve the problems of curve structure reconstruction from point clouds, as well as the vectorization of simple line drawing images by drawing lines reconstruction.

  14. Repair models of cell survival and corresponding computer program for survival curve fitting

    International Nuclear Information System (INIS)

    Shen Xun; Hu Yiwei

    1992-01-01

    Some basic concepts and formulations of two repair models of survival, the incomplete repair (IR) model and the lethal-potentially lethal (LPL) model, are introduced. An IBM-PC computer program for survival curve fitting with these models was developed and applied to fit the survivals of human melanoma cells HX118 irradiated at different dose rates. Comparison was made between the repair models and two non-repair models, the multitar get-single hit model and the linear-quadratic model, in the fitting and analysis of the survival-dose curves. It was shown that either IR model or LPL model can fit a set of survival curves of different dose rates with same parameters and provide information on the repair capacity of cells. These two mathematical models could be very useful in quantitative study on the radiosensitivity and repair capacity of cells

  15. A non-iterative method for fitting decay curves with background

    International Nuclear Information System (INIS)

    Mukoyama, T.

    1982-01-01

    A non-iterative method for fitting a decay curve with background is presented. The sum of an exponential function and a constant term is linearized by the use of the difference equation and parameters are determined by the standard linear least-squares fitting. The validity of the present method has been tested against pseudo-experimental data. (orig.)

  16. Application of tan h curve fitting to toughness data

    International Nuclear Information System (INIS)

    Sakai, Yuzuru; Ogura, Nobukazu

    1985-01-01

    Curve-fitting regression procedures for toughness data have been examined. The objectives of fitting curve in the context of the study of nuclear pressure vessel steels are (1) convenient summarization of test data to permit comparison of materials and testing methods; (2) development of statistical base concerning the data; (3) the surveying of the relationships between charpy data and fracture toughness data; (4) estimation of fracture toughness level from charpy absorbed energy data. The computational procedures using the tanh function have been applied to the toughness data (charpy absorbed energy, static fracture toughness, dynamic fracture toughness, crack arrest toughness) of A533B cl.1 and A508 cl.3 steels. The results of the analysis shows the statistical features of the material toughness and gives the method for estimating fracture toughness level from charpy absorbed energy data. (author)

  17. The thermoluminescence glow-curve analysis using GlowFit - the new powerful tool for deconvolution

    International Nuclear Information System (INIS)

    Puchalska, M.; Bilski, P.

    2005-10-01

    A new computer program, GlowFit, for deconvoluting first-order kinetics thermoluminescence (TL) glow-curves has been developed. A non-linear function describing a single glow-peak is fitted to experimental points using the least squares Levenberg-Marquardt method. The main advantage of GlowFit is in its ability to resolve complex TL glow-curves consisting of strongly overlapping peaks, such as those observed in heavily doped LiF:Mg,Ti (MTT) detectors. This resolution is achieved mainly by setting constraints or by fixing selected parameters. The initial values of the fitted parameters are placed in the so-called pattern files. GlowFit is a Microsoft Windows-operated user-friendly program. Its graphic interface enables easy intuitive manipulation of glow-peaks, at the initial stage (parameter initialization) and at the final stage (manual adjustment) of fitting peak parameters to the glow-curves. The program is freely downloadable from the web site www.ifj.edu.pl/NPP/deconvolution.htm (author)

  18. Application of numerical methods in spectroscopy : fitting of the curve of thermoluminescence

    International Nuclear Information System (INIS)

    RANDRIAMANALINA, S.

    1999-01-01

    The method of non linear least squares is one of the mathematical tools widely employed in spectroscopy, it is used for the determination of parameters of a model. In other hand, the spline function is among fitting functions that introduce the smallest error. It is used for the calculation of the area under the curve. We present an application of these methods, with the details of the corresponding algorithms, to the fitting of the thermoluminescence curve. [fr

  19. PLOTnFIT: A BASIC program for data plotting and curve fitting

    Energy Technology Data Exchange (ETDEWEB)

    Schiffgens, J O

    1989-10-01

    PLOTnFIT is a BASIC program to be used with an IBM or IBM-compatible personal computer (PC) for plotting and fitting curves to measured or observed data for both extrapolation and interpolation. It uses the Least Squares method to calculate the coefficients of nth degree polynomials (e.g., up to 10th degree) of Basis Functions so that each polynomial fits the data in a Least Squares sense, then plots the data and the polynomial that a user decides best represents them. PLOTnFIT is very versatile. It can be used to generate linear, semilog, and log-log graphs and can automatically scale the coordinate axes to suit the data. It can plot more than one data set on a graph (e.g., up to 8 data sets) and more data points than a user is likely to put on one graph (e.g., up to 225 points). A PC diskette containing (1) READIST.PNF (a summary of this NUREG), (2) INI06891.SIS and FOL06891.SIS (two data files), and 3) PLOTNFIT.4TH (the latest version of the program) may be obtained from the National Energy Software Center, Argonne National Laboratory, 9700 South Cass Avenue, Argonne, IL 60439. (author)

  20. Fitness function and nonunique solutions in x-ray reflectivity curve fitting: crosserror between surface roughness and mass density

    International Nuclear Information System (INIS)

    Tiilikainen, J; Bosund, V; Mattila, M; Hakkarainen, T; Sormunen, J; Lipsanen, H

    2007-01-01

    Nonunique solutions of the x-ray reflectivity (XRR) curve fitting problem were studied by modelling layer structures with neural networks and designing a fitness function to handle the nonidealities of measurements. Modelled atomic-layer-deposited aluminium oxide film structures were used in the simulations to calculate XRR curves based on Parratt's formalism. This approach reduced the dimensionality of the parameter space and allowed the use of fitness landscapes in the study of nonunique solutions. Fitness landscapes, where the height in a map represents the fitness value as a function of the process parameters, revealed tracks where the local fitness optima lie. The tracks were projected on the physical parameter space thus allowing the construction of the crosserror equation between weakly determined parameters, i.e. between the mass density and the surface roughness of a layer. The equation gives the minimum error for the other parameters which is a consequence of the nonuniqueness of the solution if noise is present. Furthermore, the existence of a possible unique solution in a certain parameter range was found to be dependent on the layer thickness and the signal-to-noise ratio

  1. Multi-binding site model-based curve-fitting program for the computation of RIA data

    International Nuclear Information System (INIS)

    Malan, P.G.; Ekins, R.P.; Cox, M.G.; Long, E.M.R.

    1977-01-01

    In this paper, a comparison will be made of model-based and empirical curve-fitting procedures. The implementation of a multiple binding-site curve-fitting model which will successfully fit a wide range of assay data, and which can be run on a mini-computer is described. The latter sophisticated model also provides estimates of binding site concentrations and the values of the respective equilibrium constants present: the latter have been used for refining assay conditions using computer optimisation techniques. (orig./AJ) [de

  2. A sigmoidal fit for pressure-volume curves of idiopathic pulmonary fibrosis patients on mechanical ventilation: clinical implications

    Directory of Open Access Journals (Sweden)

    Juliana C. Ferreira

    2011-01-01

    Full Text Available OBJECTIVE: Respiratory pressure-volume curves fitted to exponential equations have been used to assess disease severity and prognosis in spontaneously breathing patients with idiopathic pulmonary fibrosis. Sigmoidal equations have been used to fit pressure-volume curves for mechanically ventilated patients but not for idiopathic pulmonary fibrosis patients. We compared a sigmoidal model and an exponential model to fit pressure-volume curves from mechanically ventilated patients with idiopathic pulmonary fibrosis. METHODS: Six idiopathic pulmonary fibrosis patients and five controls underwent inflation pressure-volume curves using the constant-flow technique during general anesthesia prior to open lung biopsy or thymectomy. We identified the lower and upper inflection points and fit the curves with an exponential equation, V = A-B.e-k.P, and a sigmoid equation, V = a+b/(1+e-(P-c/d. RESULTS: The mean lower inflection point for idiopathic pulmonary fibrosis patients was significantly higher (10.5 ± 5.7 cm H2O than that of controls (3.6 ± 2.4 cm H2O. The sigmoidal equation fit the pressure-volume curves of the fibrotic and control patients well, but the exponential equation fit the data well only when points below 50% of the inspiratory capacity were excluded. CONCLUSION: The elevated lower inflection point and the sigmoidal shape of the pressure-volume curves suggest that respiratory system compliance is decreased close to end-expiratory lung volume in idiopathic pulmonary fibrosis patients under general anesthesia and mechanical ventilation. The sigmoidal fit was superior to the exponential fit for inflation pressure-volume curves of anesthetized patients with idiopathic pulmonary fibrosis and could be useful for guiding mechanical ventilation during general anesthesia in this condition.

  3. Decomposition and correction overlapping peaks of LIBS using an error compensation method combined with curve fitting.

    Science.gov (United States)

    Tan, Bing; Huang, Min; Zhu, Qibing; Guo, Ya; Qin, Jianwei

    2017-09-01

    The laser induced breakdown spectroscopy (LIBS) technique is an effective method to detect material composition by obtaining the plasma emission spectrum. The overlapping peaks in the spectrum are a fundamental problem in the qualitative and quantitative analysis of LIBS. Based on a curve fitting method, this paper studies an error compensation method to achieve the decomposition and correction of overlapping peaks. The vital step is that the fitting residual is fed back to the overlapping peaks and performs multiple curve fitting processes to obtain a lower residual result. For the quantitative experiments of Cu, the Cu-Fe overlapping peaks in the range of 321-327 nm obtained from the LIBS spectrum of five different concentrations of CuSO 4 ·5H 2 O solution were decomposed and corrected using curve fitting and error compensation methods. Compared with the curve fitting method, the error compensation reduced the fitting residual about 18.12-32.64% and improved the correlation about 0.86-1.82%. Then, the calibration curve between the intensity and concentration of the Cu was established. It can be seen that the error compensation method exhibits a higher linear correlation between the intensity and concentration of Cu, which can be applied to the decomposition and correction of overlapping peaks in the LIBS spectrum.

  4. A Graphical User Interface to Generalized Linear Models in MATLAB

    Directory of Open Access Journals (Sweden)

    Peter Dunn

    1999-07-01

    Full Text Available Generalized linear models unite a wide variety of statistical models in a common theoretical framework. This paper discusses GLMLAB-software that enables such models to be fitted in the popular mathematical package MATLAB. It provides a graphical user interface to the powerful MATLAB computational engine to produce a program that is easy to use but with many features, including offsets, prior weights and user-defined distributions and link functions. MATLAB's graphical capacities are also utilized in providing a number of simple residual diagnostic plots.

  5. THE CPA QUALIFICATION METHOD BASED ON THE GAUSSIAN CURVE FITTING

    Directory of Open Access Journals (Sweden)

    M.T. Adithia

    2015-01-01

    Full Text Available The Correlation Power Analysis (CPA attack is an attack on cryptographic devices, especially smart cards. The results of the attack are correlation traces. Based on the correlation traces, an evaluation is done to observe whether significant peaks appear in the traces or not. The evaluation is done manually, by experts. If significant peaks appear then the smart card is not considered secure since it is assumed that the secret key is revealed. We develop a method that objectively detects peaks and decides which peak is significant. We conclude that using the Gaussian curve fitting method, the subjective qualification of the peak significance can be objectified. Thus, better decisions can be taken by security experts. We also conclude that the Gaussian curve fitting method is able to show the influence of peak sizes, especially the width and height, to a significance of a particular peak.

  6. Reference Curves for Field Tests of Musculoskeletal Fitness in U.S. Children and Adolescents: The 2012 NHANES National Youth Fitness Survey.

    Science.gov (United States)

    Laurson, Kelly R; Saint-Maurice, Pedro F; Welk, Gregory J; Eisenmann, Joey C

    2017-08-01

    Laurson, KR, Saint-Maurice, PF, Welk, GJ, and Eisenmann, JC. Reference curves for field tests of musculoskeletal fitness in U.S. children and adolescents: The 2012 NHANES National Youth Fitness Survey. J Strength Cond Res 31(8): 2075-2082, 2017-The purpose of the study was to describe current levels of musculoskeletal fitness (MSF) in U.S. youth by creating nationally representative age-specific and sex-specific growth curves for handgrip strength (including relative and allometrically scaled handgrip), modified pull-ups, and the plank test. Participants in the National Youth Fitness Survey (n = 1,453) were tested on MSF, aerobic capacity (via submaximal treadmill test), and body composition (body mass index [BMI], waist circumference, and skinfolds). Using LMS regression, age-specific and sex-specific smoothed percentile curves of MSF were created and existing percentiles were used to assign age-specific and sex-specific z-scores for aerobic capacity and body composition. Correlation matrices were created to assess the relationships between z-scores on MSF, aerobic capacity, and body composition. At younger ages (3-10 years), boys scored higher than girls for handgrip strength and modified pull-ups, but not for the plank. By ages 13-15, differences between the boys and girls curves were more pronounced, with boys scoring higher on all tests. Correlations between tests of MSF and aerobic capacity were positive and low-to-moderate in strength. Correlations between tests of MSF and body composition were negative, excluding absolute handgrip strength, which was inversely related to other MSF tests and aerobic capacity but positively associated with body composition. The growth curves herein can be used as normative reference values or a starting point for creating health-related criterion reference standards for these tests. Comparisons with prior national surveys of physical fitness indicate that some components of MSF have likely decreased in the United States over

  7. Study of a photovoltaic system with MPPT using Matlab

    Directory of Open Access Journals (Sweden)

    Dumitru Pop

    2012-12-01

    Full Text Available In this paper a photovoltaic (PV system is analyzed using Matlab. First, a Matlab code is written in order to obtain the I-V and P-V curves at different values of solar irradiation and cell temperature. The results were compared with the experimental data of a commercial PV module, USP 150. Then, the code was implemented in Simulink and, along with a MPPT algorithm and a DC-DC converter, the whole system was simulated.

  8. Genetic algorithm using independent component analysis in x-ray reflectivity curve fitting of periodic layer structures

    International Nuclear Information System (INIS)

    Tiilikainen, J; Bosund, V; Tilli, J-M; Sormunen, J; Mattila, M; Hakkarainen, T; Lipsanen, H

    2007-01-01

    A novel genetic algorithm (GA) utilizing independent component analysis (ICA) was developed for x-ray reflectivity (XRR) curve fitting. EFICA was used to reduce mutual information, or interparameter dependences, during the combinatorial phase. The performance of the new algorithm was studied by fitting trial XRR curves to target curves which were computed using realistic multilayer models. The median convergence properties of conventional GA, GA using principal component analysis and the novel GA were compared. GA using ICA was found to outperform the other methods with problems having 41 parameters or more to be fitted without additional XRR curve calculations. The computational complexity of the conventional methods was linear but the novel method had a quadratic computational complexity due to the applied ICA method which sets a practical limit for the dimensionality of the problem to be solved. However, the novel algorithm had the best capability to extend the fitting analysis based on Parratt's formalism to multiperiodic layer structures

  9. PLOTNFIT.4TH, Data Plotting and Curve Fitting by Polynomials

    International Nuclear Information System (INIS)

    Schiffgens, J.O.

    1990-01-01

    1 - Description of program or function: PLOTnFIT is used for plotting and analyzing data by fitting nth degree polynomials of basis functions to the data interactively and printing graphs of the data and the polynomial functions. It can be used to generate linear, semi-log, and log-log graphs and can automatically scale the coordinate axes to suit the data. Multiple data sets may be plotted on a single graph. An auxiliary program, READ1ST, is included which produces an on-line summary of the information contained in the PLOTnFIT reference report. 2 - Method of solution: PLOTnFIT uses the least squares method to calculate the coefficients of nth-degree (up to 10. degree) polynomials of 11 selected basis functions such that each polynomial fits the data in a least squares sense. The procedure incorporated in the code uses a linear combination of orthogonal polynomials to avoid 'i11-conditioning' and to perform the curve fitting task with single-precision arithmetic. 3 - Restrictions on the complexity of the problem - Maxima of: 225 data points per job (or graph) including all data sets 8 data sets (or tasks) per job (or graph)

  10. Potential errors when fitting experience curves by means of spreadsheet software

    NARCIS (Netherlands)

    van Sark, W.G.J.H.M.|info:eu-repo/dai/nl/074628526; Alsema, E.A.|info:eu-repo/dai/nl/073416258

    2010-01-01

    Progress ratios (PRs) are widely used in forecasting development of many technologies; they are derived from historical data represented in experience curves. Fitting the double logarithmic graphs is easily done with spreadsheet software like Microsoft Excel, by adding a trend line to the graph.

  11. Methods for fitting of efficiency curves obtained by means of HPGe gamma rays spectrometers

    International Nuclear Information System (INIS)

    Cardoso, Vanderlei

    2002-01-01

    The present work describes a few methodologies developed for fitting efficiency curves obtained by means of a HPGe gamma-ray spectrometer. The interpolated values were determined by simple polynomial fitting and polynomial fitting between the ratio of experimental peak efficiency and total efficiency, calculated by Monte Carlo technique, as a function of gamma-ray energy. Moreover, non-linear fitting has been performed using a segmented polynomial function and applying the Gauss-Marquardt method. For the peak area obtainment different methodologies were developed in order to estimate the background area under the peak. This information was obtained by numerical integration or by using analytical functions associated to the background. One non-calibrated radioactive source has been included in the curve efficiency in order to provide additional calibration points. As a by-product, it was possible to determine the activity of this non-calibrated source. For all fittings developed in the present work the covariance matrix methodology was used, which is an essential procedure in order to give a complete description of the partial uncertainties involved. (author)

  12. OXSA: An open-source magnetic resonance spectroscopy analysis toolbox in MATLAB.

    Directory of Open Access Journals (Sweden)

    Lucian A B Purvis

    Full Text Available In vivo magnetic resonance spectroscopy provides insight into metabolism in the human body. New acquisition protocols are often proposed to improve the quality or efficiency of data collection. Processing pipelines must also be developed to use these data optimally. Current fitting software is either targeted at general spectroscopy fitting, or for specific protocols. We therefore introduce the MATLAB-based OXford Spectroscopy Analysis (OXSA toolbox to allow researchers to rapidly develop their own customised processing pipelines. The toolbox aims to simplify development by: being easy to install and use; seamlessly importing Siemens Digital Imaging and Communications in Medicine (DICOM standard data; allowing visualisation of spectroscopy data; offering a robust fitting routine; flexibly specifying prior knowledge when fitting; and allowing batch processing of spectra. This article demonstrates how each of these criteria have been fulfilled, and gives technical details about the implementation in MATLAB. The code is freely available to download from https://github.com/oxsatoolbox/oxsa.

  13. OXSA: An open-source magnetic resonance spectroscopy analysis toolbox in MATLAB.

    Science.gov (United States)

    Purvis, Lucian A B; Clarke, William T; Biasiolli, Luca; Valkovič, Ladislav; Robson, Matthew D; Rodgers, Christopher T

    2017-01-01

    In vivo magnetic resonance spectroscopy provides insight into metabolism in the human body. New acquisition protocols are often proposed to improve the quality or efficiency of data collection. Processing pipelines must also be developed to use these data optimally. Current fitting software is either targeted at general spectroscopy fitting, or for specific protocols. We therefore introduce the MATLAB-based OXford Spectroscopy Analysis (OXSA) toolbox to allow researchers to rapidly develop their own customised processing pipelines. The toolbox aims to simplify development by: being easy to install and use; seamlessly importing Siemens Digital Imaging and Communications in Medicine (DICOM) standard data; allowing visualisation of spectroscopy data; offering a robust fitting routine; flexibly specifying prior knowledge when fitting; and allowing batch processing of spectra. This article demonstrates how each of these criteria have been fulfilled, and gives technical details about the implementation in MATLAB. The code is freely available to download from https://github.com/oxsatoolbox/oxsa.

  14. MATLAB - Introduction

    International Nuclear Information System (INIS)

    Jung, Heon Seul

    2005-08-01

    This book introduces MATLAB giving descriptions of analysis and design of this control system like what is control? cases of control system, working environment of MATLAB, signs of MATLAB, commands of MATLAB, and drawing graphs. It also tells of basic use of simulink, mathematical model of physical system like mechanical-electrical analogous system, system analysis in time part, frequency analysis, state space and design such as canonical from and principle of duality and state observer design.

  15. Matlab for dummies

    CERN Document Server

    Sizemore, Jim

    2014-01-01

    Plot graphs, solve equations, and write code in a flash! If you work in a STEM field, chances are you'll be using MATLAB on a daily basis. MATLAB is a popular and powerful computational tool and this book provides everything you need to start manipulating and plotting your data. MATLAB has rapidly become the premier data tool, and MATLAB For Dummies is a comprehensive guide to the fundamentals. MATLAB For Dummies guides you through this complex computational language from installation to visualization to automation.Learn MATLAB's language fundamentals including syntax, operators, and data type

  16. Box-Cox transformation for resolving Peelle's Pertinent Puzzle in curve fitting

    International Nuclear Information System (INIS)

    Oh, Soo-Youl

    2003-01-01

    Incorporating the Box-Cox transformation into a least-squares method is presented as one of resolutions of an anomaly known as Peelle's Pertinent Puzzle. The transformation is a strategy to make non-normal distribution data resemble normal data. A procedure is proposed: transform the measured raw data with an optimized Box-Cox transformation parameter, fit the transformed data using a usual curve fitting method, then inverse-transform the fitted results to final estimates. The generalized least-squares method utilized in GMA is adopted as the curve fitting tool for the test of proposed procedure. In the procedure, covariance matrices are correspondingly transformed and inverse-transformed with the aid of error propagation law. In addition to a sensible answer to the Peelle's problem itself, the procedure resulted in reasonable estimates of 6 Li(n,t) cross sections in several to 800 keV energy region. Meanwhile, comparisons of the present procedure with that of Chiba and Smith show that both procedures yield estimates so close each other for the sample evaluation on 6 Li(n,t) above as well as for the Peelle's problem. Two procedures, however, are conceptually very different and further discussions would be needed for a consensus on this issue of resolving the Puzzle. It is also pointed out that the transformation is applicable not only to a least-squares method but also to other parameter estimation method such as a usual Bayesian approach formulated with an assumption of normality of the probability density function. (author)

  17. An Empirical Fitting Method for Type Ia Supernova Light Curves: A Case Study of SN 2011fe

    Energy Technology Data Exchange (ETDEWEB)

    Zheng, WeiKang; Filippenko, Alexei V., E-mail: zwk@astro.berkeley.edu [Department of Astronomy, University of California, Berkeley, CA 94720-3411 (United States)

    2017-03-20

    We present a new empirical fitting method for the optical light curves of Type Ia supernovae (SNe Ia). We find that a variant broken-power-law function provides a good fit, with the simple assumption that the optical emission is approximately the blackbody emission of the expanding fireball. This function is mathematically analytic and is derived directly from the photospheric velocity evolution. When deriving the function, we assume that both the blackbody temperature and photospheric velocity are constant, but the final function is able to accommodate these changes during the fitting procedure. Applying it to the case study of SN 2011fe gives a surprisingly good fit that can describe the light curves from the first-light time to a few weeks after peak brightness, as well as over a large range of fluxes (∼5 mag, and even ∼7 mag in the g band). Since SNe Ia share similar light-curve shapes, this fitting method has the potential to fit most other SNe Ia and characterize their properties in large statistical samples such as those already gathered and in the near future as new facilities become available.

  18. Based on matlab 3d visualization programming in the application of the uranium exploration

    International Nuclear Information System (INIS)

    Qi Jianquan

    2012-01-01

    Combined geological theory, geophysical curve and Matlab programming three dimensional visualization applied to the production of uranium exploration. With a simple Matlab programming, numerical processing and graphical visualization of convenient features, and effective in identifying ore bodies, recourse to ore, ore body delineation of the scope of analysis has played the role of sedimentary environment. (author)

  19. Box-Cox transformation for resolving the Peelle's Pertinent Puzzle in a curve fitting

    International Nuclear Information System (INIS)

    Oh, S. Y.; Seo, C. G.

    2004-01-01

    Incorporating the Box-Cox transformation into a curve fitting is presented as one of methods for resolving an anomaly known as the Peelle's Pertinent Puzzle in the nuclear data community. The Box-Cox transformation is a strategy to make non-normal distribution data resemble normal distribution data. The proposed method consists of the following steps: transform the raw data to be fitted with the optimized Box-Cox transformation parameter, fit the transformed data using a conventional curve fitting tool, the least-squares method in this study, then inverse-transform the fitted results to the final estimates. Covariance matrices are correspondingly transformed and inverse-transformed with the aid of the law of error propagation. In addition to a sensible answer to the Puzzle, the proposed method resulted in reasonable estimates for a test evaluation with pseudo-experimental 6 Li(n, t) cross sections in several to 800 keV energy region, while the GMA code resulted in systematic underestimates that characterize the Puzzle. Meanwhile, it is observed that the present method and the Chiba-Smith method yield almost the same estimates for the test evaluation on 6 Li(n, t). Conceptually, however, two methods are very different from each other and further discussions are needed for a consensus on the issue of how to resolve the Puzzle. (authors)

  20. MATLAB syntaksen

    DEFF Research Database (Denmark)

    Skajaa, Anders; Jørgensen, Jakob Heide

    MATLAB er et matematik-program som fokuserer på anvendelsen af matricer og vektorer. Deraf navnet MATrix LABoratory. Denne bog er en praktisk vejledning i at forstår og anvende MATLAB syntaksen og fungerer som en hurtig genvej til dig, der skal i gang med at anvende MATLAB i forbindelse med fx dit...

  1. CABAS: A freely available PC program for fitting calibration curves in chromosome aberration dosimetry

    International Nuclear Information System (INIS)

    Deperas, J.; Szluiska, M.; Deperas-Kaminska, M.; Edwards, A.; Lloyd, D.; Lindholm, C.; Romm, H.; Roy, L.; Moss, R.; Morand, J.; Wojcik, A.

    2007-01-01

    The aim of biological dosimetry is to estimate the dose and the associated uncertainty to which an accident victim was exposed. This process requires the use of the maximum-likelihood method for fitting a calibration curve, a procedure that is not implemented in most statistical computer programs. Several laboratories have produced their own programs, but these are frequently not user-friendly and not available to outside users. We developed a software for fitting a linear-quadratic dose-response relationship by the method of maximum-likelihood and for estimating a dose from the number of aberrations observed. The program called as CABAS consists of the main curve-fitting and dose estimating module and modules for calculating the dose in cases of partial body exposure, for estimating the minimum number of cells necessary to detect a given dose of radiation and for calculating the dose in the case of a protracted exposure. (authors)

  2. The environmental Kuznets curve. Does one size fit all?

    International Nuclear Information System (INIS)

    List, J.A.; Gallet, C.A.

    1999-01-01

    This paper uses a new panel data set on state-level sulfur dioxide and nitrogen oxide emissions from 1929-1994 to test the appropriateness of the 'one size fits all' reduced-form regression approach commonly used in the environmental Kuznets curve literature. Empirical results provide initial evidence that an inverted-U shape characterizes the relationship between per capita emissions and per capita incomes at the state level. Parameter estimates suggest, however, that previous studies, which restrict cross-sections to undergo identical experiences over time, may be presenting statistically biased results. 25 refs

  3. Nonlinear method for including the mass uncertainty of standards and the system measurement errors in the fitting of calibration curves

    International Nuclear Information System (INIS)

    Pickles, W.L.; McClure, J.W.; Howell, R.H.

    1978-01-01

    A sophisticated nonlinear multiparameter fitting program was used to produce a best fit calibration curve for the response of an x-ray fluorescence analyzer to uranium nitrate, freeze dried, 0.2% accurate, gravimetric standards. The program is based on unconstrained minimization subroutine, VA02A. The program considers the mass values of the gravimetric standards as parameters to be fit along with the normal calibration curve parameters. The fitting procedure weights with the system errors and the mass errors in a consistent way. The resulting best fit calibration curve parameters reflect the fact that the masses of the standard samples are measured quantities with a known error. Error estimates for the calibration curve parameters can be obtained from the curvature of the ''Chi-Squared Matrix'' or from error relaxation techniques. It was shown that nondispersive XRFA of 0.1 to 1 mg freeze-dried UNO 3 can have an accuracy of 0.2% in 1000 s. 5 figures

  4. Statistically generated weighted curve fit of residual functions for modal analysis of structures

    Science.gov (United States)

    Bookout, P. S.

    1995-01-01

    A statistically generated weighting function for a second-order polynomial curve fit of residual functions has been developed. The residual flexibility test method, from which a residual function is generated, is a procedure for modal testing large structures in an external constraint-free environment to measure the effects of higher order modes and interface stiffness. This test method is applicable to structures with distinct degree-of-freedom interfaces to other system components. A theoretical residual function in the displacement/force domain has the characteristics of a relatively flat line in the lower frequencies and a slight upward curvature in the higher frequency range. In the test residual function, the above-mentioned characteristics can be seen in the data, but due to the present limitations in the modal parameter evaluation (natural frequencies and mode shapes) of test data, the residual function has regions of ragged data. A second order polynomial curve fit is required to obtain the residual flexibility term. A weighting function of the data is generated by examining the variances between neighboring data points. From a weighted second-order polynomial curve fit, an accurate residual flexibility value can be obtained. The residual flexibility value and free-free modes from testing are used to improve a mathematical model of the structure. The residual flexibility modal test method is applied to a straight beam with a trunnion appendage and a space shuttle payload pallet simulator.

  5. Dose response curve of induction of MN in lymphocytes for energies Cs-137; Curva dosis respuesta de induccion de micronucleos en linfocitos para las energias Cs-137

    Energy Technology Data Exchange (ETDEWEB)

    Serna Berna, A.; Alcaraz, M.; Acevedo, C.; Vicente, V.; Fuente, I. de la; Canteras, M.

    2006-07-01

    The determination of the dose-response curve is a crucial step to use the Micronucleus assay in Lymphocytes as a biological dosimeters. The most widely used fitting function is the linear-quadratic function. The coefficients are fitted by calibration data provided by irradiations of blood from healthy donors. In our case we performed the calibration curve corresponding to gamma radiation from Cesium-137 (660 keV). Doses ranged from 0 to 16 Gy. The fitting procedure used was the iteratively re weighted least square algorithm implemented in a Matlab routine. The results of the analysis of our data show that the dose-effect curve does not follow a linear-quadratic curve at high radiation doses, diminishing the quadratic parameters as dose increases. This can be interpreted as a micronucleus saturation effect beyond a certain dose level. We conclude that the MN assay with lymphocytes can be well characterized as a biological dosimeters up to a maximum dose of 4.5 Gy. (Author)

  6. Gamma-ray Burst X-ray Flares Light Curve Fitting

    Science.gov (United States)

    Aubain, Jonisha

    2018-01-01

    Gamma Ray Bursts (GRBs) are the most luminous explosions in the Universe. These electromagnetic explosions produce jets demonstrated by a short burst of prompt gamma-ray emission followed by a broadband afterglow. There are sharp increases of flux in the X-ray light curves known as flares that occurs in about 50% of the afterglows. In this study, we characterized all of the X-ray afterglows that were detected by the Swift X-ray Telescope (XRT), whether with flares or without. We fit flares to the Norris function (Norris et al. 2005) and power laws with breaks where necessary (Racusin et al. 2009). After fitting the Norris function and power laws, we search for the residual pattern detected in prompt GRB pulses (Hakkila et al. 2014, 2015, 2017), that may indicate a common signature of shock physics. If we find the same signature in flares and prompt pulses, it provides insight into what causes them, as well as, how these flares are produced.

  7. Fitting sediment rating curves using regression analysis: a case study of Russian Arctic rivers

    Directory of Open Access Journals (Sweden)

    N. I. Tananaev

    2015-03-01

    Full Text Available Published suspended sediment data for Arctic rivers is scarce. Suspended sediment rating curves for three medium to large rivers of the Russian Arctic were obtained using various curve-fitting techniques. Due to the biased sampling strategy, the raw datasets do not exhibit log-normal distribution, which restricts the applicability of a log-transformed linear fit. Non-linear (power model coefficients were estimated using the Levenberg-Marquardt, Nelder-Mead and Hooke-Jeeves algorithms, all of which generally showed close agreement. A non-linear power model employing the Levenberg-Marquardt parameter evaluation algorithm was identified as an optimal statistical solution of the problem. Long-term annual suspended sediment loads estimated using the non-linear power model are, in general, consistent with previously published results.

  8. Fitting sediment rating curves using regression analysis: a case study of Russian Arctic rivers

    Science.gov (United States)

    Tananaev, N. I.

    2015-03-01

    Published suspended sediment data for Arctic rivers is scarce. Suspended sediment rating curves for three medium to large rivers of the Russian Arctic were obtained using various curve-fitting techniques. Due to the biased sampling strategy, the raw datasets do not exhibit log-normal distribution, which restricts the applicability of a log-transformed linear fit. Non-linear (power) model coefficients were estimated using the Levenberg-Marquardt, Nelder-Mead and Hooke-Jeeves algorithms, all of which generally showed close agreement. A non-linear power model employing the Levenberg-Marquardt parameter evaluation algorithm was identified as an optimal statistical solution of the problem. Long-term annual suspended sediment loads estimated using the non-linear power model are, in general, consistent with previously published results.

  9. The Matlab Syntax

    DEFF Research Database (Denmark)

    Jørgensen, Jakob Heide; Skajaa, Anders

    Matlab (MATrix LABoratory) is one of the most widely used programming environments for numerical computations and simulations in the technical sciences. The reason is that Matlab makes it easy to get started as well as to construct advanced programs. This book is a practical guide to understanding...... and using Matlab. It works as a quick reference for anyone who is starting to use Matlab for example while enrolled in university studies. For this reason, the book is limited to covering what is typically used by a university student and is designed as a reference of the syntax including plenty of examples....... While the primary audience of the book is university students, it is well suited for anyone who wants to become acquainted with Matlab....

  10. A graphical user interface (gui) matlab program Synthetic_Ves For ...

    African Journals Online (AJOL)

    An interactive and robust computer program for 1D forward modeling of Schlumberger Vertical Electrical Sounding (VES) curves for multilayered earth models is presented. The Graphical User Interface (GUI) enabled software, written in MATLAB v.7.12.0.635 (R2011a), accepts user-defined geologic model parameters (i.e. ...

  11. A novel knot selection method for the error-bounded B-spline curve fitting of sampling points in the measuring process

    International Nuclear Information System (INIS)

    Liang, Fusheng; Zhao, Ji; Ji, Shijun; Zhang, Bing; Fan, Cheng

    2017-01-01

    The B-spline curve has been widely used in the reconstruction of measurement data. The error-bounded sampling points reconstruction can be achieved by the knot addition method (KAM) based B-spline curve fitting. In KAM, the selection pattern of initial knot vector has been associated with the ultimate necessary number of knots. This paper provides a novel initial knots selection method to condense the knot vector required for the error-bounded B-spline curve fitting. The initial knots are determined by the distribution of features which include the chord length (arc length) and bending degree (curvature) contained in the discrete sampling points. Firstly, the sampling points are fitted into an approximate B-spline curve Gs with intensively uniform knot vector to substitute the description of the feature of the sampling points. The feature integral of Gs is built as a monotone increasing function in an analytic form. Then, the initial knots are selected according to the constant increment of the feature integral. After that, an iterative knot insertion (IKI) process starting from the initial knots is introduced to improve the fitting precision, and the ultimate knot vector for the error-bounded B-spline curve fitting is achieved. Lastly, two simulations and the measurement experiment are provided, and the results indicate that the proposed knot selection method can reduce the number of ultimate knots available. (paper)

  12. Matlab linear algebra

    CERN Document Server

    Lopez, Cesar

    2014-01-01

    MATLAB is a high-level language and environment for numerical computation, visualization, and programming. Using MATLAB, you can analyze data, develop algorithms, and create models and applications. The language, tools, and built-in math functions enable you to explore multiple approaches and reach a solution faster than with spreadsheets or traditional programming languages, such as C/C++ or Java. MATLAB Linear Algebra introduces you to the MATLAB language with practical hands-on instructions and results, allowing you to quickly achieve your goals. In addition to giving an introduction to

  13. Validation of curve-fitting method for blood retention of 99mTc-GSA. Comparison with blood sampling method

    International Nuclear Information System (INIS)

    Ha-Kawa, Sang Kil; Suga, Yutaka; Kouda, Katsuyasu; Ikeda, Koshi; Tanaka, Yoshimasa

    1997-01-01

    We investigated a curve-fitting method for the rate of blood retention of 99m Tc-galactosyl serum albumin (GSA) as a substitute for the blood sampling method. Seven healthy volunteers and 27 patients with liver disease underwent 99m Tc-GSA scanning. After normalization of the y-intercept as 100 percent, a biexponential regression curve for the precordial time-activity curve provided the percent injected dose (%ID) of 99m Tc-GSA in the blood without blood sampling. The discrepancy between %ID obtained by the curve-fitting method and that by the multiple blood samples was minimal in normal volunteers 3.1±2.1% (mean±standard deviation, n=77 sampling). Slightly greater discrepancy was observed in patients with liver disease (7.5±6.1%, n=135 sampling). The %ID at 15 min after injection obtained from the fitted curve was significantly greater in patients with liver cirrhosis than in the controls (53.2±11.6%, n=13; vs. 31.9±2.8%, n=7, p 99m Tc-GSA and the plasma retention rate for indocyanine green (r=-0.869, p 99m Tc-GSA and could be a substitute for the blood sampling method. (author)

  14. A Hierarchical Modeling for Reactive Power Optimization With Joint Transmission and Distribution Networks by Curve Fitting

    DEFF Research Database (Denmark)

    Ding, Tao; Li, Cheng; Huang, Can

    2018-01-01

    –slave structure and improves traditional centralized modeling methods by alleviating the big data problem in a control center. Specifically, the transmission-distribution-network coordination issue of the hierarchical modeling method is investigated. First, a curve-fitting approach is developed to provide a cost......In order to solve the reactive power optimization with joint transmission and distribution networks, a hierarchical modeling method is proposed in this paper. It allows the reactive power optimization of transmission and distribution networks to be performed separately, leading to a master...... optimality. Numerical results on two test systems verify the effectiveness of the proposed hierarchical modeling and curve-fitting methods....

  15. Maximum likelihood fitting of FROC curves under an initial-detection-and-candidate-analysis model

    International Nuclear Information System (INIS)

    Edwards, Darrin C.; Kupinski, Matthew A.; Metz, Charles E.; Nishikawa, Robert M.

    2002-01-01

    We have developed a model for FROC curve fitting that relates the observer's FROC performance not to the ROC performance that would be obtained if the observer's responses were scored on a per image basis, but rather to a hypothesized ROC performance that the observer would obtain in the task of classifying a set of 'candidate detections' as positive or negative. We adopt the assumptions of the Bunch FROC model, namely that the observer's detections are all mutually independent, as well as assumptions qualitatively similar to, but different in nature from, those made by Chakraborty in his AFROC scoring methodology. Under the assumptions of our model, we show that the observer's FROC performance is a linearly scaled version of the candidate analysis ROC curve, where the scaling factors are just given by the FROC operating point coordinates for detecting initial candidates. Further, we show that the likelihood function of the model parameters given observational data takes on a simple form, and we develop a maximum likelihood method for fitting a FROC curve to this data. FROC and AFROC curves are produced for computer vision observer datasets and compared with the results of the AFROC scoring method. Although developed primarily with computer vision schemes in mind, we hope that the methodology presented here will prove worthy of further study in other applications as well

  16. Basis and application of MATLAB

    International Nuclear Information System (INIS)

    Shin, Chun Sik; An, Yeong Ju; Byun, Gi Sik; Lee, Hyeong Gi

    1998-01-01

    This book introduces MATLAB, which deals with operation and basic function, MATLAB programming like for, if, else, while and script file, equation, calculation and interpolation, operation of the matrix, file management function, such as basic file management function input-output of internal file of MATLAB input-output of outer file of MATLAB, basic graph function, two-dimensional graph and three-dimensional graph and other graph function debugger of MATLAB program in 4.2 version and debugger of MATLAB in 5.1 version.

  17. Data fitting by G1 rational cubic Bézier curves using harmony search

    Directory of Open Access Journals (Sweden)

    Najihah Mohamed

    2015-07-01

    Full Text Available A metaheuristic algorithm, called Harmony Search (HS is implemented for data fitting by rational cubic Bézier curves. HS is a derivative-free real parameter optimization algorithm, and draws an inspiration from the musical improvisation process of searching for a perfect state of harmony. HS is suitable for multivariate non-linear optimization problem. It is mainly achieved by data fitting using rational cubic Bézier curves with G1 continuity for every joint of segments of the whole data sets. This approach has significant contributions in making the technique automated. HS is used to optimize positions of middle points and values of the shape parameters. Test outline images and comparative experimental analysis are presented to show effectiveness and robustness of the proposed method. Statistical testing between HS and two other different metaheuristic algorithms is used in the analysis on several outline images. All of the algorithms improvised a near optimal solution but the result that is obtained by the HS is better than the results of the other two algorithms.

  18. Reflection curves—new computation and rendering techniques

    Directory of Open Access Journals (Sweden)

    Dan-Eugen Ulmet

    2004-05-01

    Full Text Available Reflection curves on surfaces are important tools for free-form surface interrogation. They are essential for industrial 3D CAD/CAM systems and for rendering purposes. In this note, new approaches regarding the computation and rendering of reflection curves on surfaces are introduced. These approaches are designed to take the advantage of the graphics libraries of recent releases of commercial systems such as the OpenInventor toolkit (developed by Silicon Graphics or Matlab (developed by The Math Works. A new relation between reflection curves and contour curves is derived; this theoretical result is used for a straightforward Matlab implementation of reflection curves. A new type of reflection curves is also generated using the OpenInventor texture and environment mapping implementations. This allows the computation, rendering, and animation of reflection curves at interactive rates, which makes it particularly useful for industrial applications.

  19. A direct method to solve optimal knots of B-spline curves: An application for non-uniform B-spline curves fitting.

    Directory of Open Access Journals (Sweden)

    Van Than Dung

    Full Text Available B-spline functions are widely used in many industrial applications such as computer graphic representations, computer aided design, computer aided manufacturing, computer numerical control, etc. Recently, there exist some demands, e.g. in reverse engineering (RE area, to employ B-spline curves for non-trivial cases that include curves with discontinuous points, cusps or turning points from the sampled data. The most challenging task in these cases is in the identification of the number of knots and their respective locations in non-uniform space in the most efficient computational cost. This paper presents a new strategy for fitting any forms of curve by B-spline functions via local algorithm. A new two-step method for fast knot calculation is proposed. In the first step, the data is split using a bisecting method with predetermined allowable error to obtain coarse knots. Secondly, the knots are optimized, for both locations and continuity levels, by employing a non-linear least squares technique. The B-spline function is, therefore, obtained by solving the ordinary least squares problem. The performance of the proposed method is validated by using various numerical experimental data, with and without simulated noise, which were generated by a B-spline function and deterministic parametric functions. This paper also discusses the benchmarking of the proposed method to the existing methods in literature. The proposed method is shown to be able to reconstruct B-spline functions from sampled data within acceptable tolerance. It is also shown that, the proposed method can be applied for fitting any types of curves ranging from smooth ones to discontinuous ones. In addition, the method does not require excessive computational cost, which allows it to be used in automatic reverse engineering applications.

  20. Research on Standard and Automatic Judgment of Press-fit Curve of Locomotive Wheel-set Based on AAR Standard

    Science.gov (United States)

    Lu, Jun; Xiao, Jun; Gao, Dong Jun; Zong, Shu Yu; Li, Zhu

    2018-03-01

    In the production of the Association of American Railroads (AAR) locomotive wheel-set, the press-fit curve is the most important basis for the reliability of wheel-set assembly. In the past, Most of production enterprises mainly use artificial detection methods to determine the quality of assembly. There are cases of miscarriage of justice appear. For this reason, the research on the standard is carried out. And the automatic judgment of press-fit curve is analysed and designed, so as to provide guidance for the locomotive wheel-set production based on AAR standard.

  1. Matlab for engineers explained

    CERN Document Server

    Gustafsson, Fredrik

    2003-01-01

    This book is written for students at bachelor and master programs and has four different purposes, which split the book into four parts: 1. To teach first or early year undergraduate engineering students basic knowledge in technical computations and programming using MATLAB. The first part starts from first principles and is therefore well suited both for readers with prior exposure to MATLAB but lacking a solid foundational knowledge of the capabilities of the system and readers not having any previous experience with MATLAB. The foundational knowledge gained from these interactive guided tours of the system will hopefully be sufficient for an effective utilization of MATLAB in the engineering profession, in education and in research. 2. To explain the foundations of more advanced use of MATLAB using the facilities added the last couple of years, such as extended data structures, object orientation and advanced graphics. 3. To give an introduction to the use of MATLAB in typical undergraduate courses in elec...

  2. An introduction to MATLAB.

    Science.gov (United States)

    Sobie, Eric A

    2011-09-13

    This two-part lecture introduces students to the scientific computing language MATLAB. Prior computer programming experience is not required. The lectures present basic concepts of computer programming logic that tend to cause difficulties for beginners in addition to concepts that relate specifically to the MATLAB language syntax. The lectures begin with a discussion of vectors, matrices, and arrays. Because many types of biological data, such as fluorescence images and DNA microarrays, are stored as two-dimensional objects, processing these data is a form of array manipulation, and MATLAB is especially adept at handling such array objects. The students are introduced to basic commands in MATLAB, as well as built-in functions that provide useful shortcuts. The second lecture focuses on the differences between MATLAB scripts and MATLAB functions and describes when one method of programming organization might be preferable to the other. The principles are illustrated through the analysis of experimental data, specifically measurements of intracellular calcium concentration in live cells obtained using confocal microscopy.

  3. A new method for curve fitting to the data with low statistics not using the chi2-method

    International Nuclear Information System (INIS)

    Awaya, T.

    1979-01-01

    A new method which does not use the chi 2 -fitting method is investigated in order to fit the theoretical curve to data with low statistics. The method is compared with the usual and modified chi 2 -fitting ones. The analyses are done for data which are generated by computers. It is concluded that the new method gives good results in all the cases. (Auth.)

  4. Numerical generation of boundary-fitted curvilinear coordinate systems for arbitrarily curved surfaces

    International Nuclear Information System (INIS)

    Takagi, T.; Miki, K.; Chen, B.C.J.; Sha, W.T.

    1985-01-01

    A new method is presented for numerically generating boundary-fitted coordinate systems for arbitrarily curved surfaces. The three-dimensional surface has been expressed by functions of two parameters using the geometrical modeling techniques in computer graphics. This leads to new quasi-one- and two-dimensional elliptic partial differential equations for coordinate transformation. Since the equations involve the derivatives of the surface expressions, the grids geneated by the equations distribute on the surface depending on its slope and curvature. A computer program GRID-CS based on the method was developed and applied to a surface of the second order, a torus and a surface of a primary containment vessel for a nuclear reactor. These applications confirm that GRID-CS is a convenient and efficient tool for grid generation on arbitrarily curved surfaces

  5. The GMT/MATLAB Toolbox

    Science.gov (United States)

    Wessel, Paul; Luis, Joaquim F.

    2017-02-01

    The GMT/MATLAB toolbox is a basic interface between MATLAB® (or Octave) and GMT, the Generic Mapping Tools, which allows MATLAB users full access to all GMT modules. Data may be passed between the two programs using intermediate MATLAB structures that organize the metadata needed; these are produced when GMT modules are run. In addition, standard MATLAB matrix data can be used directly as input to GMT modules. The toolbox improves interoperability between two widely used tools in the geosciences and extends the capability of both tools: GMT gains access to the powerful computational capabilities of MATLAB while the latter gains the ability to access specialized gridding algorithms and can produce publication-quality PostScript-based illustrations. The toolbox is available on all platforms and may be downloaded from the GMT website.

  6. MATLAB matrix algebra

    CERN Document Server

    Pérez López, César

    2014-01-01

    MATLAB is a high-level language and environment for numerical computation, visualization, and programming. Using MATLAB, you can analyze data, develop algorithms, and create models and applications. The language, tools, and built-in math functions enable you to explore multiple approaches and reach a solution faster than with spreadsheets or traditional programming languages, such as C/C++ or Java. MATLAB Matrix Algebra introduces you to the MATLAB language with practical hands-on instructions and results, allowing you to quickly achieve your goals. Starting with a look at symbolic and numeric variables, with an emphasis on vector and matrix variables, you will go on to examine functions and operations that support vectors and matrices as arguments, including those based on analytic parent functions. Computational methods for finding eigenvalues and eigenvectors of matrices are detailed, leading to various matrix decompositions. Applications such as change of bases, the classification of quadratic forms and ...

  7. A bivariate contaminated binormal model for robust fitting of proper ROC curves to a pair of correlated, possibly degenerate, ROC datasets.

    Science.gov (United States)

    Zhai, Xuetong; Chakraborty, Dev P

    2017-06-01

    The objective was to design and implement a bivariate extension to the contaminated binormal model (CBM) to fit paired receiver operating characteristic (ROC) datasets-possibly degenerate-with proper ROC curves. Paired datasets yield two correlated ratings per case. Degenerate datasets have no interior operating points and proper ROC curves do not inappropriately cross the chance diagonal. The existing method, developed more than three decades ago utilizes a bivariate extension to the binormal model, implemented in CORROC2 software, which yields improper ROC curves and cannot fit degenerate datasets. CBM can fit proper ROC curves to unpaired (i.e., yielding one rating per case) and degenerate datasets, and there is a clear scientific need to extend it to handle paired datasets. In CBM, nondiseased cases are modeled by a probability density function (pdf) consisting of a unit variance peak centered at zero. Diseased cases are modeled with a mixture distribution whose pdf consists of two unit variance peaks, one centered at positive μ with integrated probability α, the mixing fraction parameter, corresponding to the fraction of diseased cases where the disease was visible to the radiologist, and one centered at zero, with integrated probability (1-α), corresponding to disease that was not visible. It is shown that: (a) for nondiseased cases the bivariate extension is a unit variances bivariate normal distribution centered at (0,0) with a specified correlation ρ 1 ; (b) for diseased cases the bivariate extension is a mixture distribution with four peaks, corresponding to disease not visible in either condition, disease visible in only one condition, contributing two peaks, and disease visible in both conditions. An expression for the likelihood function is derived. A maximum likelihood estimation (MLE) algorithm, CORCBM, was implemented in the R programming language that yields parameter estimates and the covariance matrix of the parameters, and other statistics

  8. Matlab laser toolbox

    NARCIS (Netherlands)

    Römer, Gerardus Richardus, Bernardus, Engelina; Huis in 't Veld, Bert; Schmidt, M.; Vollertsen, F.; Geiger, M.

    2010-01-01

    Matlab® is a program for numeric computation, simulation and visualization, developed by The Mathworks, Inc. It is used heavily in education, research, and industry, for solving general, as well as application area-specific problems, that arise in various disciplines. For this purpose Matlab has

  9. R and Matlab

    CERN Document Server

    Hiebeler, David E

    2015-01-01

    The First Book to Explain How a User of R or MATLAB Can Benefit from the OtherIn today's increasingly interdisciplinary world, R and MATLAB® users from different backgrounds must often work together and share code. R and MATLAB® is designed for users who already know R or MATLAB and now need to learn the other platform. The book makes the transition from one platform to the other as quick and painless as possible.Enables R and MATLAB Users to Easily Collaborate and Share CodeThe author covers essential tasks, such as working with matrices and vectors, writing functions and other programming co

  10. MatLab Script and Functional Programming

    Science.gov (United States)

    Shaykhian, Gholam Ali

    2007-01-01

    MatLab Script and Functional Programming: MatLab is one of the most widely used very high level programming languages for scientific and engineering computations. It is very user-friendly and needs practically no formal programming knowledge. Presented here are MatLab programming aspects and not just the MatLab commands for scientists and engineers who do not have formal programming training and also have no significant time to spare for learning programming to solve their real world problems. Specifically provided are programs for visualization. The MatLab seminar covers the functional and script programming aspect of MatLab language. Specific expectations are: a) Recognize MatLab commands, script and function. b) Create, and run a MatLab function. c) Read, recognize, and describe MatLab syntax. d) Recognize decisions, loops and matrix operators. e) Evaluate scope among multiple files, and multiple functions within a file. f) Declare, define and use scalar variables, vectors and matrices.

  11. Non-linear least squares curve fitting of a simple theoretical model to radioimmunoassay dose-response data using a mini-computer

    International Nuclear Information System (INIS)

    Wilkins, T.A.; Chadney, D.C.; Bryant, J.; Palmstroem, S.H.; Winder, R.L.

    1977-01-01

    Using the simple univalent antigen univalent-antibody equilibrium model the dose-response curve of a radioimmunoassay (RIA) may be expressed as a function of Y, X and the four physical parameters of the idealised system. A compact but powerful mini-computer program has been written in BASIC for rapid iterative non-linear least squares curve fitting and dose interpolation with this function. In its simplest form the program can be operated in an 8K byte mini-computer. The program has been extensively tested with data from 10 different assay systems (RIA and CPBA) for measurement of drugs and hormones ranging in molecular size from thyroxine to insulin. For each assay system the results have been analysed in terms of (a) curve fitting biases and (b) direct comparison with manual fitting. In all cases the quality of fitting was remarkably good in spite of the fact that the chemistry of each system departed significantly from one or more of the assumptions implicit in the model used. A mathematical analysis of departures from the model's principal assumption has provided an explanation for this somewhat unexpected observation. The essential features of this analysis are presented in this paper together with the statistical analyses of the performance of the program. From these and the results obtained to date in the routine quality control of these 10 assays, it is concluded that the method of curve fitting and dose interpolation presented in this paper is likely to be of general applicability. (orig.) [de

  12. A Matlab Tool for Tumor Localization in Parathyroid Sestamibi Scintigraphy

    Directory of Open Access Journals (Sweden)

    M. Đurović

    2015-11-01

    Full Text Available Submarine method for localization of parathyroid tumors (PT has proved to be effective in case of typical pitfalls of conventional scintigraphic methods (combined subtraction and double phase methods. It uses images obtained by standard dynamic parathyroid sestamibi scintigraphy suggested by European Association of Nuclear Medicine. This paper presents: 1 the developed Matlab interface that enables the implementation and evaluation of algorithms for the automatic application of Submarine method; 2 the algorithm for automatic extraction of the entire thyroid region from the background radioactivity using operations from mathematical morphology applied on dynamic scintigrams; 3 the results obtained by algorithm for localization and visualization of PTs based on estimation of exponential decreasing trend of time-activity curves. The algorithm was tested on a group of 20 patients with histopathologically proven PTs using developed Matlab interface.

  13. Glycation and secondary conformational changes of human serum albumin: study of the FTIR spectroscopic curve-fitting technique

    Directory of Open Access Journals (Sweden)

    Yu-Ting Huang

    2016-05-01

    Full Text Available The aim of this study was attempted to investigate both the glycation kinetics and protein secondary conformational changes of human serum albumin (HSA after the reaction with ribose. The browning and fluorescence determinations as well as Fourier transform infrared (FTIR microspectroscopy with a curve-fitting technique were applied. Various concentrations of ribose were incubated over a 12-week period at 37 ± 0.5 oC under dark conditions. The results clearly shows that the glycation occurred in HSA-ribose reaction mixtures was markedly increased with the amount of ribose used and incubation time, leading to marked alterations of protein conformation of HSA after FTIR determination. In addition, the browning intensity of reaction solutions were colored from light to deep brown, as determined by optical observation. The increase in fluorescence intensity from HSA–ribose mixtures seemed to occur more quickly than browning, suggesting that the fluorescence products were produced earlier on in the process than compounds causing browning. Moreover, the predominant α-helical composition of HSA decreased with an increase in ribose concentration and incubation time, whereas total β-structure and random coil composition increased, as determined by curve-fitted FTIR microspectroscopy analysis. We also found that the peak intensity ratios at 1044 cm−1/1542 cm−1 markedly decreased prior to 4 weeks of incubation, then almost plateaued, implying that the consumption of ribose in the glycation reaction might have been accelerated over the first 4 weeks of incubation, and gradually decreased. This study first evidences that two unique IR peaks at 1710 cm−1 [carbonyl groups of irreversible products produced by the reaction and deposition of advanced glycation end products (AGEs] and 1621 cm−1 (aggregated HSA molecules were clearly observed from the curve-fitted FTIR spectra of HSA-ribose mixtures over the course of incubation time. This study

  14. Statistics in Matlab a primer

    CERN Document Server

    Cho, MoonJung

    2014-01-01

    List of Tables Preface MATLAB BasicsDesktop Environment Getting Help and Other Documentation Data Import and Export Data I/O via the Command Line The Import Wizard Examples of Data I/O in MATLAB Data I/O with the Statistics Toolbox More Functions for Data I/O Data in MATLAB Data Objects in Base MATLAB Accessing Data Elements Examples of Joining Data Sets Data Types in the Statistics Toolbox Object-Oriented Programming Miscellaneous Topics File and Workspace Management Punctuation in MATLAB Arithmetic Operators Functions in MATLAB Summary and Further Reading Visualizing DataBasic Plot Functions Plotting 2-D Data Plotting 3-D Data Examples Scatter Plots Basic 2-D and 3-D Scatter Plots Scatter Plot Matrix Examples GUIs for Graphics Simple Plot Editing Plotting Tools Interface PLOTS Tab Summary and Further Reading Descriptive StatisticsMeasures of Location Means, Medians, and Modes Examples Measures of Dispersion Range Variance and Standard Deviation Covariance and Correlation Examples Describing the Distribution...

  15. Ionization constants by curve fitting: determination of partition and distribution coefficients of acids and bases and their ions.

    Science.gov (United States)

    Clarke, F H; Cahoon, N M

    1987-08-01

    A convenient procedure has been developed for the determination of partition and distribution coefficients. The method involves the potentiometric titration of the compound, first in water and then in a rapidly stirred mixture of water and octanol. An automatic titrator is used, and the data is collected and analyzed by curve fitting on a microcomputer with 64 K of memory. The method is rapid and accurate for compounds with pKa values between 4 and 10. Partition coefficients can be measured for monoprotic and diprotic acids and bases. The partition coefficients of the neutral compound and its ion(s) can be determined by varying the ratio of octanol to water. Distribution coefficients calculated over a wide range of pH values are presented graphically as "distribution profiles". It is shown that subtraction of the titration curve of solvent alone from that of the compound in the solvent offers advantages for pKa determination by curve fitting for compounds of low aqueous solubility.

  16. Generation of synthetic gamma spectra with MATLAB

    International Nuclear Information System (INIS)

    Palmerio, Julian J.; Coppo, Anibal D.

    2009-01-01

    Objectives: The aim of this work is the simulation of gamma spectra using the MATLAB program to generate the calibration curves in efficiency, which will be used to measure radioactive waste in drums. They are necessary for the proper characterization of these drums. A Monte Carlo simulation was basically developed with the random number generator Mersenne Twister and nuclear data obtained from NIST. This paper shows the results obtained and difficulties encountered until today. The physical correction of the simulated spectra has been the only aspect we have been working, up to this moment. Procedures: A simplified representation of the 'Laboratorio de Verificacion y Control de la Calidad' was chosen. Drums with cemented liquid waste are routinely measured in this laboratory. The commercial program MCNP was also used to get a valid reference in the field of simulation of spectra. We analyzed the spectra obtained by MATLAB in the light of classical literature photon detection and the spectrum obtained by MCNP. Conclusions: Currently the program developed seems adequate to simulate a measurement in the 'Laboratorio de Verificacion y Control de la Calidad'. The spectra obtained by MATLAB seem to physically represent what is observed in real spectra. However, it is a slow program. The current development efforts are directed to improve the speed of simulation. An alternative is to use the CUDA language for NVIDIA video cards to parallelized the simulation. An adequate simulation of the electronic measuring chain is also needed to obtain better representations of the shapes of the peaks. (author)

  17. Matlab differential and integral calculus

    CERN Document Server

    Lopez, Cesar

    2014-01-01

    MATLAB is a high-level language and environment for numerical computation, visualization, and programming. Using MATLAB, you can analyze data, develop algorithms, and create models and applications. The language, tools, and built-in math functions enable you to explore multiple approaches and reach a solution faster than with spreadsheets or traditional programming languages, such as C/C++ or Java. MATLAB Differential and Integral Calculus introduces you to the MATLAB language with practical hands-on instructions and results, allowing you to quickly achieve your goals. In addition to givi

  18. Matlab programming for numerical analysis

    CERN Document Server

    Lopez, Cesar

    2014-01-01

    MATLAB is a high-level language and environment for numerical computation, visualization, and programming. Using MATLAB, you can analyze data, develop algorithms, and create models and applications. The language, tools, and built-in math functions enable you to explore multiple approaches and reach a solution faster than with spreadsheets or traditional programming languages, such as C/C++ or Java. Programming MATLAB for Numerical Analysis introduces you to the MATLAB language with practical hands-on instructions and results, allowing you to quickly achieve your goals. You will first become

  19. A MATLAB companion for multivariable calculus

    CERN Document Server

    Cooper, Jeffery

    2001-01-01

    Offering a concise collection of MatLab programs and exercises to accompany a third semester course in multivariable calculus, A MatLab Companion for Multivariable Calculus introduces simple numerical procedures such as numerical differentiation, numerical integration and Newton''s method in several variables, thereby allowing students to tackle realistic problems. The many examples show students how to use MatLab effectively and easily in many contexts. Numerous exercises in mathematics and applications areas are presented, graded from routine to more demanding projects requiring some programming. Matlab M-files are provided on the Harcourt/Academic Press web site at http://www.harcourt-ap.com/matlab.html.* Computer-oriented material that complements the essential topics in multivariable calculus* Main ideas presented with examples of computations and graphics displays using MATLAB * Numerous examples of short code in the text, which can be modified for use with the exercises* MATLAB files are used to implem...

  20. Parallelizing AT with MatlabMPI

    International Nuclear Information System (INIS)

    2011-01-01

    The Accelerator Toolbox (AT) is a high-level collection of tools and scripts specifically oriented toward solving problems dealing with computational accelerator physics. It is integrated into the MATLAB environment, which provides an accessible, intuitive interface for accelerator physicists, allowing researchers to focus the majority of their efforts on simulations and calculations, rather than programming and debugging difficulties. Efforts toward parallelization of AT have been put in place to upgrade its performance to modern standards of computing. We utilized the packages MatlabMPI and pMatlab, which were developed by MIT Lincoln Laboratory, to set up a message-passing environment that could be called within MATLAB, which set up the necessary pre-requisites for multithread processing capabilities. On local quad-core CPUs, we were able to demonstrate processor efficiencies of roughly 95% and speed increases of nearly 380%. By exploiting the efficacy of modern-day parallel computing, we were able to demonstrate incredibly efficient speed increments per processor in AT's beam-tracking functions. Extrapolating from prediction, we can expect to reduce week-long computation runtimes to less than 15 minutes. This is a huge performance improvement and has enormous implications for the future computing power of the accelerator physics group at SSRL. However, one of the downfalls of parringpass is its current lack of transparency; the pMatlab and MatlabMPI packages must first be well-understood by the user before the system can be configured to run the scripts. In addition, the instantiation of argument parameters requires internal modification of the source code. Thus, parringpass, cannot be directly run from the MATLAB command line, which detracts from its flexibility and user-friendliness. Future work in AT's parallelization will focus on development of external functions and scripts that can be called from within MATLAB and configured on multiple nodes, while

  1. Numerical methods using Matlab

    CERN Document Server

    Lindfield, George

    2012-01-01

    Numerical Methods using MATLAB, 3e, is an extensive reference offering hundreds of useful and important numerical algorithms that can be implemented into MATLAB for a graphical interpretation to help researchers analyze a particular outcome. Many worked examples are given together with exercises and solutions to illustrate how numerical methods can be used to study problems that have applications in the biosciences, chaos, optimization, engineering and science across the board. Numerical Methods using MATLAB, 3e, is an extensive reference offering hundreds of use

  2. Cuckoo search with Lévy flights for weighted Bayesian energy functional optimization in global-support curve data fitting.

    Science.gov (United States)

    Gálvez, Akemi; Iglesias, Andrés; Cabellos, Luis

    2014-01-01

    The problem of data fitting is very important in many theoretical and applied fields. In this paper, we consider the problem of optimizing a weighted Bayesian energy functional for data fitting by using global-support approximating curves. By global-support curves we mean curves expressed as a linear combination of basis functions whose support is the whole domain of the problem, as opposed to other common approaches in CAD/CAM and computer graphics driven by piecewise functions (such as B-splines and NURBS) that provide local control of the shape of the curve. Our method applies a powerful nature-inspired metaheuristic algorithm called cuckoo search, introduced recently to solve optimization problems. A major advantage of this method is its simplicity: cuckoo search requires only two parameters, many fewer than other metaheuristic approaches, so the parameter tuning becomes a very simple task. The paper shows that this new approach can be successfully used to solve our optimization problem. To check the performance of our approach, it has been applied to five illustrative examples of different types, including open and closed 2D and 3D curves that exhibit challenging features, such as cusps and self-intersections. Our results show that the method performs pretty well, being able to solve our minimization problem in an astonishingly straightforward way.

  3. Characterization of Yellow Seahorse Hippocampus kuda feeding click sound signals in a laboratory environment: an application of probability density function and power spectral density analyses

    Digital Repository Service at National Institute of Oceanography (India)

    Chakraborty, B.; Saran, A.K.; Kuncolienker, D.S.; Sreepada, R.A.; Haris, K.; Fernandes, W.A

    based on the assumption of combinations of normal / Gaussian distributions indicate well fitted multimodal curves generated using MATLAB (Math Works Inc 2005) programs. Out of the twenty three clicks, four clicks (two clicks of 16 and 18cm male... Society of America 115: 2331-2333. Malamud BD, Turcotte DL. 1999. Self affine time series: measures of weak and strong persistence. Journal of Statistical Planning and Inference 80:173-196. MATLAB, Curve Fitting toolbox, Math Works Inc 2005. Available...

  4. Nonlinear models for fitting growth curves of Nellore cows reared in the Amazon Biome

    Directory of Open Access Journals (Sweden)

    Kedma Nayra da Silva Marinho

    2013-09-01

    Full Text Available Growth curves of Nellore cows were estimated by comparing six nonlinear models: Brody, Logistic, two alternatives by Gompertz, Richards and Von Bertalanffy. The models were fitted to weight-age data, from birth to 750 days of age of 29,221 cows, born between 1976 and 2006 in the Brazilian states of Acre, Amapá, Amazonas, Pará, Rondônia, Roraima and Tocantins. The models were fitted by the Gauss-Newton method. The goodness of fit of the models was evaluated by using mean square error, adjusted coefficient of determination, prediction error and mean absolute error. Biological interpretation of parameters was accomplished by plotting estimated weights versus the observed weight means, instantaneous growth rate, absolute maturity rate, relative instantaneous growth rate, inflection point and magnitude of the parameters A (asymptotic weight and K (maturing rate. The Brody and Von Bertalanffy models fitted the weight-age data but the other models did not. The average weight (A and growth rate (K were: 384.6±1.63 kg and 0.0022±0.00002 (Brody and 313.40±0.70 kg and 0.0045±0.00002 (Von Bertalanffy. The Brody model provides better goodness of fit than the Von Bertalanffy model.

  5. Comparison 3 - 386 MATLAB

    DEFF Research Database (Denmark)

    Jensen, Niels

    1992-01-01

    MATLAB is a C-based general tool for mathematical and engineering calculations with limited capabilities for simulation of non-linear equation systems.......MATLAB is a C-based general tool for mathematical and engineering calculations with limited capabilities for simulation of non-linear equation systems....

  6. Potential Risk Estimation Drowning Index for Children (PREDIC): a pilot study from Matlab, Bangladesh.

    Science.gov (United States)

    Borse, N N; Hyder, A A; Bishai, D; Baker, T; Arifeen, S E

    2011-11-01

    Childhood drowning is a major public health problem that has been neglected in many low- and middle-income countries. In Matlab, rural Bangladesh, more than 40% of child deaths aged 1-4 years are due to drowning. The main objective of this paper was to develop and evaluate a childhood drowning risk prediction index. A literature review was carried out to document risk factors identified for childhood drowning in Bangladesh. The Newacheck model for special health care needs for children was adapted and applied to construct a childhood drowning risk index called "Potential Risk Estimation Drowning Index for Children" (PREDIC). Finally, the proposed PREDIC Index was applied to childhood drowning deaths and compared with the comparison group from children living in Matlab, Bangladesh. This pilot study used t-tests and Receiver Operating Characteristic (ROC) curve to analyze the results. The PREDIC index was applied to 302 drowning deaths and 624 children 0-4 years old living in Matlab. The results of t-test indicate that the drowned children had a statistically (t=-8.58, p=0.0001) significant higher mean PREDIC score (6.01) than those in comparison group (5.26). Drowning cases had a PREDIC score of 6 or more for 68% of the children however, the comparison group had 43% of the children with score of 6 or more which was statistically significant (t=-7.36, p<0.001). The area under the curve for the Receiver Operating Characteristic curve was 0.662. Index score construction was scientifically plausible; and the index is relatively complete, fairly accurate, and practical. The risk index can help identify and target high risk children with drowning prevention programs. PREDIC index needs to be further tested for its accuracy, feasibility and effectiveness in drowning risk reduction in Bangladesh and other countries. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. System Simulation of Nuclear Power Plant by Coupling RELAP5 and Matlab/Simulink

    International Nuclear Information System (INIS)

    Meng Lin; Dong Hou; Zhihong Xu; Yanhua Yang; Ronghua Zhang

    2006-01-01

    modeled by RELAP5 code, and its main control and protection system is duplicated by Matlab/Simulink. Some steady states and transients are calculated under control of these I and C systems, and the results are compared with the plant test curves. The application showed that it can do exact system simulation of NPPs by coupling RELAP5 and Matlab/Simulink. This paper will mainly focus on the coupling method, plant thermal-hydraulic model, main control logics, test and application results. (authors)

  8. Essential Matlab and Octave

    CERN Document Server

    Rogel-Salazar, Jesus

    2014-01-01

    ""This is an excellent book for anyone approaching MATLAB or Octave for the first time. The pleasant language used throughout creates the sensation of having the author by your side. … An interesting feature are the examples used to explain the use of functions and operations. … compared to similar texts on Octave and MATLAB, the author introduces at an early stage how to produce line and surface plots with MATLAB and Octave. It is very attractive to students to be able to quickly produce plots with scientific journal quality. … The margin notes are great as they can also work as virtual bookm

  9. ROBUST DECLINE CURVE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Sutawanir Darwis

    2012-05-01

    Full Text Available Empirical decline curve analysis of oil production data gives reasonable answer in hyperbolic type curves situations; however the methodology has limitations in fitting real historical production data in present of unusual observations due to the effect of the treatment to the well in order to increase production capacity. The development ofrobust least squares offers new possibilities in better fitting production data using declinecurve analysis by down weighting the unusual observations. This paper proposes a robustleast squares fitting lmRobMM approach to estimate the decline rate of daily production data and compares the results with reservoir simulation results. For case study, we usethe oil production data at TBA Field West Java. The results demonstrated that theapproach is suitable for decline curve fitting and offers a new insight in decline curve analysis in the present of unusual observations.

  10. MATLAB 3A

    DEFF Research Database (Denmark)

    Freil, Ole; Kristiansen, Heidi; Kaas, Thomas

    MATLAB 3a – Matematiklaboratoriet er en elevbog til første halvdel af 3. klasse. Bogen indeholder fire kapitler: 'Store tal og regnemåder', 'Kan du tegne det?', 'Gange og division', 'Undersøg data og chance'.......MATLAB 3a – Matematiklaboratoriet er en elevbog til første halvdel af 3. klasse. Bogen indeholder fire kapitler: 'Store tal og regnemåder', 'Kan du tegne det?', 'Gange og division', 'Undersøg data og chance'....

  11. Digital speech processing using Matlab

    CERN Document Server

    Gopi, E S

    2014-01-01

    Digital Speech Processing Using Matlab deals with digital speech pattern recognition, speech production model, speech feature extraction, and speech compression. The book is written in a manner that is suitable for beginners pursuing basic research in digital speech processing. Matlab illustrations are provided for most topics to enable better understanding of concepts. This book also deals with the basic pattern recognition techniques (illustrated with speech signals using Matlab) such as PCA, LDA, ICA, SVM, HMM, GMM, BPN, and KSOM.

  12. Solving applied mathematical problems with Matlab

    CERN Document Server

    Xue, Dingyu

    2008-01-01

    Computer Mathematics Language-An Overview. Fundamentals of MATLAB Programming. Calculus Problems. MATLAB Computations of Linear Algebra Problems. Integral Transforms and Complex Variable Functions. Solutions to Nonlinear Equations and Optimization Problems. MATLAB Solutions to Differential Equation Problems. Solving Interpolations and Approximations Problems. Solving Probability and Mathematical Statistics Problems. Nontraditional Solution Methods for Mathematical Problems.

  13. Channel Access Client Toolbox for Matlab

    International Nuclear Information System (INIS)

    2002-01-01

    This paper reports on MATLAB Channel Access (MCA) Toolbox--MATLAB [1] interface to EPICS Channel Access (CA) client library. We are developing the toolbox for SPEAR3 accelerator controls, but it is of general use for accelerator and experimental physics applications programming. It is packaged as a MATLAB toolbox to allow easy development of complex CA client applications entirely in MATLAB. The benefits include: the ability to calculate and display parameters that use EPICS process variables as inputs, availability of MATLAB graphics tools for user interface design, and integration with the MATLABbased accelerator modeling software - Accelerator Toolbox [2-4]. Another purpose of this paper is to propose a feasible path to a synergy between accelerator control systems and accelerator simulation codes, the idea known as on-line accelerator model

  14. Absolute Distances to Nearby Type Ia Supernovae via Light Curve Fitting Methods

    Science.gov (United States)

    Vinkó, J.; Ordasi, A.; Szalai, T.; Sárneczky, K.; Bányai, E.; Bíró, I. B.; Borkovits, T.; Hegedüs, T.; Hodosán, G.; Kelemen, J.; Klagyivik, P.; Kriskovics, L.; Kun, E.; Marion, G. H.; Marschalkó, G.; Molnár, L.; Nagy, A. P.; Pál, A.; Silverman, J. M.; Szakáts, R.; Szegedi-Elek, E.; Székely, P.; Szing, A.; Vida, K.; Wheeler, J. C.

    2018-06-01

    We present a comparative study of absolute distances to a sample of very nearby, bright Type Ia supernovae (SNe) derived from high cadence, high signal-to-noise, multi-band photometric data. Our sample consists of four SNe: 2012cg, 2012ht, 2013dy and 2014J. We present new homogeneous, high-cadence photometric data in Johnson–Cousins BVRI and Sloan g‧r‧i‧z‧ bands taken from two sites (Piszkesteto and Baja, Hungary), and the light curves are analyzed with publicly available light curve fitters (MLCS2k2, SNooPy2 and SALT2.4). When comparing the best-fit parameters provided by the different codes, it is found that the distance moduli of moderately reddened SNe Ia agree within ≲0.2 mag, and the agreement is even better (≲0.1 mag) for the highest signal-to-noise BVRI data. For the highly reddened SN 2014J the dispersion of the inferred distance moduli is slightly higher. These SN-based distances are in good agreement with the Cepheid distances to their host galaxies. We conclude that the current state-of-the-art light curve fitters for Type Ia SNe can provide consistent absolute distance moduli having less than ∼0.1–0.2 mag uncertainty for nearby SNe. Still, there is room for future improvements to reach the desired ∼0.05 mag accuracy in the absolute distance modulus.

  15. DACE - A Matlab Kriging Toolbox

    DEFF Research Database (Denmark)

    2002-01-01

    DACE, Design and Analysis of Computer Experiments, is a Matlab toolbox for working with kriging approximations to computer models.......DACE, Design and Analysis of Computer Experiments, is a Matlab toolbox for working with kriging approximations to computer models....

  16. Curve Fitting via the Criterion of Least Squares. Applications of Algebra and Elementary Calculus to Curve Fitting. [and] Linear Programming in Two Dimensions: I. Applications of High School Algebra to Operations Research. Modules and Monographs in Undergraduate Mathematics and Its Applications Project. UMAP Units 321, 453.

    Science.gov (United States)

    Alexander, John W., Jr.; Rosenberg, Nancy S.

    This document consists of two modules. The first of these views applications of algebra and elementary calculus to curve fitting. The user is provided with information on how to: 1) construct scatter diagrams; 2) choose an appropriate function to fit specific data; 3) understand the underlying theory of least squares; 4) use a computer program to…

  17. Introduction to TAFI - A Matlab® toolbox for analysis of flexural isostasy

    Science.gov (United States)

    Jha, S.; Harry, D. L.; Schutt, D.

    2016-12-01

    The isostatic response of vertical tectonic loads emplaced on thin elastic plates overlying inviscid substrate and the corresponding gravity anomalies are commonly modeled using well established theories and methodologies of flexural analysis. However, such analysis requires some mathematical and coding expertise on part of users. With that in mind, we designed a new interactive Matlab® toolbox called Toolbox for Analysis of Flexural Isostasy (TAFI). TAFI allows users to create forward models (2-D and 3-D) of flexural deformation of the lithosphere and resulting gravity anomaly. TAFI computes Green's Functions for flexure of the elastic plate subjected to point or line loads, and analytical solution for harmonic loads. Flexure due to non-impulsive, distributed 2-D or 3-D loads are computed by convolving the appropriate Green's function with a user-supplied spatially discretized load function. The gravity anomaly associated with each density interface is calculated by using the Fourier Transform of flexural deflection of these interfaces and estimating the gravity in the wavenumber domain. All models created in TAFI are based on Matlab's intrinsic functions and do not require any specialized toolbox, function or library except those distributed with TAFI. Modeling functions within TAFI can be called from Matlab workspace, from within user written programs or from the TAFI's graphical user interface (GUI). The GUI enables the user to model the flexural deflection of lithosphere interactively, enabling real time comparison of model fit with observed data constraining the flexural deformation and gravity, facilitating rapid search for best fitting flexural model. TAFI is a very useful teaching and research tool and have been tested rigorously in graduate level teaching and basic research environment.

  18. Matpar: Parallel Extensions for MATLAB

    Science.gov (United States)

    Springer, P. L.

    1998-01-01

    Matpar is a set of client/server software that allows a MATLAB user to take advantage of a parallel computer for very large problems. The user can replace calls to certain built-in MATLAB functions with calls to Matpar functions.

  19. Introduction à MATLAB

    OpenAIRE

    Zenou, Emmanuel

    2013-01-01

    Cette initiation à MatLab a pour objectif de se familiariser à un outil très utilisé par la communauté scientifique dans les laboratoires et dans l'industrie. Il a également pour objectif d'initier (pour ceux qui n'y ont jamais touché) à la programmation et à l'algorithmique, ce qui est indispensable à tout bon ingénieur aujourd'hui. En effet, beaucoup de notions introduites ici ne sont pas propres à MatLab mais à tout langage structuré comme le C/C++, le Java, etc.

  20. A three-parameter langmuir-type model for fitting standard curves of sandwich enzyme immunoassays with special attention to the α-fetoprotein assay

    NARCIS (Netherlands)

    Kortlandt, W.; Endeman, H.J.; Hoeke, J.O.O.

    In a simplified approach to the reaction kinetics of enzyme-linked immunoassays, a Langmuir-type equation y = [ax/(b + x)] + c was derived. This model proved to be superior to logit-log and semilog models in the curve-fitting of standard curves. An assay for α-fetoprotein developed in our laboratory

  1. Characterization of acid functional groups of carbon dots by nonlinear regression data fitting of potentiometric titration curves

    Science.gov (United States)

    Alves, Larissa A.; de Castro, Arthur H.; de Mendonça, Fernanda G.; de Mesquita, João P.

    2016-05-01

    The oxygenated functional groups present on the surface of carbon dots with an average size of 2.7 ± 0.5 nm were characterized by a variety of techniques. In particular, we discussed the fit data of potentiometric titration curves using a nonlinear regression method based on the Levenberg-Marquardt algorithm. The results obtained by statistical treatment of the titration curve data showed that the best fit was obtained considering the presence of five Brønsted-Lowry acids on the surface of the carbon dots with constant ionization characteristics of carboxylic acids, cyclic ester, phenolic and pyrone-like groups. The total number of oxygenated acid groups obtained was 5 mmol g-1, with approximately 65% (∼2.9 mmol g-1) originating from groups with pKa titrated and initial concentration of HCl solution. Finally, we believe that the methodology used here, together with other characterization techniques, is a simple, fast and powerful tool to characterize the complex acid-base properties of these so interesting and intriguing nanoparticles.

  2. KALMTOOL for use with MATLAB

    DEFF Research Database (Denmark)

    Nørgaard, Peter Magnus; Ravn, Ole; Poulsen, Niels Kjølstad

    2003-01-01

    The KALMTOOL toolbox is a set of MATLAB tools for state estimation for nonlinear systems. The toolbox contains functions for extended Kalman filtering as well as for two new filters called the DD1 filter and the DD2 filter. The toolbox specifically addresses the problem of not having observations...... available at all sampling instants. All functions are available as m-functions but for faster (much faster!) execution, the DD1 and DD2 filters are also available as C Mex files for MATLAB under Windows and Linux. The toolbox requires MATLAB 6. No additional toolboxes are required....

  3. An interactive graphics program to retrieve, display, compare, manipulate, curve fit, difference and cross plot wind tunnel data

    Science.gov (United States)

    Elliott, R. D.; Werner, N. M.; Baker, W. M.

    1975-01-01

    The Aerodynamic Data Analysis and Integration System (ADAIS), developed as a highly interactive computer graphics program capable of manipulating large quantities of data such that addressable elements of a data base can be called up for graphic display, compared, curve fit, stored, retrieved, differenced, etc., was described. The general nature of the system is evidenced by the fact that limited usage has already occurred with data bases consisting of thermodynamic, basic loads, and flight dynamics data. Productivity using ADAIS of five times that for conventional manual methods of wind tunnel data analysis is routinely achieved. In wind tunnel data analysis, data from one or more runs of a particular test may be called up and displayed along with data from one or more runs of a different test. Curves may be faired through the data points by any of four methods, including cubic spline and least squares polynomial fit up to seventh order.

  4. SiFTO: An Empirical Method for Fitting SN Ia Light Curves

    Science.gov (United States)

    Conley, A.; Sullivan, M.; Hsiao, E. Y.; Guy, J.; Astier, P.; Balam, D.; Balland, C.; Basa, S.; Carlberg, R. G.; Fouchez, D.; Hardin, D.; Howell, D. A.; Hook, I. M.; Pain, R.; Perrett, K.; Pritchet, C. J.; Regnault, N.

    2008-07-01

    We present SiFTO, a new empirical method for modeling Type Ia supernova (SN Ia) light curves by manipulating a spectral template. We make use of high-redshift SN data when training the model, allowing us to extend it bluer than rest-frame U. This increases the utility of our high-redshift SN observations by allowing us to use more of the available data. We find that when the shape of the light curve is described using a stretch prescription, applying the same stretch at all wavelengths is not an adequate description. SiFTO therefore uses a generalization of stretch which applies different stretch factors as a function of both the wavelength of the observed filter and the stretch in the rest-frame B band. We compare SiFTO to other published light-curve models by applying them to the same set of SN photometry, and demonstrate that SiFTO and SALT2 perform better than the alternatives when judged by the scatter around the best-fit luminosity distance relationship. We further demonstrate that when SiFTO and SALT2 are trained on the same data set the cosmological results agree. Based on observations obtained with MegaPrime/MegaCam, a joint project of CFHT and CEA/DAPNIA, at the Canada-France-Hawaii Telescope (CFHT) which is operated by the National Research Council (NRC) of Canada, the Institut National des Sciences de l'Univers of the Centre National de la Recherche Scientifique (CNRS) of France, and the University of Hawaii. This work is based in part on data products produced at the Canadian Astronomy Data Centre as part of the Canada-France-Hawaii Telescope Legacy Survey, a collaborative project of NRC and CNRS.

  5. INFOS: spectrum fitting software for NMR analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Albert A., E-mail: alsi@nmr.phys.chem.ethz.ch [ETH Zürich, Physical Chemistry (Switzerland)

    2017-02-15

    Software for fitting of NMR spectra in MATLAB is presented. Spectra are fitted in the frequency domain, using Fourier transformed lineshapes, which are derived using the experimental acquisition and processing parameters. This yields more accurate fits compared to common fitting methods that use Lorentzian or Gaussian functions. Furthermore, a very time-efficient algorithm for calculating and fitting spectra has been developed. The software also performs initial peak picking, followed by subsequent fitting and refinement of the peak list, by iteratively adding and removing peaks to improve the overall fit. Estimation of error on fitting parameters is performed using a Monte-Carlo approach. Many fitting options allow the software to be flexible enough for a wide array of applications, while still being straightforward to set up with minimal user input.

  6. Advanced Dynamics Analytical and Numerical Calculations with MATLAB

    CERN Document Server

    Marghitu, Dan B

    2012-01-01

    Advanced Dynamics: Analytical and Numerical Calculations with MATLAB provides a thorough, rigorous presentation of kinematics and dynamics while using MATLAB as an integrated tool to solve problems. Topics presented are explained thoroughly and directly, allowing fundamental principles to emerge through applications from areas such as multibody systems, robotics, spacecraft and design of complex mechanical devices. This book differs from others in that it uses symbolic MATLAB for both theory and applications. Special attention is given to solutions that are solved analytically and numerically using MATLAB. The illustrations and figures generated with MATLAB reinforce visual learning while an abundance of examples offer additional support. This book also: Provides solutions analytically and numerically using MATLAB Illustrations and graphs generated with MATLAB reinforce visual learning for students as they study Covers modern technical advancements in areas like multibody systems, robotics, spacecraft and des...

  7. Linear algebra applications using Matlab software

    Directory of Open Access Journals (Sweden)

    Cornelia Victoria Anghel

    2005-10-01

    Full Text Available The paper presents two ways of special matrix generating using some functions included in the MatLab software package. The MatLab software package contains a set of functions that generate special matrixes used in the linear algebra applications and the signal processing from different activity fields. The paper presents two tipes of special matrixes that can be generated using written sintaxes in the dialog window of the MatLab software and for the command validity we need to press the Enter task. The applications presented in the paper represent eamples of numerical calculus using the MatLab software and belong to the scientific field „Computer Assisted Mathematics” thus creating the symbiosis between mathematics and informatics.

  8. THERMODYNAMIC STUDY OF CHARGE-TRANSFER COMPLEX ...

    African Journals Online (AJOL)

    a

    The formation constant of the resulting complex was evaluated from the absorbance-mole ratio data by using a non-linear least square curve-fitting program (curve-fitting toolbox in. MATLAB). The program is based on the iteration adjustment of calculated absorbances to the observed values. The observed absorbance of ...

  9. Cyclotron beam dynamic simulations in MATLAB

    International Nuclear Information System (INIS)

    Karamysheva, G.A.; Karamyshev, O.V.; Lepkina, O.E.

    2008-01-01

    MATLAB is useful for beam dynamic simulations in cyclotrons. Programming in an easy-to-use environment permits creation of models in a short space of time. Advanced graphical tools of MATLAB give good visualization features to created models. The beam dynamic modeling results with an example of two different cyclotron designs are presented. Programming with MATLAB opens wide possibilities of the development of the complex program, able to perform complete block of calculations for the design of the accelerators

  10. Matlab Based LOCO

    International Nuclear Information System (INIS)

    Portmann, Greg; Safranek, James; Huang, Xiaobiao

    2011-01-01

    The LOCO algorithm has been used by many accelerators around the world. Although the uses for LOCO vary, the most common use has been to find calibration errors and correct the optics functions. The light source community in particular has made extensive use of the LOCO algorithms to tightly control the beta function and coupling. Maintaining high quality beam parameters requires constant attention so a relatively large effort was put into software development for the LOCO application. The LOCO code was originally written in FORTRAN. This code worked fine but it was somewhat awkward to use. For instance, the FORTRAN code itself did not calculate the model response matrix. It required a separate modeling code such as MAD to calculate the model matrix then one manually loads the data into the LOCO code. As the number of people interested in LOCO grew, it required making it easier to use. The decision to port LOCO to Matlab was relatively easy. It's best to use a matrix programming language with good graphics capability; Matlab was also being used for high level machine control; and the accelerator modeling code AT, (5), was already developed for Matlab. Since LOCO requires collecting and processing a relative large amount of data, it is very helpful to have the LOCO code compatible with the high level machine control, (3). A number of new features were added while porting the code from FORTRAN and new methods continue to evolve, (7)(9). Although Matlab LOCO was written with AT as the underlying tracking code, a mechanism to connect to other modeling codes has been provided.

  11. GURU v2.0: An interactive Graphical User interface to fit rheometer curves in Han’s model for rubber vulcanization

    OpenAIRE

    Milani, G.; Milani, F.

    2016-01-01

    A GUI software (GURU) for experimental data fitting of rheometer curves in Natural Rubber (NR) vulcanized with sulphur at different curing temperatures is presented. Experimental data are automatically loaded in GURU from an Excel spreadsheet coming from the output of the experimental machine (moving die rheometer). To fit the experimental data, the general reaction scheme proposed by Han and co-workers for NR vulcanized with sulphur is considered. From the simplified kinetic scheme adopted, ...

  12. Accelerating MATLAB with GPU computing a primer with examples

    CERN Document Server

    Suh, Jung W

    2013-01-01

    Beyond simulation and algorithm development, many developers increasingly use MATLAB even for product deployment in computationally heavy fields. This often demands that MATLAB codes run faster by leveraging the distributed parallelism of Graphics Processing Units (GPUs). While MATLAB successfully provides high-level functions as a simulation tool for rapid prototyping, the underlying details and knowledge needed for utilizing GPUs make MATLAB users hesitate to step into it. Accelerating MATLAB with GPUs offers a primer on bridging this gap. Starting with the basics, setting up MATLAB for

  13. Use of a non-linear method for including the mass uncertainty of gravimetric standards and system measurement errors in the fitting of calibration curves for XRFA freeze-dried UNO3 standards

    International Nuclear Information System (INIS)

    Pickles, W.L.; McClure, J.W.; Howell, R.H.

    1978-05-01

    A sophisticated nonlinear multiparameter fitting program was used to produce a best fit calibration curve for the response of an x-ray fluorescence analyzer to uranium nitrate, freeze dried, 0.2% accurate, gravimetric standards. The program is based on unconstrained minimization subroutine, VA02A. The program considers the mass values of the gravimetric standards as parameters to be fit along with the normal calibration curve parameters. The fitting procedure weights with the system errors and the mass errors in a consistent way. The resulting best fit calibration curve parameters reflect the fact that the masses of the standard samples are measured quantities with a known error. Error estimates for the calibration curve parameters can be obtained from the curvature of the ''Chi-Squared Matrix'' or from error relaxation techniques. It was shown that nondispersive XRFA of 0.1 to 1 mg freeze-dried UNO 3 can have an accuracy of 0.2% in 1000 s

  14. Improved liver R2* mapping by pixel-wise curve fitting with adaptive neighborhood regularization.

    Science.gov (United States)

    Wang, Changqing; Zhang, Xinyuan; Liu, Xiaoyun; He, Taigang; Chen, Wufan; Feng, Qianjin; Feng, Yanqiu

    2018-08-01

    To improve liver R2* mapping by incorporating adaptive neighborhood regularization into pixel-wise curve fitting. Magnetic resonance imaging R2* mapping remains challenging because of the serial images with low signal-to-noise ratio. In this study, we proposed to exploit the neighboring pixels as regularization terms and adaptively determine the regularization parameters according to the interpixel signal similarity. The proposed algorithm, called the pixel-wise curve fitting with adaptive neighborhood regularization (PCANR), was compared with the conventional nonlinear least squares (NLS) and nonlocal means filter-based NLS algorithms on simulated, phantom, and in vivo data. Visually, the PCANR algorithm generates R2* maps with significantly reduced noise and well-preserved tiny structures. Quantitatively, the PCANR algorithm produces R2* maps with lower root mean square errors at varying R2* values and signal-to-noise-ratio levels compared with the NLS and nonlocal means filter-based NLS algorithms. For the high R2* values under low signal-to-noise-ratio levels, the PCANR algorithm outperforms the NLS and nonlocal means filter-based NLS algorithms in the accuracy and precision, in terms of mean and standard deviation of R2* measurements in selected region of interests, respectively. The PCANR algorithm can reduce the effect of noise on liver R2* mapping, and the improved measurement precision will benefit the assessment of hepatic iron in clinical practice. Magn Reson Med 80:792-801, 2018. © 2018 International Society for Magnetic Resonance in Medicine. © 2018 International Society for Magnetic Resonance in Medicine.

  15. Extraction Analysis and Creation of Three-Dimensional Road Profiles Using Matlab OpenCRG Tool

    Directory of Open Access Journals (Sweden)

    Rakesh Hari Borse

    2015-08-01

    Full Text Available In vehicle systems dynamics there are wide applications of simulation of vehicles on road surfaces. These simulation applications are related to vehicle handling ride comfort and durability. For accurate prediction of results there is a need for a reliable and efficient road representations. The efficient representation of road surface profiles is to represent them in three-dimensional space. This is made possible by the CRG Curved Regular Grid approach. OpenCRG is a completely open source project including a tool suite for the creation modification and evaluation of road surfaces. Its objective is to standardized detailed road surface description and it may be used for applications like tire models vibrations or driving simulation. The Matlab tool suite of OpenCRG provides powerful modification or creation tools and allows to visualize the 3D road data representation. The current research focuses on basic concepts of OpenCRG and its Matlab environment. The extraction of longitudinal two-dimensional road profiles from three-dimensional CRG format is researched. The creation of simple virtual three-dimensional roads has been programmed. A Matlab software tool to extract create and analyze the three-dimensional road profiles is to be developed.

  16. Two Aspects of the Simplex Model: Goodness of Fit to Linear Growth Curve Structures and the Analysis of Mean Trends.

    Science.gov (United States)

    Mandys, Frantisek; Dolan, Conor V.; Molenaar, Peter C. M.

    1994-01-01

    Studied the conditions under which the quasi-Markov simplex model fits a linear growth curve covariance structure and determined when the model is rejected. Presents a quasi-Markov simplex model with structured means and gives an example. (SLD)

  17. TecLines: A MATLAB-Based Toolbox for Tectonic Lineament Analysis from Satellite Images and DEMs, Part 2: Line Segments Linking and Merging

    Directory of Open Access Journals (Sweden)

    Mehdi Rahnama

    2014-11-01

    Full Text Available Extraction and interpretation of tectonic lineaments is one of the routines for mapping large areas using remote sensing data. However, this is a subjective and time-consuming process. It is difficult to choose an optimal lineament extraction method in order to reduce subjectivity and obtain vectors similar to what an analyst would manually extract. The objective of this study is the implementation, evaluation and comparison of Hough transform, segment merging and polynomial fitting methods towards automated tectonic lineament mapping. For this purpose we developed a new MATLAB-based toolbox (TecLines. The proposed toolbox capabilities were validated using a synthetic Digital Elevation Model (DEM and tested along in the Andarab fault zone (Afghanistan where specific fault structures are known. In this study, we used filters in both frequency and spatial domains and the tensor voting framework to produce binary edge maps. We used the Hough transform to extract linear image discontinuities. We used B-spline as a polynomial curve fitting method to eliminate artificial line segments that are out of interest and to link discontinuous segments with similar trends. We performed statistical analyses in order to compare the final image discontinuities maps with existing references map.

  18. Perancangan Kendali Pid Dengan Matlab

    OpenAIRE

    Sukamta, Sri

    2010-01-01

    Perancangan PID Controller selama ini menggunakan metoda trial and error dengan perhitungan yangmemakan waktu lama. MatLab yang dilengkapi Control Toolbox, membantu perancang untuk melihatrespon berbagai kombinasi konstanta dengan variasi input yang berbeda. Penggunaan MatLab ini sangatmembantu perancang dalam menentukan kombinasi di antara P, I, dan D Controller untuk menghasilkansistem pengaturan yang baik dan sederhana.

  19. Learning Matlab a problem solving approach

    CERN Document Server

    Gander, Walter

    2015-01-01

    This comprehensive and stimulating introduction to Matlab, a computer language now widely used for technical computing, is based on an introductory course held at Qian Weichang College, Shanghai University, in the fall of 2014.  Teaching and learning a substantial programming language aren’t always straightforward tasks. Accordingly, this textbook is not meant to cover the whole range of this high-performance technical programming environment, but to motivate first- and second-year undergraduate students in mathematics and computer science to learn Matlab by studying representative problems, developing algorithms and programming them in Matlab. While several topics are taken from the field of scientific computing, the main emphasis is on programming. A wealth of examples are completely discussed and solved, allowing students to learn Matlab by doing: by solving problems, comparing approaches and assessing the proposed solutions.

  20. MATLAB Software Versions and Licenses for the Peregrine System |

    Science.gov (United States)

    High-Performance Computing | NREL MATLAB Software Versions and Licenses for the Peregrine System MATLAB Software Versions and Licenses for the Peregrine System Learn about the MATLAB software Peregrine is R2017b. Licenses MATLAB is proprietary software. As such, users have access to a limited number

  1. Accelerator Toolbox for MATLAB

    International Nuclear Information System (INIS)

    Terebilo, Andrei

    2001-01-01

    This paper introduces Accelerator Toolbox (AT)--a collection of tools to model particle accelerators and beam transport lines in the MATLAB environment. At SSRL, it has become the modeling code of choice for the ongoing design and future operation of the SPEAR 3 synchrotron light source. AT was designed to take advantage of power and simplicity of MATLAB--commercially developed environment for technical computing and visualization. Many examples in this paper illustrate the advantages of the AT approach and contrast it with existing accelerator code frameworks

  2. Digital image processing an algorithmic approach with Matlab

    CERN Document Server

    Qidwai, Uvais

    2009-01-01

    Introduction to Image Processing and the MATLAB EnvironmentIntroduction Digital Image Definitions: Theoretical Account Image Properties MATLAB Algorithmic Account MATLAB CodeImage Acquisition, Types, and File I/OImage Acquisition Image Types and File I/O Basics of Color Images Other Color Spaces Algorithmic Account MATLAB CodeImage ArithmeticIntroduction Operator Basics Theoretical TreatmentAlgorithmic Treatment Coding ExamplesAffine and Logical Operations, Distortions, and Noise in ImagesIntroduction Affine Operations Logical Operators Noise in Images Distortions in ImagesAlgorithmic Account

  3. Cuckoo Search with Lévy Flights for Weighted Bayesian Energy Functional Optimization in Global-Support Curve Data Fitting

    Directory of Open Access Journals (Sweden)

    Akemi Gálvez

    2014-01-01

    for data fitting by using global-support approximating curves. By global-support curves we mean curves expressed as a linear combination of basis functions whose support is the whole domain of the problem, as opposed to other common approaches in CAD/CAM and computer graphics driven by piecewise functions (such as B-splines and NURBS that provide local control of the shape of the curve. Our method applies a powerful nature-inspired metaheuristic algorithm called cuckoo search, introduced recently to solve optimization problems. A major advantage of this method is its simplicity: cuckoo search requires only two parameters, many fewer than other metaheuristic approaches, so the parameter tuning becomes a very simple task. The paper shows that this new approach can be successfully used to solve our optimization problem. To check the performance of our approach, it has been applied to five illustrative examples of different types, including open and closed 2D and 3D curves that exhibit challenging features, such as cusps and self-intersections. Our results show that the method performs pretty well, being able to solve our minimization problem in an astonishingly straightforward way.

  4. Une introduction à MATLAB c

    OpenAIRE

    CREMONA, Christian; LABORATOIRE CENTRAL DES PONTS ET CHAUSSEES - LCPC

    2002-01-01

    MATLAB c présente toutes les fonctionnalités des approches récentes de la programmation : programmation objet basée sur des hiérarchies de classes, programmation événementielle du graphisme. MATLAB c présente une aide en ligne très complète sous format html des différentes fontions accessibles. COMPTE RENDU DE RECHERCHE

  5. Test Generator for MATLAB Simulations

    Science.gov (United States)

    Henry, Joel

    2011-01-01

    MATLAB Automated Test Tool, version 3.0 (MATT 3.0) is a software package that provides automated tools that reduce the time needed for extensive testing of simulation models that have been constructed in the MATLAB programming language by use of the Simulink and Real-Time Workshop programs. MATT 3.0 runs on top of the MATLAB engine application-program interface to communicate with the Simulink engine. MATT 3.0 automatically generates source code from the models, generates custom input data for testing both the models and the source code, and generates graphs and other presentations that facilitate comparison of the outputs of the models and the source code for the same input data. Context-sensitive and fully searchable help is provided in HyperText Markup Language (HTML) format.

  6. One hundred physics visualizations using Matlab

    CERN Document Server

    Green, Dan

    2014-01-01

    This book provides visualizations of many topics in general physics. The aim is to have an interactive MATLAB script wherein the user can vary parameters in a specific problem and then immediately see the outcome by way of dynamic movies of the response of the system in question. MATLAB tools are used throughout and the software scripts accompany the text in Symbolic Mathematics, Classical Mechanics, Electromagnetism, Waves and Optics, Gases and Fluid Flow, Quantum Mechanics, Special and General Relativity, and Astrophysics and Cosmology. The emphasis is on building up an intuition by running many different parametric choices chosen actively by the user and watching the subsequent behavior of the system. Physics books using MATLAB do not have the range or the intent of this text. They are rather steeped in technical detail. Symbolic math is used extensively and is integral to the aim of using MATLAB tools to accomplish the technical aspects of problem solving.

  7. Efficient Matlab Programs

    Data.gov (United States)

    National Aeronautics and Space Administration — Matlab has a reputation for running slowly. Here are some pointers on how to speed computations, to an often unexpected degree. Subjects currently covered: Matrix...

  8. Determination of the secondary energy from the electron beam with a flattening foil by computer. Percentage depth dose curve fitting using the specific higher order polynomial

    Energy Technology Data Exchange (ETDEWEB)

    Kawakami, H [Kyushu Univ., Beppu, Oita (Japan). Inst. of Balneotherapeutics

    1980-09-01

    A computer program written in FORTRAN is described for determining the secondary energy of the electron beam which passed through a flattening foil, using a time-sharing computer service. The procedure of this program is first to fit the specific higher order polynomial to the measured percentage depth dose curve. Next, the practical range is evaluated by the point of intersection R of the line tangent to the fitted curve at the inflection point P and the given dose E, as shown in Fig. 2. Finally, the secondary energy corresponded to the determined practical range can be obtained by the experimental equation (2.1) between the practial range R (g/cm/sup 2/) and the electron energy T (MeV). A graph for the fitted polynomial with the inflection points and the practical range can be plotted on a teletype machine by request of user. In order to estimate the shapes of percentage depth dose curves correspond to the electron beams of different energies, we tried to find some specific functional relationships between each coefficient of the fitted seventh-degree equation and the incident electron energies. However, exact relationships could not be obtained for irreguarity among these coefficients.

  9. Optimization of ACC system spacing policy on curved highway

    Science.gov (United States)

    Ma, Jun; Qian, Kun; Gong, Zaiyan

    2017-05-01

    The paper optimizes the original spacing policy when adopting VTH (Variable Time Headway), proposes to introduce the road curve curvature K to the spacing policy to cope with following the wrong vehicle or failing to follow the vehicle owing to the radar limitation of curve in ACC system. By utilizing MATLAB/Simulink, automobile longitudinal dynamics model is established. At last, the paper sets up such three common cases as the vehicle ahead runs at a uniform velocity, an accelerated velocity and hits the brake suddenly, simulates these cases on the curve with different curvature, analyzes the curve spacing policy in the perspective of safety and vehicle following efficiency and draws the conclusion whether the optimization scheme is effective or not.

  10. An Accelerator Control Middle Layer Using MATLAB

    International Nuclear Information System (INIS)

    Portmann, Gregory J.; Corbett, Jeff; Terebilo, Andrei

    2005-01-01

    Matlab is an interpretive programming language originally developed for convenient use with the LINPACK and EISPACK libraries. Matlab is appealing for accelerator physics because it is matrix-oriented, provides an active workspace for system variables, powerful graphics capabilities, built-in math libraries, and platform independence. A number of accelerator software toolboxes have been written in Matlab -- the Accelerator Toolbox (AT) for model-based machine simulations, LOCO for on-line model calibration, and Matlab Channel Access (MCA) to connect with EPICS. The function of the MATLAB ''MiddleLayer'' is to provide a scripting language for machine simulations and on-line control, including non-EPICS based control systems. The MiddleLayer has simplified and streamlined development of high-level applications including configuration control, energy ramp, orbit correction, photon beam steering, ID compensation, beam-based alignment, tune correction and response matrix measurement. The database-driven Middle Layer software is largely machine-independent and easy to port. Six accelerators presently use the software package with more scheduled to come on line soon

  11. GURU v2.0: An interactive Graphical User interface to fit rheometer curves in Han's model for rubber vulcanization

    Science.gov (United States)

    Milani, G.; Milani, F.

    A GUI software (GURU) for experimental data fitting of rheometer curves in Natural Rubber (NR) vulcanized with sulphur at different curing temperatures is presented. Experimental data are automatically loaded in GURU from an Excel spreadsheet coming from the output of the experimental machine (moving die rheometer). To fit the experimental data, the general reaction scheme proposed by Han and co-workers for NR vulcanized with sulphur is considered. From the simplified kinetic scheme adopted, a closed form solution can be found for the crosslink density, with the only limitation that the induction period is excluded from computations. Three kinetic constants must be determined in such a way to minimize the absolute error between normalized experimental data and numerical prediction. Usually, this result is achieved by means of standard least-squares data fitting. On the contrary, GURU works interactively by means of a Graphical User Interface (GUI) to minimize the error and allows an interactive calibration of the kinetic constants by means of sliders. A simple mouse click on the sliders allows the assignment of a value for each kinetic constant and a visual comparison between numerical and experimental curves. Users will thus find optimal values of the constants by means of a classic trial and error strategy. An experimental case of technical relevance is shown as benchmark.

  12. Automatic Curve Fitting Based on Radial Basis Functions and a Hierarchical Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    G. Trejo-Caballero

    2015-01-01

    Full Text Available Curve fitting is a very challenging problem that arises in a wide variety of scientific and engineering applications. Given a set of data points, possibly noisy, the goal is to build a compact representation of the curve that corresponds to the best estimate of the unknown underlying relationship between two variables. Despite the large number of methods available to tackle this problem, it remains challenging and elusive. In this paper, a new method to tackle such problem using strictly a linear combination of radial basis functions (RBFs is proposed. To be more specific, we divide the parameter search space into linear and nonlinear parameter subspaces. We use a hierarchical genetic algorithm (HGA to minimize a model selection criterion, which allows us to automatically and simultaneously determine the nonlinear parameters and then, by the least-squares method through Singular Value Decomposition method, to compute the linear parameters. The method is fully automatic and does not require subjective parameters, for example, smooth factor or centre locations, to perform the solution. In order to validate the efficacy of our approach, we perform an experimental study with several tests on benchmarks smooth functions. A comparative analysis with two successful methods based on RBF networks has been included.

  13. Numerical simulation of dimples in airfoil using MATLAB

    Science.gov (United States)

    Booma Devi, P.; Shah, Dilip A.

    2017-05-01

    The Aircraft wing is a point of important research which poses greater challenge in terms of aerodynamic efficiency. The flow separation control method is addressed in classical aerodynamics methods. This study focuses on influence of dimples on controlling the flow and also increasing the aerodynamic efficiency. The periodic process of placing the cavities on the wing starting from root to tip controls the flow separation. The linear variation of characteristic curve provides the information about the flow separation and control of flow on upper surface of the airfoil.These different shapes are utilized viz., Square, Rectangle and Triangle. The numerical simulation is carried out in using MATLAB package. Preliminary analysis on the flow separation is carried out focuses on laminar flow separation, which has the influence on the overall lift generation and drag generation.

  14. Caracterización a impacto de caucho reciclado mediante elementos finitos

    OpenAIRE

    Escribano Castro, Ane

    2015-01-01

    Análisis de caucho reciclado de manera hiperelástica mediante métodos de ajuste de Mínimos Cuadrados con programa MATLAB y Curve fitting mediante ANSYS. Para la parte viscoelástica se usa Algoritmo de Optimicación con MATLAB. Comprobación de resultados y fiabilidad.

  15. Menhir: An Environment for High Performance Matlab

    Directory of Open Access Journals (Sweden)

    Stéphane Chauveau

    1999-01-01

    Full Text Available In this paper we present Menhir a compiler for generating sequential or parallel code from the Matlab language. The compiler has been designed in the context of using Matlab as a specification language. One of the major features of Menhir is its retargetability to generate parallel and sequential C or Fortran code. We present the compilation process and the target system description for Menhir. Preliminary performances are given and compared with MCC, the MathWorks Matlab compiler.

  16. An introduction to differential equations using MATLAB

    CERN Document Server

    Butt, Rizwan

    2016-01-01

    An Introduction to Differential Equations using MATLAB exploits the symbolic, numerical, and graphical capabilitiesof MATLAB to develop a thorough understanding of differential equations algorithms. This book provides the readerwith numerous applications, m-files, and practical examples to problems. Balancing theoretical concepts withcomputational speed and accuracy, the book includes numerous short programs in MATLAB that can be used to solveproblems involving first-and higher-order differential equations, Laplace transforms, linear systems of differentialequations, numerical solutions of differential equations, computer graphics, and more. The author emphasizes thebasic ideas of analytical and numerical techniques and the uses of modern mathematical software (MATLAB) ratherthan relying only on complex mathematical derivations to engineers, mathematician, computer scientists, andphysicists or for use as a textbook in applied or computational courses.A CD-ROM with all the figures, codes, solutions, appendices...

  17. Accelerator Modeling with MATLAB Accelerator Toolbox

    International Nuclear Information System (INIS)

    2002-01-01

    This paper introduces Accelerator Toolbox (AT)--a collection of tools to model storage rings and beam transport lines in the MATLAB environment. The objective is to illustrate the flexibility and efficiency of the AT-MATLAB framework. The paper discusses three examples of problems that are analyzed frequently in connection with ring-based synchrotron light sources

  18. Faults Detection in a Photovoltaic Generator by Using Matlab Simulink and the chipKIT Max32 Board

    Directory of Open Access Journals (Sweden)

    Riadh Khenfer

    2014-01-01

    Full Text Available This paper presents a laboratory with equipment and an algorithm for teaching graduate students the monitoring and the diagnosis of PV arrays. The contribution is the presentation of an algorithm to detect and localize the fault, in photovoltaic generator when a limited number of voltage sensors are used. An I-V curve tracer using a capacitive load is exploited to measure the I-V characteristics of PV arrays. Such measurement allows characterization of PV arrays on-site, under real operating conditions, and provides also information for the detection of potential array anomalies. This I-V curve tracer is based on a microcontroller board family called chipKIT Max32 which is a popular platform for physical computing. A user program can be developed visually on a PC side via the graphical user interface (GUI in Matlab Simulink, where the chipKIT Max32 of Digilent which is a low-cost board is designed for use with the Arduinompid software. The obtained results from the partial shade default showed the effectiveness of the proposed diagnosis method and the good functioning of this board with the Matlab/Simulink environment.

  19. Growth curve models and statistical diagnostics

    CERN Document Server

    Pan, Jian-Xin

    2002-01-01

    Growth-curve models are generalized multivariate analysis-of-variance models. These models are especially useful for investigating growth problems on short times in economics, biology, medical research, and epidemiology. This book systematically introduces the theory of the GCM with particular emphasis on their multivariate statistical diagnostics, which are based mainly on recent developments made by the authors and their collaborators. The authors provide complete proofs of theorems as well as practical data sets and MATLAB code.

  20. Theoretical derivation of anodizing current and comparison between fitted curves and measured curves under different conditions

    Science.gov (United States)

    Chong, Bin; Yu, Dongliang; Jin, Rong; Wang, Yang; Li, Dongdong; Song, Ye; Gao, Mingqi; Zhu, Xufei

    2015-04-01

    Anodic TiO2 nanotubes have been studied extensively for many years. However, the growth kinetics still remains unclear. The systematic study of the current transient under constant anodizing voltage has not been mentioned in the original literature. Here, a derivation and its corresponding theoretical formula are proposed to overcome this challenge. In this paper, the theoretical expressions for the time dependent ionic current and electronic current are derived to explore the anodizing process of Ti. The anodizing current-time curves under different anodizing voltages and different temperatures are experimentally investigated in the anodization of Ti. Furthermore, the quantitative relationship between the thickness of the barrier layer and anodizing time, and the relationships between the ionic/electronic current and temperatures are proposed in this paper. All of the current-transient plots can be fitted consistently by the proposed theoretical expressions. Additionally, it is the first time that the coefficient A of the exponential relationship (ionic current jion = A exp(BE)) has been determined under various temperatures and voltages. And the results indicate that as temperature and voltage increase, ionic current and electronic current both increase. The temperature has a larger effect on electronic current than ionic current. These results can promote the research of kinetics from a qualitative to quantitative level.

  1. Theoretical derivation of anodizing current and comparison between fitted curves and measured curves under different conditions.

    Science.gov (United States)

    Chong, Bin; Yu, Dongliang; Jin, Rong; Wang, Yang; Li, Dongdong; Song, Ye; Gao, Mingqi; Zhu, Xufei

    2015-04-10

    Anodic TiO2 nanotubes have been studied extensively for many years. However, the growth kinetics still remains unclear. The systematic study of the current transient under constant anodizing voltage has not been mentioned in the original literature. Here, a derivation and its corresponding theoretical formula are proposed to overcome this challenge. In this paper, the theoretical expressions for the time dependent ionic current and electronic current are derived to explore the anodizing process of Ti. The anodizing current-time curves under different anodizing voltages and different temperatures are experimentally investigated in the anodization of Ti. Furthermore, the quantitative relationship between the thickness of the barrier layer and anodizing time, and the relationships between the ionic/electronic current and temperatures are proposed in this paper. All of the current-transient plots can be fitted consistently by the proposed theoretical expressions. Additionally, it is the first time that the coefficient A of the exponential relationship (ionic current j(ion) = A exp(BE)) has been determined under various temperatures and voltages. And the results indicate that as temperature and voltage increase, ionic current and electronic current both increase. The temperature has a larger effect on electronic current than ionic current. These results can promote the research of kinetics from a qualitative to quantitative level.

  2. A guide to MATLAB for beginners and experienced users

    CERN Document Server

    Hunt, Brian R; Rosenberg, Jonathan M

    2014-01-01

    Now in its third edition, this outstanding textbook explains everything you need to get started using MATLAB®. It contains concise explanations of essential MATLAB commands, as well as easily understood instructions for using MATLAB's programming features, graphical capabilities, simulation models, and rich desktop interface. MATLAB 8 and its new user interface is treated extensively in the book. New features in this edition include: a complete treatment of MATLAB's publish feature; new material on MATLAB graphics, enabling the user to master quickly the various symbolic and numerical plotting routines; and a robust presentation of MuPAD® and how to use it as a stand-alone platform. The authors have also updated the text throughout, reworking examples and exploring new applications. The book is essential reading for beginners, occasional users and experienced users wishing to brush up their skills. Further resources are available from the authors' website at www-math.umd.edu/schol/a-guide-to-matlab.html.

  3. Nuclear grade cable thermal life model by time temperature superposition algorithm based on Matlab GUI

    International Nuclear Information System (INIS)

    Lu Yanyun; Gu Shenjie; Lou Tianyang

    2014-01-01

    Background: As nuclear grade cable must endure harsh environment within design life, it is critical to predict cable thermal life accurately owing to thermal aging, which is one of dominant factors of aging mechanism. Purpose: Using time temperature superposition (TTS) method, the aim is to construct nuclear grade cable thermal life model, predict cable residual life and develop life model interactive interface under Matlab GUI. Methods: According to TTS, nuclear grade cable thermal life model can be constructed by shifting data groups at various temperatures to preset reference temperature with translation factor which is determined by non linear programming optimization. Interactive interface of cable thermal life model developed under Matlab GUI consists of superposition mode and standard mode which include features such as optimization of translation factor, calculation of activation energy, construction of thermal aging curve and analysis of aging mechanism., Results: With calculation result comparison between superposition and standard method, the result with TTS has better accuracy than that with standard method. Furthermore, confidence level of nuclear grade cable thermal life with TTS is higher than that with standard method. Conclusion: The results show that TTS methodology is applicable to thermal life prediction of nuclear grade cable. Interactive Interface under Matlab GUI achieves anticipated functionalities. (authors)

  4. GERMINATOR: a software package for high-throughput scoring and curve fitting of Arabidopsis seed germination.

    Science.gov (United States)

    Joosen, Ronny V L; Kodde, Jan; Willems, Leo A J; Ligterink, Wilco; van der Plas, Linus H W; Hilhorst, Henk W M

    2010-04-01

    Over the past few decades seed physiology research has contributed to many important scientific discoveries and has provided valuable tools for the production of high quality seeds. An important instrument for this type of research is the accurate quantification of germination; however gathering cumulative germination data is a very laborious task that is often prohibitive to the execution of large experiments. In this paper we present the germinator package: a simple, highly cost-efficient and flexible procedure for high-throughput automatic scoring and evaluation of germination that can be implemented without the use of complex robotics. The germinator package contains three modules: (i) design of experimental setup with various options to replicate and randomize samples; (ii) automatic scoring of germination based on the color contrast between the protruding radicle and seed coat on a single image; and (iii) curve fitting of cumulative germination data and the extraction, recap and visualization of the various germination parameters. The curve-fitting module enables analysis of general cumulative germination data and can be used for all plant species. We show that the automatic scoring system works for Arabidopsis thaliana and Brassica spp. seeds, but is likely to be applicable to other species, as well. In this paper we show the accuracy, reproducibility and flexibility of the germinator package. We have successfully applied it to evaluate natural variation for salt tolerance in a large population of recombinant inbred lines and were able to identify several quantitative trait loci for salt tolerance. Germinator is a low-cost package that allows the monitoring of several thousands of germination tests, several times a day by a single person.

  5. Application of MATLAB in testing digital power supply controllers

    International Nuclear Information System (INIS)

    Ke Xinhua; Chinese Academy of Sciences, Beijing; Lu Songlin; Shen Tianjian

    2008-01-01

    Based on introducing MATLAB SISOTOOL in MATLAB control box and the magnet power supply system in detail, this paper presented the application of MATLAB SISOTOOL in Testing Digital Power Supply controllers. This control tool should be popularized because of its characteristics of convenience and easy-to-use. (authors)

  6. A Monte Carlo Study of the Effect of Item Characteristic Curve Estimation on the Accuracy of Three Person-Fit Statistics

    Science.gov (United States)

    St-Onge, Christina; Valois, Pierre; Abdous, Belkacem; Germain, Stephane

    2009-01-01

    To date, there have been no studies comparing parametric and nonparametric Item Characteristic Curve (ICC) estimation methods on the effectiveness of Person-Fit Statistics (PFS). The primary aim of this study was to determine if the use of ICCs estimated by nonparametric methods would increase the accuracy of item response theory-based PFS for…

  7. MATLAB for Engineering and the Life Sciences

    CERN Document Server

    Tranquillo, Joseph

    2011-01-01

    In recent years, the life sciences have embraced simulation as an important tool in biomedical research. Engineers are also using simulation as a powerful step in the design process. In both arenas, Matlab has become the gold standard. It is easy to learn, flexible, and has a large and growing userbase. MATLAB for Engineering and the Life Sciences is a self-guided tour of the basic functionality of MATLAB along with the functions that are most commonly used in biomedical engineering and other life sciences. Although the text is written for undergraduates, graduate students and academics, those

  8. GURU v2.0: An interactive Graphical User interface to fit rheometer curves in Han’s model for rubber vulcanization

    Directory of Open Access Journals (Sweden)

    G. Milani

    2016-01-01

    Full Text Available A GUI software (GURU for experimental data fitting of rheometer curves in Natural Rubber (NR vulcanized with sulphur at different curing temperatures is presented. Experimental data are automatically loaded in GURU from an Excel spreadsheet coming from the output of the experimental machine (moving die rheometer. To fit the experimental data, the general reaction scheme proposed by Han and co-workers for NR vulcanized with sulphur is considered. From the simplified kinetic scheme adopted, a closed form solution can be found for the crosslink density, with the only limitation that the induction period is excluded from computations. Three kinetic constants must be determined in such a way to minimize the absolute error between normalized experimental data and numerical prediction. Usually, this result is achieved by means of standard least-squares data fitting. On the contrary, GURU works interactively by means of a Graphical User Interface (GUI to minimize the error and allows an interactive calibration of the kinetic constants by means of sliders. A simple mouse click on the sliders allows the assignment of a value for each kinetic constant and a visual comparison between numerical and experimental curves. Users will thus find optimal values of the constants by means of a classic trial and error strategy. An experimental case of technical relevance is shown as benchmark.

  9. An Accelerator control middle layer using Matlab

    CERN Document Server

    Portmann, G J; Terebilo, Andrei

    2005-01-01

    Matlab is a matrix manipulation language originally developed to be a convenient language for using the LINPACK and EISPACK libraries. What makes Matlab so appealing for accelerator physics is the combination of a matrix oriented programming language, an active workspace for system variables, powerful graphics capability, built-in math libraries, and platform independence. A number of software toolboxes for accelerators have been written in Matlab – the Accelerator Toolbox (AT) for machine simulations, LOCO for accelerator calibration, Matlab Channel Access Toolbox (MCA) for EPICS connections, and the Middle Layer. This paper will describe the MiddleLayer software toolbox that resides between the high-level control applications and the low-level accelerator control system. This software was a collaborative effort between ALS and Spear but was written to easily port. Five accelerators presently use this software – Spear, ALS, CLS, and the X-ray and VUV rings at Brookhaven. The Middle Layer fu...

  10. MOCCASIN: converting MATLAB ODE models to SBML.

    Science.gov (United States)

    Gómez, Harold F; Hucka, Michael; Keating, Sarah M; Nudelman, German; Iber, Dagmar; Sealfon, Stuart C

    2016-06-15

    MATLAB is popular in biological research for creating and simulating models that use ordinary differential equations (ODEs). However, sharing or using these models outside of MATLAB is often problematic. A community standard such as Systems Biology Markup Language (SBML) can serve as a neutral exchange format, but translating models from MATLAB to SBML can be challenging-especially for legacy models not written with translation in mind. We developed MOCCASIN (Model ODE Converter for Creating Automated SBML INteroperability) to help. MOCCASIN can convert ODE-based MATLAB models of biochemical reaction networks into the SBML format. MOCCASIN is available under the terms of the LGPL 2.1 license (http://www.gnu.org/licenses/lgpl-2.1.html). Source code, binaries and test cases can be freely obtained from https://github.com/sbmlteam/moccasin : mhucka@caltech.edu More information is available at https://github.com/sbmlteam/moccasin. © The Author 2016. Published by Oxford University Press.

  11. OMPC: an Open-Source MATLAB-to-Python Compiler.

    Science.gov (United States)

    Jurica, Peter; van Leeuwen, Cees

    2009-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB((R)), the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB((R))-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB((R)) functions into Python programs. The imported MATLAB((R)) modules will run independently of MATLAB((R)), relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB((R)). OMPC is available at http://ompc.juricap.com.

  12. MatLab Programming for Engineers Having No Formal Programming Knowledge

    Science.gov (United States)

    Shaykhian, Linda H.; Shaykhian, Gholam Ali

    2007-01-01

    MatLab is one of the most widely used very high level programming languages for Scientific and engineering computations. It is very user-friendly and needs practically no formal programming knowledge. Presented here are MatLab programming aspects and not just the MatLab commands for scientists and engineers who do not have formal programming training and also have no significant time to spare for learning programming to solve their real world problems. Specifically provided are programs for visualization. Also, stated are the current limitations of the MatLab, which possibly can be taken care of by Mathworks Inc. in a future version to make MatLab more versatile.

  13. Cosmology with MATLAB

    CERN Document Server

    Green, Dan

    2016-01-01

    This volume makes explicit use of the synergy between cosmology and high energy physics, for example, supersymmetry and dark matter, or nucleosynthesis and the baryon-to-photon ratio. In particular the exciting possible connection between the recently discovered Higgs scalar and the scalar field responsible for inflation is explored.The recent great advances in the accuracy of the basic cosmological parameters is exploited in that no free scale parameters such as h appear, rather the basic calculations are done numerically using all sources of energy density simultaneously. Scripts are provided that allow the reader to calculate exact results for the basic parameters. Throughout MATLAB tools such as symbolic math, numerical solutions, plots and 'movies' of the dynamical evolution of systems are used. The GUI package is also shown as an example of the real time manipulation of parameters which is available to the reader.All the MATLAB scripts are made available to the reader to explore examples of the uses of ...

  14. MatLab 1b

    DEFF Research Database (Denmark)

    Kristiansen, Heidi; Kaas, Thomas; Freil, Ole

    Bogen indeholder fire kapitler: ’Tal i spil’, ’Spejlinger og mønstre’, ’Mønstre med farver, figurer og tal’ og ’Længde og vægt’. MATLAB: •bygger på en undersøgende og problemorienteret tilgang til matematikken •understøtter læringsmålstyret undervisning •har fokus på faglige samtaler og undersøge......Bogen indeholder fire kapitler: ’Tal i spil’, ’Spejlinger og mønstre’, ’Mønstre med farver, figurer og tal’ og ’Længde og vægt’. MATLAB: •bygger på en undersøgende og problemorienteret tilgang til matematikken •understøtter læringsmålstyret undervisning •har fokus på faglige samtaler og...

  15. Orthogonal transformations for change detection, Matlab code

    DEFF Research Database (Denmark)

    2005-01-01

    Matlab code to do multivariate alteration detection (MAD) analysis, maximum autocorrelation factor (MAF) analysis, canonical correlation analysis (CCA) and principal component analysis (PCA) on image data.......Matlab code to do multivariate alteration detection (MAD) analysis, maximum autocorrelation factor (MAF) analysis, canonical correlation analysis (CCA) and principal component analysis (PCA) on image data....

  16. Mathematical Modeling Using MATLAB

    National Research Council Canada - National Science Library

    Phillips, Donovan

    1998-01-01

    .... Mathematical Modeling Using MA MATLAB acts as a companion resource to A First Course in Mathematical Modeling with the goal of guiding the reader to a fuller understanding of the modeling process...

  17. Flexible missile autopilot design studies with PC-MATLAB/386

    Science.gov (United States)

    Ruth, Michael J.

    1989-01-01

    Development of a responsive, high-bandwidth missile autopilot for airframes which have structural modes of unusually low frequency presents a challenging design task. Such systems are viable candidates for modern, state-space control design methods. The PC-MATLAB interactive software package provides an environment well-suited to the development of candidate linear control laws for flexible missile autopilots. The strengths of MATLAB include: (1) exceptionally high speed (MATLAB's version for 80386-based PC's offers benchmarks approaching minicomputer and mainframe performance); (2) ability to handle large design models of several hundred degrees of freedom, if necessary; and (3) broad extensibility through user-defined functions. To characterize MATLAB capabilities, a simplified design example is presented. This involves interactive definition of an observer-based state-space compensator for a flexible missile autopilot design task. MATLAB capabilities and limitations, in the context of this design task, are then summarized.

  18. OXlearn: a new MATLAB-based simulation tool for connectionist models.

    Science.gov (United States)

    Ruh, Nicolas; Westermann, Gert

    2009-11-01

    OXlearn is a free, platform-independent MATLAB toolbox in which standard connectionist neural network models can be set up, run, and analyzed by means of a user-friendly graphical interface. Due to its seamless integration with the MATLAB programming environment, the inner workings of the simulation tool can be easily inspected and/or extended using native MATLAB commands or components. This combination of usability, transparency, and extendability makes OXlearn an efficient tool for the implementation of basic research projects or the prototyping of more complex research endeavors, as well as for teaching. Both the MATLAB toolbox and a compiled version that does not require access to MATLAB can be downloaded from http://psych.brookes.ac.uk/oxlearn/.

  19. An Accelerator Control Middle Layer Using MATLAB

    International Nuclear Information System (INIS)

    Portmann, Gregory J.; Corbett, Jeff; Terebilo, Andrei

    2005-01-01

    Matlab is a matrix manipulation language originally developed to be a convenient language for using the LINPACK and EISPACK libraries. What makes Matlab so appealing for accelerator physics is the combination of a matrix oriented programming language, an active workspace for system variables, powerful graphics capability, built-in math libraries, and platform independence. A number of software toolboxes for accelerators have been written in Matlab--the Accelerator Toolbox (AT) for machine simulations, LOCO for accelerator calibration, Matlab Channel Access Toolbox (MCA) for EPICS connections, and the Middle Layer. This paper will describe the ''middle layer'' software toolbox that resides between the high-level control applications and the low-level accelerator control system. This software was a collaborative effort between ALS (LBNL) and SPEAR3 (SSRL) but easily ports to other machines. Five accelerators presently use this software. The high-level Middle Layer functionality includes energy ramp, configuration control (save/restore), global orbit correction, local photon beam steering, insertion device compensation, beam-based alignment, tune correction, response matrix measurement, and script-based programs for machine physics studies

  20. Protective Benefits of Deep Tube Wells Against Childhood Diarrhea in Matlab, Bangladesh

    Science.gov (United States)

    Winston, Jennifer Jane; Escamilla, Veronica; Perez-Heydrich, Carolina; Carrel, Margaret; Yunus, Mohammad; Streatfield, Peter Kim

    2013-01-01

    Objectives. We investigated whether deep tube wells installed to provide arsenic-free groundwater in rural Bangladesh have the added benefit of reducing childhood diarrheal disease incidence. Methods. We recorded cases of diarrhea in children younger than 5 years in 142 villages of Matlab, Bangladesh, during monthly community health surveys in 2005 and 2006. We surveyed the location and depth of 12 018 tube wells and integrated these data with diarrhea data and other data in a geographic information system. We fit a longitudinal logistic regression model to measure the relationship between childhood diarrhea and deep tube well use. We controlled for maternal education, family wealth, year, and distance to a deep tube well. Results. Household clusters assumed to be using deep tube wells were 48.7% (95% confidence interval = 27.8%, 63.5%) less likely to have a case of childhood diarrhea than were other household clusters. Conclusions. Increased access to deep tube wells may provide dual benefits to vulnerable populations in Matlab, Bangladesh, by reducing the risk of childhood diarrheal disease and decreasing exposure to naturally occurring arsenic in groundwater. PMID:23409905

  1. Methods for fitting of efficiency curves obtained by means of HPGe gamma rays spectrometers; Metodos de ajuste de curvas de eficiencia obtidas por meio de espectrometros de HPGe

    Energy Technology Data Exchange (ETDEWEB)

    Cardoso, Vanderlei

    2002-07-01

    The present work describes a few methodologies developed for fitting efficiency curves obtained by means of a HPGe gamma-ray spectrometer. The interpolated values were determined by simple polynomial fitting and polynomial fitting between the ratio of experimental peak efficiency and total efficiency, calculated by Monte Carlo technique, as a function of gamma-ray energy. Moreover, non-linear fitting has been performed using a segmented polynomial function and applying the Gauss-Marquardt method. For the peak area obtainment different methodologies were developed in order to estimate the background area under the peak. This information was obtained by numerical integration or by using analytical functions associated to the background. One non-calibrated radioactive source has been included in the curve efficiency in order to provide additional calibration points. As a by-product, it was possible to determine the activity of this non-calibrated source. For all fittings developed in the present work the covariance matrix methodology was used, which is an essential procedure in order to give a complete description of the partial uncertainties involved. (author)

  2. System design through Matlab, control toolbox and Simulink

    CERN Document Server

    Singh, Krishna K

    2001-01-01

    MATLAB , a software package developed by Math Works, Inc. is powerful, versatile and interactive software for scientific and technical computations including simulations. Specialised toolboxes provided with several built-in functions are a special feature of MATLAB . This book titled System Design through MATLAB , Control Toolbox and SIMULINK aims at getting the reader started with computations and simulations in system engineering quickly and easily and then proceeds to build concepts for advanced computations and simulations that includes the control and compensation of systems. Simulation through SIMULINK has also been described to allow the reader to get the feel of the real world situation. This book is appropriate for undergraduate students undergoing final semester of their project work, postgraduate students who have MATLAB integrated in their course or wish to take up simulation problem in the area of system engineering for their dissertation work and research scholars for whom MATLABÊ

  3. Subband/Transform MATLAB Functions For Processing Images

    Science.gov (United States)

    Glover, D.

    1995-01-01

    SUBTRANS software is package of routines implementing image-data-processing functions for use with MATLAB*(TM) software. Provides capability to transform image data with block transforms and to produce spatial-frequency subbands of transformed data. Functions cascaded to provide further decomposition into more subbands. Also used in image-data-compression systems. For example, transforms used to prepare data for lossy compression. Written for use in MATLAB mathematical-analysis environment.

  4. Experiences with Matlab and VRML in Functional Neuroimaging Visualizations

    DEFF Research Database (Denmark)

    Nielsen, Finn Årup; Hansen, Lars Kai

    2000-01-01

    We describe some experiences with Matlab and VRML. We are developing a toolbox for neuroinformatics and describe some of the functionalities we have implemented or will implement and how Matlab and VRML support the implementation....

  5. Introduction to Numerical Computation - analysis and Matlab illustrations

    DEFF Research Database (Denmark)

    Elden, Lars; Wittmeyer-Koch, Linde; Nielsen, Hans Bruun

    In a modern programming environment like eg MATLAB it is possible by simple commands to perform advanced calculations on a personal computer. In order to use such a powerful tool efiiciently it is necessary to have an overview of available numerical methods and algorithms and to know about...... are illustrated by examples in MATLAB....

  6. Introduction to finite element analysis using MATLAB and Abaqus

    CERN Document Server

    Khennane, Amar

    2013-01-01

    There are some books that target the theory of the finite element, while others focus on the programming side of things. Introduction to Finite Element Analysis Using MATLAB(R) and Abaqus accomplishes both. This book teaches the first principles of the finite element method. It presents the theory of the finite element method while maintaining a balance between its mathematical formulation, programming implementation, and application using commercial software. The computer implementation is carried out using MATLAB, while the practical applications are carried out in both MATLAB and Abaqus. MA

  7. Digital signal processing using MATLAB

    CERN Document Server

    Schilling, Robert L

    2016-01-01

    Focus on the development, implementation, and application of modern DSP techniques with DIGITAL SIGNAL PROCESSING USING MATLAB(R), 3E. Written in an engaging, informal style, this edition immediately captures your attention and encourages you to explore each critical topic. Every chapter starts with a motivational section that highlights practical examples and challenges that you can solve using techniques covered in the chapter. Each chapter concludes with a detailed case study example, a chapter summary with learning outcomes, and practical homework problems cross-referenced to specific chapter sections for your convenience. DSP Companion software accompanies each book to enable further investigation. The DSP Companion software operates with MATLAB(R) and provides intriguing demonstrations as well as interactive explorations of analysis and design concepts.

  8. A curve-fitting approach to estimate the arterial plasma input function for the assessment of glucose metabolic rate and response to treatment.

    Science.gov (United States)

    Vriens, Dennis; de Geus-Oei, Lioe-Fee; Oyen, Wim J G; Visser, Eric P

    2009-12-01

    For the quantification of dynamic (18)F-FDG PET studies, the arterial plasma time-activity concentration curve (APTAC) needs to be available. This can be obtained using serial sampling of arterial blood or an image-derived input function (IDIF). Arterial sampling is invasive and often not feasible in practice; IDIFs are biased because of partial-volume effects and cannot be used when no large arterial blood pool is in the field of view. We propose a mathematic function, consisting of an initial linear rising activity concentration followed by a triexponential decay, to describe the APTAC. This function was fitted to 80 oncologic patients and verified for 40 different oncologic patients by area-under-the-curve (AUC) comparison, Patlak glucose metabolic rate (MR(glc)) estimation, and therapy response monitoring (Delta MR(glc)). The proposed function was compared with the gold standard (serial arterial sampling) and the IDIF. To determine the free parameters of the function, plasma time-activity curves based on arterial samples in 80 patients were fitted after normalization for administered activity (AA) and initial distribution volume (iDV) of (18)F-FDG. The medians of these free parameters were used for the model. In 40 other patients (20 baseline and 20 follow-up dynamic (18)F-FDG PET scans), this model was validated. The population-based curve, individually calibrated by AA and iDV (APTAC(AA/iDV)), by 1 late arterial sample (APTAC(1 sample)), and by the individual IDIF (APTAC(IDIF)), was compared with the gold standard of serial arterial sampling (APTAC(sampled)) using the AUC. Additionally, these 3 methods of APTAC determination were evaluated with Patlak MR(glc) estimation and with Delta MR(glc) for therapy effects using serial sampling as the gold standard. Excellent individual fits to the function were derived with significantly different decay constants (P AUC from APTAC(AA/iDV), APTAC(1 sample), and APTAC(IDIF) with the gold standard (APTAC(sampled)) were 0

  9. Visual media processing using Matlab beginner's guide

    CERN Document Server

    Siogkas, George

    2013-01-01

    Written in a friendly, Beginner's Guide format, showing the user how to use the digital media aspects of Matlab (image, video, sound) in a practical, tutorial-based style.This is great for novice programmers in any language who would like to use Matlab as a tool for their image and video processing needs, and also comes in handy for photographers or video editors with even less programming experience wanting to find an all-in-one tool for their tasks.

  10. Regularization Tools Version 3.0 for Matlab 5.2

    DEFF Research Database (Denmark)

    Hansen, Per Christian

    1999-01-01

    This communication describes Version 3.0 of Regularization Tools, a Matlab package for analysis and solution of discrete ill-posed problems.......This communication describes Version 3.0 of Regularization Tools, a Matlab package for analysis and solution of discrete ill-posed problems....

  11. Gpufit: An open-source toolkit for GPU-accelerated curve fitting.

    Science.gov (United States)

    Przybylski, Adrian; Thiel, Björn; Keller-Findeisen, Jan; Stock, Bernd; Bates, Mark

    2017-11-16

    We present a general purpose, open-source software library for estimation of non-linear parameters by the Levenberg-Marquardt algorithm. The software, Gpufit, runs on a Graphics Processing Unit (GPU) and executes computations in parallel, resulting in a significant gain in performance. We measured a speed increase of up to 42 times when comparing Gpufit with an identical CPU-based algorithm, with no loss of precision or accuracy. Gpufit is designed such that it is easily incorporated into existing applications or adapted for new ones. Multiple software interfaces, including to C, Python, and Matlab, ensure that Gpufit is accessible from most programming environments. The full source code is published as an open source software repository, making its function transparent to the user and facilitating future improvements and extensions. As a demonstration, we used Gpufit to accelerate an existing scientific image analysis package, yielding significantly improved processing times for super-resolution fluorescence microscopy datasets.

  12. Novel algorithm and MATLAB-based program for automated power law analysis of single particle, time-dependent mean-square displacement

    Science.gov (United States)

    Umansky, Moti; Weihs, Daphne

    2012-08-01

    should also be backwards compatible. Symbolic Math Toolboxes (5.5) is required. The Curve Fitting Toolbox (3.0) is recommended. Computer: Tested on Windows only, yet should work on any computer running MATLAB. In Windows 7, should be used as administrator, if the user is not the administrator the program may not be able to save outputs and temporary outputs to all locations. Operating system: Any supporting MATLAB (MathWorks Inc.) v7.11 / 2010b or higher. Supplementary material: Sample output files (approx. 30 MBytes) are available. Classification: 12 External routines: Several MATLAB subfunctions (m-files), freely available on the web, were used as part of and included in, this code: count, NaN suite, parseArgs, roundsd, subaxis, wcov, wmean, and the executable pdfTK.exe. Nature of problem: In many physical and biophysical areas employing single-particle tracking, having the time-dependent power-laws governing the time-averaged meansquare displacement (MSD) of a single particle is crucial. Those power laws determine the mode-of-motion and hint at the underlying mechanisms driving motion. Accurate determination of the power laws that describe each trajectory will allow categorization into groups for further analysis of single trajectories or ensemble analysis, e.g. ensemble and time-averaged MSD. Solution method: The algorithm in the provided program automatically analyzes and fits time-dependent power laws to single particle trajectories, then group particles according to user defined cutoffs. It accepts time-dependent trajectories of several particles, each trajectory is run through the program, its time-averaged MSD is calculated, and power laws are determined in regions where the MSD is linear on a log-log scale. Our algorithm searches for high-curvature points in experimental data, here time-dependent MSD. Those serve as anchor points for determining the ranges of the power-law fits. Power-law scaling is then accurately determined and error estimations of the

  13. Transportation channels calculation method in MATLAB

    International Nuclear Information System (INIS)

    Averyanov, G.P.; Budkin, V.A.; Dmitrieva, V.V.; Osadchuk, I.O.; Bashmakov, Yu.A.

    2014-01-01

    Output devices and charged particles transport channels are necessary components of any modern particle accelerator. They differ both in sizes and in terms of focusing elements depending on particle accelerator type and its destination. A package of transport line designing codes for magnet optical channels in MATLAB environment is presented in this report. Charged particles dynamics in a focusing channel can be studied easily by means of the matrix technique. MATLAB usage is convenient because its information objects are matrixes. MATLAB allows the use the modular principle to build the software package. Program blocks are small in size and easy to use. They can be executed separately or commonly. A set of codes has a user-friendly interface. Transport channel construction consists of focusing lenses (doublets and triplets). The main of the magneto-optical channel parameters are total length and lens position and parameters of the output beam in the phase space (channel acceptance, beam emittance - beam transverse dimensions, particles divergence and image stigmaticity). Choice of the channel operation parameters is based on the conditions for satisfying mutually competing demands. And therefore the channel parameters calculation is carried out by using the search engine optimization techniques.

  14. DSISoft—a MATLAB VSP data processing package

    Science.gov (United States)

    Beaty, K. S.; Perron, G.; Kay, I.; Adam, E.

    2002-05-01

    DSISoft is a public domain vertical seismic profile processing software package developed at the Geological Survey of Canada. DSISoft runs under MATLAB version 5.0 and above and hence is portable between computer operating systems supported by MATLAB (i.e. Unix, Windows, Macintosh, Linux). The package includes processing modules for reading and writing various standard seismic data formats, performing data editing, sorting, filtering, and other basic processing modules. The processing sequence can be scripted allowing batch processing and easy documentation. A structured format has been developed to ensure future additions to the package are compatible with existing modules. Interactive modules have been created using MATLAB's graphical user interface builder for displaying seismic data, picking first break times, examining frequency spectra, doing f- k filtering, and plotting the trace header information. DSISoft modular design facilitates the incorporation of new processing algorithms as they are developed. This paper gives an overview of the scope of the software and serves as a guide for the addition of new modules.

  15. Paleomagnetic dating: Methods, MATLAB software, example

    Science.gov (United States)

    Hnatyshin, Danny; Kravchinsky, Vadim A.

    2014-09-01

    A MATLAB software tool has been developed to provide an easy to use graphical interface for the plotting and interpretation of paleomagnetic data. The tool takes either paleomagnetic directions or paleopoles and compares them to a user defined apparent polar wander path or secular variation curve to determine the age of a paleomagnetic sample. Ages can be determined in two ways, either by translating the data onto the reference curve, or by rotating it about a set location (e.g. sampling location). The results are then compiled in data tables which can be exported as an excel file. This data can also be plotted using variety of built-in stereographic projections, which can then be exported as an image file. This software was used to date the giant Sukhoi Log gold deposit in Russia. Sukhoi Log has undergone a complicated history of faulting, folding, metamorphism, and is the vicinity of many granitic bodies. Paleomagnetic analysis of Sukhoi Log allowed for the timing of large scale thermal or chemical events to be determined. Paleomagnetic analysis from gold mineralized black shales was used to define the natural remanent magnetization recorded at Sukhoi Log. The obtained paleomagnetic direction from thermal demagnetization produced a paleopole at 61.3°N, 155.9°E, with the semi-major axis and semi-minor axis of the 95% confidence ellipse being 16.6° and 15.9° respectively. This paleopole is compared to the Siberian apparent polar wander path (APWP) by translating the paleopole to the nearest location on the APWP. This produced an age of 255.2- 31.0+ 32.0Ma and is the youngest well defined age known for Sukhoi Log. We propose that this is the last major stage of activity at Sukhoi Log, and likely had a role in determining the present day state of mineralization seen at the deposit.

  16. Practical image and video processing using MATLAB

    CERN Document Server

    Marques, Oge

    2011-01-01

    "The book provides a practical introduction to the most important topics in image and video processing using MATLAB (and its Image Processing Toolbox) as a tool to demonstrate the most important techniques and algorithms. The contents are presented in a clear, technically accurate, objective way, with just enough mathematical detail. Most of the chapters are supported by figures, examples, illustrative problems, MATLAB scripts, suggestions for further reading, bibliographical references, useful Web sites, and exercises and computer projects to extend the understanding of their contents"--

  17. Modelling real solar cell using PSCAD/MATLAB

    Energy Technology Data Exchange (ETDEWEB)

    Ramos, Sergio; Silva, Marco; Fernandes, Filipe; Vale, Zita [Polytechnic of Porto (Portugal). GECAD - Knowledge Engineering and Decision Support Research Center

    2012-07-01

    This paper presents the development of a solar photovoltaic (PV) model based on PSCAD/EMTDC - Power System Computer Aided Design - including a mathematical model study. An additional algorithm has been implemented in MATLAB software in order to calculate several parameters required by the PSCAD developed model. All the simulation study has been performed in PSCAD/MATLAB software simulation tool. A real data base concerning irradiance, cell temperature and PV power generation was used in order to support the evaluation of the implemented PV model. (orig.)

  18. Kinematic analysis of the finger exoskeleton using MATLAB/Simulink.

    Science.gov (United States)

    Nasiłowski, Krzysztof; Awrejcewicz, Jan; Lewandowski, Donat

    2014-01-01

    A paralyzed and not fully functional part of human body can be supported by the properly designed exoskeleton system with motoric abilities. It can help in rehabilitation, or movement of a disabled/paralyzed limb. Both suitably selected geometry and specialized software are studied applying the MATLAB environment. A finger exoskeleton was the base for MATLAB/Simulink model. Specialized software, such as MATLAB/Simulink give us an opportunity to optimize calculation reaching precise results, which help in next steps of design process. The calculations carried out yield information regarding movement relation between three functionally connected actuators and showed distance and velocity changes during the whole simulation time.

  19. Training course "Porting code from Matlab to Python"

    OpenAIRE

    Diaz, Sandra; Klijn, Wouter; Deepu, Rajalekshmi; Peyser, Alexander; Oden, Lena

    2017-01-01

    Python is becoming a popular language for scientific applications and is increasingly used for high performance computing. In this course we want to introduce Matlab programmers to the usage of Python. Matlab and Python have a comparable language philosophy, but Python can offer better performance using its optimizations and parallelization interfaces. Python also increases the portability and flexibility (interaction with other open source and proprietary software packages) of solutions, and...

  20. Microcontroller USB interfacing with MATLAB GUI for low cost medical ultrasound scanners

    Directory of Open Access Journals (Sweden)

    Jean Rossario Raj

    2016-06-01

    Full Text Available This paper presents an 8051 microcontroller-based control of ultrasound scanner prototype hardware from a host laptop MATLAB GUI. The hardware control of many instruments is carried out by microcontrollers. These microcontrollers are in turn controlled from a GUI residing in a computing machine that is connected over the USB interface. Conventionally such GUIs are developed using ‘C’ language or its variants. But MATLAB GUI is a better tool, when such GUI programs need to do huge image/video processing. However interfacing MATLAB with the microcontroller is a challenging task. Here, MATLAB interfacing through an intermediate MEX ‘C’ language program is presented. This paper outlines the MEX programming methods for achieving the smooth interfacing of microcontrollers with MATLAB GUI.

  1. Deterministic modelling and stochastic simulation of biochemical pathways using MATLAB.

    Science.gov (United States)

    Ullah, M; Schmidt, H; Cho, K H; Wolkenhauer, O

    2006-03-01

    The analysis of complex biochemical networks is conducted in two popular conceptual frameworks for modelling. The deterministic approach requires the solution of ordinary differential equations (ODEs, reaction rate equations) with concentrations as continuous state variables. The stochastic approach involves the simulation of differential-difference equations (chemical master equations, CMEs) with probabilities as variables. This is to generate counts of molecules for chemical species as realisations of random variables drawn from the probability distribution described by the CMEs. Although there are numerous tools available, many of them free, the modelling and simulation environment MATLAB is widely used in the physical and engineering sciences. We describe a collection of MATLAB functions to construct and solve ODEs for deterministic simulation and to implement realisations of CMEs for stochastic simulation using advanced MATLAB coding (Release 14). The program was successfully applied to pathway models from the literature for both cases. The results were compared to implementations using alternative tools for dynamic modelling and simulation of biochemical networks. The aim is to provide a concise set of MATLAB functions that encourage the experimentation with systems biology models. All the script files are available from www.sbi.uni-rostock.de/ publications_matlab-paper.html.

  2. Introduction to fuzzy logic using Matlab

    CERN Document Server

    Sivanandam, SN; Deepa, S N

    2006-01-01

    Fuzzy Logic, at present is a hot topic, among academicians as well various programmers. This book is provided to give a broad, in-depth overview of the field of Fuzzy Logic. The basic principles of Fuzzy Logic are discussed in detail with various solved examples. The different approaches and solutions to the problems given in the book are well balanced and pertinent to the Fuzzy Logic research projects. The applications of Fuzzy Logic are also dealt to make the readers understand the concept of Fuzzy Logic. The solutions to the problems are programmed using MATLAB 6.0 and the simulated results are given. The MATLAB Fuzzy Logic toolbox is provided for easy reference.

  3. MatLab 1a

    DEFF Research Database (Denmark)

    Kristiansen, Heidi; Kaas, Thomas; Freil, Ole

    Bogen indeholder fire kapitler: ’Tæl og brug tal’, ’Undersøg figurer’, ’Plus- og minusproblemer’ og ’Statistik og chance’. MATLAB: •bygger på en undersøgende og problemorienteret tilgang til matematikken •understøtter læringsmålstyret undervisning •har fokus på faglige samtaler og undersøgelser...

  4. Fitting methods for constructing energy-dependent efficiency curves and their application to ionization chamber measurements

    International Nuclear Information System (INIS)

    Svec, A.; Schrader, H.

    2002-01-01

    An ionization chamber without and with an iron liner (absorber) was calibrated by a set of radionuclide activity standards of the Physikalisch-Technische Bundesanstalt (PTB). The ionization chamber is used as a secondary standard measuring system for activity at the Slovak Institute of Metrology (SMU). Energy-dependent photon-efficiency curves were established for the ionization chamber in defined measurement geometry without and with the liner, and radionuclide efficiencies were calculated. Programmed calculation with an analytical efficiency function and a nonlinear regression algorithm of Microsoft (MS) Excel for fitting was used. Efficiencies from bremsstrahlung of pure beta-particle emitters were calibrated achieving a 10% accuracy level. Such efficiency components are added to obtain the total radionuclide efficiency of photon emitters after beta decay. The method yields differences of experimental and calculated radionuclide efficiencies for most of the photon-emitting radionuclides in the order of a few percent

  5. ImageJ-MATLAB: a bidirectional framework for scientific image analysis interoperability.

    Science.gov (United States)

    Hiner, Mark C; Rueden, Curtis T; Eliceiri, Kevin W

    2017-02-15

    ImageJ-MATLAB is a lightweight Java library facilitating bi-directional interoperability between MATLAB and ImageJ. By defining a standard for translation between matrix and image data structures, researchers are empowered to select the best tool for their image-analysis tasks. Freely available extension to ImageJ2 ( http://imagej.net/Downloads ). Installation and use instructions available at http://imagej.net/MATLAB_Scripting. Tested with ImageJ 2.0.0-rc-54 , Java 1.8.0_66 and MATLAB R2015b. eliceiri@wisc.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  6. Speeding up the MATLAB complex networks package using graphic processors

    International Nuclear Information System (INIS)

    Zhang Bai-Da; Wu Jun-Jie; Li Xin; Tang Yu-Hua

    2011-01-01

    The availability of computers and communication networks allows us to gather and analyse data on a far larger scale than previously. At present, it is believed that statistics is a suitable method to analyse networks with millions, or more, of vertices. The MATLAB language, with its mass of statistical functions, is a good choice to rapidly realize an algorithm prototype of complex networks. The performance of the MATLAB codes can be further improved by using graphic processor units (GPU). This paper presents the strategies and performance of the GPU implementation of a complex networks package, and the Jacket toolbox of MATLAB is used. Compared with some commercially available CPU implementations, GPU can achieve a speedup of, on average, 11.3×. The experimental result proves that the GPU platform combined with the MATLAB language is a good combination for complex network research. (interdisciplinary physics and related areas of science and technology)

  7. Multidimentional and Multi-Parameter Fortran-Based Curve Fitting ...

    African Journals Online (AJOL)

    This work briefly describes the mathematics behind the algorithm, and also elaborates how to implement it using FORTRAN 95 programming language. The advantage of this algorithm, when it is extended to surfaces and complex functions, is that it makes researchers to have a better trust during fitting. It also improves the ...

  8. MatLab 0. Matematiklaboratoriet

    DEFF Research Database (Denmark)

    Kristiansen, Heidi; Kaas, Thomas; Freil, Ole

    Bogen indeholder fem kapitler: ’Tælle og tal’, ’Figurer og mønstre’, ’Hvor mange?’, ’Måling’ og ’Data og diagrammer’. MATLAB: •bygger på en undersøgende og problemorienteret tilgang til matematikken •understøtter læringsmålstyret undervisning •har fokus på faglige samtaler og undersøgelser, der...

  9. Piezoelectric Actuator Modeling Using MSC/NASTRAN and MATLAB

    Science.gov (United States)

    Reaves, Mercedes C.; Horta, Lucas G.

    2003-01-01

    This paper presents a procedure for modeling structures containing piezoelectric actuators using MSCMASTRAN and MATLAB. The paper describes the utility and functionality of one set of validated modeling tools. The tools described herein use MSCMASTRAN to model the structure with piezoelectric actuators and a thermally induced strain to model straining of the actuators due to an applied voltage field. MATLAB scripts are used to assemble the dynamic equations and to generate frequency response functions. The application of these tools is discussed using a cantilever aluminum beam with a surface mounted piezoelectric actuator as a sample problem. Software in the form of MSCINASTRAN DMAP input commands, MATLAB scripts, and a step-by-step procedure to solve the example problem are provided. Analysis results are generated in terms of frequency response functions from deflection and strain data as a function of input voltage to the actuator.

  10. Introduction to multifractal detrended fluctuation analysis in matlab.

    Science.gov (United States)

    Ihlen, Espen A F

    2012-01-01

    Fractal structures are found in biomedical time series from a wide range of physiological phenomena. The multifractal spectrum identifies the deviations in fractal structure within time periods with large and small fluctuations. The present tutorial is an introduction to multifractal detrended fluctuation analysis (MFDFA) that estimates the multifractal spectrum of biomedical time series. The tutorial presents MFDFA step-by-step in an interactive Matlab session. All Matlab tools needed are available in Introduction to MFDFA folder at the website www.ntnu.edu/inm/geri/software. MFDFA are introduced in Matlab code boxes where the reader can employ pieces of, or the entire MFDFA to example time series. After introducing MFDFA, the tutorial discusses the best practice of MFDFA in biomedical signal processing. The main aim of the tutorial is to give the reader a simple self-sustained guide to the implementation of MFDFA and interpretation of the resulting multifractal spectra.

  11. From curve fitting to machine learning an illustrative guide to scientific data analysis and computational intelligence

    CERN Document Server

    Zielesny, Achim

    2016-01-01

    This successful book provides in its second edition an interactive and illustrative guide from two-dimensional curve fitting to multidimensional clustering and machine learning with neural networks or support vector machines. Along the way topics like mathematical optimization or evolutionary algorithms are touched. All concepts and ideas are outlined in a clear cut manner with graphically depicted plausibility arguments and a little elementary mathematics. The major topics are extensively outlined with exploratory examples and applications. The primary goal is to be as illustrative as possible without hiding problems and pitfalls but to address them. The character of an illustrative cookbook is complemented with specific sections that address more fundamental questions like the relation between machine learning and human intelligence. All topics are completely demonstrated with the computing platform Mathematica and the Computational Intelligence Packages (CIP), a high-level function library developed with M...

  12. Matlab Software for Spatial Panels

    NARCIS (Netherlands)

    Elhorst, J.Paul

    2014-01-01

    Elhorst provides Matlab routines to estimate spatial panel data models at his website. This article extends these routines to include the bias correction procedure proposed by Lee and Yu if the spatial panel data model contains spatial and/or time-period fixed effects, the direct and indirect

  13. ECONOMIC MODELING PROCESSES USING MATLAB

    Directory of Open Access Journals (Sweden)

    Anamaria G. MACOVEI

    2008-06-01

    Full Text Available To study economic phenomena and processes using mathem atical modeling, and to determine the approximatesolution to a problem we need to choose a method of calculation and a numerical computer program, namely thepackage of programs MatLab. Any economic process or phenomenon is a mathematical description of h is behavior,and thus draw up an economic and mathematical model that has the following stages: formulation of the problem, theanalysis process modeling, the production model and design verification, validation and implementation of the model.This article is presented an economic model and its modeling is using mathematical equations and software packageMatLab, which helps us approximation effective solution. As data entry is considered the net cost, the cost of direct andtotal cost and the link between them. I presented the basic formula for determining the total cost. Economic modelcalculations were made in MatLab software package and with graphic representation of its interpretation of the resultsachieved in terms of our specific problem.

  14. Testing MONDian dark matter with galactic rotation curves

    International Nuclear Information System (INIS)

    Edmonds, Doug; Farrah, Duncan; Minic, Djordje; Takeuchi, Tatsu; Ho, Chiu Man; Ng, Y. Jack

    2014-01-01

    MONDian dark matter (MDM) is a new form of dark matter quantum that naturally accounts for Milgrom's scaling, usually associated with modified Newtonian dynamics (MOND), and theoretically behaves like cold dark matter (CDM) at cluster and cosmic scales. In this paper, we provide the first observational test of MDM by fitting rotation curves to a sample of 30 local spiral galaxies (z ≈ 0.003). For comparison, we also fit the galactic rotation curves using MOND and CDM. We find that all three models fit the data well. The rotation curves predicted by MDM and MOND are virtually indistinguishable over the range of observed radii (∼1 to 30 kpc). The best-fit MDM and CDM density profiles are compared. We also compare with MDM the dark matter density profiles arising from MOND if Milgrom's formula is interpreted as Newtonian gravity with an extra source term instead of as a modification of inertia. We find that discrepancies between MDM and MOND will occur near the center of a typical spiral galaxy. In these regions, instead of continuing to rise sharply, the MDM mass density turns over and drops as we approach the center of the galaxy. Our results show that MDM, which restricts the nature of the dark matter quantum by accounting for Milgrom's scaling, accurately reproduces observed rotation curves.

  15. A MATLAB-Aided Method for Teaching Calculus-Based Business Mathematics

    Science.gov (United States)

    Liang, Jiajuan; Pan, William S. Y.

    2009-01-01

    MATLAB is a powerful package for numerical computation. MATLAB contains a rich pool of mathematical functions and provides flexible plotting functions for illustrating mathematical solutions. The course of calculus-based business mathematics consists of two major topics: 1) derivative and its applications in business; and 2) integration and its…

  16. A Matlab program for stepwise regression

    Directory of Open Access Journals (Sweden)

    Yanhong Qi

    2016-03-01

    Full Text Available The stepwise linear regression is a multi-variable regression for identifying statistically significant variables in the linear regression equation. In present study, we presented the Matlab program of stepwise regression.

  17. Implementation and Comparison of the Lifting 5/3 and 9/7 Algorithms in MatLab on GPU

    Directory of Open Access Journals (Sweden)

    Randa Khemiri

    2016-06-01

    Full Text Available In order to accelerate the Discrete Wavelet Transform DWT, we have implemented and compared the lifting "Le Gall5/3" and "Cohen-Daubechies-Feauveau9/7" (CDF9/7 algorithms on a low cost NVIDIA’s GPU. The suggested implementation is realized in MatLab using the in-house parallel computation toolbox (PCT. Our experimental results indicate, that the speedup is proportional to the image size until it attains a maximum at 20482 pixels, beyond these values the curve decreases. The performance with GPU enhances above a factor of 2~3 compared with CPU.

  18. Fitness of gutta-percha cones in curved root canals prepared with reciprocating files correlated with tug-back sensation.

    Science.gov (United States)

    Yoon, Heeyoung; Baek, Seung-Ho; Kum, Kee-Yeon; Kim, Hyeon-Cheol; Moon, Young-Mi; Fang, Denny Y; Lee, WooCheol

    2015-01-01

    The purpose of this study was to evaluate the gutta-percha-occupied area (GPOA) and the relationship between GPOA and tug-back sensations in canals instrumented with reciprocating files. Twenty curved canals were instrumented using Reciproc R25 (VDW, Munich, Germany) (group R) and WaveOne Primary (Dentsply Maillefer, Ballaigues, Switzerland) (group W), respectively (n = 10 each). The presence or absence of a tug-back sensation was decided for both of #25/.08 and #30/.06 cones in every canal. The percentage of GPOA at 1-, 2-, and 3-mm levels from the working length was calculated using micro-computed tomographic imaging. The correlation between the sum of the GPOA and the presence of a tug-back sensation was also investigated. The data were analyzed statistically at P = .05. A tug-back sensation was present in 45% and 100% canals for #25/.08 and #30/.06 cones, respectively, with a significant difference (P sensation (P .05). Under the conditions of this study, the tug-back sensation can be a definitive determinant for indicating higher cone fitness in the curved canal regardless of the cone type. Copyright © 2015 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  19. Matlab Tools: An Alternative to Planning Systems in Brachytherapy Treatments

    International Nuclear Information System (INIS)

    Herrera, Higmar; Rodriguez, Mercedes; Rodriguez, Miguel

    2006-01-01

    This work proposes the use of the Matlab environment to obtain the treatment dose based on the reported data by Krishnaswamy and Liu et al. The comparison with reported measurements is showed for the Amersham source model. For the 3M source model, measurements with TLDs and a Monte Carlo simulation are compared to the data obtained by Matlab. The difference for the Amersham model is well under the 15% recommended by the IAEA and for the 3M model, although the difference is greater, the results are consistent. The good agreement to the reported data allows the Matlab calculations to be used in daily brachytherapy treatments

  20. Signals and systems with MATLAB

    CERN Document Server

    Yang, Won Young; Song, Ik H; Cho, Yong S

    2009-01-01

    Covers some of the theoretical foundations and mathematical derivations that can be used in higher-level related subjects such as signal processing, communication, and control, minimizing the mathematical difficulty and computational burden. This book illustrates the usage of MATLAB and Simulink for signal and system analysis and design.

  1. A Matlab/Simulink-Based Interactive Module for Servo Systems Learning

    Science.gov (United States)

    Aliane, N.

    2010-01-01

    This paper presents an interactive module for learning both the fundamental and practical issues of servo systems. This module, developed using Simulink in conjunction with the Matlab graphical user interface (Matlab-GUI) tool, is used to supplement conventional lectures in control engineering and robotics subjects. First, the paper introduces the…

  2. Four points function fitted and first derivative procedure for determining the end points in potentiometric titration curves: statistical analysis and method comparison.

    Science.gov (United States)

    Kholeif, S A

    2001-06-01

    A new method that belongs to the differential category for determining the end points from potentiometric titration curves is presented. It uses a preprocess to find first derivative values by fitting four data points in and around the region of inflection to a non-linear function, and then locate the end point, usually as a maximum or minimum, using an inverse parabolic interpolation procedure that has an analytical solution. The behavior and accuracy of the sigmoid and cumulative non-linear functions used are investigated against three factors. A statistical evaluation of the new method using linear least-squares method validation and multifactor data analysis are covered. The new method is generally applied to symmetrical and unsymmetrical potentiometric titration curves, and the end point is calculated using numerical procedures only. It outperforms the "parent" regular differential method in almost all factors levels and gives accurate results comparable to the true or estimated true end points. Calculated end points from selected experimental titration curves compatible with the equivalence point category of methods, such as Gran or Fortuin, are also compared with the new method.

  3. Non-linear modelling to describe lactation curve in Gir crossbred cows

    Directory of Open Access Journals (Sweden)

    Yogesh C. Bangar

    2017-02-01

    Full Text Available Abstract Background The modelling of lactation curve provides guidelines in formulating farm managerial practices in dairy cows. The aim of the present study was to determine the suitable non-linear model which most accurately fitted to lactation curves of five lactations in 134 Gir crossbred cows reared in Research-Cum-Development Project (RCDP on Cattle farm, MPKV (Maharashtra. Four models viz. gamma-type function, quadratic model, mixed log function and Wilmink model were fitted to each lactation separately and then compared on the basis of goodness of fit measures viz. adjusted R2, root mean square error (RMSE, Akaike’s Informaion Criteria (AIC and Bayesian Information Criteria (BIC. Results In general, highest milk yield was observed in fourth lactation whereas it was lowest in first lactation. Among the models investigated, mixed log function and gamma-type function provided best fit of the lactation curve of first and remaining lactations, respectively. Quadratic model gave least fit to lactation curve in almost all lactations. Peak yield was observed as highest and lowest in fourth and first lactation, respectively. Further, first lactation showed highest persistency but relatively higher time to achieve peak yield than other lactations. Conclusion Lactation curve modelling using gamma-type function may be helpful to setting the management strategies at farm level, however, modelling must be optimized regularly before implementing them to enhance productivity in Gir crossbred cows.

  4. Instruction of pattern recognition by MATLAB practice 1

    International Nuclear Information System (INIS)

    1999-06-01

    This book describes the pattern recognition by MATLAB practice. It includes possibility and limit of AI, introduction of pattern recognition a vector and matrix, basic status and a probability theory, a random variable and probability distribution, statistical decision theory, data-mining, gaussian mixture model, a nerve cell modeling such as Hebb's learning rule, LMS learning rule, genetic algorithm, dynamic programming and DTW, HMN on Markov model and HMM's three problems and solution, introduction of SVM with KKT condition and margin optimum, kernel trick and MATLAB practice.

  5. An Introduction to Goodness of Fit for PMU Parameter Estimation

    Energy Technology Data Exchange (ETDEWEB)

    Riepnieks, Artis; Kirkham, Harold

    2017-10-01

    New results of measurements of phasor-like signals are presented based on our previous work on the topic. In this document an improved estimation method is described. The algorithm (which is realized in MATLAB software) is discussed. We examine the effect of noisy and distorted signals on the Goodness of Fit metric. The estimation method is shown to be performing very well with clean data and with a measurement window as short as a half a cycle and as few as 5 samples per cycle. The Goodness of Fit decreases predictably with added phase noise, and seems to be acceptable even with visible distortion in the signal. While the exact results we obtain are specific to our method of estimation, the Goodness of Fit method could be implemented in any phasor measurement unit.

  6. Curve fitting of the corporate recovery rates: the comparison of Beta distribution estimation and kernel density estimation.

    Science.gov (United States)

    Chen, Rongda; Wang, Ze

    2013-01-01

    Recovery rate is essential to the estimation of the portfolio's loss and economic capital. Neglecting the randomness of the distribution of recovery rate may underestimate the risk. The study introduces two kinds of models of distribution, Beta distribution estimation and kernel density distribution estimation, to simulate the distribution of recovery rates of corporate loans and bonds. As is known, models based on Beta distribution are common in daily usage, such as CreditMetrics by J.P. Morgan, Portfolio Manager by KMV and Losscalc by Moody's. However, it has a fatal defect that it can't fit the bimodal or multimodal distributions such as recovery rates of corporate loans and bonds as Moody's new data show. In order to overcome this flaw, the kernel density estimation is introduced and we compare the simulation results by histogram, Beta distribution estimation and kernel density estimation to reach the conclusion that the Gaussian kernel density distribution really better imitates the distribution of the bimodal or multimodal data samples of corporate loans and bonds. Finally, a Chi-square test of the Gaussian kernel density estimation proves that it can fit the curve of recovery rates of loans and bonds. So using the kernel density distribution to precisely delineate the bimodal recovery rates of bonds is optimal in credit risk management.

  7. Elements of matrix modeling and computing with Matlab

    CERN Document Server

    White, Robert E

    2006-01-01

    As discrete models and computing have become more common, there is a need to study matrix computation and numerical linear algebra. Encompassing a diverse mathematical core, Elements of Matrix Modeling and Computing with MATLAB examines a variety of applications and their modeling processes, showing you how to develop matrix models and solve algebraic systems. Emphasizing practical skills, it creates a bridge from problems with two and three variables to more realistic problems that have additional variables. Elements of Matrix Modeling and Computing with MATLAB focuses on seven basic applicat

  8. Estimating aquifer transmissivity from specific capacity using MATLAB.

    Science.gov (United States)

    McLin, Stephen G

    2005-01-01

    Historically, specific capacity information has been used to calculate aquifer transmissivity when pumping test data are unavailable. This paper presents a simple computer program written in the MATLAB programming language that estimates transmissivity from specific capacity data while correcting for aquifer partial penetration and well efficiency. The program graphically plots transmissivity as a function of these factors so that the user can visually estimate their relative importance in a particular application. The program is compatible with any computer operating system running MATLAB, including Windows, Macintosh OS, Linux, and Unix. Two simple examples illustrate program usage.

  9. A curve fitting method for extrinsic camera calibration from a single image of a cylindrical object

    International Nuclear Information System (INIS)

    Winkler, A W; Zagar, B G

    2013-01-01

    An important step in the process of optical steel coil quality assurance is to measure the proportions of width and radius of steel coils as well as the relative position and orientation of the camera. This work attempts to estimate these extrinsic parameters from single images by using the cylindrical coil itself as the calibration target. Therefore, an adaptive least-squares algorithm is applied to fit parametrized curves to the detected true coil outline in the acquisition. The employed model allows for strictly separating the intrinsic and the extrinsic parameters. Thus, the intrinsic camera parameters can be calibrated beforehand using available calibration software. Furthermore, a way to segment the true coil outline in the acquired images is motivated. The proposed optimization method yields highly accurate results and can be generalized even to measure other solids which cannot be characterized by the identification of simple geometric primitives. (paper)

  10. A curve fitting method for extrinsic camera calibration from a single image of a cylindrical object

    Science.gov (United States)

    Winkler, A. W.; Zagar, B. G.

    2013-08-01

    An important step in the process of optical steel coil quality assurance is to measure the proportions of width and radius of steel coils as well as the relative position and orientation of the camera. This work attempts to estimate these extrinsic parameters from single images by using the cylindrical coil itself as the calibration target. Therefore, an adaptive least-squares algorithm is applied to fit parametrized curves to the detected true coil outline in the acquisition. The employed model allows for strictly separating the intrinsic and the extrinsic parameters. Thus, the intrinsic camera parameters can be calibrated beforehand using available calibration software. Furthermore, a way to segment the true coil outline in the acquired images is motivated. The proposed optimization method yields highly accurate results and can be generalized even to measure other solids which cannot be characterized by the identification of simple geometric primitives.

  11. MATLAB/SIMULINK platform for simulation of CANDU reactor control system

    International Nuclear Information System (INIS)

    Javidnia, H.; Jiang, J.

    2007-01-01

    In this paper a simulation platform for CANDU reactors' control system is presented. The platform is built on MATLAB/SIMULINK interactive graphical interface. Since MATLAB/SIMULINK are powerful tools to describe systems mathematically, all the subsystems in a CANDU reactor are represented in MATLAB's language and are implemented in SIMULINK graphical representation. The focus of the paper is on the flux control loop of CANDU reactors. However, the ideas can be extended to include other parts in CANDU power plants and the same technique can be applied to other types of nuclear reactors and their control systems. The CANDU reactor model and xenon feedback model are also discussed in this paper. (author)

  12. Identifying ambiguous prostate gland contours from histology using capsule shape information and least squares curve fitting

    Energy Technology Data Exchange (ETDEWEB)

    Hussein, Rania [DigiPen Institute of Technology, Department of Computer Engineering, Redmond, WA (United States); McKenzie, Frederic D. [Old Dominion University, Department of Electrical and Computer Engineering, Norfolk, VA (United States)

    2007-12-15

    To obtain an accurate assessment of the percentage and depth of extra-capsular soft tissue removed with the prostate by the various surgical techniques in order to help surgeons in determining the appropriateness of different surgical approaches. This can be enhanced by an accurate and automated means of identifying the prostate gland contour. To facilitate 3D reconstruction and, ultimately, more accurate analyses, it is essential for us to identify the capsule boundary that separates the prostate gland tissue from its extra-capsular tissue. However, the capsule is sometimes unrecognizable due to the naturally occurring intrusion of muscle and connective tissue into the prostate gland. At these regions where the capsule disappears, its contour can be arbitrarily created with a continuing contour line based on the natural shape of the prostate. We utilize an algorithm based on a least squares curve fitting technique that uses a prostate shape equation to merge previously detected capsule parts with the shape equation to produce an approximated curve that represents the prostate capsule. We have tested our algorithm using three different shapes on 13 histologic prostate slices that are cut at different locations from the apex. The best result shows a 90% average contour match when compared to pathologist-drawn contours. We believe that automatically identifying histologic prostate contours will lead to increased objective analyses of surgical margins and extracapsular spread of cancer. Our results show that this is achievable. (orig.)

  13. Identifying ambiguous prostate gland contours from histology using capsule shape information and least squares curve fitting

    International Nuclear Information System (INIS)

    Hussein, Rania; McKenzie, Frederic D.

    2007-01-01

    To obtain an accurate assessment of the percentage and depth of extra-capsular soft tissue removed with the prostate by the various surgical techniques in order to help surgeons in determining the appropriateness of different surgical approaches. This can be enhanced by an accurate and automated means of identifying the prostate gland contour. To facilitate 3D reconstruction and, ultimately, more accurate analyses, it is essential for us to identify the capsule boundary that separates the prostate gland tissue from its extra-capsular tissue. However, the capsule is sometimes unrecognizable due to the naturally occurring intrusion of muscle and connective tissue into the prostate gland. At these regions where the capsule disappears, its contour can be arbitrarily created with a continuing contour line based on the natural shape of the prostate. We utilize an algorithm based on a least squares curve fitting technique that uses a prostate shape equation to merge previously detected capsule parts with the shape equation to produce an approximated curve that represents the prostate capsule. We have tested our algorithm using three different shapes on 13 histologic prostate slices that are cut at different locations from the apex. The best result shows a 90% average contour match when compared to pathologist-drawn contours. We believe that automatically identifying histologic prostate contours will lead to increased objective analyses of surgical margins and extracapsular spread of cancer. Our results show that this is achievable. (orig.)

  14. Aplikasi MATLAB untuk Mengenali Karakter Tulisan Tangan

    Directory of Open Access Journals (Sweden)

    ali mahmudi

    2017-03-01

    Full Text Available Handwriting recognition is one of the very interesting research object in the field of image processing, artificial intelligence and computer vision. This is due to the handwritten characters is varied in every individual. The style, size and orientation of handwriting characters has made every body’s is different, hence handwriting recognition is a very interesting research object. Handwriting recognition application has been used in quite many applications, such as reading the bank deposits, reading the postal code in letters, and helping peolple in managing documents. This paper presents a handwriting recognition application using Matlab. Matlab toolbox that is used in this research are Image Processing and Neural Network Toolbox.

  15. Stress analysis in curved composites due to thermal loading

    Science.gov (United States)

    Polk, Jared Cornelius

    of such a problem. It was ascertained and proven that the general, non-modified (original) version of classical lamination theory cannot be used for an analytical solution for a simply curved beam or any other structure that would require rotations of laminates out their planes in space. Finite element analysis was used to ascertain stress variations in a simply curved beam. It was verified that these solutions reduce to the flat beam solutions as the radius of curvature of the beams tends to infinity. MATLAB was used to conduct the classical lamination theory numerical analysis. A MATLAB program was written to conduct the finite element analysis for the flat and curved beams, isotropic and composite. It does not require incompatibility techniques used in mechanics of isotropic materials for indeterminate structures that are equivalent to fixed-beam problems. Finally, it has the ability to enable the user to define and create unique elements not accessible in commercial software, and modify finite element procedures to take advantage of new paradigms.

  16. Matlab for electrical and computer engineering students and professionals with Simulink

    CERN Document Server

    Priemer, Roland

    2013-01-01

    This book combines the teaching of the MATLAB#65533; programming language with the presentation and development of carefully selected electrical and computer engineering (ECE) fundamentals. This is what distinguishes it from other books concerned with MATLAB#65533;: it is directed specifically to ECE concerns.

  17. Scilab and MATLAB Interfaces to MUMPS (version 4.6 or greater)

    OpenAIRE

    Fèvre , Aurélia; Pralet , Stéphane; L'Excellent , Jean-Yves

    2006-01-01

    This document describes the Scilab and MATLAB interfaces to MUMPS version 4.6. We describe the differences and similarities between usual Fortran/C MUMPS interfaces and its Scilab/MATLAB interfaces, the calling sequences and functionalities. Examples of use and experimental results are also provided.

  18. Spot quantification in two dimensional gel electrophoresis image analysis: comparison of different approaches and presentation of a novel compound fitting algorithm

    Science.gov (United States)

    2014-01-01

    Background Various computer-based methods exist for the detection and quantification of protein spots in two dimensional gel electrophoresis images. Area-based methods are commonly used for spot quantification: an area is assigned to each spot and the sum of the pixel intensities in that area, the so-called volume, is used a measure for spot signal. Other methods use the optical density, i.e. the intensity of the most intense pixel of a spot, or calculate the volume from the parameters of a fitted function. Results In this study we compare the performance of different spot quantification methods using synthetic and real data. We propose a ready-to-use algorithm for spot detection and quantification that uses fitting of two dimensional Gaussian function curves for the extraction of data from two dimensional gel electrophoresis (2-DE) images. The algorithm implements fitting using logical compounds and is computationally efficient. The applicability of the compound fitting algorithm was evaluated for various simulated data and compared with other quantification approaches. We provide evidence that even if an incorrect bell-shaped function is used, the fitting method is superior to other approaches, especially when spots overlap. Finally, we validated the method with experimental data of urea-based 2-DE of Aβ peptides andre-analyzed published data sets. Our methods showed higher precision and accuracy than other approaches when applied to exposure time series and standard gels. Conclusion Compound fitting as a quantification method for 2-DE spots shows several advantages over other approaches and could be combined with various spot detection methods. The algorithm was scripted in MATLAB (Mathworks) and is available as a supplemental file. PMID:24915860

  19. Development of MATLAB Scripts for the Calculation of Thermal Manikin Regional Resistance Values

    Science.gov (United States)

    2016-01-01

    TECHNICAL NOTE NO. TN16-1 DATE January 2016 ADA DEVELOPMENT OF MATLAB ® SCRIPTS FOR THE...USARIEM TECHNICAL NOTE TN16-1 DEVELOPMENT OF MATLAB ® SCRIPTS FOR THE CALCULATION OF THERMAL MANIKIN REGIONAL RESISTANCE VALUES...EXECUTIVE SUMMARY A software tool has been developed via MATLAB ® scripts to reduce the amount of repetitive and time-consuming calculations that are

  20. Considerations for reference pump curves

    International Nuclear Information System (INIS)

    Stockton, N.B.

    1992-01-01

    This paper examines problems associated with inservice testing (IST) of pumps to assess their hydraulic performance using reference pump curves to establish acceptance criteria. Safety-related pumps at nuclear power plants are tested under the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code (the Code), Section 11. The Code requires testing pumps at specific reference points of differential pressure or flow rate that can be readily duplicated during subsequent tests. There are many cases where test conditions cannot be duplicated. For some pumps, such as service water or component cooling pumps, the flow rate at any time depends on plant conditions and the arrangement of multiple independent and constantly changing loads. System conditions cannot be controlled to duplicate a specific reference value. In these cases, utilities frequently request to use pump curves for comparison of test data for acceptance. There is no prescribed method for developing a pump reference curve. The methods vary and may yield substantially different results. Some results are conservative when compared to the Code requirements; some are not. The errors associated with different curve testing techniques should be understood and controlled within reasonable bounds. Manufacturer's pump curves, in general, are not sufficiently accurate to use as reference pump curves for IST. Testing using reference curves generated with polynomial least squares fits over limited ranges of pump operation, cubic spline interpolation, or cubic spline least squares fits can provide a measure of pump hydraulic performance that is at least as accurate as the Code required method. Regardless of the test method, error can be reduced by using more accurate instruments, by correcting for systematic errors, by increasing the number of data points, and by taking repetitive measurements at each data point

  1. Open Source Power Plant Simulator Development Under Matlab Environment

    International Nuclear Information System (INIS)

    Ratemi, W.M.; Fadilah, S.M.; Abonoor, N

    2008-01-01

    In this paper an open source programming approach is targeted for the development of power plant simulator under Matlab environment. With this approach many individuals can contribute to the development of the simulator by developing different orders of complexities of the power plant components. Such modules can be modeled based on physical principles, or using neural networks or other methods. All of these modules are categorized in Matlab library, of which the user can select and build up his simulator. Many international companies developed its own authoring tool for the development of its simulators, and hence it became its own property available for high costs. Matlab is a general software developed by mathworks that can be used with its toolkits as the authoring tool for the development of components by different individuals, and through the appropriate coordination, different plant simulators, nuclear, traditional , or even research reactors can be computerly assembled. In this paper, power plant components such as a pressurizer, a reactor, a steam generator, a turbine, a condenser, a feedwater heater, a valve, a pump are modeled based on physical principles. Also a prototype modeling of a reactor ( a scram case) based on neural networks is developed. These modules are inserted in two different Matlab libraries one called physical and the other is called neural. Furthermore, during the simulation one can pause and shuffle the modules selected from the two libraries and then proceed the simulation. Also, under the Matlab environment a PID controller is developed for multi-loop plant which can be integrated for the control of the appropriate developed simulator. This paper is an attempt to base the open source approach for the development of power plant simulators or even research reactor simulators. It then requires the coordination among interested individuals or institutions to set it to professionalism. (author)

  2. Rapid BeagleBoard prototyping with MATLAB and Simulink

    CERN Document Server

    Qin, Fei

    2013-01-01

    This book is a fast-paced guide with practical, hands-on recipes which will show you how to prototype Beagleboard-based audio/video applications using Matlab/Simlink and Sourcery Codebench on a Windows host.Beagleboard Embedded Projects is great for students and academic researchers who have practical ideas and who want to build a proof-of-concept system on an embedded hardware platform quickly and efficiently. It is also useful for product design engineers who want to ratify their applications and reduce the time-to-market. It is assumed that you are familiar with Matlab/Simulink and have som

  3. How to Interface Fortran with Matlab

    OpenAIRE

    Sagastizábal , Claudia; Vige , Guillaume

    1995-01-01

    Projet PROMATH; We describe the general procedure for interfacing Fortran routines with Matlab. We explain how to write a mex-file and the associated gateway function. In particular, each different type of argument is considered in detail. We finish with an illustrative example

  4. Peak oil analyzed with a logistic function and idealized Hubbert curve

    International Nuclear Information System (INIS)

    Gallagher, Brian

    2011-01-01

    A logistic function is used to characterize peak and ultimate production of global crude oil and petroleum-derived liquid fuels. Annual oil production data were incrementally summed to construct a logistic curve in its initial phase. Using a curve-fitting approach, a population-growth logistic function was applied to complete the cumulative production curve. The simulated curve was then deconstructed into a set of annual oil production data producing an 'idealized' Hubbert curve. An idealized Hubbert curve (IHC) is defined as having properties of production data resulting from a constant growth-rate under fixed resource limits. An IHC represents a potential production curve constructed from cumulative production data and provides a new perspective for estimating peak production periods and remaining resources. The IHC model data show that idealized peak oil production occurred in 2009 at 83.2 Mb/d (30.4 Gb/y). IHC simulations of truncated historical oil production data produced similar results and indicate that this methodology can be useful as a prediction tool. - Research Highlights: →Global oil production data were analyzed by a simple curve fitting method. →Best fit-curve results were obtained using two logistic functions on select data. →A broad potential oil production peak is forecast for the years from 2004 to 2014. →Similar results were obtained using historical data from about 10 to 30 years ago. →Two potential oil production decline scenarios were presented and compared.

  5. Curve fitting of the corporate recovery rates: the comparison of Beta distribution estimation and kernel density estimation.

    Directory of Open Access Journals (Sweden)

    Rongda Chen

    Full Text Available Recovery rate is essential to the estimation of the portfolio's loss and economic capital. Neglecting the randomness of the distribution of recovery rate may underestimate the risk. The study introduces two kinds of models of distribution, Beta distribution estimation and kernel density distribution estimation, to simulate the distribution of recovery rates of corporate loans and bonds. As is known, models based on Beta distribution are common in daily usage, such as CreditMetrics by J.P. Morgan, Portfolio Manager by KMV and Losscalc by Moody's. However, it has a fatal defect that it can't fit the bimodal or multimodal distributions such as recovery rates of corporate loans and bonds as Moody's new data show. In order to overcome this flaw, the kernel density estimation is introduced and we compare the simulation results by histogram, Beta distribution estimation and kernel density estimation to reach the conclusion that the Gaussian kernel density distribution really better imitates the distribution of the bimodal or multimodal data samples of corporate loans and bonds. Finally, a Chi-square test of the Gaussian kernel density estimation proves that it can fit the curve of recovery rates of loans and bonds. So using the kernel density distribution to precisely delineate the bimodal recovery rates of bonds is optimal in credit risk management.

  6. Curve Fitting of the Corporate Recovery Rates: The Comparison of Beta Distribution Estimation and Kernel Density Estimation

    Science.gov (United States)

    Chen, Rongda; Wang, Ze

    2013-01-01

    Recovery rate is essential to the estimation of the portfolio’s loss and economic capital. Neglecting the randomness of the distribution of recovery rate may underestimate the risk. The study introduces two kinds of models of distribution, Beta distribution estimation and kernel density distribution estimation, to simulate the distribution of recovery rates of corporate loans and bonds. As is known, models based on Beta distribution are common in daily usage, such as CreditMetrics by J.P. Morgan, Portfolio Manager by KMV and Losscalc by Moody’s. However, it has a fatal defect that it can’t fit the bimodal or multimodal distributions such as recovery rates of corporate loans and bonds as Moody’s new data show. In order to overcome this flaw, the kernel density estimation is introduced and we compare the simulation results by histogram, Beta distribution estimation and kernel density estimation to reach the conclusion that the Gaussian kernel density distribution really better imitates the distribution of the bimodal or multimodal data samples of corporate loans and bonds. Finally, a Chi-square test of the Gaussian kernel density estimation proves that it can fit the curve of recovery rates of loans and bonds. So using the kernel density distribution to precisely delineate the bimodal recovery rates of bonds is optimal in credit risk management. PMID:23874558

  7. Can CCM law properly represent all extinction curves?

    International Nuclear Information System (INIS)

    Geminale, Anna; Popowski, Piotr

    2005-01-01

    We present the analysis of a large sample of lines of sight with extinction curves covering wavelength range from near-infrared (NIR) to ultraviolet (UV). We derive total to selective extinction ratios based on the Cardelli, Clayton and Mathis (1989, CCM) law, which is typically used to fit the extinction data both for diffuse and dense interstellar medium. We conclude that the CCM law is able to fit most of the extinction curves in our sample. We divide the remaining lines of sight with peculiar extinction into two groups according to two main behaviors: a) the optical/IR or/and UV wavelength region cannot be reproduced by the CCM formula; b) the optical/NIR and UV extinction data are best fit by the CCM law with different values of R v . We present examples of such curves. The study of both types of peculiar cases can help us to learn about the physical processes that affect dust in the interstellar medium, e.g., formation of mantles on the surface of grains, evaporation, growing or shattering

  8. Feasible Path Generation Using Bezier Curves for Car-Like Vehicle

    Science.gov (United States)

    Latip, Nor Badariyah Abdul; Omar, Rosli

    2017-08-01

    When planning a collision-free path for an autonomous vehicle, the main criteria that have to be considered are the shortest distance, lower computation time and completeness, i.e. a path can be found if one exists. Besides that, a feasible path for the autonomous vehicle is also crucial to guarantee that the vehicle can reach the target destination considering its kinematic constraints such as non-holonomic and minimum turning radius. In order to address these constraints, Bezier curves is applied. In this paper, Bezier curves are modeled and simulated using Matlab software and the feasibility of the resulting path is analyzed. Bezier curve is derived from a piece-wise linear pre-planned path. It is found that the Bezier curves has the capability of making the planned path feasible and could be embedded in a path planning algorithm for an autonomous vehicle with kinematic constraints. It is concluded that the length of segments of the pre-planned path have to be greater than a nominal value, derived from the vehicle wheelbase, maximum steering angle and maximum speed to ensure the path for the autonomous car is feasible.

  9. Dose-response curve estimation: a semiparametric mixture approach.

    Science.gov (United States)

    Yuan, Ying; Yin, Guosheng

    2011-12-01

    In the estimation of a dose-response curve, parametric models are straightforward and efficient but subject to model misspecifications; nonparametric methods are robust but less efficient. As a compromise, we propose a semiparametric approach that combines the advantages of parametric and nonparametric curve estimates. In a mixture form, our estimator takes a weighted average of the parametric and nonparametric curve estimates, in which a higher weight is assigned to the estimate with a better model fit. When the parametric model assumption holds, the semiparametric curve estimate converges to the parametric estimate and thus achieves high efficiency; when the parametric model is misspecified, the semiparametric estimate converges to the nonparametric estimate and remains consistent. We also consider an adaptive weighting scheme to allow the weight to vary according to the local fit of the models. We conduct extensive simulation studies to investigate the performance of the proposed methods and illustrate them with two real examples. © 2011, The International Biometric Society.

  10. Orthogonal transformations for change detection, Matlab code (ENVI-like headers)

    DEFF Research Database (Denmark)

    2007-01-01

    Matlab code to do (iteratively reweighted) multivariate alteration detection (MAD) analysis, maximum autocorrelation factor (MAF) analysis, canonical correlation analysis (CCA) and principal component analysis (PCA) on image data; accommodates ENVI (like) header files.......Matlab code to do (iteratively reweighted) multivariate alteration detection (MAD) analysis, maximum autocorrelation factor (MAF) analysis, canonical correlation analysis (CCA) and principal component analysis (PCA) on image data; accommodates ENVI (like) header files....

  11. ABAQUS2MATLAB: A Novel Tool for Finite Element Post-Processing

    DEFF Research Database (Denmark)

    Martínez Pañeda, Emilio; Papazafeiropoulos, George; Muniz-Calvente, Miguel

    2017-01-01

    A novel piece of software is presented to connect Abaqus, a sophisticated finite element package, with Matlab, the most comprehensive program for mathematical analysis. This interface between these well-known codes not only benefits from the image processing and the integrated graph-plotting feat......A novel piece of software is presented to connect Abaqus, a sophisticated finite element package, with Matlab, the most comprehensive program for mathematical analysis. This interface between these well-known codes not only benefits from the image processing and the integrated graph...... to demonstrate its capabilities. The source code, detailed documentation and a large number of tutorials can be freely downloaded from www.abaqus2matlab.com....

  12. Technological change in energy systems. Learning curves, logistic curves and input-output coefficients

    International Nuclear Information System (INIS)

    Pan, Haoran; Koehler, Jonathan

    2007-01-01

    Learning curves have recently been widely adopted in climate-economy models to incorporate endogenous change of energy technologies, replacing the conventional assumption of an autonomous energy efficiency improvement. However, there has been little consideration of the credibility of the learning curve. The current trend that many important energy and climate change policy analyses rely on the learning curve means that it is of great importance to critically examine the basis for learning curves. Here, we analyse the use of learning curves in energy technology, usually implemented as a simple power function. We find that the learning curve cannot separate the effects of price and technological change, cannot reflect continuous and qualitative change of both conventional and emerging energy technologies, cannot help to determine the time paths of technological investment, and misses the central role of R and D activity in driving technological change. We argue that a logistic curve of improving performance modified to include R and D activity as a driving variable can better describe the cost reductions in energy technologies. Furthermore, we demonstrate that the top-down Leontief technology can incorporate the bottom-up technologies that improve along either the learning curve or the logistic curve, through changing input-output coefficients. An application to UK wind power illustrates that the logistic curve fits the observed data better and implies greater potential for cost reduction than the learning curve does. (author)

  13. The curvature of sensitometric curves for Kodak XV-2 film irradiated with photon and electron beams.

    Science.gov (United States)

    van Battum, L J; Huizenga, H

    2006-07-01

    Sensitometric curves of Kodak XV-2 film, obtained in a time period of ten years with various types of equipment, have been analyzed both for photon and electron beams. The sensitometric slope in the dataset varies more than a factor of 2, which is attributed mainly to variations in developer conditions. In the literature, the single hit equation has been proposed as a model for the sensitometric curve, as with the parameters of the sensitivity and maximum optical density. In this work, the single hit equation has been translated into a polynomial like function as with the parameters of the sensitometric slope and curvature. The model has been applied to fit the sensitometric data. If the dataset is fitted for each single sensitometric curve separately, a large variation is observed for both fit parameters. When sensitometric curves are fitted simultaneously it appears that all curves can be fitted adequately with a sensitometric curvature that is related to the sensitometric slope. When fitting each curve separately, apparently measurement uncertainty hides this relation. This relation appears to be dependent only on the type of densitometer used. No significant differences between beam energies or beam modalities are observed. Using the intrinsic relation between slope and curvature in fitting sensitometric data, e.g., for pretreatment verification of intensity-modulated radiotherapy, will increase the accuracy of the sensitometric curve. A calibration at a single dose point, together with a predetermined densitometer-dependent parameter ODmax will be adequate to find the actual relation between optical density and dose.

  14. Digital signal processing with Matlab examples

    CERN Document Server

    Giron-Sierra, Jose Maria

    2017-01-01

    This is the first volume in a trilogy on modern Signal Processing. The three books provide a concise exposition of signal processing topics, and a guide to support individual practical exploration based on MATLAB programs. This book includes MATLAB codes to illustrate each of the main steps of the theory, offering a self-contained guide suitable for independent study. The code is embedded in the text, helping readers to put into practice the ideas and methods discussed. The book is divided into three parts, the first of which introduces readers to periodic and non-periodic signals. The second part is devoted to filtering, which is an important and commonly used application. The third part addresses more advanced topics, including the analysis of real-world non-stationary signals and data, e.g. structural fatigue, earthquakes, electro-encephalograms, birdsong, etc. The book’s last chapter focuses on modulation, an example of the intentional use of non-stationary signals.

  15. Sediment Curve Uncertainty Estimation Using GLUE and Bootstrap Methods

    Directory of Open Access Journals (Sweden)

    aboalhasan fathabadi

    2017-02-01

    Full Text Available Introduction: In order to implement watershed practices to decrease soil erosion effects it needs to estimate output sediment of watershed. Sediment rating curve is used as the most conventional tool to estimate sediment. Regarding to sampling errors and short data, there are some uncertainties in estimating sediment using sediment curve. In this research, bootstrap and the Generalized Likelihood Uncertainty Estimation (GLUE resampling techniques were used to calculate suspended sediment loads by using sediment rating curves. Materials and Methods: The total drainage area of the Sefidrood watershed is about 560000 km2. In this study uncertainty in suspended sediment rating curves was estimated in four stations including Motorkhane, Miyane Tonel Shomare 7, Stor and Glinak constructed on Ayghdamosh, Ghrangho, GHezelOzan and Shahrod rivers, respectively. Data were randomly divided into a training data set (80 percent and a test set (20 percent by Latin hypercube random sampling.Different suspended sediment rating curves equations were fitted to log-transformed values of sediment concentration and discharge and the best fit models were selected based on the lowest root mean square error (RMSE and the highest correlation of coefficient (R2. In the GLUE methodology, different parameter sets were sampled randomly from priori probability distribution. For each station using sampled parameter sets and selected suspended sediment rating curves equation suspended sediment concentration values were estimated several times (100000 to 400000 times. With respect to likelihood function and certain subjective threshold, parameter sets were divided into behavioral and non-behavioral parameter sets. Finally using behavioral parameter sets the 95% confidence intervals for suspended sediment concentration due to parameter uncertainty were estimated. In bootstrap methodology observed suspended sediment and discharge vectors were resampled with replacement B (set to

  16. MATLAB tensor classes for fast algorithm prototyping.

    Energy Technology Data Exchange (ETDEWEB)

    Bader, Brett William; Kolda, Tamara Gibson (Sandia National Laboratories, Livermore, CA)

    2004-10-01

    Tensors (also known as mutidimensional arrays or N-way arrays) are used in a variety of applications ranging from chemometrics to psychometrics. We describe four MATLAB classes for tensor manipulations that can be used for fast algorithm prototyping. The tensor class extends the functionality of MATLAB's multidimensional arrays by supporting additional operations such as tensor multiplication. The tensor as matrix class supports the 'matricization' of a tensor, i.e., the conversion of a tensor to a matrix (and vice versa), a commonly used operation in many algorithms. Two additional classes represent tensors stored in decomposed formats: cp tensor and tucker tensor. We descibe all of these classes and then demonstrate their use by showing how to implement several tensor algorithms that have appeared in the literature.

  17. Abaqus2Matlab: A suitable tool for finite element post-processing

    DEFF Research Database (Denmark)

    Papazafeiropoulos, George; Muñiz-Calvente, Miguel; Martínez Pañeda, Emilio

    2017-01-01

    A suitable piece of software is presented to connect Abaqus, a sophisticated finite element package, with Matlab, the most comprehensive program for mathematical analysis. This interface between these well- known codes not only benefits from the image processing and the integrated graph-plotting ......A suitable piece of software is presented to connect Abaqus, a sophisticated finite element package, with Matlab, the most comprehensive program for mathematical analysis. This interface between these well- known codes not only benefits from the image processing and the integrated graph...... crack propagation in structural materials by means of a cohesive zone approach. The source code, detailed documentation and a large number of tutorials can be freely downloaded from www.abaqus2matlab.com ....

  18. Mathematical Formulation used by MATLAB Code to Convert FTIR Interferograms to Calibrated Spectra

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, Derek Elswick [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-19

    This report discusses the mathematical procedures used to convert raw interferograms from Fourier transform infrared (FTIR) sensors to calibrated spectra. The work discussed in this report was completed as part of the Helios project at Los Alamos National Laboratory. MATLAB code was developed to convert the raw interferograms to calibrated spectra. The report summarizes the developed MATLAB scripts and functions, along with a description of the mathematical methods used by the code. The first step in working with raw interferograms is to convert them to uncalibrated spectra by applying an apodization function to the raw data and then by performing a Fourier transform. The developed MATLAB code also addresses phase error correction by applying the Mertz method. This report provides documentation for the MATLAB scripts.

  19. Observational evidence of dust evolution in galactic extinction curves

    Energy Technology Data Exchange (ETDEWEB)

    Cecchi-Pestellini, Cesare [INAF-Osservatorio Astronomico di Palermo, P.zza Parlamento 1, I-90134 Palermo (Italy); Casu, Silvia; Mulas, Giacomo [INAF-Osservatorio Astronomico di Cagliari, Via della Scienza, I-09047 Selargius (Italy); Zonca, Alberto, E-mail: cecchi-pestellini@astropa.unipa.it, E-mail: silvia@oa-cagliari.inaf.it, E-mail: gmulas@oa-cagliari.inaf.it, E-mail: azonca@oa-cagliari.inaf.it [Dipartimento di Fisica, Università di Cagliari, Strada Prov.le Monserrato-Sestu Km 0.700, I-09042 Monserrato (Italy)

    2014-04-10

    Although structural and optical properties of hydrogenated amorphous carbons are known to respond to varying physical conditions, most conventional extinction models are basically curve fits with modest predictive power. We compare an evolutionary model of the physical properties of carbonaceous grain mantles with their determination by homogeneously fitting observationally derived Galactic extinction curves with the same physically well-defined dust model. We find that a large sample of observed Galactic extinction curves are compatible with the evolutionary scenario underlying such a model, requiring physical conditions fully consistent with standard density, temperature, radiation field intensity, and average age of diffuse interstellar clouds. Hence, through the study of interstellar extinction we may, in principle, understand the evolutionary history of the diffuse interstellar clouds.

  20. Modeling two strains of disease via aggregate-level infectivity curves.

    Science.gov (United States)

    Romanescu, Razvan; Deardon, Rob

    2016-04-01

    Well formulated models of disease spread, and efficient methods to fit them to observed data, are powerful tools for aiding the surveillance and control of infectious diseases. Our project considers the problem of the simultaneous spread of two related strains of disease in a context where spatial location is the key driver of disease spread. We start our modeling work with the individual level models (ILMs) of disease transmission, and extend these models to accommodate the competing spread of the pathogens in a two-tier hierarchical population (whose levels we refer to as 'farm' and 'animal'). The postulated interference mechanism between the two strains is a period of cross-immunity following infection. We also present a framework for speeding up the computationally intensive process of fitting the ILM to data, typically done using Markov chain Monte Carlo (MCMC) in a Bayesian framework, by turning the inference into a two-stage process. First, we approximate the number of animals infected on a farm over time by infectivity curves. These curves are fit to data sampled from farms, using maximum likelihood estimation, then, conditional on the fitted curves, Bayesian MCMC inference proceeds for the remaining parameters. Finally, we use posterior predictive distributions of salient epidemic summary statistics, in order to assess the model fitted.

  1. Matlab modeling of ITER CODAC

    International Nuclear Information System (INIS)

    Pangione, L.; Lister, J.B.

    2008-01-01

    The ITER CODAC (COntrol, Data Access and Communication) conceptual design resulted from 2 years of activity. One result was a proposed functional partitioning of CODAC into different CODAC Systems, each of them partitioned into other CODAC Systems. Considering the large size of this project, simple use of human language assisted by figures would certainly be ineffective in creating an unambiguous description of all interactions and all relations between these Systems. Moreover, the underlying design is resident in the mind of the designers, who must consider all possible situations that could happen to each system. There is therefore a need to model the whole of CODAC with a clear and preferably graphical method, which allows the designers to verify the correctness and the consistency of their project. The aim of this paper is to describe the work started on ITER CODAC modeling using Matlab/Simulink. The main feature of this tool is the possibility of having a simple, graphical, intuitive representation of a complex system and ultimately to run a numerical simulation of it. Using Matlab/Simulink, each CODAC System was represented in a graphical and intuitive form with its relations and interactions through the definition of a small number of simple rules. In a Simulink diagram, each system was represented as a 'black box', both containing, and connected to, a number of other systems. In this way it is possible to move vertically between systems on different levels, to show the relation of membership, or horizontally to analyse the information exchange between systems at the same level. This process can be iterated, starting from a global diagram, in which only CODAC appears with the Plant Systems and the external sites, and going deeper down to the mathematical model of each CODAC system. The Matlab/Simulink features for simulating the whole top diagram encourage us to develop the idea of completing the functionalities of all systems in order to finally have a full

  2. Curve fitting and modeling with splines using statistical variable selection techniques

    Science.gov (United States)

    Smith, P. L.

    1982-01-01

    The successful application of statistical variable selection techniques to fit splines is demonstrated. Major emphasis is given to knot selection, but order determination is also discussed. Two FORTRAN backward elimination programs, using the B-spline basis, were developed. The program for knot elimination is compared in detail with two other spline-fitting methods and several statistical software packages. An example is also given for the two-variable case using a tensor product basis, with a theoretical discussion of the difficulties of their use.

  3. Labbtex: Toolbox para generación de informes en LATEX para Matlab

    Directory of Open Access Journals (Sweden)

    José Luis Almazán Gárate

    2012-10-01

    Full Text Available En este artículo se presenta el software desarrollado por el Equipo H3lite dentro del Departamento de Ingeneniería Civil. Transportes de la Escuela de Ingenieros de Caminos, Canales y Puertos de la Universidad Politécnica de Madrid para la generación de informes enLATEX mediante el software Matlab® y la integración en sus rutinas, Labbtex.La librería Labbtex proporciona un marco flexible para mezclar texto y código Matlab® para la generación automática de documentos. Un rchivo fuente simple contiene el texto de documentación y el código Matlab, al correr la aplicación se genera un documento final LATEX que contiene el texto, gráficos y tablas indicados con el formato de un documento LATEX. El código Matlab genera un documento LATEX usando la sintaxis. Así, LATEX (para composición de texto de alta calidad y Matlab® (para cálculo matemático pueden usarse simultáneamente. Esto permite la generación de informes en tiempo real con un uso de recursos mínimo.

  4. Improvements in Spectrum's fit to program data tool.

    Science.gov (United States)

    Mahiane, Severin G; Marsh, Kimberly; Grantham, Kelsey; Crichlow, Shawna; Caceres, Karen; Stover, John

    2017-04-01

    The Joint United Nations Program on HIV/AIDS-supported Spectrum software package (Glastonbury, Connecticut, USA) is used by most countries worldwide to monitor the HIV epidemic. In Spectrum, HIV incidence trends among adults (aged 15-49 years) are derived by either fitting to seroprevalence surveillance and survey data or generating curves consistent with program and vital registration data, such as historical trends in the number of newly diagnosed infections or people living with HIV and AIDS related deaths. This article describes development and application of the fit to program data (FPD) tool in Joint United Nations Program on HIV/AIDS' 2016 estimates round. In the FPD tool, HIV incidence trends are described as a simple or double logistic function. Function parameters are estimated from historical program data on newly reported HIV cases, people living with HIV or AIDS-related deaths. Inputs can be adjusted for proportions undiagnosed or misclassified deaths. Maximum likelihood estimation or minimum chi-squared distance methods are used to identify the best fitting curve. Asymptotic properties of the estimators from these fits are used to estimate uncertainty. The FPD tool was used to fit incidence for 62 countries in 2016. Maximum likelihood and minimum chi-squared distance methods gave similar results. A double logistic curve adequately described observed trends in all but four countries where a simple logistic curve performed better. Robust HIV-related program and vital registration data are routinely available in many middle-income and high-income countries, whereas HIV seroprevalence surveillance and survey data may be scarce. In these countries, the FPD tool offers a simpler, improved approach to estimating HIV incidence trends.

  5. Automated processing of thermal infrared images of Osservatorio Vesuviano permanent surveillance network by using Matlab code

    Science.gov (United States)

    Sansivero, Fabio; Vilardo, Giuseppe; Caputo, Teresa

    2017-04-01

    polynomial fit Matlab function - LTFC_SCOREF), c) the export of data in different raster formats (i.e. Surfer grd). An interesting example of elaborations of the data produced by ASIRA Tools is the map of the temperature changing rate, which provide remarkable information about the potential migration of fumarole activity. The high efficiency of Matlab in processing matrix data from IR scenes and the flexibility of this code-developing tool proved to be very useful to produce applications to use in volcanic surveillance aimed to monitor the evolution of surface temperatures field in diffuse degassing volcanic areas.

  6. Three-dimensional rendering of segmented object using matlab - biomed 2010.

    Science.gov (United States)

    Anderson, Jeffrey R; Barrett, Steven F

    2010-01-01

    The three-dimensional rendering of microscopic objects is a difficult and challenging task that often requires specialized image processing techniques. Previous work has been described of a semi-automatic segmentation process of fluorescently stained neurons collected as a sequence of slice images with a confocal laser scanning microscope. Once properly segmented, each individual object can be rendered and studied as a three-dimensional virtual object. This paper describes the work associated with the design and development of Matlab files to create three-dimensional images from the segmented object data previously mentioned. Part of the motivation for this work is to integrate both the segmentation and rendering processes into one software application, providing a seamless transition from the segmentation tasks to the rendering and visualization tasks. Previously these tasks were accomplished on two different computer systems, windows and Linux. This transition basically limits the usefulness of the segmentation and rendering applications to those who have both computer systems readily available. The focus of this work is to create custom Matlab image processing algorithms for object rendering and visualization, and merge these capabilities to the Matlab files that were developed especially for the image segmentation task. The completed Matlab application will contain both the segmentation and rendering processes in a single graphical user interface, or GUI. This process for rendering three-dimensional images in Matlab requires that a sequence of two-dimensional binary images, representing a cross-sectional slice of the object, be reassembled in a 3D space, and covered with a surface. Additional segmented objects can be rendered in the same 3D space. The surface properties of each object can be varied by the user to aid in the study and analysis of the objects. This inter-active process becomes a powerful visual tool to study and understand microscopic objects.

  7. Fitted curve parameters for the efficiency of a coaxial HPGe Detector

    International Nuclear Information System (INIS)

    Supian Samat

    1996-01-01

    Using Ngraph software, the parameters of various functions were determined by least squares analysis of fits to experimental efficiencies , ε sub f of a coaxial HPGe detector for gamma rays in the energy range 59 keV to 1836 keV. When these parameters had been determined, their reliability was tested by the calculated goodness-of-fit parameter χ sup 2 sub cal. It is shown that the function, ln ε sub f = Σ sup n sub j=0 a sub j (ln E/E sub 0) sup j , where n=3, gives satisfactory results

  8. KEGGParser: parsing and editing KEGG pathway maps in Matlab.

    Science.gov (United States)

    Arakelyan, Arsen; Nersisyan, Lilit

    2013-02-15

    KEGG pathway database is a collection of manually drawn pathway maps accompanied with KGML format files intended for use in automatic analysis. KGML files, however, do not contain the required information for complete reproduction of all the events indicated in the static image of a pathway map. Several parsers and editors of KEGG pathways exist for processing KGML files. We introduce KEGGParser-a MATLAB based tool for KEGG pathway parsing, semiautomatic fixing, editing, visualization and analysis in MATLAB environment. It also works with Scilab. The source code is available at http://www.mathworks.com/matlabcentral/fileexchange/37561.

  9. Kinematic simulation and analysis of robot based on MATLAB

    Science.gov (United States)

    Liao, Shuhua; Li, Jiong

    2018-03-01

    The history of industrial automation is characterized by quick update technology, however, without a doubt, the industrial robot is a kind of special equipment. With the help of MATLAB matrix and drawing capacity in the MATLAB environment each link coordinate system set up by using the d-h parameters method and equation of motion of the structure. Robotics, Toolbox programming Toolbox and GUIDE to the joint application is the analysis of inverse kinematics and path planning and simulation, preliminary solve the problem of college students the car mechanical arm positioning theory, so as to achieve the aim of reservation.

  10. A Total Factor Productivity Toolbox for MATLAB

    NARCIS (Netherlands)

    B.M. Balk (Bert); J. Barbero (Javier); J.L. Zofío (José)

    2018-01-01

    textabstractTotal Factor Productivity Toolbox is a new package for MATLAB that includes functions to calculate the main Total Factor Productivity (TFP) indices and their decompositions, based on Shephard’s distance functions and using Data Envelopment Analysis (DEA) programming techniques. The

  11. A new model describing the curves for repair of both DNA double-strand breaks and chromosome damage

    International Nuclear Information System (INIS)

    Foray, N.; Badie, C.; Alsbeih, G.; Malaise, E.P.; Fertil, B.

    1996-01-01

    A review of reports dealing with fittings of the data for repair of DNA double-strand breaks (DSBs) and excess chromosome fragments (ECFs) shows that several models are used to fit the repair curves. Since DSBs and ECFs are correleated, it is worth developing a model describing both phenomena. The curve-fitting models used most extensively, the two repair half-times model for DSBs and the monoexponential plus residual model for ECFs, appear to be too inflexible to describe the repair curves for both DSBs and ECFs. We have therefore developed a new concept based on a variable repair half-time. According to this concept, the repair curve is continuously bending and dependent on time and probably reflects a continuous spectrum of damage repairability. The fits of the curves for DSB repair to the variable repair half-time and the variable repair half-time plus residual models were compared to those obtained with the two half-times plus residual and two half-times models. Similarly, the fits of the curves for ECF repair to the variable repair half-time and variable half-time plus residual models were compared to that obtained with the monoexponential plus residual model. The quality of fit and the dependence of adjustable parameters on the portion of the curve fitted were used as comparison criteria. We found that: (a) It is useful to postulate the existence of a residual term for unrepairable lesions, regardless of the model adopted. (b) With the two cell lines tested (a normal and a hypersensitive one), data for both DSBs and ECTs are best fitted to the variable repair half-time plus residual model, whatever the repair time range. 47 refs., 3 figs., 3 tabs

  12. On the analysis of Canadian Holstein dairy cow lactation curves using standard growth functions.

    Science.gov (United States)

    López, S; France, J; Odongo, N E; McBride, R A; Kebreab, E; AlZahal, O; McBride, B W; Dijkstra, J

    2015-04-01

    Six classical growth functions (monomolecular, Schumacher, Gompertz, logistic, Richards, and Morgan) were fitted to individual and average (by parity) cumulative milk production curves of Canadian Holstein dairy cows. The data analyzed consisted of approximately 91,000 daily milk yield records corresponding to 122 first, 99 second, and 92 third parity individual lactation curves. The functions were fitted using nonlinear regression procedures, and their performance was assessed using goodness-of-fit statistics (coefficient of determination, residual mean squares, Akaike information criterion, and the correlation and concordance coefficients between observed and adjusted milk yields at several days in milk). Overall, all the growth functions evaluated showed an acceptable fit to the cumulative milk production curves, with the Richards equation ranking first (smallest Akaike information criterion) followed by the Morgan equation. Differences among the functions in their goodness-of-fit were enlarged when fitted to average curves by parity, where the sigmoidal functions with a variable point of inflection (Richards and Morgan) outperformed the other 4 equations. All the functions provided satisfactory predictions of milk yield (calculated from the first derivative of the functions) at different lactation stages, from early to late lactation. The Richards and Morgan equations provided the most accurate estimates of peak yield and total milk production per 305-d lactation, whereas the least accurate estimates were obtained with the logistic equation. In conclusion, classical growth functions (especially sigmoidal functions with a variable point of inflection) proved to be feasible alternatives to fit cumulative milk production curves of dairy cows, resulting in suitable statistical performance and accurate estimates of lactation traits. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  13. Effective approach to spectroscopy and spectral analysis techniques using Matlab

    Science.gov (United States)

    Li, Xiang; Lv, Yong

    2017-08-01

    With the development of electronic information, computer and network, modern education technology has entered new era, which would give a great impact on teaching process. Spectroscopy and spectral analysis is an elective course for Optoelectronic Information Science and engineering. The teaching objective of this course is to master the basic concepts and principles of spectroscopy, spectral analysis and testing of basic technical means. Then, let the students learn the principle and technology of the spectrum to study the structure and state of the material and the developing process of the technology. MATLAB (matrix laboratory) is a multi-paradigm numerical computing environment and fourth-generation programming language. A proprietary programming language developed by MathWorks, MATLAB allows matrix manipulations, plotting of functions and data, Based on the teaching practice, this paper summarizes the new situation of applying Matlab to the teaching of spectroscopy. This would be suitable for most of the current school multimedia assisted teaching

  14. OPTICON: Pro-Matlab software for large order controlled structure design

    Science.gov (United States)

    Peterson, Lee D.

    1989-01-01

    A software package for large order controlled structure design is described and demonstrated. The primary program, called OPTICAN, uses both Pro-Matlab M-file routines and selected compiled FORTRAN routines linked into the Pro-Matlab structure. The program accepts structural model information in the form of state-space matrices and performs three basic design functions on the model: (1) open loop analyses; (2) closed loop reduced order controller synthesis; and (3) closed loop stability and performance assessment. The current controller synthesis methods which were implemented in this software are based on the Generalized Linear Quadratic Gaussian theory of Bernstein. In particular, a reduced order Optimal Projection synthesis algorithm based on a homotopy solution method was successfully applied to an experimental truss structure using a 58-state dynamic model. These results are presented and discussed. Current plans to expand the practical size of the design model to several hundred states and the intention to interface Pro-Matlab to a supercomputing environment are discussed.

  15. Experimental study on composite solid propellant material burning rate using algorithm MATLAB

    Directory of Open Access Journals (Sweden)

    Thunaipragasam Selvakumaran

    2016-01-01

    Full Text Available In rocketry application, now-a-days instead of monopropellants slowly composite propellants are introduced. Burning rate of a solid state composite propellant depends on many factors like oxidizer-binder ratio, oxidizer particle size and distribution, particle size and its distribution, pressure, temperature, etc. Several researchers had taken the mass varied composite propellant. In that, the ammonium perchlorate mainly varied from 85 to 90%. This paper deals with the oxidizer rich propellant by allowing small variation of fuel cum binder ranging from 2%, 4%, 6%, and 8% by mass. Since the percent of the binder is very less compared to the oxidizer, the mixture remains in a powder form. The powder samples are used to make a pressed pellet. Experiments were conducted in closed window bomb set-up at pressures of 2, 3.5, and 7 MN/m2. The burning rates are calculated from the combustion photography (images taken by a high-speed camera. These images were processed frame by frame in MATLAB, detecting the edges in the images of the frames. The burning rate is obtained as the slope of the linear fit from MATLAB and observed that the burn rate increases with the mass variation of constituents present in solid state composite propellant. The result indicates a remarkable increase in burn rate of 26.66%, 20%, 16.66%, and 3.33% for Mix 1, 2, 3, 4 compared with Mix 5 at 7 MN/m2. The percentage variations in burn rate between Mix 1 and Mix 5 at 2, 3.5, and 7 MN/m2 are 25.833%, 32.322%, and 26.185%, respectively.

  16. Fitting and benchmarking of Monte Carlo output parameters for iridium-192 high dose rate brachytherapy source

    International Nuclear Information System (INIS)

    Acquah, F.G.

    2011-01-01

    Brachytherapy, the use of radioactive sources for the treatment of tumours is an important tool in radiation oncology. Accurate calculations of dose delivered to malignant and normal tissues are the main responsibility of the Medical Physics staff. With the use of Treatment Planning System (TPS) computers now becoming a standard practice in the Radiation Oncology Departments, Independent calculations to certify the results of these commercial TPSs are important part of a good quality management system for brachytherapy implants. There are inherent errors in the dose distributions produced by these TPSs due to its failure to account for heterogeneity in the calculation algorithms and Monte Carlo (MC) method seems to be the panacea for these corrections. In this study, a fit functional form using MC output parameters was performed to reduce dose calculation uncertainty using the Matlab software curve fitting applications. This includes the modification of the AAPM TG-43 parameters to accommodate the new developments for a rapid brachytherapy dose rate calculation. Analytical computations were performed to hybridize the anisotropy function, F(r,θ) and radial dose function, g(r) into a single new function f(r,θ) for the Nucletron microSelectron High Dose Rate 'new or v2' (mHDRv2) 192 Ir brachytherapy source. In order to minimize computation time and to improve the accuracy of manual calculations, the dosimetry function f(r,θ) used fewer parameters and formulas for the fit. Using MC outputs as the standard, the percentage errors for the fits were calculated and used to evaluate the average and maximum uncertainties. Dose rate deviation between the MC data and fit were also quantified as errors(E), which showed minimal values. These results showed that the dosimetry parameters from this study as compared to those of MC outputs parameters were in good agreement and better than the results obtained from literature. The work confirms a lot of promise in building robust

  17. Integration of MATLAB Simulink(Registered Trademark) Models with the Vertical Motion Simulator

    Science.gov (United States)

    Lewis, Emily K.; Vuong, Nghia D.

    2012-01-01

    This paper describes the integration of MATLAB Simulink(Registered TradeMark) models into the Vertical Motion Simulator (VMS) at NASA Ames Research Center. The VMS is a high-fidelity, large motion flight simulator that is capable of simulating a variety of aerospace vehicles. Integrating MATLAB Simulink models into the VMS needed to retain the development flexibility of the MATLAB environment and allow rapid deployment of model changes. The process developed at the VMS was used successfully in a number of recent simulation experiments. This accomplishment demonstrated that the model integrity was preserved, while working within the hard real-time run environment of the VMS architecture, and maintaining the unique flexibility of the VMS to meet diverse research requirements.

  18. MATLAB as a tool as Analysis and Problem Solving Competency Development in Chemical Engineering Degree using MATLAB

    Directory of Open Access Journals (Sweden)

    Maria-Fernanda López-Pérez

    2016-10-01

    The principal purpose of this work is the students improvement using, as has been mentioned previously, MATLAB in a problem-based learning methodology. This methodology allows a more effective coordination in the degree. The present paper presents a real- world problem and the common elements of most problem-solving contexts and how is designed to function across all disciplines.

  19. Timescale stretch parameterization of Type Ia supernova B-band light curves

    International Nuclear Information System (INIS)

    Goldhaber, G.; Groom, D.E.; Kim, A.; Aldering, G.; Astier, P.; Conley, A.; Deustua, S.E.; Ellis, R.; Fabbro, S.; Fruchter, A.S.; Goobar, A.; Hook, I.; Irwin, M.; Kim, M.; Knop, R.A.; Lidman, C.; McMahon, R.; Nugent, P.E.; Pain, R.; Panagia, N.; Pennypacker, C.R.; Perlmutter, S.; Ruiz-Lapuente, P.; Schaefer, B.; Walton, N.A.; York, T.

    2001-01-01

    R-band intensity measurements along the light curve of Type Ia supernovae discovered by the Cosmology Project (SCP) are fitted in brightness to templates allowing a free parameter the time-axis width factor w identically equal to s times (1+z). The data points are then individually aligned in the time-axis, normalized and K-corrected back to the rest frame, after which the nearly 1300 normalized intensity measurements are found to lie on a well-determined common rest-frame B-band curve which we call the ''composite curve.'' The same procedure is applied to 18 low-redshift Calan/Tololo SNe with Z < 0.11; these nearly 300 B-band photometry points are found to lie on the composite curve equally well. The SCP search technique produces several measurements before maximum light for each supernova. We demonstrate that the linear stretch factor, s, which parameterizes the light-curve timescale appears independent of z, and applies equally well to the declining and rising parts of the light curve. In fact, the B band template that best fits this composite curve fits the individual supernova photometry data when stretched by a factor s with chi 2/DoF ∼ 1, thus as well as any parameterization can, given the current data sets. The measurement of the data of explosion, however, is model dependent and not tightly constrained by the current data. We also demonstrate the 1 + z light-cure time-axis broadening expected from cosmological expansion. This argues strongly against alternative explanations, such as tired light, for the redshift of distant objects

  20. Integration of Modeling in Solidworks and Matlab/Simulink Environments

    Directory of Open Access Journals (Sweden)

    Cekus Dawid

    2014-03-01

    Full Text Available W pracy opisano tok postepowania podczas budowy modeli symulacyjnych z wykorzystaniem programu SolidWorks i Matlab/Simulink. Tworzenie modelu symulacyjnego przebiega etapami, to znaczy najpierw opracowywany jest model geometryczny w programie SolidWorks, nastepnie dzieki mozliwosci wymiany danych, model CAD jest implementowany w srodowisku obliczeniowym Matlab/Simulink. Modele SimMechanics pozwalaja na sledzenie wielu parametrów, np. trajektorii, predkosci, czy przyspieszen dowolnych elementów układu złozonego. W pracy, jako przykłady modeli symulacyjnych opracowanych zgodnie z zaprezentowana metoda, pokazano modele laboratoryjnego zurawia samochodowego oraz zurawia lesnego. Modele te umozliwiaja wizualizacje zadanego - za pomoca wymuszen kinematycznych - cyklu pracy.

  1. Image enhancement using MCNP5 code and MATLAB in neutron radiography

    International Nuclear Information System (INIS)

    Tharwat, Montaser; Mohamed, Nader; Mongy, T.

    2014-01-01

    This work presents a method that can be used to enhance the neutron radiography (NR) image for objects with high scattering materials like hydrogen, carbon and other light materials. This method used Monte Carlo code, MCNP5, to simulate the NR process and get the flux distribution for each pixel of the image and determines the scattered neutron distribution that caused image blur, and then uses MATLAB to subtract this scattered neutron distribution from the initial image to improve its quality. This work was performed before the commissioning of digital NR system in Jan. 2013. The MATLAB enhancement method is quite a good technique in the case of static based film neutron radiography, while in neutron imaging (NI) technique, image enhancement and quantitative measurement were efficient by using ImageJ software. The enhanced image quality and quantitative measurements were presented in this work. - Highlights: • This work is applicable for static based film neutron radiography and digital neutron imaging. • MATLAB is a useful tool for imaging enhancement in radiographic film. • Advanced imaging processing is available in the ETRR-2 for imaging processing and data extraction. • The digital imaging system is suitable for complex shapes and sizes, while MATLAB technique is suitable for simple shapes and sizes. • Quantitative measurements are available

  2. DynaSim: A MATLAB Toolbox for Neural Modeling and Simulation.

    Science.gov (United States)

    Sherfey, Jason S; Soplata, Austin E; Ardid, Salva; Roberts, Erik A; Stanley, David A; Pittman-Polletta, Benjamin R; Kopell, Nancy J

    2018-01-01

    DynaSim is an open-source MATLAB/GNU Octave toolbox for rapid prototyping of neural models and batch simulation management. It is designed to speed up and simplify the process of generating, sharing, and exploring network models of neurons with one or more compartments. Models can be specified by equations directly (similar to XPP or the Brian simulator) or by lists of predefined or custom model components. The higher-level specification supports arbitrarily complex population models and networks of interconnected populations. DynaSim also includes a large set of features that simplify exploring model dynamics over parameter spaces, running simulations in parallel using both multicore processors and high-performance computer clusters, and analyzing and plotting large numbers of simulated data sets in parallel. It also includes a graphical user interface (DynaSim GUI) that supports full functionality without requiring user programming. The software has been implemented in MATLAB to enable advanced neural modeling using MATLAB, given its popularity and a growing interest in modeling neural systems. The design of DynaSim incorporates a novel schema for model specification to facilitate future interoperability with other specifications (e.g., NeuroML, SBML), simulators (e.g., NEURON, Brian, NEST), and web-based applications (e.g., Geppetto) outside MATLAB. DynaSim is freely available at http://dynasimtoolbox.org. This tool promises to reduce barriers for investigating dynamics in large neural models, facilitate collaborative modeling, and complement other tools being developed in the neuroinformatics community.

  3. FC LSEI WNNLS, Least-Square Fitting Algorithms Using B Splines

    International Nuclear Information System (INIS)

    Hanson, R.J.; Haskell, K.H.

    1989-01-01

    1 - Description of problem or function: FC allows a user to fit dis- crete data, in a weighted least-squares sense, using piece-wise polynomial functions represented by B-Splines on a given set of knots. In addition to the least-squares fitting of the data, equality, inequality, and periodic constraints at a discrete, user-specified set of points can be imposed on the fitted curve or its derivatives. The subprograms LSEI and WNNLS solve the linearly-constrained least-squares problem. LSEI solves the class of problem with general inequality constraints, and, if requested, obtains a covariance matrix of the solution parameters. WNNLS solves the class of problem with non-negativity constraints. It is anticipated that most users will find LSEI suitable for their needs; however, users with inequalities that are single bounds on variables may wish to use WNNLS. 2 - Method of solution: The discrete data are fit by a linear combination of piece-wise polynomial curves which leads to a linear least-squares system of algebraic equations. Additional information is expressed as a discrete set of linear inequality and equality constraints on the fitted curve which leads to a linearly-constrained least-squares system of algebraic equations. The solution of this system is the main computational problem solved

  4. Development and Validation of Reentry Simulation Using MATLAB

    National Research Council Canada - National Science Library

    Jameson, Jr, Robert E

    2006-01-01

    This research effort develops a program using MATLAB to solve the equations of motion for atmospheric reentry and analyzes the validity of the program for use as a tool to expeditiously predict reentry profiles...

  5. The New Keynesian Phillips Curve

    DEFF Research Database (Denmark)

    Ólafsson, Tjörvi

    This paper provides a survey on the recent literature on the new Keynesian Phillips curve: the controversies surrounding its microfoundation and estimation, the approaches that have been tried to improve its empirical fit and the challenges it faces adapting to the open-economy framework. The new......, learning or state-dependant pricing. The introduction of openeconomy factors into the new Keynesian Phillips curve complicate matters further as it must capture the nexus between price setting, inflation and the exchange rate. This is nevertheless a crucial feature for any model to be used for inflation...... forecasting in a small open economy like Iceland....

  6. A Compilation of MATLAB Scripts and Functions for MACGMC Analyses

    Science.gov (United States)

    Murthy, Pappu L. N.; Bednarcyk, Brett A.; Mital, Subodh K.

    2017-01-01

    The primary aim of the current effort is to provide scripts that automate many of the repetitive pre- and post-processing tasks associated with composite materials analyses using the Micromechanics Analysis Code with the Generalized Method of Cells. This document consists of a compilation of hundreds of scripts that were developed in MATLAB (The Mathworks, Inc., Natick, MA) programming language and consolidated into 16 MATLAB functions. (MACGMC). MACGMC is a composite material and laminate analysis software code developed at NASA Glenn Research Center. The software package has been built around the generalized method of cells (GMC) family of micromechanics theories. The computer code is developed with a user-friendly framework, along with a library of local inelastic, damage, and failure models. Further, application of simulated thermo-mechanical loading, generation of output results, and selection of architectures to represent the composite material have been automated to increase the user friendliness, as well as to make it more robust in terms of input preparation and code execution. Finally, classical lamination theory has been implemented within the software, wherein GMC is used to model the composite material response of each ply. Thus, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. The pre-processing tasks include generation of a multitude of different repeating unit cells (RUCs) for CMCs and PMCs, visualization of RUCs from MACGMC input and output files and generation of the RUC section of a MACGMC input file. The post-processing tasks include visualization of the predicted composite response, such as local stress and strain contours, damage initiation and progression, stress-strain behavior, and fatigue response. In addition to the above, several miscellaneous scripts have been developed that can be used to perform repeated Monte-Carlo simulations to enable probabilistic

  7. WellReader: a MATLAB program for the analysis of fluorescence and luminescence reporter gene data.

    Science.gov (United States)

    Boyer, Frédéric; Besson, Bruno; Baptist, Guillaume; Izard, Jérôme; Pinel, Corinne; Ropers, Delphine; Geiselmann, Johannes; de Jong, Hidde

    2010-05-01

    Fluorescent and luminescent reporter gene systems in combination with automated microplate readers allow real-time monitoring of gene expression on the population level at high precision and sampling density. This generates large amounts of data for the analysis of which computer tools are missing to date. We have developed WellReader, a MATLAB program for the analysis of fluorescent and luminescent reporter gene data. WellReader allows the user to load the output files of microplate readers, remove outliers, correct for background effects and smooth and fit the data. Moreover, it computes biologically relevant quantities from the measured signals, notably promoter activities and protein concentrations, and compares the resulting expression profiles of different genes under different conditions. WellReader is available under a LGPL licence at http://prabi1.inrialpes.fr/trac/wellreader.

  8. IB2d: a Python and MATLAB implementation of the immersed boundary method.

    Science.gov (United States)

    Battista, Nicholas A; Strickland, W Christopher; Miller, Laura A

    2017-03-29

    The development of fluid-structure interaction (FSI) software involves trade-offs between ease of use, generality, performance, and cost. Typically there are large learning curves when using low-level software to model the interaction of an elastic structure immersed in a uniform density fluid. Many existing codes are not publicly available, and the commercial software that exists usually requires expensive licenses and may not be as robust or allow the necessary flexibility that in house codes can provide. We present an open source immersed boundary software package, IB2d, with full implementations in both MATLAB and Python, that is capable of running a vast range of biomechanics models and is accessible to scientists who have experience in high-level programming environments. IB2d contains multiple options for constructing material properties of the fiber structure, as well as the advection-diffusion of a chemical gradient, muscle mechanics models, and artificial forcing to drive boundaries with a preferred motion.

  9. Fitting fatigue test data with a novel S-N curve using frequentist and Bayesian inference

    NARCIS (Netherlands)

    Leonetti, D.; Maljaars, J.; Snijder, H.H.B.

    2017-01-01

    In design against fatigue, a lower bound stress range vs. endurance curve (S-N curve) is employed to characterize fatigue resistance of plain material and structural details. With respect to the inherent variability of the fatigue life, the S-N curve is related to a certain probability of

  10. Parallel Sequential Monte Carlo for Efficient Density Combination: The Deco Matlab Toolbox

    DEFF Research Database (Denmark)

    Casarin, Roberto; Grassi, Stefano; Ravazzolo, Francesco

    This paper presents the Matlab package DeCo (Density Combination) which is based on the paper by Billio et al. (2013) where a constructive Bayesian approach is presented for combining predictive densities originating from different models or other sources of information. The combination weights...... for standard CPU computing and for Graphical Process Unit (GPU) parallel computing. For the GPU implementation we use the Matlab parallel computing toolbox and show how to use General Purposes GPU computing almost effortless. This GPU implementation comes with a speed up of the execution time up to seventy...... times compared to a standard CPU Matlab implementation on a multicore CPU. We show the use of the package and the computational gain of the GPU version, through some simulation experiments and empirical applications....

  11. Changing patient population in Dhaka Hospital and Matlab Hospital of icddr,b.

    Science.gov (United States)

    Das, S K; Rahman, A; Chisti, M J; Ahmed, S; Malek, M A; Salam, M A; Bardhan, P K; Faruque, A S G

    2014-02-01

    The Diarrhoeal Disease Surveillance System of icddr,b noted increasing number of patients ≥60 years at urban Dhaka and rural Matlab from 2001 to 2012. Shigella and Vibrio cholerae were more frequently isolated from elderly people than children under 5 years and adults aged 5-59 in both areas. The resistance observed to various drugs of Shigella in Dhaka and Matlab was trimethoprim-sulphamethoxazole (72-63%), ampicillin (43-55%), nalidixic acid (58-61%), mecillinam (12-9%), azithromycin (13-0%), ciprofloxacin (11-13%) and ceftriaxone (11-0%). Vibrio cholerae isolated in Dhaka and Matlab was resistant to trimethoprim-sulphamethoxazole (98-94%), furazolidone (100%), erythromycin (71-53%), tetracycline (46-44%), ciprofloxacin (3-10%) and azithromycin (3-0%). © 2013 John Wiley & Sons Ltd.

  12. Study and program implementation of transient curves' piecewise linearization

    International Nuclear Information System (INIS)

    Shi Yang; Zu Hongbiao

    2014-01-01

    Background: Transient curves are essential for the stress analysis of related equipment in nuclear power plant (NPP). The actually operating data or the design transient data of a NPP usually consist of a large number of data points with very short time intervals. To simplify the analysis, transient curves are generally piecewise linearized in advance. Up to now, the piecewise linearization of transient curves is accomplished manually, Purpose: The aim is to develop a method for the piecewise linearization of transient curves, and to implement it by programming. Methods: First of all, the fitting line of a number of data points was obtained by the least square method. The segment of the fitting line is set while the accumulation error of linearization exceeds the preset limit with the increasing number of points. Then the linearization of subsequent data points was begun from the last point of the preceding curve segment to get the next segment in the same way, and continue until the final data point involved. Finally, averaging of junction points is taken for the segment connection. Results: A computer program named PLTC (Piecewise Linearization for Transient Curves) was implemented and verified by the linearization of the standard sine curve and typical transient curves of a NPP. Conclusion: The method and the PLTC program can be well used to the piecewise linearization of transient curves, with improving efficiency and precision. (authors)

  13. NNCTRL - a CANCSD toolkit for MATLAB(R)

    DEFF Research Database (Denmark)

    Nørgård, Peter Magnus; Ravn, Ole; Poulsen, Niels Kjølstad

    1996-01-01

    A set of tools for computer-aided neuro-control system design (CANCSD) has been developed for the MATLAB environment. The tools can be used for construction and simulation of a variety of neural network based control systems. The design methods featured in the toolkit are: direct inverse control...

  14. Introducing Effects in an Image: A MATLAB Approach

    OpenAIRE

    Kumar , Vinay; Sood , Saurabh; Mishra , Shruti

    2008-01-01

    A detailed study of introducing morning, night, and some more effect in an image is discussed, the original image is clicked in the morning. Several examples have also been discussed. MATLAB is used for the processing.

  15. Physical fitness and performance. Cardiorespiratory fitness in girls-change from middle to high school.

    Science.gov (United States)

    Pfeiffer, Karin A; Dowda, Marsha; Dishman, Rod K; Sirard, John R; Pate, Russell R

    2007-12-01

    To determine how factors are related to change in cardiorespiratory fitness (CRF) across time in middle school girls followed through high school. Adolescent girls (N = 274, 59% African American, baseline age = 13.6 +/- 0.6 yr) performed a submaximal fitness test (PWC170) in 8th, 9th, and 12th grades. Height, weight, sports participation, and physical activity were also measured. Moderate-to-vigorous physical activity (MVPA) and vigorous physical activity (VPA) were determined by the number of blocks reported on the 3-Day Physical Activity Recall (3DPAR). Individual differences and developmental change in CRF were assessed simultaneously by calculating individual growth curves for each participant, using growth curve modeling. Both weight-relative and absolute CRF increased from 8th to 9th grade and decreased from 9th to 12th grade. On average, girls lost 0.16 kg.m.min.kg.yr in weight-relative PWC170 scores (P interactions between CRF, physical activity, race, BMI, and sports participation.

  16. FPGA curved track fitter with very low resource usage

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Jin-Yuan; Wang, M.; Gottschalk, E.; Shi, Z.; /Fermilab

    2006-11-01

    Standard least-squares curved track fitting process is tailored for FPGA implementation. The coefficients in the fitting matrices are carefully chosen so that only shift and accumulation operations are used in the process. The divisions and full multiplications are eliminated. Comparison in an application example shows that the fitting errors of the low resource usage implementation are less than 4% bigger than the fitting errors of the exact least-squares algorithm. The implementation is suitable for low-cost, low-power applications such as high energy physics detector trigger systems.

  17. Computational colour science using MATLAB

    CERN Document Server

    Westland, Stephen; Cheung, Vien

    2012-01-01

    Computational Colour Science Using MATLAB 2nd Edition offers a practical, problem-based approach to colour physics. The book focuses on the key issues encountered in modern colour engineering, including efficient representation of colour information, Fourier analysis of reflectance spectra and advanced colorimetric computation. Emphasis is placed on the practical applications rather than the techniques themselves, with material structured around key topics. These topics include colour calibration of visual displays, computer recipe prediction and models for colour-appearance prediction. Each t

  18. Microscopic Model of Automobile Lane-changing Virtual Desire Trajectory by Spline Curves

    Directory of Open Access Journals (Sweden)

    Yulong Pei

    2010-05-01

    Full Text Available With the development of microscopic traffic simulation models, they have increasingly become an important tool for transport system analysis and management, which assist the traffic engineer to investigate and evaluate the performance of transport network systems. Lane-changing model is a vital component in any traffic simulation model, which could improve road capacity and reduce vehicles delay so as to reduce the likelihood of congestion occurrence. Therefore, this paper addresses the virtual desire trajectory, a vital part to investigate the behaviour divided into four phases. Based on the boundary conditions, β-spline curves and the corresponding reverse algorithm are introduced firstly. Thus, the relation between the velocity and length of lane-changing is constructed, restricted by the curvature, steering velocity and driving behaviour. Then the virtual desire trajectory curves are presented by Matlab and the error analysis results prove that this proposed description model has higher precision in automobile lane-changing process reconstruction, compared with the surveyed result. KEY WORDS: traffic simulation, lane-changing model, virtual desire trajectory, β-spline curves, driving behaviour

  19. MATLAB/SIMULINK model of CANDU reactor for control studies

    International Nuclear Information System (INIS)

    Javidnia, H.; Jiang, J.

    2006-01-01

    In this paper a MATLAB/SIMULINK model is developed for a CANDU type reactor. The data for the reactor are taken from an Indian PHWR, which is very similar to CANDU in its design. Among the different feedback mechanisms in the core of the reactor, only xenon has been considered which plays an important role in spatial oscillations. The model is verified under closed loop scenarios with simple PI controller. The results of the simulation show that this model can be used for controller design and simulation of the reactor systems. Adding models of the other components of a CANDU reactor would ultimately result in a complete model of CANDU plant in MATLAB/SIMULINK. (author)

  20. Radioligand assays - methods and applications. IV. Uniform regression of hyperbolic and linear radioimmunoassay calibration curves

    Energy Technology Data Exchange (ETDEWEB)

    Keilacker, H; Becker, G; Ziegler, M; Gottschling, H D [Zentralinstitut fuer Diabetes, Karlsburg (German Democratic Republic)

    1980-10-01

    In order to handle all types of radioimmunoassay (RIA) calibration curves obtained in the authors' laboratory in the same way, they tried to find a non-linear expression for their regression which allows calibration curves with different degrees of curvature to be fitted. Considering the two boundary cases of the incubation protocol they derived a hyperbolic inverse regression function: x = a/sub 1/y + a/sub 0/ + asub(-1)y/sup -1/, where x is the total concentration of antigen, asub(i) are constants, and y is the specifically bound radioactivity. An RIA evaluation procedure based on this function is described providing a fitted inverse RIA calibration curve and some statistical quality parameters. The latter are of an order which is normal for RIA systems. There is an excellent agreement between fitted and experimentally obtained calibration curves having a different degree of curvature.

  1. Intelligent traffic lights based on MATLAB

    Science.gov (United States)

    Nie, Ying

    2018-04-01

    In this paper, I describes the traffic lights system and it has some. Through analysis, I used MATLAB technology, transformed the camera photographs into digital signals. Than divided the road vehicle is into three methods: very congestion, congestion, a little congestion. Through the MCU programming, solved the different roads have different delay time, and Used this method, saving time and resources, so as to reduce road congestion.

  2. PFA toolbox: a MATLAB tool for Metabolic Flux Analysis.

    Science.gov (United States)

    Morales, Yeimy; Bosque, Gabriel; Vehí, Josep; Picó, Jesús; Llaneras, Francisco

    2016-07-11

    Metabolic Flux Analysis (MFA) is a methodology that has been successfully applied to estimate metabolic fluxes in living cells. However, traditional frameworks based on this approach have some limitations, particularly when measurements are scarce and imprecise. This is very common in industrial environments. The PFA Toolbox can be used to face those scenarios. Here we present the PFA (Possibilistic Flux Analysis) Toolbox for MATLAB, which simplifies the use of Interval and Possibilistic Metabolic Flux Analysis. The main features of the PFA Toolbox are the following: (a) It provides reliable MFA estimations in scenarios where only a few fluxes can be measured or those available are imprecise. (b) It provides tools to easily plot the results as interval estimates or flux distributions. (c) It is composed of simple functions that MATLAB users can apply in flexible ways. (d) It includes a Graphical User Interface (GUI), which provides a visual representation of the measurements and their uncertainty. (e) It can use stoichiometric models in COBRA format. In addition, the PFA Toolbox includes a User's Guide with a thorough description of its functions and several examples. The PFA Toolbox for MATLAB is a freely available Toolbox that is able to perform Interval and Possibilistic MFA estimations.

  3. Matlab enhanced multi-threaded tomography optimization sequence (MEMTOS)

    International Nuclear Information System (INIS)

    Lum, Edward S.; Pope, Chad L.

    2016-01-01

    Highlights: • Monte Carlo simulation of spent nuclear fuel assembly neutron computed tomography. • Optimized parallel calculations conducted from within the MATLAB environment. • Projection difference technique used to identify anomalies in spent nuclear fuel assemblies. - Abstract: One challenge associated with spent nuclear fuel assemblies is the lack of non-destructive analysis techniques to determine if fuel pins have been removed or replaced or if there are significant defects associated with fuel pins deep within a fuel assembly. Neutron computed tomography is a promising technique for addressing these qualitative issues. Monte Carlo simulation of spent nuclear fuel neutron computed tomography allows inexpensive process investigation and optimization. The main purpose of this work is to provide a fully automated advanced simulation framework for the analysis of spent nuclear fuel inspection using neutron computed tomography. The simulation framework, called Matlab Enhanced Multi-Threaded Tomography Optimization Sequence (MEMTOS) not only automates the simulation process, but also generates superior tomography image results. MEMTOS is written in the MATLAB scripting language and addresses file management, parallel Monte Carlo execution, results extraction, and tomography image generation. This paper describes the mathematical basis for neutron computed tomography, the Monte Carlo technique used to simulate neutron computed tomography, and the overall tomography simulation optimization algorithm. Sequence results presented include overall simulation speed enhancement, tomography and image results obtained for Experimental Breeder Reactor II spent fuel assemblies and light water reactor fuel assemblies. Optimization using a projection difference technique are also described.

  4. A Collection of Nonlinear Aircraft Simulations in MATLAB

    Science.gov (United States)

    Garza, Frederico R.; Morelli, Eugene A.

    2003-01-01

    Nonlinear six degree-of-freedom simulations for a variety of aircraft were created using MATLAB. Data for aircraft geometry, aerodynamic characteristics, mass / inertia properties, and engine characteristics were obtained from open literature publications documenting wind tunnel experiments and flight tests. Each nonlinear simulation was implemented within a common framework in MATLAB, and includes an interface with another commercially-available program to read pilot inputs and produce a three-dimensional (3-D) display of the simulated airplane motion. Aircraft simulations include the General Dynamics F-16 Fighting Falcon, Convair F-106B Delta Dart, Grumman F-14 Tomcat, McDonnell Douglas F-4 Phantom, NASA Langley Free-Flying Aircraft for Sub-scale Experimental Research (FASER), NASA HL-20 Lifting Body, NASA / DARPA X-31 Enhanced Fighter Maneuverability Demonstrator, and the Vought A-7 Corsair II. All nonlinear simulations and 3-D displays run in real time in response to pilot inputs, using contemporary desktop personal computer hardware. The simulations can also be run in batch mode. Each nonlinear simulation includes the full nonlinear dynamics of the bare airframe, with a scaled direct connection from pilot inputs to control surface deflections to provide adequate pilot control. Since all the nonlinear simulations are implemented entirely in MATLAB, user-defined control laws can be added in a straightforward fashion, and the simulations are portable across various computing platforms. Routines for trim, linearization, and numerical integration are included. The general nonlinear simulation framework and the specifics for each particular aircraft are documented.

  5. The use of MATLAB-SIMULINK for evaluation of thermal building behavior; O uso do MATLAB-SIMULINK para avaliacao do comportamento termico de ambientes

    Energy Technology Data Exchange (ETDEWEB)

    Mendes, Nathan; Oliveira, Gustavo H.C.; Araujo, Humberto X. de [Pontificia Universidade Catolica do Parana, Curitiba, PR (Brazil). Lab. de Sistemas Termicos]|[Pontificia Universidade Catolica do Parana, Curitiba, PR (Brazil). Lab. de Automacao e Sistemas]. E-mail: nmendes@ccet.pucpr.br; oliv@ ccet.pucpr.br; araujo@ ccet.pucpr.br

    2000-07-01

    We describe a mathematical model applied to both building thermal analysis and control systems design. We use a lumped approach to model the room air temperature and a multi-layer model for the building envelope. The capacitance model allows to study the transient analysis of room air temperature when it is submitted to sinusoidal variation of external air temperature, representing a case study for the city of Curitiba-PR, Brazil. To evaluate the building performance with thermal parameters, we use MATLAB/SIMULINK. In the results section, we show the influences of thermal capacitance on the building air temperature and energy consumption and the advantages of using MATLAB/SIMULINK in building thermal and energy analysis as well. (author)

  6. Physical fitness reference standards in fibromyalgia: The al-Ándalus project.

    Science.gov (United States)

    Álvarez-Gallardo, I C; Carbonell-Baeza, A; Segura-Jiménez, V; Soriano-Maldonado, A; Intemann, T; Aparicio, V A; Estévez-López, F; Camiletti-Moirón, D; Herrador-Colmenero, M; Ruiz, J R; Delgado-Fernández, M; Ortega, F B

    2017-11-01

    We aimed (1) to report age-specific physical fitness levels in people with fibromyalgia of a representative sample from Andalusia; and (2) to compare the fitness levels of people with fibromyalgia with non-fibromyalgia controls. This cross-sectional study included 468 (21 men) patients with fibromyalgia and 360 (55 men) controls. The fibromyalgia sample was geographically representative from southern Spain. Physical fitness was assessed with the Senior Fitness Test battery plus the handgrip test. We applied the Generalized Additive Model for Location, Scale and Shape to calculate percentile curves for women and fitted mean curves using a linear regression for men. Our results show that people with fibromyalgia reached worse performance in all fitness tests than controls (P fitness levels among patients with fibromyalgia and controls in a large sample of patients with fibromyalgia from southern of Spain. Physical fitness levels of people with fibromyalgia from Andalusia are very low in comparison with age-matched healthy controls. This information could be useful to correctly interpret physical fitness assessments and helping health care providers to identify individuals at risk for losing physical independence. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. THE CARNEGIE SUPERNOVA PROJECT: LIGHT-CURVE FITTING WITH SNooPy

    International Nuclear Information System (INIS)

    Burns, Christopher R.; Persson, S. E.; Madore, Barry F.; Freedman, Wendy L.; Stritzinger, Maximilian; Phillips, M. M.; Boldt, Luis; Campillay, Abdo; Folatelli, Gaston; Gonzalez, Sergio; Krzeminski, Wojtek; Morrell, Nidia; Salgado, Francisco; Kattner, ShiAnne; Contreras, Carlos; Suntzeff, Nicholas B.

    2011-01-01

    In providing an independent measure of the expansion history of the universe, the Carnegie Supernova Project (CSP) has observed 71 high-z Type Ia supernovae (SNe Ia) in the near-infrared bands Y and J. These can be used to construct rest-frame i-band light curves which, when compared to a low-z sample, yield distance moduli that are less sensitive to extinction and/or decline-rate corrections than in the optical. However, working with NIR observed and i-band rest-frame photometry presents unique challenges and has necessitated the development of a new set of observational tools in order to reduce and analyze both the low-z and high-z CSP sample. We present in this paper the methods used to generate uBVgriYJH light-curve templates based on a sample of 24 high-quality low-z CSP SNe. We also present two methods for determining the distances to the hosts of SN Ia events. A larger sample of 30 low-z SNe Ia in the Hubble flow is used to calibrate these methods. We then apply the method and derive distances to seven galaxies that are so nearby that their motions are not dominated by the Hubble flow.

  8. Design and fabrication of diffractive optical elements with MATLAB

    National Research Council Canada - National Science Library

    Bhattacharya, Shanti (Professor in Optics); Vijayakumar, Anand

    2017-01-01

    ... their diffraction patterns using MATLAB. The fundamentals of fabrication techniques such as photolithography, electron beam lithography, and focused ion beam lithography with basic instructions for the beginner are presented...

  9. Dynamic Regulation of a Cell Adhesion Protein Complex Including CADM1 by Combinatorial Analysis of FRAP with Exponential Curve-Fitting

    Science.gov (United States)

    Sakurai-Yageta, Mika; Maruyama, Tomoko; Suzuki, Takashi; Ichikawa, Kazuhisa; Murakami, Yoshinori

    2015-01-01

    Protein components of cell adhesion machinery show continuous renewal even in the static state of epithelial cells and participate in the formation and maintenance of normal epithelial architecture and tumor suppression. CADM1 is a tumor suppressor belonging to the immunoglobulin superfamily of cell adhesion molecule and forms a cell adhesion complex with an actin-binding protein, 4.1B, and a scaffold protein, MPP3, in the cytoplasm. Here, we investigate dynamic regulation of the CADM1-4.1B-MPP3 complex in mature cell adhesion by fluorescence recovery after photobleaching (FRAP) analysis. Traditional FRAP analysis were performed for relatively short period of around 10min. Here, thanks to recent advances in the sensitive laser detector systems, we examine FRAP of CADM1 complex for longer period of 60 min and analyze the recovery with exponential curve-fitting to distinguish the fractions with different diffusion constants. This approach reveals that the fluorescence recovery of CADM1 is fitted to a single exponential function with a time constant (τ) of approximately 16 min, whereas 4.1B and MPP3 are fitted to a double exponential function with two τs of approximately 40-60 sec and 16 min. The longer τ is similar to that of CADM1, suggesting that 4.1B and MPP3 have two distinct fractions, one forming a complex with CADM1 and the other present as a free pool. Fluorescence loss in photobleaching analysis supports the presence of a free pool of these proteins near the plasma membrane. Furthermore, double exponential fitting makes it possible to estimate the ratio of 4.1B and MPP3 present as a free pool and as a complex with CADM1 as approximately 3:2 and 3:1, respectively. Our analyses reveal a central role of CADM1 in stabilizing the complex with 4.1B and MPP3 and provide insight in the dynamics of adhesion complex formation. PMID:25780926

  10. DEM4-26, Least Square Fit for IBM PC by Deming Method

    International Nuclear Information System (INIS)

    Rinard, P.M.; Bosler, G.E.

    1989-01-01

    1 - Description of program or function: DEM4-26 is a generalized least square fitting program based on Deming's method. Functions built into the program for fitting include linear, quadratic, cubic, power, Howard's, exponential, and Gaussian; others can easily be added. The program has the following capabilities: (1) entry, editing, and saving of data; (2) fitting of any of the built-in functions or of a user-supplied function; (3) plotting the data and fitted function on the display screen, with error limits if requested, and with the option of copying the plot to the printer; (4) interpolation of x or y values from the fitted curve with error estimates based on error limits selected by the user; and (5) plotting the residuals between the y data values and the fitted curve, with the option copying the plot to the printer. 2 - Method of solution: Deming's method

  11. Design of PR current control with selective harmonic compensators using Matlab

    Directory of Open Access Journals (Sweden)

    Daniel Zammit

    2017-12-01

    Full Text Available This paper presents a procedure to design a Proportional Resonant (PR current controller with additional PR selective harmonic compensators for Grid Connected Photovoltaic (PV Inverters. The design of the PR current control and the harmonic compensators will be carried out using Matlab. Testing was carried out on a 3 kW Grid-Connected PV Inverter which was designed and constructed for this research. Both simulation and experimental results will be presented. Keywords: Inverters, Proportional-resonant controllers, Harmonic compensation, Photovoltaic, Matlab, SISO design tool

  12. A MATLAB Package for Markov Chain Monte Carlo with a Multi-Unidimensional IRT Model

    Directory of Open Access Journals (Sweden)

    Yanyan Sheng

    2008-11-01

    Full Text Available Unidimensional item response theory (IRT models are useful when each item is designed to measure some facet of a unified latent trait. In practical applications, items are not necessarily measuring the same underlying trait, and hence the more general multi-unidimensional model should be considered. This paper provides the requisite information and description of software that implements the Gibbs sampler for such models with two item parameters and a normal ogive form. The software developed is written in the MATLAB package IRTmu2no. The package is flexible enough to allow a user the choice to simulate binary response data with multiple dimensions, set the number of total or burn-in iterations, specify starting values or prior distributions for model parameters, check convergence of the Markov chain, as well as obtain Bayesian fit statistics. Illustrative examples are provided to demonstrate and validate the use of the software package.

  13. SITE INDEX CURVES AND HYPSOMETRIC RELATIONSHIP FOR Eucalyptus grandis PLANTATIONS FOR THE CAMPOS GERAIS REGION, PARANA STATE

    Directory of Open Access Journals (Sweden)

    Fabiane Aparecida de Souza Retslaff

    2015-06-01

    Full Text Available The study aimed to fit mathematical models for the construction of Site Index curves and to estimate heights at different ages for Eucalyptus grandis in the Campos Gerais region, Parana State. The data used to fit the models came from permanent, temporary plots and pre-harvesting inventory, covering ages from 2.5 to 26.5 years. Several models were tested to represent the sites and the hypsometric relationship. The Site Index curves were constructed by the guide-curve method. For the Site Index, the Chapman-Richards model showed the best fit and precision statistics, generating 5 Site Index curves (range of 5 m with the Chapman-Richards model. The four hypsometric models tested showed satisfactory performance and similar statistics and the inclusion of the variables dominant height or site index did not substantially improve the goodness of fit statistics, but the residues were more homogeneous and closer to zero.

  14. INTEGRACIÓN DE COMPONENTES COM DE MATLAB/SIMULINK EN EL ENTORNO CASE XBDK, PARA EL MODELADO DE SISTEMAS DE CONFORMACIÓN DE HAZ MATLAB/SIMULINK COM COMPONENT INTEGRATION FOR XBDK CASE ENVIRONMENT, ORIENTED TO BEAMFORMING APLICATIONS

    Directory of Open Access Journals (Sweden)

    Mariano Raboso Mateos

    2009-04-01

    Full Text Available En este artículo se describe la interfaz de acceso a Matlab desde la plataforma XBDK (XML-Based Beamforming Development Kit. La contribución más novedosa es la utilización del lenguaje de script Tcl/Tk para el acceso al entorno Matlab utilizando las interfaces COM, ofrecidas por el servicio Matlab Automation Server. La utilización de lenguajes de script tiene innumerables ventajas a la hora de diseñar, construir y depurar prototipos o automatizar procesos. Muchas de las herramientas que se utilizan hoy en día para procesado de señal de una u otra manera permiten la utilización de lenguajes de script. La combinación de un lenguaje de script, con la posibilidad de acceder de forma detallada a los servicios de Matlab, proporciona una manera flexible, rápida y potente, de integrar servicios en una herramienta CASE integrada como XBDK.This paper describes Matlab access within the XBDK (XML-Based Beamforming Development Kit platform. A well-known script language named Tcl/Tk has been used to perform Matlab COM interfacing, a powerful and little known mechanism, provided by Matlab Automation Server. Automation processing or prototype development, take advantage from script languages such Tcl/Tk. Most recent digital signal processing tools, provide mechanisms to be invoked by script languages. A language script plus COM integration, performs a detail, flexible, quick and powerful mechanism to provide services for a CASE integrated development environment such XBDK.

  15. THREE-PHASE TRANSFORMER PARAMETERS CALCULATION CONSIDERING THE CORE SATURATION FOR THE MATLAB-SIMULINK TRANSFORMER MODEL

    Directory of Open Access Journals (Sweden)

    I. V. Novash

    2015-01-01

    Full Text Available This article describes the parameters calculation for the three-phase two-winding power transformer model taken from the SimPowerSystems library, which is the part of the MatLab- Simulink environment. Presented methodology is based on the power transformer nameplate data usage. Particular attention is paid to the power transformer magnetization curve para- meters  calculation.  The  methodology  of  the  three-phase  two-winding  power  transformer model parameters calculation considering the magnetization curve nonlinearity isn’t presented in Russian-and English-language sources. Power transformers demo models described in the SimPowerSystems user’s guide have already calculated parameters, but without reference to the sources of their determination. A power transformer is a nonlinear element of the power system, that’s why for its performance analysis in different modes of operation is necessary to have the magnetization curve parameters.The process analysis during no-load energizing of the power transformer is of special interest. This regime is accompanied by the inrush current on the supply side of the power transformer, which is several times larger than the transformer rated current. Sharp rising of the magnetizing current is explained by the magnetic core saturation. Therefore, magnetiza- tion characteristic accounting during transformer no-load energizing modeling is a mandatory requirement. Article authors attempt to put all calculating formulas in a more convenient form and validate the power transformer nonlinear magnetization characteristics parameters calcu- lation. Inrush current oscillograms obtained during the simulation experiment confirmed the adequacy of the calculated model parameters.

  16. Slow Orbit Feedback at the ALS Using Matlab

    International Nuclear Information System (INIS)

    Portmann, G.

    1999-01-01

    The third generation Advanced Light Source (ALS) produces extremely bright and finely focused photon beams using undulatory, wigglers, and bend magnets. In order to position the photon beams accurately, a slow global orbit feedback system has been developed. The dominant causes of orbit motion at the ALS are temperature variation and insertion device motion. This type of motion can be removed using slow global orbit feedback with a data rate of a few Hertz. The remaining orbit motion in the ALS is only 1-3 micron rms. Slow orbit feedback does not require high computational throughput. At the ALS, the global orbit feedback algorithm, based on the singular valued decomposition method, is coded in MATLAB and runs on a control room workstation. Using the MATLAB environment to develop, test, and run the storage ring control algorithms has proven to be a fast and efficient way to operate the ALS

  17. Multilevel Models for the Analysis of Angle-Specific Torque Curves with Application to Master Athletes

    Directory of Open Access Journals (Sweden)

    Carvalho Humberto M.

    2015-12-01

    Full Text Available The aim of this paper was to outline a multilevel modeling approach to fit individual angle-specific torque curves describing concentric knee extension and flexion isokinetic muscular actions in Master athletes. The potential of the analytical approach to examine between individual differences across the angle-specific torque curves was illustrated including between-individuals variation due to gender differences at a higher level. Torques in concentric muscular actions of knee extension and knee extension at 60°·s-1 were considered within a range of motion between 5°and 85° (only torques “truly” isokinetic. Multilevel time series models with autoregressive covariance structures with standard multilevel models were superior fits compared with standard multilevel models for repeated measures to fit anglespecific torque curves. Third and fourth order polynomial models were the best fits to describe angle-specific torque curves of isokinetic knee flexion and extension concentric actions, respectively. The fixed exponents allow interpretations for initial acceleration, the angle at peak torque and the decrement of torque after peak torque. Also, the multilevel models were flexible to illustrate the influence of gender differences on the shape of torque throughout the range of motion and in the shape of the curves. The presented multilevel regression models may afford a general framework to examine angle-specific moment curves by isokinetic dynamometry, and add to the understanding mechanisms of strength development, particularly the force-length relationship, both related to performance and injury prevention.

  18. Peak fitting and identification software library for high resolution gamma-ray spectra

    International Nuclear Information System (INIS)

    Uher, Josef; Roach, Greg; Tickner, James

    2010-01-01

    A new gamma-ray spectral analysis software package is under development in our laboratory. It can be operated as a stand-alone program or called as a software library from Java, C, C++ and MATLAB TM environments. It provides an advanced graphical user interface for data acquisition, spectral analysis and radioisotope identification. The code uses a peak-fitting function that includes peak asymmetry, Compton continuum and flexible background terms. Peak fitting function parameters can be calibrated as functions of energy. Each parameter can be constrained to improve fitting of overlapping peaks. All of these features can be adjusted by the user. To assist with peak identification, the code can automatically measure half-lives of single or multiple overlapping peaks from a time series of spectra. It implements library-based peak identification, with options for restricting the search based on radioisotope half-lives and reaction types. The software also improves the reliability of isotope identification by utilizing Monte-Carlo simulation results.

  19. Directional quantile regression in Octave (and MATLAB)

    Czech Academy of Sciences Publication Activity Database

    Boček, Pavel; Šiman, Miroslav

    2016-01-01

    Roč. 52, č. 1 (2016), s. 28-51 ISSN 0023-5954 R&D Projects: GA ČR GA14-07234S Institutional support: RVO:67985556 Keywords : quantile regression * multivariate quantile * depth contour * Matlab Subject RIV: IN - Informatics, Computer Science Impact factor: 0.379, year: 2016 http://library.utia.cas.cz/separaty/2016/SI/bocek-0458380.pdf

  20. Fitness analysis method for magnesium in drinking water with atomic absorption using quadratic curve calibration

    Directory of Open Access Journals (Sweden)

    Esteban Pérez-López

    2014-11-01

    Full Text Available Because of the importance of quantitative chemical analysis in research, quality control, sales of services and other areas of interest , and the limiting of some instrumental analysis methods for quantification with linear calibration curve, sometimes because the short linear dynamic ranges of the analyte, and sometimes by limiting the technique itself, is that there is a need to investigate a little more about the convenience of using quadratic curves for analytical quantification, which seeks demonstrate that it is a valid calculation model for chemical analysis instruments. To this was taken as an analysis method based on the technique and atomic absorption spectroscopy in particular a determination of magnesium in a sample of drinking water Tacares sector Northern Grecia, employing a nonlinear calibration curve and a curve specific quadratic behavior, which was compared with the test results obtained for the same analysis with a linear calibration curve. The results show that the methodology is valid for the determination referred to, with all confidence, since the concentrations are very similar, and as used hypothesis testing can be considered equal.

  1. Hyperspectral imaging in medicine: image pre-processing problems and solutions in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2015-11-01

    The paper presents problems and solutions related to hyperspectral image pre-processing. New methods of preliminary image analysis are proposed. The paper shows problems occurring in Matlab when trying to analyse this type of images. Moreover, new methods are discussed which provide the source code in Matlab that can be used in practice without any licensing restrictions. The proposed application and sample result of hyperspectral image analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Difficulties in fitting the thermal response of atomic force microscope cantilevers for stiffness calibration

    International Nuclear Information System (INIS)

    Cole, D G

    2008-01-01

    This paper discusses the difficulties of calibrating atomic force microscope (AFM) cantilevers, in particular the effect calibrating under light fluid-loading (in air) and under heavy fluid-loading (in water) has on the ability to use thermal motion response to fit model parameters that are used to determine cantilever stiffness. For the light fluid-loading case, the resonant frequency and quality factor can easily be used to determine stiffness. The extension of this approach to the heavy fluid-loading case is troublesome due to the low quality factor (high damping) caused by fluid-loading. Simple calibration formulae are difficult to realize, and the best approach is often to curve-fit the thermal response, using the parameters of natural frequency and mass ratio so that the curve-fit's response is within some acceptable tolerance of the actual thermal response. The parameters can then be used to calculate the cantilever stiffness. However, the process of curve-fitting can lead to erroneous results unless suitable care is taken. A feedback model of the fluid–structure interaction between the unloaded cantilever and the hydrodynamic drag provides a framework for fitting a modeled thermal response to a measured response and for evaluating the parametric uncertainty of the fit. The cases of uncertainty in the natural frequency, the mass ratio, and combined uncertainty are presented and the implications for system identification and stiffness calibration using curve-fitting techniques are discussed. Finally, considerations and recommendations for the calibration of AFM cantilevers are given in light of the results of this paper

  3. Scientific computing an introduction using Maple and Matlab

    CERN Document Server

    Gander, Walter; Kwok, Felix

    2014-01-01

    Scientific computing is the study of how to use computers effectively to solve problems that arise from the mathematical modeling of phenomena in science and engineering. It is based on mathematics, numerical and symbolic/algebraic computations and visualization. This book serves as an introduction to both the theory and practice of scientific computing, with each chapter presenting the basic algorithms that serve as the workhorses of many scientific codes; we explain both the theory behind these algorithms and how they must be implemented in order to work reliably in finite-precision arithmetic. The book includes many programs written in Matlab and Maple – Maple is often used to derive numerical algorithms, whereas Matlab is used to implement them. The theory is developed in such a way that students can learn by themselves as they work through the text. Each chapter contains numerous examples and problems to help readers understand the material “hands-on”.

  4. Digital signal processing for wireless communication using Matlab

    CERN Document Server

    Gopi, E S

    2016-01-01

    This book examines signal processing techniques used in wireless communication illustrated by using the Matlab program. The author discusses these techniques as they relate to Doppler spread; delay spread; Rayleigh and Rician channel modeling; rake receiver; diversity techniques; MIMO and OFDM -based transmission techniques; and array signal processing. Related topics such as detection theory, link budget, multiple access techniques, and spread spectrum are also covered.   ·         Illustrates signal processing techniques involved in wireless communication using Matlab ·         Discusses multiple access techniques such as Frequency division multiple access, Time division multiple access, and Code division multiple access ·         Covers band pass modulation techniques such as Binary phase shift keying, Differential phase shift keying, Quadrature phase shift keying, Binary frequency shift keying, Minimum shift keying, and Gaussian minimum shift keying.

  5. KiT: a MATLAB package for kinetochore tracking.

    Science.gov (United States)

    Armond, Jonathan W; Vladimirou, Elina; McAinsh, Andrew D; Burroughs, Nigel J

    2016-06-15

    During mitosis, chromosomes are attached to the mitotic spindle via large protein complexes called kinetochores. The motion of kinetochores throughout mitosis is intricate and automated quantitative tracking of their motion has already revealed many surprising facets of their behaviour. Here, we present 'KiT' (Kinetochore Tracking)-an easy-to-use, open-source software package for tracking kinetochores from live-cell fluorescent movies. KiT supports 2D, 3D and multi-colour movies, quantification of fluorescence, integrated deconvolution, parallel execution and multiple algorithms for particle localization. KiT is free, open-source software implemented in MATLAB and runs on all MATLAB supported platforms. KiT can be downloaded as a package from http://www.mechanochemistry.org/mcainsh/software.php The source repository is available at https://bitbucket.org/jarmond/kit and under continuing development. Supplementary data are available at Bioinformatics online. jonathan.armond@warwick.ac.uk. © The Author 2016. Published by Oxford University Press.

  6. Exact fast computation of band depth for large functional datasets: How quickly can one million curves be ranked?

    KAUST Repository

    Sun, Ying

    2012-10-01

    © 2012 John Wiley & Sons, Ltd. Band depth is an important nonparametric measure that generalizes order statistics and makes univariate methods based on order statistics possible for functional data. However, the computational burden of band depth limits its applicability when large functional or image datasets are considered. This paper proposes an exact fast method to speed up the band depth computation when bands are defined by two curves. Remarkable computational gains are demonstrated through simulation studies comparing our proposal with the original computation and one existing approximate method. For example, we report an experiment where our method can rank one million curves, evaluated at fifty time points each, in 12.4 seconds with Matlab.

  7. Radar signal analysis and processing using Matlab

    CERN Document Server

    Mahafza, Bassem R

    2008-01-01

    Offering radar-related software for the analysis and design of radar waveform and signal processing, this book provides comprehensive coverage of radar signals and signal processing techniques and algorithms. It contains numerous graphical plots, common radar-related functions, table format outputs, and end-of-chapter problems. The complete set of MATLAB[registered] functions and routines are available for download online.

  8. Simulation, design and thermal analysis of a solar Stirling engine using MATLAB

    International Nuclear Information System (INIS)

    Shazly, J.H.; Hafez, A.Z.; El Shenawy, E.T.; Eteiba, M.B.

    2014-01-01

    Highlights: • Modeling and simulation for a prototype of the solar-powered Stirling engine. • The solar-powered Stirling engine working at the low temperature range. • Estimating output power from the solar Stirling engine using Matlab program. • Solar radiation simulation program presents a solar radiation data using MATLAB. - Abstract: This paper presents the modeling and simulation for a prototype of the solar-powered Stirling engine working at the low temperature range. A mathematical model for the thermal analysis of the solar-powered low temperature Stirling engine with heat transfer is developed using Matlab program. The model takes into consideration the effect of the absorber temperature on the thermal analysis like as radiation and convection heat transfer between the absorber and the working fluid as well as radiation and convection heat transfer between the lower temperature plate and the working fluid. Hence, the present analysis provides a theoretical guidance for designing and operating of the solar-powered low temperature Stirling engine system, as well as estimating output power from the solar Stirling engine using Matlab program. This study attempts to demonstrate the potential of the low temperature Stirling engine as an option for the prime movers for Photovoltaic tracking systems. The heat source temperature is 40–60 °C as the temperature available from the sun directly

  9. Benchmark Simulation Model No 2 in Matlab-Simulink

    DEFF Research Database (Denmark)

    Vrecko, Darko; Gernaey, Krist; Rosen, Christian

    2006-01-01

    In this paper, implementation of the Benchmark Simulation Model No 2 (BSM2) within Matlab-Simulink is presented. The BSM2 is developed for plant-wide WWTP control strategy evaluation on a long-term basis. It consists of a pre-treatment process, an activated sludge process and sludge treatment...

  10. A New Approach for Optimal Sizing of Standalone Photovoltaic Systems

    OpenAIRE

    Khatib, Tamer; Mohamed, Azah; Sopian, K.; Mahmoud, M.

    2012-01-01

    This paper presents a new method for determining the optimal sizing of standalone photovoltaic (PV) system in terms of optimal sizing of PV array and battery storage. A standalone PV system energy flow is first analysed, and the MATLAB fitting tool is used to fit the resultant sizing curves in order to derive general formulas for optimal sizing of PV array and battery. In deriving the formulas for optimal sizing of PV array and battery, the data considered are based on five sites in Malaysia...

  11. SIGNUM: A Matlab, TIN-based landscape evolution model

    Science.gov (United States)

    Refice, A.; Giachetta, E.; Capolongo, D.

    2012-08-01

    Several numerical landscape evolution models (LEMs) have been developed to date, and many are available as open source codes. Most are written in efficient programming languages such as Fortran or C, but often require additional code efforts to plug in to more user-friendly data analysis and/or visualization tools to ease interpretation and scientific insight. In this paper, we present an effort to port a common core of accepted physical principles governing landscape evolution directly into a high-level language and data analysis environment such as Matlab. SIGNUM (acronym for Simple Integrated Geomorphological Numerical Model) is an independent and self-contained Matlab, TIN-based landscape evolution model, built to simulate topography development at various space and time scales. SIGNUM is presently capable of simulating hillslope processes such as linear and nonlinear diffusion, fluvial incision into bedrock, spatially varying surface uplift which can be used to simulate changes in base level, thrust and faulting, as well as effects of climate changes. Although based on accepted and well-known processes and algorithms in its present version, it is built with a modular structure, which allows to easily modify and upgrade the simulated physical processes to suite virtually any user needs. The code is conceived as an open-source project, and is thus an ideal tool for both research and didactic purposes, thanks to the high-level nature of the Matlab environment and its popularity among the scientific community. In this paper the simulation code is presented together with some simple examples of surface evolution, and guidelines for development of new modules and algorithms are proposed.

  12. Elementary mechanics using Matlab a modern course combining analytical and numerical techniques

    CERN Document Server

    Malthe-Sørenssen, Anders

    2015-01-01

    This book – specifically developed as a novel textbook on elementary classical mechanics – shows how analytical and numerical methods can be seamlessly integrated to solve physics problems. This approach allows students to solve more advanced and applied problems at an earlier stage and equips them to deal with real-world examples well beyond the typical special cases treated in standard textbooks. Another advantage of this approach is that students are brought closer to the way physics is actually discovered and applied, as they are introduced right from the start to a more exploratory way of understanding phenomena and of developing their physical concepts. While not a requirement, it is advantageous for the reader to have some prior knowledge of scientific programming with a scripting-type language. This edition of the book uses Matlab, and a chapter devoted to the basics of scientific programming with Matlab is included. A parallel edition using Python instead of Matlab is also available. Last but not...

  13. Design of high-performance parallelized gene predictors in MATLAB.

    Science.gov (United States)

    Rivard, Sylvain Robert; Mailloux, Jean-Gabriel; Beguenane, Rachid; Bui, Hung Tien

    2012-04-10

    This paper proposes a method of implementing parallel gene prediction algorithms in MATLAB. The proposed designs are based on either Goertzel's algorithm or on FFTs and have been implemented using varying amounts of parallelism on a central processing unit (CPU) and on a graphics processing unit (GPU). Results show that an implementation using a straightforward approach can require over 4.5 h to process 15 million base pairs (bps) whereas a properly designed one could perform the same task in less than five minutes. In the best case, a GPU implementation can yield these results in 57 s. The present work shows how parallelism can be used in MATLAB for gene prediction in very large DNA sequences to produce results that are over 270 times faster than a conventional approach. This is significant as MATLAB is typically overlooked due to its apparent slow processing time even though it offers a convenient environment for bioinformatics. From a practical standpoint, this work proposes two strategies for accelerating genome data processing which rely on different parallelization mechanisms. Using a CPU, the work shows that direct access to the MEX function increases execution speed and that the PARFOR construct should be used in order to take full advantage of the parallelizable Goertzel implementation. When the target is a GPU, the work shows that data needs to be segmented into manageable sizes within the GFOR construct before processing in order to minimize execution time.

  14. Bose-Einstein Condensate Dark Matter Halos Confronted with Galactic Rotation Curves

    Directory of Open Access Journals (Sweden)

    M. Dwornik

    2017-01-01

    Full Text Available We present a comparative confrontation of both the Bose-Einstein Condensate (BEC and the Navarro-Frenk-White (NFW dark halo models with galactic rotation curves. We employ 6 High Surface Brightness (HSB, 6 Low Surface Brightness (LSB, and 7 dwarf galaxies with rotation curves falling into two classes. In the first class rotational velocities increase with radius over the observed range. The BEC and NFW models give comparable fits for HSB and LSB galaxies of this type, while for dwarf galaxies the fit is significantly better with the BEC model. In the second class the rotational velocity of HSB and LSB galaxies exhibits long flat plateaus, resulting in better fit with the NFW model for HSB galaxies and comparable fits for LSB galaxies. We conclude that due to its central density cusp avoidance the BEC model fits better dwarf galaxy dark matter distribution. Nevertheless it suffers from sharp cutoff in larger galaxies, where the NFW model performs better. The investigated galaxy sample obeys the Tully-Fisher relation, including the particular characteristics exhibited by dwarf galaxies. In both models the fitting enforces a relation between dark matter parameters: the characteristic density and the corresponding characteristic distance scale with an inverse power.

  15. Causes of maternal mortality decline in Matlab, Bangladesh.

    Science.gov (United States)

    Chowdhury, Mahbub Elahi; Ahmed, Anisuddin; Kalim, Nahid; Koblinsky, Marge

    2009-04-01

    Bangladesh is distinct among developing countries in achieving a low maternal mortality ratio (MMR) of 322 per 100,000 livebirths despite the very low use of skilled care at delivery (13% nationally). This variation has also been observed in Matlab, a rural area in Bangladesh, where longitudinal data on maternal mortality are available since the mid-1970s. The current study investigated the possible causes of the maternal mortality decline in Matlab. The study analyzed 769 maternal deaths and 215,779 pregnancy records from the Health and Demographic Surveillance System (HDSS) and other sources of safe motherhood data in the ICDDR,B and government service areas in Matlab during 1976-2005. The major interventions that took place in both the areas since the early 1980s were the family-planning programme plus safe menstrual regulation services and safe motherhood interventions (midwives for normal delivery in the ICDDR,B service area from the late 1980s and equal access to comprehensive emergency obstetric care [EmOC] in public facilities for women from both the areas). National programmes for social development and empowerment of women through education and microcredit programmes were implemented in both the areas. The quantitative findings were supplemented by a qualitative study by interviewing local community care providers for their change in practices for maternal healthcare over time. After the introduction of the safe motherhood programme, reduction in maternal mortality was higher in the ICDDR,B service area (68.6%) than in the government service area (50.4%) during 1986-1989 and 2001-2005. Reduction in the number of maternal deaths due to the fertility decline was higher in the government service area (30%) than in the ICDDR,B service area (23%) during 1979-2005. In each area, there has been substantial reduction in abortion-related mortality--86.7% and 78.3%--in the ICDDR,B and government service areas respectively. Education of women was a strong predictor

  16. Evaluation of Interpolants in Their Ability to Fit Seismometric Time Series

    Directory of Open Access Journals (Sweden)

    Kanadpriya Basu

    2015-08-01

    Full Text Available This article is devoted to the study of the ASARCO demolition seismic data. Two different classes of modeling techniques are explored: First, mathematical interpolation methods and second statistical smoothing approaches for curve fitting. We estimate the characteristic parameters of the propagation medium for seismic waves with multiple mathematical and statistical techniques, and provide the relative advantages of each approach to address fitting of such data. We conclude that mathematical interpolation techniques and statistical curve fitting techniques complement each other and can add value to the study of one dimensional time series seismographic data: they can be use to add more data to the system in case the data set is not large enough to perform standard statistical tests.

  17. Research Reactor Power Control System Design by MATLAB/SIMULINK

    International Nuclear Information System (INIS)

    Baang, Dane; Suh, Yong Suk; Kim, Young Ki; Im, Ki Hong

    2013-01-01

    In this study it is presented that MATLAB/SIMULINK can be efficiently used for modeling and power control system design for research reactors. The presented power control system deals with various functions including reactivity control, signals processing, reactivity calculation, alarm request generation, etc., thus it is required to test all the software logic using proper model for reactor, control rods, and field instruments. In MATLAB/SIMULINK tool, point kinetics, thermal model, control absorber rod model, and other instrument models were developed based on reactor parameters and known properties of each component or system. The software for power control system was invented and linked to the model to test each function. From the simulation result it is shown that the power control performance and other functions of the system can be easily tested and analyzed in the proposed simulation structure

  18. Simulaser, a graphical laser simulator based on Matlab Simulink

    CSIR Research Space (South Africa)

    Jacobs, Cobus

    2016-07-01

    Full Text Available We present a single-element plane-wave laser rate equation model and its implementation as a graphical laser simulation library using Matlab Simulink. Simulink’s graphical interface and vector capabilities provide a unique layer of abstraction...

  19. The NNSYSID Toolbox - A MATLAB Toolbox for System Identification with Neural Networks

    DEFF Research Database (Denmark)

    Nørgård, Peter Magnus; Ravn, Ole; Hansen, Lars Kai

    1996-01-01

    To assist the identification of nonlinear dynamic systems, a set of tools has been developed for the MATLAB(R) environment. The tools include a number of different model structures, highly effective training algorithms, functions for validating trained networks, and pruning algorithms for determi......To assist the identification of nonlinear dynamic systems, a set of tools has been developed for the MATLAB(R) environment. The tools include a number of different model structures, highly effective training algorithms, functions for validating trained networks, and pruning algorithms...

  20. Computational partial differential equations using Matlab

    CERN Document Server

    Li, Jichun

    2008-01-01

    Brief Overview of Partial Differential Equations The parabolic equations The wave equations The elliptic equations Differential equations in broader areasA quick review of numerical methods for PDEsFinite Difference Methods for Parabolic Equations Introduction Theoretical issues: stability, consistence, and convergence 1-D parabolic equations2-D and 3-D parabolic equationsNumerical examples with MATLAB codesFinite Difference Methods for Hyperbolic Equations IntroductionSome basic difference schemes Dissipation and dispersion errors Extensions to conservation lawsThe second-order hyperbolic PDE

  1. Flexible competing risks regression modeling and goodness-of-fit

    DEFF Research Database (Denmark)

    Scheike, Thomas; Zhang, Mei-Jie

    2008-01-01

    In this paper we consider different approaches for estimation and assessment of covariate effects for the cumulative incidence curve in the competing risks model. The classic approach is to model all cause-specific hazards and then estimate the cumulative incidence curve based on these cause...... models that is easy to fit and contains the Fine-Gray model as a special case. One advantage of this approach is that our regression modeling allows for non-proportional hazards. This leads to a new simple goodness-of-fit procedure for the proportional subdistribution hazards assumption that is very easy...... of the flexible regression models to analyze competing risks data when non-proportionality is present in the data....

  2. The link between the baryonic mass distribution and the rotation curve shape

    NARCIS (Netherlands)

    Swaters, R. A.; Sancisi, R.; van der Hulst, J. M.; van Albada, T. S.

    The observed rotation curves of disc galaxies, ranging from late-type dwarf galaxies to early-type spirals, can be fitted remarkably well simply by scaling up the contributions of the stellar and H?i discs. This baryonic scaling model can explain the full breadth of observed rotation curves with

  3. Vectorized Matlab Codes for Linear Two-Dimensional Elasticity

    Directory of Open Access Journals (Sweden)

    Jonas Koko

    2007-01-01

    Full Text Available A vectorized Matlab implementation for the linear finite element is provided for the two-dimensional linear elasticity with mixed boundary conditions. Vectorization means that there is no loop over triangles. Numerical experiments show that our implementation is more efficient than the standard implementation with a loop over all triangles.

  4. Optimization of Fit for Mass Customized Apparel Ordering Using Fit Preference and Self Measurement.

    Science.gov (United States)

    2000-01-01

    in significance and definition for both consumers and manufacturers. Fit preference involves an individualized bias toward a particular look, size...Committee. Bishton, D. (1984). The sweatshop report. Birmingham: AFFOR. 268 269 Bjerve, S. & Doksum, K. (1993). Correlation curves: Measures of...anthropometry methods. New York: John Wiley & Sons. Rosenbaum, R. & Schilling D. (1997). In sweatshops , wages are the issue. The Corporate Examiner

  5. AIR Tools - A MATLAB package of algebraic iterative reconstruction methods

    DEFF Research Database (Denmark)

    Hansen, Per Christian; Saxild-Hansen, Maria

    2012-01-01

    We present a MATLAB package with implementations of several algebraic iterative reconstruction methods for discretizations of inverse problems. These so-called row action methods rely on semi-convergence for achieving the necessary regularization of the problem. Two classes of methods are impleme......We present a MATLAB package with implementations of several algebraic iterative reconstruction methods for discretizations of inverse problems. These so-called row action methods rely on semi-convergence for achieving the necessary regularization of the problem. Two classes of methods...... are implemented: Algebraic Reconstruction Techniques (ART) and Simultaneous Iterative Reconstruction Techniques (SIRT). In addition we provide a few simplified test problems from medical and seismic tomography. For each iterative method, a number of strategies are available for choosing the relaxation parameter...

  6. Nonparametric statistics with applications to science and engineering

    CERN Document Server

    Kvam, Paul H

    2007-01-01

    A thorough and definitive book that fully addresses traditional and modern-day topics of nonparametric statistics This book presents a practical approach to nonparametric statistical analysis and provides comprehensive coverage of both established and newly developed methods. With the use of MATLAB, the authors present information on theorems and rank tests in an applied fashion, with an emphasis on modern methods in regression and curve fitting, bootstrap confidence intervals, splines, wavelets, empirical likelihood, and goodness-of-fit testing. Nonparametric Statistics with Applications to Science and Engineering begins with succinct coverage of basic results for order statistics, methods of categorical data analysis, nonparametric regression, and curve fitting methods. The authors then focus on nonparametric procedures that are becoming more relevant to engineering researchers and practitioners. The important fundamental materials needed to effectively learn and apply the discussed methods are also provide...

  7. An Open-source Toolbox for Analysing and Processing PhysioNet Databases in MATLAB and Octave

    Directory of Open Access Journals (Sweden)

    Ikaro Silva

    2014-09-01

    Full Text Available The WaveForm DataBase (WFDB Toolbox for MATLAB/Octave enables  integrated access to PhysioNet's software and databases. Using the WFDB Toolbox for MATLAB/Octave, users have access to over 50 physiological databases in PhysioNet. The toolbox allows direct loading into MATLAB/Octave's workspace of over 4 TB of biomedical signals including ECG, EEG, EMG, and PLETH. Additionally, most signals are accompanied by meta data such as medical annotations of clinical events: arrhythmias, sleep stages, seizures, hypotensive episodes, etc. Users of this toolbox should easily be able to reproduce, validate, and compare results published based on PhysioNet's software and databases.

  8. Integración del brazo robot IRB120 en entorno ROS-MATLAB

    OpenAIRE

    Gómez Cuadrado, José Manuel

    2017-01-01

    Este proyecto usa el entorno ROS (Robot Operating System) para desarrollar el control del brazo robot IRB 120 y su implementación en el entorno de trabajo MATLAB. Se explicará la creación del modelo del robot, la planificación de trayectorias y la comunicación con dicho robot. This project uses the ROS (Robot Operating System) environment for developing the control of the IRB 120 robotic arm and its implementation in the MATLAB working environment. It will explain the creation of the...

  9. Elementary mathematical and computational tools for electrical and computer engineers using Matlab

    CERN Document Server

    Manassah, Jamal T

    2013-01-01

    Ideal for use as a short-course textbook and for self-study Elementary Mathematical and Computational Tools for Electrical and Computer Engineers Using MATLAB fills that gap. Accessible after just one semester of calculus, it introduces the many practical analytical and numerical tools that are essential to success both in future studies and in professional life. Sharply focused on the needs of the electrical and computer engineering communities, the text provides a wealth of relevant exercises and design problems. Changes in MATLAB's version 6.0 are included in a special addendum.

  10. Trend analyses with river sediment rating curves

    Science.gov (United States)

    Warrick, Jonathan A.

    2015-01-01

    Sediment rating curves, which are fitted relationships between river discharge (Q) and suspended-sediment concentration (C), are commonly used to assess patterns and trends in river water quality. In many of these studies it is assumed that rating curves have a power-law form (i.e., C = aQb, where a and b are fitted parameters). Two fundamental questions about the utility of these techniques are assessed in this paper: (i) How well to the parameters, a and b, characterize trends in the data? (ii) Are trends in rating curves diagnostic of changes to river water or sediment discharge? As noted in previous research, the offset parameter, a, is not an independent variable for most rivers, but rather strongly dependent on b and Q. Here it is shown that a is a poor metric for trends in the vertical offset of a rating curve, and a new parameter, â, as determined by the discharge-normalized power function [C = â (Q/QGM)b], where QGM is the geometric mean of the Q values sampled, provides a better characterization of trends. However, these techniques must be applied carefully, because curvature in the relationship between log(Q) and log(C), which exists for many rivers, can produce false trends in â and b. Also, it is shown that trends in â and b are not uniquely diagnostic of river water or sediment supply conditions. For example, an increase in â can be caused by an increase in sediment supply, a decrease in water supply, or a combination of these conditions. Large changes in water and sediment supplies can occur without any change in the parameters, â and b. Thus, trend analyses using sediment rating curves must include additional assessments of the time-dependent rates and trends of river water, sediment concentrations, and sediment discharge.

  11. Polarization Curve of a Non-Uniformly Aged PEM Fuel Cell

    Directory of Open Access Journals (Sweden)

    Andrei Kulikovsky

    2014-01-01

    Full Text Available We develop a semi-analytical model for polarization curve of a polymer electrolyte membrane (PEM fuel cell with distributed (aged along the oxygen channel MEA transport and kinetic parameters of the membrane–electrode assembly (MEA. We show that the curve corresponding to varying along the channel parameter, in general, does not reduce to the curve for a certain constant value of this parameter. A possibility to determine the shape of the deteriorated MEA parameter along the oxygen channel by fitting the model equation to the cell polarization data is demonstrated.

  12. Quarkonium level fitting with two-power potentials

    International Nuclear Information System (INIS)

    Joshi, G.C.; Wignall, J.W.G.

    1981-01-01

    An attempt has been made to fit psi and UPSILON energy levels and leptonic decay width ratios with a non-relativistic potential model using a potential of the form V(r) = Arsup(p) + Brsup(q) + C. It is found that reasonable fits to states below hadronic decay threshold can be obtained for values of the powers p and q anywhere along a family of curves in the (p,q) plane that smoothly join the Martin potential (p = 0, q = 0.1) to the potential forms with p approximately -1 suggested by QCD; for the latter case the best fit is obtained with q approximately 0.4 - 0.5

  13. The Matlab Radial Basis Function Toolbox

    Directory of Open Access Journals (Sweden)

    Scott A. Sarra

    2017-03-01

    Full Text Available Radial Basis Function (RBF methods are important tools for scattered data interpolation and for the solution of Partial Differential Equations in complexly shaped domains. The most straight forward approach used to evaluate the methods involves solving a linear system which is typically poorly conditioned. The Matlab Radial Basis Function toolbox features a regularization method for the ill-conditioned system, extended precision floating point arithmetic, and symmetry exploitation for the purpose of reducing flop counts of the associated numerical linear algebra algorithms.

  14. Signals and systems primer with Matlab

    CERN Document Server

    Poularikas, Alexander D

    2006-01-01

    Signals and Systems Primer with MATLAB® equally emphasizes the fundamentals of both analog and digital signals and systems. To ensure insight into the basic concepts and methods, the text presents a variety of examples that illustrate a wide range of applications, from microelectromechanical to worldwide communication systems. It also provides MATLAB functions and procedures for practice and verification of these concepts.Taking a pedagogical approach, the author builds a solid foundation in signal processing as well as analog and digital systems. The book first introduces orthogonal signals,

  15. Obtaining DDF Curves of Extreme Rainfall Data Using Bivariate Copula and Frequency Analysis

    DEFF Research Database (Denmark)

    Sadri, Sara; Madsen, Henrik; Mikkelsen, Peter Steen

    2009-01-01

    , situated near Copenhagen in Denmark. For rainfall extracted using method 2, the marginal distribution of depth was found to fit the Generalized Pareto distribution while duration was found to fit the Gamma distribution, using the method of L-moments. The volume was fit with a generalized Pareto...... with duration for a given return period and name them DDF (depth-duration-frequency) curves. The copula approach does not assume the rainfall variables are independent or jointly normally distributed. Rainfall series are extracted in three ways: (1) by maximum mean intensity; (2) by depth and duration...... distribution and the duration was fit with a Pearson type III distribution for rainfall extracted using method 3. The Clayton copula was found to be appropriate for bivariate analysis of rainfall depth and duration for both methods 2 and 3. DDF curves derived using the Clayton copula for depth and duration...

  16. Matlab as a robust control design tool

    Science.gov (United States)

    Gregory, Irene M.

    1994-01-01

    This presentation introduces Matlab as a tool used in flight control research. The example used to illustrate some of the capabilities of this software is a robust controller designed for a single stage to orbit air breathing vehicles's ascent to orbit. The global requirements of the controller are to stabilize the vehicle and follow a trajectory in the presence of atmospheric disturbances and strong dynamic coupling between airframe and propulsion.

  17. Programming for computations MATLAB/Octave : a gentle introduction to numerical simulations with MATLAB/Octave

    CERN Document Server

    Linge, Svein

    2016-01-01

    This book presents computer programming as a key method for solving mathematical problems. There are two versions of the book, one for MATLAB and one for Python. The book was inspired by the Springer book TCSE 6: A Primer on Scientific Programming with Python (by Langtangen), but the style is more accessible and concise, in keeping with the needs of engineering students. The book outlines the shortest possible path from no previous experience with programming to a set of skills that allows the students to write simple programs for solving common mathematical problems with numerical methods in engineering and science courses. The emphasis is on generic algorithms, clean design of programs, use of functions, and automatic tests for verification.

  18. Digital Signal Processing for Medical Imaging Using Matlab

    CERN Document Server

    Gopi, E S

    2013-01-01

    This book describes medical imaging systems, such as X-ray, Computed tomography, MRI, etc. from the point of view of digital signal processing. Readers will see techniques applied to medical imaging such as Radon transformation, image reconstruction, image rendering, image enhancement and restoration, and more. This book also outlines the physics behind medical imaging required to understand the techniques being described. The presentation is designed to be accessible to beginners who are doing research in DSP for medical imaging. Matlab programs and illustrations are used wherever possible to reinforce the concepts being discussed.  ·         Acts as a “starter kit” for beginners doing research in DSP for medical imaging; ·         Uses Matlab programs and illustrations throughout to make content accessible, particularly with techniques such as Radon transformation and image rendering; ·         Includes discussion of the basic principles behind the various medical imaging tec...

  19. Pemrograman Graphical User Interface (GUI) Dengan Matlab Untuk Mendesain Alat Bantu Opersai Matematika

    OpenAIRE

    Butar Butar, Ronisah Putra

    2011-01-01

    Graphical User Interface ( GUI) is a application program orient visual which woke up with graphical obyek in the place of comand of text for the user interaction. Graphical User Interface ( GUI) in MATLAB embraced in a application of GUIDE ( Graphical User Interface Builder). In this paper will be discuss about how disagning a appliance assist mathematics operation with program of Graphical User Interface ( GUI) with MATLAB with aim to as one of the appliance alternative assist...

  20. Algorithm 873: LSTRS: MATLAB Software for Large-Scale Trust-Region Subproblems and Regularization

    DEFF Research Database (Denmark)

    Rojas Larrazabal, Marielba de la Caridad; Santos, Sandra A.; Sorensen, Danny C.

    2008-01-01

    A MATLAB 6.0 implementation of the LSTRS method is resented. LSTRS was described in Rojas, M., Santos, S.A., and Sorensen, D.C., A new matrix-free method for the large-scale trust-region subproblem, SIAM J. Optim., 11(3):611-646, 2000. LSTRS is designed for large-scale quadratic problems with one...... at each step. LSTRS relies on matrix-vector products only and has low and fixed storage requirements, features that make it suitable for large-scale computations. In the MATLAB implementation, the Hessian matrix of the quadratic objective function can be specified either explicitly, or in the form...... of a matrix-vector multiplication routine. Therefore, the implementation preserves the matrix-free nature of the method. A description of the LSTRS method and of the MATLAB software, version 1.2, is presented. Comparisons with other techniques and applications of the method are also included. A guide...

  1. One Curve Embedded Full-Bridge MMC Modeling Method with Detailed Representation of IGBT Characteristics

    Science.gov (United States)

    Hongyang, Yu; Zhengang, Lu; Xi, Yang

    2017-05-01

    Modular Multilevel Converter is more and more widely used in high voltage DC transmission system and high power motor drive system. It is a major topological structure for high power AC-DC converter. Due to the large module number, the complex control algorithm, and the high power user’s back ground, the MMC model used for simulation should be as accurate as possible to simulate the details of how MMC works for the dynamic testing of the MMC controller. But so far, there is no sample simulation MMC model which can simulate the switching dynamic process. In this paper, one curve embedded full-bridge MMC modeling method with detailed representation of IGBT characteristics is proposed. This method is based on the switching curve referring and sample circuit calculation, and it is sample for implementation. Based on the simulation comparison test under Matlab/Simulink, the proposed method is proved to be correct.

  2. Conformal Interpolating Algorithm Based on Cubic NURBS in Aspheric Ultra-Precision Machining

    International Nuclear Information System (INIS)

    Li, C G; Zhang, Q R; Cao, C G; Zhao, S L

    2006-01-01

    Numeric control machining and on-line compensation for aspheric surface are key techniques in ultra-precision machining. In this paper, conformal cubic NURBS interpolating curve is applied to fit the character curve of aspheric surface. Its algorithm and process are also proposed and imitated by Matlab7.0 software. To evaluate the performance of the conformal cubic NURBS interpolation, we compare it with the linear interpolations. The result verifies this method can ensure smoothness of interpolating spline curve and preserve original shape characters. The surface quality interpolated by cubic NURBS is higher than by line. The algorithm is benefit to increasing the surface form precision of workpieces in ultra-precision machining

  3. A linear algebra course with PC-MATLAB : some experiences

    NARCIS (Netherlands)

    Smits, J.G.M.M.; Rijpkema, J.J.M.

    1992-01-01

    The authors present their views on the impact that the use of computers and software packages should have on the contents of a first service course on linear algebra. Furthermore they report on their experiences using the software package PC-MATLAB in such a course.

  4. A platform for dynamic simulation and control of movement based on OpenSim and MATLAB.

    Science.gov (United States)

    Mansouri, Misagh; Reinbolt, Jeffrey A

    2012-05-11

    Numerical simulations play an important role in solving complex engineering problems and have the potential to revolutionize medical decision making and treatment strategies. In this paper, we combine the rapid model-based design, control systems and powerful numerical method strengths of MATLAB/Simulink with the simulation and human movement dynamics strengths of OpenSim by developing a new interface between the two software tools. OpenSim is integrated with Simulink using the MATLAB S-function mechanism, and the interface is demonstrated using both open-loop and closed-loop control systems. While the open-loop system uses MATLAB/Simulink to separately reproduce the OpenSim Forward Dynamics Tool, the closed-loop system adds the unique feature of feedback control to OpenSim, which is necessary for most human movement simulations. An arm model example was successfully used in both open-loop and closed-loop cases. For the open-loop case, the simulation reproduced results from the OpenSim Forward Dynamics Tool with root mean square (RMS) differences of 0.03° for the shoulder elevation angle and 0.06° for the elbow flexion angle. MATLAB's variable step-size integrator reduced the time required to generate the forward dynamic simulation from 7.1s (OpenSim) to 2.9s (MATLAB). For the closed-loop case, a proportional-integral-derivative controller was used to successfully balance a pole on model's hand despite random force disturbances on the pole. The new interface presented here not only integrates the OpenSim and MATLAB/Simulink software tools, but also will allow neuroscientists, physiologists, biomechanists, and physical therapists to adapt and generate new solutions as treatments for musculoskeletal conditions. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. MATLAB simulation software used for the PhD thesis "Acquisition of Multi-Band Signals via Compressed Sensing

    DEFF Research Database (Denmark)

    2014-01-01

    MATLAB simulation software used for the PhD thesis "Acquisition of Multi-Band Signals via Compressed Sensing......MATLAB simulation software used for the PhD thesis "Acquisition of Multi-Band Signals via Compressed Sensing...

  6. MBEToolbox: a Matlab toolbox for sequence data analysis in molecular biology and evolution

    Directory of Open Access Journals (Sweden)

    Xia Xuhua

    2005-03-01

    Full Text Available Abstract Background MATLAB is a high-performance language for technical computing, integrating computation, visualization, and programming in an easy-to-use environment. It has been widely used in many areas, such as mathematics and computation, algorithm development, data acquisition, modeling, simulation, and scientific and engineering graphics. However, few functions are freely available in MATLAB to perform the sequence data analyses specifically required for molecular biology and evolution. Results We have developed a MATLAB toolbox, called MBEToolbox, aimed at filling this gap by offering efficient implementations of the most needed functions in molecular biology and evolution. It can be used to manipulate aligned sequences, calculate evolutionary distances, estimate synonymous and nonsynonymous substitution rates, and infer phylogenetic trees. Moreover, it provides an extensible, functional framework for users with more specialized requirements to explore and analyze aligned nucleotide or protein sequences from an evolutionary perspective. The full functions in the toolbox are accessible through the command-line for seasoned MATLAB users. A graphical user interface, that may be especially useful for non-specialist end users, is also provided. Conclusion MBEToolbox is a useful tool that can aid in the exploration, interpretation and visualization of data in molecular biology and evolution. The software is publicly available at http://web.hku.hk/~jamescai/mbetoolbox/ and http://bioinformatics.org/project/?group_id=454.

  7. Visualization problems in conductor thermal aging resp. comparison of matlab and gnuplot graphical outputs

    International Nuclear Information System (INIS)

    Beza, J.; Heckenbergerova, J.

    2012-01-01

    The main aim of this paper is comparison of Matlab and Gnuplot software suitability for visualization of results in electro energetic and electro engineering domain. Whole contribution is targeted on specific graphical outputs in problematic of thermal aging of overhead power transmission lines. First, all graphical outputs determined for comparison are visualized in Matlab and furthermore strengths and disadvantages of this implementation are described. Same graphics are then created by free software Gnuplot and finally comparison of both visualization software is made. Last part of the contribution contains main rules and tips about Gnuplot visualization and can be used as User Handbook. The major target of this contribution is to show advantages and shortcomings of Matlab graphical outputs together with introduction of Gnuplot software and suitable alternative for visualization of electrotechnical results. (Authors)

  8. Radial artery pulse waveform analysis based on curve fitting using discrete Fourier series.

    Science.gov (United States)

    Jiang, Zhixing; Zhang, David; Lu, Guangming

    2018-04-19

    Radial artery pulse diagnosis has been playing an important role in traditional Chinese medicine (TCM). For its non-invasion and convenience, the pulse diagnosis has great significance in diseases analysis of modern medicine. The practitioners sense the pulse waveforms in patients' wrist to make diagnoses based on their non-objective personal experience. With the researches of pulse acquisition platforms and computerized analysis methods, the objective study on pulse diagnosis can help the TCM to keep up with the development of modern medicine. In this paper, we propose a new method to extract feature from pulse waveform based on discrete Fourier series (DFS). It regards the waveform as one kind of signal that consists of a series of sub-components represented by sine and cosine (SC) signals with different frequencies and amplitudes. After the pulse signals are collected and preprocessed, we fit the average waveform for each sample using discrete Fourier series by least squares. The feature vector is comprised by the coefficients of discrete Fourier series function. Compared with the fitting method using Gaussian mixture function, the fitting errors of proposed method are smaller, which indicate that our method can represent the original signal better. The classification performance of proposed feature is superior to the other features extracted from waveform, liking auto-regression model and Gaussian mixture model. The coefficients of optimized DFS function, who is used to fit the arterial pressure waveforms, can obtain better performance in modeling the waveforms and holds more potential information for distinguishing different psychological states. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. mGrid: A load-balanced distributed computing environment for the remote execution of the user-defined Matlab code

    Directory of Open Access Journals (Sweden)

    Almeida Jonas S

    2006-03-01

    Full Text Available Abstract Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else. Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web

  10. mGrid: a load-balanced distributed computing environment for the remote execution of the user-defined Matlab code.

    Science.gov (United States)

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-03-15

    Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over

  11. A Modified Johnson-Cook Model for Sheet Metal Forming at Elevated Temperatures and Its Application for Cooled Stress-Strain Curve and Spring-Back Prediction

    International Nuclear Information System (INIS)

    Duc-Toan, Nguyen; Tien-Long, Banh; Young-Suk, Kim; Dong-Won, Jung

    2011-01-01

    In this study, a modified Johnson-Cook (J-C) model and an innovated method to determine (J-C) material parameters are proposed to predict more correctly stress-strain curve for tensile tests in elevated temperatures. A MATLAB tool is used to determine material parameters by fitting a curve to follow Ludwick's hardening law at various elevated temperatures. Those hardening law parameters are then utilized to determine modified (J-C) model material parameters. The modified (J-C) model shows the better prediction compared to the conventional one. As the first verification, an FEM tensile test simulation based on the isotropic hardening model for boron sheet steel at elevated temperatures was carried out via a user-material subroutine, using an explicit finite element code, and compared with the measurements. The temperature decrease of all elements due to the air cooling process was then calculated when considering the modified (J-C) model and coded to VUMAT subroutine for tensile test simulation of cooling process. The modified (J-C) model showed the good agreement between the simulation results and the corresponding experiments. The second investigation was applied for V-bending spring-back prediction of magnesium alloy sheets at elevated temperatures. Here, the combination of proposed J-C model with modified hardening law considering the unusual plastic behaviour for magnesium alloy sheet was adopted for FEM simulation of V-bending spring-back prediction and shown the good comparability with corresponding experiments.

  12. Development of MATLAB software to control data acquisition from a multichannel systems multi-electrode array.

    Science.gov (United States)

    Messier, Erik

    2016-08-01

    A Multichannel Systems (MCS) microelectrode array data acquisition (DAQ) unit is used to collect multichannel electrograms (EGM) from a Langendorff perfused rabbit heart system to study sudden cardiac death (SCD). MCS provides software through which data being processed by the DAQ unit can be displayed and saved, but this software's combined utility with MATLAB is not very effective. MCSs software stores recorded EGM data in a MathCad (MCD) format, which is then converted to a text file format. These text files are very large, and it is therefore very time consuming to import the EGM data into MATLAB for real-time analysis. Therefore, customized MATLAB software was developed to control the acquisition of data from the MCS DAQ unit, and provide specific laboratory accommodations for this study of SCD. The developed DAQ unit control software will be able to accurately: provide real time display of EGM signals; record and save EGM signals in MATLAB in a desired format; and produce real time analysis of the EGM signals; all through an intuitive GUI.

  13. Time series analysis of cholera in Matlab, Bangladesh, during 1988-2001.

    Science.gov (United States)

    Ali, Mohammad; Kim, Deok Ryun; Yunus, Mohammad; Emch, Michael

    2013-03-01

    The study examined the impact of in-situ climatic and marine environmental variability on cholera incidence in an endemic area of Bangladesh and developed a forecasting model for understanding the magnitude of incidence. Diarrhoea surveillance data collected between 1988 and 2001 were obtained from a field research site in Matlab, Bangladesh. Cholera cases were defined as Vibrio cholerae O1 isolated from faecal specimens of patients who sought care at treatment centres serving the Matlab population. Cholera incidence for 168 months was correlated with remotely-sensed sea-surface temperature (SST) and in-situ environmental data, including rainfall and ambient temperature. A seasonal autoregressive integrated moving average (SARIMA) model was used for determining the impact of climatic and environmental variability on cholera incidence and evaluating the ability of the model to forecast the magnitude of cholera. There were 4,157 cholera cases during the study period, with an average of 1.4 cases per 1,000 people. Since monthly cholera cases varied significantly by month, it was necessary to stabilize the variance of cholera incidence by computing the natural logarithm to conduct the analysis. The SARIMA model shows temporal clustering of cholera at one- and 12-month lags. There was a 6% increase in cholera incidence with a minimum temperature increase of one degree celsius in the current month. For increase of SST by one degree celsius, there was a 25% increase in the cholera incidence at currrent month and 18% increase in the cholera incidence at two months. Rainfall did not influenc to cause variation in cholera incidence during the study period. The model forecast the fluctuation of cholera incidence in Matlab reasonably well (Root mean square error, RMSE: 0.108). Thus, the ambient and sea-surface temperature-based model could be used in forecasting cholera outbreaks in Matlab.

  14. Modeling of alpha mass-efficiency curve

    International Nuclear Information System (INIS)

    Semkow, T.M.; Jeter, H.W.; Parsa, B.; Parekh, P.P.; Haines, D.K.; Bari, A.

    2005-01-01

    We present a model for efficiency of a detector counting gross α radioactivity from both thin and thick samples, corresponding to low and high sample masses in the counting planchette. The model includes self-absorption of α particles in the sample, energy loss in the absorber, range straggling, as well as detector edge effects. The surface roughness of the sample is treated in terms of fractal geometry. The model reveals a linear dependence of the detector efficiency on the sample mass, for low masses, as well as a power-law dependence for high masses. It is, therefore, named the linear-power-law (LPL) model. In addition, we consider an empirical power-law (EPL) curve, and an exponential (EXP) curve. A comparison is made of the LPL, EPL, and EXP fits to the experimental α mass-efficiency data from gas-proportional detectors for selected radionuclides: 238 U, 230 Th, 239 Pu, 241 Am, and 244 Cm. Based on this comparison, we recommend working equations for fitting mass-efficiency data. Measurement of α radioactivity from a thick sample can determine the fractal dimension of its surface

  15. ASTEROID LIGHT CURVES FROM THE PALOMAR TRANSIENT FACTORY SURVEY: ROTATION PERIODS AND PHASE FUNCTIONS FROM SPARSE PHOTOMETRY

    Energy Technology Data Exchange (ETDEWEB)

    Waszczak, Adam [Division of Geological and Planetary Sciences, California Institute of Technology, Pasadena, CA 91125 (United States); Chang, Chan-Kao; Cheng, Yu-Chi; Ip, Wing-Huen; Kinoshita, Daisuke [Institute of Astronomy, National Central University, Jhongli, Taiwan (China); Ofek, Eran O. [Benoziyo Center for Astrophysics, Weizmann Institute of Science, Rehovot (Israel); Laher, Russ; Surace, Jason [Spitzer Science Center, California Institute of Technology, Pasadena, CA 91125 (United States); Masci, Frank; Helou, George [Infrared Processing and Analysis Center, California Institute of Technology, Pasadena, CA 91125 (United States); Levitan, David; Prince, Thomas A.; Kulkarni, Shrinivas, E-mail: waszczak@caltech.edu [Division of Physics, Mathematics and Astronomy, California Institute of Technology, Pasadena, CA 91125 (United States)

    2015-09-15

    We fit 54,296 sparsely sampled asteroid light curves in the Palomar Transient Factory survey to a combined rotation plus phase-function model. Each light curve consists of 20 or more observations acquired in a single opposition. Using 805 asteroids in our sample that have reference periods in the literature, we find that the reliability of our fitted periods is a complicated function of the period, amplitude, apparent magnitude, and other light-curve attributes. Using the 805-asteroid ground-truth sample, we train an automated classifier to estimate (along with manual inspection) the validity of the remaining ∼53,000 fitted periods. By this method we find that 9033 of our light curves (of ∼8300 unique asteroids) have “reliable” periods. Subsequent consideration of asteroids with multiple light-curve fits indicates a 4% contamination in these “reliable” periods. For 3902 light curves with sufficient phase-angle coverage and either a reliable fit period or low amplitude, we examine the distribution of several phase-function parameters, none of which are bimodal though all correlate with the bond albedo and with visible-band colors. Comparing the theoretical maximal spin rate of a fluid body with our amplitude versus spin-rate distribution suggests that, if held together only by self-gravity, most asteroids are in general less dense than ∼2 g cm{sup −3}, while C types have a lower limit of between 1 and 2 g cm{sup −3}. These results are in agreement with previous density estimates. For 5–20 km diameters, S types rotate faster and have lower amplitudes than C types. If both populations share the same angular momentum, this may indicate the two types’ differing ability to deform under rotational stress. Lastly, we compare our absolute magnitudes (and apparent-magnitude residuals) to those of the Minor Planet Center’s nominal (G = 0.15, rotation-neglecting) model; our phase-function plus Fourier-series fitting reduces asteroid photometric rms

  16. Online design of Matlab/Simulink block schemes

    Directory of Open Access Journals (Sweden)

    Zoltán Janík

    2011-04-01

    Full Text Available The paper presents a new online tool that enables to build a Matlab/Simulink block scheme in the Internet environment. The block scheme can be designed in similar manner as it is offered by local installation of Simulink. The application was created by widely used technologies as XHTML, CSS, JavaScript, PHP together with AJAX approach. The created application can be used as a supporting tool in virtual and remote laboratories.

  17. Implementation of Digital Watermarking Using MATLAB Software

    OpenAIRE

    Karnpriya Vyas; Kirti Sethiya; Sonu Jain

    2012-01-01

    Digital watermarking holds significant promise as one of the keys to protecting proprietary digital content in the coming years. It focuses on embedding information inside a digital object such that the embedded information is in separable bound to the object. The proposed scheme has been implemented on MATLAB, as it is a high level technical computing language and interactive environment for algorithm development, data visualization, data analysis, and numerical computation. We w...

  18. On the second kinetic order thermoluminescent glow curves

    International Nuclear Information System (INIS)

    Dang Thanh Luong; Nguyen Hao Quang; Hoang Minh Giang

    1995-01-01

    The kinetic parameters of thermoluminescent material such as CaF 2 -N and CaSO 4 -Dy with the different grain sizes are investigated in detail using the least square method of fitting. It was found that the activation energy E (or trap depth) and peak temperature T m ax are changed with the elapsed time between the irradiation and read-out for the low temperature glow curve peaks. The similar TL glow curve shapes are obtained for the different CaSO 4 -Dy grain size. (author). 7 refs., 5 figs., 2 tabs

  19. Analysis of the MPEG-1 Layer III (MP3) Algorithm using MATLAB

    CERN Document Server

    Thiagarajan, Jayaraman

    2011-01-01

    The MPEG-1 Layer III (MP3) algorithm is one of the most successful audio formats for consumer audio storage and for transfer and playback of music on digital audio players. The MP3 compression standard along with the AAC (Advanced Audio Coding) algorithm are associated with the most successful music players of the last decade. This book describes the fundamentals and the MATLAB implementation details of the MP3 algorithm. Several of the tedious processes in MP3 are supported by demonstrations using MATLAB software. The book presents the theoretical concepts and algorithms used in the MP3 stand

  20. First experience with the MATLAB Middle Layer at ANKA

    International Nuclear Information System (INIS)

    Marsching, S.; Huttel, E.; Klein, M.; Mueller, A.S.; Smale, N.J.

    2012-01-01

    ANKA is a synchrotron radiation facility at the Karlsruhe Institute of Technology. The MATLAB Middle Layer (MML) is a collection of scripts for the MATLAB programming environment, designed to control and measure parameters of an accelerator. MML has been adapted for use at ANKA and the commissioning process was quite simple and would have been even simpler if we had used one of the control systems directly supported by MML. At ANKA MML is used for accelerator physics studies and regular tasks like beam-based alignment and response matrix analysis using LOCO. Furthermore, we intend to study the MML as default orbit correction tool for user operation. We report on the experience made during the commissioning process and present the latest results obtained while using the MML for machine studies. MML simplifies the task of performing a beam-based alignment dramatically compared to our old solution which required the user to copy the measured data into specific files for further evaluation

  1. The S-curve for forecasting waste generation in construction projects.

    Science.gov (United States)

    Lu, Weisheng; Peng, Yi; Chen, Xi; Skitmore, Martin; Zhang, Xiaoling

    2016-10-01

    Forecasting construction waste generation is the yardstick of any effort by policy-makers, researchers, practitioners and the like to manage construction and demolition (C&D) waste. This paper develops and tests an S-curve model to indicate accumulative waste generation as a project progresses. Using 37,148 disposal records generated from 138 building projects in Hong Kong in four consecutive years from January 2011 to June 2015, a wide range of potential S-curve models are examined, and as a result, the formula that best fits the historical data set is found. The S-curve model is then further linked to project characteristics using artificial neural networks (ANNs) so that it can be used to forecast waste generation in future construction projects. It was found that, among the S-curve models, cumulative logistic distribution is the best formula to fit the historical data. Meanwhile, contract sum, location, public-private nature, and duration can be used to forecast construction waste generation. The study provides contractors with not only an S-curve model to forecast overall waste generation before a project commences, but also with a detailed baseline to benchmark and manage waste during the course of construction. The major contribution of this paper is to the body of knowledge in the field of construction waste generation forecasting. By examining it with an S-curve model, the study elevates construction waste management to a level equivalent to project cost management where the model has already been readily accepted as a standard tool. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Dose-response calibration curves of {sup 137}Cs gamma rays for dicentric chromosome aberrations in human lymphocytes

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Wol Soon; Oh, Su Jung; Jeong, Soo Kyun; Yang, Kwang Mo [Dept. of Research center, Dong Nam Institute of Radiological and Medical Sciences, Busan (Korea, Republic of); Jeong, Min Ho [Dept. of Microbiology, Dong A University College of Medicine, Busan (Korea, Republic of)

    2012-11-15

    Recently, the increased threat of radiologically industrial accident such as radiation nondestructive inspection or destruction of nuclear accident by natural disaster such as Fukushima accident requires a greater capacity for cytogenetic biodosimetry, which is critical for clinical triage of potentially thousands of radiation-exposed individuals. Dicentric chromosome aberration analysis is the conventional means of assessing radiation exposure. Dose–response calibration curves for {sup 13}'7Cs gamma rays have been established for unstable chromosome aberrations in human peripheral blood lymphocytes in many laboratories of international biodosimetry network. In this study, therefore, we established dose– response calibration curves of our laboratory for {sup 137}Cs gamma raysaccording to the IAEA protocols for conducting the dicentric chromosome assay We established in vitro dose–response calibration curves for dicentric chromosome aberrations in human lymphocytes for{sup 13}'7Cs gamma rays in the 0 to 5 Gy range, using the maximum likelihood linear-quadratic model, Y = c+αD+βD2. The estimated coefficients of the fitted curves were within the 95% confidence intervals (CIs) and the curve fitting of dose–effect relationship data indicated a good fit to the linear-quadratic model. Hence, meaningful dose estimation from unknown sample can be determined accurately by using our laboratory’s calibration curve according to standard protocol.

  3. A mathematical function for the description of nutrient-response curve.

    Directory of Open Access Journals (Sweden)

    Hamed Ahmadi

    Full Text Available Several mathematical equations have been proposed to modeling nutrient-response curve for animal and human justified on the goodness of fit and/or on the biological mechanism. In this paper, a functional form of a generalized quantitative model based on Rayleigh distribution principle for description of nutrient-response phenomena is derived. The three parameters governing the curve a has biological interpretation, b may be used to calculate reliable estimates of nutrient response relationships, and c provide the basis for deriving relationships between nutrient and physiological responses. The new function was successfully applied to fit the nutritional data obtained from 6 experiments including a wide range of nutrients and responses. An evaluation and comparison were also done based simulated data sets to check the suitability of new model and four-parameter logistic model for describing nutrient responses. This study indicates the usefulness and wide applicability of the new introduced, simple and flexible model when applied as a quantitative approach to characterizing nutrient-response curve. This new mathematical way to describe nutritional-response data, with some useful biological interpretations, has potential to be used as an alternative approach in modeling nutritional responses curve to estimate nutrient efficiency and requirements.

  4. Learning curves in highly skilled chess players: a test of the generality of the power law of practice.

    Science.gov (United States)

    Howard, Robert W

    2014-09-01

    The power law of practice holds that a power function best interrelates skill performance and amount of practice. However, the law's validity and generality are moot. Some researchers argue that it is an artifact of averaging individual exponential curves while others question whether the law generalizes to complex skills and to performance measures other than response time. The present study tested the power law's generality to development over many years of a very complex cognitive skill, chess playing, with 387 skilled participants, most of whom were grandmasters. A power or logarithmic function best fit grouped data but individuals showed much variability. An exponential function usually was the worst fit to individual data. Groups differing in chess talent were compared and a power function best fit the group curve for the more talented players while a quadratic function best fit that for the less talented. After extreme amounts of practice, a logarithmic function best fit grouped data but a quadratic function best fit most individual curves. Individual variability is great and the power law or an exponential law are not the best descriptions of individual chess skill development. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. UNSUPERVISED TRANSIENT LIGHT CURVE ANALYSIS VIA HIERARCHICAL BAYESIAN INFERENCE

    International Nuclear Information System (INIS)

    Sanders, N. E.; Soderberg, A. M.; Betancourt, M.

    2015-01-01

    Historically, light curve studies of supernovae (SNe) and other transient classes have focused on individual objects with copious and high signal-to-noise observations. In the nascent era of wide field transient searches, objects with detailed observations are decreasing as a fraction of the overall known SN population, and this strategy sacrifices the majority of the information contained in the data about the underlying population of transients. A population level modeling approach, simultaneously fitting all available observations of objects in a transient sub-class of interest, fully mines the data to infer the properties of the population and avoids certain systematic biases. We present a novel hierarchical Bayesian statistical model for population level modeling of transient light curves, and discuss its implementation using an efficient Hamiltonian Monte Carlo technique. As a test case, we apply this model to the Type IIP SN sample from the Pan-STARRS1 Medium Deep Survey, consisting of 18,837 photometric observations of 76 SNe, corresponding to a joint posterior distribution with 9176 parameters under our model. Our hierarchical model fits provide improved constraints on light curve parameters relevant to the physical properties of their progenitor stars relative to modeling individual light curves alone. Moreover, we directly evaluate the probability for occurrence rates of unseen light curve characteristics from the model hyperparameters, addressing observational biases in survey methodology. We view this modeling framework as an unsupervised machine learning technique with the ability to maximize scientific returns from data to be collected by future wide field transient searches like LSST

  6. UNSUPERVISED TRANSIENT LIGHT CURVE ANALYSIS VIA HIERARCHICAL BAYESIAN INFERENCE

    Energy Technology Data Exchange (ETDEWEB)

    Sanders, N. E.; Soderberg, A. M. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Betancourt, M., E-mail: nsanders@cfa.harvard.edu [Department of Statistics, University of Warwick, Coventry CV4 7AL (United Kingdom)

    2015-02-10

    Historically, light curve studies of supernovae (SNe) and other transient classes have focused on individual objects with copious and high signal-to-noise observations. In the nascent era of wide field transient searches, objects with detailed observations are decreasing as a fraction of the overall known SN population, and this strategy sacrifices the majority of the information contained in the data about the underlying population of transients. A population level modeling approach, simultaneously fitting all available observations of objects in a transient sub-class of interest, fully mines the data to infer the properties of the population and avoids certain systematic biases. We present a novel hierarchical Bayesian statistical model for population level modeling of transient light curves, and discuss its implementation using an efficient Hamiltonian Monte Carlo technique. As a test case, we apply this model to the Type IIP SN sample from the Pan-STARRS1 Medium Deep Survey, consisting of 18,837 photometric observations of 76 SNe, corresponding to a joint posterior distribution with 9176 parameters under our model. Our hierarchical model fits provide improved constraints on light curve parameters relevant to the physical properties of their progenitor stars relative to modeling individual light curves alone. Moreover, we directly evaluate the probability for occurrence rates of unseen light curve characteristics from the model hyperparameters, addressing observational biases in survey methodology. We view this modeling framework as an unsupervised machine learning technique with the ability to maximize scientific returns from data to be collected by future wide field transient searches like LSST.

  7. Introduction of pattern recognition by MATLAB practice 2

    International Nuclear Information System (INIS)

    1999-06-01

    The contents of this book starts introduction and examples of pattern recognition. This book describes a vector and matrix, basic statistics and a probability distribution, statistical decision theory and probability density function, liner shunt, vector quantizing and clustering GMM, PCA and KL conversion, LDA, ID 3, a nerve cell modeling, HMM, SVM and Ada boost. It has direction of MATLAB in the appendix.

  8. Kinetic modeling and fitting software for interconnected reaction schemes: VisKin.

    Science.gov (United States)

    Zhang, Xuan; Andrews, Jared N; Pedersen, Steen E

    2007-02-15

    Reaction kinetics for complex, highly interconnected kinetic schemes are modeled using analytical solutions to a system of ordinary differential equations. The algorithm employs standard linear algebra methods that are implemented using MatLab functions in a Visual Basic interface. A graphical user interface for simple entry of reaction schemes facilitates comparison of a variety of reaction schemes. To ensure microscopic balance, graph theory algorithms are used to determine violations of thermodynamic cycle constraints. Analytical solutions based on linear differential equations result in fast comparisons of first order kinetic rates and amplitudes as a function of changing ligand concentrations. For analysis of higher order kinetics, we also implemented a solution using numerical integration. To determine rate constants from experimental data, fitting algorithms that adjust rate constants to fit the model to imported data were implemented using the Levenberg-Marquardt algorithm or using Broyden-Fletcher-Goldfarb-Shanno methods. We have included the ability to carry out global fitting of data sets obtained at varying ligand concentrations. These tools are combined in a single package, which we have dubbed VisKin, to guide and analyze kinetic experiments. The software is available online for use on PCs.

  9. Stage discharge curve for Guillemard Bridge streamflow sation based on rating curve method using historical flood event data

    International Nuclear Information System (INIS)

    Ros, F C; Sidek, L M; Desa, M N; Arifin, K; Tosaka, H

    2013-01-01

    The purpose of the stage-discharge curves varies from water quality study, flood modelling study, can be used to project climate change scenarios and so on. As the bed of the river often changes due to the annual monsoon seasons that sometimes cause by massive floods, the capacity of the river will changed causing shifting controlled to happen. This study proposes to use the historical flood event data from 1960 to 2009 in calculating the stage-discharge curve of Guillemard Bridge located in Sg. Kelantan. Regression analysis was done to check the quality of the data and examine the correlation between the two variables, Q and H. The mean values of the two variables then were adopted to find the value of difference between zero gauge height and the level of zero flow, 'a', K and 'n' to fit into rating curve equation and finally plotting the stage-discharge rating curve. Regression analysis of the historical flood data indicate that 91 percent of the original uncertainty has been explained by the analysis with the standard error of 0.085.

  10. An innovation on high-grade CNC machines tools for B-spline curve method of high-speed interpolation arithmetic

    Science.gov (United States)

    Zhang, Wanjun; Gao, Shanping; Cheng, Xiyan; Zhang, Feng

    2017-04-01

    A novel on high-grade CNC machines tools for B Spline curve method of High-speed interpolation arithmetic is introduced. In the high-grade CNC machines tools CNC system existed the type value points is more trouble, the control precision is not strong and so on, In order to solve this problem. Through specific examples in matlab7.0 simulation result showed that that the interpolation error significantly reduced, the control precision is improved markedly, and satisfy the real-time interpolation of high speed, high accuracy requirements.

  11. MATLAB simulation for an experimental setup of digital feedback control

    International Nuclear Information System (INIS)

    Zheng Lifang; Liu Songqiang

    2005-01-01

    This paper describes the digital feedback simulation using MATLAB for an experimental accelerator control setup. By analyzing the plant characteristic in time-domain and frequency-domain, a guideline for design of digital filter and PID controller is derived. (authors)

  12. Object-oriented Matlab adaptive optics toolbox

    Science.gov (United States)

    Conan, R.; Correia, C.

    2014-08-01

    Object-Oriented Matlab Adaptive Optics (OOMAO) is a Matlab toolbox dedicated to Adaptive Optics (AO) systems. OOMAO is based on a small set of classes representing the source, atmosphere, telescope, wavefront sensor, Deformable Mirror (DM) and an imager of an AO system. This simple set of classes allows simulating Natural Guide Star (NGS) and Laser Guide Star (LGS) Single Conjugate AO (SCAO) and tomography AO systems on telescopes up to the size of the Extremely Large Telescopes (ELT). The discrete phase screens that make the atmosphere model can be of infinite size, useful for modeling system performance on large time scales. OOMAO comes with its own parametric influence function model to emulate different types of DMs. The cone effect, altitude thickness and intensity profile of LGSs are also reproduced. Both modal and zonal modeling approach are implemented. OOMAO has also an extensive library of theoretical expressions to evaluate the statistical properties of turbulence wavefronts. The main design characteristics of the OOMAO toolbox are object-oriented modularity, vectorized code and transparent parallel computing. OOMAO has been used to simulate and to design the Multi-Object AO prototype Raven at the Subaru telescope and the Laser Tomography AO system of the Giant Magellan Telescope. In this paper, a Laser Tomography AO system on an ELT is simulated with OOMAO. In the first part, we set-up the class parameters and we link the instantiated objects to create the source optical path. Then we build the tomographic reconstructor and write the script for the pseudo-open-loop controller.

  13. Wind Turbine Blockset in Matlab/Simulink. General Overview and Description of the Model

    DEFF Research Database (Denmark)

    Iov, Florin; Hansen, A. D.; Soerensen, P.

    This report presents a new developed Matlab/Simulink Toolbox for wind turbine applications. This toolbox has been developed during the research project ?Simulation Platform to model, optimize and design wind turbines? and it has been used as a general developer tool for other three simulation tools......: Saber, DIgSILENT, HAWC. The report provides first a quick overview over Matlab issues and then explains the structure of the developed toolbox. The attention in the report is mainly drawn to the description of the most important mathematical models, which have been developed in the Toolbox. Then, some...

  14. Prognostics 101: A tutorial for particle filter-based prognostics algorithm using Matlab

    International Nuclear Information System (INIS)

    An, Dawn; Choi, Joo-Ho; Kim, Nam Ho

    2013-01-01

    This paper presents a Matlab-based tutorial for model-based prognostics, which combines a physical model with observed data to identify model parameters, from which the remaining useful life (RUL) can be predicted. Among many model-based prognostics algorithms, the particle filter is used in this tutorial for parameter estimation of damage or a degradation model. The tutorial is presented using a Matlab script with 62 lines, including detailed explanations. As examples, a battery degradation model and a crack growth model are used to explain the updating process of model parameters, damage progression, and RUL prediction. In order to illustrate the results, the RUL at an arbitrary cycle are predicted in the form of distribution along with the median and 90% prediction interval. This tutorial will be helpful for the beginners in prognostics to understand and use the prognostics method, and we hope it provides a standard of particle filter based prognostics. -- Highlights: ► Matlab-based tutorial for model-based prognostics is presented. ► A battery degradation model and a crack growth model are used as examples. ► The RUL at an arbitrary cycle are predicted using the particle filter

  15. Light extraction block with curved surface

    Science.gov (United States)

    Levermore, Peter; Krall, Emory; Silvernail, Jeffrey; Rajan, Kamala; Brown, Julia J.

    2016-03-22

    Light extraction blocks, and OLED lighting panels using light extraction blocks, are described, in which the light extraction blocks include various curved shapes that provide improved light extraction properties compared to parallel emissive surface, and a thinner form factor and better light extraction than a hemisphere. Lighting systems described herein may include a light source with an OLED panel. A light extraction block with a three-dimensional light emitting surface may be optically coupled to the light source. The three-dimensional light emitting surface of the block may includes a substantially curved surface, with further characteristics related to the curvature of the surface at given points. A first radius of curvature corresponding to a maximum principal curvature k.sub.1 at a point p on the substantially curved surface may be greater than a maximum height of the light extraction block. A maximum height of the light extraction block may be less than 50% of a maximum width of the light extraction block. Surfaces with cross sections made up of line segments and inflection points may also be fit to approximated curves for calculating the radius of curvature.

  16. Design of elliptic curve cryptoprocessors over GF(2^163 using the Gaussian normal basis

    Directory of Open Access Journals (Sweden)

    Paulo Cesar Realpe

    2014-05-01

    Full Text Available This paper presents the efficient hardware implementation of cryptoprocessors that carry out the scalar multiplication kP over finite field GF(2163 using two digit-level multipliers. The finite field arithmetic operations were implemented using Gaussian normal basis (GNB representation, and the scalar multiplication kP was implemented using Lopez-Dahab algorithm, 2-NAF halve-and-add algorithm and w-tNAF method for Koblitz curves. The processors were designed using VHDL description, synthesized on the Stratix-IV FPGA using Quartus II 12.0 and verified using SignalTAP II and Matlab. The simulation results show that the cryptoprocessors present a very good performance to carry out the scalar multiplication kP. In this case, the computation times of the multiplication kP using Lopez-Dahab, 2-NAF halve-and-add and 16-tNAF for Koblitz curves were 13.37 µs, 16.90 µs and 5.05 µs, respectively.

  17. Research based on matlab method of digital trapezoidal shaping filter

    International Nuclear Information System (INIS)

    Zhou Qinghua; Zhang Ruanyu; Li Taihua

    2008-01-01

    In order to develop digital shaping system fast and conveniently, the paper presents the method of optimizing the trapezoidal shaping filter's parameters by using MATLAB, and discusses the affections of the parameters to the shaping result by this method. (authors)

  18. Unified approach for estimating the probabilistic design S-N curves of three commonly used fatigue stress-life models

    International Nuclear Information System (INIS)

    Zhao Yongxiang; Wang Jinnuo; Gao Qing

    2001-01-01

    A unified approach, referred to as general maximum likelihood method, is presented for estimating probabilistic design S-N curves and their confidence bounds of the three commonly used fatigue stress-life models, namely three parameter, Langer and Basquin. The curves are described by a general form of mean and standard deviation S-N curves of the logarithm of fatigue life. Different from existent methods, i.e., the conventional method and the classical maximum likelihood method,present approach considers the statistical characteristics of whole test data. The parameters of the mean curve is firstly estimated by least square method and then, the parameters of the standard deviation curve is evaluated by mathematical programming method to be agreement with the maximum likelihood principle. Fit effects of the curves are assessed by fitted relation coefficient, total fitted standard error and the confidence bounds. Application to the virtual stress amplitude-crack initiation life data of a nuclear engineering material, Chinese 1Cr18Ni9Ti stainless steel pipe-weld metal, has indicated the validity of the approach to the S-N data where both S and N show the character of random variable. Practices to the two states of S-N data of Chinese 45 carbon steel notched specimens (k t = 2.0) have indicated the validity of present approach to the test results obtained respectively from group fatigue test and from maximum likelihood fatigue test. At the practices, it was revealed that in general the fit is best for the three-parameter model,slightly inferior for the Langer relation and poor for the Basquin equation. Relative to the existent methods, present approach has better fit. In addition, the possible non-conservative predictions of the existent methods, which are resulted from the influence of local statistical characteristics of the data, are also overcome by present approach

  19. An Introduction to Transient Engine Applications Using the Numerical Propulsion System Simulation (NPSS) and MATLAB

    Science.gov (United States)

    Chin, Jeffrey C.; Csank, Jeffrey T.; Haller, William J.; Seidel, Jonathan A.

    2016-01-01

    This document outlines methodologies designed to improve the interface between the Numerical Propulsion System Simulation framework and various control and dynamic analyses developed in the Matlab and Simulink environment. Although NPSS is most commonly used for steady-state modeling, this paper is intended to supplement the relatively sparse documentation on it's transient analysis functionality. Matlab has become an extremely popular engineering environment, and better methodologies are necessary to develop tools that leverage the benefits of these disparate frameworks. Transient analysis is not a new feature of the Numerical Propulsion System Simulation (NPSS), but transient considerations are becoming more pertinent as multidisciplinary trade-offs begin to play a larger role in advanced engine designs. This paper serves to supplement the relatively sparse documentation on transient modeling and cover the budding convergence between NPSS and Matlab based modeling toolsets. The following sections explore various design patterns to rapidly develop transient models. Each approach starts with a base model built with NPSS, and assumes the reader already has a basic understanding of how to construct a steady-state model. The second half of the paper focuses on further enhancements required to subsequently interface NPSS with Matlab codes. The first method being the simplest and most straightforward but performance constrained, and the last being the most abstract. These methods aren't mutually exclusive and the specific implementation details could vary greatly based on the designer's discretion. Basic recommendations are provided to organize model logic in a format most easily amenable to integration with existing Matlab control toolsets.

  20. Incorporating Nonstationarity into IDF Curves across CONUS from Station Records and Implications

    Science.gov (United States)

    Wang, K.; Lettenmaier, D. P.

    2017-12-01

    Intensity-duration-frequency (IDF) curves are widely used for engineering design of storm-affected structures. Current practice is that IDF-curves are based on observed precipitation extremes fit to a stationary probability distribution (e.g., the extreme value family). However, there is increasing evidence of nonstationarity in station records. We apply the Mann-Kendall trend test to over 1000 stations across the CONUS at a 0.05 significance level, and find that about 30% of stations test have significant nonstationarity for at least one duration (1-, 2-, 3-, 6-, 12-, 24-, and 48-hours). We fit the stations to a GEV distribution with time-varying location and scale parameters using a Bayesian- methodology and compare the fit of stationary versus nonstationary GEV distributions to observed precipitation extremes. Within our fitted nonstationary GEV distributions, we compare distributions with a time-varying location parameter versus distributions with both time-varying location and scale parameters. For distributions with two time-varying parameters, we pay particular attention to instances where location and scale trends have opposing directions. Finally, we use the mathematical framework based on work of Koutsoyiannis to generate IDF curves based on the fitted GEV distributions and discuss the implications that using time-varying parameters may have on simple scaling relationships. We apply the above methods to evaluate how frequency statistics based on a stationary assumption compare to those that incorporate nonstationarity for both short and long term projects. Overall, we find that neglecting nonstationarity can lead to under- or over-estimates (depending on the trend for the given duration and region) of important statistics such as the design storm.

  1. Desarrollo de una interfaz para el control del robot IRB desde Matlab

    OpenAIRE

    Gutiérrez Corbacho, Azahara

    2014-01-01

    El objetivo de este proyecto es realizar la comunicación con el brazo robótico, IRB120 de ABB, a través de la herramienta de software matemático Matlab. Para ello desarrollaremos un socket de comunicación, que se encargará enviar y procesar los datos. Para comprobar que la comunicación funciona y que el envío de datos se realiza correctamente, se implementarán en Matlab, una serie de interfaces de comunicación con el robot y una aplicación final. La primera, será una interfaz gráfica r...

  2. Gene ARMADA: an integrated multi-analysis platform for microarray data implemented in MATLAB.

    Science.gov (United States)

    Chatziioannou, Aristotelis; Moulos, Panagiotis; Kolisis, Fragiskos N

    2009-10-27

    The microarray data analysis realm is ever growing through the development of various tools, open source and commercial. However there is absence of predefined rational algorithmic analysis workflows or batch standardized processing to incorporate all steps, from raw data import up to the derivation of significantly differentially expressed gene lists. This absence obfuscates the analytical procedure and obstructs the massive comparative processing of genomic microarray datasets. Moreover, the solutions provided, heavily depend on the programming skills of the user, whereas in the case of GUI embedded solutions, they do not provide direct support of various raw image analysis formats or a versatile and simultaneously flexible combination of signal processing methods. We describe here Gene ARMADA (Automated Robust MicroArray Data Analysis), a MATLAB implemented platform with a Graphical User Interface. This suite integrates all steps of microarray data analysis including automated data import, noise correction and filtering, normalization, statistical selection of differentially expressed genes, clustering, classification and annotation. In its current version, Gene ARMADA fully supports 2 coloured cDNA and Affymetrix oligonucleotide arrays, plus custom arrays for which experimental details are given in tabular form (Excel spreadsheet, comma separated values, tab-delimited text formats). It also supports the analysis of already processed results through its versatile import editor. Besides being fully automated, Gene ARMADA incorporates numerous functionalities of the Statistics and Bioinformatics Toolboxes of MATLAB. In addition, it provides numerous visualization and exploration tools plus customizable export data formats for seamless integration by other analysis tools or MATLAB, for further processing. Gene ARMADA requires MATLAB 7.4 (R2007a) or higher and is also distributed as a stand-alone application with MATLAB Component Runtime. Gene ARMADA provides a

  3. tgcd: An R package for analyzing thermoluminescence glow curves

    Directory of Open Access Journals (Sweden)

    Jun Peng

    2016-01-01

    Full Text Available Thermoluminescence (TL glow curves are widely used in dosimetric studies. Many commercial and free-distributed programs are used to deconvolute TL glow curves. This study introduces an open-source R package tgcd to conduct TL glow curve analysis, such as kinetic parameter estimation, glow peak simulation, and peak shape analysis. TL glow curves can be deconvoluted according to the general-order empirical expression or the semi-analytical expression derived from the one trap-one recombination center (OTOR model based on the Lambert W function by using a modified Levenberg–Marquardt algorithm from which any of the parameters can be constrained or fixed. The package provides an interactive environment to initialize parameters and offers an automated “trial-and-error” protocol to obtain optimal fit results. First-order, second-order, and general-order glow peaks (curves are simulated according to a number of simple kinetic models. The package was developed using a combination of Fortran and R programming languages to improve efficiency and flexibility.

  4. Experimental R-curve behavior in partially stabilized zirconia using moiracute e interferometry

    International Nuclear Information System (INIS)

    Perry, K.E.; Okada, H.; Atluri, S.N.

    1993-01-01

    Moiracute e interferometry is employed to study toughening in partially stabilized zirconia (PSZ). Energy to fracture as a function of crack growth curves (R-curves) is derived from mode I compliance calculations and from near tip fitting of the moiracute e fringes. The effect of the tetragonal to monoclinic phase transformation in the zirconia is found by comparing the bulk compliance R-curves to the locally derived moiracute e R-curve. Localized strain field plots are produced from the moiracute e data for the PSZ zirconia. The observed transformation zone height compares favorably with that predicted by Okada et al. in a companion paper, as does the qualitative nature of the R-curve with predictions by Stump and Budiansky

  5. JMat - Herramienta remota de cálculo y multiusuario para el aprendizaje basado en problemas usando Matlab

    Directory of Open Access Journals (Sweden)

    Bladimir Bacca Cortes

    2011-01-01

    Full Text Available JMat es una herramienta de cálculo basada en JAVA y EJS (Easy Java Simulations, con un esquema cliente / servidor, soporte multi-usuario y acceso remoto a Matlab. La aplicación está orientada a brindar a los usuarios una interacción con Matlab usando tres interfaces: Consola de Comandos, donde se invocan remotamente comandos de texto compatibles con Matlab. Espacio de Trabajo y Graficación, donde se mantiene un registro automático de las variables de usuario y se grafican individualmente. Funciones de usuario y Transferencia de Archivos, donde el usuario crea sus funciones, envía y recibe datos hacia y desde el servidor. JMat requiere un acceso a Internet, un servidor remoto donde esté instalado Matlab y un cliente (Navegador WEB o aplicación. No se requiere Matlab en el cliente. JMat está siendo usada actualmente en la Universidad del Valle en los cursos de Control Automático de Procesos, Control Inteligente, Redes Neuronales Artificiales, Procesamiento de Señales y Tratamiento Digital de Imágenes como herramienta para el aprendizaje basado en problemas empleando la plataforma de eLearning de la Universidad del Valle.

  6. Using user models in Matlab® within the Aspen Plus® interface with an Excel® link

    Directory of Open Access Journals (Sweden)

    Javier Fontalvo Alzate

    2014-05-01

    Full Text Available Process intensification and new technologies require tools for process design that can be integrated into well-known simulation software, such as Aspen Plus®. Thus, unit operations that are not included in traditional Aspen Plus software packages can be simulated with Matlab® and integrated within the Aspen Plus interface. In this way, the user can take advantage of all of the tools of Aspen Plus, such as optimization, sensitivity analysis and cost estimation. This paper gives a detailed description of how to implement this integration. The interface between Matlab and Aspen Plus is accomplished by sending the relevant information from Aspen Plus to Excel, which feeds the information to a Matlab routine. Once the Matlab routine processes the information, it is returned to Excel and to Aspen Plus. This paper includes the Excel and Matlab template files so the reader can implement their own simulations. By applying the protocol described here, a hybrid distillation-vapor permeation system has been simulated as an example of the applications that can be implemented. For the hybrid system, the effect of membrane selectivity on membrane area and reboiler duty for the partial dehydration of ethanol is studied. Very high selectivities are not necessarily required for an optimum hybrid distillation and vapor permeation system.

  7. Nuclear Fuel Depletion Analysis Using Matlab Software

    Science.gov (United States)

    Faghihi, F.; Nematollahi, M. R.

    Coupled first order IVPs are frequently used in many parts of engineering and sciences. In this article, we presented a code including three computer programs which are joint with the Matlab software to solve and plot the solutions of the first order coupled stiff or non-stiff IVPs. Some engineering and scientific problems related to IVPs are given and fuel depletion (production of the 239Pu isotope) in a Pressurized Water Nuclear Reactor (PWR) are computed by the present code.

  8. Stress strain flow curves for Cu-OFP

    International Nuclear Information System (INIS)

    Sandstroem, Rolf; Hallgren, Josefin

    2009-04-01

    Stress strain curves of oxygen free copper alloyed with phosphorus Cu-OFP have been determined in compression and tension. The compression tests were performed at room temperature for strain rates between 10 -5 and 10 -3 1/s. The tests in tension covered the temperature range 20 to 175 deg C for strain rates between 10 -7 and 5x10 -3 1/s. The results in compression and tension were close for similar strain rates. A model for stress strain curves has been formulated using basic dislocation mechanisms. The model has been set up in such a way that fitting of parameters to the curves is avoided. By using a fundamental creep model as a basis a direct relation to creep data has been established. The maximum engineering flow stress in tension is related to the creep stress giving the same strain rate. The model reproduces the measured flow curves as function of temperature and strain rate in the investigated interval. The model is suitable to use in finite-element computations of structures in Cu-OFP

  9. Maximum safe speed estimation using planar quintic Bezier curve with C2 continuity

    Science.gov (United States)

    Ibrahim, Mohamad Fakharuddin; Misro, Md Yushalify; Ramli, Ahmad; Ali, Jamaludin Md

    2017-08-01

    This paper describes an alternative way in estimating design speed or the maximum speed allowed for a vehicle to drive safely on a road using curvature information from Bezier curve fitting on a map. We had tested on some route in Tun Sardon Road, Balik Pulau, Penang, Malaysia. We had proposed to use piecewise planar quintic Bezier curve while satisfying the curvature continuity between joined curves in the process of mapping the road. By finding the derivatives of quintic Bezier curve, the value of curvature was calculated and design speed was derived. In this paper, a higher order of Bezier Curve had been used. A higher degree of curve will give more freedom for users to control the shape of the curve compared to curve in lower degree.

  10. Advanced topics in the arithmetic of elliptic curves

    CERN Document Server

    Silverman, Joseph H

    1994-01-01

    In the introduction to the first volume of The Arithmetic of Elliptic Curves (Springer-Verlag, 1986), I observed that "the theory of elliptic curves is rich, varied, and amazingly vast," and as a consequence, "many important topics had to be omitted." I included a brief introduction to ten additional topics as an appendix to the first volume, with the tacit understanding that eventually there might be a second volume containing the details. You are now holding that second volume. it turned out that even those ten topics would not fit Unfortunately, into a single book, so I was forced to make some choices. The following material is covered in this book: I. Elliptic and modular functions for the full modular group. II. Elliptic curves with complex multiplication. III. Elliptic surfaces and specialization theorems. IV. Neron models, Kodaira-Neron classification of special fibers, Tate's algorithm, and Ogg's conductor-discriminant formula. V. Tate's theory of q-curves over p-adic fields. VI. Neron's theory of can...

  11. Problem-based learning in communication systems using MATLAB and Simulink

    CERN Document Server

    Choi, Kwonhue

    2016-01-01

    Designed to help teach and understand communication systems using a classroom-tested, active learning approach. This book covers the basic concepts of signals, and analog and digital communications, to more complex simulations in communication systems. Problem-Based Learning in Communication Systems Using MATLAB and Simulink begins by introducing MATLAB and Simulink to prepare readers who are unfamiliar with these environments in order to tackle projects and exercises included in this book. Discussions on simulation of signals, filter design, sampling and reconstruction, and analog communications are covered next. The book concludes by covering advanced topics such as Viterbi decoding, OFDM and MIMO. In addition, this book contains examples of how to convert waveforms, constructed in simulation, into electric signals. It also includes problems illustrating how to complete actual wireless communications in the band near ultrasonic frequencies. A content-m pping table is included in this book to help instruc...

  12. Numerical methods in finance and economics a MATLAB-based introduction

    CERN Document Server

    Brandimarte, Paolo

    2006-01-01

    A state-of-the-art introduction to the powerful mathematical and statistical tools used in the field of financeThe use of mathematical models and numerical techniques is a practice employed by a growing number of applied mathematicians working on applications in finance. Reflecting this development, Numerical Methods in Finance and Economics: A MATLAB?-Based Introduction, Second Edition bridges the gap between financial theory and computational practice while showing readers how to utilize MATLAB?--the powerful numerical computing environment--for financial applications.The author provides an essential foundation in finance and numerical analysis in addition to background material for students from both engineering and economics perspectives. A wide range of topics is covered, including standard numerical analysis methods, Monte Carlo methods to simulate systems affected by significant uncertainty, and optimization methods to find an optimal set of decisions.Among this book''s most outstanding features is the...

  13. Surface Fitting for Quasi Scattered Data from Coordinate Measuring Systems.

    Science.gov (United States)

    Mao, Qing; Liu, Shugui; Wang, Sen; Ma, Xinhui

    2018-01-13

    Non-uniform rational B-spline (NURBS) surface fitting from data points is wildly used in the fields of computer aided design (CAD), medical imaging, cultural relic representation and object-shape detection. Usually, the measured data acquired from coordinate measuring systems is neither gridded nor completely scattered. The distribution of this kind of data is scattered in physical space, but the data points are stored in a way consistent with the order of measurement, so it is named quasi scattered data in this paper. Therefore they can be organized into rows easily but the number of points in each row is random. In order to overcome the difficulty of surface fitting from this kind of data, a new method based on resampling is proposed. It consists of three major steps: (1) NURBS curve fitting for each row, (2) resampling on the fitted curve and (3) surface fitting from the resampled data. Iterative projection optimization scheme is applied in the first and third step to yield advisable parameterization and reduce the time cost of projection. A resampling approach based on parameters, local peaks and contour curvature is proposed to overcome the problems of nodes redundancy and high time consumption in the fitting of this kind of scattered data. Numerical experiments are conducted with both simulation and practical data, and the results show that the proposed method is fast, effective and robust. What's more, by analyzing the fitting results acquired form data with different degrees of scatterness it can be demonstrated that the error introduced by resampling is negligible and therefore it is feasible.

  14. Spreadsheets, Graphing Calculators and the Line of Best Fit

    Directory of Open Access Journals (Sweden)

    Bernie O'Sullivan

    2003-07-01

    One technique that can now be done, almost mindlessly, is the line of best fit. Both the graphing calculator and the Excel spreadsheet produce models for collected data that appear to be very good fits, but upon closer scrutiny, are revealed to be quite poor. This article will examine one such case. I will couch the paper within the framework of a very good classroom investigation that will help generate students’ understanding of the basic principles of curve fitting and will enable them to produce a very accurate model of collected data by combining the technology of the graphing calculator and the spreadsheet.

  15. Representative Stress-Strain Curve by Spherical Indentation on Elastic-Plastic Materials

    Directory of Open Access Journals (Sweden)

    Chao Chang

    2018-01-01

    Full Text Available Tensile stress-strain curve of metallic materials can be determined by the representative stress-strain curve from the spherical indentation. Tabor empirically determined the stress constraint factor (stress CF, ψ, and strain constraint factor (strain CF, β, but the choice of value for ψ and β is still under discussion. In this study, a new insight into the relationship between constraint factors of stress and strain is analytically described based on the formation of Tabor’s equation. Experiment tests were performed to evaluate these constraint factors. From the results, representative stress-strain curves using a proposed strain constraint factor can fit better with nominal stress-strain curve than those using Tabor’s constraint factors.

  16. Light Curve Simulation Using Spacecraft CAD Models and Empirical Material Spectral BRDFS

    Science.gov (United States)

    Willison, A.; Bedard, D.

    This paper presents a Matlab-based light curve simulation software package that uses computer-aided design (CAD) models of spacecraft and the spectral bidirectional reflectance distribution function (sBRDF) of their homogenous surface materials. It represents the overall optical reflectance of objects as a sBRDF, a spectrometric quantity, obtainable during an optical ground truth experiment. The broadband bidirectional reflectance distribution function (BRDF), the basis of a broadband light curve, is produced by integrating the sBRDF over the optical wavelength range. Colour-filtered BRDFs, the basis of colour-filtered light curves, are produced by first multiplying the sBRDF by colour filters, and integrating the products. The software package's validity is established through comparison of simulated reflectance spectra and broadband light curves with those measured of the CanX-1 Engineering Model (EM) nanosatellite, collected during an optical ground truth experiment. It is currently being extended to simulate light curves of spacecraft in Earth orbit, using spacecraft Two-Line-Element (TLE) sets, yaw/pitch/roll angles, and observer coordinates. Measured light curves of the NEOSSat spacecraft will be used to validate simulated quantities. The sBRDF was chosen to represent material reflectance as it is spectrometric and a function of illumination and observation geometry. Homogeneous material sBRDFs were obtained using a goniospectrometer for a range of illumination and observation geometries, collected in a controlled environment. The materials analyzed include aluminum alloy, two types of triple-junction photovoltaic (TJPV) cell, white paint, and multi-layer insulation (MLI). Interpolation and extrapolation methods were used to determine the sBRDF for all possible illumination and observation geometries not measured in the laboratory, resulting in empirical look-up tables. These look-up tables are referenced when calculating the overall sBRDF of objects, where

  17. Optimization design of wind turbine drive train based on Matlab genetic algorithm toolbox

    Science.gov (United States)

    Li, R. N.; Liu, X.; Liu, S. J.

    2013-12-01

    In order to ensure the high efficiency of the whole flexible drive train of the front-end speed adjusting wind turbine, the working principle of the main part of the drive train is analyzed. As critical parameters, rotating speed ratios of three planetary gear trains are selected as the research subject. The mathematical model of the torque converter speed ratio is established based on these three critical variable quantity, and the effect of key parameters on the efficiency of hydraulic mechanical transmission is analyzed. Based on the torque balance and the energy balance, refer to hydraulic mechanical transmission characteristics, the transmission efficiency expression of the whole drive train is established. The fitness function and constraint functions are established respectively based on the drive train transmission efficiency and the torque converter rotating speed ratio range. And the optimization calculation is carried out by using MATLAB genetic algorithm toolbox. The optimization method and results provide an optimization program for exact match of wind turbine rotor, gearbox, hydraulic mechanical transmission, hydraulic torque converter and synchronous generator, ensure that the drive train work with a high efficiency, and give a reference for the selection of the torque converter and hydraulic mechanical transmission.

  18. Optimization design of wind turbine drive train based on Matlab genetic algorithm toolbox

    International Nuclear Information System (INIS)

    Li, R N; Liu, X; Liu, S J

    2013-01-01

    In order to ensure the high efficiency of the whole flexible drive train of the front-end speed adjusting wind turbine, the working principle of the main part of the drive train is analyzed. As critical parameters, rotating speed ratios of three planetary gear trains are selected as the research subject. The mathematical model of the torque converter speed ratio is established based on these three critical variable quantity, and the effect of key parameters on the efficiency of hydraulic mechanical transmission is analyzed. Based on the torque balance and the energy balance, refer to hydraulic mechanical transmission characteristics, the transmission efficiency expression of the whole drive train is established. The fitness function and constraint functions are established respectively based on the drive train transmission efficiency and the torque converter rotating speed ratio range. And the optimization calculation is carried out by using MATLAB genetic algorithm toolbox. The optimization method and results provide an optimization program for exact match of wind turbine rotor, gearbox, hydraulic mechanical transmission, hydraulic torque converter and synchronous generator, ensure that the drive train work with a high efficiency, and give a reference for the selection of the torque converter and hydraulic mechanical transmission

  19. Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB.

    Science.gov (United States)

    Lee, Leng-Feng; Umberger, Brian R

    2016-01-01

    Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1-2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This

  20. Estimating stock parameters from trawl cpue-at-age series using year-class curves

    NARCIS (Netherlands)

    Cotter, A.J.R.; Mesnil, B.; Piet, G.J.

    2007-01-01

    A year-class curve is a plot of log cpue (catch per unit effort) over age for a single year class of a species (in contrast to the better known catch curve, fitted to multiple year classes at one time). When linear, the intercept and slope estimate the log cpue at age 0 and the average rate of total

  1. Linking Advanced Visualization and MATLAB for the Analysis of 3D Gene Expression Data

    Energy Technology Data Exchange (ETDEWEB)

    Ruebel, Oliver; Keranen, Soile V.E.; Biggin, Mark; Knowles, David W.; Weber, Gunther H.; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes

    2011-03-30

    Three-dimensional gene expression PointCloud data generated by the Berkeley Drosophila Transcription Network Project (BDTNP) provides quantitative information about the spatial and temporal expression of genes in early Drosophila embryos at cellular resolution. The BDTNP team visualizes and analyzes Point-Cloud data using the software application PointCloudXplore (PCX). To maximize the impact of novel, complex data sets, such as PointClouds, the data needs to be accessible to biologists and comprehensible to developers of analysis functions. We address this challenge by linking PCX and Matlab via a dedicated interface, thereby providing biologists seamless access to advanced data analysis functions and giving bioinformatics researchers the opportunity to integrate their analysis directly into the visualization application. To demonstrate the usefulness of this approach, we computationally model parts of the expression pattern of the gene even skipped using a genetic algorithm implemented in Matlab and integrated into PCX via our Matlab interface.

  2. Small-signal modelling and analysis of switching converters using MATLAB

    NARCIS (Netherlands)

    Duarte, J.L.

    1998-01-01

    A general procedure for the description of power electronic circuit dynamics is proposed, with the intention of control system design and discrete-time system simulation. The approach is especially suited to be used along with computeraided analysis and synthesis software packages such as MATLAB.

  3. Learning Curves: Making Quality Online Health Information Available at a Fitness Center

    OpenAIRE

    Dobbins, Montie T.; Tarver, Talicia; Adams, Mararia; Jones, Dixie A.

    2012-01-01

    Meeting consumer health information needs can be a challenge. Research suggests that women seek health information from a variety of resources, including the Internet. In an effort to make women aware of reliable health information sources, the Louisiana State University Health Sciences Center – Shreveport Medical Library engaged in a partnership with a franchise location of Curves International, Inc. This article will discuss the project, its goals and its challenges.

  4. Learning Curves: Making Quality Online Health Information Available at a Fitness Center.

    Science.gov (United States)

    Dobbins, Montie T; Tarver, Talicia; Adams, Mararia; Jones, Dixie A

    2012-01-01

    Meeting consumer health information needs can be a challenge. Research suggests that women seek health information from a variety of resources, including the Internet. In an effort to make women aware of reliable health information sources, the Louisiana State University Health Sciences Center - Shreveport Medical Library engaged in a partnership with a franchise location of Curves International, Inc. This article will discuss the project, its goals and its challenges.

  5. A fit method for the determination of inherent filtration with diagnostic x-ray units

    International Nuclear Information System (INIS)

    Meghzifene, K; Nowotny, R; Aiginger, H

    2006-01-01

    A method for the determination of total inherent filtration for clinical x-ray units using attenuation curves was devised. A model for the calculation of x-ray spectra is used to calculate kerma values which are then adjusted to the experimental data in minimizing the sum of the squared relative differences in kerma using a modified simplex fit process. The model considers tube voltage, voltage ripple, anode angle and additional filters. Fit parameters are the thickness of an additional inherent Al filter and a general normalization factor. Nineteen sets of measurements including attenuation data for three tube voltages and five Al-filter settings each were obtained. Relative differences of experimental and calculated kerma using the data for the additional filter thickness are within a range of -7.6% to 6.4%. Quality curves, i.e. the relationship of additional filtration to HVL, are often used to determine filtration but the results show that standard quality curves do not reflect the variety of conditions encountered in practice. To relate the thickness of the additional filter to the condition of the anode surface, the data fits were also made using tungsten as the filter material. These fits gave an identical fit quality compared to aluminium with a tungsten filter thickness of 2.12-8.21 μm which is within the range of the additional absorbing layers determined for rough anodes

  6. Boundary curves of individual items in the distribution of total depressive symptom scores approximate an exponential pattern in a general population.

    Science.gov (United States)

    Tomitaka, Shinichiro; Kawasaki, Yohei; Ide, Kazuki; Akutagawa, Maiko; Yamada, Hiroshi; Furukawa, Toshiaki A; Ono, Yutaka

    2016-01-01

    Previously, we proposed a model for ordinal scale scoring in which individual thresholds for each item constitute a distribution by each item. This lead us to hypothesize that the boundary curves of each depressive symptom score in the distribution of total depressive symptom scores follow a common mathematical model, which is expressed as the product of the frequency of the total depressive symptom scores and the probability of the cumulative distribution function of each item threshold. To verify this hypothesis, we investigated the boundary curves of the distribution of total depressive symptom scores in a general population. Data collected from 21,040 subjects who had completed the Center for Epidemiologic Studies Depression Scale (CES-D) questionnaire as part of a national Japanese survey were analyzed. The CES-D consists of 20 items (16 negative items and four positive items). The boundary curves of adjacent item scores in the distribution of total depressive symptom scores for the 16 negative items were analyzed using log-normal scales and curve fitting. The boundary curves of adjacent item scores for a given symptom approximated a common linear pattern on a log normal scale. Curve fitting showed that an exponential fit had a markedly higher coefficient of determination than either linear or quadratic fits. With negative affect items, the gap between the total score curve and boundary curve continuously increased with increasing total depressive symptom scores on a log-normal scale, whereas the boundary curves of positive affect items, which are not considered manifest variables of the latent trait, did not exhibit such increases in this gap. The results of the present study support the hypothesis that the boundary curves of each depressive symptom score in the distribution of total depressive symptom scores commonly follow the predicted mathematical model, which was verified to approximate an exponential mathematical pattern.

  7. Boundary curves of individual items in the distribution of total depressive symptom scores approximate an exponential pattern in a general population

    Directory of Open Access Journals (Sweden)

    Shinichiro Tomitaka

    2016-10-01

    Full Text Available Background Previously, we proposed a model for ordinal scale scoring in which individual thresholds for each item constitute a distribution by each item. This lead us to hypothesize that the boundary curves of each depressive symptom score in the distribution of total depressive symptom scores follow a common mathematical model, which is expressed as the product of the frequency of the total depressive symptom scores and the probability of the cumulative distribution function of each item threshold. To verify this hypothesis, we investigated the boundary curves of the distribution of total depressive symptom scores in a general population. Methods Data collected from 21,040 subjects who had completed the Center for Epidemiologic Studies Depression Scale (CES-D questionnaire as part of a national Japanese survey were analyzed. The CES-D consists of 20 items (16 negative items and four positive items. The boundary curves of adjacent item scores in the distribution of total depressive symptom scores for the 16 negative items were analyzed using log-normal scales and curve fitting. Results The boundary curves of adjacent item scores for a given symptom approximated a common linear pattern on a log normal scale. Curve fitting showed that an exponential fit had a markedly higher coefficient of determination than either linear or quadratic fits. With negative affect items, the gap between the total score curve and boundary curve continuously increased with increasing total depressive symptom scores on a log-normal scale, whereas the boundary curves of positive affect items, which are not considered manifest variables of the latent trait, did not exhibit such increases in this gap. Discussion The results of the present study support the hypothesis that the boundary curves of each depressive symptom score in the distribution of total depressive symptom scores commonly follow the predicted mathematical model, which was verified to approximate an

  8. MATLAB algorithm to implement soil water data assimilation with the Ensemble Kalman Filter using HYDRUS.

    Science.gov (United States)

    Valdes-Abellan, Javier; Pachepsky, Yakov; Martinez, Gonzalo

    2018-01-01

    Data assimilation is becoming a promising technique in hydrologic modelling to update not only model states but also to infer model parameters, specifically to infer soil hydraulic properties in Richard-equation-based soil water models. The Ensemble Kalman Filter method is one of the most widely employed method among the different data assimilation alternatives. In this study the complete Matlab© code used to study soil data assimilation efficiency under different soil and climatic conditions is shown. The code shows the method how data assimilation through EnKF was implemented. Richards equation was solved by the used of Hydrus-1D software which was run from Matlab. •MATLAB routines are released to be used/modified without restrictions for other researchers•Data assimilation Ensemble Kalman Filter method code.•Soil water Richard equation flow solved by Hydrus-1D.

  9. Mild angle early onset idiopathic scoliosis children avoid progression under FITS method (Functional Individual Therapy of Scoliosis).

    Science.gov (United States)

    Białek, Marianna

    2015-05-01

    Physiotherapy for stabilization of idiopathic scoliosis angle in growing children remains controversial. Notably, little data on effectiveness of physiotherapy in children with Early Onset Idiopathic Scoliosis (EOIS) has been published.The aim of this study was to check results of FITS physiotherapy in a group of children with EOIS.The charts of the patients archived in a prospectively collected database were retrospectively reviewed. The inclusion criteria were:diagnosis of EOIS based on spine radiography, age below 10 years, both girls and boys, Cobb angle between 118 and 308, Risser zero, FITS therapy, no other treatment (bracing), and a follow-up at least 2 years from the initiation of the treatment. The criterion for curve progression were as follows: the Cobb angle increase of 68 or more, for curve stabilization; the Cobb angle was 58 comparing to the initial radiograph,for curve correction; and the Cobb angle decrease of 68 or more at the final follow-up radiograph.There were 41 children with EOIS, 36 girls and 5 boys, mean age 7.71.3 years (range 4 to 9 years) who started FITS therapy. The curve pattern was single thoracic (5 children), single thoracolumbar (22 children) or double thoracic/thoracolumbar (14 children), totally 55 structural curvatures. The minimum follow-up was 2 years after initiation of the FITS treatment, maximum was 16 years, mean 4.8 years). At follow-up the mean age was 12.53.4 years. Out of 41 children, 10 passed pubertal growth spurt at the final follow-up and 31 were still immature and continued FITS therapy. Out of 41 children, 27 improved, 13 were stable, and one progressed. Out of 55 structural curves, 32 improved, 22 were stable and one progressed. For the 55 structural curves, the Cobb angle significantly decreased from 18.085.48 at first assessment to 12.586.38 at last evaluation,pphysiotherapy was effective in preventing curve progression in children with EOIS. Final postpubertal follow-up data is needed.

  10. Comparison between two scalar field models using rotation curves of spiral galaxies

    Science.gov (United States)

    Fernández-Hernández, Lizbeth M.; Rodríguez-Meza, Mario A.; Matos, Tonatiuh

    2018-04-01

    Scalar fields have been used as candidates for dark matter in the universe, from axions with masses ∼ 10-5eV until ultra-light scalar fields with masses ∼ Axions behave as cold dark matter while the ultra-light scalar fields galaxies are Bose-Einstein condensate drops. The ultra-light scalar fields are also called scalar field dark matter model. In this work we study rotation curves for low surface brightness spiral galaxies using two scalar field models: the Gross-Pitaevskii Bose-Einstein condensate in the Thomas-Fermi approximation and a scalar field solution of the Klein-Gordon equation. We also used the zero disk approximation galaxy model where photometric data is not considered, only the scalar field dark matter model contribution to rotation curve is taken into account. From the best-fitting analysis of the galaxy catalog we use, we found the range of values of the fitting parameters: the length scale and the central density. The worst fitting results (values of χ red2 much greater than 1, on the average) were for the Thomas-Fermi models, i.e., the scalar field dark matter is better than the Thomas- Fermi approximation model to fit the rotation curves of the analysed galaxies. To complete our analysis we compute from the fitting parameters the mass of the scalar field models and two astrophysical quantities of interest, the dynamical dark matter mass within 300 pc and the characteristic central surface density of the dark matter models. We found that the value of the central mass within 300 pc is in agreement with previous reported results, that this mass is ≈ 107 M ⊙/pc2, independent of the dark matter model. And, on the contrary, the value of the characteristic central surface density do depend on the dark matter model.

  11. Calcolo scientifico esercizi e problemi risolti con Matlab e Octave

    CERN Document Server

    Quarteroni, Alfio; Gervasio, Paola

    2017-01-01

    Questo testo è concepito per i corsi delle Facoltà di Ingegneria e di Scienze. Esso affronta tutti gli argomenti tipici della Matematica Numerica, spaziando dal problema di risolvere sistemi di equazioni lineari e non lineari a quello di approssimare una funzione, di calcolare i suoi minimi, le sue derivate ed il suo integrale definito fino alla risoluzione di equazioni differenziali ordinarie e alle derivate parziali con metodi alle differenze finite ed agli elementi finiti. Un capitolo iniziale conduce lo studente ad un rapido ripasso degli argomenti dell'Analisi Matematica e dell'Algebra Lineare di uso frequente nel volume e ad una introduzione ai linguaggi MATLAB e Octave. Al fine di rendere maggiormente incisiva la presentazione e fornire un riscontro quantitativo immediato alla teoria vengono implementati in linguaggio MATLAB e Octave tutti gli algoritmi che via via si introducono. Vengono inoltre proposti numerosi esercizi, tutti risolti per esteso, ed esempi, anche con riferimento ad applicazioni in...

  12. Computational mathematics models, methods, and analysis with Matlab and MPI

    CERN Document Server

    White, Robert E

    2004-01-01

    Computational Mathematics: Models, Methods, and Analysis with MATLAB and MPI explores and illustrates this process. Each section of the first six chapters is motivated by a specific application. The author applies a model, selects a numerical method, implements computer simulations, and assesses the ensuing results. These chapters include an abundance of MATLAB code. By studying the code instead of using it as a "black box, " you take the first step toward more sophisticated numerical modeling. The last four chapters focus on multiprocessing algorithms implemented using message passing interface (MPI). These chapters include Fortran 9x codes that illustrate the basic MPI subroutines and revisit the applications of the previous chapters from a parallel implementation perspective. All of the codes are available for download from www4.ncsu.edu./~white.This book is not just about math, not just about computing, and not just about applications, but about all three--in other words, computational science. Whether us...

  13. MATLAB based beam orbit correction system of HLS storage ring

    International Nuclear Information System (INIS)

    Ding Shichuan; Liu Gongfa; Xuan Ke; Li Weimin; Wang Lin; Wang Jigang; Li Chuan; Bao Xun; Guo Weiqun

    2006-01-01

    The distortion of closed orbit usually causes much side effect which is harmful to synchrotron radiation source such as HLS, so it is necessary to correct the distortion of closed orbit. In this paper, the correction principle, development procedure and test of MATLAB based on beam orbit correction system of HLS storage ring are described. The correction system is consisted of the beam orbit measure system, corrector magnet system and the control system, and the beam orbit correction code based on MATLAB is working on the operation interface. The data of the beam orbit are analyzed and calculated firstly, and then the orbit is corrected by changing corrector strength via control system. The test shows that the distortion of closed orbit is from max 4.468 mm before correction to max 0.299 mm after correction as well as SDEV is from 2.986 mm to 0.087 mm. So the correction system reaches the design goal. (authors)

  14. SU-F-I-63: Relaxation Times of Lipid Resonances in NAFLD Animal Model Using Enhanced Curve Fitting

    Energy Technology Data Exchange (ETDEWEB)

    Song, K-H; Yoo, C-H; Lim, S-I; Choe, B-Y [Department of Biomedical Engineering, and Research Institute of Biomedical Engineering, The Catholic University of Korea College of Medicine, Seoul (Korea, Republic of)

    2016-06-15

    Purpose: The objective of this study is to evaluate the relaxation time of methylene resonance in comparison with other lipid resonances. Methods: The examinations were performed on a 3.0T MRI scanner using a four-channel animal coil. Eight more Sprague-Dawley rats in the same baseline weight range were housed with ad libitum access to water and a high-fat (HF) diet (60% fat, 20% protein, and 20% carbohydrate). In order to avoid large blood vessels, a voxel (0.8×0.8×0.8 cm{sup 3}) was placed in a homogeneous area of the liver parenchyma during free breathing. Lipid relaxations in NC and HF diet rats were estimated at a fixed repetition time (TR) of 6000 msec, and multi echo time (TEs) of 40–220 msec. All spectra for data measurement were processed using the Advanced Method for Accurate, Robust, and Efficient Spectral (AMARES) fitting algorithm of the Java-based Magnetic Resonance User Interface (jMRUI) package. Results: The mean T2 relaxation time of the methylene resonance in normal-chow diet was 37.1 msec (M{sub 0}, 2.9±0.5), with a standard deviation of 4.3 msec. The mean T2 relaxation time of the methylene resonance was 31.4 msec (M{sub 0}, 3.7±0.3), with a standard deviation of 1.8 msec. The T2 relaxation times of methylene protons were higher in normal-chow diet rats than in HF rats (p<0.05), and the extrapolated M{sub 0} values were higher in HF rats than in NC rats (p<0.005). The excellent linear fit with R{sup 2}>0.9971 and R{sup 2}>0.9987 indicates T2 relaxation decay curves with mono-exponential function. Conclusion: In in vivo, a sufficient spectral resolution and a sufficiently high signal-to-noise ratio (SNR) can be achieved, so that the data measured over short TE values can be extrapolated back to TE = 0 to produce better estimates of the relative weights of the spectral components. In the short term, treating the effective decay rate as exponential is an adequate approximation.

  15. Using MATLAB software with Tomcat server and Java platform for remote image analysis in pathology.

    Science.gov (United States)

    Markiewicz, Tomasz

    2011-03-30

    The Matlab software is a one of the most advanced development tool for application in engineering practice. From our point of view the most important is the image processing toolbox, offering many built-in functions, including mathematical morphology, and implementation of a many artificial neural networks as AI. It is very popular platform for creation of the specialized program for image analysis, also in pathology. Based on the latest version of Matlab Builder Java toolbox, it is possible to create the software, serving as a remote system for image analysis in pathology via internet communication. The internet platform can be realized based on Java Servlet Pages with Tomcat server as servlet container. In presented software implementation we propose remote image analysis realized by Matlab algorithms. These algorithms can be compiled to executable jar file with the help of Matlab Builder Java toolbox. The Matlab function must be declared with the set of input data, output structure with numerical results and Matlab web figure. Any function prepared in that manner can be used as a Java function in Java Servlet Pages (JSP). The graphical user interface providing the input data and displaying the results (also in graphical form) must be implemented in JSP. Additionally the data storage to database can be implemented within algorithm written in Matlab with the help of Matlab Database Toolbox directly with the image processing. The complete JSP page can be run by Tomcat server. The proposed tool for remote image analysis was tested on the Computerized Analysis of Medical Images (CAMI) software developed by author. The user provides image and case information (diagnosis, staining, image parameter etc.). When analysis is initialized, input data with image are sent to servlet on Tomcat. When analysis is done, client obtains the graphical results as an image with marked recognized cells and also the quantitative output. Additionally, the results are stored in a server

  16. Fitness of the analysis method of magnesium in drinking water using atomic absorption with quadratic calibration curve

    International Nuclear Information System (INIS)

    Perez-Lopez, Esteban

    2014-01-01

    The quantitative chemical analysis has been importance in research. Also, aspects like: quality control, sales of services and other areas of interest. Some instrumental analysis methods for quantification with linear calibration curve have presented limitations, because the short liner dynamic ranges of the analyte, or sometimes, by limiting the technique itself. The need has been to investigate a little more about the convenience of using quadratic calibration curves for analytical quantification, with which it has seeked demonstrate that has been a valid calculation model for chemical analysis instruments. An analysis base method is used on the technique of atomic absorption spectroscopy and in particular a determination of magnesium in a drinking water sample of the Tacares sector North of Grecia. A nonlinear calibration curve was used and specifically a curve with quadratic behavior. The same was compared with the test results obtained for the equal analysis with a linear calibration curve. The results have showed that the methodology has been valid for the determination referred with all confidence, since the concentrations have been very similar and, according to the used hypothesis testing, can be considered equal. (author) [es

  17. Responsive Graphical User Interface (ReGUI) and its Implementation in MATLAB

    OpenAIRE

    Mikulszky, Matej; Pocsova, Jana; Mojzisova, Andrea; Podlubny, Igor

    2017-01-01

    In this paper we introduce the responsive graphical user interface (ReGUI) approach to creating applications, and demonstrate how this approach can be implemented in MATLAB. The same general technique can be used in other programming languages.

  18. Digital Model of Railway Electric Traction Lines

    Science.gov (United States)

    Garg, Rachana; Mahajan, Priya; Kumar, Parmod

    2017-08-01

    The characteristic impedance and propagation constant define the behavior of signal propagation over the transmission lines. The digital model for railway traction lines which includes railway tracks is developed, using curve fitting technique in MATLAB. The sensitivity of this model has been computed with respect to frequency. The digital sensitivity values are compared with the values of analog sensitivity. The developed model is useful for digital protection, integrated operation, control and planning of the system.

  19. Generalized drying curves in conductive/convective paper drying

    Directory of Open Access Journals (Sweden)

    O.C. Motta Lima

    2000-12-01

    Full Text Available This work presents a study related to conductive/convective drying of paper (cellulose sheets over heated surfaces, under natural and forced air conditions. The experimental apparatus consists in a metallic box heated by a thermostatic bath containing an upper surface on which the paper samples (about 1 mm thick are placed. The system is submitted to ambient air under two different conditions: natural convection and forced convection provide by an adjustable blower. The influence of initial paper moisture content, drying (heated surface temperature and air velocity on drying curves behavior is observed under different drying conditions. Hence, these influence is studied through the proposal of generalized drying curves. Those curves are analyzed individually for each air condition exposed above and for both together. A set of equations to fit them is proposed and discussed.

  20. Matlab based Toolkits used to Interface with Optical Design Software for NASA's James Webb Space Telescope

    Science.gov (United States)

    Howard, Joseph

    2007-01-01

    The viewgraph presentation provides an introduction to the James Webb Space Telescope (JWST). The first part provides a brief overview of Matlab toolkits including CodeV, OSLO, and Zemax Toolkits. The toolkit overview examines purpose, layout, how Matlab gets data from CodeV, function layout, and using cvHELP. The second part provides examples of use with JWST, including wavefront sensitivities and alignment simulations.

  1. Quantification of larval zebrafish motor function in multiwell plates using open-source MATLAB applications.

    Science.gov (United States)

    Zhou, Yangzhong; Cattley, Richard T; Cario, Clinton L; Bai, Qing; Burton, Edward A

    2014-07-01

    This article describes a method to quantify the movements of larval zebrafish in multiwell plates, using the open-source MATLAB applications LSRtrack and LSRanalyze. The protocol comprises four stages: generation of high-quality, flatly illuminated video recordings with exposure settings that facilitate object recognition; analysis of the resulting recordings using tools provided in LSRtrack to optimize tracking accuracy and motion detection; analysis of tracking data using LSRanalyze or custom MATLAB scripts; and implementation of validation controls. The method is reliable, automated and flexible, requires plate format suitable for high-throughput applications.

  2. Survival curves study of platelet labelling with 51Cr

    International Nuclear Information System (INIS)

    Penas, M.E.

    1981-01-01

    Platelet kinetics and idiopathic thrombocytopenic purpura were researched in the literature. An 'in vitro' platelet labelling with 51 Cr procedure in implementation has been evaluated in human beings. Functions used for fitting considered the cases whether the curve was linear or exponential as well as the presence of hematies. (author)

  3. Dose-effect Curve for X-radiation in Lymphocytes in Goats

    International Nuclear Information System (INIS)

    Hasanbasic, D.; Saracevic, L.; Sacirbegovic, A.

    1998-01-01

    Dose-effect curve for X-radiation was made based on the analysis of chromosome aberrations in lympocytes of goats. Blood samples from seven goats were irradiated using MOORHEAD method, slightly modified and adapted to our conditions. Linear-square model was used, and the dose-effect curves were fitted by the smallest squares method. Dose-effect curve (collective) for goats is displayed as the following expression: y(D)= 8,6639·10 -3 D + 2,9748·10 -2 D 2 +2,9475·10 -3 . Comparison with some domestic animals such as sheep and pigs showed differences not only with respect to linear-square model, but to other mathematical presentations as well. (author)

  4. Method to Calculate the Electricity Generated by a Photovoltaic Cell, Based on Its Mathematical Model Simulations in MATLAB

    Directory of Open Access Journals (Sweden)

    Carlos Morcillo-Herrera

    2015-01-01

    Full Text Available This paper presents a practical method for calculating the electrical energy generated by a PV panel (kWhr through MATLAB simulations based on the mathematical model of the cell, which obtains the “Mean Maximum Power Point” (MMPP in the characteristic V-P curve, in response to evaluating historical climate data at specific location. This five-step method calculates through MMPP per day, per month, or per year, the power yield by unit area, then electrical energy generated by PV panel, and its real conversion efficiency. To validate the method, it was applied to Sewage Treatment Plant for a Group of Drinking Water and Sewerage of Yucatan (JAPAY, México, testing 250 Wp photovoltaic panels of five different manufacturers. As a result, the performance, the real conversion efficiency, and the electricity generated by five different PV panels in evaluation were obtained and show the best technical-economic option to develop the PV generation project.

  5. MATLAB-implemented estimation procedure for model-based assessment of hepatic insulin degradation from standard intravenous glucose tolerance test data.

    Science.gov (United States)

    Di Nardo, Francesco; Mengoni, Michele; Morettini, Micaela

    2013-05-01

    Present study provides a novel MATLAB-based parameter estimation procedure for individual assessment of hepatic insulin degradation (HID) process from standard frequently-sampled intravenous glucose tolerance test (FSIGTT) data. Direct access to the source code, offered by MATLAB, enabled us to design an optimization procedure based on the alternating use of Gauss-Newton's and Levenberg-Marquardt's algorithms, which assures the full convergence of the process and the containment of computational time. Reliability was tested by direct comparison with the application, in eighteen non-diabetic subjects, of well-known kinetic analysis software package SAAM II, and by application on different data. Agreement between MATLAB and SAAM II was warranted by intraclass correlation coefficients ≥0.73; no significant differences between corresponding mean parameter estimates and prediction of HID rate; and consistent residual analysis. Moreover, MATLAB optimization procedure resulted in a significant 51% reduction of CV% for the worst-estimated parameter by SAAM II and in maintaining all model-parameter CV% MATLAB-based procedure was suggested as a suitable tool for the individual assessment of HID process. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  6. Numerical Integration Techniques for Curved-Element Discretizations of Molecule–Solvent Interfaces

    Science.gov (United States)

    Bardhan, Jaydeep P.; Altman, Michael D.; Willis, David J.; Lippow, Shaun M.; Tidor, Bruce; White, Jacob K.

    2012-01-01

    approximations with increasing discretization and associated increases in computational resources. The results clearly demonstrate that our methods for approximate integration on an exact geometry are far more accurate than exact integration on an approximate geometry. A MATLAB implementation of the presented integration methods and sample data files containing curved-element discretizations of several small molecules are available online at http://web.mit.edu/tidor. PMID:17627358

  7. Identification of the main thermal characteristics of building components using MATLAB

    DEFF Research Database (Denmark)

    Jimenez, M.J.; Madsen, Henrik; Andersen, Klaus Kaae

    2008-01-01

    This paper presents the application of the IDENT Graphical User Interface of MATLAB to estimate the thermal properties of building components from outdoor dynamic testing, imposing appropriate physical constraints and assuming linear and time invariant parametric models. The theory is briefly...

  8. The Biasing Effects of Unmodeled ARMA Time Series Processes on Latent Growth Curve Model Estimates

    Science.gov (United States)

    Sivo, Stephen; Fan, Xitao; Witta, Lea

    2005-01-01

    The purpose of this study was to evaluate the robustness of estimated growth curve models when there is stationary autocorrelation among manifest variable errors. The results suggest that when, in practice, growth curve models are fitted to longitudinal data, alternative rival hypotheses to consider would include growth models that also specify…

  9. Simulasi Teknik Modulasi Ofdm Qpsk Dengan Menggunakan Matlab

    OpenAIRE

    Subrata, Rosalia H; Gozali, Ferrianto

    2015-01-01

    This paper provides a brief explanation of the processing steps involved in Orthogonal Frequency Division Multiplexing (OFDM) with Quadrature Phase Shift Keying (QPSK) modulation technique implemented as a simulation program in MatLab. Input data of the simulation program in the form of random bit stream or text can be selected by users. The process conducted in the simulation is divided into three consecutive steps, processes in the OFDM transmitter, in transmission channel and in the OFDM r...

  10. Estimation of growth curve parameters in Konya Merino sheep ...

    African Journals Online (AJOL)

    The objective of this study was to determine the fitness of Quadratic, Cubic, Gompertz and Logistic functions to the growth curves of Konya Merino lambs obtained by using monthly records of live weight from birth to 480 days of age. The models were evaluated according to determination coefficient (R2), mean square ...

  11. A new method for measuring coronary artery diameters with CT spatial profile curves

    International Nuclear Information System (INIS)

    Shimamoto, Ryoichi; Suzuki, Jun-ichi; Yamazaki, Tadashi; Tsuji, Taeko; Ohmoto, Yuki; Morita, Toshihiro; Yamashita, Hiroshi; Honye, Junko; Nagai, Ryozo; Akahane, Masaaki; Ohtomo, Kuni

    2007-01-01

    Purpose: Coronary artery vascular edge recognition on computed tomography (CT) angiograms is influenced by window parameters. A noninvasive method for vascular edge recognition independent of window setting with use of multi-detector row CT was contrived and its feasibility and accuracy were estimated by intravascular ultrasound (IVUS). Methods: Multi-detector row CT was performed to obtain 29 CT spatial profile curves by setting a line cursor across short-axis coronary angiograms processed by multi-planar reconstruction. IVUS was also performed to determine the reference coronary diameter. IVUS diameter was fitted horizontally between two points on the upward and downward slopes of the profile curves and Hounsfield number was measured at the fitted level to test seven candidate indexes for definition of intravascular coronary diameter. The best index from the curves should show the best agreement with IVUS diameter. Results: Of the seven candidates the agreement was the best (agreement: 16 ± 11%) when the two ratios of Hounsfield number at the level of IVUS diameter over that at the peak on the profile curves were used with water and with fat as the background tissue. These edge definitions were achieved by cutting the horizontal distance by the curves at the level defined by the ratio of 0.41 for water background and 0.57 for fat background. Conclusions: Vascular edge recognition of the coronary artery with CT spatial profile curves was feasible and the contrived method could define the coronary diameter with reasonable agreement

  12. MATLAB-Based Program for Teaching Autocorrelation Function and Noise Concepts

    Science.gov (United States)

    Jovanovic Dolecek, G.

    2012-01-01

    An attractive MATLAB-based tool for teaching the basics of autocorrelation function and noise concepts is presented in this paper. This tool enhances traditional in-classroom lecturing. The demonstrations of the tool described here highlight the description of the autocorrelation function (ACF) in a general case for wide-sense stationary (WSS)…

  13. Measuring modules for the research of compensators of reactive power with voltage stabilization in MATLAB

    Science.gov (United States)

    Vlasayevsky, Stanislav; Klimash, Stepan; Klimash, Vladimir

    2017-10-01

    A set of mathematical modules was developed for evaluation the energy performance in the research of electrical systems and complexes in the MatLab. In the electrotechnical library SimPowerSystems of the MatLab software, there are no measuring modules of energy coefficients characterizing the quality of electricity and the energy efficiency of electrical apparatus. Modules are designed to calculate energy coefficients characterizing the quality of electricity (current distortion and voltage distortion) and energy efficiency indicators (power factor and efficiency) are presented. There are described the methods and principles of building the modules. The detailed schemes of modules built on the elements of the Simulink Library are presented, in this connection, these modules are compatible with mathematical models of electrical systems and complexes in the MatLab. Also there are presented the results of the testing of the developed modules and the results of their verification on the schemes that have analytical expressions of energy indicators.

  14. Optimal Performance of a Nonlinear Gantry Crane System via Priority-based Fitness Scheme in Binary PSO Algorithm

    International Nuclear Information System (INIS)

    Jaafar, Hazriq Izzuan; Ali, Nursabillilah Mohd; Selamat, Nur Asmiza; Kassim, Anuar Mohamed; Mohamed, Z; Abidin, Amar Faiz Zainal; Jamian, J J

    2013-01-01

    This paper presents development of an optimal PID and PD controllers for controlling the nonlinear gantry crane system. The proposed Binary Particle Swarm Optimization (BPSO) algorithm that uses Priority-based Fitness Scheme is adopted in obtaining five optimal controller gains. The optimal gains are tested on a control structure that combines PID and PD controllers to examine system responses including trolley displacement and payload oscillation. The dynamic model of gantry crane system is derived using Lagrange equation. Simulation is conducted within Matlab environment to verify the performance of system in terms of settling time (Ts), steady state error (SSE) and overshoot (OS). This proposed technique demonstrates that implementation of Priority-based Fitness Scheme in BPSO is effective and able to move the trolley as fast as possible to the various desired position

  15. Image enhancement using MCNP5 code and MATLAB in neutron radiography.

    Science.gov (United States)

    Tharwat, Montaser; Mohamed, Nader; Mongy, T

    2014-07-01

    This work presents a method that can be used to enhance the neutron radiography (NR) image for objects with high scattering materials like hydrogen, carbon and other light materials. This method used Monte Carlo code, MCNP5, to simulate the NR process and get the flux distribution for each pixel of the image and determines the scattered neutron distribution that caused image blur, and then uses MATLAB to subtract this scattered neutron distribution from the initial image to improve its quality. This work was performed before the commissioning of digital NR system in Jan. 2013. The MATLAB enhancement method is quite a good technique in the case of static based film neutron radiography, while in neutron imaging (NI) technique, image enhancement and quantitative measurement were efficient by using ImageJ software. The enhanced image quality and quantitative measurements were presented in this work. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. A comparative analysis of the EEDF obtained by Regularization and by Least square fit methods

    International Nuclear Information System (INIS)

    Gutierrez T, C.; Flores Ll, H.

    2004-01-01

    The second derived of the characteristic curve current-voltage (I - V) of a Langmuir probe (I - V) is numerically calculated using the Tikhonov method for to determine the distribution function of the electrons energy (EEDF). One comparison of the obtained EEDF and a fit by least square are discussed (LS). The I - V experimental curve is obtained in a plasma source in the electron cyclotron resonance (ECR) using a cylindrical probe. The parameters of plasma are determined of the EEDF by means of the Laframboise theory. For the case of the LS fit, the obtained results are similar to those obtained by the Tikhonov method, but in the first case the procedure is slow to achieve the best fit. (Author)

  17. Calculating the parameters of experimental data Gauss distribution using the least square fit method and evaluation of their accuracy

    International Nuclear Information System (INIS)

    Guseva, E.V.; Peregudov, V.N.

    1982-01-01

    The FITGAV program for calculation of parameters of the Gauss curve describing experimental data is considered. The calculations are based on the least square fit method. The estimations of errors in the parameter determination as a function of experimental data sample volume and their statistical significance are obtained. The curve fit using 100 points occupies less than 1 s at the SM-4 type computer

  18. Quantitative description of the magnetization curves of amorphous alloys of the series a-DyxGd1-xNi

    International Nuclear Information System (INIS)

    Barbara, B.; Filippi, J.; Amaral, V.S.

    1992-01-01

    The magnetization curves of the series of amorphous alloys Dy x Gd 1-x Ni measured between 1.5 and 4.2 K and up to 15 T, have been fitted to the zero kelvin analytical model of Chudnovsky. The results of these fits allow a detailed understanding of the magnetization curves of amorphous alloys with ferromagnetic interactions. In particular, the ratio D/J of the local anisotropy and exchange energies, and the magnetic and atomic correlation lengths, are accurately determined. (orig.)

  19. A Simulation Platform To Model, Optimize And Design Wind Turbines. The Matlab/Simulink Toolbox

    Directory of Open Access Journals (Sweden)

    Anca Daniela HANSEN

    2002-12-01

    Full Text Available In the last years Matlab / Simulink® has become the most used software for modeling and simulation of dynamic systems. Wind energy conversion systems are for example such systems, containing subsystems with different ranges of the time constants: wind, turbine, generator, power electronics, transformer and grid. The electrical generator and the power converter need the smallest simulation step and therefore, these blocks decide the simulation speed. This paper presents a new and integrated simulation platform for modeling, optimizing and designing wind turbines. The platform contains different simulation tools: Matlab / Simulink - used as basic modeling tool, HAWC, DIgSilent and Saber.

  20. QUENCH: A software package for the determination of quenching curves in Liquid Scintillation counting

    International Nuclear Information System (INIS)

    Cassette, Philippe

    2016-01-01

    In Liquid Scintillation Counting (LSC), the scintillating source is part of the measurement system and its detection efficiency varies with the scintillator used, the vial and the volume and the chemistry of the sample. The detection efficiency is generally determined using a quenching curve, describing, for a specific radionuclide, the relationship between a quenching index given by the counter and the detection efficiency. A quenched set of LS standard sources are prepared by adding a quenching agent and the quenching index and detection efficiency are determined for each source. Then a simple formula is fitted to the experimental points to define the quenching curve function. The paper describes a software package specifically devoted to the determination of quenching curves with uncertainties. The experimental measurements are described by their quenching index and detection efficiency with uncertainties on both quantities. Random Gaussian fluctuations of these experimental measurements are sampled and a polynomial or logarithmic function is fitted on each fluctuation by χ"2 minimization. This Monte Carlo procedure is repeated many times and eventually the arithmetic mean and the experimental standard deviation of each parameter are calculated, together with the covariances between these parameters. Using these parameters, the detection efficiency, corresponding to an arbitrary quenching index within the measured range, can be calculated. The associated uncertainty is calculated with the law of propagation of variances, including the covariance terms. - Highlights: • The program “QUENCH” is devoted to the interpolation of quenching curves in LSC. • Functions are fitted to experimental data with uncertainties in both quenching and efficiency. • The parameters of the fitting function and the associated covariance matrix are evaluated. • The detection efficiency and uncertainty corresponding to a given quenching index is calculated.

  1. Strategies for fitting nonlinear ecological models in R, AD Model Builder, and BUGS

    DEFF Research Database (Denmark)

    Bolker, B.M.; Gardner, B.; Maunder, M.

    2013-01-01

    Ecologists often use nonlinear fitting techniques to estimate the parameters of complex ecological models, with attendant frustration. This paper compares three open-source model fitting tools and discusses general strategies for defining and fitting models. R is convenient and (relatively) easy...... to learn, AD Model Builder is fast and robust but comes with a steep learning curve, while BUGS provides the greatest flexibility at the price of speed. Our model-fitting suggestions range from general cultural advice (where possible, use the tools and models that are most common in your subfield...

  2. EEGVIS: A MATLAB toolbox for browsing, exploring, and viewing large datasets

    Directory of Open Access Journals (Sweden)

    Kay A Robbins

    2012-05-01

    Full Text Available Recent advances in data monitoring and sensor technology have accelerated the acquisition of very large data sets. Streaming data sets from instrumentation such as multi-channel EEG recording usually must undergo substantial pre-processing and artifact removal. Even when using automated procedures, most scientists engage in laborious manual examination and processing to assure high quality data and to indentify interesting or problematic data segments. Researchers also do not have a convenient method of method of visually assessing the effects of applying any stage in a processing pipeline. EEGVIS is a MATLAB toolbox that allows users to quickly explore multi-channel EEG and other large array-based data sets using multi-scale drill-down techniques. Customizable summary views reveal potentially interesting sections of data, which users can explore further by clicking to examine using detailed viewing components. The viewer and a companion browser are built on our MoBBED framework, which has a library of modular viewing components that can be mixed and matched to best reveal structure. Users can easily create new viewers for their specific data without any programming during the exploration process. These viewers automatically support pan, zoom, resizing of individual components, and cursor exploration. The toolbox can be used directly in MATLAB at any stage in a processing pipeline, as a plug in for EEGLAB, or as a standalone precompiled application without MATLAB running. EEGVIS and its supporting packages are freely available under the GNU general public license at http://visual.cs.utsa.edu/ eegvis.

  3. EEGVIS: A MATLAB Toolbox for Browsing, Exploring, and Viewing Large Datasets.

    Science.gov (United States)

    Robbins, Kay A

    2012-01-01

    Recent advances in data monitoring and sensor technology have accelerated the acquisition of very large data sets. Streaming data sets from instrumentation such as multi-channel EEG recording usually must undergo substantial pre-processing and artifact removal. Even when using automated procedures, most scientists engage in laborious manual examination and processing to assure high quality data and to indentify interesting or problematic data segments. Researchers also do not have a convenient method of method of visually assessing the effects of applying any stage in a processing pipeline. EEGVIS is a MATLAB toolbox that allows users to quickly explore multi-channel EEG and other large array-based data sets using multi-scale drill-down techniques. Customizable summary views reveal potentially interesting sections of data, which users can explore further by clicking to examine using detailed viewing components. The viewer and a companion browser are built on our MoBBED framework, which has a library of modular viewing components that can be mixed and matched to best reveal structure. Users can easily create new viewers for their specific data without any programming during the exploration process. These viewers automatically support pan, zoom, resizing of individual components, and cursor exploration. The toolbox can be used directly in MATLAB at any stage in a processing pipeline, as a plug-in for EEGLAB, or as a standalone precompiled application without MATLAB running. EEGVIS and its supporting packages are freely available under the GNU general public license at http://visual.cs.utsa.edu/eegvis.

  4. Curve fitting using a genetic algorithm for the X-ray fluorescence measurement of lead in bone

    International Nuclear Information System (INIS)

    Luo, L.; McMaster University, Hamilton; Chettle, D.R.; Nie, H.; McNeill, F.E.; Popovic, M.

    2006-01-01

    We investigated the potential application of the genetic algorithm in the analysis of X-ray fluorescence spectra from measurement of lead in bone. Candidate solutions are first designed based on the field knowledge and the whole operation, evaluation, selection, crossover and mutation, is then repeated until a given convergence criterion is met. An average-parameters based genetic algorithm is suggested to improve the fitting precision and accuracy. Relative standard deviation (RSD%) of fitting amplitude, peak position and width is 1.3-7.1, 0.009-0.14 and 1.4-3.3, separately. The genetic algorithm was shown to make a good resolution and fitting of K lines of Pb and γ elastic peaks. (author)

  5. Gro2mat: a package to efficiently read gromacs output in MATLAB.

    Science.gov (United States)

    Dien, Hung; Deane, Charlotte M; Knapp, Bernhard

    2014-07-30

    Molecular dynamics (MD) simulations are a state-of-the-art computational method used to investigate molecular interactions at atomic scale. Interaction processes out of experimental reach can be monitored using MD software, such as Gromacs. Here, we present the gro2mat package that allows fast and easy access to Gromacs output files from Matlab. Gro2mat enables direct parsing of the most common Gromacs output formats including the binary xtc-format. No openly available Matlab parser currently exists for this format. The xtc reader is orders of magnitudes faster than other available pdb/ascii workarounds. Gro2mat is especially useful for scientists with an interest in quick prototyping of new mathematical and statistical approaches for Gromacs trajectory analyses. © 2014 Wiley Periodicals, Inc. Copyright © 2014 Wiley Periodicals, Inc.

  6. Evaluation of Interpolants in Their Ability to Fit Seismometric Time Series

    OpenAIRE

    Basu, Kanadpriya; Mariani, Maria; Serpa, Laura; Sinha, Ritwik

    2015-01-01

    This article is devoted to the study of the ASARCO demolition seismic data. Two different classes of modeling techniques are explored: First, mathematical interpolation methods and second statistical smoothing approaches for curve fitting. We estimate the characteristic parameters of the propagation medium for seismic waves with multiple mathematical and statistical techniques, and provide the relative advantages of each approach to address fitting of such data. We conclude that mathematical ...

  7. Comparison of parametric, orthogonal, and spline functions to model individual lactation curves for milk yield in Canadian Holsteins

    Directory of Open Access Journals (Sweden)

    Corrado Dimauro

    2010-11-01

    Full Text Available Test day records for milk yield of 57,390 first lactation Canadian Holsteins were analyzed with a linear model that included the fixed effects of herd-test date and days in milk (DIM interval nested within age and calving season. Residuals from this model were analyzed as a new variable and fitted with a five parameter model, fourth-order Legendre polynomials, with linear, quadratic and cubic spline models with three knots. The fit of the models was rather poor, with about 30-40% of the curves showing an adjusted R-square lower than 0.20 across all models. Results underline a great difficulty in modelling individual deviations around the mean curve for milk yield. However, the Ali and Schaeffer (5 parameter model and the fourth-order Legendre polynomials were able to detect two basic shapes of individual deviations among the mean curve. Quadratic and, especially, cubic spline functions had better fitting performances but a poor predictive ability due to their great flexibility that results in an abrupt change of the estimated curve when data are missing. Parametric and orthogonal polynomials seem to be robust and affordable under this standpoint.

  8. Elevation data fitting and precision analysis of Google Earth in road survey

    Science.gov (United States)

    Wei, Haibin; Luan, Xiaohan; Li, Hanchao; Jia, Jiangkun; Chen, Zhao; Han, Leilei

    2018-05-01

    Objective: In order to improve efficiency of road survey and save manpower and material resources, this paper intends to apply Google Earth to the feasibility study stage of road survey and design. Limited by the problem that Google Earth elevation data lacks precision, this paper is focused on finding several different fitting or difference methods to improve the data precision, in order to make every effort to meet the accuracy requirements of road survey and design specifications. Method: On the basis of elevation difference of limited public points, any elevation difference of the other points can be fitted or interpolated. Thus, the precise elevation can be obtained by subtracting elevation difference from the Google Earth data. Quadratic polynomial surface fitting method, cubic polynomial surface fitting method, V4 interpolation method in MATLAB and neural network method are used in this paper to process elevation data of Google Earth. And internal conformity, external conformity and cross correlation coefficient are used as evaluation indexes to evaluate the data processing effect. Results: There is no fitting difference at the fitting point while using V4 interpolation method. Its external conformity is the largest and the effect of accuracy improvement is the worst, so V4 interpolation method is ruled out. The internal and external conformity of the cubic polynomial surface fitting method both are better than those of the quadratic polynomial surface fitting method. The neural network method has a similar fitting effect with the cubic polynomial surface fitting method, but its fitting effect is better in the case of a higher elevation difference. Because the neural network method is an unmanageable fitting model, the cubic polynomial surface fitting method should be mainly used and the neural network method can be used as the auxiliary method in the case of higher elevation difference. Conclusions: Cubic polynomial surface fitting method can obviously

  9. Applied Statistics Using SPSS, STATISTICA, MATLAB and R

    CERN Document Server

    De Sá, Joaquim P Marques

    2007-01-01

    This practical reference provides a comprehensive introduction and tutorial on the main statistical analysis topics, demonstrating their solution with the most common software package. Intended for anyone needing to apply statistical analysis to a large variety of science and enigineering problems, the book explains and shows how to use SPSS, MATLAB, STATISTICA and R for analysis such as data description, statistical inference, classification and regression, factor analysis, survival data and directional statistics. It concisely explains key concepts and methods, illustrated by practical examp

  10. Solution of the reactor point kinetics equations by MATLAB computing

    Directory of Open Access Journals (Sweden)

    Singh Sudhansu S.

    2015-01-01

    Full Text Available The numerical solution of the point kinetics equations in the presence of Newtonian temperature feedback has been a challenging issue for analyzing the reactor transients. Reactor point kinetics equations are a system of stiff ordinary differential equations which need special numerical treatments. Although a plethora of numerical intricacies have been introduced to solve the point kinetics equations over the years, some of the simple and straightforward methods still work very efficiently with extraordinary accuracy. As an example, it has been shown recently that the fundamental backward Euler finite difference algorithm with its simplicity has proven to be one of the most effective legacy methods. Complementing the back-ward Euler finite difference scheme, the present work demonstrates the application of ordinary differential equation suite available in the MATLAB software package to solve the stiff reactor point kinetics equations with Newtonian temperature feedback effects very effectively by analyzing various classic benchmark cases. Fair accuracy of the results implies the efficient application of MATLAB ordinary differential equation suite for solving the reactor point kinetics equations as an alternate method for future applications.

  11. Exponential models applied to automated processing of radioimmunoassay standard curves

    International Nuclear Information System (INIS)

    Morin, J.F.; Savina, A.; Caroff, J.; Miossec, J.; Legendre, J.M.; Jacolot, G.; Morin, P.P.

    1979-01-01

    An improved computer processing is described for fitting of radio-immunological standard curves by means of an exponential model on a desk-top calculator. This method has been applied to a variety of radioassays and the results are in accordance with those obtained by more sophisticated models [fr

  12. A synthetic method of solar spectrum based on LED

    Science.gov (United States)

    Wang, Ji-qiang; Su, Shi; Zhang, Guo-yu; Zhang, Jian

    2017-10-01

    A synthetic method of solar spectrum which based on the spectral characteristics of the solar spectrum and LED, and the principle of arbitrary spectral synthesis was studied by using 14 kinds of LED with different central wavelengths.The LED and solar spectrum data were selected by Origin Software firstly, then calculated the total number of LED for each center band by the transformation relation between brightness and illumination and Least Squares Curve Fit in Matlab.Finally, the spectrum curve of AM1.5 standard solar spectrum was obtained. The results met the technical indexes of the solar spectrum matching with ±20% and the solar constant with >0.5.

  13. Creep curve modeling of hastelloy-X alloy by using the theta projection method

    International Nuclear Information System (INIS)

    Woo Gon, Kim; Woo-Seog, Ryu; Jong-Hwa, Chang; Song-Nan, Yin

    2007-01-01

    To model the creep curves of the Hastelloy-X alloy which is being considered as a candidate material for the VHTR (Very High Temperature gas-cooled Reactor) components, full creep curves were obtained by constant-load creep tests for different stress levels at 950 C degrees. Using the experimental creep data, the creep curves were modeled by applying the Theta projection method. A number of computing processes of a nonlinear least square fitting (NLSF) analysis was carried out to establish the suitably of the four Theta parameters. The results showed that the Θ 1 and Θ 2 parameters could not be optimized well with a large error during the fitting of the full creep curves. On the other hand, the Θ 3 and Θ 4 parameters were optimized well without an error. For this result, to find a suitable cutoff strain criterion, the NLSF analysis was performed with various cutoff strains for all the creep curves. An optimum cutoff strain range for defining the four Theta parameters accurately was found to be a 3% cutoff strain. At the 3% cutoff strain, the predicted curves coincided well with the experimental ones. The variation of the four Theta parameters as the function of a stress showed a good linearity, and the creep curves were modeled well for the low stress levels. Predicted minimum creep rate showed a good agreement with the experimental data. Also, for a design usage of the Hastelloy-X alloy, the plot of the log stress versus log the time to a 1% strain was predicted, and the creep rate curves with time and a cutoff strain at 950 C degrees were constructed numerically for a wide rang of stresses by using the Theta projection method. (authors)

  14. INTER-INTEGRATED CIRCUIT (I2C SEBAGAI SISTEM KOMUNIKASI MULTI-MIKROKONTROLER MENGGUNAKAN PLATFORM ARDUINO DAN MATLAB

    Directory of Open Access Journals (Sweden)

    I Nyoman Kusuma Wardana

    2016-06-01

    Full Text Available Pada aplikasi yang menggunakan mikrokontroler sebagai perangkat utama, pengguna sering dihadapkan pada masalah kurangnya jumlah pin yang tersedia pada suatu mikrokontoler. Terdapat dua alternatif yang dapat dilakukan ketika penggunaan pin menjadi masalah yang krusial, yaitu dengan mengganti jenis mikrokontroler atau menggunakan lebih dari satu buah mikrokontroler (multi-mikrokontroler. Kedua alternatif ini memiliki keunggulan dan kelemahannya masing-masing. Pada penelitian ini, penggunaan protokol Inter-integrated Circuit (I2C akan diterapkan untuk sistem multi-mikrokontroler dan multi-sensor menggunakan Platform Arduino yang terkontrol MATLAB. Sebuah Master dan dua buah slave akan diuji pada penelitian ini. Master dan Slave akan sepenuhnya dikontrol menggunakan MATLAB. Kedua slave akan ditanamkan program Arduino, sedangkan Master akan menggunakan program MATLAB. Hasil dari penelitian ini menunjukkan bahwa kedua Slave dapat dikontrol dengan baik, baik membaca sensor yang terpasang maupun mengontrol LED. Sistem komunikasi secara I2C telah terbangun dengan baik.

  15. Development of Graphical User Interface for Finite Element Analysis of Static Loading of a Column using MATLAB

    Directory of Open Access Journals (Sweden)

    Moses Omolayo PETINRIN

    2010-12-01

    Full Text Available In this work, the capability of MATLAB software package to develop graphical user interface (GUI package was demonstrated. A GUI was successfully developed using MATLAB programming language to study the behaviour of a suspended column under uniaxial static loading by solving the numerical model created based on the finite element method (FEM. The comparison between the exact solution from previous researches and the numerical analysis showed good agreement. The column average strain, average stress and average load are equivalent but more accurate to the ones obtained when the whole column is taken as one element (two nodes for one dimensional linear finite element problem. It was established in this work that MATLAB is not only a software package for numerical computation but also for application development.

  16. REFLECTED LIGHT CURVES, SPHERICAL AND BOND ALBEDOS OF JUPITER- AND SATURN-LIKE EXOPLANETS

    Energy Technology Data Exchange (ETDEWEB)

    Dyudina, Ulyana; Kopparla, Pushkar; Ingersoll, Andrew P.; Yung, Yuk L. [Division of Geological and Planetary Sciences, 150-21 California Institute of Technology, Pasadena, CA 91125 (United States); Zhang, Xi [University of California Santa Cruz 1156 High Street, Santa Cruz, CA 95064 (United States); Li, Liming [Department of Physics, University of Houston, Houston, TX 77204 (United States); Dones, Luke [Southwest Research Institute, 1050 Walnut Street, Suite 300, Boulder CO 80302 (United States); Verbiscer, Anne, E-mail: ulyana@gps.caltech.edu [Department of Astronomy, University of Virginia, Charlottesville, VA 22904-4325 (United States)

    2016-05-10

    Reflected light curves observed for exoplanets indicate that a few of them host bright clouds. We estimate how the light curve and total stellar heating of a planet depends on forward and backward scattering in the clouds based on Pioneer and Cassini spacecraft images of Jupiter and Saturn. We fit analytical functions to the local reflected brightnesses of Jupiter and Saturn depending on the planet’s phase. These observations cover broadbands at 0.59–0.72 and 0.39–0.5 μ m, and narrowbands at 0.938 (atmospheric window), 0.889 (CH4 absorption band), and 0.24–0.28 μ m. We simulate the images of the planets with a ray-tracing model, and disk-integrate them to produce the full-orbit light curves. For Jupiter, we also fit the modeled light curves to the observed full-disk brightness. We derive spherical albedos for Jupiter and Saturn, and for planets with Lambertian and Rayleigh-scattering atmospheres. Jupiter-like atmospheres can produce light curves that are a factor of two fainter at half-phase than the Lambertian planet, given the same geometric albedo at transit. The spherical albedo is typically lower than for a Lambertian planet by up to a factor of ∼1.5. The Lambertian assumption will underestimate the absorption of the stellar light and the equilibrium temperature of the planetary atmosphere. We also compare our light curves with the light curves of solid bodies: the moons Enceladus and Callisto. Their strong backscattering peak within a few degrees of opposition (secondary eclipse) can lead to an even stronger underestimate of the stellar heating.

  17. Statistical study of clone survival curves after irradiation in one or two stages. Comparison and generalization of different models

    International Nuclear Information System (INIS)

    Lachet, Bernard.

    1975-01-01

    A statistical study was carried out on 208 survival curves for chlorella subjected to γ or particle radiations. The computing programmes used were written in Fortran. The different experimental causes contributing to the variance of a survival rate are analyzed and consequently the experiments can be planned. Each curve was fitted to four models by the weighted least squares method applied to non-linear functions. The validity of the fits obtained can be checked by the F test. It was possible to define the confidence and prediction zones around an adjusted curve by weighting of the residual variance, in spite of error on the doses delivered; the confidence limits can them be fixed for a dose estimated from an exact or measured survival. The four models adopted were compared for the precision of their fit (by a non-parametric simultaneous comparison test) and the scattering of their adjusted parameters: Wideroe's model gives a very good fit with the experimental points in return for a scattering of its parameters, which robs them of their presumed meaning. The principal component analysis showed the statistical equivalence of the 1 and 2 hit target models. Division of the irradiation into two doses, the first fixed by the investigator, leads to families of curves for which the equation was established from that of any basic model expressing the dose survival relationship in one-stage irradiation [fr

  18. Optimal weights for circle fitting with discrete granular data

    International Nuclear Information System (INIS)

    Chernov, N.; Kolganova, E.; Ososkov, G.

    1995-01-01

    The problem of the data approximation measured along a circle by modern detectors in high energy physics, as for example, RICH (Ring Imaging Cherenkov) is considered. Such detectors having the discrete cell structure register the energy dissipation produced by a passing elementary particle not in a single point, but in several adjacent cells where all this energy is distributed. The presence of background hits makes inapplicable circle fitting methods based on the least square fit due to their noise sensitivity. In this paper it's shown that the efficient way to overcome these problems of the curve fitting is the robust fitting technique based on a reweighted least square method with optimally chosen weights, obtained by the use of maximum likelihood estimates. Results of numerical experiments are given proving the high efficiency of the suggested method. 9 refs., 5 figs., 1 tab

  19. MATLAB for laser speckle contrast analysis (LASCA): a practice-based approach

    Science.gov (United States)

    Postnikov, Eugene B.; Tsoy, Maria O.; Postnov, Dmitry E.

    2018-04-01

    Laser Speckle Contrast Analysis (LASCA) is one of the most powerful modern methods for revealing blood dynamics. The experimental design and theory for this method are well established, and the computational recipie is often regarded to be trivial. However, the achieved performance and spatial resolution may considerable differ for different implementations. We comprise a minireview of known approaches to the spatial laser speckle contrast data processing and their realization in MATLAB code providing an explicit correspondence to the mathematical representation, a discussion of available implementations. We also present the algorithm based on the 2D Haar wavelet transform, also supplied with the program code. This new method provides an opportunity to introduce horizontal, vertical and diagonal speckle contrasts; it may be used for processing highly anisotropic images of vascular trees. We provide the comparative analysis of the accuracy of vascular pattern detection and the processing times with a special attention to details of the used MATLAB procedures.

  20. HEART: an automated beat-to-beat cardiovascular analysis package using Matlab.

    Science.gov (United States)

    Schroeder, M J Mark J; Perreault, Bill; Ewert, D L Daniel L; Koenig, S C Steven C

    2004-07-01

    A computer program is described for beat-to-beat analysis of cardiovascular parameters from high-fidelity pressure and flow waveforms. The Hemodynamic Estimation and Analysis Research Tool (HEART) is a post-processing analysis software package developed in Matlab that enables scientists and clinicians to document, load, view, calibrate, and analyze experimental data that have been digitally saved in ascii or binary format. Analysis routines include traditional hemodynamic parameter estimates as well as more sophisticated analyses such as lumped arterial model parameter estimation and vascular impedance frequency spectra. Cardiovascular parameter values of all analyzed beats can be viewed and statistically analyzed. An attractive feature of the HEART program is the ability to analyze data with visual quality assurance throughout the process, thus establishing a framework toward which Good Laboratory Practice (GLP) compliance can be obtained. Additionally, the development of HEART on the Matlab platform provides users with the flexibility to adapt or create study specific analysis files according to their specific needs. Copyright 2003 Elsevier Ltd.

  1. Using commercial simulators for determining flash distillation curves for petroleum fractions

    Directory of Open Access Journals (Sweden)

    Eleonora Erdmann

    2008-01-01

    Full Text Available This work describes a new method for estimating the equilibrium flash vaporisation (EFV distillation curve for petro-leum fractions by using commercial simulators. A commercial simulator was used for implementing a stationary mo-del for flash distillation; this model was adjusted by using a distillation curve obtained from standard laboratory ana-lytical assays. Such curve can be one of many types (eg ASTM D86, D1160 or D2887 and involves an experimental procedure simpler than that required for obtaining an EFV curve. Any commercial simulator able to model petroleum can be used for the simulation (HYSYS and CHEMCAD simulators were used here. Several types of petroleum and fractions were experimentally analysed for evaluating the proposed method; this data was then put into a process si-mulator (according to the proposed method to estimate the corresponding EFV curves. HYSYS- and CHEMCAD-estimated curves were compared to those produced by two traditional estimation methods (Edmister’s and Maswell’s methods. Simulation-estimated curves were close to average Edmister and Maxwell curves in all cases. The propo-sed method has several advantages; it avoids the need for experimentally obtaining an EFV curve, it does not de-pend on the type of experimental curve used to fit the model and it enables estimating several pressures by using just one experimental curve as data.

  2. Analysis of variation in calibration curves for Kodak XV radiographic film using model-based parameters.

    Science.gov (United States)

    Hsu, Shu-Hui; Kulasekere, Ravi; Roberson, Peter L

    2010-08-05

    Film calibration is time-consuming work when dose accuracy is essential while working in a range of photon scatter environments. This study uses the single-target single-hit model of film response to fit the calibration curves as a function of calibration method, processor condition, field size and depth. Kodak XV film was irradiated perpendicular to the beam axis in a solid water phantom. Standard calibration films (one dose point per film) were irradiated at 90 cm source-to-surface distance (SSD) for various doses (16-128 cGy), depths (0.2, 0.5, 1.5, 5, 10 cm) and field sizes (5 × 5, 10 × 10 and 20 × 20 cm²). The 8-field calibration method (eight dose points per film) was used as a reference for each experiment, taken at 95 cm SSD and 5 cm depth. The delivered doses were measured using an Attix parallel plate chamber for improved accuracy of dose estimation in the buildup region. Three fitting methods with one to three dose points per calibration curve were investigated for the field sizes of 5 × 5, 10 × 10 and 20 × 20 cm². The inter-day variation of model parameters (background, saturation and slope) were 1.8%, 5.7%, and 7.7% (1 σ) using the 8-field method. The saturation parameter ratio of standard to 8-field curves was 1.083 ± 0.005. The slope parameter ratio of standard to 8-field curves ranged from 0.99 to 1.05, depending on field size and depth. The slope parameter ratio decreases with increasing depth below 0.5 cm for the three field sizes. It increases with increasing depths above 0.5 cm. A calibration curve with one to three dose points fitted with the model is possible with 2% accuracy in film dosimetry for various irradiation conditions. The proposed fitting methods may reduce workload while providing energy dependence correction in radiographic film dosimetry. This study is limited to radiographic XV film with a Lumisys scanner.

  3. VQone MATLAB toolbox: A graphical experiment builder for image and video quality evaluations: VQone MATLAB toolbox.

    Science.gov (United States)

    Nuutinen, Mikko; Virtanen, Toni; Rummukainen, Olli; Häkkinen, Jukka

    2016-03-01

    This article presents VQone, a graphical experiment builder, written as a MATLAB toolbox, developed for image and video quality ratings. VQone contains the main elements needed for the subjective image and video quality rating process. This includes building and conducting experiments and data analysis. All functions can be controlled through graphical user interfaces. The experiment builder includes many standardized image and video quality rating methods. Moreover, it enables the creation of new methods or modified versions from standard methods. VQone is distributed free of charge under the terms of the GNU general public license and allows code modifications to be made so that the program's functions can be adjusted according to a user's requirements. VQone is available for download from the project page (http://www.helsinki.fi/psychology/groups/visualcognition/).

  4. Dose - Response Curves for Dicentrics and PCC Rings: Preparedness for Radiological Emergency in Thailand

    International Nuclear Information System (INIS)

    Rungsimaphorn, B.; Rerkamnuaychoke, B.; Sudprasert, W.

    2014-01-01

    Establishing in-vitro dose calibration curves is important for reconstruction of radiation dose in the exposed individuals. The aim of this pioneering work in Thailand was to generate dose-response curves using conventional biological dosimetry: dicentric chromosome assay (DCA) and premature chromosome condensation (PCC) assay. The peripheral blood lymphocytes were irradiated with 137 Cs at a dose rate of 0.652 Gy/min to doses of 0.1, 0.25, 0.5, 0.75, 1, 2, 3, 4 and 5 Gy for DCA technique, and 5, 10, 15, 20 and 25 Gy for PCC technique. The blood samples were cultured and processed following the standard procedure given by the IAEA with slight modifications. At least 500-1,000 metaphases or 100 dicentrics/ PCC rings were analyzed using an automated metaphase finder system. The yield of dicentrics with dose was fitted to a linear quadratic model using Chromosome Aberration Calculation Software (CABAS, version 2.0), whereas the dose-response curve of PCC rings was fitted to a linear relationship. These curves will be useful for in-vitro dose reconstruction and can support the preparedness for radiological emergency in the country.

  5. Coded Modulation in C and MATLAB

    Science.gov (United States)

    Hamkins, Jon; Andrews, Kenneth S.

    2011-01-01

    This software, written separately in C and MATLAB as stand-alone packages with equivalent functionality, implements encoders and decoders for a set of nine error-correcting codes and modulators and demodulators for five modulation types. The software can be used as a single program to simulate the performance of such coded modulation. The error-correcting codes implemented are the nine accumulate repeat-4 jagged accumulate (AR4JA) low-density parity-check (LDPC) codes, which have been approved for international standardization by the Consultative Committee for Space Data Systems, and which are scheduled to fly on a series of NASA missions in the Constellation Program. The software implements the encoder and decoder functions, and contains compressed versions of generator and parity-check matrices used in these operations.

  6. FITTING OF THE DATA FOR DIFFUSION COEFFICIENTS IN UNSATURATED POROUS MEDIA

    Energy Technology Data Exchange (ETDEWEB)

    B. Bullard

    1999-05-01

    The purpose of this calculation is to evaluate diffusion coefficients in unsaturated porous media for use in the TSPA-VA analyses. Using experimental data, regression techniques were used to curve fit the diffusion coefficient in unsaturated porous media as a function of volumetric water content. This calculation substantiates the model fit used in Total System Performance Assessment-1995 An Evaluation of the Potential Yucca Mountain Repository (TSPA-1995), Section 6.5.4.

  7. FITTING OF THE DATA FOR DIFFUSION COEFFICIENTS IN UNSATURATED POROUS MEDIA

    International Nuclear Information System (INIS)

    B. Bullard

    1999-01-01

    The purpose of this calculation is to evaluate diffusion coefficients in unsaturated porous media for use in the TSPA-VA analyses. Using experimental data, regression techniques were used to curve fit the diffusion coefficient in unsaturated porous media as a function of volumetric water content. This calculation substantiates the model fit used in Total System Performance Assessment-1995 An Evaluation of the Potential Yucca Mountain Repository (TSPA-1995), Section 6.5.4

  8. Quantitative description of the magnetization curves of amorphous alloys of the series a-Dy xGd 1-xNi

    Science.gov (United States)

    Barbara, B.; Amaral, V. S.; Filippi, J.

    1992-10-01

    The magnetization curves of the series of amorphous alloys Dy xGd 1- xNi measured between 1.5 and 4.2 K and up to 15 T, have been fitted to the zero kelvin analytical model of Chudnovsky [1]. The results of these fits allow a detailed understanding of the magnetization curves of amorphous alloys with ferromagnetic interactions. In particular, the ratio D/ J of the local anisotropy and exchange energies, and the magnetic and atomic correlation lengths, are accurately determined.

  9. A MATLAB toolbox for the analysis of articulatory data in the production of speech.

    Science.gov (United States)

    Nguyen, N

    2000-08-01

    The goal of this paper is to present EMATOOLS, a set of scripts for displaying and annotating acoustic and articulatory data simultaneously in studies on speech production. These scripts were developed with the use of MATLAB, a multiplatform computing environment for numeric computation and visualization. The system is equipped with a mouse-driven graphical interface made up of a number of displays. This interface can be easily customized to speed up routine tasks. The scripts can also be used in a noninteractive way, as stand-alone MATLAB commands. Output data can be imported into any standard spreadsheet. EMATOOLS is freely available from www.lpl.univ-aix.fr/nguyen/ematools.html.

  10. Analysis of Power Laws, Shape Collapses, and Neural Complexity: New Techniques and MATLAB Support via the NCC Toolbox.

    Science.gov (United States)

    Marshall, Najja; Timme, Nicholas M; Bennett, Nicholas; Ripp, Monica; Lautzenhiser, Edward; Beggs, John M

    2016-01-01

    Neural systems include interactions that occur across many scales. Two divergent methods for characterizing such interactions have drawn on the physical analysis of critical phenomena and the mathematical study of information. Inferring criticality in neural systems has traditionally rested on fitting power laws to the property distributions of "neural avalanches" (contiguous bursts of activity), but the fractal nature of avalanche shapes has recently emerged as another signature of criticality. On the other hand, neural complexity, an information theoretic measure, has been used to capture the interplay between the functional localization of brain regions and their integration for higher cognitive functions. Unfortunately, treatments of all three methods-power-law fitting, avalanche shape collapse, and neural complexity-have suffered from shortcomings. Empirical data often contain biases that introduce deviations from true power law in the tail and head of the distribution, but deviations in the tail have often been unconsidered; avalanche shape collapse has required manual parameter tuning; and the estimation of neural complexity has relied on small data sets or statistical assumptions for the sake of computational efficiency. In this paper we present technical advancements in the analysis of criticality and complexity in neural systems. We use maximum-likelihood estimation to automatically fit power laws with left and right cutoffs, present the first automated shape collapse algorithm, and describe new techniques to account for large numbers of neural variables and small data sets in the calculation of neural complexity. In order to facilitate future research in criticality and complexity, we have made the software utilized in this analysis freely available online in the MATLAB NCC (Neural Complexity and Criticality) Toolbox.

  11. Intensity Conserving Spectral Fitting

    Science.gov (United States)

    Klimchuk, J. A.; Patsourakos, S.; Tripathi, D.

    2015-01-01

    The detailed shapes of spectral line profiles provide valuable information about the emitting plasma, especially when the plasma contains an unresolved mixture of velocities, temperatures, and densities. As a result of finite spectral resolution, the intensity measured by a spectrometer is the average intensity across a wavelength bin of non-zero size. It is assigned to the wavelength position at the center of the bin. However, the actual intensity at that discrete position will be different if the profile is curved, as it invariably is. Standard fitting routines (spline, Gaussian, etc.) do not account for this difference, and this can result in significant errors when making sensitive measurements. Detection of asymmetries in solar coronal emission lines is one example. Removal of line blends is another. We have developed an iterative procedure that corrects for this effect. It can be used with any fitting function, but we employ a cubic spline in a new analysis routine called Intensity Conserving Spline Interpolation (ICSI). As the name implies, it conserves the observed intensity within each wavelength bin, which ordinary fits do not. Given the rapid convergence, speed of computation, and ease of use, we suggest that ICSI be made a standard component of the processing pipeline for spectroscopic data.

  12. Establishment and validation of a dose-effect curve for γ-rays by cytogenetic analysis

    International Nuclear Information System (INIS)

    Barquinero, Joan F.; Caballin, Maria Rosa; Barrios, Leonardo; Ribas, Montserrat; Miro, Rosa; Egozcue, Josep

    1995-01-01

    A dose-effect curve obtained by analysis of dicentric chromosomes after irradiation of peripheral blood samples, from one donor, at 11 different doses of γ-rays is presented. For the elaboration of this curve, more than 18,000 first division metaphases have been analyzed. The results fit very well to the linear-quadratic model. To validate the curve, samples from six individuals (three controls and three occupationally exposed persons) were irradiated at 2 Gy. The results obtained, when compared with the curve, showed that in all cases the 95% confidence interval included the 2 Gy dose, with estimated dose ranges from 1.82 to 2.19 Gy

  13. Wind Turbine Blockset in Matlab/Simulink - General overview and description of the models

    Energy Technology Data Exchange (ETDEWEB)

    Iov, F.; Hansen, A.D.; Soerensen, P.; Blaabjerg, F.

    2004-03-01

    This report presents a new developed Matlab/Simulink Toolbox for wind turbine applications. This toolbox has been developed during the research project 'Simulation Platform to model, optimize and design wind turbines' and it has been used as a general developer tool for other three simulation tools: Saber, DIgSILENT, HAWC. The report provides first a quick overview over Matlab issues and then explains the structure of the developed toolbox. The attention in the report is mainly drawn to the description of the most important mathematical models, which have been developed in the Toolbox. Then, some simulation results using the developed models are shown. Finally, some general conclusions regarding this new developed Toolbox as well as some directions for future work are made. (au)

  14. Wind turbine blockset in Matlab/Simulink. General overview and description of the models

    Energy Technology Data Exchange (ETDEWEB)

    Iov, F.; Timbus, A.V.; Hansen, A.D.; Soerensen, P.; Blaabjerg, F.

    2004-03-01

    This report presents a new developed Matlab/Simulink Toolbox for wind turbine applications. This toolbox has been developed during the research project 'Simulation Platform to model, optimize and design wind turbines' and it has been used as a general developer tool for other three simulation tools: Saber, DIgSILENT, HAWC. The report provides first a quick overview over Matlab issues and then explains the structure of the developed toolbox. The attention in the report is mainly drawn to the description of the most important mathematical models, which have been developed in the Toolbox. Then, some simulation results using the developed models are shown. Finally, some general conclusions regarding this new developed Toolbox as well as some directions for future work are made. (au)

  15. EMPIRICALLY ESTIMATED FAR-UV EXTINCTION CURVES FOR CLASSICAL T TAURI STARS

    Energy Technology Data Exchange (ETDEWEB)

    McJunkin, Matthew; France, Kevin [Laboratory for Atmospheric and Space Physics, University of Colorado, 600 UCB, Boulder, CO 80303-7814 (United States); Schindhelm, Eric [Southwest Research Institute, 1050 Walnut Street, Suite 300, Boulder, CO 80302 (United States); Herczeg, Gregory [Kavli Institute for Astronomy and Astrophysics, Peking University, Yi He Yuan Lu 5, Haidian Qu, 100871 Beijing (China); Schneider, P. Christian [ESA/ESTEC, Keplerlaan 1, 2201 AZ Noordwijk (Netherlands); Brown, Alex, E-mail: matthew.mcjunkin@colorado.edu [Center for Astrophysics and Space Astronomy, University of Colorado, 593 UCB, Boulder, CO 80309-0593 (United States)

    2016-09-10

    Measurements of extinction curves toward young stars are essential for calculating the intrinsic stellar spectrophotometric radiation. This flux determines the chemical properties and evolution of the circumstellar region, including the environment in which planets form. We develop a new technique using H{sub 2} emission lines pumped by stellar Ly α photons to characterize the extinction curve by comparing the measured far-ultraviolet H{sub 2} line fluxes with model H{sub 2} line fluxes. The difference between model and observed fluxes can be attributed to the dust attenuation along the line of sight through both the interstellar and circumstellar material. The extinction curves are fit by a Cardelli et al. (1989) model and the A {sub V} (H{sub 2}) for the 10 targets studied with good extinction fits range from 0.5 to 1.5 mag, with R {sub V} values ranging from 2.0 to 4.7. A {sub V} and R {sub V} are found to be highly degenerate, suggesting that one or the other needs to be calculated independently. Column densities and temperatures for the fluorescent H{sub 2} populations are also determined, with averages of log{sub 10}( N (H{sub 2})) = 19.0 and T = 1500 K. This paper explores the strengths and limitations of the newly developed extinction curve technique in order to assess the reliability of the results and improve the method in the future.

  16. Determination of Nuclear Track Parameters for LR-115 Detector by Using of MATLAB Software Technique

    International Nuclear Information System (INIS)

    AL-Jomaily, F.M.; AL-joburi, H.A.; Mheemeed, A.K.

    2013-01-01

    The nuclear track detector parameters, such as nuclear track diameter D(μm), number of track N T and area of track A T were determined by using MATLAB software technique for IR-115 detector irradiated by alpha particle from 241 Am source under 1.5, 2.5 and 3.5 MeV at etching time T B of 90, 120, 150 and 180 min.By using the image analysis of MATLAB software for nuclear track, the full width at half maximum FWHM and relative resolution R% were calculated for each energy of alpha particles.In this study, it was shown that increasing the alpha energy on the IR-115 detector leads to increased etching time T B and the dropping of R% to minimum value, and then reach a stable value before dropping at values 1.5, 2.5 MeV and unstable at 3.5 MeV. Imaging analysis by MATLAB technique which used in this study reflect good and accurate results for nuclear track detector parameters and we recommend using this technique for determination of these parameters

  17. Virtual experiment of optical spatial filtering in Matlab environment

    Science.gov (United States)

    Ji, Yunjing; Wang, Chunyong; Song, Yang; Lai, Jiancheng; Wang, Qinghua; Qi, Jing; Shen, Zhonghua

    2017-08-01

    The principle of spatial filtering experiment has been introduced, and the computer simulation platform with graphical user interface (GUI) has been made out in Matlab environment. Using it various filtering processes for different input image or different filtering purpose will be completed accurately, and filtering effect can be observed clearly with adjusting experimental parameters. The physical nature of the optical spatial filtering can be showed vividly, and so experimental teaching effect will be promoted.

  18. Gamma-Ray Pulsar Light Curves as Probes of Magnetospheric Structure

    Science.gov (United States)

    Harding, A. K.

    2016-01-01

    The large number of gamma-ray pulsars discovered by the Fermi Gamma-Ray Space Telescope since its launch in 2008 dwarfs the handful that were previously known. The variety of observed light curves makes possible a tomography of both the ensemble-averaged field structure and the high-energy emission regions of a pulsar magnetosphere. Fitting the gamma-ray pulsar light curves with model magnetospheres and emission models has revealed that most of the high-energy emission, and the particles acceleration, takes place near or beyond the light cylinder, near the current sheet. As pulsar magnetosphere models become more sophisticated, it is possible to probe magnetic field structure and emission that are self-consistently determined. Light curve modeling will continue to be a powerful tool for constraining the pulsar magnetosphere physics.

  19. The method in γ spectrum analysis with artificial neural network based on MATLAB

    International Nuclear Information System (INIS)

    Bai Lixin; Zhang Yiyun; Xu Jiayun; Wu Liping

    2003-01-01

    Analyzing γ spectrum with artificial neural network have the advantage of using the information of whole spectrum and having high analyzing precision. A convenient realization based on MATLAB was present in this

  20. ADAMS-MATLAB Co-Simulation of A Serial Manipulator

    Directory of Open Access Journals (Sweden)

    Parthasarathy Tejaswin

    2017-01-01

    Full Text Available This paper presents the dynamic modelling and simulation of a now redundant robot, Mitsubishi RM-501, and proposes a general algorithm for experimental simulation in kinematics, dynamics and control analysis to any such robot. Through reverse engineering, a model as accurate as the real robot was developed in SolidWorks.The simulations of the same were performed in ADAMS (dynamicmodeling software offered by MSC Software Corpalong with MATLAB for motion studies and control dynamics. Finally, with a user-input path the accuracy and precision of the simulator was verified.