WorldWideScience

Sample records for multi-dimensional histogram technique

  1. Interpolation between multi-dimensional histograms using a new non-linear moment morphing method

    NARCIS (Netherlands)

    Baak, M.; Gadatsch, S.; Harrington, R.; Verkerke, W.

    2015-01-01

    A prescription is presented for the interpolation between multi-dimensional distribution templates based on one or multiple model parameters. The technique uses a linear combination of templates, each created using fixed values of the model׳s parameters and transformed according to a specific

  2. Interpolation between multi-dimensional histograms using a new non-linear moment morphing method

    Energy Technology Data Exchange (ETDEWEB)

    Baak, M., E-mail: max.baak@cern.ch [CERN, CH-1211 Geneva 23 (Switzerland); Gadatsch, S., E-mail: stefan.gadatsch@nikhef.nl [Nikhef, PO Box 41882, 1009 DB Amsterdam (Netherlands); Harrington, R. [School of Physics and Astronomy, University of Edinburgh, Mayfield Road, Edinburgh, EH9 3JZ, Scotland (United Kingdom); Verkerke, W. [Nikhef, PO Box 41882, 1009 DB Amsterdam (Netherlands)

    2015-01-21

    A prescription is presented for the interpolation between multi-dimensional distribution templates based on one or multiple model parameters. The technique uses a linear combination of templates, each created using fixed values of the model's parameters and transformed according to a specific procedure, to model a non-linear dependency on model parameters and the dependency between them. By construction the technique scales well with the number of input templates used, which is a useful feature in modern day particle physics, where a large number of templates are often required to model the impact of systematic uncertainties.

  3. Interpolation between multi-dimensional histograms using a new non-linear moment morphing method

    International Nuclear Information System (INIS)

    Baak, M.; Gadatsch, S.; Harrington, R.; Verkerke, W.

    2015-01-01

    A prescription is presented for the interpolation between multi-dimensional distribution templates based on one or multiple model parameters. The technique uses a linear combination of templates, each created using fixed values of the model's parameters and transformed according to a specific procedure, to model a non-linear dependency on model parameters and the dependency between them. By construction the technique scales well with the number of input templates used, which is a useful feature in modern day particle physics, where a large number of templates are often required to model the impact of systematic uncertainties

  4. Interpolation between multi-dimensional histograms using a new non-linear moment morphing method

    CERN Document Server

    Baak, Max; Harrington, Robert; Verkerke, Wouter

    2014-01-01

    A prescription is presented for the interpolation between multi-dimensional distribution templates based on one or multiple model parameters. The technique uses a linear combination of templates, each created using fixed values of the model's parameters and transformed according to a specific procedure, to model a non-linear dependency on model parameters and the dependency between them. By construction the technique scales well with the number of input templates used, which is a useful feature in modern day particle physics, where a large number of templates is often required to model the impact of systematic uncertainties.

  5. Interpolation between multi-dimensional histograms using a new non-linear moment morphing method

    CERN Document Server

    Baak, Max; Harrington, Robert; Verkerke, Wouter

    2015-01-01

    A prescription is presented for the interpolation between multi-dimensional distribution templates based on one or multiple model parameters. The technique uses a linear combination of templates, each created using fixed values of the model's parameters and transformed according to a specific procedure, to model a non-linear dependency on model parameters and the dependency between them. By construction the technique scales well with the number of input templates used, which is a useful feature in modern day particle physics, where a large number of templates is often required to model the impact of systematic uncertainties.

  6. The Research of Histogram Enhancement Technique Based on Matlab Software

    Directory of Open Access Journals (Sweden)

    Li Kai

    2014-08-01

    Full Text Available Histogram enhancement technique has been widely applied as a typical pattern in digital image processing. The paper is based on Matlab software, through the two ways of histogram equalization and histogram specification technologies to deal with the darker images, using two methods of partial equilibrium and mapping histogram to transform the original histograms, thereby enhanced the image information. The results show that these two kinds of techniques both can significantly improve the image quality and enhance the image feature.

  7. AHIMSA - Ad hoc histogram information measure sensing algorithm for feature selection in the context of histogram inspired clustering techniques

    Science.gov (United States)

    Dasarathy, B. V.

    1976-01-01

    An algorithm is proposed for dimensionality reduction in the context of clustering techniques based on histogram analysis. The approach is based on an evaluation of the hills and valleys in the unidimensional histograms along the different features and provides an economical means of assessing the significance of the features in a nonparametric unsupervised data environment. The method has relevance to remote sensing applications.

  8. Motor Oil Classification using Color Histograms and Pattern Recognition Techniques.

    Science.gov (United States)

    Ahmadi, Shiva; Mani-Varnosfaderani, Ahmad; Habibi, Biuck

    2018-04-20

    Motor oil classification is important for quality control and the identification of oil adulteration. In thiswork, we propose a simple, rapid, inexpensive and nondestructive approach based on image analysis and pattern recognition techniques for the classification of nine different types of motor oils according to their corresponding color histograms. For this, we applied color histogram in different color spaces such as red green blue (RGB), grayscale, and hue saturation intensity (HSI) in order to extract features that can help with the classification procedure. These color histograms and their combinations were used as input for model development and then were statistically evaluated by using linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), and support vector machine (SVM) techniques. Here, two common solutions for solving a multiclass classification problem were applied: (1) transformation to binary classification problem using a one-against-all (OAA) approach and (2) extension from binary classifiers to a single globally optimized multilabel classification model. In the OAA strategy, LDA, QDA, and SVM reached up to 97% in terms of accuracy, sensitivity, and specificity for both the training and test sets. In extension from binary case, despite good performances by the SVM classification model, QDA and LDA provided better results up to 92% for RGB-grayscale-HSI color histograms and up to 93% for the HSI color map, respectively. In order to reduce the numbers of independent variables for modeling, a principle component analysis algorithm was used. Our results suggest that the proposed method is promising for the identification and classification of different types of motor oils.

  9. Multi-dimensional imaging

    CERN Document Server

    Javidi, Bahram; Andres, Pedro

    2014-01-01

    Provides a broad overview of advanced multidimensional imaging systems with contributions from leading researchers in the field Multi-dimensional Imaging takes the reader from the introductory concepts through to the latest applications of these techniques. Split into 3 parts covering 3D image capture, processing, visualization and display, using 1) a Multi-View Approach and 2.) a Holographic Approach, followed by a 3rd part addressing other 3D systems approaches, applications and signal processing for advanced 3D imaging. This book describes recent developments, as well as the prospects and

  10. Evaluating spatial- and temporal-oriented multi-dimensional visualization techniques

    Directory of Open Access Journals (Sweden)

    Chong Ho Yu

    2003-07-01

    Full Text Available Visualization tools are said to be helpful for researchers to unveil hidden patterns and..relationships among variables, and also for teachers to present abstract statistical concepts and..complicated data structures in a concrete manner. However, higher-dimension visualization..techniques can be confusing and even misleading, especially when human-instrument interface..and cognitive issues are under-applied. In this article, the efficacy of function-based, datadriven,..spatial-oriented, and temporal-oriented visualization techniques are discussed based..upon extensive review. Readers can find practical implications for both research and..instructional practices. For research purposes, the spatial-based graphs, such as Trellis displays..in S-Plus, are preferable over the temporal-based displays, such as the 3D animated plot in..SAS/Insight. For teaching purposes, the temporal-based displays, such as the 3D animation plot..in Maple, seem to have advantages over the spatial-based graphs, such as the 3D triangular..coordinate plot in SyStat.

  11. Contrast Enhancement Using Brightness Preserving Histogram Equalization Technique for Classification of Date Varieties

    Directory of Open Access Journals (Sweden)

    G Thomas

    2014-06-01

    Full Text Available Computer vision technique is becoming popular for quality assessment of many products in food industries. Image enhancement is the first step in analyzing the images in order to obtain detailed information for the determination of quality. In this study, Brightness preserving histogram equalization technique was used to enhance the features of gray scale images to classify three date varieties (Khalas, Fard and Madina. Mean, entropy, kurtosis and skewness features were extracted from the original and enhanced images. Mean and entropy from original images and kurtosis from the enhanced images were selected based on Lukka's feature selection approach. An overall classification efficiency of 93.72% was achieved with just three features. Brightness preserving histogram equalization technique has great potential to improve the classification in various quality attributes of food and agricultural products with minimum features.

  12. Estimates of error introduced when one-dimensional inverse heat transfer techniques are applied to multi-dimensional problems

    International Nuclear Information System (INIS)

    Lopez, C.; Koski, J.A.; Razani, A.

    2000-01-01

    A study of the errors introduced when one-dimensional inverse heat conduction techniques are applied to problems involving two-dimensional heat transfer effects was performed. The geometry used for the study was a cylinder with similar dimensions as a typical container used for the transportation of radioactive materials. The finite element analysis code MSC P/Thermal was used to generate synthetic test data that was then used as input for an inverse heat conduction code. Four different problems were considered including one with uniform flux around the outer surface of the cylinder and three with non-uniform flux applied over 360 deg C, 180 deg C, and 90 deg C sections of the outer surface of the cylinder. The Sandia One-Dimensional Direct and Inverse Thermal (SODDIT) code was used to estimate the surface heat flux of all four cases. The error analysis was performed by comparing the results from SODDIT and the heat flux calculated based on the temperature results obtained from P/Thermal. Results showed an increase in error of the surface heat flux estimates as the applied heat became more localized. For the uniform case, SODDIT provided heat flux estimates with a maximum error of 0.5% whereas for the non-uniform cases, the maximum errors were found to be about 3%, 7%, and 18% for the 360 deg C, 180 deg C, and 90 deg C cases, respectively

  13. PROCESS PERFORMANCE EVALUATION USING HISTOGRAM AND TAGUCHI TECHNIQUE IN LOCK MANUFACTURING COMPANY

    Directory of Open Access Journals (Sweden)

    Hagos Berhane

    2013-12-01

    Full Text Available Process capability analysis is a vital part of an overall quality improvement program. It is a technique that has application in many segments of the product cycle, including product and process design, vendor sourcing, production or manufacturing planning, and manufacturing. Frequently, a process capability study involves observing a quality characteristic of the product. Since this information usually pertains to the product rather than the process, this analysis should strictly speaking be called a product analysis study. A true process capability study in this context would involve collecting data that relates to process parameters so that remedial actions can be identified on a timely basis. The present study attempts to analyze performance of drilling, pressing, and reaming operations carried out for the manufacturing of two major lock components viz. handle and lever plate, at Gaurav International, Aligarh (India. The data collected for depth of hole on handle, central hole diameter, and key hole diameter are used to construct histogram. Next, the information available in frequency distribution table, the process mean, process capability from calculations and specification limits provided by the manufacturing concern are used with Taguchi technique. The data obtained from histogram and Taguchi technique combined are used to evaluate the performance of the manufacturing process. Results of this study indicated that the performance of all the processes used to produce depth of hole on handle, key hole diameter, and central hole diameter are potentially incapable as the process capability indices are found to be 0.54, 0.54 and 0.76 respectively. The number of nonconforming parts expressed in terms of parts per million (ppm that have fallen out of the specification limits are found to be 140000, 26666.66, and 146666.66 for depth of hole on handle, central hole diameter, and key hole diameter respectively. As a result, the total loss incurred

  14. Multi-Dimensional Path Queries

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    1998-01-01

    to create nested path structures. We present an SQL-like query language that is based on path expressions and we show how to use it to express multi-dimensional path queries that are suited for advanced data analysis in decision support environments like data warehousing environments......We present the path-relationship model that supports multi-dimensional data modeling and querying. A path-relationship database is composed of sets of paths and sets of relationships. A path is a sequence of related elements (atoms, paths, and sets of paths). A relationship is a binary path...

  15. DNA IMAGE CYTOMETRY IN PROGNOSTICATION OF COLORECTAL CANCER: PRACTICAL CONSIDERATIONS OF THE TECHNIQUE AND INTERPRETATION OF THE HISTOGRAMS

    Directory of Open Access Journals (Sweden)

    Abdelbaset Buhmeida

    2011-05-01

    Full Text Available The role of DNA content as a prognostic factor in colorectal cancer (CRC is highly controversial. Some of these controversies are due to purely technical reasons, e.g. variable practices in interpreting the DNA histograms, which is problematic particularly in advanced cases. In this report, we give a detailed account on various options how these histograms could be optimally interpreted, with the idea of establishing the potential value of DNA image cytometry in prognosis and in selection of proper treatment. Material consists of nuclei isolated from 50 ƒĘm paraffin sections from 160 patients with stage II, III or IV CRC diagnosed, treated and followed-up in our clinic. The nuclei were stained with the Feulgen stain. Nuclear DNA was measured using computer-assisted image cytometry. We applied 4 different approaches to analyse the DNA histograms: 1 appearance of the histogram (ABCDE approach, 2 range of DNA values, 3 peak evaluation, and 4 events present at high DNA values. Intra-observer reproducibility of these four histogram interpretation was 89%, 95%, 96%, and 100%, respectively. We depicted selected histograms to illustrate the four analytical approaches in cases with different stages of CRC, with variable disease outcome. In our analysis, the range of DNA values was the best prognosticator, i.e., the tumours with the widest histograms had the most ominous prognosis. These data implicate that DNA cytometry based on isolated nuclei is valuable in predicting the prognosis of CRC. Different interpretation techniques differed in their reproducibility, but the method showing the best prognostic value also had high reproducibility in our analysis.

  16. COLOUR IMAGE ENHANCEMENT BASED ON HISTOGRAM EQUALIZATION

    OpenAIRE

    Kanika Kapoor and Shaveta Arora

    2015-01-01

    Histogram equalization is a nonlinear technique for adjusting the contrast of an image using its histogram. It increases the brightness of a gray scale image which is different from the mean brightness of the original image. There are various types of Histogram equalization techniques like Histogram Equalization, Contrast Limited Adaptive Histogram Equalization, Brightness Preserving Bi Histogram Equalization, Dualistic Sub Image Histogram Equalization, Minimum Mean Brightness Error Bi Histog...

  17. Investigating the Role of Global Histogram Equalization Technique for 99mTechnetium-Methylene diphosphonate Bone Scan Image Enhancement.

    Science.gov (United States)

    Pandey, Anil Kumar; Sharma, Param Dev; Dheer, Pankaj; Parida, Girish Kumar; Goyal, Harish; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh

    2017-01-01

    99m Technetium-methylene diphosphonate ( 99m Tc-MDP) bone scan images have limited number of counts per pixel, and hence, they have inferior image quality compared to X-rays. Theoretically, global histogram equalization (GHE) technique can improve the contrast of a given image though practical benefits of doing so have only limited acceptance. In this study, we have investigated the effect of GHE technique for 99m Tc-MDP-bone scan images. A set of 89 low contrast 99m Tc-MDP whole-body bone scan images were included in this study. These images were acquired with parallel hole collimation on Symbia E gamma camera. The images were then processed with histogram equalization technique. The image quality of input and processed images were reviewed by two nuclear medicine physicians on a 5-point scale where score of 1 is for very poor and 5 is for the best image quality. A statistical test was applied to find the significance of difference between the mean scores assigned to input and processed images. This technique improves the contrast of the images; however, oversaturation was noticed in the processed images. Student's t -test was applied, and a statistically significant difference in the input and processed image quality was found at P histogram equalization technique in combination with some other postprocessing technique is useful.

  18. Novel Variants of a Histogram Shift-Based Reversible Watermarking Technique for Medical Images to Improve Hiding Capacity

    Directory of Open Access Journals (Sweden)

    Vishakha Kelkar

    2017-01-01

    Full Text Available In telemedicine systems, critical medical data is shared on a public communication channel. This increases the risk of unauthorised access to patient’s information. This underlines the importance of secrecy and authentication for the medical data. This paper presents two innovative variations of classical histogram shift methods to increase the hiding capacity. The first technique divides the image into nonoverlapping blocks and embeds the watermark individually using the histogram method. The second method separates the region of interest and embeds the watermark only in the region of noninterest. This approach preserves the medical information intact. This method finds its use in critical medical cases. The high PSNR (above 45 dB obtained for both techniques indicates imperceptibility of the approaches. Experimental results illustrate superiority of the proposed approaches when compared with other methods based on histogram shifting techniques. These techniques improve embedding capacity by 5–15% depending on the image type, without affecting the quality of the watermarked image. Both techniques also enable lossless reconstruction of the watermark and the host medical image. A higher embedding capacity makes the proposed approaches attractive for medical image watermarking applications without compromising the quality of the image.

  19. Novel Variants of a Histogram Shift-Based Reversible Watermarking Technique for Medical Images to Improve Hiding Capacity

    Science.gov (United States)

    Tuckley, Kushal

    2017-01-01

    In telemedicine systems, critical medical data is shared on a public communication channel. This increases the risk of unauthorised access to patient's information. This underlines the importance of secrecy and authentication for the medical data. This paper presents two innovative variations of classical histogram shift methods to increase the hiding capacity. The first technique divides the image into nonoverlapping blocks and embeds the watermark individually using the histogram method. The second method separates the region of interest and embeds the watermark only in the region of noninterest. This approach preserves the medical information intact. This method finds its use in critical medical cases. The high PSNR (above 45 dB) obtained for both techniques indicates imperceptibility of the approaches. Experimental results illustrate superiority of the proposed approaches when compared with other methods based on histogram shifting techniques. These techniques improve embedding capacity by 5–15% depending on the image type, without affecting the quality of the watermarked image. Both techniques also enable lossless reconstruction of the watermark and the host medical image. A higher embedding capacity makes the proposed approaches attractive for medical image watermarking applications without compromising the quality of the image. PMID:29104744

  20. Histogram-based normalization technique on human brain magnetic resonance images from different acquisitions.

    Science.gov (United States)

    Sun, Xiaofei; Shi, Lin; Luo, Yishan; Yang, Wei; Li, Hongpeng; Liang, Peipeng; Li, Kuncheng; Mok, Vincent C T; Chu, Winnie C W; Wang, Defeng

    2015-07-28

    Intensity normalization is an important preprocessing step in brain magnetic resonance image (MRI) analysis. During MR image acquisition, different scanners or parameters would be used for scanning different subjects or the same subject at a different time, which may result in large intensity variations. This intensity variation will greatly undermine the performance of subsequent MRI processing and population analysis, such as image registration, segmentation, and tissue volume measurement. In this work, we proposed a new histogram normalization method to reduce the intensity variation between MRIs obtained from different acquisitions. In our experiment, we scanned each subject twice on two different scanners using different imaging parameters. With noise estimation, the image with lower noise level was determined and treated as the high-quality reference image. Then the histogram of the low-quality image was normalized to the histogram of the high-quality image. The normalization algorithm includes two main steps: (1) intensity scaling (IS), where, for the high-quality reference image, the intensities of the image are first rescaled to a range between the low intensity region (LIR) value and the high intensity region (HIR) value; and (2) histogram normalization (HN),where the histogram of low-quality image as input image is stretched to match the histogram of the reference image, so that the intensity range in the normalized image will also lie between LIR and HIR. We performed three sets of experiments to evaluate the proposed method, i.e., image registration, segmentation, and tissue volume measurement, and compared this with the existing intensity normalization method. It is then possible to validate that our histogram normalization framework can achieve better results in all the experiments. It is also demonstrated that the brain template with normalization preprocessing is of higher quality than the template with no normalization processing. We have proposed

  1. Multi-dimensional Fuzzy Euler Approximation

    Directory of Open Access Journals (Sweden)

    Yangyang Hao

    2017-05-01

    Full Text Available Multi-dimensional Fuzzy differential equations driven by multi-dimen-sional Liu process, have been intensively applied in many fields. However, we can not obtain the analytic solution of every multi-dimensional fuzzy differential equation. Then, it is necessary for us to discuss the numerical results in most situations. This paper focuses on the numerical method of multi-dimensional fuzzy differential equations. The multi-dimensional fuzzy Taylor expansion is given, based on this expansion, a numerical method which is designed for giving the solution of multi-dimensional fuzzy differential equation via multi-dimensional Euler method will be presented, and its local convergence also will be discussed.

  2. Enhancement of Edge-based Image Quality Measures Using Entropy for Histogram Equalization-based Contrast Enhancement Techniques

    Directory of Open Access Journals (Sweden)

    H. T. R. Kurmasha

    2017-12-01

    Full Text Available An Edge-based image quality measure (IQM technique for the assessment of histogram equalization (HE-based contrast enhancement techniques has been proposed that outperforms the Absolute Mean Brightness Error (AMBE and Entropy which are the most commonly used IQMs to evaluate Histogram Equalization based techniques, and also the two prominent fidelity-based IQMs which are Multi-Scale Structural Similarity (MSSIM and Information Fidelity Criterion-based (IFC measures. The statistical evaluation results show that the Edge-based IQM, which was designed for detecting noise artifacts distortion, has a Person Correlation Coefficient (PCC > 0.86 while the others have poor or fair correlation to human opinion, considering the Human Visual Perception (HVP. Based on HVP, this paper propose an enhancement to classic Edge-based IQM by taking into account the brightness saturation distortion which is the most prominent distortion in HE-based contrast enhancement techniques. It is tested and found to have significantly well correlation (PCC > 0.87, Spearman rank order correlation coefficient (SROCC > 0.92, Root Mean Squared Error (RMSE < 0.1054, and Outlier Ratio (OR = 0%.

  3. Reduction of multi-dimensional laboratory data to a two-dimensional plot: a novel technique for the identification of laboratory error.

    Science.gov (United States)

    Kazmierczak, Steven C; Leen, Todd K; Erdogmus, Deniz; Carreira-Perpinan, Miguel A

    2007-01-01

    The clinical laboratory generates large amounts of patient-specific data. Detection of errors that arise during pre-analytical, analytical, and post-analytical processes is difficult. We performed a pilot study, utilizing a multidimensional data reduction technique, to assess the utility of this method for identifying errors in laboratory data. We evaluated 13,670 individual patient records collected over a 2-month period from hospital inpatients and outpatients. We utilized those patient records that contained a complete set of 14 different biochemical analytes. We used two-dimensional generative topographic mapping to project the 14-dimensional record to a two-dimensional space. The use of a two-dimensional generative topographic mapping technique to plot multi-analyte patient data as a two-dimensional graph allows for the rapid identification of potentially anomalous data. Although we performed a retrospective analysis, this technique has the benefit of being able to assess laboratory-generated data in real time, allowing for the rapid identification and correction of anomalous data before they are released to the physician. In addition, serial laboratory multi-analyte data for an individual patient can also be plotted as a two-dimensional plot. This tool might also be useful for assessing patient wellbeing and prognosis.

  4. Two multi-dimensional uncertainty relations

    International Nuclear Information System (INIS)

    Skala, L; Kapsa, V

    2008-01-01

    Two multi-dimensional uncertainty relations, one related to the probability density and the other one related to the probability density current, are derived and discussed. Both relations are stronger than the usual uncertainty relations for the coordinates and momentum

  5. Multi-dimensional modeling of two-phase flow in rod bundles and interpretation of velocities measured in BWRs by the cross-correlation technique

    International Nuclear Information System (INIS)

    Analytis, G.Th.; Luebbesmeyer, D.

    1984-04-01

    The authors present an as precise as possible interpretation of velocity measurements in BWRs by the cross-correlation technique, which is based on the radially non-uniform quality and velocity distribution in BWR type bundles, as well as on our knowledge about the spatial 'field of view' of the in-core neutron detectors. After formulating the three-dimensional two-fluid model volume/time averaged equations and pointing out some problems associated with averaging, they expound a little on the turbulence mixing and void drift effects, as well as on the way they are modelled in advanced subchannel analysis codes like THERMIT or COBRA-TF. Subsequently, some comparisons are made between axial velocities measured in a commercial BWR by neutron noise analysis, and the steam velocities of the four subchannels nearest to the instrument tube of one of the four bundles as predicted by COBRA-III and by THERMIT. Although as expected, for well-known reasons, COBRA-III predicts subchannel steam velocities which are close to each other, THERMIT correctly predicts in the upper half of the core three largely different steam velocities in the three different types of BW0 subchannels (corner, edge and interior). (Auth.)

  6. Impact of the radiotherapy technique on the correlation between dose–volume histograms of the bladder wall defined on MRI imaging and dose–volume/surface histograms in prostate cancer patients

    International Nuclear Information System (INIS)

    Maggio, Angelo; Carillo, Viviana; Perna, Lucia; Fiorino, Claudio; Cozzarini, Cesare; Rancati, Tiziana; Valdagni, Riccardo; Gabriele, Pietro

    2013-01-01

    The aim of this study was to evaluate the correlation between the ‘true’ absolute and relative dose–volume histograms (DVHs) of the bladder wall, dose–wall histogram (DWH) defined on MRI imaging and other surrogates of bladder dosimetry in prostate cancer patients, planned both with 3D-conformal and intensity-modulated radiation therapy (IMRT) techniques. For 17 prostate cancer patients, previously treated with radical intent, CT and MRI scans were acquired and matched. The contours of bladder walls were drawn by using MRI images. External bladder surfaces were then used to generate artificial bladder walls by performing automatic contractions of 5, 7 and 10 mm. For each patient a 3D conformal radiotherapy (3DCRT) and an IMRT treatment plan was generated with a prescription dose of 77.4 Gy (1.8 Gy/fr) and DVH of the whole bladder of the artificial walls (DVH-5/10) and dose–surface histograms (DSHs) were calculated and compared against the DWH in absolute and relative value, for both treatment planning techniques. A specific software (VODCA v. 4.4.0, MSS Inc.) was used for calculating the dose–volume/surface histogram. Correlation was quantified for selected dose–volume/surface parameters by the Spearman correlation coefficient. The agreement between %DWH and DVH5, DVH7 and DVH10 was found to be very good (maximum average deviations below 2%, SD < 5%): DVH5 showed the best agreement. The correlation was slightly better for absolute (R = 0.80–0.94) compared to relative (R = 0.66–0.92) histograms. The DSH was also found to be highly correlated with the DWH, although slightly higher deviations were generally found. The DVH was not a good surrogate of the DWH (R < 0.7 for most of parameters). When comparing the two treatment techniques, more pronounced differences between relative histograms were seen for IMRT with respect to 3DCRT (p < 0.0001). (note)

  7. A histogram-based technique for rapid vector extraction from PIV photographs

    Science.gov (United States)

    Humphreys, William M., Jr.

    1991-01-01

    A new analysis technique, performed totally in the image plane, is proposed which rapidly extracts all available vectors from individual interrogation regions on PIV photographs. The technique avoids the need for using Fourier transforms with the associated computational burden. The data acquisition and analysis procedure is described, and results of a preliminary simulation study to evaluate the accuracy of the technique are presented. Recently obtained PIV photographs are analyzed.

  8. Multi-Dimensional Aggregation for Temporal Data

    DEFF Research Database (Denmark)

    Böhlen, M. H.; Gamper, J.; Jensen, Christian Søndergaard

    2006-01-01

    Business Intelligence solutions, encompassing technologies such as multi-dimensional data modeling and aggregate query processing, are being applied increasingly to non-traditional data. This paper extends multi-dimensional aggregation to apply to data with associated interval values that capture...... that the data holds for each point in the interval, as well as the case where the data holds only for the entire interval, but must be adjusted to apply to sub-intervals. The paper reports on an implementation of the new operator and on an empirical study that indicates that the operator scales to large data...

  9. The 'thousand words' problem: Summarizing multi-dimensional data

    International Nuclear Information System (INIS)

    Scott, David M.

    2011-01-01

    Research highlights: → Sophisticated process sensors produce large multi-dimensional data sets. → Plant control systems cannot handle images or large amounts of data. → Various techniques reduce the dimensionality, extracting information from raw data. → Simple 1D and 2D methods can often be extended to 3D and 4D applications. - Abstract: An inherent difficulty in the application of multi-dimensional sensing to process monitoring and control is the extraction and interpretation of useful information. Ultimately the measured data must be collapsed into a relatively small number of values that capture the salient characteristics of the process. Although multiple dimensions are frequently necessary to isolate a particular physical attribute (such as the distribution of a particular chemical species in a reactor), plant control systems are not equipped to use such data directly. The production of a multi-dimensional data set (often displayed as an image) is not the final step of the measurement process, because information must still be extracted from the raw data. In the metaphor of one picture being equal to a thousand words, the problem becomes one of paraphrasing a lengthy description of the image with one or two well-chosen words. Various approaches to solving this problem are discussed using examples from the fields of particle characterization, image processing, and process tomography.

  10. Multi-dimensional quasitoeplitz Markov chains

    Directory of Open Access Journals (Sweden)

    Alexander N. Dudin

    1999-01-01

    Full Text Available This paper deals with multi-dimensional quasitoeplitz Markov chains. We establish a sufficient equilibrium condition and derive a functional matrix equation for the corresponding vector-generating function, whose solution is given algorithmically. The results are demonstrated in the form of examples and applications in queues with BMAP-input, which operate in synchronous random environment.

  11. Reproducibility of brain ADC histograms

    International Nuclear Information System (INIS)

    Steens, S.C.A.; Buchem, M.A. van; Admiraal-Behloul, F.; Schaap, J.A.; Hoogenraad, F.G.C.; Wheeler-Kingshott, C.A.M.; Tofts, P.S.; Cessie, S. le

    2004-01-01

    The aim of this study was to assess the effect of differences in acquisition technique on whole-brain apparent diffusion coefficient (ADC) histogram parameters, as well as to assess scan-rescan reproducibility. Diffusion-weighted imaging (DWI) was performed in 7 healthy subjects with b-values 0-800, 0-1000, and 0-1500 s/mm 2 and fluid-attenuated inversion recovery (FLAIR) DWI with b-values 0-1000 s/mm 2 . All sequences were repeated with and without repositioning. The peak location, peak height, and mean ADC of the ADC histograms and mean ADC of a region of interest (ROI) in the white matter were compared using paired-sample t tests. Scan-rescan reproducibility was assessed using paired-sample t tests, and repeatability coefficients were reported. With increasing maximum b-values, ADC histograms shifted to lower values, with an increase in peak height (p<0.01). With FLAIR DWI, the ADC histogram shifted to lower values with a significantly higher, narrower peak (p<0.01), although the ROI mean ADC showed no significant differences. For scan-rescan reproducibility, no significant differences were observed. Different DWI pulse sequences give rise to different ADC histograms. With a given pulse sequence, however, ADC histogram analysis is a robust and reproducible technique. Using FLAIR DWI, the partial-voluming effect of cerebrospinal fluid, and thus its confounding effect on histogram analyses, can be reduced

  12. Multi-dimensional Laplace transforms and applications

    International Nuclear Information System (INIS)

    Mughrabi, T.A.

    1988-01-01

    In this dissertation we establish new theorems for computing certain types of multidimensional Laplace transform pairs from known one-dimensional Laplace transforms. The theorems are applied to the most commonly used special functions and so we obtain many two and three dimensional Laplace transform pairs. As applications, some boundary value problems involving linear partial differential equations are solved by the use of multi-dimensional Laplace transformation. Also we establish some relations between the Laplace transformation and other integral transformation in two variables

  13. Multi-dimensional medical images compressed and filtered with wavelets

    International Nuclear Information System (INIS)

    Boyen, H.; Reeth, F. van; Flerackers, E.

    2002-01-01

    other direction for 3D and even more complex figures for 4D, because each dimension in each level halves the data to be transformed. After calculating the complex boundaries for filtering with band- and notch-filters on 1D- to 4D-data the non-standard decomposition gives fast wavelet-algorithm. We propose a new method for calculating the complex geometric figures formed by band- and notch-filters when using the non-standard decomposition giving the opportunity to compress and filter multi-dimensional images also with those wavelet-techniques. This leads to faster wavelet-algorithms to compress and filter multi-dimensional medical images. (author)

  14. Transport stochastic multi-dimensional media

    International Nuclear Information System (INIS)

    Haran, O.; Shvarts, D.

    1996-01-01

    Many physical phenomena evolve according to known deterministic rules, but in a stochastic media in which the composition changes in space and time. Examples to such phenomena are heat transfer in turbulent atmosphere with non uniform diffraction coefficients, neutron transfer in boiling coolant of a nuclear reactor and radiation transfer through concrete shields. The results of measurements conducted upon such a media are stochastic by nature, and depend on the specific realization of the media. In the last decade there has been a considerable efforts to describe linear particle transport in one dimensional stochastic media composed of several immiscible materials. However, transport in two or three dimensional stochastic media has been rarely addressed. The important effect in multi-dimensional transport that does not appear in one dimension is the ability to bypass obstacles. The current work is an attempt to quantify this effect. (authors)

  15. Transport stochastic multi-dimensional media

    Energy Technology Data Exchange (ETDEWEB)

    Haran, O; Shvarts, D [Israel Atomic Energy Commission, Beersheba (Israel). Nuclear Research Center-Negev; Thiberger, R [Ben-Gurion Univ. of the Negev, Beersheba (Israel)

    1996-12-01

    Many physical phenomena evolve according to known deterministic rules, but in a stochastic media in which the composition changes in space and time. Examples to such phenomena are heat transfer in turbulent atmosphere with non uniform diffraction coefficients, neutron transfer in boiling coolant of a nuclear reactor and radiation transfer through concrete shields. The results of measurements conducted upon such a media are stochastic by nature, and depend on the specific realization of the media. In the last decade there has been a considerable efforts to describe linear particle transport in one dimensional stochastic media composed of several immiscible materials. However, transport in two or three dimensional stochastic media has been rarely addressed. The important effect in multi-dimensional transport that does not appear in one dimension is the ability to bypass obstacles. The current work is an attempt to quantify this effect. (authors).

  16. Application of whole-lesion histogram analysis of pharmacokinetic parameters in dynamic contrast-enhanced MRI of breast lesions with the CAIPIRINHA-Dixon-TWIST-VIBE technique.

    Science.gov (United States)

    Li, Zhiwei; Ai, Tao; Hu, Yiqi; Yan, Xu; Nickel, Marcel Dominik; Xu, Xiao; Xia, Liming

    2018-01-01

    To investigate the application of whole-lesion histogram analysis of pharmacokinetic parameters for differentiating malignant from benign breast lesions on dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI). In all, 92 women with 97 breast lesions (26 benign and 71 malignant lesions) were enrolled in this study. Patients underwent dynamic breast MRI at 3T using a prototypical CAIPIRINHA-Dixon-TWIST-VIBE (CDT-VIBE) sequence and a subsequent surgery or biopsy. Inflow rate of the agent between plasma and interstitium (K trans ), outflow rate of agent between interstitium and plasma (K ep ), extravascular space volume per unit volume of tissue (v e ) including mean value, 25th/50th/75th/90th percentiles, skewness, and kurtosis were then calculated based on the whole lesion. A single-sample Kolmogorov-Smirnov test, paired t-test, and receiver operating characteristic curve (ROC) analysis were used for statistical analysis. Malignant breast lesions had significantly higher K trans , K ep , and lower v e in mean values, 25th/50th/75th/90th percentiles, and significantly higher skewness of v e than benign breast lesions (all P 0.05). The 90th percentile of K trans , the 90th percentile of K ep , and the 50th percentile of v e showed the greatest areas under the ROC curve (AUC) for each pharmacokinetic parameter derived from DCE-MRI. The 90th percentile of K ep achieved the highest AUC value (0.927) among all histogram-derived values. The whole-lesion histogram analysis of pharmacokinetic parameters can improve the diagnostic accuracy of breast DCE-MRI with the CDT-VIBE technique. The 90th percentile of K ep may be the best indicator in differentiation between malignant and benign breast lesions. 4 Technical Efficacy Stage: 2 J. Magn. Reson. Imaging 2018;47:91-96. © 2017 International Society for Magnetic Resonance in Medicine.

  17. The Amazing Histogram.

    Science.gov (United States)

    Vandermeulen, H.; DeWreede, R. E.

    1983-01-01

    Presents a histogram drawing program which sorts real numbers in up to 30 categories. Entered data are sorted and saved in a text file which is then used to generate the histogram. Complete Applesoft program listings are included. (JN)

  18. Information granules in image histogram analysis.

    Science.gov (United States)

    Wieclawek, Wojciech

    2018-04-01

    A concept of granular computing employed in intensity-based image enhancement is discussed. First, a weighted granular computing idea is introduced. Then, the implementation of this term in the image processing area is presented. Finally, multidimensional granular histogram analysis is introduced. The proposed approach is dedicated to digital images, especially to medical images acquired by Computed Tomography (CT). As the histogram equalization approach, this method is based on image histogram analysis. Yet, unlike the histogram equalization technique, it works on a selected range of the pixel intensity and is controlled by two parameters. Performance is tested on anonymous clinical CT series. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Parotid gland tumors: A comparison of postoperative radiotherapy techniques using three dimensional (3D) dose distributions and dose-volume histograms (DVHs)

    International Nuclear Information System (INIS)

    Yaparpalvi, Ravindra; Fontenla, Doracy P.; Tyerech, Sangeeta K.; Boselli, Lucia R.; Beitler, Jonathan J.

    1998-01-01

    Purpose: To compare different treatment techniques for unilateral treatment of parotid gland tumors. Methods and Materials: The CT-scans of a representative parotid patient were used. The field size was 9 x 11 cm, the separation was 15.5 cm, and the prescription depth was 4.5 cm. Using 3D dose distributions, tissue inhomogeneity corrections, scatter integration (for photons) and pencil beam (for electrons) algorithms and dose-volume histogram (DVH), nine treatment techniques were compared. [1] unilateral 6 MV photons [2] unilateral 12 MeV electrons [3] unilateral 16 MeV electrons [4] an ipsilateral wedge pair technique using 6 MV photons [5] a 3-field AP (wedged), PA (wedged) and lateral portal technique using 6 MV photons [6] a mixed beam technique using 6 MV photons and 12 MeV electrons (1:4 weighting) [7] a mixed beam technique using 6 MV photons and 16 MeV electrons (1:4 weighting) [8] a mixed beam technique using 18 MV photons and 20 MeV electrons (2:3 weighting) [9] a mixed beam technique using 18 MV photons and 20 MeV electrons (1:1 weighting). Results: Using dose-volume histograms to evaluate the dose to the contralateral parotid gland, the percentage of contralateral parotid volume receiving ≥ 30% of the prescribed dose was 100% for techniques [1], [8] and [9], and < 5% for techniques [2] through [7]. Evaluating the 'hottest' 5 cc of the ipsilateral mandible and temporal lobes, the hot spots were: 152% and 150% for technique [2], 132% and 130% for technique [6]. Comparing the exit doses, techniques [1], [8] and [9] contributed to ≥ 50% of the prescribed dose to the contralateral mandible and the temporal lobes. Only techniques [2] and [6] kept the highest point doses to both the brain stem and the spinal cord below 50% of the prescribed dose. Conclusion: The single photon lateral field [1] and the mixed electron-photon beams [8] and [9] are not recommended treatment techniques for unilateral parotid irradiation because of high doses delivered to the

  20. Limits of dose escalation in lung cancer: a dose-volume histogram analysis comparing coplanar and non-coplanar techniques

    Energy Technology Data Exchange (ETDEWEB)

    Derycke, S; Van Duyse, B; Schelfhout, J; De Neve, W

    1995-12-01

    To evaluate the feasibility of dose escalation in radiotherapy of inoperable lung cancer, a dose-volume histogram analysis was performed comparing standard coplanar (2D) with non-coplanar (3D) beam arrangements on a non-selected group of 20 patients planned by Sherouse`s GRATISTM 3D-planning system. Serial CT-scanning was performed and 2 Target Volumes (Tvs) were defined. Gross Tumor Volume (GTV) defined a high-dose Target Volume (TV-1). GTV plus location of node stations with > 10% probability of invasion (Minet et al.) defined an intermediate-dose Target Volume (TV-2). However, nodal regions which are incompatible with cure were excluded from TV-2. These are ATS-regions 1, 8, 9 and 14 all left and right as well as heterolateral regions. For 3D-planning, Beam`s Eye View selected (by an experienced planner) beam arrangements were optimised using Superdot, a method of target dose-gradient annihilation developed by Sherouse. A second 3D-planning was performed using 4 beam incidences with maximal angular separation. The linac`s isocenter for the optimal arrangement was located at the geometrical center of gravity of a tetraheder, the tetraheder`s comers being the consecutive positions of the virtual source. This ideal beam arrangement was approximated as close as possible, taking into account technical limitations (patient-couch-gantry collisions). Criteria for tolerance were met if no points inside the spinal cord exceeded 50 Gy and if at least 50% of the lung volume received less than 20Gy. If dose regions below 50 Gy were judged acceptable at TV-2, 2D- as well as 3D-plans allow safe escalation to 80 Gy at TV-1. When TV-2 needed to be encompassed by isodose surfaces exceeding 50Gy, 3D-plans were necessary to limit dose at the spinal cord below tolerance. For large TVs dose is limited by lung tolerance for 3D-plans. An analysis (including NTCP-TCP as cost functions) of rival 3D-plans is being performed.

  1. The histogramming tool hparse

    International Nuclear Information System (INIS)

    Nikulin, V.; Shabratova, G.

    2005-01-01

    A general-purpose package aimed to simplify the histogramming in the data analysis is described. The proposed dedicated language for writing the histogramming scripts provides an effective and flexible tool for definition of a complicated histogram set. The script is more transparent and much easier to maintain than corresponding C++ code. In the TTree analysis it could be a good complement to the TTreeViewer class: the TTreeViewer is used for choice of the required histogram/cut set, while the hparse enables one to generate a code for systematic analysis

  2. Steganalytic methods for the detection of histogram shifting data-hiding schemes

    OpenAIRE

    Lerch Hostalot, Daniel

    2011-01-01

    In this paper, some steganalytic techniques designed to detect the existence of hidden messages using histogram shifting methods are presented. Firstly, some techniques to identify specific methods of histogram shifting, based on visible marks on the histogram or abnormal statistical distributions are suggested. Then, we present a general technique capable of detecting all histogram shifting techniques analyzed. This technique is based on the effect of histogram shifting methods on the "volat...

  3. Parotid gland tumors: a comparison of postoperative radiotherapy techniques using three dimensional (3-D) dose distributions and dose-volume histograms (DVH)

    International Nuclear Information System (INIS)

    Yaparpalvi, R.; Tyerech, S.K.; Boselli, L.R.; Fontenla, D.P.; Beitler, J.J.; Vikram, B.

    1996-01-01

    Purpose/Objective: To compare different treatment techniques for unilateral treatment of parotid gland tumors. Materials and Methods: Twenty patients previously treated postoperatively for parotid gland tumors were retrospectively reviewed. Average field size was 9 x 11 cm, average separation was 15.5 cm, and the average prescription depth was 4.5 cm. Using 3-D dose distributions, tissue inhomogeneity corrections, scatter integration (for photons) and pencil beam (for electrons) algorithms and DVH, nine treatment techniques were compared using a representative patient. The treatment techniques investigated were: [1] unilateral 6 MV photons. [2] unilateral 12 MeV electrons. [3] unilateral 16 MeV electrons. [4] a ipsilateral wedge pair technique using 6 MV photons and a 45-degree wedge. [5] a 3-field AP (wedged), PA (wedged) and lateral portal technique using 6 MV photons. [6] a mixed beam technique using 6 MV photons and 12 MeV electrons (1:4 weighting). [7] a mixed beam technique using 6 MV photons and 16 MeV electrons (1:4 weighting). [8] a mixed beam technique using 18 MV photons and 20 MeV electrons (2:3 weighting). [9] a mixed beam technique using 18 MV photons and 20 MeV electrons (1:1 weighting). Results: Using dose-volume histograms to evaluate the dose to the contralateral parotid gland, the percentage of contralateral parotid volume receiving ≥30% of the prescribed dose was 100% for techniques [1], [8] and [9], and <5% for techniques [2] through [7]. Evaluating the 'hottest' 5 cc of the ipsilateral mandible and temporal lobes, the hot spots were: 152% and 150% for technique [2], 132% and 130% for technique [6]. Comparing the exit doses, techniques [1] and [8] contributed to ≥50% of the prescribed dose to the contralateral mandible and the temporal lobes. Only techniques [2] and [6] kept the highest point doses to both the brain stem and the spinal cord below 50% of the prescribed dose. Conclusion: The single photon lateral field [1] and the mixed

  4. SU-F-T-254: Dose Volume Histogram (DVH) Analysis of Breath Hold Vs Free Breathing Techniques for Esophageal Tumors

    Energy Technology Data Exchange (ETDEWEB)

    Badkul, R; Doke, K; Pokhrel, D; Aguilera, N; Lominska, C [University of Kansas Medical Center, Kansas City, KS (United States)

    2016-06-15

    Purpose: Lung and heart doses and associated toxicity are of concern in radiotherapy for esophageal cancer. This study evaluates the dosimetry of deep-inspiration-breath-hold (DIBH) technique as compared to freebreathing( FB) using 3D-conformal treatment(3D-CRT) of esophageal cancer. Methods: Eight patients were planned with FB and DIBH CT scans. DIBH scans were acquired using Varian RPM system. FB and DIBH CTs were contoured per RTOG-1010 to create the planning target volume(PTV) as well as organs at risk volumes(OAR). Two sets of gross target volumes(GTV) with 5cm length were contoured for each patient: proximal at the level of the carina and distal at the level of gastroesophageal junction and were enlarged with appropriate margin to generate Clinical Target Volume and PTV. 3D-CRT plans were created on Eclipse planning system for 45Gy to cover 95% of PTV in 25 fractions for both proximal and distal tumors on FB and DIBH scans. For distal tumors celiac nodes were covered electively. DVH parameters for lung and heart OARs were generated and analyzed. Results: All DIBH DVH parameters were normalized to FB plan values. Average of heart-mean and heart-V40 was 0.70 and 0.66 for proximal lesions. For distal lesions ratios were 1.21 and 2.22 respectively. For DIBH total lung volume increased by 2.43 times versus FB scan. Average of lung-mean, V30, V20, V10, V5 are 0.82, 0.92, 0.76, 0.77 and 0.79 for proximal lesions and 1.17,0.66,0.87,0.93 and 1.03 for distal lesions. Heart doses were lower for breath-hold proximal lesions but higher for distal lesions as compared to free-breathing plans. Lung doses were lower for both proximal and distal breath-hold lesions except mean lung dose and V5 for distal lesions. Conclusion: This study showed improvement of OAR doses for esophageal lesions at mid-thoracic level utilizing DIBH vs FB technique but did not show consistent OAR sparing with DIBH for distal lesions.

  5. SU-F-T-254: Dose Volume Histogram (DVH) Analysis of Breath Hold Vs Free Breathing Techniques for Esophageal Tumors

    International Nuclear Information System (INIS)

    Badkul, R; Doke, K; Pokhrel, D; Aguilera, N; Lominska, C

    2016-01-01

    Purpose: Lung and heart doses and associated toxicity are of concern in radiotherapy for esophageal cancer. This study evaluates the dosimetry of deep-inspiration-breath-hold (DIBH) technique as compared to freebreathing( FB) using 3D-conformal treatment(3D-CRT) of esophageal cancer. Methods: Eight patients were planned with FB and DIBH CT scans. DIBH scans were acquired using Varian RPM system. FB and DIBH CTs were contoured per RTOG-1010 to create the planning target volume(PTV) as well as organs at risk volumes(OAR). Two sets of gross target volumes(GTV) with 5cm length were contoured for each patient: proximal at the level of the carina and distal at the level of gastroesophageal junction and were enlarged with appropriate margin to generate Clinical Target Volume and PTV. 3D-CRT plans were created on Eclipse planning system for 45Gy to cover 95% of PTV in 25 fractions for both proximal and distal tumors on FB and DIBH scans. For distal tumors celiac nodes were covered electively. DVH parameters for lung and heart OARs were generated and analyzed. Results: All DIBH DVH parameters were normalized to FB plan values. Average of heart-mean and heart-V40 was 0.70 and 0.66 for proximal lesions. For distal lesions ratios were 1.21 and 2.22 respectively. For DIBH total lung volume increased by 2.43 times versus FB scan. Average of lung-mean, V30, V20, V10, V5 are 0.82, 0.92, 0.76, 0.77 and 0.79 for proximal lesions and 1.17,0.66,0.87,0.93 and 1.03 for distal lesions. Heart doses were lower for breath-hold proximal lesions but higher for distal lesions as compared to free-breathing plans. Lung doses were lower for both proximal and distal breath-hold lesions except mean lung dose and V5 for distal lesions. Conclusion: This study showed improvement of OAR doses for esophageal lesions at mid-thoracic level utilizing DIBH vs FB technique but did not show consistent OAR sparing with DIBH for distal lesions.

  6. Multi-dimensional discovery of biomarker and phenotype complexes

    Directory of Open Access Journals (Sweden)

    Huang Kun

    2010-10-01

    Full Text Available Abstract Background Given the rapid growth of translational research and personalized healthcare paradigms, the ability to relate and reason upon networks of bio-molecular and phenotypic variables at various levels of granularity in order to diagnose, stage and plan treatments for disease states is highly desirable. Numerous techniques exist that can be used to develop networks of co-expressed or otherwise related genes and clinical features. Such techniques can also be used to create formalized knowledge collections based upon the information incumbent to ontologies and domain literature. However, reports of integrative approaches that bridge such networks to create systems-level models of disease or wellness are notably lacking in the contemporary literature. Results In response to the preceding gap in knowledge and practice, we report upon a prototypical series of experiments that utilize multi-modal approaches to network induction. These experiments are intended to elicit meaningful and significant biomarker-phenotype complexes spanning multiple levels of granularity. This work has been performed in the experimental context of a large-scale clinical and basic science data repository maintained by the National Cancer Institute (NCI funded Chronic Lymphocytic Leukemia Research Consortium. Conclusions Our results indicate that it is computationally tractable to link orthogonal networks of genes, clinical features, and conceptual knowledge to create multi-dimensional models of interrelated biomarkers and phenotypes. Further, our results indicate that such systems-level models contain interrelated bio-molecular and clinical markers capable of supporting hypothesis discovery and testing. Based on such findings, we propose a conceptual model intended to inform the cross-linkage of the results of such methods. This model has as its aim the identification of novel and knowledge-anchored biomarker-phenotype complexes.

  7. Regionally adaptive histogram equalization of the chest

    International Nuclear Information System (INIS)

    Sherrier, R.H.; Johnson, G.A.

    1986-01-01

    Advances in digital chest radiography have resulted in the acquisition of high-quality digital images of the human chest. With these advances, there arises a genuine need for image processing algorithms, specific to chest images. The author has implemented the technique of histogram equalization, noting the problems encountered when it is adapted to chest images. These problems have been successfully solved with a regionally adaptive histogram equalization method. Histograms are calculated locally and then modified according to both the mean pixel value of a given region and certain characteristics of the cumulative distribution function. The method allows certain regions of the chest radiograph to be enhanced differentially

  8. Multi-Dimensional Customer Data Analysis in Online Auctions

    Institute of Scientific and Technical Information of China (English)

    LAO Guoling; XIONG Kuan; QIN Zheng

    2007-01-01

    In this paper, we designed a customer-centered data warehouse system with five subjects: listing, bidding, transaction,accounts, and customer contact based on the business process of online auction companies. For each subject, we analyzed its fact indexes and dimensions. Then take transaction subject as example,analyzed the data warehouse model in detail, and got the multi-dimensional analysis structure of transaction subject. At last, using data mining to do customer segmentation, we divided customers into four types: impulse customer, prudent customer, potential customer, and ordinary customer. By the result of multi-dimensional customer data analysis, online auction companies can do more target marketing and increase customer loyalty.

  9. Development of MARS for multi-dimensional and multi-purpose thermal-hydraulic system analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Chung, Bub Dong; Kim, Kyung Doo; Hwang, Moon Kyu; Jeong, Jae Jun; Ha, Kwi Seok; Joo, Han Gyu [Korea Atomic Energy Research Institute, T/H Safety Research Team, Yusung, Daejeon (Korea)

    2000-10-01

    MARS (Multi-dimensional Analysis of Reactor Safety) code is being developed by KAERI for the realistic thermal-hydraulic simulation of light water reactor system transients. MARS 1.4 has been developed as a final version of basic code frame for the multi-dimensional analysis of system thermal-hydraulics. Since MARS 1.3, MARS 1.4 has been improved to have the enhanced code capability and user friendliness through the unification of input/output features, code models and code functions, and through the code modernization. Further improvements of thermal-hydraulic models, numerical method and user friendliness are being carried out for the enhanced code accuracy. As a multi-purpose safety analysis code system, a coupled analysis system, MARS/MASTER/CONTEMPT, has been developed using multiple DLL (Dynamic Link Library) techniques of Windows system. This code system enables the coupled, that is, more realistic analysis of multi-dimensional thermal-hydraulics (MARS 2.0), three-dimensional core kinetics (MASTER) and containment thermal-hydraulics (CONTEMPT). This paper discusses the MARS development program, and the developmental progress of the MARS 1.4 and the MARS/MASTER/CONTEMPT focusing on major features of the codes and their verification. It also discusses thermal hydraulic models and new code features under development. (author)

  10. Development of MARS for multi-dimensional and multi-purpose thermal-hydraulic system analysis

    International Nuclear Information System (INIS)

    Lee, Won Jae; Chung, Bub Dong; Kim, Kyung Doo; Hwang, Moon Kyu; Jeong, Jae Jun; Ha, Kwi Seok; Joo, Han Gyu

    2000-01-01

    MARS (Multi-dimensional Analysis of Reactor Safety) code is being developed by KAERI for the realistic thermal-hydraulic simulation of light water reactor system transients. MARS 1.4 has been developed as a final version of basic code frame for the multi-dimensional analysis of system thermal-hydraulics. Since MARS 1.3, MARS 1.4 has been improved to have the enhanced code capability and user friendliness through the unification of input/output features, code models and code functions, and through the code modernization. Further improvements of thermal-hydraulic models, numerical method and user friendliness are being carried out for the enhanced code accuracy. As a multi-purpose safety analysis code system, a coupled analysis system, MARS/MASTER/CONTEMPT, has been developed using multiple DLL (Dynamic Link Library) techniques of Windows system. This code system enables the coupled, that is, more realistic analysis of multi-dimensional thermal-hydraulics (MARS 2.0), three-dimensional core kinetics (MASTER) and containment thermal-hydraulics (CONTEMPT). This paper discusses the MARS development program, and the developmental progress of the MARS 1.4 and the MARS/MASTER/CONTEMPT focusing on major features of the codes and their verification. It also discusses thermal hydraulic models and new code features under development. (author)

  11. Decay rate in a multi-dimensional fission problem

    Energy Technology Data Exchange (ETDEWEB)

    Brink, D M; Canto, L F

    1986-06-01

    The multi-dimensional diffusion approach of Zhang Jing Shang and Weidenmueller (1983 Phys. Rev. C28, 2190) is used to study a simplified model for induced fission. In this model it is shown that the coupling of the fission coordinate to the intrinsic degrees of freedom is equivalent to an extra friction and a mass correction in the corresponding one-dimensional problem.

  12. Image matrix processor for fast multi-dimensional computations

    Science.gov (United States)

    Roberson, George P.; Skeate, Michael F.

    1996-01-01

    An apparatus for multi-dimensional computation which comprises a computation engine, including a plurality of processing modules. The processing modules are configured in parallel and compute respective contributions to a computed multi-dimensional image of respective two dimensional data sets. A high-speed, parallel access storage system is provided which stores the multi-dimensional data sets, and a switching circuit routes the data among the processing modules in the computation engine and the storage system. A data acquisition port receives the two dimensional data sets representing projections through an image, for reconstruction algorithms such as encountered in computerized tomography. The processing modules include a programmable local host, by which they may be configured to execute a plurality of different types of multi-dimensional algorithms. The processing modules thus include an image manipulation processor, which includes a source cache, a target cache, a coefficient table, and control software for executing image transformation routines using data in the source cache and the coefficient table and loading resulting data in the target cache. The local host processor operates to load the source cache with a two dimensional data set, loads the coefficient table, and transfers resulting data out of the target cache to the storage system, or to another destination.

  13. Development and Validation of Multi-Dimensional Personality ...

    African Journals Online (AJOL)

    This study was carried out to establish the scientific processes for the development and validation of Multi-dimensional Personality Inventory (MPI). The process of development and validation occurred in three phases with five components of Agreeableness, Conscientiousness, Emotional stability, Extroversion, and ...

  14. Balanced sensitivity functions for tuning multi-dimensional Bayesian network classifiers

    NARCIS (Netherlands)

    Bolt, J.H.; van der Gaag, L.C.

    Multi-dimensional Bayesian network classifiers are Bayesian networks of restricted topological structure, which are tailored to classifying data instances into multiple dimensions. Like more traditional classifiers, multi-dimensional classifiers are typically learned from data and may include

  15. Multi-dimensional Bin Packing Problems with Guillotine Constraints

    DEFF Research Database (Denmark)

    Amossen, Rasmus Resen; Pisinger, David

    2010-01-01

    The problem addressed in this paper is the decision problem of determining if a set of multi-dimensional rectangular boxes can be orthogonally packed into a rectangular bin while satisfying the requirement that the packing should be guillotine cuttable. That is, there should exist a series of face...... parallel straight cuts that can recursively cut the bin into pieces so that each piece contains a box and no box has been intersected by a cut. The unrestricted problem is known to be NP-hard. In this paper we present a generalization of a constructive algorithm for the multi-dimensional bin packing...... problem, with and without the guillotine constraint, based on constraint programming....

  16. Multi-dimensional Code Development for Safety Analysis of LMR

    International Nuclear Information System (INIS)

    Ha, K. S.; Jeong, H. Y.; Kwon, Y. M.; Lee, Y. B.

    2006-08-01

    A liquid metal reactor loaded a metallic fuel has the inherent safety mechanism due to the several negative reactivity feedback. Although this feature demonstrated through experiments in the EBR-II, any of the computer programs until now did not exactly analyze it because of the complexity of the reactivity feedback mechanism. A multi-dimensional detail program was developed through the International Nuclear Energy Research Initiative(INERI) from 2003 to 2005. This report includes the numerical coupling the multi-dimensional program and SSC-K code which is used to the safety analysis of liquid metal reactors in KAERI. The coupled code has been proved by comparing the analysis results using the code with the results using SAS-SASSYS code of ANL for the UTOP, ULOF, and ULOHS applied to the safety analysis for KALIMER-150

  17. Peer Pressure in Multi-Dimensional Work Tasks

    OpenAIRE

    Felix Ebeling; Gerlinde Fellner; Johannes Wahlig

    2012-01-01

    We study the influence of peer pressure in multi-dimensional work tasks theoretically and in a controlled laboratory experiment. Thereby, workers face peer pressure in only one work dimension. We find that effort provision increases in the dimension where peer pressure is introduced. However, not all of this increase translates into a productivity gain, since the effect is partly offset by a decrease of effort in the work dimension without peer pressure. Furthermore, this tradeoff is stronger...

  18. Multi-dimensional virtual system introduced to enhance canonical sampling

    Science.gov (United States)

    Higo, Junichi; Kasahara, Kota; Nakamura, Haruki

    2017-10-01

    When an important process of a molecular system occurs via a combination of two or more rare events, which occur almost independently to one another, computational sampling for the important process is difficult. Here, to sample such a process effectively, we developed a new method, named the "multi-dimensional Virtual-system coupled Monte Carlo (multi-dimensional-VcMC)" method, where the system interacts with a virtual system expressed by two or more virtual coordinates. Each virtual coordinate controls sampling along a reaction coordinate. By setting multiple reaction coordinates to be related to the corresponding rare events, sampling of the important process can be enhanced. An advantage of multi-dimensional-VcMC is its simplicity: Namely, the conformation moves widely in the multi-dimensional reaction coordinate space without knowledge of canonical distribution functions of the system. To examine the effectiveness of the algorithm, we introduced a toy model where two molecules (receptor and its ligand) bind and unbind to each other. The receptor has a deep binding pocket, to which the ligand enters for binding. Furthermore, a gate is set at the entrance of the pocket, and the gate is usually closed. Thus, the molecular binding takes place via the two events: ligand approach to the pocket and gate opening. In two-dimensional (2D)-VcMC, the two molecules exhibited repeated binding and unbinding, and an equilibrated distribution was obtained as expected. A conventional canonical simulation, which was 200 times longer than 2D-VcMC, failed in sampling the binding/unbinding effectively. The current method is applicable to various biological systems.

  19. Code Coupling for Multi-Dimensional Core Transient Analysis

    International Nuclear Information System (INIS)

    Park, Jin-Woo; Park, Guen-Tae; Park, Min-Ho; Ryu, Seok-Hee; Um, Kil-Sup; Lee Jae-Il

    2015-01-01

    After the CEA ejection, the nuclear power of the reactor dramatically increases in an exponential behavior until the Doppler effect becomes important and turns the reactivity balance and power down to lower levels. Although this happens in a very short period of time, only few seconds, the energy generated can be very significant and cause fuel failures. The current safety analysis methodology which is based on overly conservative assumptions with the point kinetics model results in quite adverse consequences. Thus, KEPCO Nuclear Fuel(KNF) is developing the multi-dimensional safety analysis methodology to mitigate the consequences of the single CEA ejection accident. For this purpose, three-dimensional core neutron kinetics code ASTRA, sub-channel analysis code THALES, and fuel performance analysis code FROST, which have transient calculation performance, were coupled using message passing interface (MPI). This paper presents the methodology used for code coupling and the preliminary simulation results with the coupled code system (CHASER). Multi-dimensional core transient analysis code system, CHASER, has been developed and it was applied to simulate a single CEA ejection accident. CHASER gave a good prediction of multi-dimensional core transient behaviors during transient. In the near future, the multi-dimension CEA ejection analysis methodology using CHASER is planning to be developed. CHASER is expected to be a useful tool to gain safety margin for reactivity initiated accidents (RIAs), such as a single CEA ejection accident

  20. Code Coupling for Multi-Dimensional Core Transient Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin-Woo; Park, Guen-Tae; Park, Min-Ho; Ryu, Seok-Hee; Um, Kil-Sup; Lee Jae-Il [KEPCO NF, Daejeon (Korea, Republic of)

    2015-05-15

    After the CEA ejection, the nuclear power of the reactor dramatically increases in an exponential behavior until the Doppler effect becomes important and turns the reactivity balance and power down to lower levels. Although this happens in a very short period of time, only few seconds, the energy generated can be very significant and cause fuel failures. The current safety analysis methodology which is based on overly conservative assumptions with the point kinetics model results in quite adverse consequences. Thus, KEPCO Nuclear Fuel(KNF) is developing the multi-dimensional safety analysis methodology to mitigate the consequences of the single CEA ejection accident. For this purpose, three-dimensional core neutron kinetics code ASTRA, sub-channel analysis code THALES, and fuel performance analysis code FROST, which have transient calculation performance, were coupled using message passing interface (MPI). This paper presents the methodology used for code coupling and the preliminary simulation results with the coupled code system (CHASER). Multi-dimensional core transient analysis code system, CHASER, has been developed and it was applied to simulate a single CEA ejection accident. CHASER gave a good prediction of multi-dimensional core transient behaviors during transient. In the near future, the multi-dimension CEA ejection analysis methodology using CHASER is planning to be developed. CHASER is expected to be a useful tool to gain safety margin for reactivity initiated accidents (RIAs), such as a single CEA ejection accident.

  1. Machine assisted histogram classification

    Science.gov (United States)

    Benyó, B.; Gaspar, C.; Somogyi, P.

    2010-04-01

    LHCb is one of the four major experiments under completion at the Large Hadron Collider (LHC). Monitoring the quality of the acquired data is important, because it allows the verification of the detector performance. Anomalies, such as missing values or unexpected distributions can be indicators of a malfunctioning detector, resulting in poor data quality. Spotting faulty or ageing components can be either done visually using instruments, such as the LHCb Histogram Presenter, or with the help of automated tools. In order to assist detector experts in handling the vast monitoring information resulting from the sheer size of the detector, we propose a graph based clustering tool combined with machine learning algorithm and demonstrate its use by processing histograms representing 2D hitmaps events. We prove the concept by detecting ion feedback events in the LHCb experiment's RICH subdetector.

  2. Machine assisted histogram classification

    Energy Technology Data Exchange (ETDEWEB)

    Benyo, B; Somogyi, P [BME-IIT, H-1117 Budapest, Magyar tudosok koerutja 2. (Hungary); Gaspar, C, E-mail: Peter.Somogyi@cern.c [CERN-PH, CH-1211 Geneve 23 (Switzerland)

    2010-04-01

    LHCb is one of the four major experiments under completion at the Large Hadron Collider (LHC). Monitoring the quality of the acquired data is important, because it allows the verification of the detector performance. Anomalies, such as missing values or unexpected distributions can be indicators of a malfunctioning detector, resulting in poor data quality. Spotting faulty or ageing components can be either done visually using instruments, such as the LHCb Histogram Presenter, or with the help of automated tools. In order to assist detector experts in handling the vast monitoring information resulting from the sheer size of the detector, we propose a graph based clustering tool combined with machine learning algorithm and demonstrate its use by processing histograms representing 2D hitmaps events. We prove the concept by detecting ion feedback events in the LHCb experiment's RICH subdetector.

  3. Quantifying multi-dimensional attributes of human activities at various geographic scales based on smartphone tracking.

    Science.gov (United States)

    Zhou, Xiaolu; Li, Dongying

    2018-05-09

    Advancement in location-aware technologies, and information and communication technology in the past decades has furthered our knowledge of the interaction between human activities and the built environment. An increasing number of studies have collected data regarding individual activities to better understand how the environment shapes human behavior. Despite this growing interest, some challenges exist in collecting and processing individual's activity data, e.g., capturing people's precise environmental contexts and analyzing data at multiple spatial scales. In this study, we propose and implement an innovative system that integrates smartphone-based step tracking with an app and the sequential tile scan techniques to collect and process activity data. We apply the OpenStreetMap tile system to aggregate positioning points at various scales. We also propose duration, step and probability surfaces to quantify the multi-dimensional attributes of activities. Results show that, by running the app in the background, smartphones can measure multi-dimensional attributes of human activities, including space, duration, step, and location uncertainty at various spatial scales. By coordinating Global Positioning System (GPS) sensor with accelerometer sensor, this app can save battery which otherwise would be drained by GPS sensor quickly. Based on a test dataset, we were able to detect the recreational center and sports center as the space where the user was most active, among other places visited. The methods provide techniques to address key issues in analyzing human activity data. The system can support future studies on behavioral and health consequences related to individual's environmental exposure.

  4. A multi-dimensional sampling method for locating small scatterers

    International Nuclear Information System (INIS)

    Song, Rencheng; Zhong, Yu; Chen, Xudong

    2012-01-01

    A multiple signal classification (MUSIC)-like multi-dimensional sampling method (MDSM) is introduced to locate small three-dimensional scatterers using electromagnetic waves. The indicator is built with the most stable part of signal subspace of the multi-static response matrix on a set of combinatorial sampling nodes inside the domain of interest. It has two main advantages compared to the conventional MUSIC methods. First, the MDSM is more robust against noise. Second, it can work with a single incidence even for multi-scatterers. Numerical simulations are presented to show the good performance of the proposed method. (paper)

  5. Multi-dimensional cubic interpolation for ICF hydrodynamics simulation

    International Nuclear Information System (INIS)

    Aoki, Takayuki; Yabe, Takashi.

    1991-04-01

    A new interpolation method is proposed to solve the multi-dimensional hyperbolic equations which appear in describing the hydrodynamics of inertial confinement fusion (ICF) implosion. The advection phase of the cubic-interpolated pseudo-particle (CIP) is greatly improved, by assuming the continuities of the second and the third spatial derivatives in addition to the physical value and the first derivative. These derivatives are derived from the given physical equation. In order to evaluate the new method, Zalesak's example is tested, and we obtain successfully good results. (author)

  6. Multi-dimensional beam emittance and β-functions

    International Nuclear Information System (INIS)

    Buon, J.

    1993-05-01

    The concept of r.m.s. emittance is extended to the case of several degrees of freedom that are coupled. That multi-dimensional emittance is lower than the product of the emittances attached to each degree of freedom, but is conserved in a linear motion. An envelope-hyperellipsoid is introduced to define the β-functions of the beam envelope. On the contrary of an one-degree of freedom motion, it is emphasized that these envelope functions differ from the amplitude functions of the normal modes of motion as a result of the difference between the Liouville and Lagrange invariants. (author) 4 refs

  7. Multi-dimensional technology-enabled social learning approach

    DEFF Research Database (Denmark)

    Petreski, Hristijan; Tsekeridou, Sofia; Prasad, Neeli R.

    2013-01-01

    ’t respond to this systemic and structural changes and/or challenges and retains its status quo than it is jeopardizing its own existence or the existence of the education, as we know it. This paper aims to precede one step further by proposing a multi-dimensional approach for technology-enabled social...... in learning while socializing within their learning communities. However, their “educational” usage is still limited to facilitation of online learning communities and to collaborative authoring of learning material complementary to existing formal (e-) learning services. If the educational system doesn...

  8. Estimate of pulse-sequence data acquisition system for multi-dimensional measurement

    International Nuclear Information System (INIS)

    Kitamura, Yasunori; Sakae, Takeji; Nohtomi, Akihiro; Matoba, Masaru; Matsumoto, Yuzuru.

    1996-01-01

    A pulse-sequence data acquisition system has been newly designed and estimated for the measurement of one- or multi-dimensional pulse train coming from radiation detectors. In this system, in order to realize the pulse-sequence data acquisition, the arrival time of each pulse is recorded to a memory of a personal computer (PC). For the multi-dimensional data acquisition with several input channels, each arrival-time data is tagged with a 'flag' which indicates the input channel of arriving pulse. Counting losses due to the existence of processing time of the PC are expected to be reduced by using a First-In-First-Out (FIFO) memory unit. In order to verify this system, a computer simulation was performed, Various sets of random pulse trains with different mean pulse rate (1-600 kcps) were generated by using Monte Carlo simulation technique. Those pulse trains were dealt with another code which simulates the newly-designed data acquisition system including a FIFO memory unit; the memory size was assumed to be 0-100 words. And the recorded pulse trains on the PC with the various FIFO memory sizes have been observed. From the result of the simulation, it appears that the system with 3 words FIFO memory unit works successfully up to the pulse rate of 10 kcps without any severe counting losses. (author)

  9. Estimate of pulse-sequence data acquisition system for multi-dimensional measurement

    Energy Technology Data Exchange (ETDEWEB)

    Kitamura, Yasunori; Sakae, Takeji; Nohtomi, Akihiro; Matoba, Masaru [Kyushu Univ., Fukuoka (Japan). Faculty of Engineering; Matsumoto, Yuzuru

    1996-07-01

    A pulse-sequence data acquisition system has been newly designed and estimated for the measurement of one- or multi-dimensional pulse train coming from radiation detectors. In this system, in order to realize the pulse-sequence data acquisition, the arrival time of each pulse is recorded to a memory of a personal computer (PC). For the multi-dimensional data acquisition with several input channels, each arrival-time data is tagged with a `flag` which indicates the input channel of arriving pulse. Counting losses due to the existence of processing time of the PC are expected to be reduced by using a First-In-First-Out (FIFO) memory unit. In order to verify this system, a computer simulation was performed, Various sets of random pulse trains with different mean pulse rate (1-600 kcps) were generated by using Monte Carlo simulation technique. Those pulse trains were dealt with another code which simulates the newly-designed data acquisition system including a FIFO memory unit; the memory size was assumed to be 0-100 words. And the recorded pulse trains on the PC with the various FIFO memory sizes have been observed. From the result of the simulation, it appears that the system with 3 words FIFO memory unit works successfully up to the pulse rate of 10 kcps without any severe counting losses. (author)

  10. Meta-modelling, visualization and emulation of multi-dimensional data for virtual production intelligence

    Science.gov (United States)

    Schulz, Wolfgang; Hermanns, Torsten; Al Khawli, Toufik

    2017-07-01

    Decision making for competitive production in high-wage countries is a daily challenge where rational and irrational methods are used. The design of decision making processes is an intriguing, discipline spanning science. However, there are gaps in understanding the impact of the known mathematical and procedural methods on the usage of rational choice theory. Following Benjamin Franklin's rule for decision making formulated in London 1772, he called "Prudential Algebra" with the meaning of prudential reasons, one of the major ingredients of Meta-Modelling can be identified finally leading to one algebraic value labelling the results (criteria settings) of alternative decisions (parameter settings). This work describes the advances in Meta-Modelling techniques applied to multi-dimensional and multi-criterial optimization by identifying the persistence level of the corresponding Morse-Smale Complex. Implementations for laser cutting and laser drilling are presented, including the generation of fast and frugal Meta-Models with controlled error based on mathematical model reduction Reduced Models are derived to avoid any unnecessary complexity. Both, model reduction and analysis of multi-dimensional parameter space are used to enable interactive communication between Discovery Finders and Invention Makers. Emulators and visualizations of a metamodel are introduced as components of Virtual Production Intelligence making applicable the methods of Scientific Design Thinking and getting the developer as well as the operator more skilled.

  11. Goodness-of-fit tests for multi-dimensional copulas: Expanding application to historical drought data

    Directory of Open Access Journals (Sweden)

    Ming-wei Ma

    2013-01-01

    Full Text Available The question of how to choose a copula model that best fits a given dataset is a predominant limitation of the copula approach, and the present study aims to investigate the techniques of goodness-of-fit tests for multi-dimensional copulas. A goodness-of-fit test based on Rosenblatt's transformation was mathematically expanded from two dimensions to three dimensions and procedures of a bootstrap version of the test were provided. Through stochastic copula simulation, an empirical application of historical drought data at the Lintong Gauge Station shows that the goodness-of-fit tests perform well, revealing that both trivariate Gaussian and Student t copulas are acceptable for modeling the dependence structures of the observed drought duration, severity, and peak. The goodness-of-fit tests for multi-dimensional copulas can provide further support and help a lot in the potential applications of a wider range of copulas to describe the associations of correlated hydrological variables. However, for the application of copulas with the number of dimensions larger than three, more complicated computational efforts as well as exploration and parameterization of corresponding copulas are required.

  12. Histogram Estimators of Bivariate Densities

    National Research Council Canada - National Science Library

    Husemann, Joyce A

    1986-01-01

    One-dimensional fixed-interval histogram estimators of univariate probability density functions are less efficient than the analogous variable-interval estimators which are constructed from intervals...

  13. Classification of high-resolution multi-swath hyperspectral data using Landsat 8 surface reflectance data as a calibration target and a novel histogram based unsupervised classification technique to determine natural classes from biophysically relevant fit parameters

    Science.gov (United States)

    McCann, C.; Repasky, K. S.; Morin, M.; Lawrence, R. L.; Powell, S. L.

    2016-12-01

    Compact, cost-effective, flight-based hyperspectral imaging systems can provide scientifically relevant data over large areas for a variety of applications such as ecosystem studies, precision agriculture, and land management. To fully realize this capability, unsupervised classification techniques based on radiometrically-calibrated data that cluster based on biophysical similarity rather than simply spectral similarity are needed. An automated technique to produce high-resolution, large-area, radiometrically-calibrated hyperspectral data sets based on the Landsat surface reflectance data product as a calibration target was developed and applied to three subsequent years of data covering approximately 1850 hectares. The radiometrically-calibrated data allows inter-comparison of the temporal series. Advantages of the radiometric calibration technique include the need for minimal site access, no ancillary instrumentation, and automated processing. Fitting the reflectance spectra of each pixel using a set of biophysically relevant basis functions reduces the data from 80 spectral bands to 9 parameters providing noise reduction and data compression. Examination of histograms of these parameters allows for determination of natural splitting into biophysical similar clusters. This method creates clusters that are similar in terms of biophysical parameters, not simply spectral proximity. Furthermore, this method can be applied to other data sets, such as urban scenes, by developing other physically meaningful basis functions. The ability to use hyperspectral imaging for a variety of important applications requires the development of data processing techniques that can be automated. The radiometric-calibration combined with the histogram based unsupervised classification technique presented here provide one potential avenue for managing big-data associated with hyperspectral imaging.

  14. Device for multi-dimensional γ-γ-coincidence study

    International Nuclear Information System (INIS)

    Gruzinova, T.M.; Erokhina, K.I.; Kutuzov, V.I.; Lemberg, I.Kh.; Petrov, S.A.; Revenko, V.S.; Senin, A.T.; Chugunov, I.N.; Shishlinov, V.M.

    1977-01-01

    A device for studying multi-dimensional γ-γ coincidences is described which operates on-line with the BESM-4 computer. The device comprises Ge(Li) detectors, analog-to-digital converters, shaper discriminators and fast amplifiers. To control the device operation as a whole and to elaborate necessary commands, an information distributor has been developed. The following specific features of the device operation are noted: the device may operate both in the regime of recording spectra of direct γ radiation in the block memory of multi-channel analyzer, and in the regime of data transfer to the computer memory; the device performs registration of coincidences; it transfers information to the computer which has a channel of direct access to the memory. The procedure of data processing is considered, the data being recorded on a magnetic tape. Partial spectra obtained are in a good agreement with data obtained elsewhere

  15. Benchmarking multi-dimensional large strain consolidation analyses

    International Nuclear Information System (INIS)

    Priestley, D.; Fredlund, M.D.; Van Zyl, D.

    2010-01-01

    Analyzing the consolidation of tailings slurries and dredged fills requires a more extensive formulation than is used for common (small strain) consolidation problems. Large strain consolidation theories have traditionally been limited to 1-D formulations. SoilVision Systems has developed the capacity to analyze large strain consolidation problems in 2 and 3-D. The benchmarking of such formulations is not a trivial task. This paper presents several examples of modeling large strain consolidation in the beta versions of the new software. These examples were taken from the literature and were used to benchmark the large strain formulation used by the new software. The benchmarks reported here are: a comparison to the consolidation software application CONDES0, Townsend's Scenario B and a multi-dimensional analysis of long-term column tests performed on oil sands tailings. All three of these benchmarks were attained using the SVOffice suite. (author)

  16. A Multi-Dimensional Classification Model for Scientific Workflow Characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Ramakrishnan, Lavanya; Plale, Beth

    2010-04-05

    Workflows have been used to model repeatable tasks or operations in manufacturing, business process, and software. In recent years, workflows are increasingly used for orchestration of science discovery tasks that use distributed resources and web services environments through resource models such as grid and cloud computing. Workflows have disparate re uirements and constraints that affects how they might be managed in distributed environments. In this paper, we present a multi-dimensional classification model illustrated by workflow examples obtained through a survey of scientists from different domains including bioinformatics and biomedical, weather and ocean modeling, astronomy detailing their data and computational requirements. The survey results and classification model contribute to the high level understandingof scientific workflows.

  17. Anonymous voting for multi-dimensional CV quantum system

    International Nuclear Information System (INIS)

    Shi Rong-Hua; Xiao Yi; Shi Jin-Jing; Guo Ying; Lee, Moon-Ho

    2016-01-01

    We investigate the design of anonymous voting protocols, CV-based binary-valued ballot and CV-based multi-valued ballot with continuous variables (CV) in a multi-dimensional quantum cryptosystem to ensure the security of voting procedure and data privacy. The quantum entangled states are employed in the continuous variable quantum system to carry the voting information and assist information transmission, which takes the advantage of the GHZ-like states in terms of improving the utilization of quantum states by decreasing the number of required quantum states. It provides a potential approach to achieve the efficient quantum anonymous voting with high transmission security, especially in large-scale votes. (paper)

  18. Fast multi-dimensional NMR by minimal sampling

    Science.gov (United States)

    Kupče, Ēriks; Freeman, Ray

    2008-03-01

    A new scheme is proposed for very fast acquisition of three-dimensional NMR spectra based on minimal sampling, instead of the customary step-wise exploration of all of evolution space. The method relies on prior experiments to determine accurate values for the evolving frequencies and intensities from the two-dimensional 'first planes' recorded by setting t1 = 0 or t2 = 0. With this prior knowledge, the entire three-dimensional spectrum can be reconstructed by an additional measurement of the response at a single location (t1∗,t2∗) where t1∗ and t2∗ are fixed values of the evolution times. A key feature is the ability to resolve problems of overlap in the acquisition dimension. Applied to a small protein, agitoxin, the three-dimensional HNCO spectrum is obtained 35 times faster than systematic Cartesian sampling of the evolution domain. The extension to multi-dimensional spectroscopy is outlined.

  19. Advanced concepts in multi-dimensional radiation detection and imaging

    International Nuclear Information System (INIS)

    Vetter, Kai; Barnowski, Ross; Pavlovsky, Ryan; Haefner, Andy; Torii, Tatsuo; Shikaze, Yoshiaki; Sanada, Yukihisa

    2016-01-01

    Recent developments in the detector fabrication, signal readout, and data processing enable new concepts in radiation detection that are relevant for applications ranging from fundamental physics to medicine as well as nuclear security and safety. We present recent progress in multi-dimensional radiation detection and imaging in the Berkeley Applied Nuclear Physics program. It is based on the ability to reconstruct scenes in three dimensions and fuse it with gamma-ray image information. We are using the High-Efficiency Multimode Imager HEMI in its Compton imaging mode and combining it with contextual sensors such as the Microsoft Kinect or visual cameras. This new concept of volumetric imaging or scene data fusion provides unprecedented capabilities in radiation detection and imaging relevant for the detection and mapping of radiological and nuclear materials. This concept brings us one step closer to the seeing the world with gamma-ray eyes. (author)

  20. MEASURING PERFORMANCE IN ORGANIZATIONS FROM MULTI-DIMENSIONAL PERSPECTIVE

    Directory of Open Access Journals (Sweden)

    ȘTEFĂNESCU CRISTIAN

    2017-08-01

    Full Text Available In turbulent financial and economic present conditions a major challenge for the general management of organizations and in particular for the strategic human resources management is to establish a clear, coherent and consistent framework in terms of measuring organizational performance and economic efficiency. This paper aims to conduct an exploratory research of literature concerning measuring organizational performance. Based on the results of research the paper proposes a multi-dimensional model for measuring organizational performance providing a mechanism that will allow quantification of performance based on selected criteria. The model will attempt to eliminate inconsistencies and incongruities of organizational effectiveness models developed by specialists from organization theory area, performance measurement models developed by specialists from accounting management area and models of measuring the efficiency and effectiveness developed by specialists from strategic management and entrepreneurship areas.

  1. Scientific Visualization and Simulation for Multi-dimensional Marine Environment Data

    Science.gov (United States)

    Su, T.; Liu, H.; Wang, W.; Song, Z.; Jia, Z.

    2017-12-01

    As higher attention on the ocean and rapid development of marine detection, there are increasingly demands for realistic simulation and interactive visualization of marine environment in real time. Based on advanced technology such as GPU rendering, CUDA parallel computing and rapid grid oriented strategy, a series of efficient and high-quality visualization methods, which can deal with large-scale and multi-dimensional marine data in different environmental circumstances, has been proposed in this paper. Firstly, a high-quality seawater simulation is realized by FFT algorithm, bump mapping and texture animation technology. Secondly, large-scale multi-dimensional marine hydrological environmental data is virtualized by 3d interactive technologies and volume rendering techniques. Thirdly, seabed terrain data is simulated with improved Delaunay algorithm, surface reconstruction algorithm, dynamic LOD algorithm and GPU programming techniques. Fourthly, seamless modelling in real time for both ocean and land based on digital globe is achieved by the WebGL technique to meet the requirement of web-based application. The experiments suggest that these methods can not only have a satisfying marine environment simulation effect, but also meet the rendering requirements of global multi-dimension marine data. Additionally, a simulation system for underwater oil spill is established by OSG 3D-rendering engine. It is integrated with the marine visualization method mentioned above, which shows movement processes, physical parameters, current velocity and direction for different types of deep water oil spill particle (oil spill particles, hydrates particles, gas particles, etc.) dynamically and simultaneously in multi-dimension. With such application, valuable reference and decision-making information can be provided for understanding the progress of oil spill in deep water, which is helpful for ocean disaster forecasting, warning and emergency response.

  2. Investigating Student Understanding of Histograms

    Science.gov (United States)

    Kaplan, Jennifer J.; Gabrosek, John G.; Curtiss, Phyllis; Malone, Chris

    2014-01-01

    Histograms are adept at revealing the distribution of data values, especially the shape of the distribution and any outlier values. They are included in introductory statistics texts, research methods texts, and in the popular press, yet students often have difficulty interpreting the information conveyed by a histogram. This research identifies…

  3. The Histogram-Area Connection

    Science.gov (United States)

    Gratzer, William; Carpenter, James E.

    2008-01-01

    This article demonstrates an alternative approach to the construction of histograms--one based on the notion of using area to represent relative density in intervals of unequal length. The resulting histograms illustrate the connection between the area of the rectangles associated with particular outcomes and the relative frequency (probability)…

  4. Multi-dimensional relativistic simulations of core-collapse supernovae with energy-dependent neutrino transport

    International Nuclear Information System (INIS)

    Mueller, Bernhard

    2009-01-01

    In this thesis, we have presented the first multi-dimensional models of core-collapse supernovae that combine a detailed, up-to-date treatment of neutrino transport, the equation of state, and - in particular - general relativistic gravity. Building on the well-tested neutrino transport code VERTEX and the GR hydrodynamics code CoCoNuT, we developed and implemented a relativistic generalization of a ray-by-ray-plus method for energy-dependent neutrino transport. The result of these effort, the VERTEX-CoCoNuT code, also incorporates a number of improved numerical techniques that have not been used in the code components VERTEX and CoCoNuT before. In order to validate the VERTEX-CoCoNuT code, we conducted several test simulations in spherical symmetry, most notably a comparison with the one-dimensional relativistic supernova code AGILE-BOLTZTRAN and the Newtonian PROMETHEUSVERTEX code. (orig.)

  5. Multi-dimensional relativistic simulations of core-collapse supernovae with energy-dependent neutrino transport

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, Bernhard

    2009-05-07

    In this thesis, we have presented the first multi-dimensional models of core-collapse supernovae that combine a detailed, up-to-date treatment of neutrino transport, the equation of state, and - in particular - general relativistic gravity. Building on the well-tested neutrino transport code VERTEX and the GR hydrodynamics code CoCoNuT, we developed and implemented a relativistic generalization of a ray-by-ray-plus method for energy-dependent neutrino transport. The result of these effort, the VERTEX-CoCoNuT code, also incorporates a number of improved numerical techniques that have not been used in the code components VERTEX and CoCoNuT before. In order to validate the VERTEX-CoCoNuT code, we conducted several test simulations in spherical symmetry, most notably a comparison with the one-dimensional relativistic supernova code AGILE-BOLTZTRAN and the Newtonian PROMETHEUSVERTEX code. (orig.)

  6. Multi-Dimensional Bitmap Indices for Optimising Data Access within Object Oriented Databases at CERN

    CERN Document Server

    Stockinger, K

    2001-01-01

    Efficient query processing in high-dimensional search spaces is an important requirement for many analysis tools. In the literature on index data structures one can find a wide range of methods for optimising database access. In particular, bitmap indices have recently gained substantial popularity in data warehouse applications with large amounts of read mostly data. Bitmap indices are implemented in various commercial database products and are used for querying typical business applications. However, scientific data that is mostly characterised by non-discrete attribute values cannot be queried efficiently by the techniques currently supported. In this thesis we propose a novel access method based on bitmap indices that efficiently handles multi-dimensional queries against typical scientific data. The algorithm is called GenericRangeEval and is an extension of a bitmap index for discrete attribute values. By means of a cost model we study the performance of queries with various selectivities against uniform...

  7. Analytical modeling for fractional multi-dimensional diffusion equations by using Laplace transform

    Directory of Open Access Journals (Sweden)

    Devendra Kumar

    2015-01-01

    Full Text Available In this paper, we propose a simple numerical algorithm for solving multi-dimensional diffusion equations of fractional order which describes density dynamics in a material undergoing diffusion by using homotopy analysis transform method. The fractional derivative is described in the Caputo sense. This homotopy analysis transform method is an innovative adjustment in Laplace transform method and makes the calculation much simpler. The technique is not limited to the small parameter, such as in the classical perturbation method. The scheme gives an analytical solution in the form of a convergent series with easily computable components, requiring no linearization or small perturbation. The numerical solutions obtained by the proposed method indicate that the approach is easy to implement and computationally very attractive.

  8. Advanced multi-dimensional imaging of gamma-ray radiation

    International Nuclear Information System (INIS)

    Woodring, Mitchell; Beddingfield, David; Souza, David; Entine, Gerald; Squillante, Michael; Christian, James; Kogan, Alex

    2003-01-01

    The tracking of radiation contamination and distribution has become a high-priority US DOE task. To support DOE needs, Radiation Monitoring Devices Inc. has been actively carrying out research and development on a gamma-radiation imager, RadCam 2000 TM . The imager is based upon a position-sensitive PMT coupled to a scintillator near a MURA coded aperture. The modulated gamma flux detected by the PSPMT is mathematically decoded to produce images that are computer displayed in near real time. Additionally, we have developed a data-manipulation scheme which allows a multi-dimensional data array, comprised of x position, y position, and energy, to be used in the imaging process. In the imager software a gate can be set on a specific isotope energy to reveal where in the field of view the gated data lies or, conversely, a gate can be set on an area in the field of view to examine what isotopes are present in that area. This process is complicated by the FFT decoding process used with the coded aperture; however, we have achieved excellent performance and results are presented here

  9. Secondary Channel Bifurcation Geometry: A Multi-dimensional Problem

    Science.gov (United States)

    Gaeuman, D.; Stewart, R. L.

    2017-12-01

    The construction of secondary channels (or side channels) is a popular strategy for increasing aquatic habitat complexity in managed rivers. Such channels, however, frequently experience aggradation that prevents surface water from entering the side channels near their bifurcation points during periods of relatively low discharge. This failure to maintain an uninterrupted surface water connection with the main channel can reduce the habitat value of side channels for fish species that prefer lotic conditions. Various factors have been proposed as potential controls on the fate of side channels, including water surface slope differences between the main and secondary channels, the presence of main channel secondary circulation, transverse bed slopes, and bifurcation angle. A quantitative assessment of more than 50 natural and constructed secondary channels in the Trinity River of northern California indicates that bifurcations can assume a variety of configurations that are formed by different processes and whose longevity is governed by different sets of factors. Moreover, factors such as bifurcation angle and water surface slope vary with discharge level and are continuously distributed in space, such that they must be viewed as a multi-dimensional field rather than a single-valued attribute that can be assigned to a particular bifurcation.

  10. MULTI-DIMENSIONAL PATTERN DISCOVERY OF TRAJECTORIES USING CONTEXTUAL INFORMATION

    Directory of Open Access Journals (Sweden)

    M. Sharif

    2017-10-01

    Full Text Available Movement of point objects are highly sensitive to the underlying situations and conditions during the movement, which are known as contexts. Analyzing movement patterns, while accounting the contextual information, helps to better understand how point objects behave in various contexts and how contexts affect their trajectories. One potential solution for discovering moving objects patterns is analyzing the similarities of their trajectories. This article, therefore, contextualizes the similarity measure of trajectories by not only their spatial footprints but also a notion of internal and external contexts. The dynamic time warping (DTW method is employed to assess the multi-dimensional similarities of trajectories. Then, the results of similarity searches are utilized in discovering the relative movement patterns of the moving point objects. Several experiments are conducted on real datasets that were obtained from commercial airplanes and the weather information during the flights. The results yielded the robustness of DTW method in quantifying the commonalities of trajectories and discovering movement patterns with 80 % accuracy. Moreover, the results revealed the importance of exploiting contextual information because it can enhance and restrict movements.

  11. The development of a multi-dimensional gambling accessibility scale.

    Science.gov (United States)

    Hing, Nerilee; Haw, John

    2009-12-01

    The aim of the current study was to develop a scale of gambling accessibility that would have theoretical significance to exposure theory and also serve to highlight the accessibility risk factors for problem gambling. Scale items were generated from the Productivity Commission's (Australia's Gambling Industries: Report No. 10. AusInfo, Canberra, 1999) recommendations and tested on a group with high exposure to the gambling environment. In total, 533 gaming venue employees (aged 18-70 years; 67% women) completed a questionnaire that included six 13-item scales measuring accessibility across a range of gambling forms (gaming machines, keno, casino table games, lotteries, horse and dog racing, sports betting). Also included in the questionnaire was the Problem Gambling Severity Index (PGSI) along with measures of gambling frequency and expenditure. Principal components analysis indicated that a common three factor structure existed across all forms of gambling and these were labelled social accessibility, physical accessibility and cognitive accessibility. However, convergent validity was not demonstrated with inconsistent correlations between each subscale and measures of gambling behaviour. These results are discussed in light of exposure theory and the further development of a multi-dimensional measure of gambling accessibility.

  12. Multi-Dimensional Pattern Discovery of Trajectories Using Contextual Information

    Science.gov (United States)

    Sharif, M.; Alesheikh, A. A.

    2017-10-01

    Movement of point objects are highly sensitive to the underlying situations and conditions during the movement, which are known as contexts. Analyzing movement patterns, while accounting the contextual information, helps to better understand how point objects behave in various contexts and how contexts affect their trajectories. One potential solution for discovering moving objects patterns is analyzing the similarities of their trajectories. This article, therefore, contextualizes the similarity measure of trajectories by not only their spatial footprints but also a notion of internal and external contexts. The dynamic time warping (DTW) method is employed to assess the multi-dimensional similarities of trajectories. Then, the results of similarity searches are utilized in discovering the relative movement patterns of the moving point objects. Several experiments are conducted on real datasets that were obtained from commercial airplanes and the weather information during the flights. The results yielded the robustness of DTW method in quantifying the commonalities of trajectories and discovering movement patterns with 80 % accuracy. Moreover, the results revealed the importance of exploiting contextual information because it can enhance and restrict movements.

  13. The multi-dimensional roles of astrocytes in ALS.

    Science.gov (United States)

    Yamanaka, Koji; Komine, Okiru

    2018-01-01

    Despite significant progress in understanding the molecular and genetic aspects of amyotrophic lateral sclerosis (ALS), a fatal neurodegenerative disease characterized by the progressive loss of motor neurons, the precise and comprehensive pathomechanisms remain largely unknown. In addition to motor neuron involvement, recent studies using cellular and animal models of ALS indicate that there is a complex interplay between motor neurons and neighboring non-neuronal cells, such as astrocytes, in non-cell autonomous neurodegeneration. Astrocytes are key homeostatic cells that play numerous supportive roles in maintaining the brain environment. In neurodegenerative diseases such as ALS, astrocytes change their shape and molecular expression patterns and are referred to as reactive or activated astrocytes. Reactive astrocytes in ALS lose their beneficial functions and gain detrimental roles. In addition, interactions between motor neurons and astrocytes are impaired in ALS. In this review, we summarize growing evidence that astrocytes are critically involved in the survival and demise of motor neurons through several key molecules and cascades in astrocytes in both sporadic and inherited ALS. These observations strongly suggest that astrocytes have multi-dimensional roles in disease and are a viable therapeutic target for ALS. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  14. Gamma histograms for radiotherapy plan evaluation

    International Nuclear Information System (INIS)

    Spezi, Emiliano; Lewis, D. Geraint

    2006-01-01

    Background and purpose: The technique known as the 'γ evaluation method' incorporates pass-fail criteria for both distance-to-agreement and dose difference analysis of 3D dose distributions and provides a numerical index (γ) as a measure of the agreement between two datasets. As the γ evaluation index is being adopted in more centres as part of treatment plan verification procedures for 2D and 3D dose maps, the development of methods capable of encapsulating the information provided by this technique is recommended. Patients and methods: In this work the concept of γ index was extended to create gamma histograms (GH) in order to provide a measure of the agreement between two datasets in two or three dimensions. Gamma area histogram (GAH) and gamma volume histogram (GVH) graphs were produced using one or more 2D γ maps generated for each slice of the irradiated volume. GHs were calculated for IMRT plans, evaluating the 3D dose distribution from a commercial treatment planning system (TPS) compared to a Monte Carlo (MC) calculation used as reference dataset. Results: The extent of local anatomical inhomogenities in the plans under consideration was strongly correlated with the level of difference between reference and evaluated calculations. GHs provided an immediate visual representation of the proportion of the treated volume that fulfilled the γ criterion and offered a concise method for comparative numerical evaluation of dose distributions. Conclusions: We have introduced the concept of GHs and investigated its applications to the evaluation and verification of IMRT plans. The gamma histogram concept set out in this paper can provide a valuable technique for quantitative comparison of dose distributions and could be applied as a tool for the quality assurance of treatment planning systems

  15. Algorithm for locating the extremum of a multi-dimensional constrained function and its application to the PPPL Hybrid Study

    International Nuclear Information System (INIS)

    Bathke, C.

    1978-03-01

    A description is presented of a general algorithm for locating the extremum of a multi-dimensional constrained function. The algorithm employs a series of techniques dominated by random shrinkage, steepest descent, and adaptive creeping. A discussion follows of the algorithm's application to a ''real world'' problem, namely the optimization of the price of electricity, P/sub eh/, from a hybrid fusion-fission reactor. Upon the basis of comparisons with other optimization schemes of a survey nature, the algorithm is concluded to yield a good approximation to the location of a function's optimum

  16. Multi-dimensional conversion to the ion-hybrid mode

    International Nuclear Information System (INIS)

    Tracy, E.R.; Kaufman, A.N.; Brizard, A.J.; Morehead, J.J.

    1996-01-01

    We first demonstrate that the dispersion matrix for linear conversion of a magnetosonic wave to an ion-hybrid wave (as in a D-T plasma) can be congruently transformed to Friedland's normal form. As a result, this conversion can be represented as a two-step process of successive linear conversions in phase space. We then proceed to study the multi-dimensional case of tokamak geometry. After fourier transforming the toroidal dependence, we deal with the two-dimensional poloidal xy-plane and the two-dimensional k x k y -plane, forming a four-dimensional phase space. The dispersion manifolds for the magnetosonic wave [D M (x, k) = 0] and the ion-hybrid wave [D H (x, k) = 0] are each three-dimensional. (Their intersection, on which mode conversion occurs, is two-dimensional.) The incident magnetosonic wave (radiated by an antenna) is a two-dimensional set of rays (a lagrangian manifold): k(x) = ∇θ(x), with θ(x) the phase of the magnetosonic wave. When these rays pierce the ion-hybrid dispersion manifold, they convert to a set of ion-hybrid rays. Then, when those rays intersect the magnetosonic dispersion manifold, they convert to a set of open-quotes reflectedclose quotes magnetosonic rays. This set of rays is distinct from the set of incident rays that have been reflected by the inner surface of the tokamak plasma. As a result, the total destructive interference that can occur in the one-dimensional case may become only partial. We explore the implications of this startling phenomenon both analytically and geometrically

  17. Complexity of possibly gapped histogram and analysis of histogram

    Science.gov (United States)

    Fushing, Hsieh; Roy, Tania

    2018-02-01

    We demonstrate that gaps and distributional patterns embedded within real-valued measurements are inseparable biological and mechanistic information contents of the system. Such patterns are discovered through data-driven possibly gapped histogram, which further leads to the geometry-based analysis of histogram (ANOHT). Constructing a possibly gapped histogram is a complex problem of statistical mechanics due to the ensemble of candidate histograms being captured by a two-layer Ising model. This construction is also a distinctive problem of Information Theory from the perspective of data compression via uniformity. By defining a Hamiltonian (or energy) as a sum of total coding lengths of boundaries and total decoding errors within bins, this issue of computing the minimum energy macroscopic states is surprisingly resolved by applying the hierarchical clustering algorithm. Thus, a possibly gapped histogram corresponds to a macro-state. And then the first phase of ANOHT is developed for simultaneous comparison of multiple treatments, while the second phase of ANOHT is developed based on classical empirical process theory for a tree-geometry that can check the authenticity of branches of the treatment tree. The well-known Iris data are used to illustrate our technical developments. Also, a large baseball pitching dataset and a heavily right-censored divorce data are analysed to showcase the existential gaps and utilities of ANOHT.

  18. Complexity of possibly gapped histogram and analysis of histogram.

    Science.gov (United States)

    Fushing, Hsieh; Roy, Tania

    2018-02-01

    We demonstrate that gaps and distributional patterns embedded within real-valued measurements are inseparable biological and mechanistic information contents of the system. Such patterns are discovered through data-driven possibly gapped histogram, which further leads to the geometry-based analysis of histogram (ANOHT). Constructing a possibly gapped histogram is a complex problem of statistical mechanics due to the ensemble of candidate histograms being captured by a two-layer Ising model. This construction is also a distinctive problem of Information Theory from the perspective of data compression via uniformity. By defining a Hamiltonian (or energy) as a sum of total coding lengths of boundaries and total decoding errors within bins, this issue of computing the minimum energy macroscopic states is surprisingly resolved by applying the hierarchical clustering algorithm. Thus, a possibly gapped histogram corresponds to a macro-state. And then the first phase of ANOHT is developed for simultaneous comparison of multiple treatments, while the second phase of ANOHT is developed based on classical empirical process theory for a tree-geometry that can check the authenticity of branches of the treatment tree. The well-known Iris data are used to illustrate our technical developments. Also, a large baseball pitching dataset and a heavily right-censored divorce data are analysed to showcase the existential gaps and utilities of ANOHT.

  19. CHIL - a comprehensive histogramming language

    International Nuclear Information System (INIS)

    Milner, W.T.; Biggerstaff, J.A.

    1984-01-01

    A high level language, CHIL, has been developed for use in processing event-by-event experimental data at the Holifield Heavy Ion Research Facility (HHIRF) using PERKIN-ELMER 3230 computers. CHIL has been fully integrated into all software which supports on-line and off-line histogramming and off-line preprocessing. CHIL supports simple gates, free-form-gates (2-D regions of arbitrary shape), condition test and branch statements, bit-tests, loops, calls to up to three user supplied subroutines and histogram generating statements. Any combination of 1, 2, 3 or 4-D histograms (32 megachannels max) may be recorded at 16 or 32 bits/channel. User routines may intercept the data being processed and modify it as desired. The CPU-intensive part of the processing utilizes microcoded routines which enhance performance by about a factor of two

  20. CHILA A comprehensive histogramming language

    International Nuclear Information System (INIS)

    Milner, W.T.; Biggerstaff, J.A.

    1985-01-01

    A high level language, CHIL, has been developed for use in processing event-by-event experimental data at the Holifield Heavy Ion Research Facility (HHIRF) using PERKIN-ELMER 3230 computers. CHIL has been fully integrated into all software which supports on-line and off-line histogramming and off-line preprocessing. CHIL supports simple gates, free-form-gates (2-D regions of arbitrary shape), condition test and branch statements, bit-tests, loops, calls to up to three user supplied subroutines and histogram generating statements. Any combination of 1, 2, 3 or 4-D histograms (32 megachannels max) may be recorded at 16 or 32 bits/channel. User routines may intercept the data being processed and modify it as desired. The CPU-intensive part of the processing utilizes microcoded routines which enhance performance by about a factor of two

  1. Multi-dimensional SAR tomography for monitoring the deformation of newly built concrete buildings

    Science.gov (United States)

    Ma, Peifeng; Lin, Hui; Lan, Hengxing; Chen, Fulong

    2015-08-01

    Deformation often occurs in buildings at early ages, and the constant inspection of deformation is of significant importance to discover possible cracking and avoid wall failure. This paper exploits the multi-dimensional SAR tomography technique to monitor the deformation performances of two newly built buildings (B1 and B2) with a special focus on the effects of concrete creep and shrinkage. To separate the nonlinear thermal expansion from total deformations, the extended 4-D SAR technique is exploited. The thermal map estimated from 44 TerraSAR-X images demonstrates that the derived thermal amplitude is highly related to the building height due to the upward accumulative effect of thermal expansion. The linear deformation velocity map reveals that B1 is subject to settlement during the construction period, in addition, the creep and shrinkage of B1 lead to wall shortening that is a height-dependent movement in the downward direction, and the asymmetrical creep of B2 triggers wall deflection that is a height-dependent movement in the deflection direction. It is also validated that the extended 4-D SAR can rectify the bias of estimated wall shortening and wall deflection by 4-D SAR.

  2. Multispectral histogram normalization contrast enhancement

    Science.gov (United States)

    Soha, J. M.; Schwartz, A. A.

    1979-01-01

    A multispectral histogram normalization or decorrelation enhancement which achieves effective color composites by removing interband correlation is described. The enhancement procedure employs either linear or nonlinear transformations to equalize principal component variances. An additional rotation to any set of orthogonal coordinates is thus possible, while full histogram utilization is maintained by avoiding the reintroduction of correlation. For the three-dimensional case, the enhancement procedure may be implemented with a lookup table. An application of the enhancement to Landsat multispectral scanning imagery is presented.

  3. Live histograms in moving windows

    International Nuclear Information System (INIS)

    Zhil'tsov, V.E.

    1989-01-01

    Application of computer graphics for specific hardware testing is discussed. The hardware is position sensitive detector (multiwire proportional chamber) which is used in high energy physics experiments, and real-out electronics for it. Testing program is described (XPERT), which utilises multi-window user interface. Data are represented as histograms in windows. The windows on the screen may be moved, reordered, their sizes may be changed. Histograms may be put to any window, and hardcopy may be made. Some program internals are discussed. The computer environment is quite simple: MS-DOS IBM PC/XT, 256 KB RAM, CGA, 5.25'' FD, Epson MX. 4 refs.; 7 figs

  4. MRI histogram analysis enables objective and continuous classification of intervertebral disc degeneration.

    Science.gov (United States)

    Waldenberg, Christian; Hebelka, Hanna; Brisby, Helena; Lagerstrand, Kerstin Magdalena

    2018-05-01

    Magnetic resonance imaging (MRI) is the best diagnostic imaging method for low back pain. However, the technique is currently not utilized in its full capacity, often failing to depict painful intervertebral discs (IVDs), potentially due to the rough degeneration classification system used clinically today. MR image histograms, which reflect the IVD heterogeneity, may offer sensitive imaging biomarkers for IVD degeneration classification. This study investigates the feasibility of using histogram analysis as means of objective and continuous grading of IVD degeneration. Forty-nine IVDs in ten low back pain patients (six males, 25-69 years) were examined with MRI (T2-weighted images and T2-maps). Each IVD was semi-automatically segmented on three mid-sagittal slices. Histogram features of the IVD were extracted from the defined regions of interest and correlated to Pfirrmann grade. Both T2-weighted images and T2-maps displayed similar histogram features. Histograms of well-hydrated IVDs displayed two separate peaks, representing annulus fibrosus and nucleus pulposus. Degenerated IVDs displayed decreased peak separation, where the separation was shown to correlate strongly with Pfirrmann grade (P histogram appearances. Histogram features correlated well with IVD degeneration, suggesting that IVD histogram analysis is a suitable tool for objective and continuous IVD degeneration classification. As histogram analysis revealed IVD heterogeneity, it may be a clinical tool for characterization of regional IVD degeneration effects. To elucidate the usefulness of histogram analysis in patient management, IVD histogram features between asymptomatic and symptomatic individuals needs to be compared.

  5. Boundary condition histograms for modulated phases

    International Nuclear Information System (INIS)

    Benakli, M.; Gabay, M.; Saslow, W.M.

    1997-11-01

    Boundary conditions strongly affect the results of numerical computations for finite size inhomogeneous or incommensurate structures. We present a method which allows to deal with this problem, both for ground state and for critical properties: it combines fluctuating boundary conditions and specific histogram techniques. Our approach concerns classical as well as quantum systems. In particular, current-current correlation functions, which probe large scale coherence of the states, can be accurately evaluated. We illustrate our method on a frustrated two dimensional XY model. (author)

  6. Use of benchmark dose-volume histograms for selection of the optimal technique between three-dimensional conformal radiation therapy and intensity-modulated radiation therapy in prostate cancer

    International Nuclear Information System (INIS)

    Luo Chunhui; Yang, Claus Chunli; Narayan, Samir; Stern, Robin L.; Perks, Julian; Goldberg, Zelanna; Ryu, Janice; Purdy, James A.; Vijayakumar, Srinivasan

    2006-01-01

    Purpose: The aim of this study was to develop and validate our own benchmark dose-volume histograms (DVHs) of bladder and rectum for both conventional three-dimensional conformal radiation therapy (3D-CRT) and intensity-modulated radiation therapy (IMRT), and to evaluate quantitatively the benefits of using IMRT vs. 3D-CRT in treating localized prostate cancer. Methods and Materials: During the implementation of IMRT for prostate cancer, our policy was to plan each patient with both 3D-CRT and IMRT. This study included 31 patients with T1b to T2c localized prostate cancer, for whom we completed double-planning using both 3D-CRT and IMRT techniques. The target volumes included prostate, either with or without proximal seminal vesicles. Bladder and rectum DVH data were summarized to obtain an average DVH for each technique and then compared using two-tailed paired t test analysis. Results: For 3D-CRT our bladder doses were as follows: mean 28.8 Gy, v60 16.4%, v70 10.9%; rectal doses were: mean 39.3 Gy, v60 21.8%, v70 13.6%. IMRT plans resulted in similar mean dose values: bladder 26.4 Gy, rectum 34.9 Gy, but lower values of v70 for the bladder (7.8%) and rectum (9.3%). These benchmark DVHs have resulted in a critical evaluation of our 3D-CRT techniques over time. Conclusion: Our institution has developed benchmark DVHs for bladder and rectum based on our clinical experience with 3D-CRT and IMRT. We use these standards as well as differences in individual cases to make decisions on whether patients may benefit from IMRT treatment rather than 3D-CRT

  7. Tracking techniques for the method of characteristics applied to the neutron transport problem in multi-dimensional domains; Techniques de tracage pour la methode des caracteristiques appliquee a la resolution de l'equation du transport des neutrons en domaines multi-dimensionnels

    Energy Technology Data Exchange (ETDEWEB)

    Fevotte, F

    2008-10-15

    In the past years, the Method of Characteristics (MOC) has become a popular tool for the numerical solution of the neutron transport equation. Among its most interesting advantages are its good precision over computing time ratio, as well as its ability to accurately describe complicated geometries using non structured meshes. In order to reduce the need for computing resources in the method of characteristics, we propose in this dissertation two lines of improvement. The first axis of development is based on an analysis of the transverse integration technique in the method of characteristics. Various limitations have been discerned in this regard, which we intend to correct by proposing a new variant of the method of characteristics. Through a better treatment of material discontinuities in the geometry, our aim is to increase the accuracy of the transverse integration formula in order to decrease the computing resources without sacrificing the quality of the results. This method has been numerically tested in order to show its interest. Analysing the numerical results obtained with this new method also allows better understanding of the transverse integration approximations. Another improvement comes from the observation that industrial reactor cores exhibit very complex structures, but are often partly composed of a lattice of geometrically identical cells or assemblies. We propose a systematic method taking advantage of repetitions in the geometry to reduce the storage requirements for geometric data. Based on the group theory, this method can be employed for all lattice geometries. We present some numerical results showing the interest of the method in industrial contexts. (author)

  8. Multi-Dimensional Damage Detection for Surfaces and Structures

    Science.gov (United States)

    Williams, Martha; Lewis, Mark; Roberson, Luke; Medelius, Pedro; Gibson, Tracy; Parks, Steen; Snyder, Sarah

    2013-01-01

    Current designs for inflatable or semi-rigidized structures for habitats and space applications use a multiple-layer construction, alternating thin layers with thicker, stronger layers, which produces a layered composite structure that is much better at resisting damage. Even though such composite structures or layered systems are robust, they can still be susceptible to penetration damage. The ability to detect damage to surfaces of inflatable or semi-rigid habitat structures is of great interest to NASA. Damage caused by impacts of foreign objects such as micrometeorites can rupture the shell of these structures, causing loss of critical hardware and/or the life of the crew. While not all impacts will have a catastrophic result, it will be very important to identify and locate areas of the exterior shell that have been damaged by impacts so that repairs (or other provisions) can be made to reduce the probability of shell wall rupture. This disclosure describes a system that will provide real-time data regarding the health of the inflatable shell or rigidized structures, and information related to the location and depth of impact damage. The innovation described here is a method of determining the size, location, and direction of damage in a multilayered structure. In the multi-dimensional damage detection system, layers of two-dimensional thin film detection layers are used to form a layered composite, with non-detection layers separating the detection layers. The non-detection layers may be either thicker or thinner than the detection layers. The thin-film damage detection layers are thin films of materials with a conductive grid or striped pattern. The conductive pattern may be applied by several methods, including printing, plating, sputtering, photolithography, and etching, and can include as many detection layers that are necessary for the structure construction or to afford the detection detail level required. The damage is detected using a detector or

  9. A Generic multi-dimensional feature extraction method using multiobjective genetic programming.

    Science.gov (United States)

    Zhang, Yang; Rockett, Peter I

    2009-01-01

    In this paper, we present a generic feature extraction method for pattern classification using multiobjective genetic programming. This not only evolves the (near-)optimal set of mappings from a pattern space to a multi-dimensional decision space, but also simultaneously optimizes the dimensionality of that decision space. The presented framework evolves vector-to-vector feature extractors that maximize class separability. We demonstrate the efficacy of our approach by making statistically-founded comparisons with a wide variety of established classifier paradigms over a range of datasets and find that for most of the pairwise comparisons, our evolutionary method delivers statistically smaller misclassification errors. At very worst, our method displays no statistical difference in a few pairwise comparisons with established classifier/dataset combinations; crucially, none of the misclassification results produced by our method is worse than any comparator classifier. Although principally focused on feature extraction, feature selection is also performed as an implicit side effect; we show that both feature extraction and selection are important to the success of our technique. The presented method has the practical consequence of obviating the need to exhaustively evaluate a large family of conventional classifiers when faced with a new pattern recognition problem in order to attain a good classification accuracy.

  10. The method of separation for evolutionary spectral density estimation of multi-variate and multi-dimensional non-stationary stochastic processes

    KAUST Repository

    Schillinger, Dominik

    2013-07-01

    The method of separation can be used as a non-parametric estimation technique, especially suitable for evolutionary spectral density functions of uniformly modulated and strongly narrow-band stochastic processes. The paper at hand provides a consistent derivation of method of separation based spectrum estimation for the general multi-variate and multi-dimensional case. The validity of the method is demonstrated by benchmark tests with uniformly modulated spectra, for which convergence to the analytical solution is demonstrated. The key advantage of the method of separation is the minimization of spectral dispersion due to optimum time- or space-frequency localization. This is illustrated by the calibration of multi-dimensional and multi-variate geometric imperfection models from strongly narrow-band measurements in I-beams and cylindrical shells. Finally, the application of the method of separation based estimates for the stochastic buckling analysis of the example structures is briefly discussed. © 2013 Elsevier Ltd.

  11. Histogram deconvolution - An aid to automated classifiers

    Science.gov (United States)

    Lorre, J. J.

    1983-01-01

    It is shown that N-dimensional histograms are convolved by the addition of noise in the picture domain. Three methods are described which provide the ability to deconvolve such noise-affected histograms. The purpose of the deconvolution is to provide automated classifiers with a higher quality N-dimensional histogram from which to obtain classification statistics.

  12. Theory and Application of DNA Histogram Analysis.

    Science.gov (United States)

    Bagwell, Charles Bruce

    The underlying principles and assumptions associated with DNA histograms are discussed along with the characteristics of fluorescent probes. Information theory was described and used to calculate the information content of a DNA histogram. Two major types of DNA histogram analyses are proposed: parametric and nonparametric analysis. Three levels…

  13. Towards Semantic Web Services on Large, Multi-Dimensional Coverages

    Science.gov (United States)

    Baumann, P.

    2009-04-01

    Observed and simulated data in the Earth Sciences often come as coverages, the general term for space-time varying phenomena as set forth by standardization bodies like the Open GeoSpatial Consortium (OGC) and ISO. Among such data are 1-d time series, 2-D surface data, 3-D surface data time series as well as x/y/z geophysical and oceanographic data, and 4-D metocean simulation results. With increasing dimensionality the data sizes grow exponentially, up to Petabyte object sizes. Open standards for exploiting coverage archives over the Web are available to a varying extent. The OGC Web Coverage Service (WCS) standard defines basic extraction operations: spatio-temporal and band subsetting, scaling, reprojection, and data format encoding of the result - a simple interoperable interface for coverage access. More processing functionality is available with products like Matlab, Grid-type interfaces, and the OGC Web Processing Service (WPS). However, these often lack properties known as advantageous from databases: declarativeness (describe results rather than the algorithms), safe in evaluation (no request can keep a server busy infinitely), and optimizable (enable the server to rearrange the request so as to produce the same result faster). WPS defines a geo-enabled SOAP interface for remote procedure calls. This allows to webify any program, but does not allow for semantic interoperability: a function is identified only by its function name and parameters while the semantics is encoded in the (only human readable) title and abstract. Hence, another desirable property is missing, namely an explicit semantics which allows for machine-machine communication and reasoning a la Semantic Web. The OGC Web Coverage Processing Service (WCPS) language, which has been adopted as an international standard by OGC in December 2008, defines a flexible interface for the navigation, extraction, and ad-hoc analysis of large, multi-dimensional raster coverages. It is abstract in that it

  14. LHCb: Machine assisted histogram classification

    CERN Multimedia

    Somogyi, P; Gaspar, C

    2009-01-01

    LHCb is one of the four major experiments under completion at the Large Hadron Collider (LHC). Monitoring the quality of the acquired data is important, because it allows the verification of the detector performance. Anomalies, such as missing values or unexpected distributions can be indicators of a malfunctioning detector, resulting in poor data quality. Spotting faulty components can be either done visually using instruments such as the LHCb Histogram Presenter, or by automated tools. In order to assist detector experts in handling the vast monitoring information resulting from the sheer size of the detector, a graph-theoretic based clustering tool, combined with machine learning algorithms is proposed and demonstrated by processing histograms representing 2D event hitmaps. The concept is proven by detecting ion feedback events in the LHCb RICH subdetector.

  15. Visualizing Contour Trees within Histograms

    DEFF Research Database (Denmark)

    Kraus, Martin

    2010-01-01

    Many of the topological features of the isosurfaces of a scalar volume field can be compactly represented by its contour tree. Unfortunately, the contour trees of most real-world volume data sets are too complex to be visualized by dot-and-line diagrams. Therefore, we propose a new visualization...... that is suitable for large contour trees and efficiently conveys the topological structure of the most important isosurface components. This visualization is integrated into a histogram of the volume data; thus, it offers strictly more information than a traditional histogram. We present algorithms...... to automatically compute the graph layout and to calculate appropriate approximations of the contour tree and the surface area of the relevant isosurface components. The benefits of this new visualization are demonstrated with the help of several publicly available volume data sets....

  16. Multi-dimensional simulations of core-collapse supernova explosions with CHIMERA

    Science.gov (United States)

    Messer, O. E. B.; Harris, J. A.; Hix, W. R.; Lentz, E. J.; Bruenn, S. W.; Mezzacappa, A.

    2018-04-01

    Unraveling the core-collapse supernova (CCSN) mechanism is a problem that remains essentially unsolved despite more than four decades of effort. Spherically symmetric models with otherwise high physical fidelity generally fail to produce explosions, and it is widely accepted that CCSNe are inherently multi-dimensional. Progress in realistic modeling has occurred recently through the availability of petascale platforms and the increasing sophistication of supernova codes. We will discuss our most recent work on understanding neutrino-driven CCSN explosions employing multi-dimensional neutrino-radiation hydrodynamics simulations with the Chimera code. We discuss the inputs and resulting outputs from these simulations, the role of neutrino radiation transport, and the importance of multi-dimensional fluid flows in shaping the explosions. We also highlight the production of 48Ca in long-running Chimera simulations.

  17. Towards Optimal Multi-Dimensional Query Processing with BitmapIndices

    Energy Technology Data Exchange (ETDEWEB)

    Rotem, Doron; Stockinger, Kurt; Wu, Kesheng

    2005-09-30

    Bitmap indices have been widely used in scientific applications and commercial systems for processing complex, multi-dimensional queries where traditional tree-based indices would not work efficiently. This paper studies strategies for minimizing the access costs for processing multi-dimensional queries using bitmap indices with binning. Innovative features of our algorithm include (a) optimally placing the bin boundaries and (b) dynamically reordering the evaluation of the query terms. In addition, we derive several analytical results concerning optimal bin allocation for a probabilistic query model. Our experimental evaluation with real life data shows an average I/O cost improvement of at least a factor of 10 for multi-dimensional queries on datasets from two different applications. Our experiments also indicate that the speedup increases with the number of query dimensions.

  18. Development of multi-dimensional body image scale for malaysian female adolescents.

    Science.gov (United States)

    Chin, Yit Siew; Taib, Mohd Nasir Mohd; Shariff, Zalilah Mohd; Khor, Geok Lin

    2008-01-01

    The present study was conducted to develop a Multi-dimensional Body Image Scale for Malaysian female adolescents. Data were collected among 328 female adolescents from a secondary school in Kuantan district, state of Pahang, Malaysia by using a self-administered questionnaire and anthropometric measurements. The self-administered questionnaire comprised multiple measures of body image, Eating Attitude Test (EAT-26; Garner & Garfinkel, 1979) and Rosenberg Self-esteem Inventory (Rosenberg, 1965). The 152 items from selected multiple measures of body image were examined through factor analysis and for internal consistency. Correlations between Multi-dimensional Body Image Scale and body mass index (BMI), risk of eating disorders and self-esteem were assessed for construct validity. A seven factor model of a 62-item Multi-dimensional Body Image Scale for Malaysian female adolescents with construct validity and good internal consistency was developed. The scale encompasses 1) preoccupation with thinness and dieting behavior, 2) appearance and body satisfaction, 3) body importance, 4) muscle increasing behavior, 5) extreme dieting behavior, 6) appearance importance, and 7) perception of size and shape dimensions. Besides, a multidimensional body image composite score was proposed to screen negative body image risk in female adolescents. The result found body image was correlated with BMI, risk of eating disorders and self-esteem in female adolescents. In short, the present study supports a multi-dimensional concept for body image and provides a new insight into its multi-dimensionality in Malaysian female adolescents with preliminary validity and reliability of the scale. The Multi-dimensional Body Image Scale can be used to identify female adolescents who are potentially at risk of developing body image disturbance through future intervention programs.

  19. POLARIZED LINE FORMATION IN MULTI-DIMENSIONAL MEDIA. III. HANLE EFFECT WITH PARTIAL FREQUENCY REDISTRIBUTION

    International Nuclear Information System (INIS)

    Anusha, L. S.; Nagendra, K. N.

    2011-01-01

    In two previous papers, we solved the polarized radiative transfer (RT) equation in multi-dimensional (multi-D) geometries with partial frequency redistribution as the scattering mechanism. We assumed Rayleigh scattering as the only source of linear polarization (Q/I, U/I) in both these papers. In this paper, we extend these previous works to include the effect of weak oriented magnetic fields (Hanle effect) on line scattering. We generalize the technique of Stokes vector decomposition in terms of the irreducible spherical tensors T K Q , developed by Anusha and Nagendra, to the case of RT with Hanle effect. A fast iterative method of solution (based on the Stabilized Preconditioned Bi-Conjugate-Gradient technique), developed by Anusha et al., is now generalized to the case of RT in magnetized three-dimensional media. We use the efficient short-characteristics formal solution method for multi-D media, generalized appropriately to the present context. The main results of this paper are the following: (1) a comparison of emergent (I, Q/I, U/I) profiles formed in one-dimensional (1D) media, with the corresponding emergent, spatially averaged profiles formed in multi-D media, shows that in the spatially resolved structures, the assumption of 1D may lead to large errors in linear polarization, especially in the line wings. (2) The multi-D RT in semi-infinite non-magnetic media causes a strong spatial variation of the emergent (Q/I, U/I) profiles, which is more pronounced in the line wings. (3) The presence of a weak magnetic field modifies the spatial variation of the emergent (Q/I, U/I) profiles in the line core, by producing significant changes in their magnitudes.

  20. 3D Model Retrieval Based on Vector Quantisation Index Histograms

    International Nuclear Information System (INIS)

    Lu, Z M; Luo, H; Pan, J S

    2006-01-01

    This paper proposes a novel technique to retrieval 3D mesh models using vector quantisation index histograms. Firstly, points are sampled uniformly on mesh surface. Secondly, to a point five features representing global and local properties are extracted. Thus feature vectors of points are obtained. Third, we select several models from each class, and employ their feature vectors as a training set. After training using LBG algorithm, a public codebook is constructed. Next, codeword index histograms of the query model and those in database are computed. The last step is to compute the distance between histograms of the query and those of the models in database. Experimental results show the effectiveness of our method

  1. Numerical solution to a multi-dimensional linear inverse heat conduction problem by a splitting-based conjugate gradient method

    International Nuclear Information System (INIS)

    Dinh Nho Hao; Nguyen Trung Thanh; Sahli, Hichem

    2008-01-01

    In this paper we consider a multi-dimensional inverse heat conduction problem with time-dependent coefficients in a box, which is well-known to be severely ill-posed, by a variational method. The gradient of the functional to be minimized is obtained by aids of an adjoint problem and the conjugate gradient method with a stopping rule is then applied to this ill-posed optimization problem. To enhance the stability and the accuracy of the numerical solution to the problem we apply this scheme to the discretized inverse problem rather than to the continuous one. The difficulties with large dimensions of discretized problems are overcome by a splitting method which only requires the solution of easy-to-solve one-dimensional problems. The numerical results provided by our method are very good and the techniques seem to be very promising.

  2. Chemical shift-dependent apparent scalar couplings: An alternative concept of chemical shift monitoring in multi-dimensional NMR experiments

    International Nuclear Information System (INIS)

    Kwiatkowski, Witek; Riek, Roland

    2003-01-01

    The paper presents an alternative technique for chemical shift monitoring in a multi-dimensional NMR experiment. The monitored chemical shift is coded in the line-shape of a cross-peak through an apparent residual scalar coupling active during an established evolution period or acquisition. The size of the apparent scalar coupling is manipulated with an off-resonance radio-frequency pulse in order to correlate the size of the coupling with the position of the additional chemical shift. The strength of this concept is that chemical shift information is added without an additional evolution period and accompanying polarization transfer periods. This concept was incorporated into the three-dimensional triple-resonance experiment HNCA, adding the information of 1 H α chemical shifts. The experiment is called HNCA coded HA, since the chemical shift of 1 H α is coded in the line-shape of the cross-peak along the 13 C α dimension

  3. The use of multi-dimensional flow and morphodynamic models for restoration design analysis

    Science.gov (United States)

    McDonald, R.; Nelson, J. M.

    2013-12-01

    River restoration projects with the goal of restoring a wide range of morphologic and ecologic channel processes and functions have become common. The complex interactions between flow and sediment-transport make it challenging to design river channels that are both self-sustaining and improve ecosystem function. The relative immaturity of the field of river restoration and shortcomings in existing methodologies for evaluating channel designs contribute to this problem, often leading to project failures. The call for increased monitoring of constructed channels to evaluate which restoration techniques do and do not work is ubiquitous and may lead to improved channel restoration projects. However, an alternative approach is to detect project flaws before the channels are built by using numerical models to simulate hydraulic and sediment-transport processes and habitat in the proposed channel (Restoration Design Analysis). Multi-dimensional models provide spatially distributed quantities throughout the project domain that may be used to quantitatively evaluate restoration designs for such important metrics as (1) the change in water-surface elevation which can affect the extent and duration of floodplain reconnection, (2) sediment-transport and morphologic change which can affect the channel stability and long-term maintenance of the design; and (3) habitat changes. These models also provide an efficient way to evaluate such quantities over a range of appropriate discharges including low-probability events which often prove the greatest risk to the long-term stability of restored channels. Currently there are many free and open-source modeling frameworks available for such analysis including iRIC, Delft3D, and TELEMAC. In this presentation we give examples of Restoration Design Analysis for each of the metrics above from projects on the Russian River, CA and the Kootenai River, ID. These examples demonstrate how detailed Restoration Design Analysis can be used to

  4. Development and empirical validation of symmetric component measures of multi-dimensional constructs

    DEFF Research Database (Denmark)

    Sørensen, Hans Eibe; Slater, Stanley F.

    2008-01-01

    Atheoretical measure purification may lead to construct deficient measures. The purpose of this paper is to provide a theoretically driven procedure for the development and empirical validation of symmetric component measures of multi-dimensional constructs. We place particular emphasis on establ...

  5. Developing a Multi-Dimensional Evaluation Framework for Faculty Teaching and Service Performance

    Science.gov (United States)

    Baker, Diane F.; Neely, Walter P.; Prenshaw, Penelope J.; Taylor, Patrick A.

    2015-01-01

    A task force was created in a small, AACSB-accredited business school to develop a more comprehensive set of standards for faculty performance. The task force relied heavily on faculty input to identify and describe key dimensions that capture effective teaching and service performance. The result is a multi-dimensional framework that will be used…

  6. Developing Multi-Dimensional Evaluation Criteria for English Learning Websites with University Students and Professors

    Science.gov (United States)

    Liu, Gi-Zen; Liu, Zih-Hui; Hwang, Gwo-Jen

    2011-01-01

    Many English learning websites have been developed worldwide, but little research has been conducted concerning the development of comprehensive evaluation criteria. The main purpose of this study is thus to construct a multi-dimensional set of criteria to help learners and teachers evaluate the quality of English learning websites. These…

  7. A Replication Study on the Multi-Dimensionality of Online Social Presence

    Science.gov (United States)

    Mykota, David B.

    2015-01-01

    The purpose of the present study is to conduct an external replication into the multi-dimensionality of social presence as measured by the Computer-Mediated Communication Questionnaire (Tu, 2005). Online social presence is one of the more important constructs for determining the level of interaction and effectiveness of learning in an online…

  8. Multi-dimensional microanalysis of masklessly implanted atoms using focused heavy ion beam

    International Nuclear Information System (INIS)

    Mokuno, Yoshiaki; Iiorino, Yuji; Chayahara, Akiyoshi; Kiuchi, Masato; Fujii, Kanenaga; Satou, Mamoru

    1992-01-01

    Multi-dimensional structure fabricated by maskless MeV gold implantation in silicon wafer was analyzed by 3 MeV carbon ion microprobe using a microbeam line developed at GIRIO. The minimum line width of the implanted region was estimated to be about 5 μm. The advantages of heavy ions for microanalysis were demonstrated. (author)

  9. Multi-dimensional database design and implementation of dam safety monitoring system

    Directory of Open Access Journals (Sweden)

    Zhao Erfeng

    2008-09-01

    Full Text Available To improve the effectiveness of dam safety monitoring database systems, the development process of a multi-dimensional conceptual data model was analyzed and a logic design was achieved in multi-dimensional database mode. The optimal data model was confirmed by identifying data objects, defining relations and reviewing entities. The conversion of relations among entities to external keys and entities and physical attributes to tables and fields was interpreted completely. On this basis, a multi-dimensional database that reflects the management and analysis of a dam safety monitoring system on monitoring data information has been established, for which factual tables and dimensional tables have been designed. Finally, based on service design and user interface design, the dam safety monitoring system has been developed with Delphi as the development tool. This development project shows that the multi-dimensional database can simplify the development process and minimize hidden dangers in the database structure design. It is superior to other dam safety monitoring system development models and can provide a new research direction for system developers.

  10. Skip-webs: Efficient distributed data structures for multi-dimensional data sets

    DEFF Research Database (Denmark)

    Arge, Lars; Eppstein, David; Goodrich, Michael T.

    2005-01-01

    querying scenarios, which include linear (one-dimensional) data, such as sorted sets, as well as multi-dimensional data, such as d-dimensional octrees and digital tries of character strings defined over a fixed alphabet. We show how to perform a query over such a set of n items spread among n hosts using O...

  11. Uncertainty Evaluation with Multi-Dimensional Model of LBLOCA in OPR1000 Plant

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jieun; Oh, Deog Yeon; Seul, Kwang-Won; Lee, Jin Ho [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2016-10-15

    KINS has used KINS-REM (KINS-Realistic Evaluation Methodology) which developed for Best- Estimate (BE) calculation and uncertainty quantification for regulatory audit. This methodology has been improved continuously by numerous studies, such as uncertainty parameters and uncertainty ranges. In this study, to evaluate the applicability of improved KINS-REM for OPR1000 plant, uncertainty evaluation with multi-dimensional model for confirming multi-dimensional phenomena was conducted with MARS-KS code. In this study, the uncertainty evaluation with multi- dimensional model of OPR1000 plant was conducted for confirming the applicability of improved KINS- REM The reactor vessel modeled using MULTID component of MARS-KS code, and total 29 uncertainty parameters were considered by 124 sampled calculations. Through 124 calculations using Mosaique program with MARS-KS code, peak cladding temperature was calculated and final PCT was determined by the 3rd order Wilks' formula. The uncertainty parameters which has strong influence were investigated by Pearson coefficient analysis. They were mostly related with plant operation and fuel material properties. Evaluation results through the 124 calculations and sensitivity analysis show that improved KINS-REM could be reasonably applicable for uncertainty evaluation with multi-dimensional model calculations of OPR1000 plants.

  12. Multi-dimensional information diffusion and balancing market supply: an agent-based approach

    NARCIS (Netherlands)

    Osinga, S.A.; Kramer, M.R.; Hofstede, G.J.; Beulens, A.J.M.

    2013-01-01

    This agent-based information management model is designed to explore how multi-dimensional information, spreading through a population of agents (for example farmers) affects market supply. Farmers make quality decisions that must be aligned with available markets. Markets distinguish themselves by

  13. Exact asymptotic expansions for solutions of multi-dimensional renewal equations

    International Nuclear Information System (INIS)

    Sgibnev, M S

    2006-01-01

    We derive expansions with exact asymptotic expressions for the remainders for solutions of multi-dimensional renewal equations. The effect of the roots of the characteristic equation on the asymptotic representation of solutions is taken into account. The resulting formulae are used to investigate the asymptotic behaviour of the average number of particles in age-dependent branching processes having several types of particles

  14. Effects of bathymetric lidar errors on flow properties predicted with a multi-dimensional hydraulic model

    Science.gov (United States)

    J. McKean; D. Tonina; C. Bohn; C. W. Wright

    2014-01-01

    New remote sensing technologies and improved computer performance now allow numerical flow modeling over large stream domains. However, there has been limited testing of whether channel topography can be remotely mapped with accuracy necessary for such modeling. We assessed the ability of the Experimental Advanced Airborne Research Lidar, to support a multi-dimensional...

  15. The application of a multi-dimensional assessment approach to talent identification in Australian football.

    Science.gov (United States)

    Woods, Carl T; Raynor, Annette J; Bruce, Lyndell; McDonald, Zane; Robertson, Sam

    2016-07-01

    This study investigated whether a multi-dimensional assessment could assist with talent identification in junior Australian football (AF). Participants were recruited from an elite under 18 (U18) AF competition and classified into two groups; talent identified (State U18 Academy representatives; n = 42; 17.6 ± 0.4 y) and non-talent identified (non-State U18 Academy representatives; n = 42; 17.4 ± 0.5 y). Both groups completed a multi-dimensional assessment, which consisted of physical (standing height, dynamic vertical jump height and 20 m multistage fitness test), technical (kicking and handballing tests) and perceptual-cognitive (video decision-making task) performance outcome tests. A multivariate analysis of variance tested the main effect of status on the test criterions, whilst a receiver operating characteristic curve assessed the discrimination provided from the full assessment. The talent identified players outperformed their non-talent identified peers in each test (P talent identified and non-talent identified participants, respectively. When compared to single assessment approaches, this multi-dimensional assessment reflects a more comprehensive means of talent identification in AF. This study further highlights the importance of assessing multi-dimensional performance qualities when identifying talented team sports.

  16. Low-diffusion rotated upwind schemes, multigrid and defect correction for steady, multi-dimensional Euler flows

    NARCIS (Netherlands)

    Koren, B.; Hackbusch, W.; Trottenberg, U.

    1991-01-01

    Two simple, multi-dimensional upwind discretizations for the steady Euler equations are derived, with the emphasis Iying on bath a good accuracy and a good solvability. The multi-dimensional upwinding consists of applying a one-dimensional Riemann solver with a locally rotated left and right state,

  17. Development and assessment of multi-dimensional flow model in MARS compared with the RPI air-water experiment

    International Nuclear Information System (INIS)

    Lee, Seok Min; Lee, Un Chul; Bae, Sung Won; Chung, Bub Dong

    2004-01-01

    The Multi-Dimensional flow models in system code have been developed during the past many years. RELAP5-3D, CATHARE and TRACE has its specific multi-dimensional flow models and successfully applied it to the system safety analysis. In KAERI, also, MARS(Multi-dimensional Analysis of Reactor Safety) code was developed by integrating RELAP5/MOD3 code and COBRA-TF code. Even though COBRA-TF module can analyze three-dimensional flow models, it has a limitation to apply 3D shear stress dominant phenomena or cylindrical geometry. Therefore, Multi-dimensional analysis models are newly developed by implementing three-dimensional momentum flux and diffusion terms. The multi-dimensional model has been assessed compared with multi-dimensional conceptual problems and CFD code results. Although the assessment results were reasonable, the multi-dimensional model has not been validated to two-phase flow using experimental data. In this paper, the multi-dimensional air-water two-phase flow experiment was simulated and analyzed

  18. Pedagogical Factors Stimulating the Self-Development of Students' Multi-Dimensional Thinking in Terms of Subject-Oriented Teaching

    Science.gov (United States)

    Andreev, Valentin I.

    2014-01-01

    The main aim of this research is to disclose the essence of students' multi-dimensional thinking, also to reveal the rating of factors which stimulate the raising of effectiveness of self-development of students' multi-dimensional thinking in terms of subject-oriented teaching. Subject-oriented learning is characterized as a type of learning where…

  19. Synthesis of Joint Volumes, Visualization of Paths, and Revision of Viewing Sequences in a Multi-dimensional Seismic Data Viewer

    Science.gov (United States)

    Chen, D. M.; Clapp, R. G.; Biondi, B.

    2006-12-01

    Ricksep is a freely-available interactive viewer for multi-dimensional data sets. The viewer is very useful for simultaneous display of multiple data sets from different viewing angles, animation of movement along a path through the data space, and selection of local regions for data processing and information extraction. Several new viewing features are added to enhance the program's functionality in the following three aspects. First, two new data synthesis algorithms are created to adaptively combine information from a data set with mostly high-frequency content, such as seismic data, and another data set with mainly low-frequency content, such as velocity data. Using the algorithms, these two data sets can be synthesized into a single data set which resembles the high-frequency data set on a local scale and at the same time resembles the low- frequency data set on a larger scale. As a result, the originally separated high and low-frequency details can now be more accurately and conveniently studied together. Second, a projection algorithm is developed to display paths through the data space. Paths are geophysically important because they represent wells into the ground. Two difficulties often associated with tracking paths are that they normally cannot be seen clearly inside multi-dimensional spaces and depth information is lost along the direction of projection when ordinary projection techniques are used. The new algorithm projects samples along the path in three orthogonal directions and effectively restores important depth information by using variable projection parameters which are functions of the distance away from the path. Multiple paths in the data space can be generated using different character symbols as positional markers, and users can easily create, modify, and view paths in real time. Third, a viewing history list is implemented which enables Ricksep's users to create, edit and save a recipe for the sequence of viewing states. Then, the recipe

  20. An Overview of Multi-Dimensional Models of the Sacramento–San Joaquin Delta

    Directory of Open Access Journals (Sweden)

    Michael L. MacWilliams

    2016-12-01

    Full Text Available doi: https://doi.org/10.15447/sfews.2016v14iss4art2Over the past 15 years, the development and application of multi-dimensional hydrodynamic models in San Francisco Bay and the Sacramento–San Joaquin Delta has transformed our ability to analyze and understand the underlying physics of the system. Initial applications of three-dimensional models focused primarily on salt intrusion, and provided a valuable resource for investigating how sea level rise and levee failures in the Delta could influence water quality in the Delta under future conditions. However, multi-dimensional models have also provided significant insights into some of the fundamental biological relationships that have shaped our thinking about the system by exploring the relationship among X2, flow, fish abundance, and the low salinity zone. Through the coupling of multi-dimensional models with wind wave and sediment transport models, it has been possible to move beyond salinity to understand how large-scale changes to the system are likely to affect sediment dynamics, and to assess the potential effects on species that rely on turbidity for habitat. Lastly, the coupling of multi-dimensional hydrodynamic models with particle tracking models has led to advances in our thinking about residence time, the retention of food organisms in the estuary, the effect of south Delta exports on larval entrainment, and the pathways and behaviors of salmonids that travel through the Delta. This paper provides an overview of these recent advances and how they have increased our understanding of the distribution and movement of fish and food organisms. The applications presented serve as a guide to the current state of the science of Delta modeling and provide examples of how we can use multi-dimensional models to predict how future Delta conditions will affect both fish and water supply.

  1. Decomposition analysis of differential dose volume histograms

    International Nuclear Information System (INIS)

    Heuvel, Frank van den

    2006-01-01

    Dose volume histograms are a common tool to assess the value of a treatment plan for various forms of radiation therapy treatment. The purpose of this work is to introduce, validate, and apply a set of tools to analyze differential dose volume histograms by decomposing them into physically and clinically meaningful normal distributions. A weighted sum of the decomposed normal distributions (e.g., weighted dose) is proposed as a new measure of target dose, rather than the more unstable point dose. The method and its theory are presented and validated using simulated distributions. Additional validation is performed by analyzing simple four field box techniques encompassing a predefined target, using different treatment energies inside a water phantom. Furthermore, two clinical situations are analyzed using this methodology to illustrate practical usefulness. A comparison of a treatment plan for a breast patient using a tangential field setup with wedges is compared to a comparable geometry using dose compensators. Finally, a normal tissue complication probability (NTCP) calculation is refined using this decomposition. The NTCP calculation is performed on a liver as organ at risk in a treatment of a mesothelioma patient with involvement of the right lung. The comparison of the wedged breast treatment versus the compensator technique yields comparable classical dose parameters (e.g., conformity index ≅1 and equal dose at the ICRU dose point). The methodology proposed here shows a 4% difference in weighted dose outlining the difference in treatment using a single parameter instead of at least two in a classical analysis (e.g., mean dose, and maximal dose, or total dose variance). NTCP-calculations for the mesothelioma case are generated automatically and show a 3% decrease with respect to the classical calculation. The decrease is slightly dependant on the fractionation and on the α/β-value utilized. In conclusion, this method is able to distinguish clinically

  2. Clinical Utility of Blood Cell Histogram Interpretation.

    Science.gov (United States)

    Thomas, E T Arun; Bhagya, S; Majeed, Abdul

    2017-09-01

    An automated haematology analyser provides blood cell histograms by plotting the sizes of different blood cells on X-axis and their relative number on Y-axis. Histogram interpretation needs careful analysis of Red Blood Cell (RBC), White Blood Cell (WBC) and platelet distribution curves. Histogram analysis is often a neglected part of the automated haemogram which if interpreted well, has significant potential to provide diagnostically relevant information even before higher level investigations are ordered.

  3. System for histogram entry, retrieval, and plotting

    International Nuclear Information System (INIS)

    Kellogg, M.; Gallup, J.M.; Shlaer, S.; Spencer, N.

    1977-10-01

    This manual describes the systems for producing histograms and dot plots that were designed for use in connection with the Q general-purpose data-acquisition system. These systems allow for the creation of histograms; the entry, retrieval, and plotting of data in the form of histograms; and the dynamic display of scatter plots as data are acquired. Although the systems are designed for use with Q, they can also be used as a part of other applications. 3 figures

  4. Histogram equalization with Bayesian estimation for noise robust speech recognition.

    Science.gov (United States)

    Suh, Youngjoo; Kim, Hoirin

    2018-02-01

    The histogram equalization approach is an efficient feature normalization technique for noise robust automatic speech recognition. However, it suffers from performance degradation when some fundamental conditions are not satisfied in the test environment. To remedy these limitations of the original histogram equalization methods, class-based histogram equalization approach has been proposed. Although this approach showed substantial performance improvement under noise environments, it still suffers from performance degradation due to the overfitting problem when test data are insufficient. To address this issue, the proposed histogram equalization technique employs the Bayesian estimation method in the test cumulative distribution function estimation. It was reported in a previous study conducted on the Aurora-4 task that the proposed approach provided substantial performance gains in speech recognition systems based on the acoustic modeling of the Gaussian mixture model-hidden Markov model. In this work, the proposed approach was examined in speech recognition systems with deep neural network-hidden Markov model (DNN-HMM), the current mainstream speech recognition approach where it also showed meaningful performance improvement over the conventional maximum likelihood estimation-based method. The fusion of the proposed features with the mel-frequency cepstral coefficients provided additional performance gains in DNN-HMM systems, which otherwise suffer from performance degradation in the clean test condition.

  5. Multi dimensional analysis of Design Basis Events using MARS-LMR

    International Nuclear Information System (INIS)

    Woo, Seung Min; Chang, Soon Heung

    2012-01-01

    Highlights: ► The one dimensional analyzed sodium hot pool is modified to a three dimensional node system, because the one dimensional analysis cannot represent the phenomena of the inside pool of a big size pool with many compositions. ► The results of the multi-dimensional analysis compared with the one dimensional analysis results in normal operation, TOP (Transient of Over Power), LOF (Loss of Flow), and LOHS (Loss of Heat Sink) conditions. ► The difference of the sodium flow pattern due to structure effect in the hot pool and mass flow rates in the core lead the different sodium temperature and temperature history under transient condition. - Abstract: KALIMER-600 (Korea Advanced Liquid Metal Reactor), which is a pool type SFR (Sodium-cooled Fast Reactor), was developed by KAERI (Korea Atomic Energy Research Institute). DBE (Design Basis Events) for KALIMER-600 has been analyzed in the one dimension. In this study, the one dimensional analyzed sodium hot pool is modified to a three dimensional node system, because the one dimensional analysis cannot represent the phenomena of the inside pool of a big size pool with many compositions, such as UIS (Upper Internal Structure), IHX (Intermediate Heat eXchanger), DHX (Decay Heat eXchanger), and pump. The results of the multi-dimensional analysis compared with the one dimensional analysis results in normal operation, TOP (Transient of Over Power), LOF (Loss of Flow), and LOHS (Loss of Heat Sink) conditions. First, the results in normal operation condition show the good agreement between the one and multi-dimensional analysis. However, according to the sodium temperatures of the core inlet, outlet, the fuel central line, cladding and PDRC (Passive Decay heat Removal Circuit), the temperatures of the one dimensional analysis are generally higher than the multi-dimensional analysis in conditions except the normal operation state, and the PDRC operation time in the one dimensional analysis is generally longer than

  6. A Diminution Method of Large Multi-dimensional Data Retrievals

    Directory of Open Access Journals (Sweden)

    Nushwan Yousif Baithoon

    2010-01-01

    Full Text Available The intention of this work is to introduce a method ofcompressing data at the transmitter (source and expanding it atthe receiver (destination.The amount of data compression is directly related to datadimensionality, hence, for example an N by N RGB image file isconsidered to be an M-D, with M=3, image data file.Also, the amount of scatter in an M-D file, hence, the covariancematrix is calculated, along with the average value of eachdimension, to represent the signature or code for each individualdata set to be sent by the source.At the destination random sets can test a particular receivedsignature so that only one set is acceptable thus giving thecorresponding intended set to be received.Sound results are obtained depending on the constrains beingimplemented. These constrains are user tolerant in so far as howwell tuned or rapid the information is to be processed for dataretrieval.The proposed method is well suited in application areas whereboth source and destination are communicating using the samesets of data files at each end. Also such a technique is feasible forthe availability of fast microprocessors and frame-grabbers.

  7. Multi-dimensional analysis of the ECC behavior in the UPI plant Kori Unit 1

    International Nuclear Information System (INIS)

    Bae, Sungwon; Chung, Bub-Dong; Bang, Young Seok

    2008-01-01

    A multi-dimensional transient analysis during the LBLOCA of the Kori Unit 1 has been performed by using the MARS code. Based on 1-D nodalization of the Kori Unit 1, the reactor vessel nodalizations have been replaced by the multi-dimensional component. The multi-dimensional component for the reactor vessel is designed as 5 radial, 8 peripheral, and 21 vertical grids. It is assumed that the fuel assemblies are homogeneously distributed in inner 3 radial grids. The outer 1 radial grid region is modeled as the core bypass. The outer-model 1 radial grid is used for the downcomer region. The corresponding heat structures and fuels are modified to fit for the multi-dimensional reactor vessel model. The form drag coefficients for the upper plenum and the core have been designated as 0.6 and 9.39, respectively. The form drag coefficients for the radial and peripheral directions are assigned to the same on the assumption of homogeneous distribution of the flow obstacles. After obtaining the 102% power steady operation condition, cold leg LOCA simulation is performed during 400 second period. The multi-dimensional steady run results show no severe differences compared to the traditional 1-D nodalization results. After the ECC injection starts, a liquid pool is maintained at the upper plenum because the ECCS water can not overcome the upward gas flow that comes from the reactor core through the upper tie plate. The depth of ECCS water pool is predicted as about 20% of the total height from the upper tie plate and the center line of the hot leg pipe. At the vicinity region of the active ECCS show higher depth of liquid pool. The accumulated water flow rate passing the upper tie plate is calculated by the transient result. Much downward water flow is obtained at the outer-most region of upper plenum space. The downward flow dominant region is about 32.3% of the total upper tie plate area. The accumulated ECCS bypass ratio is predicted as 27.64% at 300 second. It is calculated

  8. Algorithms for adaptive histogram equalization

    International Nuclear Information System (INIS)

    Pizer, S.M.; Austin, J.D.; Cromartie, R.; Geselowitz, A.; Ter Haar Romeny, B.; Zimmerman, J.B.; Zuiderveld, K.

    1986-01-01

    Adaptive histogram equalization (ahe) is a contrast enhancement method designed to be broadly applicable and having demonstrated effectiveness [Zimmerman, 1985]. However, slow speed and the overenhancement of noise it produces in relatively homogeneous regions are two problems. The authors summarize algorithms designed to overcome these and other concerns. These algorithms include interpolated ahe, to speed up the method on general purpose computers; a version of interpolated ahe designed to run in a few seconds on feedback processors; a version of full ahe designed to run in under one second on custom VLSI hardware; and clipped ahe, designed to overcome the problem of overenhancement of noise contrast. The authors conclude that clipped ahe should become a method of choice in medical imaging and probably also in other areas of digital imaging, and that clipped ahe can be made adequately fast to be routinely applied in the normal display sequence

  9. WORKER, a program for histogram manipulation

    International Nuclear Information System (INIS)

    Bolger, J.E.; Ellinger, H.; Moore, C.F.

    1979-01-01

    A set of programs is provided which may link to any user-written program, permitting dynamic creation of histograms as well as display, manipulation and transfer of histogrammed data. With wide flexibility, constants within the user's code may be set or monitored at any time during execution. (Auth.)

  10. Interpreting Histograms. As Easy as It Seems?

    Science.gov (United States)

    Lem, Stephanie; Onghena, Patrick; Verschaffel, Lieven; Van Dooren, Wim

    2014-01-01

    Histograms are widely used, but recent studies have shown that they are not as easy to interpret as it might seem. In this article, we report on three studies on the interpretation of histograms in which we investigated, namely, (1) whether the misinterpretation by university students can be considered to be the result of heuristic reasoning, (2)…

  11. Spline smoothing of histograms by linear programming

    Science.gov (United States)

    Bennett, J. O.

    1972-01-01

    An algorithm for an approximating function to the frequency distribution is obtained from a sample of size n. To obtain the approximating function a histogram is made from the data. Next, Euclidean space approximations to the graph of the histogram using central B-splines as basis elements are obtained by linear programming. The approximating function has area one and is nonnegative.

  12. A multi-dimensional assessment of urban vulnerability to climate change in Sub-Saharan Africa

    DEFF Research Database (Denmark)

    Herslund, Lise Byskov; Jalyer, Fatameh; Jean-Baptiste, Nathalie

    2016-01-01

    In this paper, we develop and apply a multi-dimensional vulnerability assessment framework for understanding the impacts of climate change-induced hazards in Sub- Saharan African cities. The research was carried out within the European/African FP7 project CLimate change and Urban Vulnerability...... in Africa, which investigated climate change-induced risks, assessed vulnerability and proposed policy initiatives in five African cities. Dar es Salaam (Tanzania) was used as a main case with a particular focus on urban flooding. The multi-dimensional assessment covered the physical, institutional...... encroachment on green and flood-prone land). Scenario modeling suggests that vulnerability will continue to increase strongly due to the expected loss of agricultural land at the urban fringes and loss of green space within the city. However, weak institutional commitment and capacity limit the potential...

  13. Multi-dimensional instability of electrostatic solitary structures in magnetized nonthermal dusty plasmas

    International Nuclear Information System (INIS)

    Mamun, A.A.; Russel, S.M.; Mendoza-Briceno, C.A.; Alam, M.N.; Datta, T.K.; Das, A.K.

    1999-05-01

    A rigorous theoretical investigation has been made of multi-dimensional instability of obliquely propagating electrostatic solitary structures in a hot magnetized nonthermal dusty plasma which consists of a negatively charged hot dust fluid, Boltzmann distributed electrons, and nonthermally distributed ions. The Zakharov-Kuznetsov equation for the electrostatic solitary structures that exist in such a dusty plasma system is derived by the reductive perturbation method. The multi-dimensional instability of these solitary waves is also studied by the small-k (long wavelength plane wave) perturbation expansion method. The nature of these solitary structures, the instability criterion, and their growth rate depending on dust-temperature, external magnetic field, and obliqueness are discussed. The implications of these results to some space and astrophysical dusty plasma situations are briefly mentioned. (author)

  14. A Shell Multi-dimensional Hierarchical Cubing Approach for High-Dimensional Cube

    Science.gov (United States)

    Zou, Shuzhi; Zhao, Li; Hu, Kongfa

    The pre-computation of data cubes is critical for improving the response time of OLAP systems and accelerating data mining tasks in large data warehouses. However, as the sizes of data warehouses grow, the time it takes to perform this pre-computation becomes a significant performance bottleneck. In a high dimensional data warehouse, it might not be practical to build all these cuboids and their indices. In this paper, we propose a shell multi-dimensional hierarchical cubing algorithm, based on an extension of the previous minimal cubing approach. This method partitions the high dimensional data cube into low multi-dimensional hierarchical cube. Experimental results show that the proposed method is significantly more efficient than other existing cubing methods.

  15. Finite element method for radiation heat transfer in multi-dimensional graded index medium

    International Nuclear Information System (INIS)

    Liu, L.H.; Zhang, L.; Tan, H.P.

    2006-01-01

    In graded index medium, ray goes along a curved path determined by Fermat principle, and curved ray-tracing is very difficult and complex. To avoid the complicated and time-consuming computation of curved ray trajectories, a finite element method based on discrete ordinate equation is developed to solve the radiative transfer problem in a multi-dimensional semitransparent graded index medium. Two particular test problems of radiative transfer are taken as examples to verify this finite element method. The predicted dimensionless net radiative heat fluxes are determined by the proposed method and compared with the results obtained by finite volume method. The results show that the finite element method presented in this paper has a good accuracy in solving the multi-dimensional radiative transfer problem in semitransparent graded index medium

  16. towards a theory-based multi-dimensional framework for assessment in mathematics: The "SEA" framework

    Science.gov (United States)

    Anku, Sitsofe E.

    1997-09-01

    Using the reform documents of the National Council of Teachers of Mathematics (NCTM) (NCTM, 1989, 1991, 1995), a theory-based multi-dimensional assessment framework (the "SEA" framework) which should help expand the scope of assessment in mathematics is proposed. This framework uses a context based on mathematical reasoning and has components that comprise mathematical concepts, mathematical procedures, mathematical communication, mathematical problem solving, and mathematical disposition.

  17. Ionizing Shocks in Argon. Part 2: Transient and Multi-Dimensional Effects (Preprint)

    Science.gov (United States)

    2010-09-09

    stability in ionizing monatomic gases. Part 1. Argon ,” J. Fluid Mech., 84, 55 (1978). 2M. P. F. Bristow and I. I. Glass, “ Polarizability of singly...Article 3. DATES COVERED (From - To) 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Ionizing Shocks in Argon . Part 2: Transient...Physics. 14. ABSTRACT We extend the computations of ionizing shocks in argon to unsteady and multi-dimensional, using a collisional-radiative

  18. Development of multi-dimensional body image scale for malaysian female adolescents

    OpenAIRE

    Chin, Yit Siew; Taib, Mohd Nasir Mohd; Shariff, Zalilah Mohd; Khor, Geok Lin

    2008-01-01

    The present study was conducted to develop a Multi-dimensional Body Image Scale for Malaysian female adolescents. Data were collected among 328 female adolescents from a secondary school in Kuantan district, state of Pahang, Malaysia by using a self-administered questionnaire and anthropometric measurements. The self-administered questionnaire comprised multiple measures of body image, Eating Attitude Test (EAT-26; Garner & Garfinkel, 1979) and Rosenberg Self-esteem Inventory (Rosenberg, 1965...

  19. Functional consequences of trust in the construction supply chain: a multi-dimensional view

    OpenAIRE

    Manu, E; Ankrah, N; Chinyio, EA; Proverbs, D

    2016-01-01

    Trust is often linked to the emergence of cooperative behaviours that contribute to successful project outcomes. However, some have questioned the functional relevance of trust in contractual relations, arguing that control-induced cooperation can emerge from enforcement of contracts. These mixed views are further complicated by the multi-dimensional nature of trust, as different trust dimensions could have varying functional consequences. The aim of this study was to provide some clarity on ...

  20. Study on the construction of multi-dimensional Remote Sensing feature space for hydrological drought

    International Nuclear Information System (INIS)

    Xiang, Daxiang; Tan, Debao; Wen, Xiongfei; Shen, Shaohong; Li, Zhe; Cui, Yuanlai

    2014-01-01

    Hydrological drought refers to an abnormal water shortage caused by precipitation and surface water shortages or a groundwater imbalance. Hydrological drought is reflected in a drop of surface water, decrease of vegetation productivity, increase of temperature difference between day and night and so on. Remote sensing permits the observation of surface water, vegetation, temperature and other information from a macro perspective. This paper analyzes the correlation relationship and differentiation of both remote sensing and surface measured indicators, after the selection and extraction a series of representative remote sensing characteristic parameters according to the spectral characterization of surface features in remote sensing imagery, such as vegetation index, surface temperature and surface water from HJ-1A/B CCD/IRS data. Finally, multi-dimensional remote sensing features such as hydrological drought are built on a intelligent collaborative model. Further, for the Dong-ting lake area, two drought events are analyzed for verification of multi-dimensional features using remote sensing data with different phases and field observation data. The experiments results proved that multi-dimensional features are a good method for hydrological drought

  1. Minimizing I/O Costs of Multi-Dimensional Queries with BitmapIndices

    Energy Technology Data Exchange (ETDEWEB)

    Rotem, Doron; Stockinger, Kurt; Wu, Kesheng

    2006-03-30

    Bitmap indices have been widely used in scientific applications and commercial systems for processing complex,multi-dimensional queries where traditional tree-based indices would not work efficiently. A common approach for reducing the size of a bitmap index for high cardinality attributes is to group ranges of values of an attribute into bins and then build a bitmap for each bin rather than a bitmap for each value of the attribute. Binning reduces storage costs,however, results of queries based on bins often require additional filtering for discarding it false positives, i.e., records in the result that do not satisfy the query constraints. This additional filtering,also known as ''candidate checking,'' requires access to the base data on disk and involves significant I/O costs. This paper studies strategies for minimizing the I/O costs for ''candidate checking'' for multi-dimensional queries. This is done by determining the number of bins allocated for each dimension and then placing bin boundaries in optimal locations. Our algorithms use knowledge of data distribution and query workload. We derive several analytical results concerning optimal bin allocation for a probabilistic query model. Our experimental evaluation with real life data shows an average I/O cost improvement of at least a factor of 10 for multi-dimensional queries on datasets from two different applications. Our experiments also indicate that the speedup increases with the number of query dimensions.

  2. Quadrant Dynamic with Automatic Plateau Limit Histogram Equalization for Image Enhancement

    Directory of Open Access Journals (Sweden)

    P. Jagatheeswari

    2014-01-01

    Full Text Available The fundamental and important preprocessing stage in image processing is the image contrast enhancement technique. Histogram equalization is an effective contrast enhancement technique. In this paper, a histogram equalization based technique called quadrant dynamic with automatic plateau limit histogram equalization (QDAPLHE is introduced. In this method, a hybrid of dynamic and clipped histogram equalization methods are used to increase the brightness preservation and to reduce the overenhancement. Initially, the proposed QDAPLHE algorithm passes the input image through a median filter to remove the noises present in the image. Then the histogram of the filtered image is divided into four subhistograms while maintaining second separated point as the mean brightness. Then the clipping process is implemented by calculating automatically the plateau limit as the clipped level. The clipped portion of the histogram is modified to reduce the loss of image intensity value. Finally the clipped portion is redistributed uniformly to the entire dynamic range and the conventional histogram equalization is executed in each subhistogram independently. Based on the qualitative and the quantitative analysis, the QDAPLHE method outperforms some existing methods in literature.

  3. Histogram analysis for smartphone-based rapid hematocrit determination

    Science.gov (United States)

    Jalal, Uddin M.; Kim, Sang C.; Shim, Joon S.

    2017-01-01

    A novel and rapid analysis technique using histogram has been proposed for the colorimetric quantification of blood hematocrits. A smartphone-based “Histogram” app for the detection of hematocrits has been developed integrating the smartphone embedded camera with a microfluidic chip via a custom-made optical platform. The developed histogram analysis shows its effectiveness in the automatic detection of sample channel including auto-calibration and can analyze the single-channel as well as multi-channel images. Furthermore, the analyzing method is advantageous to the quantification of blood-hematocrit both in the equal and varying optical conditions. The rapid determination of blood hematocrits carries enormous information regarding physiological disorders, and the use of such reproducible, cost-effective, and standard techniques may effectively help with the diagnosis and prevention of a number of human diseases. PMID:28717569

  4. Bi-Histogram Equalization with Brightnes Preservation Using Contras Enhancement

    OpenAIRE

    A. Anitha Rani; Gowthami Rajagopal; A. Jagadeswaran

    2014-01-01

    Contrast enhancement is an important factor in the image preprocesing step. One of the widely acepted contrast enhancement method is the histogram equalization. Although histogram equalization achieves comparatively beter performance on almost al types of image, global histogram equalization sometimes produces excesive visual deterioration. A new extension of bi- histogram equalization caled Bi-Histogram Equalization with Neighborhod Metric (BHENM). First, large histogram bins that cause w...

  5. Color Histogram Diffusion for Image Enhancement

    Science.gov (United States)

    Kim, Taemin

    2011-01-01

    Various color histogram equalization (CHE) methods have been proposed to extend grayscale histogram equalization (GHE) for color images. In this paper a new method called histogram diffusion that extends the GHE method to arbitrary dimensions is proposed. Ranges in a histogram are specified as overlapping bars of uniform heights and variable widths which are proportional to their frequencies. This diagram is called the vistogram. As an alternative approach to GHE, the squared error of the vistogram from the uniform distribution is minimized. Each bar in the vistogram is approximated by a Gaussian function. Gaussian particles in the vistoram diffuse as a nonlinear autonomous system of ordinary differential equations. CHE results of color images showed that the approach is effective.

  6. Adaptive histogram equalization and its variations

    NARCIS (Netherlands)

    Pizer, S.M.; Amburn, E.P.; Austin, J.D.; Cromartie, R.; Geselowitz, A.; Greer, Trey; Haar Romenij, ter B.M.; Zimmerman, J.B.; Zuiderveld, K.J.

    1987-01-01

    Adaptive histogram equalization (ahe) is a contrast enhancement method designed to be broadly applicable and having demonstrated effectiveness. However, slow speed and the overenhancement of noise it produces in relatively homogeneous regions are two problems. We report algorithms designed to

  7. Subtracting and Fitting Histograms using Profile Likelihood

    CERN Document Server

    D'Almeida, F M L

    2008-01-01

    It is known that many interesting signals expected at LHC are of unknown shape and strongly contaminated by background events. These signals will be dif cult to detect during the rst years of LHC operation due to the initial low luminosity. In this work, one presents a method of subtracting histograms based on the pro le likelihood function when the background is previously estimated by Monte Carlo events and one has low statistics. Estimators for the signal in each bin of the histogram difference are calculated so as limits for the signals with 68.3% of Con dence Level in a low statistics case when one has a exponential background and a Gaussian signal. The method can also be used to t histograms when the signal shape is known. Our results show a good performance and avoid the problem of negative values when subtracting histograms.

  8. The Online Histogram Presenter for the ATLAS experiment: A modular system for histogram visualization

    International Nuclear Information System (INIS)

    Dotti, Andrea; Adragna, Paolo; Vitillo, Roberto A

    2010-01-01

    The Online Histogram Presenter (OHP) is the ATLAS tool to display histograms produced by the online monitoring system. In spite of the name, the Online Histogram Presenter is much more than just a histogram display. To cope with the large amount of data, the application has been designed to minimise the network traffic; sophisticated caching, hashing and filtering algorithms reduce memory and CPU usage. The system uses Qt and ROOT for histogram visualisation and manipulation. In addition, histogram visualisation can be extensively customised through configuration files. Finally, its very modular architecture features a lightweight plug-in system, allowing extensions to accommodate specific user needs. After an architectural overview of the application, the paper is going to present in detail the solutions adopted to increase the performance and a description of the plug-in system.

  9. The Online Histogram Presenter for the ATLAS experiment: A modular system for histogram visualization

    Energy Technology Data Exchange (ETDEWEB)

    Dotti, Andrea [CERN, CH-1211 Genve 23 Switzerland (Switzerland); Adragna, Paolo [Physics Department, Queen Mary, University of London Mile End Road London E1 4RP UK (United Kingdom); Vitillo, Roberto A, E-mail: andrea.dotti@cern.c [INFN Sezione di Pisa, Ed. C Largo Bruno Pontecorvo 3, 56127 Pisa (Italy)

    2010-04-01

    The Online Histogram Presenter (OHP) is the ATLAS tool to display histograms produced by the online monitoring system. In spite of the name, the Online Histogram Presenter is much more than just a histogram display. To cope with the large amount of data, the application has been designed to minimise the network traffic; sophisticated caching, hashing and filtering algorithms reduce memory and CPU usage. The system uses Qt and ROOT for histogram visualisation and manipulation. In addition, histogram visualisation can be extensively customised through configuration files. Finally, its very modular architecture features a lightweight plug-in system, allowing extensions to accommodate specific user needs. After an architectural overview of the application, the paper is going to present in detail the solutions adopted to increase the performance and a description of the plug-in system.

  10. The equivalent Histograms in clinical practice

    International Nuclear Information System (INIS)

    Pizarro Trigo, F.; Teijeira Garcia, M.; Zaballos Carrera, S.

    2013-01-01

    Is frequently abused of The tolerances established for organ at risk [1] in diagrams of standard fractionation (2Gy/session, 5 sessions per week) when applied to Dose-Volume histograms non-standard schema. The purpose of this work is to establish when this abuse may be more important and realize a transformation of fractionation non-standard of histograms dosis-volumen. Is exposed a case that can be useful to make clinical decisions. (Author)

  11. Histogram Equalization to Model Adaptation for Robust Speech Recognition

    Directory of Open Access Journals (Sweden)

    Suh Youngjoo

    2010-01-01

    Full Text Available We propose a new model adaptation method based on the histogram equalization technique for providing robustness in noisy environments. The trained acoustic mean models of a speech recognizer are adapted into environmentally matched conditions by using the histogram equalization algorithm on a single utterance basis. For more robust speech recognition in the heavily noisy conditions, trained acoustic covariance models are efficiently adapted by the signal-to-noise ratio-dependent linear interpolation between trained covariance models and utterance-level sample covariance models. Speech recognition experiments on both the digit-based Aurora2 task and the large vocabulary-based task showed that the proposed model adaptation approach provides significant performance improvements compared to the baseline speech recognizer trained on the clean speech data.

  12. Confirmatory factor analysis of the Multi-dimensional Emotional Empathy Scale in the South African context

    Directory of Open Access Journals (Sweden)

    Chantal Olckers

    2010-11-01

    Full Text Available Orientation: Empathy is a core competency in aiding individuals to address the challenges of social living. An indicator of emotional intelligence, it is useful in a globalising and cosmopolitan world. Moreover, managing staff, stakeholders and conflict in many social settings relies on communicative skills, of which empathy forms a large part. Empathy plays a pivotal role in negotiating, persuading and influencing behaviour. The skill of being able to empathise thus enables the possessor to attune to the needs of clients and employees and provides opportunities to become responsive to these needs. Research purpose: This study attempted to determine the construct validity of the Multi-dimensional Emotional Empathy Scale within the South African context. Motivation for the study: In South Africa, a large number of psychometrical instruments have been adopted directly from abroad. Studies determining the construct validity of several of these imported instruments, however, have shown that these instruments are not suited for use in the South African context. Research design, approach and method: The study was based on a quantitative research method with a survey design. A convenience sample of 212 respondents completed the Multi-dimensional Emotional Empathy Scale. The constructs explored were Suffering, Positive Sharing, Responsive Crying, Emotional Attention, a Feel for Others and Emotional Contagion. The statistical procedure used was a confirmatory factor analysis. Main findings: The study showed that, from a South African perspective, the Multi-dimensional Emotional Empathy Scale lacks sufficient construct validity. Practical/managerial implications: Further refinement of the model would provide valuable information that would aid people to be more appreciative of individual contributions, to meet client needs and to understand the motivations of others. Contribution/value-add: From a South African perspective, the findings of this study are

  13. Pedagogic discourse in introductory classes: Multi-dimensional analysis of textbooks and lectures in biology and macroeconomics

    Science.gov (United States)

    Carkin, Susan

    The broad goal of this study is to represent the linguistic variation of textbooks and lectures, the primary input for student learning---and sometimes the sole input in the large introductory classes which characterize General Education at many state universities. Computer techniques are used to analyze a corpus of textbooks and lectures from first-year university classes in macroeconomics and biology. These spoken and written variants are compared to each other as well as to benchmark texts from other multi-dimensional studies in order to examine their patterns, relations, and functions. A corpus consisting of 147,000 words was created from macroeconomics and biology lectures at a medium-large state university and from a set of nationally "best-selling" textbooks used in these same introductory survey courses. The corpus was analyzed using multi-dimensional methodology (Biber, 1988). The analysis consists of both empirical and qualitative phases. Quantitative analyses are undertaken on the linguistic features, their patterns of co-occurrence, and on the contextual elements of classrooms and textbooks. The contextual analysis is used to functionally interpret the statistical patterns of co-occurrence along five dimensions of textual variation, demonstrating patterns of difference and similarity with reference to text excerpts. Results of the analysis suggest that academic discourse is far from monolithic. Pedagogic discourse in introductory classes varies by modality and discipline, but not always in the directions expected. In the present study the most abstract texts were biology lectures---more abstract than written genres of academic prose and more abstract than introductory textbooks. Academic lectures in both disciplines, monologues which carry a heavy informational load, were extremely interactive, more like conversation than academic prose. A third finding suggests that introductory survey textbooks differ from those used in upper division classes by being

  14. Structural diversity: a multi-dimensional approach to assess recreational services in urban parks.

    Science.gov (United States)

    Voigt, Annette; Kabisch, Nadja; Wurster, Daniel; Haase, Dagmar; Breuste, Jürgen

    2014-05-01

    Urban green spaces provide important recreational services for urban residents. In general, when park visitors enjoy "the green," they are in actuality appreciating a mix of biotic, abiotic, and man-made park infrastructure elements and qualities. We argue that these three dimensions of structural diversity have an influence on how people use and value urban parks. We present a straightforward approach for assessing urban parks that combines multi-dimensional landscape mapping and questionnaire surveys. We discuss the method as well the results from its application to differently sized parks in Berlin and Salzburg.

  15. MINIMUM ENTROPY DECONVOLUTION OF ONE-AND MULTI-DIMENSIONAL NON-GAUSSIAN LINEAR RANDOM PROCESSES

    Institute of Scientific and Technical Information of China (English)

    程乾生

    1990-01-01

    The minimum entropy deconvolution is considered as one of the methods for decomposing non-Gaussian linear processes. The concept of peakedness of a system response sequence is presented and its properties are studied. With the aid of the peakedness, the convergence theory of the minimum entropy deconvolution is established. The problem of the minimum entropy deconvolution of multi-dimensional non-Gaussian linear random processes is first investigated and the corresponding theory is given. In addition, the relation between the minimum entropy deconvolution and parameter method is discussed.

  16. On the use of multi-dimensional scaling and electromagnetic tracking in high dose rate brachytherapy

    Science.gov (United States)

    Götz, Th I.; Ermer, M.; Salas-González, D.; Kellermeier, M.; Strnad, V.; Bert, Ch; Hensel, B.; Tomé, A. M.; Lang, E. W.

    2017-10-01

    High dose rate brachytherapy affords a frequent reassurance of the precise dwell positions of the radiation source. The current investigation proposes a multi-dimensional scaling transformation of both data sets to estimate dwell positions without any external reference. Furthermore, the related distributions of dwell positions are characterized by uni—or bi—modal heavy—tailed distributions. The latter are well represented by α—stable distributions. The newly proposed data analysis provides dwell position deviations with high accuracy, and, furthermore, offers a convenient visualization of the actual shapes of the catheters which guide the radiation source during the treatment.

  17. Coupling Visualization and Data Analysis for Knowledge Discovery from Multi-dimensional Scientific Data

    International Nuclear Information System (INIS)

    Rubel, Oliver; Ahern, Sean; Bethel, E. Wes; Biggin, Mark D.; Childs, Hank; Cormier-Michel, Estelle; DePace, Angela; Eisen, Michael B.; Fowlkes, Charless C.; Geddes, Cameron G.R.; Hagen, Hans; Hamann, Bernd; Huang, Min-Yu; Keranen, Soile V.E.; Knowles, David W.; Hendriks, Chris L. Luengo; Malik, Jitendra; Meredith, Jeremy; Messmer, Peter; Prabhat; Ushizima, Daniela; Weber, Gunther H.; Wu, Kesheng

    2010-01-01

    Knowledge discovery from large and complex scientific data is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the growing number of data dimensions and data objects presents tremendous challenges for effective data analysis and data exploration methods and tools. The combination and close integration of methods from scientific visualization, information visualization, automated data analysis, and other enabling technologies 'such as efficient data management' supports knowledge discovery from multi-dimensional scientific data. This paper surveys two distinct applications in developmental biology and accelerator physics, illustrating the effectiveness of the described approach.

  18. MXA: a customizable HDF5-based data format for multi-dimensional data sets

    International Nuclear Information System (INIS)

    Jackson, M; Simmons, J P; De Graef, M

    2010-01-01

    A new digital file format is proposed for the long-term archival storage of experimental data sets generated by serial sectioning instruments. The format is known as the multi-dimensional eXtensible Archive (MXA) format and is based on the public domain Hierarchical Data Format (HDF5). The MXA data model, its description by means of an eXtensible Markup Language (XML) file with associated Document Type Definition (DTD) are described in detail. The public domain MXA package is available through a dedicated web site (mxa.web.cmu.edu), along with implementation details and example data files

  19. Portable laser synthesizer for high-speed multi-dimensional spectroscopy

    Science.gov (United States)

    Demos, Stavros G [Livermore, CA; Shverdin, Miroslav Y [Sunnyvale, CA; Shirk, Michael D [Brentwood, CA

    2012-05-29

    Portable, field-deployable laser synthesizer devices designed for multi-dimensional spectrometry and time-resolved and/or hyperspectral imaging include a coherent light source which simultaneously produces a very broad, energetic, discrete spectrum spanning through or within the ultraviolet, visible, and near infrared wavelengths. The light output is spectrally resolved and each wavelength is delayed with respect to each other. A probe enables light delivery to a target. For multidimensional spectroscopy applications, the probe can collect the resulting emission and deliver this radiation to a time gated spectrometer for temporal and spectral analysis.

  20. Investigation of multi-dimensional computational models for calculating pollutant transport

    International Nuclear Information System (INIS)

    Pepper, D.W.; Cooper, R.E.; Baker, A.J.

    1980-01-01

    A performance study of five numerical solution algorithms for multi-dimensional advection-diffusion prediction on mesoscale grids was made. Test problems include transport of point and distributed sources, and a simulation of a continuous source. In all cases, analytical solutions are available to assess relative accuracy. The particle-in-cell and second-moment algorithms, both of which employ sub-grid resolution coupled with Lagrangian advection, exhibit superior accuracy in modeling a point source release. For modeling of a distributed source, algorithms based upon the pseudospectral and finite element interpolation concepts, exhibit improved accuracy on practical discretizations

  1. Assessment of wall friction model in multi-dimensional component of MARS with air–water cross flow experiment

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Jin-Hwa [Nuclear Thermal-Hydraulic Engineering Laboratory, Seoul National University, Gwanak 599, Gwanak-ro, Gwanak-gu, Seoul 151-742 (Korea, Republic of); Korea Atomic Energy Research Institute, 989-111, Daedeok-daero, Yuseong-gu, Daejeon 305-600 (Korea, Republic of); Choi, Chi-Jin [Nuclear Thermal-Hydraulic Engineering Laboratory, Seoul National University, Gwanak 599, Gwanak-ro, Gwanak-gu, Seoul 151-742 (Korea, Republic of); Cho, Hyoung-Kyu, E-mail: chohk@snu.ac.kr [Nuclear Thermal-Hydraulic Engineering Laboratory, Seoul National University, Gwanak 599, Gwanak-ro, Gwanak-gu, Seoul 151-742 (Korea, Republic of); Euh, Dong-Jin [Korea Atomic Energy Research Institute, 989-111, Daedeok-daero, Yuseong-gu, Daejeon 305-600 (Korea, Republic of); Park, Goon-Cherl [Nuclear Thermal-Hydraulic Engineering Laboratory, Seoul National University, Gwanak 599, Gwanak-ro, Gwanak-gu, Seoul 151-742 (Korea, Republic of)

    2017-02-15

    Recently, high precision and high accuracy analysis on multi-dimensional thermal hydraulic phenomena in a nuclear power plant has been considered as state-of-the-art issues. System analysis code, MARS, also adopted a multi-dimensional module to simulate them more accurately. Even though it was applied to represent the multi-dimensional phenomena, but implemented models and correlations in that are one-dimensional empirical ones based on one-dimensional pipe experimental results. Prior to the application of the multi-dimensional simulation tools, however, the constitutive models for a two-phase flow need to be carefully validated, such as the wall friction model. Especially, in a Direct Vessel Injection (DVI) system, the injected emergency core coolant (ECC) on the upper part of the downcomer interacts with the lateral steam flow during the reflood phase in the Large-Break Loss-Of-Coolant-Accident (LBLOCA). The interaction between the falling film and lateral steam flow induces a multi-dimensional two-phase flow. The prediction of ECC flow behavior plays a key role in determining the amount of coolant that can be used as core cooling. Therefore, the wall friction model which is implemented to simulate the multi-dimensional phenomena should be assessed by multidimensional experimental results. In this paper, the air–water cross film flow experiments simulating the multi-dimensional phenomenon in upper part of downcomer as a conceptual problem will be introduced. The two-dimensional local liquid film velocity and thickness data were used as benchmark data for code assessment. And then the previous wall friction model of the MARS-MultiD in the annular flow regime was modified. As a result, the modified MARS-MultiD produced improved calculation result than previous one.

  2. 640-slice DVCT multi-dimensionally and dynamically presents changes in bladder volume and urine flow rate

    Science.gov (United States)

    Su, Yunshan; Fang, Kewei; Mao, Chongwen; Xiang, Shutian; Wang, Jin; Li, Yingwen

    2018-01-01

    The present study aimed to explore the application of 640-slice dynamic volume computed tomography (DVCT) to excretory cystography and urethrography. A total of 70 healthy subjects were included in the study. Excretory cystography and urethrography using 640-slice DVCT was conducted to continuously record the motions of the bladder and the proximal female and male urethra. The patients' voiding process was divided into early, early to middle, middle, middle to late, and late voiding phases. The subjects were analyzed using DVCT and conventional CT. The cross-sectional areas of various sections of the male and female urethra were evaluated, and the average urine flow rate was calculated. The 640-slice DVCT technique was used to dynamically observe the urine flow rate and changes in bladder volume at all voiding phases. The urine volume detected by 640-slice DVCT exhibited no significant difference compared with the actual volume, and no significant difference compared with that determined using conventional CT. Furthermore, no significant difference in the volume of the bladder at each phase of the voiding process was detected between 640-slice DVCT and conventional CT. The results indicate that 640-slice DVCT can accurately evaluate the status of the male posterior urethra and female urethra. In conclusion, 640-slice DVCT is able to multi-dimensionally and dynamically present changes in bladder volume and urine flow rate, and could obtain similar results to conventional CT in detecting urine volume, as well as the status of the male posterior urethra and female urethra. PMID:29467853

  3. Optimization of an Electromagnetics Code with Multicore Wavefront Diamond Blocking and Multi-dimensional Intra-Tile Parallelization

    KAUST Repository

    Malas, Tareq M.

    2016-07-21

    Understanding and optimizing the properties of solar cells is becoming a key issue in the search for alternatives to nuclear and fossil energy sources. A theoretical analysis via numerical simulations involves solving Maxwell\\'s Equations in discretized form and typically requires substantial computing effort. We start from a hybrid-parallel (MPI+OpenMP) production code that implements the Time Harmonic Inverse Iteration Method (THIIM) with Finite-Difference Frequency Domain (FDFD) discretization. Although this algorithm has the characteristics of a strongly bandwidth-bound stencil update scheme, it is significantly different from the popular stencil types that have been exhaustively studied in the high performance computing literature to date. We apply a recently developed stencil optimization technique, multicore wavefront diamond tiling with multi-dimensional cache block sharing, and describe in detail the peculiarities that need to be considered due to the special stencil structure. Concurrency in updating the components of the electric and magnetic fields provides an additional level of parallelism. The dependence of the cache size requirement of the optimized code on the blocking parameters is modeled accurately, and an auto-tuner searches for optimal configurations in the remaining parameter space. We were able to completely decouple the execution from the memory bandwidth bottleneck, accelerating the implementation by a factor of three to four compared to an optimal implementation with pure spatial blocking on an 18-core Intel Haswell CPU.

  4. Induction machine bearing faults detection based on a multi-dimensional MUSIC algorithm and maximum likelihood estimation.

    Science.gov (United States)

    Elbouchikhi, Elhoussin; Choqueuse, Vincent; Benbouzid, Mohamed

    2016-07-01

    Condition monitoring of electric drives is of paramount importance since it contributes to enhance the system reliability and availability. Moreover, the knowledge about the fault mode behavior is extremely important in order to improve system protection and fault-tolerant control. Fault detection and diagnosis in squirrel cage induction machines based on motor current signature analysis (MCSA) has been widely investigated. Several high resolution spectral estimation techniques have been developed and used to detect induction machine abnormal operating conditions. This paper focuses on the application of MCSA for the detection of abnormal mechanical conditions that may lead to induction machines failure. In fact, this paper is devoted to the detection of single-point defects in bearings based on parametric spectral estimation. A multi-dimensional MUSIC (MD MUSIC) algorithm has been developed for bearing faults detection based on bearing faults characteristic frequencies. This method has been used to estimate the fundamental frequency and the fault related frequency. Then, an amplitude estimator of the fault characteristic frequencies has been proposed and fault indicator has been derived for fault severity measurement. The proposed bearing faults detection approach is assessed using simulated stator currents data, issued from a coupled electromagnetic circuits approach for air-gap eccentricity emulating bearing faults. Then, experimental data are used for validation purposes. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Improvement of multi-dimensional realistic thermal-hydraulic system analysis code, MARS 1.3

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Chung, Bub Dong; Jeong, Jae Jun; Ha, Kwi Seok

    1998-09-01

    The MARS (Multi-dimensional Analysis of Reactor Safety) code is a multi-dimensional, best-estimate thermal-hydraulic system analysis code. This report describes the new features that have been improved in the MARS 1.3 code since the release of MARS 1.3 in July 1998. The new features include: - implementation of point kinetics model into the 3D module - unification of the heat structure model - extension of the control function to the 3D module variables - improvement of the 3D module input check function. Each of the items has been implemented in the developmental version of the MARS 1.3.1 code and, then, independently verified and assessed. The effectiveness of the new features is well verified and it is shown that these improvements greatly extend the code capability and enhance the user friendliness. Relevant input data changes are also described. In addition to the improvements, this report briefly summarizes the future code developmental activities that are being carried out or planned, such as coupling of MARS 1.3 with the containment code CONTEMPT and the three-dimensional reactor kinetics code MASTER 2.0. (author). 8 refs.

  6. ANALISIS POSITIONING KERIPIK KENTANG DENGAN PENDEKATAN METODE MULTI DIMENSIONAL SCALLING DI KOTA BATU

    Directory of Open Access Journals (Sweden)

    Siti Asmaul Mustaniroh

    2016-11-01

    Full Text Available Potato chips are one of the main products of Batu city. Based on data from Batu government’s  in 2002, there are only 2 selling units. In 2008, amount of potato chips   and another selling unit, so the research on positioning of potato chips in Batu city is important to do. The purpose of this research are to understand which attributes which influence custumer consideration to buy and to consume potato chips, and to analyze positioning which is formed between four potato chips brand (Cita Mandiri, Gizi Food, Leo, Rimbaku based on costumer perception in Batu city by using Multi Dimensional Scaling method. Attributes that influence costumer to buy and to consume potato chips are product (taste and crunchy level, price (product price compare with quality, and considerable price products, distribution (the local stock of the products or how strategic is the selling location, promotion (the using of advertising or promotion media (such as internet, radio, or brochure. Based on the Multi Dimensional Scaling Method, positioning follow this structure are Gizi Food as market leader, Leo as market challenger, and Rimbaku and Cita Mandiri as market follower.

  7. Multi-dimensional analysis of high resolution γ-ray data

    International Nuclear Information System (INIS)

    Flibotte, S.; Huttmeier, U.J.; France, G. de; Haas, B.; Romain, P.; Theisen, Ch.; Vivien, J.P.; Zen, J.; Bednarczyk, P.

    1992-01-01

    High resolution γ-ray multi-detectors capable of measuring high-fold coincidences with a large efficiency are presently under construction (EUROGAM, GASP, GAMMASPHERE). The future experimental progress in our understanding of nuclear structure at high spin critically depends on our ability to analyze the data in a multi-dimensional space and to resolve small photopeaks of interest from the generally large background. Development of programs to process such high-fold events is still in its infancy and only the 3-fold case has been treated so far. As a contribution to the software development associated with the EUROGAM spectrometer, we have written and tested the performances of computer codes designed to select multi-dimensional gates from 3-, 4- and 5-fold coincidence databases. The tests were performed on events generated with a Monte Carlo simulation and also on experimental data (triples) recorded with the 8π spectrometer and with a preliminary version of the EUROGAM array. (author). 7 refs., 3 tabs., 1 fig

  8. Continuous Energy, Multi-Dimensional Transport Calculations for Problem Dependent Resonance Self-Shielding

    International Nuclear Information System (INIS)

    Downar, T.

    2009-01-01

    The overall objective of the work here has been to eliminate the approximations used in current resonance treatments by developing continuous energy multi-dimensional transport calculations for problem dependent self-shielding calculations. The work here builds on the existing resonance treatment capabilities in the ORNL SCALE code system. The overall objective of the work here has been to eliminate the approximations used in current resonance treatments by developing continuous energy multidimensional transport calculations for problem dependent self-shielding calculations. The work here builds on the existing resonance treatment capabilities in the ORNL SCALE code system. Specifically, the methods here utilize the existing continuous energy SCALE5 module, CENTRM, and the multi-dimensional discrete ordinates solver, NEWT to develop a new code, CENTRM( ) NEWT. The work here addresses specific theoretical limitations in existing CENTRM resonance treatment, as well as investigates advanced numerical and parallel computing algorithms for CENTRM and NEWT in order to reduce the computational burden. The result of the work here will be a new computer code capable of performing problem dependent self-shielding analysis for both existing and proposed GENIV fuel designs. The objective of the work was to have an immediate impact on the safety analysis of existing reactors through improvements in the calculation of fuel temperature effects, as well as on the analysis of more sophisticated GENIV/NGNP systems through improvements in the depletion/transmutation of actinides for Advanced Fuel Cycle Initiatives.

  9. Improvement of multi-dimensional realistic thermal-hydraulic system analysis code, MARS 1.3

    International Nuclear Information System (INIS)

    Lee, Won Jae; Chung, Bub Dong; Jeong, Jae Jun; Ha, Kwi Seok

    1998-09-01

    The MARS (Multi-dimensional Analysis of Reactor Safety) code is a multi-dimensional, best-estimate thermal-hydraulic system analysis code. This report describes the new features that have been improved in the MARS 1.3 code since the release of MARS 1.3 in July 1998. The new features include: - implementation of point kinetics model into the 3D module - unification of the heat structure model - extension of the control function to the 3D module variables - improvement of the 3D module input check function. Each of the items has been implemented in the developmental version of the MARS 1.3.1 code and, then, independently verified and assessed. The effectiveness of the new features is well verified and it is shown that these improvements greatly extend the code capability and enhance the user friendliness. Relevant input data changes are also described. In addition to the improvements, this report briefly summarizes the future code developmental activities that are being carried out or planned, such as coupling of MARS 1.3 with the containment code CONTEMPT and the three-dimensional reactor kinetics code MASTER 2.0. (author). 8 refs

  10. Identifying associations between pig pathologies using a multi-dimensional machine learning methodology

    Directory of Open Access Journals (Sweden)

    Sanchez-Vazquez Manuel J

    2012-08-01

    Full Text Available Abstract Background Abattoir detected pathologies are of crucial importance to both pig production and food safety. Usually, more than one pathology coexist in a pig herd although it often remains unknown how these different pathologies interrelate to each other. Identification of the associations between different pathologies may facilitate an improved understanding of their underlying biological linkage, and support the veterinarians in encouraging control strategies aimed at reducing the prevalence of not just one, but two or more conditions simultaneously. Results Multi-dimensional machine learning methodology was used to identify associations between ten typical pathologies in 6485 batches of slaughtered finishing pigs, assisting the comprehension of their biological association. Pathologies potentially associated with septicaemia (e.g. pericarditis, peritonitis appear interrelated, suggesting on-going bacterial challenges by pathogens such as Haemophilus parasuis and Streptococcus suis. Furthermore, hepatic scarring appears interrelated with both milk spot livers (Ascaris suum and bacteria-related pathologies, suggesting a potential multi-pathogen nature for this pathology. Conclusions The application of novel multi-dimensional machine learning methodology provided new insights into how typical pig pathologies are potentially interrelated at batch level. The methodology presented is a powerful exploratory tool to generate hypotheses, applicable to a wide range of studies in veterinary research.

  11. Identifying associations between pig pathologies using a multi-dimensional machine learning methodology.

    Science.gov (United States)

    Sanchez-Vazquez, Manuel J; Nielen, Mirjam; Edwards, Sandra A; Gunn, George J; Lewis, Fraser I

    2012-08-31

    Abattoir detected pathologies are of crucial importance to both pig production and food safety. Usually, more than one pathology coexist in a pig herd although it often remains unknown how these different pathologies interrelate to each other. Identification of the associations between different pathologies may facilitate an improved understanding of their underlying biological linkage, and support the veterinarians in encouraging control strategies aimed at reducing the prevalence of not just one, but two or more conditions simultaneously. Multi-dimensional machine learning methodology was used to identify associations between ten typical pathologies in 6485 batches of slaughtered finishing pigs, assisting the comprehension of their biological association. Pathologies potentially associated with septicaemia (e.g. pericarditis, peritonitis) appear interrelated, suggesting on-going bacterial challenges by pathogens such as Haemophilus parasuis and Streptococcus suis. Furthermore, hepatic scarring appears interrelated with both milk spot livers (Ascaris suum) and bacteria-related pathologies, suggesting a potential multi-pathogen nature for this pathology. The application of novel multi-dimensional machine learning methodology provided new insights into how typical pig pathologies are potentially interrelated at batch level. The methodology presented is a powerful exploratory tool to generate hypotheses, applicable to a wide range of studies in veterinary research.

  12. High-frequency stock linkage and multi-dimensional stationary processes

    Science.gov (United States)

    Wang, Xi; Bao, Si; Chen, Jingchao

    2017-02-01

    In recent years, China's stock market has experienced dramatic fluctuations; in particular, in the second half of 2014 and 2015, the market rose sharply and fell quickly. Many classical financial phenomena, such as stock plate linkage, appeared repeatedly during this period. In general, these phenomena have usually been studied using daily-level data or minute-level data. Our paper focuses on the linkage phenomenon in Chinese stock 5-second-level data during this extremely volatile period. The method used to select the linkage points and the arbitrage strategy are both based on multi-dimensional stationary processes. A new program method for testing the multi-dimensional stationary process is proposed in our paper, and the detailed program is presented in the paper's appendix. Because of the existence of the stationary process, the strategy's logarithmic cumulative average return will converge under the condition of the strong ergodic theorem, and this ensures the effectiveness of the stocks' linkage points and the more stable statistical arbitrage strategy.

  13. Statistical Projections for Multi-resolution, Multi-dimensional Visual Data Exploration and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Hoa T. [Univ. of Utah, Salt Lake City, UT (United States); Stone, Daithi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bethel, E. Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-01-01

    An ongoing challenge in visual exploration and analysis of large, multi-dimensional datasets is how to present useful, concise information to a user for some specific visualization tasks. Typical approaches to this problem have proposed either reduced-resolution versions of data, or projections of data, or both. These approaches still have some limitations such as consuming high computation or suffering from errors. In this work, we explore the use of a statistical metric as the basis for both projections and reduced-resolution versions of data, with a particular focus on preserving one key trait in data, namely variation. We use two different case studies to explore this idea, one that uses a synthetic dataset, and another that uses a large ensemble collection produced by an atmospheric modeling code to study long-term changes in global precipitation. The primary findings of our work are that in terms of preserving the variation signal inherent in data, that using a statistical measure more faithfully preserves this key characteristic across both multi-dimensional projections and multi-resolution representations than a methodology based upon averaging.

  14. Multi-dimensional analysis of high resolution {gamma}-ray data

    Energy Technology Data Exchange (ETDEWEB)

    Flibotte, S; Huttmeier, U J; France, G de; Haas, B; Romain, P; Theisen, Ch; Vivien, J P; Zen, J [Centre National de la Recherche Scientifique (CNRS), 67 - Strasbourg (France); Bednarczyk, P [Institute of Nuclear Physics, Cracow (Poland)

    1992-08-01

    High resolution {gamma}-ray multi-detectors capable of measuring high-fold coincidences with a large efficiency are presently under construction (EUROGAM, GASP, GAMMASPHERE). The future experimental progress in our understanding of nuclear structure at high spin critically depends on our ability to analyze the data in a multi-dimensional space and to resolve small photopeaks of interest from the generally large background. Development of programs to process such high-fold events is still in its infancy and only the 3-fold case has been treated so far. As a contribution to the software development associated with the EUROGAM spectrometer, we have written and tested the performances of computer codes designed to select multi-dimensional gates from 3-, 4- and 5-fold coincidence databases. The tests were performed on events generated with a Monte Carlo simulation and also on experimental data (triples) recorded with the 8{pi} spectrometer and with a preliminary version of the EUROGAM array. (author). 7 refs., 3 tabs., 1 fig.

  15. Adherence is a multi-dimensional construct in the POUNDS LOST trial

    Science.gov (United States)

    Williamson, Donald A.; Anton, Stephen D.; Han, Hongmei; Champagne, Catherine M.; Allen, Ray; LeBlanc, Eric; Ryan, Donna H.; McManus, Katherine; Laranjo, Nancy; Carey, Vincent J.; Loria, Catherine M.; Bray, George A.; Sacks, Frank M.

    2011-01-01

    Research on the conceptualization of adherence to treatment has not addressed a key question: Is adherence best defined as being a uni-dimensional or multi-dimensional behavioral construct? The primary aim of this study was to test which of these conceptual models best described adherence to a weight management program. This ancillary study was conducted as a part of the POUNDS LOST trial that tested the efficacy of four dietary macro-nutrient compositions for promoting weight loss. A sample of 811 overweight/obese adults was recruited across two clinical sites, and each participant was randomly assigned to one of four macronutrient prescriptions: (1) Low fat (20% of energy), average protein (15% of energy); (2) High fat (40%), average protein (15%); (3) Low fat (20%), high protein (25%); (4) High fat (40%), high protein (25%). Throughout the first 6 months of the study, a computer tracking system collected data on eight indicators of adherence. Computer tracking data from the initial 6 months of the intervention were analyzed using exploratory and confirmatory analyses. Two factors (accounting for 66% of the variance) were identified and confirmed: (1) behavioral adherence and (2) dietary adherence. Behavioral adherence did not differ across the four interventions, but prescription of a high fat diet (vs. a low fat diet) was found to be associated with higher levels of dietary adherence. The findings of this study indicated that adherence to a weight management program was best conceptualized as being multi-dimensional, with two dimensions: behavioral and dietary adherence. PMID:19856202

  16. Multi-dimensional self-esteem and magnitude of change in the treatment of anorexia nervosa.

    Science.gov (United States)

    Collin, Paula; Karatzias, Thanos; Power, Kevin; Howard, Ruth; Grierson, David; Yellowlees, Alex

    2016-03-30

    Self-esteem improvement is one of the main targets of inpatient eating disorder programmes. The present study sought to examine multi-dimensional self-esteem and magnitude of change in eating psychopathology among adults participating in a specialist inpatient treatment programme for anorexia nervosa. A standardised assessment battery, including multi-dimensional measures of eating psychopathology and self-esteem, was completed pre- and post-treatment for 60 participants (all white Scottish female, mean age=25.63 years). Statistical analyses indicated that self-esteem improved with eating psychopathology and weight over the course of treatment, but that improvements were domain-specific and small in size. Global self-esteem was not predictive of treatment outcome. Dimensions of self-esteem at baseline (Lovability and Moral Self-approval), however, were predictive of magnitude of change in dimensions of eating psychopathology (Shape and Weight Concern). Magnitude of change in Self-Control and Lovability dimensions were predictive of magnitude of change in eating psychopathology (Global, Dietary Restraint, and Shape Concern). The results of this study demonstrate that the relationship between self-esteem and eating disorder is far from straightforward, and suggest that future research and interventions should focus less exclusively on self-esteem as a uni-dimensional psychological construct. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. Multi-dimensional design window search system using neural networks in reactor core design

    International Nuclear Information System (INIS)

    Kugo, Teruhiko; Nakagawa, Masayuki

    2000-02-01

    In the reactor core design, many parametric survey calculations should be carried out to decide an optimal set of basic design parameter values. They consume a large amount of computation time and labor in the conventional way. To support directly design work, we investigate a procedure to search efficiently a design window, which is defined as feasible design parameter ranges satisfying design criteria and requirements, in a multi-dimensional space composed of several basic design parameters. We apply the present method to the neutronics and thermal hydraulics fields and develop the multi-dimensional design window search system using it. The principle of the present method is to construct the multilayer neural network to simulate quickly a response of an analysis code through a training process, and to reduce computation time using the neural network without parametric study using analysis codes. The system works on an engineering workstation (EWS) with efficient man-machine interface for pre- and post-processing. This report describes the principle of the present method, the structure of the system, the guidance of the usages of the system, the guideline for the efficient training of neural networks, the instructions of the input data for analysis calculation and so on. (author)

  18. Breast density pattern characterization by histogram features and texture descriptors

    Directory of Open Access Journals (Sweden)

    Pedro Cunha Carneiro

    2017-04-01

    Full Text Available Abstract Introduction Breast cancer is the first leading cause of death for women in Brazil as well as in most countries in the world. Due to the relation between the breast density and the risk of breast cancer, in medical practice, the breast density classification is merely visual and dependent on professional experience, making this task very subjective. The purpose of this paper is to investigate image features based on histograms and Haralick texture descriptors so as to separate mammographic images into categories of breast density using an Artificial Neural Network. Methods We used 307 mammographic images from the INbreast digital database, extracting histogram features and texture descriptors of all mammograms and selecting them with the K-means technique. Then, these groups of selected features were used as inputs of an Artificial Neural Network to classify the images automatically into the four categories reported by radiologists. Results An average accuracy of 92.9% was obtained in a few tests using only some of the Haralick texture descriptors. Also, the accuracy rate increased to 98.95% when texture descriptors were mixed with some features based on a histogram. Conclusion Texture descriptors have proven to be better than gray levels features at differentiating the breast densities in mammographic images. From this paper, it was possible to automate the feature selection and the classification with acceptable error rates since the extraction of the features is suitable to the characteristics of the images involving the problem.

  19. Automatic analysis of flow cytometric DNA histograms from irradiated mouse male germ cells

    International Nuclear Information System (INIS)

    Lampariello, F.; Mauro, F.; Uccelli, R.; Spano, M.

    1989-01-01

    An automatic procedure for recovering the DNA content distribution of mouse irradiated testis cells from flow cytometric histograms is presented. First, a suitable mathematical model is developed, to represent the pattern of DNA content and fluorescence distribution in the sample. Then a parameter estimation procedure, based on the maximum likelihood approach, is constructed by means of an optimization technique. This procedure has been applied to a set of DNA histograms relative to different doses of 0.4-MeV neutrons and to different time intervals after irradiation. In each case, a good agreement between the measured histograms and the corresponding fits has been obtained. The results indicate that the proposed method for the quantitative analysis of germ cell DNA histograms can be usefully applied to the study of the cytotoxic and mutagenic action of agents of toxicological interest such as ionizing radiations.18 references

  20. Dose-volume histograms for optimization of treatment plans illustrated by the example of oesophagus carcinoma

    International Nuclear Information System (INIS)

    Roth, J.; Huenig, R.; Huegli, C.

    1995-01-01

    Using the example of oesophagus carcinoma, dose-volume histograms for diverse treatment techniques are calculated and judged by means of multiplanar isodose representations. The selected treatment plans are ranked with the aid of the dose-volume histograms. We distinguish the tissue inside and outside of the target volume. The description of the spatial dose distribution in dependence of the different volumes and the respective fractions of the tumor dose therein with the help of dose-volume histograms brings about a correlation between the physical parameters and the biological effects. In addition one has to bear in mind the consequences of measures that influence the reaction and the side-effects of radiotherapy (e.g. chemotherapy), i.e. the recuperation of the tissues that were irradiated intentionally or inevitably. Taking all that into account it is evident that the dose-volume histograms are a powerful tool for assessing the quality of treatment plans. (orig./MG) [de

  1. Experimental observation of a multi-dimensional mixing behavior of steam-water flow in the MIDAS test facility

    International Nuclear Information System (INIS)

    Kweon, T. S.; Yun, B. J.; Ah, D. J.; Ju, I. C.; Song, C. H.; Park, J. K.

    2001-01-01

    Multi-dimensional thermal-hydraulic hehavior, such as ECC (Emergency Core Cooling) bypass, ECC penetration, steam-water condensation and accumulated water level, in an annular downcomer of a PWR (Pressurized Water Reactor) reactor vessel with a DVI(Direct Vessel Injection) injection mode is presented based on the experimental observations in the MIDAS (Multi-dimensional Investigation in Downcomer Annulus Simulation) steam-water facility. From the steady-state tests to similate a late reflood phase of LBLOCA (Large Break Loss-of-Coolant Accidents), major thermal-hydraulic phenomena in the downcomer are quantified under a wide range of test conditions. Especially, isothermal lines show well multi-dimensional phenomena of phase interaction between steam and water in the annulus downcomer. Overall test results show that multi-dimensional thermal-hydraulic behaviors occur in the downcomer annulus region as expected. The MIDAS test facility is a steam-water separate effect test facility, which is 1/4.93 linearly scaled-down of a 1400 MWe PWR type of nuclear reactor, with focusing on understanding multi-dimensional thermal-hydraulic phenomena in annulus downcomer with various types of safety injection location during refill or reflood phase of a LBLOCA in PWR

  2. Image compression using moving average histogram and RBF network

    International Nuclear Information System (INIS)

    Khowaja, S.; Ismaili, I.A.

    2015-01-01

    Modernization and Globalization have made the multimedia technology as one of the fastest growing field in recent times but optimal use of bandwidth and storage has been one of the topics which attract the research community to work on. Considering that images have a lion share in multimedia communication, efficient image compression technique has become the basic need for optimal use of bandwidth and space. This paper proposes a novel method for image compression based on fusion of moving average histogram and RBF (Radial Basis Function). Proposed technique employs the concept of reducing color intensity levels using moving average histogram technique followed by the correction of color intensity levels using RBF networks at reconstruction phase. Existing methods have used low resolution images for the testing purpose but the proposed method has been tested on various image resolutions to have a clear assessment of the said technique. The proposed method have been tested on 35 images with varying resolution and have been compared with the existing algorithms in terms of CR (Compression Ratio), MSE (Mean Square Error), PSNR (Peak Signal to Noise Ratio), computational complexity. The outcome shows that the proposed methodology is a better trade off technique in terms of compression ratio, PSNR which determines the quality of the image and computational complexity. (author)

  3. Development and assessment of Multi-dimensional flow models in the thermal-hydraulic system analysis code MARS

    Energy Technology Data Exchange (ETDEWEB)

    Chung, B. D.; Bae, S. W.; Jeong, J. J.; Lee, S. M

    2005-04-15

    A new multi-dimensional component has been developed to allow for more flexible 3D capabilities in the system code, MARS. This component can be applied in the Cartesian and cylindrical coordinates. For the development of this model, the 3D convection and diffusion terms are implemented in the momentum and energy equation. And a simple Prandtl's mixing length model is applied for the turbulent viscosity. The developed multi-dimensional component was assessed against five conceptual problems with analytic solution. And some SETs are calculated and compared with experimental data. With this newly developed multi-dimensional flow module, the MARS code can realistic calculate the flow fields in pools such as those occurring in the core, steam generators and IRWST.

  4. Development and assessment of Multi-dimensional flow models in the thermal-hydraulic system analysis code MARS

    International Nuclear Information System (INIS)

    Chung, B. D.; Bae, S. W.; Jeong, J. J.; Lee, S. M.

    2005-04-01

    A new multi-dimensional component has been developed to allow for more flexible 3D capabilities in the system code, MARS. This component can be applied in the Cartesian and cylindrical coordinates. For the development of this model, the 3D convection and diffusion terms are implemented in the momentum and energy equation. And a simple Prandtl's mixing length model is applied for the turbulent viscosity. The developed multi-dimensional component was assessed against five conceptual problems with analytic solution. And some SETs are calculated and compared with experimental data. With this newly developed multi-dimensional flow module, the MARS code can realistic calculate the flow fields in pools such as those occurring in the core, steam generators and IRWST

  5. ACTION RECOGNITION USING SALIENT NEIGHBORING HISTOGRAMS

    DEFF Research Database (Denmark)

    Ren, Huamin; Moeslund, Thomas B.

    2013-01-01

    Combining spatio-temporal interest points with Bag-of-Words models achieves state-of-the-art performance in action recognition. However, existing methods based on “bag-ofwords” models either are too local to capture the variance in space/time or fail to solve the ambiguity problem in spatial...... and temporal dimensions. Instead, we propose a salient vocabulary construction algorithm to select visual words from a global point of view, and form compact descriptors to represent discriminative histograms in the neighborhoods. Those salient neighboring histograms are then trained to model different actions...

  6. Oriented Shape Index Histograms for Cell Classification

    DEFF Research Database (Denmark)

    Larsen, Anders Boesen Lindbo; Dahl, Anders Bjorholm; Larsen, Rasmus

    2015-01-01

    We propose a novel extension to the shape index histogram feature descriptor where the orientation of the second-order curvature is included in the histograms. The orientation of the shape index is reminiscent but not equal to gradient orientation which is widely used for feature description. We...... evaluate our new feature descriptor using a public dataset consisting of HEp-2 cell images from indirect immunoflourescence lighting. Our results show that we can improve classification performance significantly when including the shape index orientation. Notably, we show that shape index orientation...

  7. Best-estimated multi-dimensional calculation during LB LOCA for APR1400

    International Nuclear Information System (INIS)

    Oh, D. Y.; Bang, Y. S.; Cheong, A. J.; Woong, S.; Korea, W.

    2010-01-01

    Best-estimated (BE) calculation with uncertainty quantification for the emergency core cooling system (ECCS) performance analysis during Loss of Coolant Accident (LOCA) is more broadly used in nuclear industries and regulations. In Korea, demand on regulatory audit calculation is continuously increasing to support the safety review for life extension, power up-rating and advanced nuclear reactor design. The thermal-hydraulic system code, MARS (Multi-dimensional Analysis of Reactor Safety), with multi-dimensional capability is used for audit calculation. It achieves to describe the complicated phenomena in reactor coolant system by very effectively consolidating the one dimensional RELAP5/MOD3 with the multidimensional COBRA-TF codes. The advanced power reactors (APR1400) to be evaluated has four separated hydraulic trains of the high pressure injection system (HPSI) with direct vessel injection (DVI) which is different from the existing commercial PWRs. Also, the therma-hydraulic behavior of DVI plant would be considerably different from that of a cold-leg safety injection since the low pressure safety injection system are eliminated and the high pressure safety flow are injected into the specific elevation of reactor vessel downcomer. The ECCS bypass induced by the downcomer boiling due to hot wall heating of reactor vessel during reflooding phase is one of the important phenomena which should be considered in DVI plants. Therefore, in this study, BE calculation with one-dimensional (1-D) and multi-dimensional (multi-D) MARS models during LBLOCA are performed for APR1400 plant. In the multi-D evaluation, the reactor vessel is modeled by multi-D components and the specific treatment of flow path inside reactor vessel, e.g., upper guide structure, is essential. The concept of hot zone is adopted to simulate the limiting thermal-hydraulic conditions surrounding hot rod, which is similar to hot channel in 1-D. Also, alternative treatment of the hot rods in multi-D is

  8. Approximate series solution of multi-dimensional, time fractional-order (heat-like) diffusion equations using FRDTM.

    Science.gov (United States)

    Singh, Brajesh K; Srivastava, Vineet K

    2015-04-01

    The main goal of this paper is to present a new approximate series solution of the multi-dimensional (heat-like) diffusion equation with time-fractional derivative in Caputo form using a semi-analytical approach: fractional-order reduced differential transform method (FRDTM). The efficiency of FRDTM is confirmed by considering four test problems of the multi-dimensional time fractional-order diffusion equation. FRDTM is a very efficient, effective and powerful mathematical tool which provides exact or very close approximate solutions for a wide range of real-world problems arising in engineering and natural sciences, modelled in terms of differential equations.

  9. Rarefaction and shock waves for multi-dimensional hyperbolic conservation laws

    International Nuclear Information System (INIS)

    Dening, Li

    1991-01-01

    In this paper, the author wants to show the local existence of a solution of combination of shock and rarefaction waves for the multi-dimensional hyperbolic system of conservation laws. The typical example he has in mind is the Euler equations for compressible fluid. More generally, he studies the hyperbolic system of conservation laws ∂ t F 0 (u) + Σ j=1 n ∂ x j F j (u)=0 where u=(u 1 ....,u m ) and F j (u), j=0,...,n are m-dimensional vector-valued functions. He'll impose some conditions in the following on the systems (1.2). All these conditions are satisfied by the Euler equations

  10. A new analytical method to solve the heat equation for a multi-dimensional composite slab

    International Nuclear Information System (INIS)

    Lu, X; Tervola, P; Viljanen, M

    2005-01-01

    A novel analytical approach has been developed for heat conduction in a multi-dimensional composite slab subject to time-dependent boundary changes of the first kind. Boundary temperatures are represented as Fourier series. Taking advantage of the periodic properties of boundary changes, the analytical solution is obtained and expressed explicitly. Nearly all the published works necessitate searching for associated eigenvalues in solving such a problem even for a one-dimensional composite slab. In this paper, the proposed method involves no iterative computation such as numerically searching for eigenvalues and no residue evaluation. The adopted method is simple which represents an extension of the novel analytical approach derived for the one-dimensional composite slab. Moreover, the method of 'separation of variables' employed in this paper is new. The mathematical formula for solutions is concise and straightforward. The physical parameters are clearly shown in the formula. Further comparison with numerical calculations is presented

  11. Extending the Implicit Association Test (IAT): assessing consumer attitudes based on multi-dimensional implicit associations.

    Science.gov (United States)

    Gattol, Valentin; Sääksjärvi, Maria; Carbon, Claus-Christian

    2011-01-05

    The authors present a procedural extension of the popular Implicit Association Test (IAT) that allows for indirect measurement of attitudes on multiple dimensions (e.g., safe-unsafe; young-old; innovative-conventional, etc.) rather than on a single evaluative dimension only (e.g., good-bad). In two within-subjects studies, attitudes toward three automobile brands were measured on six attribute dimensions. Emphasis was placed on evaluating the methodological appropriateness of the new procedure, providing strong evidence for its reliability, validity, and sensitivity. This new procedure yields detailed information on the multifaceted nature of brand associations that can add up to a more abstract overall attitude. Just as the IAT, its multi-dimensional extension/application (dubbed md-IAT) is suited for reliably measuring attitudes consumers may not be consciously aware of, able to express, or willing to share with the researcher.

  12. Analysis of multi-dimensional and countercurrent effects in a BWR loss-of-coolant accident

    International Nuclear Information System (INIS)

    Shiralkar, B.S.; Dix, G.E.; Alamgir, M.

    1989-01-01

    The presence of parallel enclosed channels in a BWR provides opportunities for multiple flow regimes in co-current and countercurrent flow under Loss-of-Coolant Accident (LOCA) conditions. To address and understand these phenomena, an integrated experimental and analytical study has been conducted. The primary experimental facility was the Steam Sector Test Facility (SSTF) which simulated a full scale 30deg sector of a BWR/6 reactor vessel. Both steady-state separate effects tests and integral transients with vessel blowdown and refill were performed. The present of multi-dimensional and parallel channel effects was found to be very beneficial to BWR LOCA performance. The best estimate TRAC-BWR computer code was extended as part of this study by incorporation of a phenomenological upper plenum mixing model. TRAC-BWR was applied to the analysis of these full scale experiments. Excellent predictions of phenomena and experimental trends were achieved. (orig.)

  13. Single-phase multi-dimensional thermohydraulics direct numerical simulation code DINUS-3. Input data description

    Energy Technology Data Exchange (ETDEWEB)

    Muramatsu, Toshiharu [Power Reactor and Nuclear Fuel Development Corp., Oarai, Ibaraki (Japan). Oarai Engineering Center

    1998-08-01

    This report explains the numerical methods and the set-up method of input data for a single-phase multi-dimensional thermohydraulics direct numerical simulation code DINUS-3 (Direct Numerical Simulation using a 3rd-order upwind scheme). The code was developed to simulate non-stationary temperature fluctuation phenomena related to thermal striping phenomena, developed at Power Reactor and Nuclear Fuel Development Corporation (PNC). The DINUS-3 code was characterized by the use of a third-order upwind scheme for convection terms in instantaneous Navier-Stokes and energy equations, and an adaptive control system based on the Fuzzy theory to control time step sizes. Author expect this report is very useful to utilize the DINUS-3 code for the evaluation of various non-stationary thermohydraulic phenomena in reactor applications. (author)

  14. Application of neural network to multi-dimensional design window search

    International Nuclear Information System (INIS)

    Kugo, T.; Nakagawa, M.

    1996-01-01

    In the reactor core design, many parametric survey calculations should be carried out to decide an optimal set of basic design parameter values. They consume a large amount of computation time and labor in the conventional way. To support directly such a work, we investigate a procedure to search efficiently a design window, which is defined as feasible design parameter ranges satisfying design criteria and requirements, in a multi-dimensional space composed of several basic design parameters. A principle of the present method is to construct the multilayer neural network to simulate quickly a response of an analysis code through a training process, and to reduce computation time using the neural network as a substitute of an analysis code. We apply the present method to a fuel pin design of high conversion light water reactors for the neutronics and thermal hydraulics fields to demonstrate performances of the method. (author)

  15. Multi-dimensional analysis of high resolution {gamma}-ray data

    Energy Technology Data Exchange (ETDEWEB)

    Flibotte, S.; Huettmeier, U.J.; France, G. de; Haas, B.; Romain, P.; Theisen, Ch.; Vivien, J.P.; Zen, J. [Strasbourg-1 Univ., 67 (France). Centre de Recherches Nucleaires

    1992-12-31

    A new generation of high resolution {gamma}-ray spectrometers capable of recording high-fold coincidence events with a large efficiency will soon be available. Algorithms are developed to analyze high-fold {gamma}-ray coincidences. As a contribution to the software development associated with the EUROGAM spectrometer, the performances of computer codes designed to select multi-dimensional gates from 3-, 4- and 5-fold coincidence databases were tested. The tests were performed on events generated with a Monte Carlo simulation and also on real experimental triple data recorded with the 8{pi} spectrometer and with a preliminary version of the EUROGAM array. (R.P.) 14 refs.; 3 figs.; 3 tabs.

  16. Multi-dimensional knowledge translation: enabling health informatics capacity audits using patient journey models.

    Science.gov (United States)

    Catley, Christina; McGregor, Carolyn; Percival, Jennifer; Curry, Joanne; James, Andrew

    2008-01-01

    This paper presents a multi-dimensional approach to knowledge translation, enabling results obtained from a survey evaluating the uptake of Information Technology within Neonatal Intensive Care Units to be translated into knowledge, in the form of health informatics capacity audits. Survey data, having multiple roles, patient care scenarios, levels, and hospitals, is translated using a structured data modeling approach, into patient journey models. The data model is defined such that users can develop queries to generate patient journey models based on a pre-defined Patient Journey Model architecture (PaJMa). PaJMa models are then analyzed to build capacity audits. Capacity audits offer a sophisticated view of health informatics usage, providing not only details of what IT solutions a hospital utilizes, but also answering the questions: when, how and why, by determining when the IT solutions are integrated into the patient journey, how they support the patient information flow, and why they improve the patient journey.

  17. A Complete Video Coding Chain Based on Multi-Dimensional Discrete Cosine Transform

    Directory of Open Access Journals (Sweden)

    T. Fryza

    2010-09-01

    Full Text Available The paper deals with a video compression method based on the multi-dimensional discrete cosine transform. In the text, the encoder and decoder architectures including the definitions of all mathematical operations like the forward and inverse 3-D DCT, quantization and thresholding are presented. According to the particular number of currently processed pictures, the new quantization tables and entropy code dictionaries are proposed in the paper. The practical properties of the 3-D DCT coding chain compared with the modern video compression methods (such as H.264 and WebM and the computing complexity are presented as well. It will be proved the best compress properties could be achieved by complex H.264 codec. On the other hand the computing complexity - especially on the encoding side - is lower for the 3-D DCT method.

  18. Multi-dimensional analysis of high resolution γ-ray data

    International Nuclear Information System (INIS)

    Flibotte, S.; Huettmeier, U.J.; France, G. de; Haas, B.; Romain, P.; Theisen, Ch.; Vivien, J.P.; Zen, J.

    1992-01-01

    A new generation of high resolution γ-ray spectrometers capable of recording high-fold coincidence events with a large efficiency will soon be available. Algorithms are developed to analyze high-fold γ-ray coincidences. As a contribution to the software development associated with the EUROGAM spectrometer, the performances of computer codes designed to select multi-dimensional gates from 3-, 4- and 5-fold coincidence databases were tested. The tests were performed on events generated with a Monte Carlo simulation and also on real experimental triple data recorded with the 8π spectrometer and with a preliminary version of the EUROGAM array. (R.P.) 14 refs.; 3 figs.; 3 tabs

  19. MULTI-DIMENSIONAL MASS SPECTROMETRY-BASED SHOTGUN LIPIDOMICS AND NOVEL STRATEGIES FOR LIPIDOMIC ANALYSES

    Science.gov (United States)

    Han, Xianlin; Yang, Kui; Gross, Richard W.

    2011-01-01

    Since our last comprehensive review on multi-dimensional mass spectrometry-based shotgun lipidomics (Mass Spectrom. Rev. 24 (2005), 367), many new developments in the field of lipidomics have occurred. These developments include new strategies and refinements for shotgun lipidomic approaches that use direct infusion, including novel fragmentation strategies, identification of multiple new informative dimensions for mass spectrometric interrogation, and the development of new bioinformatic approaches for enhanced identification and quantitation of the individual molecular constituents that comprise each cell’s lipidome. Concurrently, advances in liquid chromatography-based platforms and novel strategies for quantitative matrix-assisted laser desorption/ionization mass spectrometry for lipidomic analyses have been developed. Through the synergistic use of this repertoire of new mass spectrometric approaches, the power and scope of lipidomics has been greatly expanded to accelerate progress toward the comprehensive understanding of the pleiotropic roles of lipids in biological systems. PMID:21755525

  20. Challenges in Constructing a Multi-dimensional European Job Quality Index

    DEFF Research Database (Denmark)

    Leschke, Janine; Watt, Andrew

    2014-01-01

    quality performances and the outcomes in six sub-dimensions of job quality and compare them with each other, across gender and over time. At the same time, the limitations of such a composite index need to be borne in mind. The most important challenges are the availability (over time), timeliness......There are few attempts to benchmark job quality in a multi-dimensional perspective across Europe. Against this background, we have created a synthetic job quality index (JQI) for the EU27 countries in an attempt to shed light on the question of how European countries compare with each other and how...... they are developing over time in terms of job quality. Taking account of the multi-faceted nature of job quality, the JQI is compiled on the basis of six sub-indices which cover the most important dimensions of job quality as identified in the literature. The paper addresses the methods used to construct the JQI...

  1. Energy method for multi-dimensional balance laws with non-local dissipation

    KAUST Repository

    Duan, Renjun

    2010-06-01

    In this paper, we are concerned with a class of multi-dimensional balance laws with a non-local dissipative source which arise as simplified models for the hydrodynamics of radiating gases. At first we introduce the energy method in the setting of smooth perturbations and study the stability of constants states. Precisely, we use Fourier space analysis to quantify the energy dissipation rate and recover the optimal time-decay estimates for perturbed solutions via an interpolation inequality in Fourier space. As application, the developed energy method is used to prove stability of smooth planar waves in all dimensions n2, and also to show existence and stability of time-periodic solutions in the presence of the time-periodic source. Optimal rates of convergence of solutions towards the planar waves or time-periodic states are also shown provided initially L1-perturbations. © 2009 Elsevier Masson SAS.

  2. Multi-dimensional fiber-optic radiation sensor for ocular proton therapy dosimetry

    International Nuclear Information System (INIS)

    Jang, K.W.; Yoo, W.J.; Moon, J.; Han, K.T.; Park, B.G.; Shin, D.; Park, S-Y.; Lee, B.

    2012-01-01

    In this study, we fabricated a multi-dimensional fiber-optic radiation sensor, which consists of organic scintillators, plastic optical fibers and a water phantom with a polymethyl methacrylate structure for the ocular proton therapy dosimetry. For the purpose of sensor characterization, we measured the spread out Bragg-peak of 120 MeV proton beam using a one-dimensional sensor array, which has 30 fiber-optic radiation sensors with a 1.5 mm interval. A uniform region of spread out Bragg-peak using the one-dimensional fiber-optic radiation sensor was obtained from 20 to 25 mm depth of a phantom. In addition, the Bragg-peak of 109 MeV proton beam was measured at the depth of 11.5 mm of a phantom using a two-dimensional sensor array, which has 10×3 sensor array with a 0.5 mm interval.

  3. Multi-dimensional single-spin nano-optomechanics with a levitated nanodiamond

    Science.gov (United States)

    Neukirch, Levi P.; von Haartman, Eva; Rosenholm, Jessica M.; Nick Vamivakas, A.

    2015-10-01

    Considerable advances made in the development of nanomechanical and nano-optomechanical devices have enabled the observation of quantum effects, improved sensitivity to minute forces, and provided avenues to probe fundamental physics at the nanoscale. Concurrently, solid-state quantum emitters with optically accessible spin degrees of freedom have been pursued in applications ranging from quantum information science to nanoscale sensing. Here, we demonstrate a hybrid nano-optomechanical system composed of a nanodiamond (containing a single nitrogen-vacancy centre) that is levitated in an optical dipole trap. The mechanical state of the diamond is controlled by modulation of the optical trapping potential. We demonstrate the ability to imprint the multi-dimensional mechanical motion of the cavity-free mechanical oscillator into the nitrogen-vacancy centre fluorescence and manipulate the mechanical system's intrinsic spin. This result represents the first step towards a hybrid quantum system based on levitating nanoparticles that simultaneously engages optical, phononic and spin degrees of freedom.

  4. CANDU safety analysis system establishment; development of trip coverage and multi-dimensional hydrogen analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jong Ho; Ohn, M. Y.; Cho, C. H. [KOPEC, Taejon (Korea)

    2002-03-01

    The trip coverage analysis model requires the geometry network for primary and secondary circuit as well as the plant control system to simulate all the possible plant operating conditions throughout the plant life. The model was validated for the power maneuvering and the Wolsong 4 commissioning test. The trip coverage map was produced for the large break loss of coolant accident and the complete loss of class IV power event. The reliable multi-dimensional hydrogen analysis requires the high capability for thermal hydraulic modelling. To acquire such a basic capability and verify the applicability of GOTHIC code, the assessment of heat transfer model, hydrogen mixing and combustion model was performed. Also, the assessment methodology for flame acceleration and deflagration-to-detonation transition is established. 22 refs., 120 figs., 31 tabs. (Author)

  5. Multi-dimensional diagnostics of high power ion beams by Arrayed Pinhole Camera System

    International Nuclear Information System (INIS)

    Yasuike, K.; Miyamoto, S.; Shirai, N.; Akiba, T.; Nakai, S.; Imasaki, K.; Yamanaka, C.

    1993-01-01

    The authors developed multi-dimensional beam diagnostics system (with spatially and time resolution). They used newly developed Arrayed Pinhole Camera (APC) for this diagnosis. The APC can get spatial distribution of divergence and flux density. They use two types of particle detectors in this study. The one is CR-39 can get time integrated images. The other one is gated Micro-Channel-Plate (MCP) with CCD camera. It enables time resolving diagnostics. The diagnostics systems have resolution better than 10mrad divergence, 0.5mm spatial resolution on the objects respectively. The time resolving system has 10ns time resolution. The experiments are performed on Reiden-IV and Reiden-SHVS induction linac. The authors get time integrated divergence distributions on Reiden-IV proton beam. They also get time resolved image on Reiden-SHVS

  6. An exploration study to find important factors influencing on multi-dimensional organizational culture

    Directory of Open Access Journals (Sweden)

    Naser Azad

    2013-08-01

    Full Text Available This paper presents an empirical investigation to find important factors influencing multi-dimensional organizational culture. The proposed study designs a questionnaire in Likert scale consists of 21 questions, distributes it among 300 people who worked for different business units and collects 283 filled ones. Cronbach alpha is calculated as 0.799. In addition, Kaiser-Meyer-Olkin Measure of Sampling Adequacy and Approx. Chi-Square are 0.821 and 1395.74, respectively. The study has implemented principal component analysis and the results have indicated that there were four factors influencing organizational culture including, diversity in culture, connection based culture, integrated culture and structure of culture. In terms of diversity in culture, sensitivity to quality data and cultural flexibility are the most influential sub-factors while connection based marketing and relational satisfaction are two important sub-factors associated with diversity in culture. The study discusses other issues.

  7. Racial-ethnic self-schemas: Multi-dimensional identity-based motivation

    Science.gov (United States)

    Oyserman, Daphna

    2008-01-01

    Prior self-schema research focuses on benefits of being schematic vs. aschematic in stereotyped domains. The current studies build on this work, examining racial-ethnic self-schemas as multi-dimensional, containing multiple, conflicting, and non-integrated images. A multidimensional perspective captures complexity; examining net effects of dimensions predicts within-group differences in academic engagement and well-being. When racial-ethnicity self-schemas focus attention on membership in both in-group and broader society, engagement with school should increase since school is not seen as out-group defining. When racial-ethnicity self-schemas focus attention on inclusion (not obstacles to inclusion) in broader society, risk of depressive symptoms should decrease. Support for these hypotheses was found in two separate samples (8th graders, n = 213, 9th graders followed to 12th grade n = 141). PMID:19122837

  8. Development of Multi-Dimensional RELAP5 with Conservative Momentum Flux

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Hyung Wook; Lee, Sang Yong [KINGS, Ulsan (Korea, Republic of)

    2016-10-15

    The non-conservative form of the momentum equations are used in many codes. It tells us that using the non-conservative form in the non-porous or open body problem may not be good. In this paper, two aspects concerning the multi-dimensional codes will be discussed. Once the validity of the modified code is confirmed, it is applied to the analysis of the large break LOCA for APR-1400. One of them is the properness of the type of the momentum equations. The other discussion will be the implementation of the conservative momentum flux term in RELAP5. From the present study and former, it is shown that the RELAP5 Multi-D with conservative convective terms is applicable to LOCA analysis. And the implementation of the conservative convective terms in RELAP5 seems to be successful. Further efforts have to be made on making it more robust.

  9. A nodal collocation approximation for the multi-dimensional PL equations - 2D applications

    International Nuclear Information System (INIS)

    Capilla, M.; Talavera, C.F.; Ginestar, D.; Verdu, G.

    2008-01-01

    A classical approach to solve the neutron transport equation is to apply the spherical harmonics method obtaining a finite approximation known as the P L equations. In this work, the derivation of the P L equations for multi-dimensional geometries is reviewed and a nodal collocation method is developed to discretize these equations on a rectangular mesh based on the expansion of the neutronic fluxes in terms of orthogonal Legendre polynomials. The performance of the method and the dominant transport Lambda Modes are obtained for a homogeneous 2D problem, a heterogeneous 2D anisotropic scattering problem, a heterogeneous 2D problem and a benchmark problem corresponding to a MOX fuel reactor core

  10. Fuzzy Regression Prediction and Application Based on Multi-Dimensional Factors of Freight Volume

    Science.gov (United States)

    Xiao, Mengting; Li, Cheng

    2018-01-01

    Based on the reality of the development of air cargo, the multi-dimensional fuzzy regression method is used to determine the influencing factors, and the three most important influencing factors of GDP, total fixed assets investment and regular flight route mileage are determined. The system’s viewpoints and analogy methods, the use of fuzzy numbers and multiple regression methods to predict the civil aviation cargo volume. In comparison with the 13th Five-Year Plan for China’s Civil Aviation Development (2016-2020), it is proved that this method can effectively improve the accuracy of forecasting and reduce the risk of forecasting. It is proved that this model predicts civil aviation freight volume of the feasibility, has a high practical significance and practical operation.

  11. Multi Dimensional Honey Bee Foraging Algorithm Based on Optimal Energy Consumption

    Science.gov (United States)

    Saritha, R.; Vinod Chandra, S. S.

    2017-10-01

    In this paper a new nature inspired algorithm is proposed based on natural foraging behavior of multi-dimensional honey bee colonies. This method handles issues that arise when food is shared from multiple sources by multiple swarms at multiple destinations. The self organizing nature of natural honey bee swarms in multiple colonies is based on the principle of energy consumption. Swarms of multiple colonies select a food source to optimally fulfill the requirements of its colonies. This is based on the energy requirement for transporting food between a source and destination. Minimum use of energy leads to maximizing profit in each colony. The mathematical model proposed here is based on this principle. This has been successfully evaluated by applying it on multi-objective transportation problem for optimizing cost and time. The algorithm optimizes the needs at each destination in linear time.

  12. Energy method for multi-dimensional balance laws with non-local dissipation

    KAUST Repository

    Duan, Renjun; Fellner, Klemens; Zhu, Changjiang

    2010-01-01

    In this paper, we are concerned with a class of multi-dimensional balance laws with a non-local dissipative source which arise as simplified models for the hydrodynamics of radiating gases. At first we introduce the energy method in the setting of smooth perturbations and study the stability of constants states. Precisely, we use Fourier space analysis to quantify the energy dissipation rate and recover the optimal time-decay estimates for perturbed solutions via an interpolation inequality in Fourier space. As application, the developed energy method is used to prove stability of smooth planar waves in all dimensions n2, and also to show existence and stability of time-periodic solutions in the presence of the time-periodic source. Optimal rates of convergence of solutions towards the planar waves or time-periodic states are also shown provided initially L1-perturbations. © 2009 Elsevier Masson SAS.

  13. A multi-dimensional framework to assist in the design of successful shared services centres

    Directory of Open Access Journals (Sweden)

    Mark Borman

    2012-04-01

    Full Text Available Organisations are increasingly looking to realise the benefits of shared services yet there is little guidance available as to the best way to proceed. A multi-dimensional framework is presented that considers the service provided, the design of the shared services centre and the organisational context it sits within. Case studies are then used to determine what specific attributes from each dimension are associated with success and how they should be aligned. It is concluded that there appears to be a single, broadly standard pattern of attributes for successful Shared Services Centres (SSCs across the proposed dimensions of Activity, Environment, History, Resources, Strategy, Structure, Management, Technology and Individual Skills. It should also be noted though that some deviation from the identified standard along some dimensions is possible without adverse effect – ie that the alignment identified appears to be relatively soft.

  14. Histogram-based quantitative evaluation of endobronchial ultrasonography images of peripheral pulmonary lesion.

    Science.gov (United States)

    Morikawa, Kei; Kurimoto, Noriaki; Inoue, Takeo; Mineshita, Masamichi; Miyazawa, Teruomi

    2015-01-01

    Endobronchial ultrasonography using a guide sheath (EBUS-GS) is an increasingly common bronchoscopic technique, but currently, no methods have been established to quantitatively evaluate EBUS images of peripheral pulmonary lesions. The purpose of this study was to evaluate whether histogram data collected from EBUS-GS images can contribute to the diagnosis of lung cancer. Histogram-based analyses focusing on the brightness of EBUS images were retrospectively conducted: 60 patients (38 lung cancer; 22 inflammatory diseases), with clear EBUS images were included. For each patient, a 400-pixel region of interest was selected, typically located at a 3- to 5-mm radius from the probe, from recorded EBUS images during bronchoscopy. Histogram height, width, height/width ratio, standard deviation, kurtosis and skewness were investigated as diagnostic indicators. Median histogram height, width, height/width ratio and standard deviation were significantly different between lung cancer and benign lesions (all p histogram standard deviation. Histogram standard deviation appears to be the most useful characteristic for diagnosing lung cancer using EBUS images. © 2015 S. Karger AG, Basel.

  15. TWO-DIMENSIONAL CORE-COLLAPSE SUPERNOVA MODELS WITH MULTI-DIMENSIONAL TRANSPORT

    International Nuclear Information System (INIS)

    Dolence, Joshua C.; Burrows, Adam; Zhang, Weiqun

    2015-01-01

    We present new two-dimensional (2D) axisymmetric neutrino radiation/hydrodynamic models of core-collapse supernova (CCSN) cores. We use the CASTRO code, which incorporates truly multi-dimensional, multi-group, flux-limited diffusion (MGFLD) neutrino transport, including all relevant O(v/c) terms. Our main motivation for carrying out this study is to compare with recent 2D models produced by other groups who have obtained explosions for some progenitor stars and with recent 2D VULCAN results that did not incorporate O(v/c) terms. We follow the evolution of 12, 15, 20, and 25 solar-mass progenitors to approximately 600 ms after bounce and do not obtain an explosion in any of these models. Though the reason for the qualitative disagreement among the groups engaged in CCSN modeling remains unclear, we speculate that the simplifying ''ray-by-ray'' approach employed by all other groups may be compromising their results. We show that ''ray-by-ray'' calculations greatly exaggerate the angular and temporal variations of the neutrino fluxes, which we argue are better captured by our multi-dimensional MGFLD approach. On the other hand, our 2D models also make approximations, making it difficult to draw definitive conclusions concerning the root of the differences between groups. We discuss some of the diagnostics often employed in the analyses of CCSN simulations and highlight the intimate relationship between the various explosion conditions that have been proposed. Finally, we explore the ingredients that may be missing in current calculations that may be important in reproducing the properties of the average CCSNe, should the delayed neutrino-heating mechanism be the correct mechanism of explosion

  16. Calculation of multi-dimensional dose distribution in medium due to proton beam incidence

    International Nuclear Information System (INIS)

    Kawachi, Kiyomitsu; Inada, Tetsuo

    1978-01-01

    The method of analyzing the multi-dimensional dose distribution in a medium due to proton beam incidence is presented to obtain the reliable and simplified method from clinical viewpoint, especially for the medical treatment of cancer. The heavy ion beam being taken out of an accelerator has to be adjusted to fit cancer location and size, utilizing a modified range modulator, a ridge filter, a bolus and a special scanning apparatus. The precise calculation of multi-dimensional dose distribution of proton beam is needed to fit treatment to a limit part. The analytical formulas consist of those for the fluence distribution in a medium, the divergence of flying range, the energy distribution itself, the dose distribution in side direction and the two-dimensional dose distribution. The fluence distribution in polystyrene in case of the protons with incident energy of 40 and 60 MeV, the energy distribution of protons at the position of a Bragg peak for various values of incident energy, the depth dose distribution in polystyrene in case of the protons with incident energy of 40 and 60 MeV and average energy of 100 MeV, the proton fluence and dose distribution as functions of depth for the incident average energy of 250 MeV, the statistically estimated percentage errors in the proton fluence and dose distribution, the estimated minimum detectable tumor thickness as a function of the number of incident protons for the different incident spectra with average energy of 250 MeV, the isodose distribution in a plane containing the central axis in case of the incident proton beam of 3 mm diameter and 40 MeV and so on are presented as the analytical results, and they are evaluated. (Nakai, Y.)

  17. Predicting respiratory tumor motion with multi-dimensional adaptive filters and support vector regression

    International Nuclear Information System (INIS)

    Riaz, Nadeem; Wiersma, Rodney; Mao Weihua; Xing Lei; Shanker, Piyush; Gudmundsson, Olafur; Widrow, Bernard

    2009-01-01

    Intra-fraction tumor tracking methods can improve radiation delivery during radiotherapy sessions. Image acquisition for tumor tracking and subsequent adjustment of the treatment beam with gating or beam tracking introduces time latency and necessitates predicting the future position of the tumor. This study evaluates the use of multi-dimensional linear adaptive filters and support vector regression to predict the motion of lung tumors tracked at 30 Hz. We expand on the prior work of other groups who have looked at adaptive filters by using a general framework of a multiple-input single-output (MISO) adaptive system that uses multiple correlated signals to predict the motion of a tumor. We compare the performance of these two novel methods to conventional methods like linear regression and single-input, single-output adaptive filters. At 400 ms latency the average root-mean-square-errors (RMSEs) for the 14 treatment sessions studied using no prediction, linear regression, single-output adaptive filter, MISO and support vector regression are 2.58, 1.60, 1.58, 1.71 and 1.26 mm, respectively. At 1 s, the RMSEs are 4.40, 2.61, 3.34, 2.66 and 1.93 mm, respectively. We find that support vector regression most accurately predicts the future tumor position of the methods studied and can provide a RMSE of less than 2 mm at 1 s latency. Also, a multi-dimensional adaptive filter framework provides improved performance over single-dimension adaptive filters. Work is underway to combine these two frameworks to improve performance.

  18. Optimal sensor configuration for flexible structures with multi-dimensional mode shapes

    International Nuclear Information System (INIS)

    Chang, Minwoo; Pakzad, Shamim N

    2015-01-01

    A framework for deciding the optimal sensor configuration is implemented for civil structures with multi-dimensional mode shapes, which enhances the applicability of structural health monitoring for existing structures. Optimal sensor placement (OSP) algorithms are used to determine the best sensor configuration for structures with a priori knowledge of modal information. The signal strength at each node is evaluated by effective independence and modified variance methods. Euclidean norm of signal strength indices associated with each node is used to expand OSP applicability into flexible structures. The number of sensors for each method is determined using the threshold for modal assurance criterion (MAC) between estimated (from a set of observations) and target mode shapes. Kriging is utilized to infer the modal estimates for unobserved locations with a weighted sum of known neighbors. A Kriging model can be expressed as a sum of linear regression and random error which is assumed as the realization of a stochastic process. This study presents the effects of Kriging parameters for the accurate estimation of mode shapes and the minimum number of sensors. The feasible ranges to satisfy MAC criteria are investigated and used to suggest the adequate searching bounds for associated parameters. The finite element model of a tall building is used to demonstrate the application of optimal sensor configuration. The dynamic modes of flexible structure at centroid are appropriately interpreted into the outermost sensor locations when OSP methods are implemented. Kriging is successfully used to interpolate the mode shapes from a set of sensors and to monitor structures associated with multi-dimensional mode shapes. (paper)

  19. Histogram Modification and Wavelet Transform for High Performance Watermarking

    Directory of Open Access Journals (Sweden)

    Ying-Shen Juang

    2012-01-01

    Full Text Available This paper proposes a reversible watermarking technique for natural images. According to the similarity of neighbor coefficients’ values in wavelet domain, most differences between two adjacent pixels are close to zero. The histogram is built based on these difference statistics. As more peak points can be used for secret data hiding, the hiding capacity is improved compared with those conventional methods. Moreover, as the differences concentricity around zero is improved, the transparency of the host image can be increased. Experimental results and comparison show that the proposed method has both advantages in hiding capacity and transparency.

  20. Multi-dimensional dosimetric verification of stereotactic radiotherapy for uveal melanoma using radiochromic EBT film

    International Nuclear Information System (INIS)

    Sturtewagen, E.; Fuss, M.; Georg, D.; Paelinck, L.; Wagter, C. de

    2008-01-01

    Since 1997, linac based stereotactic radiotherapy (SRT) of uveal melanoma has been continuously developed at the Department of Radiotherapy, Medical University Vienna. The aim of the present study was (i) to test a new type of radiochromic film (Gafchromic EBT) for dosimetric verification of class solutions for these treatments and (ii) to verify treatment plan acceptance criteria, which are based on gamma values statisitcs. An EPSON Expression 1680 Pro flat bed scanner was utilized for film reading. To establish a calibration curve, films were cut in squares of 2 x 2 cm 2 , positioned at 5 cm depth in a solid water phantom and were irradiated with different dose levels (0.5 and 5 Gy) in a 5 x 5 cm 2 field at 6 MV. A previously developed solid phantom (polystyrene) was used with overall dimensions corresponding to an average human head. EBT films were placed at four different depths (10, 20, 25 and 30 mm) and all films were irradiated simultaneously. Four different treatment plans were verified that resemble typical clinical situations. These plans differed in irradiation technique (conformal mMLC or circular arc SRT) and in tumour size (PTV of 1 or 2.5 cm 3 ). In-house developed software was applied to calculate gamma (γ) index values and to perform several statistical operations (e.g. γ-area histograms). At depths of 10 mm γ 1% (γ-value where 1% of the points have an equal or higher value in the region of interest) were between 1-3 and maximum γ > 1 (% of γ-values > 1 in the region of interest) areas were almost 30%. At larger depths, i.e. more close to the isocenter, γ 1% was > 1 areas were mostly < 5%. Average γ values were about 0.5. Besides the compromised accuracy in the buildup region, previously defined IMRT acceptance criteria [Stock et al., Phys. Med. Biol. 50 (2005) 399-411] could be applied as well to SRT. Radiochromic EBT films, in combination with a flat-bed scanner, were found to be an ideal multidimensional dosimetric tool for treatment

  1. A comparison of automatic histogram constructions

    NARCIS (Netherlands)

    Davies, P.L.; Gather, U.; Nordman, D.J.; Weinert, H.

    2009-01-01

    Even for a well-trained statistician the construction of a histogram for a given real-valued data set is a difficult problem. It is even more difficult to construct a fully automatic procedure which specifies the number and widths of the bins in a satisfactory manner for a wide range of data sets.

  2. The Multi-Dimensional Blood/Injury Phobia Inventory : Its psychometric properties and relationship with disgust propensity and disgust sensitivity

    NARCIS (Netherlands)

    van Overveld, Mark; de Jong, Peter J.; Peters, Madelon L.

    The Multi-Dimensional Blood Phobia Inventory (MBPI: Wenzel & Holt, 2003) is the only instrument available that assesses both disgust and anxiety for blood-phobic stimuli. As inflated levels of disgust propensity (i.e., tendency to experience disgust more readily) are often observed in blood phobia,

  3. A multi-dimensional approach to talent: An empirical analysis of the definition of talent in Dutch academia

    NARCIS (Netherlands)

    Thunissen, M.; Arensbergen, P. van

    2015-01-01

    - Purpose – The purpose of this paper is to contribute to the development of a broader, multi-dimensional approach to talent that helps scholars and practitioners to fully understand the nuances and complexity of talent in the organizational context. - Design/methodology/approach – The data were

  4. Multi-dimensional Analysis for SLB Transient in ATLAS Facility as Activity of DSP (Domestic Standard Problem)

    International Nuclear Information System (INIS)

    Bae, B. U.; Park, Y. S.; Kim, J. R.; Kang, K. H.; Choi, K. Y.; Sung, H. J.; Hwang, M. J.; Kang, D. H.; Lim, S. G.; Jun, S. S.

    2015-01-01

    Participants of DSP-03 were divided in three groups and each group has focused on the specific subject related to the enhancement of the code analysis. The group A tried to investigate scaling capability of ATLAS test data by comparing to the code analysis for a prototype, and the group C studied to investigate effect of various models in the one-dimensional codes. This paper briefly summarizes the code analysis result from the group B participants in the DSP-03 of the ATLAS test facility. The code analysis by Group B focuses highly on investigating the multi-dimensional thermal hydraulic phenomena in the ATLAS facility during the SLB transient. Even though the one-dimensional system analysis code cannot simulate the whole system of the ATLAS facility with a nodalization of the CFD (Computational Fluid Dynamics) scale, a reactor pressure vessel can be considered with multi-dimensional components to reflect the thermal mixing phenomena inside a downcomer and a core. Also, the CFD could give useful information for understanding complex phenomena in specific components such as the reactor pressure vessel. From the analysis activity of Group B in ATLAS DSP-03, participants adopted a multi-dimensional approach to the code analysis for the SLB transient in the ATLAS test facility. The main purpose of the analysis was to investigate prediction capability of multi-dimensional analysis tools for the SLB experiment result. In particular, the asymmetric cooling and thermal mixing phenomena in the reactor pressure vessel could be significantly focused for modeling the multi-dimensional components

  5. Shroud leakage flow models and a multi-dimensional coupling CFD (computational fluid dynamics) method for shrouded turbines

    International Nuclear Information System (INIS)

    Zou, Zhengping; Liu, Jingyuan; Zhang, Weihao; Wang, Peng

    2016-01-01

    Multi-dimensional coupling simulation is an effective approach for evaluating the flow and aero-thermal performance of shrouded turbines, which can balance the simulation accuracy and computing cost effectively. In this paper, 1D leakage models are proposed based on classical jet theories and dynamics equations, which can be used to evaluate most of the main features of shroud leakage flow, including the mass flow rate, radial and circumferential momentum, temperature and the jet width. Then, the 1D models are expanded to 2D distributions on the interface by using a multi-dimensional scaling method. Based on the models and multi-dimensional scaling, a multi-dimensional coupling simulation method for shrouded turbines is developed, in which, some boundary source and sink are set on the interface between the shroud and the main flow passage. To verify the precision, some simulations on the design point and off design points of a 1.5 stage turbine are conducted. It is indicated that the models and methods can give predictions with sufficient accuracy for most of the flow field features and will contribute to pursue deeper understanding and better design methods of shrouded axial turbines, which are the important devices in energy engineering. - Highlights: • Free and wall attached jet theories are used to model the leakage flow in shrouds. • Leakage flow rate is modeled by virtual labyrinth number and residual-energy factor. • A scaling method is applied to 1D model to obtain 2D distributions on interfaces. • A multi-dimensional coupling CFD method for shrouded turbines is proposed. • The proposed coupling method can give accurate predictions with low computing cost.

  6. Analysis of Phenix End-of-Life asymmetry test with multi-dimensional pool modeling of MARS-LMR code

    International Nuclear Information System (INIS)

    Jeong, H.-Y.; Ha, K.-S.; Choi, C.-W.; Park, M.-G.

    2015-01-01

    Highlights: • Pool behaviors under asymmetrical condition in an SFR were evaluated with MARS-LMR. • The Phenix asymmetry test was analyzed one-dimensionally and multi-dimensionally. • One-dimensional modeling has limitation to predict the cold pool temperature. • Multi-dimensional modeling shows improved prediction of stratification and mixing. - Abstract: The understanding of complicated pool behaviors and its modeling is essential for the design and safety analysis of a pool-type Sodium-cooled Fast Reactor. One of the remarkable recent efforts on the study of pool thermal–hydraulic behaviors is the asymmetrical test performed as a part of Phenix End-of-Life tests by the CEA. To evaluate the performance of MARS-LMR code, which is a key system analysis tool for the design of an SFR in Korea, in the prediction of thermal hydraulic behaviors during an asymmetrical condition, the Phenix asymmetry test is analyzed with MARS-LMR in the present study. Pool regions are modeled with two different approaches, one-dimensional modeling and multi-dimensional one, and the prediction results are analyzed to identify the appropriateness of each modeling method. The prediction with one-dimensional pool modeling shows a large deviation from the measured data at the early stage of the test, which suggests limitations to describe the complicated thermal–hydraulic phenomena. When the pool regions are modeled multi-dimensionally, the prediction gives improved results quite a bit. This improvement is explained by the enhanced modeling of pool mixing with the multi-dimensional modeling. On the basis of the results from the present study, it is concluded that an accurate modeling of pool thermal–hydraulics is a prerequisite for the evaluation of design performance and safety margin quantification in the future SFR developments

  7. Efficient contrast enhancement through log-power histogram modification

    NARCIS (Netherlands)

    Wu, T.; Toet, A.

    2014-01-01

    A simple power-logarithm histogram modification operator is proposed to enhance digital image contrast. First a logarithm operator reduces the effect of spikes and transforms the image histogram into a smoothed one that approximates a uniform histogram while retaining the relative size ordering of

  8. Analytic Approximations to the Free Boundary and Multi-dimensional Problems in Financial Derivatives Pricing

    Science.gov (United States)

    Lau, Chun Sing

    This thesis studies two types of problems in financial derivatives pricing. The first type is the free boundary problem, which can be formulated as a partial differential equation (PDE) subject to a set of free boundary condition. Although the functional form of the free boundary condition is given explicitly, the location of the free boundary is unknown and can only be determined implicitly by imposing continuity conditions on the solution. Two specific problems are studied in details, namely the valuation of fixed-rate mortgages and CEV American options. The second type is the multi-dimensional problem, which involves multiple correlated stochastic variables and their governing PDE. One typical problem we focus on is the valuation of basket-spread options, whose underlying asset prices are driven by correlated geometric Brownian motions (GBMs). Analytic approximate solutions are derived for each of these three problems. For each of the two free boundary problems, we propose a parametric moving boundary to approximate the unknown free boundary, so that the original problem transforms into a moving boundary problem which can be solved analytically. The governing parameter of the moving boundary is determined by imposing the first derivative continuity condition on the solution. The analytic form of the solution allows the price and the hedging parameters to be computed very efficiently. When compared against the benchmark finite-difference method, the computational time is significantly reduced without compromising the accuracy. The multi-stage scheme further allows the approximate results to systematically converge to the benchmark results as one recasts the moving boundary into a piecewise smooth continuous function. For the multi-dimensional problem, we generalize the Kirk (1995) approximate two-asset spread option formula to the case of multi-asset basket-spread option. Since the final formula is in closed form, all the hedging parameters can also be derived in

  9. Robust histogram-based image retrieval

    Czech Academy of Sciences Publication Activity Database

    Höschl, Cyril; Flusser, Jan

    2016-01-01

    Roč. 69, č. 1 (2016), s. 72-81 ISSN 0167-8655 R&D Projects: GA ČR GA15-16928S Institutional support: RVO:67985556 Keywords : Image retrieval * Noisy image * Histogram * Convolution * Moments * Invariants Subject RIV: JD - Computer Applications, Robotics Impact factor: 1.995, year: 2016 http://library.utia.cas.cz/separaty/2015/ZOI/hoschl-0452147.pdf

  10. Reducing variability in the output of pattern classifiers using histogram shaping

    International Nuclear Information System (INIS)

    Gupta, Shalini; Kan, Chih-Wen; Markey, Mia K.

    2010-01-01

    Purpose: The authors present a novel technique based on histogram shaping to reduce the variability in the output and (sensitivity, specificity) pairs of pattern classifiers with identical ROC curves, but differently distributed outputs. Methods: The authors identify different sources of variability in the output of linear pattern classifiers with identical ROC curves, which also result in classifiers with differently distributed outputs. They theoretically develop a novel technique based on the matching of the histograms of these differently distributed pattern classifier outputs to reduce the variability in their (sensitivity, specificity) pairs at fixed decision thresholds, and to reduce the variability in their actual output values. They empirically demonstrate the efficacy of the proposed technique by means of analyses on the simulated data and real world mammography data. Results: For the simulated data, with three different known sources of variability, and for the real world mammography data with unknown sources of variability, the proposed classifier output calibration technique significantly reduced the variability in the classifiers' (sensitivity, specificity) pairs at fixed decision thresholds. Furthermore, for classifiers with monotonically or approximately monotonically related output variables, the histogram shaping technique also significantly reduced the variability in their actual output values. Conclusions: Classifier output calibration based on histogram shaping can be successfully employed to reduce the variability in the output values and (sensitivity, specificity) pairs of pattern classifiers with identical ROC curves, but differently distributed outputs.

  11. Unstable Periodic Orbit Analysis of Histograms of Chaotic Time Series

    International Nuclear Information System (INIS)

    Zoldi, S.M.

    1998-01-01

    Using the Lorenz equations, we have investigated whether unstable periodic orbits (UPOs) associated with a strange attractor may predict the occurrence of the robust sharp peaks in histograms of some experimental chaotic time series. Histograms with sharp peaks occur for the Lorenz parameter value r=60.0 but not for r=28.0 , and the sharp peaks for r=60.0 do not correspond to a histogram derived from any single UPO. However, we show that histograms derived from the time series of a non-Axiom-A chaotic system can be accurately predicted by an escape-time weighting of UPO histograms. copyright 1998 The American Physical Society

  12. Software Defined Networking (SDN) controlled all optical switching networks with multi-dimensional switching architecture

    Science.gov (United States)

    Zhao, Yongli; Ji, Yuefeng; Zhang, Jie; Li, Hui; Xiong, Qianjin; Qiu, Shaofeng

    2014-08-01

    Ultrahigh throughout capacity requirement is challenging the current optical switching nodes with the fast development of data center networks. Pbit/s level all optical switching networks need to be deployed soon, which will cause the high complexity of node architecture. How to control the future network and node equipment together will become a new problem. An enhanced Software Defined Networking (eSDN) control architecture is proposed in the paper, which consists of Provider NOX (P-NOX) and Node NOX (N-NOX). With the cooperation of P-NOX and N-NOX, the flexible control of the entire network can be achieved. All optical switching network testbed has been experimentally demonstrated with efficient control of enhanced Software Defined Networking (eSDN). Pbit/s level all optical switching nodes in the testbed are implemented based on multi-dimensional switching architecture, i.e. multi-level and multi-planar. Due to the space and cost limitation, each optical switching node is only equipped with four input line boxes and four output line boxes respectively. Experimental results are given to verify the performance of our proposed control and switching architecture.

  13. Effect of a Multi-Dimensional Intervention Programme on the Motivation of Physical Education Students

    Science.gov (United States)

    Amado, Diana; Del Villar, Fernando; Leo, Francisco Miguel; Sánchez-Oliva, David; Sánchez-Miguel, Pedro Antonio; García-Calvo, Tomás

    2014-01-01

    This research study purports to verify the effect produced on the motivation of physical education students of a multi-dimensional programme in dance teaching sessions. This programme incorporates the application of teaching skills directed towards supporting the needs of autonomy, competence and relatedness. A quasi-experimental design was carried out with two natural groups of 4th year Secondary Education students - control and experimental -, delivering 12 dance teaching sessions. A prior training programme was carried out with the teacher in the experimental group to support these needs. An initial and final measurement was taken in both groups and the results revealed that the students from the experimental group showed an increase of the perception of autonomy and, in general, of the level of self-determination towards the curricular content of corporal expression focused on dance in physical education. To this end, we highlight the programme's usefulness in increasing the students' motivation towards this content, which is so complicated for teachers of this area to develop. PMID:24454831

  14. Wind Farm Power Forecasting for Less Than an Hour Using Multi Dimensional Models

    DEFF Research Database (Denmark)

    Knudsen, Torben; Bak, Thomas; Jensen, Tom Nørgaard

    2018-01-01

    The paper focus on prediction of wind farm power for horizons of 0-10 minutes and not more than one hour using statistical methods. These short term predictions are relevant for both transmission system operators, wind farm operators and traders. Previous research indicates that for short time ho...... the prediction error variance estimate compared to the persistence method. We also present convincing examples showing that the predictions follow the wind farm power over a window of an hour.......The paper focus on prediction of wind farm power for horizons of 0-10 minutes and not more than one hour using statistical methods. These short term predictions are relevant for both transmission system operators, wind farm operators and traders. Previous research indicates that for short time...... horizons the persistence method performs as well as more complex methods. However, these results are based on accumulated power for an entire wind farm. The contribution in this paper is to develop multi-dimensional linear methods based on measurements of power or wind speed from individual wind turbine...

  15. Stochastic volatility and multi-dimensional modeling in the European energy market

    Energy Technology Data Exchange (ETDEWEB)

    Vos, Linda

    2012-07-01

    In energy prices there is evidence for stochastic volatility. Stochastic volatility has effect on the price of path-dependent options and therefore has to be modeled properly. We introduced a multi-dimensional non-Gaussian stochastic volatility model with leverage which can be used in energy pricing. It captures special features of energy prices like price spikes, mean-reversion, stochastic volatility and inverse leverage. Moreover it allows modeling dependencies between different commodities.The derived forward price dynamics based on this multi-variate spot price model, provides a very flexible structure. It includes cotango, backwardation and hump shape forward curves.Alternatively energy prices could be modeled by a 2-factor model consisting of a non-Gaussian stable CARMA process and a non-stationary trend models by a Levy process. Also this model is able to capture special features like price spikes, mean reversion and the low frequency dynamics in the market. An robust L1-filter is introduced to filter out the states of the CARMA process. When applying to German electricity EEX exchange data an overall negative risk-premium is found. However close to delivery a positive risk-premium is observed.(Author)

  16. A Simple Free Surface Tracking Model for Multi-dimensional Two-Fluid Approaches

    International Nuclear Information System (INIS)

    Lee, Seungjun; Yoon, Han Young

    2014-01-01

    The development in two-phase experiments devoted to find unknown phenomenological relationships modified conventional flow pattern maps into a sophisticated one and even extended to the multi-dimensional usage. However, for a system including a large void fraction gradient, such as a pool with the free surface, the flow patterns varies spatially throughout small number of cells and sometimes results in an unstable and unrealistic prediction of flows at the large gradient void fraction cells. Then, the numerical stability problem arising from the free surface is the major interest in the analyses of a passive cooling pool convecting the decay heat naturally, which has become a design issue to increase the safety level of nuclear reactors recently. In this research, a new and simple free surface tracking method combined with a simplified topology map is presented. The method modified the interfacial drag coefficient only for the cells defined as the free surface. The performance is shown by comparing the natural convection analysis of a small scale pool with respect to single- and two-phase condition. A simple free surface tracking model with a simplified topology map is developed

  17. Application of neural network to multi-dimensional design window search in reactor core design

    International Nuclear Information System (INIS)

    Kugo, Teruhiko; Nakagawa, Masayuki

    1999-01-01

    In the reactor core design, many parametric survey calculations should be carried out to decide an optimal set of basic design parameter values. They consume a large amount of computation time and labor in the conventional way. To support design work, we investigate a procedure to search efficiently a design window, which is defined as feasible design parameter ranges satisfying design criteria and requirements, in a multi-dimensional space composed of several basic design parameters. The present method is applied to the neutronics and thermal hydraulics fields. The principle of the present method is to construct the multilayer neural network to simulate quickly a response of an analysis code through a training process, and to reduce computation time using the neural network without parametric study using analysis codes. To verify the applicability of the present method to the neutronics and the thermal hydraulics design, we have applied it to high conversion water reactors and examined effects of the structure of the neural network and the number of teaching patterns on the accuracy of the design window estimated by the neural network. From the results of the applications, a guideline to apply the present method is proposed and the present method can predict an appropriate design window in a reasonable computation time by following the guideline. (author)

  18. Psychometric evaluation of a multi-dimensional measure of satisfaction with behavioral interventions.

    Science.gov (United States)

    Sidani, Souraya; Epstein, Dana R; Fox, Mary

    2017-10-01

    Treatment satisfaction is recognized as an essential aspect in the evaluation of an intervention's effectiveness, but there is no measure that provides for its comprehensive assessment with regard to behavioral interventions. Informed by a conceptualization generated from a literature review, we developed a measure that covers several domains of satisfaction with behavioral interventions. In this paper, we briefly review its conceptualization and describe the Multi-Dimensional Treatment Satisfaction Measure (MDTSM) subscales. Satisfaction refers to the appraisal of the treatment's process and outcome attributes. The MDTSM has 11 subscales assessing treatment process and outcome attributes: treatment components' suitability and utility, attitude toward treatment, desire for continued treatment use, therapist competence and interpersonal style, format and dose, perceived benefits of the health problem and everyday functioning, discomfort, and attribution of outcomes to treatment. The MDTSM was completed by persons (N = 213) in the intervention group in a large trial of a multi-component behavioral intervention for insomnia within 1 week following treatment completion. The MDTSM's subscales demonstrated internal consistency reliability (α: .65 - .93) and validity (correlated with self-reported adherence and perceived insomnia severity at post-test). The MDTSM subscales can be used to assess satisfaction with behavioral interventions and point to aspects of treatments that are viewed favorably or unfavorably. © 2017 Wiley Periodicals, Inc.

  19. Development of a multi-dimensional measure of resilience in adolescents: the Adolescent Resilience Questionnaire

    Directory of Open Access Journals (Sweden)

    Buzwell Simone

    2011-10-01

    Full Text Available Abstract Background The concept of resilience has captured the imagination of researchers and policy makers over the past two decades. However, despite the ever growing body of resilience research, there is a paucity of relevant, comprehensive measurement tools. In this article, the development of a theoretically based, comprehensive multi-dimensional measure of resilience in adolescents is described. Methods Extensive literature review and focus groups with young people living with chronic illness informed the conceptual development of scales and items. Two sequential rounds of factor and scale analyses were undertaken to revise the conceptually developed scales using data collected from young people living with a chronic illness and a general population sample. Results The revised Adolescent Resilience Questionnaire comprises 93 items and 12 scales measuring resilience factors in the domains of self, family, peer, school and community. All scales have acceptable alpha coefficients. Revised scales closely reflect conceptually developed scales. Conclusions It is proposed that, with further psychometric testing, this new measure of resilience will provide researchers and clinicians with a comprehensive and developmentally appropriate instrument to measure a young person's capacity to achieve positive outcomes despite life stressors.

  20. Fluorescence Intrinsic Characterization of Excitation-Emission Matrix Using Multi-Dimensional Ensemble Empirical Mode Decomposition

    Directory of Open Access Journals (Sweden)

    Tzu-Chien Hsiao

    2013-11-01

    Full Text Available Excitation-emission matrix (EEM fluorescence spectroscopy is a noninvasive method for tissue diagnosis and has become important in clinical use. However, the intrinsic characterization of EEM fluorescence remains unclear. Photobleaching and the complexity of the chemical compounds make it difficult to distinguish individual compounds due to overlapping features. Conventional studies use principal component analysis (PCA for EEM fluorescence analysis, and the relationship between the EEM features extracted by PCA and diseases has been examined. The spectral features of different tissue constituents are not fully separable or clearly defined. Recently, a non-stationary method called multi-dimensional ensemble empirical mode decomposition (MEEMD was introduced; this method can extract the intrinsic oscillations on multiple spatial scales without loss of information. The aim of this study was to propose a fluorescence spectroscopy system for EEM measurements and to describe a method for extracting the intrinsic characteristics of EEM by MEEMD. The results indicate that, although PCA provides the principal factor for the spectral features associated with chemical compounds, MEEMD can provide additional intrinsic features with more reliable mapping of the chemical compounds. MEEMD has the potential to extract intrinsic fluorescence features and improve the detection of biochemical changes.

  1. Effect of a multi-dimensional intervention programme on the motivation of physical education students.

    Science.gov (United States)

    Amado, Diana; Del Villar, Fernando; Leo, Francisco Miguel; Sánchez-Oliva, David; Sánchez-Miguel, Pedro Antonio; García-Calvo, Tomás

    2014-01-01

    This research study purports to verify the effect produced on the motivation of physical education students of a multi-dimensional programme in dance teaching sessions. This programme incorporates the application of teaching skills directed towards supporting the needs of autonomy, competence and relatedness. A quasi-experimental design was carried out with two natural groups of 4(th) year Secondary Education students--control and experimental -, delivering 12 dance teaching sessions. A prior training programme was carried out with the teacher in the experimental group to support these needs. An initial and final measurement was taken in both groups and the results revealed that the students from the experimental group showed an increase of the perception of autonomy and, in general, of the level of self-determination towards the curricular content of corporal expression focused on dance in physical education. To this end, we highlight the programme's usefulness in increasing the students' motivation towards this content, which is so complicated for teachers of this area to develop.

  2. Spectral analysis of multi-dimensional self-similar Markov processes

    International Nuclear Information System (INIS)

    Modarresi, N; Rezakhah, S

    2010-01-01

    In this paper we consider a discrete scale invariant (DSI) process {X(t), t in R + } with scale l > 1. We consider a fixed number of observations in every scale, say T, and acquire our samples at discrete points α k , k in W, where α is obtained by the equality l = α T and W = {0, 1, ...}. We thus provide a discrete time scale invariant (DT-SI) process X(.) with the parameter space {α k , k in W}. We find the spectral representation of the covariance function of such a DT-SI process. By providing the harmonic-like representation of multi-dimensional self-similar processes, spectral density functions of them are presented. We assume that the process {X(t), t in R + } is also Markov in the wide sense and provide a discrete time scale invariant Markov (DT-SIM) process with the above scheme of sampling. We present an example of the DT-SIM process, simple Brownian motion, by the above sampling scheme and verify our results. Finally, we find the spectral density matrix of such a DT-SIM process and show that its associated T-dimensional self-similar Markov process is fully specified by {R H j (1), R j H (0), j = 0, 1, ..., T - 1}, where R H j (τ) is the covariance function of jth and (j + τ)th observations of the process.

  3. Measurement of multi-dimensional flow structure for flow boiling in a tube

    International Nuclear Information System (INIS)

    Adachi, Yu; Ito, Daisuke; Saito, Yasushi

    2014-01-01

    With an aim of the measurement of multi-dimensional flow structure of in-tube boiling two-phase flow, the authors built their own wire mesh measurement system based on electrical conductivity measurement, and examined the relationship between the electrical conductivity obtained by the wire mesh sensor and the void fraction. In addition, the authors measured the void fraction using neutron radiography, and compared the result with the measured value using the wire mesh sensor. From the comparison with neutron radiography, it was found that the new method underestimated the void fraction in the flow in the vicinity of the void fraction of 0.2-0.5, similarly to the conventional result. In addition, since the wire mesh sensor cannot measure dispersed droplets, it tends to overestimate the void fraction in the high void fraction region, such as churn flow accompanied by droplet generation. In the electrical conductivity wire-mesh sensor method, it is necessary to correctly take into account the effect of liquid film or droplets. The authors also built a measurement system based on the capacitance wire mesh sensor method using the difference in dielectric constant, performed the confirmation of transmission and reception signals using deionized water as a medium, and showed the validity of the system. As for the dispersed droplets, the capacitance method has a potential to be able to measure them. (A.O.)

  4. Operationalising the Sustainable Knowledge Society Concept through a Multi-dimensional Scorecard

    Science.gov (United States)

    Dragomirescu, Horatiu; Sharma, Ravi S.

    Since the early 21st Century, building a Knowledge Society represents an aspiration not only for the developed countries, but for the developing ones too. There is an increasing concern worldwide for rendering this process manageable towards a sustainable, equitable and ethically sound societal system. As proper management, including at the societal level, requires both wisdom and measurement, the operationalisation of the Knowledge Society concept encompasses a qualitative side, related to vision-building, and a quantitative one, pertaining to designing and using dedicated metrics. The endeavour of enabling policy-makers mapping, steering and monitoring the sustainable development of the Knowledge Society at national level, in a world increasingly based on creativity, learning and open communication, led researchers to devising a wide range of composite indexes. However, as such indexes are generated through weighting and aggregation, their usefulness is limited to retrospectively assessing and comparing levels and states already attained; therefore, to better serve policy-making purposes, composite indexes should be complemented by other instruments. Complexification, inspired by the systemic paradigm, allows obtaining "rich pictures" of the Knowledge Society; to this end, a multi-dimensional scorecard of the Knowledge Society development is hereby suggested, that seeks a more contextual orientation towards sustainability. It is assumed that, in the case of the Knowledge Society, the sustainability condition goes well beyond the "greening" desideratum and should be of a higher order, relying upon the conversion of natural and productive life-cycles into virtuous circles of self-sustainability.

  5. Dynameomics: a multi-dimensional analysis-optimized database for dynamic protein data.

    Science.gov (United States)

    Kehl, Catherine; Simms, Andrew M; Toofanny, Rudesh D; Daggett, Valerie

    2008-06-01

    The Dynameomics project is our effort to characterize the native-state dynamics and folding/unfolding pathways of representatives of all known protein folds by way of molecular dynamics simulations, as described by Beck et al. (in Protein Eng. Des. Select., the first paper in this series). The data produced by these simulations are highly multidimensional in structure and multi-terabytes in size. Both of these features present significant challenges for storage, retrieval and analysis. For optimal data modeling and flexibility, we needed a platform that supported both multidimensional indices and hierarchical relationships between related types of data and that could be integrated within our data warehouse, as described in the accompanying paper directly preceding this one. For these reasons, we have chosen On-line Analytical Processing (OLAP), a multi-dimensional analysis optimized database, as an analytical platform for these data. OLAP is a mature technology in the financial sector, but it has not been used extensively for scientific analysis. Our project is further more unusual for its focus on the multidimensional and analytical capabilities of OLAP rather than its aggregation capacities. The dimensional data model and hierarchies are very flexible. The query language is concise for complex analysis and rapid data retrieval. OLAP shows great promise for the dynamic protein analysis for bioengineering and biomedical applications. In addition, OLAP may have similar potential for other scientific and engineering applications involving large and complex datasets.

  6. Moving toward multi-dimensional radiotherapy and the role of radiobiology

    International Nuclear Information System (INIS)

    Oita, Masataka; Uto, Yoshihiro; Aoyama, Hideki

    2014-01-01

    Recent radiotherapy for cancer treatment enable the high-precision irradiation to the target under the computed image guidance. Developments of such radiotherapy has played large role in the improved strategy of cancer treatments. In addition, the molecular mechanistic studies related to proliferations of cancer cell contribute the multidisciplinary fields of clinical radiotherapies. Therefore, the combination of the image guidance and molecular targeting of cancer cells make it possible for individualized cancer treatment. Especially, the use of particle beam or boron neutron capture therapy (BNCT) has been spotlighted, and installations of such devices are planned widely. As the progress and collaborations of radiation biology and engineering physics, establishment of a new style of radiotherapy becomes available in post-genome era. In 2010s, the hi-tech machines controlling the spaciotemporal radiotherapy become in practice. Although, there still remains to be improved, e.g., more precise prediction of radiosensitivity or growth of individual tumors, and adverse outcomes after treatments, multi-dimensional optimizations of the individualized irradiations based on the molecular radiation biologies and medical physics are important for further development of radiotherapy. (author)

  7. Analysis of UPTF downcomer tests with the Cathare multi-dimensional model

    International Nuclear Information System (INIS)

    Dor, I.

    1993-01-01

    This paper presents the analysis and the modelling - with the system code CATHARE - of UPTF downcomer refill tests simulating the refill phase of a large break LOCA. The modelling approach in a system code is discussed. First the reasons why in this particular case available flooding correlations are difficult to use in system code are developed. Then the use of a 1 - D modelling of the downcomer with specific closure relations for the annular geometry is examined. But UPTF 1:1 scale tests and CREARE reduced scale tests point out some weaknesses of this modelling due to the particular multi-dimensional nature of the flow in the upper part of the downcomer. Thus a 2-D model is elaborated and implemented into CATHARE version 1.3e code. The assessment of the model is based on UPTF 1:1 scale tests (saturated and subcooled conditions). Discretization and meshing influence are investigated. On the basis of saturated tests a new discretization is proposed for different terms of the momentum balance equations (interfacial friction, momentum transport terms) which results in a significant improvement. Sensitivity studies performed on subcooled tests show that the water downflow predictions are improved by increasing the condensation in the downcomer. (author). 8 figs., 5 tabs., 9 refs., 2 appendix

  8. Multi-dimensional self-esteem and substance use among Chinese adolescents.

    Science.gov (United States)

    Wu, Cynthia S T; Wong, Ho Ting; Shek, Carmen H M; Loke, Alice Yuen

    2014-10-01

    Substance use among adolescents has caused worldwide public health concern in recent years. Overseas studies have demonstrated an association between adolescent self-esteem and substance use, but studies within a Chinese context are limited. A study was therefore initiated to: (1) explore the 30 days prevalence of substance use (smoking, drinking, and drugs) among male and female adolescents in Hong Kong; (2) identify the significant associations between multidimensional self-esteem and gender; and (3) examine the relationship between multi-dimensional self-esteem and substance use. A self-esteem scale and the Chinese version of the global school-based student health survey were adopted. A total of 1,223 students were recruited from two mixed-gender schools and one boys' school. Among females, there was a lower 30-day prevalence of cigarette, alcohol, and drug use. They also had significantly higher peer and family self-esteem but lower sport-related self-esteem. Body image self-esteem was a predictor of alcohol use among females, while peer and school self-esteem were predictors of drug use among males. In summary, the findings demonstrated the influence of self-esteem to the overall well-being of adolescents. Schools could play a role in promoting physical fitness and positive relationships between adolescents and their peers, family, and schools to fulfill their physical and psychological self-esteem needs.

  9. The knock study of methanol fuel based on multi-dimensional simulation analysis

    International Nuclear Information System (INIS)

    Zhen, Xudong; Liu, Daming; Wang, Yang

    2017-01-01

    Methanol is an alternative fuel, and considered to be one of the most favorable fuels for engines. In this study, knocking combustion in a developed ORCEM (optical rapid compression and expansion machine) is studied based on the multi-dimensional simulation analysis. The LES (large-eddy simulation) models coupled with methanol chemical reaction kinetics (contains 21-species and 84-elementary reactions) is adopted to study knocking combustion. The results showed that the end-gas auto-ignition first occurred in the position near the chamber wall because of the higher temperature and pressure. The H_2O_2 species could be a good flame front indicator. OH radicals played the major role, and the HCO radicals almost could be ignored during knocking combustion. The HCO radicals generated little, so its concentration during knocking combustion almost may be ignored. The mean reaction intensity results of CH_2O, OH, H_2O_2, and CO were higher than others during knocking combustion. Finally, this paper put forward some new suggestions on the weakness in the knocking combustion researches of methanol fuel. - Highlights: • Knocking combustion of methanol was studied in a developed ORCEM. • The LES coupled with detailed chemical kinetics was adopted to simulation study. • The end-gas auto-ignition first occurred in the place near the chamber wall. • OH radical was the predominant species during knocking combustion. • The H_2O_2 species could be a good flame front indicator.

  10. Chi-square tests for comparing weighted histograms

    International Nuclear Information System (INIS)

    Gagunashvili, N.D.

    2010-01-01

    Weighted histograms in Monte Carlo simulations are often used for the estimation of probability density functions. They are obtained as a result of random experiments with random events that have weights. In this paper, the bin contents of a weighted histogram are considered as a sum of random variables with a random number of terms. Generalizations of the classical chi-square test for comparing weighted histograms are proposed. Numerical examples illustrate an application of the tests for the histograms with different statistics of events and different weighted functions. The proposed tests can be used for the comparison of experimental data histograms with simulated data histograms as well as for the two simulated data histograms.

  11. Visual Contrast Enhancement Algorithm Based on Histogram Equalization

    Directory of Open Access Journals (Sweden)

    Chih-Chung Ting

    2015-07-01

    Full Text Available Image enhancement techniques primarily improve the contrast of an image to lend it a better appearance. One of the popular enhancement methods is histogram equalization (HE because of its simplicity and effectiveness. However, it is rarely applied to consumer electronics products because it can cause excessive contrast enhancement and feature loss problems. These problems make the images processed by HE look unnatural and introduce unwanted artifacts in them. In this study, a visual contrast enhancement algorithm (VCEA based on HE is proposed. VCEA considers the requirements of the human visual perception in order to address the drawbacks of HE. It effectively solves the excessive contrast enhancement problem by adjusting the spaces between two adjacent gray values of the HE histogram. In addition, VCEA reduces the effects of the feature loss problem by using the obtained spaces. Furthermore, VCEA enhances the detailed textures of an image to generate an enhanced image with better visual quality. Experimental results show that images obtained by applying VCEA have higher contrast and are more suited to human visual perception than those processed by HE and other HE-based methods.

  12. Visual Contrast Enhancement Algorithm Based on Histogram Equalization

    Science.gov (United States)

    Ting, Chih-Chung; Wu, Bing-Fei; Chung, Meng-Liang; Chiu, Chung-Cheng; Wu, Ya-Ching

    2015-01-01

    Image enhancement techniques primarily improve the contrast of an image to lend it a better appearance. One of the popular enhancement methods is histogram equalization (HE) because of its simplicity and effectiveness. However, it is rarely applied to consumer electronics products because it can cause excessive contrast enhancement and feature loss problems. These problems make the images processed by HE look unnatural and introduce unwanted artifacts in them. In this study, a visual contrast enhancement algorithm (VCEA) based on HE is proposed. VCEA considers the requirements of the human visual perception in order to address the drawbacks of HE. It effectively solves the excessive contrast enhancement problem by adjusting the spaces between two adjacent gray values of the HE histogram. In addition, VCEA reduces the effects of the feature loss problem by using the obtained spaces. Furthermore, VCEA enhances the detailed textures of an image to generate an enhanced image with better visual quality. Experimental results show that images obtained by applying VCEA have higher contrast and are more suited to human visual perception than those processed by HE and other HE-based methods. PMID:26184219

  13. Accelerated weight histogram method for exploring free energy landscapes

    Energy Technology Data Exchange (ETDEWEB)

    Lindahl, V.; Lidmar, J.; Hess, B. [Department of Theoretical Physics and Swedish e-Science Research Center, KTH Royal Institute of Technology, 10691 Stockholm (Sweden)

    2014-07-28

    Calculating free energies is an important and notoriously difficult task for molecular simulations. The rapid increase in computational power has made it possible to probe increasingly complex systems, yet extracting accurate free energies from these simulations remains a major challenge. Fully exploring the free energy landscape of, say, a biological macromolecule typically requires sampling large conformational changes and slow transitions. Often, the only feasible way to study such a system is to simulate it using an enhanced sampling method. The accelerated weight histogram (AWH) method is a new, efficient extended ensemble sampling technique which adaptively biases the simulation to promote exploration of the free energy landscape. The AWH method uses a probability weight histogram which allows for efficient free energy updates and results in an easy discretization procedure. A major advantage of the method is its general formulation, making it a powerful platform for developing further extensions and analyzing its relation to already existing methods. Here, we demonstrate its efficiency and general applicability by calculating the potential of mean force along a reaction coordinate for both a single dimension and multiple dimensions. We make use of a non-uniform, free energy dependent target distribution in reaction coordinate space so that computational efforts are not wasted on physically irrelevant regions. We present numerical results for molecular dynamics simulations of lithium acetate in solution and chignolin, a 10-residue long peptide that folds into a β-hairpin. We further present practical guidelines for setting up and running an AWH simulation.

  14. Improved Steganographic Method Preserving Pixel-Value Differencing Histogram with Modulus Function

    Directory of Open Access Journals (Sweden)

    Lee Hae-Yeoun

    2010-01-01

    Full Text Available Abstract We herein advance a secure steganographic algorithm that uses a turnover policy and a novel adjusting process. Although the method of Wang et al. uses Pixel-Value Differencing (PVD and their modulus function provides high capacity and good image quality, the embedding process causes a number of artifacts, such as abnormal increases and fluctuations in the PVD histogram, which may reveal the existence of the hidden message. In order to enhance the security of the algorithm, a turnover policy is used that prevents abnormal increases in the histogram values and a novel adjusting process is devised to remove the fluctuations at the border of the subrange in the PVD histogram. The proposed method therefore eliminates all the weaknesses of the PVD steganographic methods thus far proposed and guarantees secure communication. In the experiments described herein, the proposed algorithm is compared with other PVD steganographic algorithms by using well-known steganalysis techniques, such as RS-analysis, steganalysis for LSB matching, and histogram-based attacks. The results support our contention that the proposed method enhances security by keeping the PVD histogram similar to the cover, while also providing high embedding capacity and good imperceptibility to the naked eye.

  15. Improved Steganographic Method Preserving Pixel-Value Differencing Histogram with Modulus Function

    Directory of Open Access Journals (Sweden)

    Heung-Kyu Lee

    2010-01-01

    Full Text Available We herein advance a secure steganographic algorithm that uses a turnover policy and a novel adjusting process. Although the method of Wang et al. uses Pixel-Value Differencing (PVD and their modulus function provides high capacity and good image quality, the embedding process causes a number of artifacts, such as abnormal increases and fluctuations in the PVD histogram, which may reveal the existence of the hidden message. In order to enhance the security of the algorithm, a turnover policy is used that prevents abnormal increases in the histogram values and a novel adjusting process is devised to remove the fluctuations at the border of the subrange in the PVD histogram. The proposed method therefore eliminates all the weaknesses of the PVD steganographic methods thus far proposed and guarantees secure communication. In the experiments described herein, the proposed algorithm is compared with other PVD steganographic algorithms by using well-known steganalysis techniques, such as RS-analysis, steganalysis for LSB matching, and histogram-based attacks. The results support our contention that the proposed method enhances security by keeping the PVD histogram similar to the cover, while also providing high embedding capacity and good imperceptibility to the naked eye.

  16. Flat-histogram methods in quantum Monte Carlo simulations: Application to the t-J model

    International Nuclear Information System (INIS)

    Diamantis, Nikolaos G.; Manousakis, Efstratios

    2016-01-01

    We discuss that flat-histogram techniques can be appropriately applied in the sampling of quantum Monte Carlo simulation in order to improve the statistical quality of the results at long imaginary time or low excitation energy. Typical imaginary-time correlation functions calculated in quantum Monte Carlo are subject to exponentially growing errors as the range of imaginary time grows and this smears the information on the low energy excitations. We show that we can extract the low energy physics by modifying the Monte Carlo sampling technique to one in which configurations which contribute to making the histogram of certain quantities flat are promoted. We apply the diagrammatic Monte Carlo (diag-MC) method to the motion of a single hole in the t-J model and we show that the implementation of flat-histogram techniques allows us to calculate the Green's function in a wide range of imaginary-time. In addition, we show that applying the flat-histogram technique alleviates the “sign”-problem associated with the simulation of the single-hole Green's function at long imaginary time. (paper)

  17. A Distributed Multi-dimensional SOLAP Model of Remote Sensing Data and Its Application in Drought Analysis

    Directory of Open Access Journals (Sweden)

    LI Jiyuan

    2014-06-01

    Full Text Available SOLAP (Spatial On-Line Analytical Processing has been applied to multi-dimensional analysis of remote sensing data recently. However, its computation performance faces a considerable challenge from the large-scale dataset. A geo-raster cube model extended by Map-Reduce is proposed, which refers to the application of Map-Reduce (a data-intensive computing paradigm in the OLAP field. In this model, the existing methods are modified to adapt to distributed environment based on the multi-level raster tiles. Then the multi-dimensional map algebra is introduced to decompose the SOLAP computation into multiple distributed parallel map algebra functions on tiles under the support of Map-Reduce. The drought monitoring by remote sensing data is employed as a case study to illustrate the model construction and application. The prototype is also implemented, and the performance testing shows the efficiency and scalability of this model.

  18. optBINS: Optimal Binning for histograms

    Science.gov (United States)

    Knuth, Kevin H.

    2018-03-01

    optBINS (optimal binning) determines the optimal number of bins in a uniform bin-width histogram by deriving the posterior probability for the number of bins in a piecewise-constant density model after assigning a multinomial likelihood and a non-informative prior. The maximum of the posterior probability occurs at a point where the prior probability and the the joint likelihood are balanced. The interplay between these opposing factors effectively implements Occam's razor by selecting the most simple model that best describes the data.

  19. Effect of a Multi-Dimensional and Inter-Sectoral Intervention on the Adherence of Psychiatric Patients.

    Directory of Open Access Journals (Sweden)

    Anne Pauly

    Full Text Available In psychiatry, hospital stays and transitions to the ambulatory sector are susceptible to major changes in drug therapy that lead to complex medication regimens and common non-adherence among psychiatric patients. A multi-dimensional and inter-sectoral intervention is hypothesized to improve the adherence of psychiatric patients to their pharmacotherapy.269 patients from a German university hospital were included in a prospective, open, clinical trial with consecutive control and intervention groups. Control patients (09/2012-03/2013 received usual care, whereas intervention patients (05/2013-12/2013 underwent a program to enhance adherence during their stay and up to three months after discharge. The program consisted of therapy simplification and individualized patient education (multi-dimensional component during the stay and at discharge, as well as subsequent phone calls after discharge (inter-sectoral component. Adherence was measured by the "Medication Adherence Report Scale" (MARS and the "Drug Attitude Inventory" (DAI.The improvement in the MARS score between admission and three months after discharge was 1.33 points (95% CI: 0.73-1.93 higher in the intervention group compared to controls. In addition, the DAI score improved 1.93 points (95% CI: 1.15-2.72 more for intervention patients.These two findings indicate significantly higher medication adherence following the investigated multi-dimensional and inter-sectoral program.German Clinical Trials Register DRKS00006358.

  20. Comparative study of the two-fluid momentum equations for multi-dimensional bubbly flows: Modification of Reynolds stress

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Jun; Park, Ik Kyu; Yoon, Han Young [Thermal-Hydraulic Safety Research Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Jae, Byoung [School of Mechanical Engineering, Chungnam National University, Daejeon (Korea, Republic of)

    2017-01-15

    Two-fluid equations are widely used to obtain averaged behaviors of two-phase flows. This study addresses a problem that may arise when the two-fluid equations are used for multi-dimensional bubbly flows. If steady drag is the only accounted force for the interfacial momentum transfer, the disperse-phase velocity would be the same as the continuous-phase velocity when the flow is fully developed without gravity. However, existing momentum equations may show unphysical results in estimating the relative velocity of the disperse phase against the continuous-phase. First, we examine two types of existing momentum equations. One is the standard two-fluid momentum equation in which the disperse-phase is treated as a continuum. The other is the averaged momentum equation derived from a solid/ fluid particle motion. We show that the existing equations are not proper for multi-dimensional bubbly flows. To resolve the problem mentioned above, we modify the form of the Reynolds stress terms in the averaged momentum equation based on the solid/fluid particle motion. The proposed equation shows physically correct results for both multi-dimensional laminar and turbulent flows.

  1. Multi-dimensional two-phase flow measurements in a large-diameter pipe using wire-mesh sensor

    International Nuclear Information System (INIS)

    Kanai, Taizo; Furuya, Masahiro; Arai, Takahiro; Shirakawa, Kenetsu; Nishi, Yoshihisa; Ueda, Nobuyuki

    2011-01-01

    The authors developed a method of measurement to determine the multi-dimensionality of two phase flow. A wire-mesh sensor (WMS) can acquire a void fraction distribution at a high temporal and spatial resolution and also estimate the velocity of a vertical rising flow by investigating the signal time-delay of the upstream WMS relative to downstream. Previously, one-dimensional velocity was estimated by using the same point of each WMS at a temporal resolution of 1.0 - 5.0 s. The authors propose to extend this time series analysis to estimate the multi-dimensional velocity profile via cross-correlation analysis between a point of upstream WMS and multiple points downstream. Bubbles behave in various ways according to size, which is used to classify them into certain groups via wavelet analysis before cross-correlation analysis. This method was verified by air-water straight and swirl flows within a large-diameter vertical pipe. A high-speed camera is used to set the parameter of cross-correlation analysis. The results revealed that for the rising straight and swirl flows, large scale bubbles tend to move to the center, while the small bubble is pushed to the outside or sucked into the space where the large bubbles existed. Moreover, it is found that this method can estimate the rotational component of velocity of the swirl flow as well as measuring the multi-dimensional velocity vector at high temporal resolutions of 0.2 s. (author)

  2. Multi-Dimensional Spectrum-Effect Relationship of the Impact of Chinese Herbal Formula Lichong Shengsui Yin on Ovarian Cancer

    Directory of Open Access Journals (Sweden)

    Yanhong Wang

    2017-06-01

    Full Text Available Lichong Shengsui Yin (LCSSY is an effective and classic compound prescription of Traditional Chinese Medicines (TCMs used for the treatment of ovarian cancer. To investigate its pharmacodynamic basis for treating ovarian cancer, the multi-dimensional spectrum-effect relationship was determined. Four compositions (I to IV were obtained by extracting LCSSY successively with supercritical CO2 fluid extraction, 75% ethanol reflux extraction, and the water extraction-ethanol precipitation method. Nine samples for pharmacological evaluation and fingerprint analysis were prepared by changing the content of the four compositions. The specific proportions of the four compositions were designed according to a four-factor, three-level L9(34 orthogonal test. The pharmacological evaluation included in vitro tumor inhibition experiments and the survival extension rate in tumor-bearing nude mice. The fingerprint analyzed by chromatographic condition I (high-performance liquid chromatography-photodiode array detec tor,HPLC-PDA identified 19 common peaks. High-performance liquid chromatography-photodiode array detector-Evaporative Light-scattering Detector (HPLC-PDA-ELSD hyphenated techniques were used to compensate for the use of a single detector, and the fingerprint analyzed by chromatographic condition II identified 28 common peaks in PDA and 23 common peaks in ELSD. Furthermore, multiple statistical analyses were utilized to calculate the relationships between the peaks and the pharmacological results. The union of the regression and the correlation analysis results were the peaks of X5, X9, X11, X12, X16, X18, Y5, Y8, Y12, Y14, Y20, Z4, Z5, Z6, and Z8. The intersection of the regression and the correlation analysis results were the peaks of X11, X12, X16, X18, Y5, Y12, and Z5. The correlated peaks were assigned by comparing the fingerprints with the negative control samples and reference standard samples, and identifying the structure using high

  3. Multi-dimensional Analysis Method of Hydrogen Combustion in the Containment of a Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jongtae; Hong, Seongwan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kim, Gun Hong [Kyungwon E and C Co., Seongnam (Korea, Republic of)

    2014-05-15

    The most severe case is the occurrence of detonation, which induces a few-fold greater pressure load on the containment wall than a deflagration flame. The occurrence of a containment-wise global detonation is prohibited by a national regulation. The compartments located in the flow path such as steam generator compartment, annular compartment, and dome region are likely to have highly-concentrated hydrogen. If it is found that hydrogen concentration in any compartment is far below a detonation criterion during an accident progression, it can be thought that the occurrence of a detonative explosion in a compartment is excluded. However, if it is not, it is necessary to evaluate the characteristics of flame acceleration in the containment. The possibility of a flame transition from a deflagration to a detonation (DDT) can be evaluated from a calculated hydrogen distribution in a compartment by using sigma-lambda criteria. However, this method can provide a very conservative result because the geometric characteristics of a real compartment are not considered well. In order to evaluate the containment integrity from a threat of a hydrogen explosion, it is necessary to establish an integrated evaluation system, which includes a lumped-parameter and detail analysis methods. In this study, a method for the multi-dimensional analysis of hydrogen combustion is proposed to mechanistically evaluate the flame acceleration characteristics with a geometric effect. The geometry of the containment is modeled 3-dimensionally using a CAD tool. To resolve a propagating flame front, an adaptive mesh refinement method is coupled with a combustion analysis solver.

  4. Consumer preference of fertilizer in West Java using multi-dimensional scaling approach

    Science.gov (United States)

    Utami, Hesty Nurul; Sadeli, Agriani Hermita; Perdana, Tomy; Renaldy, Eddy; Mahra Arari, H.; Ajeng Sesy N., P.; Fernianda Rahayu, H.; Ginanjar, Tetep; Sanjaya, Sonny

    2018-02-01

    There are various fertilizer products in the markets for farmers to be used for farming activities. Fertilizers are a supplements supply to soil nutrients, build up soil fertility in order to support plant nutrients and increase plants productivity. Fertilizers consists of nitrogen, phosphorous, potassium, micro vitamins and other complex nutrient in farming systems that commonly used in agricultural activities to improve quantity and quality of harvest. Recently, market demand for fertilizer has been increased dramatically; furthermore, fertilizer companies are required to develop strategies to know about consumer preferences towards several issues. Consumer preference depends on consumer needs selected by subject (individual) that is measured by utilization from several things that market offered and as final decision on purchase process. West Java is one of province as the main producer of agricultural products and automatically is one of the potential consumer's fertilizers on farming activities. This research is a case study in nine districts in West Java province, i.e., Bandung, West Bandung, Bogor, Depok, Garut, Indramayu, Majalengka, Cirebon and Cianjur. The purpose of this research is to describe the attributes on consumer preference for fertilizers. The multi-dimensional scaling method is used as quantitative method to help visualize the level of similarity of individual cases on a dataset, to describe and mapping the information system and to accept the goal. The attributes in this research are availability, nutrients content, price, form of fertilizer, decomposition speed, easy to use, label, packaging type, color, design and size of packaging, hardening process and promotion. There are tendency towards two fertilizer brand have similarity on availability of products, price, speed of decomposition and hardening process.

  5. Design of a Multi Dimensional Database for the Archimed DataWarehouse.

    Science.gov (United States)

    Bréant, Claudine; Thurler, Gérald; Borst, François; Geissbuhler, Antoine

    2005-01-01

    The Archimed data warehouse project started in 1993 at the Geneva University Hospital. It has progressively integrated seven data marts (or domains of activity) archiving medical data such as Admission/Discharge/Transfer (ADT) data, laboratory results, radiology exams, diagnoses, and procedure codes. The objective of the Archimed data warehouse is to facilitate the access to an integrated and coherent view of patient medical in order to support analytical activities such as medical statistics, clinical studies, retrieval of similar cases and data mining processes. This paper discusses three principal design aspects relative to the conception of the database of the data warehouse: 1) the granularity of the database, which refers to the level of detail or summarization of data, 2) the database model and architecture, describing how data will be presented to end users and how new data is integrated, 3) the life cycle of the database, in order to ensure long term scalability of the environment. Both, the organization of patient medical data using a standardized elementary fact representation and the use of the multi dimensional model have proved to be powerful design tools to integrate data coming from the multiple heterogeneous database systems part of the transactional Hospital Information System (HIS). Concurrently, the building of the data warehouse in an incremental way has helped to control the evolution of the data content. These three design aspects bring clarity and performance regarding data access. They also provide long term scalability to the system and resilience to further changes that may occur in source systems feeding the data warehouse.

  6. Magnetic quantum tunneling: key insights from multi-dimensional high-field EPR.

    Science.gov (United States)

    Lawrence, J; Yang, E-C; Hendrickson, D N; Hill, S

    2009-08-21

    Multi-dimensional high-field/frequency electron paramagnetic resonance (HFEPR) spectroscopy is performed on single-crystals of the high-symmetry spin S = 4 tetranuclear single-molecule magnet (SMM) [Ni(hmp)(dmb)Cl](4), where hmp(-) is the anion of 2-hydroxymethylpyridine and dmb is 3,3-dimethyl-1-butanol. Measurements performed as a function of the applied magnetic field strength and its orientation within the hard-plane reveal the four-fold behavior associated with the fourth order transverse zero-field splitting (ZFS) interaction, (1/2)B(S + S), within the framework of a rigid spin approximation (with S = 4). This ZFS interaction mixes the m(s) = +/-4 ground states in second order of perturbation, generating a sizeable (12 MHz) tunnel splitting, which explains the fast magnetic quantum tunneling in this SMM. Meanwhile, multi-frequency measurements performed with the field parallel to the easy-axis reveal HFEPR transitions associated with excited spin multiplets (S spin s = 1 Ni(II) ions within the cluster, as well as a characterization of the ZFS within excited states. The combined experimental studies support recent work indicating that the fourth order anisotropy associated with the S = 4 state originates from second order ZFS interactions associated with the individual Ni(II) centers, but only as a result of higher-order processes that occur via S-mixing between the ground state and higher-lying (S spin multiplets. We argue that this S-mixing plays an important role in the low-temperature quantum dynamics associated with many other well known SMMs.

  7. Installation of aerosol behavior model into multi-dimensional thermal hydraulic analysis code AQUA

    International Nuclear Information System (INIS)

    Kisohara, Naoyuki; Yamaguchi, Akira

    1997-12-01

    The safety analysis of FBR plant system for sodium leak phenomena needs to evaluate the deposition of the aerosol particle to the components in the plant, the chemical reaction of aerosol to humidity in the air and the effect of the combustion heat through aerosol to the structural component. For this purpose, ABC-INTG (Aerosol Behavior in Containment-INTeGrated Version) code has been developed and used until now. This code calculates aerosol behavior in the gas area of uniform temperature and pressure by 1 cell-model. Later, however, more detailed calculation of aerosol behavior requires the installation of aerosol model into multi-cell thermal hydraulic analysis code AQUA. AQUA can calculate the carrier gas flow, temperature and the distribution of the aerosol spatial concentration. On the other hand, ABC-INTG can calculate the generation, deposition to the wall and flower, agglomeration of aerosol particle and figure out the distribution of the aerosol particle size. Thus, the combination of these two codes enables to deal with aerosol model coupling the distribution of the aerosol spatial concentration and that of the aerosol particle size. This report describes aerosol behavior model, how to install the aerosol model to AQUA and new subroutine equipped to the code. Furthermore, the test calculations of the simple structural model were executed by this code, appropriate results were obtained. Thus, this code has prospect to predict aerosol behavior by the introduction of coupling analysis with multi-dimensional gas thermo-dynamics for sodium combustion evaluation. (J.P.N.)

  8. A revised Thai Multi-Dimensional Scale of Perceived Social Support.

    Science.gov (United States)

    Wongpakaran, Nahathai; Wongpakaran, Tinakon

    2012-11-01

    In order to ensure the construct validity of the three-factor model of the Multi-dimensional Scale of Perceived Social Support (MSPSS), and based on the assumption that it helps users differentiate between sources of social support, in this study a revised version was created and tested. The aim was to compare the level of model fit of the original version of the MSPSS against the revised version--which contains a minor change from the original. The study was conducted on 486 medical students who completed the original and revised versions of the MSPSS, as well as the Rosenberg Self-Esteem Scale (Rosenberg, 1965) and Beck Depression Inventory II (Beck, Steer, & Brown, 1996). Confirmatory factor analysis was performed to compare the results, showing that the revised version of MSPSS demonstrated a good internal consistency--with a Cronbach's alpha of .92 for the MSPSS questionnaire, and a significant correlation with the other scales, as predicted. The revised version provided better internal consistency, increasing the Cronbach's alpha for the Significant Others sub-scale from 0.86 to 0.92. Confirmatory factor analysis revealed an acceptable model fit: chi2 128.11, df 51, p < .001; TLI 0.94; CFI 0.95; GFI 0.90; PNFI 0.71; AGFI 0.85; RMSEA 0.093 (0.073-0.113) and SRMR 0.042, which is better than the original version. The tendency of the new version was to display a better level of fit with a larger sample size. The limitations of the study are discussed, as well as recommendations for further study.

  9. Oceans 2.0: Interactive tools for the Visualization of Multi-dimensional Ocean Sensor Data

    Science.gov (United States)

    Biffard, B.; Valenzuela, M.; Conley, P.; MacArthur, M.; Tredger, S.; Guillemot, E.; Pirenne, B.

    2016-12-01

    Ocean Networks Canada (ONC) operates ocean observatories on all three of Canada's coasts. The instruments produce 280 gigabytes of data per day with 1/2 petabyte archived so far. In 2015, 13 terabytes were downloaded by over 500 users from across the world. ONC's data management system is referred to as "Oceans 2.0" owing to its interactive, participative features. A key element of Oceans 2.0 is real time data acquisition and processing: custom device drivers implement the input-output protocol of each instrument. Automatic parsing and calibration takes place on the fly, followed by event detection and quality control. All raw data are stored in a file archive, while the processed data are copied to fast databases. Interactive access to processed data is provided through data download and visualization/quick look features that are adapted to diverse data types (scalar, acoustic, video, multi-dimensional, etc). Data may be post or re-processed to add features, analysis or correct errors, update calibrations, etc. A robust storage structure has been developed consisting of an extensive file system and a no-SQL database (Cassandra). Cassandra is a node-based open source distributed database management system. It is scalable and offers improved performance for big data. A key feature is data summarization. The system has also been integrated with web services and an ERDDAP OPeNDAP server, capable of serving scalar and multidimensional data from Cassandra for fixed or mobile devices.A complex data viewer has been developed making use of the big data capability to interactively display live or historic echo sounder and acoustic Doppler current profiler data, where users can scroll, apply processing filters and zoom through gigabytes of data with simple interactions. This new technology brings scientists one step closer to a comprehensive, web-based data analysis environment in which visual assessment, filtering, event detection and annotation can be integrated.

  10. Evaluating accessibility to Bangkok Metro Systems using multi-dimensional criteria across user groups

    Directory of Open Access Journals (Sweden)

    Duangporn Prasertsubpakij

    2012-07-01

    Full Text Available Metro systems act as fast and efficient transport systems for many modern metropolises; however, enhancing higher usage of such systems often conflicts with providing suitable accessibility options. The traditional approach of metro accessibility studies seems to be an ineffective measure to gage sustainable access in which the equal rights of all users are taken into account. Bangkok Metropolitan Region (BMR transportation has increasingly relied on the role of two mass rapid transport systems publicly called “BTS Skytrain” and “MRT Subway”, due to limited availability of land and massive road congestion; however, access to such transit arguably treats some vulnerable groups, especially women, the elderly and disabled people unfairly. This study constructs a multi-dimensional assessment of accessibility considerations to scrutinize how user groups access metro services based on BMR empirical case. 600 individual passengers at various stations were asked to rate the questionnaire that simultaneously considers accessibility aspects of spatial, feeder connectivity, temporal, comfort/safety, psychosocial and other dimensions. It was interestingly found by user disaggregated accessibility model that the lower the accessibility perceptions—related uncomfortable and unsafe environment conditions, the greater the equitable access to services, as illustrated by MRT — Hua Lumphong and MRT — Petchaburi stations. The study suggests that, to balance the access priorities of groups on services, policy actions should emphasize acceptably safe access for individuals, cost efficient feeder services connecting the metro lines, socioeconomic influences and time allocation. Insightful discussions on integrated approach balancing different dimensions of accessibility and recommendations would contribute to accessibility-based knowledge and potential propensity to use the public transits towards transport sustainability.

  11. Studying Operation Rules of Cascade Reservoirs Based on Multi-Dimensional Dynamics Programming

    Directory of Open Access Journals (Sweden)

    Zhiqiang Jiang

    2017-12-01

    Full Text Available Although many optimization models and methods are applied to the optimization of reservoir operation at present, the optimal operation decision that is made through these models and methods is just a retrospective review. Due to the limitation of hydrological prediction accuracy, it is practical and feasible to obtain the suboptimal or satisfactory solution by the established operation rules in the actual reservoir operation, especially for the mid- and long-term operation. In order to obtain the optimized sample data with global optimality; and make the extracted operation rules more reasonable and reliable, this paper presents the multi-dimensional dynamic programming model of the optimal joint operation of cascade reservoirs and provides the corresponding recursive equation and the specific solving steps. Taking Li Xianjiang cascade reservoirs as a case study, seven uncertain problems in the whole operation period of the cascade reservoirs are summarized after a detailed analysis to the obtained optimal sample data, and two sub-models are put forward to solve these uncertain problems. Finally, by dividing the whole operation period into four characteristic sections, this paper extracts the operation rules of each reservoir for each section respectively. When compared the simulation results of the extracted operation rules with the conventional joint operation method; the result indicates that the power generation of the obtained rules has a certain degree of improvement both in inspection years and typical years (i.e., wet year; normal year and dry year. So, the rationality and effectiveness of the extracted operation rules are verified by the comparative analysis.

  12. Design and implement of BESIII online histogramming software

    International Nuclear Information System (INIS)

    Li Fei; Wang Liang; Liu Yingjie; Chinese Academy of Sciences, Beijing; Zhu Kejun; Zhao Jingwei

    2007-01-01

    The online histogramming software is an important part of the BESIII DAQ (Data Acquisition) system. This article introduces the main requirements and design of the online histogramming software and presents how to produce, transmit and gather histograms in the distributed environment in the current software implement. The article also illustrate one smart, simple and easy to expand way of setup with xml configure database. (authors)

  13. A monitoring program of the histograms based on ROOT package

    International Nuclear Information System (INIS)

    Zhou Yongzhao; Liang Hao; Chen Yixin; Xue Jundong; Yang Tao; Gong Datao; Jin Ge; Yu Xiaoqi

    2002-01-01

    KHBOOK is a histogram monitor and browser based on ROOT package, which reads the histogram file in HBOOK format from Physmon, converts it into ROOT format, and browses the histograms in Repeat and Overlap modes to monitor and trace the quality of the data from DAQ. KHBOOK is a program of small memory, easy maintenance and fast running as well, using mono-behavior classes and a communication class of C ++

  14. Glioma grade assessment by using histogram analysis of diffusion tensor imaging-derived maps

    International Nuclear Information System (INIS)

    Jakab, Andras; Berenyi, Ervin; Molnar, Peter; Emri, Miklos

    2011-01-01

    Current endeavors in neuro-oncology include morphological validation of imaging methods by histology, including molecular and immunohistochemical techniques. Diffusion tensor imaging (DTI) is an up-to-date methodology of intracranial diagnostics that has gained importance in studies of neoplasia. Our aim was to assess the feasibility of discriminant analysis applied to histograms of preoperative diffusion tensor imaging-derived images for the prediction of glioma grade validated by histomorphology. Tumors of 40 consecutive patients included 13 grade II astrocytomas, seven oligoastrocytomas, six grade II oligodendrogliomas, three grade III oligoastrocytomas, and 11 glioblastoma multiformes. Preoperative DTI data comprised: unweighted (B 0 ) images, fractional anisotropy, longitudinal and radial diffusivity maps, directionally averaged diffusion-weighted imaging, and trace images. Sampling consisted of generating histograms for gross tumor volumes; 25 histogram bins per scalar map were calculated. The histogram bins that allowed the most precise determination of low-grade (LG) or high-grade (HG) classification were selected by multivariate discriminant analysis. Accuracy of the model was defined by the success rate of the leave-one-out cross-validation. Statistical descriptors of voxel value distribution did not differ between LG and HG tumors and did not allow classification. The histogram model had 88.5% specificity and 85.7% sensitivity in the separation of LG and HG gliomas; specificity was improved when cases with oligodendroglial components were omitted. Constructing histograms of preoperative radiological images over the tumor volume allows representation of the grade and enables discrimination of LG and HG gliomas which has been confirmed by histopathology. (orig.)

  15. Quantifying multi-dimensional functional trait spaces of trees: empirical versus theoretical approaches

    Science.gov (United States)

    Ogle, K.; Fell, M.; Barber, J. J.

    2016-12-01

    Empirical, field studies of plant functional traits have revealed important trade-offs among pairs or triplets of traits, such as the leaf (LES) and wood (WES) economics spectra. Trade-offs include correlations between leaf longevity (LL) vs specific leaf area (SLA), LL vs mass-specific leaf respiration rate (RmL), SLA vs RmL, and resistance to breakage vs wood density. Ordination analyses (e.g., PCA) show groupings of traits that tend to align with different life-history strategies or taxonomic groups. It is unclear, however, what underlies such trade-offs and emergent spectra. Do they arise from inherent physiological constraints on growth, or are they more reflective of environmental filtering? The relative importance of these mechanisms has implications for predicting biogeochemical cycling, which is influenced by trait distributions of the plant community. We address this question using an individual-based model of tree growth (ACGCA) to quantify the theoretical trait space of trees that emerges from physiological constraints. ACGCA's inputs include 32 physiological, anatomical, and allometric traits, many of which are related to the LES and WES. We fit ACGCA to 1.6 million USFS FIA observations of tree diameters and heights to obtain vectors of trait values that produce realistic growth, and we explored the structure of this trait space. No notable correlations emerged among the 496 trait pairs, but stepwise regressions revealed complicated multi-variate structure: e.g., relationships between pairs of traits (e.g., RmL and SLA) are governed by other traits (e.g., LL, radiation-use efficiency [RUE]). We also simulated growth under various canopy gap scenarios that impose varying degrees of environmental filtering to explore the multi-dimensional trait space (hypervolume) of trees that died vs survived. The centroid and volume of the hypervolumes differed among dead and live trees, especially under gap conditions leading to low mortality. Traits most predictive

  16. Inferring dynamic gene regulatory networks in cardiac differentiation through the integration of multi-dimensional data.

    Science.gov (United States)

    Gong, Wuming; Koyano-Nakagawa, Naoko; Li, Tongbin; Garry, Daniel J

    2015-03-07

    -CM transitions. We report a novel method to systematically integrate multi-dimensional -omics data and reconstruct the gene regulatory networks. This method will allow one to rapidly determine the cis-modules that regulate key genes during cardiac differentiation.

  17. Histogram-based ionogram displays and their application to autoscaling

    Science.gov (United States)

    Lynn, Kenneth J. W.

    2018-03-01

    A simple method is described for displaying and auto scaling the basic ionogram parameters foF2 and h'F2 as well as some additional layer parameters from digital ionograms. The technique employed is based on forming frequency and height histograms in each ionogram. This technique has now been applied specifically to ionograms produced by the IPS5D ionosonde developed and operated by the Australian Space Weather Service (SWS). The SWS ionograms are archived in a cleaned format and readily available from the SWS internet site. However, the method is applicable to any ionosonde which produces ionograms in a digital format at a useful signal-to-noise level. The most novel feature of the technique for autoscaling is its simplicity and the avoidance of the mathematical imaging and line fitting techniques often used. The program arose from the necessity to display many days of ionogram output to allow the location of specific types of ionospheric event such as ionospheric storms, travelling ionospheric disturbances and repetitive ionospheric height changes for further investigation and measurement. Examples and applications of the method are given including the removal of sporadic E and spread F.

  18. Devaney chaos, Li-Yorke chaos, and multi-dimensional Li-Yorke chaos for topological dynamics

    Science.gov (United States)

    Dai, Xiongping; Tang, Xinjia

    2017-11-01

    Let π : T × X → X, written T↷π X, be a topological semiflow/flow on a uniform space X with T a multiplicative topological semigroup/group not necessarily discrete. We then prove: If T↷π X is non-minimal topologically transitive with dense almost periodic points, then it is sensitive to initial conditions. As a result of this, Devaney chaos ⇒ Sensitivity to initial conditions, for this very general setting. Let R+↷π X be a C0-semiflow on a Polish space; then we show: If R+↷π X is topologically transitive with at least one periodic point p and there is a dense orbit with no nonempty interior, then it is multi-dimensional Li-Yorke chaotic; that is, there is a uncountable set Θ ⊆ X such that for any k ≥ 2 and any distinct points x1 , … ,xk ∈ Θ, one can find two time sequences sn → ∞ ,tn → ∞ with Moreover, let X be a non-singleton Polish space; then we prove: Any weakly-mixing C0-semiflow R+↷π X is densely multi-dimensional Li-Yorke chaotic. Any minimal weakly-mixing topological flow T↷π X with T abelian is densely multi-dimensional Li-Yorke chaotic. Any weakly-mixing topological flow T↷π X is densely Li-Yorke chaotic. We in addition construct a completely Li-Yorke chaotic minimal SL (2 , R)-acting flow on the compact metric space R ∪ { ∞ }. Our various chaotic dynamics are sensitive to the choices of the topology of the phase semigroup/group T.

  19. Value-at-risk estimation with fuzzy histograms

    NARCIS (Netherlands)

    Almeida, R.J.; Kaymak, U.

    2008-01-01

    Value at risk (VaR) is a measure for senior management that summarises the financial risk a company faces into one single number. In this paper, we consider the use of fuzzy histograms for quantifying the value-at-risk of a portfolio. It is shown that the use of fuzzy histograms provides a good

  20. Multiple histogram method and static Monte Carlo sampling

    NARCIS (Netherlands)

    Inda, M.A.; Frenkel, D.

    2004-01-01

    We describe an approach to use multiple-histogram methods in combination with static, biased Monte Carlo simulations. To illustrate this, we computed the force-extension curve of an athermal polymer from multiple histograms constructed in a series of static Rosenbluth Monte Carlo simulations. From

  1. Calibration of 14C Histograms : A Comparison of Methods

    NARCIS (Netherlands)

    Stolk, Ad; Törnqvist, Torbjörn E.; Hekhuis, Kilian P.V.; Berendsen, Henk J.A.; Plicht, Johannes van der

    1994-01-01

    The interpretation of C-14 histograms is complicated by the non-linearity of the C-14 time scale in terms of Calendar years, which may result in clustering of C-14 ages in certain time intervals unrelated to the (geologic or archaeologic) phenomenon of interest. One can calibrate C-14 histograms for

  2. Calculation of complication probability of pion treatment at PSI using dose-volume histograms

    International Nuclear Information System (INIS)

    Nakagawa, Keiichi; Akanuma, Atsuo; Aoki, Yukimasa

    1991-01-01

    In the conformation technique a target volume is irradiated uniformly as in conventional radiations, whereas surrounding tissue and organs are nonuniformly irradiated. Clinical data on radiation injuries that accumulate with conventional radiation are not applicable without appropriate compensation. Recently a putative solution of this problem was proposed by Lyman using dose-volume histograms. This histogram reduction method reduces a given dose-volume histogram of an organ to a single step which corresponds to the equivalent complication probability by interpolation. As a result it converts nonuniform radiation into a unique dose to the whole organ which has the equivalent likelihood of radiation injury. This method is based on low LET radiation with conventional fractionation schedules. When it is applied to high LET radiation such as negative pion treatment, a high LET dose should be converted to an equivalent photon dose using an appropriate value of RBE. In the present study the histogram reduction method was applied to actual patients treated by the negative pion conformation technique at the Paul Scherrer Institute. Out of evaluable 90 cases of pelvic tumors, 16 developed grade III-IV bladder injury, and 7 developed grade III-IV rectal injury. The 90 cases were divided into roughly equal groups according to the equivalent doses to the entire bladder and rectum. Complication rates and equivalent doses to the full organs in these groups could be represented by a sigmoid dose-effect relation. When RBE from a pion dose to a photon dose is assumed to be 2.1 for bladder injury, the rates of bladder complications fit best to the theoretical complication curve. When the RBE value was 2.3, the rates of rectal injury fit the theoretical curve best. These values are close to the conversion factor of 2.0 that is used in clinical practice at PSI. This agreement suggests the clinical feasibility of the histogram reduction method in conformation radiotherapy. (author)

  3. Controllable preparation of multi-dimensional hybrid materials of nickel-cobalt layered double hydroxide nanorods/nanosheets on electrospun carbon nanofibers for high-performance supercapacitors

    International Nuclear Information System (INIS)

    Lai, Feili; Huang, Yunpeng; Miao, Yue-E; Liu, Tianxi

    2015-01-01

    Graphical Abstract: Multi-dimensional hybrid materials of nickel-cobalt layered double hydroxide nanorods/nanosheets grown on electrospun carbon nanofiber membranes were prepared via electrospinning combined with solution co-deposition for high-performance supercapacitor electrodes. - Highlights: • Ni-Co LDH@CNFhybridswerepreparedbyelectrospinningandsolutionco-deposition. • Ni-Co LDH@CNF hybrids show high electrochemical performance for supercapacitors. • This method can be extended to other bimetallic@CNF hybrids for electrode materials. - Abstract: Hybrid nanomaterials with hierarchical structures have been considered as one kind of the most promising electrode materials for high-performance supercapacitors with high capacity and long cycle lifetime. In this work, multi-dimensional hybrid materials of nickel-cobalt layered double hydroxide (Ni-Co LDH) nanorods/nanosheets on carbon nanofibers (CNFs) were prepared by electrospinning technique combined with one-step solution co-deposition method. Carbon nanofiber membranes were obtained by electrospinning of polyacrylonitrile (PAN) followed by pre-oxidation and carbonization. The successful growth of Ni-Co LDH with different morphologies on CNF membrane by using two kinds of auxiliary agents reveals the simplicity and universality of this method. The uniform and immense growth of Ni-Co LDH on CNFs significantly improves its dispersion and distribution. Meanwhile the hierarchical structure of carbon nanofiber@nickel-cobalt layered double hydroxide nanorods/nanosheets (CNF@Ni-Co LDH NR/NS) hybrid membranes provide not only more active sites for electrochemical reaction but also more efficient pathways for electron transport. Galvanostatic charge-discharge measurements reveal high specific capacitances of 1378.2 F g −1 and 1195.4 F g −1 (based on Ni-Co LDH mass) at 1 A g −1 for CNF@Ni-Co LDH NR and CNF@Ni-Co LDH NS hybrid membranes, respectively. Moreover, cycling stabilities for both hybrid membranes are

  4. A Two-Temperature Open-Source CFD Model for Hypersonic Reacting Flows, Part Two: Multi-Dimensional Analysis †

    OpenAIRE

    Vincent Casseau; Daniel E. R. Espinoza; Thomas J. Scanlon; Richard E. Brown

    2016-01-01

    hy2Foam is a newly-coded open-source two-temperature computational fluid dynamics (CFD) solver that has previously been validated for zero-dimensional test cases. It aims at (1) giving open-source access to a state-of-the-art hypersonic CFD solver to students and researchers; and (2) providing a foundation for a future hybrid CFD-DSMC (direct simulation Monte Carlo) code within the OpenFOAM framework. This paper focuses on the multi-dimensional verification of hy2Foam and firstly describes th...

  5. Theoretical background and implementation of the finite element method for multi-dimensional Fokker-Planck equation analysis

    Czech Academy of Sciences Publication Activity Database

    Král, Radomil; Náprstek, Jiří

    2017-01-01

    Roč. 113, November (2017), s. 54-75 ISSN 0965-9978 R&D Projects: GA ČR(CZ) GP14-34467P; GA ČR(CZ) GA15-01035S Institutional support: RVO:68378297 Keywords : Fokker-Planck equation * finite element method * simplex element * multi-dimensional problem * non-symmetric operator Subject RIV: JM - Building Engineering OBOR OECD: Mechanical engineering Impact factor: 3.000, year: 2016 https://www.sciencedirect.com/science/ article /pii/S0965997817301904

  6. DPAK and HPAK: a versatile display and histogramming package

    International Nuclear Information System (INIS)

    Logg, C.A.; Boyarski, A.M.; Cook, A.J.; Cottrell, R.L.A.; Sund, S.

    1979-07-01

    The features of a display and histogram package which requires a minimal number of subroutine calls in order to generate graphic output in many flavors on a variety of devices are described. Default options are preset to values that are generally most wanted, but the default values may be readily changed to the user's needs. The description falls naturally into two parts, namely, the set of routines (DPAK) for displaying data on some device, and the set of routines (HPAK) for generating histograms. HPAK provides a means of allocating memory for histograms, accumulating data into histograms, and subsequently displaying the hisotgrams via calls to the DPAK routines. Histograms and displays of either one or two independent variables can be made

  7. Examining the evolution towards turbulence through spatio-temporal analysis of multi-dimensional structures formed by instability growth along a shear layer

    Science.gov (United States)

    Merritt, Elizabeth; Doss, Forrest; Loomis, Eric; Flippo, Kirk; Devolder, Barbara; Welser-Sherrill, Leslie; Fincke, James; Kline, John

    2014-10-01

    The counter-propagating shear campaign is examining instability growth and its transition to turbulence relevant to mix in ICF capsules. Experimental platforms on both OMEGA and NIF use anti-symmetric flows about a shear interface to examine isolated Kelvin-Helmholtz instability growth. Measurements of interface (an Al or Ti tracer layer) dynamics are used to benchmark the LANL RAGE hydrocode with BHR turbulence model. The tracer layer does not expand uniformly, but breaks up into multi-dimensional structures that are initially quasi-2D due to the target geometry. We are developing techniques to analyze the multi-D structure growth along the tracer surface with a focus on characterizing the time-dependent structures' spectrum of scales in order to appraise a transition to turbulence in the system and potentially provide tighter constraints on initialization schemes for the BHR model. To this end, we use a wavelet based analysis to diagnose single-time radiographs of the tracer layer surface (w/low and amplified roughness for random noise seeding) with observed spatially non-repetitive features, in order to identify spatial and temporal trends in radiographs taken at different times across several experimental shots. This work conducted under the auspices of the U.S. Department of Energy by LANL under Contract DE-AC52-06NA25396.

  8. An integrated approach for the knowledge discovery in computer simulation models with a multi-dimensional parameter space

    Energy Technology Data Exchange (ETDEWEB)

    Khawli, Toufik Al; Eppelt, Urs; Hermanns, Torsten [RWTH Aachen University, Chair for Nonlinear Dynamics, Steinbachstr. 15, 52047 Aachen (Germany); Gebhardt, Sascha [RWTH Aachen University, Virtual Reality Group, IT Center, Seffenter Weg 23, 52074 Aachen (Germany); Kuhlen, Torsten [Forschungszentrum Jülich GmbH, Institute for Advanced Simulation (IAS), Jülich Supercomputing Centre (JSC), Wilhelm-Johnen-Straße, 52425 Jülich (Germany); Schulz, Wolfgang [Fraunhofer, ILT Laser Technology, Steinbachstr. 15, 52047 Aachen (Germany)

    2016-06-08

    In production industries, parameter identification, sensitivity analysis and multi-dimensional visualization are vital steps in the planning process for achieving optimal designs and gaining valuable information. Sensitivity analysis and visualization can help in identifying the most-influential parameters and quantify their contribution to the model output, reduce the model complexity, and enhance the understanding of the model behavior. Typically, this requires a large number of simulations, which can be both very expensive and time consuming when the simulation models are numerically complex and the number of parameter inputs increases. There are three main constituent parts in this work. The first part is to substitute the numerical, physical model by an accurate surrogate model, the so-called metamodel. The second part includes a multi-dimensional visualization approach for the visual exploration of metamodels. In the third part, the metamodel is used to provide the two global sensitivity measures: i) the Elementary Effect for screening the parameters, and ii) the variance decomposition method for calculating the Sobol indices that quantify both the main and interaction effects. The application of the proposed approach is illustrated with an industrial application with the goal of optimizing a drilling process using a Gaussian laser beam.

  9. An integrative multi-dimensional genetic and epigenetic strategy to identify aberrant genes and pathways in cancer

    Directory of Open Access Journals (Sweden)

    Lockwood William W

    2010-05-01

    Full Text Available Abstract Background Genomics has substantially changed our approach to cancer research. Gene expression profiling, for example, has been utilized to delineate subtypes of cancer, and facilitated derivation of predictive and prognostic signatures. The emergence of technologies for the high resolution and genome-wide description of genetic and epigenetic features has enabled the identification of a multitude of causal DNA events in tumors. This has afforded the potential for large scale integration of genome and transcriptome data generated from a variety of technology platforms to acquire a better understanding of cancer. Results Here we show how multi-dimensional genomics data analysis would enable the deciphering of mechanisms that disrupt regulatory/signaling cascades and downstream effects. Since not all gene expression changes observed in a tumor are causal to cancer development, we demonstrate an approach based on multiple concerted disruption (MCD analysis of genes that facilitates the rational deduction of aberrant genes and pathways, which otherwise would be overlooked in single genomic dimension investigations. Conclusions Notably, this is the first comprehensive study of breast cancer cells by parallel integrative genome wide analyses of DNA copy number, LOH, and DNA methylation status to interpret changes in gene expression pattern. Our findings demonstrate the power of a multi-dimensional approach to elucidate events which would escape conventional single dimensional analysis and as such, reduce the cohort sample size for cancer gene discovery.

  10. Assessment of multi-dimensional analysis cacpacity of the MARS using the OECD-SETH PANDA tests

    International Nuclear Information System (INIS)

    Bae, S. W.; Jung, J. J.; Jung, B. D.

    2004-01-01

    The objectives of OECD/NEA-PANDA tests are to validate and assess computer codes that analyze the non-condensable gas concentrations and mixing phenomena in a reactor containment building. Especially, the main issue is multi-dimensional analysis capability which is involved in the mixing of non-condensable gases, i. e. hydrogen. The main tests consist of a superheated steam flow injection into a large vessel initially filled with air or air/helium mixtures. Then the temperature and concentration of noncondensable gases are measured. A pre-calculation has been performed with the MARS about PANDA Tests even though MARS is not a containment analysis code. Three cases among 25 PANDA Tests are selected and are modeled to simulate the jet plumes and air mixing in a large vessel. The dimensions of large vessel are 4 m diameter and 8 m height. For the conclusion of calculation, the cylindrical vessel which dimensions are 4 m diameter and 8 m height was simplified as rectangular geometry. It is revealed that the MARS code has the capability to distinguish the multi-dimensional distribution of the velocity and the temperature fields

  11. Principal component analysis of the CT density histogram to generate parametric response maps of COPD

    Science.gov (United States)

    Zha, N.; Capaldi, D. P. I.; Pike, D.; McCormack, D. G.; Cunningham, I. A.; Parraga, G.

    2015-03-01

    Pulmonary x-ray computed tomography (CT) may be used to characterize emphysema and airways disease in patients with chronic obstructive pulmonary disease (COPD). One analysis approach - parametric response mapping (PMR) utilizes registered inspiratory and expiratory CT image volumes and CT-density-histogram thresholds, but there is no consensus regarding the threshold values used, or their clinical meaning. Principal-component-analysis (PCA) of the CT density histogram can be exploited to quantify emphysema using data-driven CT-density-histogram thresholds. Thus, the objective of this proof-of-concept demonstration was to develop a PRM approach using PCA-derived thresholds in COPD patients and ex-smokers without airflow limitation. Methods: Fifteen COPD ex-smokers and 5 normal ex-smokers were evaluated. Thoracic CT images were also acquired at full inspiration and full expiration and these images were non-rigidly co-registered. PCA was performed for the CT density histograms, from which the components with the highest eigenvalues greater than one were summed. Since the values of the principal component curve correlate directly with the variability in the sample, the maximum and minimum points on the curve were used as threshold values for the PCA-adjusted PRM technique. Results: A significant correlation was determined between conventional and PCA-adjusted PRM with 3He MRI apparent diffusion coefficient (p<0.001), with CT RA950 (p<0.0001), as well as with 3He MRI ventilation defect percent, a measurement of both small airways disease (p=0.049 and p=0.06, respectively) and emphysema (p=0.02). Conclusions: PRM generated using PCA thresholds of the CT density histogram showed significant correlations with CT and 3He MRI measurements of emphysema, but not airways disease.

  12. Multi-Dimensional Quantum Effect Simulation Using a Density-Gradient Model and Script-Level Programming Techniques

    Science.gov (United States)

    Rafferty, Connor S.; Biegel, Bryan A.; Yu, Zhi-Ping; Ancona, Mario G.; Bude, J.; Dutton, Robert W.; Saini, Subhash (Technical Monitor)

    1998-01-01

    A density-gradient (DG) model is used to calculate quantum-mechanical corrections to classical carrier transport in MOS (Metal Oxide Semiconductor) inversion/accumulation layers. The model is compared to measured data and to a fully self-consistent coupled Schrodinger and Poisson equation (SCSP) solver. Good agreement is demonstrated for MOS capacitors with gate oxide as thin as 21 A. It is then applied to study carrier distribution in ultra short MOSFETs (Metal Oxide Semiconductor Field Effect Transistor) with surface roughness. This work represents the first implementation of the DG formulation on multidimensional unstructured meshes. It was enabled by a powerful scripting approach which provides an easy-to-use and flexible framework for solving the fourth-order PDEs (Partial Differential Equation) of the DG model.

  13. Characterization of sediments in the Clinch River, Tennessee, using remote sensing and multi-dimensional GIS techniques

    International Nuclear Information System (INIS)

    Levine, D.A.; Hargrove, W.W.; Hoffman, F.

    1995-01-01

    Remotely-sensed hydro-acoustic data were used as input to spatial extrapolation tools in a GIS to develop two- and three-dimensional models of sediment densities in the Clinch River arm of Watts Bar Reservoir, Tennessee. This work delineated sediment deposition zones to streamline sediment sampling and to provide a tool for estimating sediment volumes and extrapolating contaminant concentrations throughout the system. The Clinch River arm of Watts Bar Reservoir has been accumulating sediment-bound contaminants from three Department of Energy (DOE) facilities on the Oak Ridge Reservation, Tennessee. Public concern regarding human and ecological health resulted in Watts Bar Reservoir being placed on the National Priorities List for SUPERFUND. As a result, DOE initiated and is funding the Clinch River Environmental Restoration Program (CR-ERP) to perform a remedial investigation to determine the nature and extent of sediment contamination in the Watts Bar Reservoir and the Clinch River and to quantify any human or ecological health risks. The first step in characterizing Clinch River sediments was to determine the locations of deposition zones. It was also important to know the sediment type distribution within deposition zones because most sediment-bound contaminants are preferentially associated to fine particles. A dual-frequency hydro-acoustic survey was performed to determine: (1) depth to the sediment water interface, (2) depth of the sediment layer, and (3) sediment characteristics (density) with depth (approximately 0.5-foot intervals). An array of geophysical instruments was used to meet the objectives of this investigation

  14. IMPLEMENTASI METODE HISTOGRAM EQUALIZATION UNTUK MENINGKATKAN KUALITAS CITRA DIGITAL

    Directory of Open Access Journals (Sweden)

    Isa Akhlis

    2012-02-01

    Full Text Available Radiografi dapat digunakan untuk membantu mendiagnosis penyakit dalam bidang medis. Umumnya citra radiograf masih tampak kabur sehingga memerlukan pengolahan untuk menghilangkan atau mengurangi kekaburan tersebut. Tujuan penelitian ini adalah mendesain perangkat lunak untuk meningkatkan kualitas citra digital foto Roentgen yaitu dengan meningkatkan kontras citra tersebut. Salah satu metode untuk meningkatkan kontras citra digital adalah dengan menggunakan metode histogram equalization. Metoda tersebut membuat tingkat keabuan citra tersebar merata pada semua tingkat keabuan. Hasil penelitian menunjukkan bahwa metoda histogram equalization dapat digunakan untuk meningkatkan kontras citra.  Hal ini dapat langsung dilihat pada layar monitor.   Kata kunci: citra radiograf,  histogram equalization

  15. HPLOT: the graphics interface package for the HBOOK histogramming package

    International Nuclear Information System (INIS)

    Watkins, H.

    1978-01-01

    The subroutine package HPLOT described in this report, enables the CERN histogramming package HBOOK to produce high-quality pictures by means of high-resolution devices such as plotters. HPLOT can be implemented on any scientific computing system with a Fortran IV compiler and can be interfaced with any graphics package; spectral routines in addition to the basic ones enable users to embellish their histograms. Examples are also given of the use of HPLOT as a graphics package for plotting simple pictures without histograms. (Auth.)

  16. Urban agriculture: multi-dimensional tools for social development in poor neighbourhoods

    Directory of Open Access Journals (Sweden)

    E. Duchemin

    2009-01-01

    Full Text Available For over 30 years, different urban agriculture (UA experiments have been undertaken in Montreal (Quebec, Canada. The Community Gardening Program, managed by the City, and 6 collective gardens, managed by community organizations, are discussed in this article. These experiments have different objectives, including food security, socialization and education. Although these have changed over time, they have also differed depending on geographic location (neighbourhood. The UA initiatives in Montreal have resulted in the development of a centre with a significant vegetable production and a socialization and education environment that fosters individual and collective social development in districts with a significant economically disadvantaged population. The various approaches attain the established objectives and these are multi-dimensional tools used for the social development of disadvantaged populations.Depuis plus de 30 ans, différentes expériences d’AU ont été tentée à Montréal (Québec, Canada. Le programme des jardins communautaires, géré par la Ville, et 6 jardins collectifs, gérés par des organisations communautaires, sont examinés dans le cadre de cet article.  Ces expériences visent différents objectifs : accroître la sécurité alimentaire, sociabiliser, éduquer, etc. Les objectifs évoluent dans le temps mais aussi selon les quartiers. Notre étude révèle que les initiatives en AU à Montréal sont un lieu de production de légumes non négligeable, un espace pour sociabiliser et un lieu d’éducation favorisant un développement social individuel et collectif des quartiers ayant une forte présence de population économique défavorisée. Les différentes approches atteignent les objectifs identifiés et permettent le développement d’outils multi-facettes favorisant le développement social des populations défavorisées.Durante más de 30 años se han realizado diversos experimentos relacionados con la

  17. Face recognition algorithm using extended vector quantization histogram features.

    Science.gov (United States)

    Yan, Yan; Lee, Feifei; Wu, Xueqian; Chen, Qiu

    2018-01-01

    In this paper, we propose a face recognition algorithm based on a combination of vector quantization (VQ) and Markov stationary features (MSF). The VQ algorithm has been shown to be an effective method for generating features; it extracts a codevector histogram as a facial feature representation for face recognition. Still, the VQ histogram features are unable to convey spatial structural information, which to some extent limits their usefulness in discrimination. To alleviate this limitation of VQ histograms, we utilize Markov stationary features (MSF) to extend the VQ histogram-based features so as to add spatial structural information. We demonstrate the effectiveness of our proposed algorithm by achieving recognition results superior to those of several state-of-the-art methods on publicly available face databases.

  18. A Modified Image Comparison Algorithm Using Histogram Features

    OpenAIRE

    Al-Oraiqat, Anas M.; Kostyukova, Natalya S.

    2018-01-01

    This article discuss the problem of color image content comparison. Particularly, methods of image content comparison are analyzed, restrictions of color histogram are described and a modified method of images content comparison is proposed. This method uses the color histograms and considers color locations. Testing and analyzing of based and modified algorithms are performed. The modified method shows 97% average precision for a collection containing about 700 images without loss of the adv...

  19. On generalized de Rham-Hodge complexes, the related characteristic Chern classes and some applications to integrable multi-dimensional differential systems on Riemannian manifolds

    International Nuclear Information System (INIS)

    Bogolubov, Nikolai N. Jr.; Prykarpatsky, Anatoliy K.

    2006-12-01

    The differential-geometric aspects of generalized de Rham-Hodge complexes naturally related with integrable multi-dimensional differential systems of M. Gromov type, as well as the geometric structure of Chern characteristic classes are studied. Special differential invariants of the Chern type are constructed, their importance for the integrability of multi-dimensional nonlinear differential systems on Riemannian manifolds is discussed. An example of the three-dimensional Davey-Stewartson type nonlinear strongly integrable differential system is considered, its Cartan type connection mapping and related Chern type differential invariants are analyzed. (author)

  20. Efficient Scalable Median Filtering Using Histogram-Based Operations.

    Science.gov (United States)

    Green, Oded

    2018-05-01

    Median filtering is a smoothing technique for noise removal in images. While there are various implementations of median filtering for a single-core CPU, there are few implementations for accelerators and multi-core systems. Many parallel implementations of median filtering use a sorting algorithm for rearranging the values within a filtering window and taking the median of the sorted value. While using sorting algorithms allows for simple parallel implementations, the cost of the sorting becomes prohibitive as the filtering windows grow. This makes such algorithms, sequential and parallel alike, inefficient. In this work, we introduce the first software parallel median filtering that is non-sorting-based. The new algorithm uses efficient histogram-based operations. These reduce the computational requirements of the new algorithm while also accessing the image fewer times. We show an implementation of our algorithm for both the CPU and NVIDIA's CUDA supported graphics processing unit (GPU). The new algorithm is compared with several other leading CPU and GPU implementations. The CPU implementation has near perfect linear scaling with a speedup on a quad-core system. The GPU implementation is several orders of magnitude faster than the other GPU implementations for mid-size median filters. For small kernels, and , comparison-based approaches are preferable as fewer operations are required. Lastly, the new algorithm is open-source and can be found in the OpenCV library.

  1. Landmark Detection in Orbital Images Using Salience Histograms

    Science.gov (United States)

    Wagstaff, Kiri L.; Panetta, Julian; Schorghofer, Norbert; Greeley, Ronald; PendletonHoffer, Mary; bunte, Melissa

    2010-01-01

    NASA's planetary missions have collected, and continue to collect, massive volumes of orbital imagery. The volume is such that it is difficult to manually review all of the data and determine its significance. As a result, images are indexed and searchable by location and date but generally not by their content. A new automated method analyzes images and identifies "landmarks," or visually salient features such as gullies, craters, dust devil tracks, and the like. This technique uses a statistical measure of salience derived from information theory, so it is not associated with any specific landmark type. It identifies regions that are unusual or that stand out from their surroundings, so the resulting landmarks are context-sensitive areas that can be used to recognize the same area when it is encountered again. A machine learning classifier is used to identify the type of each discovered landmark. Using a specified window size, an intensity histogram is computed for each such window within the larger image (sliding the window across the image). Next, a salience map is computed that specifies, for each pixel, the salience of the window centered at that pixel. The salience map is thresholded to identify landmark contours (polygons) using the upper quartile of salience values. Descriptive attributes are extracted for each landmark polygon: size, perimeter, mean intensity, standard deviation of intensity, and shape features derived from an ellipse fit.

  2. Multifractal diffusion entropy analysis: Optimal bin width of probability histograms

    Science.gov (United States)

    Jizba, Petr; Korbel, Jan

    2014-11-01

    In the framework of Multifractal Diffusion Entropy Analysis we propose a method for choosing an optimal bin-width in histograms generated from underlying probability distributions of interest. The method presented uses techniques of Rényi’s entropy and the mean squared error analysis to discuss the conditions under which the error in the multifractal spectrum estimation is minimal. We illustrate the utility of our approach by focusing on a scaling behavior of financial time series. In particular, we analyze the S&P500 stock index as sampled at a daily rate in the time period 1950-2013. In order to demonstrate a strength of the method proposed we compare the multifractal δ-spectrum for various bin-widths and show the robustness of the method, especially for large values of q. For such values, other methods in use, e.g., those based on moment estimation, tend to fail for heavy-tailed data or data with long correlations. Connection between the δ-spectrum and Rényi’s q parameter is also discussed and elucidated on a simple example of multiscale time series.

  3. Differential diagnosis of normal pressure hydrocephalus by MRI mean diffusivity histogram analysis.

    Science.gov (United States)

    Ivkovic, M; Liu, B; Ahmed, F; Moore, D; Huang, C; Raj, A; Kovanlikaya, I; Heier, L; Relkin, N

    2013-01-01

    Accurate diagnosis of normal pressure hydrocephalus is challenging because the clinical symptoms and radiographic appearance of NPH often overlap those of other conditions, including age-related neurodegenerative disorders such as Alzheimer and Parkinson diseases. We hypothesized that radiologic differences between NPH and AD/PD can be characterized by a robust and objective MR imaging DTI technique that does not require intersubject image registration or operator-defined regions of interest, thus avoiding many pitfalls common in DTI methods. We collected 3T DTI data from 15 patients with probable NPH and 25 controls with AD, PD, or dementia with Lewy bodies. We developed a parametric model for the shape of intracranial mean diffusivity histograms that separates brain and ventricular components from a third component composed mostly of partial volume voxels. To accurately fit the shape of the third component, we constructed a parametric function named the generalized Voss-Dyke function. We then examined the use of the fitting parameters for the differential diagnosis of NPH from AD, PD, and DLB. Using parameters for the MD histogram shape, we distinguished clinically probable NPH from the 3 other disorders with 86% sensitivity and 96% specificity. The technique yielded 86% sensitivity and 88% specificity when differentiating NPH from AD only. An adequate parametric model for the shape of intracranial MD histograms can distinguish NPH from AD, PD, or DLB with high sensitivity and specificity.

  4. Clarification of the use of chi-square and likelihood functions in fits to histograms

    International Nuclear Information System (INIS)

    Baker, S.; Cousins, R.D.

    1984-01-01

    We consider the problem of fitting curves to histograms in which the data obey multinomial or Poisson statistics. Techniques commonly used by physicists are examined in light of standard results found in the statistics literature. We review the relationship between multinomial and Poisson distributions, and clarify a sufficient condition for equality of the area under the fitted curve and the number of events on the histogram. Following the statisticians, we use the likelihood ratio test to construct a general Z 2 statistic, Zsub(lambda) 2 , which yields parameter and error estimates identical to those of the method of maximum likelihood. The Zsub(lambda) 2 statistic is further useful for testing goodness-of-fit since the value of its minimum asymptotically obeys a classical chi-square distribution. One should be aware, however, of the potential for statistical bias, especially when the number of events is small. (orig.)

  5. Multi-stream LSTM-HMM decoding and histogram equalization for noise robust keyword spotting.

    Science.gov (United States)

    Wöllmer, Martin; Marchi, Erik; Squartini, Stefano; Schuller, Björn

    2011-09-01

    Highly spontaneous, conversational, and potentially emotional and noisy speech is known to be a challenge for today's automatic speech recognition (ASR) systems, which highlights the need for advanced algorithms that improve speech features and models. Histogram Equalization is an efficient method to reduce the mismatch between clean and noisy conditions by normalizing all moments of the probability distribution of the feature vector components. In this article, we propose to combine histogram equalization and multi-condition training for robust keyword detection in noisy speech. To better cope with conversational speaking styles, we show how contextual information can be effectively exploited in a multi-stream ASR framework that dynamically models context-sensitive phoneme estimates generated by a long short-term memory neural network. The proposed techniques are evaluated on the SEMAINE database-a corpus containing emotionally colored conversations with a cognitive system for "Sensitive Artificial Listening".

  6. A novel and efficient analytical method for calculation of the transient temperature field in a multi-dimensional composite slab

    International Nuclear Information System (INIS)

    Lu, X; Tervola, P; Viljanen, M

    2005-01-01

    This paper provides an efficient analytical tool for solving the heat conduction equation in a multi-dimensional composite slab subject to generally time-dependent boundary conditions. A temporal Laplace transformation and novel separation of variables are applied to the heat equation. The time-dependent boundary conditions are approximated with Fourier series. Taking advantage of the periodic properties of Fourier series, the corresponding analytical solution is obtained and expressed explicitly through employing variable transformations. For such conduction problems, nearly all the published works necessitate numerical work such as computing residues or searching for eigenvalues even for a one-dimensional composite slab. In this paper, the proposed method involves no numerical iteration. The final closed form solution is straightforward; hence, the physical parameters are clearly shown in the formula. The accuracy of the developed analytical method is demonstrated by comparison with numerical calculations

  7. Development of a multi-dimensional realistic thermal-hydraulic system analysis code, MARS 1.3 and its verification

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Chung, Bub Dong; Jeong, Jae Jun; Ha, Kwi Seok [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-06-01

    A multi-dimensional realistic thermal-hydraulic system analysis code, MARS version 1.3 has been developed. Main purpose of MARS 1.3 development is to have the realistic analysis capability of transient two-phase thermal-hydraulics of Pressurized Water Reactors (PWRs) especially during Large Break Loss of Coolant Accidents (LBLOCAs) where the multi-dimensional phenomena domain the transients. MARS code is a unified version of USNRC developed COBRA-TF, domain the transients. MARS code is a unified version of USNRC developed COBRA-TF, three-dimensional (3D) reactor vessel analysis code, and RELAP5/MOD3.2.1.2, one-dimensional (1D) reactor system analysis code., Developmental requirements for MARS are chosen not only to best utilize the existing capability of the codes but also to have the enhanced capability in code maintenance, user accessibility, user friendliness, code portability, code readability, and code flexibility. For the maintenance of existing codes capability and the enhancement of code maintenance capability, user accessibility and user friendliness, MARS has been unified to be a single code consisting of 1D module (RELAP5) and 3D module (COBRA-TF). This is realized by implicitly integrating the system pressure matrix equations of hydrodynamic models and solving them simultaneously, by modifying the 1D/3D calculation sequence operable under a single Central Processor Unit (CPU) and by unifying the input structure and the light water property routines of both modules. In addition, the code structure of 1D module is completely restructured using the modular data structure of standard FORTRAN 90, which greatly improves the code maintenance capability, readability and portability. For the code flexibility, a dynamic memory management scheme is applied in both modules. MARS 1.3 now runs on PC/Windows and HP/UNIX platforms having a single CPU, and users have the options to select the 3D module to model the 3D thermal-hydraulics in the reactor vessel or other

  8. Assessment of the RELAP5 multi-dimensional component model using data from LOFT test L2-5

    International Nuclear Information System (INIS)

    Davis, C.B.

    1998-01-01

    The capability of the RELAP5-3D computer code to perform multi-dimensional analysis of a pressurized water reactor (PWR) was assessed using data from the LOFT L2-5 experiment. The LOFT facility was a 50 MW PWR that was designed to simulate the response of a commercial PWR during a loss-of-coolant accident. Test L2-5 simulated a 200% double-ended cold leg break with an immediate primary coolant pump trip. A three-dimensional model of the LOFT reactor vessel was developed. Calculations of the LOFT L2-5 experiment were performed using the RELAP5-3D Version BF02 computer code. The calculated thermal-hydraulic responses of the LOFT primary and secondary coolant systems were generally in reasonable agreement with the test. The calculated results were also generally as good as or better than those obtained previously with RELAP/MOD3

  9. Lithium Depletion in Solar-like Stars: Effect of Overshooting Based on Realistic Multi-dimensional Simulations

    Science.gov (United States)

    Baraffe, I.; Pratt, J.; Goffrey, T.; Constantino, T.; Folini, D.; Popov, M. V.; Walder, R.; Viallet, M.

    2017-08-01

    We study lithium depletion in low-mass and solar-like stars as a function of time, using a new diffusion coefficient describing extra-mixing taking place at the bottom of a convective envelope. This new form is motivated by multi-dimensional fully compressible, time-implicit hydrodynamic simulations performed with the MUSIC code. Intermittent convective mixing at the convective boundary in a star can be modeled using extreme value theory, a statistical analysis frequently used for finance, meteorology, and environmental science. In this Letter, we implement this statistical diffusion coefficient in a one-dimensional stellar evolution code, using parameters calibrated from multi-dimensional hydrodynamic simulations of a young low-mass star. We propose a new scenario that can explain observations of the surface abundance of lithium in the Sun and in clusters covering a wide range of ages, from ˜50 Myr to ˜4 Gyr. Because it relies on our physical model of convective penetration, this scenario has a limited number of assumptions. It can explain the observed trend between rotation and depletion, based on a single additional assumption, namely, that rotation affects the mixing efficiency at the convective boundary. We suggest the existence of a threshold in stellar rotation rate above which rotation strongly prevents the vertical penetration of plumes and below which rotation has small effects. In addition to providing a possible explanation for the long-standing problem of lithium depletion in pre-main-sequence and main-sequence stars, the strength of our scenario is that its basic assumptions can be tested by future hydrodynamic simulations.

  10. A new improvement on a chemical kinetic model of primary reference fuel for multi-dimensional CFD simulation

    International Nuclear Information System (INIS)

    Zhen, Xudong; Wang, Yang; Liu, Daming

    2016-01-01

    Highlights: • A new optimized chemical kinetic mechanism for PRF is developed. • New mechanism optimization is performed based on the CHEMKIN simulations. • More reactions of C_0–C_1 oxidation are added in the present mechanism. • Good performance is achieved of mechanism by validating various reactors and operating conditions. - Abstract: In the present study, for the multi-dimensional CFD (computational fluid dynamics) combustion simulations of internal combustion engines, a new optimized chemical kinetic reaction mechanism for the oxidation of PRF (primary reference fuel) instead of gasoline has been developed. In order to carry out the in-depth research for combustion phenomenon of internal combustion engines, an optimized reduced PRF mechanism including more intermediate species and radicals was developed. The developed mechanism contains of iso-octane (C_8H_1_8) and n-heptane (C_7H_1_6) surrogates, which contains of 51-species and 193 reactions. Compared with many other mechanisms of PRF, more reactions of C_0–C_1 oxidation (100 reactions) are added in the present mechanism. In order to improve the performances of the model, the developed mechanism focused on the improvement through the prediction of the ignition delay time. The developed mechanism has been validated against various experimental and simulation data including shock tube data, laminar flame speed data and HCCI (homogeneous charge compression ignition) engine data. The results showed that the developed PRF mechanism was agreements with the experimental data and other approved reduced mechanisms, and it could be applied to the multi-dimensional CFD simulations for internal combustion engines.

  11. Lithium Depletion in Solar-like Stars: Effect of Overshooting Based on Realistic Multi-dimensional Simulations

    International Nuclear Information System (INIS)

    Baraffe, I.; Pratt, J.; Goffrey, T.; Constantino, T.; Viallet, M.; Folini, D.; Popov, M. V.; Walder, R.

    2017-01-01

    We study lithium depletion in low-mass and solar-like stars as a function of time, using a new diffusion coefficient describing extra-mixing taking place at the bottom of a convective envelope. This new form is motivated by multi-dimensional fully compressible, time-implicit hydrodynamic simulations performed with the MUSIC code. Intermittent convective mixing at the convective boundary in a star can be modeled using extreme value theory, a statistical analysis frequently used for finance, meteorology, and environmental science. In this Letter, we implement this statistical diffusion coefficient in a one-dimensional stellar evolution code, using parameters calibrated from multi-dimensional hydrodynamic simulations of a young low-mass star. We propose a new scenario that can explain observations of the surface abundance of lithium in the Sun and in clusters covering a wide range of ages, from ∼50 Myr to ∼4 Gyr. Because it relies on our physical model of convective penetration, this scenario has a limited number of assumptions. It can explain the observed trend between rotation and depletion, based on a single additional assumption, namely, that rotation affects the mixing efficiency at the convective boundary. We suggest the existence of a threshold in stellar rotation rate above which rotation strongly prevents the vertical penetration of plumes and below which rotation has small effects. In addition to providing a possible explanation for the long-standing problem of lithium depletion in pre-main-sequence and main-sequence stars, the strength of our scenario is that its basic assumptions can be tested by future hydrodynamic simulations.

  12. Lithium Depletion in Solar-like Stars: Effect of Overshooting Based on Realistic Multi-dimensional Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Baraffe, I.; Pratt, J.; Goffrey, T.; Constantino, T.; Viallet, M. [Astrophysics Group, University of Exeter, Exeter EX4 4QL (United Kingdom); Folini, D.; Popov, M. V.; Walder, R., E-mail: i.baraffe@ex.ac.uk [Ecole Normale Supérieure de Lyon, CRAL, UMR CNRS 5574, F-69364 Lyon Cedex 07 (France)

    2017-08-10

    We study lithium depletion in low-mass and solar-like stars as a function of time, using a new diffusion coefficient describing extra-mixing taking place at the bottom of a convective envelope. This new form is motivated by multi-dimensional fully compressible, time-implicit hydrodynamic simulations performed with the MUSIC code. Intermittent convective mixing at the convective boundary in a star can be modeled using extreme value theory, a statistical analysis frequently used for finance, meteorology, and environmental science. In this Letter, we implement this statistical diffusion coefficient in a one-dimensional stellar evolution code, using parameters calibrated from multi-dimensional hydrodynamic simulations of a young low-mass star. We propose a new scenario that can explain observations of the surface abundance of lithium in the Sun and in clusters covering a wide range of ages, from ∼50 Myr to ∼4 Gyr. Because it relies on our physical model of convective penetration, this scenario has a limited number of assumptions. It can explain the observed trend between rotation and depletion, based on a single additional assumption, namely, that rotation affects the mixing efficiency at the convective boundary. We suggest the existence of a threshold in stellar rotation rate above which rotation strongly prevents the vertical penetration of plumes and below which rotation has small effects. In addition to providing a possible explanation for the long-standing problem of lithium depletion in pre-main-sequence and main-sequence stars, the strength of our scenario is that its basic assumptions can be tested by future hydrodynamic simulations.

  13. Condition monitoring of face milling tool using K-star algorithm and histogram features of vibration signal

    Directory of Open Access Journals (Sweden)

    C.K. Madhusudana

    2016-09-01

    Full Text Available This paper deals with the fault diagnosis of the face milling tool based on machine learning approach using histogram features and K-star algorithm technique. Vibration signals of the milling tool under healthy and different fault conditions are acquired during machining of steel alloy 42CrMo4. Histogram features are extracted from the acquired signals. The decision tree is used to select the salient features out of all the extracted features and these selected features are used as an input to the classifier. K-star algorithm is used as a classifier and the output of the model is utilised to study and classify the different conditions of the face milling tool. Based on the experimental results, K-star algorithm is provided a better classification accuracy in the range from 94% to 96% with histogram features and is acceptable for fault diagnosis.

  14. Numerical Computation of Wave-Plasma Interactions in Multi-Dimensional Systems

    International Nuclear Information System (INIS)

    D. A. D'Ippolito; J. R. Myra

    2005-01-01

    This project studied two kinds of nonlinear interactions between ion cyclotron range of frequency waves and fusion plasmas. A wavelet technique was also developed for analyzing the complex wave fields produced by wave propagation codes

  15. Histogram analysis of diffusion kurtosis imaging of nasopharyngeal carcinoma: Correlation between quantitative parameters and clinical stage.

    Science.gov (United States)

    Xu, Xiao-Quan; Ma, Gao; Wang, Yan-Jun; Hu, Hao; Su, Guo-Yi; Shi, Hai-Bin; Wu, Fei-Yun

    2017-07-18

    To evaluate the correlation between histogram parameters derived from diffusion-kurtosis (DK) imaging and the clinical stage of nasopharyngeal carcinoma (NPC). High T-stage (T3/4) NPC showed significantly higher Kapp-mean (P = 0.018), Kapp-median (P = 0.029) and Kapp-90th (P = 0.003) than low T-stage (T1/2) NPC. High N-stage NPC (N2/3) showed significantly lower Dapp-mean (P = 0.002), Dapp-median (P = 0.002) and Dapp-10th (P Histogram parameters, including mean, median, 10th, 90th percentiles, skewness and kurtosis of Dapp and Kapp were calculated. Patients were divided into low and high T, N and clinical stage based on American Joint Committee on Cancer (AJCC) staging system. Differences of histogram parameters between low and high T, N and AJCC stages were compared using t test. Multiple receiver operating characteristic (ROC) curves were used to determine and compare the value of significant parameters in predicting high T, N and AJCC stage, respectively. DK imaging-derived parameters correlated well with clinical stage of NPC, therefore could serve as an adjunctive imaging technique for evaluating NPC.

  16. Multipeak Mean Based Optimized Histogram Modification Framework Using Swarm Intelligence for Image Contrast Enhancement

    Directory of Open Access Journals (Sweden)

    P. Babu

    2015-01-01

    Full Text Available A novel approach, Multipeak mean based optimized histogram modification framework (MMOHM is introduced for the purpose of enhancing the contrast as well as preserving essential details for any given gray scale and colour images. The basic idea of this technique is the calculation of multiple peaks (local maxima from the original histogram. The mean value of multiple peaks is computed and the input image’s histogram is segmented into two subhistograms based on this multipeak mean (mmean value. Then, a bicriteria optimization problem is formulated and the subhistograms are modified by selecting optimal contrast enhancement parameters. While formulating the enhancement parameters, particle swarm optimization is employed to find optimal values of them. Finally, the union of the modified subhistograms produces a contrast enhanced and details preserved output image. This mechanism enhances the contrast of the input image better than the existing contemporary HE methods. The performance of the proposed method is well supported by the contrast enhancement quantitative metrics such as discrete entropy, natural image quality evaluator, and absolute mean brightness error.

  17. Discrimination of paediatric brain tumours using apparent diffusion coefficient histograms

    International Nuclear Information System (INIS)

    Bull, Jonathan G.; Clark, Christopher A.; Saunders, Dawn E.

    2012-01-01

    To determine if histograms of apparent diffusion coefficients (ADC) can be used to differentiate paediatric brain tumours. Imaging of histologically confirmed tumours with pre-operative ADC maps were reviewed (54 cases, 32 male, mean age 6.1 years; range 0.1-15.8 years) comprising 6 groups. Whole tumour ADC histograms were calculated; normalised for volume. Stepwise logistic regression analysis was used to differentiate tumour types using histogram metrics, initially for all groups and then for specific subsets. All 6 groups (5 dysembryoplastic neuroectodermal tumours, 22 primitive neuroectodermal tumours (PNET), 5 ependymomas, 7 choroid plexus papillomas, 4 atypical teratoid rhabdoid tumours (ATRT) and 9 juvenile pilocytic astrocytomas (JPA)) were compared. 74% (40/54) were correctly classified using logistic regression of ADC histogram parameters. In the analysis of posterior fossa tumours, 80% of ependymomas, 100% of astrocytomas and 94% of PNET-medulloblastoma were classified correctly. All PNETs were discriminated from ATRTs (22 PNET and 4 supratentorial ATRTs) (100%). ADC histograms are useful in differentiating paediatric brain tumours, in particular, the common posterior fossa tumours of childhood. PNETs were differentiated from supratentorial ATRTs, in all cases, which has important implications in terms of clinical management. (orig.)

  18. Defect detection based on extreme edge of defective region histogram

    Directory of Open Access Journals (Sweden)

    Zouhir Wakaf

    2018-01-01

    Full Text Available Automatic thresholding has been used by many applications in image processing and pattern recognition systems. Specific attention was given during inspection for quality control purposes in various industries like steel processing and textile manufacturing. Automatic thresholding problem has been addressed well by the commonly used Otsu method, which provides suitable results for thresholding images based on a histogram of bimodal distribution. However, the Otsu method fails when the histogram is unimodal or close to unimodal. Defects have different shapes and sizes, ranging from very small to large. The gray-level distributions of the image histogram can vary between unimodal and multimodal. Furthermore, Otsu-revised methods, like the valley-emphasis method and the background histogram mode extents, which overcome the drawbacks of the Otsu method, require preprocessing steps and fail to use the general threshold for multimodal defects. This study proposes a new automatic thresholding algorithm based on the acquisition of the defective region histogram and the selection of its extreme edge as the threshold value to segment all defective objects in the foreground from the image background. To evaluate the proposed defect-detection method, common standard images for experimentation were used. Experimental results of the proposed method show that the proposed method outperforms the current methods in terms of defect detection.

  19. Multi-dimensional Rankings, Program Termination, and Complexity Bounds of Flowchart Programs

    OpenAIRE

    Alias , Christophe; Darte , Alain; Feautrier , Paul; Gonnord , Laure

    2010-01-01

    International audience; Proving the termination of a flowchart program can be done by exhibiting a ranking function, i.e., a function from the program states to a well-founded set, which strictly decreases at each program step. A standard method to automatically generate such a function is to compute invariants for each program point and to search for a ranking in a restricted class of functions that can be handled with linear programming techniques. Previous algorithms based on affine rankin...

  20. Simple one-dimensional finite element algorithm with multi-dimensional capabilities

    International Nuclear Information System (INIS)

    Pepper, D.W.; Baker, A.J.

    1978-01-01

    The application of the finite element procedure for the solution of partial differential equations is gaining widespread acceptance. The ability of the finite element procedure to solve problems which are arbitrarily shaped as well as the alleviation of boundary condition problems is well known. By using local interpolation functionals over each subdomain, or element, a set of linearized algebraic equations are obtained which can be solved using any direct, iterative, or inverse numerical technique. Subsequent use of an explicit or implicit integration procedure permits closure of the solution over the global domain

  1. Bounding the Computational Complexity of Flowchart Programs with Multi-dimensional Rankings

    OpenAIRE

    Alias , Christophe; Darte , Alain; Feautrier , Paul; Gonnord , Laure

    2010-01-01

    Proving the termination of a flowchart program can be done by exhibiting a ranking function, i.e., a function from the program states to a well-founded set, which strictly decreases at each program step. A standard method to automatically generate such a function is to compute invariants for each program point and to search for a ranking in a restricted class of functions that can be handled with linear programming techniques. Our first contribution is to propose an efficient algorithm to com...

  2. Histogram bin width selection for time-dependent Poisson processes

    International Nuclear Information System (INIS)

    Koyama, Shinsuke; Shinomoto, Shigeru

    2004-01-01

    In constructing a time histogram of the event sequences derived from a nonstationary point process, we wish to determine the bin width such that the mean squared error of the histogram from the underlying rate of occurrence is minimized. We find that the optimal bin widths obtained for a doubly stochastic Poisson process and a sinusoidally regulated Poisson process exhibit different scaling relations with respect to the number of sequences, time scale and amplitude of rate modulation, but both diverge under similar parametric conditions. This implies that under these conditions, no determination of the time-dependent rate can be made. We also apply the kernel method to these point processes, and find that the optimal kernels do not exhibit any critical phenomena, unlike the time histogram method

  3. Histogram bin width selection for time-dependent Poisson processes

    Energy Technology Data Exchange (ETDEWEB)

    Koyama, Shinsuke; Shinomoto, Shigeru [Department of Physics, Graduate School of Science, Kyoto University, Sakyo-ku, Kyoto 606-8502 (Japan)

    2004-07-23

    In constructing a time histogram of the event sequences derived from a nonstationary point process, we wish to determine the bin width such that the mean squared error of the histogram from the underlying rate of occurrence is minimized. We find that the optimal bin widths obtained for a doubly stochastic Poisson process and a sinusoidally regulated Poisson process exhibit different scaling relations with respect to the number of sequences, time scale and amplitude of rate modulation, but both diverge under similar parametric conditions. This implies that under these conditions, no determination of the time-dependent rate can be made. We also apply the kernel method to these point processes, and find that the optimal kernels do not exhibit any critical phenomena, unlike the time histogram method.

  4. Finding significantly connected voxels based on histograms of connection strengths

    DEFF Research Database (Denmark)

    Kasenburg, Niklas; Pedersen, Morten Vester; Darkner, Sune

    2016-01-01

    We explore a new approach for structural connectivity based segmentations of subcortical brain regions. Connectivity based segmentations are usually based on fibre connections from a seed region to predefined target regions. We present a method for finding significantly connected voxels based...... on the distribution of connection strengths. Paths from seed voxels to all voxels in a target region are obtained from a shortest-path tractography. For each seed voxel we approximate the distribution with a histogram of path scores. We hypothesise that the majority of estimated connections are false-positives...... and that their connection strength is distributed differently from true-positive connections. Therefore, an empirical null-distribution is defined for each target region as the average normalized histogram over all voxels in the seed region. Single histograms are then tested against the corresponding null...

  5. PENGARUH HISTOGRAM EQUALIZATION UNTUK PERBAIKAN KUALITAS CITRA DIGITAL

    Directory of Open Access Journals (Sweden)

    Sisilia Daeng Bakka Mau

    2016-04-01

    Full Text Available Penelitian ini membahas penggunaan metode histogram equalization yang akan digunakan untuk perbaikan kualitas citra. Perbaikan kualitas citra (image enhancement merupakan salah satu proses awal dalam peningkatan mutu citra. Peningkatan mutu citra diperlukan karena seringkali citra yang dijadikan objek pembahasan mempunyai kualitas yang buruk, misalnya citra mengalami derau, kabur, citra terlalu gelap atau terang, citra kurang tajam dan sebagainya. Perbaikan kualitas citra adalah proses memperjelas dan mempertajam ciri atau fitur tertentu dari citra agar citra lebih mudah dipersepsi maupun dianalisa secara lebih teliti. Hasil penelitian ini membuktikan bahwa penggunaan metode histogram equalization dapat digunakan untuk meningkatkan kontras citra dan dapat meningkatkan kualitas citra, sehingga informasi yang ada pada citra lebih jelas terlihat. Kata kunci: perbaikan kualitas citra, histogram equalization, citra digital

  6. Multi-dimensional Rankings, Program Termination, and Complexity Bounds of Flowchart Programs

    Science.gov (United States)

    Alias, Christophe; Darte, Alain; Feautrier, Paul; Gonnord, Laure

    Proving the termination of a flowchart program can be done by exhibiting a ranking function, i.e., a function from the program states to a well-founded set, which strictly decreases at each program step. A standard method to automatically generate such a function is to compute invariants for each program point and to search for a ranking in a restricted class of functions that can be handled with linear programming techniques. Previous algorithms based on affine rankings either are applicable only to simple loops (i.e., single-node flowcharts) and rely on enumeration, or are not complete in the sense that they are not guaranteed to find a ranking in the class of functions they consider, if one exists. Our first contribution is to propose an efficient algorithm to compute ranking functions: It can handle flowcharts of arbitrary structure, the class of candidate rankings it explores is larger, and our method, although greedy, is provably complete. Our second contribution is to show how to use the ranking functions we generate to get upper bounds for the computational complexity (number of transitions) of the source program. This estimate is a polynomial, which means that we can handle programs with more than linear complexity. We applied the method on a collection of test cases from the literature. We also show the links and differences with previous techniques based on the insertion of counters.

  7. VHDL implementation on histogram with ADC CAMAC module

    International Nuclear Information System (INIS)

    Ruby Santhi, R.; Satyanarayana, V.V.V.; Ajith Kumar, B.P.

    2007-01-01

    Modern nuclear spectroscopy systems the data acquisition and analysis in experimental science have been undergoing major changes because of faster speed and higher resolution. The CAMAC module which is described here is FPGA based 8K x 24 bit Histogram Memory integrated with ADC on a single board has been designed and fabricated. This module accepts input from Spectroscopy Amplifier for Pulse Height Analysis and offers all features single spectra for a few selected parameters. These on line histograms are to monitor the progress of the experiments during on line experiments

  8. Integral Histogram with Random Projection for Pedestrian Detection.

    Directory of Open Access Journals (Sweden)

    Chang-Hua Liu

    Full Text Available In this paper, we give a systematic study to report several deep insights into the HOG, one of the most widely used features in the modern computer vision and image processing applications. We first show that, its magnitudes of gradient can be randomly projected with random matrix. To handle over-fitting, an integral histogram based on the differences of randomly selected blocks is proposed. The experiments show that both the random projection and integral histogram outperform the HOG feature obviously. Finally, the two ideas are combined into a new descriptor termed IHRP, which outperforms the HOG feature with less dimensions and higher speed.

  9. Similarity from multi-dimensional scaling: solving the accuracy and diversity dilemma in information filtering.

    Directory of Open Access Journals (Sweden)

    Wei Zeng

    Full Text Available Recommender systems are designed to assist individual users to navigate through the rapidly growing amount of information. One of the most successful recommendation techniques is the collaborative filtering, which has been extensively investigated and has already found wide applications in e-commerce. One of challenges in this algorithm is how to accurately quantify the similarities of user pairs and item pairs. In this paper, we employ the multidimensional scaling (MDS method to measure the similarities between nodes in user-item bipartite networks. The MDS method can extract the essential similarity information from the networks by smoothing out noise, which provides a graphical display of the structure of the networks. With the similarity measured from MDS, we find that the item-based collaborative filtering algorithm can outperform the diffusion-based recommendation algorithms. Moreover, we show that this method tends to recommend unpopular items and increase the global diversification of the networks in long term.

  10. Similarity from multi-dimensional scaling: solving the accuracy and diversity dilemma in information filtering.

    Science.gov (United States)

    Zeng, Wei; Zeng, An; Liu, Hao; Shang, Ming-Sheng; Zhang, Yi-Cheng

    2014-01-01

    Recommender systems are designed to assist individual users to navigate through the rapidly growing amount of information. One of the most successful recommendation techniques is the collaborative filtering, which has been extensively investigated and has already found wide applications in e-commerce. One of challenges in this algorithm is how to accurately quantify the similarities of user pairs and item pairs. In this paper, we employ the multidimensional scaling (MDS) method to measure the similarities between nodes in user-item bipartite networks. The MDS method can extract the essential similarity information from the networks by smoothing out noise, which provides a graphical display of the structure of the networks. With the similarity measured from MDS, we find that the item-based collaborative filtering algorithm can outperform the diffusion-based recommendation algorithms. Moreover, we show that this method tends to recommend unpopular items and increase the global diversification of the networks in long term.

  11. Numerical simulation of multi-dimensional two-phase flow based on flux vector splitting

    Energy Technology Data Exchange (ETDEWEB)

    Staedtke, H.; Franchello, G.; Worth, B. [Joint Research Centre - Ispra Establishment (Italy)

    1995-09-01

    This paper describes a new approach to the numerical simulation of transient, multidimensional two-phase flow. The development is based on a fully hyperbolic two-fluid model of two-phase flow using separated conservation equations for the two phases. Features of the new model include the existence of real eigenvalues, and a complete set of independent eigenvectors which can be expressed algebraically in terms of the major dependent flow parameters. This facilitates the application of numerical techniques specifically developed for high speed single-phase gas flows which combine signal propagation along characteristic lines with the conservation property with respect to mass, momentum and energy. Advantages of the new model for the numerical simulation of one- and two- dimensional two-phase flow are discussed.

  12. A study on the multi-dimensional spectral analysis for response of a piping model with two-seismic inputs

    International Nuclear Information System (INIS)

    Suzuki, K.; Sato, H.

    1975-01-01

    The power and the cross power spectrum analysis by which the vibration characteristic of structures, such as natural frequency, mode of vibration and damping ratio, can be identified would be effective for the confirmation of the characteristics after the construction is completed by using the response for small earthquakes or the micro-tremor under the operating condition. This method of analysis previously utilized only from the view point of systems with single input so far, is extensively applied for the analysis of a medium scale model of a piping system subjected to two seismic inputs. The piping system attached to a three storied concrete structure model which is constructed on a shaking table was excited due to earthquake motions. The inputs to the piping system were recorded at the second floor and the ceiling of the third floor where the system was attached to. The output, the response of the piping system, was instrumented at a middle point on the system. As a result, the multi-dimensional power spectrum analysis is effective for a more reliable identification of the vibration characteristics of the multi-input structure system

  13. A multi-dimensional quasi-discrete model for the analysis of Diesel fuel droplet heating and evaporation

    KAUST Repository

    Sazhin, Sergei S.

    2014-08-01

    A new multi-dimensional quasi-discrete model is suggested and tested for the analysis of heating and evaporation of Diesel fuel droplets. As in the original quasi-discrete model suggested earlier, the components of Diesel fuel with close thermodynamic and transport properties are grouped together to form quasi-components. In contrast to the original quasi-discrete model, the new model takes into account the contribution of not only alkanes, but also various other groups of hydrocarbons in Diesel fuels; quasi-components are formed within individual groups. Also, in contrast to the original quasi-discrete model, the contributions of individual components are not approximated by the distribution function of carbon numbers. The formation of quasi-components is based on taking into account the contributions of individual components without any approximations. Groups contributing small molar fractions to the composition of Diesel fuel (less than about 1.5%) are replaced with characteristic components. The actual Diesel fuel is simplified to form six groups: alkanes, cycloalkanes, bicycloalkanes, alkylbenzenes, indanes & tetralines, and naphthalenes, and 3 components C19H34 (tricycloalkane), C13H 12 (diaromatic), and C14H10 (phenanthrene). It is shown that the approximation of Diesel fuel by 15 quasi-components and components, leads to errors in estimated temperatures and evaporation times in typical Diesel engine conditions not exceeding about 3.7% and 2.5% respectively, which is acceptable for most engineering applications. © 2014 Published by Elsevier Ltd. All rights reserved.

  14. The multi-dimensional model of Māori identity and cultural engagement: item response theory analysis of scale properties.

    Science.gov (United States)

    Sibley, Chris G; Houkamau, Carla A

    2013-01-01

    We argue that there is a need for culture-specific measures of identity that delineate the factors that most make sense for specific cultural groups. One such measure, recently developed specifically for Māori peoples, is the Multi-Dimensional Model of Māori Identity and Cultural Engagement (MMM-ICE). Māori are the indigenous peoples of New Zealand. The MMM-ICE is a 6-factor measure that assesses the following aspects of identity and cultural engagement as Māori: (a) group membership evaluation, (b) socio-political consciousness, (c) cultural efficacy and active identity engagement, (d) spirituality, (e) interdependent self-concept, and (f) authenticity beliefs. This article examines the scale properties of the MMM-ICE using item response theory (IRT) analysis in a sample of 492 Māori. The MMM-ICE subscales showed reasonably even levels of measurement precision across the latent trait range. Analysis of age (cohort) effects further indicated that most aspects of Māori identification tended to be higher among older Māori, and these cohort effects were similar for both men and women. This study provides novel support for the reliability and measurement precision of the MMM-ICE. The study also provides a first step in exploring change and stability in Māori identity across the life span. A copy of the scale, along with recommendations for scale scoring, is included.

  15. Parallel Implementation of the Multi-Dimensional Spectral Code SPECT3D on large 3D grids.

    Science.gov (United States)

    Golovkin, Igor E.; Macfarlane, Joseph J.; Woodruff, Pamela R.; Pereyra, Nicolas A.

    2006-10-01

    The multi-dimensional collisional-radiative, spectral analysis code SPECT3D can be used to study radiation from complex plasmas. SPECT3D can generate instantaneous and time-gated images and spectra, space-resolved and streaked spectra, which makes it a valuable tool for post-processing hydrodynamics calculations and direct comparison between simulations and experimental data. On large three dimensional grids, transporting radiation along lines of sight (LOS) requires substantial memory and CPU resources. Currently, the parallel option in SPECT3D is based on parallelization over photon frequencies and allows for a nearly linear speed-up for a variety of problems. In addition, we are introducing a new parallel mechanism that will greatly reduce memory requirements. In the new implementation, spatial domain decomposition will be utilized allowing transport along a LOS to be performed only on the mesh cells the LOS crosses. The ability to operate on a fraction of the grid is crucial for post-processing the results of large-scale three-dimensional hydrodynamics simulations. We will present a parallel implementation of the code and provide a scalability study performed on a Linux cluster.

  16. Multi-dimensional optimization of a terawatt seeded tapered Free Electron Laser with a Multi-Objective Genetic Algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Juhao, E-mail: jhwu@SLAC.Stanford.EDU [SLAC National Accelerator Laboratory, Menlo Park, CA 94025 (United States); Hu, Newman [Valley Christian High School, 100 Skyway Drive, San Jose, CA 95111 (United States); Setiawan, Hananiel [The Facility for Rare Isotope Beams, Michigan State University, East Lansing, MI 48824 (United States); Huang, Xiaobiao; Raubenheimer, Tor O. [SLAC National Accelerator Laboratory, Menlo Park, CA 94025 (United States); Jiao, Yi [Institute of High Energy Physics, Chinese Academy of Sciences, Beijing 100049 (China); Yu, George [Columbia University, New York, NY 10027 (United States); Mandlekar, Ajay [California Institute of Technology, Pasadena, CA 91125 (United States); Spampinati, Simone [Sincrotrone Trieste S.C.p.A. di interesse nazionale, Strada Statale 14-km 163,5 in AREA Science Park, 34149 Basovizza, Trieste (Italy); Fang, Kun [SLAC National Accelerator Laboratory, Menlo Park, CA 94025 (United States); Chu, Chungming [The Facility for Rare Isotope Beams, Michigan State University, East Lansing, MI 48824 (United States); Qiang, Ji [Lawrence Berkeley National Laboratory, University of California, Berkeley, CA 94720 (United States)

    2017-02-21

    There is a great interest in generating high-power hard X-ray Free Electron Laser (FEL) in the terawatt (TW) level that can enable coherent diffraction imaging of complex molecules like proteins and probe fundamental high-field physics. A feasibility study of producing such X-ray pulses was carried out employing a configuration beginning with a Self-Amplified Spontaneous Emission FEL, followed by a “self-seeding” crystal monochromator generating a fully coherent seed, and finishing with a long tapered undulator where the coherent seed recombines with the electron bunch and is amplified to high power. The undulator tapering profile, the phase advance in the undulator break sections, the quadrupole focusing strength, etc. are parameters to be optimized. A Genetic Algorithm (GA) is adopted for this multi-dimensional optimization. Concrete examples are given for LINAC Coherent Light Source (LCLS) and LCLS-II-type systems. Analytical estimate is also developed to cross check the simulation and optimization results as a quick and complimentary tool.

  17. A Two-Temperature Open-Source CFD Model for Hypersonic Reacting Flows, Part Two: Multi-Dimensional Analysis †

    Directory of Open Access Journals (Sweden)

    Vincent Casseau

    2016-12-01

    Full Text Available hy2Foam is a newly-coded open-source two-temperature computational fluid dynamics (CFD solver that has previously been validated for zero-dimensional test cases. It aims at (1 giving open-source access to a state-of-the-art hypersonic CFD solver to students and researchers; and (2 providing a foundation for a future hybrid CFD-DSMC (direct simulation Monte Carlo code within the OpenFOAM framework. This paper focuses on the multi-dimensional verification of hy2Foam and firstly describes the different models implemented. In conjunction with employing the coupled vibration-dissociation-vibration (CVDV chemistry–vibration model, novel use is made of the quantum-kinetic (QK rates in a CFD solver. hy2Foam has been shown to produce results in good agreement with previously published data for a Mach 11 nitrogen flow over a blunted cone and with the dsmcFoam code for a Mach 20 cylinder flow for a binary reacting mixture. This latter case scenario provides a useful basis for other codes to compare against.

  18. Multi-dimensional approach of MARS-LMR for the analysis of Phenix End-of-Life natural circulation test

    International Nuclear Information System (INIS)

    Jeong, Hae Yong; Ha, Kwi Seok; Chang, Won Pyo; Lee, Kwi Lim

    2012-01-01

    Phenix is one of the important prototype sodium-cooled fast reactors (SFR) in nuclear reactor development history. It had been operated successfully for 35 years by the French Commissariat a l'energie atomique (CEA) and the Electricite de France (EdF) achieving its original objectives of demonstrating a fast breeder reactor technology and of playing the role of irradiation facility for innovative fuels and materials. After its final shutdown in 2009, CEA launched the Phenix End-of-life (EOL) test program. It provided a unique opportunity to generate reliable test data which is inevitable in the validation and verification of a SFR system analysis code. KAERI joined this international collaboration program of IAEA CRP and has performed the pretest analysis and post-test analysis utilizing the one-dimensional modeling of the MARS-LMR code, which had been developed by KAERI for the transient analysis of SFR systems. Through the previous studies, it has been identified that there are some limitations in the modeling of complicated thermal-hydraulic behaviors in the large pool volumes with the one-dimensional modeling. Recently, KAERI performed the analysis of Phenix EOL natural circulation test with multi-dimensional pool modeling, which is detailed below

  19. Multi-dimensional TOF-SIMS analysis for effective profiling of disease-related ions from the tissue surface.

    Science.gov (United States)

    Park, Ji-Won; Jeong, Hyobin; Kang, Byeongsoo; Kim, Su Jin; Park, Sang Yoon; Kang, Sokbom; Kim, Hark Kyun; Choi, Joon Sig; Hwang, Daehee; Lee, Tae Geol

    2015-06-05

    Time-of-flight secondary ion mass spectrometry (TOF-SIMS) emerges as a promising tool to identify the ions (small molecules) indicative of disease states from the surface of patient tissues. In TOF-SIMS analysis, an enhanced ionization of surface molecules is critical to increase the number of detected ions. Several methods have been developed to enhance ionization capability. However, how these methods improve identification of disease-related ions has not been systematically explored. Here, we present a multi-dimensional SIMS (MD-SIMS) that combines conventional TOF-SIMS and metal-assisted SIMS (MetA-SIMS). Using this approach, we analyzed cancer and adjacent normal tissues first by TOF-SIMS and subsequently by MetA-SIMS. In total, TOF- and MetA-SIMS detected 632 and 959 ions, respectively. Among them, 426 were commonly detected by both methods, while 206 and 533 were detected uniquely by TOF- and MetA-SIMS, respectively. Of the 426 commonly detected ions, 250 increased in their intensities by MetA-SIMS, whereas 176 decreased. The integrated analysis of the ions detected by the two methods resulted in an increased number of discriminatory ions leading to an enhanced separation between cancer and normal tissues. Therefore, the results show that MD-SIMS can be a useful approach to provide a comprehensive list of discriminatory ions indicative of disease states.

  20. Assessment of the physico-chemical behavior of titanium dioxide nanoparticles in aquatic environments using multi-dimensional parameter testing

    International Nuclear Information System (INIS)

    Kammer, Frank von der; Ottofuelling, Stephanie; Hofmann, Thilo

    2010-01-01

    Assessment of the behavior and fate of engineered nanoparticles (ENPs) in natural aquatic media is crucial for the identification of environmentally critical properties of the ENPs. Here we present a methodology for testing the dispersion stability, ζ-potential and particle size of engineered nanoparticles as a function of pH and water composition. The results obtained from already widely used titanium dioxide nanoparticles (Evonik P25 and Hombikat UV-100) serve as a proof-of-concept for the proposed testing scheme. In most cases the behavior of the particles in the tested settings follows the expectations derived from classical DLVO theory for metal oxide particles with variable charge and an isoelectric point at around pH 5, but deviations also occur. Regardless of a 5-fold difference in BET specific surface area particles composed of the same core material behave in an overall comparable manner. The presented methodology can act as a basis for the development of standardised methods for comparing the behavior of different nanoparticles within aquatic systems. - The behavior of engineered nanoparticles in the aquatic environment can be elucidated using a multi-dimensional parameter set acquired by a semi automated experimental set-up.

  1. Psychometric Properties of Multi-Dimensional Scale of Perceived Social Support in Chinese Parents of Children with Cerebral Palsy

    Directory of Open Access Journals (Sweden)

    Yongli Wang

    2017-11-01

    Full Text Available The Multi-dimensional Scale of Perceived Social Support (MSPSS is one of the most extensively used instruments to assess social support. The purpose of this research was to test the reliability, factorial validity, concurrent validity and measurement invariance across gender groups of the MSPSS in Chinese parents of children with cerebral palsy. A total of 487 participants aged 21–55 years were recruited to complete the Chinese MSPSS and Parenting Stress Index-Short Form (PSI-SF. Composite reliability was calculated as the internal consistency of the Chinese MSPSS and a (multi-group confirmatory factor analysis (CFA was conducted to test the factorial validity and measurement invariance across gender. And Pearson correlations were calculated to test the relationships between MSPSS and PSI-SF. The Chinese MSPSS had satisfactory internal reliability with composite reliability values of more than 0.7. The CFA indicated that the original three-factor model was replicated in this specific population. Importantly, the results of the multi-group CFA demonstrated that configural, metric, and scalar invariance across gender groups was supported. In addition, all the three subscales of MSPSS were significant related with PSI-SF. These findings suggest that the Chinese MSPSS is a reliable and valid tool for assessing social support and can generally be utilized across sex in the parents of children with cerebral palsy.

  2. Multi-dimensional approach of MARS-LMR for the analysis of Phenix End-of-Life natural circulation test

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Hae Yong; Ha, Kwi Seok; Chang, Won Pyo; Lee, Kwi Lim [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2012-05-15

    Phenix is one of the important prototype sodium-cooled fast reactors (SFR) in nuclear reactor development history. It had been operated successfully for 35 years by the French Commissariat a l'energie atomique (CEA) and the Electricite de France (EdF) achieving its original objectives of demonstrating a fast breeder reactor technology and of playing the role of irradiation facility for innovative fuels and materials. After its final shutdown in 2009, CEA launched the Phenix End-of-life (EOL) test program. It provided a unique opportunity to generate reliable test data which is inevitable in the validation and verification of a SFR system analysis code. KAERI joined this international collaboration program of IAEA CRP and has performed the pretest analysis and post-test analysis utilizing the one-dimensional modeling of the MARS-LMR code, which had been developed by KAERI for the transient analysis of SFR systems. Through the previous studies, it has been identified that there are some limitations in the modeling of complicated thermal-hydraulic behaviors in the large pool volumes with the one-dimensional modeling. Recently, KAERI performed the analysis of Phenix EOL natural circulation test with multi-dimensional pool modeling, which is detailed below

  3. Development of an Output-based Adaptive Method for Multi-Dimensional Euler and Navier-Stokes Simulations

    Science.gov (United States)

    Darmofal, David L.

    2003-01-01

    The use of computational simulations in the prediction of complex aerodynamic flows is becoming increasingly prevalent in the design process within the aerospace industry. Continuing advancements in both computing technology and algorithmic development are ultimately leading to attempts at simulating ever-larger, more complex problems. However, by increasing the reliance on computational simulations in the design cycle, we must also increase the accuracy of these simulations in order to maintain or improve the reliability arid safety of the resulting aircraft. At the same time, large-scale computational simulations must be made more affordable so that their potential benefits can be fully realized within the design cycle. Thus, a continuing need exists for increasing the accuracy and efficiency of computational algorithms such that computational fluid dynamics can become a viable tool in the design of more reliable, safer aircraft. The objective of this research was the development of an error estimation and grid adaptive strategy for reducing simulation errors in integral outputs (functionals) such as lift or drag from from multi-dimensional Euler and Navier-Stokes simulations. In this final report, we summarize our work during this grant.

  4. Multi-dimensional fission-barrier calculations from Se to the SHE; from the proton to the neutron drip lines

    International Nuclear Information System (INIS)

    Moeller, Peter; Sierk, Arnold J.; Bengtsson, Ragnar; Iwamoto, Akira

    2003-01-01

    We present fission-barrier-height calculations for nuclei throughout the periodic system based on a realistic theoretical model of the multi-dimensional potential-energy surface of a fissioning nucleus. This surface guides the nuclear shape evolution from the ground state, over inner and outer saddle points, to the final configurations of separated fission fragments. We have previously shown that our macroscopic-microscopic nuclear potential-energy model yields calculated 'outer' fission-barrier heights (E B ) for even-even nuclei throughout the periodic system that agree with experimental data to within about 1.0 MeV. We present final results of this work. Just recently we have enhanced our macroscopic-microscopic nuclear potential-energy model to also allow the consideration of axially asymmetric shapes. This shape degree of freedom has a substantial effect on the calculated height (E A ) of the inner peak of some actinide fission barriers. We present examples of fission-barrier calculations by use of this model with its redetermined constants. Finally we discuss what the model now tells us about fission barriers at the end of the r-process nucleosynthesis path. (author)

  5. Transport synthetic acceleration scheme for multi-dimensional neutron transport problems

    Energy Technology Data Exchange (ETDEWEB)

    Modak, R S; Kumar, Vinod; Menon, S V.G. [Theoretical Physics Div., Bhabha Atomic Research Centre, Mumbai (India); Gupta, Anurag [Reactor Physics Design Div., Bhabha Atomic Research Centre, Mumbai (India)

    2005-09-15

    The numerical solution of linear multi-energy-group neutron transport equation is required in several analyses in nuclear reactor physics and allied areas. Computer codes based on the discrete ordinates (Sn) method are commonly used for this purpose. These codes solve external source problem and K-eigenvalue problem. The overall solution technique involves solution of source problem in each energy group as intermediate procedures. Such a single-group source problem is solved by the so-called Source Iteration (SI) method. As is well-known, the SI-method converges very slowly for optically thick and highly scattering regions, leading to large CPU times. Over last three decades, many schemes have been tried to accelerate the SI; the most prominent being the Diffusion Synthetic Acceleration (DSA) scheme. The DSA scheme, however, often fails and is also rather difficult to implement. In view of this, in 1997, Ramone and others have developed a new acceleration scheme called Transport Synthetic Acceleration (TSA) which is much more robust and easy to implement. This scheme has been recently incorporated in 2-D and 3-D in-house codes at BARC. This report presents studies on the utility of TSA scheme for fairly general test problems involving many energy groups and anisotropic scattering. The scheme is found to be useful for problems in Cartesian as well as Cylindrical geometry. (author)

  6. magHD: a new approach to multi-dimensional data storage, analysis, display and exploitation

    International Nuclear Information System (INIS)

    Angleraud, Christophe

    2014-01-01

    The ever increasing amount of data and processing capabilities – following the well- known Moore's law – is challenging the way scientists and engineers are currently exploiting large datasets. The scientific visualization tools, although quite powerful, are often too generic and provide abstract views of phenomena, thus preventing cross disciplines fertilization. On the other end, Geographic information Systems allow nice and visually appealing maps to be built but they often get very confused as more layers are added. Moreover, the introduction of time as a fourth analysis dimension to allow analysis of time dependent phenomena such as meteorological or climate models, is encouraging real-time data exploration techniques that allow spatial-temporal points of interests to be detected by integration of moving images by the human brain. Magellium is involved in high performance image processing chains for satellite image processing as well as scientific signal analysis and geographic information management since its creation (2003). We believe that recent work on big data, GPU and peer-to-peer collaborative processing can open a new breakthrough in data analysis and display that will serve many new applications in collaborative scientific computing, environment mapping and understanding. The magHD (for Magellium Hyper-Dimension) project aims at developing software solutions that will bring highly interactive tools for complex datasets analysis and exploration commodity hardware, targeting small to medium scale clusters with expansion capabilities to large cloud based clusters

  7. A multi-dimensional functional principal components analysis of EEG data.

    Science.gov (United States)

    Hasenstab, Kyle; Scheffler, Aaron; Telesca, Donatello; Sugar, Catherine A; Jeste, Shafali; DiStefano, Charlotte; Şentürk, Damla

    2017-09-01

    The electroencephalography (EEG) data created in event-related potential (ERP) experiments have a complex high-dimensional structure. Each stimulus presentation, or trial, generates an ERP waveform which is an instance of functional data. The experiments are made up of sequences of multiple trials, resulting in longitudinal functional data and moreover, responses are recorded at multiple electrodes on the scalp, adding an electrode dimension. Traditional EEG analyses involve multiple simplifications of this structure to increase the signal-to-noise ratio, effectively collapsing the functional and longitudinal components by identifying key features of the ERPs and averaging them across trials. Motivated by an implicit learning paradigm used in autism research in which the functional, longitudinal, and electrode components all have critical interpretations, we propose a multidimensional functional principal components analysis (MD-FPCA) technique which does not collapse any of the dimensions of the ERP data. The proposed decomposition is based on separation of the total variation into subject and subunit level variation which are further decomposed in a two-stage functional principal components analysis. The proposed methodology is shown to be useful for modeling longitudinal trends in the ERP functions, leading to novel insights into the learning patterns of children with Autism Spectrum Disorder (ASD) and their typically developing peers as well as comparisons between the two groups. Finite sample properties of MD-FPCA are further studied via extensive simulations. © 2017, The International Biometric Society.

  8. Transport synthetic acceleration scheme for multi-dimensional neutron transport problems

    International Nuclear Information System (INIS)

    Modak, R.S.; Vinod Kumar; Menon, S.V.G.; Gupta, Anurag

    2005-09-01

    The numerical solution of linear multi-energy-group neutron transport equation is required in several analyses in nuclear reactor physics and allied areas. Computer codes based on the discrete ordinates (Sn) method are commonly used for this purpose. These codes solve external source problem and K-eigenvalue problem. The overall solution technique involves solution of source problem in each energy group as intermediate procedures. Such a single-group source problem is solved by the so-called Source Iteration (SI) method. As is well-known, the SI-method converges very slowly for optically thick and highly scattering regions, leading to large CPU times. Over last three decades, many schemes have been tried to accelerate the SI; the most prominent being the Diffusion Synthetic Acceleration (DSA) scheme. The DSA scheme, however, often fails and is also rather difficult to implement. In view of this, in 1997, Ramone and others have developed a new acceleration scheme called Transport Synthetic Acceleration (TSA) which is much more robust and easy to implement. This scheme has been recently incorporated in 2-D and 3-D in-house codes at BARC. This report presents studies on the utility of TSA scheme for fairly general test problems involving many energy groups and anisotropic scattering. The scheme is found to be useful for problems in Cartesian as well as Cylindrical geometry. (author)

  9. Multi-dimensional upwinding-based implicit LES for the vorticity transport equations

    Science.gov (United States)

    Foti, Daniel; Duraisamy, Karthik

    2017-11-01

    Complex turbulent flows such as rotorcraft and wind turbine wakes are characterized by the presence of strong coherent structures that can be compactly described by vorticity variables. The vorticity-velocity formulation of the incompressible Navier-Stokes equations is employed to increase numerical efficiency. Compared to the traditional velocity-pressure formulation, high order numerical methods and sub-grid scale models for the vorticity transport equation (VTE) have not been fully investigated. Consistent treatment of the convection and stretching terms also needs to be addressed. Our belief is that, by carefully designing sharp gradient-capturing numerical schemes, coherent structures can be more efficiently captured using the vorticity-velocity formulation. In this work, a multidimensional upwind approach for the VTE is developed using the generalized Riemann problem-based scheme devised by Parish et al. (Computers & Fluids, 2016). The algorithm obtains high resolution by augmenting the upwind fluxes with transverse and normal direction corrections. The approach is investigated with several canonical vortex-dominated flows including isolated and interacting vortices and turbulent flows. The capability of the technique to represent sub-grid scale effects is also assessed. Navy contract titled ``Turbulence Modelling Across Disparate Length Scales for Naval Computational Fluid Dynamics Applications,'' through Continuum Dynamics, Inc.

  10. Breast lesion characterization using whole-lesion histogram analysis with stretched-exponential diffusion model.

    Science.gov (United States)

    Liu, Chunling; Wang, Kun; Li, Xiaodan; Zhang, Jine; Ding, Jie; Spuhler, Karl; Duong, Timothy; Liang, Changhong; Huang, Chuan

    2018-06-01

    Diffusion-weighted imaging (DWI) has been studied in breast imaging and can provide more information about diffusion, perfusion and other physiological interests than standard pulse sequences. The stretched-exponential model has previously been shown to be more reliable than conventional DWI techniques, but different diagnostic sensitivities were found from study to study. This work investigated the characteristics of whole-lesion histogram parameters derived from the stretched-exponential diffusion model for benign and malignant breast lesions, compared them with conventional apparent diffusion coefficient (ADC), and further determined which histogram metrics can be best used to differentiate malignant from benign lesions. This was a prospective study. Seventy females were included in the study. Multi-b value DWI was performed on a 1.5T scanner. Histogram parameters of whole lesions for distributed diffusion coefficient (DDC), heterogeneity index (α), and ADC were calculated by two radiologists and compared among benign lesions, ductal carcinoma in situ (DCIS), and invasive carcinoma confirmed by pathology. Nonparametric tests were performed for comparisons among invasive carcinoma, DCIS, and benign lesions. Comparisons of receiver operating characteristic (ROC) curves were performed to show the ability to discriminate malignant from benign lesions. The majority of histogram parameters (mean/min/max, skewness/kurtosis, 10-90 th percentile values) from DDC, α, and ADC were significantly different among invasive carcinoma, DCIS, and benign lesions. DDC 10% (area under curve [AUC] = 0.931), ADC 10% (AUC = 0.893), and α mean (AUC = 0.787) were found to be the best metrics in differentiating benign from malignant tumors among all histogram parameters derived from ADC and α, respectively. The combination of DDC 10% and α mean , using logistic regression, yielded the highest sensitivity (90.2%) and specificity (95.5%). DDC 10% and α mean derived from

  11. Measuring kinetics of complex single ion channel data using mean-variance histograms.

    Science.gov (United States)

    Patlak, J B

    1993-07-01

    -variance histogram technique provided a more credible analysis of the open, closed, and subconductance times for the patch. I also show that the method produces accurate results on simulated data in a wide variety of conditions, whereas the half-amplitude method, when applied to complex simulated data shows the same errors as were apparent in the real data. The utility and the limitations of this new method are discussed.

  12. A demonstration of adjoint methods for multi-dimensional remote sensing of the atmosphere and surface

    International Nuclear Information System (INIS)

    Martin, William G.K.; Hasekamp, Otto P.

    2018-01-01

    Highlights: • We demonstrate adjoint methods for atmospheric remote sensing in a two-dimensional setting. • Searchlight functions are used to handle the singularity of measurement response functions. • Adjoint methods require two radiative transfer calculations to evaluate the measurement misfit function and its derivatives with respect to all unknown parameters. • Synthetic retrieval studies show the scalability of adjoint methods to problems with thousands of measurements and unknown parameters. • Adjoint methods and the searchlight function technique are generalizable to 3D remote sensing. - Abstract: In previous work, we derived the adjoint method as a computationally efficient path to three-dimensional (3D) retrievals of clouds and aerosols. In this paper we will demonstrate the use of adjoint methods for retrieving two-dimensional (2D) fields of cloud extinction. The demonstration uses a new 2D radiative transfer solver (FSDOM). This radiation code was augmented with adjoint methods to allow efficient derivative calculations needed to retrieve cloud and surface properties from multi-angle reflectance measurements. The code was then used in three synthetic retrieval studies. Our retrieval algorithm adjusts the cloud extinction field and surface albedo to minimize the measurement misfit function with a gradient-based, quasi-Newton approach. At each step we compute the value of the misfit function and its gradient with two calls to the solver FSDOM. First we solve the forward radiative transfer equation to compute the residual misfit with measurements, and second we solve the adjoint radiative transfer equation to compute the gradient of the misfit function with respect to all unknowns. The synthetic retrieval studies verify that adjoint methods are scalable to retrieval problems with many measurements and unknowns. We can retrieve the vertically-integrated optical depth of moderately thick clouds as a function of the horizontal coordinate. It is also

  13. Comments on 'Reconsidering the definition of a dose-volume histogram'-dose-mass histogram (DMH) versus dose-volume histogram (DVH) for predicting radiation-induced pneumonitis

    International Nuclear Information System (INIS)

    Mavroidis, Panayiotis; Plataniotis, Georgios A; Gorka, Magdalena Adamus; Lind, Bengt K

    2006-01-01

    In a recently published paper (Nioutsikou et al 2005 Phys. Med. Biol. 50 L17) the authors showed that the use of the dose-mass histogram (DMH) concept is a more accurate descriptor of the dose delivered to lung than the traditionally used dose-volume histogram (DVH) concept. Furthermore, they state that if a functional imaging modality could also be registered to the anatomical imaging modality providing a functional weighting across the organ (functional mass) then the more general and realistic concept of the dose-functioning mass histogram (D[F]MH) could be an even more appropriate descriptor. The comments of the present letter to the editor are in line with the basic arguments of that work since their general conclusions appear to be supported by the comparison of the DMH and DVH concepts using radiobiological measures. In this study, it is examined whether the dose-mass histogram (DMH) concept deviated significantly from the widely used dose-volume histogram (DVH) concept regarding the expected lung complications and if there are clinical indications supporting these results. The problem was investigated theoretically by applying two hypothetical dose distributions (Gaussian and semi-Gaussian shaped) on two lungs of uniform and varying densities. The influence of the deviation between DVHs and DMHs on the treatment outcome was estimated by using the relative seriality and LKB models using the Gagliardi et al (2000 Int. J. Radiat. Oncol. Biol. Phys. 46 373) and Seppenwoolde et al (2003 Int. J. Radiat. Oncol. Biol. Phys. 55 724) parameter sets for radiation pneumonitis, respectively. Furthermore, the biological equivalent of their difference was estimated by the biologically effective uniform dose (D-bar) and equivalent uniform dose (EUD) concepts, respectively. It is shown that the relation between the DVHs and DMHs varies depending on the underlying cell density distribution and the applied dose distribution. However, the range of their deviation in terms of

  14. Fuzzy Logic-Based Histogram Equalization for Image Contrast Enhancement

    Directory of Open Access Journals (Sweden)

    V. Magudeeswaran

    2013-01-01

    Full Text Available Fuzzy logic-based histogram equalization (FHE is proposed for image contrast enhancement. The FHE consists of two stages. First, fuzzy histogram is computed based on fuzzy set theory to handle the inexactness of gray level values in a better way compared to classical crisp histograms. In the second stage, the fuzzy histogram is divided into two subhistograms based on the median value of the original image and then equalizes them independently to preserve image brightness. The qualitative and quantitative analyses of proposed FHE algorithm are evaluated using two well-known parameters like average information contents (AIC and natural image quality evaluator (NIQE index for various images. From the qualitative and quantitative measures, it is interesting to see that this proposed method provides optimum results by giving better contrast enhancement and preserving the local information of the original image. Experimental result shows that the proposed method can effectively and significantly eliminate washed-out appearance and adverse artifacts induced by several existing methods. The proposed method has been tested using several images and gives better visual quality as compared to the conventional methods.

  15. Hybrid Histogram Descriptor: A Fusion Feature Representation for Image Retrieval.

    Science.gov (United States)

    Feng, Qinghe; Hao, Qiaohong; Chen, Yuqi; Yi, Yugen; Wei, Ying; Dai, Jiangyan

    2018-06-15

    Currently, visual sensors are becoming increasingly affordable and fashionable, acceleratingly the increasing number of image data. Image retrieval has attracted increasing interest due to space exploration, industrial, and biomedical applications. Nevertheless, designing effective feature representation is acknowledged as a hard yet fundamental issue. This paper presents a fusion feature representation called a hybrid histogram descriptor (HHD) for image retrieval. The proposed descriptor comprises two histograms jointly: a perceptually uniform histogram which is extracted by exploiting the color and edge orientation information in perceptually uniform regions; and a motif co-occurrence histogram which is acquired by calculating the probability of a pair of motif patterns. To evaluate the performance, we benchmarked the proposed descriptor on RSSCN7, AID, Outex-00013, Outex-00014 and ETHZ-53 datasets. Experimental results suggest that the proposed descriptor is more effective and robust than ten recent fusion-based descriptors under the content-based image retrieval framework. The computational complexity was also analyzed to give an in-depth evaluation. Furthermore, compared with the state-of-the-art convolutional neural network (CNN)-based descriptors, the proposed descriptor also achieves comparable performance, but does not require any training process.

  16. Improved LSB matching steganography with histogram characters reserved

    Science.gov (United States)

    Chen, Zhihong; Liu, Wenyao

    2008-03-01

    This letter bases on the researches of LSB (least significant bit, i.e. the last bit of a binary pixel value) matching steganographic method and the steganalytic method which aims at histograms of cover images, and proposes a modification to LSB matching. In the LSB matching, if the LSB of the next cover pixel matches the next bit of secret data, do nothing; otherwise, choose to add or subtract one from the cover pixel value at random. In our improved method, a steganographic information table is defined and records the changes which embedded secrete bits introduce in. Through the table, the next LSB which has the same pixel value will be judged to add or subtract one dynamically in order to ensure the histogram's change of cover image is minimized. Therefore, the modified method allows embedding the same payload as the LSB matching but with improved steganographic security and less vulnerability to attacks compared with LSB matching. The experimental results of the new method show that the histograms maintain their attributes, such as peak values and alternative trends, in an acceptable degree and have better performance than LSB matching in the respects of histogram distortion and resistance against existing steganalysis.

  17. The development of a collapsing method for the mixed group and point cross sections and its application on multi-dimensional deep penetration calculations

    International Nuclear Information System (INIS)

    Bor-Jing Chang; Yen-Wan H. Liu

    1992-01-01

    The HYBRID, or mixed group and point, method was developed to solve the neutron transport equation deterministically using detailed treatment at cross section minima for deep penetration calculations. Its application so far is limited to one-dimensional calculations due to the enormous computing time involved in multi-dimensional calculations. In this article, a collapsing method is developed for the mixed group and point cross section sets to provide a more direct and practical way of using the HYBRID method in the multi-dimensional calculations. A testing problem is run. The method is then applied to the calculation of a deep penetration benchmark experiment. It is observed that half of the window effect is smeared in the collapsing treatment, but it still provide a better cross section set than the VITAMIN-C cross sections for the deep penetrating calculations

  18. Effect of multi-dimensional ultraviolet light exposure on the growth of pentacene film and application to organic field-effect transistors.

    Science.gov (United States)

    Bae, Jin-Hyuk; Lee, Sin-Doo; Choi, Jong Sun; Park, Jaehoon

    2012-05-01

    We report on the multi-dimensional alignment of pentacene molecules on a poly(methyl methacrylate)-based photosensitive polymer (PMMA-polymer) and its effect on the electrical performance of the pentacene-based field-effect transistor (FET). Pentacene molecules are shown to be preferentially aligned on the linearly polarized ultraviolet (LPUV)-exposed PMMA-polymer layer, which is contrast to an isotropic alignment on the bare PMMA-polymer layer. Multi-dimensional alignment of pentacene molecules in the film could be achieved by adjusting the direction of LPUV exposed to the PMMA-polymer. The control of pentacene molecular alignment is found to be promising for the field-effect mobility enhancement in the pentacene FET.

  19. Large Break LOCA Analysis with New downcomer Nodalizaion and Multi-Dimensional Model and Effect of Cross flow option in MARS code

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Hyung-wook; Lee, Sang-yong; Oh, Seung-jong; Kim, Woong-bae [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2016-10-15

    The phenomena of LOCA have been investigated for long time. The most extensive research project for LOCA was the 2D/3D program experiments. The results of the 2D/3D experiments show flow conditions in the downcomer during end-of-blowdown were highly multi-dimensional at full-scale. In this paper, the authors modified the nodalization of MARS code LBLOCA input deck and performed LBLOCA analysis with new input deck. An LBLOCA analysis for APR1400 with new downcomer input deck was conducted using KREM with MARS-KS 1.4 Version code. Analysis was processed under LBCOCA of 100% break size of cold leg case. The authors developed input deck with new downcomer nodalizaion and Multi-Dimensional downcomer model, then implemented LOCA analysis with new input decks and compared with existing analysis results. PCT from new input and multi-dimensional input deck shows similar PCT trend from original input deck. There occurred more rapid drop of PCT from new and multidimensional input deck than original input deck. PCT from new and multidimensional input deck are satisfied with PCT design limit. It can be concluded that there occurs no acceptance criteria issue even though new and multidimensional input deck are applied to LBLOCA analysis. In future study, comparative analysis with experiment results will be implemented.

  20. Multi-dimensional Mixing Behavior of Steam-Water Flow in a Downcomer Annulus during LBLOCA Reflood Phase with a DVI Injection Mode

    International Nuclear Information System (INIS)

    Kwon, T.S.; Yun, B.J.; Euh, D.J.; Chu, I.C.; Song, C.H.

    2002-01-01

    Multi-dimensional thermal-hydraulic behavior in the downcomer annulus of a pressurized water reactor vessel with a Direct Vessel Injection (DVI) mode is presented based on the experimental observation in the MIDAS (Multi-dimensional Investigation in Downcomer Annulus Simulation) steam-water test facility. From the steady-state test results to simulate the late reflood phase of a Large Break Loss-of-Coolant Accidents(LBLOCA), isothermal lines show the multidimensional phenomena of a phasic interaction between steam and water in the downcomer annulus very well. MIDAS is a steam-water separate effect test facility, which is 1/4.93 linearly scaled-down of 1400 MWe PWR type of a nuclear reactor, focused on understanding multi-dimensional thermalhydraulic phenomena in downcomer annulus with various types of safety injection during the refill or reflood phase of a LBLOCA. The initial and the boundary conditions are scaled from the pre-test analysis based on the preliminary calculation using the TRAC code. The superheated steam with a superheating degree of 80 K at a given downcomer pressure of 180 kPa is injected equally through three intact cold legs into the downcomer. (authors)

  1. 3D facial expression recognition based on histograms of surface differential quantities

    KAUST Repository

    Li, Huibin; Morvan, Jean-Marie; Chen, Liming

    2011-01-01

    . To characterize shape information of the local neighborhood of facial landmarks, we calculate the weighted statistical distributions of surface differential quantities, including histogram of mesh gradient (HoG) and histogram of shape index (HoS). Normal cycle

  2. Development and validation of the Bullying and Cyberbullying Scale for Adolescents: A multi-dimensional measurement model.

    Science.gov (United States)

    Thomas, Hannah J; Scott, James G; Coates, Jason M; Connor, Jason P

    2018-05-03

    Intervention on adolescent bullying is reliant on valid and reliable measurement of victimization and perpetration experiences across different behavioural expressions. This study developed and validated a survey tool that integrates measurement of both traditional and cyber bullying to test a theoretically driven multi-dimensional model. Adolescents from 10 mainstream secondary schools completed a baseline and follow-up survey (N = 1,217; M age  = 14 years; 66.2% male). The Bullying and cyberbullying Scale for Adolescents (BCS-A) developed for this study comprised parallel victimization and perpetration subscales, each with 20 items. Additional measures of bullying (Olweus Global Bullying and the Forms of Bullying Scale [FBS]), as well as measures of internalizing and externalizing problems, school connectedness, social support, and personality, were used to further assess validity. Factor structure was determined, and then, the suitability of items was assessed according to the following criteria: (1) factor interpretability, (2) item correlations, (3) model parsimony, and (4) measurement equivalence across victimization and perpetration experiences. The final models comprised four factors: physical, verbal, relational, and cyber. The final scale was revised to two 13-item subscales. The BCS-A demonstrated acceptable concurrent and convergent validity (internalizing and externalizing problems, school connectedness, social support, and personality), as well as predictive validity over 6 months. The BCS-A has sound psychometric properties. This tool establishes measurement equivalence across types of involvement and behavioural forms common among adolescents. An improved measurement method could add greater rigour to the evaluation of intervention programmes and also enable interventions to be tailored to subscale profiles. © 2018 The British Psychological Society.

  3. SU-F-T-312: Identifying Distinct Radiation Therapy Plan Classes Through Multi-Dimensional Analysis of Plan Complexity Metrics

    Energy Technology Data Exchange (ETDEWEB)

    Desai, V; Labby, Z; Culberson, W [University of Wisc Madison, Madison, WI (United States)

    2016-06-15

    Purpose: To determine whether body site-specific treatment plans form unique “plan class” clusters in a multi-dimensional analysis of plan complexity metrics such that a single beam quality correction determined for a representative plan could be universally applied within the “plan class”, thereby increasing the dosimetric accuracy of a detector’s response within a subset of similarly modulated nonstandard deliveries. Methods: We collected 95 clinical volumetric modulated arc therapy (VMAT) plans from four body sites (brain, lung, prostate, and spine). The lung data was further subdivided into SBRT and non-SBRT data for a total of five plan classes. For each control point in each plan, a variety of aperture-based complexity metrics were calculated and stored as unique characteristics of each patient plan. A multiple comparison of means analysis was performed such that every plan class was compared to every other plan class for every complexity metric in order to determine which groups could be considered different from one another. Statistical significance was assessed after correcting for multiple hypothesis testing. Results: Six out of a possible 10 pairwise plan class comparisons were uniquely distinguished based on at least nine out of 14 of the proposed metrics (Brain/Lung, Brain/SBRT lung, Lung/Prostate, Lung/SBRT Lung, Lung/Spine, Prostate/SBRT Lung). Eight out of 14 of the complexity metrics could distinguish at least six out of the possible 10 pairwise plan class comparisons. Conclusion: Aperture-based complexity metrics could prove to be useful tools to quantitatively describe a distinct class of treatment plans. Certain plan-averaged complexity metrics could be considered unique characteristics of a particular plan. A new approach to generating plan-class specific reference (pcsr) fields could be established through a targeted preservation of select complexity metrics or a clustering algorithm that identifies plans exhibiting similar

  4. [Multi-dimensional exploration of the characteristics of emotional regulation in children with attention-deficit/hyperactivity disorder].

    Science.gov (United States)

    Yu, Xiaoyan; Liu, Lu; Sun, Li; Qian, Ying; Qian, Qiujin; Wu, Zhaomin; Cao, Qingjiu; Wang, Yufeng

    2015-10-20

    To explore the characteristics of emotional regulation in children with attention-deficit/hyperactivity disorder (ADHD). Two hundred and eighty-two children who were diagnosed as ADHD according to the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV) were recruited from the child psychiatric clinic of Peking University Sixth Hospital/Institute of Mental Health from August 2012 to April 2014. And 260 normal children from the local primary schools were selected as the healthy control group. The emotional factors or items of Conners' Parent Rating Scale, Behavior Rating Inventory of Executive Function (BRIEF), Achenbach's Child Behavior Checklist (CBCL) and Rutter Children Behavior Questionnaire were used to assess the characteristics of emotional regulation multi-dimensionally. After controlling for the effects of age, sex and intelligence quotient (IQ), in Conner scale, the emotional lability (EL) scores of ADHD group were significantly higher than that of healthy control group [(4.3±2.6) vs (1.4±1.5), Pemotional control (ECTRL) scores of ADHD group were significantly higher than that of control group [(16.1±4.4) vs (12.0±2.5), Pemotional self-regulation (DESR) scores of ADHD group were significantly higher than that of control group [(26.8±11.0) vs (6.6±6.8), Pemotional symptoms (ES) scores of ADHD group were significantly higher than that of control group [(2.7±2.0) vs (1.7±1.5), Pemotional regulation.

  5. Ethnicity, work-related stress and subjective reports of health by migrant workers: a multi-dimensional model.

    Science.gov (United States)

    Capasso, Roberto; Zurlo, Maria Clelia; Smith, Andrew P

    2018-02-01

    This study integrates different aspects of ethnicity and work-related stress dimensions (based on the Demands-Resources-Individual-Effects model, DRIVE [Mark, G. M., and A. P. Smith. 2008. "Stress Models: A Review and Suggested New Direction." In Occupational Health Psychology, edited by J. Houdmont and S. Leka, 111-144. Nottingham: Nottingham University Press]) and aims to test a multi-dimensional model that combines individual differences, ethnicity dimensions, work characteristics, and perceived job satisfaction/stress as independent variables in the prediction of subjectives reports of health by workers differing in ethnicity. A questionnaire consisting of the following sections was submitted to 900 workers in Southern Italy: for individual and cultural characteristics, coping strategies, personality behaviours, and acculturation strategies; for work characteristics, perceived job demands and job resources/rewards; for appraisals, perceived job stress/satisfaction and racial discrimination; for subjective reports of health, psychological disorders and general health. To test the reliability and construct validity of the extracted factors referred to all dimensions involved in the proposed model and logistic regression analyses to evaluate the main effects of the independent variables on the health outcomes were conducted. Principal component analysis (PCA) yielded seven factors for individual and cultural characteristics (emotional/relational coping, objective coping, Type A behaviour, negative affectivity, social inhibition, affirmation/maintenance culture, and search identity/adoption of the host culture); three factors for work characteristics (work demands, intrinsic/extrinsic rewards, and work resources); three factors for appraisals (perceived job satisfaction, perceived job stress, perceived racial discrimination) and three factors for subjective reports of health (interpersonal disorders, anxious-depressive disorders, and general health). Logistic

  6. ADC histogram analysis for adrenal tumor histogram analysis of apparent diffusion coefficient in differentiating adrenal adenoma from pheochromocytoma.

    Science.gov (United States)

    Umanodan, Tomokazu; Fukukura, Yoshihiko; Kumagae, Yuichi; Shindo, Toshikazu; Nakajo, Masatoyo; Takumi, Koji; Nakajo, Masanori; Hakamada, Hiroto; Umanodan, Aya; Yoshiura, Takashi

    2017-04-01

    To determine the diagnostic performance of apparent diffusion coefficient (ADC) histogram analysis in diffusion-weighted (DW) magnetic resonance imaging (MRI) for differentiating adrenal adenoma from pheochromocytoma. We retrospectively evaluated 52 adrenal tumors (39 adenomas and 13 pheochromocytomas) in 47 patients (21 men, 26 women; mean age, 59.3 years; range, 16-86 years) who underwent DW 3.0T MRI. Histogram parameters of ADC (b-values of 0 and 200 [ADC 200 ], 0 and 400 [ADC 400 ], and 0 and 800 s/mm 2 [ADC 800 ])-mean, variance, coefficient of variation (CV), kurtosis, skewness, and entropy-were compared between adrenal adenomas and pheochromocytomas, using the Mann-Whitney U-test. Receiver operating characteristic (ROC) curves for the histogram parameters were generated to differentiate adrenal adenomas from pheochromocytomas. Sensitivity and specificity were calculated by using a threshold criterion that would maximize the average of sensitivity and specificity. Variance and CV of ADC 800 were significantly higher in pheochromocytomas than in adrenal adenomas (P histogram parameters for diagnosing adrenal adenomas (ADC 200 , 0.82; ADC 400 , 0.87; and ADC 800 , 0.92), with sensitivity of 84.6% and specificity of 84.6% (cutoff, ≤2.82) with ADC 200 ; sensitivity of 89.7% and specificity of 84.6% (cutoff, ≤2.77) with ADC 400 ; and sensitivity of 94.9% and specificity of 92.3% (cutoff, ≤2.67) with ADC 800 . ADC histogram analysis of DW MRI can help differentiate adrenal adenoma from pheochromocytoma. 3 J. Magn. Reson. Imaging 2017;45:1195-1203. © 2016 International Society for Magnetic Resonance in Medicine.

  7. AN ILLUMINATION INVARIANT FACE RECOGNITION BY ENHANCED CONTRAST LIMITED ADAPTIVE HISTOGRAM EQUALIZATION

    Directory of Open Access Journals (Sweden)

    A. Thamizharasi

    2016-05-01

    Full Text Available Face recognition system is gaining more importance in social networks and surveillance. The face recognition task is complex due to the variations in illumination, expression, occlusion, aging and pose. The illumination variations in image are due to changes in lighting conditions, poor illumination, low contrast or increased brightness. The variations in illumination adversely affect the quality of image and recognition accuracy. The illumination variations in face image have to be pre-processed prior to face recognition. The Contrast Limited Adaptive Histogram Equalization (CLAHE is an image enhancement technique popular in enhancing medical images. The proposed work is to create illumination invariant face recognition system by enhancing Contrast Limited Adaptive Histogram Equalization technique. This method is termed as “Enhanced CLAHE”. The efficiency of Enhanced CLAHE is tested using Fuzzy K Nearest Neighbour classifier and fisher face subspace projection method. The face recognition accuracy percentage rate, Equal Error Rate and False Acceptance Rate at 1% are calculated. The performance of CLAHE and Enhanced CLAHE methods is compared. The efficiency of the Enhanced CLAHE method is tested with three public face databases AR, Yale and ORL. The Enhanced CLAHE has very high recognition accuracy percentage rate when compared to CLAHE.

  8. Optimized broad-histogram simulations for strong first-order phase transitions: droplet transitions in the large-Q Potts model

    International Nuclear Information System (INIS)

    Bauer, Bela; Troyer, Matthias; Gull, Emanuel; Trebst, Simon; Huse, David A

    2010-01-01

    The numerical simulation of strongly first-order phase transitions has remained a notoriously difficult problem even for classical systems due to the exponentially suppressed (thermal) equilibration in the vicinity of such a transition. In the absence of efficient update techniques, a common approach for improving equilibration in Monte Carlo simulations is broadening the sampled statistical ensemble beyond the bimodal distribution of the canonical ensemble. Here we show how a recently developed feedback algorithm can systematically optimize such broad-histogram ensembles and significantly speed up equilibration in comparison with other extended ensemble techniques such as flat-histogram, multicanonical and Wang–Landau sampling. We simulate, as a prototypical example of a strong first-order transition, the two-dimensional Potts model with up to Q = 250 different states in large systems. The optimized histogram develops a distinct multi-peak structure, thereby resolving entropic barriers and their associated phase transitions in the phase coexistence region—such as droplet nucleation and annihilation, and droplet–strip transitions for systems with periodic boundary conditions. We characterize the efficiency of the optimized histogram sampling by measuring round-trip times τ(N, Q) across the phase transition for samples comprised of N spins. While we find power-law scaling of τ versus N for small Q∼ 2 , we observe a crossover to exponential scaling for larger Q. These results demonstrate that despite the ensemble optimization, broad-histogram simulations cannot fully eliminate the supercritical slowing down at strongly first-order transitions

  9. CHOBS: Color Histogram of Block Statistics for Automatic Bleeding Detection in Wireless Capsule Endoscopy Video.

    Science.gov (United States)

    Ghosh, Tonmoy; Fattah, Shaikh Anowarul; Wahid, Khan A

    2018-01-01

    Wireless capsule endoscopy (WCE) is the most advanced technology to visualize whole gastrointestinal (GI) tract in a non-invasive way. But the major disadvantage here, it takes long reviewing time, which is very laborious as continuous manual intervention is necessary. In order to reduce the burden of the clinician, in this paper, an automatic bleeding detection method for WCE video is proposed based on the color histogram of block statistics, namely CHOBS. A single pixel in WCE image may be distorted due to the capsule motion in the GI tract. Instead of considering individual pixel values, a block surrounding to that individual pixel is chosen for extracting local statistical features. By combining local block features of three different color planes of RGB color space, an index value is defined. A color histogram, which is extracted from those index values, provides distinguishable color texture feature. A feature reduction technique utilizing color histogram pattern and principal component analysis is proposed, which can drastically reduce the feature dimension. For bleeding zone detection, blocks are classified using extracted local features that do not incorporate any computational burden for feature extraction. From extensive experimentation on several WCE videos and 2300 images, which are collected from a publicly available database, a very satisfactory bleeding frame and zone detection performance is achieved in comparison to that obtained by some of the existing methods. In the case of bleeding frame detection, the accuracy, sensitivity, and specificity obtained from proposed method are 97.85%, 99.47%, and 99.15%, respectively, and in the case of bleeding zone detection, 95.75% of precision is achieved. The proposed method offers not only low feature dimension but also highly satisfactory bleeding detection performance, which even can effectively detect bleeding frame and zone in a continuous WCE video data.

  10. Multifractal analysis of three-dimensional histogram from color images

    International Nuclear Information System (INIS)

    Chauveau, Julien; Rousseau, David; Richard, Paul; Chapeau-Blondeau, Francois

    2010-01-01

    Natural images, especially color or multicomponent images, are complex information-carrying signals. To contribute to the characterization of this complexity, we investigate the possibility of multiscale organization in the colorimetric structure of natural images. This is realized by means of a multifractal analysis applied to the three-dimensional histogram from natural color images. The observed behaviors are confronted to those of reference models with known multifractal properties. We use for this purpose synthetic random images with trivial monofractal behavior, and multidimensional multiplicative cascades known for their actual multifractal behavior. The behaviors observed on natural images exhibit similarities with those of the multifractal multiplicative cascades and display the signature of elaborate multiscale organizations stemming from the histograms of natural color images. This type of characterization of colorimetric properties can be helpful to various tasks of digital image processing, as for instance modeling, classification, indexing.

  11. A novel parallel architecture for local histogram equalization

    Science.gov (United States)

    Ohannessian, Mesrob I.; Choueiter, Ghinwa F.; Diab, Hassan

    2005-07-01

    Local histogram equalization is an image enhancement algorithm that has found wide application in the pre-processing stage of areas such as computer vision, pattern recognition and medical imaging. The computationally intensive nature of the procedure, however, is a main limitation when real time interactive applications are in question. This work explores the possibility of performing parallel local histogram equalization, using an array of special purpose elementary processors, through an HDL implementation that targets FPGA or ASIC platforms. A novel parallelization scheme is presented and the corresponding architecture is derived. The algorithm is reduced to pixel-level operations. Processing elements are assigned image blocks, to maintain a reasonable performance-cost ratio. To further simplify both processor and memory organizations, a bit-serial access scheme is used. A brief performance assessment is provided to illustrate and quantify the merit of the approach.

  12. Histogram specification as a method of density modification

    International Nuclear Information System (INIS)

    Harrison, R.W.

    1988-01-01

    A new method for improving the quality and extending the resolution of Fourier maps is described. The method is based on a histogram analysis of the electron density. The distribution of electron density values in the map is forced to be 'ideal'. The 'ideal' distribution is assumed to be Gaussian. The application of the method to improve the electron density map for the protein Acinetobacter asparaginase, which is a tetrameric enzyme of molecular weight 140000 daltons, is described. (orig.)

  13. Breast density pattern characterization by histogram features and texture descriptors

    OpenAIRE

    Carneiro,Pedro Cunha; Franco,Marcelo Lemos Nunes; Thomaz,Ricardo de Lima; Patrocinio,Ana Claudia

    2017-01-01

    Abstract Introduction Breast cancer is the first leading cause of death for women in Brazil as well as in most countries in the world. Due to the relation between the breast density and the risk of breast cancer, in medical practice, the breast density classification is merely visual and dependent on professional experience, making this task very subjective. The purpose of this paper is to investigate image features based on histograms and Haralick texture descriptors so as to separate mammo...

  14. Retrospective Reconstructions of Active Bone Marrow Dose-Volume Histograms

    International Nuclear Information System (INIS)

    Veres, Cristina; Allodji, Rodrigue S.; Llanas, Damien; Vu Bezin, Jérémi; Chavaudra, Jean; Mège, Jean Pierre; Lefkopoulos, Dimitri; Quiniou, Eric; Deutsh, Eric; Vathaire, Florent de; Diallo, Ibrahima

    2014-01-01

    Purpose: To present a method for calculating dose-volume histograms (DVH's) to the active bone marrow (ABM) of patients who had undergone radiation therapy (RT) and subsequently developed leukemia. Methods and Materials: The study focuses on 15 patients treated between 1961 and 1996. Whole-body RT planning computed tomographic (CT) data were not available. We therefore generated representative whole-body CTs similar to patient anatomy. In addition, we developed a method enabling us to obtain information on the density distribution of ABM all over the skeleton. Dose could then be calculated in a series of points distributed all over the skeleton in such a way that their local density reflected age-specific data for ABM distribution. Dose to particular regions and dose-volume histograms of the entire ABM were estimated for all patients. Results: Depending on patient age, the total number of dose calculation points generated ranged from 1,190,970 to 4,108,524. The average dose to ABM ranged from 0.3 to 16.4 Gy. Dose-volume histograms analysis showed that the median doses (D 50% ) ranged from 0.06 to 12.8 Gy. We also evaluated the inhomogeneity of individual patient ABM dose distribution according to clinical situation. It was evident that the coefficient of variation of the dose for the whole ABM ranged from 1.0 to 5.7, which means that the standard deviation could be more than 5 times higher than the mean. Conclusions: For patients with available long-term follow-up data, our method provides reconstruction of dose-volume data comparable to detailed dose calculations, which have become standard in modern CT-based 3-dimensional RT planning. Our strategy of using dose-volume histograms offers new perspectives to retrospective epidemiological studies

  15. Independent histogram pursuit for segmentation of skin lesions

    DEFF Research Database (Denmark)

    Gomez, D.D.; Butakoff, C.; Ersbøll, Bjarne Kjær

    2008-01-01

    In this paper, an unsupervised algorithm, called the Independent Histogram Pursuit (HIP), for segmenting dermatological lesions is proposed. The algorithm estimates a set of linear combinations of image bands that enhance different structures embedded in the image. In particular, the first estima...... to deal with different types of dermatological lesions. The boundary detection precision using k-means segmentation was close to 97%. The proposed algorithm can be easily combined with the majority of classification algorithms....

  16. Color and Contrast Enhancement by Controlled Piecewise Affine Histogram Equalization

    Directory of Open Access Journals (Sweden)

    Jose-Luis Lisani

    2012-10-01

    Full Text Available This paper presents a simple contrast enhancement algorithm based on histogram equalization (HE. The proposed algorithm performs a piecewise affine transform of the intensity levels of a digital image such that the new cumulative distribution function will be approximately uniform (as with HE, but where the stretching of the range is locally controlled to avoid brutal noise enhancement. We call this algorithm Piecewise Affine Equalization (PAE. Several experiments show that, in general, the new algorithm improves HE results.

  17. Contrast Enhancement Algorithm Based on Gap Adjustment for Histogram Equalization

    Directory of Open Access Journals (Sweden)

    Chung-Cheng Chiu

    2016-06-01

    Full Text Available Image enhancement methods have been widely used to improve the visual effects of images. Owing to its simplicity and effectiveness histogram equalization (HE is one of the methods used for enhancing image contrast. However, HE may result in over-enhancement and feature loss problems that lead to unnatural look and loss of details in the processed images. Researchers have proposed various HE-based methods to solve the over-enhancement problem; however, they have largely ignored the feature loss problem. Therefore, a contrast enhancement algorithm based on gap adjustment for histogram equalization (CegaHE is proposed. It refers to a visual contrast enhancement algorithm based on histogram equalization (VCEA, which generates visually pleasing enhanced images, and improves the enhancement effects of VCEA. CegaHE adjusts the gaps between two gray values based on the adjustment equation, which takes the properties of human visual perception into consideration, to solve the over-enhancement problem. Besides, it also alleviates the feature loss problem and further enhances the textures in the dark regions of the images to improve the quality of the processed images for human visual perception. Experimental results demonstrate that CegaHE is a reliable method for contrast enhancement and that it significantly outperforms VCEA and other methods.

  18. Contrast Enhancement Algorithm Based on Gap Adjustment for Histogram Equalization

    Science.gov (United States)

    Chiu, Chung-Cheng; Ting, Chih-Chung

    2016-01-01

    Image enhancement methods have been widely used to improve the visual effects of images. Owing to its simplicity and effectiveness histogram equalization (HE) is one of the methods used for enhancing image contrast. However, HE may result in over-enhancement and feature loss problems that lead to unnatural look and loss of details in the processed images. Researchers have proposed various HE-based methods to solve the over-enhancement problem; however, they have largely ignored the feature loss problem. Therefore, a contrast enhancement algorithm based on gap adjustment for histogram equalization (CegaHE) is proposed. It refers to a visual contrast enhancement algorithm based on histogram equalization (VCEA), which generates visually pleasing enhanced images, and improves the enhancement effects of VCEA. CegaHE adjusts the gaps between two gray values based on the adjustment equation, which takes the properties of human visual perception into consideration, to solve the over-enhancement problem. Besides, it also alleviates the feature loss problem and further enhances the textures in the dark regions of the images to improve the quality of the processed images for human visual perception. Experimental results demonstrate that CegaHE is a reliable method for contrast enhancement and that it significantly outperforms VCEA and other methods. PMID:27338412

  19. Non-parametric comparison of histogrammed two-dimensional data distributions using the Energy Test

    International Nuclear Information System (INIS)

    Reid, Ivan D; Lopes, Raul H C; Hobson, Peter R

    2012-01-01

    When monitoring complex experiments, comparison is often made between regularly acquired histograms of data and reference histograms which represent the ideal state of the equipment. With the larger HEP experiments now ramping up, there is a need for automation of this task since the volume of comparisons could overwhelm human operators. However, the two-dimensional histogram comparison tools available in ROOT have been noted in the past to exhibit shortcomings. We discuss a newer comparison test for two-dimensional histograms, based on the Energy Test of Aslan and Zech, which provides more conclusive discrimination between histograms of data coming from different distributions than methods provided in a recent ROOT release.

  20. Application of the multi-dimensional surface water modeling system at Bridge 339, Copper River Highway, Alaska

    Science.gov (United States)

    Brabets, Timothy P.; Conaway, Jeffrey S.

    2009-01-01

    The Copper River Basin, the sixth largest watershed in Alaska, drains an area of 24,200 square miles. This large, glacier-fed river flows across a wide alluvial fan before it enters the Gulf of Alaska. Bridges along the Copper River Highway, which traverses the alluvial fan, have been impacted by channel migration. Due to a major channel change in 2001, Bridge 339 at Mile 36 of the highway has undergone excessive scour, resulting in damage to its abutments and approaches. During the snow- and ice-melt runoff season, which typically extends from mid-May to September, the design discharge for the bridge often is exceeded. The approach channel shifts continuously, and during our study it has shifted back and forth from the left bank to a course along the right bank nearly parallel to the road.Maintenance at Bridge 339 has been costly and will continue to be so if no action is taken. Possible solutions to the scour and erosion problem include (1) constructing a guide bank to redirect flow, (2) dredging approximately 1,000 feet of channel above the bridge to align flow perpendicular to the bridge, and (3) extending the bridge. The USGS Multi-Dimensional Surface Water Modeling System (MD_SWMS) was used to assess these possible solutions. The major limitation of modeling these scenarios was the inability to predict ongoing channel migration. We used a hybrid dataset of surveyed and synthetic bathymetry in the approach channel, which provided the best approximation of this dynamic system. Under existing conditions and at the highest measured discharge and stage of 32,500 ft3/s and 51.08 ft, respectively, the velocities and shear stresses simulated by MD_SWMS indicate scour and erosion will continue. Construction of a 250-foot-long guide bank would not improve conditions because it is not long enough. Dredging a channel upstream of Bridge 339 would help align the flow perpendicular to Bridge 339, but because of the mobility of the channel bed, the dredged channel would

  1. A Novel Contrast Enhancement Technique on Palm Bone Images

    Directory of Open Access Journals (Sweden)

    Yung-Tsang Chang

    2014-09-01

    Full Text Available Contrast enhancement plays a fundamental role in image processing. Many histogram-based techniques are widely used for contrast enhancement of given images, due to their simple function and effectiveness. However, the conventional histogram equalization (HE methods result in excessive contrast enhancement, which causes natural looking and satisfactory results for a variety of low contrast images. To solve such problems, a novel multi-histogram equalization technique is proposed to enhance the contrast of the palm bone X-ray radiographs in this paper. For images, the mean-variance analysis method is employed to partition the histogram of the original grey scale image into multiple sub-histograms. These histograms are independently equalized. By using this mean-variance partition method, a proposed multi-histogram equalization technique is employed to achieve the contrast enhancement of the palm bone X-ray radiographs. Experimental results show that the multi-histogram equalization technique achieves a lower average absolute mean brightness error (AMBE value. The multi-histogram equalization technique simultaneously preserved the mean brightness and enhanced the local contrast of the original image.

  2. The INTERGROWTH-21st Project Neurodevelopment Package: a novel method for the multi-dimensional assessment of neurodevelopment in pre-school age children.

    Directory of Open Access Journals (Sweden)

    Michelle Fernandes

    Full Text Available BACKGROUND: The International Fetal and Newborn Growth Consortium for the 21st Century (INTERGROWTH-21st Project is a population-based, longitudinal study describing early growth and development in an optimally healthy cohort of 4607 mothers and newborns. At 24 months, children are assessed for neurodevelopmental outcomes with the INTERGROWTH-21st Neurodevelopment Package. This paper describes neurodevelopment tools for preschoolers and the systematic approach leading to the development of the Package. METHODS: An advisory panel shortlisted project-specific criteria (such as multi-dimensional assessments and suitability for international populations to be fulfilled by a neurodevelopment instrument. A literature review of well-established tools for preschoolers revealed 47 candidates, none of which fulfilled all the project's criteria. A multi-dimensional assessment was, therefore, compiled using a package-based approach by: (i categorizing desired outcomes into domains, (ii devising domain-specific criteria for tool selection, and (iii selecting the most appropriate measure for each domain. RESULTS: The Package measures vision (Cardiff tests; cortical auditory processing (auditory evoked potentials to a novelty oddball paradigm; and cognition, language skills, behavior, motor skills and attention (the INTERGROWTH-21st Neurodevelopment Assessment in 35-45 minutes. Sleep-wake patterns (actigraphy are also assessed. Tablet-based applications with integrated quality checks and automated, wireless electroencephalography make the Package easy to administer in the field by non-specialist staff. The Package is in use in Brazil, India, Italy, Kenya and the United Kingdom. CONCLUSIONS: The INTERGROWTH-21st Neurodevelopment Package is a multi-dimensional instrument measuring early child development (ECD. Its developmental approach may be useful to those involved in large-scale ECD research and surveillance efforts.

  3. Bin Ratio-Based Histogram Distances and Their Application to Image Classification.

    Science.gov (United States)

    Hu, Weiming; Xie, Nianhua; Hu, Ruiguang; Ling, Haibin; Chen, Qiang; Yan, Shuicheng; Maybank, Stephen

    2014-12-01

    Large variations in image background may cause partial matching and normalization problems for histogram-based representations, i.e., the histograms of the same category may have bins which are significantly different, and normalization may produce large changes in the differences between corresponding bins. In this paper, we deal with this problem by using the ratios between bin values of histograms, rather than bin values' differences which are used in the traditional histogram distances. We propose a bin ratio-based histogram distance (BRD), which is an intra-cross-bin distance, in contrast with previous bin-to-bin distances and cross-bin distances. The BRD is robust to partial matching and histogram normalization, and captures correlations between bins with only a linear computational complexity. We combine the BRD with the ℓ1 histogram distance and the χ(2) histogram distance to generate the ℓ1 BRD and the χ(2) BRD, respectively. These combinations exploit and benefit from the robustness of the BRD under partial matching and the robustness of the ℓ1 and χ(2) distances to small noise. We propose a method for assessing the robustness of histogram distances to partial matching. The BRDs and logistic regression-based histogram fusion are applied to image classification. The experimental results on synthetic data sets show the robustness of the BRDs to partial matching, and the experiments on seven benchmark data sets demonstrate promising results of the BRDs for image classification.

  4. Histogram specification as a method of density modification

    Energy Technology Data Exchange (ETDEWEB)

    Harrison, R.W.

    1988-12-01

    A new method for improving the quality and extending the resolution of Fourier maps is described. The method is based on a histogram analysis of the electron density. The distribution of electron density values in the map is forced to be 'ideal'. The 'ideal' distribution is assumed to be Gaussian. The application of the method to improve the electron density map for the protein Acinetobacter asparaginase, which is a tetrameric enzyme of molecular weight 140000 daltons, is described.

  5. High capacity, high speed histogramming data acquisition memory

    International Nuclear Information System (INIS)

    Epstein, A.; Boulin, C.

    1996-01-01

    A double width CAMAC DRAM store module was developed for use as a histogramming memory in fast time-resolved synchrotron radiation applications to molecular biology. High speed direct memory modify (3 MHz) is accomplished by using a discrete DRAM controller and fast page mode access. The module can be configured using standard SIMMs to sizes of up to 64M-words. The word width is 16 bit and the module can handle overflows by storing the overflow addresses in a dedicated FIFO. Simultaneous front panel DMM/DMI access and CAMAC readout of the overflow addresses is supported

  6. WASP (Write a Scientific Paper) using Excel - 4: Histograms.

    Science.gov (United States)

    Grech, Victor

    2018-02-01

    Plotting data into graphs is a crucial step in data analysis as part of an initial descriptive statistics exercise since it gives the researcher an overview of the shape and nature of the data. Outlier values may also be identified, and these may be incorrect data, or true and important outliers. This paper explains how to access Microsoft Excel's Analysis Toolpak and provides some pointers for the utilisation of the histogram tool within the Toolpak. Copyright © 2018. Published by Elsevier B.V.

  7. Text-Independent Speaker Identification Using the Histogram Transform Model

    DEFF Research Database (Denmark)

    Ma, Zhanyu; Yu, Hong; Tan, Zheng-Hua

    2016-01-01

    In this paper, we propose a novel probabilistic method for the task of text-independent speaker identification (SI). In order to capture the dynamic information during SI, we design a super-MFCCs features by cascading three neighboring Mel-frequency Cepstral coefficients (MFCCs) frames together....... These super-MFCC vectors are utilized for probabilistic model training such that the speaker’s characteristics can be sufficiently captured. The probability density function (PDF) of the aforementioned super-MFCCs features is estimated by the recently proposed histogram transform (HT) method. To recedes...

  8. Characterization of dissolved organic matter in a coral reef ecosystem subjected to anthropogenic pressures (La Réunion Island, Indian Ocean) using multi-dimensional fluorescence spectroscopy.

    Science.gov (United States)

    Tedetti, Marc; Cuet, Pascale; Guigue, Catherine; Goutx, Madeleine

    2011-05-01

    highly impacted by sewage effluents, numerous in this coastal area of La Réunion Island. We conclude that multi-dimensional fluorescence spectroscopy (EEM) coupled to the determination of HIX and BIX is a good tool for assessing the origin and distribution of DOM in the coral reef ecosystems submitted to anthropogenic impacts. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Assessment of MARS for downcomer multi-dimensional thermal hydraulics during LBLOCA reflood using KAERI air-water direct vessel injection tests

    Energy Technology Data Exchange (ETDEWEB)

    Won-Jae, Lee; Kwi-Seok, Ha; Chul-Hwa, Song [Korea Atomic Energy Research Inst., Daejeon (Korea, Republic of)

    2001-07-01

    The MARS code has been assessed for the downcomer multi-dimensional thermal hydraulics during a large break loss-of-coolant accident (LBLOCA) reflood of Korean Next Generation Reactor (KNGR) that adopted an upper direct vessel injection (DVI) design. Direct DVI bypass and downcomer level sweep-out tests carried out at 1/50-scale air-water DVI test facility are simulated to examine the capability of MARS. Test conditions are selected such that they represent typical reflood conditions of KNGR, that is, DVI injection velocities of 1.0 {approx} 1.6 m/sec and air injection velocities of 18.0 {approx} 35.0 m/sec, for single and double DVI configurations. MARS calculation is first adjusted to the experimental DVI film distribution that largely affects air-water interaction in a scaled-down downcomer, then, the code is assessed for the selected test matrix. With some improvements of MARS thermal-hydraulic (T/H) models, it has been demonstrated that the MARS code is capable of simulating the direct DVI bypass and downcomer level sweep-out as well as the multi-dimensional thermal hydraulics in downcomer, where condensation effect is excluded. (authors)

  10. Urban air quality forecasting based on multi-dimensional collaborative Support Vector Regression (SVR): A case study of Beijing-Tianjin-Shijiazhuang.

    Science.gov (United States)

    Liu, Bing-Chun; Binaykia, Arihant; Chang, Pei-Chann; Tiwari, Manoj Kumar; Tsao, Cheng-Chin

    2017-01-01

    Today, China is facing a very serious issue of Air Pollution due to its dreadful impact on the human health as well as the environment. The urban cities in China are the most affected due to their rapid industrial and economic growth. Therefore, it is of extreme importance to come up with new, better and more reliable forecasting models to accurately predict the air quality. This paper selected Beijing, Tianjin and Shijiazhuang as three cities from the Jingjinji Region for the study to come up with a new model of collaborative forecasting using Support Vector Regression (SVR) for Urban Air Quality Index (AQI) prediction in China. The present study is aimed to improve the forecasting results by minimizing the prediction error of present machine learning algorithms by taking into account multiple city multi-dimensional air quality information and weather conditions as input. The results show that there is a decrease in MAPE in case of multiple city multi-dimensional regression when there is a strong interaction and correlation of the air quality characteristic attributes with AQI. Also, the geographical location is found to play a significant role in Beijing, Tianjin and Shijiazhuang AQI prediction.

  11. Development of multi-dimensional analysis method for porous blockage in fuel subassembly. Numerical simulation for 4 subchannel geometry water test

    International Nuclear Information System (INIS)

    Tanaka, Masa-aki; Kamide, Hideki

    2001-02-01

    This investigation deals with the porous blockage in a wire spacer type fuel subassembly in Fast Breeder Reactors (FBR's). Multi-dimensional analysis method for a porous blockage in a fuel subassembly is developed using the standard k-ε turbulence model with the typical correlations in handbooks. The purpose of this analysis method is to evaluate the position and the magnitude of the maximum temperature, and to investigate the thermo-hydraulic phenomena in the porous blockage. Verification of this analysis method was conducted based on the results of 4-subchannel geometry water test. It was revealed that the evaluation of the porosity distribution and the particle diameter in a porous blockage was important to predict the temperature distribution. This analysis method could simulate the spatial characteristic of velocity and temperature distributions in the blockage and evaluate the pin surface temperature inside the porous blockage. Through the verification of this analysis method, it is shown that this multi-dimensional analysis method is useful to predict the thermo-hydraulic field and the highest temperature in a porous blockage. (author)

  12. Differentially Private Event Histogram Publication on Sequences over Graphs

    Institute of Scientific and Technical Information of China (English)

    Ning Wang; Yu Gu; Jia Xu; Fang-Fang Li; Ge Yu

    2017-01-01

    The big data era is coming with strong and ever-growing demands on analyzing personal information and footprints in the cyber world. To enable such analysis without privacy leak risk, differential privacy (DP) has been quickly rising in recent years, as the first practical privacy protection model with rigorous theoretical guarantee. This paper discusses how to publish differentially private histograms on events in time series domain, with sequences of personal events over graphs with events as edges. Such individual-generated sequences commonly appear in formalized industrial workflows, online game logs, and spatial-temporal trajectories. Directly publishing the statistics of sequences may compromise personal privacy. While existing DP mechanisms mainly target at normalized domains with fixed and aligned dimensions, our problem raises new challenges when the sequences could follow arbitrary paths on the graph. To tackle the problem, we reformulate the problem with a three-step framework, which 1) carefully truncates the original sequences, trading off errors introduced by the truncation with those introduced by the noise added to guarantee privacy, 2) decomposes the event graph into path sub-domains based on a group of event pivots, and 3) employs a deeply optimized tree-based histogram construction approach for each sub-domain to benefit with less noise addition. We present a careful analysis on our framework to support thorough optimizations over each step of the framework, and verify the huge improvements of our proposals over state-of-the-art solutions.

  13. Variational Histogram Equalization for Single Color Image Defogging

    Directory of Open Access Journals (Sweden)

    Li Zhou

    2016-01-01

    Full Text Available Foggy images taken in the bad weather inevitably suffer from contrast loss and color distortion. Existing defogging methods merely resort to digging out an accurate scene transmission in ignorance of their unpleasing distortion and high complexity. Different from previous works, we propose a simple but powerful method based on histogram equalization and the physical degradation model. By revising two constraints in a variational histogram equalization framework, the intensity component of a fog-free image can be estimated in HSI color space, since the airlight is inferred through a color attenuation prior in advance. To cut down the time consumption, a general variation filter is proposed to obtain a numerical solution from the revised framework. After getting the estimated intensity component, it is easy to infer the saturation component from the physical degradation model in saturation channel. Accordingly, the fog-free image can be restored with the estimated intensity and saturation components. In the end, the proposed method is tested on several foggy images and assessed by two no-reference indexes. Experimental results reveal that our method is relatively superior to three groups of relevant and state-of-the-art defogging methods.

  14. REAL-TIME FACE RECOGNITION BASED ON OPTICAL FLOW AND HISTOGRAM EQUALIZATION

    Directory of Open Access Journals (Sweden)

    D. Sathish Kumar

    2013-05-01

    Full Text Available Face recognition is one of the intensive areas of research in computer vision and pattern recognition but many of which are focused on recognition of faces under varying facial expressions and pose variation. A constrained optical flow algorithm discussed in this paper, recognizes facial images involving various expressions based on motion vector computation. In this paper, an optical flow computation algorithm which computes the frames of varying facial gestures, and integrating with synthesized image in a probabilistic environment has been proposed. Also Histogram Equalization technique has been used to overcome the effect of illuminations while capturing the input data using camera devices. It also enhances the contrast of the image for better processing. The experimental results confirm that the proposed face recognition system is more robust and recognizes the facial images under varying expressions and pose variations more accurately.

  15. Stochastic Learning of Multi-Instance Dictionary for Earth Mover's Distance based Histogram Comparison

    OpenAIRE

    Fan, Jihong; Liang, Ru-Ze

    2016-01-01

    Dictionary plays an important role in multi-instance data representation. It maps bags of instances to histograms. Earth mover's distance (EMD) is the most effective histogram distance metric for the application of multi-instance retrieval. However, up to now, there is no existing multi-instance dictionary learning methods designed for EMD based histogram comparison. To fill this gap, we develop the first EMD-optimal dictionary learning method using stochastic optimization method. In the stoc...

  16. Using histograms to introduce randomization in the generation of ensembles of decision trees

    Science.gov (United States)

    Kamath, Chandrika; Cantu-Paz, Erick; Littau, David

    2005-02-22

    A system for decision tree ensembles that includes a module to read the data, a module to create a histogram, a module to evaluate a potential split according to some criterion using the histogram, a module to select a split point randomly in an interval around the best split, a module to split the data, and a module to combine multiple decision trees in ensembles. The decision tree method includes the steps of reading the data; creating a histogram; evaluating a potential split according to some criterion using the histogram, selecting a split point randomly in an interval around the best split, splitting the data, and combining multiple decision trees in ensembles.

  17. Transverse Position Reconstruction in a Liquid Argon Time Projection Chamber using Principal Component Analysis and Multi-Dimensional Fitting

    Science.gov (United States)

    Watson, Andrew William

    2017-08-01

    pocket above the liquid region, respectively. One of the lingering challenges in this experiment, however, is the determination of an event's position along the other two spatial dimensions, that is, its transverse or "xy" position. Some liquid noble element TPCs have achieved remarkably accurate event position reconstructions, typically using the relative amounts of S2 light collected by Photo-Multiplier Tubes ("PMTs") as the input data to their reconstruction algorithms. This approach has been partic- ularly challenging in DarkSide-50, partly due to unexpected asymmetries in the detector, and partly due to the design of the detector itself. A variety of xy-Reconstruction methods ("xy methods" for short) have come and gone in DS- 50, with only a few of them providing useful results. The xy method described in this dissertation is a two-step Principal Component Analysis / Multi-Dimensional Fit (PCAMDF) reconstruction. In a nutshell, this method develops a functional mapping from the 19-dimensional space of the signal received by the PMTs at the "top" (or the "anode" end) of the DarkSide-50 TPC to each of the transverse coordinates, x and y. PCAMDF is a low-level "machine learning" algorithm, and as such, needs to be "trained" with a sample of representative events; in this case, these are provided by the DarkSide geant4-based Monte Carlo, g4ds. In this work, a thorough description of the PCAMDF xy-Reconstruction method is provided along with an analysis of its performance on MC events and data. The method is applied to several classes of data events, including coincident decays, external gamma rays from calibration sources, and both atmospheric argon "AAr" and underground argon "UAr". Discrepancies between the MC and data are explored, and fiducial volume cuts are calculated. Finally, a novel method is proposed for finding the accuracy of the PCAMDF reconstruction on data by using the asymmetry of the S2 light collected on the anode and cathode PMT arrays as a function

  18. Support vector machine for breast cancer classification using diffusion-weighted MRI histogram features: Preliminary study.

    Science.gov (United States)

    Vidić, Igor; Egnell, Liv; Jerome, Neil P; Teruel, Jose R; Sjøbakk, Torill E; Østlie, Agnes; Fjøsne, Hans E; Bathen, Tone F; Goa, Pål Erik

    2018-05-01

    Diffusion-weighted MRI (DWI) is currently one of the fastest developing MRI-based techniques in oncology. Histogram properties from model fitting of DWI are useful features for differentiation of lesions, and classification can potentially be improved by machine learning. To evaluate classification of malignant and benign tumors and breast cancer subtypes using support vector machine (SVM). Prospective. Fifty-one patients with benign (n = 23) and malignant (n = 28) breast tumors (26 ER+, whereof six were HER2+). Patients were imaged with DW-MRI (3T) using twice refocused spin-echo echo-planar imaging with echo time / repetition time (TR/TE) = 9000/86 msec, 90 × 90 matrix size, 2 × 2 mm in-plane resolution, 2.5 mm slice thickness, and 13 b-values. Apparent diffusion coefficient (ADC), relative enhanced diffusivity (RED), and the intravoxel incoherent motion (IVIM) parameters diffusivity (D), pseudo-diffusivity (D*), and perfusion fraction (f) were calculated. The histogram properties (median, mean, standard deviation, skewness, kurtosis) were used as features in SVM (10-fold cross-validation) for differentiation of lesions and subtyping. Accuracies of the SVM classifications were calculated to find the combination of features with highest prediction accuracy. Mann-Whitney tests were performed for univariate comparisons. For benign versus malignant tumors, univariate analysis found 11 histogram properties to be significant differentiators. Using SVM, the highest accuracy (0.96) was achieved from a single feature (mean of RED), or from three feature combinations of IVIM or ADC. Combining features from all models gave perfect classification. No single feature predicted HER2 status of ER + tumors (univariate or SVM), although high accuracy (0.90) was achieved with SVM combining several features. Importantly, these features had to include higher-order statistics (kurtosis and skewness), indicating the importance to account for heterogeneity. Our

  19. TOP-DRAWER, Histograms, Scatterplots, Curve-Smoothing

    International Nuclear Information System (INIS)

    Chaffee, R.B.

    1988-01-01

    Description of program or function: TOP DRAWER produces histograms, scatterplots, data points with error bars and plots symbols, and curves passing through data points, with elaborate titles. It also does smoothing and calculates frequency distributions. There is little facility, however, for arithmetic manipulation. Because of its restricted applicability, TOP DRAWER can be controlled by a relatively simple set of commands, and this control is further simplified by the choice of reasonable default values for all parameters. Despite this emphasis on simplicity, TOP DRAWER plots are of exceptional quality and are suitable for publication. Input is normally from card-image records, although a set of subroutines is provided to accommodate FORTRAN calls. The program contains switches which can be set to generate code suitable for execution on IBM, DECX VAX, and PRIME computers

  20. Steam leak detection method in pipeline using histogram analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Se Oh; Jeon, Hyeong Seop; Son, Ki Sung; Chae, Gyung Sun [Saean Engineering Corp, Seoul (Korea, Republic of); Park, Jong Won [Dept. of Information Communications Engineering, Chungnam NationalUnversity, Daejeon (Korea, Republic of)

    2015-10-15

    Leak detection in a pipeline usually involves acoustic emission sensors such as contact type sensors. These contact type sensors pose difficulties for installation and cannot operate in areas having high temperature and radiation. Therefore, recently, many researchers have studied the leak detection phenomenon by using a camera. Leak detection by using a camera has the advantages of long distance monitoring and wide area surveillance. However, the conventional leak detection method by using difference images often mistakes the vibration of a structure for a leak. In this paper, we propose a method for steam leakage detection by using the moving average of difference images and histogram analysis. The proposed method can separate the leakage and the vibration of a structure. The working performance of the proposed method is verified by comparing with experimental results.

  1. Histogram plots and cutoff energies for nuclear discrete levels

    International Nuclear Information System (INIS)

    Belgya, T.; Molnar, G.; Fazekas, B.; Oestoer, J.

    1997-05-01

    Discrete level schemes for 1277 nuclei, from 6 Li through 251 Es, extracted from the Evaluated Nuclear Structure Data File were analyzed. Cutoff energies (U max ), indicating the upper limit of level scheme completeness, were deduced from the inspection of histograms of the cumulative number of levels. Parameters of the constant-temperature level density formula (nuclear temperature T and energy shift U 0 ) were obtained by means of the least square fit of the formula to the known levels below cutoff energy. The results are tabulated for all 1277 nuclei allowing for an easy and reliable application of the constant-temperature level density approach. A complete set of cumulative plots of discrete levels is also provided. (author). 5 figs, 2 tabs

  2. TSimpleAnalysis: histogramming many trees in parallel

    CERN Document Server

    Giommi, Luca

    2016-01-01

    I worked inside the ROOT team of EP-SFT group. My project focuses on writing a ROOT class that has the aim of creating histograms from a TChain. The name of the class is TSimpleAnalysis and it is already integrated in ROOT. The work that I have done was to write the source, the header le of the class and also a python script, that allows to the user to use the class through the command line. This represents a great improvement respect to the usual user code that counts lines and lines of code to do the same thing. (Link for the class: https://root.cern.ch/doc/master/classTSimpleAnalysis.html)

  3. Fast Graph Partitioning Active Contours for Image Segmentation Using Histograms

    Directory of Open Access Journals (Sweden)

    Nath SumitK

    2009-01-01

    Full Text Available Abstract We present a method to improve the accuracy and speed, as well as significantly reduce the memory requirements, for the recently proposed Graph Partitioning Active Contours (GPACs algorithm for image segmentation in the work of Sumengen and Manjunath (2006. Instead of computing an approximate but still expensive dissimilarity matrix of quadratic size, , for a 2D image of size and regular image tiles of size , we use fixed length histograms and an intensity-based symmetric-centrosymmetric extensor matrix to jointly compute terms associated with the complete dissimilarity matrix. This computationally efficient reformulation of GPAC using a very small memory footprint offers two distinct advantages over the original implementation. It speeds up convergence of the evolving active contour and seamlessly extends performance of GPAC to multidimensional images.

  4. Adaptive Kalman filtering for histogram-based appearance learning in infrared imagery.

    Science.gov (United States)

    Venkataraman, Vijay; Fan, Guoliang; Havlicek, Joseph P; Fan, Xin; Zhai, Yan; Yeary, Mark B

    2012-11-01

    Targets of interest in video acquired from imaging infrared sensors often exhibit profound appearance variations due to a variety of factors, including complex target maneuvers, ego-motion of the sensor platform, background clutter, etc., making it difficult to maintain a reliable detection process and track lock over extended time periods. Two key issues in overcoming this problem are how to represent the target and how to learn its appearance online. In this paper, we adopt a recent appearance model that estimates the pixel intensity histograms as well as the distribution of local standard deviations in both the foreground and background regions for robust target representation. Appearance learning is then cast as an adaptive Kalman filtering problem where the process and measurement noise variances are both unknown. We formulate this problem using both covariance matching and, for the first time in a visual tracking application, the recent autocovariance least-squares (ALS) method. Although convergence of the ALS algorithm is guaranteed only for the case of globally wide sense stationary process and measurement noises, we demonstrate for the first time that the technique can often be applied with great effectiveness under the much weaker assumption of piecewise stationarity. The performance advantages of the ALS method relative to the classical covariance matching are illustrated by means of simulated stationary and nonstationary systems. Against real data, our results show that the ALS-based algorithm outperforms the covariance matching as well as the traditional histogram similarity-based methods, achieving sub-pixel tracking accuracy against the well-known AMCOM closure sequences and the recent SENSIAC automatic target recognition dataset.

  5. Optimism and well-being: a prospective multi-method and multi-dimensional examination of optimism as a resilience factor following the occurrence of stressful life events.

    Science.gov (United States)

    Kleiman, Evan M; Chiara, Alexandra M; Liu, Richard T; Jager-Hyman, Shari G; Choi, Jimmy Y; Alloy, Lauren B

    2017-02-01

    Optimism has been conceptualised variously as positive expectations (PE) for the future , optimistic attributions , illusion of control , and self-enhancing biases. Relatively little research has examined these multiple dimensions of optimism in relation to psychological and physical health. The current study assessed the multi-dimensional nature of optimism within a prospective vulnerability-stress framework. Initial principal component analyses revealed the following dimensions: PEs, Inferential Style (IS), Sense of Invulnerability (SI), and Overconfidence (O). Prospective follow-up analyses demonstrated that PE was associated with fewer depressive episodes and moderated the effect of stressful life events on depressive symptoms. SI also moderated the effect of life stress on anxiety symptoms. Generally, our findings indicated that optimism is a multifaceted construct and not all forms of optimism have the same effects on well-being. Specifically, our findings indicted that PE may be the most relevant to depression, whereas SI may be the most relevant to anxiety.

  6. The discharge behavior of lithium-ion batteries using the Dual-Potential Multi-Scale Multi-Dimensional (MSMD) Battery Model

    DEFF Research Database (Denmark)

    Saeed Madani, Seyed; Swierczynski, Maciej Jozef; Kær, Søren Knudsen

    2017-01-01

    This paper gives insight into the discharge behavior of lithium-ion batteries based on the investigations, which have been done by the researchers [1– 19]. In this article, the battery's discharge behaviour at various discharge rates is studied and surface monitor, discharge curve, volume monitor...... to analysis the discharge behaviour of lithium-ion batteries. The results show that surface monitor plot of discharge curve at 1 C has a decreasing trend and volume monitor plot of maximum temperature in the domain has slightly increasing pattern over the simulation time. For the curves of discharge...... plot of maximum temperature in the domain and maximum temperature in the area are illustrated. Additionally, an external and internal short-circuit treatment for three cases have been studied. The Dual-Potential Multi-Scale Multi-Dimensional (MSMD) Battery Model (BM) was used by ANSYS FLUENT software...

  7. On Some New Properties of the Fundamental Solution to the Multi-Dimensional Space- and Time-Fractional Diffusion-Wave Equation

    Directory of Open Access Journals (Sweden)

    Yuri Luchko

    2017-12-01

    Full Text Available In this paper, some new properties of the fundamental solution to the multi-dimensional space- and time-fractional diffusion-wave equation are deduced. We start with the Mellin-Barnes representation of the fundamental solution that was derived in the previous publications of the author. The Mellin-Barnes integral is used to obtain two new representations of the fundamental solution in the form of the Mellin convolution of the special functions of the Wright type. Moreover, some new closed-form formulas for particular cases of the fundamental solution are derived. In particular, we solve the open problem of the representation of the fundamental solution to the two-dimensional neutral-fractional diffusion-wave equation in terms of the known special functions.

  8. The multi-dimensional talent support tool (mBET – a systemic approach towards individualized support of the gifted and talented in Austria

    Directory of Open Access Journals (Sweden)

    Johanna Stahl

    2015-03-01

    Full Text Available Providing gifted students with personalized talent development programs is a challenge for teachers and educators alike. The multi-dimensional talent development tool (mBET guides teachers on their way to individualized gifted programs. Within a holistic and systemic concept of giftedness, the mBET brings together the perspectives of teachers, parents and the individual student in assessing talents as well as relevant personality characteristics and environment factors. By facilitating support-oriented round-table talks, the mBET helps teachers, parents and students to develop individually tailored talent development programs, taking into consideration both talents and other factors relevant for successful gifted education (i.e. non-cognitive personality characteristics and environmental factors.

  9. Symbol recognition via statistical integration of pixel-level constraint histograms: a new descriptor.

    Science.gov (United States)

    Yang, Su

    2005-02-01

    A new descriptor for symbol recognition is proposed. 1) A histogram is constructed for every pixel to figure out the distribution of the constraints among the other pixels. 2) All the histograms are statistically integrated to form a feature vector with fixed dimension. The robustness and invariance were experimentally confirmed.

  10. Hand Vein Images Enhancement Based on Local Gray-level Information Histogram

    Directory of Open Access Journals (Sweden)

    Jun Wang

    2015-06-01

    Full Text Available Based on the Histogram equalization theory, this paper presents a novel concept of histogram to realize the contrast enhancement of hand vein images, avoiding the lost of topological vein structure or importing the fake vein information. Firstly, we propose the concept of gray-level information histogram, the fundamental characteristic of which is that the amplitudes of the components can objectively reflect the contribution of the gray levels and information to the representation of image information. Then, we propose the histogram equalization method that is composed of an automatic histogram separation module and an intensity transformation module, and the histogram separation module is a combination of the proposed prompt multiple threshold procedure and an optimum peak signal-to-noise (PSNR calculation to separate the histogram into small-scale detail, the use of the intensity transformation module can enhance the vein images with vein topological structure and gray information preservation for each generated sub-histogram. Experimental results show that the proposed method can achieve extremely good contrast enhancement effect.

  11. Infrared Contrast Enhancement Through Log-Power Histogram Modification

    NARCIS (Netherlands)

    Toet, A.; Wu, T.

    2015-01-01

    A simple power-logarithm histogram modification operator is proposed to enhance infrared (IR) image contrast. The algorithm combines a logarithm operator that smoothes the input image histogram while retaining the relative ordering of the original bins, with a power operator that restores the

  12. Thresholding using two-dimensional histogram and watershed algorithm in the luggage inspection system

    International Nuclear Information System (INIS)

    Chen Jingyun; Cong Peng; Song Qi

    2006-01-01

    The authors present a new DR image segmentation method based on two-dimensional histogram and watershed algorithm. The authors use watershed algorithm to locate threshold on the vertical projection plane of two-dimensional histogram. This method is applied to the segmentation of DR images produced by luggage inspection system with DR-CT. The advantage of this method is also analyzed. (authors)

  13. Curvature histogram features for retrieval of images of smooth 3D objects

    International Nuclear Information System (INIS)

    Zhdanov, I; Scherbakov, O; Potapov, A; Peterson, M

    2014-01-01

    We consider image features on the base of histograms of oriented gradients (HOG) with addition of contour curvature histogram (HOG-CH), and also compare it with results of known scale-invariant feature transform (SIFT) approach in application to retrieval of images of smooth 3D objects.

  14. Cross-interval histogram analysis of neuronal activity on multi-electrode arrays

    NARCIS (Netherlands)

    Castellone, P.; Rutten, Wim; Marani, Enrico

    2003-01-01

    Cross-neuron-interval histogram (CNIH) analysis has been performed in order to study correlated activity and connectivity between pairs of neurons in a spontaneously active developing cultured network of rat cortical cells. Thirty-eight histograms could be analyzed using two parameters, one for the

  15. Treatment plan evaluation using dose-volume histogram (DVH) and spatial dose-volume histogram (zDVH)

    International Nuclear Information System (INIS)

    Cheng, C.-W.; Das, Indra J.

    1999-01-01

    Objective: The dose-volume histogram (DVH) has been accepted as a tool for treatment-plan evaluation. However, DVH lacks spatial information. A new concept, the z-dependent dose-volume histogram (zDVH), is presented as a supplement to the DVH in three-dimensional (3D) treatment planning to provide the spatial variation, as well as the size and magnitude of the different dose regions within a region of interest. Materials and Methods: Three-dimensional dose calculations were carried out with various plans for three disease sites: lung, breast, and prostate. DVHs were calculated for the entire volume. A zDVH is defined as a differential dose-volume histogram with respect to a computed tomographic (CT) slice position. In this study, zDVHs were calculated for each CT slice in the treatment field. DVHs and zDVHs were compared. Results: In the irradiation of lung, DVH calculation indicated that the treatment plan satisfied the dose-volume constraint placed on the lung and zDVH of the lung revealed that a sizable fraction of the lung centered about the central axis (CAX) received a significant dose, a situation that warranted a modification of the treatment plan due to the removal of one lung. In the irradiation of breast with tangential fields, the DVH showed that about 7% of the breast volume received at least 110% of the prescribed dose (PD) and about 11% of the breast received less than 98% PD. However, the zDVHs of the breast volume in each of seven planes showed the existence of high-dose regions of 34% and 15%, respectively, of the volume in the two caudal-most planes and cold spots of about 40% in the two cephalic planes. In the treatment planning of prostate, DVHs showed that about 15% of the bladder and 40% of the rectum received 102% PD, whereas about 30% of the bladder and 50% of the rectum received the full dose. Taking into account the hollow structure of both the bladder and the rectum, the dose-surface histograms (DSH) showed larger hot-spot volume, about

  16. Optimized broad-histogram simulations for strong first-order phase transitions: droplet transitions in the large-Q Potts model

    Science.gov (United States)

    Bauer, Bela; Gull, Emanuel; Trebst, Simon; Troyer, Matthias; Huse, David A.

    2010-01-01

    The numerical simulation of strongly first-order phase transitions has remained a notoriously difficult problem even for classical systems due to the exponentially suppressed (thermal) equilibration in the vicinity of such a transition. In the absence of efficient update techniques, a common approach for improving equilibration in Monte Carlo simulations is broadening the sampled statistical ensemble beyond the bimodal distribution of the canonical ensemble. Here we show how a recently developed feedback algorithm can systematically optimize such broad-histogram ensembles and significantly speed up equilibration in comparison with other extended ensemble techniques such as flat-histogram, multicanonical and Wang-Landau sampling. We simulate, as a prototypical example of a strong first-order transition, the two-dimensional Potts model with up to Q = 250 different states in large systems. The optimized histogram develops a distinct multi-peak structure, thereby resolving entropic barriers and their associated phase transitions in the phase coexistence region—such as droplet nucleation and annihilation, and droplet-strip transitions for systems with periodic boundary conditions. We characterize the efficiency of the optimized histogram sampling by measuring round-trip times τ(N, Q) across the phase transition for samples comprised of N spins. While we find power-law scaling of τ versus N for small Q \\lesssim 50 and N \\lesssim 40^2 , we observe a crossover to exponential scaling for larger Q. These results demonstrate that despite the ensemble optimization, broad-histogram simulations cannot fully eliminate the supercritical slowing down at strongly first-order transitions.

  17. Disentangling the health benefits of walking from increased exposure to falls in older people using remote gait monitoring and multi-dimensional analysis.

    Science.gov (United States)

    Brodie, Matthew A; Okubo, Yoshiro; Annegarn, Janneke; Wieching, Rainer; Lord, Stephen R; Delbaere, Kim

    2017-01-01

    Falls and physical deconditioning are two major health problems for older people. Recent advances in remote physiological monitoring provide new opportunities to investigate why walking exercise, with its many health benefits, can both increase and decrease fall rates in older people. In this paper we combine remote wearable device monitoring of daily gait with non-linear multi-dimensional pattern recognition analysis; to disentangle the complex associations between walking, health and fall rates. One week of activities of daily living (ADL) were recorded with a wearable device in 96 independent living older people prior to completing 6 months of exergaming interventions. Using the wearable device data; the quantity, intensity, variability and distribution of daily walking patterns were assessed. At baseline, clinical assessments of health, falls, sensorimotor and physiological fall risks were completed. At 6 months, fall rates, sensorimotor and physiological fall risks were re-assessed. A non-linear multi-dimensional analysis was conducted to identify risk-groups according to their daily walking patterns. Four distinct risk-groups were identified: The Impaired (93% fallers), Restrained (8% fallers), Active (50% fallers) and Athletic (4% fallers). Walking was strongly associated with multiple health benefits and protective of falls for the top performing Athletic risk-group. However, in the middle of the spectrum, the Active risk-group, who were more active, younger and healthier were 6.25 times more likely to be fallers than their Restrained counterparts. Remote monitoring of daily walking patterns may provide a new way to distinguish Impaired people at risk of falling because of frailty from Active people at risk of falling from greater exposure to situations were falls could occur, but further validation is required. Wearable device risk-profiling could help in developing more personalised interventions for older people seeking the health benefits of walking

  18. Neutron stars as X-ray burst sources. II. Burst energy histograms and why they burst

    International Nuclear Information System (INIS)

    Baan, W.A.

    1979-01-01

    In this work we explore some of the implications of a model for X-ray burst sources where bursts are caused by Kruskal-Schwarzschild instabilities at the magnetopause of an accreting and rotating neutron star. A number of simplifying assumptions are made in order to test the model using observed burst-energy histograms for the rapid burster MXB 1730--335. The predicted histograms have a correct general shape, but it appears that other effects are important as well, and that mode competition, for instance, may suppress the histograms at high burst energies. An explanation is ventured for the enhancement in the histogram at the highest burst energies, which produces the bimodal shape in high accretion rate histograms. Quantitative criteria are given for deciding when accreting neutron stars are steady sources or burst sources, and these criteria are tested using the X-ray pulsars

  19. Whole-Lesion Histogram Analysis of Apparent Diffusion Coefficient for the Assessment of Cervical Cancer.

    Science.gov (United States)

    Guan, Yue; Shi, Hua; Chen, Ying; Liu, Song; Li, Weifeng; Jiang, Zhuoran; Wang, Huanhuan; He, Jian; Zhou, Zhengyang; Ge, Yun

    2016-01-01

    The aim of this study was to explore the application of whole-lesion histogram analysis of apparent diffusion coefficient (ADC) values of cervical cancer. A total of 54 women (mean age, 53 years) with cervical cancers underwent 3-T diffusion-weighted imaging with b values of 0 and 800 s/mm prospectively. Whole-lesion histogram analysis of ADC values was performed. Paired sample t test was used to compare differences in ADC histogram parameters between cervical cancers and normal cervical tissues. Receiver operating characteristic curves were constructed to identify the optimal threshold of each parameter. All histogram parameters in this study including ADCmean, ADCmin, ADC10%-ADC90%, mode, skewness, and kurtosis of cervical cancers were significantly lower than those of normal cervical tissues (all P histogram analysis of ADC maps is useful in the assessment of cervical cancer.

  20. PARAFAC: uma ferramenta quimiométrica para tratamento de dados multidimensionais. Aplicações na determinação direta de fármacos em plasma humano por espectrofluorimetria PARAFAC: a chemometric tool for multi-dimensional data treatment. Applications in direct determination of drugs in human plasma by spectrofluorimetry

    Directory of Open Access Journals (Sweden)

    Marcelo M. Sena

    2005-10-01

    Full Text Available Since the last decade, the combined use of chemometrics and molecular spectroscopic techniques has become a new alternative for direct drug determination, without the need of physical separation. Among the new methodologies developed, the application of PARAFAC in the decomposition of spectrofluorimetric data should be highlighted. The first objective of this article is to describe the theoretical basis of PARAFAC. For this purpose, a discussion about the order of chemometric methods used in multivariate calibration and the development of multi-dimensional methods is presented first. The other objective of this article is to divulge for the Brazilian chemical community the potential of the combination PARAFAC/spectrofluorimetry for the determination of drugs in complex biological matrices. For this purpose, two applications aiming at determining, respectively, doxorrubicine and salicylate in human plasma are presented.

  1. Stochastic learning of multi-instance dictionary for earth mover’s distance-based histogram comparison

    KAUST Repository

    Fan, Jihong; Liang, Ru-Ze

    2016-01-01

    Dictionary plays an important role in multi-instance data representation. It maps bags of instances to histograms. Earth mover’s distance (EMD) is the most effective histogram distance metric for the application of multi-instance retrieval. However

  2. Novel multi-dimensional heteronuclear NMR techniques for the study of 13C-O-acetylated oligosaccharides: Expanding the dimensions for carbohydrate structures

    Energy Technology Data Exchange (ETDEWEB)

    Jones, David N.M. [University of Colorado Health Sciences Center, Departments of Pharmacology (United States); Bendiak, Brad [University of Colorado Health Sciences Center, Departments of Cellular and Structural Biology (United States)

    1999-10-15

    Complex carbohydrates have critical roles in a wide variety of biological processes. An understanding of the molecular mechanisms that underlie these processes is essential in the development of novel oligosaccharide-based therapeutic strategies. Unfortunately, obtaining detailed structural information for larger oligosaccharides (>10 residues) can be exceedingly difficult, especially where the amount of sample available is limited. Here we demonstrate the application of {sup 13} C O-acetylation in combination with novel NMR experiments to obtain much of the information required to characterize the primary structure of oligosaccharides. (H)C{sub Me}COH-HEHAHA and H(C{sub Me})COH-HEHAHA experiments are presented that use heteronuclear Hartmann-Hahn transfer to correlate the acetyl groups with sugar ring protons in peracetylated oligosaccharides. The in-phase, pure absorption nature of the correlation peaks in these experiments allows measurement of both chemical shifts and, importantly, {sup 1}H-{sup 1}H coupling constants that are used to define the stereochemistry of the sugar ring. The (HC{sub Me})COH and (HC{sub Me})COH-RELAY experiments provide additional methods for obtaining chemical shift assignments for larger oligosaccharides to define the sites of glycosidic linkages from the patterns of acetylation.

  3. Using the Bootstrap Method for a Statistical Significance Test of Differences between Summary Histograms

    Science.gov (United States)

    Xu, Kuan-Man

    2006-01-01

    A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.

  4. Histogram analysis of diffusion measures in clinically isolated syndromes and relapsing-remitting multiple sclerosis

    International Nuclear Information System (INIS)

    Yu Chunshui; Lin Fuchun; Liu Yaou; Duan Yunyun; Lei Hao; Li Kuncheng

    2008-01-01

    Objective: The purposes of our study were to employ diffusion tensor imaging (DTI)-based histogram analysis to determine the presence of occult damage in clinically isolated syndrome (CIS), to compare its severity with relapsing-remitting multiple sclerosis (RRMS), and to determine correlations between DTI histogram measures and clinical and MRI indices in these two diseases. Materials and methods: DTI scans were performed in 19 CIS and 19 RRMS patients and 19 matched healthy volunteers. Histogram analyses of mean diffusivity and fractional anisotropy were performed in normal-appearing brain tissue (NABT), normal-appearing white matter (NAWM) and gray matter (NAGM). Correlations were analyzed between these measures and expanded disability status scale (EDSS) scores, T 2 WI lesion volumes (LV) and normalized brain tissue volumes (NBTV) in CIS and RRMS patients. Results: Significant differences were found among CIS, RRMS and control groups in the NBTV and most of the DTI histogram measures of the NABT, NAWM and NAGM. In CIS patients, some DTI histogram measures showed significant correlations with LV and NBTV, but none of them with EDSS. In RRMS patients, however, some DTI histogram measures were significantly correlated with LV, NBTV and EDSS. Conclusion: Occult damage occurs in both NAGM and NAWM in CIS, but the severity is milder than that in RRMS. In CIS and RRMS, the occult damage might be related to both T2 lesion load and brain tissue atrophy. Some DTI histogram measures might be useful for assessing the disease progression in RRMS patients

  5. Parameterization of the Age-Dependent Whole Brain Apparent Diffusion Coefficient Histogram

    Science.gov (United States)

    Batra, Marion; Nägele, Thomas

    2015-01-01

    Purpose. The distribution of apparent diffusion coefficient (ADC) values in the brain can be used to characterize age effects and pathological changes of the brain tissue. The aim of this study was the parameterization of the whole brain ADC histogram by an advanced model with influence of age considered. Methods. Whole brain ADC histograms were calculated for all data and for seven age groups between 10 and 80 years. Modeling of the histograms was performed for two parts of the histogram separately: the brain tissue part was modeled by two Gaussian curves, while the remaining part was fitted by the sum of a Gaussian curve, a biexponential decay, and a straight line. Results. A consistent fitting of the histograms of all age groups was possible with the proposed model. Conclusions. This study confirms the strong dependence of the whole brain ADC histograms on the age of the examined subjects. The proposed model can be used to characterize changes of the whole brain ADC histogram in certain diseases under consideration of age effects. PMID:26609526

  6. Histogram analysis of T2*-based pharmacokinetic imaging in cerebral glioma grading.

    Science.gov (United States)

    Liu, Hua-Shan; Chiang, Shih-Wei; Chung, Hsiao-Wen; Tsai, Ping-Huei; Hsu, Fei-Ting; Cho, Nai-Yu; Wang, Chao-Ying; Chou, Ming-Chung; Chen, Cheng-Yu

    2018-03-01

    To investigate the feasibility of histogram analysis of the T2*-based permeability parameter volume transfer constant (K trans ) for glioma grading and to explore the diagnostic performance of the histogram analysis of K trans and blood plasma volume (v p ). We recruited 31 and 11 patients with high- and low-grade gliomas, respectively. The histogram parameters of K trans and v p , derived from the first-pass pharmacokinetic modeling based on the T2* dynamic susceptibility-weighted contrast-enhanced perfusion-weighted magnetic resonance imaging (T2* DSC-PW-MRI) from the entire tumor volume, were evaluated for differentiating glioma grades. Histogram parameters of K trans and v p showed significant differences between high- and low-grade gliomas and exhibited significant correlations with tumor grades. The mean K trans derived from the T2* DSC-PW-MRI had the highest sensitivity and specificity for differentiating high-grade gliomas from low-grade gliomas compared with other histogram parameters of K trans and v p . Histogram analysis of T2*-based pharmacokinetic imaging is useful for cerebral glioma grading. The histogram parameters of the entire tumor K trans measurement can provide increased accuracy with additional information regarding microvascular permeability changes for identifying high-grade brain tumors. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Parameterization of the Age-Dependent Whole Brain Apparent Diffusion Coefficient Histogram

    Directory of Open Access Journals (Sweden)

    Uwe Klose

    2015-01-01

    Full Text Available Purpose. The distribution of apparent diffusion coefficient (ADC values in the brain can be used to characterize age effects and pathological changes of the brain tissue. The aim of this study was the parameterization of the whole brain ADC histogram by an advanced model with influence of age considered. Methods. Whole brain ADC histograms were calculated for all data and for seven age groups between 10 and 80 years. Modeling of the histograms was performed for two parts of the histogram separately: the brain tissue part was modeled by two Gaussian curves, while the remaining part was fitted by the sum of a Gaussian curve, a biexponential decay, and a straight line. Results. A consistent fitting of the histograms of all age groups was possible with the proposed model. Conclusions. This study confirms the strong dependence of the whole brain ADC histograms on the age of the examined subjects. The proposed model can be used to characterize changes of the whole brain ADC histogram in certain diseases under consideration of age effects.

  8. An evaluation of an improved method for computing histograms in dynamic tracer studies using positron-emission tomography

    International Nuclear Information System (INIS)

    Ollinger, J.M.; Snyder, D.L.

    1986-01-01

    A method for computing approximate minimum-mean-square-error estimates of histograms from list-mode data for use in dynamic tracer studies is evaluated. Parameters estimated from these histograms are significantly more accurate than those estimated from histograms computed by a commonly used method

  9. Evaluation of low-grade glioma structural changes after chemotherapy using DTI-based histogram analysis and functional diffusion maps

    Energy Technology Data Exchange (ETDEWEB)

    Castellano, Antonella; Iadanza, Antonella; Falini, Andrea [San Raffaele Scientific Institute and Vita-Salute San Raffaele University, Neuroradiology Unit and CERMAC, Milano (Italy); Donativi, Marina [University of Salento, Department of Mathematics and Physics ' ' Ennio De Giorgi' ' and A.D.A.M. (Advanced Data Analysis in Medicine), Lecce (Italy); Ruda, Roberta; Bertero, Luca; Soffietti, Riccardo [University of Torino, Department of Neuro-oncology, Turin (Italy); De Nunzio, Giorgio [University of Salento, Department of Mathematics and Physics ' ' Ennio De Giorgi' ' and A.D.A.M. (Advanced Data Analysis in Medicine), Lecce (Italy); INFN (National Institute of Nuclear Physics), Lecce (Italy); Riva, Marco; Bello, Lorenzo [Universita degli Studi di Milano, Milan, and Humanitas Research Hospital, Department of Medical Biotechnology and Translational Medicine, Rozzano, MI (Italy); Rucco, Matteo [University of Camerino, School of Science and Technology, Computer Science Division, Camerino, MC (Italy)

    2016-05-15

    To explore the role of diffusion tensor imaging (DTI)-based histogram analysis and functional diffusion maps (fDMs) in evaluating structural changes of low-grade gliomas (LGGs) receiving temozolomide (TMZ) chemotherapy. Twenty-one LGG patients underwent 3T-MR examinations before and after three and six cycles of dose-dense TMZ, including 3D-fluid-attenuated inversion recovery (FLAIR) sequences and DTI (b = 1000 s/mm{sup 2}, 32 directions). Mean diffusivity (MD), fractional anisotropy (FA), and tensor-decomposition DTI maps (p and q) were obtained. Histogram and fDM analyses were performed on co-registered baseline and post-chemotherapy maps. DTI changes were compared with modifications of tumour area and volume [according to Response Assessment in Neuro-Oncology (RANO) criteria], and seizure response. After three cycles of TMZ, 20/21 patients were stable according to RANO criteria, but DTI changes were observed in all patients (Wilcoxon test, P ≤ 0.03). After six cycles, DTI changes were more pronounced (P ≤ 0.005). Seventy-five percent of patients had early seizure response with significant improvement of DTI values, maintaining stability on FLAIR. Early changes of the 25th percentiles of p and MD predicted final volume change (R{sup 2} = 0.614 and 0.561, P < 0.0005, respectively). TMZ-related changes were located mainly at tumour borders on p and MD fDMs. DTI-based histogram and fDM analyses are useful techniques to evaluate the early effects of TMZ chemotherapy in LGG patients. (orig.)

  10. HEp-2 Cell Classification Using Shape Index Histograms With Donut-Shaped Spatial Pooling

    DEFF Research Database (Denmark)

    Larsen, Anders Boesen Lindbo; Vestergaard, Jacob Schack; Larsen, Rasmus

    2014-01-01

    We present a new method for automatic classification of indirect immunoflourescence images of HEp-2 cells into different staining pattern classes. Our method is based on a new texture measure called shape index histograms that captures second-order image structure at multiple scales. Moreover, we...... datasets. Our results show that shape index histograms are superior to other popular texture descriptors for HEp-2 cell classification. Moreover, when comparing to other automated systems for HEp-2 cell classification we show that shape index histograms are very competitive; especially considering...

  11. An alternative to γ histograms for ROI-based quantitative dose comparisons

    International Nuclear Information System (INIS)

    Dvorak, P

    2009-01-01

    An alternative to gamma (γ) histograms for ROI-based quantitative comparisons of dose distributions using the γ concept is proposed. The method provides minimum values of dose difference and distance-to-agreement such that a pre-set fraction of the region of interest passes the γ test. Compared to standard γ histograms, the method provides more information in terms of pass rate per γ calculation. This is achieved at negligible additional calculation cost and without loss of accuracy. The presented method is proposed as a useful and complementary alternative to standard γ histograms, increasing both the quantity and quality of information for use in acceptance or rejection decisions. (note)

  12. Making ceramics used for compound environment into multi-composite and evaluation of their multi-dimensional system

    International Nuclear Information System (INIS)

    Mitsuhashi, Takefumi

    1996-01-01

    In order to advance current nuclear power technology greatly, the development of the boundary materials suitable to between the environments with largely different properties is indispensable. In the research of first period, the ceramic having the corrosion resistance in liquid sodium which is far superior to metals was found. As boundary material, in addition, thermal, mechanical and radiation resistant properties are required. In the project of second period, it is aimed at to establish the basic technology for the synthesis techniques for multi-composite materials that possess the combination of the excellent characteristics of individual monolithic system ceramics. The liquid sodium immersion test of various ceramics in the research of first period is reported. The diffusion of sodium in ceramics was also examined. As the simplified quick evaluation technique, the corrosion test in KOH solution was carried out. As for ceramic multi-composites, Y ions were implanted in the surface of alumina, and the changes of structure and corrosion resistance were examined. The surface condition of ceramics and the adsorption of alkali metals were investigated. (K.I.)

  13. Development and application of a living probabilistic safety assessment tool: Multi-objective multi-dimensional optimization of surveillance requirements in NPPs considering their ageing

    International Nuclear Information System (INIS)

    Kančev, Duško; Čepin, Marko; Gjorgiev, Blaže

    2014-01-01

    The benefits of utilizing the probabilistic safety assessment towards improvement of nuclear power plant safety are presented in this paper. Namely, a nuclear power plant risk reduction can be achieved by risk-informed optimization of the deterministically-determined surveillance requirements. A living probabilistic safety assessment tool for time-dependent risk analysis on component, system and plant level is developed. The study herein focuses on the application of this living probabilistic safety assessment tool as a computer platform for multi-objective multi-dimensional optimization of the surveillance requirements of selected safety equipment seen from the aspect of the risk-informed reasoning. The living probabilistic safety assessment tool is based on a newly developed model for calculating time-dependent unavailability of ageing safety equipment within nuclear power plants. By coupling the time-dependent unavailability model with a commercial software used for probabilistic safety assessment modelling on plant level, the frames of the new platform i.e. the living probabilistic safety assessment tool are established. In such way, the time-dependent core damage frequency is obtained and is further on utilized as first objective function within a multi-objective multi-dimensional optimization case study presented within this paper. The test and maintenance costs are designated as the second and the incurred dose due to performing the test and maintenance activities as the third objective function. The obtained results underline, in general, the usefulness and importance of a living probabilistic safety assessment, seen as a dynamic probabilistic safety assessment tool opposing the conventional, time-averaged unavailability-based, probabilistic safety assessment. The results of the optimization, in particular, indicate that test intervals derived as optimal differ from the deterministically-determined ones defined within the existing technical specifications

  14. Histogram-driven cupping correction (HDCC) in CT

    Science.gov (United States)

    Kyriakou, Y.; Meyer, M.; Lapp, R.; Kalender, W. A.

    2010-04-01

    Typical cupping correction methods are pre-processing methods which require either pre-calibration measurements or simulations of standard objects to approximate and correct for beam hardening and scatter. Some of them require the knowledge of spectra, detector characteristics, etc. The aim of this work was to develop a practical histogram-driven cupping correction (HDCC) method to post-process the reconstructed images. We use a polynomial representation of the raw-data generated by forward projection of the reconstructed images; forward and backprojection are performed on graphics processing units (GPU). The coefficients of the polynomial are optimized using a simplex minimization of the joint entropy of the CT image and its gradient. The algorithm was evaluated using simulations and measurements of homogeneous and inhomogeneous phantoms. For the measurements a C-arm flat-detector CT (FD-CT) system with a 30×40 cm2 detector, a kilovoltage on board imager (radiation therapy simulator) and a micro-CT system were used. The algorithm reduced cupping artifacts both in simulations and measurements using a fourth-order polynomial and was in good agreement to the reference. The minimization algorithm required less than 70 iterations to adjust the coefficients only performing a linear combination of basis images, thus executing without time consuming operations. HDCC reduced cupping artifacts without the necessity of pre-calibration or other scan information enabling a retrospective improvement of CT image homogeneity. However, the method can work with other cupping correction algorithms or in a calibration manner, as well.

  15. Entropy-based viscous regularization for the multi-dimensional Euler equations in low-Mach and transonic flows

    Energy Technology Data Exchange (ETDEWEB)

    Marc O Delchini; Jean E. Ragusa; Ray A. Berry

    2015-07-01

    We present a new version of the entropy viscosity method, a viscous regularization technique for hyperbolic conservation laws, that is well-suited for low-Mach flows. By means of a low-Mach asymptotic study, new expressions for the entropy viscosity coefficients are derived. These definitions are valid for a wide range of Mach numbers, from subsonic flows (with very low Mach numbers) to supersonic flows, and no longer depend on an analytical expression for the entropy function. In addition, the entropy viscosity method is extended to Euler equations with variable area for nozzle flow problems. The effectiveness of the method is demonstrated using various 1-D and 2-D benchmark tests: flow in a converging–diverging nozzle; Leblanc shock tube; slow moving shock; strong shock for liquid phase; low-Mach flows around a cylinder and over a circular hump; and supersonic flow in a compression corner. Convergence studies are performed for smooth solutions and solutions with shocks present.

  16. Numerical solutions of multi-dimensional solidification/melting problems by the dual reciprocity boundary element method

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Jong Chull; Shin, Won Ky [Korea Institute of Nuclear Safety, Taejon (Korea, Republic of)

    1997-12-31

    This paper presents an effective and simple procedure for the simulation of the motion of the solid-liquid interfacial boundary and the transient temperature field during phase change process. To accomplish this purpose, an iterative implicit solution algorithm has been developed by employing the dual reciprocity boundary element method. The dual reciprocity boundary element approach provided in this paper is much simpler than the usual boundary element method applying a reciprocity principle and an available technique for dealing with domain integral of boundary element formulation simultaneously. The effectiveness of the present analysis method have been illustrated through comparisons of the calculation results of an example with its semi-analytical or other numerical solutions where available. 22 refs., 3 figs. (Author)

  17. Numerical solutions of multi-dimensional solidification/melting problems by the dual reciprocity boundary element method

    International Nuclear Information System (INIS)

    Jo, Jong Chull; Shin, Won Ky

    1997-01-01

    This paper presents an effective and simple procedure for the simulation of the motion of the solid-liquid interfacial boundary and the transient temperature field during phase change process. To accomplish this purpose, an iterative implicit solution algorithm has been developed by employing the dual reciprocity boundary element method. The dual reciprocity boundary element approach provided in this paper is much simpler than the usual boundary element method applying a reciprocity principle and an available technique for dealing with domain integral of boundary element formulation simultaneously. The effectiveness of the present analysis method have been illustrated through comparisons of the calculation results of an example with its semi-analytical or other numerical solutions where available

  18. Numerical solutions of multi-dimensional solidification/melting problems by the dual reciprocity boundary element method

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Jong Chull; Shin, Won Ky [Korea Institute of Nuclear Safety, Taejon (Korea, Republic of)

    1998-12-31

    This paper presents an effective and simple procedure for the simulation of the motion of the solid-liquid interfacial boundary and the transient temperature field during phase change process. To accomplish this purpose, an iterative implicit solution algorithm has been developed by employing the dual reciprocity boundary element method. The dual reciprocity boundary element approach provided in this paper is much simpler than the usual boundary element method applying a reciprocity principle and an available technique for dealing with domain integral of boundary element formulation simultaneously. The effectiveness of the present analysis method have been illustrated through comparisons of the calculation results of an example with its semi-analytical or other numerical solutions where available. 22 refs., 3 figs. (Author)

  19. Reliability Study Regarding the Use of Histogram Similarity Methods for Damage Detection

    Directory of Open Access Journals (Sweden)

    Nicoleta Gillich

    2013-01-01

    Full Text Available The paper analyses the reliability of three dissimilarity estimators to compare histograms, as support for a frequency-based damage detection method, able to identify structural changes in beam-like structures. First a brief presentation of the own developed damage detection method is made, with focus on damage localization. It consists actually in comparing a histogram derived from measurement results, with a large series of histograms, namely the damage location indexes for all locations along the beam, obtained by calculus. We tested some dissimilarity estimators like the Minkowski-form Distances, the Kullback-Leibler Divergence and the Histogram Intersection and found the Minkowski Distance as the method providing best results. It was tested for numerous locations, using real measurement results and with results artificially debased by noise, proving its reliability.

  20. Image Enhancement via Subimage Histogram Equalization Based on Mean and Variance

    Directory of Open Access Journals (Sweden)

    Liyun Zhuang

    2017-01-01

    Full Text Available This paper puts forward a novel image enhancement method via Mean and Variance based Subimage Histogram Equalization (MVSIHE, which effectively increases the contrast of the input image with brightness and details well preserved compared with some other methods based on histogram equalization (HE. Firstly, the histogram of input image is divided into four segments based on the mean and variance of luminance component, and the histogram bins of each segment are modified and equalized, respectively. Secondly, the result is obtained via the concatenation of the processed subhistograms. Lastly, the normalization method is deployed on intensity levels, and the integration of the processed image with the input image is performed. 100 benchmark images from a public image database named CVG-UGR-Database are used for comparison with other state-of-the-art methods. The experiment results show that the algorithm can not only enhance image information effectively but also well preserve brightness and details of the original image.

  1. Adaptive Histogram Equalization Based Image Forensics Using Statistics of DC DCT Coefficients

    Directory of Open Access Journals (Sweden)

    Neetu Singh

    2018-01-01

    Full Text Available The vulnerability of digital images is growing towards manipulation. This motivated an area of research to deal with digital image forgeries. The certifying origin and content of digital images is an open problem in the multimedia world. One of the ways to find the truth of images is finding the presence of any type of contrast enhancement. In this work, novel and simple machine learning tool is proposed to detect the presence of histogram equalization using statistical parameters of DC Discrete Cosine Transform (DCT coefficients. The statistical parameters of the Gaussian Mixture Model (GMM fitted to DC DCT coefficients are used as features for classifying original and histogram equalized images. An SVM classifier has been developed to classify original and histogram equalized image which can detect histogram equalized image with accuracy greater than 95% when false rate is less than 5%.

  2. Histograms of Arecibo World Days Measurements and Linear-H Fits Between 1985 and 1995

    National Research Council Canada - National Science Library

    Melendez-Alvira, D

    1998-01-01

    This document presents histograms of linear-H model fits to electron density profiles measured with the incoherent scatter radar of the Arecibo Observatory in Puerto Rico during the World Days between 1985 and 1995...

  3. Image Enhancement via Subimage Histogram Equalization Based on Mean and Variance

    Science.gov (United States)

    2017-01-01

    This paper puts forward a novel image enhancement method via Mean and Variance based Subimage Histogram Equalization (MVSIHE), which effectively increases the contrast of the input image with brightness and details well preserved compared with some other methods based on histogram equalization (HE). Firstly, the histogram of input image is divided into four segments based on the mean and variance of luminance component, and the histogram bins of each segment are modified and equalized, respectively. Secondly, the result is obtained via the concatenation of the processed subhistograms. Lastly, the normalization method is deployed on intensity levels, and the integration of the processed image with the input image is performed. 100 benchmark images from a public image database named CVG-UGR-Database are used for comparison with other state-of-the-art methods. The experiment results show that the algorithm can not only enhance image information effectively but also well preserve brightness and details of the original image. PMID:29403529

  4. Efficient Human Action and Gait Analysis Using Multiresolution Motion Energy Histogram

    Directory of Open Access Journals (Sweden)

    Kuo-Chin Fan

    2010-01-01

    Full Text Available Average Motion Energy (AME image is a good way to describe human motions. However, it has to face the computation efficiency problem with the increasing number of database templates. In this paper, we propose a histogram-based approach to improve the computation efficiency. We convert the human action/gait recognition problem to a histogram matching problem. In order to speed up the recognition process, we adopt a multiresolution structure on the Motion Energy Histogram (MEH. To utilize the multiresolution structure more efficiently, we propose an automated uneven partitioning method which is achieved by utilizing the quadtree decomposition results of MEH. In that case, the computation time is only relevant to the number of partitioned histogram bins, which is much less than the AME method. Two applications, action recognition and gait classification, are conducted in the experiments to demonstrate the feasibility and validity of the proposed approach.

  5. Image Enhancement via Subimage Histogram Equalization Based on Mean and Variance.

    Science.gov (United States)

    Zhuang, Liyun; Guan, Yepeng

    2017-01-01

    This paper puts forward a novel image enhancement method via Mean and Variance based Subimage Histogram Equalization (MVSIHE), which effectively increases the contrast of the input image with brightness and details well preserved compared with some other methods based on histogram equalization (HE). Firstly, the histogram of input image is divided into four segments based on the mean and variance of luminance component, and the histogram bins of each segment are modified and equalized, respectively. Secondly, the result is obtained via the concatenation of the processed subhistograms. Lastly, the normalization method is deployed on intensity levels, and the integration of the processed image with the input image is performed. 100 benchmark images from a public image database named CVG-UGR-Database are used for comparison with other state-of-the-art methods. The experiment results show that the algorithm can not only enhance image information effectively but also well preserve brightness and details of the original image.

  6. The equivalent Histograms in clinical practice; Los histogramas equivalentes en la practica clinica

    Energy Technology Data Exchange (ETDEWEB)

    Pizarro Trigo, F.; Teijeira Garcia, M.; Zaballos Carrera, S.

    2013-07-01

    Is frequently abused of The tolerances established for organ at risk [1] in diagrams of standard fractionation (2Gy/session, 5 sessions per week) when applied to Dose-Volume histograms non-standard schema. The purpose of this work is to establish when this abuse may be more important and realize a transformation of fractionation non-standard of histograms dosis-volumen. Is exposed a case that can be useful to make clinical decisions. (Author)

  7. Hot Spots Detection of Operating PV Arrays through IR Thermal Image Using Method Based on Curve Fitting of Gray Histogram

    Directory of Open Access Journals (Sweden)

    Jiang Lin

    2016-01-01

    Full Text Available The overall efficiency of PV arrays is affected by hot spots which should be detected and diagnosed by applying responsible monitoring techniques. The method using the IR thermal image to detect hot spots has been studied as a direct, noncontact, nondestructive technique. However, IR thermal images suffer from relatively high stochastic noise and non-uniformity clutter, so the conventional methods of image processing are not effective. The paper proposes a method to detect hotspots based on curve fitting of gray histogram. The result of MATLAB simulation proves the method proposed in the paper is effective to detect the hot spots suppressing the noise generated during the process of image acquisition.

  8. Evaluation of dose-volume histograms after prostate seed implantation. 4-year experience

    International Nuclear Information System (INIS)

    Hoinkis, C.; Lehmann, D.; Winkler, C.; Herrmann, T.; Hakenberg, O.W.; Wirth, M.P.

    2004-01-01

    Background and purpose: permanent interstitial brachytherapy by seed implantation is a treatment alternative for low-volume low-risk prostate cancer and a complex interdisciplinary treatment with a learning curve. Dose-volume histograms are used to assess postimplant quality. The authors evaluated their learning curve based on dose-volume histograms and analyzed factors influencing implantation quality. Patients and methods: since 1999, 38 patients with a minimum follow-up of 6 months were treated at the authors' institution with seed implantation using palladium-103 or iodine-125, initially using the preplan method and later real-time planning. Postimplant CT was performed after 4 weeks. The dose-volume indices D90, V100, V150, the D max of pre- and postplans, and the size and position of the volume receiving the prescribed dose (high-dose volume) of the postplans were evaluated. In six patients, postplan imaging both by CT and MRI was used and prostate volumes were compared with preimplant transrectal ultrasound volumes. The first five patients were treated under external supervision. Results: patients were divided into three consecutive groups for analysis of the learning curve (group 1: n = 5 patients treated under external supervision; group 2: n = 13 patients; group 3: n = 20 patients). D90 post for the three groups were 79.3%, 74.2%, and 99.9%, the V100 post were 78.6%, 73.5%, and 88.2%, respectively. The relationship between high-dose volume and prostate volume showed a similar increase as the D90, while the relationship between high-dose volume lying outside the prostate and prostate volume remained constant. The ratio between prostate volumes from transrectal ultrasound and CT imaging decreased with increasing D90 post , while the preplanning D90 and V100 remained constant. The different isotopes used, the method of planning, and the implanted activity per prostate volume did not influence results. Conclusion: a learning curve characterized by an increase

  9. A Cross-Cultural Comparison of Singaporean and Taiwanese Eighth Graders' Science Learning Self-Efficacy from a Multi-Dimensional Perspective

    Science.gov (United States)

    Lin, Tzung-Jin; Tan, Aik Ling; Tsai, Chin-Chung

    2013-05-01

    Due to the scarcity of cross-cultural comparative studies in exploring students' self-efficacy in science learning, this study attempted to develop a multi-dimensional science learning self-efficacy (SLSE) instrument to measure 316 Singaporean and 303 Taiwanese eighth graders' SLSE and further to examine the differences between the two student groups. Moreover, within-culture comparisons were made in terms of gender. The results showed that, first, the SLSE instrument was valid and reliable for measuring the Singaporean and Taiwanese students' SLSE. Second, through a two-way multivariate analysis of variance analysis (nationality by gender), the main result indicated that the SLSE held by the Singaporean eighth graders was significantly higher than that of their Taiwanese counterparts in all dimensions, including 'conceptual understanding and higher-order cognitive skills', 'practical work (PW)', 'everyday application', and 'science communication'. In addition, the within-culture gender comparisons indicated that the male Singaporean students tended to possess higher SLSE than the female students did in all SLSE dimensions except for the 'PW' dimension. However, no gender differences were found in the Taiwanese sample. The findings unraveled in this study were interpreted from a socio-cultural perspective in terms of the curriculum differences, societal expectations of science education, and educational policies in Singapore and Taiwan.

  10. Vision from next generation sequencing: multi-dimensional genome-wide analysis for producing gene regulatory networks underlying retinal development, aging and disease.

    Science.gov (United States)

    Yang, Hyun-Jin; Ratnapriya, Rinki; Cogliati, Tiziana; Kim, Jung-Woong; Swaroop, Anand

    2015-05-01

    Genomics and genetics have invaded all aspects of biology and medicine, opening uncharted territory for scientific exploration. The definition of "gene" itself has become ambiguous, and the central dogma is continuously being revised and expanded. Computational biology and computational medicine are no longer intellectual domains of the chosen few. Next generation sequencing (NGS) technology, together with novel methods of pattern recognition and network analyses, has revolutionized the way we think about fundamental biological mechanisms and cellular pathways. In this review, we discuss NGS-based genome-wide approaches that can provide deeper insights into retinal development, aging and disease pathogenesis. We first focus on gene regulatory networks (GRNs) that govern the differentiation of retinal photoreceptors and modulate adaptive response during aging. Then, we discuss NGS technology in the context of retinal disease and develop a vision for therapies based on network biology. We should emphasize that basic strategies for network construction and analyses can be transported to any tissue or cell type. We believe that specific and uniform guidelines are required for generation of genome, transcriptome and epigenome data to facilitate comparative analysis and integration of multi-dimensional data sets, and for constructing networks underlying complex biological processes. As cellular homeostasis and organismal survival are dependent on gene-gene and gene-environment interactions, we believe that network-based biology will provide the foundation for deciphering disease mechanisms and discovering novel drug targets for retinal neurodegenerative diseases. Published by Elsevier Ltd.

  11. Multi-dimensional, fully-implicit, spectral method for the Vlasov-Maxwell equations with exact conservation laws in discrete form

    Science.gov (United States)

    Delzanno, G. L.

    2015-11-01

    A spectral method for the numerical solution of the multi-dimensional Vlasov-Maxwell equations is presented. The plasma distribution function is expanded in Fourier (for the spatial part) and Hermite (for the velocity part) basis functions, leading to a truncated system of ordinary differential equations for the expansion coefficients (moments) that is discretized with an implicit, second order accurate Crank-Nicolson time discretization. The discrete non-linear system is solved with a preconditioned Jacobian-Free Newton-Krylov method. It is shown analytically that the Fourier-Hermite method features exact conservation laws for total mass, momentum and energy in discrete form. Standard tests involving plasma waves and the whistler instability confirm the validity of the conservation laws numerically. The whistler instability test also shows that we can step over the fastest time scale in the system without incurring in numerical instabilities. Some preconditioning strategies are presented, showing that the number of linear iterations of the Krylov solver can be drastically reduced and a significant gain in performance can be obtained.

  12. Performance of Edmonton Frail Scale on frailty assessment: its association with multi-dimensional geriatric conditions assessed with specific screening tools.

    Science.gov (United States)

    Perna, Simone; Francis, Matthew D'Arcy; Bologna, Chiara; Moncaglieri, Francesca; Riva, Antonella; Morazzoni, Paolo; Allegrini, Pietro; Isu, Antonio; Vigo, Beatrice; Guerriero, Fabio; Rondanelli, Mariangela

    2017-01-04

    The aim of this study was to evaluate the performance of Edmonton Frail Scale (EFS) on frailty assessment in association with multi-dimensional conditions assessed with specific screening tools and to explore the prevalence of frailty by gender. We enrolled 366 hospitalised patients (women\\men: 251\\115), mean age 81.5 years. The EFS was given to the patients to evaluate their frailty. Then we collected data concerning cognitive status through Mini-Mental State Examination (MMSE), health status (evaluated with the number of diseases), functional independence (Barthel Index and Activities Daily Living; BI, ADL, IADL), use of drugs (counting of drugs taken every day), Mini Nutritional Assessment (MNA), Geriatric Depression Scale (GDS), Skeletal Muscle Index of sarcopenia (SMI), osteoporosis and functionality (Handgrip strength). According with the EFS, the 19.7% of subjects were classified as non frail, 66.4% as apparently vulnerable and 13.9% with severe frailty. The EFS scores were associated with cognition (MMSE: β = 0.980; p nutrition (MNA: β = -0.413; p performance (Handgrip: β = -0.114, p performance tool for stratifying the state of fragility in a group of institutionalized elderly. As matter of facts the EFS has been shown to be associated with several geriatric conditions such independence, drugs assumption, mood, mental, functional and nutritional status.

  13. AdS and stabilized extra dimensions in multi-dimensional gravitational models with nonlinear scalar curvature terms R-1 and R4

    International Nuclear Information System (INIS)

    Guenther, Uwe; Zhuk, Alexander; Bezerra, Valdir B; Romero, Carlos

    2005-01-01

    We study multi-dimensional gravitational models with scalar curvature nonlinearities of types R -1 and R 4 . It is assumed that the corresponding higher dimensional spacetime manifolds undergo a spontaneous compactification to manifolds with a warped product structure. Special attention has been paid to the stability of the extra-dimensional factor spaces. It is shown that for certain parameter regions the systems allow for a freezing stabilization of these spaces. In particular, we find for the R -1 model that configurations with stabilized extra dimensions do not provide a late-time acceleration (they are AdS), whereas the solution branch which allows for accelerated expansion (the dS branch) is incompatible with stabilized factor spaces. In the case of the R 4 model, we obtain that the stability region in parameter space depends on the total dimension D = dim(M) of the higher dimensional spacetime M. For D > 8 the stability region consists of a single (absolutely stable) sector which is shielded from a conformal singularity (and an antigravity sector beyond it) by a potential barrier of infinite height and width. This sector is smoothly connected with the stability region of a curvature-linear model. For D 4 model

  14. Directional Histogram Ratio at Random Probes: A Local Thresholding Criterion for Capillary Images

    Science.gov (United States)

    Lu, Na; Silva, Jharon; Gu, Yu; Gerber, Scott; Wu, Hulin; Gelbard, Harris; Dewhurst, Stephen; Miao, Hongyu

    2013-01-01

    With the development of micron-scale imaging techniques, capillaries can be conveniently visualized using methods such as two-photon and whole mount microscopy. However, the presence of background staining, leaky vessels and the diffusion of small fluorescent molecules can lead to significant complexity in image analysis and loss of information necessary to accurately quantify vascular metrics. One solution to this problem is the development of accurate thresholding algorithms that reliably distinguish blood vessels from surrounding tissue. Although various thresholding algorithms have been proposed, our results suggest that without appropriate pre- or post-processing, the existing approaches may fail to obtain satisfactory results for capillary images that include areas of contamination. In this study, we propose a novel local thresholding algorithm, called directional histogram ratio at random probes (DHR-RP). This method explicitly considers the geometric features of tube-like objects in conducting image binarization, and has a reliable performance in distinguishing small vessels from either clean or contaminated background. Experimental and simulation studies suggest that our DHR-RP algorithm is superior over existing thresholding methods. PMID:23525856

  15. Thai Finger-Spelling Recognition Using a Cascaded Classifier Based on Histogram of Orientation Gradient Features

    Directory of Open Access Journals (Sweden)

    Kittasil Silanon

    2017-01-01

    Full Text Available Hand posture recognition is an essential module in applications such as human-computer interaction (HCI, games, and sign language systems, in which performance and robustness are the primary requirements. In this paper, we proposed automatic classification to recognize 21 hand postures that represent letters in Thai finger-spelling based on Histogram of Orientation Gradient (HOG feature (which is applied with more focus on the information within certain region of the image rather than each single pixel and Adaptive Boost (i.e., AdaBoost learning technique to select the best weak classifier and to construct a strong classifier that consists of several weak classifiers to be cascaded in detection architecture. We collected 21 static hand posture images from 10 subjects for testing and training in Thai letters finger-spelling. The parameters for the training process have been adjusted in three experiments, false positive rates (FPR, true positive rates (TPR, and number of training stages (N, to achieve the most suitable training model for each hand posture. All cascaded classifiers are loaded into the system simultaneously to classify different hand postures. A correlation coefficient is computed to distinguish the hand postures that are similar. The system achieves approximately 78% accuracy on average on all classifier experiments.

  16. Histogram-Based Thresholding for Detection and Quantification of Hemorrhages in Retinal Images

    Directory of Open Access Journals (Sweden)

    Hussain Fadhel Hamdan Jaafar

    2016-12-01

    Full Text Available Retinal image analysis is commonly used for the detection and quantification of retinal diabetic retinopathy. In retinal images, dark lesions including hemorrhages and microaneurysms are the earliest warnings of vision loss. In this paper, new algorithm for extraction and quantification of hemorrhages in fundus images is presented. Hemorrhage candidates are extracted in a preliminary step as a coarse segmentation followed by a fine segmentation step. Local variation processes are applied in the coarse segmentation step to determine boundaries of all candidates with distinct edges. Fine segmentation processes are based on histogram thresholding to extract real hemorrhages from the segmented candidates locally. The proposed method was trained and tested using an image dataset of 153 manually labeled retinal images. At the pixel level, the proposed method could identify abnormal retinal images with 90.7% sensitivity and 85.1% predictive value. Due to its distinctive performance measurements, this technique demonstrates that it could be used for a computer-aided mass screening of retinal diseases.

  17. Histogram based analysis of lung perfusion of children after congenital diaphragmatic hernia repair.

    Science.gov (United States)

    Kassner, Nora; Weis, Meike; Zahn, Katrin; Schaible, Thomas; Schoenberg, Stefan O; Schad, Lothar R; Zöllner, Frank G

    2018-05-01

    To investigate a histogram based approach to characterize the distribution of perfusion in the whole left and right lung by descriptive statistics and to show how histograms could be used to visually explore perfusion defects in two year old children after Congenital Diaphragmatic Hernia (CDH) repair. 28 children (age of 24.2±1.7months; all left sided hernia; 9 after extracorporeal membrane oxygenation therapy) underwent quantitative DCE-MRI of the lung. Segmentations of left and right lung were manually drawn to mask the calculated pulmonary blood flow maps and then to derive histograms for each lung side. Individual and group wise analysis of histograms of left and right lung was performed. Ipsilateral and contralateral lung show significant difference in shape and descriptive statistics derived from the histogram (Wilcoxon signed-rank test, phistogram derived parameters. Histogram analysis can be a valuable tool to characterize and visualize whole lung perfusion of children after CDH repair. It allows for several possibilities to analyze the data, either describing the perfusion differences between the right and left lung but also to explore and visualize localized perfusion patterns in the 3D lung volume. Subgroup analysis will be possible given sufficient sample sizes. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. ADC histogram analysis of muscle lymphoma - Correlation with histopathology in a rare entity.

    Science.gov (United States)

    Meyer, Hans-Jonas; Pazaitis, Nikolaos; Surov, Alexey

    2018-06-21

    Diffusion weighted imaging (DWI) is able to reflect histopathology architecture. A novel imaging approach, namely histogram analysis, is used to further characterize lesion on MRI. The purpose of this study is to correlate histogram parameters derived from apparent diffusion coefficient- (ADC) maps with histopathology parameters in muscle lymphoma. Eight patients (mean age 64.8 years, range 45-72 years) with histopathologically confirmed muscle lymphoma were retrospectively identified. Cell count, total nucleic and average nucleic areas were estimated using ImageJ. Additionally, Ki67-index was calculated. DWI was obtained on a 1.5T scanner by using the b values of 0 and 1000 s/mm2. Histogram analysis was performed as a whole lesion measurement by using a custom-made Matlabbased application. The correlation analysis revealed statistically significant correlation between cell count and ADCmean (p=-0.76, P=0.03) as well with ADCp75 (p=-0.79, P=0.02). Kurtosis and entropy correlated with average nucleic area (p=-0.81, P=0.02, p=0.88, P=0.007, respectively). None of the analyzed ADC parameters correlated with total nucleic area and with Ki67-index. This study identified significant correlations between cellularity and histogram parameters derived from ADC maps in muscle lymphoma. Thus, histogram analysis parameters reflect histopathology in muscle tumors. Advances in knowledge: Whole lesion ADC histogram analysis is able to reflect histopathology parameters in muscle lymphomas.

  19. DE-STRIPING FOR TDICCD REMOTE SENSING IMAGE BASED ON STATISTICAL FEATURES OF HISTOGRAM

    Directory of Open Access Journals (Sweden)

    H.-T. Gao

    2016-06-01

    Full Text Available Aim to striping noise brought by non-uniform response of remote sensing TDI CCD, a novel de-striping method based on statistical features of image histogram is put forward. By analysing the distribution of histograms,the centroid of histogram is selected to be an eigenvalue representing uniformity of ground objects,histogrammic centroid of whole image and each pixels are calculated first,the differences between them are regard as rough correction coefficients, then in order to avoid the sensitivity caused by single parameter and considering the strong continuity and pertinence of ground objects between two adjacent pixels,correlation coefficient of the histograms is introduces to reflect the similarities between them,fine correction coefficient is obtained by searching around the rough correction coefficient,additionally,in view of the influence of bright cloud on histogram,an automatic cloud detection based on multi-feature including grey level,texture,fractal dimension and edge is used to pre-process image.Two 0-level panchromatic images of SJ-9A satellite with obvious strip noise are processed by proposed method to evaluate the performance, results show that the visual quality of images are improved because the strip noise is entirely removed,we quantitatively analyse the result by calculating the non-uniformity ,which has reached about 1% and is better than histogram matching method.

  20. A multi-dimensional analysis of the upper Rio Grande-San Luis Valley social-ecological system

    Science.gov (United States)

    Mix, Ken

    The Upper Rio Grande (URG), located in the San Luis Valley (SLV) of southern Colorado, is the primary contributor to streamflow to the Rio Grande Basin, upstream of the confluence of the Rio Conchos at Presidio, TX. The URG-SLV includes a complex irrigation-dependent agricultural social-ecological system (SES), which began development in 1852, and today generates more than 30% of the SLV revenue. The diversions of Rio Grande water for irrigation in the SLV have had a disproportionate impact on the downstream portion of the river. These diversions caused the flow to cease at Ciudad Juarez, Mexico in the late 1880s, creating international conflict. Similarly, low flows in New Mexico and Texas led to interstate conflict. Understanding changes in the URG-SLV that led to this event and the interactions among various drivers of change in the URG-SLV is a difficult task. One reason is that complex social-ecological systems are adaptive, contain feedbacks, emergent properties, cross-scale linkages, large-scale dynamics and non-linearities. Further, most analyses of SES to date have been qualitative, utilizing conceptual models to understand driver interactions. This study utilizes both qualitative and quantitative techniques to develop an innovative approach for analyzing driver interactions in the URG-SLV. Five drivers were identified for the URG-SLV social-ecological system: water (streamflow), water rights, climate, agriculture, and internal and external water policy. The drivers contained several longitudes (data aspect) relevant to the system, except water policy, for which only discreet events were present. Change point and statistical analyses were applied to the longitudes to identify quantifiable changes, to allow detection of cross-scale linkages between drivers, and presence of feedback cycles. Agricultural was identified as the driver signal. Change points for agricultural expansion defined four distinct periods: 1852--1923, 1924--1948, 1949--1978 and 1979

  1. Interpretation of erythrocyte histograms obtained from automated hematology analyzers in hematologic diseases

    Directory of Open Access Journals (Sweden)

    Ali Maleki

    2015-12-01

    Full Text Available Background: Presently, the graphical data of blood cells (histograms and cytograms or/ scattergrams that they are usually available in all modern automated hematology analyzers are an integral a part of automated complete blood count (CBC. To find incorrect results from automated hematology analyzer and establish the samples that require additional analysis, Laboratory employees will use those data for quality control of obtaining results, to assist identification of complex and troublesome cases. Methods: During this descriptive analytic study, in addition to erythrocyte graphs from variety of patients, referring from March 2013 to Feb 2014 to our clinical laboratory, Zagros Hospital, Kermanshah, Iran, are given, the papers published in relevant literature as well as available published manuals of automatic blood cell counters were used. articles related to the key words of erythrocyte graphs and relevant literature as well as available published manuals of automatic blood cell counters were searched from valid databases such as Springer Link, google scholar, Pubmed and Sciencedirect. Then, the articles related to erythrogram, erythrocyte histogram and hematology analyzer graphs are involved in diagnosis of hematological disorder were searched and selected for this study. Results: Histograms and different automated CBC parameter become abnormal in various pathologic conditions, and can present important clues for diagnosis and treatment of hematologic and non-hematologic disorders. In several instances, these histograms have characteristic appearances in an exceedingly wide range of pathological conditions. In some hematologic disorders like iron deficiency or megaloblastic anemia, a sequential histogram can clearly show the progressive treatment and management. Conclusion: These graphical data are often accompanied by other automated CBC parameter and microscopic examination of peripheral blood smears (PBS, and can help in monitoring and

  2. Can histogram analysis of MR images predict aggressiveness in pancreatic neuroendocrine tumors?

    Science.gov (United States)

    De Robertis, Riccardo; Maris, Bogdan; Cardobi, Nicolò; Tinazzi Martini, Paolo; Gobbo, Stefano; Capelli, Paola; Ortolani, Silvia; Cingarlini, Sara; Paiella, Salvatore; Landoni, Luca; Butturini, Giovanni; Regi, Paolo; Scarpa, Aldo; Tortora, Giampaolo; D'Onofrio, Mirko

    2018-06-01

    To evaluate MRI derived whole-tumour histogram analysis parameters in predicting pancreatic neuroendocrine neoplasm (panNEN) grade and aggressiveness. Pre-operative MR of 42 consecutive patients with panNEN >1 cm were retrospectively analysed. T1-/T2-weighted images and ADC maps were analysed. Histogram-derived parameters were compared to histopathological features using the Mann-Whitney U test. Diagnostic accuracy was assessed by ROC-AUC analysis; sensitivity and specificity were assessed for each histogram parameter. ADC entropy was significantly higher in G2-3 tumours with ROC-AUC 0.757; sensitivity and specificity were 83.3 % (95 % CI: 61.2-94.5) and 61.1 % (95 % CI: 36.1-81.7). ADC kurtosis was higher in panNENs with vascular involvement, nodal and hepatic metastases (p= .008, .021 and .008; ROC-AUC= 0.820, 0.709 and 0.820); sensitivity and specificity were: 85.7/74.3 % (95 % CI: 42-99.2 /56.4-86.9), 36.8/96.5 % (95 % CI: 17.2-61.4 /76-99.8) and 100/62.8 % (95 % CI: 56.1-100/44.9-78.1). No significant differences between groups were found for other histogram-derived parameters (p >.05). Whole-tumour histogram analysis of ADC maps may be helpful in predicting tumour grade, vascular involvement, nodal and liver metastases in panNENs. ADC entropy and ADC kurtosis are the most accurate parameters for identification of panNENs with malignant behaviour. • Whole-tumour ADC histogram analysis can predict aggressiveness in pancreatic neuroendocrine neoplasms. • ADC entropy and kurtosis are higher in aggressive tumours. • ADC histogram analysis can quantify tumour diffusion heterogeneity. • Non-invasive quantification of tumour heterogeneity can provide adjunctive information for prognostication.

  3. Sensitivity improvement for correlations involving arginine side-chain Nε/Hε resonances in multi-dimensional NMR experiments using broadband 15N 180o pulses

    International Nuclear Information System (INIS)

    Iwahara, Junji; Clore, G. Marius

    2006-01-01

    Due to practical limitations in available 15 N rf field strength, imperfections in 15 N 180 o pulses arising from off-resonance effects can result in significant sensitivity loss, even if the chemical shift offset is relatively small. Indeed, in multi-dimensional NMR experiments optimized for protein backbone amide groups, cross-peaks arising from the Arg guanidino 15 Nε (∼85 ppm) are highly attenuated by the presence of multiple INEPT transfer steps. To improve the sensitivity for correlations involving Arg Nε-Hε groups, we have incorporated 15 N broadband 180 deg. pulses into 3D 15 N-separated NOE-HSQC and HNCACB experiments. Two 15 N-WURST pulses incorporated at the INEPT transfer steps of the 3D 15 N-separated NOE-HSQC pulse sequence resulted in a ∼1.5-fold increase in sensitivity for the Arg Nε-Hε signals at 800 MHz. For the 3D HNCACB experiment, five 15 N Abramovich-Vega pulses were incorporated for broadband inversion and refocusing, and the sensitivity of Arg 1 Hε- 15 Nε- 13 Cγ/ 13 Cδ correlation peaks was enhanced by a factor of ∼1.7 at 500 MHz. These experiments eliminate the necessity for additional experiments to assign Arg 1 Hε and 15 Nε resonances. In addition, the increased sensitivity afforded for the detection of NOE cross-peaks involving correlations with the 15 Nε/ 1 Hε of Arg in 3D 15 N-separated NOE experiments should prove to be very useful for structural analysis of interactions involving Arg side-chains

  4. Generalized multi-dimensional adaptive filtering for conventional and spiral single-slice, multi-slice, and cone-beam CT

    International Nuclear Information System (INIS)

    Kachelriess, Marc; Watzke, Oliver; Kalender, Willi A.

    2001-01-01

    In modern computed tomography (CT) there is a strong desire to reduce patient dose and/or to improve image quality by increasing spatial resolution and decreasing image noise. These are conflicting demands since increasing resolution at a constant noise level or decreasing noise at a constant resolution level implies a higher demand on x-ray power and an increase of patient dose. X-ray tube power is limited due to technical reasons. We therefore developed a generalized multi-dimensional adaptive filtering approach that applies nonlinear filters in up to three dimensions in the raw data domain. This new method differs from approaches in the literature since our nonlinear filters are applied not only in the detector row direction but also in the view and in the z-direction. This true three-dimensional filtering improves the quantum statistics of a measured projection value proportional to the third power of the filter size. Resolution tradeoffs are shared among these three dimensions and thus are considerably smaller as compared to one-dimensional smoothing approaches. Patient data of spiral and sequential single- and multi-slice CT scans as well as simulated spiral cone-beam data were processed to evaluate these new approaches. Image quality was assessed by evaluation of difference images, by measuring the image noise and the noise reduction, and by calculating the image resolution using point spread functions. The use of generalized adaptive filters helps to reduce image noise or, alternatively, patient dose. Image noise structures, typically along the direction of the highest attenuation, are effectively reduced. Noise reduction values of typically 30%-60% can be achieved in noncylindrical body regions like the shoulder. The loss in image resolution remains below 5% for all cases. In addition, the new method has a great potential to reduce metal artifacts, e.g., in the hip region

  5. AdS and stabilized extra dimensions in multi-dimensional gravitational models with nonlinear scalar curvature terms R{sup -1} and R{sup 4}

    Energy Technology Data Exchange (ETDEWEB)

    Guenther, Uwe [Gravitationsprojekt, Mathematische Physik I, Institut fuer Mathematik, Universitaet Potsdam, Am Neuen Palais 10, PF 601553, D-14415 Potsdam (Germany); Zhuk, Alexander [Department of Physics, University of Odessa, 2 Dvoryanskaya St, Odessa 65100 (Ukraine); Bezerra, Valdir B [Departamento de Fisica, Universidade Federal de ParaIba C Postal 5008, Joao Pessoa, PB, 58059-970 (Brazil); Romero, Carlos [Departamento de Fisica, Universidade Federal de ParaIba C Postal 5008, Joao Pessoa, PB, 58059-970 (Brazil)

    2005-08-21

    We study multi-dimensional gravitational models with scalar curvature nonlinearities of types R{sup -1} and R{sup 4}. It is assumed that the corresponding higher dimensional spacetime manifolds undergo a spontaneous compactification to manifolds with a warped product structure. Special attention has been paid to the stability of the extra-dimensional factor spaces. It is shown that for certain parameter regions the systems allow for a freezing stabilization of these spaces. In particular, we find for the R{sup -1} model that configurations with stabilized extra dimensions do not provide a late-time acceleration (they are AdS), whereas the solution branch which allows for accelerated expansion (the dS branch) is incompatible with stabilized factor spaces. In the case of the R{sup 4} model, we obtain that the stability region in parameter space depends on the total dimension D = dim(M) of the higher dimensional spacetime M. For D > 8 the stability region consists of a single (absolutely stable) sector which is shielded from a conformal singularity (and an antigravity sector beyond it) by a potential barrier of infinite height and width. This sector is smoothly connected with the stability region of a curvature-linear model. For D < 8 an additional (metastable) sector exists which is separated from the conformal singularity by a potential barrier of finite height and width so that systems in this sector are prone to collapse into the conformal singularity. This second sector is not smoothly connected with the first (absolutely stable) one. Several limiting cases and the possibility of inflation are discussed for the R{sup 4} model.

  6. A NEW MULTI-DIMENSIONAL GENERAL RELATIVISTIC NEUTRINO HYDRODYNAMICS CODE FOR CORE-COLLAPSE SUPERNOVAE. II. RELATIVISTIC EXPLOSION MODELS OF CORE-COLLAPSE SUPERNOVAE

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, Bernhard; Janka, Hans-Thomas; Marek, Andreas, E-mail: bjmuellr@mpa-garching.mpg.de, E-mail: thj@mpa-garching.mpg.de [Max-Planck-Institut fuer Astrophysik, Karl-Schwarzschild-Str. 1, D-85748 Garching (Germany)

    2012-09-01

    We present the first two-dimensional general relativistic (GR) simulations of stellar core collapse and explosion with the COCONUT hydrodynamics code in combination with the VERTEX solver for energy-dependent, three-flavor neutrino transport, using the extended conformal flatness condition for approximating the space-time metric and a ray-by-ray-plus ansatz to tackle the multi-dimensionality of the transport. For both of the investigated 11.2 and 15 M{sub Sun} progenitors we obtain successful, though seemingly marginal, neutrino-driven supernova explosions. This outcome and the time evolution of the models basically agree with results previously obtained with the PROMETHEUS hydro solver including an approximative treatment of relativistic effects by a modified Newtonian potential. However, GR models exhibit subtle differences in the neutrinospheric conditions compared with Newtonian and pseudo-Newtonian simulations. These differences lead to significantly higher luminosities and mean energies of the radiated electron neutrinos and antineutrinos and therefore to larger energy-deposition rates and heating efficiencies in the gain layer with favorable consequences for strong nonradial mass motions and ultimately for an explosion. Moreover, energy transfer to the stellar medium around the neutrinospheres through nucleon recoil in scattering reactions of heavy-lepton neutrinos also enhances the mentioned effects. Together with previous pseudo-Newtonian models, the presented relativistic calculations suggest that the treatment of gravity and energy-exchanging neutrino interactions can make differences of even 50%-100% in some quantities and is likely to contribute to a finally successful explosion mechanism on no minor level than hydrodynamical differences between different dimensions.

  7. Preliminary H{sub 2} Combustion Analysis in the Containment of APR1400 for SBLOCA Accident using a Multi-Dimensional H{sub 2} Analysis System

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Hyung Seok; Kim, Jongtae; Kim, Sang-Baik; Hong, Seong-Wan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    The COM3D analyze an overpressure buildup resulting from a propagation of hydrogen flame along the structure and wall in the containment using the hydrogen distribution result calculated by the GASFLOW. The MAAP evaluates a hydrogen source during a severe accident and transfer it to the GASFLOW. We performed a hydrogen combustion analysis using the multidimensional hydrogen analysis system for a station blackout (SBO) accident under the assumption of 100% metal-water reaction in the reactor vessel. The COM3D results showed that the pressure buildup was about 250 kPa because the flame speed was not increased above 300 m/s and the pressure wave passed through the open spaces in the large containment. To increase the reliability of the COM3D calculation, it is necessary to perform the hydrogen combustion analysis for another accident such as a small break loss of coolant (SBLOCA). KAERI performed a hydrogen combustion analysis for a SBLOCA accident using the multi-dimensional hydrogen analysis system under the assumption of 100% metal-water reaction in the reactor vessel. From the COM3D results, we can know that the pressure buildup was approximately 310 kPa because the flame speed was not increased above 100 m/s owing to the high steam concentration and low oxygen concentration in the hydrogen distributed region of the containment. The predicted maximum overpressure in the SBLOCA accident is similar to that of the COM3D results for the SBO accident. Thus, we found that the maximum overpressure due to the hydrogen combustion in the containment may depend on the amount of hydrogen mass released from the reactor vessel.

  8. IMPROVEMENT OF ACCURACY OF RADIATIVE HEAT TRANSFER DIFFERENTIAL APPROXIMATION METHOD FOR MULTI DIMENSIONAL SYSTEMS BY MEANS OF AUTO-ADAPTABLE BOUNDARY CONDITIONS

    Directory of Open Access Journals (Sweden)

    K. V. Dobrego

    2015-01-01

    Full Text Available Differential approximation is derived from radiation transfer equation by averaging over the solid angle. It is one of the more effective methods for engineering calculations of radia- tive heat transfer in complex three-dimensional thermal power systems with selective and scattering media. The new method for improvement of accuracy of the differential approximation based on using of auto-adaptable boundary conditions is introduced in the paper. The  efficiency  of  the  named  method  is  proved  for  the  test  2D-systems.  Self-consistent auto-adaptable boundary conditions taking into consideration the nonorthogonal component of the incident to the boundary radiation flux are formulated. It is demonstrated that taking in- to consideration of the non- orthogonal incident flux in multi-dimensional systems, such as furnaces, boilers, combustion chambers improves the accuracy of the radiant flux simulations and to more extend in the zones adjacent to the edges of the chamber.Test simulations utilizing the differential approximation method with traditional boundary conditions, new self-consistent boundary conditions and “precise” discrete ordinates method were performed. The mean square errors of the resulting radiative fluxes calculated along the boundary of rectangular and triangular test areas were decreased 1.5–2 times by using auto- adaptable boundary conditions. Radiation flux gaps in the corner points of non-symmetric sys- tems are revealed by using auto-adaptable boundary conditions which can not be obtained by using the conventional boundary conditions.

  9. Conformal irradiation of the prostate: estimating long-term rectal bleeding risk using dose-volume histograms

    International Nuclear Information System (INIS)

    Hartford, Alan C.; Niemierko, Andrzej; Adams, Judith A.; Urie, Marcia M.; Shipley, William U.

    1996-01-01

    . Conclusions: There is a dose-volume relationship for rectal mucosal bleeding in the region between 60 and 75 CGE; therefore, efforts to spare rectal wall volume using improved treatment planning and delivery techniques are important. Stratifying dose-volume histograms (DVHs) into risk groups, as done in this study, represents a useful means of analyzing empirical data as a function of heterogeneous dose distributions. Modeling efforts may extend these results to more heterogeneous treatment techniques. Such analysis of DVH data may allow practicing clinicians to better assess the risk of various treatments, fields, or doses, when caring for an individual patient

  10. Comparative study of pulsed-continuous arterial spin labeling and dynamic susceptibility contrast imaging by histogram analysis in evaluation of glial tumors.

    Science.gov (United States)

    Arisawa, Atsuko; Watanabe, Yoshiyuki; Tanaka, Hisashi; Takahashi, Hiroto; Matsuo, Chisato; Fujiwara, Takuya; Fujiwara, Masahiro; Fujimoto, Yasunori; Tomiyama, Noriyuki

    2018-06-01

    Arterial spin labeling (ASL) is a non-invasive perfusion technique that may be an alternative to dynamic susceptibility contrast magnetic resonance imaging (DSC-MRI) for assessment of brain tumors. To our knowledge, there have been no reports on histogram analysis of ASL. The purpose of this study was to determine whether ASL is comparable with DSC-MRI in terms of differentiating high-grade and low-grade gliomas by evaluating the histogram analysis of cerebral blood flow (CBF) in the entire tumor. Thirty-four patients with pathologically proven glioma underwent ASL and DSC-MRI. High-signal areas on contrast-enhanced T 1 -weighted images or high-intensity areas on fluid-attenuated inversion recovery images were designated as the volumes of interest (VOIs). ASL-CBF, DSC-CBF, and DSC-cerebral blood volume maps were constructed and co-registered to the VOI. Perfusion histogram analyses of the whole VOI and statistical analyses were performed to compare the ASL and DSC images. There was no significant difference in the mean values for any of the histogram metrics in both of the low-grade gliomas (n = 15) and the high-grade gliomas (n = 19). Strong correlations were seen in the 75th percentile, mean, median, and standard deviation values between the ASL and DSC images. The area under the curve values tended to be greater for the DSC images than for the ASL images. DSC-MRI is superior to ASL for distinguishing high-grade from low-grade glioma. ASL could be an alternative evaluation method when DSC-MRI cannot be used, e.g., in patients with renal failure, those in whom repeated examination is required, and in children.

  11. Whole-lesion histogram analysis metrics of the apparent diffusion coefficient as a marker of breast lesions characterization at 1.5 T

    International Nuclear Information System (INIS)

    Bougias, H.; Ghiatas, A.; Priovolos, D.; Veliou, K.; Christou, A.

    2017-01-01

    Introduction: To retrospectively assess the role of whole-lesion apparent diffusion coefficient (ADC) in the characterization of breast tumors by comparing different histogram metrics. Methods: 49 patients with 53 breast lesions underwent magnetic resonance imaging (MRI). ADC histogram parameters, including the mean, mode, 10th/50th/90th percentile, skewness, kurtosis, and entropy ADCs, were derived for the whole-lesion volume in each patient. Mann–Whitney U-test, area under the receiver-operating characteristic curve (AUC) were used for statistical analysis. Results: The mean, mode and 10th/50th/90th percentile ADC values were significantly lower in malignant lesions compared with benign ones (all P < 0.0001), while skewness was significantly higher in malignant lesions P = 0.02. However, no significant difference was found between entropy and kurtosis values in malignant lesions compared with benign ones (P = 0.06 and P = 1.00, respectively). Univariate logistic regression showed that 10th and 50th percentile ADC yielded the highest AUC (0.985; 95% confidence interval [CI]: 0.902, 1.000 and 0.982; 95% confidence interval [CI]: 0.896, 1.000 respectively), whereas kurtosis value yielded the lowest AUC (0.500; 95% CI: 0.355, 0.645), indicating that 10th and 50th percentile ADC values may be more accurate for lesion discrimination. Conclusion: Whole-lesion ADC histogram analysis could be a helpful index in the characterization and differentiation between benign and malignant breast lesions with the 10th and 50th percentile ADC be the most accurate discriminators. - Highlights: • DWI is a noninvasive technique that allows quantification of water diffusion in tissues. • ADC histogram analysis is a useful index in the differentiation benign and malignant breast tumors. • The 10th, 50th percentile ADC values being the best discriminators between breast lesions.

  12. DSP+FPGA-based real-time histogram equalization system of infrared image

    Science.gov (United States)

    Gu, Dongsheng; Yang, Nansheng; Pi, Defu; Hua, Min; Shen, Xiaoyan; Zhang, Ruolan

    2001-10-01

    Histogram Modification is a simple but effective method to enhance an infrared image. There are several methods to equalize an infrared image's histogram due to the different characteristics of the different infrared images, such as the traditional HE (Histogram Equalization) method, and the improved HP (Histogram Projection) and PE (Plateau Equalization) method and so on. If to realize these methods in a single system, the system must have a mass of memory and extremely fast speed. In our system, we introduce a DSP + FPGA based real-time procession technology to do these things together. FPGA is used to realize the common part of these methods while DSP is to do the different part. The choice of methods and the parameter can be input by a keyboard or a computer. By this means, the function of the system is powerful while it is easy to operate and maintain. In this article, we give out the diagram of the system and the soft flow chart of the methods. And at the end of it, we give out the infrared image and its histogram before and after the process of HE method.

  13. Particle swarm optimization-based local entropy weighted histogram equalization for infrared image enhancement

    Science.gov (United States)

    Wan, Minjie; Gu, Guohua; Qian, Weixian; Ren, Kan; Chen, Qian; Maldague, Xavier

    2018-06-01

    Infrared image enhancement plays a significant role in intelligent urban surveillance systems for smart city applications. Unlike existing methods only exaggerating the global contrast, we propose a particle swam optimization-based local entropy weighted histogram equalization which involves the enhancement of both local details and fore-and background contrast. First of all, a novel local entropy weighted histogram depicting the distribution of detail information is calculated based on a modified hyperbolic tangent function. Then, the histogram is divided into two parts via a threshold maximizing the inter-class variance in order to improve the contrasts of foreground and background, respectively. To avoid over-enhancement and noise amplification, double plateau thresholds of the presented histogram are formulated by means of particle swarm optimization algorithm. Lastly, each sub-image is equalized independently according to the constrained sub-local entropy weighted histogram. Comparative experiments implemented on real infrared images prove that our algorithm outperforms other state-of-the-art methods in terms of both visual and quantized evaluations.

  14. Histogram Curve Matching Approaches for Object-based Image Classification of Land Cover and Land Use

    Science.gov (United States)

    Toure, Sory I.; Stow, Douglas A.; Weeks, John R.; Kumar, Sunil

    2013-01-01

    The classification of image-objects is usually done using parametric statistical measures of central tendency and/or dispersion (e.g., mean or standard deviation). The objectives of this study were to analyze digital number histograms of image objects and evaluate classifications measures exploiting characteristic signatures of such histograms. Two histograms matching classifiers were evaluated and compared to the standard nearest neighbor to mean classifier. An ADS40 airborne multispectral image of San Diego, California was used for assessing the utility of curve matching classifiers in a geographic object-based image analysis (GEOBIA) approach. The classifications were performed with data sets having 0.5 m, 2.5 m, and 5 m spatial resolutions. Results show that histograms are reliable features for characterizing classes. Also, both histogram matching classifiers consistently performed better than the one based on the standard nearest neighbor to mean rule. The highest classification accuracies were produced with images having 2.5 m spatial resolution. PMID:24403648

  15. Histogram Analysis of Diffusion Tensor Imaging Parameters in Pediatric Cerebellar Tumors.

    Science.gov (United States)

    Wagner, Matthias W; Narayan, Anand K; Bosemani, Thangamadhan; Huisman, Thierry A G M; Poretti, Andrea

    2016-05-01

    Apparent diffusion coefficient (ADC) values have been shown to assist in differentiating cerebellar pilocytic astrocytomas and medulloblastomas. Previous studies have applied only ADC measurements and calculated the mean/median values. Here we investigated the value of diffusion tensor imaging (DTI) histogram characteristics of the entire tumor for differentiation of cerebellar pilocytic astrocytomas and medulloblastomas. Presurgical DTI data were analyzed with a region of interest (ROI) approach to include the entire tumor. For each tumor, histogram-derived metrics including the 25th percentile, 75th percentile, and skewness were calculated for fractional anisotropy (FA) and mean (MD), axial (AD), and radial (RD) diffusivity. The histogram metrics were used as primary predictors of interest in a logistic regression model. Statistical significance levels were set at p histogram skewness showed statistically significant differences for MD between low- and high-grade tumors (P = .008). The 25th percentile for MD yields the best results for the presurgical differentiation between pediatric cerebellar pilocytic astrocytomas and medulloblastomas. The analysis of other DTI metrics does not provide additional diagnostic value. Our study confirms the diagnostic value of the quantitative histogram analysis of DTI data in pediatric neuro-oncology. Copyright © 2015 by the American Society of Neuroimaging.

  16. Value of MR histogram analyses for prediction of microvascular invasion of hepatocellular carcinoma.

    Science.gov (United States)

    Huang, Ya-Qin; Liang, He-Yue; Yang, Zhao-Xia; Ding, Ying; Zeng, Meng-Su; Rao, Sheng-Xiang

    2016-06-01

    The objective is to explore the value of preoperative magnetic resonance (MR) histogram analyses in predicting microvascular invasion (MVI) of hepatocellular carcinoma (HCC).Fifty-one patients with histologically confirmed HCC who underwent diffusion-weighted and contrast-enhanced MR imaging were included. Histogram analyses were performed and mean, variance, skewness, kurtosis, 1th, 10th, 50th, 90th, and 99th percentiles were derived. Quantitative histogram parameters were compared between HCCs with and without MVI. Receiver operating characteristics (ROC) analyses were generated to compare the diagnostic performance of tumor size, histogram analyses of apparent diffusion coefficient (ADC) maps, and MR enhancement.The mean, 1th, 10th, and 50th percentiles of ADC maps, and the mean, variance. 1th, 10th, 50th, 90th, and 99th percentiles of the portal venous phase (PVP) images were significantly different between the groups with and without MVI (P histogram analyses-in particular for 1th percentile for PVP images-held promise for prediction of MVI of HCC.

  17. Histogram analysis of noise performance on fractional anisotropy brain MR image with different diffusion gradient numbers

    International Nuclear Information System (INIS)

    Chang, Yong Min; Kim, Yong Sun; Kang, Duk Sik; Lee, Young Joo; Sohn, Chul Ho; Woo, Seung Koo; Suh, Kyung Jin

    2005-01-01

    We wished to analyze, qualitatively and quantitatively, the noise performance of fractional anisotropy brain images along with the different diffusion gradient numbers by using the histogram method. Diffusion tensor images were acquired using a 3.0 T MR scanner from ten normal volunteers who had no neurological symptoms. The single-shot spin-echo EPI with a Stejskal-Tanner type diffusion gradient scheme was employed for the diffusion tensor measurement. With a b-valuee of 1000 s/mm 2 , the diffusion tensor images were obtained for 6, 11, 23, 35 and 47 diffusion gradient directions. FA images were generated for each DTI scheme. The histograms were then obtained at selected ROIs for the anatomical structures on the FA image. At the same ROI location, the mean FA value and the standard deviation of the mean FA value were calculated. The quality of the FA image was improved as the number of diffusion gradient directions increased by showing better contrast between the WM and GM. The histogram showed that the variance of FA values was reduced as the number of diffusion gradient directions increased. This histogram analysis was in good agreement with the result obtained using quantitative analysis. The image quality of the FA map was significantly improved as the number of diffusion gradient directions increased. The histogram analysis well demonstrated that the improvement in the FA images resulted from the reduction in the variance of the FA values included in the ROI

  18. Sound is Multi-Dimensional

    DEFF Research Database (Denmark)

    Bergstrøm-Nielsen, Carl

    2006-01-01

    First part of this work examines the concept of musical parameter theory and discusses its methodical use. Second part is an annotated catalogue of 33 different students' compositions, presented in their totality with English translations, created between 1985 and 2006 as part of the subject...... Intuitive Music at Music Therapy, AAU. 20 of these have sound files as well. The work thus serves as an anthology of this form of composition. All the compositions are systematically presented according to parameters: pitch, duration, dynamics, timbre, density, pulse-no pulse, tempo, stylistic...

  19. A dose-volume histogram based decision-support system for dosimetric comparison of radiotherapy treatment plans

    International Nuclear Information System (INIS)

    Alfonso, J. C. L.; Herrero, M. A.; Núñez, L.

    2015-01-01

    The choice of any radiotherapy treatment plan is usually made after the evaluation of a few preliminary isodose distributions obtained from different beam configurations. Despite considerable advances in planning techniques, such final decision remains a challenging task that would greatly benefit from efficient and reliable assessment tools. For any dosimetric plan considered, data on dose-volume histograms supplied by treatment planning systems are used to provide estimates on planning target coverage as well as on sparing of organs at risk and the remaining healthy tissue. These partial metrics are then combined into a dose distribution index (DDI), which provides a unified, easy-to-read score for each competing radiotherapy plan. To assess the performance of the proposed scoring system, DDI figures for fifty brain cancer patients were retrospectively evaluated. Patients were divided in three groups depending on tumor location and malignancy. For each patient, three tentative plans were designed and recorded during planning, one of which was eventually selected for treatment. We thus were able to compare the plans with better DDI scores and those actually delivered. When planning target coverage and organs at risk sparing are considered as equally important, the tentative plan with the highest DDI score is shown to coincide with that actually delivered in 32 of the 50 patients considered. In 15 (respectively 3) of the remaining 18 cases, the plan with highest DDI value still coincides with that actually selected, provided that organs at risk sparing is given higher priority (respectively, lower priority) than target coverage. DDI provides a straightforward and non-subjective tool for dosimetric comparison of tentative radiotherapy plans. In particular, DDI readily quantifies differences among competing plans with similar-looking dose-volume histograms and can be easily implemented for any tumor type and localization, irrespective of the planning system and

  20. Infrared face recognition based on LBP histogram and KW feature selection

    Science.gov (United States)

    Xie, Zhihua

    2014-07-01

    The conventional LBP-based feature as represented by the local binary pattern (LBP) histogram still has room for performance improvements. This paper focuses on the dimension reduction of LBP micro-patterns and proposes an improved infrared face recognition method based on LBP histogram representation. To extract the local robust features in infrared face images, LBP is chosen to get the composition of micro-patterns of sub-blocks. Based on statistical test theory, Kruskal-Wallis (KW) feature selection method is proposed to get the LBP patterns which are suitable for infrared face recognition. The experimental results show combination of LBP and KW features selection improves the performance of infrared face recognition, the proposed method outperforms the traditional methods based on LBP histogram, discrete cosine transform(DCT) or principal component analysis(PCA).