WorldWideScience

Sample records for distributed mammogram analysis

  1. Anatomic breast coordinate system for mammogram analysis

    DEFF Research Database (Denmark)

    Karemore, Gopal; Brandt, S.; Karssemeijer, N.

    2011-01-01

    was represented by geodesic distance (s) from nipple and parametric angle (¿) as shown in figure 1. The scoring technique called MTR (mammographic texture resemblance marker) used this breast coordinate system to extract Gaussian derivative features. The features extracted using the (x,y) and the curve......Purpose Many researchers have investigated measures also other than density in the mammogram such as measures based on texture to improve breast cancer risk assessment. However, parenchymal texture characteristics are highly dependent on the orientation of vasculature structure and fibrous tissue...... methodologies as seen from table 2 in given temporal study. Conclusion The curve-linear anatomical breast coordinate system facilitated computerized analysis of mammograms. The proposed coordinate system slightly improved the risk segregation by Mammographic Texture Resemblance and minimized the geometrical...

  2. Mammogram CAD, hybrid registration and iconic analysis

    Science.gov (United States)

    Boucher, A.; Cloppet, F.; Vincent, N.

    2013-03-01

    This paper aims to develop a computer aided diagnosis (CAD) based on a two-step methodology to register and analyze pairs of temporal mammograms. The concept of "medical file", including all the previous medical information on a patient, enables joint analysis of different acquisitions taken at different times, and the detection of significant modifications. The developed registration method aims to superimpose at best the different anatomical structures of the breast. The registration is designed in order to get rid of deformation undergone by the acquisition process while preserving those due to breast changes indicative of malignancy. In order to reach this goal, a referent image is computed from control points based on anatomical features that are extracted automatically. Then the second image of the couple is realigned on the referent image, using a coarse-to-fine approach according to expert knowledge that allows both rigid and non-rigid transforms. The joint analysis detects the evolution between two images representing the same scene. In order to achieve this, it is important to know the registration error limits in order to adapt the observation scale. The approach used in this paper is based on an image sparse representation. Decomposed in regular patterns, the images are analyzed under a new angle. The evolution detection problem has many practical applications, especially in medical images. The CAD is evaluated using recall and precision of differences in mammograms.

  3. MAMMOGRAMS ANALYSIS USING SVM CLASSIFIER IN COMBINED TRANSFORMS DOMAIN

    Directory of Open Access Journals (Sweden)

    B.N. Prathibha

    2011-02-01

    Full Text Available Breast cancer is a primary cause of mortality and morbidity in women. Reports reveal that earlier the detection of abnormalities, better the improvement in survival. Digital mammograms are one of the most effective means for detecting possible breast anomalies at early stages. Digital mammograms supported with Computer Aided Diagnostic (CAD systems help the radiologists in taking reliable decisions. The proposed CAD system extracts wavelet features and spectral features for the better classification of mammograms. The Support Vector Machines classifier is used to analyze 206 mammogram images from Mias database pertaining to the severity of abnormality, i.e., benign and malign. The proposed system gives 93.14% accuracy for discrimination between normal-malign and 87.25% accuracy for normal-benign samples and 89.22% accuracy for benign-malign samples. The study reveals that features extracted in hybrid transform domain with SVM classifier proves to be a promising tool for analysis of mammograms.

  4. Computerized Analysis and Detection of Missed Cancer in Screening Mammogram

    National Research Council Canada - National Science Library

    Li, Lihua

    2005-01-01

    This project is to explore an innovative CAD strategy for improving early detection of breast cancer in screening mammograms by focusing on computerized analysis and detection of cancers missed by radiologists...

  5. Computerized Analysis and Detection of Missed Cancer in Screening Mammogram

    National Research Council Canada - National Science Library

    Li, Lihua

    2004-01-01

    This project is to explore an innovative CAD strategy for improving early detection of breast cancer in screening mammograms by focusing on computerized analysis and detection of cancers missed by radiologists...

  6. Application of texture analysis method for mammogram density classification

    Science.gov (United States)

    Nithya, R.; Santhi, B.

    2017-07-01

    Mammographic density is considered a major risk factor for developing breast cancer. This paper proposes an automated approach to classify breast tissue types in digital mammogram. The main objective of the proposed Computer-Aided Diagnosis (CAD) system is to investigate various feature extraction methods and classifiers to improve the diagnostic accuracy in mammogram density classification. Texture analysis methods are used to extract the features from the mammogram. Texture features are extracted by using histogram, Gray Level Co-Occurrence Matrix (GLCM), Gray Level Run Length Matrix (GLRLM), Gray Level Difference Matrix (GLDM), Local Binary Pattern (LBP), Entropy, Discrete Wavelet Transform (DWT), Wavelet Packet Transform (WPT), Gabor transform and trace transform. These extracted features are selected using Analysis of Variance (ANOVA). The features selected by ANOVA are fed into the classifiers to characterize the mammogram into two-class (fatty/dense) and three-class (fatty/glandular/dense) breast density classification. This work has been carried out by using the mini-Mammographic Image Analysis Society (MIAS) database. Five classifiers are employed namely, Artificial Neural Network (ANN), Linear Discriminant Analysis (LDA), Naive Bayes (NB), K-Nearest Neighbor (KNN), and Support Vector Machine (SVM). Experimental results show that ANN provides better performance than LDA, NB, KNN and SVM classifiers. The proposed methodology has achieved 97.5% accuracy for three-class and 99.37% for two-class density classification.

  7. Field nonuniformity correction for quantitative analysis of digitized mammograms

    International Nuclear Information System (INIS)

    Pawluczyk, Olga; Yaffe, Martin J.

    2001-01-01

    Several factors, including the heel effect, variation in distance from the x-ray source to points in the image and path obliquity contribute to the signal nonuniformity of mammograms. To best use digitized mammograms for quantitative image analysis, these field non-uniformities must be corrected. An empirically based correction method, which uses a bowl-shaped calibration phantom, has been developed. Due to the annular spherical shape of the phantom, its attenuation is constant over the entire image. Remaining nonuniformities are due only to the heel and inverse square effects as well as the variable path through the beam filter, compression plate and image receptor. In logarithmic space, a normalized image of the phantom can be added to mammograms to correct for these effects. Then, an analytical correction for path obliquity in the breast can be applied to the images. It was found that the correction causes the errors associated with field nonuniformity to be reduced from 14% to 2% for a 4 cm block of material corresponding to a combination of 50% fibroglandular and 50% fatty breast tissue. A repeatability study has been conducted to show that in regions as far as 20 cm away from the chest wall, variations due to imaging conditions and phantom alignment contribute to <2% of overall corrected signal

  8. Computerized image analysis: estimation of breast density on mammograms

    Science.gov (United States)

    Zhou, Chuan; Chan, Heang-Ping; Petrick, Nicholas; Sahiner, Berkman; Helvie, Mark A.; Roubidoux, Marilyn A.; Hadjiiski, Lubomir M.; Goodsitt, Mitchell M.

    2000-06-01

    An automated image analysis tool is being developed for estimation of mammographic breast density, which may be useful for risk estimation or for monitoring breast density change in a prevention or intervention program. A mammogram is digitized using a laser scanner and the resolution is reduced to a pixel size of 0.8 mm X 0.8 mm. Breast density analysis is performed in three stages. First, the breast region is segmented from the surrounding background by an automated breast boundary-tracking algorithm. Second, an adaptive dynamic range compression technique is applied to the breast image to reduce the range of the gray level distribution in the low frequency background and to enhance the differences in the characteristic features of the gray level histogram for breasts of different densities. Third, rule-based classification is used to classify the breast images into several classes according to the characteristic features of their gray level histogram. For each image, a gray level threshold is automatically determined to segment the dense tissue from the breast region. The area of segmented dense tissue as a percentage of the breast area is then estimated. In this preliminary study, we analyzed the interobserver variation of breast density estimation by two experienced radiologists using BI-RADS lexicon. The radiologists' visually estimated percent breast densities were compared with the computer's calculation. The results demonstrate the feasibility of estimating mammographic breast density using computer vision techniques and its potential to improve the accuracy and reproducibility in comparison with the subjective visual assessment by radiologists.

  9. Use of prior mammograms in the transition to digital mammography: A performance and cost analysis

    International Nuclear Information System (INIS)

    Taylor-Phillips, S.; Wallis, M.G.; Duncan, A.; Gale, A.G.

    2012-01-01

    Breast screening in Europe is gradually changing from film to digital imaging and reporting of cases. In the transition period prior mammograms (from the preceding screening round) are films thereby potentially causing difficulties in comparison to current digital mammograms. To examine this breast screening performance was measured at a digital mammography workstation with prior mammograms displayed in different formats, and the associated costs calculated. 160 selected difficult cases (41% malignant) were read by eight UK qualified mammography readers in three conditions: with film prior mammograms; with digitised prior mammograms; or without prior mammograms. Lesion location and probability of malignancy were recorded, alongside a decision of whether to recall each case for further tests. JAFROC analysis showed a difference between conditions (p = .006); performance with prior mammograms in either film or digitised formats was superior to that without prior mammograms (p < .05). There was no difference in the performance when the prior mammograms were presented in film or digitised form. The number of benign or normal cases recalled was 26% higher without prior mammograms than with digitised or film prior mammograms (p < .05). This would correspond to an increase in recall rate at the study hospital from 4.3% to 5.5% with no associated increase in cancer detection rate. The cost of this increase was estimated to be £11,581 (€13,666) per 10,000 women screened, which is higher than the cost of digitised (£11,114/€13,115), or film display (£6451/€7612) of the prior mammograms. It is recommended that, where available, prior mammograms are used in the transition to digital breast screening.

  10. An anatomically oriented breast coordinate system for mammogram analysis

    DEFF Research Database (Denmark)

    Brandt, Sami; Karemore, Gopal; Karssemeijer, Nico

    2011-01-01

    and the shape of the breast boundary because these are the most robust features independent of the breast size and shape. On the basis of these landmarks, we have constructed a nonlinear mapping between the parameter frame and the breast region in the mammogram. This mapping makes it possible to identify...... the corresponding positions and orientations among all of the ML or MLO mammograms, which facilitates an implicit use of the registration, i.e., no explicit image warping is needed. We additionally show how the coordinate transform can be used to extract Gaussian derivative features so that the feature positions...... and orientations are registered and extracted without non-linearly deforming the images. We use the proposed breast coordinate transform in a cross-sectional breast cancer risk assessment study of 490 women, in which we attempt to learn breast cancer risk factors from mammograms that were taken prior to when...

  11. TU-F-18C-09: Mammogram Surveillance Using Texture Analysis for Breast Cancer Patients After Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Kuo, H; Tome, W; FOX, J; Hong, L; Yaparpalvi, R; Mehta, K; Bodner, W; Kalnicki, S [Montefiore Medical Center/Albert Einstein College of Medicine, Bronx, New York (United States); Huang, Y [Memorial Sloan-Kettering Cancer Center, Great Neck, NY (United States)

    2014-06-15

    Purpose: To study the feasibility of applying cancer risk model established from treated patients to predict the risk of recurrence on follow-up mammography after radiation therapy for both ipsilateral and contralateral breast. Methods: An extensive set of textural feature functions was applied to a set of 196 Mammograms from 50 patients. 56 Mammograms from 28 patients were used as training set, 44 mammograms from 22 patients were used as test set and the rest were used for prediction. Feature functions include Histogram, Gradient, Co-Occurrence Matrix, Run-Length Matrix and Wavelet Energy. An optimum subset of the feature functions was selected by Fisher Coefficient (FO) or Mutual Information (MI) (up to top 10 features) or a method combined FO, MI and Principal Component (FMP) (up to top 30 features). One-Nearest Neighbor (1-NN), Linear Discriminant Analysis (LDA) and Nonlinear Discriminant Analysis (NDA) were utilized to build a risk model of breast cancer from the training set of mammograms at the time of diagnosis. The risk model was then used to predict the risk of recurrence from mammogram taken one year and three years after RT. Results: FPM with NDA has the best classification power in classifying the training set of the mammogram with lesions versus those without lesions. The model of FPM with NDA achieved a true positive (TP) rate of 82% compared to 45.5% of using FO with 1-NN. The best false positive (FP) rates were 0% and 3.6% in contra-lateral breast of 1-year and 3-years after RT, and 10.9% in ipsi-lateral breast of 3-years after RT. Conclusion: Texture analysis offers high dimension to differentiate breast tissue in mammogram. Using NDA to classify mammogram with lesion from mammogram without lesion, it can achieve rather high TP and low FP in the surveillance of mammogram for patient with conservative surgery combined RT.

  12. Analysis and comparison of breast density according to age on mammogram between Korean and Western women

    International Nuclear Information System (INIS)

    Kim, Seung Hyung; Kim, Mi Hye; Oh, Ki Keun

    2000-01-01

    To compare changes in breast parenchymal density among diverse age groups in asymptomatic Korean women with those of Western women, and to evaluate the effect of different patterns of breast parenchymal density on the sensitivity of screening mammography in Korean women. We analyzed the distribution of breast parenchymal density among diverse age groups in 823 asymptomatic Korean women aged 30-64 who underwent screening mammography between January and December 1998. On the basis of ACR BI-RADS breast composition, four density patterns were designated: patterns 1 and 2 related to fatty mammograms, and patterns 3 and 4 to dense mammograms. We compared the results with those for western women. In Korean women, the frequency of dense mammogram was 88.1% (30-34 years old), 91.1% (35-39), 78.3% (40-44), 61.1% (45-49), 30.1% (50-54), 21.1% (55-59), and 7.0% (60-64). Korean women in their 40s thus showed a higher frequency of dense mammograms, but this frequency decreased abruptly between the ages of 40 and 54. In Western women, however, there was little difference between 40 and 54-year-olds: the figures were 47.2% (40-44 years), 44.8% (45-49), and 44.4% (50-54). Because the frequency of their dense mammograms shows little change between Western women in their forties and in their fifties, it is clear that between these two age groups, mammographic sensitivity is only slightly different. Because the frequency of dense mammograms is much greater among Korean women in their forties than among Western women of the same age, and among korean women this frequency decreases abruptly, it appears, however, that the mammographic sensitivity of korean women is less among those in their forties than among those in their fifties. It is therefore thought that mammography combined with ultrasonography may increase screening sensitivity among Korean women under 50, who have a relatively higher incidence of breast cancer in the younger age groups than do Western women. (author)

  13. Detection of masses in mammograms by analysis of gradient vector convergence using sector filter

    International Nuclear Information System (INIS)

    Fakhari, Y.; Karimian, A.; Mohammadbeigi, M.

    2012-01-01

    Although mammography is the main diagnostic method for breast cancer, but the interpretation of mammograms is a difficult task and depends on the experience and skill of the radiologists. Computer Aided Detection (CADe) systems have been proposed to help radiologist in interpretation of mammograms. In this paper a novel filter called Sector filter is proposed to detect masses. This filter works based on the analysis of convergence of gradient vectors toward the center of filter. Using this filter, rounded convex regions, which are more likely to be pertained to a mass, could be detected in a gray scale image. After applying this filter on the images with two scales and their linear combination suspicious points were selected by a specific process. After implementation of the proposed method, promising results were achieved. The performance of the proposed method in this research was competitive or in some cases even better than that of other suggested methods in the literature. (authors)

  14. Registration and analysis for images couple : application to mammograms

    OpenAIRE

    Boucher, Arnaud

    2014-01-01

    Advisor: Nicole Vincent. Date and location of PhD thesis defense: 10 January 2013, University of Paris Descartes In this thesis, the problem addressed is the development of a computer-aided diagnosis system (CAD) based on conjoint analysis of several images, and therefore on the comparison of these medical images. The particularity of our approach is to look for evolutions or aberrant new tissues in a given set, rather than attempting to characterize, with a strong a priori, the type of ti...

  15. Detection of architectural distortion in prior screening mammograms using Gabor filters, phase portraits, fractal dimension, and texture analysis

    International Nuclear Information System (INIS)

    Rangayyan, Rangaraj M.; Prajna, Shormistha; Ayres, Fabio J.; Desautels, J.E.L.

    2008-01-01

    Mammography is a widely used screening tool for the early detection of breast cancer. One of the commonly missed signs of breast cancer is architectural distortion. The purpose of this study is to explore the application of fractal analysis and texture measures for the detection of architectural distortion in screening mammograms taken prior to the detection of breast cancer. A method based on Gabor filters and phase portrait analysis was used to detect initial candidates for sites of architectural distortion. A total of 386 regions of interest (ROIs) were automatically obtained from 14 ''prior mammograms'', including 21 ROIs related to architectural distortion. From the corresponding set of 14 ''detection mammograms'', 398 ROIs were obtained, including 18 related to breast cancer. For each ROI, the fractal dimension and Haralick's texture features were computed. The fractal dimension of the ROIs was calculated using the circular average power spectrum technique. The average fractal dimension of the normal (false-positive) ROIs was significantly higher than that of the ROIs with architectural distortion (p = 0.006). For the ''prior mammograms'', the best receiver operating characteristics (ROC) performance achieved, in terms of the area under the ROC curve, was 0.80 with a Bayesian classifier using four features including fractal dimension, entropy, sum entropy, and inverse difference moment. Analysis of the performance of the methods with free-response receiver operating characteristics indicated a sensitivity of 0.79 at 8.4 false positives per image in the detection of sites of architectural distortion in the ''prior mammograms''. Fractal dimension offers a promising way to detect the presence of architectural distortion in prior mammograms. (orig.)

  16. Review and analysis of mammograms of the Servicio de Radiologia, Hospital Calderon Guardia from January 2013 to May 2013

    International Nuclear Information System (INIS)

    Lawrence Villalobos, Andrea; Solis Vargas, Carlos

    2013-01-01

    An analysis of 400 mammograms of the Radiology Service of the Calderon Guardia Hospital is carried out, linking the statistical findings between BIRADS Categorization, family hereditary factors, age groups and pathological personalities, of the patients treated in the period from January 2013 to May 2013 Calcifications are identified as the most frequent pathological findings in mammograms analyzed. Ultrasound is identified as the complementary method to mammography the most frequently used. Mammography is still the screening study for the early detection of breast cancer. The support of specialists in radiology of medical images is recommended to implement early reports in the second level of attention [es

  17. Iso-precision scaling of digitized mammograms to facilitate image analysis

    International Nuclear Information System (INIS)

    Karssmeijer, N.; van Erning, L.

    1991-01-01

    This paper reports on a 12 bit CCD camera equipped with a linear sensor of 4096 photodiodes which is used to digitize conventional mammographic films. An iso-precision conversion of the pixel values is preformed to transform the image data to a scale on which the image noise is equal at each level. For this purpose film noise and digitization noise have been determined as a function of optical density and pixel size. It appears that only at high optical densities digitization noise is comparable to or larger than film noise. The quantization error caused by compression of images recorded with 12 bits per pixel to 8 bit images by an iso-precision conversion has been calculated as a function of the number of quantization levels. For mammograms digitized in a 4096 2 matrix the additional error caused by such a scale transform is only about 1.5 percent. An iso-precision scale transform can be advantageous when automated procedures for quantitative image analysis are developed. Especially when detection of signals in noise is aimed at, a constant noise level over the whole pixel value range is very convenient. This is demonstrated by applying local thresholding to detect small microcalcifications. Results are compared to those obtained by using logarithmic or linearized scales

  18. Automatic classification for mammogram backgrounds based on bi-rads complexity definition and on a multi content analysis framework

    Science.gov (United States)

    Wu, Jie; Besnehard, Quentin; Marchessoux, Cédric

    2011-03-01

    Clinical studies for the validation of new medical imaging devices require hundreds of images. An important step in creating and tuning the study protocol is the classification of images into "difficult" and "easy" cases. This consists of classifying the image based on features like the complexity of the background, the visibility of the disease (lesions). Therefore, an automatic medical background classification tool for mammograms would help for such clinical studies. This classification tool is based on a multi-content analysis framework (MCA) which was firstly developed to recognize image content of computer screen shots. With the implementation of new texture features and a defined breast density scale, the MCA framework is able to automatically classify digital mammograms with a satisfying accuracy. BI-RADS (Breast Imaging Reporting Data System) density scale is used for grouping the mammograms, which standardizes the mammography reporting terminology and assessment and recommendation categories. Selected features are input into a decision tree classification scheme in MCA framework, which is the so called "weak classifier" (any classifier with a global error rate below 50%). With the AdaBoost iteration algorithm, these "weak classifiers" are combined into a "strong classifier" (a classifier with a low global error rate) for classifying one category. The results of classification for one "strong classifier" show the good accuracy with the high true positive rates. For the four categories the results are: TP=90.38%, TN=67.88%, FP=32.12% and FN =9.62%.

  19. False Negative Mammogram of Breast Cancer : Analysis of Mammographic and Sonographic Findings and Correlation with Clinical Findings

    International Nuclear Information System (INIS)

    Lee, Kil Jun; Lee, Ji Yeon; Han, Sung Nim; Jeong, Seong Ki; Tae, Seok; Shin, Kyoung Ja; Lee, Sang Chun

    1995-01-01

    Recent mammographic equipment have been of good quality and yielded high diagnostic accuracy for the detection of breast cancer. However, negative mammogram does not necessarily rule out breast cancer. Therefore were viewed cause of false negative mammography in confirmed breast cancer to improve diagnostic accuracy and for adequate clinical approach. We reviewed 19 cases of confirmed breast cancer, which showed false negative mammography with positive sonographic findings. Retrospective analysis was done by correlating the patient's age, sonographic finding and mass size, mammographic breast pattern and cause of false negative mammogram, and clinical symptoms. Among the 5 patients below 35 years in age, mass was not visible due to dense breast in 4 and due to small size in 1 case. In 14 patients over 35 years in age, 11 had normal mammographic findings, 4 had dense breast, and 7 had small sized mass. Remaining 3 cases showed asymmetric density in 2 and architecture distortion in 1 case. All showed mass lesion in sonography : ill defined malignant appearance in 14,well defined malignant appearance in 2, and well defined benign in 3 cases. Negative mammogram should be correlated with sonography in case of dense breast, below 35 years in age with palpable mass and under risk for breast cancer

  20. Detecting mammographically occult cancer in women with dense breasts using Radon Cumulative Distribution Transform: a preliminary analysis

    Science.gov (United States)

    Lee, Juhun; Nishikawa, Robert M.; Rohde, Gustavo K.

    2018-02-01

    We propose using novel imaging biomarkers for detecting mammographically-occult (MO) cancer in women with dense breast tissue. MO cancer indicates visually occluded, or very subtle, cancer that radiologists fail to recognize as a sign of cancer. We used the Radon Cumulative Distribution Transform (RCDT) as a novel image transformation to project the difference between left and right mammograms into a space, increasing the detectability of occult cancer. We used a dataset of 617 screening full-field digital mammograms (FFDMs) of 238 women with dense breast tissue. Among 238 women, 173 were normal with 2 - 4 consecutive screening mammograms, 552 normal mammograms in total, and the remaining 65 women had an MO cancer with a negative screening mammogram. We used Principal Component Analysis (PCA) to find representative patterns in normal mammograms in the RCDT space. We projected all mammograms to the space constructed by the first 30 eigenvectors of the RCDT of normal cases. Under 10-fold crossvalidation, we conducted quantitative feature analysis to classify normal mammograms and mammograms with MO cancer. We used receiver operating characteristic (ROC) analysis to evaluate the classifier's output using the area under the ROC curve (AUC) as the figure of merit. Four eigenvectors were selected via a feature selection method. The mean and standard deviation of the AUC of the trained classifier on the test set were 0.74 and 0.08, respectively. In conclusion, we utilized imaging biomarkers to highlight differences between left and right mammograms to detect MO cancer using novel imaging transformation.

  1. Computerized image analysis: Texture-field orientation method for pectoral muscle identification on MLO-view mammograms

    International Nuclear Information System (INIS)

    Zhou Chuan; Wei Jun; Chan, Heang-Ping; Paramagul, Chintana; Hadjiiski, Lubomir M.; Sahiner, Berkman; Douglas, Julie A.

    2010-01-01

    Purpose: To develop a new texture-field orientation (TFO) method that combines a priori knowledge, local and global information for the automated identification of pectoral muscle on mammograms. Methods: The authors designed a gradient-based directional kernel (GDK) filter to enhance the linear texture structures, and a gradient-based texture analysis to extract a texture orientation image that represented the dominant texture orientation at each pixel. The texture orientation image was enhanced by a second GDK filter for ridge point extraction. The extracted ridge points were validated and the ridges that were less likely to lie on the pectoral boundary were removed automatically. A shortest-path finding method was used to generate a probability image that represented the likelihood that each remaining ridge point lay on the true pectoral boundary. Finally, the pectoral boundary was tracked by searching for the ridge points with the highest probability lying on the pectoral boundary. A data set of 130 MLO-view digitized film mammograms (DFMs) from 65 patients was used to train the TFO algorithm. An independent data set of 637 MLO-view DFMs from 562 patients was used to evaluate its performance. Another independent data set of 92 MLO-view full field digital mammograms (FFDMs) from 92 patients was used to assess the adaptability of the TFO algorithm to FFDMs. The pectoral boundary detection accuracy of the TFO method was quantified by comparison with an experienced radiologist's manually drawn pectoral boundary using three performance metrics: The percent overlap area (POA), the Hausdorff distance (Hdist), and the average distance (AvgDist). Results: The mean and standard deviation of POA, Hdist, and AvgDist were 95.0±3.6%, 3.45±2.16 mm, and 1.12±0.82 mm, respectively. For the POA measure, 91.5%, 97.3%, and 98.9% of the computer detected pectoral muscles had POA larger than 90%, 85%, and 80%, respectively. For the distance measures, 85.4% and 98.0% of the

  2. Can Australian radiographers assess screening mammograms accurately? Biennial follow-up from a four year prospective study and lesion analysis

    International Nuclear Information System (INIS)

    Moran, S.; Warren-Forward, H.

    2016-01-01

    Introduction: Globally, the role of the radiographer is changing; some countries have developed advanced roles with specific scopes of practice. Other countries, like Australia, are in the process of this change. This paper demonstrates the abilities of Australian radiographers in mammogram screen reading, highlighting some of their specific difficulties with different lesion types. Method: Six experienced radiographers participated in a prospective study, screen reading 2000 mammograms each between 2010 and 2011. This paper looks at the results of those same women at biennial re-screen. Analysis of the results included validation of normal results by negative follow-up screens and new cancers at biennial review; there is also analysis on the types of lesions detected and missed. Results: After biennial review, three cancers in 2013/2014 had been marked as abnormal by one radiographer two years prior, which increased her sensitivity from 64% to 85%. Sensitivity for the radiologists decreased from the assumed 100% to 95%. Radiographers appeared to be skilled in detection of calcifications and architectural distortions but had difficulty with non-specific densities. Conclusion: This study demonstrates the potential for Australian radiographers to enhance the accuracy of screen reading programs. - Highlights: • Radiographers have the potential to increase breast cancer detection rates. • Radiographers appear to be skilled at detecting calcifications. • Lesions commonly overlooked by radiographers could be targeted for training.

  3. Digitized mammograms

    International Nuclear Information System (INIS)

    Bruneton, J.N.; Balu-Maestro, C.; Rogopoulos, A.; Chauvel, C.; Geoffray, A.

    1988-01-01

    Two observers conducted a blind evaluation of 100 mammography files, including 47 malignant cases. Films were read both before and after image digitization at 50 μm and 100 μm with the FilmDRSII. Digitization permitted better analysis of the normal anatomic structures and moderately improved diagnostic sensitivity. Searches for microcalcifications before and after digitization at 100 μm and 50 μm showed better analysis of anatomic structures after digitization (especially for solitary microcalcifications). The diagnostic benefit, with discovery of clustered microcalcifications, was more limited (one case at 100 μm, nine cases at 50 μm). Recognition of microcalcifications was clearly improved in dense breasts, which can benefit from reinterpretation after digitization at 50 μm rather 100μm

  4. Evaluation of mammogram compression efficiency

    International Nuclear Information System (INIS)

    Przelaskowski, A.; Surowski, P.; Kukula, A.

    2005-01-01

    Lossy image coding significantly improves performance over lossless methods, but a reliable control of diagnostic accuracy regarding compressed images is necessary. The acceptable range of compression ratios must be safe with respect to as many objective criteria as possible. This study evaluates the compression efficiency of digital mammograms in both numerically lossless (reversible) and lossy (irreversible) manner. Effective compression methods and concepts were examined to increase archiving and telediagnosis performance. Lossless compression as a primary applicable tool for medical applications was verified on a set 131 mammograms. Moreover, nine radiologists participated in the evaluation of lossy compression of mammograms. Subjective rating of diagnostically important features brought a set of mean rates given for each test image. The lesion detection test resulted in binary decision data analyzed statistically. The radiologists rated and interpreted malignant and benign lesions, representative pathology symptoms, and other structures susceptible to compression distortions contained in 22 original and 62 reconstructed mammograms. Test mammograms were collected in two radiology centers for three years and then selected according to diagnostic content suitable for an evaluation of compression effects. Lossless compression efficiency of the tested coders varied, but CALIC, JPEG-LS, and SPIHT performed the best. The evaluation of lossy compression effects affecting detection ability was based on ROC-like analysis. Assuming a two-sided significance level of p=0.05, the null hypothesis that lower bit rate reconstructions are as useful for diagnosis as the originals was false in sensitivity tests with 0.04 bpp mammograms. However, verification of the same hypothesis with 0.1 bpp reconstructions suggested their acceptance. Moreover, the 1 bpp reconstructions were rated very similarly to the original mammograms in the diagnostic quality evaluation test, but the

  5. A deep learning approach for the analysis of masses in mammograms with minimal user intervention.

    Science.gov (United States)

    Dhungel, Neeraj; Carneiro, Gustavo; Bradley, Andrew P

    2017-04-01

    We present an integrated methodology for detecting, segmenting and classifying breast masses from mammograms with minimal user intervention. This is a long standing problem due to low signal-to-noise ratio in the visualisation of breast masses, combined with their large variability in terms of shape, size, appearance and location. We break the problem down into three stages: mass detection, mass segmentation, and mass classification. For the detection, we propose a cascade of deep learning methods to select hypotheses that are refined based on Bayesian optimisation. For the segmentation, we propose the use of deep structured output learning that is subsequently refined by a level set method. Finally, for the classification, we propose the use of a deep learning classifier, which is pre-trained with a regression to hand-crafted feature values and fine-tuned based on the annotations of the breast mass classification dataset. We test our proposed system on the publicly available INbreast dataset and compare the results with the current state-of-the-art methodologies. This evaluation shows that our system detects 90% of masses at 1 false positive per image, has a segmentation accuracy of around 0.85 (Dice index) on the correctly detected masses, and overall classifies masses as malignant or benign with sensitivity (Se) of 0.98 and specificity (Sp) of 0.7. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Analysis of Oriented Texture - with application to the Detection of Architectural Distortion in Mammograms with Application to the Detection of Architectural Distortion in Mammograms

    CERN Document Server

    Ayres, Fabio; Desautels, JE Leo

    2011-01-01

    The presence of oriented features in images often conveys important information about the scene or the objects contained; the analysis of oriented patterns is an important task in the general framework of image understanding. As in many other applications of computer vision, the general framework for the understanding of oriented features in images can be divided into low- and high-level analysis. In the context of the study of oriented features, low-level analysis includes the detection of oriented features in images; a measure of the local magnitude and orientation of oriented features over

  7. Double versus single reading of mammograms in a breast cancer screening programme: a cost-consequence analysis.

    Science.gov (United States)

    Posso, Margarita C; Puig, Teresa; Quintana, Ma Jesus; Solà-Roca, Judit; Bonfill, Xavier

    2016-09-01

    To assess the costs and health-related outcomes of double versus single reading of digital mammograms in a breast cancer screening programme. Based on data from 57,157 digital screening mammograms from women aged 50-69 years, we compared costs, false-positive results, positive predictive value and cancer detection rate using four reading strategies: double reading with and without consensus and arbitration, and single reading with first reader only and second reader only. Four highly trained radiologists read the mammograms. Double reading with consensus and arbitration was 15 % (Euro 334,341) more expensive than single reading with first reader only. False-positive results were more frequent at double reading with consensus and arbitration than at single reading with first reader only (4.5 % and 4.2 %, respectively; p cancer detection rate were similar for both reading strategies (4.6 and 4.2 per 1000 screens; p = 0.283). Our results suggest that changing to single reading of mammograms could produce savings in breast cancer screening. Single reading could reduce the frequency of false-positive results without changing the cancer detection rate. These results are not conclusive and cannot be generalized to other contexts with less trained radiologists. • Double reading of digital mammograms is more expensive than single reading. • Compared to single reading, double reading yields a higher proportion of false-positive results. • The cancer detection rate was similar for double and single readings. • Single reading may be a cost-effective strategy in breast cancer screening programmes.

  8. Improved Screening Mammogram Workflow by Maximizing PACS Streamlining Capabilities in an Academic Breast Center.

    Science.gov (United States)

    Pham, Ramya; Forsberg, Daniel; Plecha, Donna

    2017-04-01

    The aim of this study was to perform an operational improvement project targeted at the breast imaging reading workflow of mammography examinations at an academic medical center with its associated breast centers and satellite sites. Through careful analysis of the current workflow, two major issues were identified: stockpiling of paperwork and multiple worklists. Both issues were considered to cause significant delays to the start of interpreting screening mammograms. Four workflow changes were suggested (scanning of paperwork, worklist consolidation, use of chat functionality, and tracking of case distribution among trainees) and implemented in July 2015. Timestamp data was collected 2 months before (May-Jun) and after (Aug-Sep) the implemented changes. Generalized linear models were used to analyze the data. The results showed significant improvements for the interpretation of screening mammograms. The average time elapsed for time to open a case reduced from 70 to 28 min (60 % decrease, p workflow for diagnostic mammograms at large unaltered even with increased volume of mammography examinations (31 % increase of 4344 examinations for May-Jun to 5678 examinations for Aug-Sep). In conclusion, targeted efforts to improve the breast imaging reading workflow for screening mammograms in a teaching environment provided significant performance improvements without affecting the workflow of diagnostic mammograms.

  9. Applying a social network analysis (SNA) approach to understanding radiologists' performance in reading mammograms

    Science.gov (United States)

    Tavakoli Taba, Seyedamir; Hossain, Liaquat; Heard, Robert; Brennan, Patrick; Lee, Warwick; Lewis, Sarah

    2017-03-01

    Rationale and objectives: Observer performance has been widely studied through examining the characteristics of individuals. Applying a systems perspective, while understanding of the system's output, requires a study of the interactions between observers. This research explains a mixed methods approach to applying a social network analysis (SNA), together with a more traditional approach of examining personal/ individual characteristics in understanding observer performance in mammography. Materials and Methods: Using social networks theories and measures in order to understand observer performance, we designed a social networks survey instrument for collecting personal and network data about observers involved in mammography performance studies. We present the results of a study by our group where 31 Australian breast radiologists originally reviewed 60 mammographic cases (comprising of 20 abnormal and 40 normal cases) and then completed an online questionnaire about their social networks and personal characteristics. A jackknife free response operating characteristic (JAFROC) method was used to measure performance of radiologists. JAFROC was tested against various personal and network measures to verify the theoretical model. Results: The results from this study suggest a strong association between social networks and observer performance for Australian radiologists. Network factors accounted for 48% of variance in observer performance, in comparison to 15.5% for the personal characteristics for this study group. Conclusion: This study suggest a strong new direction for research into improving observer performance. Future studies in observer performance should consider social networks' influence as part of their research paradigm, with equal or greater vigour than traditional constructs of personal characteristics.

  10. Receiver operating characteristic analysis for the detection of simulated microcalcifications on mammograms using hardcopy images

    International Nuclear Information System (INIS)

    Lai, C J; Shaw, Chris C; Whitman, Gary J; Yang, Wei T; Dempsey, Peter J; Nguyen, Victoria; Ice, Mary F

    2006-01-01

    The aim of this study was to compare mammography systems based on three different detectors-a conventional screen-film (SF) combination, an a-Si/CsI flat-panel (FP)-based detector, and a charge-coupled device (CCD)-based x-ray phosphor-based detector-for their performance in detecting simulated microcalcifications (MCs). 112-150 μm calcium carbonate grains were used to simulate MCs and were overlapped with a slab phantom of simulated 50% adipose/50% glandular breast tissue-equivalent material referred to as the uniform background. For the tissue structure background, 200-250 μm calcium carbonate grains were used and overlapped with an anthropomorphic breast phantom. All MC phantom images were acquired with and without magnification (1.8X). The hardcopy images were reviewed by five mammographers. A five-point confidence level rating was used to score each detection task. Receiver operating characteristic (ROC) analysis was performed, and the areas under the ROC curves (A z s) were used to compare the performances of the three mammography systems under various conditions. The results showed that, with a uniform background and contact images, the FP-based system performed significantly better than the SF and the CCD-based systems. For magnified images with a uniform background, the SF and the FP-based systems performed equally well and significantly better than the CCD-based system. With tissue structure background and contact images, the SF system performed significantly better than the FP and the CCD-based systems. With magnified images and a tissue structure background, the SF and the CCD-based systems performed equally well and significantly better than the FP-based system. In the detection of MCs in the fibroglandular and the heterogeneously dense regions, no significant differences were found except that the SF system performed significantly better than the CCD-based system in the fibroglandular regions for the contact images

  11. Risks of mammograms

    International Nuclear Information System (INIS)

    Swartz, H.M.; Reichling, B.A.

    1977-01-01

    In summary, the following practical guidelines for mammography are offered: 1. Any woman, regardless of age, with signs or symptoms that indicate breast cancer should have a mammogram. 2. A woman who has a high risk for breast cancer (e.g., strong family history, no pregnancy before 30 years of age, or a previous breast cancer) should receive periodic screening examinations, including mammography. 3. Periodic screening for asymptomatic women over the age of 50 is indicated. 4. The value of periodic screening for asymptomatic women who are not considered to be at high risk and are under the age of 50 years is not established. Such screening should be carried out only when useful data can be collected on the benefits and risks of this procedure. 5. For any individual woman, the risk of inducing breast cancer by mammography is very low. 6. Mammograms should be made only with modern equipment and techniques designed to provide optimum information with minimal dose

  12. Quantitative assessment of breast density from mammograms

    International Nuclear Information System (INIS)

    Jamal, N.; Ng, K.H.

    2004-01-01

    Full text: It is known that breast density is increasingly used as a risk factor for breast cancer. This study was undertaken to develop and validate a semi-automated computer technique for the quantitative assessment of breast density from digitised mammograms. A computer technique had been developed using MATLAB (Version 6.1) based GUI applications. This semi-automated image analysis tool consists of gradient correction, segmentation of breast region from background, segmentation of fibroglandular and adipose region within the breast area and calculation of breast density. The density is defined as the percentage of fibroglandular tissue area divided by the total breast area in the mammogram. This technique was clinically validated with 122 normal mammograms; these were subjectively evaluated and classified according to the five parenchyma patterns of the Tabar's scheme (Class I- V) by a consultant radiologist. There was a statistical significant correlation between the computer technique and subjective classification (r 2 = 0.84, p<0.05). 71.3% of subjective classification was correctly classified using the computer technique. We had developed a computer technique for the quantitative assessment of breast density and validated its accuracy for computerized classification based on Tabar's scheme. This quantitative tool is useful for the evaluation of a large dataset of mammograms to predict breast cancer risk based on density. Furthermore it has the potential to provide an early marker for success or failure in chemoprevention studies such as hormonal replacement therapy. Copyright (2004) Australasian College of Physical Scientists and Engineers in Medicine

  13. Radiation scars on mammograms

    International Nuclear Information System (INIS)

    Otto, H.; Breining, H.; Knappschafts-Krankenhaus Essen

    1985-01-01

    Six patients with radiation scars are described. In each case the diagnosis was confirmed histologically in five cases corresponding mammograms were available. The histological appearances of radiation scars are described and the radiological features are presented. These lesions can be diagnosed mammographically in vivo. Macroscopically differentiation from a scirrhous carcinoma is not possible and therefore a radiation scar must always be excised; this also leads to definitive cure. On mammographic screening the incidence is 0.5 to 0.9 per thousand. The significance of radiation scars depends on the fact that they are pre-cancerous and therefore are equivalent to the early diagnosis of a carcinoma with the possibility of a complete cure. (orig.) [de

  14. Clustering microcalcifications techniques in digital mammograms

    Science.gov (United States)

    Díaz, Claudia. C.; Bosco, Paolo; Cerello, Piergiorgio

    2008-11-01

    Breast cancer has become a serious public health problem around the world. However, this pathology can be treated if it is detected in early stages. This task is achieved by a radiologist, who should read a large amount of mammograms per day, either for a screening or diagnostic purpose in mammography. However human factors could affect the diagnosis. Computer Aided Detection is an automatic system, which can help to specialists in the detection of possible signs of malignancy in mammograms. Microcalcifications play an important role in early detection, so we focused on their study. The two mammographic features that indicate the microcalcifications could be probably malignant are small size and clustered distribution. We worked with density techniques for automatic clustering, and we applied them on a mammography CAD prototype developed at INFN-Turin, Italy. An improvement of performance is achieved analyzing images from a Perugia-Assisi Hospital, in Italy.

  15. Analysis of the effect of spatial resolution on texture features in the classification of breast masses in mammograms

    International Nuclear Information System (INIS)

    Rangayyan, R.M.; Nguyen, T.M.; Ayres, F.J.; Nandi, A.K.

    2007-01-01

    The present study investigates the effect of spatial resolution on co-occurrence matrix-based texture features in discriminating breast lesions as benign masses or malignant tumors. The highest classification result, in terms of the area under the receiver operating characteristics (ROC) curve, of A z 0.74, was obtained at the spatial resolution of 800 μm using all 14 of Haralick's texture features computed using the margins, or ribbons, of the breast masses as seen on mammograms. Furthermore, our study indicates that texture features computed using the ribbons resulted in higher classification accuracy than the same texture features computed using the corresponding regions of interest within the mass boundaries drawn by an expert radiologist. Classification experiments using each single texture feature showed that the texture F 8 , sum entropy, gives consistently high classification results with an average A z of 0.64 across all levels of resolution. At certain levels of resolution, the textures F 5 , F 9 , and F 11 individually gave the highest classification result with A z = 0.70. (orig.)

  16. Mammogram segmentation using maximal cell strength updation in cellular automata.

    Science.gov (United States)

    Anitha, J; Peter, J Dinesh

    2015-08-01

    Breast cancer is the most frequently diagnosed type of cancer among women. Mammogram is one of the most effective tools for early detection of the breast cancer. Various computer-aided systems have been introduced to detect the breast cancer from mammogram images. In a computer-aided diagnosis system, detection and segmentation of breast masses from the background tissues is an important issue. In this paper, an automatic segmentation method is proposed to identify and segment the suspicious mass regions of mammogram using a modified transition rule named maximal cell strength updation in cellular automata (CA). In coarse-level segmentation, the proposed method performs an adaptive global thresholding based on the histogram peak analysis to obtain the rough region of interest. An automatic seed point selection is proposed using gray-level co-occurrence matrix-based sum average feature in the coarse segmented image. Finally, the method utilizes CA with the identified initial seed point and the modified transition rule to segment the mass region. The proposed approach is evaluated over the dataset of 70 mammograms with mass from mini-MIAS database. Experimental results show that the proposed approach yields promising results to segment the mass region in the mammograms with the sensitivity of 92.25% and accuracy of 93.48%.

  17. Interobserver variability in interpretation of mammogram

    International Nuclear Information System (INIS)

    Lee, Kyung Jae; Lee, Hae Kyung; Lee, Won Chul; Hwang, In Young; Park, Young Gyu; Jung, Sang Seol; Kim, Hoon Kyo; Kim, Mi Hye; Kim, Hak Hee

    2004-01-01

    The purpose of this study was to evaluate the performance of radiologists for mammographic screening, and to analyze interobserver agreement in the interpretation of mammograms. 50 women were selected as subjects from the patients who were screened with mammograms at two university hospitals. The images were analyzed by five radiologists working independently and without their having any knowledge of the final diagnosis. The interobserver variation was analyzed by using the kappa statistic. There were moderate agreements for the findings of the parenchymal pattern (k=0.44; 95% CI 0.39-0.49). calcification type (k=0.66; 95% CI 0.60-0.72) and calcification distribution (K=0.43; 95% CI 0.38-0.48). The mean kappa values ranged from 0.66 to 0.42 for the mass findings. The mean kappa value for the final conclusion was 0.44 (95% CI 0.38-0.51). In general, moderate agreement was evident for all the categories that were evaluated. The general agreement was moderate, but there was wide variability in some findings. To improve the accuracy and reduce variability among physicians in interpretation, proper training of radiologists and standardization of criteria are essential for breast screening

  18. Computerized classification of mass lesions in digital mammograms

    International Nuclear Information System (INIS)

    Giger, M.L.; Doi, K.; Yin, F.F.; Schmidt, R.A.; Vyborny, C.J.

    1989-01-01

    Subjective classification of masses on mammograms is a difficult task. On average, about 25% of masses referred for surgical biopsy are actually malignant. The authors are developing, as an aid to radiologists, a computerized scheme for the classification of lesions in mammograms to reduce the false-negative and false-positive diagnoses of malignancies. The classification scheme involves the extraction of border information from the mammographic lesion in order to quantify the degree of spiculation, which is related to the possibility of malignancy. Clinical film mammograms are digitized with an optical drum scanner (0.1-mm pixel size) for analysis on a Micro VAX 3500 computer. Border information (fluctuations) is obtained from the difference between the lesion border and its smoothed border. Using the rms variation of the frequency content of these fluctuations, approximately 85% of the cancerous lesions were correctly classified as malignant, while 15% of benign lesions were misclassified, in a preliminary study

  19. Computerized detection of mass lesions in digital mammograms

    International Nuclear Information System (INIS)

    Yin, F.F.; Giger, M.L.; Doi, K.; Metz, C.E.; Vyborny, C.J.; Schmidt, R.A.

    1989-01-01

    Early detection of breast cancer from the periodic screening of asymptomatic women could reduce breast cancer mortality by at least 40%. The authors are developing a computerized scheme for the detection of mass lesions in digital mammograms as an aid to radiologists in such high volume screening programs. Based on left-right architectural symmetry and gray-level histogram analysis, bilateral subtraction of left and right breast images is performed. False-positive detections included in bilateral-difference images are reduced with various images feature-extraction techniques. The database involves clinical film mammograms digitized by a TV camera and analyzed on a Micro-VAX workstation. Among five different bilateral subtraction techniques investigated, a nonlinear approach provided superior lesion enhancement. Feature-extraction techniques reduced substantially the remaining false-positives. Preliminary results, for 32 pairs of clinical mammograms, yielded a true-positive rate of approximately 95% with a false-positive rate of about 2 per image

  20. Use of prior mammograms in the classification of benign and malignant masses

    International Nuclear Information System (INIS)

    Varela, Celia; Karssemeijer, Nico; Hendriks, Jan H.C.L.; Holland, Roland

    2005-01-01

    The purpose of this study was to determine the importance of using prior mammograms for classification of benign and malignant masses. Five radiologists and one resident classified mass lesions in 198 mammograms obtained from a population-based screening program. Cases were interpreted twice, once without and once with comparison of previous mammograms, in a sequential reading order using soft copy image display. The radiologists' performances in classifying benign and malignant masses without and with previous mammograms were evaluated with receiver operating characteristic (ROC) analysis. The statistical significance of the difference in performances was calculated using analysis of variance. The use of prior mammograms improved the classification performance of all participants in the study. The mean area under the ROC curve of the readers increased from 0.763 to 0.796. This difference in performance was statistically significant (P = 0.008)

  1. Detecting microcalcifications in digital mammogram using wavelets

    International Nuclear Information System (INIS)

    Yang Jucheng; Park Dongsun

    2004-01-01

    delegates of different kinds of the family wavelets. Among them, the Biothgonal wavelet is bi-orthogonal, but it is not orthogonal and symmetric. While other three wavelets are not only bi-orthogonal but also orthogonal, besides, they are near symmetric. These different characteristics will affect their detecting results. We first decompose the mammogram by using db4, bior3.7, coif3 and sym2 wavelet respectably, and for each family wavelet, the image is decompose into 4 levels (the first level is the original image). The detection of microcalcifications is accomplished by setting the wavelet coefficients of upper-left sub-band to zero in order to suppress the image background information before the reconstruction of the image. The reconstructed mammogram is expected to contain only high-frequency components, which include the microcalcifications. After the wavelets transform process, the third step is to locate microcalcifications through a thresholding operation. The labeling operation with a threshold changes each reconstructed image into a binary image. The threshold is determined through a series of simulation study. The final step of the proposed detection algorithm is the post-processing to eliminate tiny isolated points using binary morphological closing and opening operators. The digital mammogram database used in this work is the MIAS (Mammographic Image Analysis Society) database. The images in this database were scanned with a Joyce-Loebl microdensitometer SCANDIG-3, which has a linear response in the optical density range 0-3.2. Each pixel is 8-bits deep and at a resolution of 50 um x 50 um. And regions of microcalcifications have marked by the veteran diagnostician. Twenty five images (twelve with benign and thirteen with malignant microcalcifications) are selected for this experiment. The performance of the proposed algorithm is evaluated by a free-response receiver operating characteristic (FROC) in terms of true-positive (TP) fraction for a given number of

  2. Detection of microcalcifications in television-enhanced nonmagnified screen-film mammograms compared with matching magnification unenhanced mammograms

    International Nuclear Information System (INIS)

    Kimme-Smith, C.; Gormley, L.S.; Gold, R.H.; Bassett, L.W.

    1988-01-01

    The object of this investigation was to determine which imaging method was associated with greater accuracy in the interpretation of breast microcalcifications: 1.5X to 2.0X magnification with a microfocal spot or Damon DETECT-enhanced mammograms. The authors' test series consisted of matched pairs of images of 31 breasts, each containing a cluster of microcalcifications within a biopsy-proved benign (N = 21) or malignant (N =10) lesion. Three experienced mammographers and three senior radiology residents with 2 weeks of training in mammography interpreted the calcifications. On the basis of receiver operating characteristic analysis, the authors conclude that (1) inexperienced mammographers should not use television image enhancement alone to evaluate microcalcifications and (2) television-enhanced mammograms are not a substitute for microfocal spot magnification mammograms

  3. Should previous mammograms be digitised in the transition to digital mammography?

    International Nuclear Information System (INIS)

    Taylor-Phillips, S.; Gale, A.G.; Wallis, M.G.

    2009-01-01

    Breast screening specificity is improved if previous mammograms are available, which presents a challenge when converting to digital mammography. Two display options were investigated: mounting previous film mammograms on a multiviewer adjacent to the workstation, or digitising them for soft copy display. Eight qualified screen readers were videotaped undertaking routine screen reading for two 45-min sessions in each scenario. Analysis of gross eye and head movements showed that when digitised, previous mammograms were examined a greater number of times per case (p=0.03), due to a combination of being used in 19% more cases (p=0.04) and where used, looked at a greater number of times (28% increase, p=0.04). Digitising previous mammograms reduced both the average time taken per case by 18% (p=0.04) and the participants' perceptions of workload (p < 0.05). Digitising previous analogue mammograms may be advantageous, in particular in increasing their level of use. (orig.)

  4. Locally adaptive decision in detection of clustered microcalcifications in mammograms

    Science.gov (United States)

    Sainz de Cea, María V.; Nishikawa, Robert M.; Yang, Yongyi

    2018-02-01

    In computer-aided detection or diagnosis of clustered microcalcifications (MCs) in mammograms, the performance often suffers from not only the presence of false positives (FPs) among the detected individual MCs but also large variability in detection accuracy among different cases. To address this issue, we investigate a locally adaptive decision scheme in MC detection by exploiting the noise characteristics in a lesion area. Instead of developing a new MC detector, we propose a decision scheme on how to best decide whether a detected object is an MC or not in the detector output. We formulate the individual MCs as statistical outliers compared to the many noisy detections in a lesion area so as to account for the local image characteristics. To identify the MCs, we first consider a parametric method for outlier detection, the Mahalanobis distance detector, which is based on a multi-dimensional Gaussian distribution on the noisy detections. We also consider a non-parametric method which is based on a stochastic neighbor graph model of the detected objects. We demonstrated the proposed decision approach with two existing MC detectors on a set of 188 full-field digital mammograms (95 cases). The results, evaluated using free response operating characteristic (FROC) analysis, showed a significant improvement in detection accuracy by the proposed outlier decision approach over traditional thresholding (the partial area under the FROC curve increased from 3.95 to 4.25, p-value  FPs at a given sensitivity level. The proposed adaptive decision approach could not only reduce the number of FPs in detected MCs but also improve case-to-case consistency in detection.

  5. Improved Classification of Mammograms Following Idealized Training

    Science.gov (United States)

    Hornsby, Adam N.; Love, Bradley C.

    2014-01-01

    People often make decisions by stochastically retrieving a small set of relevant memories. This limited retrieval implies that human performance can be improved by training on idealized category distributions (Giguère & Love, 2013). Here, we evaluate whether the benefits of idealized training extend to categorization of real-world stimuli, namely classifying mammograms as normal or tumorous. Participants in the idealized condition were trained exclusively on items that, according to a norming study, were relatively unambiguous. Participants in the actual condition were trained on a representative range of items. Despite being exclusively trained on easy items, idealized-condition participants were more accurate than those in the actual condition when tested on a range of item types. However, idealized participants experienced difficulties when test items were very dissimilar from training cases. The benefits of idealization, attributable to reducing noise arising from cognitive limitations in memory retrieval, suggest ways to improve real-world decision making. PMID:24955325

  6. Improved Classification of Mammograms Following Idealized Training.

    Science.gov (United States)

    Hornsby, Adam N; Love, Bradley C

    2014-06-01

    People often make decisions by stochastically retrieving a small set of relevant memories. This limited retrieval implies that human performance can be improved by training on idealized category distributions (Giguère & Love, 2013). Here, we evaluate whether the benefits of idealized training extend to categorization of real-world stimuli, namely classifying mammograms as normal or tumorous. Participants in the idealized condition were trained exclusively on items that, according to a norming study, were relatively unambiguous. Participants in the actual condition were trained on a representative range of items. Despite being exclusively trained on easy items, idealized-condition participants were more accurate than those in the actual condition when tested on a range of item types. However, idealized participants experienced difficulties when test items were very dissimilar from training cases. The benefits of idealization, attributable to reducing noise arising from cognitive limitations in memory retrieval, suggest ways to improve real-world decision making.

  7. Intelligent detection of microcalcification from digitized mammograms

    Indian Academy of Sciences (India)

    Permanent link: https://www.ias.ac.in/article/fulltext/sadh/036/01/0125-0139. Keywords. Breast cancer; digital mammogram; artificial neural networks; microcalcification detection. Abstract. This paper reports the design and implementation of an intelligent system for detection of microcalcification from digital mammograms.

  8. Mammograms

    Science.gov (United States)

    ... Reporting & Auditing Grant Transfer Grant Closeout Contracts & Small Business Training Cancer Training at NCI (Intramural) Resources for ... several new technologies to detect breast tumors. This research ranges from methods being developed in research labs to those that ...

  9. Mammograms

    Science.gov (United States)

    ... federal government website managed by the Office on Women's Health in the Office of the Assistant Secretary for Health at the U.S. Department of Health and Human Services . 200 Independence Avenue, S.W., Washington, DC 20201 1-800-994- ...

  10. Search for lesions in mammograms: Statistical characterization of observer responses

    International Nuclear Information System (INIS)

    Bochud, Francois O.; Abbey, Craig K.; Eckstein, Miguel P.

    2004-01-01

    We investigate human performance for visually detecting simulated microcalcifications and tumors embedded in x-ray mammograms as a function of signal contrast and the number of possible signal locations. Our results show that performance degradation with an increasing number of locations is well approximated by signal detection theory (SDT) with the usual Gaussian assumption. However, more stringent statistical analysis finds a departure from Gaussian assumptions for the detection of microcalcifications. We investigated whether these departures from the SDT Gaussian model could be accounted for by an increase in human internal response correlations arising from the image-pixel correlations present in 1/f spectrum backgrounds and/or observer internal response distributions that departed from the Gaussian assumption. Results were consistent with a departure from the Gaussian response distributions and suggested that the human observer internal responses were more compact than the Gaussian distribution. Finally, we conducted a free search experiment where the signal could appear anywhere within the image. Results show that human performance in a multiple-alternative forced-choice experiment can be used to predict performance in the clinically realistic free search experiment when the investigator takes into account the search area and the observers' inherent spatial imprecision to localize the targets

  11. Distributed Analysis in CMS

    CERN Document Server

    Fanfani, Alessandra; Sanches, Jose Afonso; Andreeva, Julia; Bagliesi, Giusepppe; Bauerdick, Lothar; Belforte, Stefano; Bittencourt Sampaio, Patricia; Bloom, Ken; Blumenfeld, Barry; Bonacorsi, Daniele; Brew, Chris; Calloni, Marco; Cesini, Daniele; Cinquilli, Mattia; Codispoti, Giuseppe; D'Hondt, Jorgen; Dong, Liang; Dongiovanni, Danilo; Donvito, Giacinto; Dykstra, David; Edelmann, Erik; Egeland, Ricky; Elmer, Peter; Eulisse, Giulio; Evans, Dave; Fanzago, Federica; Farina, Fabio; Feichtinger, Derek; Fisk, Ian; Flix, Josep; Grandi, Claudio; Guo, Yuyi; Happonen, Kalle; Hernandez, Jose M; Huang, Chih-Hao; Kang, Kejing; Karavakis, Edward; Kasemann, Matthias; Kavka, Carlos; Khan, Akram; Kim, Bockjoo; Klem, Jukka; Koivumaki, Jesper; Kress, Thomas; Kreuzer, Peter; Kurca, Tibor; Kuznetsov, Valentin; Lacaprara, Stefano; Lassila-Perini, Kati; Letts, James; Linden, Tomas; Lueking, Lee; Maes, Joris; Magini, Nicolo; Maier, Gerhild; McBride, Patricia; Metson, Simon; Miccio, Vincenzo; Padhi, Sanjay; Pi, Haifeng; Riahi, Hassen; Riley, Daniel; Rossman, Paul; Saiz, Pablo; Sartirana, Andrea; Sciaba, Andrea; Sekhri, Vijay; Spiga, Daniele; Tuura, Lassi; Vaandering, Eric; Vanelderen, Lukas; Van Mulders, Petra; Vedaee, Aresh; Villella, Ilaria; Wicklund, Eric; Wildish, Tony; Wissing, Christoph; Wurthwein, Frank

    2009-01-01

    The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.

  12. Decision support system for breast cancer detection using mammograms.

    Science.gov (United States)

    Ganesan, Karthikeyan; Acharya, Rajendra U; Chua, Chua K; Min, Lim C; Mathew, Betty; Thomas, Abraham K

    2013-07-01

    Mammograms are by far one of the most preferred methods of screening for breast cancer. Early detection of breast cancer can improve survival rates to a greater extent. Although the analysis and diagnosis of breast cancer are done by experienced radiologists, there is always the possibility of human error. Interobserver and intraobserver errors occur frequently in the analysis of medical images, given the high variability between every patient. Also, the sensitivity of mammographic screening varies with image quality and expertise of the radiologist. So, there is no golden standard for the screening process. To offset this variability and to standardize the diagnostic procedures, efforts are being made to develop automated techniques for diagnosis and grading of breast cancer images. This article presents a classification pipeline to improve the accuracy of differentiation between normal, benign, and malignant mammograms. Several features based on higher-order spectra, local binary pattern, Laws' texture energy, and discrete wavelet transform were extracted from mammograms. Feature selection techniques based on sequential forward, backward, plus-l-takeaway-r, individual, and branch-and-bound selections using the Mahalanobis distance criterion were used to rank the features and find classification accuracies for combination of several features based on the ranking. Six classifiers were used, namely, decision tree classifier, fisher classifier, linear discriminant classifier, nearest mean classifier, Parzen classifier, and support vector machine classifier. We evaluated our proposed methodology with 300 mammograms obtained from the Digital Database for Screening Mammography and 300 mammograms from the Singapore Anti-Tuberculosis Association CommHealth database. Sensitivity, specificity, and accuracy values were used to compare the performances of the classifiers. Our results show that the decision tree classifier demonstrated an excellent performance compared to

  13. Distributed analysis at LHCb

    International Nuclear Information System (INIS)

    Williams, Mike; Egede, Ulrik; Paterson, Stuart

    2011-01-01

    The distributed analysis experience to date at LHCb has been positive: job success rates are high and wait times for high-priority jobs are low. LHCb users access the grid using the GANGA job-management package, while the LHCb virtual organization manages its resources using the DIRAC package. This clear division of labor has benefitted LHCb and its users greatly; it is a major reason why distributed analysis at LHCb has been so successful. The newly formed LHCb distributed analysis support team has also proved to be a success.

  14. Using autoencoders for mammogram compression.

    Science.gov (United States)

    Tan, Chun Chet; Eswaran, Chikkannan

    2011-02-01

    This paper presents the results obtained for medical image compression using autoencoder neural networks. Since mammograms (medical images) are usually of big sizes, training of autoencoders becomes extremely tedious and difficult if the whole image is used for training. We show in this paper that the autoencoders can be trained successfully by using image patches instead of the whole image. The compression performances of different types of autoencoders are compared based on two parameters, namely mean square error and structural similarity index. It is found from the experimental results that the autoencoder which does not use Restricted Boltzmann Machine pre-training yields better results than those which use this pre-training method.

  15. Automated detection of microcalcification clusters in digital mammograms based on wavelet domain hidden Markov tree modeling

    International Nuclear Information System (INIS)

    Regentova, E.; Zhang, L.; Veni, G.; Zheng, J.

    2007-01-01

    A system is designed for detecting microcalcification clusters (MCC) in digital mammograms. The system is intended for computer-aided diagnostic prompting. Further discrimination of MCC as benign or malignant is assumed to be performed by radiologists. Processing of mammograms is based on the statistical modeling by means of wavelet domain hidden markov trees (WHMT). Segmentation is performed by the weighted likelihood evaluation followed by the classification based on spatial filters for a single microcalcification (MC) and a cluster of MC detection. The analysis is carried out on FROC curves for 40 mammograms from the mini-MIAS database and for 100 mammograms with 50 cancerous and 50 benign cases from DDSM database. The designed system is capable to detect 100% of true positive cases in these sets. The rate of false positives is 2.9 per case for mini-MIAS dataset; and 0.01 for the DDSM images. (orig.)

  16. Resolution effects on the morphology of calcifications in digital mammograms

    Energy Technology Data Exchange (ETDEWEB)

    Kallergi, Maria; He, Li; Gavrielides, Marios; Heine, John; Clarke, Laurence P [Department of Radiology, College of Medicine, and H. Lee Moffitt Cancer Center and Research Institute at the University of South Florida, 12901 Bruce B. Downs Blvd., Box 17, Tampa, FL 33612 (United States)

    1999-12-31

    The development of computer assisted diagnosis (CAD) techniques and direct digital mammography systems have generated significant interest in the issue of the effect of image resolution on the detection and classification (benign vs malignant) of mammographic abnormalities. CAD in particular seems to heavily depend on image resolution, either due to the inherent algorithm design and optimization, which is almost always dependent, or due to the differences in image content at the various resolutions. This twofold dependence makes it even more difficult to answer the question of what is the minimum resolution required for successful detection and/or classification of a specific mammographic abnormality, such as calcifications. One may begin by evaluating the losses in the mammograms as the films are digitized with different pixel sizes and depths. In this paper we attempted to measure these losses for the case of calcifications at four different spatial resolutions through a simulation model and a classification scheme that is based only on morphological features. The results showed that a 60 {mu}m pixel size and 12 bits per pixel should at least be used if the morphology and distribution of the calcifications are essential components in the CAD algorithm design. These conclusions were tested with the use of a wavelet-based algorithm for the segmentation of simulated mammographic calcifications at various resolutions. The evaluation of the segmentation through shape analysis and classification supported the initial conclusion. (authors) 14 refs., 1 tabs.

  17. Quantification and normalization of x-ray mammograms

    International Nuclear Information System (INIS)

    Tromans, Christopher E; Cocker, Mary R; Brady, Michael

    2012-01-01

    The analysis of (x-ray) mammograms remains qualitative, relying on the judgement of clinicians. We present a novel method to compute a quantitative, normalized measure of tissue radiodensity traversed by the primary beam incident on each pixel of a mammogram, a measure we term the standard attenuation rate (SAR). SAR enables: the estimation of breast density which is linked to cancer risk; direct comparison between images; the full potential of computer aided diagnosis to be utilized; and a basis for digital breast tomosynthesis reconstruction. It does this by removing the effects of the imaging conditions under which the mammogram is acquired. First, the x-ray spectrum incident upon the breast is calculated, and from this, the energy exiting the breast is calculated. The contribution of scattered radiation is calculated and subtracted. The SAR measure is the scaling factor that must be applied to the reference material in order to match the primary attenuation of the breast. Specifically, this is the scaled reference material attenuation which when traversed by an identical beam to that traversing the breast, and when subsequently detected, results in the primary component of the pixel intensity observed in the breast image. We present results using two tissue equivalent phantoms, as well as a sensitivity analysis to detector response changes over time and possible errors in compressed thickness measurement. (paper)

  18. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  19. Automated detection of microcalcification clusters in mammograms

    Science.gov (United States)

    Karale, Vikrant A.; Mukhopadhyay, Sudipta; Singh, Tulika; Khandelwal, Niranjan; Sadhu, Anup

    2017-03-01

    Mammography is the most efficient modality for detection of breast cancer at early stage. Microcalcifications are tiny bright spots in mammograms and can often get missed by the radiologist during diagnosis. The presence of microcalcification clusters in mammograms can act as an early sign of breast cancer. This paper presents a completely automated computer-aided detection (CAD) system for detection of microcalcification clusters in mammograms. Unsharp masking is used as a preprocessing step which enhances the contrast between microcalcifications and the background. The preprocessed image is thresholded and various shape and intensity based features are extracted. Support vector machine (SVM) classifier is used to reduce the false positives while preserving the true microcalcification clusters. The proposed technique is applied on two different databases i.e DDSM and private database. The proposed technique shows good sensitivity with moderate false positives (FPs) per image on both databases.

  20. Breast composition measurements using retrospective standard mammogram form (SMF)

    International Nuclear Information System (INIS)

    Highnam, R; Pan, X; Warren, R; Jeffreys, M; Smith, G Davey; Brady, M

    2006-01-01

    The standard mammogram form (SMF) representation of an x-ray mammogram is a standardized, quantitative representation of the breast from which the volume of non-fat tissue and breast density can be easily estimated, both of which are of significant interest in determining breast cancer risk. Previous theoretical analysis of SMF had suggested that a complete and substantial set of calibration data (such as mAs and kVp) would be needed to generate realistic breast composition measures and yet there are many interesting trials that have retrospectively collected images with no calibration data. The main contribution of this paper is to revisit our previous theoretical analysis of SMF with respect to errors in the calibration data and to show how and why that theoretical analysis did not match the results from the practical implementations of SMF. In particular, we show how by estimating breast thickness for every image we are, effectively, compensating for any errors in the calibration data. To illustrate our findings, the current implementation of SMF (version 2.2β) was run over 4028 digitized film-screen mammograms taken from six sites over the years 1988-2002 with and without using the known calibration data. Results show that the SMF implementation running without any calibration data at all generates results which display a strong relationship with when running with a complete set of calibration data, and, most importantly, to an expert's visual assessment of breast composition using established techniques. SMF shows considerable promise in being of major use in large epidemiological studies related to breast cancer which require the automated analysis of large numbers of films from many years previously where little or no calibration data is available

  1. Mammogram synthesis using a 3D simulation. I. Breast tissue model and image acquisition simulation

    International Nuclear Information System (INIS)

    Bakic, Predrag R.; Albert, Michael; Brzakovic, Dragana; Maidment, Andrew D. A.

    2002-01-01

    A method is proposed for generating synthetic mammograms based upon simulations of breast tissue and the mammographic imaging process. A computer breast model has been designed with a realistic distribution of large and medium scale tissue structures. Parameters controlling the size and placement of simulated structures (adipose compartments and ducts) provide a method for consistently modeling images of the same simulated breast with modified position or acquisition parameters. The mammographic imaging process is simulated using a compression model and a model of the x-ray image acquisition process. The compression model estimates breast deformation using tissue elasticity parameters found in the literature and clinical force values. The synthetic mammograms were generated by a mammogram acquisition model using a monoenergetic parallel beam approximation applied to the synthetically compressed breast phantom

  2. Early detection of breast cancer mass lesions by mammogram segmentation images based on texture features

    International Nuclear Information System (INIS)

    Mahmood, F.H.

    2012-01-01

    Mammography is at present one of the available method for early detection of masses or abnormalities which is related to breast cancer.The calcifications. The challenge lies in early and accurate detection to overcome the development of breast cancer that affects more and more women throughout the world. Breast cancer is diagnosed at advanced stages with the help of the digital mammogram images. Masses appear in a mammogram as fine, granular clusters, which are often difficult to identify in a raw mammogram. The incidence of breast cancer in women has increased significantly in recent years. This paper proposes a computer aided diagnostic system for the extraction of features like mass lesions in mammograms for early detection of breast cancer. The proposed technique is based on a four-step procedure: (a) the preprocessing of the image is done, (b) regions of interest (ROI) specification, (c) supervised segmentation method includes two to stages performed using the minimum distance (M D) criterion, and (d) feature extraction based on Gray level Co-occurrence matrices GLC M for the identification of mass lesions. The method suggested for the detection of mass lesions from mammogram image segmentation and analysis was tested over several images taken from A L-llwiya Hospital in Baghdad, Iraq.The proposed technique shows better results.

  3. INDIAM--an e-learning system for the interpretation of mammograms.

    Science.gov (United States)

    Guliato, Denise; Bôaventura, Ricardo S; Maia, Marcelo A; Rangayyan, Rangaraj M; Simedo, Mariângela S; Macedo, Túlio A A

    2009-08-01

    We propose the design of a teaching system named Interpretation and Diagnosis of Mammograms (INDIAM) for training students in the interpretation of mammograms and diagnosis of breast cancer. The proposed system integrates an illustrated tutorial on radiology of the breast, that is, mammography, which uses education techniques to guide the user (doctors, students, or researchers) through various concepts related to the diagnosis of breast cancer. The user can obtain informative text about specific subjects, access a library of bibliographic references, and retrieve cases from a mammographic database that are similar to a query case on hand. The information of each case stored in the mammographic database includes the radiological findings, the clinical history, the lifestyle of the patient, and complementary exams. The breast cancer tutorial is linked to a module that simulates the analysis and diagnosis of a mammogram. The tutorial incorporates tools for helping the user to evaluate his or her knowledge about a specific subject by using the education system or by simulating a diagnosis with appropriate feedback in case of error. The system also makes available digital image processing tools that allow the user to draw the contour of a lesion, the contour of the breast, or identify a cluster of calcifications in a given mammogram. The contours provided by the user are submitted to the system for evaluation. The teaching system is integrated with AMDI-An Indexed Atlas of Digital Mammograms-that includes case studies, e-learning, and research systems. All the resources are accessible via the Web.

  4. Breast Cancer Detection with Gabor Features from Digital Mammograms

    Directory of Open Access Journals (Sweden)

    Yufeng Zheng

    2010-01-01

    Full Text Available A new breast cancer detection algorithm, named the “Gabor Cancer Detection” (GCD algorithm, utilizing Gabor features is proposed. Three major steps are involved in the GCD algorithm, preprocessing, segmentation (generating alarm segments, and classification (reducing false alarms. In preprocessing, a digital mammogram is down-sampled, quantized, denoised and enhanced. Nonlinear diffusion is used for noise suppression. In segmentation, a band-pass filter is formed by rotating a 1-D Gaussian filter (off center in frequency space, termed as “Circular Gaussian Filter” (CGF. A CGF can be uniquely characterized by specifying a central frequency and a frequency band. A mass or calcification is a space-occupying lesion and usually appears as a bright region on a mammogram. The alarm segments (suspicious to be masses/calcifications can be extracted out using a threshold that is adaptively decided upon the histogram analysis of the CGF-filtered mammogram. In classification, a Gabor filter bank is formed with five bands by four orientations (horizontal, vertical, 45 and 135 degree in Fourier frequency domain. For each mammographic image, twenty Gabor-filtered images are produced. A set of edge histogram descriptors (EHD are then extracted from 20 Gabor images for classification. An EHD signature is computed with four orientations of Gabor images along each band and five EHD signatures are then joined together to form an EHD feature vector of 20 dimensions. With the EHD features, the fuzzy C-means clustering technique and k-nearest neighbor (KNN classifier are used to reduce the number of false alarms. The experimental results tested on the DDSM database (University of South Florida show the promises of GCD algorithm in breast cancer detection, which achieved TP (true positive rate = 90% at FPI (false positives per image = 1.21 in mass detection; and TP = 93% at FPI = 1.19 in calcification detection.

  5. Normal mammogram detection based on local probability difference transforms and support vector machines

    International Nuclear Information System (INIS)

    Chiracharit, W.; Kumhom, P.; Chamnongthai, K.; Sun, Y.; Delp, E.J.; Babbs, C.F

    2007-01-01

    Automatic detection of normal mammograms, as a ''first look'' for breast cancer, is a new approach to computer-aided diagnosis. This approach may be limited, however, by two main causes. The first problem is the presence of poorly separable ''crossed-distributions'' in which the correct classification depends upon the value of each feature. The second problem is overlap of the feature distributions that are extracted from digitized mammograms of normal and abnormal patients. Here we introduce a new Support Vector Machine (SVM) based method utilizing with the proposed uncrossing mapping and Local Probability Difference (LPD). Crossed-distribution feature pairs are identified and mapped into a new features that can be separated by a zero-hyperplane of the new axis. The probability density functions of the features of normal and abnormal mammograms are then sampled and the local probability difference functions are estimated to enhance the features. From 1,000 ground-truth-known mammograms, 250 normal and 250 abnormal cases, including spiculated lesions, circumscribed masses or microcalcifications, are used for training a support vector machine. The classification results tested with another 250 normal and 250 abnormal sets show improved testing performances with 90% sensitivity and 89% specificity. (author)

  6. Association between Radiologists' Experience and Accuracy in Interpreting Screening Mammograms

    Directory of Open Access Journals (Sweden)

    Maristany Maria-Teresa

    2008-04-01

    Full Text Available Abstract Background Radiologists have been observed to differ, sometimes substantially, both in their interpretations of mammograms and in their recommendations for follow-up. The aim of this study was to determine how factors related to radiologists' experience affect the accuracy of mammogram readings. Methods We selected a random sample of screening mammograms from a population-based breast cancer screening program. The sample was composed of 30 women with histopathologically-confirmed breast cancer and 170 women without breast cancer after a 2-year follow-up (the proportion of cancers was oversampled. These 200 mammograms were read by 21 radiologists routinely interpreting mammograms, with different amount of experience, and by seven readers who did not routinely interpret mammograms. All readers were blinded to the results of the screening. A positive assessment was considered when a BI-RADS III, 0, IV, V was reported (additional evaluation required. Diagnostic accuracy was calculated through sensitivity and specificity. Results Average specificity was higher in radiologists routinely interpreting mammograms with regard to radiologists who did not (66% vs 56%; p Conclusion Among radiologists who read routinely, volume is not associated with better performance when interpreting screening mammograms, although specificity decreased in radiologists not routinely reading mammograms. Follow-up of cases for which further workup is recommended might reduce variability in mammogram readings and improve the quality of breast cancer screening programs.

  7. Global detection approach for clustered microcalcifications in mammograms using a deep learning network.

    Science.gov (United States)

    Wang, Juan; Nishikawa, Robert M; Yang, Yongyi

    2017-04-01

    In computerized detection of clustered microcalcifications (MCs) from mammograms, the traditional approach is to apply a pattern detector to locate the presence of individual MCs, which are subsequently grouped into clusters. Such an approach is often susceptible to the occurrence of false positives (FPs) caused by local image patterns that resemble MCs. We investigate the feasibility of a direct detection approach to determining whether an image region contains clustered MCs or not. Toward this goal, we develop a deep convolutional neural network (CNN) as the classifier model to which the input consists of a large image window ([Formula: see text] in size). The multiple layers in the CNN classifier are trained to automatically extract image features relevant to MCs at different spatial scales. In the experiments, we demonstrated this approach on a dataset consisting of both screen-film mammograms and full-field digital mammograms. We evaluated the detection performance both on classifying image regions of clustered MCs using a receiver operating characteristic (ROC) analysis and on detecting clustered MCs from full mammograms by a free-response receiver operating characteristic analysis. For comparison, we also considered a recently developed MC detector with FP suppression. In classifying image regions of clustered MCs, the CNN classifier achieved 0.971 in the area under the ROC curve, compared to 0.944 for the MC detector. In detecting clustered MCs from full mammograms, at 90% sensitivity, the CNN classifier obtained an FP rate of 0.69 clusters/image, compared to 1.17 clusters/image by the MC detector. These results indicate that using global image features can be more effective in discriminating clustered MCs from FPs caused by various sources, such as linear structures, thereby providing a more accurate detection of clustered MCs on mammograms.

  8. Automatic detection of anomalies in screening mammograms

    Science.gov (United States)

    2013-01-01

    Background Diagnostic performance in breast screening programs may be influenced by the prior probability of disease. Since breast cancer incidence is roughly half a percent in the general population there is a large probability that the screening exam will be normal. That factor may contribute to false negatives. Screening programs typically exhibit about 83% sensitivity and 91% specificity. This investigation was undertaken to determine if a system could be developed to pre-sort screening-images into normal and suspicious bins based on their likelihood to contain disease. Wavelets were investigated as a method to parse the image data, potentially removing confounding information. The development of a classification system based on features extracted from wavelet transformed mammograms is reported. Methods In the multi-step procedure images were processed using 2D discrete wavelet transforms to create a set of maps at different size scales. Next, statistical features were computed from each map, and a subset of these features was the input for a concerted-effort set of naïve Bayesian classifiers. The classifier network was constructed to calculate the probability that the parent mammography image contained an abnormality. The abnormalities were not identified, nor were they regionalized. The algorithm was tested on two publicly available databases: the Digital Database for Screening Mammography (DDSM) and the Mammographic Images Analysis Society’s database (MIAS). These databases contain radiologist-verified images and feature common abnormalities including: spiculations, masses, geometric deformations and fibroid tissues. Results The classifier-network designs tested achieved sensitivities and specificities sufficient to be potentially useful in a clinical setting. This first series of tests identified networks with 100% sensitivity and up to 79% specificity for abnormalities. This performance significantly exceeds the mean sensitivity reported in literature

  9. Rapid point-of-care breath test for biomarkers of breast cancer and abnormal mammograms.

    Directory of Open Access Journals (Sweden)

    Michael Phillips

    Full Text Available BACKGROUND: Previous studies have reported volatile organic compounds (VOCs in breath as biomarkers of breast cancer and abnormal mammograms, apparently resulting from increased oxidative stress and cytochrome p450 induction. We evaluated a six-minute point-of-care breath test for VOC biomarkers in women screened for breast cancer at centers in the USA and the Netherlands. METHODS: 244 women had a screening mammogram (93/37 normal/abnormal or a breast biopsy (cancer/no cancer 35/79. A mobile point-of-care system collected and concentrated breath and air VOCs for analysis with gas chromatography and surface acoustic wave detection. Chromatograms were segmented into a time series of alveolar gradients (breath minus room air. Segmental alveolar gradients were ranked as candidate biomarkers by C-statistic value (area under curve [AUC] of receiver operating characteristic [ROC] curve. Multivariate predictive algorithms were constructed employing significant biomarkers identified with multiple Monte Carlo simulations and cross validated with a leave-one-out (LOO procedure. RESULTS: Performance of breath biomarker algorithms was determined in three groups: breast cancer on biopsy versus normal screening mammograms (81.8% sensitivity, 70.0% specificity, accuracy 79% (73% on LOO [C-statistic value], negative predictive value 99.9%; normal versus abnormal screening mammograms (86.5% sensitivity, 66.7% specificity, accuracy 83%, 62% on LOO; and cancer versus no cancer on breast biopsy (75.8% sensitivity, 74.0% specificity, accuracy 78%, 67% on LOO. CONCLUSIONS: A pilot study of a six-minute point-of-care breath test for volatile biomarkers accurately identified women with breast cancer and with abnormal mammograms. Breath testing could potentially reduce the number of needless mammograms without loss of diagnostic sensitivity.

  10. Computerized detection of masses on mammograms by entropy maximization thresholding

    International Nuclear Information System (INIS)

    Kom, Guillaume; Tiedeu, Alain; Feudjio, Cyrille; Ngundam, J.

    2010-03-01

    In many cases, masses in X-ray mammograms are subtle and their detection can benefit from an automated system serving as a diagnostic aid. It is to this end that the authors propose in this paper, a new computer aided mass detection for breast cancer diagnosis. The first step focuses on wavelet filters enhancement which removes bright background due to dense breast tissues and some film artifacts while preserving features and patterns related to the masses. In the second step, enhanced image is computed by Entropy Maximization Thresholding (EMT) to obtain segmented masses. The efficiency of 98,181% is achieved by analyzing a database of 84 mammograms previously marked by radiologists and digitized at a pixel size of 343μmm x 343μ mm. The segmentation results, in terms of size of detected masses, give a relative error on mass area that is less than 8%. The performance of the proposed method has also been evaluated by means of the receiver operating-characteristics (ROC) analysis. This yielded respectively, an area (Az) of 0.9224 and 0.9295 under the ROC curve whether enhancement step is applied or not. Furthermore, we observe that the EMT yields excellent segmentation results compared to those found in literature. (author)

  11. ANALISA PERBANDINGAN METODE SEGMENTASI CITRA PADA CITRA MAMMOGRAM

    Directory of Open Access Journals (Sweden)

    Toni Arifin

    2016-09-01

    Full Text Available Abstract Cancer is a desaeas with a high prevalence in the world. As many 8,2 million people died of cancer. The prevalence of cancer was happened in woman that is breast cancer. Breast cancer is a malignancy derived from grandular cells, gland duct and supporting the breast tissues. There are many ways of detecting the presence of breast cancer which one is mammography test that aims to examine the human breast using low-dose X-rays. Observation mammography results in the form of mammogram images can be done with image processing, in this way the process of observation is not take a long time and error in the observation can be reduced. One of the process image processing is image segmentation, the step of image segmentation is an important in image analysis there force is needed method in process of image sementation. This observation is aims to analyze comparison of two image segmentation methods of mammogram images that is using Watershed method and Otsu method after that it will see the quality of image by calculating the signal to noise ratio and timing run of each method. The result of this observation is showed that the signal to noise ratio on the Watershed method 7,475 dB and Otsu method 6.197 dB and the conclution is Watershed method is better than Otsu method, whereas if viewed the timing run Watershed method 0,016 seconds is more faster than Otsu method.

  12. Diagnostic image quality of mammograms in German outpatient medical care

    International Nuclear Information System (INIS)

    Pfandzelter, R.; Wuelfing, U.; Boedeker, B.

    2010-01-01

    Purpose: A total of 79 115 mammograms from statutory health insurance (SHI) physicians within German outpatient care were evaluated with respect to the diagnostic image quality. Materials and Methods: Mammograms were randomly selected between 2006 and 2008 by the regional Associations of Statutory Health Insurance Physicians and submitted to regional boards of experts for external evaluation. The mammogram quality was evaluated using a 3-point scale (adequate, borderline, failure) and documented using a nationally standardized protocol. Results: 87.6 % of the mammograms were classified as adequate, 11.0 % as borderline and 1.4 % as failure. Mediolateral oblique mammograms (mlo) had worse ratings than craniocaudal mammograms (cc). Main reasons for classifying the mammograms as borderline or failure were 'inframammary fold not adequately visualized' (mlo), 'pectoral muscle not in the correct angle or not to the level with the nipple' (mlo), 'the nipple not in profile' (mlo, cc) and 'breast not completely or not adequately visualized' (cc). Conclusion: The results show a good overall quality of mammograms in German outpatient medical care. Failures can be associated predominantly with incorrect positioning of the breast. More precisely defined quality criteria using objective measures are recommended, especially for craniocaudal mammograms (cc). (orig.)

  13. Scheduling mammograms for asymptomatic women

    International Nuclear Information System (INIS)

    Gohagan, J.K.; Darby, W.P.; Spitznagel, E.L.; Tome, A.E.

    1988-01-01

    A decision theoretic model was used to investigate the relative importance of risk level, radiation hazard, mammographic accuracy, and cost in mammographic screening decision. The model uses woman-specific medical and family history facts and clinic-specific information regarding mammographic accuracy and practice to profile both woman and clinic, and to formulate periodic screening recommendations. Model parameters were varied extensively to investigate the sensitivity of screening schedules to input values. Multivariate risk was estimated within the program using published data from the Breast Cancer Detection Demonstration Project 5-year follow-up study. Radiation hazard estimates were developed from published radiation physics and radioepidemiologic risk data. Benchmark values for mammographic sensitivity and specificity under screening conditions were calculated from Breast Cancer Detection Demonstration Project data. Procedural costs used in the analysis were varied around values reflecting conditions at the Washington University Medical Center. Mortality advantages of early versus late breast cancer detection were accounted for using Health Insurance Plan of New York case survival rates. Results are compared with published screening policies to provide insight into implicit assumptions behind those policies. This analysis emphasizes the importance of accounting for variations in clinical accuracy under screening circumstances, in costs, in radiation exposure, and in woman-specific risk when recommending mammographic screening

  14. The classification of normal screening mammograms

    Science.gov (United States)

    Ang, Zoey Z. Y.; Rawashdeh, Mohammad A.; Heard, Robert; Brennan, Patrick C.; Lee, Warwick; Lewis, Sarah J.

    2016-03-01

    Rationale and objectives: To understand how breast screen readers classify the difficulty of normal screening mammograms using common lexicon describing normal appearances. Cases were also assessed on their suitability for a single reader strategy. Materials and Methods: 15 breast readers were asked to interpret a test set of 29 normal screening mammogram cases and classify them by rating the difficulty of the case on a five-point Likert scale, identifying the salient features and assessing their suitability for single reading. Using the False Positive Fractions from a previous study, the 29 cases were classified into 10 "low", 10 "medium" and nine "high" difficulties. Data was analyzed with descriptive statistics. Spearman's correlation was used to test the strength of association between the difficulty of the cases and the readers' recommendation for single reading strategy. Results: The ratings from readers in this study corresponded to the known difficulty level of cases for the 'low' and 'high' difficulty cases. Uniform ductal pattern and density, symmetrical mammographic features and the absence of micro-calcifications were the main reasons associated with 'low' difficulty cases. The 'high' difficulty cases were described as having `dense breasts'. There was a statistically significant negative correlation between the difficulty of the cases and readers' recommendation for single reading (r = -0.475, P = 0.009). Conclusion: The findings demonstrated potential relationships between certain mammographic features and the difficulty for readers to classify mammograms as 'normal'. The standard Australian practice of double reading was deemed more suitable for most cases. There was an inverse moderate association between the difficulty of the cases and the recommendations for single reading.

  15. Distribution system modeling and analysis

    CERN Document Server

    Kersting, William H

    2001-01-01

    For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures. Distribution System Modeling and Analysis helps prevent those errors. It gives readers a basic understanding of the modeling and operating characteristics of the major components of a distribution system. One by one, the author develops and analyzes each component as a stand-alone element, then puts them all together to analyze a distribution system comprising the various shunt and series devices for power-flow and short-circuit studies. He includes the derivation of all models and includes many num...

  16. Assessment of a novel mass detection algorithm in mammograms

    Directory of Open Access Journals (Sweden)

    Ehsan Kozegar

    2013-01-01

    Settings and Design: The proposed mass detector consists of two major steps. In the first step, several suspicious regions are extracted from the mammograms using an adaptive thresholding technique. In the second step, false positives originating by the previous stage are reduced by a machine learning approach. Materials and Methods: All modules of the mass detector were assessed on mini-MIAS database. In addition, the algorithm was tested on INBreast database for more validation. Results: According to FROC analysis, our mass detection algorithm outperforms other competing methods. Conclusions: We should not just insist on sensitivity in the segmentation phase because if we forgot FP rate, and our goal was just higher sensitivity, then the learning algorithm would be biased more toward false positives and the sensitivity would decrease dramatically in the false positive reduction phase. Therefore, we should consider the mass detection problem as a cost sensitive problem because misclassification costs are not the same in this type of problems.

  17. Dynamic multiple thresholding breast boundary detection algorithm for mammograms

    International Nuclear Information System (INIS)

    Wu, Yi-Ta; Zhou Chuan; Chan, Heang-Ping; Paramagul, Chintana; Hadjiiski, Lubomir M.; Daly, Caroline Plowden; Douglas, Julie A.; Zhang Yiheng; Sahiner, Berkman; Shi Jiazheng; Wei Jun

    2010-01-01

    Purpose: Automated detection of breast boundary is one of the fundamental steps for computer-aided analysis of mammograms. In this study, the authors developed a new dynamic multiple thresholding based breast boundary (MTBB) detection method for digitized mammograms. Methods: A large data set of 716 screen-film mammograms (442 CC view and 274 MLO view) obtained from consecutive cases of an Institutional Review Board approved project were used. An experienced breast radiologist manually traced the breast boundary on each digitized image using a graphical interface to provide a reference standard. The initial breast boundary (MTBB-Initial) was obtained by dynamically adapting the threshold to the gray level range in local regions of the breast periphery. The initial breast boundary was then refined by using gradient information from horizontal and vertical Sobel filtering to obtain the final breast boundary (MTBB-Final). The accuracy of the breast boundary detection algorithm was evaluated by comparison with the reference standard using three performance metrics: The Hausdorff distance (HDist), the average minimum Euclidean distance (AMinDist), and the area overlap measure (AOM). Results: In comparison with the authors' previously developed gradient-based breast boundary (GBB) algorithm, it was found that 68%, 85%, and 94% of images had HDist errors less than 6 pixels (4.8 mm) for GBB, MTBB-Initial, and MTBB-Final, respectively. 89%, 90%, and 96% of images had AMinDist errors less than 1.5 pixels (1.2 mm) for GBB, MTBB-Initial, and MTBB-Final, respectively. 96%, 98%, and 99% of images had AOM values larger than 0.9 for GBB, MTBB-Initial, and MTBB-Final, respectively. The improvement by the MTBB-Final method was statistically significant for all the evaluation measures by the Wilcoxon signed rank test (p<0.0001). Conclusions: The MTBB approach that combined dynamic multiple thresholding and gradient information provided better performance than the breast boundary

  18. Availability and accessibility of subsidized mammogram screening program in peninsular Malaysia: A preliminary study using travel impedance approach.

    Science.gov (United States)

    Mahmud, Aidalina; Aljunid, Syed Mohamed

    2018-01-01

    Access to healthcare is essential in the pursuit of universal health coverage. Components of access are availability, accessibility (spatial and non-spatial), affordability and acceptability. Measuring spatial accessibility is common approach to evaluating access to health care. This study aimed to determine the availability and spatial accessibility of subsidised mammogram screening in Peninsular Malaysia. Availability was determined from the number and distribution of facilities. Spatial accessibility was determined using the travel impedance approach to represent the revealed access as opposed to potential access measured by other spatial measurement methods. The driving distance of return trips from the respondent's residence to the facilities was determined using a mapping application. The travel expenditure was estimated by multiplying the total travel distance by a standardised travel allowance rate, plus parking fees. Respondents in this study were 344 breast cancer patients who received treatment at 4 referral hospitals between 2015 and 2016. In terms of availability, there were at least 6 major entities which provided subsidised mammogram programs. Facilities with mammogram involved with these programs were located more densely in the central and west coast region of the Peninsula. The ratio of mammogram facility to the target population of women aged 40-74 years ranged between 1: 10,000 and 1:80,000. In terms of accessibility, of the 3.6% of the respondents had undergone mammogram screening, their mean travel distance was 53.4 km (SD = 34.5, range 8-112 km) and the mean travel expenditure was RM 38.97 (SD = 24.00, range RM7.60-78.40). Among those who did not go for mammogram screening, the estimated travel distance and expenditure had a skewed distribution with median travel distance of 22.0 km (IQR 12.0, 42.0, range 2.0-340.0) and the median travel cost of RM 17.40 (IQR 10.40, 30.00, range 3.40-240.00). Higher travel impedance was noted among those who

  19. Distributed analysis challenges in ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Duckeck, Guenter; Legger, Federica; Mitterer, Christoph Anton; Walker, Rodney [Ludwig-Maximilians-Universitaet Muenchen (Germany)

    2016-07-01

    The ATLAS computing model has undergone massive changes to meet the high luminosity challenge of the second run of the Large Hadron Collider (LHC) at CERN. The production system and distributed data management have been redesigned, a new data format and event model for analysis have been introduced, and common reduction and derivation frameworks have been developed. We report on the impact these changes have on the distributed analysis system, study the various patterns of grid usage for user analysis, focusing on the differences between the first and th e second LHC runs, and measure performances of user jobs.

  20. The ATLAS distributed analysis system

    International Nuclear Information System (INIS)

    Legger, F

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.

  1. The ATLAS distributed analysis system

    Science.gov (United States)

    Legger, F.; Atlas Collaboration

    2014-06-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.

  2. The ATLAS distributed analysis system

    OpenAIRE

    Legger, F.

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During...

  3. Cancer on a mammogram is not memorable: readers remember their recalls and not cancers

    International Nuclear Information System (INIS)

    Pitman, Alexander G.; Kok, Phebe; Zentner, Lucila

    2012-01-01

    To determine if presence of cancer on a mammogram makes that mammogram more memorable. A total of 100 mammograms (25 cancers) were grouped into 5 sets of 20 cases. Set pairs were presented in five reads to eight radiologist readers. Readers were asked to 'clear' or 'call back' cases, and at post-baseline reads to indicate whether each case was 'new' or 'old ' (remembered from prior read). Two sets were presented only at baseline, to calculate each reader's false recollection rate. For cases presented more than once ('old' cases, 100 presentations) readers could have 'correct memory' or 'memory loss'. Memory performance was defined as odds ratio of correct memory to memory loss. Multivariate logistic data regression analysis identified predictors of memory performance from: reader, set, time since last read, presence of cancer, and whether the case was called back at the last read. Memory performance differed markedly between readers and reader identity was a highly significant predictor of memory performance. Presence of cancer was not a significant predictor of memory performance (odds ratio 0.77, 95% CI: 0.49–1.21). Whether the case was called back at the last read was a highly significant predictor (odds ratio 4.22, 95% CI: 2.70–6.61) for the model incorporating reader variability, and also the model without reader variability (odds ratio 2.67, 95% CI: 1.74–4.08). The only statistically significant predictor of radiologist memory for a mammogram was whether the radiologist 'called it back' at a prior reading round. Presence of cancer on a mammogram did not make it memorable.

  4. Improving work-up of the abnormal mammogram through organized assessment: results from the ontario breast screening program.

    Science.gov (United States)

    Quan, May Lynn; Shumak, Rene S; Majpruz, Vicky; Holloway, Claire M D; O'Malley, Frances P; Chiarelli, Anna M

    2012-03-01

    Women with an abnormal screening mammogram should ideally undergo an organized assessment to attain a timely diagnosis. This study evaluated outcomes of women undergoing work-up after abnormal mammogram through a formal breast assessment affiliate (BAA) program with explicit care pathways compared with usual care (UC) using developed quality indicators for screening mammography programs. Between January 1 and December 31, 2007, a total of 320,635 women underwent a screening mammogram through the Ontario Breast Screening Program (OBSP), of whom 25,543 had an abnormal result requiring further assessment. Established indicators assessing timeliness, appropriateness of follow-up, and biopsy rates were compared between women who were assessed through either a BAA or UC using χ(2) analysis. Work-up of the abnormal mammogram for patients screened through a BAA resulted in a greater proportion of women attaining a definitive diagnosis within the recommended time interval when a histologic diagnosis was required. In addition, use of other quality measures including specimen radiography for both core biopsies and surgical specimens and preoperative core needle biopsy was greater in BAA facilities. These findings support future efforts to increase the number of BAAs within the OBSP, because the pathways and reporting methods associated with them result in improvements in our ability to provide timely and appropriate care for women requiring work-up of an abnormal mammogram.

  5. The ATLAS Distributed Analysis System

    CERN Document Server

    Legger, F; The ATLAS collaboration; Pacheco Pages, A; Stradling, A

    2013-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters ...

  6. The ATLAS Distributed Analysis System

    CERN Document Server

    Legger, F; The ATLAS collaboration

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters ...

  7. What Is a Mammogram and When Should I Get One?

    Science.gov (United States)

    ... Statistics What CDC Is Doing Research African American Women and Mass Media Campaign Public Service Announcements Print Materials Buttons and Badges Stay Informed Cancer Home What Is a Mammogram? Language: English (US) Español ( ...

  8. Hospitalized women's willingness to pay for an inpatient screening mammogram.

    Science.gov (United States)

    Khaliq, Waseem; Harris, Ché Matthew; Landis, Regina; Bridges, John F P; Wright, Scott M

    2014-01-01

    Lower rates for breast cancer screening persist among low income and uninsured women. Although Medicare and many other insurance plans would pay for screening mammograms done during hospital stays, breast cancer screening has not been part of usual hospital care. This study explores the mean amount of money that hospitalized women were willing to contribute towards the cost of a screening mammogram. Of the 193 enrolled patients, 72% were willing to pay a mean of $83.41 (95% CI, $71.51-$95.31) in advance towards inpatient screening mammogram costs. The study's findings suggest that hospitalized women value the prospect of screening mammography during the hospitalization. It may be wise policy to offer mammograms to nonadherent hospitalized women, especially those who are at high risk for developing breast cancer. © 2014 Annals of Family Medicine, Inc.

  9. Consensus double reading of mammograms in private practice

    International Nuclear Information System (INIS)

    Pacher, B.; Tscherney, R.; Litmann-Rowenta, B.; Liskutin, J.; Mazewski, I.; Leitner, H.; Tscholakoff, D.

    2004-01-01

    Purpose: To evaluate retrospectively the results of consensus double reading of mammograms in a private practice for a period of 1.5 years (November 2001 to March 2003). Materials and Method: Two independent experts with dedicated training read all mammograms on a weekly basis. All mammograms including sonographic examinations were evaluated independently and categorized using the Bl-RADS classification. The achieved consensus included a possible recommendation for recall or therapy. A total of 3936 mammograms and 1912 sonography studies were evaluated. All cases with BI-RADS 4 and 5 categories were compared with the histologic results. For a period of three months, the acceptance of double reading including a delay of the final report by one week was tested with a questionnaire and informed consent sheet. Results: BI-RADS categories 4 and 5 were found in 57 cases, with 41 consensus results by two independent readers and 26 carcinomas verified by histology. No consensus could be reached in 16 patients, of which 10 had a final histologic result, with 5 benign lesions and 5 carcinomas of less than 1 cm in diameter. Clinical symptoms or alterations were absent in all patients. The 5 carcinomas were discovered by the double reading procedure. The result of the questionnaire (695 questionnaires) showed a refusal rate of 0.7%, with only 5 women refusing the opportunity of double reading their mammograms. Conclusion: Double reading of mammograms by independent experts is feasible, shows a measurable increase in quality and is accepted by almost all women. (orig.)

  10. Vitamin D intake, month the mammogram was taken and mammographic density in Norwegian women aged 50-69.

    Directory of Open Access Journals (Sweden)

    Merete Ellingjord-Dale

    Full Text Available The role of vitamin D in breast cancer etiology is unclear. There is some, but inconsistent, evidence that vitamin D is associated with both breast cancer risk and mammographic density (MD. We evaluated the associations of MD with month the mammogram was taken, and with vitamin D intake, in a population of women from Norway--a country with limited sunlight exposure for a large part of the year.3114 women aged 50-69, who participated in the Norwegian Breast Cancer Screening Program (NBCSP in 2004 or 2006/07, completed risk factor and food frequency (FFQ questionnaires. Dietary and total (dietary plus supplements vitamin D, calcium and energy intakes were estimated by the FFQ. Month when the mammogram was taken was recorded on the mammogram. Percent MD was assessed using a computer assisted method (Madena, University of Southern California after digitization of the films. Linear regression models were used to investigate percent MD associations with month the mammogram was taken, and vitamin D and calcium intakes, adjusting for age, body mass index (BMI, study year, estrogen and progestin therapy (EPT, education, parity, calcium intakes and energy intakes.There was no statistical significant association between the month the mammogram was taken and percent MD. Overall, there was no association between percent MD and quartiles of total or dietary vitamin D intakes, or of calcium intake. However, analysis restricted to women aged <55 years revealed a suggestive inverse association between total vitamin D intake and percent MD (p for trend = 0.03.Overall, we found no strong evidence that month the mammogram was taken was associated with percent MD. We found no inverse association between vitamin D intake and percent MD overall, but observed a suggestive inverse association between dietary vitamin D and MD for women less than 55 years old.

  11. The visibility of cancer on previous mammograms in retrospective review

    International Nuclear Information System (INIS)

    Saarenmaa, I.; Salminen, T.; Geiger, U.; Heikkinen, P.; Hyvarinen, S.; Isola, J.; Kataja, V.; Kokko, M.-L.; Kokko, R.; Kumpulainen, E.; Karkkainen, A.; Pakkanen, J.; Peltonen, P.; Piironen, A.; Salo, A.; Talviala, M.-L.; Hakama, M.

    2001-01-01

    AIM: To study how many tumours were visible in restrospect on mammograms originally reported as normal or benign in patients coming to surgery with proven breast cancer. The effect of making the pre--operative mammogram available was also assessed. MATERIALS AND METHODS: Three hundred and twenty initial mammograms of consecutive new breast cancer cases were analysed by a group of radiologists in the knowledge that all patients were later diagnosed with breast cancer. The films were read twice, first without and then with the later (pre-operative) mammograms available. The parenchymal density in the location of the tumour was classified as fatty, mixed or dense, and the tumours were classified as visible or not visible. The reasons for the invisibility of the tumour in the earlier examination were analysed. RESULTS: Fourteen per cent (45) of cancers were retrospectively visible in earlier mammograms without the pre-operative mammograms having been shown, and 29% (95) when pre-operative mammograms were shown. Breast parenchymal density decreased with age and the visibility of tumours increased with age. When considered simultaneously, the effect of age (over 55 vs under 55) was greater (OR = 2.9) than the effect of density (fatty vs others) (OR = 1.5). The most common reasons for non-detection were that the lesion was overlooked (55%), diagnosed as benign (33%) or was visible only in one projection (26%). Growing density was the most common (37%) feature of those lesions originally overlooked or regarded as benign. CONCLUSIONS: Tumours are commonly visible in retrospect, but few of them exhibit specific signs of cancer, and are recognized only if they grow or otherwise change. It is not possible to differentiate most of them from normal parenchymal densities. Saaremaa, I. (2001)

  12. Clinical Image Evaluation of Film Mammograms in Korea: Comparison with the ACR Standard

    International Nuclear Information System (INIS)

    Gwak, Yeon Joo; Kim, Hye Jung; Kwak, Jin Young; Son, Eun Ju; Ko, Kyung Hee; Lee, Jin Hwa; Lim, Hyo Soon; Lee, You Jin; Park, Ji Won; Shin, Kyung Min; Jang, Yun-Jin

    2013-01-01

    The goal of this study is to compare the overall quality of film mammograms taken according to the Korean standards with the American College of Radiology (ACR) standard for clinical image evaluation and to identify means of improving mammography quality in Korea. Four hundred and sixty eight sets of film mammograms were evaluated with respect to the Korean and ACR standards for clinical image evaluation. The pass and failure rates of mammograms were compared by medical facility types. Average scores in each category of the two standards were evaluated. Receiver operating characteristic curve analysis was used to identify an optimal Korean standard pass mark by taking the ACR standard as the reference standard. 93.6% (438/468) of mammograms passed the Korean standard, whereas only 80.1% (375/468) passed the ACR standard (p < 0.001). Non-radiologic private clinics had the lowest pass rate (88.1%: Korean standard, 71.8%: ACR standard) and the lowest total score (76.0) by the Korean standard. Average scores of positioning were lowest (19.3/29 by the Korean standard and 3.7/5 by the ACR standard). A cutoff score of 77.0 for the Korean standard was found to correspond to a pass level when the ACR standard was applied. We suggest that tighter regulations, such as, raising the Korean pass mark, subtracting more for severe deficiencies, or considering a very low scores in even a single category as failure, are needed to improve the quality of mammography in Korea

  13. Association between mammogram density and background parenchymal enhancement of breast MRI

    Science.gov (United States)

    Aghaei, Faranak; Danala, Gopichandh; Wang, Yunzhi; Zarafshani, Ali; Qian, Wei; Liu, Hong; Zheng, Bin

    2018-02-01

    Breast density has been widely considered as an important risk factor for breast cancer. The purpose of this study is to examine the association between mammogram density results and background parenchymal enhancement (BPE) of breast MRI. A dataset involving breast MR images was acquired from 65 high-risk women. Based on mammography density (BIRADS) results, the dataset was divided into two groups of low and high breast density cases. The Low-Density group has 15 cases with mammographic density (BIRADS 1 and 2), while the High-density group includes 50 cases, which were rated by radiologists as mammographic density BIRADS 3 and 4. A computer-aided detection (CAD) scheme was applied to segment and register breast regions depicted on sequential images of breast MRI scans. CAD scheme computed 20 global BPE features from the entire two breast regions, separately from the left and right breast region, as well as from the bilateral difference between left and right breast regions. An image feature selection method namely, CFS method, was applied to remove the most redundant features and select optimal features from the initial feature pool. Then, a logistic regression classifier was built using the optimal features to predict the mammogram density from the BPE features. Using a leave-one-case-out validation method, the classifier yields the accuracy of 82% and area under ROC curve, AUC=0.81+/-0.09. Also, the box-plot based analysis shows a negative association between mammogram density results and BPE features in the MRI images. This study demonstrated a negative association between mammogram density and BPE of breast MRI images.

  14. Distributed Data Analysis in ATLAS

    CERN Document Server

    Nilsson, P; The ATLAS collaboration

    2012-01-01

    Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and NorduGrid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interfa...

  15. Three-Class Mammogram Classification Based on Descriptive CNN Features

    Directory of Open Access Journals (Sweden)

    M. Mohsin Jadoon

    2017-01-01

    Full Text Available In this paper, a novel classification technique for large data set of mammograms using a deep learning method is proposed. The proposed model targets a three-class classification study (normal, malignant, and benign cases. In our model we have presented two methods, namely, convolutional neural network-discrete wavelet (CNN-DW and convolutional neural network-curvelet transform (CNN-CT. An augmented data set is generated by using mammogram patches. To enhance the contrast of mammogram images, the data set is filtered by contrast limited adaptive histogram equalization (CLAHE. In the CNN-DW method, enhanced mammogram images are decomposed as its four subbands by means of two-dimensional discrete wavelet transform (2D-DWT, while in the second method discrete curvelet transform (DCT is used. In both methods, dense scale invariant feature (DSIFT for all subbands is extracted. Input data matrix containing these subband features of all the mammogram patches is created that is processed as input to convolutional neural network (CNN. Softmax layer and support vector machine (SVM layer are used to train CNN for classification. Proposed methods have been compared with existing methods in terms of accuracy rate, error rate, and various validation assessment measures. CNN-DW and CNN-CT have achieved accuracy rate of 81.83% and 83.74%, respectively. Simulation results clearly validate the significance and impact of our proposed model as compared to other well-known existing techniques.

  16. Mammogram retrieval through machine learning within BI-RADS standards.

    Science.gov (United States)

    Wei, Chia-Hung; Li, Yue; Huang, Pai Jung

    2011-08-01

    A content-based mammogram retrieval system can support usual comparisons made on images by physicians, answering similarity queries over images stored in the database. The importance of searching for similar mammograms lies in the fact that physicians usually try to recall similar cases by seeking images that are pathologically similar to a given image. This paper presents a content-based mammogram retrieval system, which employs a query example to search for similar mammograms in the database. In this system the mammographic lesions are interpreted based on their medical characteristics specified in the Breast Imaging Reporting and Data System (BI-RADS) standards. A hierarchical similarity measurement scheme based on a distance weighting function is proposed to model user's perception and maximizes the effectiveness of each feature in a mammographic descriptor. A machine learning approach based on support vector machines and user's relevance feedback is also proposed to analyze the user's information need in order to retrieve target images more accurately. Experimental results demonstrate that the proposed machine learning approach with Radial Basis Function (RBF) kernel function achieves the best performance among all tested ones. Furthermore, the results also show that the proposed learning approach can improve retrieval performance when applied to retrieve mammograms with similar mass and calcification lesions, respectively. Copyright © 2011 Elsevier Inc. All rights reserved.

  17. A Probabilistic Approach for Breast Boundary Extraction in Mammograms

    Directory of Open Access Journals (Sweden)

    Hamed Habibi Aghdam

    2013-01-01

    Full Text Available The extraction of the breast boundary is crucial to perform further analysis of mammogram. Methods to extract the breast boundary can be classified into two categories: methods based on image processing techniques and those based on models. The former use image transformation techniques such as thresholding, morphological operations, and region growing. In the second category, the boundary is extracted using more advanced techniques, such as the active contour model. The problem with thresholding methods is that it is a hard to automatically find the optimal threshold value by using histogram information. On the other hand, active contour models require defining a starting point close to the actual boundary to be able to successfully extract the boundary. In this paper, we propose a probabilistic approach to address the aforementioned problems. In our approach we use local binary patterns to describe the texture around each pixel. In addition, the smoothness of the boundary is handled by using a new probability model. Experimental results show that the proposed method reaches 38% and 50% improvement with respect to the results obtained by the active contour model and threshold-based methods respectively, and it increases the stability of the boundary extraction process up to 86%.

  18. Study of the Effects of Total Modulation Transfer Function Changes on Observer Performance Using Clinical Mammograms.

    Science.gov (United States)

    Bencomo, Jose Antonio Fagundez

    The main goal of this study was to relate physical changes in image quality measured by Modulation Transfer Function (MTF) to diagnostic accuracy. One Hundred and Fifty Kodak Min-R screen/film combination conventional craniocaudal mammograms obtained with the Pfizer Microfocus Mammographic system were selected from the files of the Department of Radiology, at M.D. Anderson Hospital and Tumor Institute. The mammograms included 88 cases with a variety of benign diagnosis and 62 cases with a variety of malignant biopsy diagnosis. The average age of the patient population was 55 years old. 70 cases presented calcifications with 30 cases having calcifications smaller than 0.5mm. 46 cases presented irregular bordered masses larger than 1 cm. 30 cases presented smooth bordered masses with 20 larger than 1 cm. Four separated copies of the original images were made each having a different change in the MTF using a defocusing technique whereby copies of the original were obtained by light exposure through different thicknesses (spacing) of transparent film base. The mammograms were randomized, and evaluated by three experienced mammographers for the degree of visibility of various anatomical breast structures and pathological lesions (masses and calicifications), subjective image quality, and mammographic interpretation. 3,000 separate evaluations were anayzed by several statistical techniques including Receiver Operating Characteristic curve analysis, McNemar test for differences between proportions and the Landis et al. method of agreement weighted kappa for ordinal categorical data. Results from the statistical analysis show: (1) There were no statistical significant differences in the diagnostic accuracy of the observers when diagnosing from mammograms with the same MTF. (2) There were no statistically significant differences in diagnostic accuracy for each observer when diagnosing from mammograms with the different MTF's used in the study. (3) There statistical

  19. Automatic correspondence detection in mammogram and breast tomosynthesis images

    Science.gov (United States)

    Ehrhardt, Jan; Krüger, Julia; Bischof, Arpad; Barkhausen, Jörg; Handels, Heinz

    2012-02-01

    Two-dimensional mammography is the major imaging modality in breast cancer detection. A disadvantage of mammography is the projective nature of this imaging technique. Tomosynthesis is an attractive modality with the potential to combine the high contrast and high resolution of digital mammography with the advantages of 3D imaging. In order to facilitate diagnostics and treatment in the current clinical work-flow, correspondences between tomosynthesis images and previous mammographic exams of the same women have to be determined. In this paper, we propose a method to detect correspondences in 2D mammograms and 3D tomosynthesis images automatically. In general, this 2D/3D correspondence problem is ill-posed, because a point in the 2D mammogram corresponds to a line in the 3D tomosynthesis image. The goal of our method is to detect the "most probable" 3D position in the tomosynthesis images corresponding to a selected point in the 2D mammogram. We present two alternative approaches to solve this 2D/3D correspondence problem: a 2D/3D registration method and a 2D/2D mapping between mammogram and tomosynthesis projection images with a following back projection. The advantages and limitations of both approaches are discussed and the performance of the methods is evaluated qualitatively and quantitatively using a software phantom and clinical breast image data. Although the proposed 2D/3D registration method can compensate for moderate breast deformations caused by different breast compressions, this approach is not suitable for clinical tomosynthesis data due to the limited resolution and blurring effects perpendicular to the direction of projection. The quantitative results show that the proposed 2D/2D mapping method is capable of detecting corresponding positions in mammograms and tomosynthesis images automatically for 61 out of 65 landmarks. The proposed method can facilitate diagnosis, visual inspection and comparison of 2D mammograms and 3D tomosynthesis images for

  20. Distributed mobility management - framework & analysis

    NARCIS (Netherlands)

    Liebsch, M.; Seite, P.; Karagiannis, Georgios

    2013-01-01

    Mobile operators consider the distribution of mobility anchors to enable offloading some traffic from their core network. The Distributed Mobility Management (DMM) Working Group is investigating the impact of decentralized mobility management to existing protocol solutions, while taking into account

  1. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  2. Parameter estimation in stochastic mammogram model by heuristic optimization techniques.

    NARCIS (Netherlands)

    Selvan, S.E.; Xavier, C.C.; Karssemeijer, N.; Sequeira, J.; Cherian, R.A.; Dhala, B.Y.

    2006-01-01

    The appearance of disproportionately large amounts of high-density breast parenchyma in mammograms has been found to be a strong indicator of the risk of developing breast cancer. Hence, the breast density model is popular for risk estimation or for monitoring breast density change in prevention or

  3. Flu Shots, Mammogram, and the Perception of Probabilities

    NARCIS (Netherlands)

    Carman, K.G.; Kooreman, P.

    2010-01-01

    We study individuals’ decisions to decline or accept preventive health care interventions such as flu shots and mammograms. In particular, we analyze the role of perceptions of the effectiveness of the intervention, by eliciting individuals' subjective probabilities of sickness and survival, with

  4. Volumetric breast density estimation from full-field digital mammograms.

    NARCIS (Netherlands)

    Engeland, S. van; Snoeren, P.R.; Huisman, H.J.; Boetes, C.; Karssemeijer, N.

    2006-01-01

    A method is presented for estimation of dense breast tissue volume from mammograms obtained with full-field digital mammography (FFDM). The thickness of dense tissue mapping to a pixel is determined by using a physical model of image acquisition. This model is based on the assumption that the breast

  5. Automatic breast cancer risk assessment from digital mammograms

    DEFF Research Database (Denmark)

    Karemore, Gopal Raghunath; Brandt, Sami; Karssemeijer, N

    Purpose: Textural characteristics of the breast tissue structure on mammogram have been shown to improve breast cancer risk assessment in several large studies. Currently, however, the texture is not used to assess risk in standard clinical procedures or involved in general breast cancer risk ass...

  6. Effect of JPEG2000 mammogram compression on microcalcifications segmentation

    International Nuclear Information System (INIS)

    Georgiev, V.; Arikidis, N.; Karahaliou, A.; Skiadopoulos, S.; Costaridou, L.

    2012-01-01

    The purpose of this study is to investigate the effect of mammographic image compression on the automated segmentation of individual microcalcifications. The dataset consisted of individual microcalcifications of 105 clusters originating from mammograms of the Digital Database for Screening Mammography. A JPEG2000 wavelet-based compression algorithm was used for compressing mammograms at 7 compression ratios (CRs): 10:1, 20:1, 30:1, 40:1, 50:1, 70:1 and 100:1. A gradient-based active contours segmentation algorithm was employed for segmentation of microcalcifications as depicted on original and compressed mammograms. The performance of the microcalcification segmentation algorithm on original and compressed mammograms was evaluated by means of the area overlap measure (AOM) and distance differentiation metrics (d mean and d max ) by comparing automatically derived microcalcification borders to manually defined ones by an expert radiologist. The AOM monotonically decreased as CR increased, while d mean and d max metrics monotonically increased with CR increase. The performance of the segmentation algorithm on original mammograms was (mean±standard deviation): AOM=0.91±0.08, d mean =0.06±0.05 and d max =0.45±0.20, while on 40:1 compressed images the algorithm's performance was: AOM=0.69±0.15, d mean =0.23±0.13 and d max =0.92±0.39. Mammographic image compression deteriorates the performance of the segmentation algorithm, influencing the quantification of individual microcalcification morphological properties and subsequently affecting computer aided diagnosis of microcalcification clusters. (authors)

  7. Modeling sequential context effects in diagnostic interpretation of screening mammograms.

    Science.gov (United States)

    Alamudun, Folami; Paulus, Paige; Yoon, Hong-Jun; Tourassi, Georgia

    2018-07-01

    Prior research has shown that physicians' medical decisions can be influenced by sequential context, particularly in cases where successive stimuli exhibit similar characteristics when analyzing medical images. This type of systematic error is known to psychophysicists as sequential context effect as it indicates that judgments are influenced by features of and decisions about the preceding case in the sequence of examined cases, rather than being based solely on the peculiarities unique to the present case. We determine if radiologists experience some form of context bias, using screening mammography as the use case. To this end, we explore correlations between previous perceptual behavior and diagnostic decisions and current decisions. We hypothesize that a radiologist's visual search pattern and diagnostic decisions in previous cases are predictive of the radiologist's current diagnostic decisions. To test our hypothesis, we tasked 10 radiologists of varied experience to conduct blind reviews of 100 four-view screening mammograms. Eye-tracking data and diagnostic decisions were collected from each radiologist under conditions mimicking clinical practice. Perceptual behavior was quantified using the fractal dimension of gaze scanpath, which was computed using the Minkowski-Bouligand box-counting method. To test the effect of previous behavior and decisions, we conducted a multifactor fixed-effects ANOVA. Further, to examine the predictive value of previous perceptual behavior and decisions, we trained and evaluated a predictive model for radiologists' current diagnostic decisions. ANOVA tests showed that previous visual behavior, characterized by fractal analysis, previous diagnostic decisions, and image characteristics of previous cases are significant predictors of current diagnostic decisions. Additionally, predictive modeling of diagnostic decisions showed an overall improvement in prediction error when the model is trained on additional information about

  8. Distribution system analysis and automation

    CERN Document Server

    Gers, Juan

    2013-01-01

    A comprehensive guide to techniques that allow engineers to simulate, analyse and optimise power distribution systems which combined with automation, underpin the emerging concept of the "smart grid". This book is supported by theoretical concepts with real-world applications and MATLAB exercises.

  9. Mobile Versus Fixed Facility: Latinas' Attitudes and Preferences for Obtaining a Mammogram.

    Science.gov (United States)

    Scheel, John R; Tillack, Allison A; Mercer, Lauren; Coronado, Gloria D; Beresford, Shirley A A; Molina, Yamile; Thompson, Beti

    2018-01-01

    Mobile mammographic services have been proposed as a way to reduce Latinas' disproportionate late-stage presentation compared with white women by increasing their access to mammography. The aims of this study were to assess why Latinas may not use mobile mammographic services and to explore their preferences after using these services. Using a mixed-methods approach, a secondary analysis was conducted of baseline survey data (n = 538) from a randomized controlled trial to improve screening mammography rates among Latinas in Washington. Descriptive statistics and bivariate regression were used to characterize mammography location preferences and to test for associations with sociodemographic indices, health care access, and perceived breast cancer risk and beliefs. On the basis of these findings, a qualitative study (n = 18) was used to explore changes in perceptions after using mobile mammographic services. More Latinas preferred obtaining a mammogram at a fixed facility (52.3% [n = 276]) compared with having no preference (46.3% [n = 249]) and preferring mobile mammographic services (1.7% [n = 9]). Concerns about privacy and comfort (15.6% [n = 84]) and about general quality (10.6% [n = 57]) were common reasons for preferring a fixed facility. Those with no history of mammography preferred a fixed facility (P mobile mammographic services after obtaining a mammogram. Although most Latinas preferred obtaining a mammogram at a fixed facility, positive experiences with mobile mammography services changed their attitudes toward them. These findings highlight the need to include community education when using mobile mammographic service to increase screening mammography rates in underserved communities. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  10. Effects of Different Compression Techniques on Diagnostic Accuracies of Breast Masses on Digitized Mammograms

    International Nuclear Information System (INIS)

    Zhigang Liang; Xiangying Du; Jiabin Liu; Yanhui Yang; Dongdong Rong; Xinyu Y ao; Kuncheng Li

    2008-01-01

    Background: The JPEG 2000 compression technique has recently been introduced into the medical imaging field. It is critical to understand the effects of this technique on the detection of breast masses on digitized images by human observers. Purpose: To evaluate whether lossless and lossy techniques affect the diagnostic results of malignant and benign breast masses on digitized mammograms. Material and Methods: A total of 90 screen-film mammograms including craniocaudal and lateral views obtained from 45 patients were selected by two non-observing radiologists. Of these, 22 cases were benign lesions and 23 cases were malignant. The mammographic films were digitized by a laser film digitizer, and compressed to three levels (lossless and lossy 20:1 and 40:1) using the JPEG 2000 wavelet-based image compression algorithm. Four radiologists with 10-12 years' experience in mammography interpreted the original and compressed images. The time interval was 3 weeks for each reading session. A five-point malignancy scale was used, with a score of 1 corresponding to definitely not a malignant mass, a score of 2 referring to not a malignant mass, a score of 3 meaning possibly a malignant mass, a score of 4 being probably a malignant mass, and a score of 5 interpreted as definitely a malignant mass. The radiologists' performance was evaluated using receiver operating characteristic analysis. Results: The average Az values for all radiologists decreased from 0.8933 for the original uncompressed images to 0.8299 for the images compressed at 40:1. This difference was not statistically significant. The detection accuracy of the original images was better than that of the compressed images, and the Az values decreased with increasing compression ratio. Conclusion: Digitized mammograms compressed at 40:1 could be used to substitute original images in the diagnosis of breast cancer

  11. Studies on computer-aided diagnosis systems for chest radiographs and mammograms (in Japanese)

    International Nuclear Information System (INIS)

    Hara, Takeshi

    2001-01-01

    This thesis describes computer-aided diagnosis (CAD) systems for chest radiographs and mammograms. Preprocessing and imaging processing methods for each CAD system include dynamic range compression and region segmentation technique. A new pattern recognition technique combines genetic algorithms with template matching methods to detect lung nodules. A genetic algorithm was employed to select the optimal shape of simulated nodular shadows to be compared with real lesions on digitized chest images. Detection performance was evaluated using 332 chest radiographs from the database of the Japanese Society of Radiological Technology. Our average true-positive rate was 72.8% with an average of 11 false-positive findings per image. A new detection method using high resolution digital images with 0.05 mm sampling is also proposed for the mammogram CAD system to detect very small microcalcifications. An automated classification method uses feature extraction based on fractal dimension analysis of masses. Using over 200 cases to evaluate the detection of mammographic masses and calcifications, the detection rate of masses and microcalcifications were 87% and 96% with 1.5 and 1.8 false-positive findings, respectively. The classification performance on benign vs malignant lesions, the Az values that were defined by the areas under the ROC curves derived from classification schemes of masses and microcalcifications were 0.84 and 0.89. To demonstrate the practicality of these CAD systems in a computer-network environment, we propose to use the mammogram CAD system via the Internet and WWW. A common gateway interface and server-client approach for the CAD system via the Internet will permit display of the CAD results on ordinary computers

  12. Multiplexed wavelet transform technique for detection of microcalcification in digitized mammograms.

    Science.gov (United States)

    Mini, M G; Devassia, V P; Thomas, Tessamma

    2004-12-01

    Wavelet transform (WT) is a potential tool for the detection of microcalcifications, an early sign of breast cancer. This article describes the implementation and evaluates the performance of two novel WT-based schemes for the automatic detection of clustered microcalcifications in digitized mammograms. Employing a one-dimensional WT technique that utilizes the pseudo-periodicity property of image sequences, the proposed algorithms achieve high detection efficiency and low processing memory requirements. The detection is achieved from the parent-child relationship between the zero-crossings [Marr-Hildreth (M-H) detector] /local extrema (Canny detector) of the WT coefficients at different levels of decomposition. The detected pixels are weighted before the inverse transform is computed, and they are segmented by simple global gray level thresholding. Both detectors produce 95% detection sensitivity, even though there are more false positives for the M-H detector. The M-H detector preserves the shape information and provides better detection sensitivity for mammograms containing widely distributed calcifications.

  13. Transient stability analysis of a distribution network with distributed generators

    NARCIS (Netherlands)

    Xyngi, I.; Ishchenko, A.; Popov, M.; Sluis, van der L.

    2009-01-01

    This letter describes the transient stability analysis of a 10-kV distribution network with wind generators, microturbines, and CHP plants. The network being modeled in Matlab/Simulink takes into account detailed dynamic models of the generators. Fault simulations at various locations are

  14. Monte Carlo Modelling of Mammograms : Development and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Spyrou, G; Panayiotakis, G [Univercity of Patras, School of Medicine, Medical Physics Department, 265 00 Patras (Greece); Bakas, A [Technological Educational Institution of Athens, Department of Radiography, 122 10 Athens (Greece); Tzanakos, G [University of Athens, Department of Physics, Divission of Nuclear and Particle Physics, 157 71 Athens (Greece)

    1999-12-31

    A software package using Monte Carlo methods has been developed for the simulation of x-ray mammography. A simplified geometry of the mammographic apparatus has been considered along with the software phantom of compressed breast. This phantom may contain inhomogeneities of various compositions and sizes at any point. Using this model one can produce simulated mammograms. Results that demonstrate the validity of this simulation are presented. (authors) 16 refs, 4 figs

  15. Monte Carlo Modelling of Mammograms : Development and Validation

    International Nuclear Information System (INIS)

    Spyrou, G.; Panayiotakis, G.; Bakas, A.; Tzanakos, G.

    1998-01-01

    A software package using Monte Carlo methods has been developed for the simulation of x-ray mammography. A simplified geometry of the mammographic apparatus has been considered along with the software phantom of compressed breast. This phantom may contain inhomogeneities of various compositions and sizes at any point. Using this model one can produce simulated mammograms. Results that demonstrate the validity of this simulation are presented. (authors)

  16. Estimating average glandular dose by measuring glandular rate in mammograms

    International Nuclear Information System (INIS)

    Goto, Sachiko; Azuma, Yoshiharu; Sumimoto, Tetsuhiro; Eiho, Shigeru

    2003-01-01

    The glandular rate of the breast was objectively measured in order to calculate individual patient exposure dose (average glandular dose) in mammography. By employing image processing techniques and breast-equivalent phantoms with various glandular rate values, a conversion curve for pixel value to glandular rate can be determined by a neural network. Accordingly, the pixel values in clinical mammograms can be converted to the glandular rate value for each pixel. The individual average glandular dose can therefore be calculated using the individual glandular rates on the basis of the dosimetry method employed for quality control in mammography. In the present study, a data set of 100 craniocaudal mammograms from 50 patients was used to evaluate our method. The average glandular rate and average glandular dose of the data set were 41.2% and 1.79 mGy, respectively. The error in calculating the individual glandular rate can be estimated to be less than ±3%. When the calculation error of the glandular rate is taken into consideration, the error in the individual average glandular dose can be estimated to be 13% or less. We feel that our method for determining the glandular rate from mammograms is useful for minimizing subjectivity in the evaluation of patient breast composition. (author)

  17. Distributed computing and nuclear reactor analysis

    International Nuclear Information System (INIS)

    Brown, F.B.; Derstine, K.L.; Blomquist, R.N.

    1994-01-01

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations

  18. Distributed analysis with PROOF in ATLAS collaboration

    International Nuclear Information System (INIS)

    Panitkin, S Y; Ernst, M; Ito, H; Maeno, T; Majewski, S; Rind, O; Tarrade, F; Wenaus, T; Ye, S; Benjamin, D; Montoya, G Carillo; Guan, W; Mellado, B; Xu, N; Cranmer, K; Shibata, A

    2010-01-01

    The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF can be configured to work with centralized storage systems, but it is especially effective together with distributed local storage systems - like Xrootd, when data are distributed over computing nodes. It works efficiently on different types of hardware and scales well from a multi-core laptop to large computing farms. From that point of view it is well suited for both large central analysis facilities and Tier 3 type analysis farms. PROOF can be used in interactive or batch like regimes. The interactive regime allows the user to work with typically distributed data from the ROOT command prompt and get a real time feedback on analysis progress and intermediate results. We will discuss our experience with PROOF in the context of ATLAS Collaboration distributed analysis. In particular we will discuss PROOF performance in various analysis scenarios and in multi-user, multi-session environments. We will also describe PROOF integration with the ATLAS distributed data management system and prospects of running PROOF on geographically distributed analysis farms.

  19. Distributed analysis with PROOF in ATLAS collaboration

    Energy Technology Data Exchange (ETDEWEB)

    Panitkin, S Y; Ernst, M; Ito, H; Maeno, T; Majewski, S; Rind, O; Tarrade, F; Wenaus, T; Ye, S [Brookhaven National Laboratory, Upton, NY 11973 (United States); Benjamin, D [Duke University, Durham, NC 27708 (United States); Montoya, G Carillo; Guan, W; Mellado, B; Xu, N [University of Wisconsin-Madison, Madison, WI 53706 (United States); Cranmer, K; Shibata, A [New York University, New York, NY 10003 (United States)

    2010-04-01

    The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF can be configured to work with centralized storage systems, but it is especially effective together with distributed local storage systems - like Xrootd, when data are distributed over computing nodes. It works efficiently on different types of hardware and scales well from a multi-core laptop to large computing farms. From that point of view it is well suited for both large central analysis facilities and Tier 3 type analysis farms. PROOF can be used in interactive or batch like regimes. The interactive regime allows the user to work with typically distributed data from the ROOT command prompt and get a real time feedback on analysis progress and intermediate results. We will discuss our experience with PROOF in the context of ATLAS Collaboration distributed analysis. In particular we will discuss PROOF performance in various analysis scenarios and in multi-user, multi-session environments. We will also describe PROOF integration with the ATLAS distributed data management system and prospects of running PROOF on geographically distributed analysis farms.

  20. Analysis of irregularly distributed points

    DEFF Research Database (Denmark)

    Hartelius, Karsten

    1996-01-01

    conditional modes are applied to this problem. The Kalman filter is described as a powerfull tool for modelling two-dimensional data. Motivated by the development of the reduced update Kalman filter we propose a reduced update Kalman smoother which offers considerable computa- tional savings. Kriging...... on hybridisation analysis, which comprise matching a grid to an arrayed set of DNA- clones spotted onto a hybridisation filter. The line process has proven to perform a satisfactorly modelling of shifted fields (subgrids) in the hybridisation grid, and a two-staged hierarchical grid matching scheme which...

  1. Volumetric quantification of the effect of aging and hormone replacement therapy on breast composition from digital mammograms

    International Nuclear Information System (INIS)

    Hammann-Kloss, J.S.; Bick, U.; Fallenberg, E.; Engelken, F.

    2014-01-01

    Objective: To assess the physiological changes in breast composition with aging using volumetric breast composition measurement from digital mammograms and to assess the effect of hormone replacement therapy (HRT). Methods: A total of 764 consecutive mammograms of 208 non-HRT using women and 508 mammograms of 134 HRT-using women were analyzed using a volumetric breast composition assessment software (Quantra™, Hologic Inc.). Fibroglandular tissue volume (FTV), breast volume (BV), and percent density (PD) were measured. For statistical analysis, women were divided into a premenopausal (<46 years), a perimenopausal (46–55 years), and a postmenopausal (>55 years) age group. More detailed graphical analysis was performed using smaller age brackets. Women using HRT were compared to age-matched controls not using HRT. Results: Women in the postmenopausal age group had a significantly lower FTV and PD and a significantly higher BV than women in the premenopausal age group (FTV: 77 vs. 120 cm 3 , respectively; PD: 16% vs. 28%, respectively; BV 478 vs. 406 cm 3 , respectively; p < 0.01 for all). Median FTV was nearly stable in consecutive mammograms in the premenopausal and postmenopausal age groups, but declined at a rate of 3.9% per year in the perimenopausal period. Median PD was constant in the premenopausal and postmenopausal age groups and declined at a rate of 0.57% per year in the perimenopausal age group. BV continuously increased with age. Women using HRT throughout the study had a 5% higher PD than women not using HRT (22% vs. 17%, respectively; p < 0.001). Conclusions: Accurate knowledge of normal changes in breast composition are of particular interest nowadays due to the importance of breast density for breast cancer risk evaluation. FTV and PD change significantly during the perimenopausal period but remain relatively constant before and thereafter. Median total breast volume consistently increases with age and further contributes to changes in breast

  2. Cluster analysis for determining distribution center location

    Science.gov (United States)

    Lestari Widaningrum, Dyah; Andika, Aditya; Murphiyanto, Richard Dimas Julian

    2017-12-01

    Determination of distribution facilities is highly important to survive in the high level of competition in today’s business world. Companies can operate multiple distribution centers to mitigate supply chain risk. Thus, new problems arise, namely how many and where the facilities should be provided. This study examines a fast-food restaurant brand, which located in the Greater Jakarta. This brand is included in the category of top 5 fast food restaurant chain based on retail sales. There were three stages in this study, compiling spatial data, cluster analysis, and network analysis. Cluster analysis results are used to consider the location of the additional distribution center. Network analysis results show a more efficient process referring to a shorter distance to the distribution process.

  3. Can Australian radiographers assess screening mammograms accurately? First stage results from a four year prospective study

    International Nuclear Information System (INIS)

    Moran, S.; Warren-Forward, H.

    2016-01-01

    Introduction: Globally, the role of the radiographer is changing; some countries have developed advanced roles with specific scopes of practice. Other countries, like Australia, are in the process of this change. The aim of this research is to assess the diagnostic outcomes reported by the radiographers and compare them to those reported by current screen readers. Method: Six experienced radiographers were invited to participate in a prospective study conducted between 2010 and 2011. They were required to read 2000 mammograms each. Their results were compared with those of the radiologists. Statistical analysis of the results included overall cancer detection rates, recall rates, levels of agreement, kappa, sensitivity, specificity, accuracy, positive predictive value and negative predictive value. Results: A total of 9348 women were included in the study. The percentage of cancers detected by the radiographers ranged from 53% to 100% of the cancers detected by the radiologists. Radiologist recall rate ranged between 3.4% and 5.5% and the radiographers' range was 2.9%–9.8%. Level of agreement of the radiographers with the radiologists ranged from 90 to 96%. Conclusion: The potential for accuracy in screen reading by Australian radiographers is supported by the results of this study. Implementation of formal training is likely to result in an increase in the diagnostic accuracy of radiographers. - Highlights: • Radiographers prospectively read 2000 screening mammograms each. • These results support potential for accuracy in screen reading by radiographers. • Will advanced practice be introduced within BreastScreen Australia?.

  4. ELM BASED CAD SYSTEM TO CLASSIFY MAMMOGRAMS BY THE COMBINATION OF CLBP AND CONTOURLET

    Directory of Open Access Journals (Sweden)

    S Venkatalakshmi

    2017-05-01

    Full Text Available Breast cancer is a serious life threat to the womanhood, worldwide. Mammography is the promising screening tool, which can show the abnormality being detected. However, the physicians find it difficult to detect the affected regions, as the size of microcalcifications is very small. Hence it would be better, if a CAD system can accompany the physician in detecting the malicious regions. Taking this as a challenge, this paper presents a CAD system for mammogram classification which is proven to be accurate and reliable. The entire work is decomposed into four different stages and the outcome of a phase is passed as the input of the following phase. Initially, the mammogram is pre-processed by adaptive median filter and the segmentation is done by GHFCM. The features are extracted by combining the texture feature descriptors Completed Local Binary Pattern (CLBP and contourlet to frame the feature sets. In the training phase, Extreme Learning Machine (ELM is trained with the feature sets. During the testing phase, the ELM can classify between normal, malignant and benign type of cancer. The performance of the proposed approach is analysed by varying the classifier, feature extractors and parameters of the feature extractor. From the experimental analysis, it is evident that the proposed work outperforms the analogous techniques in terms of accuracy, sensitivity and specificity.

  5. Regions of micro-calcifications clusters detection based on new features from imbalance data in mammograms

    Science.gov (United States)

    Wang, Keju; Dong, Min; Yang, Zhen; Guo, Yanan; Ma, Yide

    2017-02-01

    Breast cancer is the most common cancer among women. Micro-calcification cluster on X-ray mammogram is one of the most important abnormalities, and it is effective for early cancer detection. Surrounding Region Dependence Method (SRDM), a statistical texture analysis method is applied for detecting Regions of Interest (ROIs) containing microcalcifications. Inspired by the SRDM, we present a method that extract gray and other features which are effective to predict the positive and negative regions of micro-calcifications clusters in mammogram. By constructing a set of artificial images only containing micro-calcifications, we locate the suspicious pixels of calcifications of a SRDM matrix in original image map. Features are extracted based on these pixels for imbalance date and then the repeated random subsampling method and Random Forest (RF) classifier are used for classification. True Positive (TP) rate and False Positive (FP) can reflect how the result will be. The TP rate is 90% and FP rate is 88.8% when the threshold q is 10. We draw the Receiver Operating Characteristic (ROC) curve and the Area Under the ROC Curve (AUC) value reaches 0.9224. The experiment indicates that our method is effective. A novel regions of micro-calcifications clusters detection method is developed, which is based on new features for imbalance data in mammography, and it can be considered to help improving the accuracy of computer aided diagnosis breast cancer.

  6. Automatic and consistent registration framework for temporal pairs of mammograms in application to breast cancer risk assessment due to hormone replacement therapy (HRT)

    DEFF Research Database (Denmark)

    Karemore, Gopal Raghunath; Carreras, I. Arganda; Nielsen, Mads

    2009-01-01

     Purpose: Mammographic density is a strong risk factor for breast cancer. However, whether changes in mammographic density due to HRT are associated with risk remains unclear. The aim of this study is to provide a framework for accurate interval change analysis in temporal pairs of mammograms of ...

  7. Generalized Analysis of a Distribution Separation Method

    Directory of Open Access Journals (Sweden)

    Peng Zhang

    2016-04-01

    Full Text Available Separating two probability distributions from a mixture model that is made up of the combinations of the two is essential to a wide range of applications. For example, in information retrieval (IR, there often exists a mixture distribution consisting of a relevance distribution that we need to estimate and an irrelevance distribution that we hope to get rid of. Recently, a distribution separation method (DSM was proposed to approximate the relevance distribution, by separating a seed irrelevance distribution from the mixture distribution. It was successfully applied to an IR task, namely pseudo-relevance feedback (PRF, where the query expansion model is often a mixture term distribution. Although initially developed in the context of IR, DSM is indeed a general mathematical formulation for probability distribution separation. Thus, it is important to further generalize its basic analysis and to explore its connections to other related methods. In this article, we first extend DSM’s theoretical analysis, which was originally based on the Pearson correlation coefficient, to entropy-related measures, including the KL-divergence (Kullback–Leibler divergence, the symmetrized KL-divergence and the JS-divergence (Jensen–Shannon divergence. Second, we investigate the distribution separation idea in a well-known method, namely the mixture model feedback (MMF approach. We prove that MMF also complies with the linear combination assumption, and then, DSM’s linear separation algorithm can largely simplify the EM algorithm in MMF. These theoretical analyses, as well as further empirical evaluation results demonstrate the advantages of our DSM approach.

  8. Reproducibility of computer-aided detection system in digital mammograms

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Cho, Nariya; Cha, Joo Hee; Chung, Hye Kyung; Lee, Sin Ho; Cho, Kyung Soo; Kim, Sun Mi; Moon, Woo Kyung

    2005-01-01

    To evaluate the reproducibility of the computer-aided detection (CAD) system for digital mammograms. We applied the CAD system (ImageChecker M1000-DM, version 3.1; R2 Technology) to full field digital mammograms. These mammograms were taken twice at an interval of 10-45 days (mean:25 days) for 34 preoperative patients (breast cancer n=27, benign disease n=7, age range:20-66 years, mean age:47.9 years). On the mammograms, lesions were visible in 19 patients and these were depicted as 15 masses and 12 calcification clusters. We analyzed the sensitivity, the false positive rate (FPR) and the reproducibility of the CAD marks. The broader sensitivities of the CAD system were 80% (12 of 15), 67%(10 of 15) for masses and those for calcification clusters were 100% (12 of 12). The strict sensitivities were 50% (15 of 30) and 50% (15 of 30) for masses and 92% (22 of 24) and 79% (19 of 24) for the clusters. The FPR for the masses was 0.21-0.22/image, the FPR for the clusters was 0.03-0.04/image and the total FPR was 0.24-0.26/image. Among 132 mammography images, the identical images regardless of the existence of CAD marks were 59% (78 of 132), and the identical images with CAD marks were 22% (15 of 69). The reproducibility of the CAD marks for the true positive mass was 67% (12 of 18) and 71% (17 of 24) for the true positive cluster. The reproducibility of CAD marks for the false positive mass was 8% (4 of 53), and the reproducibility of CAD marks for the false positive clusters was 14% (1 of 7). The reproducibility of the total mass marks was 23% (16 of 71), and the reproducibility of the total cluster marks was 58% (18 of 31). CAD system showed higher sensitivity and reproducibility of CAD marks for the calcification clusters which are related to breast cancer. Yet the overall reproducibility of CAD marks was low; therefore, the CAD system must be applied considering this limitation

  9. A similarity measure method combining location feature for mammogram retrieval.

    Science.gov (United States)

    Wang, Zhiqiong; Xin, Junchang; Huang, Yukun; Li, Chen; Xu, Ling; Li, Yang; Zhang, Hao; Gu, Huizi; Qian, Wei

    2018-05-28

    Breast cancer, the most common malignancy among women, has a high mortality rate in clinical practice. Early detection, diagnosis and treatment can reduce the mortalities of breast cancer greatly. The method of mammogram retrieval can help doctors to find the early breast lesions effectively and determine a reasonable feature set for image similarity measure. This will improve the accuracy effectively for mammogram retrieval. This paper proposes a similarity measure method combining location feature for mammogram retrieval. Firstly, the images are pre-processed, the regions of interest are detected and the lesions are segmented in order to get the center point and radius of the lesions. Then, the method, namely Coherent Point Drift, is used for image registration with the pre-defined standard image. The center point and radius of the lesions after registration are obtained and the standard location feature of the image is constructed. This standard location feature can help figure out the location similarity between the image pair from the query image to each dataset image in the database. Next, the content feature of the image is extracted, including the Histogram of Oriented Gradients, the Edge Direction Histogram, the Local Binary Pattern and the Gray Level Histogram, and the image pair content similarity can be calculated using the Earth Mover's Distance. Finally, the location similarity and content similarity are fused to form the image fusion similarity, and the specified number of the most similar images can be returned according to it. In the experiment, 440 mammograms, which are from Chinese women in Northeast China, are used as the database. When fusing 40% lesion location feature similarity and 60% content feature similarity, the results have obvious advantages. At this time, precision is 0.83, recall is 0.76, comprehensive indicator is 0.79, satisfaction is 96.0%, mean is 4.2 and variance is 17.7. The results show that the precision and recall of this

  10. The Effect of Breast Implants on Mammogram Outcomes.

    Science.gov (United States)

    Kam, Kelli; Lee, Esther; Pairawan, Seyed; Anderson, Kendra; Cora, Cherie; Bae, Won; Senthil, Maheswari; Solomon, Naveenraj; Lum, Sharon

    2015-10-01

    Breast cancer detection in women with implants has been questioned. We sought to evaluate the impact of breast implants on mammographic outcomes. A retrospective review of women undergoing mammography between March 1 and October 30, 2013 was performed. Demographic characteristics and mammogram results were compared between women with and without breast implants. Overall, 4.8 per cent of 1863 women identified during the study period had breast implants. Median age was 59 years (26-93). Women with implants were younger (53.9 vs 59.2 years, P breast tissue (72.1% vs 56.4%, P = 0.004) than those without. There were no statistically significant differences with regards to Breast Imaging Recording and Data System 0 score (13.3% with implants vs 21.4% without), call back exam (18.9% with vs 24.1% without), time to resolution of abnormal imaging (58.6 days with vs 43.3 without), or cancer detection rate (0% with implants vs 1.0% without). Because implants did not significantly affect mammogram results, women with implants should be reassured that mammography remains useful in detecting cancer. However, future research is required to determine whether lower call back rates and longer time to resolution of imaging findings contribute to delays in diagnosis in patients with implants.

  11. Ameliorating mammograms by using novel image processing algorithms

    Science.gov (United States)

    Pillai, A.; Kwartowitz, D.

    2014-03-01

    Mammography is one of the most important tools for the early detection of breast cancer typically through detection of characteristic masses and/or micro calcifications. Digital mammography has become commonplace in recent years. High quality mammogram images are large in size, providing high-resolution data. Estimates of the false negative rate for cancers in mammography are approximately 10%-30%. This may be due to observation error, but more frequently it is because the cancer is hidden by other dense tissue in the breast and even after retrospective review of the mammogram, cannot be seen. In this study, we report on the results of novel image processing algorithms that will enhance the images providing decision support to reading physicians. Techniques such as Butterworth high pass filtering and Gabor filters will be applied to enhance images; followed by segmentation of the region of interest (ROI). Subsequently, the textural features will be extracted from the ROI, which will be used to classify the ROIs as either masses or non-masses. Among the statistical methods most used for the characterization of textures, the co-occurrence matrix makes it possible to determine the frequency of appearance of two pixels separated by a distance, at an angle from the horizontal. This matrix contains a very large amount of information that is complex. Therefore, it is not used directly but through measurements known as indices of texture such as average, variance, energy, contrast, correlation, normalized correlation and entropy.

  12. Mammographer personality traits – elements of the optimal mammogram experience

    Directory of Open Access Journals (Sweden)

    Amanda Louw

    2014-11-01

    Doelstellings: Die doel van hierdie studie was om van die faktore wat pasiënte se persepsies beïnvloed, te ondersoek. Pasiënte se persepsies en voorkeure ten opsigte van mammograwe se persoonlikheidseienskappe word in hierdie artikel bespreek. Metode: In dié beskrywende, verkennende studie is ’n nie-waarskynlikheid-gerieflikheidsteekproef-metode gebruik om data van 274 mammogram-pasiënte in vier kliniese opleidingsentrums in Gauteng met behulp van ’n vraelys in te win. Die respondente moes die belangrikheid van 24 persoonlikheidseienskappe van mammograwe beoordeel. Geldigheid, betroubaarheid, geloofwaardigheid en etiese oorwegings is in ag geneem. Resultate: Van al die vraelyste is 91% ingehandig. Die data is met behulp van beskrywende statistiek en faktoranalise geïnterpreteer, en vier faktore is uit die persoonlikheidskaal geïdentifiseer. Gevolgtrekking: Dit blyk dat pasiënte mammograwe beoordeel volgens die vertroue wat hulle inboesem, die sorg wat hulle verleen, hoe veilig hulle pasiënte laat voel, asook hoe goed hulle kommunikeer. Aangesien die mammograaf-pasiënt-verhouding pasiënte se indrukke van mammogramme sterk beïnvloed, kan hierdie vier faktore as fundamentele elemente van ’n optimale mammogram-ondersoek beskou word.

  13. Parameter optimization of a computer-aided diagnosis scheme for the segmentation of microcalcification clusters in mammograms

    International Nuclear Information System (INIS)

    Gavrielides, Marios A.; Lo, Joseph Y.; Floyd, Carey E. Jr.

    2002-01-01

    Our purpose in this study is to develop a parameter optimization technique for the segmentation of suspicious microcalcification clusters in digitized mammograms. In previous work, a computer-aided diagnosis (CAD) scheme was developed that used local histogram analysis of overlapping subimages and a fuzzy rule-based classifier to segment individual microcalcifications, and clustering analysis for reducing the number of false positive clusters. The performance of this previous CAD scheme depended on a large number of parameters such as the intervals used to calculate fuzzy membership values and on the combination of membership values used by each decision rule. These parameters were optimized empirically based on the performance of the algorithm on the training set. In order to overcome the limitations of manual training and rule generation, the segmentation algorithm was modified in order to incorporate automatic parameter optimization. For the segmentation of individual microcalcifications, the new algorithm used a neural network with fuzzy-scaled inputs. The fuzzy-scaled inputs were created by processing the histogram features with a family of membership functions, the parameters of which were automatically extracted from the distribution of the feature values. The neural network was trained to classify feature vectors as either positive or negative. Individual microcalcifications were segmented from positive subimages. After clustering, another neural network was trained to eliminate false positive clusters. A database of 98 images provided training and testing sets to optimize the parameters and evaluate the CAD scheme, respectively. The performance of the algorithm was evaluated with a FROC analysis. At a sensitivity rate of 93.2%, there was an average of 0.8 false positive clusters per image. The results are very comparable with those taken using our previously published rule-based method. However, the new algorithm is more suited to generalize its

  14. Distributed Algorithms for Time Optimal Reachability Analysis

    DEFF Research Database (Denmark)

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand

    2016-01-01

    . We propose distributed computing to accelerate time optimal reachability analysis. We develop five distributed state exploration algorithms, implement them in \\uppaal enabling it to exploit the compute resources of a dedicated model-checking cluster. We experimentally evaluate the implemented...... algorithms with four models in terms of their ability to compute near- or proven-optimal solutions, their scalability, time and memory consumption and communication overhead. Our results show that distributed algorithms work much faster than sequential algorithms and have good speedup in general.......Time optimal reachability analysis is a novel model based technique for solving scheduling and planning problems. After modeling them as reachability problems using timed automata, a real-time model checker can compute the fastest trace to the goal states which constitutes a time optimal schedule...

  15. Computer aided detection of clusters of microcalcifications on full field digital mammograms

    International Nuclear Information System (INIS)

    Ge Jun; Sahiner, Berkman; Hadjiiski, Lubomir M.; Chan, H.-P.; Wei Jun; Helvie, Mark A.; Zhou Chuan

    2006-01-01

    We are developing a computer-aided detection (CAD) system to identify microcalcification clusters (MCCs) automatically on full field digital mammograms (FFDMs). The CAD system includes six stages: preprocessing; image enhancement; segmentation of microcalcification candidates; false positive (FP) reduction for individual microcalcifications; regional clustering; and FP reduction for clustered microcalcifications. At the stage of FP reduction for individual microcalcifications, a truncated sum-of-squares error function was used to improve the efficiency and robustness of the training of an artificial neural network in our CAD system for FFDMs. At the stage of FP reduction for clustered microcalcifications, morphological features and features derived from the artificial neural network outputs were extracted from each cluster. Stepwise linear discriminant analysis (LDA) was used to select the features. An LDA classifier was then used to differentiate clustered microcalcifications from FPs. A data set of 96 cases with 192 images was collected at the University of Michigan. This data set contained 96 MCCs, of which 28 clusters were proven by biopsy to be malignant and 68 were proven to be benign. The data set was separated into two independent data sets for training and testing of the CAD system in a cross-validation scheme. When one data set was used to train and validate the convolution neural network (CNN) in our CAD system, the other data set was used to evaluate the detection performance. With the use of a truncated error metric, the training of CNN could be accelerated and the classification performance was improved. The CNN in combination with an LDA classifier could substantially reduce FPs with a small tradeoff in sensitivity. By using the free-response receiver operating characteristic methodology, it was found that our CAD system can achieve a cluster-based sensitivity of 70, 80, and 90 % at 0.21, 0.61, and 1.49 FPs/image, respectively. For case

  16. Performance optimisations for distributed analysis in ALICE

    CERN Document Server

    Betev, L; Gheata, M; Grigoras, C; Hristov, P

    2014-01-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the framewo rks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available r esources and ranging from fully I/O - bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by a...

  17. A new approach to develop computer-aided detection schemes of digital mammograms

    Science.gov (United States)

    Tan, Maxine; Qian, Wei; Pu, Jiantao; Liu, Hong; Zheng, Bin

    2015-06-01

    The purpose of this study is to develop a new global mammographic image feature analysis based computer-aided detection (CAD) scheme and evaluate its performance in detecting positive screening mammography examinations. A dataset that includes images acquired from 1896 full-field digital mammography (FFDM) screening examinations was used in this study. Among them, 812 cases were positive for cancer and 1084 were negative or benign. After segmenting the breast area, a computerized scheme was applied to compute 92 global mammographic tissue density based features on each of four mammograms of the craniocaudal (CC) and mediolateral oblique (MLO) views. After adding three existing popular risk factors (woman’s age, subjectively rated mammographic density, and family breast cancer history) into the initial feature pool, we applied a sequential forward floating selection feature selection algorithm to select relevant features from the bilateral CC and MLO view images separately. The selected CC and MLO view image features were used to train two artificial neural networks (ANNs). The results were then fused by a third ANN to build a two-stage classifier to predict the likelihood of the FFDM screening examination being positive. CAD performance was tested using a ten-fold cross-validation method. The computed area under the receiver operating characteristic curve was AUC = 0.779   ±   0.025 and the odds ratio monotonically increased from 1 to 31.55 as CAD-generated detection scores increased. The study demonstrated that this new global image feature based CAD scheme had a relatively higher discriminatory power to cue the FFDM examinations with high risk of being positive, which may provide a new CAD-cueing method to assist radiologists in reading and interpreting screening mammograms.

  18. Exposure parameters of mammograms with and without mass lesions from a South African breast care centre

    International Nuclear Information System (INIS)

    Acho, Sussan N.; Boonzaier, Willem P. E.; Nel, Ina F.

    2017-01-01

    In South African breast care centres, full-field digital mammography units provide breast imaging services to symptomatic and asymptomatic women simultaneously. This study evaluated the technical exposure parameters of 800 mammograms of which 100 mammograms had obvious mass lesions in the fibro-glandular tissue. The average breast compression force of mammograms with mass lesions in the fibro-glandular tissue was 18.4% less than the average breast compression force of mammograms without mass lesions. The average mean glandular dose (MGD), tube potential (kV p ) and compressed breast thickness (CBT) values were 2.14 mGy, 30.5 kV p and 63.9 mm, respectively, for mammograms with mass lesions, and 1.45 mGy, 29.6 kV p and 56.9 mm, respectively, for mammograms without mass lesions. Overall, the average MGD and mean CBT of mammograms with mass lesion were significantly higher compared to those without mass lesions (p < 0.05), although there was no significant difference in their tube potentials (p > 0.05). (authors)

  19. Effect on sensitivity and specificity of mammography screening with or without comparison of old mammograms

    International Nuclear Information System (INIS)

    Thurfjell, M.G.; Vitak, B.; Azavedo, E.; Svane, G.; Thurfjell, E.

    2000-01-01

    In order to evaluate the effect of old mammograms on the specificity and sensitivity of radiologists in mammography screening, one hundred and fifty sets of screening mammograms were examined by 3 experienced screeners twice: once without and once in comparison with older mammograms. The films came from a population-based screening done during the first half of 1994 and comprised all 35 cancers detected during screening in 1994, 12/24 interval cancers, 14/34 cancers detected in the following screening and 89 normal mammograms. Without old mammograms, the screeners detected an average of 40.3 cancers (range 37-42), with a specificity of 87% (85-88%). With old mammograms, the screeners detected 37.7 cancers (range 34-42) with a specificity of 96% (94-99%). The change in detection rate was not significant. However, the increase in specificity was significant for each screener. Mammography screening with old mammograms available for comparison decreased the false-positive recall rate. The effect on sensitivity, however, was unclear

  20. Survival Function Analysis of Planet Size Distribution

    OpenAIRE

    Zeng, Li; Jacobsen, Stein B.; Sasselov, Dimitar D.; Vanderburg, Andrew

    2018-01-01

    Applying the survival function analysis to the planet radius distribution of the Kepler exoplanet candidates, we have identified two natural divisions of planet radius at 4 Earth radii and 10 Earth radii. These divisions place constraints on planet formation and interior structure model. The division at 4 Earth radii separates small exoplanets from large exoplanets above. When combined with the recently-discovered radius gap at 2 Earth radii, it supports the treatment of planets 2-4 Earth rad...

  1. A fully-automated software pipeline for integrating breast density and parenchymal texture analysis for digital mammograms: parameter optimization in a case-control breast cancer risk assessment study

    Science.gov (United States)

    Zheng, Yuanjie; Wang, Yan; Keller, Brad M.; Conant, Emily; Gee, James C.; Kontos, Despina

    2013-02-01

    Estimating a woman's risk of breast cancer is becoming increasingly important in clinical practice. Mammographic density, estimated as the percent of dense (PD) tissue area within the breast, has been shown to be a strong risk factor. Studies also support a relationship between mammographic texture and breast cancer risk. We have developed a fullyautomated software pipeline for computerized analysis of digital mammography parenchymal patterns by quantitatively measuring both breast density and texture properties. Our pipeline combines advanced computer algorithms of pattern recognition, computer vision, and machine learning and offers a standardized tool for breast cancer risk assessment studies. Different from many existing methods performing parenchymal texture analysis within specific breast subregions, our pipeline extracts texture descriptors for points on a spatial regular lattice and from a surrounding window of each lattice point, to characterize the local mammographic appearance throughout the whole breast. To demonstrate the utility of our pipeline, and optimize its parameters, we perform a case-control study by retrospectively analyzing a total of 472 digital mammography studies. Specifically, we investigate the window size, which is a lattice related parameter, and compare the performance of texture features to that of breast PD in classifying case-control status. Our results suggest that different window sizes may be optimal for raw (12.7mm2) versus vendor post-processed images (6.3mm2). We also show that the combination of PD and texture features outperforms PD alone. The improvement is significant (p=0.03) when raw images and window size of 12.7mm2 are used, having an ROC AUC of 0.66. The combination of PD and our texture features computed from post-processed images with a window size of 6.3 mm2 achieves an ROC AUC of 0.75.

  2. A Single Sided Edge Marking Method for Detecting Pectoral Muscle in Digital Mammograms

    Directory of Open Access Journals (Sweden)

    G. Toz

    2018-02-01

    Full Text Available In the computer-assisted diagnosis of breast cancer, the removal of pectoral muscle from mammograms is very important. In this study, a new method, called Single-Sided Edge Marking (SSEM technique, is proposed for the identification of the pectoral muscle border from mammograms. 60 mammograms from the INbreast database were used to test the proposed method. The results obtained were compared for False Positive Rate, False Negative Rate, and Sensitivity using the ground truth values pre-determined by radiologists for the same images. Accordingly, it has been shown that the proposed method can detect the pectoral muscle border with an average of 95.6% sensitivity.

  3. Distributed analysis in ATLAS using GANGA

    International Nuclear Information System (INIS)

    Elmsheuser, Johannes; Brochu, Frederic; Egede, Ulrik; Reece, Will; Williams, Michael; Gaidioz, Benjamin; Maier, Andrew; Moscicki, Jakub; Vanderster, Daniel; Lee, Hurng-Chun; Pajchel, Katarina; Samset, Bjorn; Slater, Mark; Soroko, Alexander; Cowan, Greig

    2010-01-01

    Distributed data analysis using Grid resources is one of the fundamental applications in high energy physics to be addressed and realized before the start of LHC data taking. The needs to manage the resources are very high. In every experiment up to a thousand physicists will be submitting analysis jobs to the Grid. Appropriate user interfaces and helper applications have to be made available to assure that all users can use the Grid without expertise in Grid technology. These tools enlarge the number of Grid users from a few production administrators to potentially all participating physicists. The GANGA job management system (http://cern.ch/ganga), developed as a common project between the ATLAS and LHCb experiments, provides and integrates these kind of tools. GANGA provides a simple and consistent way of preparing, organizing and executing analysis tasks within the experiment analysis framework, implemented through a plug-in system. It allows trivial switching between running test jobs on a local batch system and running large-scale analyzes on the Grid, hiding Grid technicalities. We will be reporting on the plug-ins and our experiences of distributed data analysis using GANGA within the ATLAS experiment. Support for all Grids presently used by ATLAS, namely the LCG/EGEE, NDGF/NorduGrid, and OSG/PanDA is provided. The integration and interaction with the ATLAS data management system DQ2 into GANGA is a key functionality. An intelligent job brokering is set up by using the job splitting mechanism together with data-set and file location knowledge. The brokering is aided by an automated system that regularly processes test analysis jobs at all ATLAS DQ2 supported sites. Large numbers of analysis jobs can be sent to the locations of data following the ATLAS computing model. GANGA supports amongst other things tasks of user analysis with reconstructed data and small scale production of Monte Carlo data.

  4. Objective Bayesian Analysis of Skew- t Distributions

    KAUST Repository

    BRANCO, MARCIA D'ELIA

    2012-02-27

    We study the Jeffreys prior and its properties for the shape parameter of univariate skew-t distributions with linear and nonlinear Student\\'s t skewing functions. In both cases, we show that the resulting priors for the shape parameter are symmetric around zero and proper. Moreover, we propose a Student\\'s t approximation of the Jeffreys prior that makes an objective Bayesian analysis easy to perform. We carry out a Monte Carlo simulation study that demonstrates an overall better behaviour of the maximum a posteriori estimator compared with the maximum likelihood estimator. We also compare the frequentist coverage of the credible intervals based on the Jeffreys prior and its approximation and show that they are similar. We further discuss location-scale models under scale mixtures of skew-normal distributions and show some conditions for the existence of the posterior distribution and its moments. Finally, we present three numerical examples to illustrate the implications of our results on inference for skew-t distributions. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.

  5. Uncertainty analysis for secondary energy distributions

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.

    1978-01-01

    In many transport calculations the integral design parameter of interest (response) is determined mainly by secondary particles such as gamma rays from (n,γ) reactions or secondary neutrons from inelastic scattering events or (n,2n) reactions. Standard sensitivity analysis usually allows to calculate the sensitivities to the production cross sections of such secondaries, but an extended formalism is needed to also obtain the sensitivities to the energy distribution of the generated secondary particles. For a 30-group standard cross-section set 84% of all non-zero table positions pertain to the description of secondary energy distributions (SED's) and only 16% to the actual reaction cross sections. Therefore, any sensitivity/uncertainty analysis which does not consider the effects of SED's is incomplete and neglects most of the input data. This paper describes the methods of how sensitivity profiles for SED's are obtained and used to estimate the uncertainty of an integral response due to uncertainties in these SED's. The detailed theory is documented elsewhere and implemented in the LASL sensitivity code SENSIT. SED sensitivity profiles have proven particularly valuable in cross-section uncertainty analyses for fusion reactors. Even when the production cross sections for secondary neutrons were assumed to be without error, the uncertainties in the energy distribution of these secondaries produced appreciable uncertainties in the calculated tritium breeding rate. However, complete error files for SED's are presently nonexistent. Therefore, methods will be described that allow rough error estimates due to estimated SED uncertainties based on integral SED sensitivities

  6. Scaling analysis of meteorite shower mass distributions

    DEFF Research Database (Denmark)

    Oddershede, Lene; Meibom, A.; Bohr, Jakob

    1998-01-01

    Meteorite showers are the remains of extraterrestrial objects which are captivated by the gravitational field of the Earth. We have analyzed the mass distribution of fragments from 16 meteorite showers for scaling. The distributions exhibit distinct scaling behavior over several orders of magnetude......; the observed scaling exponents vary from shower to shower. Half of the analyzed showers show a single scaling region while the orther half show multiple scaling regimes. Such an analysis can provide knowledge about the fragmentation process and about the original meteoroid. We also suggest to compare...... the observed scaling exponents to exponents observed in laboratory experiments and discuss the possibility that one can derive insight into the original shapes of the meteoroids....

  7. Performance optimisations for distributed analysis in ALICE

    International Nuclear Information System (INIS)

    Betev, L; Gheata, A; Grigoras, C; Hristov, P; Gheata, M

    2014-01-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with ''sensors'' collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis

  8. Area and volumetric density estimation in processed full-field digital mammograms for risk assessment of breast cancer.

    Directory of Open Access Journals (Sweden)

    Abbas Cheddad

    Full Text Available INTRODUCTION: Mammographic density, the white radiolucent part of a mammogram, is a marker of breast cancer risk and mammographic sensitivity. There are several means of measuring mammographic density, among which are area-based and volumetric-based approaches. Current volumetric methods use only unprocessed, raw mammograms, which is a problematic restriction since such raw mammograms are normally not stored. We describe fully automated methods for measuring both area and volumetric mammographic density from processed images. METHODS: The data set used in this study comprises raw and processed images of the same view from 1462 women. We developed two algorithms for processed images, an automated area-based approach (CASAM-Area and a volumetric-based approach (CASAM-Vol. The latter method was based on training a random forest prediction model with image statistical features as predictors, against a volumetric measure, Volpara, for corresponding raw images. We contrast the three methods, CASAM-Area, CASAM-Vol and Volpara directly and in terms of association with breast cancer risk and a known genetic variant for mammographic density and breast cancer, rs10995190 in the gene ZNF365. Associations with breast cancer risk were evaluated using images from 47 breast cancer cases and 1011 control subjects. The genetic association analysis was based on 1011 control subjects. RESULTS: All three measures of mammographic density were associated with breast cancer risk and rs10995190 (p0.10 for risk, p>0.03 for rs10995190. CONCLUSIONS: Our results show that it is possible to obtain reliable automated measures of volumetric and area mammographic density from processed digital images. Area and volumetric measures of density on processed digital images performed similar in terms of risk and genetic association.

  9. Analysis of Jingdong Mall Logistics Distribution Model

    Science.gov (United States)

    Shao, Kang; Cheng, Feng

    In recent years, the development of electronic commerce in our country to speed up the pace. The role of logistics has been highlighted, more and more electronic commerce enterprise are beginning to realize the importance of logistics in the success or failure of the enterprise. In this paper, the author take Jingdong Mall for example, performing a SWOT analysis of their current situation of self-built logistics system, find out the problems existing in the current Jingdong Mall logistics distribution and give appropriate recommendations.

  10. Computer aided monitoring of breast abnormalities in X-ray mammograms

    OpenAIRE

    Selvan, Arul; Saatchi, Reza; Ferris, Christine

    2011-01-01

    X­ray mammography is regarded as the most effective tool for the detection and diagnosis of breast cancer, but the interpretation of mammograms is a difficult and \\ud error­prone task. Computer­aided detection (CADe) systems address the problem that radiologists often miss signs of cancers that are retrospectively visible in mammograms. Furthermore, computer­aided diagnosis (CADx) systems assist the radiologist in the classification of mammographic lesions as benign or malignant[1].\\ud This p...

  11. Fibrocystic change of breast : relation with parenchymal pattern on mammogram and fibroadenoma

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Ki Yeol; Cha, In Ho; Kang, Eun Young; Kim, Jung Hyuk [Korea Univ. College of Medicine, Seoul (Korea, Republic of)

    1996-10-01

    To determine the relationship between fibrocystic change and parenchymal pattern and fibroadenoma on mammogram. Mammograms of 135 patients with histologically- diagnosed fibrocystic disease after excisional biopsy were retrospectively analyzed and correlated with pathologic specimens. Classification of the parenchymal pattern was based on Wolfe's method. On mammogram, we observed abnormality in 88 out of the 135 cases;these latter consisted of 70 cases of DY, 30 of P2, 20 of P1, and 15 of Nl, following Wolfe's parenchymal patterns. Among the 88 abnormal cases we obseved 37 cases of mass with clear boundaries, five cases of mass with unclear boundaries, 22 with clustered microcalcifications, six with macrocalcifications and 18 with asymmetric dense breast. Histologic examination revealed a varying composition of stromal fibrosis, epithelial hyperplasia,cyst formation, apocrine metaplasia, etc. Histologically fibroadenomatoid change in 18 cases was appeared as a radiopaque mass on mammogram, especially in those cases where the change was well-defined, which were all except three. Fibrocystic disease was prevalent in Wolfe's P2 and DY patterns(about 80%). About 40% of fibrocystic change appearing as a well defined mass on mammogram showed fibroadenomatoid chage histologically and was difficult to differentiate from fibroadenoma. Fibrocystic disease should therefore be included in the differential diagnosis of a mass which on mammogram is well-defined.

  12. Fibrocystic change of breast : relation with parenchymal pattern on mammogram and fibroadenoma

    International Nuclear Information System (INIS)

    Lee, Ki Yeol; Cha, In Ho; Kang, Eun Young; Kim, Jung Hyuk

    1996-01-01

    To determine the relationship between fibrocystic change and parenchymal pattern and fibroadenoma on mammogram. Mammograms of 135 patients with histologically- diagnosed fibrocystic disease after excisional biopsy were retrospectively analyzed and correlated with pathologic specimens. Classification of the parenchymal pattern was based on Wolfe's method. On mammogram, we observed abnormality in 88 out of the 135 cases;these latter consisted of 70 cases of DY, 30 of P2, 20 of P1, and 15 of Nl, following Wolfe's parenchymal patterns. Among the 88 abnormal cases we obseved 37 cases of mass with clear boundaries, five cases of mass with unclear boundaries, 22 with clustered microcalcifications, six with macrocalcifications and 18 with asymmetric dense breast. Histologic examination revealed a varying composition of stromal fibrosis, epithelial hyperplasia,cyst formation, apocrine metaplasia, etc. Histologically fibroadenomatoid change in 18 cases was appeared as a radiopaque mass on mammogram, especially in those cases where the change was well-defined, which were all except three. Fibrocystic disease was prevalent in Wolfe's P2 and DY patterns(about 80%). About 40% of fibrocystic change appearing as a well defined mass on mammogram showed fibroadenomatoid chage histologically and was difficult to differentiate from fibroadenoma. Fibrocystic disease should therefore be included in the differential diagnosis of a mass which on mammogram is well-defined

  13. Volumetric breast density estimation from full-field digital mammograms.

    Science.gov (United States)

    van Engeland, Saskia; Snoeren, Peter R; Huisman, Henkjan; Boetes, Carla; Karssemeijer, Nico

    2006-03-01

    A method is presented for estimation of dense breast tissue volume from mammograms obtained with full-field digital mammography (FFDM). The thickness of dense tissue mapping to a pixel is determined by using a physical model of image acquisition. This model is based on the assumption that the breast is composed of two types of tissue, fat and parenchyma. Effective linear attenuation coefficients of these tissues are derived from empirical data as a function of tube voltage (kVp), anode material, filtration, and compressed breast thickness. By employing these, tissue composition at a given pixel is computed after performing breast thickness compensation, using a reference value for fatty tissue determined by the maximum pixel value in the breast tissue projection. Validation has been performed using 22 FFDM cases acquired with a GE Senographe 2000D by comparing the volume estimates with volumes obtained by semi-automatic segmentation of breast magnetic resonance imaging (MRI) data. The correlation between MRI and mammography volumes was 0.94 on a per image basis and 0.97 on a per patient basis. Using the dense tissue volumes from MRI data as the gold standard, the average relative error of the volume estimates was 13.6%.

  14. Evaluation of hybrids algorithms for mass detection in digitalized mammograms

    International Nuclear Information System (INIS)

    Cordero, Jose; Garzon Reyes, Johnson

    2011-01-01

    The breast cancer remains being a significant public health problem, the early detection of the lesions can increase the success possibilities of the medical treatments. The mammography is an image modality effective to early diagnosis of abnormalities, where the medical image is obtained of the mammary gland with X-rays of low radiation, this allows detect a tumor or circumscribed mass between two to three years before that it was clinically palpable, and is the only method that until now achieved reducing the mortality by breast cancer. In this paper three hybrids algorithms for circumscribed mass detection on digitalized mammograms are evaluated. In the first stage correspond to a review of the enhancement and segmentation techniques used in the processing of the mammographic images. After a shape filtering was applied to the resulting regions. By mean of a Bayesian filter the survivors regions were processed, where the characteristics vector for the classifier was constructed with few measurements. Later, the implemented algorithms were evaluated by ROC curves, where 40 images were taken for the test, 20 normal images and 20 images with circumscribed lesions. Finally, the advantages and disadvantages in the correct detection of a lesion of every algorithm are discussed.

  15. Quantifying the effect of colorization enhancement on mammogram images

    Science.gov (United States)

    Wojnicki, Paul J.; Uyeda, Elizabeth; Micheli-Tzanakou, Evangelia

    2002-04-01

    Current methods of radiological displays provide only grayscale images of mammograms. The limitation of the image space to grayscale provides only luminance differences and textures as cues for object recognition within the image. However, color can be an important and significant cue in the detection of shapes and objects. Increasing detection ability allows the radiologist to interpret the images in more detail, improving object recognition and diagnostic accuracy. Color detection experiments using our stimulus system, have demonstrated that an observer can only detect an average of 140 levels of grayscale. An optimally colorized image can allow a user to distinguish 250 - 1000 different levels, hence increasing potential image feature detection by 2-7 times. By implementing a colorization map, which follows the luminance map of the original grayscale images, the luminance profile is preserved and color is isolated as the enhancement mechanism. The effect of this enhancement mechanism on the shape, frequency composition and statistical characteristics of the Visual Evoked Potential (VEP) are analyzed and presented. Thus, the effectiveness of the image colorization is measured quantitatively using the Visual Evoked Potential (VEP).

  16. iPixel: a visual content-based and semantic search engine for retrieving digitized mammograms by using collective intelligence.

    Science.gov (United States)

    Alor-Hernández, Giner; Pérez-Gallardo, Yuliana; Posada-Gómez, Rubén; Cortes-Robles, Guillermo; Rodríguez-González, Alejandro; Aguilar-Laserre, Alberto A

    2012-09-01

    Nowadays, traditional search engines such as Google, Yahoo and Bing facilitate the retrieval of information in the format of images, but the results are not always useful for the users. This is mainly due to two problems: (1) the semantic keywords are not taken into consideration and (2) it is not always possible to establish a query using the image features. This issue has been covered in different domains in order to develop content-based image retrieval (CBIR) systems. The expert community has focussed their attention on the healthcare domain, where a lot of visual information for medical analysis is available. This paper provides a solution called iPixel Visual Search Engine, which involves semantics and content issues in order to search for digitized mammograms. iPixel offers the possibility of retrieving mammogram features using collective intelligence and implementing a CBIR algorithm. Our proposal compares not only features with similar semantic meaning, but also visual features. In this sense, the comparisons are made in different ways: by the number of regions per image, by maximum and minimum size of regions per image and by average intensity level of each region. iPixel Visual Search Engine supports the medical community in differential diagnoses related to the diseases of the breast. The iPixel Visual Search Engine has been validated by experts in the healthcare domain, such as radiologists, in addition to experts in digital image analysis.

  17. Reduction of false positives in the detection of architectural distortion in mammograms by using a geometrically constrained phase portrait model

    International Nuclear Information System (INIS)

    Ayres, Fabio J.; Rangayyan, Rangaraj M.

    2007-01-01

    Objective One of the commonly missed signs of breast cancer is architectural distortion. We have developed techniques for the detection of architectural distortion in mammograms, based on the analysis of oriented texture through the application of Gabor filters and a linear phase portrait model. In this paper, we propose constraining the shape of the general phase portrait model as a means to reduce the false-positive rate in the detection of architectural distortion. Material and methods The methods were tested with one set of 19 cases of architectural distortion and 41 normal mammograms, and with another set of 37 cases of architectural distortion. Results Sensitivity rates of 84% with 4.5 false positives per image and 81% with 10 false positives per image were obtained for the two sets of images. Conclusion The adoption of a constrained phase portrait model with a symmetric matrix and the incorporation of its condition number in the analysis resulted in a reduction in the false-positive rate in the detection of architectural distortion. The proposed techniques, dedicated for the detection and localization of architectural distortion, should lead to efficient detection of early signs of breast cancer. (orig.)

  18. CMS distributed data analysis with CRAB3

    Science.gov (United States)

    Mascheroni, M.; Balcas, J.; Belforte, S.; Bockelman, B. P.; Hernandez, J. M.; Ciangottini, D.; Konstantinov, P. B.; Silva, J. M. D.; Ali, M. A. B. M.; Melo, A. M.; Riahi, H.; Tanasijczuk, A. J.; Yusli, M. N. B.; Wolf, M.; Woodard, A. E.; Vaandering, E.

    2015-12-01

    The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day. CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks and submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. The new system improves the robustness, scalability and sustainability of the service. Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.

  19. Buffered Communication Analysis in Distributed Multiparty Sessions

    Science.gov (United States)

    Deniélou, Pierre-Malo; Yoshida, Nobuko

    Many communication-centred systems today rely on asynchronous messaging among distributed peers to make efficient use of parallel execution and resource access. With such asynchrony, the communication buffers can happen to grow inconsiderately over time. This paper proposes a static verification methodology based on multiparty session types which can efficiently compute the upper bounds on buffer sizes. Our analysis relies on a uniform causality audit of the entire collaboration pattern - an examination that is not always possible from each end-point type. We extend this method to design algorithms that allocate communication channels in order to optimise the memory requirements of session executions. From these analyses, we propose two refinements methods which respect buffer bounds: a global protocol refinement that automatically inserts confirmation messages to guarantee stipulated buffer sizes and a local protocol refinement to optimise asynchronous messaging without buffer overflow. Finally our work is applied to overcome a buffer overflow problem of the multi-buffering algorithm.

  20. Node-based analysis of species distributions

    DEFF Research Database (Denmark)

    Borregaard, Michael Krabbe; Rahbek, Carsten; Fjeldså, Jon

    2014-01-01

    overrepresentation score (SOS) and the geographic node divergence (GND) score, which together combine ecological and evolutionary patterns into a single framework and avoids many of the problems that characterize community phylogenetic methods in current use.This approach goes through each node in the phylogeny...... with case studies on two groups with well-described biogeographical histories: a local-scale community data set of hummingbirds in the North Andes, and a large-scale data set of the distribution of all species of New World flycatchers. The node-based analysis of these two groups generates a set...... of intuitively interpretable patterns that are consistent with current biogeographical knowledge.Importantly, the results are statistically tractable, opening many possibilities for their use in analyses of evolutionary, historical and spatial patterns of species diversity. The method is implemented...

  1. Probabilistic analysis of flaw distribution on structure under cyclic load

    International Nuclear Information System (INIS)

    Kwak, Sang Log; Choi, Young Hwan; Kim, Hho Jung

    2003-01-01

    Flaw geometries, applied stress, and material properties are major input variables for the fracture mechanics analysis. Probabilistic approach can be applied for the consideration of uncertainties within these input variables. But probabilistic analysis requires many assumptions due to the lack of initial flaw distributions data. In this study correlations are examined between initial flaw distributions and in-service flaw distributions on structures under cyclic load. For the analysis, LEFM theories and Monte Carlo simulation are applied. Result shows that in-service flaw distributions are determined by initial flaw distributions rather than fatigue crack growth rate. So initial flaw distribution can be derived from in-service flaw distributions

  2. Mammogram classification scheme using 2D-discrete wavelet and local binary pattern for detection of breast cancer

    Science.gov (United States)

    Adi Putra, Januar

    2018-04-01

    In this paper, we propose a new mammogram classification scheme to classify the breast tissues as normal or abnormal. Feature matrix is generated using Local Binary Pattern to all the detailed coefficients from 2D-DWT of the region of interest (ROI) of a mammogram. Feature selection is done by selecting the relevant features that affect the classification. Feature selection is used to reduce the dimensionality of data and features that are not relevant, in this paper the F-test and Ttest will be performed to the results of the feature extraction dataset to reduce and select the relevant feature. The best features are used in a Neural Network classifier for classification. In this research we use MIAS and DDSM database. In addition to the suggested scheme, the competent schemes are also simulated for comparative analysis. It is observed that the proposed scheme has a better say with respect to accuracy, specificity and sensitivity. Based on experiments, the performance of the proposed scheme can produce high accuracy that is 92.71%, while the lowest accuracy obtained is 77.08%.

  3. Performance of computer-aided detection in false-negative screening mammograms of breast cancers

    International Nuclear Information System (INIS)

    Han, Boo Kyung; Kim, Ji Young; Shin, Jung Hee; Choe, Yeon Hyeon

    2004-01-01

    To analyze retrospectively the abnormalities visible on the false-negative screening mammograms of patients with breast cancer and to determine the performance of computer-aided detection (CAD) in the detection of cancers. Of 108 consecutive cases of breast cancer diagnosed over a period of 6 years, of which previous screening mammograms were available, 32 retrospectively visible abnormalities (at which locations cancer later developed) were found in the previous mammograms, and which were originally reported as negative. These 32 patients ranged in age from 38 to 72 years (mean 52 years). We analyzed their previous mammographic findings, and assessed the ability of CAD to mark cancers in previous mammograms, according to the clinical presentation, the type of abnormalities and the mammographic parenchymal density. In these 32 previous mammograms of breast cancers (20 asymptomatic, 12 symptomatic), the retrospectively visible abnormalities were identified as densities in 22, calcifications in 8, and densities with calcifications in 2. CAD marked abnormalities in 20 (63%) of the 32 cancers with false-negative screening mammograms; 14 (70%) of the 20 subsequent screening-detected cancers, 5 (50%) of the 10 interval cancers, and 1 (50%) of the 2 cancers palpable after the screening interval. CAD marked 12 (50%) of the 24 densities and 9 (90%) of the 10 calcifications. CAD marked abnormalities in 7 (50%) of the 14 predominantly fatty breasts, and 13 (72%) of the 18 dense breasts. CAD-assisted diagnosis could potentially decrease the number of false-negative mammograms caused by the failure to recognize the cancer in the screening program, although its usefulness in the prevention of interval cancers appears to be limited

  4. Clinical images evaluation of mammograms: a national survey

    International Nuclear Information System (INIS)

    Moon, Woo Kyung; Kim, Tae Jung; Cha, Joo Hee

    2003-01-01

    The goal of this study was to survey the overall quality of mammographic images in Korea. A total of 598 mammographic images collected from 257 hospitals nationwide were reviewed in terms of eight images quality categories, namely positioning, compression, contrast, exposure, sharpness, noise, artifacts, and examination identification, and rated on a five-point scale: (1=severe deficiency, 2=major deficiency, 3=minor deficiency, 4=good, 5=best). Failure was defined as the occurrence of more than four major deficiencies or one severe deficiency (score of 1 or 2). The results were compared among hospitals of varying kinds, and common problems in clinical images quality were identified. Two hundred and seventeen mammographic images (36.3%) failed the evaluation. Poor images were found in descending order of frequency, at The Society for Medical Examination (33/69, 47.8%), non-radiologyclinics (42/88, 47.7%), general hospitals (92/216, 42.6%), radiology clinics (39/102, 38.2%), and university hospitals (11/123, 8.9%) (p<0.01, Chi-square test). Among the 598 images, serious problems which occurred were related to positioning in 23.7% of instances (n=142) (p<0.01, Chi-square test), examination identification in 5.7% (n=34), exposure in 5.4% (n=32), contrast in 4.2% (n=25), sharpness in 2.7% (n=16), compression in 2.5% (n=15), artifacts in 2.5% (n=15), and noise in 0.3% (n=2). This study showed that in Korea, 36.3% of the mammograms examined in this sampling had important image-related defects that might have led to serious errors in patient management. The failure rate was significantly higher in non-radiology clinics and at The Society for Medical Examination than at university hospitals

  5. DETECTION OF MICROCALCIFICATION IN DIGITAL MAMMOGRAMS USING ONE DIMENSIONAL WAVELET TRANSFORM

    Directory of Open Access Journals (Sweden)

    T. Balakumaran

    2010-11-01

    Full Text Available Mammography is the most efficient method for breast cancer early detection. Clusters of microcalcifications are the early sign of breast cancer and their detection is the key to improve prognosis of breast cancer. Microcalcifications appear in mammogram image as tiny localized granular points, which is often difficult to detect by naked eye because of their small size. Automatic and accurately detection of microcalcifications has received much more attention from radiologists and physician. An efficient method for automatic detection of clustered microcalcifications in digitized mammograms is the use of Computer Aided Diagnosis (CAD systems. This paper presents a one dimensional wavelet-based multiscale products scheme for microcalcification detection in mammogram images. The detection of microcalcifications were achieved by decomposing the each line of mammograms by 1D wavelet transform into different frequency sub-bands, suppressing the low-frequency subband, and finally reconstructing the mammogram from the subbands containing only significant high frequencies features. The significant features are obtained by multiscale products. Preliminary results indicate that the proposed scheme is better in suppressing the background and detecting the microcalcification clusters than any other wavelet decomposition methods.

  6. Diagnostic abilities of three CAD methods for assessing microcalcifications in mammograms and an aspect of equivocal cases decisions by radiologists

    International Nuclear Information System (INIS)

    Hung, W.T.; Nguyen, H.T.; Thornton, B.S.; Rickard, M.T.; Blinowska, A.

    2003-01-01

    Radiologists use an 'Overall impression' rating to assess a suspicious region on a mammogram. The value ranges from 1 to 5. They will definitely send a patient for biopsy if the rating is 4 or 5. They will send the patient for core biopsy when a rating of 3 (indeterminate) is given. We have developed three methods to aid diagnosis of cases with microcalcifications. The first two methods, namely, Bayesian and multiple logistic regression (with a special 'cutting score' technique), utilise six parameter ratings which minimise subjectivity in characterising the microcalcifications. The third method uses three parameters (age of patient, uniformity of size of microcalcification and their distribution) in a multiple stepwise regression. For both training set and test set, all three methods are as good as the two radiologists in terms of percentages of correct classification. Therefore, all three proposed methods potentially can be used as second readers. Copyright (2003) Australasian College of Physical Scientists and Engineers in Medicine

  7. Development of pair distribution function analysis

    International Nuclear Information System (INIS)

    Vondreele, R.; Billinge, S.; Kwei, G.; Lawson, A.

    1996-01-01

    This is the final report of a 3-year LDRD project at LANL. It has become more and more evident that structural coherence in the CuO 2 planes of high-T c superconducting materials over some intermediate length scale (nm range) is important to superconductivity. In recent years, the pair distribution function (PDF) analysis of powder diffraction data has been developed for extracting structural information on these length scales. This project sought to expand and develop this technique, use it to analyze neutron powder diffraction data, and apply it to problems. In particular, interest is in the area of high-T c superconductors, although we planned to extend the study to the closely related perovskite ferroelectric materials andother materials where the local structure affects the properties where detailed knowledge of the local and intermediate range structure is important. In addition, we planned to carry out single crystal experiments to look for diffuse scattering. This information augments the information from the PDF

  8. Study on Compression Induced Contrast in X-ray Mammograms Using Breast Mimicking Phantoms

    Directory of Open Access Journals (Sweden)

    A. B. M. Aowlad Hossain

    2015-09-01

    Full Text Available X-ray mammography is commonly used to scan cancer or tumors in breast using low dose x-rays. But mammograms suffer from low contrast problem. The breast is compressed in mammography to reduce x-ray scattering effects. As tumors are stiffer than normal tissues, they undergo smaller deformation under compression. Therefore, image intensity at tumor region may change less than the background tissues. In this study, we try to find out compression induced contrast from multiple mammographic images of tumorous breast phantoms taken with different compressions. This is an extended work of our previous simulation study with experiment and more analysis. We have used FEM models for synthetic phantom and constructed a phantom using agar and n-propanol for simulation and experiment. The x-ray images of deformed phantoms have been obtained under three compression steps and a non-rigid registration technique has been applied to register these images. It is noticeably observed that the image intensity changes at tumor are less than those at surrounding which induce a detectable contrast. Addition of this compression induced contrast to the simulated and experimental images has improved their original contrast by a factor of about 1.4

  9. An automatic system to discriminate malignant from benign massive lesions on mammograms

    International Nuclear Information System (INIS)

    Retico, A.; Delogu, P.; Fantacci, M.E.; Kasae, P.

    2006-01-01

    Mammography is widely recognized as the most reliable technique for early detection of breast cancers. Automated or semi-automated computerized classification schemes can be very useful in assisting radiologists with a second opinion about the visual diagnosis of breast lesions, thus leading to a reduction in the number of unnecessary biopsies. We present a computer-aided diagnosis (CADi) system for the characterization of massive lesions in mammograms, whose aim is to distinguish malignant from benign masses. The CADi system we realized is based on a three-stage algorithm: (a) a segmentation technique extracts the contours of the massive lesion from the image; (b) 16 features based on size and shape of the lesion are computed; (c) a neural classifier merges the features into an estimated likelihood of malignancy. A data set of 226 massive lesions (109 malignant and 117 benign) has been used in this study. The system performances have been evaluated in terms of the receiver-operating characteristic (ROC) analysis, obtaining A z =0.80+/-0.04 as the estimated area under the ROC curve

  10. Corroded scale analysis from water distribution pipes

    Directory of Open Access Journals (Sweden)

    Rajaković-Ognjanović Vladana N.

    2011-01-01

    Full Text Available The subject of this study was the steel pipes that are part of Belgrade's drinking water supply network. In order to investigate the mutual effects of corrosion and water quality, the corrosion scales on the pipes were analyzed. The idea was to improve control of corrosion processes and prevent impact of corrosion on water quality degradation. The instrumental methods for corrosion scales characterization used were: scanning electron microscopy (SEM, for the investigation of corrosion scales of the analyzed samples surfaces, X-ray diffraction (XRD, for the analysis of the presence of solid forms inside scales, scanning electron microscopy (SEM, for the microstructural analysis of the corroded scales, and BET adsorption isotherm for the surface area determination. Depending on the composition of water next to the pipe surface, corrosion of iron results in the formation of different compounds and solid phases. The composition and structure of the iron scales in the drinking water distribution pipes depends on the type of the metal and the composition of the aqueous phase. Their formation is probably governed by several factors that include water quality parameters such as pH, alkalinity, buffer intensity, natural organic matter (NOM concentration, and dissolved oxygen (DO concentration. Factors such as water flow patterns, seasonal fluctuations in temperature, and microbiological activity as well as water treatment practices such as application of corrosion inhibitors can also influence corrosion scale formation and growth. Therefore, the corrosion scales found in iron and steel pipes are expected to have unique features for each site. Compounds that are found in iron corrosion scales often include goethite, lepidocrocite, magnetite, hematite, ferrous oxide, siderite, ferrous hydroxide, ferric hydroxide, ferrihydrite, calcium carbonate and green rusts. Iron scales have characteristic features that include: corroded floor, porous core that contains

  11. RELIABILITY ANALYSIS OF POWER DISTRIBUTION SYSTEMS

    Directory of Open Access Journals (Sweden)

    Popescu V.S.

    2012-04-01

    Full Text Available Power distribution systems are basic parts of power systems and reliability of these systems at present is a key issue for power engineering development and requires special attention. Operation of distribution systems is accompanied by a number of factors that produce random data a large number of unplanned interruptions. Research has shown that the predominant factors that have a significant influence on the reliability of distribution systems are: weather conditions (39.7%, defects in equipment(25% and unknown random factors (20.1%. In the article is studied the influence of random behavior and are presented estimations of reliability of predominantly rural electrical distribution systems.

  12. A retrospective study of the performance of radiographers in interpreting screening mammograms

    International Nuclear Information System (INIS)

    Moran, S.; Warren-Forward, H.

    2011-01-01

    Purpose: This paper provides data on the continued success of radiographers in reviewing mammograms with similar accuracy to screen readers. Method: The participants consisted of 7 radiographers and 2 current official screen readers. Two hundred and fifty sets of mammograms from 2003 were used in this study. Each participant reviewed each set of mammograms as a Rescreen or Recall. Patient outcomes were assessed by following up the results of any histology or pathology tests in 2003 or the 2005/2006 screening results. Results: The screen reader's sensitivities ranged from 79% to 93% and the specificities ranged from 82% to 84%. The radiographer values ranged from 57% to 97% and 63% to 80% respectively. Conclusion: The sensitivity and specificity values attained by some radiographers were equivalent to those of both the screen readers. Accuracy rates of the radiographers suggest that screen reading by selected and appropriately trained radiographers should be achievable in Australia.

  13. Computer aided system for segmentation and visualization of microcalcifications in digital mammograms

    International Nuclear Information System (INIS)

    Reljin, B.; Reljin, I.; Milosevic, Z.; Stojic, T.

    2009-01-01

    Two methods for segmentation and visualization of microcalcifications in digital or digitized mammograms are described. First method is based on modern mathematical morphology, while the second one uses the multifractal approach. In the first method, by using an appropriate combination of some morphological operations, high local contrast enhancement, followed by significant suppression of background tissue, irrespective of its radiology density, is obtained. By iterative procedure, this method highly emphasizes only small bright details, possible microcalcifications. In a multifractal approach, from initial mammogram image, a corresponding multifractal 'images' are created, from which a radiologist has a freedom to change the level of segmentation. An appropriate user friendly computer aided visualization (CAV) system with embedded two methods is realized. The interactive approach enables the physician to control the level and the quality of segmentation. Suggested methods were tested through mammograms from MIAS database as a gold standard, and from clinical praxis, using digitized films and digital images from full field digital mammograph. (authors)

  14. Staging of breast cancer and the advanced applications of digital mammogram: what the physician needs to know?

    Science.gov (United States)

    Helal, Maha H; Mansour, Sahar M; Zaglol, Mai; Salaleldin, Lamia A; Nada, Omniya M; Haggag, Marwa A

    2017-03-01

    To study the role of advanced applications of digital mammogram, whether contrast-enhanced spectral mammography (CESM) or digital breast tomosynthesis (DBT), in the "T" staging of histologically proven breast cancer before planning for treatment management. In this prospective analysis, we evaluated 98 proved malignant breast masses regarding their size, multiplicity and the presence of associated clusters of microcalcifications. Evaluation methods included digital mammography (DM), 3D tomosynthesis and CESM. Traditional DM was first performed then in a period of 10-14-day interval; breast tomosynthesis and contrast-based mammography were performed for the involved breast only. Views at tomosynthesis were acquired in a "step-and-shoot" tube motion mode to produce multiple (11-15), low-dose images and in contrast-enhanced study, low-energy (22-33 kVp) and high-energy (44-49 kVp) exposures were taken after the i.v. injection of the contrast agent. Operative data were the gold standard reference. Breast tomosynthesis showed the highest accuracy in size assessment (n = 69, 70.4%) than contrast-enhanced (n = 49, 50%) and regular mammography (n = 59, 60.2%). Contrast-enhanced mammography presented the least performance in assessing calcifications, yet it was most sensitive in the detection of multiplicity (92.3%), followed by tomosynthesis (77%) and regular mammography (53.8%). The combined analysis of the three modalities provided an accuracy of 74% in the "T" staging of breast cancer. The combined application of tomosynthesis and contrast-enhanced digital mammogram enhanced the performance of the traditional DM and presented an informative method in the staging of breast cancer. Advances in knowledge: Staging and management planning of breast cancer can divert according to tumour size, multiplicity and the presence of microcalcifications. DBT shows sharp outlines of the tumour with no overlap tissue and spots microcalcifications. Contrast

  15. silicon bipolar distributed oscillator design and analysis

    African Journals Online (AJOL)

    digital and analogue market, wired or wireless is making it necessary to operate ... is generally high; this additional power is supplied by the eternal dc source. ... distributed oscillator consists of a pair of transmission lines with characteristic ...

  16. Distributed energy store railguns experiment and analysis

    International Nuclear Information System (INIS)

    Holland, L.D.

    1984-01-01

    Electromagnetic acceleration of projectiles holds the potential for achieving higher velocities than yet achieved by any other means. A railgun is the simplest form of electromagnetic macroparticle accelerator and can generate the highest sustained accelerating force. The practical length of conventional railguns is limited by the impedance of the rails because current must be carried along the entire length of the rails. A railgun and power supply system called the distributed energy store railgun was proposed as a solution to this limitation. The distributed energy store railgun used multiple current sources connected to the rails of a railgun at points distributed along the bore. These current sources (energy stores) are turned on in sequence as the projectile moves down the bore so that current is fed to the railgun from behind the armature. In this system the length of the rails that carry the full armature current is less than the total length of the railgun. If a sufficient number of energy stores is used, this removes the limitation on the length of a railgun. An additional feature of distributed energy store type railguns is that they can be designed to maintain a constant pressure on the projectile being accelerated. A distributed energy store railgun was constructed and successfully operated. In addition to this first demonstration of the distributed energy store railgun principle, a theoretical model of the system was also constructed

  17. Web Based Distributed Coastal Image Analysis System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This project develops Web based distributed image analysis system processing the Moderate Resolution Imaging Spectroradiometer (MODIS) data to provide decision...

  18. Analysis of mixed mode microwave distribution manifolds

    International Nuclear Information System (INIS)

    White, T.L.

    1982-09-01

    The 28-GHz microwave distribution manifold used in the ELMO Bumpy Torus-Scale (EBT-S) experiments consists of a toroidal metallic cavity, whose dimensions are much greater than a wavelength, fed by a source of microwave power. Equalization of the mixed mode power distribution ot the 24 cavities of EBT-S is accomplished by empirically adjusting the coupling irises which are equally spaced around the manifold. The performance of the manifold to date has been very good, yet no analytical models exist for optimizing manifold transmission efficiency or for scaling this technology to the EBT-P manifold design. The present report develops a general model for mixed mode microwave distribution manifolds based on isotropic plane wave sources of varying amplitudes that are distributed toroidally around the manifold. The calculated manifold transmission efficiency for the most recent EBT-S coupling iris modification is 90%. This agrees with the average measured transmission efficiency. Also, the model predicts the coupling iris areas required to balance the distribution of microwave power while maximizing transmission efficiency, and losses in waveguide feeds connecting the irises to the cavities of EBT are calculated using an approach similar to the calculation of mainfold losses. The model will be used to evaluate EBT-P manifold designs

  19. Distributed energy store railguns experiment and analysis

    Science.gov (United States)

    Holland, L. D.

    1984-02-01

    Electromagnetic acceleration of projectiles holds the potential for achieving higher velocities than yet achieved by any other means. A railgun is the simplest form of electromagnetic macroparticle accelerator and can generate the highest sustained accelerating force. The practical length of conventional railguns is limited by the impedance of the rails because current must be carried along the entire length of the rails. A railgun and power supply system called the distributed energy store railgun was proposed as a solution to this limitation. A distributed energy storage railgun was constructed and successfully operated. In addition to this demonstration of the distributed energy store railgun principle, a theoretical model of the system was also constructed. A simple simulation of the railgun system based on this model, but ignoring frictional drag, was compared with the experimental results. During the process of comparing results from the simulation and the experiment, the effect of significant frictional drag of the projectile on the sidewalls of the bore was observed.

  20. Field distribution analysis in deflecting structures

    Energy Technology Data Exchange (ETDEWEB)

    Paramonov, V.V. [Joint Inst. for Nuclear Research, Moscow (Russian Federation)

    2013-02-15

    Deflecting structures are used now manly for bunch rotation in emittance exchange concepts, bunch diagnostics and to increase the luminosity. The bunch rotation is a transformation of a particles distribution in the six dimensional phase space. Together with the expected transformations, deflecting structures introduce distortions due to particularities - aberrations - in the deflecting field distribution. The distributions of deflecting fields are considered with respect to non linear additions, which provide emittance deteriorations during a transformation. The deflecting field is treated as combination of hybrid waves HE{sub 1} and HM{sub 1}. The criteria for selection and formation of deflecting structures with minimized level of aberrations are formulated and applied to known structures. Results of the study are confirmed by comparison with results of numerical simulations.

  1. Field distribution analysis in deflecting structures

    International Nuclear Information System (INIS)

    Paramonov, V.V.

    2013-02-01

    Deflecting structures are used now manly for bunch rotation in emittance exchange concepts, bunch diagnostics and to increase the luminosity. The bunch rotation is a transformation of a particles distribution in the six dimensional phase space. Together with the expected transformations, deflecting structures introduce distortions due to particularities - aberrations - in the deflecting field distribution. The distributions of deflecting fields are considered with respect to non linear additions, which provide emittance deteriorations during a transformation. The deflecting field is treated as combination of hybrid waves HE 1 and HM 1 . The criteria for selection and formation of deflecting structures with minimized level of aberrations are formulated and applied to known structures. Results of the study are confirmed by comparison with results of numerical simulations.

  2. Statistical analysis of partial reduced width distributions

    International Nuclear Information System (INIS)

    Tran Quoc Thuong.

    1973-01-01

    The aim of this study was to develop rigorous methods for analysing experimental event distributions according to a law in chi 2 and to check if the number of degrees of freedom ν is compatible with the value 1 for the reduced neutron width distribution. Two statistical methods were used (the maximum-likelihood method and the method of moments); it was shown, in a few particular cases, that ν is compatible with 1. The difference between ν and 1, if it exists, should not exceed 3%. These results confirm the validity of the compound nucleus model [fr

  3. About normal distribution on SO(3) group in texture analysis

    Science.gov (United States)

    Savyolova, T. I.; Filatov, S. V.

    2017-12-01

    This article studies and compares different normal distributions (NDs) on SO(3) group, which are used in texture analysis. Those NDs are: Fisher normal distribution (FND), Bunge normal distribution (BND), central normal distribution (CND) and wrapped normal distribution (WND). All of the previously mentioned NDs are central functions on SO(3) group. CND is a subcase for normal CLT-motivated distributions on SO(3) (CLT here is Parthasarathy’s central limit theorem). WND is motivated by CLT in R 3 and mapped to SO(3) group. A Monte Carlo method for modeling normally distributed values was studied for both CND and WND. All of the NDs mentioned above are used for modeling different components of crystallites orientation distribution function in texture analysis.

  4. Distributed crack analysis of ceramic inlays

    NARCIS (Netherlands)

    Peters, M.C.R.B.; Vree, de J.H.P.; Brekelmans, W.A.M.

    1993-01-01

    In all-ceramic restorations, crack formation and propagation phenomena are of major concern, since they may result in intra-oral fracture. The objective of this study was calculation of damage in porcelain MOD inlays by utilization of a finite-element (FE) implementation of the distributed crack

  5. Objective Bayesian Analysis of Skew- t Distributions

    KAUST Repository

    BRANCO, MARCIA D'ELIA; GENTON, MARC G.; LISEO, BRUNERO

    2012-01-01

    We study the Jeffreys prior and its properties for the shape parameter of univariate skew-t distributions with linear and nonlinear Student's t skewing functions. In both cases, we show that the resulting priors for the shape parameter are symmetric

  6. Analysis of refrigerant mal-distribution

    DEFF Research Database (Denmark)

    Kærn, Martin Ryhl; Elmegaard, Brian

    2009-01-01

    to be two straight tubes. The refrigerant maldistribution is then induced to the evaporator by varying the vapor quality at the inlet to each tube and the air-flow across each tube. Finally it is shown that mal-distribution can be compensated by an intelligent distributor, that ensures equal superheat...

  7. An abnormal screening mammogram causes more anxiety than a palpable lump in benign breast disease

    NARCIS (Netherlands)

    Keyzer-Dekker, C. M. G.; van Esch, L.; de Vries, J.; Ernst, Marloes; Nieuwenhuijzen, G. A. P.; Roukema, J. A.; van der Steeg, A. F. W.

    Being recalled for further diagnostic procedures after an abnormal screening mammogram (ASM) can evoke a high state anxiety with lowered quality of life (QoL). We examined whether these adverse psychological consequences are found in all women with benign breast disease (BBD) or are particular to

  8. Noise equalization for detection of microcalcification clusters in direct digital mammogram images.

    NARCIS (Netherlands)

    McLoughlin, K.J.; Bones, P.J.; Karssemeijer, N.

    2004-01-01

    Equalizing image noise is shown to be an important step in the automatic detection of microcalcifications in digital mammography. This study extends a well established film-screen noise equalization scheme developed by Veldkamp et al. for application to full-field digital mammogram (FFDM) images. A

  9. New Embedded Denotes Fuzzy C-Mean Application for Breast Cancer Density Segmentation in Digital Mammograms

    Science.gov (United States)

    Othman, Khairulnizam; Ahmad, Afandi

    2016-11-01

    In this research we explore the application of normalize denoted new techniques in advance fast c-mean in to the problem of finding the segment of different breast tissue regions in mammograms. The goal of the segmentation algorithm is to see if new denotes fuzzy c- mean algorithm could separate different densities for the different breast patterns. The new density segmentation is applied with multi-selection of seeds label to provide the hard constraint, whereas the seeds labels are selected based on user defined. New denotes fuzzy c- mean have been explored on images of various imaging modalities but not on huge format digital mammograms just yet. Therefore, this project is mainly focused on using normalize denoted new techniques employed in fuzzy c-mean to perform segmentation to increase visibility of different breast densities in mammography images. Segmentation of the mammogram into different mammographic densities is useful for risk assessment and quantitative evaluation of density changes. Our proposed methodology for the segmentation of mammograms on the basis of their region into different densities based categories has been tested on MIAS database and Trueta Database.

  10. Response Time Analysis of Distributed Web Systems Using QPNs

    Directory of Open Access Journals (Sweden)

    Tomasz Rak

    2015-01-01

    Full Text Available A performance model is used for studying distributed Web systems. Performance evaluation is done by obtaining load test measurements. Queueing Petri Nets formalism supports modeling and performance analysis of distributed World Wide Web environments. The proposed distributed Web systems modeling and design methodology have been applied in the evaluation of several system architectures under different external loads. Furthermore, performance analysis is done to determine the system response time.

  11. Empirical analysis for Distributed Energy Resources' impact on future distribution network

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2012-01-01

    There has been a large body of statements claiming that the large scale deployment of Distributed Energy Resources (DERs) will eventually reshape the future distribution grid operation in various ways. Thus, it is interesting to introduce a platform to interpret to what extent the power system...... operation will be alternated. In this paper, quantitative results in terms of how the future distribution grid will be changed by the deployment of distributed generation, active demand and electric vehicles, are presented. The analysis is based on the conditions for both a radial and a meshed distribution...... network. The input parameters are based on the current and envisioned DER deployment scenarios proposed for Sweden....

  12. Economic analysis of efficient distribution transformer trends

    Energy Technology Data Exchange (ETDEWEB)

    Downing, D.J.; McConnell, B.W.; Barnes, P.R.; Hadley, S.W.; Van Dyke, J.W.

    1998-03-01

    This report outlines an approach that will account for uncertainty in the development of evaluation factors used to identify transformer designs with the lowest total owning cost (TOC). The TOC methodology is described and the most highly variable parameters are discussed. The model is developed to account for uncertainties as well as statistical distributions for the important parameters. Sample calculations are presented. The TOC methodology is applied to data provided by two utilities in order to test its validity.

  13. Nonlinear Progressive Collapse Analysis Including Distributed Plasticity

    Directory of Open Access Journals (Sweden)

    Mohamed Osama Ahmed

    2016-01-01

    Full Text Available This paper demonstrates the effect of incorporating distributed plasticity in nonlinear analytical models used to assess the potential for progressive collapse of steel framed regular building structures. Emphasis on this paper is on the deformation response under the notionally removed column, in a typical Alternate Path (AP method. The AP method employed in this paper is based on the provisions of the Unified Facilities Criteria – Design of Buildings to Resist Progressive Collapse, developed and updated by the U.S. Department of Defense [1]. The AP method is often used for to assess the potential for progressive collapse of building structures that fall under Occupancy Category III or IV. A case study steel building is used to examine the effect of incorporating distributed plasticity, where moment frames were used on perimeter as well as the interior of the three dimensional structural system. It is concluded that the use of moment resisting frames within the structural system will enhance resistance to progressive collapse through ductile deformation response and that it is conserative to ignore the effects of distributed plasticity in determining peak displacement response under the notionally removed column.

  14. Patient understanding of the revised USPSTF screening mammogram guidelines: need for development of patient decision aids

    Directory of Open Access Journals (Sweden)

    Allen Summer V

    2012-10-01

    Full Text Available Abstract Background The purpose of the study was to examine patients’ understanding of the revised screening mammogram guidelines released by the United States Preventive Services Task Force (USPSTF in 2009 addressing age at initiation and frequency of screening mammography. Methods Patients from the Departments of Family Medicine, Internal Medicine, and Obstetrics and Gynecology (n = 150 at a tertiary care medical center in the United States completed a survey regarding their understanding of the revised USPSTF guidelines following their release, within four to six months of their scheduled mammogram (March 2010 to May 2010. Results Of the patients surveyed, 97/147 (67% indicated increased confusion regarding the age and frequency of screening mammography, 61/148 (41% reported increased anxiety about mammograms, and 58/146 (40% reported anxiety about their own health status following the release of the revised screening guidelines. Most of the patients surveyed, 111/148 (75%, did not expect to change their timing or frequency of screening mammograms in the future. Conclusion Results from this survey suggested increased confusion and possibly an increase in patients’ anxiety related to screening mammography and their own health status following the release of the revised USPSTF screening mammogram guidelines to the public and subsequent media portrayal of the revised guidelines. Although the study did not specifically address causality for these findings, the results highlight the need for improvements in the communication of guidelines to patients and the public. Development of shared decision-making tools and outcomes should be considered to address the communication challenge.

  15. A Comparison of Distribution Free and Non-Distribution Free Factor Analysis Methods

    Science.gov (United States)

    Ritter, Nicola L.

    2012-01-01

    Many researchers recognize that factor analysis can be conducted on both correlation matrices and variance-covariance matrices. Although most researchers extract factors from non-distribution free or parametric methods, researchers can also extract factors from distribution free or non-parametric methods. The nature of the data dictates the method…

  16. Probabilistic analysis in normal operation of distribution system with distributed generation

    DEFF Research Database (Denmark)

    Villafafila-Robles, R.; Sumper, A.; Bak-Jensen, B.

    2011-01-01

    Nowadays, the incorporation of high levels of small-scale non-dispatchable distributed generation is leading to the transition from the traditional 'vertical' power system structure to a 'horizontally-operated' power system, where the distribution networks contain both stochastic generation...... and load. This fact increases the number of stochastic inputs and dependence structures between them need to be considered. The deterministic analysis is not enough to cope with these issues and a new approach is needed. Probabilistic analysis provides a better approach. Moreover, as distribution systems...

  17. Distributed bearing fault diagnosis based on vibration analysis

    Science.gov (United States)

    Dolenc, Boštjan; Boškoski, Pavle; Juričić, Đani

    2016-01-01

    Distributed bearing faults appear under various circumstances, for example due to electroerosion or the progression of localized faults. Bearings with distributed faults tend to generate more complex vibration patterns than those with localized faults. Despite the frequent occurrence of such faults, their diagnosis has attracted limited attention. This paper examines a method for the diagnosis of distributed bearing faults employing vibration analysis. The vibrational patterns generated are modeled by incorporating the geometrical imperfections of the bearing components. Comparing envelope spectra of vibration signals shows that one can distinguish between localized and distributed faults. Furthermore, a diagnostic procedure for the detection of distributed faults is proposed. This is evaluated on several bearings with naturally born distributed faults, which are compared with fault-free bearings and bearings with localized faults. It is shown experimentally that features extracted from vibrations in fault-free, localized and distributed fault conditions form clearly separable clusters, thus enabling diagnosis.

  18. A Modeling Framework for Schedulability Analysis of Distributed Avionics Systems

    DEFF Research Database (Denmark)

    Han, Pujie; Zhai, Zhengjun; Nielsen, Brian

    2018-01-01

    This paper presents a modeling framework for schedulability analysis of distributed integrated modular avionics (DIMA) systems that consist of spatially distributed ARINC-653 modules connected by a unified AFDX network. We model a DIMA system as a set of stopwatch automata (SWA) in UPPAAL...

  19. User-friendly Tool for Power Flow Analysis and Distributed ...

    African Journals Online (AJOL)

    Akorede

    AKOREDE et al: TOOL FOR POWER FLOW ANALYSIS AND DISTRIBUTED GENERATION OPTIMISATION. 23 ... greenhouse gas emissions and the current deregulation of electric energy ..... Visual composition and temporal behaviour of GUI.

  20. Residence time distribution software analysis. User's manual

    International Nuclear Information System (INIS)

    1996-01-01

    Radiotracer applications cover a wide range of industrial activities in chemical and metallurgical processes, water treatment, mineral processing, environmental protection and civil engineering. Experiment design, data acquisition, treatment and interpretation are the basic elements of tracer methodology. The application of radiotracers to determine impulse response as RTD as well as the technical conditions for conducting experiments in industry and in the environment create a need for data processing using special software. Important progress has been made during recent years in the preparation of software programs for data treatment and interpretation. The software package developed for industrial process analysis and diagnosis by the stimulus-response methods contains all the methods for data processing for radiotracer experiments

  1. A combined approach for the enhancement and segmentation of mammograms using modified fuzzy C-means method in wavelet domain

    OpenAIRE

    Srivastava, Subodh; Sharma, Neeraj; Singh, S. K.; Srivastava, R.

    2014-01-01

    In this paper, a combined approach for enhancement and segmentation of mammograms is proposed. In preprocessing stage, a contrast limited adaptive histogram equalization (CLAHE) method is applied to obtain the better contrast mammograms. After this, the proposed combined methods are applied. In the first step of the proposed approach, a two dimensional (2D) discrete wavelet transform (DWT) is applied to all the input images. In the second step, a proposed nonlinear complex diffusion based uns...

  2. Detecting microcalcifications in mammograms by using SVM method for the diagnostics of breast cancer

    Science.gov (United States)

    Wan, Baikun; Wang, Ruiping; Qi, Hongzhi; Cao, Xuchen

    2005-01-01

    Support vector machine (SVM) is a new statistical learning method. Compared with the classical machine learning methods, SVM learning discipline is to minimize the structural risk instead of the empirical risk of the classical methods, and it gives better generative performance. Because SVM algorithm is a convex quadratic optimization problem, the local optimal solution is certainly the global optimal one. In this paper a SVM algorithm is applied to detect the micro-calcifications (MCCs) in mammograms for the diagnostics of breast cancer that has not been reported yet. It had been tested with 10 mammograms and the results show that the algorithm can achieve a higher true positive in comparison with artificial neural network (ANN) based on the empirical risk minimization, and is valuable for further study and application in the clinical engineering.

  3. Bayesian Maximum Entropy Based Algorithm for Digital X-ray Mammogram Processing

    Directory of Open Access Journals (Sweden)

    Radu Mutihac

    2009-06-01

    Full Text Available Basics of Bayesian statistics in inverse problems using the maximum entropy principle are summarized in connection with the restoration of positive, additive images from various types of data like X-ray digital mammograms. An efficient iterative algorithm for image restoration from large data sets based on the conjugate gradient method and Lagrange multipliers in nonlinear optimization of a specific potential function was developed. The point spread function of the imaging system was determined by numerical simulations of inhomogeneous breast-like tissue with microcalcification inclusions of various opacities. The processed digital and digitized mammograms resulted superior in comparison with their raw counterparts in terms of contrast, resolution, noise, and visibility of details.

  4. Statistical Analysis Of Failure Strength Of Material Using Weibull Distribution

    International Nuclear Information System (INIS)

    Entin Hartini; Mike Susmikanti; Antonius Sitompul

    2008-01-01

    In evaluation of ceramic and glass materials strength a statistical approach is necessary Strength of ceramic and glass depend on its measure and size distribution of flaws in these material. The distribution of strength for ductile material is narrow and close to a Gaussian distribution while strength of brittle materials as ceramic and glass following Weibull distribution. The Weibull distribution is an indicator of the failure of material strength resulting from a distribution of flaw size. In this paper, cumulative probability of material strength to failure probability, cumulative probability of failure versus fracture stress and cumulative probability of reliability of material were calculated. Statistical criteria calculation supporting strength analysis of Silicon Nitride material were done utilizing MATLAB. (author)

  5. Segmentation of the Breast Region in Digital Mammograms and Detection of Masses

    OpenAIRE

    Armen Sahakyan; Hakop Sarukhanyan

    2012-01-01

    The mammography is the most effective procedure for an early diagnosis of the breast cancer. Finding an accurate and efficient breast region segmentation technique still remains a challenging problem in digital mammography. In this paper we explore an automated technique for mammogram segmentation. The proposed algorithm uses morphological preprocessing algorithm in order to: remove digitization noises and separate background region from the breast profile region for further edge detection an...

  6. Computerized detection of masses on mammograms: A comparative study of two algorithms

    International Nuclear Information System (INIS)

    Tiedeu, A.; Kom, G.; Kom, M.

    2007-02-01

    In this paper, we implement and carry out the comparison of two methods of computer-aided-detection of masses on mammograms. The two algorithms basically consist of 3 steps each: segmentation, binarization and noise suppression but using different techniques for each step. A database of 60 images was used to compare the performance of the two algorithms in terms of general detection efficiency, conservation of size and shape of detected masses. (author)

  7. Modeling and analysis of solar distributed generation

    Science.gov (United States)

    Ortiz Rivera, Eduardo Ivan

    power point tracking algorithms. Finally, several PV power applications will be presented like circuit analysis for a load connected to two different PV arrays, speed control for a do motor connected to a PVM, and a novel single phase photovoltaic inverter system using the Z-source converter.

  8. Modelling and analysis of distributed simulation protocols with distributed graph transformation

    OpenAIRE

    Lara, Juan de; Taentzer, Gabriele

    2005-01-01

    Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. J. de Lara, and G. Taentzer, "Modelling and analysis of distributed simulation protocols with distributed graph transformation...

  9. Ethnic differences in social support after initial receipt of an abnormal mammogram.

    Science.gov (United States)

    Molina, Yamile; Hohl, Sarah D; Nguyen, Michelle; Hempstead, Bridgette H; Weatherby, Shauna Rae; Dunbar, Claire; Beresford, Shirley A A; Ceballos, Rachel M

    2016-10-01

    We examine access to and type of social support after initial receipt of an abnormal mammogram across non-Latina White (NLW), African American, and Latina women. This cross-sectional study used a mixed method design, with quantitative and qualitative measures. Women were recruited through 2 community advocates and 3 breast-health-related care organizations. With regard to access, African American women were less likely to access social support relative to NLW counterparts. Similar nonsignificant differences were found for Latinas. Women did not discuss results with family and friends to avoid burdening social networks and negative reactions. Networks' geographic constraints and medical mistrust influenced Latina and African American women's decisions to discuss results. With regard to type of social support, women reported emotional support across ethnicity. Latina and African American women reported more instrumental support, whereas NLW women reported more informational support in the context of their well-being. There are shared and culturally unique aspects of women's experiences with social support after initially receiving an abnormal mammogram. Latina and African American women may particularly benefit from informational support from health care professionals. Communitywide efforts to mitigate mistrust and encourage active communication about cancer may improve ethnic disparities in emotional well-being and diagnostic resolution during initial receipt of an abnormal mammogram. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  10. Quantitative assessment of breast density from digitized mammograms into Tabar's patterns

    Energy Technology Data Exchange (ETDEWEB)

    Jamal, N [Medical Technology Division, Malaysian Institute for Nuclear Technology Research (MINT) 43000 Kajang (Malaysia); Ng, K-H [Department of Radiology, University of Malaya, 50603 Kuala Lumpur (Malaysia); Looi, L-M [Department of Pathology, University of Malaya, 50603 Kuala Lumpur (Malaysia); McLean, D [Medical Physics Department, Westmead Hospital, Sydney, NSW 2145 (Australia); Zulfiqar, A [Department of Radiology, Hospital Universiti Kebangsaan Malaysia, 56000 Malaysia, Kuala Lumpur, Malaysia (Malaysia); Tan, S-P [Department of Radiology, Hospital Universiti Kebangsaan Malaysia, 56000 Malaysia, Kuala Lumpur, Malaysia (Malaysia); Liew, W-F [Department of Radiology, Hospital Universiti Kebangsaan Malaysia, 56000 Malaysia, Kuala Lumpur, Malaysia (Malaysia); Shantini, A [Department of Radiology, Kuala Lumpur Hospital, 50586 Kuala Lumpur (Malaysia); Ranganathan, S [Department of Radiology, University of Malaya, 50603 Kuala Lumpur (Malaysia)

    2006-11-21

    We describe a semi-automated technique for the quantitative assessment of breast density from digitized mammograms in comparison with patterns suggested by Tabar. It was developed using the MATLAB-based graphical user interface applications. It is based on an interactive thresholding method, after a short automated method that shows the fibroglandular tissue area, breast area and breast density each time new thresholds are placed on the image. The breast density is taken as a percentage of the fibroglandular tissue to the breast tissue areas. It was tested in four different ways, namely by examining: (i) correlation of the quantitative assessment results with subjective classification, (ii) classification performance using the quantitative assessment technique, (iii) interobserver agreement and (iv) intraobserver agreement. The results of the quantitative assessment correlated well (r{sup 2} = 0.92) with the subjective Tabar patterns classified by the radiologist (correctly classified 83% of digitized mammograms). The average kappa coefficient for the agreement between the readers was 0.63. This indicated moderate agreement between the three observers in classifying breast density using the quantitative assessment technique. The kappa coefficient of 0.75 for intraobserver agreement reflected good agreement between two sets of readings. The technique may be useful as a supplement to the radiologist's assessment in classifying mammograms into Tabar's pattern associated with breast cancer risk.

  11. Quantitative assessment of breast density from digitized mammograms into Tabar's patterns

    International Nuclear Information System (INIS)

    Jamal, N; Ng, K-H; Looi, L-M; McLean, D; Zulfiqar, A; Tan, S-P; Liew, W-F; Shantini, A; Ranganathan, S

    2006-01-01

    We describe a semi-automated technique for the quantitative assessment of breast density from digitized mammograms in comparison with patterns suggested by Tabar. It was developed using the MATLAB-based graphical user interface applications. It is based on an interactive thresholding method, after a short automated method that shows the fibroglandular tissue area, breast area and breast density each time new thresholds are placed on the image. The breast density is taken as a percentage of the fibroglandular tissue to the breast tissue areas. It was tested in four different ways, namely by examining: (i) correlation of the quantitative assessment results with subjective classification, (ii) classification performance using the quantitative assessment technique, (iii) interobserver agreement and (iv) intraobserver agreement. The results of the quantitative assessment correlated well (r 2 = 0.92) with the subjective Tabar patterns classified by the radiologist (correctly classified 83% of digitized mammograms). The average kappa coefficient for the agreement between the readers was 0.63. This indicated moderate agreement between the three observers in classifying breast density using the quantitative assessment technique. The kappa coefficient of 0.75 for intraobserver agreement reflected good agreement between two sets of readings. The technique may be useful as a supplement to the radiologist's assessment in classifying mammograms into Tabar's pattern associated with breast cancer risk

  12. Using x-ray mammograms to assist in microwave breast image interpretation.

    Science.gov (United States)

    Curtis, Charlotte; Frayne, Richard; Fear, Elise

    2012-01-01

    Current clinical breast imaging modalities include ultrasound, magnetic resonance (MR) imaging, and the ubiquitous X-ray mammography. Microwave imaging, which takes advantage of differing electromagnetic properties to obtain image contrast, shows potential as a complementary imaging technique. As an emerging modality, interpretation of 3D microwave images poses a significant challenge. MR images are often used to assist in this task, and X-ray mammograms are readily available. However, X-ray mammograms provide 2D images of a breast under compression, resulting in significant geometric distortion. This paper presents a method to estimate the 3D shape of the breast and locations of regions of interest from standard clinical mammograms. The technique was developed using MR images as the reference 3D shape with the future intention of using microwave images. Twelve breast shapes were estimated and compared to ground truth MR images, resulting in a skin surface estimation accurate to within an average Euclidean distance of 10 mm. The 3D locations of regions of interest were estimated to be within the same clinical area of the breast as corresponding regions seen on MR imaging. These results encourage investigation into the use of mammography as a source of information to assist with microwave image interpretation as well as validation of microwave imaging techniques.

  13. Using X-Ray Mammograms to Assist in Microwave Breast Image Interpretation

    Directory of Open Access Journals (Sweden)

    Charlotte Curtis

    2012-01-01

    Full Text Available Current clinical breast imaging modalities include ultrasound, magnetic resonance (MR imaging, and the ubiquitous X-ray mammography. Microwave imaging, which takes advantage of differing electromagnetic properties to obtain image contrast, shows potential as a complementary imaging technique. As an emerging modality, interpretation of 3D microwave images poses a significant challenge. MR images are often used to assist in this task, and X-ray mammograms are readily available. However, X-ray mammograms provide 2D images of a breast under compression, resulting in significant geometric distortion. This paper presents a method to estimate the 3D shape of the breast and locations of regions of interest from standard clinical mammograms. The technique was developed using MR images as the reference 3D shape with the future intention of using microwave images. Twelve breast shapes were estimated and compared to ground truth MR images, resulting in a skin surface estimation accurate to within an average Euclidean distance of 10 mm. The 3D locations of regions of interest were estimated to be within the same clinical area of the breast as corresponding regions seen on MR imaging. These results encourage investigation into the use of mammography as a source of information to assist with microwave image interpretation as well as validation of microwave imaging techniques.

  14. Reading screening mammograms – Attitudes among radiologists and radiographers about skill mix

    International Nuclear Information System (INIS)

    Johansen, Lena Westphal; Brodersen, John

    2011-01-01

    Introduction: Because of shortage of personnel for the Danish mammography screening programme, the aim of this study was to investigate the attitudes of radiologists and radiographers towards a future implementation of radiographers reading screening mammograms. Materials and methods: Seven combined phenomenological and hermeneutical interviews with radiographers and radiologists were performed. Stratified selection was used for sampling of informants. The interviews were analysed against theory about quality, organization and profession. Results: Quality related possibilities: radiographers do routinely measure the performance quality, radiographers obtain sufficient reading qualifications, and skill mix improves quality. Quality related obstacles: radiologists do not routinely measure performance quality. Organization related possibilities: shortage of radiologists, positive attitudes of managers, and improved working relations. Organization related obstacles: shortage of radiographers and negative attitudes of managers. Professional related possibilities: positive experience with skill mix. Professional related obstacles: worries about negative consequences for the training of radiologists, and resistance against handing over tasks to another profession. Conclusion: Attitudes towards radiographers reading screening mammograms are attached to either quality-, organisational or professional perspectives. Radiographers are capable of learning to read mammograms at sufficient performance level but routine measurement of performance quality is essential. Resistance against skill mix may be caused by an emotionally conditioned fear of losing demarcations. The main motive for skill mix is improvement of the utilization of resources. No evidence was found regarding the organisational and financial consequences of skill mix. Despite of this all radiologists and radiographers experienced with skill mix were strong advocates for reading radiographers.

  15. Computer-aided diagnosis scheme for histological classification of clustered microcalcifications on magnification mammograms

    International Nuclear Information System (INIS)

    Nakayama, Ryohei; Uchiyama, Yoshikazu; Watanabe, Ryoji; Katsuragawa, Shigehiko; Namba, Kiyoshi; Doi, Kunio

    2004-01-01

    The histological classification of clustered microcalcifications on mammograms can be difficult, and thus often require biopsy or follow-up. Our purpose in this study was to develop a computer-aided diagnosis schemefor identifying the histological classification of clustered microcalcifications on magnification mammograms in order to assist the radiologists' interpretation as a 'second opinion'. Our database consisted of 58 magnification mammograms, which included 35 malignant clustered microcalcifications (9 invasive carcinomas, 12 noninvasive carcinomas of the comedo type, and 14 noninvasive carcinomas of the noncomedo type) and 23 benign clustered microcalcifications (17 mastopathies and 6 fibroadenomas). The histological classifications of all clustered microcalcifications were proved by pathologic diagnosis. The clustered microcalcifications were first segmented by use of a novel filter bank and a thresholding technique. Five objective features on clustered microcalcifications were determined by taking into account subjective features that experienced the radiologists commonly use to identify possible histological classifications. The Bayes decision rule with five objective features was employed for distinguishing between five histological classifications. The classification accuracies for distinguishing between three malignant histological classifications were 77.8% (7/9) for invasive carcinoma, 75.0% (9/12) for noninvasive carcinoma of the comedo type, and 92.9% (13/14) for noninvasive carcinoma of the noncomedo type. The classification accuracies for distinguishing between two benign histological classifications were 94.1% (16/17) for mastopathy, and 100.0% (6/6) for fibroadenoma. This computerized method would be useful in assisting radiologists in their assessments of clustered microcalcifications

  16. Comparative analysis through probability distributions of a data set

    Science.gov (United States)

    Cristea, Gabriel; Constantinescu, Dan Mihai

    2018-02-01

    In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.

  17. Distribution of lod scores in oligogenic linkage analysis.

    Science.gov (United States)

    Williams, J T; North, K E; Martin, L J; Comuzzie, A G; Göring, H H; Blangero, J

    2001-01-01

    In variance component oligogenic linkage analysis it can happen that the residual additive genetic variance bounds to zero when estimating the effect of the ith quantitative trait locus. Using quantitative trait Q1 from the Genetic Analysis Workshop 12 simulated general population data, we compare the observed lod scores from oligogenic linkage analysis with the empirical lod score distribution under a null model of no linkage. We find that zero residual additive genetic variance in the null model alters the usual distribution of the likelihood-ratio statistic.

  18. System analysis and planning of a gas distribution network

    Energy Technology Data Exchange (ETDEWEB)

    Salas, Edwin F.M.; Farias, Helio Monteiro [AUTOMIND, Rio de Janeiro, RJ (Brazil); Costa, Carla V.R. [Universidade Salvador (UNIFACS), BA (Brazil)

    2009-07-01

    The increase in demand by gas consumers require that projects or improvements in gas distribution networks be made carefully and safely to ensure a continuous, efficient and economical supply. Gas distribution companies must ensure that the networks and equipment involved are defined and designed at the appropriate time to attend to the demands of the market. To do that a gas distribution network analysis and planning tool should use distribution networks and transmission models for the current situation and the future changes to be implemented. These models are used to evaluate project options and help in making appropriate decisions in order to minimize the capital investment in new components or simple changes in operational procedures. Gas demands are increasing and it is important that gas distribute design new distribution systems to ensure this growth, considering financial constraints of the company, as well as local legislation and regulation. In this study some steps of developing a flexible system that attends to those needs will be described. The analysis of distribution requires geographically referenced data for the models as well as an accurate connectivity and the attributes of the equipment. GIS systems are often used as a deposit center that holds the majority of this information. GIS systems are constantly updated as distribution network equipment is modified. The distribution network modeling gathered from this system ensures that the model represents the current network condition. The benefits of this architecture drastically reduce the creation and maintenance cost of the network models, because network components data are conveniently made available to populate the distribution network. This architecture ensures that the models are continually reflecting the reality of the distribution network. (author)

  19. Joint two-view information for computerized detection of microcalcifications on mammograms

    International Nuclear Information System (INIS)

    Sahiner, Berkman; Chan, H.-P.; Hadjiiski, Lubomir M.; Helvie, Mark A.; Paramagul, Chinatana; Ge Jun; Wei Jun; Zhou Chuan

    2006-01-01

    We are developing new techniques to improve the accuracy of computerized microcalcification detection by using the joint two-view information on craniocaudal (CC) and mediolateral-oblique (MLO) views. After cluster candidates were detected using a single-view detection technique, candidates on CC and MLO views were paired using their radial distances from the nipple. Candidate pairs were classified with a similarity classifier that used the joint information from both views. Each cluster candidate was also characterized by its single-view features. The outputs of the similarity classifier and the single-view classifier were fused and the cluster candidate was classified as a true microcalcification cluster or a false-positive (FP) using the fused two-view information. A data set of 116 pairs of mammograms containing microcalcification clusters and 203 pairs of normal images from the University of South Florida (USF) public database was used for training the two-view detection algorithm. The trained method was tested on an independent test set of 167 pairs of mammograms, which contained 71 normal pairs and 96 pairs with microcalcification clusters collected at the University of Michigan (UM). The similarity classifier had a very low FP rate for the test set at low and medium levels of sensitivity. However, the highest mammogram-based sensitivity that could be reached by the similarity classifier was 69%. The single-view classifier had a higher FP rate compared to the similarity classifier, but it could reach a maximum mammogram-based sensitivity of 93%. The fusion method combined the scores of these two classifiers so that the number of FPs was substantially reduced at relatively low and medium sensitivities, and a relatively high maximum sensitivity was maintained. For the malignant microcalcification clusters, at a mammogram-based sensitivity of 80%, the FP rates were 0.18 and 0.35 with the two-view fusion and single-view detection methods, respectively. When the

  20. LHCb Distributed Data Analysis on the Computing Grid

    CERN Document Server

    Paterson, S; Parkes, C

    2006-01-01

    LHCb is one of the four Large Hadron Collider (LHC) experiments based at CERN, the European Organisation for Nuclear Research. The LHC experiments will start taking an unprecedented amount of data when they come online in 2007. Since no single institute has the compute resources to handle this data, resources must be pooled to form the Grid. Where the Internet has made it possible to share information stored on computers across the world, Grid computing aims to provide access to computing power and storage capacity on geographically distributed systems. LHCb software applications must work seamlessly on the Grid allowing users to efficiently access distributed compute resources. It is essential to the success of the LHCb experiment that physicists can access data from the detector, stored in many heterogeneous systems, to perform distributed data analysis. This thesis describes the work performed to enable distributed data analysis for the LHCb experiment on the LHC Computing Grid.

  1. An integrated economic and distributional analysis of energy policies

    International Nuclear Information System (INIS)

    Labandeira, Xavier; Labeaga, Jose M.; Rodriguez, Miguel

    2009-01-01

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  2. An integrated economic and distributional analysis of energy policies

    Energy Technology Data Exchange (ETDEWEB)

    Labandeira, Xavier [Facultade de CC. Economicas, University of Vigo, 36310 Vigo (Spain); Labeaga, Jose M. [Instituto de Estudios Fiscales, Avda. Cardenal Herrera Oria 378, 28035 Madrid (Spain); Rodriguez, Miguel [Facultade de CC. Empresariais e Turismo, University of Vigo, 32004 Ourense (Spain)

    2009-12-15

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  3. Analysis and Comparison of Typical Models within Distribution Network Design

    DEFF Research Database (Denmark)

    Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.

    This paper investigates the characteristics of typical optimisation models within Distribution Network Design. During the paper fourteen models known from the literature will be thoroughly analysed. Through this analysis a schematic approach to categorisation of distribution network design models...... for educational purposes. Furthermore, the paper can be seen as a practical introduction to network design modelling as well as a being an art manual or recipe when constructing such a model....

  4. Retrospective analysis of 'gamma distribution' based IMRT QA criteria

    International Nuclear Information System (INIS)

    Wen, C.; Chappell, R.A.

    2010-01-01

    Full text: IMRT has been implemented into clinical practice at Royal Hobart Hospital (RHH) since mid 2006 for treating patients with Head and Neck (H and N) or prostate tumours. A local quality assurance (QA) acceptance criteria based on 'gamma distribution' for approving IMRT plan was developed and implemented in early 2007. A retrospective analysis of such criteria over 194 clinical cases will be presented. The RHH IMRT criteria was established with assumption that gamma distribution obtained through inter-comparison of 2 D dose maps between planned and delivered was governed by a positive-hail' normal distribution. A commercial system-MapCheck was used for 2 D dose map comparison with a built-in gamma analysis tool. Gamma distribution histogram was generated and recorded for all cases. By retrospectively analysing those distributions using curve fitting technique, a statistical gamma distribution can be obtained and evaluated. This analytical result can be used for future IMRT planing and treatment delivery. The analyses indicate that gamma distribution obtained through MapCheckTM is well under the normal distribution, particularly for prostate cases. The applied pass/fail criteria is not overly sensitive to identify 'false fails' but can be further tighten-up for smaller field while for larger field found in both H and N and prostate cases, the criteria was correctly applied. Non-uniform distribution of detectors in MapCheck and experience level of planners are two major factors to variation in gamma distribution among clinical cases. This criteria derived from clinical statistics is superior and more accurate than single-valued criteria for lMRT QA acceptance procedure. (author)

  5. Analysis and Comparison of Typical Models within Distribution Network Design

    DEFF Research Database (Denmark)

    Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.

    Efficient and cost effective transportation and logistics plays a vital role in the supply chains of the modern world’s manufacturers. Global distribution of goods is a very complicated matter as it involves many different distinct planning problems. The focus of this presentation is to demonstrate...... a number of important issues which have been identified when addressing the Distribution Network Design problem from a modelling angle. More specifically, we present an analysis of the research which has been performed in utilizing operational research in developing and optimising distribution systems....

  6. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2013-01-01

    There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system......, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation...... results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles....

  7. First Experiences with LHC Grid Computing and Distributed Analysis

    CERN Document Server

    Fisk, Ian

    2010-01-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  8. Annual Screening Mammogram and its Relation to Breast Density

    Directory of Open Access Journals (Sweden)

    Sabek EAS

    2017-11-01

    Full Text Available Background: Current national screening programs totally depend on mammographic evaluation. After increased incidence of breast cancer in women under the age of 35, mammography sensitivity in now a question. Several factors added to decrease sensitivity of mammography, such as increased density in older age groups and increased aggressiveness of tumour biology. All these factors will change the reliability of the screening program. The study is a retrospective study conducted at Ain Shams University. Method: 138 patients diagnosed with cancer breast underwent both mammography and sonography to determine percentage of patient with more than one focus, age and density distribution breast cancer in the affected patient and accuracy of both mammography and US. Results: By studying this population, we found that around 61,44% have areas of density ranging from dense breast, heterogenous density or scattered density. These areas of density render the mammography a less sensitive tool as its sensitivity fall to 34.09%, while that of US was 77.27%. Conclusion: As breast cancer is prevalent in younger population, also with increased density in older population who are relatively insensitive to mammography, we recommended the use of Automated Breast Ultrasound (ABUS in the national screening program.

  9. Quantitative comparison of clustered microcalcifications in for-presentation and for-processing mammograms in full-field digital mammography.

    Science.gov (United States)

    Wang, Juan; Nishikawa, Robert M; Yang, Yongyi

    2017-07-01

    Mammograms acquired with full-field digital mammography (FFDM) systems are provided in both "for-processing'' and "for-presentation'' image formats. For-presentation images are traditionally intended for visual assessment by the radiologists. In this study, we investigate the feasibility of using for-presentation images in computerized analysis and diagnosis of microcalcification (MC) lesions. We make use of a set of 188 matched mammogram image pairs of MC lesions from 95 cases (biopsy proven), in which both for-presentation and for-processing images are provided for each lesion. We then analyze and characterize the MC lesions from for-presentation images and compare them with their counterparts in for-processing images. Specifically, we consider three important aspects in computer-aided diagnosis (CAD) of MC lesions. First, we quantify each MC lesion with a set of 10 image features of clustered MCs and 12 textural features of the lesion area. Second, we assess the detectability of individual MCs in each lesion from the for-presentation images by a commonly used difference-of-Gaussians (DoG) detector. Finally, we study the diagnostic accuracy in discriminating between benign and malignant MC lesions from the for-presentation images by a pretrained support vector machine (SVM) classifier. To accommodate the underlying background suppression and image enhancement in for-presentation images, a normalization procedure is applied. The quantitative image features of MC lesions from for-presentation images are highly consistent with that from for-processing images. The values of Pearson's correlation coefficient between features from the two formats range from 0.824 to 0.961 for the 10 MC image features, and from 0.871 to 0.963 for the 12 textural features. In detection of individual MCs, the FROC curve from for-presentation is similar to that from for-processing. In particular, at sensitivity level of 80%, the average number of false-positives (FPs) per image region is 9

  10. Molecular Isotopic Distribution Analysis (MIDAs) with adjustable mass accuracy.

    Science.gov (United States)

    Alves, Gelio; Ogurtsov, Aleksey Y; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

  11. GIS-based poverty and population distribution analysis in China

    Science.gov (United States)

    Cui, Jing; Wang, Yingjie; Yan, Hong

    2009-07-01

    Geographically, poverty status is not only related with social-economic factors but also strongly affected by geographical environment. In the paper, GIS-based poverty and population distribution analysis method is introduced for revealing their regional differences. More than 100000 poor villages and 592 national key poor counties are chosen for the analysis. The results show that poverty distribution tends to concentrate in most of west China and mountainous rural areas of mid China. Furthermore, the fifth census data are overlaid to those poor areas in order to gain its internal diversity of social-economic characteristics. By overlaying poverty related social-economic parameters, such as sex ratio, illiteracy, education level, percentage of ethnic minorities, family composition, finding shows that poverty distribution is strongly correlated with high illiteracy rate, high percentage minorities, and larger family member.

  12. Location Analysis of Freight Distribution Terminal of Jakarta City, Indonesia

    Directory of Open Access Journals (Sweden)

    Nahry Nahry

    2016-03-01

    Full Text Available Currently Jakarta has two freight terminals, namely Pulo Gebang and Tanah Merdeka. But, both terminals are just functioned for parking and have not been utilized properly yet, e.g. for consolidation. Goods consolidation, which is usually performed in distribution terminal, may reduce number of freight flow within the city. This paper is aimed to determine the best location of distribution terminal in Jakarta among those two terminals and two additional alternative sites, namely Lodan and Rawa Buaya. It is initialized by the identification of important factors that affect the location selection. It is carried out by Likert analysis through the questionnaires distributed to logistics firms. The best location is determined by applying Overlay Analysis using ArcGIS 9.2. Four grid maps are produced to represent the accessibility, cost, time, and environment factors as the important factors of location. The result shows that the ranking from the best is; Lodan, Tanah Merdeka, Pulo Gebang, and Rawa Buaya.

  13. Silicon Bipolar Distributed Oscillator Design and Analysis | Aku ...

    African Journals Online (AJOL)

    The design of high frequency silicon bipolar oscillator using common emitter (CE) with distributed output and analysis is carried out. The general condition for oscillation and the resulting analytical expressions for the frequency of oscillators were reviewed. Transmission line design was carried out using Butterworth LC ...

  14. Tense Usage Analysis in Verb Distribution in Brazilian Portuguese.

    Science.gov (United States)

    Hoge, Henry W., Comp.

    This section of a four-part research project investigating the syntax of Brazilian Portuguese presents data concerning tense usage in verb distribution. The data are derived from the analysis of selected literary samples from representative and contemporary writers. The selection of authors and tabulation of data are also described. Materials…

  15. Three-Phase Harmonic Analysis Method for Unbalanced Distribution Systems

    Directory of Open Access Journals (Sweden)

    Jen-Hao Teng

    2014-01-01

    Full Text Available Due to the unbalanced features of distribution systems, a three-phase harmonic analysis method is essential to accurately analyze the harmonic impact on distribution systems. Moreover, harmonic analysis is the basic tool for harmonic filter design and harmonic resonance mitigation; therefore, the computational performance should also be efficient. An accurate and efficient three-phase harmonic analysis method for unbalanced distribution systems is proposed in this paper. The variations of bus voltages, bus current injections and branch currents affected by harmonic current injections can be analyzed by two relationship matrices developed from the topological characteristics of distribution systems. Some useful formulas are then derived to solve the three-phase harmonic propagation problem. After the harmonic propagation for each harmonic order is calculated, the total harmonic distortion (THD for bus voltages can be calculated accordingly. The proposed method has better computational performance, since the time-consuming full admittance matrix inverse employed by the commonly-used harmonic analysis methods is not necessary in the solution procedure. In addition, the proposed method can provide novel viewpoints in calculating the branch currents and bus voltages under harmonic pollution which are vital for harmonic filter design. Test results demonstrate the effectiveness and efficiency of the proposed method.

  16. AND LANDSCAPE-ECOLOGICAL ANALYSIS OF ITS DISTRIBUTION

    OpenAIRE

    S. M. Musaeva

    2012-01-01

    The article is devoted to the study of helminthofauna of the striped lizard in Lankaran natural region. The landscape and ecological analysis of distribution of the helminthofauna is provided. As a result of studies on 99 individuals of striped lizard totally 14 species of helminthes, including 1 trematode species, 1 species of cestode, 3 species of akantocefals and 9 species of nematodes were found.

  17. Analysis of thrips distribution: application of spatial statistics and Kriging

    Science.gov (United States)

    John Aleong; Bruce L. Parker; Margaret Skinner; Diantha Howard

    1991-01-01

    Kriging is a statistical technique that provides predictions for spatially and temporally correlated data. Observations of thrips distribution and density in Vermont soils are made in both space and time. Traditional statistical analysis of such data assumes that the counts taken over space and time are independent, which is not necessarily true. Therefore, to analyze...

  18. Data synthesis and display programs for wave distribution function analysis

    Science.gov (United States)

    Storey, L. R. O.; Yeh, K. J.

    1992-01-01

    At the National Space Science Data Center (NSSDC) software was written to synthesize and display artificial data for use in developing the methodology of wave distribution analysis. The software comprises two separate interactive programs, one for data synthesis and the other for data display.

  19. Weibull distribution in reliability data analysis in nuclear power plant

    International Nuclear Information System (INIS)

    Ma Yingfei; Zhang Zhijian; Zhang Min; Zheng Gangyang

    2015-01-01

    Reliability is an important issue affecting each stage of the life cycle ranging from birth to death of a product or a system. The reliability engineering includes the equipment failure data processing, quantitative assessment of system reliability and maintenance, etc. Reliability data refers to the variety of data that describe the reliability of system or component during its operation. These data may be in the form of numbers, graphics, symbols, texts and curves. Quantitative reliability assessment is the task of the reliability data analysis. It provides the information related to preventing, detect, and correct the defects of the reliability design. Reliability data analysis under proceed with the various stages of product life cycle and reliability activities. Reliability data of Systems Structures and Components (SSCs) in Nuclear Power Plants is the key factor of probabilistic safety assessment (PSA); reliability centered maintenance and life cycle management. The Weibull distribution is widely used in reliability engineering, failure analysis, industrial engineering to represent manufacturing and delivery times. It is commonly used to model time to fail, time to repair and material strength. In this paper, an improved Weibull distribution is introduced to analyze the reliability data of the SSCs in Nuclear Power Plants. An example is given in the paper to present the result of the new method. The Weibull distribution of mechanical equipment for reliability data fitting ability is very strong in nuclear power plant. It's a widely used mathematical model for reliability analysis. The current commonly used methods are two-parameter and three-parameter Weibull distribution. Through comparison and analysis, the three-parameter Weibull distribution fits the data better. It can reflect the reliability characteristics of the equipment and it is more realistic to the actual situation. (author)

  20. Validation of a method for measuring the volumetric breast density from digital mammograms

    International Nuclear Information System (INIS)

    Alonzo-Proulx, O; Shen, S Z; Yaffe, M J; Packard, N; Boone, J M; Al-Mayah, A; Brock, K K

    2010-01-01

    The purpose of this study was to evaluate the performance of an algorithm used to measure the volumetric breast density (VBD) from digital mammograms. The algorithm is based on the calibration of the detector signal versus the thickness and composition of breast-equivalent phantoms. The baseline error in the density from the algorithm was found to be 1.25 ± 2.3% VBD units (PVBD) when tested against a set of calibration phantoms, of thicknesses 3-8 cm, with compositions equivalent to fibroglandular content (breast density) between 0% and 100% and under x-ray beams between 26 kVp and 32 kVp with a Rh/Rh anode/filter. The algorithm was also tested against images from a dedicated breast computed tomography (CT) scanner acquired on 26 volunteers. The CT images were segmented into regions representing adipose, fibroglandular and skin tissues, and then deformed using a finite-element algorithm to simulate the effects of compression in mammography. The mean volume, VBD and thickness of the compressed breast for these deformed images were respectively 558 cm 3 , 23.6% and 62 mm. The displaced CT images were then used to generate simulated digital mammograms, considering the effects of the polychromatic x-ray spectrum, the primary and scattered energy transmitted through the breast, the anti-scatter grid and the detector efficiency. The simulated mammograms were analyzed with the VBD algorithm and compared with the deformed CT volumes. With the Rh/Rh anode filter, the root mean square difference between the VBD from CT and from the algorithm was 2.6 PVBD, and a linear regression between the two gave a slope of 0.992 with an intercept of -1.4 PVBD and a correlation with R 2 = 0.963. The results with the Mo/Mo and Mo/Rh anode/filter were similar.

  1. Validation of a method for measuring the volumetric breast density from digital mammograms

    Energy Technology Data Exchange (ETDEWEB)

    Alonzo-Proulx, O; Shen, S Z; Yaffe, M J [Sunnybrook Health Sciences Centre, University of Toronto, Toronto, Ontario M4N 3M5 (Canada); Packard, N; Boone, J M [UC Davis Medical Center, University of California-Davis, Sacramento, CA 95817 (United States); Al-Mayah, A; Brock, K K, E-mail: oliviera@sri.utoronto.c [University Health Network, University of Toronto, Toronto, Ontario M5G 2M9 (Canada)

    2010-06-07

    The purpose of this study was to evaluate the performance of an algorithm used to measure the volumetric breast density (VBD) from digital mammograms. The algorithm is based on the calibration of the detector signal versus the thickness and composition of breast-equivalent phantoms. The baseline error in the density from the algorithm was found to be 1.25 {+-} 2.3% VBD units (PVBD) when tested against a set of calibration phantoms, of thicknesses 3-8 cm, with compositions equivalent to fibroglandular content (breast density) between 0% and 100% and under x-ray beams between 26 kVp and 32 kVp with a Rh/Rh anode/filter. The algorithm was also tested against images from a dedicated breast computed tomography (CT) scanner acquired on 26 volunteers. The CT images were segmented into regions representing adipose, fibroglandular and skin tissues, and then deformed using a finite-element algorithm to simulate the effects of compression in mammography. The mean volume, VBD and thickness of the compressed breast for these deformed images were respectively 558 cm{sup 3}, 23.6% and 62 mm. The displaced CT images were then used to generate simulated digital mammograms, considering the effects of the polychromatic x-ray spectrum, the primary and scattered energy transmitted through the breast, the anti-scatter grid and the detector efficiency. The simulated mammograms were analyzed with the VBD algorithm and compared with the deformed CT volumes. With the Rh/Rh anode filter, the root mean square difference between the VBD from CT and from the algorithm was 2.6 PVBD, and a linear regression between the two gave a slope of 0.992 with an intercept of -1.4 PVBD and a correlation with R{sup 2} = 0.963. The results with the Mo/Mo and Mo/Rh anode/filter were similar.

  2. Classification of masses on mammograms using support vector machine

    Science.gov (United States)

    Chu, Yong; Li, Lihua; Goldgof, Dmitry B.; Qui, Yan; Clark, Robert A.

    2003-05-01

    Mammography is the most effective method for early detection of breast cancer. However, the positive predictive value for classification of malignant and benign lesion from mammographic images is not very high. Clinical studies have shown that most biopsies for cancer are very low, between 15% and 30%. It is important to increase the diagnostic accuracy by improving the positive predictive value to reduce the number of unnecessary biopsies. In this paper, a new classification method was proposed to distinguish malignant from benign masses in mammography by Support Vector Machine (SVM) method. Thirteen features were selected based on receiver operating characteristic (ROC) analysis of classification using individual feature. These features include four shape features, two gradient features and seven Laws features. With these features, SVM was used to classify the masses into two categories, benign and malignant, in which a Gaussian kernel and sequential minimal optimization learning technique are performed. The data set used in this study consists of 193 cases, in which there are 96 benign cases and 97 malignant cases. The leave-one-out evaluation of SVM classifier was taken. The results show that the positive predict value of the presented method is 81.6% with the sensitivity of 83.7% and the false-positive rate of 30.2%. It demonstrated that the SVM-based classifier is effective in mass classification.

  3. Neuronal model with distributed delay: analysis and simulation study for gamma distribution memory kernel.

    Science.gov (United States)

    Karmeshu; Gupta, Varun; Kadambari, K V

    2011-06-01

    A single neuronal model incorporating distributed delay (memory)is proposed. The stochastic model has been formulated as a Stochastic Integro-Differential Equation (SIDE) which results in the underlying process being non-Markovian. A detailed analysis of the model when the distributed delay kernel has exponential form (weak delay) has been carried out. The selection of exponential kernel has enabled the transformation of the non-Markovian model to a Markovian model in an extended state space. For the study of First Passage Time (FPT) with exponential delay kernel, the model has been transformed to a system of coupled Stochastic Differential Equations (SDEs) in two-dimensional state space. Simulation studies of the SDEs provide insight into the effect of weak delay kernel on the Inter-Spike Interval(ISI) distribution. A measure based on Jensen-Shannon divergence is proposed which can be used to make a choice between two competing models viz. distributed delay model vis-á-vis LIF model. An interesting feature of the model is that the behavior of (CV(t))((ISI)) (Coefficient of Variation) of the ISI distribution with respect to memory kernel time constant parameter η reveals that neuron can switch from a bursting state to non-bursting state as the noise intensity parameter changes. The membrane potential exhibits decaying auto-correlation structure with or without damped oscillatory behavior depending on the choice of parameters. This behavior is in agreement with empirically observed pattern of spike count in a fixed time window. The power spectral density derived from the auto-correlation function is found to exhibit single and double peaks. The model is also examined for the case of strong delay with memory kernel having the form of Gamma distribution. In contrast to fast decay of damped oscillations of the ISI distribution for the model with weak delay kernel, the decay of damped oscillations is found to be slower for the model with strong delay kernel.

  4. User-friendly Tool for Power Flow Analysis and Distributed Generation Optimisation in Radial Distribution Networks

    Directory of Open Access Journals (Sweden)

    M. F. Akorede

    2017-06-01

    Full Text Available The intent of power distribution companies (DISCOs is to deliver electric power to their customers in an efficient and reliable manner – with minimal energy loss cost. One major way to minimise power loss on a given power system is to install distributed generation (DG units on the distribution networks. However, to maximise benefits, it is highly crucial for a DISCO to ensure that these DG units are of optimal size and sited in the best locations on the network. This paper gives an overview of a software package developed in this study, called Power System Analysis and DG Optimisation Tool (PFADOT. The main purpose of the graphical user interface-based package is to guide a DISCO in finding the optimal size and location for DG placement in radial distribution networks. The package, which is also suitable for load flow analysis, employs the GUI feature of MATLAB. Three objective functions are formulated into a single optimisation problem and solved with fuzzy genetic algorithm to simultaneously obtain DG optimal size and location. The accuracy and reliability of the developed tool was validated using several radial test systems, and the results obtained are evaluated against the existing similar package cited in the literature, which are impressive and computationally efficient.

  5. Reading screening mammograms - Attitudes among radiologists and radiographers about skill mix

    DEFF Research Database (Denmark)

    Johansen, Lena Westphal; Brodersen, John

    2011-01-01

    INTRODUCTION: Because of shortage of personnel for the Danish mammography screening programme, the aim of this study was to investigate the attitudes of radiologists and radiographers towards a future implementation of radiographers reading screening mammograms. MATERIALS AND METHODS: Seven...... of managers, and improved working relations. Organization related obstacles: shortage of radiographers and negative attitudes of managers. Professional related possibilities: positive experience with skill mix. Professional related obstacles: worries about negative consequences for the training...... and financial consequences of skill mix. Despite of this all radiologists and radiographers experienced with skill mix were strong advocates for reading radiographers....

  6. The cost of electricity distribution in Italy: a quantitative analysis

    International Nuclear Information System (INIS)

    Scarpa, C.

    1998-01-01

    This paper presents a quantitative analysis of the cost of medium and low tension electricity distribution in Italy. An econometric analysis of the cost function is proposed, on the basis of data on 147 zones of the dominant firm, ENEL. Data are available only for 1996, which has forced to carry out only a cross-section OLS analysis. The econometric estimate shows the existence of significant scale economies, that the current organisational structure does not exploit. On this basis is also possible to control to what extent exogenous cost drivers affect costs. The role of numerous exogenous factors considered seems however quite limited. The area of the distribution zone and an indicator of quality are the only elements that appear significant from an economic viewpoint [it

  7. Quantitative analysis of tritium distribution in austenitic stainless steels welds

    International Nuclear Information System (INIS)

    Roustila, A.; Kuromoto, N.; Brass, A.M.; Chene, J.

    1994-01-01

    Tritium autoradiography was used to study the tritium distribution in laser and arc (TIG) weldments performed on tritiated AISI 316 samples. Quantitative values of the local tritium concentration were obtained from the microdensitometric analysis of the autoradiographs. This procedure was used to map the tritium concentration in the samples before and after laser and TIG treatments. The effect of the detritiation conditions and of welding on the tritium distribution in the material is extensively characterized. The results illustrate the interest of the technique for predicting a possible embrittlement of the material associated with a local enhancement of the tritium concentration and the presence of helium 3 generated by tritium decay. ((orig.))

  8. Pseudodifferential Analysis, Automorphic Distributions in the Plane and Modular Forms

    CERN Document Server

    Unterberger, Andre

    2011-01-01

    Pseudodifferential analysis, introduced in this book in a way adapted to the needs of number theorists, relates automorphic function theory in the hyperbolic half-plane I to automorphic distribution theory in the plane. Spectral-theoretic questions are discussed in one or the other environment: in the latter one, the problem of decomposing automorphic functions in I according to the spectral decomposition of the modular Laplacian gives way to the simpler one of decomposing automorphic distributions in R2 into homogeneous components. The Poincare summation process, which consists in building au

  9. Statistical analysis of rockfall volume distributions: Implications for rockfall dynamics

    Science.gov (United States)

    Dussauge, Carine; Grasso, Jean-Robert; Helmstetter, AgnèS.

    2003-06-01

    We analyze the volume distribution of natural rockfalls on different geological settings (i.e., calcareous cliffs in the French Alps, Grenoble area, and granite Yosemite cliffs, California Sierra) and different volume ranges (i.e., regional and worldwide catalogs). Contrary to previous studies that included several types of landslides, we restrict our analysis to rockfall sources which originated on subvertical cliffs. For the three data sets, we find that the rockfall volumes follow a power law distribution with a similar exponent value, within error bars. This power law distribution was also proposed for rockfall volumes that occurred along road cuts. All these results argue for a recurrent power law distribution of rockfall volumes on subvertical cliffs, for a large range of rockfall sizes (102-1010 m3), regardless of the geological settings and of the preexisting geometry of fracture patterns that are drastically different on the three studied areas. The power law distribution for rockfall volumes could emerge from two types of processes. First, the observed power law distribution of rockfall volumes is similar to the one reported for both fragmentation experiments and fragmentation models. This argues for the geometry of rock mass fragment sizes to possibly control the rockfall volumes. This way neither cascade nor avalanche processes would influence the rockfall volume distribution. Second, without any requirement of scale-invariant quenched heterogeneity patterns, the rock mass dynamics can arise from avalanche processes driven by fluctuations of the rock mass properties, e.g., cohesion or friction angle. This model may also explain the power law distribution reported for landslides involving unconsolidated materials. We find that the exponent values of rockfall volume on subvertical cliffs, 0.5 ± 0.2, is significantly smaller than the 1.2 ± 0.3 value reported for mixed landslide types. This change of exponents can be driven by the material strength, which

  10. Análise do Sistema de Informação do Programa de Controle do Câncer de Mama (SISMAMA mediante avaliação de 1.000 exames nas cidades de Barra Mansa e Volta Redonda Analysis of the Breast Cancer Control Program Information System (SISMAMA, with review of 1,000 mammograms in the cities of Barra Mansa and Volta Redonda

    Directory of Open Access Journals (Sweden)

    Sissy Bullos Lins dos Santos

    2010-10-01

    Full Text Available OBJETIVO: Fazer uma análise do Sistema de Informação do Programa de Controle do Câncer de Mama (SIS-MAMA, implantado em 2009 pelo Ministério da Saúde. MATERIAIS E MÉTODOS: Tratou-se de um estudo retrospectivo, feito mediante análise de 1.000 fichas de requisição e resultado de mamografias realizadas pelo Sistema Único de Saúde (SUS nos municípios participantes desta pesquisa, no período de tempo compreendido entre agosto e outubro de 2009. Foram analisados a qualidade das informações enviadas através do processamento desses dados e os desvios gerados pelo não preenchimento ou pelo inadequado preenchimento dos dados nessas fichas. RESULTADOS: O problema mais frequentemente encontrado foi a omissão de dados nas fichas, principalmente no quesito cirurgias anteriores, constatando-se 302 omissões (30,2%. CONCLUSÃO: Apesar do Sistema necessitar de alguns ajustes, pelo lapso temporal transcorrido entre sua criação até sua implementação, esses ajustes não afetam diretamente a validade do Sistema, encontrando-se como principal fator de erros na alimentação do banco de dados do Ministério da Saúde o não preenchimento de informações relevantes para o fechamento dos laudos, e a falta de familiarização e capacitação dos profissionais envolvidos nesse processo e no repasse de dados do resultado da mamografia.OBJECTIVE: To analyze the Breast Cancer Control Program Information System (SISMAMA implemented in 2009 by the Brazilian Health Ministry. MATERIALS AND METHODS: This was a retrospective study involving the analysis of 1,000 requisition forms and results of mammograms performed by SUS - Sistema Único de Saúde (the Brazilian unified public health system in the cities participating in the present study, during the period from August to October/2009. The study covered the qualitative analysis of the information sent through the data processing and the deviations resulting from the failure or inappropriateness in the

  11. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Jianhua Ni

    2016-08-01

    Full Text Available The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities.

  12. The clinical significance of normal mammograms and normal sonograms in patients with palpable abnormalities of the breast

    International Nuclear Information System (INIS)

    Lee, Jin Hwa; Yoon, Seong Kuk; Choi, Sun Seob; Nam, Kyung Jin; Cho, Se Heon; Kim, Dae Cheol; Kim, Jung Il; Kim, Eun Kyung

    2006-01-01

    We wanted to evaluate the clinical significance of normal mammograms and normal sonograms in patients with palpable abnormalities of the breast. From Apr 2003 to Feb 2005, 107 patients with 113 palpable abnormalities who had combined normal sonographic and normal mammographic findings were retrospectively studied. The evaluated parameters included age of the patients, the clinical referrals, the distribution of the locations of the palpable abnormalities, whether there was a past surgical history, the mammographic densities and the sonographic echo patterns (purely hyperechoic fibrous tissue, mixed fibroglandular breast tissue, predominantly isoechoic glandular tissue and isoechoic subcutaneous fat tissue) at the sites of clinical concern, whether there was a change in imaging and/or the physical examination results at follow-up, and whether there were biopsy results. This study period was chosen to allow a follow-up period of at least 12 months. The patients' ages ranged from 22 to 66 years (mean age: 48.8 years) and 62 (58%) of the 107 patients were between 41 and 50 years old (58%). The most common location of the palpable abnormalities was the upper outer portion of the breast (45%) and most of the mammographic densities were dense patterns (BI-RADS Type 3 or 4: 91%). Our cases showed similar distribution for all the types of sonographic echo patterns. 23 patients underwent biopsy; all the biopsy specimens were benign. For the 84 patients with 90 palpable abnormalities who were followed, there was no interval development of breast cancer in the areas of clinical concern. Our results suggest that we can follow up and prevent unnecessary biopsies in women with palpable abnormalities when both the mammography and ultrasonography show normal tissue, but this study was limited by its small sample size. Therefore, a larger study will be needed to better define the negative predictive value of combined normal sonographic and mammographic findings

  13. Comparison of Two-dimensional Synthesized Mammograms versus Original Digital Mammograms Alone and in Combination with Tomosynthesis Images

    Science.gov (United States)

    Guo, Ben; Catullo, Victor J.; Chough, Denise M.; Kelly, Amy E.; Lu, Amy H.; Rathfon, Grace Y.; Lee Spangler, Marion; Sumkin, Jules H.; Wallace, Luisa P.; Bandos, Andriy I.

    2014-01-01

    Purpose To assess interpretation performance and radiation dose when two-dimensional synthesized mammography (SM) images versus standard full-field digital mammography (FFDM) images are used alone or in combination with digital breast tomosynthesis images. Materials and Methods A fully crossed, mode-balanced multicase (n = 123), multireader (n = 8), retrospective observer performance study was performed by using deidentified images acquired between 2008 and 2011 with institutional review board approved, HIPAA-compliant protocols, during which each patient signed informed consent. The cohort included 36 cases of biopsy-proven cancer, 35 cases of biopsy-proven benign lesions, and 52 normal or benign cases (Breast Imaging Reporting and Data System [BI-RADS] score of 1 or 2) with negative 1-year follow-up results. Accuracy of sequentially reported probability of malignancy ratings and seven-category forced BI-RADS ratings was evaluated by using areas under the receiver operating characteristic curve (AUCs) in the random-reader analysis. Results Probability of malignancy–based mean AUCs for SM and FFDM images alone was 0.894 and 0.889, respectively (difference, −0.005; 95% confidence interval [CI]: −0.062, 0.054; P = .85). Mean AUC for SM with tomosynthesis and FFDM with tomosynthesis was 0.916 and 0.939, respectively (difference, 0.023; 95% CI: −0.011, 0.057; P = .19). In terms of the reader-specific AUCs, five readers performed better with SM alone versus FFDM alone, and all eight readers performed better with combined FFDM and tomosynthesis (absolute differences from 0.003 to 0.052). Similar results were obtained by using a nonparametric analysis of forced BI-RADS ratings. Conclusion SM alone or in combination with tomosynthesis is comparable in performance to FFDM alone or in combination with tomosynthesis and may eliminate the need for FFDM as part of a routine clinical study. © RSNA, 2014 PMID:24475859

  14. HammerCloud: A Stress Testing System for Distributed Analysis

    International Nuclear Information System (INIS)

    Ster, Daniel C van der; García, Mario Úbeda; Paladin, Massimo; Elmsheuser, Johannes

    2011-01-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HammerCloud was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HammerCloud has been employed by the ATLAS experiment for continuous testing of many sites worldwide, and also during large scale computing challenges such as STEP'09 and UAT'09, where the scale of the tests exceeded 10,000 concurrently running and 1,000,000 total jobs over multi-day periods. In addition, HammerCloud is being adopted by the CMS experiment; the plugin structure of HammerCloud allows the execution of CMS jobs using their official tool (CRAB).

  15. HammerCloud: A Stress Testing System for Distributed Analysis

    CERN Document Server

    van der Ster, Daniel C; Ubeda Garcia, Mario; Paladin, Massimo

    2011-01-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud (HC) is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HC was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HC has been ...

  16. An improved algorithm for connectivity analysis of distribution networks

    International Nuclear Information System (INIS)

    Kansal, M.L.; Devi, Sunita

    2007-01-01

    In the present paper, an efficient algorithm for connectivity analysis of moderately sized distribution networks has been suggested. Algorithm is based on generation of all possible minimal system cutsets. The algorithm is efficient as it identifies only the necessary and sufficient conditions of system failure conditions in n-out-of-n type of distribution networks. The proposed algorithm is demonstrated with the help of saturated and unsaturated distribution networks. The computational efficiency of the algorithm is justified by comparing the computational efforts with the previously suggested appended spanning tree (AST) algorithm. The proposed technique has the added advantage as it can be utilized for generation of system inequalities which is useful in reliability estimation of capacitated networks

  17. DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.

    Science.gov (United States)

    Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien

    2017-09-01

    Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.

  18. Vibrational Energy Distribution Analysis (VEDA): Scopes and limitations

    Science.gov (United States)

    Jamróz, Michał H.

    2013-10-01

    The principle of operations of the VEDA program written by the author for Potential Energy Distribution (PED) analysis of theoretical vibrational spectra is described. Nowadays, the PED analysis is indispensible tool in serious analysis of the vibrational spectra. To perform the PED analysis it is necessary to define 3N-6 linearly independent local mode coordinates. Already for 20-atomic molecules it is a difficult task. The VEDA program reads the input data automatically from the Gaussian program output files. Then, VEDA automatically proposes an introductory set of local mode coordinates. Next, the more adequate coordinates are proposed by the program and optimized to obtain maximal elements of each column (internal coordinate) of the PED matrix (the EPM parameter). The possibility for an automatic optimization of PED contributions is a unique feature of the VEDA program absent in any other programs performing PED analysis.

  19. Distribution System Reliability Analysis for Smart Grid Applications

    Science.gov (United States)

    Aljohani, Tawfiq Masad

    Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.

  20. Risk analysis for a local gas distribution network

    International Nuclear Information System (INIS)

    Peters, J.W.

    1991-01-01

    Cost control and service reliability are popular topics when discussing strategic issues facing local distribution companies (LDCs) in the 1990s. The ability to provide secure and uninterrupted gas service is crucial for growth and company image, both with the public and regulatory agencies. At the same time, the industry is facing unprecedented competition from alternate fuels, and cost control is essential for maintaining a competitive edge in the market. On the surface, it would appear that cost control and service reliability are contradictory terms. Improvement in service reliability should cost something, or does it? Risk analysis can provide the answer from a distribution design perspective. From a gas distribution engineer's perspective, projects such as loops, backfeeds and even valve placement are designed to reduce, minimize and/or eliminate potential customer outages. These projects improve service reliability by acting as backups should a failure occur on a component of the distribution network. These contingency projects are cost-effective but their longterm benefit or true value is under question. Their purpose is to maintain supply to an area in the distribution network in the event of a failure somewhere else. Two phrases, potential customer outages and in the event of failure, identify uncertainty

  1. Analysis of rainfall distribution in Kelantan river basin, Malaysia

    Science.gov (United States)

    Che Ros, Faizah; Tosaka, Hiroyuki

    2018-03-01

    Using rainfall gauge on its own as input carries great uncertainties regarding runoff estimation, especially when the area is large and the rainfall is measured and recorded at irregular spaced gauging stations. Hence spatial interpolation is the key to obtain continuous and orderly rainfall distribution at unknown points to be the input to the rainfall runoff processes for distributed and semi-distributed numerical modelling. It is crucial to study and predict the behaviour of rainfall and river runoff to reduce flood damages of the affected area along the Kelantan river. Thus, a good knowledge on rainfall distribution is essential in early flood prediction studies. Forty six rainfall stations and their daily time-series were used to interpolate gridded rainfall surfaces using inverse-distance weighting (IDW), inverse-distance and elevation weighting (IDEW) methods and average rainfall distribution. Sensitivity analysis for distance and elevation parameters were conducted to see the variation produced. The accuracy of these interpolated datasets was examined using cross-validation assessment.

  2. Size distribution measurements and chemical analysis of aerosol components

    Energy Technology Data Exchange (ETDEWEB)

    Pakkanen, T.A.

    1995-12-31

    The principal aims of this work were to improve the existing methods for size distribution measurements and to draw conclusions about atmospheric and in-stack aerosol chemistry and physics by utilizing size distributions of various aerosol components measured. A sample dissolution with dilute nitric acid in an ultrasonic bath and subsequent graphite furnace atomic absorption spectrometric analysis was found to result in low blank values and good recoveries for several elements in atmospheric fine particle size fractions below 2 {mu}m of equivalent aerodynamic particle diameter (EAD). Furthermore, it turned out that a substantial amount of analyses associated with insoluble material could be recovered since suspensions were formed. The size distribution measurements of in-stack combustion aerosols indicated two modal size distributions for most components measured. The existence of the fine particle mode suggests that a substantial fraction of such elements with two modal size distributions may vaporize and nucleate during the combustion process. In southern Norway, size distributions of atmospheric aerosol components usually exhibited one or two fine particle modes and one or two coarse particle modes. Atmospheric relative humidity values higher than 80% resulted in significant increase of the mass median diameters of the droplet mode. Important local and/or regional sources of As, Br, I, K, Mn, Pb, Sb, Si and Zn were found to exist in southern Norway. The existence of these sources was reflected in the corresponding size distributions determined, and was utilized in the development of a source identification method based on size distribution data. On the Finnish south coast, atmospheric coarse particle nitrate was found to be formed mostly through an atmospheric reaction of nitric acid with existing coarse particle sea salt but reactions and/or adsorption of nitric acid with soil derived particles also occurred. Chloride was depleted when acidic species reacted

  3. Comparing Distributions of Environmental Outcomes for Regulatory Environmental Justice Analysis

    Directory of Open Access Journals (Sweden)

    Glenn Sheriff

    2011-05-01

    Full Text Available Economists have long been interested in measuring distributional impacts of policy interventions. As environmental justice (EJ emerged as an ethical issue in the 1970s, the academic literature has provided statistical analyses of the incidence and causes of various environmental outcomes as they relate to race, income, and other demographic variables. In the context of regulatory impacts, however, there is a lack of consensus regarding what information is relevant for EJ analysis, and how best to present it. This paper helps frame the discussion by suggesting a set of questions fundamental to regulatory EJ analysis, reviewing past approaches to quantifying distributional equity, and discussing the potential for adapting existing tools to the regulatory context.

  4. DIRAC - The Distributed MC Production and Analysis for LHCb

    CERN Document Server

    Tsaregorodtsev, A

    2004-01-01

    DIRAC is the LHCb distributed computing grid infrastructure for MC production and analysis. Its architecture is based on a set of distributed collaborating services. The service decomposition broadly follows the ARDA project proposal, allowing for the possibility of interchanging the EGEE/ARDA and DIRAC components in the future. Some components developed outside the DIRAC project are already in use as services, for example the File Catalog developed by the AliEn project. An overview of the DIRAC architecture will be given, in particular the recent developments to support user analysis. The main design choices will be presented. One of the main design goals of DIRAC is the simplicity of installation, configuring and operation of various services. This allows all the DIRAC resources to be easily managed by a single Production Manager. The modular design of the DIRAC components allows its functionality to be easily extended to include new computing and storage elements or to handle new tasks. The DIRAC system al...

  5. Energy system analysis of fuel cells and distributed generation

    DEFF Research Database (Denmark)

    Mathiesen, Brian Vad; Lund, Henrik

    2007-01-01

    This chapter introduces Energy System Analysis methodologies and tools, which can be used for identifying the best application of different Fuel Cell (FC) technologies to different regional or national energy systems. The main point is that the benefits of using FC technologies indeed depend...... on the energy system in which they are used. Consequently, coherent energy systems analyses of specific and complete energy systems must be conducted in order to evaluate the benefits of FC technologies and in order to be able to compare alternative solutions. In relation to distributed generation, FC...... technologies are very often connected to the use of hydrogen, which has to be provided e.g. from electrolysers. Decentralised and distributed generation has the possibility of improving the overall energy efficiency and flexibility of energy systems. Therefore, energy system analysis tools and methodologies...

  6. Similarity estimation for reference image retrieval in mammograms using convolutional neural network

    Science.gov (United States)

    Muramatsu, Chisako; Higuchi, Shunichi; Morita, Takako; Oiwa, Mikinao; Fujita, Hiroshi

    2018-02-01

    Periodic breast cancer screening with mammography is considered effective in decreasing breast cancer mortality. For screening programs to be successful, an intelligent image analytic system may support radiologists' efficient image interpretation. In our previous studies, we have investigated image retrieval schemes for diagnostic references of breast lesions on mammograms and ultrasound images. Using a machine learning method, reliable similarity measures that agree with radiologists' similarity were determined and relevant images could be retrieved. However, our previous method includes a feature extraction step, in which hand crafted features were determined based on manual outlines of the masses. Obtaining the manual outlines of masses is not practical in clinical practice and such data would be operator-dependent. In this study, we investigated a similarity estimation scheme using a convolutional neural network (CNN) to skip such procedure and to determine data-driven similarity scores. By using CNN as feature extractor, in which extracted features were employed in determination of similarity measures with a conventional 3-layered neural network, the determined similarity measures were correlated well with the subjective ratings and the precision of retrieving diagnostically relevant images was comparable with that of the conventional method using handcrafted features. By using CNN for determination of similarity measure directly, the result was also comparable. By optimizing the network parameters, results may be further improved. The proposed method has a potential usefulness in determination of similarity measure without precise lesion outlines for retrieval of similar mass images on mammograms.

  7. AND LANDSCAPE-ECOLOGICAL ANALYSIS OF ITS DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    S. M. Musaeva

    2012-01-01

    Full Text Available The article is devoted to the study of helminthofauna of the striped lizard in Lankaran natural region. The landscape and ecological analysis of distribution of the helminthofauna is provided. As a result of studies on 99 individuals of striped lizard totally 14 species of helminthes, including 1 trematode species, 1 species of cestode, 3 species of akantocefals and 9 species of nematodes were found.

  8. Automatic analysis of attack data from distributed honeypot network

    Science.gov (United States)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  9. Performance analysis of distributed beamforming in a spectrum sharing system

    KAUST Repository

    Yang, Liang; Alouini, Mohamed-Slim; Qaraqe, Khalid A.

    2012-01-01

    In this paper, we consider a distributed beamforming scheme (DBF) in a spectrum sharing system where multiple secondary users share the spectrum with the licensed primary users under an interference temperature constraint. We assume that DBF is applied at the secondary users. We first consider optimal beamforming and compare it with the user selection scheme in terms of the outage probability and bit-error rate performance. Since perfect feedback is difficult to obtain, we then investigate a limited feedback DBF scheme and develop an outage probability analysis for a random vector quantization (RVQ) design algorithm. Numerical results are provided to illustrate our mathematical formalism and verify our analysis. © 2012 IEEE.

  10. Performance analysis of distributed beamforming in a spectrum sharing system

    KAUST Repository

    Yang, Liang

    2012-09-01

    In this paper, we consider a distributed beamforming scheme (DBF) in a spectrum sharing system where multiple secondary users share the spectrum with the licensed primary users under an interference temperature constraint. We assume that DBF is applied at the secondary users. We first consider optimal beamforming and compare it with the user selection scheme in terms of the outage probability and bit-error rate performance. Since perfect feedback is difficult to obtain, we then investigate a limited feedback DBF scheme and develop an outage probability analysis for a random vector quantization (RVQ) design algorithm. Numerical results are provided to illustrate our mathematical formalism and verify our analysis. © 2012 IEEE.

  11. Thermographic Analysis of Stress Distribution in Welded Joints

    Directory of Open Access Journals (Sweden)

    Domazet Ž.

    2010-06-01

    Full Text Available The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.

  12. Thermographic Analysis of Stress Distribution in Welded Joints

    Science.gov (United States)

    Piršić, T.; Krstulović Opara, L.; Domazet, Ž.

    2010-06-01

    The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural) stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis) in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.

  13. Development of a site analysis tool for distributed wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)

    2012-02-28

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  14. Componential distribution analysis of food using near infrared ray image

    Science.gov (United States)

    Yamauchi, Hiroki; Kato, Kunihito; Yamamoto, Kazuhiko; Ogawa, Noriko; Ohba, Kimie

    2008-11-01

    The components of the food related to the "deliciousness" are usually evaluated by componential analysis. The component content and type of components in the food are determined by this analysis. However, componential analysis is not able to analyze measurements in detail, and the measurement is time consuming. We propose a method to measure the two-dimensional distribution of the component in food using a near infrared ray (IR) image. The advantage of our method is to be able to visualize the invisible components. Many components in food have characteristics such as absorption and reflection of light in the IR range. The component content is measured using subtraction between two wavelengths of near IR light. In this paper, we describe a method to measure the component of food using near IR image processing, and we show an application to visualize the saccharose in the pumpkin.

  15. Principal Component Analysis for Normal-Distribution-Valued Symbolic Data.

    Science.gov (United States)

    Wang, Huiwen; Chen, Meiling; Shi, Xiaojun; Li, Nan

    2016-02-01

    This paper puts forward a new approach to principal component analysis (PCA) for normal-distribution-valued symbolic data, which has a vast potential of applications in the economic and management field. We derive a full set of numerical characteristics and variance-covariance structure for such data, which forms the foundation for our analytical PCA approach. Our approach is able to use all of the variance information in the original data than the prevailing representative-type approach in the literature which only uses centers, vertices, etc. The paper also provides an accurate approach to constructing the observations in a PC space based on the linear additivity property of normal distribution. The effectiveness of the proposed method is illustrated by simulated numerical experiments. At last, our method is applied to explain the puzzle of risk-return tradeoff in China's stock market.

  16. Growing axons analysis by using Granulometric Size Distribution

    International Nuclear Information System (INIS)

    Gonzalez, Mariela A; Ballarin, Virginia L; Rapacioli, Melina; CelIn, A R; Sanchez, V; Flores, V

    2011-01-01

    Neurite growth (neuritogenesis) in vitro is a common methodology in the field of developmental neurobiology. Morphological analyses of growing neurites are usually difficult because their thinness and low contrast usually prevent to observe clearly their shape, number, length and spatial orientation. This paper presents the use of the granulometric size distribution in order to automatically obtain information about the shape, size and spatial orientation of growing axons in tissue cultures. The results here presented show that the granulometric size distribution results in a very useful morphological tool since it allows the automatic detection of growing axons and the precise characterization of a relevant parameter indicative of the axonal growth spatial orientation such as the quantification of the angle of deviation of the growing direction. The developed algorithms automatically quantify this orientation by facilitating the analysis of these images, which is important given the large number of images that need to be processed for this type of study.

  17. Water hammer analysis in a water distribution system

    Directory of Open Access Journals (Sweden)

    John Twyman

    2017-04-01

    Full Text Available The solution to water hammer in a water distribution system (WDS is shown by applying three hybrid methods (HM based on the Box’s scheme, McCormack's method and Diffusive Scheme. Each HM formulation in conjunction with their relative advantages and disadvantages are reviewed. The analyzed WDS has pipes with different lengths, diameters and wave speeds, being the Courant number different in each pipe according to the adopted discretization. The HM results are compared with the results obtained by the Method of Characteristics (MOC. In reviewing the numerical attenuation, second order schemes based on Box and McCormack are more conservative from a numerical point of view, being recommendable their application in the analysis of water hammer in water distribution systems.

  18. Distributed analysis using GANGA on the EGEE/LCG infrastructure

    International Nuclear Information System (INIS)

    Elmsheuser, J; Brochu, F; Harrison, K; Egede, U; Gaidioz, B; Liko, D; Maier, A; Moscicki, J; Muraru, A; Lee, H-C; Romanovsky, V; Soroko, A; Tan, C L

    2008-01-01

    The distributed data analysis using Grid resources is one of the fundamental applications in high energy physics to be addressed and realized before the start of LHC data taking. The need to facilitate the access to the resources is very high. In every experiment up to a thousand physicist will be submitting analysis jobs into the Grid. Appropriate user interfaces and helper applications have to be made available to assure that all users can use the Grid without too much expertise in Grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. The GANGA job management system (http://cern.ch/ganga), developed as a common project between the ATLAS and LHCb experiments provides and integrates these kind of tools. GANGA provides a simple and consistent way of preparing, organizing and executing analysis tasks within the experiment analysis framework, implemented through a plug-in system. It allows trivial switching between running test jobs on a local batch system and running large-scale analyzes on the Grid, hiding Grid technicalities. We will be reporting on the plug-ins and our experiences of distributed data analysis using GANGA within the ATLAS experiment and the EGEE/LCG infrastructure. The integration with the ATLAS data management system DQ2 into GANGA is a key functionality. In combination with the job splitting mechanism large amounts of jobs can be sent to the locations of data following the ATLAS computing model. GANGA supports tasks of user analysis with reconstructed data and small scale production of Monte Carlo data

  19. Statistical analysis of the spatial distribution of galaxies and clusters

    International Nuclear Information System (INIS)

    Cappi, Alberto

    1993-01-01

    This thesis deals with the analysis of the distribution of galaxies and clusters, describing some observational problems and statistical results. First chapter gives a theoretical introduction, aiming to describe the framework of the formation of structures, tracing the history of the Universe from the Planck time, t_p = 10"-"4"3 sec and temperature corresponding to 10"1"9 GeV, to the present epoch. The most usual statistical tools and models of the galaxy distribution, with their advantages and limitations, are described in chapter two. A study of the main observed properties of galaxy clustering, together with a detailed statistical analysis of the effects of selecting galaxies according to apparent magnitude or diameter, is reported in chapter three. Chapter four delineates some properties of groups of galaxies, explaining the reasons of discrepant results on group distributions. Chapter five is a study of the distribution of galaxy clusters, with different statistical tools, like correlations, percolation, void probability function and counts in cells; it is found the same scaling-invariant behaviour of galaxies. Chapter six describes our finding that rich galaxy clusters too belong to the fundamental plane of elliptical galaxies, and gives a discussion of its possible implications. Finally chapter seven reviews the possibilities offered by multi-slit and multi-fibre spectrographs, and I present some observational work on nearby and distant galaxy clusters. In particular, I show the opportunities offered by ongoing surveys of galaxies coupled with multi-object fibre spectrographs, focusing on the ESO Key Programme A galaxy redshift survey in the south galactic pole region to which I collaborate and on MEFOS, a multi-fibre instrument with automatic positioning. Published papers related to the work described in this thesis are reported in the last appendix. (author) [fr

  20. Analysis of Statistical Distributions Used for Modeling Reliability and Failure Rate of Temperature Alarm Circuit

    International Nuclear Information System (INIS)

    EI-Shanshoury, G.I.

    2011-01-01

    Several statistical distributions are used to model various reliability and maintainability parameters. The applied distribution depends on the' nature of the data being analyzed. The presented paper deals with analysis of some statistical distributions used in reliability to reach the best fit of distribution analysis. The calculations rely on circuit quantity parameters obtained by using Relex 2009 computer program. The statistical analysis of ten different distributions indicated that Weibull distribution gives the best fit distribution for modeling the reliability of the data set of Temperature Alarm Circuit (TAC). However, the Exponential distribution is found to be the best fit distribution for modeling the failure rate

  1. Automated registration of diagnostic to prediagnostic x-ray mammograms: Evaluation and comparison to radiologists' accuracy

    International Nuclear Information System (INIS)

    Pinto Pereira, Snehal M.; Hipwell, John H.; McCormack, Valerie A.; Tanner, Christine; Moss, Sue M.; Wilkinson, Louise S.; Khoo, Lisanne A. L.; Pagliari, Catriona; Skippage, Pippa L.; Kliger, Carole J.; Hawkes, David J.; Santos Silva, Isabel M. dos

    2010-01-01

    Purpose: To compare and evaluate intensity-based registration methods for computation of serial x-ray mammogram correspondence. Methods: X-ray mammograms were simulated from MRIs of 20 women using finite element methods for modeling breast compressions and employing a MRI/x-ray appearance change model. The parameter configurations of three registration methods, affine, fluid, and free-form deformation (FFD), were optimized for registering x-ray mammograms on these simulated images. Five mammography film readers independently identified landmarks (tumor, nipple, and usually two other normal features) on pairs of diagnostic and corresponding prediagnostic digitized images from 52 breast cancer cases. Landmarks were independently reidentified by each reader. Target registration errors were calculated to compare the three registration methods using the reader landmarks as a gold standard. Data were analyzed using multilevel methods. Results: Between-reader variability varied with landmark (p<0.01) and screen (p=0.03), with between-reader mean distance (mm) in point location on the diagnostic/prediagnostic images of 2.50 (95% CI 1.95, 3.15)/2.84 (2.24, 3.55) for nipples and 4.26 (3.43, 5.24)/4.76 (3.85, 5.84) for tumors. Registration accuracy was sensitive to the type of landmark and the amount of breast density. For dense breasts (≥40%), the affine and fluid methods outperformed FFD. For breasts with lower density, the affine registration surpassed both fluid and FFD. Mean accuracy (mm) of the affine registration varied between 3.16 (95% CI 2.56, 3.90) for nipple points in breasts with density 20%-39% and 5.73 (4.80, 6.84) for tumor points in breasts with density <20%. Conclusions: Affine registration accuracy was comparable to that between independent film readers. More advanced two-dimensional nonrigid registration algorithms were incapable of increasing the accuracy of image alignment when compared to affine registration.

  2. Simultaneous detection and classification of breast masses in digital mammograms via a deep learning YOLO-based CAD system.

    Science.gov (United States)

    Al-Masni, Mohammed A; Al-Antari, Mugahed A; Park, Jeong-Min; Gi, Geon; Kim, Tae-Yeon; Rivera, Patricio; Valarezo, Edwin; Choi, Mun-Taek; Han, Seung-Moo; Kim, Tae-Seong

    2018-04-01

    Automatic detection and classification of the masses in mammograms are still a big challenge and play a crucial role to assist radiologists for accurate diagnosis. In this paper, we propose a novel Computer-Aided Diagnosis (CAD) system based on one of the regional deep learning techniques, a ROI-based Convolutional Neural Network (CNN) which is called You Only Look Once (YOLO). Although most previous studies only deal with classification of masses, our proposed YOLO-based CAD system can handle detection and classification simultaneously in one framework. The proposed CAD system contains four main stages: preprocessing of mammograms, feature extraction utilizing deep convolutional networks, mass detection with confidence, and finally mass classification using Fully Connected Neural Networks (FC-NNs). In this study, we utilized original 600 mammograms from Digital Database for Screening Mammography (DDSM) and their augmented mammograms of 2,400 with the information of the masses and their types in training and testing our CAD. The trained YOLO-based CAD system detects the masses and then classifies their types into benign or malignant. Our results with five-fold cross validation tests show that the proposed CAD system detects the mass location with an overall accuracy of 99.7%. The system also distinguishes between benign and malignant lesions with an overall accuracy of 97%. Our proposed system even works on some challenging breast cancer cases where the masses exist over the pectoral muscles or dense regions. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Computer-Aided Detection in Digital Mammography: False-Positive Marks and Their Reproducibility in Negative Mammograms

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Moon, Woo Kyung; Cho, Nariya; Chang, Jung Min; Seong, Min Hyun

    2009-01-01

    Background: There are relatively few studies reporting the frequency of false-positive computer-aided detection (CAD) marks and their reproducibility in normal cases. Purpose: To evaluate retrospectively the false-positive mark rate of a CAD system and the reproducibility of false-positive marks in two sets of negative digital mammograms. Material and Methods: Two sets of negative digital mammograms were obtained in 360 women (mean age 57 years, range 30-76 years) with an approximate interval of 1 year (mean time 343.7 days), and a CAD system was applied. False-positive CAD marks and the reproducibility were determined. Results: Of the 360 patients, 252 (70.0%) and 240 (66.7%) patients had 1-7 CAD marks on the initial and second mammograms, respectively. The false-positive CAD mark rate was 1.5 (1.1 for masses and 0.4 for calcifications) and 1.4 (1.0 for masses and 0.4 for calcifications) per examination in the initial and second mammograms, respectively. The reproducibility of the false-positive CAD marks was 12.0% for both mass (81/680) and microcalcification (33/278) marks. Conclusion: False-positive CAD marks were seen in approximately 70% of normal cases. However, the reproducibility was very low. Radiologists must be familiar with the findings of false-positive CAD marks, since they are very common and can increase the recall rate in screening

  4. Job optimization in ATLAS TAG-based distributed analysis

    Science.gov (United States)

    Mambelli, M.; Cranshaw, J.; Gardner, R.; Maeno, T.; Malon, D.; Novak, M.

    2010-04-01

    The ATLAS experiment is projected to collect over one billion events/year during the first few years of operation. The efficient selection of events for various physics analyses across all appropriate samples presents a significant technical challenge. ATLAS computing infrastructure leverages the Grid to tackle the analysis across large samples by organizing data into a hierarchical structure and exploiting distributed computing to churn through the computations. This includes events at different stages of processing: RAW, ESD (Event Summary Data), AOD (Analysis Object Data), DPD (Derived Physics Data). Event Level Metadata Tags (TAGs) contain information about each event stored using multiple technologies accessible by POOL and various web services. This allows users to apply selection cuts on quantities of interest across the entire sample to compile a subset of events that are appropriate for their analysis. This paper describes new methods for organizing jobs using the TAGs criteria to analyze ATLAS data. It further compares different access patterns to the event data and explores ways to partition the workload for event selection and analysis. Here analysis is defined as a broader set of event processing tasks including event selection and reduction operations ("skimming", "slimming" and "thinning") as well as DPD making. Specifically it compares analysis with direct access to the events (AOD and ESD data) to access mediated by different TAG-based event selections. We then compare different ways of splitting the processing to maximize performance.

  5. Distributed analysis environment for HEP and interdisciplinary applications

    International Nuclear Information System (INIS)

    Moscicki, J.T.

    2003-01-01

    Huge data volumes of Larger Hadron Collider experiment require parallel end-user analysis on clusters of hundreds of machines. While the focus of end-user High-Energy Physics analysis is on ntuples, the master-worker model of parallel processing may be also used in other contexts such as detector simulation. The aim of DIANE R and D project (http://cern.ch/it-proj-diane) currently held by CERN IT/API group is to create a generic, component-based framework for distributed, parallel data processing in master-worker model. Pre-compiled user analysis code is loaded dynamically at runtime in component libraries and called back when appropriate. Such application-oriented framework must be flexible enough to integrate with the emerging GRID technologies as they become available in the time to come. Therefore, common services such as environment reconstruction, code distribution, load balancing and authentication are designed and implemented as pluggable modules. This approach allows to easily replace them with modules implemented with newer technology as necessary. The paper gives an overview of DIANE architecture and explains the main design choices. Selected examples of diverse applications from a variety of domains applicable to DIANE are presented. As well as preliminary benchmarking results

  6. Precision Statistical Analysis of Images Based on Brightness Distribution

    Directory of Open Access Journals (Sweden)

    Muzhir Shaban Al-Ani

    2017-07-01

    Full Text Available Study the content of images is considered an important topic in which reasonable and accurate analysis of images are generated. Recently image analysis becomes a vital field because of huge number of images transferred via transmission media in our daily life. These crowded media with images lead to highlight in research area of image analysis. In this paper, the implemented system is passed into many steps to perform the statistical measures of standard deviation and mean values of both color and grey images. Whereas the last step of the proposed method concerns to compare the obtained results in different cases of the test phase. In this paper, the statistical parameters are implemented to characterize the content of an image and its texture. Standard deviation, mean and correlation values are used to study the intensity distribution of the tested images. Reasonable results are obtained for both standard deviation and mean value via the implementation of the system. The major issue addressed in the work is concentrated on brightness distribution via statistical measures applying different types of lighting.

  7. Distributed analysis environment for HEP and interdisciplinary applications

    CERN Document Server

    Moscicki, J T

    2003-01-01

    Huge data volumes of Large Hadron Collider experiments require parallel end-user analysis on clusters of hundreds of machines. While the focus of end-user High-Energy Physics analysis is on ntuples, the master-worker model of parallel processing may be also used in other contexts such as detector simulation. The aim of DIANE R&D project (http://cern.ch/it-proj-diane) currently held by CERN IT/API group is to create a generic, component-based framework for distributed, parallel data processing in master-worker model. Pre-compiled user analysis code is loaded dynamically at runtime in component libraries and called back when appropriate. Such application-oriented framework must be flexible enough to integrate with the emerging GRID technologies as they become available in the time to come. Therefore, common services such as environment reconstruction, code distribution, load balancing and authentication are designed and implemented as pluggable modules. This approach allows to easily replace them with modul...

  8. Aeroelastic Analysis of a Distributed Electric Propulsion Wing

    Science.gov (United States)

    Massey, Steven J.; Stanford, Bret K.; Wieseman, Carol D.; Heeg, Jennifer

    2017-01-01

    An aeroelastic analysis of a prototype distributed electric propulsion wing is presented. Results using MSC Nastran (Registered Trademark) doublet lattice aerodynamics are compared to those based on FUN3D Reynolds Averaged Navier- Stokes aerodynamics. Four levels of grid refinement were examined for the FUN3D solutions and solutions were seen to be well converged. It was found that no oscillatory instability existed, only that of divergence, which occurred in the first bending mode at a dynamic pressure of over three times the flutter clearance condition.

  9. The relationship of psychosocial factors to mammograms, physical activity, and fruit and vegetable consumption among sisters of breast cancer patients

    Directory of Open Access Journals (Sweden)

    Hartman SJ

    2011-08-01

    Full Text Available Sheri J Hartman1, Shira I Dunsiger1, Paul B Jacobsen21Centers for Behavioral and Preventive Medicine, The Miriam Hospital and W Alpert Medical School of Brown University, Providence, RI; 2Department of Health Outcomes and Behavior, H Lee Moffitt Cancer Center and Research Institute, Tampa, FL, USAAbstract: This study examined the relationship of psychosocial factors to health-promoting behaviors in sisters of breast cancer patients. One hundred and twenty sisters of breast cancer patients completed questionnaires assessing response efficacy of mammography screenings, physical activity, and fruit and vegetable consumption on decreasing breast cancer risk, breast cancer worry, involvement in their sister’s cancer care, mammography screenings, physical activity, and fruit and vegetable consumption. Results indicate that greater perceived effectiveness for mammograms was associated with a 67% increase in odds of yearly mammograms. Greater involvement in the patient’s care was associated with a 7% decrease in odds of yearly mammograms. Greater perceived effectiveness for physical activity was significantly related to greater physical activity. There was a trend for greater perceived effectiveness for fruits and vegetables to be associated with consuming more fruits and vegetables. Breast cancer worry was not significantly associated with the outcomes. While perceived effectiveness for a specific health behavior in reducing breast cancer risk was consistently related to engaging in that health behavior, women reported significantly lower perceived effectiveness for physical activity and fruits and vegetables than for mammograms. Making women aware of the health benefits of these behaviors may be important in promoting changes.Keywords: breast cancer risk, mammograms, physical activity, diet, perceived effectiveness

  10. Towards an in-plane methodology to track breast lesions using mammograms and patient-specific finite-element simulations

    Science.gov (United States)

    Lapuebla-Ferri, Andrés; Cegoñino-Banzo, José; Jiménez-Mocholí, Antonio-José; Pérez del Palomar, Amaya

    2017-11-01

    In breast cancer screening or diagnosis, it is usual to combine different images in order to locate a lesion as accurately as possible. These images are generated using a single or several imaging techniques. As x-ray-based mammography is widely used, a breast lesion is located in the same plane of the image (mammogram), but tracking it across mammograms corresponding to different views is a challenging task for medical physicians. Accordingly, simulation tools and methodologies that use patient-specific numerical models can facilitate the task of fusing information from different images. Additionally, these tools need to be as straightforward as possible to facilitate their translation to the clinical area. This paper presents a patient-specific, finite-element-based and semi-automated simulation methodology to track breast lesions across mammograms. A realistic three-dimensional computer model of a patient’s breast was generated from magnetic resonance imaging to simulate mammographic compressions in cranio-caudal (CC, head-to-toe) and medio-lateral oblique (MLO, shoulder-to-opposite hip) directions. For each compression being simulated, a virtual mammogram was obtained and posteriorly superimposed to the corresponding real mammogram, by sharing the nipple as a common feature. Two-dimensional rigid-body transformations were applied, and the error distance measured between the centroids of the tumors previously located on each image was 3.84 mm and 2.41 mm for CC and MLO compression, respectively. Considering that the scope of this work is to conceive a methodology translatable to clinical practice, the results indicate that it could be helpful in supporting the tracking of breast lesions.

  11. Block-based wavelet transform coding of mammograms with region-adaptive quantization

    Science.gov (United States)

    Moon, Nam Su; Song, Jun S.; Kwon, Musik; Kim, JongHyo; Lee, ChoongWoong

    1998-06-01

    To achieve both high compression ratio and information preserving, it is an efficient way to combine segmentation and lossy compression scheme. Microcalcification in mammogram is one of the most significant sign of early stage of breast cancer. Therefore in coding, detection and segmentation of microcalcification enable us to preserve it well by allocating more bits to it than to other regions. Segmentation of microcalcification is performed both in spatial domain and in wavelet transform domain. Peak error controllable quantization step, which is off-line designed, is suitable for medical image compression. For region-adaptive quantization, block- based wavelet transform coding is adopted and different peak- error-constrained quantizers are applied to blocks according to the segmentation result. In view of preservation of microcalcification, the proposed coding scheme shows better performance than JPEG.

  12. Primary breast osteosarcoma mimicking calcified fibroadenoma on screening digital breast tomosynthesis mammogram

    Directory of Open Access Journals (Sweden)

    Debbie Lee Bennett, MD

    2017-12-01

    Full Text Available Primary breast osteosarcoma is a rare malignancy, with mostly case reports in the literature. The appearance of breast osteosarcoma on digital breast tomosynthesis imaging has not yet been described. A 69-year-old woman presents for routine screening mammography and is found to have a calcified mass in her right breast. Pattern of calcification appeared “sunburst” on digital breast tomosynthesis images. This mass was larger than on the previous year's mammogram, at which time it had been interpreted as a benign calcified fibroadenoma. The subsequent workup demonstrated the mass to reflect primary breast osteosarcoma. The patient's workup and treatment are detailed in this case. Primary breast osteosarcoma, although rare, should be included as a diagnostic consideration for breast masses with a sunburst pattern of calcifications, particularly when the mammographic appearance has changed.

  13. Case base classification on digital mammograms: improving the performance of case base classifier

    Science.gov (United States)

    Raman, Valliappan; Then, H. H.; Sumari, Putra; Venkatesa Mohan, N.

    2011-10-01

    Breast cancer continues to be a significant public health problem in the world. Early detection is the key for improving breast cancer prognosis. The aim of the research presented here is in twofold. First stage of research involves machine learning techniques, which segments and extracts features from the mass of digital mammograms. Second level is on problem solving approach which includes classification of mass by performance based case base classifier. In this paper we build a case-based Classifier in order to diagnose mammographic images. We explain different methods and behaviors that have been added to the classifier to improve the performance of the classifier. Currently the initial Performance base Classifier with Bagging is proposed in the paper and it's been implemented and it shows an improvement in specificity and sensitivity.

  14. Abnormality detection of mammograms by discriminative dictionary learning on DSIFT descriptors.

    Science.gov (United States)

    Tavakoli, Nasrin; Karimi, Maryam; Nejati, Mansour; Karimi, Nader; Reza Soroushmehr, S M; Samavi, Shadrokh; Najarian, Kayvan

    2017-07-01

    Detection and classification of breast lesions using mammographic images are one of the most difficult studies in medical image processing. A number of learning and non-learning methods have been proposed for detecting and classifying these lesions. However, the accuracy of the detection/classification still needs improvement. In this paper we propose a powerful classification method based on sparse learning to diagnose breast cancer in mammograms. For this purpose, a supervised discriminative dictionary learning approach is applied on dense scale invariant feature transform (DSIFT) features. A linear classifier is also simultaneously learned with the dictionary which can effectively classify the sparse representations. Our experimental results show the superior performance of our method compared to existing approaches.

  15. Characterizing single-molecule FRET dynamics with probability distribution analysis.

    Science.gov (United States)

    Santoso, Yusdi; Torella, Joseph P; Kapanidis, Achillefs N

    2010-07-12

    Probability distribution analysis (PDA) is a recently developed statistical tool for predicting the shapes of single-molecule fluorescence resonance energy transfer (smFRET) histograms, which allows the identification of single or multiple static molecular species within a single histogram. We used a generalized PDA method to predict the shapes of FRET histograms for molecules interconverting dynamically between multiple states. This method is tested on a series of model systems, including both static DNA fragments and dynamic DNA hairpins. By fitting the shape of this expected distribution to experimental data, the timescale of hairpin conformational fluctuations can be recovered, in good agreement with earlier published results obtained using different techniques. This method is also applied to studying the conformational fluctuations in the unliganded Klenow fragment (KF) of Escherichia coli DNA polymerase I, which allows both confirmation of the consistency of a simple, two-state kinetic model with the observed smFRET distribution of unliganded KF and extraction of a millisecond fluctuation timescale, in good agreement with rates reported elsewhere. We expect this method to be useful in extracting rates from processes exhibiting dynamic FRET, and in hypothesis-testing models of conformational dynamics against experimental data.

  16. Hough transform for clustered microcalcifications detection in full-field digital mammograms

    Science.gov (United States)

    Fanizzi, A.; Basile, T. M. A.; Losurdo, L.; Amoroso, N.; Bellotti, R.; Bottigli, U.; Dentamaro, R.; Didonna, V.; Fausto, A.; Massafra, R.; Moschetta, M.; Tamborra, P.; Tangaro, S.; La Forgia, D.

    2017-09-01

    Many screening programs use mammography as principal diagnostic tool for detecting breast cancer at a very early stage. Despite the efficacy of the mammograms in highlighting breast diseases, the detection of some lesions is still doubtless for radiologists. In particular, the extremely minute and elongated salt-like particles of microcalcifications are sometimes no larger than 0.1 mm and represent approximately half of all cancer detected by means of mammograms. Hence the need for automatic tools able to support radiologists in their work. Here, we propose a computer assisted diagnostic tool to support radiologists in identifying microcalcifications in full (native) digital mammographic images. The proposed CAD system consists of a pre-processing step, that improves contrast and reduces noise by applying Sobel edge detection algorithm and Gaussian filter, followed by a microcalcification detection step performed by exploiting the circular Hough transform. The procedure performance was tested on 200 images coming from the Breast Cancer Digital Repository (BCDR), a publicly available database. The automatically detected clusters of microcalcifications were evaluated by skilled radiologists which asses the validity of the correctly identified regions of interest as well as the system error in case of missed clustered microcalcifications. The system performance was evaluated in terms of Sensitivity and False Positives per images (FPi) rate resulting comparable to the state-of-art approaches. The proposed model was able to accurately predict the microcalcification clusters obtaining performances (sensibility = 91.78% and FPi rate = 3.99) which favorably compare to other state-of-the-art approaches.

  17. Pair distribution function analysis applied to decahedral gold nanoparticles

    International Nuclear Information System (INIS)

    Nakotte, H; Silkwood, C; Kiefer, B; Karpov, D; Fohtung, E; Page, K; Wang, H-W; Olds, D; Manna, S; Fullerton, E E

    2017-01-01

    The five-fold symmetry of face-centered cubic (fcc) derived nanoparticles is inconsistent with the translational symmetry of a Bravais lattice and generally explained by multiple twinning of a tetrahedral subunit about a (joint) symmetry axis, with or without structural modification to the fcc motif. Unlike in bulk materials, five-fold twinning in cubic nanoparticles is common and strongly affects their structural, chemical, and electronic properties. To test and verify theoretical approaches, it is therefore pertinent that the local structural features of such materials can be fully characterized. The small size of nanoparticles severely limits the application of traditional analysis techniques, such as Bragg diffraction. A complete description of the atomic arrangement in nanoparticles therefore requires a departure from the concept of translational symmetry, and prevents fully evaluating all the structural features experimentally. We describe how recent advances in instrumentation, together with the increasing power of computing, are shaping the development of alternative analysis methods of scattering data for nanostructures. We present the application of Debye scattering and pair distribution function (PDF) analysis towards modeling of the total scattering data for the example of decahedral gold nanoparticles. PDF measurements provide a statistical description of the pair correlations of atoms within a material, allowing one to evaluate the probability of finding two atoms within a given distance. We explored the sensitivity of existing synchrotron x-ray PDF instruments for distinguishing four different simple models for our gold nanoparticles: a multiply twinned fcc decahedron with either a single gap or multiple distributed gaps, a relaxed body-centered orthorhombic (bco) decahedron, and a hybrid decahedron. The data simulations of the models were then compared with experimental data from synchrotron x-ray total scattering. We present our experimentally

  18. Data intensive high energy physics analysis in a distributed cloud

    Science.gov (United States)

    Charbonneau, A.; Agarwal, A.; Anderson, M.; Armstrong, P.; Fransham, K.; Gable, I.; Harris, D.; Impey, R.; Leavett-Brown, C.; Paterson, M.; Podaima, W.; Sobie, R. J.; Vliet, M.

    2012-02-01

    We show that distributed Infrastructure-as-a-Service (IaaS) compute clouds can be effectively used for the analysis of high energy physics data. We have designed a distributed cloud system that works with any application using large input data sets requiring a high throughput computing environment. The system uses IaaS-enabled science and commercial clusters in Canada and the United States. We describe the process in which a user prepares an analysis virtual machine (VM) and submits batch jobs to a central scheduler. The system boots the user-specific VM on one of the IaaS clouds, runs the jobs and returns the output to the user. The user application accesses a central database for calibration data during the execution of the application. Similarly, the data is located in a central location and streamed by the running application. The system can easily run one hundred simultaneous jobs in an efficient manner and should scale to many hundreds and possibly thousands of user jobs.

  19. Distributed Data Analysis in the ATLAS Experiment: Challenges and Solutions

    International Nuclear Information System (INIS)

    Elmsheuser, Johannes; Van der Ster, Daniel

    2012-01-01

    The ATLAS experiment at the LHC at CERN is recording and simulating several 10's of PetaBytes of data per year. To analyse these data the ATLAS experiment has developed and operates a mature and stable distributed analysis (DA) service on the Worldwide LHC Computing Grid. The service is actively used: more than 1400 users have submitted jobs in the year 2011 and a total of more 1 million jobs run every week. Users are provided with a suite of tools to submit Athena, ROOT or generic jobs to the Grid, and the PanDA workload management system is responsible for their execution. The reliability of the DA service is high but steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. This paper will review the state of the DA tools and services, summarize the past year of distributed analysis activity, and present the directions for future improvements to the system.

  20. Data intensive high energy physics analysis in a distributed cloud

    International Nuclear Information System (INIS)

    Charbonneau, A; Impey, R; Podaima, W; Agarwal, A; Anderson, M; Armstrong, P; Fransham, K; Gable, I; Harris, D; Leavett-Brown, C; Paterson, M; Sobie, R J; Vliet, M

    2012-01-01

    We show that distributed Infrastructure-as-a-Service (IaaS) compute clouds can be effectively used for the analysis of high energy physics data. We have designed a distributed cloud system that works with any application using large input data sets requiring a high throughput computing environment. The system uses IaaS-enabled science and commercial clusters in Canada and the United States. We describe the process in which a user prepares an analysis virtual machine (VM) and submits batch jobs to a central scheduler. The system boots the user-specific VM on one of the IaaS clouds, runs the jobs and returns the output to the user. The user application accesses a central database for calibration data during the execution of the application. Similarly, the data is located in a central location and streamed by the running application. The system can easily run one hundred simultaneous jobs in an efficient manner and should scale to many hundreds and possibly thousands of user jobs.

  1. Implementation of force distribution analysis for molecular dynamics simulations

    Directory of Open Access Journals (Sweden)

    Seifert Christian

    2011-04-01

    Full Text Available Abstract Background The way mechanical stress is distributed inside and propagated by proteins and other biopolymers largely defines their function. Yet, determining the network of interactions propagating internal strain remains a challenge for both, experiment and theory. Based on molecular dynamics simulations, we developed force distribution analysis (FDA, a method that allows visualizing strain propagation in macromolecules. Results To be immediately applicable to a wide range of systems, FDA was implemented as an extension to Gromacs, a commonly used package for molecular simulations. The FDA code comes with an easy-to-use command line interface and can directly be applied to every system built using Gromacs. We provide an additional R-package providing functions for advanced statistical analysis and presentation of the FDA data. Conclusions Using FDA, we were able to explain the origin of mechanical robustness in immunoglobulin domains and silk fibers. By elucidating propagation of internal strain upon ligand binding, we previously also successfully revealed the functionality of a stiff allosteric protein. FDA thus has the potential to be a valuable tool in the investigation and rational design of mechanical properties in proteins and nano-materials.

  2. Evaluation of Distribution Analysis Software for DER Applications

    Energy Technology Data Exchange (ETDEWEB)

    Staunton, RH

    2003-01-23

    unstoppable. In response, energy providers will be forced to both fully acknowledge the trend and plan for accommodating DER [3]. With bureaucratic barriers [4], lack of time/resources, tariffs, etc. still seen in certain regions of the country, changes still need to be made. Given continued technical advances in DER, the time is fast approaching when the industry, nation-wide, must not only accept DER freely but also provide or review in-depth technical assessments of how DER should be integrated into and managed throughout the distribution system. Characterization studies are needed to fully understand how both the utility system and DER devices themselves will respond to all reasonable events (e.g., grid disturbances, faults, rapid growth, diverse and multiple DER systems, large reactive loads). Some of this work has already begun as it relates to operation and control of DER [5] and microturbine performance characterization [6,7]. One of the most urgently needed tools that can provide these types of analyses is a distribution network analysis program in combination with models for various DER. Together, they can be used for (1) analyzing DER placement in distribution networks and (2) helping to ensure that adequate transmission reliability is maintained. Surveys of the market show products that represent a partial match to these needs; specifically, software that has been developed to plan electrical distribution systems and analyze reliability (in a near total absence of DER). The first part of this study (Sections 2 and 3 of the report) looks at a number of these software programs and provides both summary descriptions and comparisons. The second part of this study (Section 4 of the report) considers the suitability of these analysis tools for DER studies. It considers steady state modeling and assessment work performed by ORNL using one commercially available tool on feeder data provided by a southern utility. Appendix A provides a technical report on the results of

  3. Analysis of acidic properties of distribution transformer oil insulation ...

    African Journals Online (AJOL)

    This paper examined the acidic properties of distribution transformer oil insulation in service at Jericho distribution network Ibadan, Nigeria. Five oil samples each from six distribution transformers (DT1, DT2, DT3, DT4 and DT5) making a total of thirty samples were taken from different installed distribution transformers all ...

  4. Reliability analysis of water distribution systems under uncertainty

    International Nuclear Information System (INIS)

    Kansal, M.L.; Kumar, Arun; Sharma, P.B.

    1995-01-01

    In most of the developing countries, the Water Distribution Networks (WDN) are of intermittent type because of the shortage of safe drinking water. Failure of a pipeline(s) in such cases will cause not only the fall in one or more nodal heads but also the poor connectivity of source with various demand nodes of the system. Most of the previous works have used the two-step algorithm based on pathset or cutset approach for connectivity analysis. The computations become more cumbersome when connectivity of all demand nodes taken together with that of supply is carried out. In the present paper, network connectivity based on the concept of Appended Spanning Tree (AST) is suggested to compute global network connectivity which is defined as the probability of the source node being connected with all the demand nodes simultaneously. The concept of AST has distinct advantages as it attacks the problem directly rather than in an indirect way as most of the studies so far have done. Since the water distribution system is a repairable one, a general expression for pipeline avialability using the failure/repair rate is considered. Furthermore, the sensitivity of global reliability estimates due to the likely error in the estimation of failure/repair rates of various pipelines is also studied

  5. Harmonic Analysis of Electric Vehicle Loadings on Distribution System

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Yijun A [University of Southern California, Department of Electrical Engineering; Xu, Yunshan [University of Southern California, Department of Electrical Engineering; Chen, Zimin [University of Southern California, Department of Electrical Engineering; Peng, Fei [University of Southern California, Department of Electrical Engineering; Beshir, Mohammed [University of Southern California, Department of Electrical Engineering

    2014-12-01

    With the increasing number of Electric Vehicles (EV) in this age, the power system is facing huge challenges of the high penetration rates of EVs charging stations. Therefore, a technical study of the impact of EVs charging on the distribution system is required. This paper is applied with PSCAD software and aimed to analyzing the Total Harmonic Distortion (THD) brought by Electric Vehicles charging stations in power systems. The paper starts with choosing IEEE34 node test feeder as the distribution system, building electric vehicle level two charging battery model and other four different testing scenarios: overhead transmission line and underground cable, industrial area, transformer and photovoltaic (PV) system. Then the statistic method is used to analyze different characteristics of THD in the plug-in transient, plug-out transient and steady-state charging conditions associated with these four scenarios are taken into the analysis. Finally, the factors influencing the THD in different scenarios are found. The analyzing results lead the conclusion of this paper to have constructive suggestions for both Electric Vehicle charging station construction and customers' charging habits.

  6. A Distributed Flocking Approach for Information Stream Clustering Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Xiaohui [ORNL; Potok, Thomas E [ORNL

    2006-01-01

    Intelligence analysts are currently overwhelmed with the amount of information streams generated everyday. There is a lack of comprehensive tool that can real-time analyze the information streams. Document clustering analysis plays an important role in improving the accuracy of information retrieval. However, most clustering technologies can only be applied for analyzing the static document collection because they normally require a large amount of computation resource and long time to get accurate result. It is very difficult to cluster a dynamic changed text information streams on an individual computer. Our early research has resulted in a dynamic reactive flock clustering algorithm which can continually refine the clustering result and quickly react to the change of document contents. This character makes the algorithm suitable for cluster analyzing dynamic changed document information, such as text information stream. Because of the decentralized character of this algorithm, a distributed approach is a very natural way to increase the clustering speed of the algorithm. In this paper, we present a distributed multi-agent flocking approach for the text information stream clustering and discuss the decentralized architectures and communication schemes for load balance and status information synchronization in this approach.

  7. Modelling and analysis of solar cell efficiency distributions

    Science.gov (United States)

    Wasmer, Sven; Greulich, Johannes

    2017-08-01

    We present an approach to model the distribution of solar cell efficiencies achieved in production lines based on numerical simulations, metamodeling and Monte Carlo simulations. We validate our methodology using the example of an industrial feasible p-type multicrystalline silicon “passivated emitter and rear cell” process. Applying the metamodel, we investigate the impact of each input parameter on the distribution of cell efficiencies in a variance-based sensitivity analysis, identifying the parameters and processes that need to be improved and controlled most accurately. We show that if these could be optimized, the mean cell efficiencies of our examined cell process would increase from 17.62% ± 0.41% to 18.48% ± 0.09%. As the method relies on advanced characterization and simulation techniques, we furthermore introduce a simplification that enhances applicability by only requiring two common measurements of finished cells. The presented approaches can be especially helpful for ramping-up production, but can also be applied to enhance established manufacturing.

  8. Primitive Path Analysis and Stress Distribution in Highly Strained Macromolecules.

    Science.gov (United States)

    Hsu, Hsiao-Ping; Kremer, Kurt

    2018-01-16

    Polymer material properties are strongly affected by entanglement effects. For long polymer chains and composite materials, they are expected to be at the origin of many technically important phenomena, such as shear thinning or the Mullins effect, which microscopically can be related to topological constraints between chains. Starting from fully equilibrated highly entangled polymer melts, we investigate the effect of isochoric elongation on the entanglement structure and force distribution of such systems. Theoretically, the related viscoelastic response usually is discussed in terms of the tube model. We relate stress relaxation in the linear and nonlinear viscoelastic regimes to a primitive path analysis (PPA) and show that tension forces both along the original paths and along primitive paths, that is, the backbone of the tube, in the stretching direction correspond to each other. Unlike homogeneous relaxation along the chain contour, the PPA reveals a so far not observed long-lived clustering of topological constraints along the chains in the deformed state.

  9. Computer-aided detection system performance on current and previous digital mammograms in patients with contralateral metachronous breast cancer

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Moon, Woo Kyung; Cho, Nariya; Chang, Jung Min

    2012-01-01

    Background: The computer-aided detection (CAD) system is widely used for screening mammography. The performance of the CAD system for contralateral breast cancer has not been reported for women with a history of breast cancer. Purpose: To retrospectively evaluate the performance of a CAD system on current and previous mammograms in patients with contralateral metachronous breast cancer. Material and Methods: During a 3-year period, 4945 postoperative patients had follow-up examinations, from whom we selected 55 women with contralateral breast cancers. Among them, 38 had visible malignant signs on the current mammograms. We analyzed the sensitivity and false-positive marks of the system on the current and previous mammograms according to lesion type and breast density. Results: The total visible lesion components on the current mammograms included 27 masses and 14 calcifications in 38 patients. The case-based sensitivity for all lesion types was 63.2% (24/38) with false-positive marks of 0.71 per patient. The lesion-based sensitivity for masses and calcifications was 59.3% (16/27) and 71.4% (10/14), respectively. The lesion-based sensitivity for masses in fatty and dense breasts was 68.8% (11/16) and 45.5% (5/11), respectively. The lesion-based sensitivity for calcifications in fatty and dense breasts was 100.0% (3/3) and 63.6% (7/11), respectively. The total visible lesion components on the previous mammograms included 13 masses and three calcifications in 16 patients, and the sensitivity for all lesion types was 31.3% (5/16) with false-positive marks of 0.81 per patient. On these mammograms, the sensitivity for masses and calcifications was 30.8% (4/13) and 33.3% (1/3), respectively. The sensitivity in fatty and dense breasts was 28.6% (2/7) and 33.3% (3/9), respectively. Conclusion: In the women with a history of breast cancer, the sensitivity of the CAD system in visible contralateral breast cancer was lower than in most previous reports using the same CAD

  10. Distributional patterns of cecropia (Cecropiaceae: a panbiogeographic analysis

    Directory of Open Access Journals (Sweden)

    Franco Rosselli Pilar

    1997-06-01

    Full Text Available A panbiogeographic analysis of the distributional patterns of 60 species of Cecropia was carried out. Based on the distributional ranges of 36 species, we found eight generalized tracks for Cecropia species. whereas distributional patterns of 24 species were uninformative for the analysis. The major concentration of species of Cecropia is in the Neotropical Andean region. where there are three generalized tracks and two nodes. The northern Andes in Colombia and Ecuador are richer than the Central Andes in Perú. they contain two generalized tracks; one to the west and another to the east, formed by individual tracks of eight species each. There are four generalized tracks outside the Andean region: two in the Amazonian region in Guayana-Pará and in Manaus. one in Roraima. one in Serra do Mar in the Atlantic forest of Brazil and one in Central America. Speciation in Cecropia may be related to the Andean first uplift.Con base en la distribución de 60 especies del género Cecropia, se hizo un análisis panbiogeográfico. Se construyeron 8 trazos generalizados con base en el patrón de distribución de 36 especies; la distribución de las demás especies no aportaba información para la definición de los trazos. La región andina tiene la mayor concentración de especies de Cecropia representada por la presencia de tres trazos generalizados y dos nodos; los dos trazos con mayor número de especies se localizan en su parte norte, en Colombia y Ecuador y el otro en los Andes centrales en Perú. Se encontraron además, cuatro trazos extrandinos: dos en la región amazónica, en Pará-Guayana y en Manaus, uno en Roraima, uno en Serra do Mar en la Selva Atlánfíca del Brasil y uno en Centro América. La especiación en Cecropia parece estar relacionada con el primer levantamiento de los Andes.

  11. Dictionaries and distributions: Combining expert knowledge and large scale textual data content analysis : Distributed dictionary representation.

    Science.gov (United States)

    Garten, Justin; Hoover, Joe; Johnson, Kate M; Boghrati, Reihane; Iskiwitch, Carol; Dehghani, Morteza

    2018-02-01

    Theory-driven text analysis has made extensive use of psychological concept dictionaries, leading to a wide range of important results. These dictionaries have generally been applied through word count methods which have proven to be both simple and effective. In this paper, we introduce Distributed Dictionary Representations (DDR), a method that applies psychological dictionaries using semantic similarity rather than word counts. This allows for the measurement of the similarity between dictionaries and spans of text ranging from complete documents to individual words. We show how DDR enables dictionary authors to place greater emphasis on construct validity without sacrificing linguistic coverage. We further demonstrate the benefits of DDR on two real-world tasks and finally conduct an extensive study of the interaction between dictionary size and task performance. These studies allow us to examine how DDR and word count methods complement one another as tools for applying concept dictionaries and where each is best applied. Finally, we provide references to tools and resources to make this method both available and accessible to a broad psychological audience.

  12. Performance analysis of distributed beamforming in a spectrum sharing system

    KAUST Repository

    Yang, Liang

    2013-05-01

    In this paper, we consider a distributed beamforming scheme (DBF) in a spectrum sharing system where multiple secondary users share the spectrum with some licensed primary users under an interference temperature constraint. We assume that the DBF is applied at the secondary users. We first consider optimal beamforming and compare it with the user selection scheme in terms of the outage probability and bit error rate performance metrics. Since perfect feedback is difficult to obtain, we then investigate a limited feedback DBF scheme and develop an analysis for a random vector quantization design algorithm. Specifically, the approximate statistics functions of the squared inner product between the optimal and quantized vectors are derived. With these statistics, we analyze the outage performance. Furthermore, the effects of channel estimation error and number of primary users on the system performance are investigated. Finally, optimal power adaptation and cochannel interference are considered and analyzed. Numerical and simulation results are provided to illustrate our mathematical formalism and verify our analysis. © 2012 IEEE.

  13. An Effective Distributed Model for Power System Transient Stability Analysis

    Directory of Open Access Journals (Sweden)

    MUTHU, B. M.

    2011-08-01

    Full Text Available The modern power systems consist of many interconnected synchronous generators having different inertia constants, connected with large transmission network and ever increasing demand for power exchange. The size of the power system grows exponentially due to increase in power demand. The data required for various power system applications have been stored in different formats in a heterogeneous environment. The power system applications themselves have been developed and deployed in different platforms and language paradigms. Interoperability between power system applications becomes a major issue because of the heterogeneous nature. The main aim of the paper is to develop a generalized distributed model for carrying out power system stability analysis. The more flexible and loosely coupled JAX-RPC model has been developed for representing transient stability analysis in large interconnected power systems. The proposed model includes Pre-Fault, During-Fault, Post-Fault and Swing Curve services which are accessible to the remote power system clients when the system is subjected to large disturbances. A generalized XML based model for data representation has also been proposed for exchanging data in order to enhance the interoperability between legacy power system applications. The performance measure, Round Trip Time (RTT is estimated for different power systems using the proposed JAX-RPC model and compared with the results obtained using traditional client-server and Java RMI models.

  14. Location and Size Planning of Distributed Photovoltaic Generation in Distribution network System Based on K-means Clustering Analysis

    Science.gov (United States)

    Lu, Siqi; Wang, Xiaorong; Wu, Junyong

    2018-01-01

    The paper presents a method to generate the planning scenarios, which is based on K-means clustering analysis algorithm driven by data, for the location and size planning of distributed photovoltaic (PV) units in the network. Taken the power losses of the network, the installation and maintenance costs of distributed PV, the profit of distributed PV and the voltage offset as objectives and the locations and sizes of distributed PV as decision variables, Pareto optimal front is obtained through the self-adaptive genetic algorithm (GA) and solutions are ranked by a method called technique for order preference by similarity to an ideal solution (TOPSIS). Finally, select the planning schemes at the top of the ranking list based on different planning emphasis after the analysis in detail. The proposed method is applied to a 10-kV distribution network in Gansu Province, China and the results are discussed.

  15. Analysis of distribution systems with a high penetration of distributed generation

    DEFF Research Database (Denmark)

    Lund, Torsten

    Since the mid eighties, a large number of wind turbines and distributed combined heat and power plants (CHPs) have been connected to the Danish power system. Especially in the Western part, comprising Jutland and Funen, the penetration is high compared to the load demand. In some periods the wind...... power alone can cover the entire load demand. The objective of the work is to investigate the influence of wind power and distributed combined heat and power production on the operation of the distribution systems. Where other projects have focused on the modeling and control of the generators and prime...... movers, the focus of this project is on the operation of an entire distribution system with several wind farms and CHPs. Firstly, the subject of allocation of power system losses in a distribution system with distributed generation is treated. A new approach to loss allocation based on current injections...

  16. Finite element analysis of thermal stress distribution in different ...

    African Journals Online (AJOL)

    Nigerian Journal of Clinical Practice. Journal Home ... Von Mises and thermal stress distributions were evaluated. Results: In all ... distribution. Key words: Amalgam, finite element method, glass ionomer cement, resin composite, thermal stress ...

  17. Characteristics of Quoit filter, a digital filter developed for the extraction of circumscribed shadows, and its applications to mammograms

    International Nuclear Information System (INIS)

    Isobe, Yoshiaki; Ohkubo, Natsumi; Yamamoto, Shinji; Toriwaki, Jun-ichiro; Kobatake, Hidefumi.

    1993-01-01

    This paper presents a newly developed filter called Quoit filter, which detects circumscribed shadows (concentric circular isolated image), like typical cancer regions. This Quoit filter is based on the mathematical morphology and is found to have interesting facts as follows. (1) Output of this filter can be analytically expressible when an input image is assumed to be a concentric circular model (output is expectable for typical inputs). (2) This filter has an ability to reconstruct original isolated models mentioned in (1) selectively, when this filter is applied sequentially twice. This filter was tested on the detection of cancer regions in X-ray mammograms, and for 12 cancer mammograms, this filter achieved a true-positive cancer detection rate of 100 %. (author)

  18. Stability analysis of distributed order fractional chen system.

    Science.gov (United States)

    Aminikhah, H; Refahi Sheikhani, A; Rezazadeh, H

    2013-01-01

    We first investigate sufficient and necessary conditions of stability of nonlinear distributed order fractional system and then we generalize the integer-order Chen system into the distributed order fractional domain. Based on the asymptotic stability theory of nonlinear distributed order fractional systems, the stability of distributed order fractional Chen system is discussed. In addition, we have found that chaos exists in the double fractional order Chen system. Numerical solutions are used to verify the analytical results.

  19. Stability Analysis of Distributed Order Fractional Chen System

    Science.gov (United States)

    Aminikhah, H.; Refahi Sheikhani, A.; Rezazadeh, H.

    2013-01-01

    We first investigate sufficient and necessary conditions of stability of nonlinear distributed order fractional system and then we generalize the integer-order Chen system into the distributed order fractional domain. Based on the asymptotic stability theory of nonlinear distributed order fractional systems, the stability of distributed order fractional Chen system is discussed. In addition, we have found that chaos exists in the double fractional order Chen system. Numerical solutions are used to verify the analytical results. PMID:24489508

  20. Factory Gate Pricing: An Analysis of the Dutch Retail Distribution

    NARCIS (Netherlands)

    H.M. le Blanc; F. Cruijssen (Frans); H.A. Fleuren; M.B.M. de Koster (René)

    2004-01-01

    textabstractFactory Gate Pricing (FGP) is a relatively new phenomenon in retail distribution. Under FGP, products are no longer delivered at the retailer distribution center, but collected by the retailer at the factory gates of the suppliers. Owing to both the asymmetry in the distribution networks

  1. Factory Gate Pricing : An Analysis of the Dutch Retail Distribution

    NARCIS (Netherlands)

    Le Blanc, H.M.; Cruijssen, F.C.A.M.; Fleuren, H.A.; de Koster, M.B.M.

    2004-01-01

    Factory Gate Pricing (FGP) is a relatively new phenomenon in retail distribution.Under FGP, products are no longer delivered at the retailer distribution center, but collected by the retailer at the factory gates of the suppliers.Owing to both the asymmetry in the distribution networks (the supplier

  2. Development of a Computer-aided Diagnosis System for Early Detection of Masses Using Retrospectively Detected Cancers on Prior Mammograms

    Science.gov (United States)

    2009-06-01

    des ign a c lassifier f or t he differentiation of the abnormal from the normal structures . In this project, we have also investigated the...were trained: one with the current m ammograms and the other with the prior mammograms. A feed- forward backpropagation artificial neural network...Sahin er, B., Helvie, M. A., Petri ck, N., Roubidou x, M. A., Wi lson, T. E., Adler, D. D., Pa ramagul, C., Ne wman, J . S . a nd G opal , S. S

  3. Is there a correlation between the presence of a spiculated mass on mammogram and luminal a subtype breast cancer?

    International Nuclear Information System (INIS)

    Liu, Song; Wu, Xiao Dong; Xu, Wen Jian; Lin, Qing; Liu, Xue Jun; Li, Ying

    2016-01-01

    To determine whether the appearance of a spiculated mass on a mammogram is associated with luminal A subtype breast cancer and the factors that may influence the presence or absence of the spiculated mass. Three hundred seventeen (317) patients who underwent image-guided or surgical biopsy between December 2014 and April 2015 were included in the study. Radiologists conducted retrospective assessments of the presence of spiculated masses according to the criteria of Breast Imaging Reporting and Data System. We used combinations of estrogen receptor (ER), progesterone receptor (PR), human epithelial growth factor receptor 2 (HER2), and Ki67 as surrogate markers to identify molecular subtypes of breast cancer. Pearson chi-square test was employed to measure statistical significance of correlations. Furthermore, we built a bi-variate logistic regression model to quantify the relative contribution of the factors that may influence the presence or absence of the spiculated mass. Seventy-one percent (71%) of the spiculated masses were classified as luminal A. Masses classified as luminal A were 10.3 times more likely to be presented as spiculated mass on a mammogram than all other subtypes. Patients with low Ki67 index (< 14%) and HER2 negative were most likely to present with a spiculated mass on their mammograms (p <0.001) than others. The hormone receptor status (ER and PR), pathology grade, overall breast composition, were all associated with the presence of a spiculated mass, but with less weight in contribution than Ki67 and HER2. We observed an association between the luminal A subtype of invasive breast cancer and the presence of a spiculated mass on a mammogram. It is hypothesized that lower Ki67 index and HER2 negativity may be the most significant factors in the presence of a spiculated mass

  4. Is there a correlation between the presence of a spiculated mass on mammogram and luminal a subtype breast cancer?

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Song; Wu, Xiao Dong; Xu, Wen Jian; Lin, Qing; Liu, Xue Jun; Li, Ying [The Affiliated Hospital of Qingdao University, Qingdao (China)

    2016-11-15

    To determine whether the appearance of a spiculated mass on a mammogram is associated with luminal A subtype breast cancer and the factors that may influence the presence or absence of the spiculated mass. Three hundred seventeen (317) patients who underwent image-guided or surgical biopsy between December 2014 and April 2015 were included in the study. Radiologists conducted retrospective assessments of the presence of spiculated masses according to the criteria of Breast Imaging Reporting and Data System. We used combinations of estrogen receptor (ER), progesterone receptor (PR), human epithelial growth factor receptor 2 (HER2), and Ki67 as surrogate markers to identify molecular subtypes of breast cancer. Pearson chi-square test was employed to measure statistical significance of correlations. Furthermore, we built a bi-variate logistic regression model to quantify the relative contribution of the factors that may influence the presence or absence of the spiculated mass. Seventy-one percent (71%) of the spiculated masses were classified as luminal A. Masses classified as luminal A were 10.3 times more likely to be presented as spiculated mass on a mammogram than all other subtypes. Patients with low Ki67 index (< 14%) and HER2 negative were most likely to present with a spiculated mass on their mammograms (p <0.001) than others. The hormone receptor status (ER and PR), pathology grade, overall breast composition, were all associated with the presence of a spiculated mass, but with less weight in contribution than Ki67 and HER2. We observed an association between the luminal A subtype of invasive breast cancer and the presence of a spiculated mass on a mammogram. It is hypothesized that lower Ki67 index and HER2 negativity may be the most significant factors in the presence of a spiculated mass.

  5. Psychological distress, social withdrawal, and coping following receipt of an abnormal mammogram among different ethnicities: a mediation model.

    Science.gov (United States)

    Molina, Yamile; Beresford, Shirley A A; Espinoza, Noah; Thompson, Beti

    2014-09-01

    To explore ethnic differences in psychological distress and social withdrawal after receiving an abnormal mammogram result and to assess if coping strategies mediate ethnic differences. Descriptive correlational. Two urban mobile mammography units and a rural community hospital in the state of Washington. 41 Latina and 41 non-Latina Caucasian (NLC) women who had received an abnormal mammogram result. Women completed standard sociodemographic questions, Impact of Event Scale-Revised, the social dimension of the Psychological Consequences Questionnaire, and the Brief COPE. Ethnicity, psychological distress, social withdrawal, and coping. Latinas experienced greater psychological distress and social withdrawal compared to NLC counterparts. Denial as a coping strategy mediated ethnic differences in psychological distress. Religious coping mediated ethnic differences in social withdrawal. Larger population-based studies are necessary to understand how ethnic differences in coping strategies can influence psychological outcomes. This is an important finding that warrants additional study among women who are and are not diagnosed with breast cancer following an abnormal mammogram. Nurses may be able to work with Latina patients to diminish denial coping and consequent distress. Nurses may be particularly effective, given cultural values concerning strong interpersonal relationships and respect for authority figures.

  6. Dynamic models for transient stability analysis of transmission and distribution systems with distributed generation : an overview

    NARCIS (Netherlands)

    Boemer, J.C.; Gibescu, M.; Kling, W.L.

    2009-01-01

    Distributed Generation is increasing in nowadays power systems. Small scale systems such as photovoltaic, biomass or small cogeneration plants are connected to the distribution level, while large wind farms will be connected to the transmission level. Both trends lead to a replacement of large

  7. Rod internal pressure quantification and distribution analysis using Frapcon

    Energy Technology Data Exchange (ETDEWEB)

    Jessee, Matthew Anderson [ORNL; Wieselquist, William A [ORNL; Ivanov, Kostadin [Pennsylvania State University, University Park

    2015-09-01

    This report documents work performed supporting the Department of Energy (DOE) Office of Nuclear Energy (NE) Fuel Cycle Technologies Used Fuel Disposition Campaign (UFDC) under work breakdown structure element 1.02.08.10, ST Analysis. In particular, this report fulfills the M4 milestone M4FT- 15OR0810036, Quantify effects of power uncertainty on fuel assembly characteristics, within work package FT-15OR081003 ST Analysis-ORNL. This research was also supported by the Consortium for Advanced Simulation of Light Water Reactors (http://www.casl.gov), an Energy Innovation Hub (http://www.energy.gov/hubs) for Modeling and Simulation of Nuclear Reactors under U.S. Department of Energy Contract No. DE-AC05-00OR22725. The discharge rod internal pressure (RIP) and cladding hoop stress (CHS) distributions are quantified for Watts Bar Nuclear Unit 1 (WBN1) fuel rods by modeling core cycle design data, operation data (including modeling significant trips and downpowers), and as-built fuel enrichments and densities of each fuel rod in FRAPCON-3.5. A methodology is developed which tracks inter-cycle assembly movements and assembly batch fabrication information to build individual FRAPCON inputs for each evaluated WBN1 fuel rod. An alternate model for the amount of helium released from the zirconium diboride (ZrB2) integral fuel burnable absorber (IFBA) layer is derived and applied to FRAPCON output data to quantify the RIP and CHS for these types of fuel rods. SCALE/Polaris is used to quantify fuel rodspecific spectral quantities and the amount of gaseous fission products produced in the fuel for use in FRAPCON inputs. Fuel rods with ZrB2 IFBA layers (i.e., IFBA rods) are determined to have RIP predictions that are elevated when compared to fuel rod without IFBA layers (i.e., standard rods) despite the fact that IFBA rods often have reduced fill pressures and annular fuel pellets. The primary contributor to elevated RIP predictions at burnups less than and greater than 30 GWd

  8. A novel featureless approach to mass detection in digital mammograms based on support vector machines

    Energy Technology Data Exchange (ETDEWEB)

    Campanini, Renato [Department of Physics, University of Bologna, and INFN, Bologna (Italy); Dongiovanni, Danilo [Department of Physics, University of Bologna, and INFN, Bologna (Italy); Iampieri, Emiro [Department of Physics, University of Bologna, and INFN, Bologna (Italy); Lanconelli, Nico [Department of Physics, University of Bologna, and INFN, Bologna (Italy); Masotti, Matteo [Department of Physics, University of Bologna, and INFN, Bologna (Italy); Palermo, Giuseppe [Department of Physics, University of Bologna, and INFN, Bologna (Italy); Riccardi, Alessandro [Department of Physics, University of Bologna, and INFN, Bologna (Italy); Roffilli, Matteo [Department of Computer Science, University of Bologna, Bologna (Italy)

    2004-03-21

    In this work, we present a novel approach to mass detection in digital mammograms. The great variability of the appearance of masses is the main obstacle to building a mass detection method. It is indeed demanding to characterize all the varieties of masses with a reduced set of features. Hence, in our approach we have chosen not to extract any feature, for the detection of the region of interest; in contrast, we exploit all the information available on the image. A multiresolution overcomplete wavelet representation is performed, in order to codify the image with redundancy of information. The vectors of the very-large space obtained are then provided to a first support vector machine (SVM) classifier. The detection task is considered here as a two-class pattern recognition problem: crops are classified as suspect or not, by using this SVM classifier. False candidates are eliminated with a second cascaded SVM. To further reduce the number of false positives, an ensemble of experts is applied: the final suspect regions are achieved by using a voting strategy. The sensitivity of the presented system is nearly 80% with a false-positive rate of 1.1 marks per image, estimated on images coming from the USF DDSM database.

  9. Evaluating radiographers' diagnostic accuracy in screen-reading mammograms: what constitutes a quality study?

    International Nuclear Information System (INIS)

    Debono, Josephine C; Poulos, Ann E

    2015-01-01

    The aim of this study was to first evaluate the quality of studies investigating the diagnostic accuracy of radiographers as mammogram screen-readers and then to develop an adapted tool for determining the quality of screen-reading studies. A literature search was used to identify relevant studies and a quality evaluation tool constructed by combining the criteria for quality of Whiting, Rutjes, Dinnes et al. and Brealey and Westwood. This constructed tool was then applied to the studies and subsequently adapted specifically for use in evaluating quality in studies investigating diagnostic accuracy of screen-readers. Eleven studies were identified and the constructed tool applied to evaluate quality. This evaluation resulted in the identification of quality issues with the studies such as potential for bias, applicability of results, study conduct, reporting of the study and observer characteristics. An assessment of the applicability and relevance of the tool for this area of research resulted in adaptations to the criteria and the development of a tool specifically for evaluating diagnostic accuracy in screen-reading. This tool, with further refinement and rigorous validation can make a significant contribution to promoting well-designed studies in this important area of research and practice

  10. Relationship between arterial vascular calcifications seen on screening mammograms and biochemical markers of endothelial injury

    Energy Technology Data Exchange (ETDEWEB)

    Pidal, Diego [Unidad de Investigacion del, Hospital de Jove, Gijon (Spain)], E-mail: dpidal@hotmail.com; Sanchez Vidal, M Teresa [Servicio de Medicina Interna, Hospital de Jove (Spain)], E-mail: medicinainterna@hospitaldejove.com; Rodriguez, Juan Carlos [Unidad de Investigacion del, Hospital de Jove, Gijon (Spain); Servicio de Cirugia General, Hospital de Jove (Spain); Instituto Universitario de Oncologia del Principado de Asturias, Oviedo (Spain)], E-mail: investigacion@hospitaldejove.com; Corte, M Daniela [Unidad de Investigacion del, Hospital de Jove, Gijon (Spain); Instituto Universitario de Oncologia del Principado de Asturias, Oviedo (Spain)], E-mail: mdanielac@hotmail.com; Pravia, Paz [Servicio de Radiodiagnostico, Hospital de Jove (Spain)], E-mail: radiologia@hospitaldejove.com; Guinea, Oscar [Servicio de Radiodiagnostico, Hospital de Jove (Spain)], E-mail: oscarfguinea@seram.org; Pidal, Ivan [Unidad de Investigacion del, Hospital de Jove, Gijon (Spain)], E-mail: ivanpida@hotmail.com; Bongera, Miguel [Unidad de Investigacion del, Hospital de Jove, Gijon (Spain)], E-mail: mbchoppy@hotmail.com; Escribano, Damaso [Servicio de Medicina Interna, Hospital de Jove (Spain)], E-mail: medicinainterna@hospitaldejove.com; Gonzalez, Luis O. [Unidad de Investigacion del, Hospital de Jove, Gijon (Spain)], E-mail: lovidiog@telefonica.net; Diez, M Cruz [Servicio de Cirugia General, Hospital de Jove (Spain)], E-mail: cirugiageneral@hospitaldejove.com; Venta, Rafael [Servicio de Analisis Clinicos, Hospital de San Agustin, Aviles (Spain); Departamento de Bioquimica y Biologia Molecular, Universidad de Oviedo (Spain)], E-mail: rafael.venta@sespa.princast.es; Vizoso, Francisco J. [Unidad de Investigacion del, Hospital de Jove, Gijon (Spain); Servicio de Cirugia General, Hospital de Jove (Spain); Instituto Universitario de Oncologia del Principado de Asturias, Oviedo (Spain)], E-mail: fjvizoso@telefonica.net

    2009-01-15

    To assess whether breast arterial calcifications (BAC) are associated with altered serum markers of cardiovascular risk, mammograms and records from 1759 women (age range: 45-65 years) screened for breast cancer were revised. One hundred and forty seven (8.36%) women showed BAC. A total of 136 women with BAC and controls (mean age: 57 and 55 years, respectively) accepted entering the study. There were no significant differences in serum levels of urea, glucose, uric acid, creatinine, total cholesterol, HDL-C, LDL-C, folic acid, vitamin B{sub 12}, TSH or cysteine, between both groups of patients. However, women with BAC showed higher serum levels of triglycerides (p = 0.006), homocysteine (p = 0.002) and hs-CRP (p = 0.003) than women without BAC. Likewise, we found a significantly higher percentage of cases with an elevated LDL-C/HDL-C ratio (coronary risk index >2) amongst women with BAC than in women without BAC (56.7 and 38.2%, respectively; p = 0.04). Our results indicate that the finding of BAC identify women showing altered serum markers of cardiovascular risk.

  11. Mass-Like Fibrocystic Disease of the Breast : Characteristic Findings on Mammogram and Sonogram

    International Nuclear Information System (INIS)

    KIm, Hyeon Hee; Yun, Duk Hee; Hwang, Ho Kyumg; Kim, Jang Min; Kim, Young Sun; Lee, Jung Hee

    1995-01-01

    This study was performed to evaluate the mammographic and sonographic features of mass-like fibrocystic disease of the breast to differentiate from other breast mass. We retrospectively analyzed characteristics mammographic(16 cases) and sonographic findings(39 cases) of histopathologically proven mass-like fibrocystic disease of the breast in 39 patients. Of 16 patients with mammogram, mass-like fibrocystic disease of the breast was round shape in 12 cases, high density in 14 cases.The margin of the mass was well marginated in 8 cases, poorly marginated in 8 cases. The calcification within the mass was not detected in 13 cases. In 39 patients with sonogram, mass-like fibrocystic disease of the breast was mostly ovoid shape in 24 cases, hypoechoic in 23 cases, with homogenous internal echo in 36 cases, well defined in 28 cases, and with equivocal posterior shadowing in 26 cases. The T/AP ratios of the mass was not less than 1.5 in 29 cases. The bilateral edge-shadowing of the mass was not noted in 24 cases. Characteristic findings of the mass-like fibrocystic disease of the breast are round shape, high density, well defined mass on mammograrn and ovoid shape, homogeneous internal echo, well marginated mass on sonogram which were similar to those in other benign lesion. Mass-like fibrocystic disease, which in a frequent cause of breast lumps, should be included in the differential diagnosis of breast mass with benign mammographic and/or sonographic findings

  12. Mass-Like Fibrocystic Disease of the Breast : Characteristic Findings on Mammogram and Sonogram

    Energy Technology Data Exchange (ETDEWEB)

    KIm, Hyeon Hee; Yun, Duk Hee; Hwang, Ho Kyumg; Kim, Jang Min; Kim, Young Sun; Lee, Jung Hee [Kwang Myung Sung Ae Hospital, Gwangmyeong (Korea, Republic of)

    1995-12-15

    This study was performed to evaluate the mammographic and sonographic features of mass-like fibrocystic disease of the breast to differentiate from other breast mass. We retrospectively analyzed characteristics mammographic(16 cases) and sonographic findings(39 cases) of histopathologically proven mass-like fibrocystic disease of the breast in 39 patients. Of 16 patients with mammogram, mass-like fibrocystic disease of the breast was round shape in 12 cases, high density in 14 cases.The margin of the mass was well marginated in 8 cases, poorly marginated in 8 cases. The calcification within the mass was not detected in 13 cases. In 39 patients with sonogram, mass-like fibrocystic disease of the breast was mostly ovoid shape in 24 cases, hypoechoic in 23 cases, with homogenous internal echo in 36 cases, well defined in 28 cases, and with equivocal posterior shadowing in 26 cases. The T/AP ratios of the mass was not less than 1.5 in 29 cases. The bilateral edge-shadowing of the mass was not noted in 24 cases. Characteristic findings of the mass-like fibrocystic disease of the breast are round shape, high density, well defined mass on mammograrn and ovoid shape, homogeneous internal echo, well marginated mass on sonogram which were similar to those in other benign lesion. Mass-like fibrocystic disease, which in a frequent cause of breast lumps, should be included in the differential diagnosis of breast mass with benign mammographic and/or sonographic findings

  13. An SVM classifier to separate false signals from microcalcifications in digital mammograms

    Energy Technology Data Exchange (ETDEWEB)

    Bazzani, Armando; Bollini, Dante; Brancaccio, Rosa; Campanini, Renato; Riccardi, Alessandro; Romani, Davide [Department of Physics, University of Bologna (Italy); INFN, Bologna (Italy); Lanconelli, Nico [Department of Physics, University of Bologna, and INFN, Bologna (Italy). E-mail: nico.lanconelli@bo.infn.it; Bevilacqua, Alessandro [Department of Electronics, Computer Science and Systems, University of Bologna, and INFN, Bologna (Italy)

    2001-06-01

    In this paper we investigate the feasibility of using an SVM (support vector machine) classifier in our automatic system for the detection of clustered microcalcifications in digital mammograms. SVM is a technique for pattern recognition which relies on the statistical learning theory. It minimizes a function of two terms: the number of misclassified vectors of the training set and a term regarding the generalization classifier capability. We compare the SVM classifier with an MLP (multi-layer perceptron) in the false-positive reduction phase of our detection scheme: a detected signal is considered either microcalcification or false signal, according to the value of a set of its features. The SVM classifier gets slightly better results than the MLP one (Az value of 0.963 against 0.958) in the presence of a high number of training data; the improvement becomes much more evident (Az value of 0.952 against 0.918) in training sets of reduced size. Finally, the setting of the SVM classifier is much easier than the MLP one. (author)

  14. Three-dimensional reconstruction of clustered microcalcifications from two digitized mammograms

    Science.gov (United States)

    Stotzka, Rainer; Mueller, Tim O.; Epper, Wolfgang; Gemmeke, Hartmut

    1998-06-01

    X-ray mammography is one of the most significant diagnosis methods in early detection of breast cancer. Usually two X- ray images from different angles are taken from each mamma to make even overlapping structures visible. X-ray mammography has a very high spatial resolution and can show microcalcifications of 50 - 200 micron in size. Clusters of microcalcifications are one of the most important and often the only indicator for malignant tumors. These calcifications are in some cases extremely difficult to detect. Computer assisted diagnosis of digitized mammograms may improve detection and interpretation of microcalcifications and cause more reliable diagnostic findings. We build a low-cost mammography workstation to detect and classify clusters of microcalcifications and tissue densities automatically. New in this approach is the estimation of the 3D formation of segmented microcalcifications and its visualization which will put additional diagnostic information at the radiologists disposal. The real problem using only two or three projections for reconstruction is the big loss of volume information. Therefore the arrangement of a cluster is estimated using only the positions of segmented microcalcifications. The arrangement of microcalcifications is visualized to the physician by rotating.

  15. Application of support vector machines to breast cancer screening using mammogram and history data

    Science.gov (United States)

    Land, Walker H., Jr.; Akanda, Anab; Lo, Joseph Y.; Anderson, Francis; Bryden, Margaret

    2002-05-01

    Support Vector Machines (SVMs) are a new and radically different type of classifiers and learning machines that use a hypothesis space of linear functions in a high dimensional feature space. This relatively new paradigm, based on Statistical Learning Theory (SLT) and Structural Risk Minimization (SRM), has many advantages when compared to traditional neural networks, which are based on Empirical Risk Minimization (ERM). Unlike neural networks, SVM training always finds a global minimum. Furthermore, SVMs have inherent ability to solve pattern classification without incorporating any problem-domain knowledge. In this study, the SVM was employed as a pattern classifier, operating on mammography data used for breast cancer detection. The main focus was to formulate the best learning machine configurations for optimum specificity and positive predictive value at very high sensitivities. Using a mammogram database of 500 biopsy-proven samples, the best performing SVM, on average, was able to achieve (under statistical 5-fold cross-validation) a specificity of 45.0% and a positive predictive value (PPV) of 50.1% at 100% sensitivity. At 97% sensitivity, a specificity of 55.8% and a PPV of 55.2% were obtained.

  16. Relationship between arterial vascular calcifications seen on screening mammograms and biochemical markers of endothelial injury

    International Nuclear Information System (INIS)

    Pidal, Diego; Sanchez Vidal, M Teresa; Rodriguez, Juan Carlos; Corte, M Daniela; Pravia, Paz; Guinea, Oscar; Pidal, Ivan; Bongera, Miguel; Escribano, Damaso; Gonzalez, Luis O.; Diez, M Cruz; Venta, Rafael; Vizoso, Francisco J.

    2009-01-01

    To assess whether breast arterial calcifications (BAC) are associated with altered serum markers of cardiovascular risk, mammograms and records from 1759 women (age range: 45-65 years) screened for breast cancer were revised. One hundred and forty seven (8.36%) women showed BAC. A total of 136 women with BAC and controls (mean age: 57 and 55 years, respectively) accepted entering the study. There were no significant differences in serum levels of urea, glucose, uric acid, creatinine, total cholesterol, HDL-C, LDL-C, folic acid, vitamin B 12 , TSH or cysteine, between both groups of patients. However, women with BAC showed higher serum levels of triglycerides (p = 0.006), homocysteine (p = 0.002) and hs-CRP (p = 0.003) than women without BAC. Likewise, we found a significantly higher percentage of cases with an elevated LDL-C/HDL-C ratio (coronary risk index >2) amongst women with BAC than in women without BAC (56.7 and 38.2%, respectively; p = 0.04). Our results indicate that the finding of BAC identify women showing altered serum markers of cardiovascular risk

  17. Women with physical disability and the mammogram: An observational study to identify barriers and facilitators

    International Nuclear Information System (INIS)

    Poulos, Ann; Balandin, Susan; Llewellyn, Gwynnyth; McCarthy, Louella; Dark, Leigha

    2011-01-01

    Purpose: To identify barriers and facilitators experienced by women with physical disability having a mammogram. Method: Direct observation of the mammography procedure for women with a range of physical disability at screening facilities of BreastScreen NSW Australia. Results: A volunteer sample of 13 women with varying degrees of physical disability participated in the study. The outcomes suggested that many barriers for women with physical disability can be ameliorated by environmental adaptations and guidelines for both radiographers and women. Some women however cannot be screened successfully, or can be screened only with a level of trauma and/or pain which militates against their continuation within the screening program. This study has identified physical limitations which preclude a successful outcome, those which increase the discomfort/pain of the procedure and aspects of the procedure which can be improved to minimise the experience of discomfort/pain. Conclusion: From the outcomes of the study the development of a decision tool is indicated as a method of providing information for women with physical disability and their doctors as to the likelihood of a successful outcome to participation in mammography screening.

  18. Women with physical disability and the mammogram: An observational study to identify barriers and facilitators

    Energy Technology Data Exchange (ETDEWEB)

    Poulos, Ann, E-mail: ann.poulos@sydney.edu.a [University of Sydney, Faculty of Health Sciences, Discipline of Medical Radiation Sciences, PO Box 170, Lidcombe, NSW 1825 (Australia); Balandin, Susan [University of Sydney, Faculty of Health Sciences, Discipline of Speech Pathology, PO Box 170, Lidcombe, NSW 1825 (Australia); Avdeling for helse- og sosialfag, Hogskolen i Molde, Postboks 2110, 6402 Molde (Norway); Llewellyn, Gwynnyth; McCarthy, Louella [University of Sydney, Faculty of Health Sciences, Discipline of Occupational Therapy, PO Box 170, Lidcombe, NSW 1825 (Australia); Dark, Leigha [University of Sydney, Faculty of Health Sciences, Discipline of Speech Pathology, PO Box 170, Lidcombe, NSW 1825 (Australia)

    2011-02-15

    Purpose: To identify barriers and facilitators experienced by women with physical disability having a mammogram. Method: Direct observation of the mammography procedure for women with a range of physical disability at screening facilities of BreastScreen NSW Australia. Results: A volunteer sample of 13 women with varying degrees of physical disability participated in the study. The outcomes suggested that many barriers for women with physical disability can be ameliorated by environmental adaptations and guidelines for both radiographers and women. Some women however cannot be screened successfully, or can be screened only with a level of trauma and/or pain which militates against their continuation within the screening program. This study has identified physical limitations which preclude a successful outcome, those which increase the discomfort/pain of the procedure and aspects of the procedure which can be improved to minimise the experience of discomfort/pain. Conclusion: From the outcomes of the study the development of a decision tool is indicated as a method of providing information for women with physical disability and their doctors as to the likelihood of a successful outcome to participation in mammography screening.

  19. Performance Analysis of the Consensus-Based Distributed LMS Algorithm

    Directory of Open Access Journals (Sweden)

    Gonzalo Mateos

    2009-01-01

    Full Text Available Low-cost estimation of stationary signals and reduced-complexity tracking of nonstationary processes are well motivated tasks than can be accomplished using ad hoc wireless sensor networks (WSNs. To this end, a fully distributed least mean-square (D-LMS algorithm is developed in this paper, in which sensors exchange messages with single-hop neighbors to consent on the network-wide estimates adaptively. The novel approach does not require a Hamiltonian cycle or a special bridge subset of sensors, while communications among sensors are allowed to be noisy. A mean-square error (MSE performance analysis of D-LMS is conducted in the presence of a time-varying parameter vector, which adheres to a first-order autoregressive model. For sensor observations that are related to the parameter vector of interest via a linear Gaussian model and after adopting simplifying independence assumptions, exact closed-form expressions are derived for the global and sensor-level MSE evolution as well as its steady-state (s.s. values. Mean and MSE-sense stability of D-LMS are also established. Interestingly, extensive numerical tests demonstrate that for small step-sizes the results accurately extend to the pragmatic setting whereby sensors acquire temporally correlated, not necessarily Gaussian data.

  20. The ganga user interface for physics analysis and distributed resources

    CERN Document Server

    Soroko, A; Adams, D; Harrison, K; Charpentier, P; Maier, A; Mato, P; Moscicki, J T; Egede, U; Martyniak, J; Jones, R; Patrick, G N

    2004-01-01

    A physicist analysing data from the LHC experiments will have to deal with data and computing resources that are distributed across multiple locations and have different access methods. Ganga helps by providing a uniform high-level interface to the different low-level solutions for the required tasks, ranging from the specification of input data to the retrieval and post-processing of the output. For LHCb and ATLAS the goal is to assist in running jobs based on the Gaudi/Athena C++ framework. Ganga is written in python and presents the user with a single GUI rather than a set of different applications. It uses pluggable modules to interact with external tools for operations such as querying metadata catalogues, job configuration and job submission. At start-up, the user is presented with a list of templates for common analysis tasks, and information about ongoing tasks is stored from one invocation to the next. Ganga can also be used through a command line interface. This closely mirrors the functionality of ...

  1. Distribution

    Science.gov (United States)

    John R. Jones

    1985-01-01

    Quaking aspen is the most widely distributed native North American tree species (Little 1971, Sargent 1890). It grows in a great diversity of regions, environments, and communities (Harshberger 1911). Only one deciduous tree species in the world, the closely related Eurasian aspen (Populus tremula), has a wider range (Weigle and Frothingham 1911)....

  2. The analysis of annual dose distributions for radiation workers

    International Nuclear Information System (INIS)

    Mill, A.J.

    1984-05-01

    The system of dose limitation recommended by the ICRP includes the requirement that no worker shall exceed the current dose limit of 50mSv/a. Continuous exposure at this limit corresponds to an annual death rate comparable with 'high risk' industries if all workers are continuously exposed at the dose limit. In practice, there is a distribution of doses with an arithmetic mean lower than the dose limit. In its 1977 report UNSCEAR defined a reference dose distribution for the purposes of comparison. However, this two parameter distribution does not show the departure from log-normality normally observed for actual distributions at doses which are a significant proportion of the annual limit. In this report an alternative model is suggested, based on a three parameter log-normal distribution. The third parameter is an ''effective dose limit'' and such a model fits very well the departure from log-normality observed in actual dose distributions. (author)

  3. A Maximum Entropy Approach to Loss Distribution Analysis

    Directory of Open Access Journals (Sweden)

    Marco Bee

    2013-03-01

    Full Text Available In this paper we propose an approach to the estimation and simulation of loss distributions based on Maximum Entropy (ME, a non-parametric technique that maximizes the Shannon entropy of the data under moment constraints. Special cases of the ME density correspond to standard distributions; therefore, this methodology is very general as it nests most classical parametric approaches. Sampling the ME distribution is essential in many contexts, such as loss models constructed via compound distributions. Given the difficulties in carrying out exact simulation,we propose an innovative algorithm, obtained by means of an extension of Adaptive Importance Sampling (AIS, for the approximate simulation of the ME distribution. Several numerical experiments confirm that the AIS-based simulation technique works well, and an application to insurance data gives further insights in the usefulness of the method for modelling, estimating and simulating loss distributions.

  4. A Distributional Analysis of the Gender Wage Gap in Bangladesh

    OpenAIRE

    Salma Ahmed; Pushkar Maitra

    2011-01-01

    This paper decomposes the gender wage gap along the entire wage distribution into an endowment effect and a discrimination effect, taking into account possible selection into full-time employment. Applying a new decomposition approach to the Bangladesh Labour Force Survey (LFS) data we find that women are paid less than men every where on the wage distribution and the gap is higher at the lower end of the distribution. Discrimination against women is the primary determinant of the wage gap. W...

  5. Empirical analysis on the runners' velocity distribution in city marathons

    Science.gov (United States)

    Lin, Zhenquan; Meng, Fan

    2018-01-01

    In recent decades, much researches have been performed on human temporal activity and mobility patterns, while few investigations have been made to examine the features of the velocity distributions of human mobility patterns. In this paper, we investigated empirically the velocity distributions of finishers in New York City marathon, American Chicago marathon, Berlin marathon and London marathon. By statistical analyses on the datasets of the finish time records, we captured some statistical features of human behaviors in marathons: (1) The velocity distributions of all finishers and of partial finishers in the fastest age group both follow log-normal distribution; (2) In the New York City marathon, the velocity distribution of all male runners in eight 5-kilometer internal timing courses undergoes two transitions: from log-normal distribution at the initial stage (several initial courses) to the Gaussian distribution at the middle stage (several middle courses), and to log-normal distribution at the last stage (several last courses); (3) The intensity of the competition, which is described by the root-mean-square value of the rank changes of all runners, goes weaker from initial stage to the middle stage corresponding to the transition of the velocity distribution from log-normal distribution to Gaussian distribution, and when the competition gets stronger in the last course of the middle stage, there will come a transition from Gaussian distribution to log-normal one at last stage. This study may enrich the researches on human mobility patterns and attract attentions on the velocity features of human mobility.

  6. Distributed Leadership in Drainage Basin Management: A Critical Analysis of ‘River Chief Policy’ from a Distributed Leadership Perspective

    Science.gov (United States)

    Zhang, Liuyi

    2018-02-01

    Water resources management has been more significant than ever since the official file stipulated ‘three red lines’ to scrupulously control water usage and water pollution, accelerating the promotion of ‘River Chief Policy’ throughout China. The policy launches creative approaches to include people from different administrative levels to participate and distributes power to increase drainage basin management efficiency. Its execution resembles features of distributed leadership theory, a vastly acknowledged western leadership theory with innovative perspective and visions to suit the modern world. This paper intends to analyse the policy from a distributed leadership perspective using Taylor’s critical policy analysis framework.

  7. A planning and analysis framework for evaluating distributed generation and utility strategies

    International Nuclear Information System (INIS)

    Ault, Graham W.

    2000-01-01

    The numbers of smaller scale distributed power generation units connected to the distribution networks of electricity utilities in the UK and elsewhere have grown significantly in recent years. Numerous economic and political drivers have stimulated this growth and continue to provide the environment for future growth in distributed generation. The simple fact that distributed generation is independent from the distribution utility complicates planning and operational tasks for the distribution network. The uncertainty relating to the number, location and type of distributed generating units to connect to the distribution network in the future makes distribution planning a particularly difficult activity. This thesis concerns the problem of distribution network and business planning in the era of distributed generation. A distributed generation strategic analysis framework is proposed to provide the required analytical capability and planning and decision making framework to enable distribution utilities to deal effectively with the challenges and opportunities presented to them by distributed generation. The distributed generation strategic analysis framework is based on the best features of modern planning and decision making methodologies and facilitates scenario based analysis across many utility strategic options and uncertainties. Case studies are presented and assessed to clearly illustrate the potential benefits of such an approach to distributed generation planning in the UK electricity supply industry. (author)

  8. Distribution analysis of segmented wave sea clutter in littoral environments

    CSIR Research Space (South Africa)

    Strempel, MD

    2015-10-01

    Full Text Available are then fitted against the K-distribution. It is shown that the approach can accurately describe specific sections of the wave with a reduced error between actual and estimated distributions. The improved probability density function (PDF) representation...

  9. Analysis of contrast and absorbed doses in mammography

    International Nuclear Information System (INIS)

    Augusto, F.M.; Ghilardi Netto, T.; Subtil, L.J.; Silva, R. da

    2001-01-01

    One of the great causes of mortality between women in the world is the breast cancer. The mammograms are the method most efficient to detect some cases of cancer of breast before this to be clinically concrete. The quality of a picture system must be determined by the ability to detect tissue soft masses, cyst or tumors, but also calcifications. This detection is directly connected with the contrast obtained in these pictures. This work has for objective to develop a method for the analysis of this contrast in mammograms verifying the doses referred to these mammograms and comparing them with national and international levels of reference. (author)

  10. Similarity Analysis for Reactor Flow Distribution Test and Its Validation

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Soon Joon; Ha, Jung Hui [Heungdeok IT Valley, Yongin (Korea, Republic of); Lee, Taehoo; Han, Ji Woong [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    facility. It was clearly found in Hong et al. In this study the feasibility of the similarity analysis of Hong et al. was examined. The similarity analysis was applied to SFR which has been designed in KAERI (Korea Atomic Energy Research Institute) in order to design the reactor flow distribution test. The length scale was assumed to be 1/5, and the velocity scale 1/2, which bounds the square root of the length scale (1/√5). The CFX calculations for both prototype and model were carried out and the flow field was compared.

  11. Entropy-based derivation of generalized distributions for hydrometeorological frequency analysis

    Science.gov (United States)

    Chen, Lu; Singh, Vijay P.

    2018-02-01

    Frequency analysis of hydrometeorological and hydrological extremes is needed for the design of hydraulic and civil infrastructure facilities as well as water resources management. A multitude of distributions have been employed for frequency analysis of these extremes. However, no single distribution has been accepted as a global standard. Employing the entropy theory, this study derived five generalized distributions for frequency analysis that used different kinds of information encoded as constraints. These distributions were the generalized gamma (GG), the generalized beta distribution of the second kind (GB2), and the Halphen type A distribution (Hal-A), Halphen type B distribution (Hal-B) and Halphen type inverse B distribution (Hal-IB), among which the GG and GB2 distribution were previously derived by Papalexiou and Koutsoyiannis (2012) and the Halphen family was first derived using entropy theory in this paper. The entropy theory allowed to estimate parameters of the distributions in terms of the constraints used for their derivation. The distributions were tested using extreme daily and hourly rainfall data. Results show that the root mean square error (RMSE) values were very small, which indicated that the five generalized distributions fitted the extreme rainfall data well. Among them, according to the Akaike information criterion (AIC) values, generally the GB2 and Halphen family gave a better fit. Therefore, those general distributions are one of the best choices for frequency analysis. The entropy-based derivation led to a new way for frequency analysis of hydrometeorological extremes.

  12. Skewness and kurtosis analysis for non-Gaussian distributions

    Science.gov (United States)

    Celikoglu, Ahmet; Tirnakli, Ugur

    2018-06-01

    In this paper we address a number of pitfalls regarding the use of kurtosis as a measure of deviations from the Gaussian. We treat kurtosis in both its standard definition and that which arises in q-statistics, namely q-kurtosis. We have recently shown that the relation proposed by Cristelli et al. (2012) between skewness and kurtosis can only be verified for relatively small data sets, independently of the type of statistics chosen; however it fails for sufficiently large data sets, if the fourth moment of the distribution is finite. For infinite fourth moments, kurtosis is not defined as the size of the data set tends to infinity. For distributions with finite fourth moments, the size, N, of the data set for which the standard kurtosis saturates to a fixed value, depends on the deviation of the original distribution from the Gaussian. Nevertheless, using kurtosis as a criterion for deciding which distribution deviates further from the Gaussian can be misleading for small data sets, even for finite fourth moment distributions. Going over to q-statistics, we find that although the value of q-kurtosis is finite in the range of 0 < q < 3, this quantity is not useful for comparing different non-Gaussian distributed data sets, unless the appropriate q value, which truly characterizes the data set of interest, is chosen. Finally, we propose a method to determine the correct q value and thereby to compute the q-kurtosis of q-Gaussian distributed data sets.

  13. Analysis of a Pareto Mixture Distribution for Maritime Surveillance Radar

    Directory of Open Access Journals (Sweden)

    Graham V. Weinberg

    2012-01-01

    Full Text Available The Pareto distribution has been shown to be an excellent model for X-band high-resolution maritime surveillance radar clutter returns. Given the success of mixture distributions in radar, it is thus of interest to consider the effect of Pareto mixture models. This paper introduces a formulation of a Pareto intensity mixture distribution and investigates coherent multilook radar detector performance using this new clutter model. Clutter parameter estimates are derived from data sets produced by the Defence Science and Technology Organisation's Ingara maritime surveillance radar.

  14. The gluon distribution at small x - a phenomenological analysis

    International Nuclear Information System (INIS)

    Harriman, P.N.; Martin, A.D.; Stirling, W.J.; Roberts, R.G.

    1990-03-01

    The size of the gluon distribution at small χ has important implications for phenomenology at future high energy hadron-hadron and lepton-hadron colliders. We extend a recent global parton distribution fit to investigate the constraints on the gluon from deep inelastic and prompt photon data. In particular, we estimate a band of allowed gluon distributions with qualitatively small-χ behaviour and study the implications of these on a variety of cross sections at high energy pp and ep colliders. (author)

  15. Distributed processing and analysis of ATLAS experimental data

    CERN Document Server

    Barberis, D; The ATLAS collaboration

    2011-01-01

    The ATLAS experiment is taking data steadily since Autumn 2009, collecting close to 1 fb-1 of data (several petabytes of raw and reconstructed data per year of data-taking). Data are calibrated, reconstructed, distributed and analysed at over 100 different sites using the World-wide LHC Computing Grid and the tools produced by the ATLAS Distributed Computing project. In addition to event data, ATLAS produces a wealth of information on detector status, luminosity, calibrations, alignments, and data processing conditions. This information is stored in relational databases, online and offline, and made transparently available to analysers of ATLAS data world-wide through an infrastructure consisting of distributed database replicas and web servers that exploit caching technologies. This paper reports on the experience of using this distributed computing infrastructure with real data and in real time, on the evolution of the computing model driven by this experience, and on the system performance during the first...

  16. Distributed processing and analysis of ATLAS experimental data

    CERN Document Server

    Barberis, D; The ATLAS collaboration

    2011-01-01

    The ATLAS experiment is taking data steadily since Autumn 2009, and collected so far over 5 fb-1 of data (several petabytes of raw and reconstructed data per year of data-taking). Data are calibrated, reconstructed, distributed and analysed at over 100 different sites using the World-wide LHC Computing Grid and the tools produced by the ATLAS Distributed Computing project. In addition to event data, ATLAS produces a wealth of information on detector status, luminosity, calibrations, alignments, and data processing conditions. This information is stored in relational databases, online and offline, and made transparently available to analysers of ATLAS data world-wide through an infrastructure consisting of distributed database replicas and web servers that exploit caching technologies. This paper reports on the experience of using this distributed computing infrastructure with real data and in real time, on the evolution of the computing model driven by this experience, and on the system performance during the...

  17. analysis of acidic properties of distribution transformer oil insulation

    African Journals Online (AJOL)

    user

    The system detects when the acid- ... rated above 500 kVA are classed as power transformers. Transformers rated at ... generate great impact in safety, reliability and cost of the electric ... the primary voltage of the electric distribution system to.

  18. Determination analysis of energy conservation standards for distribution transformers

    Energy Technology Data Exchange (ETDEWEB)

    Barnes, P.R.; Van Dyke, J.W.; McConnell, B.W.; Das, S.

    1996-07-01

    This report contains information for US DOE to use in making a determination on proposing energy conservation standards for distribution transformers as required by the Energy Policy Act of 1992. Potential for saving energy with more efficient liquid-immersed and dry-type distribution transformers could be significant because these transformers account for an estimated 140 billion kWh of the annual energy lost in the delivery of electricity. Objective was to determine whether energy conservation standards for distribution transformers would have the potential for significant energy savings, be technically feasible, and be economically justified from a national perspective. It was found that energy conservation for distribution transformers would be technically and economically feasible. Based on the energy conservation options analyzed, 3.6-13.7 quads of energy could be saved from 2000 to 2030.

  19. A preliminary survey and analysis of the spatial distribution of ...

    African Journals Online (AJOL)

    The spatial distribution of aquatic macroinvertebrates in the Okavango River ... of taxa was recorded in marginal vegetation in the channels and lagoons, ... highlights the importance of maintaining a mosaic of aquatic habitats in the Delta.

  20. Short circuit analysis of distribution system with integration of DG

    DEFF Research Database (Denmark)

    Su, Chi; Liu, Zhou; Chen, Zhe

    2014-01-01

    and as a result bring challenges to the network protection system. This problem has been frequently discussed in the literature, but mostly considering only the balanced fault situation. This paper presents an investigation on the influence of full converter based wind turbine (WT) integration on fault currents......Integration of distributed generation (DG) such as wind turbines into distribution system is increasing all around the world, because of the flexible and environmentally friendly characteristics. However, DG integration may change the pattern of the fault currents in the distribution system...... during both balanced and unbalanced faults. Major factors such as external grid short circuit power capacity, WT integration location, connection type of WT integration transformer are taken into account. In turn, the challenges brought to the protection system in the distribution network are presented...

  1. A preliminary survey and analysis of the spatial distribution of ...

    African Journals Online (AJOL)

    The spatial distribution of aquatic macroinvertebrates in the Okavango River Delta, ... seasonally-flooded pools and temporary rain-filled pools in MGR and CI. ... biodiversity of the Okavango Delta, thereby contributing to its conservation.

  2. Distributed Multiscale Data Analysis and Processing for Sensor Networks

    National Research Council Canada - National Science Library

    Wagner, Raymond; Sarvotham, Shriram; Choi, Hyeokho; Baraniuk, Richard

    2005-01-01

    .... Second, the communication overhead of multiscale algorithms can become prohibitive. In this paper, we take a first step in addressing both shortcomings by introducing two new distributed multiresolution transforms...

  3. Tradespace Analysis Tool for Designing Earth Science Distributed Missions

    Data.gov (United States)

    National Aeronautics and Space Administration — The ESTO 2030 Science Vision envisions the future of Earth Science to be characterized by 'many more distributed observations,' and 'formation-flying [missions that]...

  4. Data analysis and mapping of the mountain permafrost distribution

    Science.gov (United States)

    Deluigi, Nicola; Lambiel, Christophe; Kanevski, Mikhail

    2017-04-01

    the permafrost occurrence where it is unknown, the mentioned supervised learning techniques inferred a classification function from labelled training data (pixels of permafrost absence and presence). A particular attention was given to the pre-processing of the dataset, with the study of its complexity and the relation between permafrost data and employed environmental variables. The application of feature selection techniques completed this analysis and informed about redundant or valueless predictors. Classification performances were assessed with AUROC on independent validation sets (0.81 for LR, 0.85 with SVM and 0.88 with RF). At the micro scale obtained permafrost maps illustrate consistent results compared to the field reality thanks to the high resolution of the dataset (10 meters). Moreover, compared to classical models, the permafrost prediction is computed without recurring to altitude thresholds (above which permafrost may be found). Finally, as machine learning is a non-deterministic approach, mountain permafrost distribution maps are presented and discussed with corresponding uncertainties maps, which provide information on the quality of the results.

  5. Spatial Distribution Analysis of Scrub Typhus in Korea

    OpenAIRE

    Jin, Hong Sung; Chu, Chaeshin; Han, Dong Yeob

    2013-01-01

    Objective: This study analyzes the spatial distribution of scrub typhus in Korea. Methods: A spatial distribution of Orientia tsutsugamushi occurrence using a geographic information system (GIS) is presented, and analyzed by means of spatial clustering and correlations. Results: The provinces of Gangwon-do and Gyeongsangbuk-do show a low incidence throughout the year. Some districts have almost identical environmental conditions of scrub typhus incidence. The land use change of districts does...

  6. Analysis of transverse field distributions in Porro prism resonators

    Science.gov (United States)

    Litvin, Igor A.; Burger, Liesl; Forbes, Andrew

    2007-05-01

    A model to describe the transverse field distribution of the output beam from porro prism resonators is proposed. The model allows the prediction of the output transverse field distribution by assuming that the main areas of loss are located at the apexes of the porro prisms. Experimental work on a particular system showed some interested correlations between the time domain behavior of the resonator and the transverse field output. These findings are presented and discussed.

  7. Analysis of Strengthening Steel Distribution Channel in Domestic Automotive Industry

    OpenAIRE

    Pangraksa, Sugeng; Djajadiningrat, Surna Tjahja

    2013-01-01

    Distribution has strategic role to spread up product from manufacturer into end-user. Automotive industry needs distribution channel which has: excellent data management, timely delivery management, excellent quality management, and competitive reducing cost. Krakatau Steel (KS) distributors has weaknesses to enter automotive market current that require tight prerequisite such as: consistency of product quality, good cooperation, close relationship, continuously cost reduction, wide spread to...

  8. Distributed generation: An empirical analysis of primary motivators

    International Nuclear Information System (INIS)

    Carley, Sanya

    2009-01-01

    What was once an industry dominated by centralized fossil-fuel power plants, the electricity industry in the United States is now evolving into a more decentralized and deregulated entity. While the future scope and scale of the industry is not yet apparent, recent trends indicate that distributed generation electricity applications may play an important role in this transformation. This paper examines which types of utilities are more likely to adopt distributed generation systems and, additionally, which factors motivate decisions of adoption and system capacity size. Results of a standard two-part model reveal that private utilities are significantly more inclined to adopt distributed generation than cooperatives and other types of public utilities. We also find evidence that interconnection standards and renewable portfolio standards effectively encourage consumer-owned distributed generation, while market forces associated with greater market competition encourage utility-owned distributed generation. Net metering programs are also found to have a significant marginal effect on distributed generation adoption and deployment.

  9. Distributed generation: An empirical analysis of primary motivators

    Energy Technology Data Exchange (ETDEWEB)

    Carley, Sanya [Department of Public Policy and Center for Sustainable Energy, Environment, and Economic Development, University of North Carolina at Chapel Hill, CB3435, Chapel Hill, NC 27599 (United States)], E-mail: scarley@email.unc.edu

    2009-05-15

    What was once an industry dominated by centralized fossil-fuel power plants, the electricity industry in the United States is now evolving into a more decentralized and deregulated entity. While the future scope and scale of the industry is not yet apparent, recent trends indicate that distributed generation electricity applications may play an important role in this transformation. This paper examines which types of utilities are more likely to adopt distributed generation systems and, additionally, which factors motivate decisions of adoption and system capacity size. Results of a standard two-part model reveal that private utilities are significantly more inclined to adopt distributed generation than cooperatives and other types of public utilities. We also find evidence that interconnection standards and renewable portfolio standards effectively encourage consumer-owned distributed generation, while market forces associated with greater market competition encourage utility-owned distributed generation. Net metering programs are also found to have a significant marginal effect on distributed generation adoption and deployment.

  10. Distributed generation. An empirical analysis of primary motivators

    Energy Technology Data Exchange (ETDEWEB)

    Carley, Sanya [Department of Public Policy and Center for Sustainable Energy, Environment, and Economic Development, University of North Carolina at Chapel Hill, CB3435, Chapel Hill, NC 27599 (United States)

    2009-05-15

    What was once an industry dominated by centralized fossil-fuel power plants, the electricity industry in the United States is now evolving into a more decentralized and deregulated entity. While the future scope and scale of the industry is not yet apparent, recent trends indicate that distributed generation electricity applications may play an important role in this transformation. This paper examines which types of utilities are more likely to adopt distributed generation systems and, additionally, which factors motivate decisions of adoption and system capacity size. Results of a standard two-part model reveal that private utilities are significantly more inclined to adopt distributed generation than cooperatives and other types of public utilities. We also find evidence that interconnection standards and renewable portfolio standards effectively encourage consumer-owned distributed generation, while market forces associated with greater market competition encourage utility-owned distributed generation. Net metering programs are also found to have a significant marginal effect on distributed generation adoption and deployment. (author)

  11. Computer-aided detection system applied to full-field digital mammograms

    International Nuclear Information System (INIS)

    Vega Bolivar, Alfonso; Sanchez Gomez, Sonia; Merino, Paula; Alonso-Bartolome, Pilar; Ortega Garcia, Estrella; Munoz Cacho, Pedro; Hoffmeister, Jeffrey W.

    2010-01-01

    Background: Although mammography remains the mainstay for breast cancer screening, it is an imperfect examination with a sensitivity of 75-92% for breast cancer. Computer-aided detection (CAD) has been developed to improve mammographic detection of breast cancer. Purpose: To retrospectively estimate CAD sensitivity and false-positive rate with full-field digital mammograms (FFDMs). Material and Methods: CAD was used to evaluate 151 cases of ductal carcinoma in situ (DCIS) (n=48) and invasive breast cancer (n=103) detected with FFDM. Retrospectively, CAD sensitivity was estimated based on breast density, mammographic presentation, histopathology type, and lesion size. CAD false-positive rate was estimated with screening FFDMs from 200 women. Results: CAD detected 93% (141/151) of cancer cases: 97% (28/29) in fatty breasts, 94% (81/86) in breasts containing scattered fibroglandular densities, 90% (28/31) in heterogeneously dense breasts, and 80% (4/5) in extremely dense breasts. CAD detected 98% (54/55) of cancers manifesting as calcifications, 89% (74/83) as masses, and 100% (13/13) as mixed masses and calcifications. CAD detected 92% (73/79) of invasive ductal carcinomas, 89% (8/9) of invasive lobular carcinomas, 93% (14/15) of other invasive carcinomas, and 96% (46/48) of DCIS. CAD sensitivity for cancers 1-10 mm was 87% (47/54); 11-20 mm, 99% (70/71); 21-30 mm, 86% (12/14); and larger than 30 mm, 100% (12/12). The CAD false-positive rate was 2.5 marks per case. Conclusion: CAD with FFDM showed a high sensitivity in identifying cancers manifesting as calcifications or masses. CAD sensitivity was maintained in small lesions (1-20 mm) and invasive lobular carcinomas, which have lower mammographic sensitivity

  12. Human observer detection experiments with mammograms and power-law noise

    International Nuclear Information System (INIS)

    Burgess, Arthur E.; Jacobson, Francine L.; Judy, Philip F.

    2001-01-01

    We determined contrast thresholds for lesion detection as a function of lesion size in both mammograms and filtered noise backgrounds with the same average power spectrum, P(f )=B/f 3 . Experiments were done using hybrid images with digital images of tumors added to digitized normal backgrounds, displayed on a monochrome monitor. Four tumors were extracted from digitized specimen radiographs. The lesion sizes were varied by digital rescaling to cover the range from 0.5 to 16 mm. Amplitudes were varied to determine the value required for 92% correct detection in two-alternative forced-choice (2AFC) and 90% for search experiments. Three observers participated, two physicists and a radiologist. The 2AFC mammographic results demonstrated a novel contrast-detail (CD) diagram with threshold amplitudes that increased steadily (with slope of 0.3) with increasing size for lesions larger than 1 mm. The slopes for prewhitening model observers were about 0.4. Human efficiency relative to these models was as high as 90%. The CD diagram slopes for the 2AFC experiments with filtered noise were 0.44 for humans and 0.5 for models. Human efficiency relative to the ideal observer was about 40%. The difference in efficiencies for the two types of backgrounds indicates that breast structure cannot be considered to be pure random noise for 2AFC experiments. Instead, 2AFC human detection with mammographic backgrounds is limited by a combination of noise and deterministic masking effects. The search experiments also gave thresholds that increased with lesion size. However, there was no difference in human results for mammographic and filtered noise backgrounds, suggesting that breast structure can be considered to be pure random noise for this task. Our conclusion is that, in spite of the fact that mammographic backgrounds have nonstationary statistics, models based on statistical decision theory can still be applied successfully to estimate human performance

  13. Comparison Between Digital and Synthetic 2D Mammograms in Breast Density Interpretation.

    Science.gov (United States)

    Alshafeiy, Taghreed I; Wadih, Antoine; Nicholson, Brandi T; Rochman, Carrie M; Peppard, Heather R; Patrie, James T; Harvey, Jennifer A

    2017-07-01

    The purpose of this study was to compare assessments of breast density on synthetic 2D images as compared with digital 2D mammograms. This retrospective study included consecutive women undergoing screening with digital 2D mammography and tomosynthesis during May 2015 with a negative or benign outcome. In separate reading sessions, three radiologists with 5-25 years of clinical experience and 1 year of experience with synthetic 2D mammography read digital 2D and synthetic 2D images and assigned breast density categories according to the 5th edition of BI-RADS. Inter- and intrareader agreement was assessed for each BI-RADS density assessment and combined dense and nondense categories using percent agreement and Cohen kappa coefficient for consensus and all reads. A total of 309 patients met study inclusion criteria. Agreement between consensus BI-RADS density categories assigned for digital and synthetic 2D mammography was 80.3% (95% CI, 75.4-84.5%) with κ = 0.73 (95% CI, 0.66-0.79). For combined dense and nondense categories, agreement reached 91.9% (95% CI, 88.2-94.7%). For consensus readings, similar numbers of patients were shifted between nondense and dense categories (11 and 14, respectively) with the synthetic 2D compared with digital 2D mammography. Interreader differences were apparent; assignment to dense categories was greater with digital 2D mammography for reader 1 (odds ratio [OR], 1.26; p = 0.002), the same for reader 2 (OR, 0.91; p = 0.262), and greater with synthetic 2D mammography for reader 3 (OR, 0.86; p = 0.033). Overall, synthetic 2D mammography is comparable with digital 2D mammography in assessment of breast density, though there is some variability by reader. Practices can readily adopt synthetic 2D mammography without concern that it will affect density assessment and subsequent recommendations for supplemental screening.

  14. Computer-aided classification of breast masses using contrast-enhanced digital mammograms

    Science.gov (United States)

    Danala, Gopichandh; Aghaei, Faranak; Heidari, Morteza; Wu, Teresa; Patel, Bhavika; Zheng, Bin

    2018-02-01

    By taking advantages of both mammography and breast MRI, contrast-enhanced digital mammography (CEDM) has emerged as a new promising imaging modality to improve efficacy of breast cancer screening and diagnosis. The primary objective of study is to develop and evaluate a new computer-aided detection and diagnosis (CAD) scheme of CEDM images to classify between malignant and benign breast masses. A CEDM dataset consisting of 111 patients (33 benign and 78 malignant) was retrospectively assembled. Each case includes two types of images namely, low-energy (LE) and dual-energy subtracted (DES) images. First, CAD scheme applied a hybrid segmentation method to automatically segment masses depicting on LE and DES images separately. Optimal segmentation results from DES images were also mapped to LE images and vice versa. Next, a set of 109 quantitative image features related to mass shape and density heterogeneity was initially computed. Last, four multilayer perceptron-based machine learning classifiers integrated with correlationbased feature subset evaluator and leave-one-case-out cross-validation method was built to classify mass regions depicting on LE and DES images, respectively. Initially, when CAD scheme was applied to original segmentation of DES and LE images, the areas under ROC curves were 0.7585+/-0.0526 and 0.7534+/-0.0470, respectively. After optimal segmentation mapping from DES to LE images, AUC value of CAD scheme significantly increased to 0.8477+/-0.0376 (pbreast tissue on lesions, segmentation accuracy was significantly improved as compared to regular mammograms, the study demonstrated that computer-aided classification of breast masses using CEDM images yielded higher performance.

  15. Computer-aided detection system applied to full-field digital mammograms

    Energy Technology Data Exchange (ETDEWEB)

    Vega Bolivar, Alfonso; Sanchez Gomez, Sonia; Merino, Paula; Alonso-Bartolome, Pilar; Ortega Garcia, Estrella (Dept. of Radiology, Univ. Marques of Valdecilla Hospital, Santander (Spain)), e-mail: avegab@telefonica.net; Munoz Cacho, Pedro (Dept. of Statistics, Univ. Marques of Valdecilla Hospital, Santander (Spain)); Hoffmeister, Jeffrey W. (iCAD, Inc., Nashua, NH (United States))

    2010-12-15

    Background: Although mammography remains the mainstay for breast cancer screening, it is an imperfect examination with a sensitivity of 75-92% for breast cancer. Computer-aided detection (CAD) has been developed to improve mammographic detection of breast cancer. Purpose: To retrospectively estimate CAD sensitivity and false-positive rate with full-field digital mammograms (FFDMs). Material and Methods: CAD was used to evaluate 151 cases of ductal carcinoma in situ (DCIS) (n=48) and invasive breast cancer (n=103) detected with FFDM. Retrospectively, CAD sensitivity was estimated based on breast density, mammographic presentation, histopathology type, and lesion size. CAD false-positive rate was estimated with screening FFDMs from 200 women. Results: CAD detected 93% (141/151) of cancer cases: 97% (28/29) in fatty breasts, 94% (81/86) in breasts containing scattered fibroglandular densities, 90% (28/31) in heterogeneously dense breasts, and 80% (4/5) in extremely dense breasts. CAD detected 98% (54/55) of cancers manifesting as calcifications, 89% (74/83) as masses, and 100% (13/13) as mixed masses and calcifications. CAD detected 92% (73/79) of invasive ductal carcinomas, 89% (8/9) of invasive lobular carcinomas, 93% (14/15) of other invasive carcinomas, and 96% (46/48) of DCIS. CAD sensitivity for cancers 1-10 mm was 87% (47/54); 11-20 mm, 99% (70/71); 21-30 mm, 86% (12/14); and larger than 30 mm, 100% (12/12). The CAD false-positive rate was 2.5 marks per case. Conclusion: CAD with FFDM showed a high sensitivity in identifying cancers manifesting as calcifications or masses. CAD sensitivity was maintained in small lesions (1-20 mm) and invasive lobular carcinomas, which have lower mammographic sensitivity

  16. Volumetric Mammogram Assessment: A Helpful Tool in the Treatment of Breast Asymmetries.

    Science.gov (United States)

    Zimman, Oscar A; Butto, Carlos D; Rostagno, Román; Rostagno, Camila

    2017-12-01

    The surgical approach to breast asymmetry depends on several factors, including the surgeon's experience, the anatomy of the patient, and several methods that may help to choose a technique and define the size of the implant or the amount of breast tissue to be excised. The aim of this study is to assist in evaluation of breast volumes with the Quantra™ software application, intended for use with Hologic™ digital mammography systems. Twenty-eight women were studied with full-field digital mammography (FFDM) with the Quantra™ software application, for use with Hologic™ digital mammography systems preoperatively. The case diagnoses were as follows: breast hypertrophy, ptosis, hypoplasia, and reconstruction, and the surgeries included breast reduction, mastopexy, mastopexy and breast reduction, mastoplasty and breast augmentation, breast augmentation, and immediate or delayed breast reconstruction. Patients were evaluated from 6 to 18 months after surgery. Volumetric mammogram studies help to decide the amount of tissue to be excised, the size of the implants, and the combination of both. The results of this study were evaluated by surgeons and patients and found to be highly satisfactory. The use of full-field digital mammography with adequate software should be considered as another tool to assist in making decisions regarding the correction of breast asymmetries. Level of Evidence IV This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .

  17. Toward the breast screening balance sheet: cumulative risk of false positives for annual versus biennial mammograms commencing at age 40 or 50.

    Science.gov (United States)

    Winch, Caleb J; Sherman, Kerry A; Boyages, John

    2015-01-01

    This study aimed to: (1) Estimate cumulative risk of recall from breast screening where no cancer is detected (a harm) in Australia; (2) Compare women screened annually versus biennially, commencing age 40 versus 50; and (3) Compare with international findings. At the no-cost metropolitan program studied, women attended biennial screening, but were offered annual screening if regarded at elevated risk for breast cancer. The cumulative risk of at least one recall was estimated using discrete-time survival analysis. Cancer detection statistics were computed. In total, 801,636 mammograms were undertaken in 231,824 women. Over 10 years, cumulative risk of recall was 13.3 % (95 % CI 12.7-13.8) for those screened biennially, and 19.9 % (CI 16.6-23.2) for those screened annually from age 50-51. Cumulative risk of complex false positive involving a biopsy was 3.1 % (CI 2.9-3.4) and 5.0 % (CI 3.4-6.6), respectively. From age 40-41, the risk of recall was 15.1 % (CI 14.3-16.0) and 22.5 % (CI 17.9-27.1) for biennial and annual screening, respectively. Corresponding rates of complex false positive were 3.3 % (CI 2.9-3.8) and 6.3 % (CI 3.4-9.1). Over 10 mammograms, invasive cancer was detected in 3.4 % (CI 3.3-3.5) and ductal carcinoma in situ in 0.7 % (CI 0.6-0.7) of women, with a non-significant trend toward a larger proportion of Tis and T1N0 cancers in women screened annually (74.5 %) versus biennially (70.1 %), χ (2) = 2.77, p = 0.10. Cancer detection was comparable to international findings. Recall risk was equal to European estimates for women screening from 50 and lower for screening from 40. Recall risk was half of United States' rates across start age and rescreening interval categories. Future benefit/harm balance sheets may be useful for communicating these findings to women.

  18. PanDA: distributed production and distributed analysis system for ATLAS

    International Nuclear Information System (INIS)

    Maeno, T

    2008-01-01

    A new distributed software system was developed in the fall of 2005 for the ATLAS experiment at the LHC. This system, called PANDA, provides an integrated service architecture with late binding of jobs, maximal automation through layered services, tight binding with ATLAS Distributed Data Management system [1], advanced error discovery and recovery procedures, and other features. In this talk, we will describe the PANDA software system. Special emphasis will be placed on the evolution of PANDA based on one and half year of real experience in carrying out Computer System Commissioning data production [2] for ATLAS. The architecture of PANDA is well suited for the computing needs of the ATLAS experiment, which is expected to be one of the first HEP experiments to operate at the petabyte scale

  19. The Aggregation of Individual Distributive Preferences through the Distributive Liberal Social Contract : Normative Analysis.

    OpenAIRE

    Jean Mercier-Ythier

    2010-01-01

    We consider abstract social systems of private property, made of n individuals endowed with non-paternalistic interdependent preferences, who interact through exchanges on competitive markets and Pareto-efficient lumpsum transfers. The transfers follow from a distributive liberal social contract defined as a redistribution of initial endowments such that the resulting market equilibrium allocation is both Pareto-efficient relative to individual interdependent preferences, and unanimously weak...

  20. Sample path analysis and distributions of boundary crossing times

    CERN Document Server

    Zacks, Shelemyahu

    2017-01-01

    This monograph is focused on the derivations of exact distributions of first boundary crossing times of Poisson processes, compound Poisson processes, and more general renewal processes.  The content is limited to the distributions of first boundary crossing times and their applications to various stochastic models. This book provides the theory and techniques for exact computations of distributions and moments of level crossing times. In addition, these techniques could replace simulations in many cases, thus providing more insight about the phenomenona studied. This book takes a general approach for studying telegraph processes and is based on nearly thirty published papers by the author and collaborators over the past twenty five years.  No prior knowledge of advanced probability is required, making the book widely available to students and researchers in applied probability, operations research, applied physics, and applied mathematics. .

  1. A digital elevation analysis: Spatially distributed flow apportioning algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sang-Hyun; Kim, Kyung-Hyun [Pusan National University, Pusan(Korea); Jung, Sun-Hee [Korea Environment Institute, (Korea)

    2001-06-30

    A flow determination algorithm is proposed for the distributed hydrologic model. The advantages of a single flow direction scheme and multiple flow direction schemes are selectively considered to address the drawbacks of existing algorithms. A spatially varied flow apportioning factor is introduced in order to accommodate the accumulated area from upslope cells. The channel initiation threshold area(CIT) concept is expanded and integrated into the spatially distributed flow apportioning algorithm in order to delineate a realistic channel network. An application of a field example suggests that the linearly distributed flow apportioning scheme provides some advantages over existing approaches, such as the relaxation of over-dissipation problems near channel cells, the connectivity feature of river cells, the continuity of saturated areas and the negligence of the optimization of few parameters in existing algorithms. The effects of grid sizes are explored spatially as well as statistically. (author). 28 refs., 7 figs.

  2. Simulation and energy analysis of distributed electric heating system

    Science.gov (United States)

    Yu, Bo; Han, Shenchao; Yang, Yanchun; Liu, Mingyuan

    2018-02-01

    Distributed electric heating system assistssolar heating systemby using air-source heat pump. Air-source heat pump as auxiliary heat sourcecan make up the defects of the conventional solar thermal system can provide a 24 - hour high - efficiency work. It has certain practical value and practical significance to reduce emissions and promote building energy efficiency. Using Polysun software the system is simulated and compared with ordinary electric boiler heating system. The simulation results show that upon energy request, 5844.5kW energy is saved and 3135kg carbon - dioxide emissions are reduced and5844.5 kWhfuel and energy consumption is decreased with distributed electric heating system. Theeffect of conserving energy and reducing emissions using distributed electric heating systemis very obvious.

  3. Interactive microbial distribution analysis using BioAtlas

    DEFF Research Database (Denmark)

    Lund, Jesper; List, Markus; Baumbach, Jan

    2017-01-01

    body maps and (iii) user-defined maps. It further allows for (iv) uploading of own sample data, which can be placed on existing maps to (v) browse the distribution of the associated taxonomies. Finally, BioAtlas enables users to (vi) contribute custom maps (e.g. for plants or animals) and to map...... to analyze microbial distribution in a location-specific context. BioAtlas is an interactive web application that closes this gap between sequence databases, taxonomy profiling and geo/body-location information. It enables users to browse taxonomically annotated sequences across (i) the world map, (ii) human...

  4. Core Flow Distribution from Coupled Supercritical Water Reactor Analysis

    Directory of Open Access Journals (Sweden)

    Po Hu

    2014-01-01

    Full Text Available This paper introduces an extended code package PARCS/RELAP5 to analyze steady state of SCWR US reference design. An 8 × 8 quarter core model in PARCS and a reactor core model in RELAP5 are used to study the core flow distribution under various steady state conditions. The possibility of moderator flow reversal is found in some hot moderator channels. Different moderator flow orifice strategies, both uniform across the core and nonuniform based on the power distribution, are explored with the goal of preventing the reversal.

  5. Energy efficiency analysis of reconfigured distribution system for practical loads

    Directory of Open Access Journals (Sweden)

    Pawan Kumar

    2016-09-01

    Full Text Available In deregulated rate structure, the performance evaluation of distribution system for energy efficiency includes; loss minimization, improved power quality, loadability limit, reliability and availability of supply. Energy efficiency changes with the variation in loading pattern and the load behaviour. Further, the nature of load at each node is not explicitly of any one type rather their characteristics depend upon the node voltages. In most cases, load is assumed to be constant power (real and reactive. In this paper voltage dependent practical loads are represented with composite load model and the energy efficiency performance of distribution system for practical loads is evaluated in different configurations of 33-node system.

  6. Automated registration of diagnostic to prediagnostic x-ray mammograms: Evaluation and comparison to radiologists' accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Pinto Pereira, Snehal M.; Hipwell, John H.; McCormack, Valerie A.; Tanner, Christine; Moss, Sue M.; Wilkinson, Louise S.; Khoo, Lisanne A. L.; Pagliari, Catriona; Skippage, Pippa L.; Kliger, Carole J.; Hawkes, David J.; Santos Silva, Isabel M. dos [Cancer Research UK Epidemiology and Genetics Group, London School of Hygiene and Tropical Medicine, Keppel Street, London WC1E 7HT (United Kingdom); Centre for Medical Image Computing, University College London, London WC1E 6BT (United Kingdom); Lifestyle and Cancer Group, International Agency for Research on Cancer, 150 cours Albert Thomas, Lyon 69008 (France); Centre for Medical Image Computing, University College London, London WC1E 6BT (United Kingdom); Cancer Screening Evaluation Unit, Institute of Cancer Research, Surrey SM2 5NG (United Kingdom); St. George' s Healthcare NHS Trust and South West London Breast Screening Service, London SW17 0QT (United Kingdom); Centre for Medical Image Computing, University College London, London WC1E 6BT (United Kingdom); Cancer Research UK Epidemiology and Genetics Group, London School of Hygiene and Tropical Medicine, Keppel Street, London WC1E 7HT (United Kingdom)

    2010-09-15

    Purpose: To compare and evaluate intensity-based registration methods for computation of serial x-ray mammogram correspondence. Methods: X-ray mammograms were simulated from MRIs of 20 women using finite element methods for modeling breast compressions and employing a MRI/x-ray appearance change model. The parameter configurations of three registration methods, affine, fluid, and free-form deformation (FFD), were optimized for registering x-ray mammograms on these simulated images. Five mammography film readers independently identified landmarks (tumor, nipple, and usually two other normal features) on pairs of diagnostic and corresponding prediagnostic digitized images from 52 breast cancer cases. Landmarks were independently reidentified by each reader. Target registration errors were calculated to compare the three registration methods using the reader landmarks as a gold standard. Data were analyzed using multilevel methods. Results: Between-reader variability varied with landmark (p<0.01) and screen (p=0.03), with between-reader mean distance (mm) in point location on the diagnostic/prediagnostic images of 2.50 (95% CI 1.95, 3.15)/2.84 (2.24, 3.55) for nipples and 4.26 (3.43, 5.24)/4.76 (3.85, 5.84) for tumors. Registration accuracy was sensitive to the type of landmark and the amount of breast density. For dense breasts ({>=}40%), the affine and fluid methods outperformed FFD. For breasts with lower density, the affine registration surpassed both fluid and FFD. Mean accuracy (mm) of the affine registration varied between 3.16 (95% CI 2.56, 3.90) for nipple points in breasts with density 20%-39% and 5.73 (4.80, 6.84) for tumor points in breasts with density <20%. Conclusions: Affine registration accuracy was comparable to that between independent film readers. More advanced two-dimensional nonrigid registration algorithms were incapable of increasing the accuracy of image alignment when compared to affine registration.

  7. Analysis of temperature distribution in a heat conducting fiber with ...

    African Journals Online (AJOL)

    The temperature distribution in a heat conducting fiber is computed using the Galerkin Finite Element Method in the present study. The weak form of the governing differential equation is obtained and nodal temperatures for linear and quadratic interpolation functions for different mesh densities are calculated for Neumann ...

  8. Polybutadiene latex particle size distribution analysis utilizing a disk centrifuge

    NARCIS (Netherlands)

    Verdurmen, E.M.F.J.; Albers, J.G.; German, A.L.

    1994-01-01

    Polybutadiene (I) latexes prepd. by emulsifier-free emulsion polymn. and having particle diam. 50-300 nm for both unimodal and bimodal particles size distributions were analyzed by the line-start (LIST) method in a Brookhaven disk centrifuge photosedimentometer. A special spin fluid was designed to

  9. Resonance analysis in parallel voltage-controlled Distributed Generation inverters

    DEFF Research Database (Denmark)

    Wang, Xiongfei; Blaabjerg, Frede; Chen, Zhe

    2013-01-01

    Thanks to the fast responses of the inner voltage and current control loops, the dynamic behaviors of parallel voltage-controlled Distributed Generation (DG) inverters not only relies on the stability of load sharing among them, but subjects to the interactions between the voltage control loops...

  10. Analysis of the Relationship between Shared Leadership and Distributed Leadership

    Science.gov (United States)

    Goksoy, Suleyman

    2016-01-01

    Problem Statement: The current study's purpose is: First, to examine the relationship between shared leadership and distributed leadership, which, despite having many similar aspects in theory and practice, are defined as separate concepts. Second, to compare the two approaches and dissipate the theoretical contradictions. In this sense, the main…

  11. Comparative Analysis of Possible Designs for Flexible Distribution System Operation

    DEFF Research Database (Denmark)

    Lin, Jeremy; Knezovic, Katarina

    2016-01-01

    for achieving the most efficient utilization of these resources while meeting the forecasted load. In this paper, we present possible system design frameworks proposed for flexible distribution system operation. Critical evaluations and comparison of these models are made based on a number of key attributes...

  12. A formal analysis of a dynamic distributed spanning tree algorithm

    NARCIS (Netherlands)

    Mooij, A.J.; Wesselink, J.W.

    2003-01-01

    Abstract. We analyze the spanning tree algorithm in the IEEE 1394.1 draft standard, which correctness has not previously been proved. This algorithm is a fully-dynamic distributed graph algorithm, which, in general, is hard to develop. The approach we use is to formally develop an algorithm that is

  13. Analysis Of Rainfall Distribution In Owerri And Enugu, Nigeria Using ...

    African Journals Online (AJOL)

    The precipitation concentration index (PCI) of Owerri and Enugu for 1974 to 2011 was computed to characterise the rainfall distribution for both locations. The PCI was estimated on an annual and seasonal scale. The seasonal estimation was based on the categorisation of the seasons in eastern Nigeria into long wet ...

  14. Smart optimisation and sensitivity analysis in water distribution systems

    CSIR Research Space (South Africa)

    Page, Philip R

    2015-12-01

    Full Text Available optimisation of a water distribution system by keeping the average pressure unchanged as water demands change, by changing the speed of the pumps. Another application area considered, using the same mathematical notions, is the study of the sensitivity...

  15. Chemical bonding and charge density distribution analysis of ...

    Indian Academy of Sciences (India)

    tice and the electron density distributions in the unit cell of the samples were investigated. Structural ... titanium and oxygen ions and predominant ionic nature between barium and oxygen ions. Average grain sizes ... trations (at <1%) is responsible for the formation of .... indicated by dots and calculated powder patterns are.

  16. Bidirectional reflectance distribution function measurements and analysis of retroreflective materials.

    Science.gov (United States)

    Belcour, Laurent; Pacanowski, Romain; Delahaie, Marion; Laville-Geay, Aude; Eupherte, Laure

    2014-12-01

    We compare the performance of various analytical retroreflecting bidirectional reflectance distribution function (BRDF) models to assess how they reproduce accurately measured data of retroreflecting materials. We introduce a new parametrization, the back vector parametrization, to analyze retroreflecting data, and we show that this parametrization better preserves the isotropy of data. Furthermore, we update existing BRDF models to improve the representation of retroreflective data.

  17. Diagnostic accuracy of commercial system for computer-assisted detection (CADx) as an adjunct to interpretation of mammograms

    International Nuclear Information System (INIS)

    Menna, Sabatino; Di Virgilio, Maria Rosaria; Burke, Paolo; Frigerio, Alfonso; Boglione, Elisa; Ciccarelli, Grazia; Di Filippo, Sabato; Garretti, Licia

    2005-01-01

    Purpose. To evaluate the diagnostic accuracy of the commercial computer-aided detection CADx system for the reading of mammograms. Materials and methods. The study assessed the Second Look system developed and marketed by CADx Medical Systems, Montreal, Canada. The diagnostic sensitivity was evaluated by means of a retrospective study on 98 consecutive cancers detected at screening by double independent reading. The specificity and the positive predictive value (PPV) for cancer of the CADx system were prospectively evaluated on a second group of 560 consecutive mammograms of asymptomatic women not included in screening program. The radiologist who was present during the test assessed the abnormal mammographic findings by one or more of the following diagnostic procedures: physical examination, additional mammographic detail views with or without magnification,ultrasonography, ultrasound- or mammography guided fine needle aspiration cytology, and core-biopsy. The exams first underwent conventional reading and then a second reading carried out with the aid of the CADx system. Results.The overall diagnostic sensitivity of the CADx system on the 98 screening cancers was 81.6%; in particular it was 89.3% for calcifications, 83.9% for masses and only 37.5% for architectural distortion. The CADx markings for each mammography were 4.7 on average. Identification of invasive carcinoma was independent from tumour size. In the second group of 560 mammograms, the CADx system marked all cases identified as positive by conventional reading and confirmed by biopsy (7/7), but did not permit the detection of any additional cancer. The CADx markings per exam were 4.2 on average, the specificity was 13.7% and the PPV was 0.55% versus 13.7% recall rate of conventional reading. CADx reading led to a 1.96% (11/560) increase of the women necessitating further diagnostic investigation. Conclusions. The results of our study show that the diagnostic sensitivity of the CADx system is lower

  18. Translating the 2-dimensional mammogram into a 3-dimensional breast: Identifying factors that influence the movement of pre-operatively placed wire.

    Science.gov (United States)

    Park, Ko Un; Nathanson, David

    2017-08-01

    Pre-operative measurements from the skin to a wire-localized breast lesion can differ from operating room measurements. This study was designed to measure the discrepancies and study factors that may contribute to wire movement. Prospective data were collected on patients who underwent wire localization lumpectomy. Clip and hook location, breast size, density, and direction of wire placement were the main focus of the analysis. Wire movement was more likely with longer distance from skin to hook or clip, larger breast size (especially if "fatty"), longer time between wire placement and surgery start time, and medial wire placement in larger breast. Age, body mass index, presence of mass, malignant diagnosis, tumor grade, and clip distance to the chest wall were not associated with wire movement. A longer distance from skin to hook correlated with larger specimen volume. Translation of the lesion location from a 2-dimensional mammogram into 3-dimensional breasts is sometimes discrepant because of movement of the localizing wire. Breast size, distance of skin to clip or hook, and wire exit site in larger breasts have a significant impact on wire movement. This information may guide the surgeon's skin incision and extent of excision. © 2017 Wiley Periodicals, Inc.

  19. On process capability and system availability analysis of the inverse Rayleigh distribution

    Directory of Open Access Journals (Sweden)

    Sajid Ali

    2015-04-01

    Full Text Available In this article, process capability and system availability analysis is discussed for the inverse Rayleigh lifetime distribution. Bayesian approach with a conjugate gamma distribution is adopted for the analysis. Different types of loss functions are considered to find Bayes estimates of the process capability and system availability. A simulation study is conducted for the comparison of different loss functions.

  20. "Thanks for Letting Us All Share Your Mammogram Experience Virtually": Developing a Web-Based Hub for Breast Cancer Screening.

    Science.gov (United States)

    Galpin, Adam; Meredith, Joanne; Ure, Cathy; Robinson, Leslie

    2017-10-27

    The decision around whether to attend breast cancer screening can often involve making sense of confusing and contradictory information on its risks and benefits. The Word of Mouth Mammogram e-Network (WoMMeN) project was established to create a Web-based resource to support decision making regarding breast cancer screening. This paper presents data from our user-centered approach in engaging stakeholders (both health professionals and service users) in the design of this Web-based resource. Our novel approach involved creating a user design group within Facebook to allow them access to ongoing discussion between researchers, radiographers, and existing and potential service users. This study had two objectives. The first was to examine the utility of an online user design group for generating insight for the creation of Web-based health resources. We sought to explore the advantages and limitations of this approach. The second objective was to analyze what women want from a Web-based resource for breast cancer screening. We recruited a user design group on Facebook and conducted a survey within the group, asking questions about design considerations for a Web-based breast cancer screening hub. Although the membership of the Facebook group varied over time, there were 71 members in the Facebook group at the end point of analysis. We next conducted a framework analysis on 70 threads from Facebook and a thematic analysis on the 23 survey responses. We focused additionally on how the themes were discussed by the different stakeholders within the context of the design group. Two major themes were found across both the Facebook discussion and the survey data: (1) the power of information and (2) the hub as a place for communication and support. Information was considered as empowering but also recognized as threatening. Communication and the sharing of experiences were deemed important, but there was also recognition of potential miscommunication within online

  1. Dynamical Analysis of SIR Epidemic Models with Distributed Delay

    Directory of Open Access Journals (Sweden)

    Wencai Zhao

    2013-01-01

    Full Text Available SIR epidemic models with distributed delay are proposed. Firstly, the dynamical behaviors of the model without vaccination are studied. Using the Jacobian matrix, the stability of the equilibrium points of the system without vaccination is analyzed. The basic reproduction number R is got. In order to study the important role of vaccination to prevent diseases, the model with distributed delay under impulsive vaccination is formulated. And the sufficient conditions of globally asymptotic stability of “infection-free” periodic solution and the permanence of the model are obtained by using Floquet’s theorem, small-amplitude perturbation skills, and comparison theorem. Lastly, numerical simulation is presented to illustrate our main conclusions that vaccination has significant effects on the dynamical behaviors of the model. The results can provide effective tactic basis for the practical infectious disease prevention.

  2. Analysis of the logistics processes in the wine distribution

    OpenAIRE

    Slavkovský, Matúš

    2011-01-01

    Master's thesis is referring the importance of logistics in the retail business and the importance of reducing logistics costs. It includes so theoretical knowledge as well as the analysis of the relevant markets, which are producing and consuming wine in the largest quantities. Thesis is focused on analysis of the logistical processes and costs of an e-shop. Based on this analysis measures to improve the logistics of the process of the company are proposed. The goal of the Master's thesis is...

  3. Distributed Scheduling in Time Dependent Environments: Algorithms and Analysis

    OpenAIRE

    Shmuel, Ori; Cohen, Asaf; Gurewitz, Omer

    2017-01-01

    Consider the problem of a multiple access channel in a time dependent environment with a large number of users. In such a system, mostly due to practical constraints (e.g., decoding complexity), not all users can be scheduled together, and usually only one user may transmit at any given time. Assuming a distributed, opportunistic scheduling algorithm, we analyse the system's properties, such as delay, QoS and capacity scaling laws. Specifically, we start with analyzing the performance while \\...

  4. Fully Stochastic Distributed Methodology for Multivariate Flood Frequency Analysis

    Directory of Open Access Journals (Sweden)

    Isabel Flores-Montoya

    2016-05-01

    Full Text Available An adequate estimation of the extreme behavior of basin response is essential both for designing river structures and for evaluating their risk. The aim of this paper is to develop a new methodology to generate extreme hydrograph series of thousands of years using an event-based model. To this end, a spatial-temporal synthetic rainfall generator (RainSimV3 is combined with a distributed physically-based rainfall–runoff event-based model (RIBS. The use of an event-based model allows simulating longer hydrograph series with less computational and data requirements but need to characterize the initial basis state, which depends on the initial basin moisture distribution. To overcome this problem, this paper proposed a probabilistic calibration–simulation approach, which considers the initial state and the model parameters as random variables characterized by a probability distribution though a Monte Carlo simulation. This approach is compared with two other approaches, the deterministic and the semi-deterministic approaches. Both approaches use a unique initial state. The deterministic approach also uses a unique value of the model parameters while the semi-deterministic approach obtains these values from its probability distribution through a Monte Carlo simulation, considering the basin variability. This methodology has been applied to the Corbès and Générargues basins, in the Southeast of France. The results show that the probabilistic approach offers the best fit. That means that the proposed methodology can be successfully used to characterize the extreme behavior of the basin considering the basin variability and overcoming the basin initial state problem.

  5. HPC Performance Analysis of a Distributed Information Enterprise Simulation

    National Research Council Canada - National Science Library

    Hanna, James P; Walter, Martin J; Hillman, Robert G

    2004-01-01

    .... The analysis identified several performance limitations and bottlenecks. One critical limitation addressed and eliminated was simultaneously mixing a periodic process model with an event driven model causing rollbacks...

  6. Finite element analysis for temperature distributions in a cold forging

    International Nuclear Information System (INIS)

    Kim, Dong Bum; Lee, In Hwan; Cho, Hae Yong; Kim, Sung Wook; Song, In Chul; Jeon, Byung Cheol

    2013-01-01

    In this research, the finite element method is utilized to predict the temperature distributions in a cold-forging process for a cambolt. The cambolt is mainly used as a part of a suspension system of a vehicle. The cambolt has an off-centered lobe that manipulates the vertical position of the knuckle and wheel to a slight degree. The cambolt requires certain mechanical properties, such as strength and endurance limits. Moreover, temperature is also an important factor to realize mass production and improve efficiency. However, direct measurement of temperature in a forging process is infeasible with existing technology; therefore, there is a critical need for a new technique. Accordingly, in this study, a thermo-coupled finite element method is developed for predicting the temperature distribution. The rate of energy conversion to heat for the workpiece material is determined, and the temperature distribution is analyzed throughout the forging process for a cambolt. The temperatures associated with different punch speeds are also studied, as well as the relationships between load, temperature, and punch speed. Experimental verification of the technique is presented.

  7. Finite element analysis for temperature distributions in a cold forging

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dong Bum; Lee, In Hwan; Cho, Hae Yong [Chungbuk National University, Cheongju (Korea, Republic of); Kim, Sung Wook [Yanbian National University, Yanbian (China); Song, In Chul; Jeon, Byung Cheol [Sunil dyfas, Jincheon (Korea, Republic of)

    2013-10-15

    In this research, the finite element method is utilized to predict the temperature distributions in a cold-forging process for a cambolt. The cambolt is mainly used as a part of a suspension system of a vehicle. The cambolt has an off-centered lobe that manipulates the vertical position of the knuckle and wheel to a slight degree. The cambolt requires certain mechanical properties, such as strength and endurance limits. Moreover, temperature is also an important factor to realize mass production and improve efficiency. However, direct measurement of temperature in a forging process is infeasible with existing technology; therefore, there is a critical need for a new technique. Accordingly, in this study, a thermo-coupled finite element method is developed for predicting the temperature distribution. The rate of energy conversion to heat for the workpiece material is determined, and the temperature distribution is analyzed throughout the forging process for a cambolt. The temperatures associated with different punch speeds are also studied, as well as the relationships between load, temperature, and punch speed. Experimental verification of the technique is presented.

  8. Staghorn: An Automated Large-Scale Distributed System Analysis Platform

    Energy Technology Data Exchange (ETDEWEB)

    Gabert, Kasimir [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Burns, Ian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Elliott, Steven [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kallaher, Jenna [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vail, Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-09-01

    Conducting experiments on large-scale distributed computing systems is becoming significantly easier with the assistance of emulation. Researchers can now create a model of a distributed computing environment and then generate a virtual, laboratory copy of the entire system composed of potentially thousands of virtual machines, switches, and software. The use of real software, running at clock rate in full virtual machines, allows experiments to produce meaningful results without necessitating a full understanding of all model components. However, the ability to inspect and modify elements within these models is bound by the limitation that such modifications must compete with the model, either running in or alongside it. This inhibits entire classes of analyses from being conducted upon these models. We developed a mechanism to snapshot an entire emulation-based model as it is running. This allows us to \\freeze time" and subsequently fork execution, replay execution, modify arbitrary parts of the model, or deeply explore the model. This snapshot includes capturing packets in transit and other input/output state along with the running virtual machines. We were able to build this system in Linux using Open vSwitch and Kernel Virtual Machines on top of Sandia's emulation platform Firewheel. This primitive opens the door to numerous subsequent analyses on models, including state space exploration, debugging distributed systems, performance optimizations, improved training environments, and improved experiment repeatability.

  9. Analysis of magnetic electron lens with secant hyperbolic field distribution

    International Nuclear Information System (INIS)

    Pany, S.S.; Ahmed, Z.; Dubey, B.P.

    2014-01-01

    Electron-optical imaging instruments like Scanning Electron Microscope (SEM) and Transmission Electron Microscope (TEM) use specially designed solenoid electromagnets for focusing of the electron beam. Indicators of imaging performance of these instruments, like spatial resolution, have a strong correlation with the focal characteristics of the magnetic lenses, which in turn have been shown to be sensitive to the details of the spatial distribution of the axial magnetic field. Owing to the complexity of designing practical lenses, empirical mathematical expressions are important to obtain the desired focal properties. Thus the degree of accuracy of such models in representing the actual field distribution determines accuracy of the calculations and ultimately the performance of the lens. Historically, the mathematical models proposed by Glaser [1] and Ramberg [2] have been extensively used. In this paper the authors discuss another model with a secant-hyperbolic type magnetic field distribution function, and present a comparison between models, utilizing results from finite element-based field simulations as the reference for evaluating performance

  10. The Emotions of Abstract Words: A Distributional Semantic Analysis.

    Science.gov (United States)

    Lenci, Alessandro; Lebani, Gianluca E; Passaro, Lucia C

    2018-04-06

    Recent psycholinguistic and neuroscientific research has emphasized the crucial role of emotions for abstract words, which would be grounded by affective experience, instead of a sensorimotor one. The hypothesis of affective embodiment has been proposed as an alternative to the idea that abstract words are linguistically coded and that linguistic processing plays a key role in their acquisition and processing. In this paper, we use distributional semantic models to explore the complex interplay between linguistic and affective information in the representation of abstract words. Distributional analyses on Italian norming data show that abstract words have more affective content and tend to co-occur with contexts with higher emotive values, according to affective statistical indices estimated in terms of distributional similarity with a restricted number of seed words strongly associated with a set of basic emotions. Therefore, the strong affective content of abstract words might just be an indirect byproduct of co-occurrence statistics. This is consistent with a version of representational pluralism in which concepts that are fully embodied either at the sensorimotor or at the affective level live side-by-side with concepts only indirectly embodied via their linguistic associations with other embodied words. Copyright © 2018 Cognitive Science Society, Inc.

  11. Improving mass candidate detection in mammograms via feature maxima propagation and local feature selection.

    Science.gov (United States)

    Melendez, Jaime; Sánchez, Clara I; van Ginneken, Bram; Karssemeijer, Nico

    2014-08-01

    Mass candidate detection is a crucial component of multistep computer-aided detection (CAD) systems. It is usually performed by combining several local features by means of a classifier. When these features are processed on a per-image-location basis (e.g., for each pixel), mismatching problems may arise while constructing feature vectors for classification, which is especially true when the behavior expected from the evaluated features is a peaked response due to the presence of a mass. In this study, two of these problems, consisting of maxima misalignment and differences of maxima spread, are identified and two solutions are proposed. The first proposed method, feature maxima propagation, reproduces feature maxima through their neighboring locations. The second method, local feature selection, combines different subsets of features for different feature vectors associated with image locations. Both methods are applied independently and together. The proposed methods are included in a mammogram-based CAD system intended for mass detection in screening. Experiments are carried out with a database of 382 digital cases. Sensitivity is assessed at two sets of operating points. The first one is the interval of 3.5-15 false positives per image (FPs/image), which is typical for mass candidate detection. The second one is 1 FP/image, which allows to estimate the quality of the mass candidate detector's output for use in subsequent steps of the CAD system. The best results are obtained when the proposed methods are applied together. In that case, the mean sensitivity in the interval of 3.5-15 FPs/image significantly increases from 0.926 to 0.958 (p < 0.0002). At the lower rate of 1 FP/image, the mean sensitivity improves from 0.628 to 0.734 (p < 0.0002). Given the improved detection performance, the authors believe that the strategies proposed in this paper can render mass candidate detection approaches based on image location classification more robust to feature

  12. Current issues and challenges in global analysis of parton distributions

    International Nuclear Information System (INIS)

    Tung, Wu-Ki

    2007-01-01

    A new implementation of precise perturbative QCD calculation of deep inelastic scattering structure functions and cross sections, incorporating heavy quark mass effects, is applied to the global analysis of the full HERA I data sets on NC and CC cross sections, in conjunction with other experiments. Improved agreement between the NLO QCD theory and the global data sets are obtained. Comparison of the new results to that of previous analysis based on conventional zero-mass parton formalism is made. Exploratory work on implications of new fixed-target neutrino scattering and Drell-Yan data on global analysis is also discussed. (author)

  13. BioAtlas: Interactive web service for microbial distribution analysis

    DEFF Research Database (Denmark)

    Lund, Jesper; List, Markus; Baumbach, Jan

    Massive amounts of 16S rRNA sequencing data have been stored in publicly accessible databases, such as GOLD, SILVA, GreenGenes (GG), and the Ribosomal Database Project (RDP). Many of these sequences are tagged with geo-locations. Nevertheless, researchers currently lack a user-friendly tool...... to analyze microbial distribution in a location-specific context. BioAtlas is an interactive web application that closes this gap between sequence databases, taxonomy profiling and geo/body-location information. It enables users to browse taxonomically annotated sequences across (i) the world map, (ii) human...

  14. Analysis of the porosity distribution of mixed oxide pins

    International Nuclear Information System (INIS)

    Lieblich, M.; Lopez, J.

    1987-01-01

    In the frame of the Joint Irradiation Program IVO-FR2-Vg7 between the Centre of Nuclear Research of Karlsruhe (KfK), the irradiation of 30 mixed-oxide fuel rods in the FR2 experimental reactor was carried out. The pins were located in 10 single-walled NaK capsules. The behaviour of the fuel during its burnup was studied, mainly, the rest-porosity and cracking distribution in the pellet, partial densification, etc. In this work 3 pins from the capsule No. 165 were analyzed. The experimental results (pore and cracking profiles) were interpreted by the fuel rod code SATURN. (Author) 20 refs

  15. Role of technetium-99m sestamibi scintimammography and contrast-enhanced magnetic resonance imaging for the evaluation of indeterminate mammograms

    International Nuclear Information System (INIS)

    Tiling, R.; Moser, R.; Meyer, G.; Tatsch, K.; Hahn, K.; Khalkhali, I.; Sommer, H.; Willemsen, F.; Pfluger, T.

    1997-01-01

    This study evaluated and compared technetium-99m sestamibi scintimammography (SMM) and breast magnetic resonance imaging (MRI) results in patients with indeterminate mammograms to determine whether either technique can improve the sensitivity and specificity for the diagnosis of breast carcinoma. From 123 consecutive patients who underwent physical examination, mammography, SMM, and histopathologic confirmation, a subgroup of 82 patients presenting with indeterminate mammograms was studied. Sixty-eight patients underwent contrast-enhanced MRI. SMM results were scored on the basis of the intensity and pattern of sestamibi uptake. MRI images were scored on the basis of signal intensity increase after administration of contrast material as well as the enhancement pattern and speed of gadolinium uptake. The results obtained with the two techniques were compared and related to the final histopathologic diagnoses. Considering indeterminate findings as positive, the sensitivity of SMM was 79% and the specificity, 70%. MRI displayed a sensitivity of 84% and a specificity of 49%. When indeterminate results were considered negative, the sensitivity and specificity of SMM were 62% and 83%, respectively. MRI revealed a sensitivity and specificity of 56% and 79%, respectively. The calculated sensitivities and specificities demonstrate the diagnostic limitations of both SMM and MRI in the evaluation of patients with indeterminate mammographic findings. Due to the higher specificity, SMM may be the preferred modality in the evaluation of selected patients with breast abnormalities. (orig.)

  16. Genetic Fuzzy System (GFS based wavelet co-occurrence feature selection in mammogram classification for breast cancer diagnosis

    Directory of Open Access Journals (Sweden)

    Meenakshi M. Pawar

    2016-09-01

    Full Text Available Breast cancer is significant health problem diagnosed mostly in women worldwide. Therefore, early detection of breast cancer is performed with the help of digital mammography, which can reduce mortality rate. This paper presents wrapper based feature selection approach for wavelet co-occurrence feature (WCF using Genetic Fuzzy System (GFS in mammogram classification problem. The performance of GFS algorithm is explained using mini-MIAS database. WCF features are obtained from detail wavelet coefficients at each level of decomposition of mammogram image. At first level of decomposition, 18 features are applied to GFS algorithm, which selects 5 features with an average classification success rate of 39.64%. Subsequently, at second level it selects 9 features from 36 features and the classification success rate is improved to 56.75%. For third level, 16 features are selected from 54 features and average success rate is improved to 64.98%. Lastly, at fourth level 72 features are applied to GFS, which selects 16 features and thereby increasing average success rate to 89.47%. Hence, GFS algorithm is the effective way of obtaining optimal set of feature in breast cancer diagnosis.

  17. Multi-task transfer learning deep convolutional neural network: application to computer-aided diagnosis of breast cancer on mammograms

    Science.gov (United States)

    Samala, Ravi K.; Chan, Heang-Ping; Hadjiiski, Lubomir M.; Helvie, Mark A.; Cha, Kenny H.; Richter, Caleb D.

    2017-12-01

    Transfer learning in deep convolutional neural networks (DCNNs) is an important step in its application to medical imaging tasks. We propose a multi-task transfer learning DCNN with the aim of translating the ‘knowledge’ learned from non-medical images to medical diagnostic tasks through supervised training and increasing the generalization capabilities of DCNNs by simultaneously learning auxiliary tasks. We studied this approach in an important application: classification of malignant and benign breast masses. With Institutional Review Board (IRB) approval, digitized screen-film mammograms (SFMs) and digital mammograms (DMs) were collected from our patient files and additional SFMs were obtained from the Digital Database for Screening Mammography. The data set consisted of 2242 views with 2454 masses (1057 malignant, 1397 benign). In single-task transfer learning, the DCNN was trained and tested on SFMs. In multi-task transfer learning, SFMs and DMs were used to train the DCNN, which was then tested on SFMs. N-fold cross-validation with the training set was used for training and parameter optimization. On the independent test set, the multi-task transfer learning DCNN was found to have significantly (p  =  0.007) higher performance compared to the single-task transfer learning DCNN. This study demonstrates that multi-task transfer learning may be an effective approach for training DCNN in medical imaging applications when training samples from a single modality are limited.

  18. Analysis and Synthesis of Distributed Real-Time Embedded Systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    like automotive electronics, real-time multimedia, avionics, medical equipment, and factory systems. The proposed analysis and synthesis techniques derive optimized implementations that fulfill the imposed design constraints. An important part of the implementation process is the synthesis...

  19. Experience with a distributed computing system for magnetic field analysis

    International Nuclear Information System (INIS)

    Newman, M.J.

    1978-08-01

    The development of a general purpose computer system, THESEUS, is described the initial use for which has been magnetic field analysis. The system involves several computers connected by data links. Some are small computers with interactive graphics facilities and limited analysis capabilities, and others are large computers for batch execution of analysis programs with heavy processor demands. The system is highly modular for easy extension and highly portable for transfer to different computers. It can easily be adapted for a completely different application. It provides a highly efficient and flexible interface between magnet designers and specialised analysis programs. Both the advantages and problems experienced are highlighted, together with a mention of possible future developments. (U.K.)

  20. Finite element analysis of thermal stress distribution in different ...

    African Journals Online (AJOL)

    Nigerian Journal of Clinical Practice • Jan-Feb 2016 • Vol 19 • Issue 1. Abstract ... Key words: Amalgam, finite element method, glass ionomer cement, resin composite, thermal stress ... applications for force analysis and assessment of different.

  1. Archiving, Distribution and Analysis of Solar-B Data

    Science.gov (United States)

    Shimojo, M.

    2007-10-01

    The Solar-B Mission Operation and Data Analysis (MODA) working group has been discussing the data analysis system for Solar-B data since 2001. In the paper, based on the Solar-B MODA document and the recent work in Japan, we introduce the dataflow from Solar-B to scientists, the data format and data-level of Solar-B data, and the data searching/providing system.

  2. Fuel distribution process risk analysis in East Borneo

    Directory of Open Access Journals (Sweden)

    Laksmita Raizsa

    2018-01-01

    Full Text Available Fuel distribution is an important aspect of fulfilling the customer’s need. It is risky because it can cause tardiness that can cause fuel scarcity. In the process of distribution, many risks are occurring. House of Risk is a method used for mitigating the risk. It identifies seven risk events and nine risk agents. Matrix occurrence and severity are used for eliminating the minor impact risk. House of Risk 1 is used for determining the Aggregate Risk Potential (ARP. Pareto diagram is applied to prioritize risk that must be mitigated by preventive actions based on ARP. It identifies 4 priority risks, namely A8 (Car trouble, A4 (Human Error, A3 (Error deposit via bank and underpayment, and A6 (traffic accident which should be mitigated. House of Risk 2 makes for mapping between the preventive action and risk agent. It gets the Effectiveness to Difficulty Ratio (ETD for mitigating action. Conducting safety talk routine once every three days with ETD 2088 is the primary preventive actions.

  3. Visualization and analysis of lipopolysaccharide distribution in binary phospholipid bilayers

    Energy Technology Data Exchange (ETDEWEB)

    Henning, Maria Florencia [Instituto de Investigaciones Bioquimicas La Plata (INIBIOLP), CCT-La Plata, CONICET, Facultad de Ciencias Medicas, UNLP, Calles 60 y 120, 1900 La Plata (Argentina); Sanchez, Susana [Laboratory for Fluorescence Dynamics, University of California-Irvine, Irvine, CA (United States); Bakas, Laura, E-mail: lbakas@biol.unlp.edu.ar [Instituto de Investigaciones Bioquimicas La Plata (INIBIOLP), CCT-La Plata, CONICET, Facultad de Ciencias Medicas, UNLP, Calles 60 y 120, 1900 La Plata (Argentina); Departamento de Ciencias Biologicas, Facultad de Ciencias Exactas, UNLP, Calles 47 y 115, 1900 La Plata (Argentina)

    2009-05-22

    Lipopolysaccharide (LPS) is an endotoxin released from the outer membrane of Gram-negative bacteria during infections. It have been reported that LPS may play a role in the outer membrane of bacteria similar to that of cholesterol in eukaryotic plasma membranes. In this article we compare the effect of introducing LPS or cholesterol in liposomes made of dipalmitoylphosphatidylcholine/dioleoylphosphatidylcholine on the solubilization process by Triton X-100. The results show that liposomes containing LPS or cholesterol are more resistant to solubilization by Triton X-100 than the binary phospholipid mixtures at 4 {sup o}C. The LPS distribution was analyzed on GUVs of DPPC:DOPC using FITC-LPS. Solid and liquid-crystalline domains were visualized labeling the GUVs with LAURDAN and GP images were acquired using a two-photon microscope. The images show a selective distribution of LPS in gel domains. Our results support the hypothesis that LPS could aggregate and concentrate selectively in biological membranes providing a mechanism to bring together several components of the LPS-sensing machinery.

  4. Distributed sensing signal analysis of deformable plate/membrane mirrors

    Science.gov (United States)

    Lu, Yifan; Yue, Honghao; Deng, Zongquan; Tzou, Hornsen

    2017-11-01

    Deformable optical mirrors usually play key roles in aerospace and optical structural systems applied to space telescopes, radars, solar collectors, communication antennas, etc. Limited by the payload capacity of current launch vehicles, the deformable mirrors should be lightweight and are generally made of ultra-thin plates or even membranes. These plate/membrane mirrors are susceptible to external excitations and this may lead to surface inaccuracy and jeopardize relevant working performance. In order to investigate the modal vibration characteristics of the mirror, a piezoelectric layer is fully laminated on its non-reflective side to serve as sensors. The piezoelectric layer is segmented into infinitesimal elements so that microscopic distributed sensing signals can be explored. In this paper, the deformable mirror is modeled as a pre-tensioned plate and membrane respectively and sensing signal distributions of the two models are compared. Different pre-tensioning forces are also applied to reveal the tension effects on the mode shape and sensing signals of the mirror. Analytical results in this study could be used as guideline of optimal sensor/actuator placement for deformable space mirrors.

  5. A Script Analysis of the Distribution of Counterfeit Alcohol Across Two European Jurisdictions

    OpenAIRE

    Lord, Nicholas; Spencer, Jonathan; Bellotti, Elisa; Benson, Katie

    2017-01-01

    This article presents a script analysis of the distribution of counterfeit alcohols across two European jurisdictions. Based on an analysis of case file data from a European regulator and interviews with investigators, the article deconstructs the organisation of the distribution of the alcohol across jurisdictions into five scenes (collection, logistics, delivery, disposal, proceeds/finance) and analyses the actual (or likely permutations of) behaviours within each scene. The analysis also i...

  6. Measurement based scenario analysis of short-range distribution system planning

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Bak-Jensen, Birgitte; Chen, Zhe

    2009-01-01

    This paper focuses on short-range distribution system planning using a probabilistic approach. Empirical probabilistic distributions of load demand and distributed generations are derived from the historical measurement data and incorporated into the system planning. Simulations with various...... feasible scenarios are performed based on a local distribution system at Støvring in Denmark. Simulation results provide more accurate and insightful information for the decision-maker when using the probabilistic analysis than using the worst-case analysis, so that a better planning can be achieved....

  7. Analysis of the tropospheric water distribution during FIRE 2

    Science.gov (United States)

    Westphal, Douglas L.

    1993-01-01

    The Penn State/NCAR mesoscale model, as adapted for use at ARC, was used as a testbed for the development and validation of cloud models for use in General Circulation Models (GCM's). This modeling approach also allows us to intercompare the predictions of the various cloud schemes within the same dynamical framework. The use of the PSU/NCAR mesoscale model also allows us to compare our results with FIRE-II (First International Satellite Cloud Climatology Project Regional Experiment) observations, instead of climate statistics. Though a promising approach, our work to date revealed several difficulties. First, the model by design is limited in spatial coverage and is only run for 12 to 48 hours at a time. Hence the quality of the simulation will depend heavily on the initial conditions. The poor quality of upper-tropospheric measurements of water vapor is well known and the situation is particularly bad for mid-latitude winter since the coupling with the surface is less direct than in summer so that relying on the model to spin-up a reasonable moisture field is not always successful. Though one of the most common atmospheric constituents, water vapor is relatively difficult to measure accurately, especially operationally over large areas. The standard NWS sondes have little sensitivity at the low temperatures where cirrus form and the data from the GOES 6.7 micron channel is difficult to quantify. For this reason, the goals of FIRE Cirrus II included characterizing the three-dimensional distribution of water vapor and clouds. In studying the data from FIRE Cirrus II, it was found that no single special observation technique provides accurate regional distributions of water vapor. The Raman lidar provides accurate measurements, but only at the Hub, for levels up to 10 km, and during nighttime hours. The CLASS sondes are more sensitive to moisture at low temperatures than are the NWS sondes, but the four stations only cover an area of two hundred kilometers on a side

  8. CLUSTER ANALYSIS UKRAINIAN REGIONAL DISTRIBUTION BY LEVEL OF INNOVATION

    Directory of Open Access Journals (Sweden)

    Roman Shchur

    2016-07-01

    Full Text Available   SWOT-analysis of the threats and benefits of innovation development strategy of Ivano-Frankivsk region in the context of financial support was сonducted. Methodical approach to determine of public-private partnerships potential that is tool of innovative economic development financing was identified. Cluster analysis of possibilities of forming public-private partnership in a particular region was carried out. Optimal set of problem areas that require urgent solutions and financial security is defined on the basis of cluster approach. It will help to form practical recommendations for the formation of an effective financial mechanism in the regions of Ukraine. Key words: the mechanism of innovation development financial provision, innovation development, public-private partnerships, cluster analysis, innovative development strategy.

  9. Quantitative analysis of distributed control paradigms of robot swarms

    DEFF Research Database (Denmark)

    Ngo, Trung Dung

    2010-01-01

    describe the physical and simulated robots, experiment scenario, and experiment setup. Third, we present our robot controllers based on behaviour based and neural network based paradigms. Fourth, we graphically show their experiment results and quantitatively analyse the results in comparison of the two......Given a task of designing controller for mobile robots in swarms, one might wonder which distributed control paradigms should be selected. Until now, paradigms of robot controllers have been within either behaviour based control or neural network based control, which have been recognized as two...... mainstreams of controller design for mobile robots. However, in swarm robotics, it is not clear how to determine control paradigms. In this paper we study the two control paradigms with various experiments of swarm aggregation. First, we introduce the two control paradigms for mobile robots. Second, we...

  10. BEANS - a software package for distributed Big Data analysis

    Science.gov (United States)

    Hypki, Arkadiusz

    2018-03-01

    BEANS software is a web based, easy to install and maintain, new tool to store and analyse in a distributed way a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of datasets. Its main purpose is to simplify the process of storing, examining and finding new relations in huge datasets. The software is an answer to a growing need of the astronomical community to have a versatile tool to store, analyse and compare the complex astrophysical numerical simulations with observations (e.g. simulations of the Galaxy or star clusters with the Gaia archive). However, this software was built in a general form and it is ready to use in any other research field. It can be used as a building block for other open source software too.

  11. TCP isoeffect analysis using a heterogeneous distribution of radiosensitivity

    International Nuclear Information System (INIS)

    Carlone, Marco; Wilkins, David; Nyiri, Balazs; Raaphorst, Peter

    2004-01-01

    A formula for the α/β ratio is derived using the heterogeneous (population averaged) tumor control model. This formula is nearly identical to the formula obtained using the homogeneous (individual) tumor control model, but the new formula includes extra terms showing that the α/β ratio, the ratio of the mean value of α divided by the mean value of β that would be observed in a patient population, explicitly depends on the survival level and heterogeneity. The magnitude of this correction is estimated for prostate cancer, and this appears to raise the mean value of the ratio estimate by about 20%. The method also allows investigation of confidence limits for α/β based on a population distribution of radiosensitivity. For a widely heterogeneous population, the upper 95% confidence interval for the α/β ratio can be as high as 7.3 Gy, even though the population mean is between 2.3 and 2.6 Gy

  12. Analysis and Optimization of Distributed Real-Time Embedded Systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    2006-01-01

    and scheduling policies. In this context, the task of designing such systems is becoming increasingly difficult. The success of new adequate design methods depends on the availability of efficient analysis as well as optimization techniques. In this paper, we present both analysis and optimization approaches...... characteristic to this class of systems: mapping of functionality, the optimization of the access to the communication channel, and the assignment of scheduling policies to processes. Optimization heuristics aiming at producing a schedulable system, with a given amount of resources, are presented....

  13. A data analysis expert system for large established distributed databases

    Science.gov (United States)

    Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick

    1987-01-01

    A design for a natural language database interface system, called the Deductively Augmented NASA Management Decision support System (DANMDS), is presented. The DANMDS system components have been chosen on the basis of the following considerations: maximal employment of the existing NASA IBM-PC computers and supporting software; local structuring and storing of external data via the entity-relationship model; a natural easy-to-use error-free database query language; user ability to alter query language vocabulary and data analysis heuristic; and significant artificial intelligence data analysis heuristic techniques that allow the system to become progressively and automatically more useful.

  14. Evolution of the sedimentation technique for particle size distribution analysis

    International Nuclear Information System (INIS)

    Maley, R.

    1998-01-01

    After an introduction on the significance of particle size measurements, sedimentation methods are described, with emphasis on the evolution of the gravitational approach. The gravitational technique based on mass determination by X-ray adsorption allows fast analysis by automation and easy data handling, in addition to providing the accuracy required by quality control and research applications [it

  15. Distributed Robustness Analysis of Interconnected Uncertain Systems Using Chordal Decomposition

    DEFF Research Database (Denmark)

    Pakazad, Sina Khoshfetrat; Hansson, Anders; Andersen, Martin Skovgaard

    2014-01-01

    Large-scale interconnected uncertain systems commonly have large state and uncertainty dimensions. Aside from the heavy computational cost of performing robust stability analysis in a centralized manner, privacy requirements in the network can also introduce further issues. In this paper, we util...

  16. Channel flow analysis. [velocity distribution throughout blade flow field

    Science.gov (United States)

    Katsanis, T.

    1973-01-01

    The design of a proper blade profile requires calculation of the blade row flow field in order to determine the velocities on the blade surfaces. An analysis theory is presented for several methods used for this calculation and associated computer programs that were developed are discussed.

  17. Morphological analysis of polymer systems with broad particle size distribution

    Czech Academy of Sciences Publication Activity Database

    Šlouf, Miroslav; Ostafinska, Aleksandra; Nevoralová, Martina; Fortelný, Ivan

    2015-01-01

    Roč. 42, April (2015), s. 8-16 ISSN 0142-9418 R&D Projects: GA ČR(CZ) GA14-17921S Institutional support: RVO:61389013 Keywords : polymer blends * morphology * image analysis Subject RIV: JJ - Other Materials Impact factor: 2.350, year: 2015

  18. Advanced analysis of metal distributions in human hair

    International Nuclear Information System (INIS)

    Kempson, Ivan M.; Skinner, William M.

    2006-01-01

    A variety of techniques (secondary electron microscopy with energy dispersive X-ray analysis, time-of-flight-secondary ion mass spectrometry, and synchrotron X-ray fluorescence) were utilized to distinguish metal contamination occurring in hair arising from endogenous uptake from an individual exposed to a polluted environment, in this case a lead smelter. Evidence was sought for elements less affected by contamination and potentially indicative of biogenic activity. The unique combination of surface sensitivity, spatial resolution, and detection limits used here has provided new insight regarding hair analysis. Metals such as Ca, Fe, and Pb appeared to have little representative value of endogenous uptake and were mainly due to contamination. Cu and Zn, however, demonstrate behaviors worthy of further investigation into relating hair concentrations to endogenous function.

  19. Gaze distribution analysis and saliency prediction across age groups.

    Science.gov (United States)

    Krishna, Onkar; Helo, Andrea; Rämä, Pia; Aizawa, Kiyoharu

    2018-01-01

    Knowledge of the human visual system helps to develop better computational models of visual attention. State-of-the-art models have been developed to mimic the visual attention system of young adults that, however, largely ignore the variations that occur with age. In this paper, we investigated how visual scene processing changes with age and we propose an age-adapted framework that helps to develop a computational model that can predict saliency across different age groups. Our analysis uncovers how the explorativeness of an observer varies with age, how well saliency maps of an age group agree with fixation points of observers from the same or different age groups, and how age influences the center bias tendency. We analyzed the eye movement behavior of 82 observers belonging to four age groups while they explored visual scenes. Explorative- ness was quantified in terms of the entropy of a saliency map, and area under the curve (AUC) metrics was used to quantify the agreement analysis and the center bias tendency. Analysis results were used to develop age adapted saliency models. Our results suggest that the proposed age-adapted saliency model outperforms existing saliency models in predicting the regions of interest across age groups.

  20. Distributed Finite Element Analysis Using a Transputer Network

    Science.gov (United States)

    Watson, James; Favenesi, James; Danial, Albert; Tombrello, Joseph; Yang, Dabby; Reynolds, Brian; Turrentine, Ronald; Shephard, Mark; Baehmann, Peggy

    1989-01-01

    The principal objective of this research effort was to demonstrate the extraordinarily cost effective acceleration of finite element structural analysis problems using a transputer-based parallel processing network. This objective was accomplished in the form of a commercially viable parallel processing workstation. The workstation is a desktop size, low-maintenance computing unit capable of supercomputer performance yet costs two orders of magnitude less. To achieve the principal research objective, a transputer based structural analysis workstation termed XPFEM was implemented with linear static structural analysis capabilities resembling commercially available NASTRAN. Finite element model files, generated using the on-line preprocessing module or external preprocessing packages, are downloaded to a network of 32 transputers for accelerated solution. The system currently executes at about one third Cray X-MP24 speed but additional acceleration appears likely. For the NASA selected demonstration problem of a Space Shuttle main engine turbine blade model with about 1500 nodes and 4500 independent degrees of freedom, the Cray X-MP24 required 23.9 seconds to obtain a solution while the transputer network, operated from an IBM PC-AT compatible host computer, required 71.7 seconds. Consequently, the $80,000 transputer network demonstrated a cost-performance ratio about 60 times better than the $15,000,000 Cray X-MP24 system.

  1. Neighborhood Structural Similarity Mapping for the Classification of Masses in Mammograms.

    Science.gov (United States)

    Rabidas, Rinku; Midya, Abhishek; Chakraborty, Jayasree

    2018-05-01

    In this paper, two novel feature extraction methods, using neighborhood structural similarity (NSS), are proposed for the characterization of mammographic masses as benign or malignant. Since gray-level distribution of pixels is different in benign and malignant masses, more regular and homogeneous patterns are visible in benign masses compared to malignant masses; the proposed method exploits the similarity between neighboring regions of masses by designing two new features, namely, NSS-I and NSS-II, which capture global similarity at different scales. Complementary to these global features, uniform local binary patterns are computed to enhance the classification efficiency by combining with the proposed features. The performance of the features are evaluated using the images from the mini-mammographic image analysis society (mini-MIAS) and digital database for screening mammography (DDSM) databases, where a tenfold cross-validation technique is incorporated with Fisher linear discriminant analysis, after selecting the optimal set of features using stepwise logistic regression method. The best area under the receiver operating characteristic curve of 0.98 with an accuracy of is achieved with the mini-MIAS database, while the same for the DDSM database is 0.93 with accuracy .

  2. Analysis of fold distributions of segmented clover detectors

    International Nuclear Information System (INIS)

    Bhattacharya, Pintu; Kshetri, Ritesh

    2015-01-01

    We have studied the effect of segmentation on the full energy energy deposition of a gamma-ray through the studies on fold distribution. The response of seven segmented TIGRESS detectors up to an energy of 8 MeV has been studied by utilizing standard sources of 152 Eu, 56,60 Co and a radioactive 11 Be beam with an energy of 16.5 MeV. Experiment was performed at the ISAC-II facility at TRIUMF, using a thick gold target. The β¯ decay of 11 Be (τ 1/2 = 13.81(8) sec) produces high energy gamma-rays up to 7974 keV. A 1 mm thick annular double-sided silicon detector of the BAMBINO detector, was mounted 19.4 mm downstream of the target position and used for detection of the electrons in coincidence with the gamma-rays from the seven TIGRESS detectors. The master trigger allowed data to be collected either in Ge singles mode or with a Ge-Si coincidence condition. Standard sources of 152 Eu and 56,60 Co were also used to obtain low energy data

  3. Using Weibull Distribution Analysis to Evaluate ALARA Performance

    International Nuclear Information System (INIS)

    Frome, E.L.; Watkins, J.P.; Hagemeyer, D.A.

    2009-01-01

    As Low as Reasonably Achievable (ALARA) is the underlying principle for protecting nuclear workers from potential health outcomes related to occupational radiation exposure. Radiation protection performance is currently evaluated by measures such as collective dose and average measurable dose, which do not indicate ALARA performance. The purpose of this work is to show how statistical modeling of individual doses using the Weibull distribution can provide objective supplemental performance indicators for comparing ALARA implementation among sites and for insights into ALARA practices within a site. Maximum likelihood methods were employed to estimate the Weibull shape and scale parameters used for performance indicators. The shape parameter reflects the effectiveness of maximizing the number of workers receiving lower doses and is represented as the slope of the fitted line on a Weibull probability plot. Additional performance indicators derived from the model parameters include the 99th percentile and the exceedance fraction. When grouping sites by collective total effective dose equivalent (TEDE) and ranking by 99th percentile with confidence intervals, differences in performance among sites can be readily identified. Applying this methodology will enable more efficient and complete evaluation of the effectiveness of ALARA implementation.

  4. Analysis of dose distribution in interventionist radiology professionals

    International Nuclear Information System (INIS)

    Mauricio, Claudia L.P.; Silva, Leonardo Peres; Canevaro, Lucia V.; Luz, Eara de Souza

    2005-01-01

    In this work, an evaluation was made of the distribution of dose received by professionals involved in some procedures of Interventional Radiology at hospitals and clinics in Rio de Janeiro, RJ, Brazil. For these measurements thermoluminescent dosemeters (TLD) of LiF: Mg, Ti (TLD100) were used, positioned at different points of the body of professionals: the hands, knees, neck, forehead and chest, inside and outside the lead apron. The measurements were made by procedure and/or a day of work, and the TLD were calibrated in equivalent operating magnitude of personal dose (Hp (d)) at different depths: 0.07 mm, 3 mm and 10 mm. In some procedures, physicians (holders of service and residents) received significant doses. The results show the importance of the appropriate training of these professionals and the use of personal protective equipment (PPE), such as thyroid shield, which is not always used. Based on these evaluations, some suggestions were made in order to optimize the dose values in these procedures and a discussion on the need for additional monitoring points

  5. Distributed support modelling for vertical track dynamic analysis

    Science.gov (United States)

    Blanco, B.; Alonso, A.; Kari, L.; Gil-Negrete, N.; Giménez, J. G.

    2018-04-01

    The finite length nature of rail-pad supports is characterised by a Timoshenko beam element formulation over an elastic foundation, giving rise to the distributed support element. The new element is integrated into a vertical track model, which is solved in frequency and time domain. The developed formulation is obtained by solving the governing equations of a Timoshenko beam for this particular case. The interaction between sleeper and rail via the elastic connection is considered in an analytical, compact and efficient way. The modelling technique results in realistic amplitudes of the 'pinned-pinned' vibration mode and, additionally, it leads to a smooth evolution of the contact force temporal response and to reduced amplitudes of the rail vertical oscillation, as compared to the results from concentrated support models. Simulations are performed for both parametric and sinusoidal roughness excitation. The model of support proposed here is compared with a previous finite length model developed by other authors, coming to the conclusion that the proposed model gives accurate results at a reduced computational cost.

  6. A framework for establishing the technical efficiency of Electricity Distribution Counties (EDCs) using Data Envelopment Analysis

    International Nuclear Information System (INIS)

    Mullarkey, Shane; Caulfield, Brian; McCormack, Sarah; Basu, Biswajit

    2015-01-01

    Highlights: • Six models are employed to establish the technical efficiency of Electricity Distribution Counties. • A diagnostic parameter is incorporated to account for differences across Electricity Distribution Counties. • The amalgamation of Electricity Distribution Counties leads to improved efficiency in the production of energy. - Abstract: European Energy market liberalization has entailed the restructuring of electricity power markets through the unbundling of electricity generation, transmission and distribution, supply activities and introducing competition into electricity generation. Under these new electricity market regimes, it is important to have an evaluation tool that is capable of examining the impacts of these market changes. The adoption of Data Envelopment Analysis as a form of benchmarking for electricity distribution regulation is one method to conduct this analysis. This paper applies a Data Envelopment Analysis framework to the electricity distribution network in Ireland to explore the merits of using this approach, to determine the technical efficiency and the potential scope for efficiency improvements through reorganizing and the amalgamation of the distribution network in Ireland. The results presented show that overall grid efficiency is improved through this restructuring. A diagnostic parameter is defined and pursued to account for aberrations across Electricity Distribution Counties as opposed to the traditionally employed environmental variables. The adoption of this diagnostic parameter leads to a more intuitive understanding of Electricity Distribution Counties

  7. Analysis of distribution of PSL intensity recorded in imaging plate

    International Nuclear Information System (INIS)

    Oda, Keiji; Tsukahara, Kazutaka; Tada, Hidenori; Yamauchi, Tomoya

    2006-01-01

    Supplementary experiments and theoretical consideration have been performed about a new method for particle identification with an imaging plate, which was proposed in the previous paper. The imaging plate was exposed to 137 Cs γ-rays, 2 MeV- protons accelerated by a tandem Van de Graaff, X-rays emitted from a tube operated under the condition of 20-70 kV, as well as α- and β-rays. The frequency distribution in PSL intensity in a pixel of 100μm x 100μm was measured and the standard deviation was obtained by fitting to a Gaussian. It was confirmed that the relative standard deviation decreased with the average PSL intensity for every radiation species and that the curves were roughly divided into four groups of α-rays, protons, β-rays and photons. In the second step, these data were analyzed by plotting the square of the relative standard deviation against the average PSL intensity in full-log scale, where the relation should be expressed by a straight line with an slope of -1 provided that the deviation could be dominated only by statistical fluctuation. The data for α- and β-rays deviated from a straight line and approached to each saturated value as the average PSL intensity increased. This saturation was considered to be caused by inhomogeneity in the source intensity. It was also out that the value of interception on full-log plot would have important information about PSL reading efficiency, one of characteristic parameters of imaging plate. (author)

  8. Photoelastic analysis of stress distribution in oral rehabilitation.

    Science.gov (United States)

    Turcio, Karina Helga Leal; Goiato, Marcelo Coelho; Gennari Filho, Humberto; dos Santos, Daniela Micheline

    2009-03-01

    The purpose of this study was to present a literature review about photoelasticity, a laboratory method for evaluation of implants prosthesis behavior. Fixed or removable prostheses function as levers on supporting teeth, allowing forces to cause tooth movement if not carefully planned. Hence, during treatment planning, the dentist must be aware of the biomechanics involved and prevent movement of supporting teeth, decreasing lever-type forces generated by these prosthesis. Photoelastic analysis has great applicability in restorative dentistry as it allows prediction and minimization of biomechanical critical points through modifications in treatment planning.

  9. Lognormal Distribution of Cellular Uptake of Radioactivity: Statistical Analysis of α-Particle Track Autoradiography

    Science.gov (United States)

    Neti, Prasad V.S.V.; Howell, Roger W.

    2010-01-01

    Recently, the distribution of radioactivity among a population of cells labeled with 210Po was shown to be well described by a log-normal (LN) distribution function (J Nucl Med. 2006;47:1049–1058) with the aid of autoradiography. To ascertain the influence of Poisson statistics on the interpretation of the autoradiographic data, the present work reports on a detailed statistical analysis of these earlier data. Methods The measured distributions of α-particle tracks per cell were subjected to statistical tests with Poisson, LN, and Poisson-lognormal (P-LN) models. Results The LN distribution function best describes the distribution of radioactivity among cell populations exposed to 0.52 and 3.8 kBq/mL of 210Po-citrate. When cells were exposed to 67 kBq/mL, the P-LN distribution function gave a better fit; however, the underlying activity distribution remained log-normal. Conclusion The present analysis generally provides further support for the use of LN distributions to describe the cellular uptake of radioactivity. Care should be exercised when analyzing autoradiographic data on activity distributions to ensure that Poisson processes do not distort the underlying LN distribution. PMID:18483086

  10. Impact analysis of automotive structures with distributed smart material systems

    Science.gov (United States)

    Peelamedu, Saravanan M.; Naganathan, Ganapathy; Buckley, Stephen J.

    1999-06-01

    New class of automobiles has structural skins that are quite different from their current designs. Particularly, new families of composite skins are developed with new injection molding processes. These skins while support the concept of lighter vehicles of the future, are also susceptible to damage upon impact. It is important that their design should be based on a better understanding on the type of impact loads and the resulting strains and damage. It is possible that these skins can be integrally designed with active materials to counter damages. This paper presents a preliminary analysis of a new class of automotive skins, using piezoceramic as a smart material. The main objective is to consider the complex system with, the skin to be modeled as a layered plate structure involving a lightweight material with foam and active materials imbedded on them. To begin with a cantilever beam structure is subjected to a load through piezoceramic and the resulting strain at the active material site is predicted accounting for the material properties, piezoceramic thickness, adhesive thickness including the effect of adhesives. A finite element analysis is carried out to compare experimental work. Further work in this direction would provide an analytical tool that will provide the basis for algorithms to predict and counter impacts on the future class of automobiles.

  11. Analysis of the international distribution of per capita CO2 emissions using the polarization concept

    International Nuclear Information System (INIS)

    Duro, Juan Antonio; Padilla, Emilio

    2008-01-01

    The concept of polarization is linked to the extent that a given distribution leads to the formation of homogeneous groups with opposing interests. This concept, which is basically different from the traditional one of inequality, is related to the level of inherent potential conflict in a distribution. The polarization approach has been widely applied in the analysis of income distribution. The extension of this approach to the analysis of international distribution of CO 2 emissions is quite useful as it gives a potent informative instrument for characterizing the state and evolution of the international distribution of emissions and its possible political consequences in terms of tensions and the probability of achieving agreements. In this paper we analyze the international distribution of per capita CO 2 emissions between 1971 and 2001 through the adaptation of the polarization concept and measures. We find that the most interesting grouped description deriving from the analysis is a two groups' one, which broadly coincide with Annex B and non-Annex B countries of the Kyoto Protocol, which shows the power of polarization analysis for explaining the generation of groups in the real world. The analysis also shows a significant reduction in international polarization in per capita CO 2 emissions between 1971 and 1995, but not much change since 1995, which might indicate that polarized distribution of emission is still one of the important factors leading to difficulties in achieving agreements for reducing global emissions. (author)

  12. Attitudes of women in their forties toward the 2009 USPSTF mammogram guidelines: a randomized trial on the effects of media exposure.

    Science.gov (United States)

    Davidson, AuTumn S; Liao, Xun; Magee, B Dale

    2011-07-01

    The objective of the study was to assess women's attitudes toward 2009 US Preventive Services Task Force mammography screening guideline changes and evaluate the role of media in shaping opinions. Two hundred forty-nine women, aged 39-49 years, presenting for annual examinations randomized to read 1 of 2 articles, and survey completion comprised the design of the study. Eighty-eight percent overestimated the lifetime breast cancer (BrCa) risk. Eighty-nine percent want yearly mammograms in their 40s. Eighty-six percent felt the changes were unsafe, and even if the changes were doctor recommended, 84% would not delay screening until age 50 years. Those with a friend/relative with BrCa were more likely to want annual mammography in their forties (92% vs 77%, P = .001), and feel changes unsafe (91% vs 69%, P ≤ .0001). Participants with previous false-positive mammograms were less likely to accept doctor-recommended screening delay until age 50 years (8% vs 21%, P = .01). Women overestimate BrCa risk. Skepticism of new mammogram guidelines exists, and is increased by exposure to negative media. Those with prior false-positive mammograms are less likely to accept changes. Copyright © 2011 Mosby, Inc. All rights reserved.

  13. Extending the LWS Data Environment: Distributed Data Processing and Analysis

    Science.gov (United States)

    Narock, Thomas

    2005-01-01

    The final stages of this work saw changes to the original framework, as well as the completion and integration of several data processing services. Initially, it was thought that a peer-to-peer architecture was necessary to make this work possible. The peer-to-peer architecture provided many benefits including the dynamic discovery of new services that would be continually added. A prototype example was built and while it showed promise, a major disadvantage was seen in that it was not easily integrated into the existing data environment. While the peer-to-peer system worked well for finding and accessing distributed data processing services, it was found that its use was limited by the difficulty in calling it from existing tools and services. After collaborations with members of the data community, it was determined that our data processing system was of high value and that a new interface should be pursued in order for the community to take full advantage of it. As such; the framework was modified from a peer-to-peer architecture to a more traditional web service approach. Following this change multiple data processing services were added. These services include such things as coordinate transformations and sub setting of data. Observatory (VHO), assisted with integrating the new architecture into the VHO. This allows anyone using the VHO to search for data, to then pass that data through our processing services prior to downloading it. As a second attempt at demonstrating the new system, a collaboration was established with the Collaborative Sun Earth Connector (CoSEC) group at Lockheed Martin. This group is working on a graphical user interface to the Virtual Observatories and data processing software. The intent is to provide a high-level easy-to-use graphical interface that will allow access to the existing Virtual Observatories and data processing services from one convenient application. Working with the CoSEC group we provided access to our data

  14. Chromosome aberration analysis based on a beta-binomial distribution

    International Nuclear Information System (INIS)

    Otake, Masanori; Prentice, R.L.

    1983-10-01

    Analyses carried out here generalized on earlier studies of chromosomal aberrations in the populations of Hiroshima and Nagasaki, by allowing extra-binomial variation in aberrant cell counts corresponding to within-subject correlations in cell aberrations. Strong within-subject correlations were detected with corresponding standard errors for the average number of aberrant cells that were often substantially larger than was previously assumed. The extra-binomial variation is accomodated in the analysis in the present report, as described in the section on dose-response models, by using a beta-binomial (B-B) variance structure. It is emphasized that we have generally satisfactory agreement between the observed and the B-B fitted frequencies by city-dose category. The chromosomal aberration data considered here are not extensive enough to allow a precise discrimination between competing dose-response models. A quadratic gamma ray and linear neutron model, however, most closely fits the chromosome data. (author)

  15. Movement of Fuel Ashore: Storage, Capacity, Throughput, and Distribution Analysis

    Science.gov (United States)

    2015-12-01

    in MPEM, as the basis for analysis. In keeping with the spirit of EF21 and 33 seabasing concepts, this approach assumes that all other combat...PLTSQ03TM 2 GCE 1 1 INF BN 1 (SL.R’)CO B 1ST Pl. T SQO 3 TM 3 GCE , 1 INF BN 1 (Sl.FIF)CO B2Ml Pl. T HJS(J) GCE 1 1 INF BN 1 (SI.H)CO B2 >1l Pl T SCiO 1 HQ...TM GCE 1 1 INF BN 1 (Sl.R’)CO B2NJ Pl. T SOl 1 TM 1 GCE 1 1 INF BN 1 (SI.H)CO B2 >1l Pl T SCiO 1 TM 2 GCE 1 1 INF BN 1 (SURF) CO B2NJ Pl. T SQ) 1 TM

  16. Fault Diagnosis for Electrical Distribution Systems using Structural Analysis

    DEFF Research Database (Denmark)

    Knüppel, Thyge; Blanke, Mogens; Østergaard, Jacob

    2014-01-01

    redundancies in large sets of equations only from the structure (topology) of the equations. A salient feature is automated generation of redundancy relations. The method is indeed feasible in electrical networks where circuit theory and network topology together formulate the constraints that define...... relations (ARR) are likely to change. The algorithms used for diagnosis may need to change accordingly, and finding efficient methods to ARR generation is essential to employ fault-tolerant methods in the grid. Structural analysis (SA) is based on graph-theoretical results, that offer to find analytic...... a structure graph. This paper shows how three-phase networks are modelled and analysed using structural methods, and it extends earlier results by showing how physical faults can be identified such that adequate remedial actions can be taken. The paper illustrates a feasible modelling technique for structural...

  17. The GIS and analysis of earthquake damage distribution of the 1303 Hongtong M=8 earthquake

    Science.gov (United States)

    Gao, Meng-Tan; Jin, Xue-Shen; An, Wei-Ping; Lü, Xiao-Jian

    2004-07-01

    The geography information system of the 1303 Hongton M=8 earthquake has been established. Using the spatial analysis function of GIS, the spatial distribution characteristics of damage and isoseismal of the earthquake are studies. By comparing with the standard earthquake intensity attenuation relationship, the abnormal damage distribution of the earthquake is found, so the relationship of the abnormal distribution with tectonics, site condition and basin are analyzed. In this paper, the influence on the ground motion generated by earthquake source and the underground structures near source also are studied. The influence on seismic zonation, anti-earthquake design, earthquake prediction and earthquake emergency responding produced by the abnormal density distribution are discussed.

  18. Distributional Benefit Analysis of a National Air Quality Rule

    Directory of Open Access Journals (Sweden)

    Jin Huang

    2011-06-01

    Full Text Available Under Executive Order 12898, the U.S. Environmental Protection Agency (EPA must perform environmental justice (EJ reviews of its rules and regulations. EJ analyses address the hypothesis that environmental disamenities are experienced disproportionately by poor and/or minority subgroups. Such analyses typically use communities as the unit of analysis. While community-based approaches make sense when considering where polluting sources locate, they are less appropriate for national air quality rules affecting many sources and pollutants that can travel thousands of miles. We compare exposures and health risks of EJ-identified individuals rather than communities to analyze EPA’s Heavy Duty Diesel (HDD rule as an example national air quality rule. Air pollutant exposures are estimated within grid cells by air quality models; all individuals in the same grid cell are assigned the same exposure. Using an inequality index, we find that inequality within racial/ethnic subgroups far outweighs inequality between them. We find, moreover, that the HDD rule leaves between-subgroup inequality essentially unchanged. Changes in health risks depend also on subgroups’ baseline incidence rates, which differ across subgroups. Thus, health risk reductions may not follow the same pattern as reductions in exposure. These results are likely representative of other national air quality rules as well.

  19. Sensitivity Analysis of Dynamic Tariff Method for Congestion Management in Distribution Networks

    DEFF Research Database (Denmark)

    Huang, Shaojun; Wu, Qiuwei; Liu, Zhaoxi

    2015-01-01

    The dynamic tariff (DT) method is designed for the distribution system operator (DSO) to alleviate the congestions that might occur in a distribution network with high penetration of distribute energy resources (DERs). Sensitivity analysis of the DT method is crucial because of its decentralized...... control manner. The sensitivity analysis can obtain the changes of the optimal energy planning and thereby the line loading profiles over the infinitely small changes of parameters by differentiating the KKT conditions of the convex quadratic programming, over which the DT method is formed. Three case...

  20. Parametric distribution approach for flow availability in small hydro potential analysis

    Science.gov (United States)

    Abdullah, Samizee; Basri, Mohd Juhari Mat; Jamaluddin, Zahrul Zamri; Azrulhisham, Engku Ahmad; Othman, Jamel

    2016-10-01

    Small hydro system is one of the important sources of renewable energy and it has been recognized worldwide as clean energy sources. Small hydropower generation system uses the potential energy in flowing water to produce electricity is often questionable due to inconsistent and intermittent of power generated. Potential analysis of small hydro system which is mainly dependent on the availability of water requires the knowledge of water flow or stream flow distribution. This paper presented the possibility of applying Pearson system for stream flow availability distribution approximation in the small hydro system. By considering the stochastic nature of stream flow, the Pearson parametric distribution approximation was computed based on the significant characteristic of Pearson system applying direct correlation between the first four statistical moments of the distribution. The advantage of applying various statistical moments in small hydro potential analysis will have the ability to analyze the variation shapes of stream flow distribution.

  1. SU-E-I-59: Investigation of the Usefulness of a Standard Deviation and Mammary Gland Density as Indexes for Mammogram Classification.

    Science.gov (United States)

    Takarabe, S; Yabuuchi, H; Morishita, J

    2012-06-01

    To investigate the usefulness of the standard deviation of pixel values in a whole mammary glands region and the percentage of a high- density mammary glands region to a whole mammary glands region as features for classification of mammograms into four categories based on the ACR BI-RADS breast composition. We used 36 digital mediolateral oblique view mammograms (18 patients) approved by our IRB. These images were classified into the four categories of breast compositions by an experienced breast radiologist and the results of the classification were regarded as a gold standard. First, a whole mammary region in a breast was divided into two regions such as a high-density mammary glands region and a low/iso-density mammary glands region by using a threshold value that was obtained from the pixel values corresponding to a pectoral muscle region. Then the percentage of a high-density mammary glands region to a whole mammary glands region was calculated. In addition, as a new method, the standard deviation of pixel values in a whole mammary glands region was calculated as an index based on the intermingling of mammary glands and fats. Finally, all mammograms were classified by using the combination of the percentage of a high-density mammary glands region and the standard deviation of each image. The agreement rates of the classification between our proposed method and gold standard was 86% (31/36). This result signified that our method has the potential to classify mammograms. The combination of the standard deviation of pixel values in a whole mammary glands region and the percentage of a high-density mammary glands region to a whole mammary glands region was available as features to classify mammograms based on the ACR BI- RADS breast composition. © 2012 American Association of Physicists in Medicine.

  2. Advancing Collaborative Climate Studies through Globally Distributed Geospatial Analysis

    Science.gov (United States)

    Singh, R.; Percivall, G.

    2009-12-01

    Infrastructure and the broader GEOSS architecture. Of specific interest to this session is the work on geospatial workflows and geo-processing and data discovery and access. CCIP demonstrates standards-based interoperability between geospatial applications in the service of Climate Change analysis. CCIP is planned to be a yearly exercise. It consists of a network of online data services (WCS, WFS, SOS), analysis services (WPS, WCPS, WMS), and clients that exercise those services. In 2009, CCIP focuses on Australia, and the initial application of existing OGC services to climate studies. The results of the 2009 CCIP will serve as requirements for more complex geo-processing services to be developed for CCIP 2010. The benefits of CCIP include accelerating the implementation of the GCOS, and building confidence that implementations using multi-vendor interoperable technologies can help resolve vexing climate change questions. AIP-2: Architecture Implementation Pilot, Phase 2 CCIP: Climate Challenge Integration Plugfest GEO: Group on Earth Observations GEOSS: Global Earth Observing System of Systems GCOS: Global Climate Observing System OGC: Open Geospatial Consortium SOS: Sensor Observation Service WCS: Web Coverage Service WCPS: Web Coverage Processing Service WFS: Web Feature Service WMS: Web Mapping Service

  3. THE COMPARATIVE ANALYSIS OF TWO DIFFERENT STATISTICAL DISTRIBUTIONS USED TO ESTIMATE THE WIND ENERGY POTENTIAL

    Directory of Open Access Journals (Sweden)

    Mehmet KURBAN

    2007-01-01

    Full Text Available In this paper, the wind energy potential of the region is analyzed with Weibull and Reyleigh statistical distribution functions by using the wind speed data measured per 15 seconds in July, August, September, and October of 2005 at 10 m height of 30-m observation pole in the wind observation station constructed in the coverage of the scientific research project titled "The Construction of Hybrid (Wind-Solar Power Plant Model by Determining the Wind and Solar Potential in the Iki Eylul Campus of A.U." supported by Anadolu University. The Maximum likelihood method is used for finding the parameters of these distributions. The conclusion of the analysis for the months taken represents that the Weibull distribution models the wind speeds better than the Rayleigh distribution. Furthermore, the error rate in the monthly values of power density computed by using the Weibull distribution is smaller than the values by Rayleigh distribution.

  4. Making distributed ALICE analysis simple using the GRID plug-in

    International Nuclear Information System (INIS)

    Gheata, A; Gheata, M

    2012-01-01

    We have developed an interface within the ALICE analysis framework that allows transparent usage of the experiment's distributed resources. This analysis plug-in makes it possible to configure back-end specific parameters from a single interface and to run with no change the same custom user analysis in many computing environments, from local workstations to PROOF clusters or GRID resources. The tool is used now extensively in the ALICE collaboration for both end-user analysis and large scale productions.

  5. A combined approach for the enhancement and segmentation of mammograms using modified fuzzy C-means method in wavelet domain.

    Science.gov (United States)

    Srivastava, Subodh; Sharma, Neeraj; Singh, S K; Srivastava, R

    2014-07-01

    In this paper, a combined approach for enhancement and segmentation of mammograms is proposed. In preprocessing stage, a contrast limited adaptive histogram equalization (CLAHE) method is applied to obtain the better contrast mammograms. After this, the proposed combined methods are applied. In the first step of the proposed approach, a two dimensional (2D) discrete wavelet transform (DWT) is applied to all the input images. In the second step, a proposed nonlinear complex diffusion based unsharp masking and crispening method is applied on the approximation coefficients of the wavelet transformed images to further highlight the abnormalities such as micro-calcifications, tumours, etc., to reduce the false positives (FPs). Thirdly, a modified fuzzy c-means (FCM) segmentation method is applied on the output of the second step. In the modified FCM method, the mutual information is proposed as a similarity measure in place of conventional Euclidian distance based dissimilarity measure for FCM segmentation. Finally, the inverse 2D-DWT is applied. The efficacy of the proposed unsharp masking and crispening method for image enhancement is evaluated in terms of signal-to-noise ratio (SNR) and that of the proposed segmentation method is evaluated in terms of random index (RI), global consistency error (GCE), and variation of information (VoI). The performance of the proposed segmentation approach is compared with the other commonly used segmentation approaches such as Otsu's thresholding, texture based, k-means, and FCM clustering as well as thresholding. From the obtained results, it is observed that the proposed segmentation approach performs better and takes lesser processing time in comparison to the standard FCM and other segmentation methods in consideration.

  6. Mapping 3D breast lesions from full-field digital mammograms using subject-specific finite element models

    Science.gov (United States)

    García, E.; Oliver, A.; Diaz, O.; Diez, Y.; Gubern-Mérida, A.; Martí, R.; Martí, J.

    2017-03-01

    Patient-specific finite element (FE) models of the breast have received increasing attention due to the potential capability of fusing images from different modalities. During the Magnetic Resonance Imaging (MRI) to X-ray mammography registration procedure, the FE model is compressed mimicking the mammographic acquisition. Subsequently, suspicious lesions in the MRI volume can be projected into the 2D mammographic space. However, most registration algorithms do not provide the reverse information, avoiding to obtain the 3D geometrical information from the lesions localized in the mammograms. In this work we introduce a fast method to localize the 3D position of the lesion within the MRI, using both cranio-caudal (CC) and medio-lateral oblique (MLO) mammographic projections, indexing the tetrahedral elements of the biomechanical model by means of an uniform grid. For each marked lesion in the Full-Field Digital Mammogram (FFDM), the X-ray path from source to the marker is calculated. Barycentric coordinates are computed in the tetrahedrons traversed by the ray. The list of elements and coordinates allows to localize two curves within the MRI and the closest point between both curves is taken as the 3D position of the lesion. The registration errors obtained in the mammographic space are 9.89 +/- 3.72 mm in CC- and 8.04 +/- 4.68 mm in MLO-projection and the error in the 3D MRI space is equal to 10.29 +/- 3.99 mm. Regarding the uniform grid, it is computed spending between 0.1 and 0.7 seconds. The average time spent to compute the 3D location of a lesion is about 8 ms.

  7. Image quality of a wet laser printer versus a paper printer for full-field digital mammograms.

    Science.gov (United States)

    Schueller, Gerd; Kaindl, Elisabeth; Matzek, Wolfgang K; Semturs, Friedrich; Schueller-Weidekamm, Claudia; Helbich, Thomas H

    2006-01-01

    The purpose of our study was to compare the image quality of a wet laser printer with that of a paper printer for full-field digital mammography (FFDM). For both a wet laser printer and a paper printer connected to an FFDM system, image quality parameters were evaluated using a standardized printer test image (luminance density, dynamic range). The detectability of standardized objects on a phantom was also evaluated. Furthermore, 640 mammograms of 80 patients with different breast tissue composition patterns were imaged with both printers. Subjective image quality parameters (brightness, contrast, and detection of details of anatomic structures-that is, skin, subcutis, musculature, glandular tissue, and fat), the detectability of breast lesions (mass, calcifications), and the diagnostic performance according to the BI-RADS classification were evaluated. Both the luminance density and the dynamic range were superior for the wet laser printer. More standardized objects were visible on the phantom imaged with the wet laser printer than with the paper printer (13/16 vs 11/16). Each subjective image quality parameter of the mammograms from the wet laser printer was rated superior to those of the paper printer. Significantly more breast lesions were detected on the wet laser printer images than on the paper printer images (masses, 13 vs 10; calcifications, 65 vs 48; p printer images, BI-RADS 4 and 5 categories were underestimated for 10 (43.5%) of 23 patients. For FFDM, images obtained from a wet laser printer show superior objective and subjective image quality compared with a paper printer. As a consequence, the paper printer should not be used for FFDM.

  8. Hydrogen distribution analysis for CANDU 6 containment using the GOTHIC containment analysis code

    International Nuclear Information System (INIS)

    Nguyen, T.H.; Collins, W.M.

    1995-01-01

    Hydrogen may be generated in the reactor core by the zircaloy-steam reaction for a postulated loss of coolant accident (LOCA) scenario with loss of emergency core cooling (ECC). It is important to predict hydrogen distribution within containment in order to determine if flammable mixtures exist. This information is required to determine the best locations in containment for the placement of mitigation devices such as igniters and recombiners. For large break loss coolant accidents, hydrogen is released after the break flow has subsided. Following this period of high discharge the flow in the containment building undergoes transition from forced flow to a buoyancy driven flow (particularly when local air coolers (LACS) are not credited). One-dimensional computer codes (lumped parameter) are applicable during the initial period when a high degree of mixing occurs due to the forced flow generated by the break. However, during the post-blowdown phase the assumption of homogeneity becomes less accurate, and it is necessary to employ three-dimensional codes to capture local effects. This is particularly important for purely buoyant flows which may exhibit stratification effects. In the present analysis a three-dimensional model of CANDU 6 containment was constructed with the GOTHIC computer code using a relatively coarse mesh adequate enough to capture the salient features of the flow during the blowdown and hydrogen release periods. A 3D grid representation was employed for that portion of containment in which the primary flow (LOCA and post-LOCA) was deemed to occur. The remainder of containment was represented by lumped nodes. The results of the analysis indicate that flammable concentrations exist for several minutes in the vicinity of the break and in the steam generator enclosure. This is due to the fact that the hydrogen released from the break is primarily directed upwards into the steam generator enclosure due to buoyancy effects. Once hydrogen production ends

  9. Fissure formation in coke. 3: Coke size distribution and statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    D.R. Jenkins; D.E. Shaw; M.R. Mahoney [CSIRO, North Ryde, NSW (Australia). Mathematical and Information Sciences

    2010-07-15

    A model of coke stabilization, based on a fundamental model of fissuring during carbonisation is used to demonstrate the applicability of the fissuring model to actual coke size distributions. The results indicate that the degree of stabilization is important in determining the size distribution. A modified form of the Weibull distribution is shown to provide a better representation of the whole coke size distribution compared to the Rosin-Rammler distribution, which is generally only fitted to the lump coke. A statistical analysis of a large number of experiments in a pilot scale coke oven shows reasonably good prediction of the coke mean size, based on parameters related to blend rank, amount of low rank coal, fluidity and ash. However, the prediction of measures of the spread of the size distribution is more problematic. The fissuring model, the size distribution representation and the statistical analysis together provide a comprehensive capability for understanding and predicting the mean size and distribution of coke lumps produced during carbonisation. 12 refs., 16 figs., 4 tabs.

  10. Projection methods for the analysis of molecular-frame photoelectron angular distributions

    International Nuclear Information System (INIS)

    Lucchese, R.R.; Montuoro, R.; Grum-Grzhimailo, A.N.; Liu, X.-J.; Pruemper, G.; Morishita, Y.; Saito, N.; Ueda, K.

    2007-01-01

    The analysis of the molecular-frame photoelectron angular distributions (MFPADs) is discussed within the dipole approximation. The general expressions are reviewed and strategies for extracting the maximum amount of information from different types of experimental measurements are considered. The analysis of the N 1s photoionization of NO is given to illustrate the method

  11. Decomposition and Projection Methods for Distributed Robustness Analysis of Interconnected Uncertain Systems

    DEFF Research Database (Denmark)

    Pakazad, Sina Khoshfetrat; Hansson, Anders; Andersen, Martin Skovgaard

    2013-01-01

    We consider a class of convex feasibility problems where the constraints that describe the feasible set are loosely coupled. These problems arise in robust stability analysis of large, weakly interconnected uncertain systems. To facilitate distributed implementation of robust stability analysis o...

  12. Analysis of the influences of grid-connected PV power system on distribution grids

    Directory of Open Access Journals (Sweden)

    Dumitru Popandron

    2013-12-01

    Full Text Available This paper presents the analysis of producing an electric power of 2.8 MW using a solar photovoltaic plant. The PV will be grid connected to the distribution network. The study is focused on the influences of connecting to the grid of a photovoltaic system, using modern software for analysis, modeling and simulation in power systems.

  13. Combining Static Analysis and Runtime Checking in Security Aspects for Distributed Tuple Spaces

    DEFF Research Database (Denmark)

    Yang, Fan; Aotani, Tomoyuki; Masuhara, Hidehiko

    2011-01-01

    Enforcing security policies to distributed systems is difficult, in particular, to a system containing untrusted components. We designed AspectKE*, an aspect-oriented programming language based on distributed tuple spaces to tackle this issue. One of the key features in AspectKE* is the program...... analysis predicates and functions that provide information on future behavior of a program. With a dual value evaluation mechanism that handles results of static analysis and runtime values at the same time, those functions and predicates enable the users to specify security policies in a uniform manner....... Our two-staged implementation strategy gathers fundamental static analysis information at load-time, so as to avoid performing all analysis at runtime. We built a compiler for AspectKE*, and successfully implemented security aspects for a distributed chat system and an electronic healthcare record...

  14. Just fracking: a distributive environmental justice analysis of unconventional gas development in Pennsylvania, USA

    Science.gov (United States)

    Clough, Emily; Bell, Derek

    2016-02-01

    This letter presents a distributive environmental justice analysis of unconventional gas development in the area of Pennsylvania lying over the Marcellus Shale, the largest shale gas formation in play in the United States. The extraction of shale gas using unconventional wells, which are hydraulically fractured (fracking), has increased dramatically since 2005. As the number of wells has grown, so have concerns about the potential public health effects on nearby communities. These concerns make shale gas development an environmental justice issue. This letter examines whether the hazards associated with proximity to wells and the economic benefits of shale gas production are fairly distributed. We distinguish two types of distributive environmental justice: traditional and benefit sharing. We ask the traditional question: are there a disproportionate number of minority or low-income residents in areas near to unconventional wells in Pennsylvania? However, we extend this analysis in two ways: we examine income distribution and level of education; and we compare before and after shale gas development. This contributes to discussions of benefit sharing by showing how the income distribution of the population has changed. We use a binary dasymetric technique to remap the data from the 2000 US Census and the 2009-2013 American Communities Survey and combine that data with a buffer containment analysis of unconventional wells to compare the characteristics of the population living nearer to unconventional wells with those further away before and after shale gas development. Our analysis indicates that there is no evidence of traditional distributive environmental injustice: there is not a disproportionate number of minority or low-income residents in areas near to unconventional wells. However, our analysis is consistent with the claim that there is benefit sharing distributive environmental injustice: the income distribution of the population nearer to shale gas wells

  15. Spherical Harmonic Analysis of Particle Velocity Distribution Function: Comparison of Moments and Anisotropies using Cluster Data

    Science.gov (United States)

    Gurgiolo, Chris; Vinas, Adolfo F.

    2009-01-01

    This paper presents a spherical harmonic analysis of the plasma velocity distribution function using high-angular, energy, and time resolution Cluster data obtained from the PEACE spectrometer instrument to demonstrate how this analysis models the particle distribution function and its moments and anisotropies. The results show that spherical harmonic analysis produced a robust physical representation model of the velocity distribution function, resolving the main features of the measured distributions. From the spherical harmonic analysis, a minimum set of nine spectral coefficients was obtained from which the moment (up to the heat flux), anisotropy, and asymmetry calculations of the velocity distribution function were obtained. The spherical harmonic method provides a potentially effective "compression" technique that can be easily carried out onboard a spacecraft to determine the moments and anisotropies of the particle velocity distribution function for any species. These calculations were implemented using three different approaches, namely, the standard traditional integration, the spherical harmonic (SPH) spectral coefficients integration, and the singular value decomposition (SVD) on the spherical harmonic methods. A comparison among the various methods shows that both SPH and SVD approaches provide remarkable agreement with the standard moment integration method.

  16. Grid Databases for Shared Image Analysis in the MammoGrid Project

    CERN Document Server

    Amendolia, S R; Hauer, T; Manset, D; McClatchey, R; Odeh, M; Reading, T; Rogulin, D; Schottlander, D; Solomonides, T

    2004-01-01

    The MammoGrid project aims to prove that Grid infrastructures can be used for collaborative clinical analysis of database-resident but geographically distributed medical images. This requires: a) the provision of a clinician-facing front-end workstation and b) the ability to service real-world clinician queries across a distributed and federated database. The MammoGrid project will prove the viability of the Grid by harnessing its power to enable radiologists from geographically dispersed hospitals to share standardized mammograms, to compare diagnoses (with and without computer aided detection of tumours) and to perform sophisticated epidemiological studies across national boundaries. This paper outlines the approach taken in MammoGrid to seamlessly connect radiologist workstations across a Grid using an "information infrastructure" and a DICOM-compliant object model residing in multiple distributed data stores in Italy and the UK

  17. An approach to prospective consequential life cycle assessment and net energy analysis of distributed electricity generation

    International Nuclear Information System (INIS)

    Jones, Christopher; Gilbert, Paul; Raugei, Marco; Mander, Sarah; Leccisi, Enrica

    2017-01-01

    Increasing distributed renewable electricity generation is one of a number of technology pathways available to policy makers to meet environmental and other sustainability goals. Determining the efficacy of such a pathway for a national electricity system implies evaluating whole system change in future scenarios. Life cycle assessment (LCA) and net energy analysis (NEA) are two methodologies suitable for prospective and consequential analysis of energy performance and associated impacts. This paper discusses the benefits and limitations of prospective and consequential LCA and NEA analysis of distributed generation. It concludes that a combined LCA and NEA approach is a valuable tool for decision makers if a number of recommendations are addressed. Static and dynamic temporal allocation are both needed for a fair comparison of distributed renewables with thermal power stations to account for their different impact profiles over time. The trade-offs between comprehensiveness and uncertainty in consequential analysis should be acknowledged, with system boundary expansion and system simulation models limited to those clearly justified by the research goal. The results of this approach are explorative, rather than for accounting purposes; this interpretive remit, and the assumptions in scenarios and system models on which results are contingent, must be clear to end users. - Highlights: • A common LCA and NEA framework for prospective, consequential analysis is discussed. • Approach to combined LCA and NEA of distributed generation scenarios is proposed. • Static and dynamic temporal allocation needed to assess distributed generation uptake.

  18. A new measure of uncertainty importance based on distributional sensitivity analysis for PSA

    International Nuclear Information System (INIS)

    Han, Seok Jung; Tak, Nam Il; Chun, Moon Hyun

    1996-01-01

    The main objective of the present study is to propose a new measure of uncertainty importance based on distributional sensitivity analysis. The new measure is developed to utilize a metric distance obtained from cumulative distribution functions (cdfs). The measure is evaluated for two cases: one is a cdf given by a known analytical distribution and the other given by an empirical distribution generated by a crude Monte Carlo simulation. To study its applicability, the present measure has been applied to two different cases. The results are compared with those of existing three methods. The present approach is a useful measure of uncertainty importance which is based on cdfs. This method is simple and easy to calculate uncertainty importance without any complex process. On the basis of the results obtained in the present work, the present method is recommended to be used as a tool for the analysis of uncertainty importance

  19. Evaluating Domestic Hot Water Distribution System Options with Validated Analysis Models

    Energy Technology Data Exchange (ETDEWEB)

    Weitzel, E. [Alliance for Residential Building Innovation, Davis, CA (United States); Hoeschele, E. [Alliance for Residential Building Innovation, Davis, CA (United States)

    2014-09-01

    A developing body of work is forming that collects data on domestic hot water consumption, water use behaviors, and energy efficiency of various distribution systems. Transient System Simulation Tool (TRNSYS) is a full distribution system developed that has been validated using field monitoring data and then exercised in a number of climates to understand climate impact on performance. In this study, the Building America team built upon previous analysis modeling work to evaluate differing distribution systems and the sensitivities of water heating energy and water use efficiency to variations of climate, load, distribution type, insulation and compact plumbing practices. Overall, 124 different TRNSYS models were simulated. The results of this work are useful in informing future development of water heating best practices guides as well as more accurate (and simulation time efficient) distribution models for annual whole house simulation programs.

  20. Nonlinear analysis of field distribution in electric motor with periodicity conditions

    Energy Technology Data Exchange (ETDEWEB)

    Stabrowski, M M; Sikora, J

    1981-01-01

    Numerical analysis of electromagnetic field distribution in linear motion tubular electric motor has been performed with the aid of finite element method. Two Fortran programmes for the solution of DBBF and BF large linear symmetric equation systems have been developed for purposes of this analysis. A new iterative algorithm, taking into account iron nonlinearity and periodicity conditions, has been introduced. Final results of the analysis in the form of induction diagrammes and motor driving force are directly useful for motor designers.

  1. Projection methods for the analysis of molecular-frame photoelectron angular distributions

    International Nuclear Information System (INIS)

    Grum-Grzhimailo, A.N.; Lucchese, R.R.; Liu, X.-J.; Pruemper, G.; Morishita, Y.; Saito, N.; Ueda, K.

    2007-01-01

    A projection method is developed for extracting the nondipole contribution from the molecular frame photoelectron angular distributions of linear molecules. A corresponding convenient parametric form for the angular distributions is derived. The analysis was performed for the N 1s photoionization of the NO molecule a few eV above the ionization threshold. No detectable nondipole contribution was found for the photon energy of 412 eV

  2. Advanced Hydroinformatic Techniques for the Simulation and Analysis of Water Supply and Distribution Systems

    OpenAIRE

    Herrera, Manuel; Meniconi, Silvia; Alvisi, Stefano; Izquierdo, Joaquin

    2018-01-01

    This document is intended to be a presentation of the Special Issue “Advanced Hydroinformatic Techniques for the Simulation and Analysis of Water Supply and Distribution Systems”. The final aim of this Special Issue is to propose a suitable framework supporting insightful hydraulic mechanisms to aid the decision-making processes of water utility managers and practitioners. Its 18 peer-reviewed articles present as varied topics as: water distribution system design, optimization of network perf...

  3. The unequal distribution of unequal pay - An empirical analysis of the gender wage gap in Switzerland

    OpenAIRE

    Dorothe Bonjour; Michael Gerfin

    2001-01-01

    In this paper we analyze the distribution of the gender wage gap. Using microdata for Switzerland we estimate conditional wage distribution functions and find that the total wage gap and its discrimination component are not constant over the range of wages. At low wages an overproportional part of the wage gap is due to discrimination. In a further analysis of specific individuals we examine the wage gap at different quantiles and propose a new measure to assess equal earnings opportunities. ...

  4. Vibration analysis of continuous maglev guideways with a moving distributed load model

    International Nuclear Information System (INIS)

    Teng, N G; Qiao, B P

    2008-01-01

    A model of moving distributed load with a constant speed is established for vertical vibration analysis of a continuous guideway in maglev transportation system. The guideway is considered as a continuous structural system and the action of maglev vehicles on guideways is considered as a moving distributed load. Vibration of the continuous guideways used in Shanghai maglev line is analyzed with this model. The factors that affect the vibration of the guideways, such as speeds, guideway's spans, frequency and damping, are discussed

  5. The Analysis of Tree Species Distribution Information Extraction and Landscape Pattern Based on Remote Sensing Images

    Directory of Open Access Journals (Sweden)

    Yi Zeng

    2017-08-01

    Full Text Available The forest ecosystem is the largest land vegetation type, which plays the role of unreplacement with its unique value. And in the landscape scale, the research on forest landscape pattern has become the current hot spot, wherein the study of forest canopy structure is very important. They determines the process and the strength of forests energy flow, which influences the adjustments of ecosystem for climate and species diversity to some extent. The extraction of influencing factors of canopy structure and the analysis of the vegetation distribution pattern are especially important. To solve the problems, remote sensing technology, which is superior to other technical means because of its fine timeliness and large-scale monitoring, is applied to the study. Taking Lingkong Mountain as the study area, the paper uses the remote sensing image to analyze the forest distribution pattern and obtains the spatial characteristics of canopy structure distribution, and DEM data are as the basic data to extract the influencing factors of canopy structure. In this paper, pattern of trees distribution is further analyzed by using terrain parameters, spatial analysis tools and surface processes quantitative simulation. The Hydrological Analysis tool is used to build distributed hydrological model, and corresponding algorithm is applied to determine surface water flow path, rivers network and basin boundary. Results show that forest vegetation distribution of dominant tree species present plaque on the landscape scale and their distribution have spatial heterogeneity which is related to terrain factors closely. After the overlay analysis of aspect, slope and forest distribution pattern respectively, the most suitable area for stand growth and the better living condition are obtained.

  6. AMIC: an expandable integrated analog front-end for light distribution moments analysis

    OpenAIRE

    SPAGGIARI, MICHELE; Herrero Bosch, Vicente; Lerche, Christoph Werner; Aliaga Varea, Ramón José; Monzó Ferrer, José María; Gadea Gironés, Rafael

    2011-01-01

    In this article we introduce AMIC (Analog Moments Integrated Circuit), a novel analog Application Specific Integrated Circuit (ASIC) front-end for Positron Emission Tomography (PET) applications. Its working principle is based on mathematical analysis of light distribution through moments calculation. Each moment provides useful information about light distribution, such as energy, position, depth of interaction, skewness (deformation due to border effect) etc. A current buffer delivers a cop...

  7. Determinants of the distribution and concentration of biogas production in Germany. A spatial econometric analysis

    International Nuclear Information System (INIS)

    Scholz, Lukas

    2015-01-01

    The biogas production in Germany is characterized by a heterogeneous distribution and the formation of regional centers. In the present study the determinants of the spatial distribution and concentration are analyzed with methods of spatial statistics and spatial econometrics. In addition to the consideration of ''classic'' site factors of agricultural production, the analysis here focuses on the possible relevance of agglomeration effects. The results of the work contribute to a better understanding of the regional distribution and concentration of the biogas production in Germany. [de

  8. Powerlaw: a Python package for analysis of heavy-tailed distributions.

    Science.gov (United States)

    Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar

    2014-01-01

    Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.

  9. Power distribution, the environment, and public health. A state-level analysis

    International Nuclear Information System (INIS)

    Boyce, James K.; Klemer, Andrew R.; Templet, Paul H.; Willis, Cleve E.

    1999-01-01

    This paper examines relationships among power distribution, the environment, and public health by means of a cross-sectional analysis of the 50 US states. A measure of inter-state variations in power distribution is derived from data on voter participation, tax fairness, Medicaid access, and educational attainment. We develop and estimate a recursive model linking the distribution of power to environmental policy, environmental stress, and public health. The results support the hypothesis that greater power inequality leads to weaker environmental policies, which in turn lead to greater environmental degradation and to adverse public health outcomes

  10. Power distribution, the environment, and public health. A state-level analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boyce, James K. [Department of Economics, University of Massachusetts, Amherst, MA 01003 (United States); Klemer, Andrew R. [Department of Biology, University of Minnesota, Duluth, MN (United States); Templet, Paul H. [Institute of Environmental Studies, Louisiana State University, Baton Rouge, LA (United States); Willis, Cleve E. [Department of Resource Economics, University of Massachusetts, Amherst, MA 01003 (United States)

    1999-04-15

    This paper examines relationships among power distribution, the environment, and public health by means of a cross-sectional analysis of the 50 US states. A measure of inter-state variations in power distribution is derived from data on voter participation, tax fairness, Medicaid access, and educational attainment. We develop and estimate a recursive model linking the distribution of power to environmental policy, environmental stress, and public health. The results support the hypothesis that greater power inequality leads to weaker environmental policies, which in turn lead to greater environmental degradation and to adverse public health outcomes.

  11. Power distribution, the environment, and public health. A state-level analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boyce, James K. [Department of Economics, University of Massachusetts, Amherst, MA 01003 (United States); Klemer, Andrew R. [Department of Biology, University of Minnesota, Duluth, MN (United States); Templet, Paul H. [Institute of Environmental Studies, Louisiana State University, Baton Rouge, LA (United States); Willis, Cleve E. [Department of Resource Economics, University of Massachusetts, Amherst, MA 01003 (United States)

    1999-04-15

    This paper examines relationships among power distribution, the environment, and public health by means of a cross-sectional analysis of the 50 US states. A measure of inter-state variations in power distribution is derived from data on voter participation, tax fairness, Medicaid access, and educational attainment. We develop and estimate a recursive model linking the distribution of power to environmental policy, environmental stress, and public health. The results support the hypothesis that greater power inequality leads to weaker environmental policies, which in turn lead to greater environmental degradation and to adverse public health outcomes

  12. Geographic distribution of suicide and railway suicide in Belgium, 2008-2013: a principal component analysis.

    Science.gov (United States)

    Strale, Mathieu; Krysinska, Karolina; Overmeiren, Gaëtan Van; Andriessen, Karl

    2017-06-01

    This study investigated the geographic distribution of suicide and railway suicide in Belgium over 2008--2013 on local (i.e., district or arrondissement) level. There were differences in the regional distribution of suicide and railway suicides in Belgium over the study period. Principal component analysis identified three groups of correlations among population variables and socio-economic indicators, such as population density, unemployment, and age group distribution, on two components that helped explaining the variance of railway suicide at a local (arrondissement) level. This information is of particular importance to prevent suicides in high-risk areas on the Belgian railway network.

  13. Powerlaw: a Python package for analysis of heavy-tailed distributions.

    Directory of Open Access Journals (Sweden)

    Jeff Alstott

    Full Text Available Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.

  14. Development of neural network for analysis of local power distributions in BWR fuel bundles

    International Nuclear Information System (INIS)

    Tanabe, Akira; Yamamoto, Toru; Shinfuku, Kimihiro; Nakamae, Takuji.

    1993-01-01

    A neural network model has been developed to learn the local power distributions in a BWR fuel bundle. A two layers neural network with total 128 elements is used for this model. The neural network learns 33 cases of local power peaking factors of fuel rods with given enrichment distribution as the teacher signals, which were calculated by a fuel bundle nuclear analysis code based on precise physical models. This neural network model studied well the teacher signals within 1 % error. It is also able to calculate the local power distributions within several % error for the different enrichment distributions from the teacher signals when the average enrichment is close to 2 %. This neural network is simple and the computing speed of this model is 300 times faster than that of the precise nuclear analysis code. This model was applied to survey the enrichment distribution to meet a target local power distribution in a fuel bundle, and the enrichment distribution with flat power shape are obtained within short computing time. (author)

  15. Is Middle-Upper Arm Circumference "normally" distributed? Secondary data analysis of 852 nutrition surveys.

    Science.gov (United States)

    Frison, Severine; Checchi, Francesco; Kerac, Marko; Nicholas, Jennifer

    2016-01-01

    Wasting is a major public health issue throughout the developing world. Out of the 6.9 million estimated deaths among children under five annually, over 800,000 deaths (11.6 %) are attributed to wasting. Wasting is quantified as low Weight-For-Height (WFH) and/or low Mid-Upper Arm Circumference (MUAC) (since 2005). Many statistical procedures are based on the assumption that the data used are normally distributed. Analyses have been conducted on the distribution of WFH but there are no equivalent studies on the distribution of MUAC. This secondary data analysis assesses the normality of the MUAC distributions of 852 nutrition cross-sectional survey datasets of children from 6 to 59 months old and examines different approaches to normalise "non-normal" distributions. The distribution of MUAC showed no departure from a normal distribution in 319 (37.7 %) distributions using the Shapiro-Wilk test. Out of the 533 surveys showing departure from a normal distribution, 183 (34.3 %) were skewed (D'Agostino test) and 196 (36.8 %) had a kurtosis different to the one observed in the normal distribution (Anscombe-Glynn test). Testing for normality can be sensitive to data quality, design effect and sample size. Out of the 533 surveys showing departure from a normal distribution, 294 (55.2 %) showed high digit preference, 164 (30.8 %) had a large design effect, and 204 (38.3 %) a large sample size. Spline and LOESS smoothing techniques were explored and both techniques work well. After Spline smoothing, 56.7 % of the MUAC distributions showing departure from normality were "normalised" and 59.7 % after LOESS. Box-Cox power transformation had similar results on distributions showing departure from normality with 57 % of distributions approximating "normal" after transformation. Applying Box-Cox transformation after Spline or Loess smoothing techniques increased that proportion to 82.4 and 82.7 % respectively. This suggests that statistical approaches relying on the

  16. Estimation of monthly solar radiation distribution for solar energy system analysis

    International Nuclear Information System (INIS)

    Coskun, C.; Oktay, Z.; Dincer, I.

    2011-01-01

    The concept of probability density frequency, which is successfully used for analyses of wind speed and outdoor temperature distributions, is now modified and proposed for estimating solar radiation distributions for design and analysis of solar energy systems. In this study, global solar radiation distribution is comprehensively analyzed for photovoltaic (PV) panel and thermal collector systems. In this regard, a case study is conducted with actual global solar irradiation data of the last 15 years recorded by the Turkish State Meteorological Service. It is found that intensity of global solar irradiance greatly affects energy and exergy efficiencies and hence the performance of collectors. -- Research highlights: → The first study to apply global solar radiation distribution in solar system analyzes. → The first study showing global solar radiation distribution as a parameter of the solar irradiance intensity. → Time probability intensity frequency and probability power distribution do not have similar distribution patterns for each month. → There is no relation between the distribution of annual time lapse and solar energy with the intensity of solar irradiance.

  17. Size distribution of magnetic iron oxide nanoparticles using Warren-Averbach XRD analysis

    Science.gov (United States)

    Mahadevan, S.; Behera, S. P.; Gnanaprakash, G.; Jayakumar, T.; Philip, J.; Rao, B. P. C.

    2012-07-01

    We use the Fourier transform based Warren-Averbach (WA) analysis to separate the contributions of X-ray diffraction (XRD) profile broadening due to crystallite size and microstrain for magnetic iron oxide nanoparticles. The profile shape of the column length distribution, obtained from WA analysis, is used to analyze the shape of the magnetic iron oxide nanoparticles. From the column length distribution, the crystallite size and its distribution are estimated for these nanoparticles which are compared with size distribution obtained from dynamic light scattering measurements. The crystallite size and size distribution of crystallites obtained from WA analysis are explained based on the experimental parameters employed in preparation of these magnetic iron oxide nanoparticles. The variation of volume weighted diameter (Dv, from WA analysis) with saturation magnetization (Ms) fits well to a core shell model wherein it is known that Ms=Mbulk(1-6g/Dv) with Mbulk as bulk magnetization of iron oxide and g as magnetic shell disorder thickness.

  18. A subchannel and CFD analysis of void distribution for the BWR fuel bundle test benchmark

    International Nuclear Information System (INIS)

    In, Wang-Kee; Hwang, Dae-Hyun; Jeong, Jae Jun

    2013-01-01

    Highlights: ► We analyzed subchannel void distributions using subchannel, system and CFD codes. ► The mean error and standard deviation at steady states were compared. ► The deviation of the CFD simulation was greater than those of the others. ► The large deviation of the CFD prediction is due to interface model uncertainties. -- Abstract: The subchannel grade and microscopic void distributions in the NUPEC (Nuclear Power Engineering Corporation) BFBT (BWR Full-Size Fine-Mesh Bundle Tests) facility have been evaluated with a subchannel analysis code MATRA, a system code MARS and a CFD code CFX-10. Sixteen test series from five different test bundles were selected for the analysis of the steady-state subchannel void distributions. Four test cases for a high burn-up 8 × 8 fuel bundle with a single water rod were simulated using CFX-10 for the microscopic void distribution benchmark. Two transient cases, a turbine trip without a bypass as a typical power transient and a re-circulation pump trip as a flow transient, were also chosen for this analysis. It was found that the steady-state void distributions calculated by both the MATRA and MARS codes coincided well with the measured data in the range of thermodynamic qualities from 5 to 25%. The results of the transient calculations were also similar to each other and very reasonable. The CFD simulation reproduced the overall radial void distribution trend which produces less vapor in the central part of the bundle and more vapor in the periphery. However, the predicted variation of the void distribution inside the subchannels is small, while the measured one is large showing a very high concentration in the center of the subchannels. The variations of the void distribution between the center of the subchannels and the subchannel gap are estimated to be about 5–10% for the CFD prediction and more than 20% for the experiment

  19. A density distribution algorithm for bone incorporating local orthotropy, modal analysis and theories of cellular solids.

    Science.gov (United States)

    Impelluso, Thomas J

    2003-06-01

    An algorithm for bone remodeling is presented which allows for both a redistribution of density and a continuous change of principal material directions for the orthotropic material properties of bone. It employs a modal analysis to add density for growth and a local effective strain based analysis to redistribute density. General re-distribution functions are presented. The model utilizes theories of cellular solids to relate density and strength. The code predicts the same general density distributions and local orthotropy as observed in reality.

  20. Reliability Worth Analysis of Distribution Systems Using Cascade Correlation Neural Networks

    DEFF Research Database (Denmark)

    Heidari, Alireza; Agelidis, Vassilios; Pou, Josep

    2018-01-01

    Reliability worth analysis is of great importance in the area of distribution network planning and operation. The reliability worth's precision can be affected greatly by the customer interruption cost model used. The choice of the cost models can change system and load point reliability indices....... In this study, a cascade correlation neural network is adopted to further develop two cost models comprising a probabilistic distribution model and an average or aggregate model. A contingency-based analytical technique is adopted to conduct the reliability worth analysis. Furthermore, the possible effects...